The AI & Data Analytics Team is looking for a Senior Data Engineer to join our team. In this role, you will be responsible for designing, building, and optimizing robust data pipelines that process massive datasets in both batch and real-time. You will work at the intersection of software engineering and data science, ensuring that our data architecture is scalable, reliable, and follows industry best practices.
Priorities can change in a fast-paced environment like ours, so this role includes, but is not limited to, the following responsibilities:
- Pipeline Development: Design and implement complex data processing pipelines using Apache Spark.
- Architectural Leadership: Build scalable, distributed systems that handle high-throughput data streams and large-scale batch processing.
- Infrastructure as Code: Manage and provision cloud infrastructure using Terraform.
- CI/CD & Automation: Streamline development workflows by implementing and maintaining GitHub Actions for automated testing and deployment.
- Code Quality: Uphold rigorous software engineering standards, including comprehensive unit/integration testing, code reviews, and maintainable documentation.
- Collaboration: Work closely with stakeholders to translate business requirements into technical specifications.
Required Qualifications:
- BA/BSc in Computer Science, Engineering, Mathematics, or a related technical discipline
- 5+ years of experience in the data engineering and software development life cycle.
- 4+ years of hands-on experience in building and maintaining production data applications, current experience in both relational and columnar data stores.
- 4+ years of hands-on experience working with AWS cloud services
- Comprehensive experience with one or more programming languages such as Python, Java, or Rust
- Comprehensive experience working with Big Data platforms (i.e., Spark, Google Big Query, Azure, AWS S3, etc.)
- Familiarity with time series database, data streaming applications, event driven architectures, Kafka, Flink, and more
- Experience with workflow management engines (i.e., Airflow, Luigi, Azure Data Factory, etc.)
- Experience with designing and implementing real-time pipelines
- Experience with data quality and validation
- Experience with API design
- Distributed Computing: Deep expertise in Apache Spark (Core, SQL, and Structured Streaming).
- Programming Mastery: Strong proficiency in Scala or Java. You should be comfortable building production-grade applications in a JVM-based environment.
- SQL Proficiency: Advanced knowledge of SQL for data transformation, analysis, and performance tuning.
- DevOps & Tools: Hands-on experience with Terraform for infrastructure management and GitHub Actions for CI/CD pipelines.
- Software Engineering Foundation: Solid understanding of data structures, algorithms, and design patterns. Experience applying "Clean Code" principles to data engineering.
- Stream Processing: Experience with Apache Flink for low-latency stream processing.
- Scripting: Proficiency in Python for automation, data analysis, or scripting.
- Cloud Platforms: Experience with AWS, Azure, or GCP data services (e.g., EMR, Glue, Databricks).
- Data Modeling: Familiarity with dimensional modeling, Lakehouse architectures (Delta Lake, Iceberg), or NoSQL databases.
Preferred Experience:
- Comprehensive knowledge of relational database concepts, including; data architecture, operational data stores, Interface processes, multidimensional modeling, master data management, and data manipulation
- Expert knowledge and experience with custom ETL design, implementation and maintenance
- Comprehensive experience designing, implementing, and iterating data pipelines using Big Data technologies
- Certification in AWS or other cloud providers
- Experience with Databricks notebook workflows
- Experience with Terraform
Top Skills
What We Do
Our storied and iconic brands embody the passion of their visionary founders and today’s customers in their innovative products and services: they include Abarth, Alfa Romeo, Chrysler, Citroën, Dodge, DS Automobiles, Fiat, Jeep®, Lancia, Maserati, Opel, Peugeot, Ram, Vauxhall and mobility brands Free2move and Leasys. Powered by our diversity, we lead the way the world moves – aspiring to become the greatest sustainable mobility tech company, not the biggest, while creating added value for all stakeholders as well as the communities in which we operate.

.png)







