Our client is one of the United States’ largest insurers, providing a wide range of insurance and financial services products with gross written premiums well over US$25 Billion (P&C). They proudly serve more than 10 million U.S. households with more than 19 million individual policies across all 50 states through the efforts of over 48,000 exclusive and independent agents and nearly 18,500 employees. Finally, our client is part of one the largest Insurance Groups in the world.
Job Summary:We are looking for a skilled Data Engineer to design, build, and maintain data pipelines that support analytics and business intelligence initiatives. This role involves both enhancing existing pipelines and developing new ones to integrate data from diverse internal and external sources. The ideal candidate will have advanced SQL and Informatica skills, experience in ETL development, and a foundational understanding of dimensional data modeling. Experience with DBT is a plus.
- Design, develop, and maintain data pipelines and ETL workflows to ensure reliable data integration across platforms.
- Enhance and optimize existing data pipelines by adding new attributes, improving performance, or increasing maintainability.
- Build new data ingestion pipelines from a variety of structured and semi-structured sources.
- Use Informatica to develop and manage ETL processes in alignment with business requirements.
- Write and optimize complex SQL queries for data transformation, validation, and extraction.
- Apply basic knowledge of dimensional data modeling to support reporting and data warehousing needs.
- Collaborate with data analysts, data scientists, and business teams to understand data needs and deliver clean, structured datasets.
- Participate in code reviews, documentation, and testing to ensure quality and accuracy in data delivery.
- Work in agile or project-based environments to deliver on sprint goals and project timelines.
- Bachelor's degree in Computer Science, Information Systems, or related field (or equivalent experience).
- 2–4 years of hands-on experience in data engineering or ETL development using (MUST)
- Previous experience using Informatica (Cloud Software) (MUST)
- SQL: Advanced-level proficiency in writing, optimizing, and troubleshooting queries.
- ETL Tools: Intermediate experience building and managing pipelines using ETL platforms.
- Informatica: Advanced experience with PowerCenter or Informatica Cloud for data integration tasks.
- Dimensional Data Modeling: Basic understanding of star and snowflake schema designs.
- Excellent problem-solving and communication skills with an ability to collaborate across teams.
- Experience with DBT (Data Build Tool) for modular and scalable transformation logic.
- Exposure to cloud data platforms (AWS, GCP, Azure).
This position comes with competitive compensation and benefits package:
- Competitive salary and performance-based bonuses
- Comprehensive benefits package
- Career development and training opportunities
- Flexible work arrangements (remote and/or office-based)
- Dynamic and inclusive work culture within a globally reknowned group
- Private Health Insurance
- Pension Plan
- Paid Time Off
- Training & Development
Top Skills
What We Do
Capgemini is a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. The Group is guided everyday by its purpose of unleashing human energy through technology for an inclusive and sustainable future. It is a responsible and diverse organization of 270,000 team members in nearly 50 countries. With its strong 50 year heritage and deep industry expertise, Capgemini is trusted by its clients to address the entire breadth of their business needs, from strategy and design to operations, fueled by the fast evolving and innovative world of cloud, data, AI, connectivity, software, digital engineering and platforms. The Group reported in 2020 global revenues of €16 billion.