Principal, Data Engineer

Posted 23 Days Ago
Be an Early Applicant
Bangalore, Bengaluru Urban, Karnataka, IND
In-Office
Expert/Leader
Fintech
The Role
Lead the architecture and development of cloud data platforms, implementing data engineering solutions, optimizing performance, and mentoring teams.
Summary Generated by Built In
Job Description:

Job Title: Principal, Data Engineer

The Purpose of this Role

We are seeking a Principal Data Engineer with 10+ years of experience to architect and scale modern data platforms, lead cloud data engineering initiatives, and deliver robust, secure, and high‑performance data solutions.  We use data and analytics to personalize incredible customer experiences and develop solutions that help our customers live the lives they want. As part of our digital transformation, we have made significant investments to build cloud data lake platforms. We are looking for a hands-on data engineer who can help us design and develop our next generation Cloud Data Lake and Analytics Platform for Workplace Solutions.

The Expertise You Have

  • 10+ years in Data Engineering / Big Data / Platform Engineering with end‑to‑end delivery of large enterprise programs.

  • Multi‑tenant database and data modeling expertise—designing tenant‑aware schemas and delivering efficient, robust database code and designs for websites and products.

  • Advanced SQL across Oracle, Azure SQL Managed Instance (SQL MI), MySQL, and Snowflake; strong performance tuning and query optimization.

  • Hands‑on with cloud data platforms: Azure (preferred), including big data services and integrations.

  • Deep experience building ETL/ELT pipelines using Informatica, SnapLogic, Azure Functions, Python targeting Snowflake and other cloud data warehouses.

  • Proven track record designing, developing, and deploying batch data processing pipelines and feature generation platforms using Azure, Python, and SQL.

  • Strong DataOps/CI/CD for data and code: GitHub‑based workflows, automated database releases, and source control best practices across the SDLC.

  • Experience with event/streaming: Azure Event Hubs, Kafka, or Amazon Kinesis.

  • Data architecture design & documentation using Confluence and draw.io; metadata management and lineage.

  • Production excellence in financial services / benefits administration—triaging incidents, root cause analysis (logs/reports), and minimizing business impact.

  • Experience leading or mentoring teams on business‑critical data platform solutions; strong stakeholder communication.

The Skills That Are Key to This Role

  • Translate complex business and analytical needs into scalable, resilient data architectures and pipelines.

  • Design and implement high‑performance distributed pipelines (batch & streaming) with strong SLAs, monitoring, and cost efficiency.

  • Build lakehouse and warehouse solutions (e.g., Delta/Iceberg/Hudi + Snowflake) with robust semantic/data models.

  • End‑to‑end ownership: requirements → architecture → build → deploy → monitor → optimize.

  • ETL/ELT excellence with Informatica, SnapLogic, Azure Data Factory, Azure Functions, Python, and Databricks/Spark.

  • Azure platform proficiency, including:

    • Azure Data Factory, Azure Databricks, AKS, Azure Service Bus (ASB), API Management, Storage Accounts, Event Hub/Event Grid, Redis Cache.

  • Strong RDBMS/NoSQL experience: Oracle, SQL Server/SQL MI, MySQL, Postgres, Cosmos DB, MongoDB.

  • DataOps/DevOps: CI/CD pipelines (GitHub Actions/Azure DevOps), unit/integration tests, automated data quality, and infrastructure‑as‑code (Bicep/ARM/Terraform).

  • Data quality & observability (e.g., Great Expectations/Deequ/Monte Carlo), schema evolution, partitioning, and performance tuning.

  • Build tools and frameworks that turn pipelines into actionable insights for key business KPIs; partner with BI/analytics teams.

  • Strong Agile delivery—driving epics, stories, and tasks using Agile/Scrum and Jira; excellent written and verbal communication.

Good to Have Skills for This Role

  • dbt for modular SQL development, testing, and documentation.

  • Mainframe/enterprise integrations (e.g., Control‑M, DB2, CICS) and hybrid/cloud connectivity patterns.

  • Advanced Spark optimization and distributed compute internals.

  • Containerization/Kubernetes for data workloads and platform services.

The Value You Deliver

  • Establish and scale a multi‑tenant, secure, and cost‑efficient data platform that powers analytics, ML, and real‑time applications.

  • Convert leadership vision into reference architectures, patterns, and platform roadmaps; influence data strategy across domains.

  • Standardize data modeling, metadata, and pipeline frameworks; raise engineering quality via code reviews, design forums, and mentorship.

  • Optimize compute/storage spend, improve pipeline reliability/throughput, and reduce manual toil through automation.

  • Build robust batch and streaming foundations with clear SLAs, lineage, observability, and self‑service enablement.

  • Ensure operational excellence—monitor, triage, conduct RCA, and continuously harden systems to protect business outcomes.

  • Deliver actionable, trusted data that accelerates decision‑making and KPI visibility across business units and products.

The Expertise We’re Looking For

  • 10+ years professional experience with multiple successful enterprise data platform deliveries.

  • Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or equivalent experience.

How Your Work Impacts the Organization

You will strengthen Workplace Investments (WI) by modernizing data platforms and enabling reliable, timely, and secure data for analytics, customer experiences, and operational excellence. Your work underpins retirement solutions, employer services, investor centers, and advisory capabilities—improving both business outcomes and customer trust.

Location & Shift

  • Location: Bangalore

  • Shift: 11:00 AM – 08:00 PM

Certifications:

Category:Information Technology
Am I A Good Fit?
beta
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
HQ: Boston, MA
58,848 Employees
Year Founded: 1946

What We Do

At Fidelity, our goal is to make financial expertise broadly accessible and effective in helping people live the lives they want. We do this by focusing on a diverse set of customers: - from 23 million people investing their life savings, to 20,000 businesses managing their employee benefits to 10,000 advisors needing innovative technology to invest their clients’ money. We offer investment management, retirement planning, portfolio guidance, brokerage, and many other financial products. Privately held for nearly 70 years, we’ve always believed by providing investors with access to the information and expertise, we can help them achieve better results. That’s been our approach- innovative yet personal, compassionate yet responsible, grounded by a tireless work ethic—it is the heart of the Fidelity way.

Similar Jobs

Sandisk Corporation Logo Sandisk Corporation

Principal Engineer

Hardware • Information Technology • Semiconductor • Manufacturing
In-Office
Bengaluru, Bengaluru Urban, Karnataka, IND
11000 Employees
In-Office
Bangalore, Bengaluru Urban, Karnataka, IND
3464 Employees
In-Office
Bangalore, Bengaluru Urban, Karnataka, IND
72000 Employees

Autodesk Logo Autodesk

Data Engineer

Big Data • Cloud • Digital Media • Machine Learning • Mobile • Software • Industrial
In-Office
Bengaluru, Bengaluru Urban, Karnataka, IND
13285 Employees

Similar Companies Hiring

Rain Thumbnail
Blockchain • Fintech • Payments • Financial Services • Cryptocurrency • Web3 • Infrastructure as a Service (IaaS)
New York, NY
100 Employees
Scotch Thumbnail
Artificial Intelligence • eCommerce • Fintech • Payments • Retail • Software • Analytics
US
35 Employees
Kepler  Thumbnail
Fintech • Software
New York, New York
6 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account