Principal Data Engineer

Reposted 10 Hours Ago
Westlake, TX
In-Office
Senior level
Fintech
The Role
Leads data analysis projects, develops ETL/ELT pipelines for data management, and supports Agile/SCRUM methodology for project execution.
Summary Generated by Built In
Job Description:

Position Description: 

 

Provides data analysis leadership on complex systems analysis projects, often across subsystems and companies, in a matrix organization. Develops and deploys pipelines leveraging DevOps and Continuous Integration/Continuous Delivery (CI/CD) best practices with a cloud native infrastructure. Develops Extract, Transform, Load (ETL) and Extract, Load, Transform (ELT) pipelines to move data to and from Snowflake data store, using Python and Snowflake SnowSQL. Establishes CD pipelines to deploy tools and practices, using GitHub, Jenkins, Stash, and Artifactory. Supports the creation, maintenance, and compliance of Agile/SCRUM development standards and guidelines. Performs data manipulation using Amazon Web Services (AWS). Influences decisions and solutions by working closely with squad leaders, engineers, and architects to form strategic partnerships. Performs data mining and data analysis, using Oracle, SQL server, and NoSQL database.  

 

Primary Responsibilities: 

 

  • Designs, implements, and maintains data structures, batch jobs, and interfaces to external systems.  

  • Develops original and creative technical solutions to on-going development efforts.  

  • Develops applications for multiple projects supporting several divisional initiatives.  

  • Supports and performs all phases of testing leading to implementation.  

  • Assists in the planning and conducting of user acceptance testing.  

  • Develops comprehensive documentation for multiple applications supporting several corporate initiatives.  

  • Responsible for post-installation validation and triaging of any issues.  

  • Establishes project plans for projects of moderate scope.  

  • Performs independent and complex technical and functional analysis for multiple projects supporting several initiatives.  

  • Manages data services hosted on the operational data stores and file-based interfaces.  

  • Confers with systems analysts and other software engineers/developers to design systems.  

  • Gathers information on project limitations and capabilities, performance requirements, and interfaces.  

  • Develops and oversees software system testing and validation procedures, programming, and documentation. 

 

 

Education and Experience: 

 

Bachelor’s degree (or foreign education equivalent) in Computer Science, Engineering, Information Technology, Information Systems, Mathematics, Physics, or a closely related field and five (5) years of experience as a Principal Data Engineer (or closely related occupation) analyzing, designing, and building ETL processes, and using data integration solutions to ensure reliable and scalable data management across operational or analytical capability platforms, using Informatica PowerCenter, Java Spring Batch, and AWS.  

 

Or, alternatively, Master’s degree (or foreign education equivalent) in Computer Science, Engineering, Information Technology, Information Systems, Mathematics, Physics, or a closely related field and three (3) years of experience as a Principal Data Engineer (or closely related occupation) analyzing, designing, and building ETL processes, and using data integration solutions to ensure reliable and scalable data management across operational or analytical capability platforms, using Informatica PowerCenter, Java Spring Batch, and AWS. 

 

 

Skills and Knowledge: 

 

Candidate must also possess: 

 

  • Demonstrated Expertise (“DE”) architecting, designing, and developing microservices-based Application Programming Interfaces (APIs) and testing automation frameworks, using Java Spring, Swagger, Amazon Elastic Kubernetes Service (EKS), and serverless technologies.   

  • DE developing CI/CD pipelines in a hybrid on-prem and Cloud environment (AWS) to deliver changes in production and non-production environments, using DevOps tools (GitHub, Jenkins, Maven, and Terraform).  

  • DE analyzing, designing, developing, and testing ETL batch processing application for data warehouse and OLTP based systems, using Informatica Power Center, Java, and Oracle PL/SQL.  

  • DE performing Logical and Physical Data Modelling for relational databases -- Oracle, Snowflake, or Cockroach; and optimizing database and query performance by implementing appropriate data types, indexing strategies, and partitioning techniques based on data access patterns. 

#PE1M2 

#LI-DNI 

Certifications:

Category:Information Technology

Most roles at Fidelity are Hybrid, requiring associates to work onsite every other week (all business days, M-F) in a Fidelity office. This does not apply to Remote or fully Onsite roles.

Please be advised that Fidelity’s business is governed by the provisions of the Securities Exchange Act of 1934, the Investment Advisers Act of 1940, the Investment Company Act of 1940, ERISA, numerous state laws governing securities, investment and retirement-related financial activities and the rules and regulations of numerous self-regulatory organizations, including FINRA, among others. Those laws and regulations may restrict Fidelity from hiring and/or associating with individuals with certain Criminal Histories.

Top Skills

Amazon Elastic Kubernetes Service (Eks)
Amazon Web Services (Aws)
Artifactory
Git
Informatica Powercenter
Java Spring Batch
Jenkins
NoSQL
Oracle
Python
Snowflake Snowsql
SQL Server
Stash
Terraform
Am I A Good Fit?
beta
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
HQ: Boston, MA
58,848 Employees
Year Founded: 1946

What We Do

At Fidelity, our goal is to make financial expertise broadly accessible and effective in helping people live the lives they want. We do this by focusing on a diverse set of customers: - from 23 million people investing their life savings, to 20,000 businesses managing their employee benefits to 10,000 advisors needing innovative technology to invest their clients’ money. We offer investment management, retirement planning, portfolio guidance, brokerage, and many other financial products.

Privately held for nearly 70 years, we’ve always believed by providing investors with access to the information and expertise, we can help them achieve better results. That’s been our approach- innovative yet personal, compassionate yet responsible, grounded by a tireless work ethic—it is the heart of the Fidelity way.

Similar Jobs

Wells Fargo Logo Wells Fargo

Platform Engineer

Fintech • Financial Services
Hybrid
3 Locations
205000 Employees
159K-305K Annually

CrowdStrike Logo CrowdStrike

Data Engineer

Cloud • Computer Vision • Information Technology • Sales • Security • Cybersecurity
Remote or Hybrid
USA
10000 Employees
170K-260K Annually

DIRECTV Logo DIRECTV

Data Engineer

Consumer Web • Digital Media • Information Technology • News + Entertainment • On-Demand
In-Office or Remote
2 Locations
12000 Employees
148K-268K Annually
In-Office
3 Locations
4900 Employees
201K-250K Annually

Similar Companies Hiring

Camber Thumbnail
Social Impact • Healthtech • Fintech
New York, NY
53 Employees
Rain Thumbnail
Web3 • Payments • Infrastructure as a Service (IaaS) • Fintech • Financial Services • Cryptocurrency • Blockchain
New York, NY
80 Employees
Scotch Thumbnail
Software • Retail • Payments • Fintech • eCommerce • Artificial Intelligence • Analytics
US
25 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account