Position Description:
Designs, develops, tests, deploys, maintains, and enhances data science Machine Learning (ML) models by building Artificial Intelligence/Machine Learning (AI/ML) pipelines. Develops and tests products and applications to support cloud integrations and incorporate AI/ML and Large Language Model (LLM) technologies. Applies technical tools and methodologies to enable efficiency, performance and agility by automating DevOps processes, Continuous Integration and Delivery (CI/CD) pipelines. Designs, develops, and implements Application Programming Interfaces. Creates and maintains cloud-based data pipeline infrastructure and API applications. Improves Unit Test Automation coverage to achieve the Automation Pyramid. Develops applications supporting data pipelines for feature engineering. Develops vector databases to optimize storage and querying of high-dimensional data, enabling rapid access to relevant information for LLM driven applications.
Primary Responsibilities:
Builds and coordinates large-scale ML platforms and tools to enable prediction and optimize models.
Designs and develops a feature generation/store framework that promotes sharing of data/features among different ML models.
Partners with Data Scientists and to help use the foundational platform upon which models can be built and trained.
Operationalizes ML models at scale.
Builds tools to help detect shifts in data/features used by ML models and identifies issues in advance of deteriorating prediction quality.
Monitors the uncertainty of model outputs.
Automates prediction explanations for model diagnostics.
Explores new technology trends to simplify data and ML ecosystem.
Drives innovation and implements solutions with future thinking.
Designs and develops REST-based API solutions.
Delivers system automations by setting up Continuous Integration/Continuous Delivery (CI/CD) pipelines.
Mentors and coaches team members on project activities.
Education and Experience:
Bachelor’s degree (or foreign education equivalent) in Computer Science, Engineering, Information Technology, Information Systems, Mathematics, Physics, or a closely related field and five (5) years of experience as a Principal AI/ML Engineer (or closely related occupation) building large-scale AI/ML pipelines for feature engineering and model deployment, on AWS and Snowflake data warehouses.
Or, alternatively, Master’s degree (or foreign education equivalent) in Computer Science, Engineering, Information Technology, Information Systems, Mathematics, Physics, or a closely related field and three (3) years of experience as a Principal AI/ML Engineer (or closely related occupation) building large-scale AI/ML pipelines for feature engineering and model deployment, on AWS and Snowflake data warehouses.
Skills and Knowledge:
Candidate must also possess:
Demonstrated Expertise (“DE”) architecting, designing, and building feature engineering pipelines, deploying AI models, and optimizing model inference using PyTorch or TensorFlow; developing ML infrastructure and MLOps in the Cloud using Amazon Web Services (AWS) -- SageMaker, Lambda, Glue, and Step functions; and working with predictive and optimization ML models in a development and production environment for deployment, inference, tuning, and required measurements.
DE architecting, designing, and building highly scalable Cloud-based Big Data applications according to business user requirements in AWS, using S3, EMR, Lambda, and Athena; acting as a member of a team responsible for implementing data lake strategies using Snowflake as a platform for structured and semi-structured data; and building and formulating data lake design patterns for data ingestion, processing, and extraction for personalization teams using Snowflake, SQL, Python, data warehousing, and advanced data modeling techniques (Entity-Relationship and Dimensional modeling).
DE architecting, designing, and building Retrieval Augmented Generation (RAG) techniques with vector databases to enhance the capabilities of generative AI \LLMs; and developing API frameworks for real-time and near real-time data ingestion (for customer interactions flowing from different channels) using AWS Services - Kinesis (Stream and Firehose).
DE maintaining Continuous Integration/Continuous Delivery (CI/CD) pipelines for application codes using Jenkins and GitHub; developing Unix shell scripts and creating Control-M jobs to automate and schedule end-to-end processes; and performing platform migration by transitioning on-premises systems to AWS cloud infrastructure; and ensuring the full potential of cloud-based environments and modern data warehousing technologies (Star Schema or Snowflake Schema) using End-to-End (E2E) migration planning, execution, and optimization.
#PE1M2
#LI-DNI
Most roles at Fidelity are Hybrid, requiring associates to work onsite every other week (all business days, M-F) in a Fidelity office. This does not apply to Remote or fully Onsite roles.
Please be advised that Fidelity’s business is governed by the provisions of the Securities Exchange Act of 1934, the Investment Advisers Act of 1940, the Investment Company Act of 1940, ERISA, numerous state laws governing securities, investment and retirement-related financial activities and the rules and regulations of numerous self-regulatory organizations, including FINRA, among others. Those laws and regulations may restrict Fidelity from hiring and/or associating with individuals with certain Criminal Histories.
Top Skills
What We Do
At Fidelity, our goal is to make financial expertise broadly accessible and effective in helping people live the lives they want. We do this by focusing on a diverse set of customers: - from 23 million people investing their life savings, to 20,000 businesses managing their employee benefits to 10,000 advisors needing innovative technology to invest their clients’ money. We offer investment management, retirement planning, portfolio guidance, brokerage, and many other financial products.
Privately held for nearly 70 years, we’ve always believed by providing investors with access to the information and expertise, we can help them achieve better results. That’s been our approach- innovative yet personal, compassionate yet responsible, grounded by a tireless work ethic—it is the heart of the Fidelity way.






