Position Summary
Continuing with the tradition of innovation, MetLife, as part of its Data and Analytics function, established a dedicated center for advanced analytics & research in India. DnA Hyderabad a.k.a Global Advanced analytics and Research Center (GARC) is part of larger Data and analytics organization (DnA) of MetLife focused on scaling data governance, data management, data engineering, data science/machine learning/ artificial intelligence, visualization, techno-project management capabilities, enabling a more cost-effective analytics operating model, and increasing data and analytics maturity across the MetLife global community.
Driven by passion and purpose, we are looking for you, high performing data, and analytics professional, to
drive and support development and deployment of actionable, high impact, data, and analytics solutions in
support of MetLife's enterprise functions and lines of businesses across markets. The portfolio of work
delivers data driven solutions across key business functions such as customer acquisition and targeting,
engagement, retention, and distribution, underwriting, claims service & operations, risk management,
investments, audit, and tackling hard, open-ended problems.
The portfolio of work will support deployment of models to various clusters and environments with support/ guidance of Big Data Engineers by following a set of standards and will ensure operational readiness by incorporating configuration management, exception handling and logging for end-to-end batch and real-time model operationalization. The position requires understanding of data engineering, azure devops and atlassian stack, container as a service.
You will work and collaborate with a nimble, autonomous, cross-functional team of makers, breakers, doers,
and disruptors who love to solve real problems and meet real customer needs. You will be using cutting-edge
technologies and frameworks to process data, create data pipelines and collaborate with the data science
team to operationalize end to end machine learning & AI solutions.
Job Responsibilities
- Contribute towards supporting the build/implementation of data ingestion and curation processes developed using Big data tools such as Spark (Scala/python), Hive, HDFS, Kafka, Pig, Spark, HDFS, Oozie, Sqoop, Flume, Zookeeper, Kerberos, Sentry, Impala, CDP 7.x etc. under the guidance of Big Data Engineers.
- Support the ingestion of huge data volumes from various platforms for Analytics needs and prepare high-performance, reliable, and maintainable ETL code with support/review guidance of senior team members.
- Provide relevant support in monitoring performance and advising any necessary infrastructure changes to senior team members for their review inputs.
- Understand the defined data security principals and policies developed using Ranger and Kerberos.
- Gain broader understanding on how to support application developers and progressively work on efficient big data application development using cutting edge technologies.
- Collaborate with Business systems analyst, technical leads, Project managers and business/operations teams in building data enablement solutions across different LOBs and use cases basis
- Understand & support the creation of reusable frameworks which will optimize the development effort involved.
Knowledge, Skills and Abilities
Education
- Bachelor's degree in Computer Science, Engineering, or related discipline
Experience
- 1 - 3 years of solutions development and delivery experience.
- Hive database management and Performance tuning (Partitioning / Bucketing).
- Good SQL knowledge and data analysis skills for data anomaly detection and data quality assurance.
- Basic experience of building stream-processing systems, using solutions such as Storm or Spark-Streaming.
- Able to support build/design of Data warehouses, data stores for analytics consumption On prem/ Cloud (real time as well as batch use cases) as per guidance and help from Sr. Big Data Engineer/Big Data Engineer.
- Experience in any model management methodologies is a plus.
Knowledge and skills (general and technical)
Required:
- Proficiency with hands on development experience on some of the key tools: HDFS, Hive, Spark, Scala, Java, Python, Databricks/Delta Lake, Flume, Kafka etc.
- Analytical skills to analyze situations and come to optimal and efficient solution based on requirements.
- Performance tuning and problem-solving skills.
- Able to support/contribute towards designing of multi-tenant, containerized Hadoop architecture for memory/CPU management/sharing across different LOBs under the guidance of senior data engineers in the team.
- Code versioning experience using Bitbucket.
- Good communication skills both written and verbal.
- Proficiency in project documentation preparation and other support as required.
Additional skills (good to have):
- Experience in Python and writing Azure functions using Python/Node.js
- Experience using Event Hub for data integrations.
- Supporting implementation of analytical data stores in Azure platform using ADLS/Azure data factory /Data Bricks and Cosmos DB (mongo/graph API)
- Proficiency in using tools like Git, Bamboo and other continuous integration and deployment tools.
- Exposure to data governance principles such as Metadata, Lineage (Colibra /Atlas) etc.
About MetLife
Recognized on Fortune magazine's list of the "World's Most Admired Companies" and Fortune World's 25 Best Workplaces™, MetLife, through its subsidiaries and affiliates, is one of the world's leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East.
Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by our core values - Win Together, Do the Right Thing, Deliver Impact Over Activity, and Think Ahead - we're inspired to transform the next century in financial services. At MetLife, it's #AllTogetherPossible . Join us!
#BI-Hybrid
Top Skills
What We Do
We're honored to be No. 10 on Great Place to Work's World's Best Workplaces and recognized in the Fortune 100 Best Companies to Work For® list in 2025. At MetLife, we're leading the global transformation of an industry we’ve defined for over 157 years.
At MetLife, every innovation and line of code is a lifeline for our customers and their families—from victims of natural disasters to people living with disabilities and beyond. With operations in more than 40 markets and leading positions across the globe, MetLife fosters an inclusive culture where our people are energized and inspired to deliver for our customers and communities.
Join our remarkable journey—one in which you help write the next century of innovation in financial services—because with MetLife, making the world a better place is All Together Possible.
Why Work With Us
At MetLife, you’ll be working for a company whose purpose is to help customers throughout their life’s journey, and often in their most critical time of need. You’ll be a part of developing leading-edge platforms that will have a lasting impact on the lives and well-being of tens of millions of customers.
Gallery
MetLife Teams
MetLife Offices
Hybrid Workspace
Employees engage in a combination of remote and on-site work.
MetLife's current workplace policies classify roles as Office, Hybrid or Virtual based on the nature of work, encouraging new ways of working together


















