Senior Data Engineer (8+ Years in Python Development + ETL Development + Snowflake Development + 2 Years in ADF)
Senior Data Engineer, Assurant, GCC-India
Reports To: Director of Product Engineering & Integrations.
Position Summary
We are seeking a highly skilled Senior Data Engineer to design, develop, and optimize data pipelines and cloud-based data solutions. The ideal candidate will have strong expertise in Azure Data Factory, Snowflake, and modern ETL/ELT practices, enabling scalable, secure, and high-performance data workflows. This role will collaborate closely with analytics and BI teams to deliver reliable data infrastructure supporting enterprise reporting and advanced analytics.
This position will be in Bangalore/Chennai/Hyderabad at our India location.
Work Time: 3:30 PM IST ~ 12:30 AM IST
What will be my duties and responsibilities in this job?
Data Engineering & ETL Development
- Design, develop, and maintain ETL/ELT pipelines using Azure Data Factory, Snowflake Tasks, and Snowpipe for real-time and batch ingestion.
- Implement best practices for data modeling, transformation, and performance tuning within Snowflake.
- Build and manage pipelines connecting multiple structured and unstructured data sources across cloud and on-prem environments.
- Automate data quality checks, data lineage tracking, and error handling within ETL workflows.
Snowflake Development & Optimization
- Develop and maintain Snowflake schemas, views, stored procedures, and materialized views.
- Configure and optimize Snowpipe for continuous data loading.
- Utilize Snowsight for monitoring query performance, cost optimization, and workload analysis.
- Implement role-based access control and ensure data security in Snowflake.
Azure & Cloud Integration
- Integrate Azure Data Factory with other Azure services (Blob Storage, Synapse, Key Vault).
- Design scalable cloud architectures and orchestrate pipelines across hybrid environments.
- Implement CI/CD pipelines for data workflows using GitHub Actions.
Analytics & Reporting Enablement
- Collaborate with business analysts and BI teams to enable Power BI dashboards backed by optimized Snowflake data models.
- Create semantic models and data marts to support self-service analytics and reporting.
Scripting & Automation
- Develop Python scripts for automation, data processing, and custom integrations.
- Leverage Python-based frameworks (Pandas, PySpark, Airflow) to enhance pipeline capabilities.
What are the requirements needed for this position?
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
- 8+ years of experience in data engineering, ETL development, and cloud data platforms.
- Strong proficiency in Snowflake, Azure Data Factory, and Python.
- Experience with CI/CD, data security, and performance optimization.
- Familiarity with BI tools (Power BI, Looker, etc.) and data modeling best practices.
- Excellent problem-solving skills and ability to work in a fast-paced environment.
What are the Preferred requirements needed for this position?
- Knowledge of Airflow, PySpark, and data orchestration frameworks.
- Experience with real-time data ingestion and streaming architectures.
- Understanding of cost optimization in cloud environments.
Top Skills
What We Do
We work with the world’s top brands to make smart devices simpler. Vehicles last longer. Homes more secure. Problems easier to solve. And we volunteer in communities all over the globe to help the world become a greener, better place. We come from a variety of countries, cultures, and backgrounds. But we’re united by our enduring values of common sense, common decency, uncommon thinking, and uncommon results. So connect with us. Bring us your best work and your brightest ideas. And we’ll bring you a place where you can thrive.
We protect and secure:
62 million mobile devices
54 million motor vehicles
102 million household valuables, appliances, and electronics
31 million mortgages
55 million travelers and credit card holders
And that's just the beginning.








