Job Description:
Job Title: Data Engineer
Experience: 2–5 years
Location: Remote
About the Role
We’re seeking a Data Engineer who combines strong technical skills with a structured and collaborative approach to engineering. You’ll be responsible for building reliable data pipelines and optimizing data systems that power analytics, reporting, and product intelligence. This role requires hands-on expertise in Python, SQL, and Snowflake.
Key Responsibilities
Data Engineering & Development
- Design, implement, and maintain efficient ETL/ELT workflows using Python & SQL.
- Build, optimize, and maintain scalable Snowflake data models for analytics and operational use cases.
- Ensure high data quality, consistency, and accessibility across different teams and systems.
- Automate repetitive data processes and continuously improve data infrastructure performance.
- Collaborate closely with data analysts, scientists, and product teams to understand data requirements and deliver actionable datasets.
Engineering Excellence & Collaboration
- Participate in and lead technical design reviews, ensuring alignment with established engineering standards.
- Contribute to planning and prioritization across concurrent data projects, breaking down tasks, identifying risks, and maintaining delivery momentum.
- Provide visibility into team progress, workloads, and potential bottlenecks, escalating issues where needed.
- Support resourcing efforts by identifying capacity gaps and recommending solutions such as internal alignment or contractor engagement.
- Encourage cross-team knowledge sharing and alignment on engineering practices.
Leadership & Communication
- Demonstrate strong project management and organizational abilities, balancing technical depth with delivery focus.
- Prepare and present technical solutions and recommendations for internal stakeholders and client discussions.
- Ensure all technical contributions align with product goals and overall data architecture strategy.
Skills & Experience
- Proficiency in Python (data processing, automation, APIs).
- Advanced SQL skills (optimization, data transformation, complex queries).
- Hands-on experience with Snowflake (data warehousing, schema design, performance tuning).
- Working knowledge of cloud data platform - AWS and pipeline orchestration tools (Airflow, dbt, etc.).
- Working knowledge of AWS Glue, RDS, Aurora, Redshift
- Experience with Git, CI/CD, and modern engineering workflows.
- Strong analytical, communication, and collaboration skills.
Why You’ll Love Working Here
- Collaborate with talented engineers on data challenges that drive real business impact.
- Work with a modern cloud data stack and best-in-class tools.
- Be part of a culture that values ownership, learning, and continuous improvement.
Location:
DGS India - Bengaluru - Manyata N1 BlockBrand:
MerkleTime Type:
Full timeContract Type:
PermanentSimilar Jobs
What We Do
Merkle is a leading technology-enabled, data-driven customer experience management (CXM) company. For over 30 years, Fortune 1,000 companies and leading nonprofit organizations have partnered with us to build and maximize the value of their customer portfolios. We work with world-class brands like Dell, T-Mobile, Samsung, GEICO, Regions, Kimberly-Clark, AARP, Lilly, Sanofi, NBC Universal, DIRECTV, American Cancer Society, Habitat for Humanity, and many others to build and execute customer-centric business strategies. With more than 9,600 smart, dedicated people in more than 50 offices around the world, we are still growing at a rate that outpaces the market, with 2019 net revenue of $1.1 billion.








