Data Engineer (Databricks Focus)
About the Role
We are seeking an experienced Data Engineer to join our growing data team and play a key role in modernizing our analytics platform. In 2026, we will be executing a large-scale migration and rehydration of ~500 existing PowerBI reports, including re-connecting and optimizing data sources in a new lakehouse environment.
Key Responsibilities
- Design, build, and maintain scalable data pipelines using Databricks (Delta Lake, Unity Catalog, Spark).
- Lead or significantly contribute to the migration and rehydration of approximately 500 PowerBI reports in 2026, including re-pointing and optimizing data sources.
- Implement and maintain CI/CD pipelines for data assets using Databricks Asset Bundles (DAB), GitHub Actions, and other modern DevOps practices.
- Collaborate with data analysts, BI developers, and business stakeholders to ensure data availability, performance, and reliability.
- Optimize ETL/ELT processes for performance, cost, and maintainability.
- Establish best practices for version control, testing, and deployment of notebooks, workflows, and Delta Live Tables.
Required Experience & Skills
- 4+ years of hands-on data engineering experience (5-7+ years of experience overall).
- Strong proficiency in Python and SQL.
- Deep experience with Databricks (workspace administration, cluster management, Delta Lake, Unity Catalog, workflows, and notebooks).
- Proven track record implementing CI/CD for data workloads (preferably using Databricks Asset Bundles and GitHub Actions).
- Solid understanding of Spark (PySpark and/or Spark SQL).
- Experience with infrastructure-as-code and modern data DevOps practices.
- Relevant certifications strongly preferred:
- Databricks Certified Data Engineer Associate or Professional
- Azure Data Engineer Associate (DP-203) or equivalent AWS/GCP certifications
Nice-to-Have / Bonus Skills
- Experience extracting data from SAP/HANA or S/4HANA systems (via ODP, CDS views, SDA, etc.).
- Previous large-scale PowerBI migration or re-platforming projects.
- Familiarity with Databricks SQL warehouses, Serverless, or Lakehouse Monitoring.
- Experience with dbt, Delta Live Tables, or Lakeflow.
If you have strong Databricks + CI/CD + Asset Bundles experience and are excited about transforming a large PowerBI footprint into a modern Lakehouse architecture, you’re a good fit.
Top Skills
What We Do
Headquartered in Santa Clara, California, and backed by renowned private equity firms Advent International and Warburg Pincus, Encora is the preferred technology modernization and innovation partner to some of the world’s leading enterprise companies. It provides award-winning digital engineering services including Product Engineering & Development, Cloud Services, Quality Engineering, DevSecOps, Data & Analytics, Digital Experience, Cybersecurity, and AI & LLM Engineering. Encora's deep cluster vertical capabilities extend across diverse industries, including HiTech, Healthcare & Life Sciences, Retail & CPG, Energy & Utilities, Banking Financial Services & Insurance, Travel, Hospitality & Logistics, Telecom & Media, Automotive, and other specialized industries.
With over 9,000 associates in 47+ offices and delivery centers across the U.S., Canada, Latin America, Europe, India, and Southeast Asia, Encora delivers nearshore agility to clients anywhere in the world, coupled with expertise at scale in India. Encora’s Cloud-first, Data-first, AI-first approach enables clients to create differentiated enterprise value through technology

.jpg)





