Sr. Databricks Data Engineer - I

Sorry, this job was removed at 02:07 p.m. (CST) on Wednesday, Sep 24, 2025
Be an Early Applicant
India
Information Technology • Software • Analytics
The Role

Job Title: Databricks Data Engineer - I
Experience: 5+ years
Location: Remote
Job Type: Full-time with AB2

We are seeking an experienced Databricks Data Engineer who can play a crucial role in our Fintech data lake project.What You Bring
• 5+ years of experience working in data warehousing systems
• 3+ strong hands-on programming expertise in Databricks landscape, including SparkSQL, Workflows
• for data processing and pipeline development
• 3+ strong hands-on data transformation/ETL skills using Spark SQL, Pyspark, Unity Catalog working
• in Databricks Medallion architecture
• 2+ yrs work experience in one of cloud platforms: Azure, AWS or GCP
• Experience working in using Git version control, and well versed with CI/CD best practices to
• automate the deployment and management of data pipelines and infrastructure
• Nice to have hands-on experience building data ingestion pipelines from ERP systems (Oracle
• Fusion preferably) to a Databricks environment, using Fivetran or any alternative data connectors
• Experience in a fast-paced, ever-changing and growing environment
• Understanding of metadata management, data lineage, and data glossaries is a plus
• Must have eport development experience using PowerBI, SplashBI or any enterprise reporting toolWhat You’ll Do
• Involve in design and development of enterprise data solutions in Databricks, from ideation to
• deployment, ensuring robustness and scalability.
• Work with the Data Architect to build, and maintain robust and scalable data pipeline architectures on
• Databricks using PySpark and SQL
• Assemble and process large, complex ERP datasets to meet diverse functional and non-functional
• requirements.
• Involve in continuous optimization efforts, implementing testing and tooling techniques to enhance
• data solution quality
• Focus on improving performance, reliability, and maintainability of data pipelines.
• Implement and maintain PySpark and databricks SQL workflows for querying and analyzing large
• datasets
• Involve in release management using Git and CI/CD practices
• Develop business reports using SplashBI reporting tool leveraging the data from Databricks gold layer.Qualifications
• Bachelors Degree in Computer Science, Engineering, Finance or equivalent experience
• Good communication skills 

We are seeking an experienced Databricks Data Engineer who can play a crucial role in our Fintech data lake project.What You Bring
• 5+ years of experience working in data warehousing systems
• 3+ strong hands-on programming expertise in Databricks landscape, including SparkSQL, Workflows
• for data processing and pipeline development
• 3+ strong hands-on data transformation/ETL skills using Spark SQL, Pyspark, Unity Catalog working
• in Databricks Medallion architecture
• 2+ yrs work experience in one of cloud platforms: Azure, AWS or GCP
• Experience working in using Git version control, and well versed with CI/CD best practices to
• automate the deployment and management of data pipelines and infrastructure
• Nice to have hands-on experience building data ingestion pipelines from ERP systems (Oracle
• Fusion preferably) to a Databricks environment, using Fivetran or any alternative data connectors
• Experience in a fast-paced, ever-changing and growing environment
• Understanding of metadata management, data lineage, and data glossaries is a plus
• Must have eport development experience using PowerBI, SplashBI or any enterprise reporting toolWhat You’ll Do
• Involve in design and development of enterprise data solutions in Databricks, from ideation to
• deployment, ensuring robustness and scalability.
• Work with the Data Architect to build, and maintain robust and scalable data pipeline architectures on
• Databricks using PySpark and SQL
• Assemble and process large, complex ERP datasets to meet diverse functional and non-functional
• requirements.
• Involve in continuous optimization efforts, implementing testing and tooling techniques to enhance
• data solution quality
• Focus on improving performance, reliability, and maintainability of data pipelines.
• Implement and maintain PySpark and databricks SQL workflows for querying and analyzing large
• datasets
• Involve in release management using Git and CI/CD practices
• Develop business reports using SplashBI reporting tool leveraging the data from Databricks gold layer.Qualifications
• Bachelors Degree in Computer Science, Engineering, Finance or equivalent experience
• Good communication skills 

Similar Jobs

Easy Apply
Remote or Hybrid
India
100 Employees

Capco Logo Capco

Way4 Specialist

Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Remote or Hybrid
India
6000 Employees

MetLife Logo MetLife

Team leader

Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
Remote or Hybrid
India
43000 Employees

MetLife Logo MetLife

Team leader

Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
Remote or Hybrid
India
43000 Employees
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
HQ: Reston, Virginia
28 Employees
Year Founded: 2003

What We Do

DATAMAXIS takes pride in delivering a wide range of business IT modernization, data analytics, and technology management services. With command of the cutting-edge developments in these fields, our team and consultants are ready to provide you a robust technology modernization experience that results in a big boost in performance capability and operational efficiency.

Similar Companies Hiring

PRIMA Thumbnail
Travel • Software • Marketing Tech • Hospitality • eCommerce
US
15 Employees
Scotch Thumbnail
Software • Retail • Payments • Fintech • eCommerce • Artificial Intelligence • Analytics
US
25 Employees
Milestone Systems Thumbnail
Software • Security • Other • Big Data Analytics • Artificial Intelligence • Analytics
Lake Oswego, OR
1500 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account