The Role
The Data Platform DevOps Engineer will build and automate an enterprise Data Platform on AWS, managing infrastructure and CI/CD pipelines for data teams.
Summary Generated by Built In
We are seeking an experienced Data Platform Engineer to join our core platform team. In this role, you will be responsible for building, securing, and automating our enterprise Data Platform on AWS. You will go beyond basic pipeline creation by designing and maintaining the underlying infrastructure and CI/CD frameworks that enable our data teams to operate and scale efficiently.
Job Responsibilities
• Cloud & Platform Infrastructure (IaC): Deploy and maintain Databricks workspaces and AWS infrastructure (VPC, PrivateLink, IAM, S3, Lambda, EKS, and Fargate) using Terraform.
• Unity Catalog Implementation: Automate the governance layer, including metastore configuration, external locations, and access controls within Unity Catalog.
• Security & Compliance: Ensure the platform adheres to enterprise security standards by managing implementing cloud infrastructure and data protection automated security controls.
• Workspace Lifecycle Management: Use Terraform for end-to-end workspace provisioning, ensuring consistent setup across Dev, Acc, and Prod environments.
• Governance & Cost Control (Policies): Design and implement policies and guardrails to enforce standards
• Identity & Access Automation: Automate assignment of permissions using Terraform. Manage Service Principals for pipelines and map groups to specific Workspace roles and Unity Catalog grants.
• DevOps & Automation (CI/CD)
• Pipeline Architecture: Oversee GitLab CI/CD pipelines for data projects, transitioning the team from manual notebook deployments to automated workflows.
• Databricks Asset Bundles (DABs): Standardize deployment strategies using DABs. Develop templates and presets for Data Engineers to deploy jobs and workflows.
• Release Management: Implement branching strategies, code review policies, and environment promotion rules (Dev → Acc → Prod).
• Service Organization & Operations
• Observability: Configure monitoring, alerting, and logging (using system tables or integration with tools like CloudWatch) to ensure platform stability.
• Support & Incident Management: Serve as an escalation point for platform-related incidents.
• Knowledge Sharing: Document best practices and conduct workshops to upskill data engineers on effective platform usage.
Job Qualifications:
• Bachelor’s in computer science, software engineering, mathematics, or related field.
• 5+ years industry experience in Data Engineering, Cloud Infrastructure, or DevOps; 3+ years with Databricks in enterprise settings.
• Advanced Terraform skills for managing Cloud infrastructure and Databricks resources
• Extensive AWS portfolio knowledge
• Expertise in CI/CD pipelines using GitLab CI and Databricks Asset Bundles.
• Deep understanding of Databricks Lakehouse architecture, Unity Catalog, Serverless Compute, Delta Lake, and Workflow orchestration.
• Solid grasp of SDLC/DataOps, including unit testing, modular code, and Git strategies.
• Proficient in Python (e.g., automation, PySpark, pandas) and Bash/Shell scripting for CI/CD.
• Excellent communication, documentation, mentoring, and collaboration skills.
• Preferred: Databricks Certified Data Engineer Professional or AWS Solutions Architect certification.
More information about NXP in India...
#LI-29f4Skills Required
- Bachelor's in computer science, software engineering, mathematics, or related field
- 5+ years industry experience in Data Engineering, Cloud Infrastructure, or DevOps
- 3+ years with Databricks in enterprise settings
- Advanced Terraform skills for managing Cloud infrastructure and Databricks resources
- Extensive AWS portfolio knowledge
- Expertise in CI/CD pipelines using GitLab CI and Databricks Asset Bundles
- Deep understanding of Databricks Lakehouse architecture, Unity Catalog, Serverless Compute, Delta Lake, and Workflow orchestration
- Solid grasp of SDLC/DataOps, including unit testing, modular code, and Git strategies
- Proficient in Python, Bash/Shell scripting for CI/CD
- Excellent communication, documentation, mentoring, and collaboration skills
- Databricks Certified Data Engineer Professional or AWS Solutions Architect certification
Am I A Good Fit?
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.
Success! Refresh the page to see how your skills align with this role.
The Company
What We Do
NXP Semiconductors N.V. (NASDAQ: NXPI) enables a smarter, safer and more sustainable world through innovation. As a world leader in secure connectivity solutions for embedded applications, NXP is pushing boundaries in the automotive, industrial & IoT, mobile, and communication infrastructure markets. Built on more than 60 years of combined experience and expertise, the company has approximately 34,500 employees in more than 30 countries and posted revenue of $13.21 billion in 2022. Find out more at www.nxp.com. Privacy Policy: https://www.nxp.com/company/about-nxp/privacy-policy-for-social-media-pages:PRIVACY-POLICY-SOCIAL-MEDIA







