The Role
The Senior Data Engineer will modernize ETL pipelines using Python and dbt, manage data ingestion with Kafka and Apache NiFi, and collaborate in an Agile environment to maintain high-quality data workflows.
Summary Generated by Built In
Job Brief
We are seeking a skilled Senior Data Engineer to support the modernization of existing Talend-based ETL pipelines into a modern data engineering ecosystem leveraging Python, dbt, Kafka, Apache NiFi, and orchestration tools such as Airflow or Dagster.
You will work closely with senior engineers to migrate, build, test, and maintain high-quality data pipelines across the organization. This role is ideal for professionals with strong hands-on data engineering skills, a collaborative mindset, and an eagerness to work with modern data stack technologies.
VentureDive Overview
Founded in 2012 by veteran technology entrepreneurs from MIT and Stanford, VentureDive is the fastest-growing technology company in the region that develops and invests in products and solutions that simplify and improve the lives of people worldwide. We aspire to create a technology organization and an entrepreneurial ecosystem in the region that is recognized as second to none in the world.
Key Responsibilities:
Pipeline Migration & Development
What we look for beyond required skills
In order to thrive at VentureDive, you
…are intellectually smart and curious
…have the passion for and take pride in your work
…deeply believe in VentureDive’s mission, vision, and values
…have a no-frills attitude
…are a collaborative team player
…are ethical and honest
Are you ready to put your ideas into products and solutions that will be used by millions?
You will find VentureDive to be a quick pace, high standards, fun and a rewarding place to work at. Not only will your work reach millions of users world-wide, you will also be rewarded with competitive salaries and benefits. If you think you have what it takes to be a VenDian, come join us ... we're having a ball!
#LI-Hybrid
We are seeking a skilled Senior Data Engineer to support the modernization of existing Talend-based ETL pipelines into a modern data engineering ecosystem leveraging Python, dbt, Kafka, Apache NiFi, and orchestration tools such as Airflow or Dagster.
You will work closely with senior engineers to migrate, build, test, and maintain high-quality data pipelines across the organization. This role is ideal for professionals with strong hands-on data engineering skills, a collaborative mindset, and an eagerness to work with modern data stack technologies.
VentureDive Overview
Founded in 2012 by veteran technology entrepreneurs from MIT and Stanford, VentureDive is the fastest-growing technology company in the region that develops and invests in products and solutions that simplify and improve the lives of people worldwide. We aspire to create a technology organization and an entrepreneurial ecosystem in the region that is recognized as second to none in the world.
Key Responsibilities:
Pipeline Migration & Development
- Assist in re-engineering legacy Talend pipelines into Python, dbt, and Airflow/Dagster workflows.
- Ensure pipeline logic, data mappings, and tests are accurately replicated and validated.
- Support both legacy and new pipeline environments during the transition period.
- Develop and maintain data ingestion flows using Kafka, Apache NiFi, and REST APIs.
- Work with batch and streaming data across structured, semi-structured, and unstructured formats.
- Implement data validation, quality checks, schema enforcement, and row-level transformations.
- Contribute to dbt development (models, tests, documentation, snapshots).
- Support transformation logic to maintain accuracy, maintainability, and lineage.
- Monitor daily ETL/ELT workflows for failures, bottlenecks, or data quality issues.
- Perform root-cause analysis and escalate complex issues when needed.
- Optimize performance across data ingestion, processing, and transformation layers.
- Maintain well-structured documentation for pipeline logic, migration work, and data flows.
- Collaborate with senior engineers, QA, data analysts, architects, and platform teams.
- Participate in Agile ceremonies: stand-ups, planning, reviews, and retrospectives.
- 5+ years of experience in Data Engineering or ETL development.
- Demonstrated ability to design, build, and maintain robust ELT/ETL pipelines.
- Proficiency writing production-grade code (preferably in Python).
- Hands-on with SQL (including analytical queries, CTEs, window functions, optimization).
- Experience building pipelines using orchestration tools (Airflow, Databricks Workflows, etc.).
- Proven comfort with version control, automated testing, code review, CI/CD for data.
- ++ Practical experience with Databricks: can confidently use Spark APIs, Delta Lake features (ACID, schema evolution, time travel), and Unity Catalog for data management & access governance.
- Familiarity with data quality frameworks welcomed.
- Skilled in designing scalable, maintainable, and performant data models (e.g., star/snowflake, normalization, partitioning, incremental strategies).
- Can articulate and justify trade-offs in storage, compute, and access layer designs.
- Proactive in identifying and fixing pipeline/data quality issues.
- Strong troubleshooting, debugging, and root cause analysis skills; goes beyond surface-level solutions.
- Able to reason about idempotency, error handling, recovery, backfilling, and other critical production concerns.
- Actively seeks to understand how and why things work
- Consistently dives deeper
- Explains in depth, not just what was done but why, and what trade-offs were considered or learned.
- Excellent communicator: explains choices and solutions clearly, tailors depth for technical and non-technical audiences.
- Can demonstrate hands-on skills Python and SQL via past technical interviews, or assessments (not just design/theoretical expertise).
- Has built/maintained production pipelines for large and/or complex data sets.
- Excels in writing, debugging, and optimizing both SQL and Python.
What we look for beyond required skills
In order to thrive at VentureDive, you
…are intellectually smart and curious
…have the passion for and take pride in your work
…deeply believe in VentureDive’s mission, vision, and values
…have a no-frills attitude
…are a collaborative team player
…are ethical and honest
Are you ready to put your ideas into products and solutions that will be used by millions?
You will find VentureDive to be a quick pace, high standards, fun and a rewarding place to work at. Not only will your work reach millions of users world-wide, you will also be rewarded with competitive salaries and benefits. If you think you have what it takes to be a VenDian, come join us ... we're having a ball!
#LI-Hybrid
Top Skills
Airflow
Apache Nifi
Ci/Cd
Dagster
Dbt
Git
Kafka
Python
Am I A Good Fit?
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.
Success! Refresh the page to see how your skills align with this role.
The Company
What We Do
VentureDive is an award-winning digital development company that builds cutting-edge technology solutions to improve lives globally. Since its inception in 2012, the firm has enabled two tech unicorns and successfully driven digital transformation initiatives for large enterprises. Led by co-founders Atif Azim and Shehzaad Nakhoda, VentureDive has a presence in Silicon Valley, London, Portugal, Dubai, and Pakistan. To learn more, visit https://www.venturedive.com.






