Senior Data Engineer

Reposted 3 Days Ago
Be an Early Applicant
Bangalore, Bengaluru Urban, Karnataka, IND
Hybrid
Senior level
AdTech • Artificial Intelligence • Digital Media • Marketing Tech • Mobile • Other • Software
Let's make media better.
The Role
The Senior Data Engineer will design and maintain scalable data processing pipelines, focusing on ETL workflows and real-time data processing with technologies like Kafka and Spark, while collaborating with analytics and data science teams.
Summary Generated by Built In

Who We Are

Verve has created a more efficient and privacy-focused way to buy and monetize advertising. Verve is an ecosystem of demand and supply technologies fusing data, media, and technology together to deliver results and growth to both advertisers and publishers–no matter the screen or location, no matter who, what, or where a customer is. With 30 offices across the globe and with an eye on servicing forward-thinking advertising customers, Verve’s solutions are trusted by more than 90 of the United States’ top 100 advertisers, 4,000 publishers globally, and the world’s top demand-side platforms. Learn more at www.verve.com.


Who You Are

We're looking for a Senior Data Engineer to join our Data Processing team at Verve. This team is responsible for designing and maintaining scalable, real-time, and batch data processing pipelines that enable high-quality insights and analytics. The Data Processing team works closely with the Data Analytics and Data Science teams to ensure efficient data flows across the organization.

In this role, you'll focus on developing and optimizing data processing workflows, ETL pipelines, and streaming solutions. Precision, efficiency, and high-quality deliverables are essential, and you'll report to the Engineering Manager, Data Processing.

If you are passionate about working with big data technologies, real-time data processing, and optimizing large-scale ETL workflows, we’d love to hear from you!

This is a hybrid position with three days per week in our Bengaluru office.

What You Will Do

Domain Expertise:

  • Design, develop, and maintain scalable, efficient, and cost-effective data pipelines for real-time and batch data processing.

  • Build high-performance ETL workflows leveraging Kafka, Apache Spark, and Scala.

  • Implement best practices for data modeling, storage, and retrieval, ensuring high availability and reliability.

  • Work with GCP-based data services, including BigQuery, Dataflow, and Cloud Storage.

  • Optimize the performance and cost efficiency of data processing jobs.

Data Processing & Engineering

  • Architect and implement scalable streaming and batch processing solutions.

  • Develop real-time data processing pipelines using Kafka and Apache Spark Structured Streaming.

  • Ensure the stability and reliability of high-volume data ingestion pipelines.

  • Write clean, testable, and efficient code using Scala.

  • Ensure compliance with data privacy and security best practices.

Collaborative Work

  • Work closely with Data Analytics and Data Science teams to ensure data is processed, stored, and accessed efficiently.

  • Collaborate with the Data Analytics team to provide structured data for data warehouses, reporting APIs, and UIs.

  • Support the Data Science team by ensuring high-quality data availability for machine learning models.

  • Work cross-functionally with engineering and product teams to design end-to-end data solutions.

  • Coordinate with global teams in different time zones to deliver projects successfully.

  • Extend optimized solutions to sister companies within Verve Group.

What Will You Bring

Must-Have Qualifications:

  • 5+ years of experience as a Data Engineer working on large-scale ETL/data processing pipelines.

  • Experience handling daily terabytes of data efficiently.

  • Strong knowledge of Apache Kafka for streaming data processing.

  • Strong expertise in Apache Spark (Scala) for batch and real-time processing.

  • Proficiency in orchestration tools like Apache Airflow.

  • Proficiency in GCP, including services like BigQuery, Dataflow, and Dataproc.

  • Solid understanding of data modeling techniques, including Star/Snowflake schema and state of the art Lakehouses.

  • Strong SQL skills, including query optimization and performance tuning.

  • Familiarity with Apache Druid or other OLAP engines.

  • Strong knowledge of infrastructure as code (Terraform).

Nice-to-Have Qualifications:

  • Experience with AdTech or high-velocity data processing environments.

  • Hands-on experience with Hadoop ecosystem tools.

  • Knowledge of containerization (Docker, Kubernetes).

  • Familiarity with BI tools like Looker

  • Basic DevOps knowledge

What We Offer

Just a few of the benefits waiting for you at Verve: 

  • Keep healthy through our Apollo Health Check-ups, and stay covered with our Medical Insurance for you and your family

  • Pick what matters most to you in our INR 4100/month Personalized Benefits Platform: Travel, entertainment, food, fitness, and healthcare

  • Have us cover your office lunches 3 times a week through Zomato, and enjoy office sweets, snacks, and beverages

  • Boost your professional knowledge with our annual training budget & internal webinars, and level up your language skills through our German/English classes

  • Recharge with 19 paid vacation days + 3 Wellness days throughout the year, in addition to the public holidays

  • Strengthen team connections while exploring new cultures through our monthly Work&Travel raffle, offering a chance to work from one of our global offices (after 2 years of employment)

  • … and even more reasons to join us!

Verve provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.

#LI-Hybrid

Skills Required

  • 5+ years of experience as a Data Engineer working on large-scale ETL/data processing pipelines
  • Experience handling daily terabytes of data efficiently
  • Strong knowledge of Apache Kafka for streaming data processing
  • Strong expertise in Apache Spark (Scala) for batch and real-time processing
  • Proficiency in orchestration tools like Apache Airflow
  • Proficiency in GCP, including services like BigQuery, Dataflow, and Dataproc
  • Solid understanding of data modeling techniques, including Star/Snowflake schema and state of the art Lakehouses
  • Strong SQL skills, including query optimization and performance tuning
  • Familiarity with Apache Druid or other OLAP engines
  • Strong knowledge of infrastructure as code (Terraform)
Am I A Good Fit?
beta
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
HQ: New York, New York
1,000 Employees

What We Do

Verve connects advertisers to publishers in emerging channels. We provide AI-driven tools for effective, responsible ad campaigns. Our digital media solutions optimize under-leveraged ad inventory, enhancing outcomes across digital devices. As the digital world rapidly evolves, promises of privacy and transparency haven’t always kept pace. Advertisers, consumers, and businesses alike have faced challenges in navigating this complex landscape. At Verve, we believe there’s a better way. Our mission is to strengthen the internet, making it safer and more effective for everyone. By improving media through a commitment to privacy, transparency, and responsibility, we aim to build a digital ecosystem that truly serves people better. We’re dedicated to creating an open, privacy-first environment that drives growth and trust for brands, agencies, and publishers worldwide. Better media isn’t just about success in numbers—it’s about achieving it with sustainability, diversity, and integrity at the forefront.

Why Work With Us

Growth runs in our veins — learn and develop your skills with us Diversity is what brings us together — 55 nationalities and growing In autonomy we trust — make an impact from day one We live and breathe innovation — bring your revolutionary ideas to life

Similar Jobs

Zeta Global Logo Zeta Global

Senior Data Engineer

AdTech • Artificial Intelligence • Marketing Tech • Software • Analytics
Easy Apply
Hybrid
Bengaluru, Bengaluru Urban, Karnataka, IND
2429 Employees

Akamai Technologies Logo Akamai Technologies

Senior Data Engineer

Cloud • Security • Software • Cybersecurity
In-Office or Remote
2 Locations
10285 Employees

Cargill Logo Cargill

Senior Data Engineer

Food • Greentech • Logistics • Sharing Economy • Transportation • Agriculture • Industrial
In-Office
Bengaluru, Bengaluru Urban, Karnataka, IND
155000 Employees

Cargill Logo Cargill

Senior Data Engineer

Food • Greentech • Logistics • Sharing Economy • Transportation • Agriculture • Industrial
In-Office
Bengaluru, Bengaluru Urban, Karnataka, IND
155000 Employees

Similar Companies Hiring

Bellagent Thumbnail
Artificial Intelligence • Machine Learning • Business Intelligence • Generative AI
Chicago, IL
20 Employees
Golden Pet Brands Thumbnail
Digital Media • eCommerce • Information Technology • Marketing Tech • Pet • Retail • Social Media
El Segundo, California
178 Employees
Kepler  Thumbnail
Fintech • Software
New York, New York
6 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account