Senior Data Engineer - GCP

Reposted 10 Days Ago
Be an Early Applicant
Hyderabad, Telangana
Hybrid
Senior level
Artificial Intelligence • Big Data • Cloud • Information Technology • Machine Learning
The Role
The Senior Data Engineer designs and maintains data pipelines on Google Cloud Platform, focusing on ETL processes while ensuring data quality and collaboration with cross-functional teams.
Summary Generated by Built In

Job Overview:
 
We are looking for a skilled and motivated Senior Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems.

Experience Level: 7 to 10 years of relevant IT experience
 

Key Responsibilities:

  • Design, develop, test, and maintain scalable ETL data pipelines using Python.
  • Work extensively on Google Cloud Platform (GCP) services such as:
  • Dataflow for real-time and batch data processing
  • Cloud Functions for lightweight serverless compute
  • BigQuery for data warehousing and analytics
  • Cloud Composer for orchestration of data workflows (based on Apache Airflow)
  • Google Cloud Storage (GCS) for managing data at scale
  • IAM for access control and security
  • Cloud Run for containerized applications

Should have experience in the following areas :

  • API framework: Python FastAPI
  • Processing engine: Apache Spark
  • Messaging and streaming data processing : Kafka
  • Storage: MongoDB, Redis/Bigtable
  • Orchestration: Airflow
  • Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery.
  • Implement and enforce data quality checks, validation rules, and monitoring.
  • Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions.
  • Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects.
  • Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL.
  • Document pipeline designs, data flow diagrams, and operational support procedures.

Required Skills:

  • 7–10 years of hands-on experience in Python for backend or data engineering projects.
  • Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.).
  • Solid understanding of data pipeline architecture, data integration, and transformation techniques.
  • Experience in working with version control systems like GitHub and knowledge of CI/CD practices.
  • Experience in Apache Spark, Kafka, Redis, Fast APIs, Airflow, GCP Composer DAGs.
  • Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.).
  • Experience in data migrations from on-premise data sources to Cloud platforms.

Good to Have (Optional Skills):

  • Experience working with Snowflake cloud data platform.
  • Experience in deployments in GKE, Cloud Run.
  • Hands-on knowledge of Databricks for big data processing and analytics.
  • Familiarity with Azure Data Factory (ADF) and other Azure data engineering tools.

Additional Details:

  • Excellent problem-solving and analytical skills.
  • Strong communication skills and ability to collaborate in a team environment.

Education:

  • Bachelor's degree in Computer Science, a related field, or equivalent experience.

Top Skills

Airflow
Spark
BigQuery
Cloud Composer
Cloud Functions
Dataflow
Git
Google Cloud Platform (Gcp)
Google Cloud Storage
Kafka
MongoDB
Oracle
Postgres
Python
Redis
SQL Server
Am I A Good Fit?
beta
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
HQ: Naperville, IL
240 Employees
Year Founded: 2000

What We Do

Egen is a data engineering and cloud modernization firm partnering with leading Chicagoland companies to launch, scale, and modernize industry-changing technologies. We are catalysts for change who create digital breakthroughs at warp speed. Our team of cloud and data engineering experts are trusted by top clients in pursuit of the extraordinary.

Our mission is to be an enabler of amazing possibilities for companies looking to use the power of cloud and data. We want to stand shoulder to shoulder with clients, as true technology partners, and make sure they succeed at what they have set out to do. We want to be disruptors, game-changers, and innovators who have played an important part in moving the world forward.

Similar Jobs

Micron Technology Logo Micron Technology

Data Engineer

Artificial Intelligence • Hardware • Information Technology • Machine Learning • Semiconductor
In-Office
Hyderabad, Telangana, IND

Wells Fargo Logo Wells Fargo

Senior Software Engineer

Fintech • Financial Services
Hybrid
Hyderabad, Telangana, IND

Wells Fargo Logo Wells Fargo

Senior Software Engineer

Fintech • Financial Services
Hybrid
Hyderabad, Telangana, IND

Wells Fargo Logo Wells Fargo

Program Analyst

Fintech • Financial Services
Hybrid
2 Locations

Similar Companies Hiring

Amplify Platform Thumbnail
Fintech • Financial Services • Consulting • Cloud • Business Intelligence • Big Data Analytics
Scottsdale, AZ
62 Employees
Credal.ai Thumbnail
Software • Security • Productivity • Machine Learning • Artificial Intelligence
Brooklyn, NY
Standard Template Labs Thumbnail
Software • Information Technology • Artificial Intelligence
New York, NY
10 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account