Senior Data Engineer (English Required)

Reposted 19 Days Ago
Be an Early Applicant
4 Locations
In-Office or Remote
Senior level
Information Technology • Software
The Role
Design, build, and optimize data pipelines for applications, ensuring scalability, performance, and security while collaborating with data scientists and software engineers.
Summary Generated by Built In

Work at DaCodes!

We are a team of experts in software development and high-impact digital transformation.

For over 10 years, we’ve created technology and innovation-driven solutions thanks to our team of 220+ talented #DaCoders, including developers, architects, UX/UI designers, PMs, QA testers, and more. Our team integrates into projects with clients across LATAM and the United States, delivering outstanding results.

At DaCodes, you'll accelerate your professional growth by collaborating on diverse projects across various industries and sectors.

Working with us will make you versatile and agile, giving you the opportunity to work with cutting-edge technologies and collaborate with top-level professionals.

Our DaCoders play a crucial role in the success of our business and that of our clients. You’ll become the expert contributing to our projects while gaining access to disruptive startups and global brands. Does this sound interesting to you?

We’re looking for talent to join our team—let’s work together!

The ideal candidate brings a unique mix of technical experience, curiosity, a logical and analytical mindset, proactivity, ownership, and a passion for teamwork.


Requirements

We are looking for a Senior Data Engineer to join our team and help design, build, and optimize data pipelines for large-scale applications. The ideal candidate has strong experience in data architecture, ETL/ELT processes, cloud platforms, and distributed systems.

This role requires expertise in handling big data, real-time processing, and data lakes while ensuring scalability, performance, and security. The candidate should be comfortable working in a fast-paced, agile environment and collaborating with data scientists, analysts, and software engineers to deliver high-quality data solutions.

Required Qualifications

🔹 5+ years of experience in data engineering, data architecture, or backend development.
🔹 Strong expertise in SQL and NoSQL databases (PostgreSQL, MySQL, MongoDB, DynamoDB, etc.).
🔹 Cloud expertise with AWS (preferred), GCP, or Azure.
🔹 Proficiency in Python, Java, or Scala for data processing and pipeline development.
🔹 Experience with big data frameworks like Apache Spark, Hadoop, or Flink.
🔹 Hands-on experience with ETL/ELT processes and data pipeline orchestration tools (Apache Airflow, dbt, Luigi, or Prefect).
🔹 Experience with message queues and streaming technologies (Kafka, Kinesis, Pub/Sub, or RabbitMQ).
🔹 Knowledge of containerization and orchestration tools (Docker, Kubernetes).
🔹 Strong problem-solving skills and the ability to optimize performance and scalability.
🔹 English proficiency (B2 or higher) to collaborate with international teams.

Nice-to-Have Skills (Preferred)

Experience with data lakehouse architectures (Delta Lake, Iceberg, Hudi).
✅ Familiarity with Machine Learning (ML) and AI-related data workflows.
✅ Experience with Infrastructure as Code (Terraform, CloudFormation) for managing data environments.
✅ Knowledge of data security and compliance regulations (GDPR, CCPA, HIPAA).
Key Responsibilities

Design, develop, and maintain scalable and efficient data pipelines for batch and real-time processing.
Build and optimize data lakes, warehouses, and analytics solutions on cloud platforms (AWS, GCP, or Azure).
Implement ETL/ELT workflows using tools such as Apache Airflow, dbt, or Prefect.
Ensure data integrity, consistency, and governance through proper architecture and best practices.
Integrate data from various sources (structured and unstructured), including APIs, streaming services, and databases.
Work with data scientists and analysts to ensure high availability and accessibility of data for analytics and machine learning models.
Monitor, troubleshoot, and improve the performance of data pipelines.
Implement security best practices for data access, encryption, and compliance.
Collaborate with software engineers to integrate data pipelines into applications and services.
Stay up to date with the latest trends in big data, cloud technologies, and data engineering best practices.


Benefits
  • Integration with global brands and disruptive startups.
  • Remote work/Home office. *You will be informed from the first session if any positions require a hybrid or on-site format. Don't worry, most are remote!
  • Work schedule aligned with your assigned team/project. (Client's time zone)
  • Monday to Friday work week.
  • Legal benefits.
  • Official holidays according to your assigned team/project.
  • Vacation days *You can use these days after six months with the company.
  • Day off on your birthday.
  • Major medical insurance.
  • Life insurance.
  • Virtual integration events and interest groups.
  • Meetups with special guests from companies, IT professionals, and prestigious universities.
  • Constant feedback and performance tracking.
  • Access to courses and certifications.
  • Multicultural work teams.
  • English classes.
  • Opportunities across our different business lines.

Proudly certified as a Great Place to Work!

Top Skills

Apache Airflow
Spark
AWS
Azure
Dbt
Docker
DynamoDB
Flink
GCP
Hadoop
Java
Kafka
Kinesis
Kubernetes
Luigi
MongoDB
MySQL
NoSQL
Postgres
Prefect
Pub/Sub
Python
RabbitMQ
Scala
SQL
Am I A Good Fit?
beta
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
HQ: Houston, Texas
220 Employees
Year Founded: 2014

What We Do

DaCodes empowers organizations to achieve their full potential through custom software solutions. We believe in the power of technology to drive innovation and growth.

✅We understand your unique vision.
✅We engineer for impact
✅We're a global team, dedicated to your success

Our 220+ DaCoders worldwide provide personalized support and technical expertise to amplify your engineering capabilities, enhance workflows, and help you scale your business.

Let's code something great together.

Similar Jobs

BlackLine Logo BlackLine

Application Support Analyst

Cloud • Fintech • Information Technology • Machine Learning • Software • App development • Generative AI
Remote or Hybrid
Mexico City, Cuauhtémoc, Mexico City, MEX
1810 Employees

Level Access Logo Level Access

Business Analyst

Social Impact • Software
Easy Apply
Remote
3 Locations
650 Employees

ServiceNow Logo ServiceNow

Principal Strategist, Inspire Value

Artificial Intelligence • Cloud • HR Tech • Information Technology • Productivity • Software • Automation
Remote or Hybrid
Mexico City, Cuauhtémoc, Mexico City, MEX
27000 Employees

Liftoff Logo Liftoff

Staff Software Engineer

AdTech • Artificial Intelligence • Big Data • Machine Learning • Marketing Tech • Mobile • Software
Easy Apply
Remote
3 Locations
645 Employees
160K-235K Annually

Similar Companies Hiring

Standard Template Labs Thumbnail
Software • Information Technology • Artificial Intelligence
New York, NY
10 Employees
PRIMA Thumbnail
Travel • Software • Marketing Tech • Hospitality • eCommerce
US
15 Employees
Scotch Thumbnail
Software • Retail • Payments • Fintech • eCommerce • Artificial Intelligence • Analytics
US
25 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account