Big data Engineer (Airflow/Snowflake)

Reposted 6 Days Ago
Be an Early Applicant
Hiring Remotely in Ciudad de México, Cuauhtémoc, Ciudad de México
In-Office or Remote
Senior level
Information Technology • Consulting
The Role
Design and maintain ELT pipelines and integrations using S3, Apache Airflow, and Snowflake in AWS. Develop Python code and optimize SQL queries.
Summary Generated by Built In

Our client is a rapidly growing, automation-led service provider specializing in IT, business process outsourcing (BPO), and consulting services. With a strong focus on digital transformation, cloud solutions, and AI-driven automation, they help businesses optimize operations and enhance customer experiences. Backed by a global workforce of over 32,000 employees, our client fosters a culture of innovation, collaboration, and continuous learning, making it an exciting environment for professionals looking to advance their careers.

Committed to excellence, our client serves 31 Fortune 500 companies across industries such as financial services, healthcare, and manufacturing. Their approach is driven by the Automate Everything, Cloudify Everything, and Transform Customer Experiences strategy, ensuring they stay ahead in an evolving digital landscape. 

As a company that values growth and professional development, our client offers global career opportunities, a dynamic work environment, and exposure to high-impact projects. With 54 offices worldwide and a presence in 39 delivery centers across 28 countries, employees benefit from an international network of expertise and innovation. Their commitment to a 'customer success, first and always' philosophy ensures a rewarding and forward-thinking workplace for driven professionals.

We are currently searching for a Big data Engineer (Airflow/Snowflake):

Responsibilities:

  • Design, build, and maintain ELT pipelines, orchestration, and integrations using S3, Apache Airflow, DBT Cloud, and Snowflake within an AWS cloud environment.
  • Develop modular Python code for reusable packages, scripting, and automation.
  • Implement and optimize SQL queries and Snowflake data models for performance and scalability.
  • Deploy container-based services in AWS with proper monitoring setup.
  • Integrate and process data from REST APIs to support analytical and operational needs.
  • Ensure data quality, governance, and implement cloud-native troubleshooting practices.
  • Collaborate with cross-functional teams to define and execute data engineering best practices.

Requirements:

  • 5+ years of experience in data engineering roles.
  • Proficient in Python for scripting, automation, and API interactions.
  • Strong SQL skills and Snowflake expertise, including performance tuning and data modeling.
  • Hands-on experience with Apache Airflow for workflow orchestration and monitoring.
  • Experience with dbt for modular, version-controlled data transformations.
  • Solid background with AWS services such as S3, Lambda, IAM, and CloudWatch.
  • Strong understanding of data quality, governance, and troubleshooting in cloud environments.

Languages

  • Advanced Oral English.
  • Native Spanish.

Note:

  • Fully remote.

If you meet these qualifications and are pursuing new challenges, start your application on our website to join an award-winning employer. Explore all our job openings | Sequoia Career’s Page: https://www.sequoia-connect.com/careers/

Requirements:

  • 5+ years of experience in data engineering roles.
  • Proficient in Python for scripting, automation, and API interactions.
  • Strong SQL skills and Snowflake expertise, including performance tuning and data modeling.
  • Hands-on experience with Apache Airflow for workflow orchestration and monitoring.
  • Experience with dbt for modular, version-controlled data transformations.
  • Solid background with AWS services such as S3, Lambda, IAM, and CloudWatch.
  • Strong understanding of data quality, governance, and troubleshooting in cloud environments.

Top Skills

Apache Airflow
Aws Cloudwatch
Aws Iam
Aws Lambda
Aws S3
Dbt
Python
Snowflake
SQL
Am I A Good Fit?
beta
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
Austin, , Texas ,
30 Employees
Year Founded: 2017

What We Do

From Technologist to Technologist.

We are the catalysts of digital evolution. Our core expertise lies in connecting Top Technologists with Top Companies through unparalleled IT headhunting solutions.

Our international expertise helps businesses of all sizes innovate and succeed.

Our clients are global companies with over 100k employees serving 700+ clients in 50+ countries. Rooted in a profound grasp of technology, our premier IT Headhunting Services drive transformative growth for Fortune 500 corporations. We maintain the highest standards of technological innovation to meet global demands.

Similar Jobs

SailPoint Logo SailPoint

IT Auditor

Artificial Intelligence • Cloud • Sales • Security • Software • Cybersecurity • Data Privacy
Remote or Hybrid
3 Locations
2461 Employees
46K-86K Annually
Remote or Hybrid
Ciudad de México, Cuauhtémoc, Ciudad de México, MEX
1100 Employees
Remote or Hybrid
Ciudad de México, Cuauhtémoc, Ciudad de México, MEX
1100 Employees

Samsara Logo Samsara

BT Support Specialist I

Artificial Intelligence • Cloud • Computer Vision • Hardware • Internet of Things • Software
Easy Apply
Remote or Hybrid
México
4000 Employees

Similar Companies Hiring

Scrunch AI Thumbnail
Software • SEO • Marketing Tech • Information Technology • Artificial Intelligence
Salt Lake City, Utah
Amplify Platform Thumbnail
Fintech • Financial Services • Consulting • Cloud • Business Intelligence • Big Data Analytics
Scottsdale, AZ
62 Employees
Standard Template Labs Thumbnail
Software • Information Technology • Artificial Intelligence
New York, NY
10 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account