Data Engineer

Reposted 12 Days Ago
Be an Early Applicant
Hiring Remotely in Brazil
Remote
Senior level
Information Technology • Consulting
The Role
The Data Engineer role involves creating and managing data pipelines, optimizing data infrastructure, and automating data processes using tools like Snowflake, dbt, and Terraform.
Summary Generated by Built In

This role has a specialized focus on building and maintaining robust, scalable, and automated data pipelines and plays a key role in optimizing our data infrastructure and enabling efficient data delivery across the organization. As the organization enhances its cloud data platform (Snowflake or something similar), this role will be instrumental in implementing and managing CI/CD processes, infrastructure as code (Terraform), and data transformation workflows (dbt).

Job Responsibilities:

  • Design, build, and maintain scalable and resilient CI/CD pipelines for data applications and infrastructure, with a focus on Snowflake, dbt, and related data tools.
  • Implement and manage Snowflake dbt projects for data transformation, including developing dbt models, tests, and documentation, and integrating dbt into CI/CD workflows.
  • Develop and manage infrastructure as code (IaC) using Terraform to provision and configure cloud resources for data storage, processing, and analytics on GCP.
  • Automate the deployment, monitoring, and management of Snowflake data warehouse environments, ensuring optimal performance, security, and cost-effectiveness.
  • Collaborate with data engineers and data scientists to understand their requirements and provide robust, automated solutions for data ingestion, processing, and delivery.
  • Implement and manage monitoring, logging, and alerting systems for data pipelines and infrastructure to ensure high availability and proactive issue resolution.
  • Develop and maintain robust automation scripts and tools, primarily using Python, to streamline operational tasks, manage data pipelines, and improve efficiency; Bash scripting for system-level tasks is also required.
  • Ensure security best practices are implemented and maintained across the data infrastructure and pipelines.
  • Troubleshoot and resolve issues related to data infrastructure, pipelines, and deployments in a timely manner.
  • Participate in code reviews for infrastructure code, dbt models, and automation scripts.
  • Document system architectures, configurations, and operational procedures.
  • Stay current with emerging DevOps technologies, data engineering tools, and cloud best practices, particularly related to Snowflake, dbt, and Terraform.
  • Optimize data pipelines for performance, scalability, and cost.
  • Support and contribute to data governance and data quality initiatives from an operational perspective.
  • Help implement AI features

Requirements
  • Bachelor's degree in Computer Science, Engineering, or a related technical field expertise which is equivalent.
  • 5+ years of hands-on experience in a DevOps, SRE, or infrastructure engineering role.
  • 3+ years of experience specifically focused on automating and managing data infrastructure and pipelines.
  • 1+ years of experience enabling AI features

Others:

  • Strong, demonstrable experience with Infrastructure as Code tools, particularly Terraform.
  • Strong background in DevOps principles and practices, and hands-on experience in building business intelligence solutions.
  • Highly experienced in automation, and problem-solving skills with proficiency in cloud technologies.
  • Ability to collaborate effectively with data engineers, analysts, and other stakeholders to ensure the reliability and performance of our data ecosystem.
  • Proven experience with dbt for data transformation, including developing models, tests, and managing dbt projects in a production environment.
  • Hands-on experience managing and optimizing Snowflake data warehouse environments.
  • Demonstrable experience with data modeling techniques for ODS, dimensional modeling (Facts, Dimensions), and semantic models for analytics and BI.
  • Strong proficiency in Python for automation, scripting, and data-related tasks. Experience with relevant Python libraries is a plus. Strong Bash scripting.
  • Solid understanding of CI/CD principles and tools (e.g., Bitbucket Runners, Jenkins, GitLab CI, GitHub Actions, Azure DevOps).
  • Experience with cloud platforms (GCP preferred, AWS, or Azure) and their data services.
  • Experience with containerization technologies (e.g., Docker, Kubernetes) is a plus.
  • Knowledge of data integration tools and ETL/ELT concepts.
  • Familiarity with monitoring and logging tools.
  • Strong SQL skills.
  • Ability to work independently and as part of a collaborative team in an agile environment.
  • Strong communication skills, with the ability to explain complex technical concepts clearly."

Benefits
  • Fully remote.
  • Flexible timings. You decide your work scheduled.
  • Market competitive compensation (in $$).
  • Insane learning and growth

Top Skills

AWS
Azure
Bash
Ci/Cd
Dbt
Docker
GCP
Kubernetes
Python
Snowflake
SQL
Terraform
Am I A Good Fit?
beta
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
San Mateo, California
175 Employees
Year Founded: 2020

What We Do

At Remotebase, we are on a mission to bring together great ideas and great people, transcending physical borders. As the world embraces remote work, we stand at the forefront of this transformative shift, empowering exceptional companies to collaborate with top talent on a global scale.

Our purpose is clear: to help organizations build the best remote engineering teams comprising of the top 1% talent within 24 hours. We specialize in partnering with forward-thinking TECH startups that are actively shaping the world we inhabit. By deeply understanding their ideas, needs, and aspirations, we create innovative technology solutions that drive their success.

At the heart of our approach is the relentless pursuit of excellence. We meticulously source the finest TECH talent, curating passionate teams eager to tackle complex challenges head-on. Whether you require expertise in Software Engineering, UX Design, Product Management, Data Engineering, Data Analysis, Data Science, AI, QA, or DevOps, we have you covered.

What sets us apart is our unwavering commitment to delivering exceptional results swiftly. We understand the urgency inherent in today's fast-paced business landscape and are dedicated to providing you with a highly skilled, motivated team from day one. With Remotebase, your organization can hit the ground running, accelerate growth and achieve remarkable outcomes.

Join us as we revolutionize the way companies thrive in a borderless world. Let's create a future where remarkable ideas and exceptional talent know no bounds

Similar Jobs

YLD Logo YLD

Data Engineer

Software • Consulting
Remote
12 Locations
127 Employees

Quartile Logo Quartile

Data Engineer

AdTech • eCommerce
Remote
Brazil
286 Employees

Tort Experts Logo Tort Experts

Data Engineer

Legal Tech • Software
Remote
12 Locations
50 Employees

Luxor Technology Logo Luxor Technology

Data Engineer

Blockchain • Hardware • Software • Energy • Cryptocurrency • Big Data Analytics
Remote
13 Locations
100 Employees

Similar Companies Hiring

Scrunch AI Thumbnail
Software • SEO • Marketing Tech • Information Technology • Artificial Intelligence
Salt Lake City, Utah
Amplify Platform Thumbnail
Fintech • Financial Services • Consulting • Cloud • Business Intelligence • Big Data Analytics
Scottsdale, AZ
62 Employees
Standard Template Labs Thumbnail
Software • Information Technology • Artificial Intelligence
New York, NY
10 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account