KF - Data Engineer - 0062

Posted 2 Days Ago
Be an Early Applicant
Hiring Remotely in Brazil
Remote
Mid level
Software
The Role
Design, develop, and maintain data pipelines using SQL and modern technologies like Azure Databricks and Python to transform and load data into the data warehouse, optimizing performance and ensuring data quality.
Summary Generated by Built In

We are seeking an experienced and skilled Data Engineer to join our team. As a Data Engineer, your primary responsibility will be to design, develop, and maintain data pipelines and solutions using modern technologies. You should have expertise in SQL with a focus on data warehousing, as well as experience with Azure Databricks, PySpark, Azure Data Factory, and Azure Data Lake. Strong knowledge of data engineering fundamentals and working with Parquet/Delta tables in Azure Data Lake. Proficiency in Python programming and the ability to create data pipelines is necessary.

Job Responsibilities

    • Designing and developing data pipelines to extract, transform, and load data from various sources into the data warehouse leveraging Python & notebooks.
    • Writing complex SQL queries for data extraction and manipulation from the data warehouse.
    • Building and maintaining ETL processes using Azure Databricks with PySpark.
    • Implementing data integration workflows using Azure Data Factory.
    • Collaborating with cross functional teams including developers, data analysts, and business stakeholders to understand requirements and deliver high quality solutions.
    • Optimizing performance of the data pipelines and ensuring scalability and reliability of the systems.
    • Monitoring Data quality and troubleshooting issues in collaboration with the operations team.
    • Maintaining documentation of the design and implementation of the data pipelines.
    • Ability to collaborate on best practices in code creation while maintaining communication with the team for new business logic transformations. 

Requirements
    • Expertise in SQL, ideally with experience in working with data warehousing concepts.
    • Strong firsthand experience with Azure Databricks and Spark.
    • Proficiency in designing and implementing data integration workflow using Azure Data Factory.
    • Demonstrates proficiency in Python programming and the ability to develop scalable data engineering pipelines in Python.
    • Solid understanding of data engineering fundamentals including data modeling, data transformation, change data capture and performance optimization techniques.
    • Experience working with Azure Data Lake for storing large data sets, maintaining Parquet/Delta tables, and performing efficient querying.
    • Experience with version control systems and familiarity with CI/CD practices.
    • Strong interpersonal skills, ability to clearly communicate, and voice concerns in a group setting. 
    • Initiative-taking, self-reliant approach with a willingness to learn business logic and work with critical faults. Candidates should be able to independently understand business requirements without relying on subject matter experts for ongoing explanations. 
    • Ability to collaborate effectively in planning and refinement sessions.

Top Skills

Azure Data Factory
Azure Data Lake
Azure Databricks
Pyspark
Python
SQL
Am I A Good Fit?
beta
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
San Francisco, CA
39 Employees
Year Founded: 2020

What We Do

Experts in crafting digital products ⚡️

At Thaloz, the mission is to support at every stage of the digital product journey. With a team of over 100 experts and a global presence in 30 countries, we leverage top-tier Latin American talent to deliver exceptional software development solutions that drive success.

Our Services:
→ Product Lab: Comprehensive product development services to build and scale software solutions. From strategy and design to development, testing, and launch, every aspect is handled with expertise.
→ Talent Hub: Accelerate the team-building process by 50% with carefully vetted LATAM talent. Select the team members, and they will be seamlessly integrated into projects under the client's leadership.
→ Enterprise Pod: Optimize operations with streamlined complex integrations and flawless implementations of digital products for B2B companies, ensuring rapid and smooth deployments.

Ready to assist in turning ideas into reality, get in touch through www.thaloz.com/contact-us

Join our community! 👨‍💻
Instagram: @thalozteam
YouTube: @thalozteam
Clutch: @thaloz

Similar Jobs

Techchain AI Logo Techchain AI

Blockchain Engineer

Blockchain • Software • Cryptocurrency • NFT • Web3 • App development
Remote
18 Locations
230 Employees
180K-200K Annually

monday.com Logo monday.com

Sales Development Representative

Productivity • Sales • Software
Remote or Hybrid
São Paulo, BRA
3049 Employees

CrowdStrike Logo CrowdStrike

Implementation Engineer - Evergreen (Remote, BRA)

Cloud • Computer Vision • Information Technology • Sales • Security • Cybersecurity
Remote or Hybrid
Brazil
10000 Employees

CrowdStrike Logo CrowdStrike

Technical Support

Cloud • Computer Vision • Information Technology • Sales • Security • Cybersecurity
Remote or Hybrid
Brazil
10000 Employees

Similar Companies Hiring

LayerOne Thumbnail
Software • Information Technology • Artificial Intelligence
New York, NY
15 Employees
PRIMA Thumbnail
Travel • Software • Marketing Tech • Hospitality • eCommerce
US
15 Employees
Scotch Thumbnail
Software • Retail • Payments • Fintech • eCommerce • Artificial Intelligence • Analytics
US
25 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account