Data Integration Engineer

Posted 3 Days Ago
Be an Early Applicant
2 Locations
Remote
Mid level
Artificial Intelligence • Cloud • Analytics • Automation
The Role
The Data Integration Engineer is responsible for designing, maintaining, and optimizing data integration processes across various IT systems, ensuring data quality and alignment with business needs.
Summary Generated by Built In

This is a fully remote position

This job will focus on creating, maintaining, transforming, and decommissioning various systems that cross a wide hybrid IT landscape. As part of this role, it is vital that data flows smoothly, accurately, and completely to ensure general performance and that internal & security project requirements are met. The engineer will also work through optimizing (or changing) existing data workflows & pipelines that are relied on by multiple internal stakeholders.
A complex IT landscape that ranges from on-premises, SaaS, and multiple cloud providers (primarily AWS) is integrated into business processes that need modernizing or re-integrating into modern systems & processes. The goal is to increase the resiliency, data quality, observability, and completeness of these integrations to support the needs of the organization.
While this role does rely on automation/integration between systems, the right candidate will be able to work with internal stakeholders to ensure today's requirements are met with tomorrow's integrations in mind. This person should be able to gather requirements and own the end-to-end process/delivery of a solution. They'll also be able to see a larger data pipeline picture of several integrations and communicate effectively to those involved on possible solutions (as well as eventual pitfalls based on technical decisions/paths), understanding that some changes may rely on future product enhancements and their implementation priority within a development cycle.

Responsibilities:

  • Design, develop, and maintain data integration processes using ETL tools and frameworks.
  • Collaborate with cross-functional teams to establish data integration requirements and ensure that data flows align with business objectives.
  • Implement data transformation, cleansing, and enrichment processes to enhance data quality and usability.
  • Monitor and troubleshoot data integration workflows to identify and resolve issues promptly.
  • Document data integration processes, data flows, and operational procedures for reference and future improvements.
  • Evaluate new data integration technologies and tools to improve existing processes and ensure scalability.
  • Provide technical support and training to team members on data integration best practices.
  • Work closely with data engineers to optimize data storage and retrieval strategies.

Requirements

The ideal candidate will be able to write/edit/maintain code that originates in Python/Bash/PowerShell/etc., as well as use DevOps tooling in CI/CD workflows to check into source control systems. Using Linux, Windows, Containers, and cloud infrastructure is required. Being able to integrate, develop, and troubleshoot Splunk (and Splunk Apps, generally Python-based) is highly desired. 

  • Proven experience as a Data Integration Engineer, ETL Developer, or similar role.
  • Strong proficiency in SQL and experience with database systems (e.g., MySQL, PostgreSQL, SQL Server).
  • Familiarity with ETL tools such as Talend, Informatica, Apache NiFi, or similar.
  • Experience with data modeling and design, data warehousing concepts, and data governance.
  • Strong programming skills in scripting languages such as Python, Java, or similar.
  • Excellent problem-solving skills and the ability to work with complex data structures.
  • Strong communication and collaboration skills, with experience working in cross-functional teams.
  • Ability to work independently and manage multiple tasks in a remote environment.

Benefits
  • Competitive salary
  • Opportunity to work in a dynamic and innovative environment.
  • Professional development and growth opportunities.
  • Flexible work schedule and remote work options

Top Skills

Python,Bash,Powershell,Devops,Ci/Cd,Linux,Windows,Containers,Cloud Infrastructure,Sql,Mysql,Postgresql,Sql Server,Talend,Informatica,Apache Nifi
Am I A Good Fit?
beta
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
Montreal, Quebec
62 Employees

What We Do

PartnerOne is an enterprise software company that manages the world’s largest data environments through virtualized cloud storage, hyper-automation, artificial intelligence, and metadata analytics. Contrary to other software companies, we play a mission-critical role in not just one, but many aspects of the enterprise Big Data cycle.

Over 1250 of the world’s largest data environments rely on our software for their most critical needs and to safeguard their most valuable data.

Similar Jobs

Dynatrace Logo Dynatrace

Solutions Engineer

Artificial Intelligence • Big Data • Cloud • Information Technology • Software • Big Data Analytics • Automation
Remote or Hybrid
São Paulo, BRA

Motorola Solutions Logo Motorola Solutions

Product Owner

Artificial Intelligence • Hardware • Information Technology • Security • Software • Cybersecurity • Big Data Analytics
Remote or Hybrid
Brazil
Remote
Brazil

Coinbase Logo Coinbase

Staff Software Engineer

Artificial Intelligence • Blockchain • Fintech • Financial Services • Cryptocurrency • NFT • Web3
Remote
Brazil

Similar Companies Hiring

Amplify Platform Thumbnail
Fintech • Financial Services • Consulting • Cloud • Business Intelligence • Big Data Analytics
Scottsdale, AZ
62 Employees
Credal.ai Thumbnail
Software • Security • Productivity • Machine Learning • Artificial Intelligence
Brooklyn, NY
Standard Template Labs Thumbnail
Software • Information Technology • Artificial Intelligence
New York, NY
10 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account