Data Engineer

Sorry, this job was removed at 01:47 a.m. (CST) on Friday, Feb 21, 2025
3 Locations
Remote
Logistics • Sales
The Role

E2open is the connected supply chain platform that enables the world’s largest companies to transform the way they make, move, and sell goods and services. We connect more than 400,000 partners as one multi-enterprise network. Powered by the network, data, and applications, our SaaS platform anticipates disruptions and opportunities to help companies improve efficiency, reduce waste, and operate sustainably. Our employees around the world are focused on delivering enduring value for our clients.

Job Summary:

e2open seeks a Data Engineer with approximately 3-5 years of experience in building and maintaining scalable data pipelines, architectures, and infrastructure. The ideal candidate will have hands-on experience with Databricks and/or Snowflake, as well as a strong understanding of data governance, regulatory requirements, and global data hosting.


***This is a hybrid role requiring 3 days in office***

Key Responsibilities:

  • Design, build, and maintain scalable data pipelines, architectures, and infrastructure using Databricks and/or Snowflake
  • Work with large data sets, ensuring data quality, integrity, and compliance with regulatory requirements
  • Collaborate with cross-functional teams, including data science, product, and engineering, to identify and prioritize data requirements
  • Develop and implement data governance policies, procedures, and standards to ensure data quality, security, and compliance
  • Ensure compliance with global data hosting regulatory requirements such as GDPR
  • Optimize data infrastructure for performance, scalability, and reliability
  • Develop and maintain technical documentation for data infrastructure and pipelines
  • Stay current with industry trends, best practices, and emerging technologies in data engineering

Requirements:

  • 5+ years of experience in data engineering, with a focus on building and maintaining scalable data pipelines, architectures, and infrastructure
  • Hands-on experience with Databricks and/or Snowflake
  • Strong understanding of data governance, regulatory requirements, and global data hosting
  • Experience working with large data sets, ensuring data quality, integrity, and compliance
  • Strong programming skills in languages such as Python or Java
  • Experience with data warehousing, ETL/ELT, and data modeling
  • Strong understanding of data security, access controls, and compliance
  • Excellent problem-solving skills, with the ability to work in a fast-paced environment
  • Strong communication and collaboration skills, with the ability to work with cross-functional teams

Nice to Have:

  • Experience with cloud-based data platforms, such as AWS, Azure, or GCP
  • Knowledge of data discovery, metadata management, and data cataloging
  • Experience with agile development methodologies and version control systems, such as Git
  • Certification in data engineering, data governance, or related fields



E2open is proud to be an Equal Employment Opportunity employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, reproductive health decisions, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, genetic information, political views or activity, or other applicable legally protected characteristics.

E2open participates in the E-verify program in certain locations, as required by law.

E2open does not accept unsolicited referrals or resumes from any source other than directly from candidates or preferred vendors. We will not consider unsolicited referrals.

Am I A Good Fit?
beta
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
HQ: Austin, TX
31,304 Employees
On-site Workplace

What We Do

At E2open, we’re creating a more connected, intelligent supply chain. It starts with sensing and responding to real-time demand, supply and delivery constraints. Bringing together data from customers, distribution channels, suppliers, contract manufacturers and logistics partners, our collaborative and agile supply chain platform enables companies to use data in real-time, with artificial intelligence and machine learning to drive smarter decisions. All this complex information is delivered in a single view that encompasses your demand, supply and logistics ecosystems. E2open is changing everything. Demand. Supply. Delivered.

Similar Jobs

Arcadia Logo Arcadia

Data Engineer

Big Data • Fitness • Healthtech • Software • Analytics • Energy
Remote
USA
370 Employees

Adswerve, Inc. Logo Adswerve, Inc.

Data Engineer, IT

AdTech • Cloud • Digital Media • Marketing Tech • Analytics • Consulting
Easy Apply
Remote
United States
250 Employees

Bombora Logo Bombora

Data Engineer

AdTech • Big Data • Information Technology • Marketing Tech • Sales • Software
Easy Apply
Remote
Hybrid
3 Locations
152 Employees

Atlassian Logo Atlassian

Principal Data Engineer

Cloud • Information Technology • Productivity • Security • Software • App development • Automation
Remote
San Francisco, CA, USA
11000 Employees
169K-271K Annually

Similar Companies Hiring

Cencora Thumbnail
Pharmaceutical • Logistics • Healthtech
Conshohocken, PA
46000 Employees
Air Space Intelligence Thumbnail
Transportation • Software • Machine Learning • Logistics • Artificial Intelligence • Aerospace
Boston , Massachusetts
109 Employees
HERE Technologies Thumbnail
Software • Logistics • Internet of Things • Information Technology • Computer Vision • Automotive • Artificial Intelligence
Amsterdam, NL
6000 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account