Data Engineer

Posted 5 Days Ago
Hiring Remotely in United States
Remote
3-5 Years Experience
Artificial Intelligence • Big Data • Logistics • Machine Learning
We keep the world moving by keeping risk out of the way.
The Role
As a Data Engineer at Everstream Analytics, you will play a critical role in building and maintaining their data infrastructure. Responsibilities include designing data pipelines, utilizing AWS services, relational databases, stream processing, Python development, data warehousing, collaboration with other teams, monitoring, and documentation.
Summary Generated by Built In

We keep the world moving by keeping risk out of the way.

Everstream Analytics risk scores and predictive insights set the world’s supply chain standard, helping global companies turn supply chains into business-changing, market-shifting, competition-crushing assets. Removing the traditional blinders of traditional data – we offer more complete information, sharper analysis, and accurate predictions. Companies like Google, Schneider Electric, Unilever, and Campbell’s rely on Everstream Analytics to push their supply chains to be faster, smarter, safer, and sustainable!

What Matters Most to Everstreamers

Doing our best, no matter what challenges lie in front of us. We’re sharp, focused, determined, and as a team, we’re unstoppable. Of course, we have values like “integrity” and “honesty”—that’s a given—but our core values run deeper:

Audacity | We are bold. We break through the status quo and do what others haven’t, can’t or won’t

Grit | We get the job done and keep going, so our customers can do the same

Optimism | We have a can-do attitude, and instead of saying “no”, we figure out how

Virtue | We do what’s right, the right way—especially when it’s difficult

Solidarity | When we celebrate each other and our differences, we all do better

 

Job Description:

As a Data Engineer at Everstream Analytics, you will play a critical role in building and maintaining our data infrastructure. You will work with a team of talented engineers to design, develop, and optimize data pipelines and data products that support our multi-tenant cloud-native data platform, leveraging various AWS services such as Lambda, EMR, S3, Glue and Redshift as well as helping drive our future toolset. Your expertise in distributed system design, data warehousing, data lakes, and ETL/Orchestration is essential in ensuring the scalability, reliability, and efficiency of our data infrastructure. 

  

Key Responsibilities: 

  • Design, implement, and maintain data pipelines that handle large volumes of data from various sources, ensuring data quality, integrity, and availability. 
  • AWS Expertise: Utilize AWS services like Lambda, EMR, S3, Glue, and others to create scalable and cost-effective data solutions.
  • Relational Database Experience: Utilize PostgreSQL on RDS or similar database technologies, where applicable.
  • Stream Processing: Experience with Apache Kafka, Apache Spark or similar for real-time data processing and stream analytics.
  • Python Development: Primarily use Python for data engineering tasks, data transformation, and ETL processes.
  • Data Warehousing: Implement and manage data warehousing and/or data lake solutions for efficient data storage and retrieval to support engineering, data science, applications, and groups across our organization.
  • Collaboration: Work closely with Product Management, Data Science, and the leadership team to understand data requirements and deliver data solutions that meet business needs.
  • Monitoring and Optimization: Continuously monitor the performance of data pipelines to optimize scalability and efficiency.
  • Documentation: Maintain comprehensive documentation for data engineering processes, ensuring knowledge transfer within the team.
  • Leadership: Lead by example within the data engineering team, taking pride in your team’s deliverables, and performing as technical lead for a scrum team or on various projects, where applicable. 

  

Qualifications: 

  • Proven experience in designing and building multi-tenant cloud-native data platforms in a SaaS or PaaS environment. 
  • Strong experience with Cloud Data Warehouses such as AWS Redshift, Snowflake, BigQuery, Databricks.
  • Extensive experience with relational database technologies in a production environment, specifically PosgreSQL.
  • Strong expertise in AWS services and ETL/Orchestration (Glue, Spark, Airflow, Apache Seatunnel).
  • Proficiency in distributed system design, data warehousing, data lakes, and stream processing using Spark or similar.
  • Strong programming skills in Python.
  • Excellent problem-solving and troubleshooting skills.
  • Ability to work collaboratively with cross-functional teams and convey complex technical concepts to non-technical stakeholders.
  • Bachelor's or Master's degree in Computer Science, Data Engineering, related field, or equivalent experience. 


 100% Remote Position 

Applicants must be currently authorized to work in the United States on a full-time basis.

#LI-AB1

Thanks to our remarkable people we are at the forefront of change and bringing cutting-edge products and services to market. We focus on growth, so our people, our business, and our customers can achieve their full potential.  It takes determination, focus, and resilience to scale a high-growth, global business. We're looking for people intrinsically driven to create, build, solve, and push boundaries to deliver the unrivaled innovation and service our clients know and love. Everstreamers aren't afraid of ambiguity, changing priorities, shifting org structures, or pivoting to new strategies. They thrive on change and put in the effort to achieve the seemingly impossible. It isn't always easy, but it's always worth it. Does this sound like you? Grow your career at Everstream.

Top Skills

Python
The Company
HQ: San Marcos, California
170 Employees
On-site Workplace

What We Do

Everstream Analytics sets the global supply chain standard. Through the application of artificial intelligence and predictive analytics to its vast proprietary dataset, Everstream delivers the predictive insights and risk analytics businesses need for a smarter, more autonomous and sustainable supply chain. Everstream’s proven solution integrates with procurement, logistics and business continuity platforms generating the complete information, sharper analysis, and accurate predictions required to turn the supply chain into a business asset. To learn more, visit www.everstream.ai.

Why Work With Us

We’re sharp, focused, determined, and as a team, we’re unstoppable. We use the latest innovations, data science, and machine learning to empower companies in all industries to build better supply chains, improve people’s lives, and change the world.

Gallery

Gallery

Jobs at Similar Companies

Cencora Logo Cencora

DevSecOps Application Security Engineer III

Healthtech • Logistics • Software • Pharmaceutical
Conshohocken, PA, USA
46000 Employees
87K-124K Annually

MassMutual India Logo MassMutual India

Fullstack Senior Architect

Big Data • Fintech • Information Technology • Insurance • Financial Services
Hyderabad, Telangana, IND

Halter Logo Halter

Experienced Mechanical Engineer

Hardware • Information Technology • Internet of Things • Machine Learning • Software • Business Intelligence • Agriculture
Easy Apply
Hybrid
Auckland, NZL
150 Employees

Similar Companies Hiring

Halter Thumbnail
Software • Machine Learning • Internet of Things • Information Technology • Hardware • Business Intelligence • Agriculture
Auckland City, NZ
150 Employees
MassMutual India Thumbnail
Insurance • Information Technology • Fintech • Financial Services • Big Data
Hyderabad, Telangana
Cencora Thumbnail
Software • Pharmaceutical • Logistics • Healthtech
Conshohocken, PA
46000 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account