Egen is a fast-growing and entrepreneurial company with a data-first mindset. We bring together the best engineering talent working with the most advanced technology platforms, including Google Cloud and Salesforce, to help clients drive action and impact through data and insights. We are committed to being a place where the best people choose to work so they can apply their engineering and technology expertise to envision what is next for how data and platforms can change the world for the better. We are dedicated to learning, thrive on solving tough problems, and continually innovate to achieve fast, effective results.
As an Egen Data Engineer, you will be responsible for building and implementing distributed ETL/ELT data pipelines to enable the processing of huge data sets, solving ingestion and data modeling challenges at scale, and continuously improving data processing turnaround and usability.
The ideal candidate is very resourceful and has strong production experience working with Python, complex SQL procedures, relational and NoSQL databases, distributed data warehouses, data mapping, data transformations, and data integration.
Responsibilities:
- Design, build and improve scalable and resilient ETL pipelines and integrate with cloud-native data warehouses (Google Cloud) and relational or NoSQL databases.
- Follow and manage best practices and standards for data quality, scalability, reliability, and reusability.
- Debug production issues across data platform services.
- Partner with the business, product, and data science teams to automate processes to improve data sets for analytical and reporting needs.
- Write test cases, perform QA and participate with stakeholders on UAT
- Automate...automate...automate.
Required Experience:
- Bachelor’s degree in a relevant field
- Must have strong professional knowledge of and experience building event-driven and/or batch-processed ETL data pipelines
- Enterprise data migration and multi-source ingestion experience
- Proficiency in Python, complex SQL procedures, and familiarity with distributed data warehouses architecture and processing.
Nice to have's (but not required):
- Familiarity with event-driven architecture using Kafka, Airflow, and/or Spark.
- Experience with Docker, Kubernetes.
- Develop and deploy CICD pipelines for Data Engineering
Top Skills
What We Do
Egen is a data engineering and cloud modernization firm partnering with leading Chicagoland companies to launch, scale, and modernize industry-changing technologies. We are catalysts for change who create digital breakthroughs at warp speed. Our team of cloud and data engineering experts are trusted by top clients in pursuit of the extraordinary.
Our mission is to be an enabler of amazing possibilities for companies looking to use the power of cloud and data. We want to stand shoulder to shoulder with clients, as true technology partners, and make sure they succeed at what they have set out to do. We want to be disruptors, game-changers, and innovators who have played an important part in moving the world forward.