About the job:
- Design, develop, and support GCP data pipelines to extract, load, and transform data
- Maintain a holistic view of information assets by creating and maintaining artifacts that illustrate how information is stored, processed, and supported (i.e. documentation)
- Work with the project team, estimate and plan the work effort
- Attend daily team stand-ups
- Work with business analysts and project management to review business requirements and and produce technical design specs that will meet the requirements
About you:
- 3+ years of hands-on experience as a Data Engineer or Data Architect
- Proven track record leading technical projects and teams
- Expert proficiency in Google Cloud Platform (GCP) tools, including: Google Cloud Storage (GCS), BigQuery, Cloud Composer/Airflow, Dataproc/Spark
- Strong data analysis and problem-solving skills
- Advanced SQL skills, including writing, tuning, and interpreting complex SQL queries
- Experience writing and maintaining Unix/Linux shell scripts
- Solid understanding of Data Build Tool (DBT)
- Experience with CI/CD pipelines
- Machine Learning experience
- Proficiency in developing Python-based ELT data pipelines
- Expertise in optimizing GCP BigQuery SQL queries and scripts
Similar Jobs
What We Do
Egen is a data engineering and cloud modernization firm partnering with leading Chicagoland companies to launch, scale, and modernize industry-changing technologies. We are catalysts for change who create digital breakthroughs at warp speed. Our team of cloud and data engineering experts are trusted by top clients in pursuit of the extraordinary.
Our mission is to be an enabler of amazing possibilities for companies looking to use the power of cloud and data. We want to stand shoulder to shoulder with clients, as true technology partners, and make sure they succeed at what they have set out to do. We want to be disruptors, game-changers, and innovators who have played an important part in moving the world forward.






