Responsibilities:
- Lead the end-to-end architecture, design, and implementation of scalable Data Lakehouse solutions on Google Cloud Platform (GCP) using BigQuery, GCS, BigLake, and Dataplex
- Collaborate directly with customers to understand business goals, data challenges, and technical requirements; translate them into robust architectural blueprints and actionable plans
- Design and implement data pipelines supporting both real-time and batch ingestion using modern orchestration and streaming frameworks
- Establish and enforce best practices for data cataloging, metadata management, lineage, and data quality across multiple systems
- Define and implement data security, access control, and governance models in compliance with enterprise and regulatory standards
- Serve as the technical lead for project teams - mentoring engineers, reviewing solutions, and ensuring architectural consistency across deliverables
- Balance strategic architecture discussions with hands-on solutioning, POCs, and deep dives into data pipelines or performance tuning
- Partner with stakeholders, cloud architects, and delivery leads to drive solution adoption, scalability, and long-term maintainability
- Represent the company as a trusted technical advisor in client engagements - clearly articulating trade-offs, best practices, and recommendations
Qualifications:
- 8–10 years of progressive experience in Software Engineering and Data Platform development, with 5+ years architecting data platforms on GCP and/or Databricks
- Proven hands-on experience designing and deploying Data Lakehouse platforms with data products and medallion architectures
- Strong understanding of data ingestion patterns (real-time and batch), ETL/ELT pipeline design, and data orchestration using tools such as Airflow, Pub/Sub, or similar frameworks
- Expertise in data modeling, storage optimization, partitioning, and performance tuning for large-scale analytical workloads
- Experience implementing data governance, security, and cataloging solutions (Dataplex, Data Catalog, IAM, or equivalent)
- Excellent communication and presentation skills - able to confidently engage with technical and non-technical stakeholders and guide clients through solution decisions
- Demonstrated ability to lead by example in mixed teams of engineers, analysts, and architects, balancing architectural vision with hands-on delivery
- Nice to have: Experience with Databricks (Delta Lake, Unity Catalog) and hybrid GCP-Databricks data architectures
- Strong problem-solving mindset, curiosity to explore new technologies, and ability to “zoom out” for architecture discussions and “zoom in” for code-level troubleshooting
Top Skills
What We Do
Egen is a data engineering and cloud modernization firm partnering with leading Chicagoland companies to launch, scale, and modernize industry-changing technologies. We are catalysts for change who create digital breakthroughs at warp speed. Our team of cloud and data engineering experts are trusted by top clients in pursuit of the extraordinary.
Our mission is to be an enabler of amazing possibilities for companies looking to use the power of cloud and data. We want to stand shoulder to shoulder with clients, as true technology partners, and make sure they succeed at what they have set out to do. We want to be disruptors, game-changers, and innovators who have played an important part in moving the world forward.








