Egen
Jobs at Similar Companies
Similar Companies Hiring
Jobs at Egen
Search the 7 jobs at Egen
Recently posted jobs
The Data Solutions Architect will design and develop data solutions using Google Cloud Platform to improve patient care through data-driven innovation. Responsibilities include architecting data pipelines, collaborating with teams to meet project requirements, and maintaining documentation of data assets.
As a Cloud Data Engineer, you will build distributed data pipelines and workflows, manage large data sets, and improve data processing and usability, along with providing technical guidance and support to clients.
Seeking a Site Reliability Engineer to ensure system reliability and infrastructure support, delivering scalability, performance optimization, incident management, and analysis.
Join a team of data engineers at Egen, a data engineering and cloud modernization firm. Responsibilities include building event-driven data pipelines, solving data integration challenges, and working on cloud-native data warehouse processing. Requires strong skills in Python, SQL, Bash, and experience with distributed data warehousing technologies and Kafka.
The Egen Cloud Software Engineer at Egen is responsible for implementing cloud-based Infrastructure as Code solutions, automating deployment processes, and ensuring secure integrations using Google Cloud Platform services. The role involves managing infrastructure on GCP, AWS, and Azure, deploying containers with Docker and Kubernetes, and maintaining monitoring and CI/CD pipelines. The ideal candidate must have 4+ years of experience in cloud infrastructure management and possess expertise in various cloud technologies.
Lead a team of data engineers in designing and creating scalable data platforms and fault-tolerant pipelines for modern analytics and AI services. Responsible for complex data migrations and developing distributed ETL/ELT pipelines with cloud-native data stores.
Join a team of data engineers at Egen, a data engineering and cloud modernization firm, to build scalable data platforms and fault-tolerant pipelines using Python, SQL, Linux, Bash, and cloud services. Responsibilities include leading data engineering teams, designing data pipelines, consulting with teams, preparing documentation, and facilitating data migrations. Requires a Bachelor's degree in a related field, experience with EDA leveraging Kafka, data warehousing knowledge, and strong programming skills in SQL, Python, and Bash.