We'll trust you to:
- Ensures reliability and stability of the enterprise data platform and tools used by internal teams and clients.
- Demonstrates knowledge of the data platform components (warehouse, monitoring, language, etc.) and data engineering (ETL design, schema design, etc.) and performs unit testing.
- Builds and enhances data integration, management, and analytics tools and pipelines.
- Documents code, deployments, best practices, and improvements.
- Designs and builds ETL/ELT data systems, including cloud-based APIs, automated data transfer systems, and code management solutions.
- Acts as a regular point of contact for data engineering projects and collaborates with the internal consulting team and the data platform and technology teams to complete projects for clients.
- Participates in and contributes ideas to brainstorm, design, and implement more complex processes and analyses to solve challenging and abstract problems.
- Develops and implements best practice guidance and supporting materials for data management and advanced analytics pipelines.
- Creates high-quality presentations, deliverables, and code that meet objectives, with project manager guidance.
- Performs other job-related duties as assigned.
You'll need to have:
- At least 5 years’ experience in data engineering using Python, including the use of pandas or PySpark.
- Experience working with Databricks extensively (including installing packages, understanding and setting cluster configurations, managing jobs, user management, handling permissions, managing Unity Catalogues, Databricks APIs and AI tools, and handling configuration issues) and relational database technologies, such as PostgreSQL, Oracle, MySQL, Redshift, Snowflake.
- Software development fundamentals, including Agile development, version control systems such as Git or DevOps, code reviews, testing, and documentation.
- Experience configuring Azure AD/SAML/Okta/Oath and administering AWS or Azure security best practices, preferred.
- Experience with WYSIWYG ETL tools (Azure Data Factory, Informatica, Snap Logic, Boomi), preferred.
- Container orchestration systems experience using Docker, Kubernetes, AWS ECS, preferred.
- At least 6 months of experience in web application development using Flask, Django, JavaScript, Ajax, or CSS/HTML is preferred.
- Candidates holding a recognized data engineering certification are preferred (e.g., Databricks Data Engineer, Google Cloud Professional Data Engineer, Azure Data/Fabric Data Engineer, AWS Certified Data Engineer).
- Proficiency using Microsoft Office products, including Excel, PowerPoint, and Word.
- Life Sciences industry experience, preferred.
- Experience: 5 - 7 years
Top Skills
What We Do
Beghou Consulting provides sales force and marketing consulting services to clients in the pharmaceutical and health care industries. We bring significant expertise in addressing sales and marketing issues and in developing advanced analytic approaches to support our clients' decision-making. We pride ourselves on our growing list of long-term clients, for whom we deliver an increasing array of services and analyses. Our clients are developing and launching innovative, high-profile products, and as such require a partner that provides similarly innovative insights and processes to support their sales force and marketing management.
.jpeg)





