Senior Data Engineer - Python and Snowflake (1099 contract, Remote, must be based in US)
The Role
Design and build ETL/ELT data pipelines using Python and Snowflake, manage data workflows, and support analytics through Power BI dashboards.
Summary Generated by Built In
About Us:
At CompassX, our clients rely on us to lead high-priority strategic initiatives and transformational projects. Our mission is to create a community of people who come up with innovative approaches and deliver the best outcomes for our clients.
You will have the opportunity to leverage your experience, creativity, and skills to impact your clients and influence the trajectory of our firm to achieve growth for the team and your career.
We are honored to be recognized as a “Best Place to Work” in Southern California and listed as one of INC.’s 5000 fastest-growing private companies in the U.S.
We’re looking for a Senior Data Engineer (Python and Snowflake) to support one of our life sciences clients. This role will focus on designing and building scalable data pipelines, integrating data into Snowflake, and enabling downstream analytics and reporting in Power BI.
The client environment is still maturing, so your ability to shape structure, define logic, and deliver value will be key.
What you'll do
- Design, build, and maintain ETL/ELT pipelines using Python, integrating data from APIs, flat files, and relational systems into Snowflake
- Develop and optimize data models and transformations (dbt) to support reporting and analytical use cases
- Implement data validation, testing, and quality checks to ensure accuracy and reliability across datasets
- Manage data workflows, orchestration, and automation using modern tools and practices (e.g., Airflow, GitHub Actions)
- Support downstream users and analysts by preparing clean, well-structured datasets for Power BI dashboards and reports
- Contribute to the development and management of containerized environments using Docker and Linux
- Collaborate with BI developers, analysts, and business stakeholders to deliver end-to-end data solutions
- Help define and promote data engineering best practices, frameworks, and standards within a growing data environment
What we're looking for
- 7–10 years of data engineering experience across the full data lifecycle
- Strong programming experience in Python, including data libraries such as Pandas, PySpark, or SQLAlchemy
- Advanced SQL skills and hands-on experience developing transformations using dbt
- Experience with Snowflake or similar cloud data platforms (e.g., Redshift, BigQuery)
- Working knowledge of Linux, Docker, and GitHub Actions for environment management and CI/CD automation
- Understanding of data architecture concepts, including modeling, lineage, and orchestration
- Exposure to Power BI and experience supporting analytics or BI teams
- Comfortable working in a fast-paced and collaborative environment
Top Skills
Airflow
Dbt
Docker
Elt
ETL
Github Actions
Linux
Pandas
Power BI
Pyspark
Python
Snowflake
SQL
Sqlalchemy
Am I A Good Fit?
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.
Success! Refresh the page to see how your skills align with this role.
The Company
What We Do
CompassX is a local-model consulting firm headquartered in SoCal. Founded in 2009, our core delivery focus is on Project Leadership and the flawless execution of our clients’ projects and strategic initiatives. Our team members have unfettered access to a leadership team who truly cares & listens, and the ability to customize their consulting career journey.
You will have the opportunity to use your experience, creativity, and passion to put your mark on building the next great consulting firm!
Join us and give yourself a chance to love consulting again.