At the forefront of health tech innovation, CopilotIQ+Biofourmis is transforming in-home care with the industry's first AI-driven platform that supports individuals through every stage of their health journey-from pre-surgical optimization to acute, post-acute and chronic care. We are helping people live healthier, longer lives by bringing personalized, proactive care directly into their homes. With CopilotIQ's commitment to enhancing the lives of seniors with chronic conditions and Biofourmis' advanced data-driven insights and virtual care solutions, we're setting a new standard in accessible healthcare. If you're passionate about driving real change in healthcare, join the CopilotIQ+Biofourmis Team!
What is the Data Engineer role?
CopilotIQ is seeking a Data Engineer to join our team as an individual contributor in a remote capacity. In this hands-on role, you will be responsible for designing, developing, and maintaining robust data pipelines, as well as optimizing our data warehouse and data lake infrastructure. You will also support the organization’s business intelligence needs by ensuring the availability, reliability, and quality of data across the ecosystem. We are looking for a highly collaborative and detail-oriented professional with strong problem-solving skills and a passion for building efficient, scalable data solutions that drive impactful insights.
What you’ll be doing:
- Operate and own production data pipelines, with a focus on data quality—reliability, accuracy, timeliness—and rapid incident resolution.
- Design, build, and optimize batch and streaming pipelines in Airflow, landing data in our Redshift-based warehouse.
- Lead hands-on development. Build, test, and deploy high-throughput Airflow workflows and Python/SQL transformations; introduce DataOps practices (CI/CD, versioned schemas, automated data quality tests) to raise the engineering bar.
- Ensure rigorous governance, lineage, and privacy controls across data platforms.
- Write clean, efficient, and high-performance code in Python and SQL for data transformations and automation workflows.
- Partner closely with business stakeholders to support reporting and analytics through BI tools such as Sigma Computing and Superset.
- Optimize relentlessly. Monitor performance, scalability, and infrastructure cost; tune queries, caching, and compression. Introduce new value-creating technologies (e.g., Redshift Serverless, Kinesis, Iceberg).
- Champion best practices in data engineering, including automation, data security, and operational excellence.
- Collaborate cross-functionally with analysts and software engineers to design and implement scalable, production-grade data solutions.
Requirements
- 5+ years of dedicated hands-on experience in designing, building, and operating production data pipelines and warehouses.
- Bachelor’s or Master’s in Computer Science.
- Programming & query languages: expert-level Python for data processing/automation and advanced SQL for analytical workloads.
- Core data-engineering stack: orchestration with Airflow (or Prefect); distributed processing with Apache Spark; AWS services (Redshift, Glue, Athena, S3, DMS, Lambda, and SQS); and at least one streaming technology such as Kinesis or Kafka.
- Deep understanding of dimensional / event-driven data modeling, partitioning, and performance tuning at terabyte-to-petabyte scale; comfortable balancing performance, cost, and governance in petabyte-scale environments.
- Foundational knowledge of NoSQL technologies such as MongoDB and DynamoDB.
- Excellent problem-solving skills, strong attention to detail, and the ability to thrive in a fast-paced, dynamic environment.
- A mission-driven, collaborative mindset with strong product thinking and a desire to learn, grow, and make meaningful technical contributions.
- AWS Certified Data Analytics – Specialty or Solutions Architect – Professional.
- Administering access control, datasets, and embedded dashboards in BI tools such as Sigma Computing and Superset.
- Experience rolling out DataOps practices (CI/CD for data, automated quality tests, lineage/observability tooling).
- Leading technical initiatives in a growing team.
Top Skills
What We Do
At CopilotIQ, our mission is clear: we are dedicated to delivering world-class care to seniors facing chronic conditions such as diabetes and hypertension. We firmly believe that every patient should have access to expert healthcare monitoring, consistent data collection, thoughtful support, and personalized care.
Our primary focus is on prevention – averting heart attacks and strokes before they even arise. CopilotIQ is at the forefront of pioneering solutions that proactively address diabetes and hypertension, preventing them from escalating into crises.
We collect an extraordinary 1,000 times more data than traditional healthcare providers. Our AI-powered software anticipates problems, signals nurses for timely intervention, and drives comprehensive treatment plans. By uniting the expertise of consistent remote nursing with cutting-edge AI technology, we provide patients with customized and proactive care.
In our ongoing pursuit of excellence, we're expanding the use of wearables and non-invasive devices to multiply the volume of data points from thousands to millions. CopilotIQ is not merely transforming healthcare; we are empowering seniors to lead healthier lives, free from the looming threat of medical emergencies.
Why Work With Us
At CopilotIQ, our uniqueness lies in fusing advanced AI with personalized clinical care, revolutionizing senior chronic condition management. We're a mission-driven team, dedicated to improving lives and shaping healthcare's future. Our remote-friendly culture values diversity, fostering innovation through varied backgrounds and abilities.
Gallery
