Senior Data Integration Engineer
Location: REMOTE
Description: Our client is currently seeking a Senior Data Integration Engineer (W2 Only)
For immediate consideration, please share your resume to [email protected]
LOCATION: Fully Remote
PROJECT: Commercial cloud (Azure) based data ingestion engine that is highly performant- scalable and elastic and transforms and stores member/clinical data.
IDEAL BACKGROUND: Healthcare data background, especially provider data (versus payer data), including clinical data. Experience with FHIR is highly desirable. Experience and deep background in various methods of ETL is required. Programming and full-stack development experience would be a great addition.
TOP REQUIREMENTS: Healthcare specific data knowledge ETL (Delta Lake), FHIR
Function Description:
- Functions may include database architecture, engineering, design, optimization, security, and administration; as well as data modeling, big data development, Extract, Transform, and Load (ETL - data ingestion pipeline leveraging Delta Lake on Databricks) development, storage engineering, data warehousing, data provisioning and other similar roles.
- Evaluating the performance and applicability of multiple tools against customer requirements
- Responsibilities may include management of design services, providing sizing and configuration assistance, ensuring strict data quality, and performing needs assessments.
- Analyzes current business practices, processes and procedures as well as identifying future business opportunities for leveraging data storage and retrieval system capabilities.
- Manages relationships with software and hardware vendors to understand the potential architectural impact of different vendor strategies and data acquisition.
- May design schemas, write SQL or other data markup scripting and helps to support development of Analytics and Applications that build on top of data.
Core Tasks:
- Maintain, improve, clean, and manipulate data
- Transform data for consumption by various clinical systems
- Improve data efficiency, reliability and quality
- Create data enrichment
- Build high performance
- Ensure data integrity
- Create and manage data stores at scale
- Ensure data governance - security, quality, access and compliance
- Partner with other members to support delivery of additional project components (API interfaces, Cognitive Search)
Contact: [email protected]
This job and many more are available through The Judge Group. Find us on the web at www.judge.com