Sr. Data Engineer - Remote

Sorry, this job was removed at 11:07 a.m. (CST) on Thursday, July 28, 2022
Find out who's hiring in Atlanta, GA.
See all Data + Analytics jobs in Atlanta, GA
Apply
By clicking Apply Now you agree to share your profile information with the hiring company.

Job Description

Sharecare is a digital health company that helps people manage all their health in one place. The Sharecare platform provides each person -- no matter where they are in their health journey -- with a comprehensive and personalized health profile, where they can dynamically and easily connect to the information, evidence-based programs, and health professionals they need to live their healthiest, happiest and most productive life. With award-winning and innovative frictionless technologies, scientifically validated clinical protocols, and best-in-class coaching tools, Sharecare helps providers, employers, and health plans effectively scale outcomes-based health and wellness solutions across their entire populations. We are always looking for people that value the opportunity to work hard, have fun on the job, and make a difference in the lives of others through their work every day!

Job Summary:

Seeking a Sr. Data Engineer to contribute to a new cutting-edge platform supporting big-name partners and customers. In this fast-paced environment, the team works with Product, QA, Operations, and Analytics to meet business needs. The Senior Data Engineer maintains both strategic and tactical working knowledge of the production data systems, along with the users, workflows, data flows, data models, use cases, and requirements associated with these systems, in order to allow them to effectively transform production data to be used in the data warehouse. This position is a chance to join a highly collaborative agile team working on breaking new ground in the health information industry.

An ideal candidate should be a fast learner, self-driven, and detail-focused. The candidate should be comfortable with Python and related frameworks, or they must have strong OOPS programming experience with a desire to learn functional programming, performing ETL and Big-Data processing with Pandas, Dask, and SQL. Strong analytical skills are a must when designing our platform architecture. The candidate should be able to follow requirement documents, design future-proof solutions, and voice his/her opinions on possible improvements and innovations.

Essential Functions:

* Design, Develop, Test and Deploy data pipelines to manage enterprise data.
* Evaluates requirements, issues, and proposed enhancements to provide recommendations for improvement.
* Design orchestration solutions to meet internal and external analytical needs.
* Build architecture and libraries to be leveraged across the enterprise.
* Fundamental understanding of events, data streams, and batch processing.
* Passion for big data, data transformations, data validations, and data processing.
* Performs research and analysis of operations performance data, agent performance data, member eligibility, healthcare, response data, etc.
* Proactively monitors production orchestrations for trends.
* Occasionally publish written documentation that relates to the technology solutions.
* Works on problems of moderate scope where analysis of situations or data requires a review of a variety of identifiable factors or solutions.
* Use and design clean RESTful APIs that conform to company standards and best practices.
* Exhibit a passion for clean code and simple solutions to complex problems.

Requirements

Qualifications:

* Bachelor's degree (or higher) in Computer Science or related field.
* Detail focused engineer excited to learn new technologies and solve complex problems.
* Experience programming Python (OOPS or Functional languages a plus).
* Proficient in writing SQL queries with complex datasets.
* A strong understanding of concurrent, parallel, and distributed systems.
* Experience with Pandas, Apache Airflow, Apache Spark, Apache Beam, Redshift, and related technologies is a plus.
* Experience with Amazon SNS, S3, SQS, Kafka, Kinesis MongoDB, Kubernetes, Unix shell scripts, and GIT is desired.
* Experience implementing REST service is a plus.
* Knowledge or experience in test-driven development (TDD) and continuous deployment best practices.
* Experience with machine learning algorithms and big data processing is a plus.
* A "can do" attitude and enjoys working within a highly collaborative and friendly work environment.

EEO Statement

Sharecare is an Equal Opportunity Employer and doesn't discriminate on the basis of race, color, sex, national origin, sexual orientation, gender identity, religion, age, disability, genetic information, protected veteran status,or other non-merit factor.

More Information on Sharecare
Sharecare operates in the Fitness industry. The company is located in Atlanta, GA. Sharecare was founded in 2010. It has 3221 total employees. It offers perks and benefits such as Flexible Spending Account (FSA), Disability insurance, Dental insurance, Vision insurance, Health insurance and Life insurance. To see all jobs at Sharecare, click here.
Read Full Job Description
Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.

Similar Jobs

Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.
Learn more about SharecareFind similar jobs