Senior Data Platform Engineer

| Peninsula
Apply now
By clicking continue you agree to Built In’s Privacy Policy and Terms of Use.

Senior Data Platform Engineer

Hippo is modernizing the home insurance industry by putting customers at the center of everything we do, from the coverage we offer to the customer service we provide. Hippo’s true ambition lies in helping people protect their homes to begin with by leveraging technology and data to help find small issues before they become big headaches. Because, at the end of the day, the best home insurance policy is the one you never have to use. 

We're looking for a talented Data Platform Engineer to scale up our analytical data infrastructure, as well as partner with the product team to ensure data integrity in the entire system. An ideal candidate brings curiosity, a passion for data, and a deep understanding of the technologies behind data pipelines, warehousing, big data, analytics and deep understanding of the software development life cycle . Prior startup experience and ability to thrive in a fast-paced environment is a big plus.

Responsibilities:

  • Manage analytics data systems, continuously scale up to meet the growing business demand
  • Develop and provide on-going improvement for workflow orchestration system
  • Manage build and deployment process in company wide data systems
  • Manage data ingest flow to meet all company’s batch processing and real time use cases
  • Develop and improve company’s reporting framework in empowering various regulatory and partner requirements
  • Partner with product engineering and other platform team to ensure data integrity across the broader systems
  • Improve the anomaly detection platform integration to achieve a resilient data system, and empower data QA as well as business users

Required Qualifications:

  • Bachelor’s degree or equivalent in Computer Science, Computer Engineering, Information Technology or any related field of study
  • 5+ years of work experience as a software engineer in data infrastructure or analytical data warehouse
  • Demonstrable knowledge, experience, skill and proficiency with the following:
    • Python
    • SQL
    • Shell scripting
    • AWS/GCP environments
  • Experience with building ETL workflow management systems
  • Experience with docker container and Kubernetes

Preferred Qualifications:

  • Experience working with DBT and BigQuery
  • Experience in building or integrating data anomaly and validation tools
Compensation & Perks:
  • Healthy Hippos Benefits- 100% Employer paid medical, dental & vision plan options for our team members AND their families (yes, you read that correctly). As well as 401K, long & short term disability, EAP and flexible spending accounts.
  • Flexible Paid Time Off- Uncapped PTO (with manager approval)
  • Hippo Habitat- Fun offices located in downtown Austin. Stocked kitchen and BBQ Friday’s for all! 
Hippo is an equal opportunity employer, and we are committed to building a team culture that celebrates diversity and inclusion. Hippo’s applicants are considered solely based on their qualifications, without regard to an applicant’s disability or need for accommodation. Any Hippo applicant who requires reasonable accommodations during the application process should contact the Hippo’s People Team to make the need for an accommodation known


Read Full Job Description
Apply now
By clicking continue you agree to Built In’s Privacy Policy and Terms of Use.
Apply now
By clicking continue you agree to Built In’s Privacy Policy and Terms of Use.
Save jobView Hippo Insurance's full profileFind similar jobs