What you'll do
- Document, improve, and maintain data strategies and artifacts, including logical and physical data models, data dictionary, data architecture roadmap, and data security policies, using industry best practices and adhering to government standards, along with operational runbook procedures
- Collect data access patterns and review current data models to optimize designs for customer use cases
- Standardize data ingestion and processing pipelines to scale with increasing utilization
- Audit and reverse-engineer business rules in legacy systems, and build data connectors for integrating them into a data and analytics platform
- Implement large-scale data ecosystems within cloud-based platforms that include data management and data governance of structured and unstructured data
- Leverage and enhance automation to speed development and improve reliability and performance
- Work with cross-functional project teams to gather business requirements and translate to detailed technical specifications
- Work with Government partners to assist and develop data engineering applications and pipelines that will enable data services and processing capabilities, such as advanced analytics, AI/ML, and experimentation
- Design, develop, test, automate, and deploy data engineering solutions in a cloud platforms, such as AWS
- Participate in software design and code reviews
- Develop automated testing, monitoring and alerting, and CI/CD for production systems
What we're looking for
- 10+ years of software engineering experience
- 5+ years of experience with Cloud Data architecture (AWS preferred)
- Experience with professional software engineering practices using such tools and methodologies as Agile Software Development, Test Driven Development, CI/CD, and Source Code Management
- Familiar with modern data technologies including data lakes, data warehouses, and real-time analytics
- Experience with building ETL pipelines to ingest, process, and store data
- Experience with data orchestration frameworks, such as Apache Airflow, AWS Step Functions or Temporal
- Experience with business intelligence and analytics platforms
- Proficient with relational databases and advanced SQL queries
- Prior experience with Python, Java, or Scala programming across multiple use-cases
- Experience with data cleaning and data modeling while protecting sensitive data
- Proficient with building data integrations using both API and file-based protocols
- Comfortable troubleshooting complex data and systems interaction problemsHighly organized and demonstrates critical reasoning and problem-solving skills
- Works well independently, as part of a team, and with stakeholders
- Effective oral and written communication skills to relay technical concepts clearly and concisely
Desired Skills
- Familiarity with DevOps IaC tools (e.g., Ansible, Terraform, and CloudFormation)
- Experience with security scanning tools (e.g., Nessus, BurpSuite, Netsparker, OWASP)
- Experience working with and programmatically interfacing with legacy systems
- Experience with healthcare quality data, e.g., provider, beneficiary, claims, and quality measure data
- Familiarity with test automation frameworks
Similar Jobs
What We Do
Nava is a public benefit corporation working to radically improve how government serves people. Formed as a team of designers and engineers in the effort to fix HealthCare.gov in 2013, Nava now works with Medicare and the Department of Veterans Affairs. With a strong research practice and a depth of experience scaling digital services, Nava helps more than sixty million people access critical government services.
We’re thinkers and designers of civic technology. We work holistically across engineering, research, design, and operations to proactively envision the services of a better future. We build with empathy and inclusion, and as we come from many backgrounds and countries ourselves, we seek and value different perspectives. We’d love for you to come join us.
.jpg)







