Data Engineer

Sorry, this job was removed at 08:16 p.m. (CST) on Tuesday, Aug 19, 2025
2 Locations
Hybrid
115K-138K Annually
Social Impact
The Role
The Data Operations & Cloud Engineer will oversee the development, management, and optimizaiton of a cloud data platform on Google Cloud Platform (GCP). This role will ensure secure, efficient, and scalable data solutions and be instrumental in establishing and maintaining the data infrastructure that supports advanced analytics and research on workforce development, higher education, labor market, and economic data for state partners. Key responsibilities include managing data ingestion pipelines, optimizing system performance, implementing data governance policies, and providing technical leadership to support data-driven decision making. Our data environment leverages GCP services such as BigQuery, Cloud Run, Cloud Storage, and Vertex AI. We welcome candidates with experience in comparable cloud data ecosystems (e.g. Snowflake, Amazon Redshift, Databricks) who are eager to pivot their expertise to GCP's toolset in support of our mission.

Context

Strada Education Foundation supports programs, policies, and organizations that strengthen connections between education and employment in the U.S., with a special focus on helping those who have faced the greatest challenges in securing economic opportunity through postsecondary education and training. Strada’s strategic plan focuses on five key areas to improve pathways to opportunity in post-high school education:

Clear outcomes – defining and measuring educational outcomes
Quality coaching – providing guidance to learners
Affordability – ensuring education and training is financially attainable
Work-based learning – expanding on-the-job training opportunities
Employer alignment – aligning educational programs with workforce needs

Strada leverages research, strategic philanthropy, investments, communications, advocacy, and collaboration in the pursuit of this mission. The Data Operations Engineer supports the Education Analytics and Technical Services team within the Employer Alignment focus area, ensuring that data infrastructure and practices meet the needs of these strategic initiatives.

Key responsibilities: the Data Operations Engineer has four core responsibility areas, listed below with the approximate percentage of time required for execution.

Data Management (40%)

  • Cloud Data Platform Administration: Manage, maintain, and optimize the Google Cloud-based data warehouse and storage environment (e.g. BigQueryCloud Storage) to ensure secure, efficient, and scalable data solutions.
  • ETL/ELT Pipeline Development: Develop and orchestrate scalable ETL/ELT data pipelines for data ingestion and transformation, using GCP services (such as Cloud Run or Cloud Dataflow) to handle large-scale data processing.
  • Data Quality & Governance: Ensure data integrity, quality, and governance compliance across all workflows, establishing best practices for data security, access control, and regulatory compliance.
  • Third-Party Data Integration: Collaborate with internal research teams and state agency partners to integrate third-party data sources (e.g. via APIs and data marketplaces) into the platform. Identify and onboard new data sources and technology to expand the workforce and education data model.
  • Performance Optimization: Optimize data structures, partitioning, and query strategies for performance and cost efficiency in BigQuery. Monitor and tune resource usage to ensure cost-effective operations.

System Administration and Performance Optimization (30%)

  • Monitoring & Troubleshooting: Monitor system performance and troubleshoot issues across GCP data services. Ensure high availability and responsiveness of databases, pipelines, and applications.
  • Resource & Cost Management: Manage storage resources, query performance, and workload scheduling in the GCP environment (BigQuery, Cloud Run, etc.). Implement cost-effective strategies for managing cloud data warehouse expenditures, including rightsizing storage and compute resources.
  • Automation & DevOps: Automate data workflows, pipeline scheduling, and deployments (leveraging Infrastructure-as-Code and CI/CD where possible) to streamline data processing and reporting. Maintain comprehensive documentation for data models, system configurations, and integration processes to support maintainability and knowledge sharing.

Data Collaboration and Reporting Support (20%)

  • Analytics Enablement: Provide technical data access and support to the Education Analytics and Technical Services team working within the data platform. Ensure that analysts and researchers can easily retrieve and analyze data needed for workforce and education insights.
  • Data Quality Collaboration: Drive data quality improvements through close collaboration with stakeholders and contractors, supporting effective analytics and reporting outcomes. Establish feedback loops to continually refine data definitions and accuracy.
  • Dashboard and App Support: Assist in the development and maintenance of analytics dashboards and applications by ensuring data is accessible, well-structured, and up to date. This includes supporting business intelligence tools like Tableau and custom analytics apps (e.g. Streamlit) by provisioning data and optimizing queries for front-end use.
  • Alignment with Analytics Needs: Ensure ongoing alignment between the data infrastructure and the analytic capabilities of the team. Work closely with analysts to understand their data needs and adjust data models or pipelines to enable new metrics, visualizations, and insights.

Team Leadership & DEI Commitment (10%)

  • Technical Leadership: Provide guidance and training to research and analytics team members on best practices in data warehousing, data engineering, and governance. Foster data literacy and efficient use of the data platform across the team.
  • Diversity, Equity & Inclusion: Partner with Human Resources and DEI leadership to promote equitable workplace practices and embrace diverse perspectives in data operations.
  • Collaborative Culture: Foster a collaborative, inclusive environment that encourages innovation and cross-functional teamwork. Model transparency, respect, and inclusion in all professional interactions, ensuring that all team members feel valued and heard as we develop data solutions.

The Person: Qualifications and Experience

  • Education: Bachelor’s degree in computer science, data engineering, information systems, or a related field (or equivalent work experience). A master’s degree is a plus.
  • Cloud Data Platform Expertise: 5+ years of experience in enterprise environments with multiple internal and external stakeholders managing and operating cloud-based data platforms, preferably on Google Cloud Platform (BigQuery, Cloud Storage, etc.). Experience with other data warehouse ecosystems such as Snowflake, Amazon Redshift, or Databricks is highly valued, with an expectation of willingness to pivot and learn GCP tools.
  • Data Modeling & Architecture: Proven experience in designing and implementing data models that facilitate continuous research, dashboarding, reporting, and advanced analytics. Ability to optimize data workflows and performance for large-scale datasets.
  • Programming & Scripting: Strong proficiency in SQL for data manipulation and query optimization. Experience with Python (or similar languages) for data engineering tasks and script automation (e.g., using Pandas, Apache Beam/Dataflow).
  • ETL/ELT & Integration: Hands-on experience with ETL/ELT pipeline development and cloud-based data integration processes. Experience integrating third-party data from external APIs and data marketplaces to enrich internal datasets.
  • CI/CD: Familiarity with tools such as Terraform, Cloud Build, GitHub Actions, or Jenkins for infrastructure provisioning and deployment automation.
  • Data Governance & Security: Demonstrated expertise managing security, permissions, and controls for a large organization. Knowledge of key U.S. data privacy regulations (e.g., FERPA, CCPA,) and cloud compliance frameworks (e.g., SOC 2, ISO 27001) for data handling in education and labor contexts is essential.  
  • Version Control: Proficient with Git for version control and collaborative development.
  • Analytical Tools: Familiarity with business intelligence and data visualization tools such as Tableau (preferred) or Power BI, and exposure to building simple analytics applications or dashboards to support end-users.
  • Domain Experience: Experience working with state-level workforce, education, and/or economic datasets is a strong plus. An understanding of labor market data or higher education data conventions will help in contextualizing and validating data.
  • Soft Skills: Excellent problem-solving and troubleshooting skills in a data-centric environment. Strong communication and collaboration abilities, with experience working in cross-functional teams and explaining technical concepts to non-technical stakeholders. Experience with agile project management methodologies for data infrastructure projects is beneficial.
  • Desired Certifications:Current Google Cloud Professional Data Engineer and/or Cloud Architect certification.

Travel Requirements:
This role may require occasional travel for conferences, meetings, or site visits, estimated at up to 10% of the time.

Please note: We are unable to offer visa sponsorship for this position. Applicants must be legally authorized to work in the United States on a full-time basis without current or future sponsorship requirements.

Similar Jobs

Liberty Mutual Insurance Logo Liberty Mutual Insurance

Data Engineer

Artificial Intelligence • Fintech • Insurance • Marketing Tech • Software • Analytics
Hybrid
Indianapolis, IN, USA
40000 Employees
82K-150K Annually

Corteva Agriscience Logo Corteva Agriscience

Data Engineer

Greentech • Software • Agriculture
In-Office
2 Locations
17000 Employees
2-3 Annually

Centier Bank Logo Centier Bank

Database Administrator

Fintech • Payments • Financial Services
In-Office
Merrillville, IN, USA
983 Employees

Moser Consulting Logo Moser Consulting

Artificial Intelligence Engineer

Information Technology • Consulting
In-Office
Indianapolis, IN, USA
316 Employees
120K-155K Annually
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
HQ: Indianapolis, IN
197 Employees
Year Founded: 2017

What We Do

Strada Education Network is a new kind of social impact organization dedicated to improving lives by forging clearer and
more purposeful pathways between education and employment. Our approach combines innovative research, thought
leadership, strategic philanthropy, mission-aligned investments, and a network of affiliate organizations. Together, we
work to better serve millions of individuals in the United States seeking to complete postsecondary education and training,
gain clear value from those experiences, and build meaningful careers. Learn more at stradaeducation.org.

Similar Companies Hiring

Marble Health Thumbnail
Telehealth • Software • Social Impact • Kids + Family • Healthtech • Conversational AI
New York, New York
20 Employees
Camber Thumbnail
Social Impact • Healthtech • Fintech
New York, NY
53 Employees
Sailor Health Thumbnail
Telehealth • Social Impact • Healthtech
New York City, NY
20 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account