Senior Data Engineer

Posted 18 Days Ago
San Francisco, CA
Senior level
Software
The Role
The Senior Data Engineer will design, develop, and maintain complex ETL pipelines using SQL and PySpark, focusing on processing time-series data and ensuring data quality. The role involves collaborating with product stakeholders and contributing to the production of data products.
Summary Generated by Built In

Plato Systems is pioneering the use of spatial AI to boost capacity, productivity, and safety across physical operations, starting with manufacturing. Established as a spin-off out of Stanford and funded by NEA in 2019, Plato has developed an Operations Digital Twin and AI copilot fueled by our proprietary hardware that uses machine perception and sensor fusion on the edge to digitize patterns of activity. Advanced manufacturing companies use the unprecedented capabilities in our platform to root cause complicated systemic operational issues rapidly and at scale, thereby taking Kaizen and compliance tracking into the age of AI. We are deployed across multinational electronics manufacturers, semiconductor fabs, and EMS companies in several countries. You can find out more about us by visiting our website.

We are seeking a Senior Data Engineer with over 7 years of relevant experience to join our growing team. The ideal candidate will have a deep understanding of the design, development, and maintenance of many ETL pipelines at the same time, will be confident working with time-series data, and will have some prior experience with business analyst or business intelligence functions.

Core Responsibilities & Qualifications:

  • ETL Pipeline Design and Development: Work closely with our Head of Product to create and maintain complex ETL data pipelines using SQL and PySpark, while ensuring data quality. Familiarity with Databricks is preferred.
  • Time-Series Data Processing: Prior experience in designing, & implementing production pipelines (i.e. normalizing, aggregating, aligning, correlating, processing time-series data from sensors, machines, and processes as well as orchestrating & monitoring).
  • Production Data Product Experience: Demonstrated ability to work on the production of data products, including handling data quality issues, orchestration and automation, and testing.
  • Coding Skills: Proficiency in Python, SQL, and Java or other backend languages. Familiarity with cloud platforms like AWS, GCP or Azure is a plus.
  • Communication Requirements: Effective communication skills, both verbal and written, to clearly explain complex data concepts to non-technical team members and stakeholders.
  • Team Collaboration: Proven experience working effectively in cross-functional teams, demonstrating a collaborative mindset.

Preferred Qualifications:

  •  Domain Engagement: Demonstrated prior experience understanding business requirements and domain in order to create data products that address customer needs. Success in this role hinges on grasping the nuances and intricacies of our domain to offer relevant and impactful data solutions.
  • Autonomy & attention to detail: Derived tables that feed dashboards and analytics are only useful if they contain clean high quality data. This requires good judgment to find the right balance of unit tests, integration tests, regression tests, and monitoring / alerting to ensure data flow and its quality.
  • BI, Data Visualization & Reporting: Prior experience creating and maintaining automated reporting, dashboards, and consistent analysis to bolster data-driven decisions.
  • Versatility of Prior Work: Experience working in different industries or domains, as well as different stages (early stage build-out to mature pipelines and processes), showing flexibility and ability to adapt to new types of data and/or business domains.

Top Skills

Java
Pyspark
Python
SQL
The Company
HQ: Washington, DC
23 Employees
On-site Workplace

What We Do

Plato unlocks the power of digital transformation and industrial automation through its integrated Spatial Intelligence (TM) system and platform. Plato Deep Fusion perception technology tracks activity patterns of people, equipment, and assets in industrial environments – making it the world’s first tagless activity tracking system. Plato provides a reliable, frictionless, and scalable automation solution that delivers actionable KPIs so businesses can improve safety, optimize process efficiency, and increase productivity.

Pioneered by Stanford academics and commercialized by industry experts, the Deep Fusion multi-sensor system provides robust tagless target tracking and scene understanding using a combination of model and ML based algorithms on an embedded edge compute platform.

Similar Jobs

Cash App Logo Cash App

Senior Data Engineer, Data Engineering & Empowerment

Blockchain • Fintech • Mobile • Payments • Software • Financial Services
Hybrid
8 Locations
3500 Employees

CoreWeave Logo CoreWeave

Senior Data Engineer - Compute and Storage Infrastructure

Cloud • Information Technology • Machine Learning
2 Locations
806 Employees

The Walt Disney Company Logo The Walt Disney Company

Sr Data Engineer

AdTech • Digital Media • News + Entertainment
Hybrid
Santa Monica, CA, USA
200000 Employees
136K-191K Annually

The Walt Disney Company Logo The Walt Disney Company

Senior Data Engineer

AdTech • Digital Media • News + Entertainment
Hybrid
Santa Monica, CA, USA
200000 Employees

Similar Companies Hiring

TrainingPeaks (A Peaksware Company) Thumbnail
Software • Fitness
Louisville, CO
69 Employees
bet365 Thumbnail
Software • Gaming • eSports • Digital Media • Automation
Denver, Colorado
6100 Employees
Jobba Trade Technologies, Inc. Thumbnail
Software • Professional Services • Productivity • Information Technology • Cloud
Chicago, IL
45 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account