Core Responsibilities
- Collaborate with Product Management, Understand use cases and personas, and engineer product to support a strong user experience.
- Own schema design and data modeling for energy metering and building management system (BMS) data.
- Architect and maintain cost-effective and performant next generation data storage (e.g. ClickHouse, StarTree, etc).
- Lead data architecture decisions, including evaluating and integrating tools in our modern data stack.
- Build and manage robust, scalable ETL/ELT pipelines to ingest, transform, and serve data
- Ensure performance and efficiency of analytical queries across large datasets
- Develop and enforce data quality, validation, and governance standards
Adjacent Responsibilities
- Support real-time IoT analytics and streaming pipelines.
- Owning BI tooling (e.g. Superset, Looker, Tableau, etc).
- Contribute to building internal data tools for engineers and analysts.
- Collaborate with AI/ML teams to support model training and inference pipelines.
- Work with web and application teams to ensure real-time and batch data access needs are met.
- Manage team projects and coordinate with other technical leads.
- Mentor junior engineers and contribute to technical hiring.
Required Qualifications
- Align with core working hours, 10:00AM PST to 5:00PM PST in either pacific, mountain, or central timezones.
- 5+ years of experience in data engineering with large-scale, high-throughput systems
- Proven experience designing dimensional models and OLAP schema (fact/dimension tables)
- Deep understanding of columnar stores and database internals (e.g., ClickHouse, Druid, StarTree, Pinot)
- Strong SQL skills and proficiency with Python for data pipelines
- Experience handling updates/inserts/type-2 dimensions for time-series or large-scale event stores
Preferred Qualifications
- Experience with BMS/HVAC or Energy data is a plus
- Experience with usage of time series and energy data used for diagnostics and efficiency.
- Experience with IoT or sensor data systems.
- Experience working in AWS Cloud.
- Experience with Postgres.
- Proficiency in orchestrating ETL workflows (e.g. Dagster, Airflow, AWS Step Functions, etc.)
- Familiarity with stream processing tools (e.g., Kafka, Flink, Spark Streaming)
- Exposure to machine learning feature stores or MLOps tooling
- Experience with data observability and data cataloging tools
- Experience managing a team or others.
Top Skills
What We Do
Verdigris is an artificial intelligence IoT platform that makes buildings smarter and more connected while reducing energy consumption and costs. By combining proprietary hardware sensors, machine learning, and software, Verdigris “learns” the energy patterns of a building. Their AI software produces comprehensive reports including energy forecasts, alerts about faulty equipment, maintenance reminders, and detailed energy usage information for each and every device and appliance. Verdigris offers a suite of applications that gives building engineers a comprehensive overview, an “itemized utility bill”, powerful reporting, and simple automation tools for their facility. For more information, visit www.verdigris.co.