Engineering Manager - Transformation & Data Lake

Posted Yesterday
Be an Early Applicant
Bellevue, WA, USA
Hybrid
236K-339K Annually
Entry level
Artificial Intelligence • Big Data • Cloud • Machine Learning • Software • Database • Analytics
Let's build a world where data and AI turn possibilities into reality.
The Role
As an Engineering Manager for Transformation & Data Lake at Snowflake, you'll lead efforts in building interoperable data solutions, focusing on metadata, integration platforms, and data pipeline automation, driving innovation across cloud data ecosystems.
Summary Generated by Built In

At Snowflake, we are powering the era of the agentic enterprise. To usher in this new era, we seek AI-native thinkers across every function who are energized by the opportunity to reinvent how they work. You don’t just use tools; you possess an innate curiosity, treating AI as a high-trust collaborator that is core to how you solve problems and accelerate your impact. We look for low-ego individuals who thrive in dynamic and fast-moving environments and move with an experimental mindset — who rapidly test emerging capabilities to discover simpler, more powerful ways to deliver results. At Snowflake, your role isn't just to execute a function, but to help redefine the future of how work gets done.

Metadata :
  • Our team owns the interoperable metadata and transaction foundation for Snowflake’s Lakehouse, building the core platforms behind Snowflake Managed and Externally Managed Iceberg tables. We enable Iceberg tables to tap into the best of Snowflake—replication, sharing, change tracking/row lineage, managed storage, and cross-cloud reliability—while bringing the best of Iceberg, including branching, tagging, and rich versioning, natively into the Snowflake platform. We work at the intersection of distributed systems, open table formats, and core Snowflake services to make open-format workloads first-class citizens in the AI Data Cloud.

Data Lake :
  • Catalog Services and Integrations Platform Team
    We are looking for visionary leaders to join the Catalog Services & Integrations Platform team. Our mission is to make Snowflake the most interoperable data platform on the planet. We build the foundational infrastructure that enables seamless connectivity between Snowflake and the broader data ecosystem including platforms like Google BigLake, Microsoft OneLake, and Databricks Unity Catalog. If you are passionate about distributed systems, multi-cloud architectures, and shaping the future of open data standards, this is where you can have outsized impact.
    What We Do
    Catalog Integrations Platform
    We design and build a high performance, scalable integration layer that allows Snowflake to interoperate with external catalogs. Our platform enables frictionless metadata data discovery and access across the modern data stack, breaking down silos and empowering true data mobility.
    Horizon Catalog & Managed Polaris
    We power the next generation of open cataloging within Snowflake through Horizon Catalog. Our team delivers managed Polaris capabilities and implements Iceberg REST Catalog (IRC) support bringing open standards, governance, and interoperability together in a unified experience.
    Open Ecosystem Enablement
    We are at the forefront of enabling open table formats and cross-platform compatibility, ensuring Snowflake remains a central player in an increasingly interconnected data landscape

Dynamic Tables :
  • Dynamic Tables (DT) are Snowflake’s declarative engine for building automated, incremental data pipelines in simple SQL—Snowflake handles scheduling, incremental refresh, and reliable DAG execution so customers can focus on modeling their business logic. The Dynamic Tables Orchestration team builds towards the next-generation of “Unified Pipelines” that ties DTs together with Tasks, Streams, Snowpark, and dbt. We also build AI-enabled pipeline authoring and streamlining experiences, including prompt-based tuning and data refinement, turning complex platform signals into simple, intuitive workflows for data engineers.

Every Snowflake employee is expected to follow the company’s confidentiality and security standards for handling sensitive data. Snowflake employees must abide by the company’s data security plan as an essential part of their duties. It is every employee's duty to keep customer information secure and confidential.

Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake.

How do you want to make your impact?

For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com

Top Skills

AI
Cloud Technologies
Cross-Cloud Architecture
Data Catalogs
Data Lake Platforms
Dbt
Distributed Systems
Metadata Management
Snowpark
SQL
Am I A Good Fit?
beta
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
HQ: Bozeman, MT
9,023 Employees
Year Founded: 2012

What We Do

Snowflake powers the end-to-end data lifecycle – from ingesting and processing data to analyzing and modeling it, to building and sharing data and AI applications – helping engineers, analysts, and leaders innovate faster and achieve more with their data. We're on a mission to empower every enterprise to achieve its full potential through data and AI.

Why Work With Us

Snowflake is where data does more, and so do you. More innovating, more growing, and more collaborating. Here, you’ll find the sweet spot between building big and moving fast, in technology and your career.

Gallery

Gallery

Similar Jobs

OpenX Technologies Logo OpenX Technologies

Intern - Communications

AdTech • Enterprise Web • Information Technology • Machine Learning • Marketing Tech • Sales
Easy Apply
Remote or Hybrid
US
420 Employees

Tempus AI Logo Tempus AI

(Senior) Medical Science Liaison - Great Lakes (Detroit, MI)

Artificial Intelligence • Big Data • Healthtech • Machine Learning • Analytics • Biotech • Generative AI
Remote or Hybrid
USA
3775 Employees

Remitly Logo Remitly

Principal Engineer

eCommerce • Fintech • Payments • Software • Financial Services
In-Office
Seattle, WA, USA
2800 Employees
240K-300K Annually

Remitly Logo Remitly

Senior Data Scientist

eCommerce • Fintech • Payments • Software • Financial Services
In-Office
Seattle, WA, USA
2800 Employees
176K-220K Annually

Similar Companies Hiring

Fairly Even Thumbnail
Hardware • Other • Robotics • Sales • Software • Hospitality
New York, NY
30 Employees
Bellagent Thumbnail
Artificial Intelligence • Machine Learning • Business Intelligence • Generative AI
Chicago, IL
20 Employees
Kepler  Thumbnail
Fintech • Software
New York, New York
6 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account