Notion helps you build beautiful tools for your life’s work. In today's world of endless apps and tabs, Notion provides one place for teams to get everything done, seamlessly connecting docs, notes, projects, calendar, and email—with AI built in to find answers and automate work. Millions of users, from individuals to large organizations like Toyota, Figma, and OpenAI, love Notion for its flexibility and choose it because it helps them save time and money.
In-person collaboration is essential to Notion's culture. We require all team members to work from our offices on Mondays and Thursdays, our designated Anchor Days. Certain teams or positions may require additional in-office workdays.
Join Notion’s Data Platform team as we scale our infrastructure for enterprise customers. You’ll help design and build the core data platform that powers Notion’s AI, analytics, and search while meeting stringent security, privacy, and compliance requirements. This role focuses on the data platform layer (storage, compute, pipelines, governance) and partners closely with Security, Search Platform, AI, and Data Engineering.
What You'll Do:Design and evolve the data lakehouse
Build and operate core lakehouse components (e.g., Iceberg/Hudi/Delta tables, catalogs, schema management) that serve as the source of truth for analytics, AI, and search.
Own critical data pipelines and services
Design, implement, and harden batch and streaming pipelines (Spark, Kafka, EMR, etc.) that move and transform data reliably across regions and cells.
Advance EKM and encryption-by-design
Work with Security and platform teams to integrate Enterprise Key Management (EKM) into data workflows, including file- and record-level encryption and safe key handling in Spark and storage systems.
Improve data access, auditability, and residency
Build primitives for fine-grained access control, auditing, and data residency so customers can see who accessed what, where, and under which guarantees.
Drive reliability and observability
Raise the operational bar for our data stack: improve on-call experience, debugging, and alerting for data jobs and services.
Optimize large-scale performance and cost
Tackle performance and cost challenges across Kafka, Spark, and storage for very large workspaces (20k+ users, multi-cell deployments), including cluster migrations and workload tuning.
Enable ML and search workflows
Build infrastructure to support training and inference pipelines, ranking workflows, and embedding infrastructure on top of the shared data platform.
Shape the platform roadmap
Contribute to design docs and evaluations that influence our long-term platform direction and vendor choices.
Skills You'll Need:
Experience: 5+ years building and operating data platforms or large-scale data infrastructure for SaaS or similar environments.
Programming: Strong skills in at least one of Python, Java, or Scala; comfortable working with SQL for analytics and data modeling.
Distributed data systems: Hands-on experience with Spark or similar distributed processing systems, including debugging and performance tuning.
Streaming & ingestion: Experience with Kafka or equivalent streaming systems; familiarity with CDC/ingestion patterns (e.g., Debezium, Fivetran, custom connectors).
Lakehouse / storage: Experience with data lakes and table formats (Iceberg, Hudi, or Delta) and/or data catalogs and schema evolution.
Security & governance: Practical understanding of access control, encryption at rest/in transit, and auditing as they apply to data platforms.
Cloud infrastructure: Experience with at least one major cloud provider (AWS, GCP, or Azure) and managed data/compute services (e.g., EMR, Dataproc, Kubernetes-based compute).
Operations: Comfortable owning services and pipelines in production, including on-call, incident response, and reliability improvements.
Nice To Haves:
Experience working directly with enterprise customers or on features like data residency, EKM, or compliance-driven auditing.
Prior work on Databricks, Unity Catalog, Lake Formation, or similar catalog/governance systems.
Background implementing multi-region / multi-cell data architectures.
Experience building ML training/eval workflows or model/feature stores on top of a shared data platform.
Familiarity with vector databases or search infrastructure, and how they integrate with upstream data systems.
Experience designing or improving observability for data platforms (e.g., Honeycomb, OpenTelemetry, metrics/trace-heavy debugging).
Our customers come from all walks of life and so do we. We hire great people from a wide variety of backgrounds, not just because it's the right thing to do, but because it makes our company stronger. If you share our values and our enthusiasm for small businesses, you will find a home at Notion.
Notion is proud to be an equal opportunity employer. We do not discriminate in hiring or any employment decision based on race, color, religion, national origin, age, sex (including pregnancy, childbirth, or related medical conditions), marital status, ancestry, physical or mental disability, genetic information, veteran status, gender identity or expression, sexual orientation, or other applicable legally protected characteristic. Notion considers qualified applicants with criminal histories, consistent with applicable federal, state and local law. Notion is also committed to providing reasonable accommodations for qualified individuals with disabilities and disabled veterans in our job application procedures. If you need assistance or an accommodation made due to a disability, please let your recruiter know.
Notion is committed to providing highly competitive cash compensation, equity, and benefits. The compensation offered for this role will be based on multiple factors such as location, the role’s scope and complexity, and the candidate’s experience and expertise, and may vary from the range provided below. For roles based in San Francisco, the estimated base salary range for this role is $230,000 - $300,000 per year.
By clicking “Submit Application”, I understand and agree that Notion and its affiliates and subsidiaries will collect and process my information in accordance with Notion’s Global Recruiting Privacy Policy.
#LI-Onsite
Top Skills
What We Do
Notion blends your everyday work tools into one. Product roadmap? Company wiki? Meeting notes? With Notion, they're all in one place, and totally customizable to meet the needs of any workflow. It's the all-in-one workspace for you, your team, and your whole company.
Mission: We humans are toolmakers by nature, but most of us can't build or modify the software we use every day — arguably our most powerful tool. Here at Notion, we're on a mission to make it possible for everyone to shape the tools that shape their lives.
Why Work With Us
Here at Notion, our work shapes our culture and our culture inspires our work. We seek to hire creative toolmakers that want to be the best in their craft. If every employee is able to focus on being the best toolmaker in their craft, we'll be able to achieve our mission of enabling the world to better solve its problems.
Gallery
Notion Offices
Hybrid Workspace
Employees engage in a combination of remote and on-site work.
Employees work in-person at our offices on Mondays and Thursdays. The other three days are flexible.











