Foundation models have transformed text and images, but structured data - the largest and most consequential data modality in the world - has remained untouched. Tables power every clinical trial, every financial model, every scientific experiment, every business decision. No one has built a foundation model that truly understands them.
Until now. What LLMs did for language, we're doing for tables. The next modality shift in AI is happening - and we're hiring the team that makes it.
Momentum: We pioneered tabular foundation models and are now the world-leading organization in structured data ML. Our TabPFN v2 model was published in Nature and set a new state-of-the-art for tabular machine learning. Since its release, we've scaled model capabilities more than 20x, reached 3M+ downloads, 6,000+ GitHub stars, and are seeing accelerating adoption across research and industry - from detecting lung disease with Oxford Cancer Analytics to preventing train failures with Hitachi to improving clinical trial decisions with BostonGene.
The hardest work is in front of us. We're scaling tabular foundation models to handle millions of rows, thousands of features, real-time inference, and entirely new data modalities - while building the infrastructure to deploy them in production across some of the most demanding industries on earth. These are open problems no one else is working on at this level.
Our team: We’re a small, highly selective team of 20+ engineers, researchers and GTM specialists, selected from over 5,000 applicants, with backgrounds spanning Google, Apple, Amazon, Microsoft, G-Research, Jane Street, Goldman Sachs, and CERN, led by Frank Hutter, Noah Hollmann and Sauraj Gambhir and advised by world-leading AI researchers such as Bernhard Schölkopf and Turing Award winner Yann LeCun. We ship fast, create top-tier research, and hold each other to an extremely high bar.
What’s Next: In 2025, we raised €9m pre-seed led by Balderton Capital, backed by leaders from Hugging Face, DeepMind, and Black Forest Labs. The next phase of growth is here which makes this an optimal time to join.
Core Areas of ImpactYou'll be among the first scientists collaborating and working an entirely new class of AI models, not just incremental improvements. As an early-stage startup working on foundation models for tabular data, we have countless exciting research ideas and problems to explore - you're sure to find challenges that match your interests and expertise. We are working on problems such as:
Scaling our transformer architectures from 10K to 1M+ samples while maintaining performance
Building multimodal models that combine text and tabular understanding on proprietary data
Developing specialized architectures for time series, forecasting, and anomaly detection
Creating efficient inference methods for production deployment
Researching causal understanding in foundation models
Designing novel approaches for handling multiple related tables
Currently pursuing or holding a PhD in Computer Science, Applied Mathematics, Statistics, Electrical Engineering, or a related field (we will also consider exceptional Master's students)
Deep experience with ML frameworks, especially PyTorch and scikit-learn
Strong engineering fundamentals with excellent Python expertise
Experience in data-science and working with tabular data or time series
Publications at top-tier venues (NeurIPS, ICML, ICLR) or significant open-source contributions
Strong mentorship and professional development opportunities
Work with state-of-the-art ML architecture, substantial compute resources, and a world-class team
Comprehensive benefits including healthcare, transportation, and fitness
Competitive compensation package with meaningful equity (We compete with the world's biggest AI companies for talent)
Work with state-of-the-art ML architecture, substantial compute resources, and a world-class team
Annual company-wide offsites to bring the team together (last trip was to the Alps 🏔️)
Comprehensive benefits aligned to the location that you will join us in
Support with relocation where needed
We believe the best products and teams come from a wide range of perspectives, experiences, and backgrounds. That’s why we welcome applications from people of all identities and walks of life, especially anyone who’s ever felt discouraged by "not checking every box."
We’re committed to creating a safe, inclusive environment and providing equal opportunities regardless of gender, sexual orientation, origin, disabilities, or any other traits that make you who you are.
Top Skills
What We Do
Prior Labs is building breakthrough foundation models that understand spreadsheets and databases - the lifeblood of science and business. While foundation models have transformed text and images, tabular data has remained largely untouched. We're tackling this opportunity to revolutionize how we approach scientific discovery, medical research, financial modeling, and business intelligence. Backed by Balderton Capital, XTX Ventures, SAP Founder Hans Werner-Hector's Hector Foundation, Atlantic Labs, Galion.exe and top AI leaders such as Peter Sarlin, Guy Podjarny, Thomas Wolf, Ed Grefenstette, Robin Rombach, Christopher Lynch and Ash Kulkarni.







.png)