Working Student - Machine Learning

Posted Yesterday
Be an Early Applicant
Eindhoven, NLD
Hybrid
Internship
Artificial Intelligence • Cloud • Machine Learning • Mobile • Software • Virtual Reality • App development
We believe the camera presents the greatest opportunity to improve the way people live and communicate.
The Role
Join Snap Inc. as a thesis student exploring efficient on-device ML for AR applications, focusing on event-driven processors. Conduct research, develop models, and produce a thesis report over 8-12 months.
Summary Generated by Built In

Snap Inc is a technology company. We believe the camera presents the greatest opportunity to improve the way people live and communicate. Snap contributes to human progress by empowering people to express themselves, live in the moment, learn about the world, and have fun together. The Company’s three core products are Snapchat, a visual messaging app that enhances your relationships with friends, family, and the world; Lens Studio, an augmented reality platform that powers AR across Snapchat and other services; and its AR glasses, Spectacles.

The Spectacles team is pushing the boundaries of technology to bring people closer together in the real world. Our fifth-generation Spectacles, powered by Snap OS, showcase how standalone, see-through AR glasses make playing, learning, and working better together.


Snap’s camera supports real friendships through visual communication, self expression and storytelling. Moving forward, our camera will play a transformative role in how people experience the world around them, combining what they see in the real world, with all that’s available to them in the digital world.


We are looking for a Machine Learning/ Software Engineering  thesis student to join our team at Snap Inc! 


What you’ll do

Project background & research context: 

Conventional frame-based pipelines and large neural networks are often too slow and power-hungry for always-on, real-time AR. They process every frame exhaustively, move large amounts of data through memory, and quickly hit strict latency, energy, and bandwidth limits on embedded hardware. Event-based sensing and processing, combined with other efficiency-oriented techniques, open up a fundamentally different design space. By exploiting temporal and spatial sparsity, we can:

  • Turn always-on perception into something that fits within strict power budgets

  • Push more intelligence closer to the sensor, reducing latency and data movement

  • Co-design models and systems that are built for edge hardware, rather than shrinking down server-scale architectures


In this project, you will explore how to combine modern deep learning with event-based and embedded processors to push the limits of what AR glasses can do on-device. You will help answer questions such as:

  • How can we architect models that are both accurate and ultra-efficient for real-world AR tasks on event-driven or low-power hardware?

  • What are the right trade-offs between accuracy, latency, memory, and energy for different AR scenarios?

  • How do we turn promising research ideas into practical, measurable improvements on realistic platforms and workloads?


Your work will directly inform how future AR experiences can run locally, responsively, and efficiently on next-generation devices


As a thesis student, you will define and drive a focused research direction in efficient on-device ML for AR, with a particular emphasis on event-driven or embedded processors. Possible directions within this space include:

  • Design and prototype ML models tailored to AR use cases under embedded constraints (e.g., event-based vision models, lightweight CNNs/Vision Transformers, or hybrid frame+event pipelines).

  • Set up datasets and baselines relevant to AR tasks (e.g., detection, tracking, segmentation, gesture/interaction), and define evaluation metrics across accuracy, latency, memory usage, and energy.

  • Implement and train models in PyTorch, including data pipelines, training loops, and evaluation scripts that are easy to extend and reproduce.

  • Explore efficiency techniques such as sparsity, pruning, quantization (PTQ/QAT), or event-based representations, and study their impact on performance–efficiency trade-offs.

  • Profile models under embedded-like conditions using simulators, profiling tools, or edge accelerators to understand system-level behavior (e.g., FLOPs, latency, memory footprint, bandwidth).

  • Communicate your findings through ablation studies, a clear thesis report (and optionally a paper-style write-up), and a reproducible codebase with pre-trained checkpoints.

Expected Outcomes

By the end of the project, you are expected to:

  • Demonstrate proof-of-concepts on AR hardware (e.g., Spectacles) showcasing real-world impact

  • Deliver measurable improvements in runtime performance, efficiency, and adaptability for representative AR tasks

  • Provide insights into model–system co-design for low-power, on-device ML

  • Contribute to ML frameworks, tooling, or deployment strategies for embedded AR systems

  • Produce a high-quality thesis report (and optionally a paper-style write-up) with reproducible code and results


Minimum qualifications

  • Currently enrolled in a Master’s program (e.g., Computer Science, Electrical/Computer Engineering, Artificial Intelligence, Robotics, or a related field).

  • Degree program allows a Master’s thesis / graduation project in collaboration with an external organization.

  • Strong background in:

    • Linear algebra, probability, and optimization

    • Deep learning fundamentals, including backpropagation, regularization, and basic model architectures

  • Hands-on experience training deep learning models for computer vision, including:

    • Experience with PyTorch (preferred) or a similar framework

    • Comfort implementing and training CNNs and/or vision transformers

  • Proficiency in Python and standard ML tooling (e.g., NumPy, PyTorch, Git, basic experiment management).

  • Interest in turning research ideas into robust, reproducible codebases that others can build on.

Preferred qualifications

  • Experience with one or more of:

    • Event-based or streaming vision, or other non-conventional sensor modalities

    • Model compression techniques: pruning, sparsity, quantization, or knowledge distillation

    • Efficient architectures for embedded or real-time applications (e.g., lightweight backbones, dynamic computation, conditional execution)

  • Familiarity with embedded / on-device ML toolchains (e.g., TensorFlow Lite, ONNX Runtime, or similar frameworks).

  • Experience with AI-assisted development and research tools (e.g., experiment tracking, ML tooling, or LLM-based coding and analysis assistants).

  • Exposure to performance profiling and basic systems concepts: FLOPs, latency, memory access patterns, and bandwidth.

Practical details

  • Project type: Master’s Thesis / Graduation Project

  • Focus: Efficient on-device ML for AR applications on embedded and/or event-driven processors

  • Duration & scope: Minimum of 8 months, up to 12 months, aligned with university and team requirements

  • Location: Eindhoven, the Netherlands, with a minimum of 4 days per week in the office

  • Start date: Flexible, to be agreed based on candidate and university timelines

If you have a disability or special need that requires accommodation, please don’t be shy and provide us some information.

"Default Together" Policy at Snap: At Snap Inc. we believe that being together in person helps us build our culture faster, reinforce our values, and serve our community, customers and partners better through dynamic collaboration. To reflect this, we practice a “default together” approach and expect our team members to work in an office 4+ days per week. 

At Snap, we believe that having a team of diverse backgrounds and voices working together will enable us to create innovative products that improve the way people live and communicate. Snap is proud to be an equal opportunity employer, and committed to providing employment opportunities regardless of race, religious creed, color, national origin, ancestry, physical disability, mental disability, medical condition, genetic information, marital status, sex, gender, gender identity, gender expression, pregnancy, childbirth and breastfeeding, age, sexual orientation, military or veteran status, or any other protected classification, in accordance with applicable federal, state, and local laws. EOE, including disability/vets.

Our Benefits: Snap Inc. is its own community, so we’ve got your back! We do our best to make sure you and your loved ones have everything you need to be happy and healthy, on your own terms. Our benefits are built around your needs and include paid parental leave, comprehensive medical coverage, emotional and mental health support programs, and compensation packages that let you share in Snap’s long-term success!

Top Skills

Numpy
Onnx Runtime
Python
PyTorch
Tensorflow Lite

What the Team is Saying

Xiaolin
Yvette
Matt
Jasmeet
Xueyin (Sherry)
Amir
Jung
Xu
Am I A Good Fit?
beta
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
HQ: Santa Monica, CA
5,000 Employees
Year Founded: 2011

What We Do

Snap Inc. is a technology company. We believe the camera presents the greatest opportunity to improve the way people live and communicate. We contribute to human progress by empowering people to express themselves, live in the moment, learn about the world, and have fun together.

Why Work With Us

Snap contributes to human progress by empowering people to express themselves, live in the moment, learn about the world, and have fun together.

Gallery

Gallery
Gallery
Gallery
Gallery
Gallery
Gallery
Gallery
Gallery
Gallery

Snap Inc. Teams

Team
Product + Tech
Team
Machine Learning
Team
Sales
About our Teams

Snap Inc. Offices

Hybrid Workspace

Employees engage in a combination of remote and on-site work.

Our “default together” approach is an 80/20 model where we are asking team members to spend 80% of the time, on average, in the office, with the remaining 20% of the time spent remote.

Typical time on-site: 4 days a week
HQSanta Monica, CA
Amsterdam, NL
Austin, TX
Bellevue, WA
Berlin, DE
Boulder, CO
Chandler, AZ
Chicago, IL
Copenhagen, DK
Dallas, TX
Eindhoven, NL
Hamburg, DE
London, GB
Mumbai, IN
New York, NY
Oslo, NO
Palo Alto, CA
Paris, FR
San Francisco, CA
Seattle, WA
Singapore
Sydney, AU
Toronto, ON
Vancouver, CA
Washington, DC
Learn more

Similar Jobs

Snap Inc. Logo Snap Inc.

Computer Architecture Intern

Artificial Intelligence • Cloud • Machine Learning • Mobile • Software • Virtual Reality • App development
Hybrid
Eindhoven, NLD
5000 Employees

Snap Inc. Logo Snap Inc.

Verification Engineer Intern

Artificial Intelligence • Cloud • Machine Learning • Mobile • Software • Virtual Reality • App development
Hybrid
Eindhoven, NLD
5000 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account