Research Engineer, Perception

Posted 6 Days Ago
Easy Apply
Be an Early Applicant
El Segundo, CA
In-Office
75K-300K Annually
Mid level
Aerospace • Hardware
Our mission is to make life galactic through exponential growth.
The Role
As a Research Engineer in Perception, you will create algorithms for robot perception, implement systems for state estimation, and design output systems for navigation and interaction in complex environments.
Summary Generated by Built In

The Mission

At GRAM, we believe the next great leap for humanity will be physical, not digital. What AGI is to bits, Self-Replication (SR) is to atoms. As an SR research and deployment company, our mission is to make humanity galactic by enabling physical exponential growth. To achieve this, we are conducting foundational research across general-purpose robotics, artificial intelligence, biology, and materials science to enable SR within our lifetimes.

Our first step is to challenge the consensus that robots should look like us. We believe true progress comes from optimizing for function over form. We are creating a new embodiment for intelligence, designed from first principles to build the future physical economy, on and off-world.

The Role

A machine's body is only as capable as the mind that drives it. This role is for the engineers who breathe life into the hardware, teaching our Insectoid robot how to move, perceive, and act in the physical world. You will be responsible for creating the algorithms and software that form the robot's intelligence, from low-level motor skills to high-level reasoning.

This is an opportunity to solve foundational problems in AI and robotics on a novel morphology that has no precedent. You will work at the frontier of control theory, reinforcement learning, and computer vision to give our robot the ability to navigate and interact with complex, unstructured environments, creating the intelligent foundation for its ultimate mission.

Key Responsibilities:
  • Architect and build the complete perception stack for the Insectoid platform, responsible for turning raw sensor data into a coherent understanding of the world.
  • Develop robust algorithms for real-time state estimation and sensor fusion, combining data from IMUs, cameras, joint encoders, and other sensors to precisely track the robot's position and orientation.
  • Implement systems for 3D scene understanding, including object detection, semantic segmentation, and geometric reconstruction of the robot's immediate surroundings.
  • Design and manage the sensor calibration pipeline, working closely with the hardware team to ensure the highest possible data quality.
  • Provide clean, reliable, and low-latency perception outputs (e.g., world state, obstacle maps) that serve as the ground truth for our autonomy, manipulation, and world model teams.
About You

You are a first-principles thinker who is energized by solving problems that have no playbook. You are drawn to ambitious, long-term missions and want to build technology that fundamentally changes our physical capabilities. You thrive in a high-agency environment where you are given the ownership and resources to tackle core challenges.

Basic Qualifications:
  • MS in Computer Science, Robotics, or a related field with 3+ years of experience building perception systems for robots or autonomous vehicles.
  • Expert proficiency in C++ for real-time applications.
  • A strong foundation in 3D geometry, linear algebra, and probabilistic filtering techniques (e.g., Kalman Filters, Particle Filters).
  • Experience with computer vision libraries (e.g., OpenCV) and robotics frameworks (e.g., ROS).
Preferred Qualifications:
  • A PhD with a focus on robotics perception, state estimation, or computer vision.
  • A track record of publications in top-tier robotics or computer vision conferences (e.g., ICRA, IROS, RSS, CVPR).
  • Hands-on experience deploying perception stacks on real-world hardware.
  • Expertise in SLAM (Simultaneous Localization and Mapping) or visual-inertial odometry.
  • Familiarity with a variety of sensor modalities, including cameras, LiDAR, IMUs, and radar.

Location

This is an on-site role at our research lab in El Segundo, California. We offer significant relocation assistance for exceptional candidates.

Interview Process

After submitting your application, we review your portfolio and any exceptional work you've shipped. If your application demonstrates the caliber we seek, you'll enter our interview process, which is designed for speed and substance. We aim to complete it within one week from start to finish.

Compensation

Your total compensation reflects both the significance of early-stage equity and competitive market rates. The following is the general framework for all roles at GRAM.
  • Company-Wide Base Salary Range: $75,000 - $300,000 USD, calibrated to your impact potential
  • Equity: Substantial ownership stake befitting founding team members
  • Benefits: Health, dental, and vision coverage; all meals are paid for; relocation assistance

GRAM is an equal opportunity employer. We evaluate solely on capability and drive.

Why Join?

We are an early, focused team of scientists and engineers building the system that builds itself. GRAM is funded by leading investors and long-term visionaries. We operate without bureaucracy. Ownership and leadership flow to those who demonstrate exceptional capability, and every team member works directly on our core technology. We are based in El Segundo, the heart of America's industrial future, focused on solving one of the most important problems of our time.

Top Skills

C++
Opencv
Ros
Am I A Good Fit?
beta
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
4 Employees
Year Founded: 2025

What We Do

Humanity's birthright is to become galactic. GRAM is founded to serve this mission. We are a foundational AI and robotics startup birthing the machines and intelligence necessary to solve the physical labor bottleneck preventing space industrialization.

Our first product is a general-purpose insectoid designed for work in vast swarms for operation in extreme environments. Over the coming century, our systems will build the critical infrastructure needed for humanity's expansion across the Milky Way.

Similar Jobs

RoboForce Logo RoboForce

Senior ML Research Engineer, Perception

Artificial Intelligence • Machine Learning • Robotics
Easy Apply
In-Office
Milpitas, CA, USA
14 Employees

Anduril Logo Anduril

Associate Director, Technical Operations Engineering (Tactical Recon and Strike)

Aerospace • Artificial Intelligence • Hardware • Robotics • Security • Software • Defense
In-Office
Costa Mesa, CA, USA
6000 Employees
166K-220K Annually

ServiceNow Logo ServiceNow

Senior Manager - CRM Sales Operations

Artificial Intelligence • Cloud • HR Tech • Information Technology • Productivity • Software • Automation
Remote or Hybrid
San Diego, CA, USA
28000 Employees
138K-241K Annually

ServiceNow Logo ServiceNow

Technical Content Creator

Artificial Intelligence • Cloud • HR Tech • Information Technology • Productivity • Software • Automation
Remote or Hybrid
Santa Clara, CA, USA
28000 Employees
163K-285K Annually

Similar Companies Hiring

Red 6 Thumbnail
Virtual Reality • Software • Hardware • Defense • Aerospace
Orlando, Florida
155 Employees
Blissway Thumbnail
Transportation • Software • Machine Learning • Internet of Things • Hardware • Fintech • Computer Vision
Denver, Colorado
20 Employees
Turion Space Thumbnail
Software • Manufacturing • Information Technology • Hardware • Defense • Artificial Intelligence • Aerospace
Irvine, CA
150 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account