AI Cloud Platform Software Engineer - Internship (PEY 2025)

Sorry, this job was removed at 02:46 p.m. (CST) on Thursday, Jan 30, 2025
Be an Early Applicant
Toronto, ON
In-Office
Artificial Intelligence
The Role

Cerebras has developed a radically new chip and system to dramatically accelerate deep learning applications. Our system runs training and inference workloads orders of magnitude faster than contemporary machines, fundamentally changing the way ML researchers work and pursue AI innovation.

We are innovating at every level of the stack – from chip, to microcode, to power delivery and cooling, to new algorithms and network architectures at the cutting edge of ML research. Our fully-integrated system delivers unprecedented performance because it is built from the ground up for deep learning workloads.

Cerebras is building a team of exceptional people to work together on big problems. Join us!

About the Role

As a software engineer on our AI cloud platform, you will work on our cloud platform for AI model training and inference. In this role, you will be responsible for optimizing deployment for minimal latency and efficient load balancing, and on ensuring high reliability, scalability, security, and observability of our distributed training and inference infrastructure. You will define and document production requirements, implement scaling strategies, and ensure robust API server management and high availability. 

You will develop tools to map out system bottlenecks and sources of instability, and design and build solutions to address them. 

We're looking for talented software engineers who thrive in ambiguity, view change as an opportunity, and have a voracious desire to learn and share knowledge clearly and concisely. 

Requirements

  • Enrolled in the University of Toronto's PEY program with a degree in Computer Science, Computer Engineering, or other related disciplines.
  • Experience building high-reliability, production-grade cloud services.
  • Exposure or experience in developing or building ML serving and/or training services, preferably for modern generative AI models.
  • Familiarity with the latest AI model architectures and efficient implementation of serving systems for these models.
  • Strong problem-solving skills and excellent communication skills.
  • A real passion for AI!

Cerebras Systems is committed to creating an equal and diverse environment and is proud to be an equal opportunity employer. We celebrate different backgrounds, perspectives, and skills. We believe inclusive teams build better products and companies. We try every day to build a work environment that empowers people to do their best work through continuous learning, growth and support of those around them.

This website or its third-party tools process personal data. For more details, click here to review our CCPA disclosure notice.

Similar Jobs

TransUnion Logo TransUnion

Account Executive

Big Data • Fintech • Information Technology • Business Intelligence • Financial Services • Cybersecurity • Big Data Analytics
Remote or Hybrid
Ontario, ON, CAN
13000 Employees

MongoDB Logo MongoDB

Escalation Engineer

Big Data • Cloud • Software • Database
Easy Apply
Hybrid
7 Locations
5550 Employees

MongoDB Logo MongoDB

Staff Software Engineer

Big Data • Cloud • Software • Database
Easy Apply
Hybrid
6 Locations
5550 Employees
159K-221K Annually

Opendoor Logo Opendoor

Senior Manager, Marketing Operations & Automation

eCommerce • Fintech • Real Estate • Software • PropTech
Remote or Hybrid
Ontario, ON, CAN
1600 Employees
129K-177K Annually
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
HQ: Sunnyvale, CA
402 Employees
Year Founded: 2016

What We Do

Cerebras Systems is a team of pioneering computer architects, computer scientists, deep learning researchers, functional business experts and engineers of all types. We have come together to build a new class of computer to accelerate artificial intelligence work by three orders of magnitude beyond the current state of the art.

The CS-2 is the fastest AI computer in existence. It contains a collection of industry firsts, including the Cerebras Wafer Scale Engine (WSE-2). The WSE-2 is the largest chip ever built. It contains 2.6 trillion transistors and covers more than 46,225 square millimeters of silicon. The largest graphics processor on the market has 54 billion transistors and covers 815 square millimeters. In artificial intelligence work, large chips process information more quickly producing answers in less time. As a result, neural networks that in the past took months to train, can now train in minutes on the Cerebras CS-2 powered by the WSE-2.

Join us: https://cerebras.net/careers/

Similar Companies Hiring

Granted Thumbnail
Mobile • Insurance • Healthtech • Financial Services • Artificial Intelligence
New York, New York
23 Employees
Milestone Systems Thumbnail
Software • Security • Other • Big Data Analytics • Artificial Intelligence • Analytics
Lake Oswego, OR
1500 Employees
Idler Thumbnail
Artificial Intelligence
San Francisco, California
6 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account