Exascale computing is a type of high-performance supercomputing that can solve one exaflop — or one quintillion operations — per second. In terms of processing power, exascale computers are the most powerful machines in the world.
What Is Exascale Computing?
Exascale computing refers to massive, high-performance machines that analyze large data volumes at a rate of one quintillion operations per second. It is the world’s fastest computational standard.
Using advanced modeling and simulations, these hefty hardware systems crunch vast data volumes to provide predictive analysis about national security, precision medicine and climate research. This unlocks “new competition in the realms of science and computer engineering,” Lucas Ochoa, an ex-Google artificial intelligence engineer and former Microsoft product designer, told Built In.
What Is Exascale Computing?
Exascale computing refers to computing systems capable of performing one quintillion calculations per second, known as an exaflop. That’s 18 zeros, or a billion billion calculations. These systems run on highly parallel software algorithms and require robust hardware architecture — made up of thousands of CPUs, GPUs and nodes — that’s so massive it must be stored in warehouse-sized buildings.
Take for example Frontier, the first supercomputer to break the exascale and that beat out Fujitsu’s Fugaku supercomputer as the fastest in the world. Operating out of Tennessee’s Oak Ridge National Laboratory, the 7,300-square-foot system debuted in 2022, and is made up of 74 cabinets with more than 9,400 nodes, 10,000 CPUs and 38,000 GPUs. It runs a peak performance of 1.6 exaflops.
This extraordinary processing power allows exascale computers to analyze massive amounts of data at rapid speeds to simulate real-world phenomena and essentially figure out how everything works. One exascale machine could outperform the combined effort of today’s top one hundred petascale supercomputers.
What Is Exascale Computing Used For?
Exascale computers are multi-million-dollar machines owned by government entities or large conglomerates, so scientists typically must submit grants to access these systems. Once they do, researchers can use exascale computing for a range of applications:
- Cybersecurity: With exascale computing’s ability to rapidly execute billions of calculations, researchers can anticipate, detect and handle many types of cyber attacks and malicious actors.
- Greentech: Researchers can use exascale computing-powered analyses to create sustainable materials that can handle challenging conditions and produce crops that can adapt to different stressors.
- Energy: Exascale computing makes it possible to model nuclear weapons’ performance and gather data on whether they’re safe — all without having to conduct an actual nuclear explosion.
- Manufacturing: The massive computational capacity of exascale computers allows researchers to more accurately model and simulate manufacturing materials, resulting in faster additive manufacturing processes.
- Healthcare: Exascale computing can work on the nanoscale to help researchers understand how various drug molecules interact with the human body and design more effective cancer treatments.
- Aerospace: Scientists can use exascale computing to develop detailed simulations of the universe, making it possible to unlock some of the universe’s biggest mysteries (like how stars explode).
Exascale computing delivers a gateway to previously unattainable knowledge with better, faster science at scale. George Amvrosiadis, an associate research professor of electrical and computer engineering at Carnegie Mellon University and a member of the Parallel Data Lab, told Built In that exascale computing “allows us to execute complex simulations and analyses that would be too costly, dangerous or even impossible to conduct in real life.”
How Does Exascale Computing Work?
Think of exascale computers as a bunch of smaller networked computers working as one. Each of these smaller computers, known as nodes, share access to rapid data storage and come equipped with their own set of processors that solve complex problems by breaking them down into smaller parts. So when a task is inputted into an exascale system, it’s broken up into smaller pieces that are then distributed among thousands — or even millions — of processors working concurrently in a process known as parallel computing.
For exascale computing to work, each processor, either a CPU or GPU, is assigned a specific job to complete. They then communicate with each other through a high-speed, interconnected network, ensuring that they’re all working toward the same goal. To prevent disruptions, exascale computers have built-in fail-safe systems to handle errors and keep running smoothly even if some parts fail.
Parallel processing enables intricate, large-scale systems to balance their computational power for efficient performance and optimal energy usage.
Benefits of Exascale Computing
Expanded Computational Power
By definition, exascale computers are built with unprecedented power. Frontier, the machine that first broke the exascale, is more powerful than the top seven supercomputers combined, with a peak performance of 1.6 exaflops that could theoretically surpass 2 exaflops. These machines will reign supreme until the next generation of zettascale technology takes over — potentially within a decade.
Enhanced Performance
Exascale computing systems are built with size in mind, not just efficiency. Using parallelism, exascale computers split one task across thousands or millions of processing units that concurrently work on different parts of a problem. This method optimizes the use of computational resources for faster task execution.
Ochoa, who is also the CEO and founder at AI-startup Automat, said that exascale performance capacities enable the construction of AI models that are about four-and-a-half times faster and eight times larger. This capacity “allows for training on more extensive data sets, enhancing predictability and accelerating the time-to-discovery process,” he added.
Improved Speed
Exascale computers outperform the human brain one quintillion to one. Since Frontier’s debut in 2022, exascale computers have set a new pace for computational speed at one exaflop per second, which is one thousand times faster than the top petascale computers.
“If every person on the planet performed one calculation per second, it would still take over four years to accomplish what an exascale computer can do in a single second,” Ochoa said.
Increased Scientific Discovery
With the ability to create better simulations and predictive analysis at unimaginable speeds and scale, exascale computers facilitate groundbreaking discoveries by providing valuable insights that are otherwise impossible to achieve. By modeling natural processes and unobservable phenomena, scientists and researchers get one step closer to cracking the codes of fundamental principles that shape the world around us.
Ability to Handle Complex Challenges
Exascale computing empowers researchers to address some of society’s biggest challenges, such as climate change, disease prevention, renewable energy optimization and urban planning. It provides the computational resources needed to model, simulate and optimize complex systems.
Challenges of Exascale Computing
Demanding Power Consumption
Due to their grand scale and high-performance parts, exascale systems require an enormous amount of power to operate. The world’s fastest supercomputer produces 20 megawatts per exaflop, at 22.7 megawatts total. That’s enough to power an American small town or a jet engine.
Greater Chances for Errors to Occur
Exascale systems consist of many moving parts working independently in parallel, but as one. In the interest of peak performance, minimizing data movement and streamlining communication between networked nodes and processors is crucial. Given their sheer size, these larger-than-life systems won’t work without efficient data transfer rates, latency and bandwidth.
“Exascale systems will deploy so many components that it’s improbable for the entire system to operate without issues at any given time,” Ochoa said. “The hardware, system software and applications must be capable of handling both minor malfunctions and major failures.”
Higher Costs
Building and operating exascale computing facilities comes with a hefty price tag, which is why they are largely owned by government entities or conglomerates. In addition to ongoing operational costs, they require specialized hardware, infrastructure and energy requirements. The initial investment alone disqualifies most organizations from exascale computing. So while it took $600 million to physically build Frontier, there’s an additional cost of $100 million per year to run the machine. This is just one of three exascale projects announced by the Department of Energy, collectively budgeted at $1.8 billion.
Heightened Security Risks
To a cybercriminal, each node of a networked network system is a window inside. So while the thousands of interconnected nodes that make up an exascale computer may increase its computational performance, they also contribute to a sizable attack surface that increases their vulnerability to security threats, including cyberattacks, data breaches and unauthorized access.
Exascale Computing vs. Quantum Computing
Exascale computers are essentially classical computers with extremely powerful hardware, while quantum computers are entirely different computers that rely on quantum mechanics to function.
Exascale Computing
You can think of exascale computing as classical computing at scale. It uses the same coding that runs our smartphones, laptops and PCs, then builds it out using a maximal amount of hardware.
“Our everyday computers cannot solve large-scale scientific problems that involve immense amounts of data or must be solved before a tight deadline,” Omer Subasi, a computer scientist at Pacific Northwest National Laboratory, told Built In. “This is why we need exascale supercomputers.”
Classical computations use binary logic, where units of information are represented in “bits” and can be of two states: 1 (True) and 0 (False).
Quantum Computing
Quantum computing is a fundamentally different style of computation that leverages quantum mechanics and is capable of simulating atomic and subatomic phenomena. Quantum computing uses “qubits,” which may represent an infinite number of superimposed states. So while traditional methods have to carry out a single path per result, quantum computers use a larger workspace to explore multiple paths simultaneously.
Quantum computers are expected to surpass any conventional binary-code machine in terms of complexity and speed. They also differ from traditional supercomputers in that they solve different types of questions specific to fields like chemistry, material science and physics, Subasi explained.
Frequently Asked Questions
What is exascale used for?
Exascale computing processes massive volumes of data at hyper speed to create predictive analysis and better simulations that are used to facilitate scientific breakthroughs and solve some of society’s most pressing issues. These insights can be applied to create more accurate disease modeling for precision medicine, bolster national security, discover energy production alternatives and deter climate change.
Do exascale computers exist?
Yes; the world’s first exascale computer, Frontier, debuted in 2022. Operating out of Tennessee’s Oak Ridge National Laboratory, the supercomputer has a peak performance of 1.6 exaflops with a theoretical reach of 2 exaflops.
How fast is exascale computing?
Exascale computing exceeds one exaflop, meaning that these supercomputers perform one quintillion (or one billion billion) calculations per second.
Why do we need exascale computing?
Exascale computing is essential for conducting more advanced experiments, analyses and simulations that require tremendous amounts of data. This improved research process can be used to explore the nature of stars, enhance cancer treatments and design more sustainable materials, among other use cases.
How much does an exascale computer cost?
As of now, the Department of Energy has budgeted $1.8 billion to complete three exascale computing projects. So far, the exascale computer Frontier has cost $600 million to build and $100 million per year to operate.