Submitted by Stephen Gossett on Wed, 11/06/2019 - 10:30
The simple concept of splitting up a task, computationally speaking, is spawning profound changes in drug research, energy exploration, medical imaging and much more.

Take all the help you can get.

If parallel computing has a central tenet, that might be it. Some of the crazy-complex computations asked of today’s hardware are so demanding that the compute burden must be borne by multiple processors, effectively “parallelizing” whatever task is being performed. The result? Slashed latencies and turbocharged completion times.

Perhaps the most notable push toward parallelism happened around 2006, when tech hardware powerhouse Nvidia approached Wen-mei Hwu, a professor of electrical and computer engineering at the University of Illinois-Urbana Champaign. Nvidia was designing graphics processing units (GPUs) — which, thanks to large numbers of threads and cores, had far higher memory bandwidth than the traditional central processing unit (CPUs) — as a way to process huge numbers of pixels. 

Parallel Processing Applications

Parallel processing refers to the speeding up a computational task by dividing it into smaller jobs across multiple processors. Notable applications for parallel processing (also known as parallel computing) include computational astrophysics, geoprocessing (or seismic surveying), climate modeling, agriculture estimates, financial risk management, video color correction, computational fluid dynamics, medical imaging and drug discovery.

“They asked if I’d be willing to extend this into general parallel computing applications,” Hwu, who’s now considered a godfather of parallel computing, told Built In. 

That effectively sparked the use of GPUs for general-purpose computing — and, eventually, for massively parallel systems as well. Believe it or not, the circuit your computer uses to render fancy graphics for video games and 3D animations is built from the same root architecture as the circuits that make possible accurate climate pattern prediction. Wild, huh? And GPUs’ parallel infrastructure continues to power the most powerful computers.

“If you look at the workhorses for the scientific community today, the new computers, like [IBM supercomputer] Summit, and also the next generation, like Aurora, they're largely based on this model now,” Hwu said.

The model is a workhorse for medical and commercial applications, too, facilitating everything from drug discovery to interstellar simulations to post-production film techniques. 

Here are just a few ways parallel computing is helping improve results and solve the previously unsolvable.

 

science research energy parallel processing examples
Science, Research & Energy

Science, Research & Energy

When you tap the Weather Channel app on your phone to check the day’s forecast, thank parallel processing. Not because your phone is running multiple applications — parallel computing shouldn’t be confused with concurrent computing — but because maps of climate and weather patterns require the serious computational heft of parallel.

Parallel computing is the backbone of other scientific studies, too, including astrophysic simulations, seismic surveying, quantum chromodynamics and more. Here’s a closer look at a few.

 

northwestern university parallel processing applications examples
Northwestern University

Northwestern University

Location: Evanston, Ill.

How it’s using parallel computing: Astronomy moves slowly. It can take millions of years for stars to collide, galaxies to merge or black holes to swallow astronomical objects — which is why astrophysicists must turn to computer simulations to study these kinds of processes. And such complex models demand massive compute power.

A recent breakthrough in the study of black holes, for example, happened courtesy of a parallel supercomputer. Researchers solved a four-decade-old mystery, proving that the innermost part of matter that orbits, then collapses into, black holes aligns with those black holes. That’s key to helping scientists better understand how this still-mysterious phenomenon behaves.

“These details around the black hole may seem small, but they enormously impact what happens in the galaxy as a whole,” said researcher Alexander Tchekhovskoy of Northwestern University, which partnered with the University of Amsterdam and the University of Oxford on the study. “They control how fast the black holes spin and, as a result, what effect black holes have on their entire galaxies.”

 

downunder geosolutions parallel processing applications examples
DownUnder Geosolutions

DownUnder GeoSolutions

Location: Houston, Texas

How it’s using parallel computing: One of the oil industry’s biggest players lives in suburban Houston and goes by the name Bubba. But Bubba’s no black-gold bigwig, it’s a supercomputer (among the fastest on the planet) owned by Australian geoprocessing company DownUnder GeoSolutions.

Seismic data processing has long helped provide a clearer picture of underground strata, an obvious must for industries like oil and gas. Supercomputing, though, is practically de rigueur in energy excavation nowadays — especially as algorithms process massive amounts of data to help drillers mine difficult terrain, like salt domes. (French energy titan Total uses the world’s most powerful commercial supercomputer.)

Bubba’s backbone is formed by thousands of Intel Xeon Phi multiprocessors that are cooled in chilled oil baths, a technique that allows for extremely high-performance parallel processing. The hope is that by selling parallel power access to third-party companies, fewer energy outfits will feel compelled to build their own, less efficient systems.

 

university of illinois parallel processing applications examples
University of Illinois

University of Illinois

Location: Urbana-Champaign, Ill.

How it’s using parallel computing: Every month, the U.S. Department of Agriculture estimates supply and demand figures for a number of major crops. The crucially important forecasts can impact everyone from legislators striving to stabilize markets to farmers who want to manage their finances.

Last year, researchers at U of I’s Department of Natural Resources and Environmental Sciences topped the feds’ industry-standard forecast by incorporating more data — crop growth calculations and seasonal climate information as well as satellite figures — which it then crunched using machine learning algorithms processed by the university’s parallel-data supercomputer, the petascale Blue Waters. Their prediction quickly proved more accurate by nearly five bushels per acre.

In 2019, the team turned its predictive eye to Australian wheat yields with similarly impressive results.

 

commercial banking finance parallel processing examples
Commerce

The Commercial World

Even though parallel computing is often the domain of academic and government research institutions, the commercial world has definitely taken notice. 

“The banking industry, investment industry traders, cryptocurrency — those are the big communities that are using a lot of GPUs for making money,” Hwu said.

Parallel computing also has roots in the entertainment world — no surprise given that GPUs were first designed for heavy graphics loads. It’s also a boon to industries that rely on computational fluid dynamics, a mechanical analysis that has several big commercial applications. Here’s a closer look.

 

wells fargo parallel processing applications examples
Wells Fargo

Wells Fargo

Location: San Francisco, Calif.

How it’s using parallel computing: Nearly every major aspect of today’s banking, from credit scoring to risk modeling to fraud detection, is GPU-accelerated. In a way, the departure from traditional CPU-powered analysis was inevitable. GPU offloading came of age around 2008, just as lawmakers ushered in several rounds of post-crash financial legislation. “It’s not uncommon now to find a bank with tens of thousands of Tesla GPUs,” Xcelerit co-founder Hicham Lahlou told The Next Platform in 2017. “And this wouldn’t have been the case without that mandatory push from regulation.”

One early adopter was JPMorgan Chase, which announced in 2011 that its switch from CPU-only to GPU-CPU hybrid processing had improved risk calculations at its data centers by 40 percent and netted 80 percent savings. More recently, Wells Fargo used Nvidia’s GPUs for processes as varied as accelerating AI models for liquidity risk and virtualizing its desktop infrastructure.

GPUs also found themselves at the center of a very 2019 financial trend: the crypto-mining craze. But chip sales have since stabilized after that particular boom and bust.

 

blackmagic design parallel processing applications examples
Blackmagic Design

Blackmagic Design

Location: Port Melbourne, Australia

How it’s using parallel computing: If you saw either Brad Pitt’s character working out his intergalactic daddy issues in Ad Astra or John Wick’s latest round of elaborately choreographed assassin dispatch, you also saw the handiwork of parallel processing. Both were colored using the Blackmagic Design’s DaVinci Resolve Studio, one of a handful of Hollywood-standard post-production suites (including Adobe Effects and Avid Media Composer) that incorporates GPU-accelerated tools. “The high-quality rendering based on what they call the ray-tracing technique are all using some of these processors now,” Hwu said. Color correction and 3D animation, he added, both commonly use GPU parallel processing.

 

volkswagen parallel processing applications examples
Volkswagen

Volkswagen

Location: Wolfsburg, Germany

How it’s using parallel computing: When French driver Romaine Dumas drove an electric Volkswagen prototype to auto-racing glory last year — smashing the Pikes Peak climb record by completing the first ever sub-eight-minute finish at the legendary track, kicking off a barnstorming tour of more all-time best finishes — the win was arguably as notable for computing as it was for electric automobiles.

Engineers relied on Anasys Fluent software in at least two key facets: running a virtual simulation of the course, and finding the ideal balance of low weight and aerodynamic drag loss for the battery cooling system.

Such cooling is one of a number of so-called computational fluid dynamics (CFD) simulations users can run on Ansys, a program that easily supports GPU acceleration. It’s one of the more headline-worthy examples of how high-powered parallel computing has become a go-to for all manner of CFD research in everything from numerical weather prediction combustion engine optimization.

 

medicine drug discovery parallel processing examples
Medicine & Drug Discovery

Medicine & Drug Discovery

Emergent technology is reshaping the medical landscape in countless ways, from virtual reality that ameliorates macular degeneration to developments in bioprinting tissue and organs to the countless ways Amazon is poised to further impact healthcare. Parallel computing has for years made its presence felt in this arena, but it’s poised to fuel even more breakthroughs. Here’s how.

 

nvidia parallel processing applications examples
Nvidia

Nvidia

Location: Santa Clara, Calif.

How it’s using parallel computing: One of the first industries that saw a sea change thanks to parallel processing, particularly the GPU-for-general-computing revolution, was medical imaging. Today, a whole body of medical literature exists cataloging how the high computation and bandwidth led to vast improvements in speed and definition for, well, just about everything: MRI, CT, X-rays, optical tomography and more.

The next great leap in medical imaging will likely be similarly parallel-focused, and parallel pioneer Nvidia is at the forefront. Using the company’s recently released toolkit, radiologists can more easily access the powers of AI, which helps imaging systems handle growing amounts of data and computational weight. Dubbed Clara, the GPU-leveraging system reportedly lets doctors create imaging models with ten times less data than otherwise required. Among the institutions that have already signed on are Ohio State University and the National Institutes of Health.

 

acellera parallel processing applications examples
Acellera

Acellera

Location: London

How it’s using parallel computing: If you think of parallel processing as a nesting doll, one of the innermost figures could be a life-saving drug. Parallel programming is an ideal architecture for running simulations of molecular dynamics, which has proven to be highly useful in drug discovery.

Medical research company Acellera has developed multiple programs that harness the powerful offloading infrastructure of GPUs: simulation code ACEMD and Python package HTMD. They’ve been used to perform simulations on some of the world’s most powerful computers, including a Titan run that helped scientists better understand how our neurotransmitters communicate. And Acellera has partnered with the likes of Janssen and Pfizer for pharma research.

Since advanced parallel computing allows for fine-grain study of molecular machinery, it could also have major applications in the study of genetic disease — something researchers are currently looking into.

 

oak ridge national laboratory parallel processing applications examples
Oak Ridge National Laboratory

Oak Ridge National Laboratory

Location: Oak Ridge, Tenn.

How it’s using parallel computing: Beyond image rendering and pharmaceutical research, parallel processing’s herculean data-analytics power hold great promise for public health. Take one particularly harrowing epidemic: veteran suicide. Around 20 veterans have died by suicide every day since 2014, according to data from the Department of Veterans Affairs. The issue is one of precious few to garner genuine bipartisan attention.

After the VA developed a model that dove into veterans’ prescription and refill patterns, researchers at the Oak Ridge National Laboratory were then able to run the algorithm on a high-performance computer 300 times faster than the VA’s capabilities. The hope is to eventually use IBM’s fabled (and GPU-packed) Summit supercomputer to allow real-time risk alerts to be sent to doctors.

“We don’t want veterans to walk into a clinic and get missed because someone hasn’t been specifically trained to recognize these symptoms,” said ORNL researcher Edmon Begoli. “We never want it to be too late to reach someone.”

Great Companies Need Great People. That's Where We Come In.

Recruit With Us