A few years ago, scientists at Argonne National Laboratory outside Chicago concluded that it would take only a couple of months for zombies to overrun the city and wipe out its population.
“No part of the city would be spared,” Chick Macal, an Argonne senior systems engineer, told the Chicago Tribune in 2016.
Fortunately, as he recently reassured Built In, we now have “the knowledge to develop an actionable program to train the population to both better defend themselves against zombies and also take offensive actions that are the most effective.”
Zombies aren’t real, of course, but merely an entertaining explanatory device that Macal and his co-researchers employed to predict how more plausible infectious diseases might spread, and to determine the most effective methods of intervention and policy action. Their research relied on what’s called agent-based computer modeling and simulation. Along with its equation-based cousin, the method (not to be confused with 3D visualization) has for decades allowed researchers in all types of academic disciplines and commercial industries to figure out how things (equipment, viruses, etc.) would function or act in certain environments without having to physically replicate those conditions. In the case of Macal and his cohorts, that means no humans living or undead were harmed in the course of their work. Again, phew.
Macal’s colleague, computational scientist Jonathan Ozik, described this part of their work as the “computational discovery of effective interventions,” and it’s especially good at working with a particular population of people. An added benefit, he said, is that “we can do these experiments without worrying about the cost of experiments or even ethical and privacy considerations,” because the populations they study are synthetic — mathematical representations and not the real thing.
What is a Computer Simulation?
Still not clear on what simulation does or is? Let’s let Encyclopedia Britannica take a crack at it, with some added italics for emphasis: Computer simulation, the venerable knowledge repository tells us, involves “the use of a computer to represent the dynamic responses of one system by the behavior of another system modeled after it. A simulation uses a mathematical description, or model, of a real system in the form of a computer program. This model is composed of equations that duplicate the functional relationships within the real system. When the program is run, the resulting mathematical dynamics form an analog of the behavior of the real system, with the results presented in the form of data.”
Better? Here’s hoping.
5 Quick Examples of Computer Simulators at Work
1. Responding To Deadly Pandemics
Along with Ozik and their fellow researcher Nick Collier, Macal also worked on a modeling and simulation project that determined what might happen if the deadly Ebola virus (which initially spread through West Africa in 2013 through 2016, with devastating effects) would impact that U.S. population. Part of that process involved visiting Chicago hospitals to learn about Ebola-related procedures, then incorporating those procedures into their models (a.k.a. mathematical descriptions).
2. Improving Cancer Treatment
Other Argonne scientists have used modeling and simulation to improve cancer treatment through predictive medicine — finding out how various patients and tumors respond to different drugs. And those are just a couple of examples. Whether in academic science or industry, computer simulation is everywhere these days.
“If it’s too large-scale, too expensive or too risky to work with the real system itself — that’s why we use computer simulation."
“If it’s too large-scale, too expensive or too risky to work with the real system itself — that’s why we use computer simulation,” said Barry Nelson, a professor of engineering at Northwestern University in Evanston, Ill. “Simulation allows you to create data or systems that are conceptual, that people want to build, want to consider or want to change. I sometimes say that simulation is data analytics for systems that don’t yet exist.”
3. Predicting Health Code Violations
Or systems that are unwieldy. In Chicago, the city’s Department of Public Health uses computer modeling and simulation to predict where critical violations might pop up first. Those restaurants are then bumped to the top of a 15,000-establishment list that’s overseen by only three dozen inspectors. And apparently it’s working; a recent simulation yielded 14 percent more violations, which ideally means earlier inspection and a lower chance of patrons getting sick from poorly refrigerated monkfish.
4. Understanding Our Relationship With Religion
Over at the University of Boston, Wesley Wildman, a professor of philosophy, theology and ethics, uses computer simulation to study, as he put it a 2018 article for The Conversation, “how religion interacts with complex human minds, including in processes such as managing reactions to terrifying events.”
In order to do so, he and his team designed a world and filled it with computer-controlled characters, or “agents,” that are “programmed to follow rules and tendencies identified in humans through psychological experiments, ethnographic observation and social analysis.” Then they observed what happened when their agents were tested against “well-known, real-world” examples like a massive earthquake that struck Christchurch, New Zealand in 2011.
“The better our agents mimic the behavior of real humans in those sorts of circumstances,” Wildman goes on, “the more closely aligned the model is with reality, and the more comfortable we are saying humans are likely to behave the way the agents did in new and unexplored situations.”
5. Earthquake Research
And in Germany, a team at the Leibniz Supercomputing Centre performed earthquake simulations using the devastating Indian Ocean earthquake of 2004, which spurred a massive tsunami, as their point of origin. According to one of the researchers, Professor Michael Bader of Germany’s Institut für Informatik, they wanted to “better understand the entire process of why some earthquakes and resulting tsunamis are so much bigger than others. Sometimes we see relatively small tsunamis when earthquakes are large, or surprisingly large tsunamis connected with relatively small earthquakes. Simulation is one of the tools to get insight into these events.”
But it’s far from perfect. In a recent New York Times article titled “This High-Tech Solution to Disaster Response May Be Too Good to Be True,” reporter Sheri Fink detailed how a Seattle-based disaster response startup called One Concern developed an earthquake simulation that failed to include many densely populated commercial structures in its test-runs “because damage calculations relied largely on residential census data.” The potential real-world result of this faulty predictive model: rescuers might not have known the location of many victims in need. And that was just one of many issues highlighted.
What It Takes to Simulate
Thanks to the robust data-crunching powers of super expensive supercomputers (Argonne currently has two, with another on the way soon, that harness what’s called “massively parallel processing”), simulation is more advanced than ever — and evolving at a rapid pace.
“We’re not interested in simply extrapolating into the future,” Macal said. “We’re interested in looking at all the uncertainties as well as different parameters that characterize the model, and doing thousands or millions of simulations of all the different possibilities and trying to understand which interventions would be most robust. And this is where high-performance computing comes in.”
The computational resources at their disposal, Ozik added, allow Argonne researchers (and anyone with supercomputing access) “to fully explore the behaviors that these models can exhibit rather than just applying ad hoc approaches to find certain interesting behaviors that might reflect some aspect of reality.”
Which is to say, the simulations are much broader, and therefore even more realistic — at least from a hypothetical perspective.
Then again, plenty of simulations are done with far less compute power than Argonne possesses. Alison Bridger, department chair of meteorology and climate science at San Jose State University in California, said on-site cluster computers are strong enough to run the climate simulation models she builds. Cloud computing services like those offered by Amazon (AWS) and Microsoft (Azure) are gradually gaining a foothold in the space as well.
Along with nuclear physics, meteorology was one of the first disciplines to make use of computer simulation after World War II. And climate modeling, Bridger said, “is like a close cousin of weather forecasting. Back in the 1960s, people used early weather forecasting models to predict the climate. Before you can predict the weather, you have to be able to properly reproduce it with your model.”
Bridger’s work employs a widely used “local scale” model called WRF, which stands for Weather, Research and Forecasting and can produce “reasonably good simulations of weather on the scale of, say, Northern Illinois — so Chicago up to Green Bay and down into the central part of the state. It will forecast things like high and low temperatures, rain and so forth. And it’s typically only run to simulate 24, 48 or 72 hours of weather.”
In further explaining her process, Bridger employs the imagery of a cube centered over Chicago that’s roughly a kilometer east-west by a kilometer north-south. The goal is to predict the temperature in the cube’s center and extrapolate that reading to the entire thing. There are also, in her telling, additional cubes surrounding the initial one “stacked up all the way to the top of the atmosphere” whose future temperatures will be predicted in various time increments — in an hour, in 12 hours, in one day, in three days and so on. Next, temperature-affecting variables are added to the mix, such as amount of sunshine, cloud cover, natural disasters like wildfires and manmade pollution. It’s then a matter of applying the laws of physics to determine a variety of weather-related events: rising and falling temperatures, amount of wind and rain.
Said Bridger, “You’re doing thousands, probably millions of calculations to get that answer.”
Computer Simulation and Industry
In the past 75 years, computer modeling and simulation has evolved from a primarily scientific tool to something industry has embraced for the purposes of optimization and, ultimately, increased profitability.
“Industry is embracing simulation at a faster rate than ever before and connecting it to what I would call data analytics for things like scheduling and supply chain management,” Macal said. “Industry is trying to simulate everything they do because they realize it’s cheaper and quicker than actually building a prototype system.”
When Northwestern’s Nelson spoke with Built In, he had recently returned from the annual Applied Probability Conference. There, the simulation applications discussed included but weren’t limited to the following: aviation modeling, cybersecurity, environmental sustainability and risk, financial risk management, health care, logistics, supply chain and transportation, semiconductor manufacturing, military applications, networking communications, project management and construction.
“Frequently, companies that use simulation want to optimize system performance in some sense,” Nelson said, using as an example a car company that wants to build a new assembly plant or decide what vehicles to bring to market.
“So optimization is a key to lots of business in industry, but optimal solutions are often brittle. By which I mean, if small issues about the assumptions or the modeling approximations you made are wrong, then suddenly something that appeared to be optimal in your model can be catastrophically bad.”
The technical term for that is “model risk,” and those who build models and run simulations try to assess the risks inherent in decisions that are made based on those models. It’s a difficult subject to dissect, let alone make widely comprehensible, but Nelson makes a fine attempt. This is, after all, his area of expertise.
“When people built mathematical and computer models,” he said, “even though the model may have been built from data, they treat it as if the model is correct and therefore the solution that [results] is optimal. What we try to do is continue to incorporate in the model the uncertainty that was created when we built it.”
The financial crisis of 2008, Nelson said, is one instance where model risk was detrimentally downplayed.
“The financial industry uses a tremendous number of very sophisticated mathematical computer modeling [methods]. And it’s quite clear that the correlations among various financial instruments and securities and so on were kind of ignored, so we got cascading failures.”
Such cautionary tales, however, don’t mean that those who create the mathematical and computer models on which simulations are based must strive for perfection, Nelson adds, because no model is perfect and “models move us forward.” Demanding perfection, he said, “would paralyze us. But as we start to make more life-critical decisions based on models, then it does become more important to account for risks.”
Imagine this: It’s years from now and someone you know has been diagnosed with a cancerous tumor. But instead of immediately bombarding them with radiation and highly toxic chemotherapy drugs and hoping for the best, doctors instead perform tests from which they create a virtual (mathematical) twin of that person’s malignant growth. The digital replica is then subjected to computational interventions in the form of millions or even billions of simulations that quickly determine the most effective form of treatment.
It’s less fantastical than it sounds.
“Recent developments in cancer-specific ‘big data’ and experimental technologies, coupled with advances in data analysis and high-performance computational capabilities, are creating unprecedented opportunities to advance understanding of cancer at greater and more precise scales,” the National Cancer Institute recently reported.
Other revolutionary developments with far-reaching impact are already being implemented. Science Daily has reported on many of them.
This one, for instance: “[A]rtificial neural nets can be trained to encode quantum mechanical laws to describe the motion of molecules, supercharging simulations potentially across a broad range of fields.”
According to Los Alamos National Laboratory physicist Justin Smith, that means “we can now model materials and molecular dynamics billions of times faster compared to conventional quantum methods, while retaining the same level of accuracy.”
That’s good news for drug developers, whose researchers study molecular movement in order to see what’s suitable for use in pharmaceutical manufacturing, as well as patients who are all too often caught up in a detrimental guessing game when it comes to treatment.
And this: Over at Penn State, researchers working in tandem with colleagues at the University of Almeria in Spain developed “a computer model that can help forecasters recognize potential severe storms more quickly and accurately.” As Steve Wistar, a senior forensic meteorologist at AccuWeather, explained, the tool could lead to better forecasts because he and his fellow forecasters will have “a snapshot of the most complete look of the atmosphere.”
And so, while we may or may not be living in a computer-simulated world (another topic for another story), the world is being transformed by computer simulation. As computers get faster and research methods more refined, there’s no telling where it might lead.
Mudi Yang, a cosmos-simulating high school senior from Nashville, put it eloquently last year when he said, “Computer simulations gave us the ability to create virtual worlds, and those virtual worlds allowed us to better understand our real one.”