In November 2019, a truck drove itself down Interstate 90, relying on a mix of cameras, sensors and, just to be safe, a human driver ready to take the wheel in an emergency.
The truck was fitted with software by Autobon, a firm based in the northwest Chicago suburb of Rolling Meadows. Autobon is one of the growing number of companies driving to put self-driving cars on the map — and soon. In three years, Gartner research consultancy predicts the number of autonomous vehicles on the roads will rise to more than 745,700, up more than 400 percent from the approximately 137,130 self-driving cars counted in 2018.
After decades of speculation, autonomous vehicles are finally going places — but only some places. Mike Ramsey, an analyst at Gartner, said he expects that widespread adoption of self-driving cars — or, when these vehicles are easily accessible in every part of the country all the time — will take about 10 years.
“You might see it now, but you’re going to have to seek it out.”
“You might see it now, but you’re going to have to seek it out,” he said.
Right now, high costs, a lack of business case and no standard safety regulations all present roadblocks to getting cars without drivers, um, driving all over.
Regulation is a big hurdle for autonomous cars — but not in the way you’d think
Autonomous vehicles likely won’t become widespread until a widely accepted road-readiness standard is in place, Ramsey said. But while these types of standards often come in the form of regulation, Ramsey said the federal government’s current inaction on the issue suggests that these standards are more likely to originate within the industry itself.
“Some outside third-party entity, like insurance companies, may end up implementing a requirement.”
“What I see as more likely is that some outside third-party entity, like insurance companies, may end up implementing a requirement for testing to be insured,” Ramsey said. “That requirement will end up being the de facto test.”
The feds have created some outlines for how autonomous vehicles should operate. In 2016, President Barack Obama introduced the first framework governing steps for getting self-driving cars on the road. An updated 2017 version contains 12 guidelines and a public comment process required before self-driving wheels touched taxpayer’s streets. Before he left office, Obama also created a federal committee — comprised of academics, business leaders, engineers and more — to study automation in transportation.
The Trump administration terminated the committee in early 2019. It also revised the guidelines for what autonomous vehicle companies needed to submit to the federal government before they got on the road, re-affirming that companies can decide for themselves when they’re ready.
“Under the Obama administration, they had more clear guidelines set out for what autonomous vehicles had to do,” Ramsey said. “Then the Trump administration kind of pulled back, and made it more soft.”
While conversations about tech regulation tend to focus on whether too much red tape is getting in the way of innovation, many autonomous vehicle companies would welcome more regulatory meddling.
Until there is a national standard created, self-driving car firms whose vehicles are involved in accidents risk lawsuits questioning if the car was really ready for the road, and will likely have to pay the high cost to insure themselves, Ramsey said.
In recent years, the conversation about risk moved from hypothetical scenarios to real ones. In March 2018, Uber’s self-driving car struck a woman in Arizona, for example, representing the first pedestrian death associated with self-driving tech. The terms of the company’s settlement with the victim’s family were never disclosed. Tesla vehicles, which currently offer a semi-autonomous autopilot feature, have also been involved in three fatal crashes to date.
And some states are done waiting on the feds. Thirty-five states have enacted legislation related to autonomous vehicles, according to the National Conference of State Legislatures. Countries like Japan, Singapore and China are working on creating national laws, too.
Deciding when the rubber hits the road
Today, the federal government asks companies to voluntarily submit an outline of what they’ve done to ensure their self-driving vehicles are safe for passengers, and for everyone else on the road. The makers of at least 19 autonomous vehicle models have submitted safety reports.
Their criteria for deciding when they’re ready for the road varies. Waymo boasts that it has racked up more than a billion miles in simulated driving, along with millions of miles of real road time. GM’s cars currently only drive in known, geo-fenced boundaries in clear weather conditions. Uber’s goal is for its self-driving robot cars to get in fewer crashes than humans.
Ramsey said the closest companies have come to creating a unified safety standard came in 2018, when Nvidia and Mobileye worked together to create a set of rules governing how autonomous vehicles reacted to road conditions. But their collaboration fell through, with Mobileye eventually accusing Nvidia of copying its tech.
And while the idea of an industry-driven standard creation process might sound appealing in principle, Ramsey said, getting all the major players on the same page might be difficult.
“Everybody is working on their own thing. It’s possible that [the standard] might require a significant tear up of what they've been working on.”
“Everybody is working on their own thing,” he said. “It’s possible that [the standard] might require a significant tear up of what they’ve been working on. So you may get some pushback.”
Some companies are still collaborating on the issue, however.
In early 2019, a group of analysts representing major players in the space — car companies including BMW, Daimler and Volkswagen Group, as well as tech companies like Intel and Here Technologies — came together to release a report outlining the requirements an autonomous vehicle should meet before it hits the road. The report named items like building in part redundancies, using multiple technologies to snap pictures of the surrounding environment, installing a cybersecurity system and more. It did not go into the technical means necessary to achieve such standards, however, noting that companies have multiple ways of dealing with these issues.
Krystian Gebis, Autobon’s CEO, said its metric for road-readiness is simple: every year, Autobon crunches data from the Federal Motor Carrier Safety Administration to find out how often human drivers get in accidents, and then aims for its vehicles to meet and beat this statistic.
“Your system really needs to be tested and show statistical efficacy to say, ‘Hey, look, our system was exposed to or drove across the same amount of miles that all human truck drivers have driven and did better,’” Gebis said.
AI needs driving instructors, too
Autobon, like most autonomous vehicle firms, uses an artificial intelligence system to decide if a car should stop, go, accelerate or slow.
This AI system is trained by human drivers. Self-driving cars essentially ride as passengers to these human chauffeurs, with machines learning from the decisions humans make before they’re allowed to take over the wheel.
Many firms, including Autobon, base these learning experiences in the cloud, so the data can be updated continuously and across all vehicles — when one truck learns how to respond to a tractor on the road, for example, all the trucks learn from it.
“Essentially it just learns behavior, like ‘What do other drivers do?’ And it basically teaches itself, ‘OK, this is how I drive a vehicle.’”
“Essentially it just learns behavior, like ‘What do other drivers do?’ And it basically teaches itself, ‘OK, this is how I drive a vehicle,” Ramsey said.
These findings are then added to a set “core logic,” or rules for driving that underpin Autobon’s AI system. These rules include items like never going over the speed limit or always stopping at red lights.
Ramsey said the ideal universal safety standard would regulate self-driving cars’ core logic and AI system.
“If everyone agrees to these basic set of standards underneath [the AI], then you can actually say, ‘Hey, look, our vehicle adheres to this, and we know that this is safe driving,’” Ramsey said. “We would have a test.”
AIs log millions of miles before they hit the road
Most companies build their AI systems — and test their core logic — through millions of miles of simulated driving. They want to figure out how a vehicle is programmed to respond in different situations and then use machine learning to learn from new, simulated conditions.
Many companies base their simulated driving systems on gaming engines, which are physics-based and can replicate the effects of things like gravity or speed, Ramsey said. Autobon’s neural network is built on a homegrown ResNet system. Companies like Microsoft and Siemens also offer over-the-counter tools.
Autonomous vehicles need to be tested in tons of bizarre situations, some of which humans might not think to train their vehicles on, Ramsey said. One current limitation is that computers are unable to generalize in the way humans can. If a person sees a bulldog and a Great Dane, they know they are both dogs and can react accordingly, based on how dogs typically behave. Computers’ AI needs to be trained separately for both types of dogs to figure out how to react.
“We ignore most of what is out there. It just comes in and we don't really think about it,” Ramsey said. “Whereas the computer doesn't really have the capability of saying, ‘I only need to pay attention to a few things.’”
Once a company does decide its autonomous car has clocked enough simulated miles — and trained its AI accordingly — it’s then ready to test on the road. Ramsey said the tech powering how different types of self-driving cars figure out what’s in front of them presents the greatest variation among firms.
Autobon uses a mix of cameras and radar sensors to figure out what’s going on around the vehicle. These technologies are embedded into a visor above a truck’s windshield and snap at least 20 images of the road per second. A computer uses graphical processing units to process the images quickly. A neural network then analyzes the images to determine how the truck should be driving, and communicates its decision to machine controls in the steering wheel, acceleration and brakes.
Right now, Autobon’s system instructs trucks on how to drive straight and keep a safe distance from the vehicles around them. Two aftermarket trucks have the system installed in them, and Gebis aims to add the system to 10 more vehicles this year.
By 2026, he plans to have installed Autobon’s software in 50,000 trucks. It will use machine learning to collect images — or data — from these vehicles to teach its AI how to react in situations the system is not confident in, like recognizing a chicken crossing the road. Its system will cost about $5,000 a pop, Gebis said, and takes less than two hours to install in a truck.
He said he’s lent out the two trucks installed with Autobon’s tech to trucking firms and shippers interested in testing its system.
“We have not only fleets coming to us, but we actually have also shippers coming to us asking, ‘Hey, when can we start contracting trucks?’ he said.
Self-driving cars will have to pay their own way
Autobon doesn’t aim to use its technology to replace drivers: “Someone needs to be unloading the truck,” Gebis said. Rather, it aims to use its driver assist technology to ease some of the struggles these drivers face, like boredom, and make trucking safer for everyone on the road.
Ramsey said widespread adoption for truck technologies like Autobon — and its competitors like Torc and Tusimple — will likely happen sooner than consumer systems like robocabs because it’s easy for trucking companies to figure out Autobon’s return on investment.
Truck driving is dangerous. Turnover in the field is high. Drivers are well-paid and carrying expensive cargo, which makes the benefit to reducing accidents very clear, Ramsey said.
Conditions matter, too. Drivers often aren’t driving in congested and complex downtown streets. Their routes are also typically short and well known, from a warehouse to a distribution center.
“I don’t think you're going to see autonomous vehicles running around in a city environment.”
“I don’t think you’re going to see autonomous vehicles running around in a city environment,” Ramsey said. “But like between rail yards and factories, between long strips of highway, I think you might start to see a bigger clip of proliferation in the near term.”
Even for these use cases, the lack of standard safety regulations present a roadblock to uptake, since trucks often drive cargo from one state to another, meaning a company’s legal ability to use the technology may change over the course of one trip. Consumer vehicles represent a bigger challenge still, Ramsey said. The potential is clear: Self-driving cars could help the blind hit the road, gaining independence. Autonomous vehicles could free up humans’ hands and attention, giving them more time to devote themselves to other tasks.
But until a clear safety standard is enacted, and self-driving cab and consumer car companies make a clear business case for the tech, it will not be widespread, Ramsey said. Cars can drive themselves. But right now, he said the costs are a little too high for the technology to hit every street, everywhere.
“It’s not just whether we can do it. It’s ‘Does it make sense to actually do it?’”
“It’s not just whether we can do it,” Ramsey said. “It’s, ‘Does it make sense to actually do it?’”