The Fuse

Autonomous Driving Takes Power. Will That Matter for Electrification?

by Jesse Dunietz | July 27, 2018

Electric autonomous vehicles will soon be all the rage—or at least that’s the bet autonomous vehicle (AV) companies appear to be making. Tesla, GM, Volkswagen, BMW, and Google affiliate Waymo are just a sampling of the manufacturers and tech companies promising all-electric or plug-in hybrid AVs in the next few years.

On its face, electrifying AVs makes sense: self-driving cars are often touted for their environmental benefits, and what better environmental choice than an electric vehicle (EV)? But a nagging issue has cast doubt on the wisdom of the AV-EV combo: power. Autonomous driving takes serious computational chops, which in turn draws serious electric current. For an EV, that means precious electrons must be diverted from the vehicle’s core business of going places.

Autonomous driving takes serious computational chops, which in turn draws serious electric current. For an EV, that means precious electrons must be diverted from the vehicle’s core business of going places.

The power draw that AV firms report for their prototypes—between 1.5 and 4 kilowatts—“would almost compete with the power consumption for propulsion if you are cruising down the street in an urban area,” says Zoran Filipi, chair of Clemson University’s Department for Automotive Engineering. Estimates of the hit to battery range have ranged from 5–10% to as much as a third.

Ford’s AV team has taken heed, citing the energy draw of computing equipment as a reason for sticking to hybrids. But most companies’ portfolios continue to include some electric vehicles. With range on the line, why are so many forging ahead with battery-driven technology?

The answer is probably a mix of factors. First, some studies point to a less dire energy picture, suggesting both that the energy demands will be lower and that range won’t be a limiting factor. Second, engineering synergies between EVs and AVs balance out battery concerns. And finally, automakers may be betting that AVs’ energy demands won’t stay so severe forever.

Range optimism

One source of optimism is a recent study co-sponsored by none other than Ford. The study, carried out by the University of Michigan, is the first to examine the energy impacts of automation over a vehicle’s entire lifecycle. The results vary with vehicle configuration; Tesla’s minimalist hardware consumes slightly less power than the Waymo’s bulkier rig. But for intermediate-size rigs, autonomous hardware imposes only a 3% increase in energy consumption during driving—and that’s including the added drag and weight. The computers account for just half of that 3%.

For intermediate-size rigs, autonomous hardware imposes only a 3% increase in energy consumption during driving—and that’s including the added drag and weight. The computers account for just half of that 3%.

Those numbers aren’t nothing, but they’re far below previous estimates. Jim Gawron, who led the study as a research assistant at Michigan, attributes the difference to prototype vs. production hardware: his study is based on specifications for the latest chips and sensors, which may not have made it into prototypes. Jeff Greenblatt, CEO of energy and space technology consultancy Emerging Futures and former staff scientist at Lawrence Berkeley National Laboratory, agrees that the higher numbers were probably based on older equipment. Greenblatt is “pretty comfortable” asserting that they’re well above what we’ll see in production AVs.

Others, though, see the discrepancy as cause for skepticism. “This is an outlier compared to all other sources, including sources from Ford,” says Aymeric Rousseau, manager of the Systems Modeling and Control Section at Argonne National Laboratory. He doesn’t buy the pre-production hardware argument, either; “It’s too big of a difference,” he says. To him, “the only plausible explanation…is that [the Michigan study] did not fully consider every single accessory load.”

Gawron acknowledges that AVs may draw more power than the specs suggest, perhaps due to unexpectedly computing-intensive software. Still, his study did consider a hypothetical computer chip drawing ten times as much power, a figure some prototypes already beat. Even then, the computer would impose about a 5–6% load increase—still on the low end of earlier estimates. Bulkier gear like Waymo’s could raise that a few more percentage points, so a pessimistic scenario would see a substantial energy hit, but still not an outrageous one.

Embracing short ranges

Greenblatt’s own research bolsters the conclusion that autonomy won’t be a battery-buster. Early AV deployments will be fleets of urban robotaxis. A recent study co-authored by Greenblatt found that the economically optimal battery range for an electric fleet in Manhattan would be a mere 90 miles. With larger ranges, each car would spend less time charging, allowing a smaller fleet—but then the cars would need to be replaced sooner, since lifespan is governed by miles driven. Cost-wise, these factors cancel each other out.

With larger ranges, each car would spend less time charging, allowing a smaller fleet—but then the cars would need to be replaced sooner, since lifespan is governed by miles driven. Cost-wise, these factors cancel each other out.

Again, Rousseau is skeptical. For one thing, Greenblatt’s study did not account for the energy increases from computing. There’s also the broader issue that energy consumption hinges on the scenario. In simulations, Rousseau’s team “can literally double the [energy impact of automation] by changing the trip” to include a different balance of highways, traffic lights, and other road factors. Similarly, heavy use of fast charging and air conditioning can slash the lifespan of a battery pack, especially smaller ones like those Greenblatt proposes. “I don’t think we’re at the point where we can have narrow ranges” for automation’s energy effects, Rousseau says.

The uncertainties on both studies are compounded by the fact that nobody knows for sure what hardware will ultimately be needed. But despite all the unknowns, these studies at least give hope that the impact on range won’t be as high as was feared, and that moderate decreases in range will be at worst neutral for fleet operators.

The other side of the balance sheet

Even if AVs prove range-limited, manufacturers still have compelling technical and economic reasons to prefer electrification. First, electricity costs around half as much as gas per mile. That’s one reason Greenblatt’s study found that a battery-powered fleet, which would see near-continuous operation, would be cheaper to operate than a gas-powered one. Any unaccounted-for energy powering autonomous hardware would only widen this gap: internal combustion engines are less efficient electrical generators than the power grid, so a big electrical load would make a gas-powered fleet even less cost-effective.

The same mechanical simplicity also makes EVs highly reliable, another critical factor for fleets.

There are also engineering benefits from easier integration with existing hardware. Electric vehicles already come with the high-voltage electrical conduits needed to power AV gear. They also typically replace traditional mechanical control systems for brakes, steering, and so on with electrical control (“drive-by-wire”), which is easier for computers to manipulate. That difference is particularly significant for the powertrain: internal combustion engines must be turned on and off and geared up and down, whereas electric motors are simply told how much torque to apply. The same mechanical simplicity also makes EVs highly reliable, another critical factor for fleets.

Shifts to come

Of course, the energy burden of the hardware is not static; “it’s going to go down as these systems go through improvements,” says Filipi. Costa Samaras, director of Carnegie Mellon University’s Center for Engineering and Resilience for Climate Adaptation, notes that the priority to date has been getting any version of the technology working. “There’s not a lot of folks optimizing MPG for autonomous vehicles right now,” he says.

Chip suppliers are already hard at work preparing lower-power offerings. NVIDIA will soon release an update to its AV chip that draws 60% less power than the version Gawron modeled. A host of startups are promising more radical solutions, mostly focused on processors optimized for the intensive neural network computations that take up most of existing chips’ time. Their proposed energy-savers include custom traditional chips hardcoded for efficient neural network calculations; lightdriven chips that offload the most energy-sucking operations to low-power optical components (which are uniquely suited to these applications); and chips that strive to emulate neurons more faithfully, which reduce power requirements by skipping low-impact computations or by eschewing digital logic altogether for analog.

Such efforts will certainly help. Still, they’re unlikely to dispatch the energy issue altogether. Historically, “there’s never enough compute horsepower,” says Danny Shapiro, Nvidia’s Senior Director of Automotive; “we always want more, because the software continues to get more and more complex.” Samaras sees parallels to past improvements to vehicle efficiency, too: “Companies have made strides in energy efficiency of the combustion engine” and other vehicle components. “Some of those efficiencies went into improving miles per gallon, but more of the efficiency went into more horsepower.”  Both Samaras and Greenblatt suspect that AV processors will meet a similar fate: once energy use is within an acceptable range, manufacturers will use subsequent efficiency improvements to deliver better performance rather than lower energy use.

In short, then, power will probably remain an engineering nuisance for many generations of AV technology—but an EV-killer it does not appear to be.

ADD A COMMENT