Newswise – In the future, the energy needed to power the powerful computers aboard a global fleet of autonomous vehicles could generate as many greenhouse gas emissions as all the data centers in the world today.
This is one of the main findings of a new study by MIT researchers that explored the potential energy consumption and related carbon emissions if autonomous vehicles were to be widely adopted.
Data centers that house the physical computing infrastructure used to run large-scale applications are notoriously large in carbon footprint: they currently account for about 0.3 percent of global greenhouse gas emissions, or roughly the amount of carbon the country produces annually, according to the International Energy Agency. . Realizing that less attention has been paid to the potential footprint of self-driving vehicles, the MIT researchers built a statistical model to study the problem. They determined that 1 billion self-driving vehicles, each driving an hour per day with a computer consuming 840 watts, would consume enough energy to generate the same amount of emissions as data centers currently.
The researchers also found that in more than 90 percent of model scenarios, to prevent autonomous vehicle emissions from amplifying existing data center emissions, each vehicle must use less than 1.2 kilowatts of power for computing, which would require more efficient hardware. In one scenario—in which 95 percent of the global vehicle fleet is autonomous in 2050, computational workloads double every three years, and the world continues to decarbonize at the current rate—they found that instrument efficiency would need to double faster than every 1.1 years to keep emissions below those. levels.
“If we maintain business-as-usual trends in decarbonization and the current rate of improvement in device efficiency, it doesn’t seem like it will be enough to constrain emissions from on-board computing in self-driving vehicles. This has the potential to become a massive problem,” says first author Soumya Sudhakar, graduate student at Aeronautics and Astronautics, “If we get ahead of it, we can design self-driving vehicles that are more efficient and have a smaller carbon footprint right from the start.”
Sudhakar wrote the paper with her co-advisers Vivian Sze, assistant professor in the Department of Electrical Engineering and Computer Science (EECS) and member of the Research Laboratory of Electronics (RLE); and Sertac Karaman, associate professor of aeronautics and astronautics and director of the Laboratory for Information and Decision Systems (LIDS). The research appears in the January-February issue of IEEE Micro.
The researchers built a framework to explore operational emissions from the on-board computers of a global fleet of fully autonomous electric vehicles, meaning they don’t require a backup human driver.
The model is a function of the number of vehicles in the global fleet, the power of each computer in each vehicle, the hours traveled by each vehicle, and the carbon intensity of the electricity that powers each computer.
That on its own, seems like a deceptively simple equation. But each of these variables contains a lot of uncertainty because we’re studying an emerging application that isn’t here yet.
For example, some research suggests that the amount of time driven in self-driving vehicles may increase because people can multitask while driving and younger and older people can drive more. But other research suggests that time spent driving may decrease because algorithms can find optimal routes that get people to their destinations faster.
In addition to considering these uncertainties, the researchers also needed to design advanced computing hardware and software that did not yet exist.
To achieve this, they modeled the workload of a popular algorithm for self-driving vehicles, known as a multitasking deep neural network because it can perform many tasks simultaneously. Figure out how much power this deep neural network would consume if it processed many high-resolution inputs from many cameras with high frame rates simultaneously.
When they used the probabilistic model to explore different scenarios, Sudhakar was surprised at how quickly the algorithms’ workload increased.
For example, if an autonomous car has 10 deep neural networks processing images from 10 cameras, and that car drives for 1 hour per day, it will get 21.6 million conclusions every day. One billion cars would result in 21.6 quadrillion inferences. To put that into perspective, all of Facebook’s data centers are around the world Make a few trillion inferences every day (1 quadrillion equals 1,000 trillion).
“After seeing the results, this makes a lot of sense, but it’s not something that’s on a lot of people’s radar. These vehicles can actually use a ton of computer power. They have a 360-degree view of the world, so while we have two eyes, they might have 20 eyes, looking at everywhere and trying to understand all the things that are happening at the same time,” says Karaman.
Autonomous vehicles will be used to transport goods, as well as people, so there could be an enormous amount of computing power distributed along global supply chains, he says. And their model only takes into account computing — it doesn’t take into account the energy consumed by the vehicle’s sensors or the emissions produced during manufacturing.
To prevent emissions from getting out of control, the researchers found that each self-driving vehicle needs to consume less than 1.2 kilowatts of power for computing. For this to be possible, computing devices must become more efficient at a significantly faster pace, doubling in efficiency approximately every 1.1 years.
One way to enhance this efficiency could be to use more specialized hardware, which is designed to run specific driving algorithms. Since researchers know the navigation and perception tasks required for autonomous driving, it may be easier to design specialized devices for those tasks, says Sudhakar. But compounds tend to be 10 or 20 years old, so one of the challenges in developing specialized devices will be “future proof” them so they can run new algorithms.
In the future, researchers can also make algorithms more efficient, so they will need less computing power. However, this is also a challenge because the trade-off of some precision for more efficiency may hinder vehicle safety.
Now that they have demonstrated this framework, the researchers want to continue exploring hardware efficiency And Algorithm improvements. In addition, they say their model could be improved by characterizing embodied carbon from self-driving vehicles — the carbon emissions generated when a car is manufactured — and emissions from the vehicle’s sensors.
While there are still many scenarios to explore, the researchers hope that this work will shed light on a potential problem that people may not have considered.
We hope people will think of emissions and carbon efficiency as important metrics to consider in their designs. The energy consumption of an autonomous vehicle is really critical, not only for battery life, but also for sustainability,” says Sze.
This research was funded in part by the National Science Foundation and the MIT-Accenture Fellowship.
By Adam Zoe, MIT News Desk
paper: “Data Centers on Wheels: Emissions from Accounting for Self-Driving Vehicles on Board”