Home » MIT Study Finds Something No One Mentions About Self-Driving Cars: They’re Lousy For The Environment

MIT Study Finds Something No One Mentions About Self-Driving Cars: They’re Lousy For The Environment

Avpowerstudy Top
ADVERTISEMENT

I’ve been talking about the coming of automated vehicles (AVs) for years now, even writing a whole damn book about them, and while I’ve discussed the ethics and issues behind AVs, I’m a bit embarrassed to say that there are some obvious things I’ve never really considered. One of these is the subject of a recent MIT study titled Data Centers on Wheels: Emissions From Computing Onboard Autonomous Vehicles, which came to this conclusion: AVs are potentially an environmental disaster, even if they’re all battery EVs. The reason is really pretty simple: It takes a lot of computing power to drive a car, and all that computing power takes energy. And making that energy can produce emissions. In short, nothing is free.

Here’s the abstract to the study, which gives a good sense of what MIT found. I guess that’s literally what an abstract is supposed to do, isn’t it? Anyway, here:

Vidframe Min Top
Vidframe Min Bottom

While much attention has been paid to data centers’ greenhouse gas emissions, less attention has been paid to autonomous vehicles’ (AVs) potential emissions. In this work, we introduce a framework to probabilistically model the emissions from computing onboard a global fleet of AVs and show that the emissions have the potential to make a nonnegligible impact on global emissions, comparable to that of all data centers today.

Holy crap, that’s an alarming reference, as much emission output as all data centers? We’ll get to that more in a bit.

Based on current trends, a widespread AV adoption scenario where approximately 95% of all vehicles are autonomous requires computer power to be less than 1.2 kW for emissions from computing on AVs to be less than emissions from all data centers in 2018 in 90% of modeled scenarios. Anticipating a future scenario with high adoption of AVs, business-as-usual decarbonization, and workloads doubling every three years, hardware efficiency must double every 1.1 years for emissions in 2050 to equal 2018 data center emissions. The rate of increase in hardware efficiency needed in many scenarios to contain emissions is faster than the current rate. We discuss several avenues of future research unique to AVs to further analyze and potentially reduce the carbon footprint of AVs.

I mean, think about it: All cars today are driven using energy-hungry computing power, it’s just that the computer in question there lives in your skull and is powered by fistfuls of Pizza Rolls and lots of caffeine. And, if we think about driving in terms of computational efficiency, human driving is incredible, because you’re just re-purposing your brain-computer to drive the car, and you’d need to keep that thing running anyway, so the computational cost is really zero. But once we move to automated driving, then we end up having to add a completely separate computational system to drive the car, and that computer is doing a hell of a lot of work, and that sort of computation takes a lot of energy.

How much energy? Well, look at this:

ADVERTISEMENT

The study found that with a mass global takeup of autonomous vehicles, the powerful onboard computers needed to run them could generate as many greenhouse gas emissions as all the data centres in operation today.

These data centres currently produce around 0.14 gigatonnes of greenhouse gas emissions per year, equivalent to the entire output of Argentina or around 0.3 per cent of global emissions, according to the researchers.

So, if AVs really catch on in a big way, the advanced, deep neural-network-running computers that drive them could create emissions equal to all the computer data centers currently running, which produces as much greenhouse gas emissions as Argentina! Holy crap. And, that doesn’t really factor in other consequences, like how the constant demand for power from computational hardware would necessarily reduce the amount of available range from a battery in an electric automated vehicle, possibly spurring the need for larger batteries, which means more weight, which means less efficiency and greater vehicle demand energy, and so on.

Like I said, nothing is free, If you want your car to drive itself, that takes energy. To compute the amount of energy needed and emissions produced, the study used four main variables: number of AVs operating on Earth, how many hours the vehicle operates, the power of each vehicle’s computer, and the amount of emissions produced per measured unit of electricity used to power the car and its systems. The study doesn’t just point out this issue, it does also suggest some possible ways to mitigate it, mostly via the development of specialized hardware optimized for just the sorts of tasks AVs require.

It’s also worth noting that nothing is certain here by a long shot. Even the basic premise that there will be vast numbers of Level 5-ish AVs on the road is by no means guaranteed, for example, and, if there are that many AVs deployed, it’s very possible they could have positive environmental effects too, as they could lessen private car ownership and be more efficient in how trips are taken in ways not possible today. Or, they could end up driving as much or even more, even if there’s fewer cars on the road. The point is nothing is really known yet, and while the study brings up a fundamental excellent point – computation in these cars demands significant energy – how the overall deployment of widespread AVs will affect the environment really isn’t clear at all.

I personally think that AVs could be less computationally demanding if there’s more infrastructural assistance for their use, and if the focus is on Level 4 – that is, restricted to a particular operating domain – as opposed to the near-magical Level 5, which can operate anywhere, anytime. But infrastructure assistance requires has its own significant hurdles, including getting actual standards in place and buy-in from government agencies, all of which are highly energy demanding in different ways.

It really is amazing that this has been glossed over for as long as it has been; currently deployed Level 2 systems like Tesla FSD Beta may be taking a toll on range, but it doesn’t seem that a real careful test of this has been undertaken. It seems a lot of the computation happens regardless of whether the system is active or not (operating in an observational “shadow mode”) but I can’t completely confirm this.

ADVERTISEMENT

The more AVs that get deployed into the world, the more demand for power. It’s also possible that AVs may encourage more driving, since cars could hypothetically do things like drop you off at work and return home, then return to pick you up at work later. This could ease demand for parking places, but increase driving. There’s so many unknowns here. But we do know that computation takes power, often nontrivial amounts of it. No way around that.

 

Relatedbar
Support our mission of championing car culture by becoming an Official Autopian Member.

Share on facebook
Facebook
Share on whatsapp
WhatsApp
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on reddit
Reddit
Subscribe
Notify of
50 Comments
Inline Feedbacks
View all comments
Cheap Bastard
Cheap Bastard
11 months ago

So how does the energy lost to extra computational power stack up with the energy losses caused by human drivers?

In the ideal AV world AVs will travel at the posted speed limit in the fast lane with each lane to the right going a bit slower to minimize risk of collisions. AVs will keep lane changes to a minimum and leave shorter but adequate safety distances between vehicles. If there is a minor collision they will record everything and pull over to the shoulder or if possible exit the roadway entirely to exchange information and wait for the cops they automatically called. In the case of bad collisions AVs will also not worsen congestion by rubbernecking as they pass a scene no matter how gruesomely entertaining and click bait-y it might be. AVs can also avoid road hazards by knowing exactly where those hazards are thanks to that information being passed down from cars ahead. (Ideally) AVs will also not run red lights or stop signs preventing even more collisions. AVs can also bring themselves in for simple needed maintenance like topping off the tire pressure.

No speeding, no hard starts and stops means less energy used. Fewer collisions and rigorously following the rules when they do happen means less congestion which means even more energy savings. Proper maintenance helps save energy too.

Many humans OTOH tend to feel they are the exception when it comes to traffic laws: They speed, run lights and stop signs, tailgate, weave through traffic, brake hard and accelerate at will. Humans also don’t multitask well behind the wheel but boy do they try! Some humans can’t even be bothered to keep their tires at the right pressure which is a huge energy drain. Many humans if involved in a fender bender stop right in the middle of the road blocking traffic despite repeated instructions to get off the damn road.

That all adds up to big energy losses.

Michael
Michael
1 year ago

“The rate of increase in hardware efficiency needed in many scenarios to contain emissions is faster than the current rate. ”

I’m noticing a theme present in much of tech evangelism in regards to autonomous and electric vehicles. It seems that a lot of this technology’s promise is predicated upon an assumption that the future rate of technological advancement can only remain constant or possibly even increase. No one ever seems to ask what might happen when it hits a brick wall as progress dependent on human ingenuity and limited by laws of physics sometimes tends to do. We’re told that major revolutionary milestones are just around the corner (see: solid state batteries), and that we all just need to believe.

Cheap Bastard
Cheap Bastard
11 months ago
Reply to  Michael

“We’re told that major revolutionary milestones are just around the corner (see: solid state batteries), and that we all just need to believe.”

*cough* hydrogen *cough*

Ron888
Ron888
1 year ago

I probably shouldnt bother commenting because full autonomy isnt happening, but their numbers feel WAY off as well.
The best existing safety suites use very little energy.Did they base they numbers on the worst system out there?

TL/DR Nothing to see here

Geemy
Geemy
1 year ago

“It takes a lot of computing power to drive a car, and all that computing power takes energy”
spoiler alert it takes a lot of power to move a 2 tons car. autonomous driving system will get more efficient with time and will also be able to improve efficiency
the big question is robotaxi. there will be an overhead with miles driven with no passenger. . shared robotaxi vs personal cars have a potential to save a lot of energy & resources too:
-replace some two way trips with one way trips like driving kids to school
-less parking spaces
-smoother traffic, less idling and better efficiency for everyone with more autonomous cars on the roads. in a far future,where (almost) all cars would be autonomous or have autonomous capabilities, same amount of traffic could be handled with less lanes. even traffic lights, stop signs could be completely overhauled with autonomous driving in mind.
-less cars need to be manufactured
-they will drive more miles / year, so batteries will be less susceptible to degradation over time, and drive more miles
-less miles driven looking for street parking
-you can imagine family pods with 2/4/6 reclining “business class” seats that will only be used for long trips and will transport you comfortably 500/800 miles overnight to your destination in a way more efficient way than planes. small lightweight robot axis to drive you around town/commute/short trips. this could be way more efficient than everyone owning heavy long-range SUVs

Fred Allegrezza
Fred Allegrezza
1 year ago
Reply to  Geemy

Unfortunately people will likely increase their commute times since they can read a book, play games or even do work while they drive. That will be much greater than the energy on computing.

Eric R
Eric R
1 year ago

Apart from electricity AVs also tend to use other things at less eco friendly rates.
Take this with a grain of salt as it’s based on my personal experience and it may change as technology progresses, but AVs eat brake pads like cheerios. I loved the adaptive cruise I had in an ’18 Volvo, but the brake pads needed changing at 25k.

Humans (when paying attention) can anticipate changing circumstances better than our current AVs can. Take the scenario where the car in front of you is braking because a truck in front of them is turning. A human can recognize the scenario, braking less and coasting into the space they’ve left between them and the car in front, anticipating that car to resume it’s previous speed. You’d then regain your following distance by accelerating appropriately. In my experience the AV has to “play it safe” and brake as much as the car in front of them is and maintain their following distance at all times.

Geemy
Geemy
1 year ago
Reply to  Eric R

can a good driver drive more efficiently than current adaptive cruise control or autonomous systems yes of course.
can a bad driver do way worse? I think you already have the answer
once autonomous driving will be safe, the focus will shift to comfort and efficiency and I have no doubt that they will outperform human drivers even accounting for the computing power, I think worrying about autonomous system power consumption is really looking in the wrong direction.

M K
M K
1 year ago

This went a different direction than I expected. I was anticipating a calculation of all the dead-headed miles enroute to pick up a passenger, or roaming when there are no customers…But yeah, computers use energy too. At least they are not just mindlessly calculating numbers in hopes of a cryto coin reward. Once we rid the earth of that scourge, I’ll start worrying about the energy other computers are using.

Boggart Hole-Clough
Boggart Hole-Clough
1 year ago

If autonomous driving is to become more efficient in terms of both operation and power consumption then global standardisation is going to be required. Especially if road infrastructure is to play a part.
To what end is the FIA involved in this and how much energy are they putting into it? (sorry bad pun but I couldn’t resist*) They should be all over this, after all is is one of the reasons they exist.

*resist – second bad pun, still not sorry.

Kal Wayne
Kal Wayne
1 year ago

I’m not convinced that the actual power usage of the onboard “self driving” systems will be significant compared to the travel overhead.

I recall a NYC traffic engineer talking about how taxis and “disruptive” services like Uber travel about 1.6 miles for every mile a car driven by the occupant drives. Even if you reduce the number of cars on the road, you increase the number of miles driven. This applies equally to the self driving taxis that will likely be more profitable and thus be pushed in favor of car ownership when/if self driving cars become popular.

60% more miles driven will almost certainly dwarf any power usage by the onboard computers. An EV (or any car, really) absolutely obliterates my power hungry desktop in power usage, and this thing is perfectly capable of training computer vision models. Given that we can train neural networks that can then run on a smartphone, I think it is a bit naive to assume that future cars will use anywhere near 60% more energy to power their onboard computers. Even 10 or 20% being sent to the onboard computer sounds like a major engineering failure.

Here’s the thing though: if you don’t want to worry about driving and just want to get to your destination safe, unharmed, and comfortable, we already have a solution. (You can even get a sleeper car, or, for example, aThey are called trains and street cars. Unfortunately, no one wants to invest in public transportation this side of the pond, so here we are.

A F
A F
1 year ago

Conclusions are only as valid as the assumptions the intervening calculations are based on. The big assumption here is 1.2kw of constant computational power needed to handle the autonomous driving. That’s basically the peak demand of a very high end gaming or video editing PC. But (assuming L5 is even possible at any power draw) that number could be off by three full orders of magnitude on the low end, i.e, current smartphone power draws, to maybe 1 order of magnitude greater on the high end (which would be limited by packaging and heat elimination constraints).

And that’s not even considering any of the potential energy usage changes from widespread adoption of autonomous vehicles. For example, people could buy fewer cars. It takes 10,000 to 20,000 kWh of energy to produce 1 car (largely depending on weight), so that could be a substantial offset.

Given this wide range of possible inputs, there is zero utility in the conclusions made in this article.

Roofless
Roofless
1 year ago

I mean, this is the same argument leveled against EVs – “If the electricity is created in a coal power plant, it still releases GHGs!” Well, yes – and if it’s created via a solar plant? “Datacenters use electricity!” Ok, and if that electricity is hydro, does that still release GHGs? Yes, we need to move away from fossil fuels for power generation if we want EVs to not cause GHG emissions, but we’ve actually _got_ that option for the EVs. Now go complain to your utility company to solve the other half.

The handwringing over the energy cost of the computing power belies a fair lack of understanding of how much energy’s actually available in an EV as well – a Tesla’s got an 80kWh battery. They’re saying the compute budget’s gotta stay under 1.2kWh, which is an absolutely gobsmacking amount of juice to play with for dedicated hardware, and rounds to about a 10% hit on range if you drive the car for 7 hours. You could power your house for a week on what your car uses to ferry you around for half a day.

Acid Tonic
Acid Tonic
1 year ago

Still think diesel is a far better technology. Can we compare EV datacenter emissions to diesel now?

I recently talked a friend out of a gas engine for his next vehicle and got him into a 17 A8L TDI and he is blown away and now a diesel convert. Keeps telling me how he gets 42mpg on the highway with a huge barge of an AWD car and simply cant believe how sporty 466ftlbs of torque is at low rpm for “only 240hp”.

We need more actual efficient vehicles. News flash, efficiency is literally always going to mean more emissions. Even the fancy new homogenous charge combustion gas engines claiming diesel like efficiency started emitting diesel-like levels of of NOx. EVs are using rare earth minerals, AI and datacenter pollution, plus batteries internal resistance means all our hardwork to charge them is bleeding away as self-discharge which no one talks about (but would if gas evaporated as quickly).

I just will go against the fad and say I dont think 9000lb tanks going 0-60 in 2 seconds with self-discharging fire-starting rare earth batteries are doing *anything* to save our environment.

Small aluminum non-rusting 2 seater diesel commuter cars are what we need but nobody will build them. But we can get aluminum frames for batteries. And we can put 7 L diesels into trucks no problem.

We are misguided.

David Muse
David Muse
1 year ago

This is nonsense. Tesla cars, with their “massive” self driving computer, regularly top efficiency studies compared to other EVs. So the computer seems to hardly make a dent in energy usage.

Sorry, but the concern just doesn’t sound valid to me. The study even points out that improvements in computing efficiency (pretty certain to occur) will eliminate this problem.

Ian McClure
Ian McClure
1 year ago
Reply to  David Muse

I think it’s perfectly reasonable to assume any car would need a significant increase in computing power to *actually* autonomously drive, which the Tesla does not.

Boulevard_Yachtsman
Boulevard_Yachtsman
1 year ago

Wait, I thought AVs were supposed to be powered by self-sustaining Blockchain? The AVs were going to create Non Functional Trips (NFTs), which one could leverage into an Electrons Likely Operating Nowhere payment. These ELON-payments could then be saved up as bits of tokens until the Grid becomes smart enough to use them as a power source to complete more Blockchain.

Is this not yet happening? Should I check my FTX balance?

Kal Wayne
Kal Wayne
1 year ago

For whatever its worth, the difference here is that the technology in question also powers your search engine, the suggestions on whatever video services you might use, and machine translation like Google translate.

Blockchain and NFTs on the other hand have no proven use case. Idiots like Elon misuse machine learning because they don’t understand the current limitations, and don’t know want to hear anything but “yes” from their employees.

RootWyrm
RootWyrm
1 year ago

And leave it to MIT to miss the other part of things.
That being: any sort of “self-driving” car is also fully dependent on very large datacenters. No, your car does not even remotely have the required computing power to do half of the tasks required. No matter what car it is. The NV DGX H100, “AI for enterprise” using H100 Tensors, requires over 11kW of continuous power to operate.

That’s the kind of box doing the hard work behind the scenes, figuring out what collection of pixels is a kid and which is a cone, no matter how poorly it does it. 11 kilowatts, continuous, for each one. That’s 264kWh, per machine, per day. Requiring dozens to hundreds of them to run image classification in order to construct inference structures that the relative equivalent of a broken solar powered pocket calculator in your car can handle.
So in addition to all the other waste, you absolutely have to include multiple datacenters in the 1-9MW and 10MW+ bands at minimum, and more likely in the 100MW+ class. Yes, megawatts. Do the math. Just 4 H100’s is a megawatt of demand. 32 of them is is 8.5MW continuous not including necessary storage (~7-20kW per cabinet,) networking (+1-2kW per cabinet,) and ancillary systems (anywhere from 4kW to 50kW per cabinet.) And we haven’t even talked about cooling.

East Cermak (Lakeside) is now leasing over 120MW of power for just 1.1 million square feet. Lots of that space is taken up by 50+ generators, 20+ 30,000 gallon tanks of diesel, large gothic hallways, and four fiber vaults. The “iDataCenter” (Maiden, NC) is about 500k square feet and has over 80MW of supporting solar alone.

You really have no concept at all of just how much power these facilities actually use without having first-hand experience at that scale. The best way that I can give you some idea as to the scale?
The combined output of the two largest nuclear power plants in the US produce 62,973GWh annually.
This would only fulfill 30% of global datacenter power usage for 2018.

“The rate of increase in hardware efficiency needed in many scenarios to contain emissions is faster than the current rate.”

Which is a colossal understatement. MIT’s vastly underestimated the necessary gains (it’s not 1.1x unless you ignore necessary precision gains. With those, it’s likely 10x.) And this ain’t coming. Period. Everyone’s rushing headlong into ARM as the savior, but in fact, ARM’s only trick is offering lower performance at lower per-watt cost.
The world’s “greenest” supercomputer is the Lenovo ThinkSystem SR670V2 at Flatiron. Period. It achieves 65.091 GFLOPS/watt. It is powered by Intel Xeon Platinum 8362’s and 80 Nvidia H100’s. Each node is 10kW. That’s every single node. Number two is ORNL’s HPE/Cray Frontier TDS at 62.684 using EPYC’s and MI250X’s. 74 cabinets of them. Eating over 21MW continuous, excluding cooling.
ARM doesn’t even make the list. No, I don’t mean “doesn’t make the top 10.” I mean it does not make the list, at all. And no, Fugaku doesn’t count. The A64FX is a Fujitsu vector processor core with an ARMv8.2 (circa 2016) tacked on to be the interface.

Moore’s Law is long dead, just like the company it’s author founded and the company it’s author was CEO of later on! (But not according to Gelsinger because talk about a reality distortion field. Putting two dies on one package does not count, Pat.) It’s been dead over a decade. And the net performance difference between an AMD EPYC (Zen1) from 2017 and an AMD EPYC Genoa (2022) on a per core basis is not that big. EPYC 7601 (32c/64t 2.2/2.6GHz) to EPYC 7502 (32c/64t 2.5/3.35GHz) saw best case numbers of about 20% or 0.20x, mostly from clock and cache. And 7502 to 7513 (32c/64t 2.6/3.65GHz) saw another ~18% typical or 0.18x.

So in a 5 year span, the greatest leap in CPU performance is not “X + Y + Z.” That’s not how it works. It’s actually about 25% average, or 0.25x; the average difference between the 7601 and the 7513. Certain workloads got much much higher benefit (as much as 40%,) others not as much (as little as 5%.) That’s nowhere near doubling. And the power increased from 180W to 200W.

And as MIT succinctly points out: what is required at minimum is doubling efficiency, every year. This can be split between performance and power equally for simplicity. So to meet these requirements, what actually would have had to happen would be the 7601 to 7513 would need not 500% performance gains, but a 1,600% performance gain over 5 years.
There is not a single class or piece of silicon out there that has doubled in performance or increased in performance and efficiency at that rate in many decades. And no, GPUs don’t even come close to meeting it either. The A100 to H100 would have needed 38,800 GFLOPS (DP-FMA) at 250W; it got 33,500 at 700W.
And like it or not? This isn’t fixable. Because physics – not just a suggestion, it’s the LAW! You want Moore’s law with double the transistors every year, then you get double the power every year. You want to shrink the process to offset it? You can’t. Source-to-drain leakage and materials long ago hit the wall, and publicized process ‘shrinks’ are marketing bullshit. TSMC’s “3 nanometer” has zero relation to any physical measurement. It’s actually 48nm gate pitch, 24nm minimum theoretical, if EUV multi litho gets sorted – and that’s a big if. 5nm was supposed to be 42nm/24nm; instead it ended up 51-57nm/28-36nm – not even close to the IRDS roadmap. “3nm” doesn’t even hit the IRDS “7nm” roadmap that was supposed to be volume by 2019!

Like I said. They’re underselling and understating just how screwed we are, and just how permanently unfixable it is. Because it’s not about “oh nobody’s investing.” Everybody’s pouring cash into this. Mountains of it. It’s not more profitable to be behind the curve, because smaller gates mean smaller dies mean selling more parts per fixed-size wafer. Even worse, immersion (pre-7nm) is double the throughput of wafers per day over EUV until you hit 4+ pattern layer. So the new processes mean much, much lower output potential. Samsung’s 7nm capped at 1500 wafers/day, TSMC N7+ at 1000 wafers/day. TSMC is still hoping to get up to 2,000 wafers/day on 5nm versus 6,000+ on immersion.
Which is why everyone in the know has been increasingly concerned for years now. You can spend a million a month on lawyers to bankrupt the competition, but you can’t sue physics out of existence. And you can’t spend physics into submission either. Trust them. They’ve tried. It didn’t work.

Kal Wayne
Kal Wayne
1 year ago
Reply to  RootWyrm

The power used to train a model is significant, yes. But in general (pun not intended) smaller models generalize better and have less danger of overfitting and as such use less power in the vehicle itself. Plenty of these models are trained on larger computers so even smartphones can forward feed data through a neural network and get a reasonable classification. The power usage of the data centers and supercomputers pales in comparison to how much energy we spend driving on transportation in general. The end users power usage will very likely still dwarf the supercomputers training said model.

Still, the technology to make “self driving” safe enough to not require human interaction (in other words, what would need to happen to make it useful, outside of driver aids to make up for our lack of attention, rather than replace it) just isn’t there yet. When a system has a 0.5% false positive rate, that sounds good. So does one with a 0.5% false negative rate. When applied across the literally trillions of miles Americans drive every year, that would add up to a hell of lot of accidents.

RootWyrm
RootWyrm
1 year ago
Reply to  Kal Wayne

What you’ve just said is one of the most insanely idiotic things I have ever heard. At no point in your rambling, incoherent response were you even close to anything that could be considered a rational thought. Everyone in this room is now dumber for having listened to it. I award you no points, and may God have mercy on your soul.

And that’s not just a quote.

Kal Wayne
Kal Wayne
1 year ago
Reply to  RootWyrm

It is really difficult not to come back with something just as idiotic as your completely out of context response, but here we go anyway.

If you don’t know what overfitting is, or how that might relate to smaller models, you don’t know the first thing about neural networks. Blathering on about how Moore’s law hasn’t held up will convince people without higher degrees in STEM that you know what you’re talking about, but I’m guessing your expertise lies outside this specific kind of model. Training is energy intensive, but not compared to transportation. Worldwide projections for data center energy usage is projected to be about 3% in 2025. In the US, transportation is about 28%. That’s before we account for the fact that most of the data centers and supercomputers are serving everyday requests like Google searches or loading websites, not training a computer vision model.

Honestly, by your previous post and your response, I take it you have some familiarity with computers or engineering, but if you were unable to understand my post, you obviously don’t understand even basic terminology that even CS undergrads should know if they’ve had a single class dealing with deep learning. If you didn’t understand the bit about base rate fallacy, you apparently don’t understand statistics.

Kal Wayne
Kal Wayne
1 year ago
Reply to  RootWyrm

Oh, and for the record, a single GPU with mere hundreds of watts is sufficient to do a surprisingly good job with many classification and segmentation tasks. A forward on such a model is not *nearly* as energy intensive as training in the first place. Not that I expect you to know what I’m talking about.

RootWyrm
RootWyrm
1 year ago
Reply to  Kal Wayne

Son, you really are just embarrassing yourself by putting your ignorance on show at this point. You’re just another crypto-turned-AI desktop kiddie who thinks that a few hours with a 3090Ti and PyTorch and DeepDream on Ubuntu somehow makes you the world’s leading expert on this.

I’m the guy who’s been designing and shipping systems used for this kind of work longer than you’ve been on this earth. You don’t even know the difference between faux-AI types, and can’t even begin to understand the concepts of adversarial versus cooperative models, much less the advantages of vector to FMA.

You’re literally doing nothing but regurgitating a bunch of nonsense buzzword marketing. None of which is true or accurate. Or are you going to prove how great AI is at ‘classification’ by having your Python render a speed limit sign?

Kal Wayne
Kal Wayne
1 year ago
Reply to  RootWyrm

One more comment… You bring up ARM vs x86_64 server processors… You realize M2 exists? I won’t buy an Apple product for other reasons, but you are cherry picking your targets. And for that matter, CPU architecture is almost entirely irrelevant to the power usage of training computer vision models. I’m sure you already knew that CPUs are mostly useless for training these kinds of models, right?

Ron888
Ron888
1 year ago
Reply to  RootWyrm

A perhaps amusing side note-
Since i got my first computer in about 2001 i had followed the computer industry news. For years the hand-ringing talk was about the coming hard limit, and ohmygodwhatwillwedo?!
Then someone decided to use several cores in one computer and i’m like “what? You can do that? You’ve been able to do it all along??!”
From that time on a lost a ton of respect for computer people

Vetatur Fumare
Vetatur Fumare
1 year ago

But obviously the miles driven will increase exponentially with (full) AVs. People will set them to drive in a loop while they’re at work to avoid having to pay for parking, others will send them back and forth on pointless errands – a wealthy acquaintance had Uber pick up the spare key for their beach car in Manhattan and drive it out to East Hampton, now imagine everyone doing this all of the time.
Humans kinda suck and will find a way to make the worst of any new gadget or ability.

Dogisbadob
Dogisbadob
1 year ago
Reply to  Vetatur Fumare

This. Especially when parking is scarce. They will keep driving around.

Nlpnt
Nlpnt
1 year ago
Reply to  Vetatur Fumare

Not only that but ride services will flood the zone with their cars to boost market share once labor costs at one driver per car is gone. Right now, with the medallion system that was cities’ traditional way of regulating the number of cabs on the street having been demolished in the last round of “disruption”, the labor cost is the only limiting factor.

Jason Roth
Jason Roth
1 year ago

“equivalent to the entire output of Argentina”

I’m sorry. i’m going to need a more comprehensible measurement. How about a number of ’73 Beetles?

GhosnInABox
GhosnInABox
1 year ago

“…but it’s bad for the environment” caps the conversation for most automotive technology discussions. Followed immediately by: “Well…will it make us rich?”

Funny how a certain EV millionaire is also developing ways to leave the planet.

Larry
Larry
1 year ago
Reply to  GhosnInABox

Since it’s a one-way trip to leave the planet, it would be a pity if the certain EV millionaire were to be the first passenger. /s

Cheap Bastard
Cheap Bastard
11 months ago
Reply to  Larry

“Since it’s a one-way trip to leave the planet, it would be a pity if the certain EV millionaire were to be the first passenger. /s”

FWIW those first colonists off planet will endure a lifestyle that will make the trashiest, shittiest trailer park on Earth seem opulent by comparison. At least in a trailer park you have food (even if it’s just low nutrition empty lot weeds, dumpster maggots or roadkill skunk), filthy gutter water that can be filtered and steam condensed to become potable, earth normal gravity, unlimited breathable air, (probably) trivial radiation, unlimited iodized salt, a FAR more temperate climate and WAY better medical facilities within reach.

In many US trailer parks you can have food delivered in 30 minutes or less and two day delivery of stuff from Amazon Prime. What luxury! Try that on Mars. You’d be lucky to get your delivery within a year and boy will it cost ya!

What do you see out your colony window? No plants, no animals, no water, just rocks, dust, rocks, dust, more rocks maybe sometimes a dust storm. If you’re lucky MAYBE some ice but that’s a kind of rock too. What do you see when you look up? The exact same stars looking a whole lot like what you’d see had you’d stayed on Earth. It’s going to get old, real fast.

Changed your mind? Tough! You’re stuck there. Take heart, your misery won’t last long.

Those first off world colonists will likely die within a short time from something that would be trivial on Earth but is catastrophic off world (or simply commit suicide) and thanks to the cryogenic oxygenless environment their remains will be preserved to eventually be eaten in desperation by a subsequent wave of colonists when their own food runs out.

The only bright side is unlike terrestrial colonists off world colonists won’t have to worry about the natives or the animals.

Good luck Mr/Ms EV millionaires.

Pro tip for those subsequent colonists: Bring plenty of Teriyaki marinade/sauce. You’re gonna need it!

Querty
Querty
1 year ago

If the energy you use is cleanly produced, who really cares?

BolognaBurrito
BolognaBurrito
1 year ago
Reply to  Querty

Even cleanly produced energy has a cost (ecological, financial, and societal) behind it.

RootWyrm
RootWyrm
1 year ago
Reply to  BolognaBurrito

The world’s largest solar plant, Bhadl Solar Park in India occupies over 22 square miles, cost more than $1.3B despite using the cheapest type of pholotovaic cells, has had no end of problems because the region is nearly uninhabitable due to temperatures and extreme sand storms (which are bad for solar, duh,) and doesn’t even make enough power to be in the top 100 power plants in the US.
You got another 14,000 unusable acres lying around conveniently located near major transmission infrastructure? I don’t. Yeah. Bit of a rub there innit?

And hydroelectric… hooboy do I know hydro. Thank you, day job and curiosity.
Hydroelectric dams invariably and without exception have absolutely massive ecological impacts. Dozens to hundreds of square miles of land is flooded and downstream water delivery is severely curtailed and sometimes even completely eliminated. Which destroys vital wetlands, shoals, wildlife habitats, and the like. Which is why many perfectly serviceable old hydroelectric plants instead of being refit are being very carefully deconstructed. (Because you can’t just tear them down, especially after a bunch of people built homes in dried up wetlands.)

Which is also why nobody except China (holy shit just look at the real history of Three Gorges) and India is aggressively building hydroelectric, and overhauls are being very, very heavily scrutinized. And India’s tearing up treaties building theirs. (Water rights: don’t even google that shitshow. Seriously.) Any new hydroelectric dam in sane countries is looking very long and hard first and foremost at the ecological impacts, not just “hey we can make zappies for cheap.”

And this is despite the facts that is not just ‘clean,’ but also economical, by far the most reliable, and incredibly efficient. Properly constructed and operated turbines can run continuously for decades. They’re extremely affordable compared to many other turbines. It’s extremely low maintenance. Hydroelectric is genuinely really really good stuff. Hell, OPP’s 15 generator plant (Horseshoe falls, not Niagara) was built in 1905, didn’t get upgraded to 60Hz until 1972-1976, and kept running until 1999. How’s that for reducing waste?

But none of those benefits are realized if drying up a river results in billions of dollars in storm damage on an annual basis. Or if you cause the extinction of several species that turn out to be vital to the ecosystem. Or you just look at Lake Mead; Hoover Dam’s capacity dropped by over 40% from low water levels, and Lake Powell got so low upstream that Glen Canyon had to be reduced to almost nothing.

Kal Wayne
Kal Wayne
1 year ago
Reply to  RootWyrm

One my earlier jobs involved building bathymetric grids for hindcasting hurricane storm surge in the aftermath of Katrina. The lab has since moved on to forecasting and has worked extensively with NOAA on a global model.

Suffice it to say, warmer temperatures (sea surface temperature in particular) have affected the intensity and rate of intensification of hurricanes. Reduced wind shear means that as the globe warms, fewer hurricanes will form in the first place, but those that do will gain more energy, intensify faster and more unpredictably, and cause a lot more damage. (I would recommend anyone who is interested in how climate change effects hurricanes look into Kerry Emanuel’s publications. He may be from *gasp* MIT but he is well renowned in the field. And for the record, his conservative ideology is somewhat at odds with my own, but he hasn’t let his political leanings get in the way of the facts.)

Hydroelectric absolutely has its drawbacks. Nonetheless, doing nothing has multi-billion dollar drawbacks as well, not to mention massive migrant crises etc. Typhoons affect a lot of countries in the third world, not to mention the sea level rise issue. I don’t mean to sound confrontational, but it is worth keeping these things in context.

Brian Beaulaurier
Brian Beaulaurier
1 year ago
Reply to  Querty

Pop1

...getstoneyII
...getstoneyII
1 year ago

Boy, aside from the computational alley that this study is examining (which is fine, but stats can always be manipulated), I really feel strongly that AVs are such a horrible idea for the gen pop.

I see immense value in them for use cases like industrial areas and the like, no doubt about that. But, to expect people to be able to only half-ass drive and then be able to take over in a split second is insane. By insane, I mean that there is not any training to prove a person is capable of responding correctly. It should be a whole separate category of licensing to be able to “operate”, which requires hours of in-class and in-the-field testing.

My 70-year-old neighbor should not be able to operate any level of AV without some sort of update to their 54-year-old license. Telsa, Supercruise, or whatever else is out there…

Detroit-Lightning
Detroit-Lightning
1 year ago

So at this point, AV’s *might* make driving safer…but aside from all of that, the only benefit is ultimately to the owners of the fleets that will, potentially, making money on all of these vehicles driving paying customers around?

Drew
Drew
1 year ago

“the computer in question there lives in your skull and is powered by fistfuls of Pizza Rolls and lots of caffeine.”
Man, you better not convince everyone to start my ideal diet. If pizza roll demand skyrockets, I’m going to be reduced to just eating heaping scoops of peanut butter.

Eric Busch
Eric Busch
1 year ago
Reply to  Drew

With your fingers, no doubt.

Fuzzyweis
Fuzzyweis
1 year ago

I think for automated vehicles, they should develop a ‘designated road’ for them, maybe even using some kind of physical link to the road, so the vehicle can’t really stray from the road, so they can just troll on down the road along with other traffic, and make the vehicle fairly large with big entrances and lots of seating, and people could get on and off as they need….

it’s a Trolley, I’m talking about a Trolley….

Drew
Drew
1 year ago
Reply to  Fuzzyweis

but how are you going to get a trolley to go through Elon’s underground Vegas tunnel, huh? We’re going to need a whole underground trolley technology. Perhaps a series of linked trolleys. Oh, dear, I’ve come up with the idea for a subway.

Nlpnt
Nlpnt
1 year ago
Reply to  Drew

I think Adam Something has the copyright on that schtick.

Phuzz
Phuzz
1 year ago
Reply to  Nlpnt

Wasn’t that everyone’s first reaction when Elon’s HyperTube was first announced? “Well done Elon, you’ve invented the underground train 150 years after everyone else”.

Drew
Drew
1 year ago
Reply to  Phuzz

Yeah, everyone has noticed that tech bros tend to invent shittier versions of existing things to solve problems that aren’t technological, and Elon went all-in on that.

Jim Stock
Jim Stock
1 year ago

I have been reading automotive news since 1987 and do not remember any argument about AVs being anything other than they are for safety and replacing bad human drivers. I must have missed the stuff about them being better for the environment as that is not something I have a memory of ever seeing. I am not saying there was none I am just saying that never seemed to be the main reason to go AV.

Drew
Drew
1 year ago
Reply to  Jim Stock

Generally speaking, tech bros have posited that autonomous cars could be better for the environment with the “vehicle as a service” model, which would theoretically reduce the number of overall vehicles. Basically, they assume the car is not in use the majority of the time, so others would use it in the interim. They forget that there are times of heaviest use and that a large number of the vehicles would still sit stagnant between rush hours. They also forget that public transit could accomplish the same thing in cities and rural areas don’t have the density for vehicle-as-a-service to work.

So, yeah, automotive news doesn’t say it, but tech news sometimes does. And it is almost entirely based on faulty assumptions.

Mike Harrell
Mike Harrell
1 year ago
Reply to  Jim Stock

I believe the point is not that anyone had necessarily been arguing as a primary benefit that AVs would be better for the environment but rather that we have been overlooking the possibility that they might be significantly worse for the environment than, say, an otherwise equivalent fleet of EVs that are not self-driving.

50
0
Would love your thoughts, please comment.x
()
x