In one of my favorite movies of all time, The Matrix, humans become the power source that keeps machines alive.
Elon Musk must have watched that movie recently, because he just pitched a similar idea. Except he wants idle machines to power the future of intelligence, not the other way around.
On Tesla’s recent third-quarter earnings call, Musk floated this wild idea:
Actually, one of the things I thought, if we’ve got all these cars that maybe are bored, while they’re sort of, if they are bored, we could actually have a giant distributed inference fleet and say, if they’re not actively driving, let’s just have a giant distributed inference fleet.
Translation: every idle Tesla could soon act as a node in a massive AI network. Tens of millions of parked cars, thinking together.
But how would Elon’s mobile supercomputer work?
That’s where things get really interesting…
A Fleet That Thinks
Estimates vary, but as of 2024, there were around five million Teslas on the road worldwide.
Elon Musk has much bigger plans, predicting the fleet might eventually total 100 million cars.
Here’s what he said during Tesla’s recent earnings call:
At some point, if you’ve got tens of millions of cars in the fleet, or maybe at some point 100 million cars in the fleet, and let’s say they had at that point, I don’t know, a kilowatt of inference capability, of high-performance inference capability, that’s 100 gigawatts of inference distributed with power and cooling taken, with cooling and power conversion taken care of. That seems like a pretty significant asset.
In other words, 100 million Teslas, each capable of about one kilowatt of high-performance inference.
That works out to roughly 100 gigawatts of compute power.
To put that in perspective, 100 gigawatts is close to the combined output of 100 nuclear reactors or enough electricity to power 75 million U.S. homes.
A single hyperscale data center from Amazon Web Services or Google Cloud can draw 50 to 100 megawatts of power. You’d need around 1,000 of those to match Musk’s theoretical 100-gigawatt network.
And all that potential computing power would already be built, paid for and sitting in driveways.
Image: Tesla
Tesla’s full-self-driving computer — known as Hardware 4 — is designed to approach the kind of performance seen in high-end data center chips.
And a next-generation system called AI5 is in development that could deliver several times more processing power, giving every Tesla the kind of onboard compute once reserved for data centers.
What’s more, each car already contains a high-performance processor and power system capable of running complex AI tasks. Each one already has a built-in thermal-management system that keeps chips cool and batteries balanced. And every vehicle is connected to Tesla’s cloud through the same over-the-air update network that pushes new software and maps.
The difference is, unlike a server rack, these systems spend most of their time doing nothing. Because the average car sits parked 95% of the day.
So Musk’s pitch is simple. Let’s put those idle processors to work.
If you could borrow a little bit of energy and compute from every parked Tesla, you could form a global computing grid that would make today’s cloud networks look far too centralized and inefficient by comparison.
Need to run an image-recognition model, simulate an autonomous-driving scenario or process video data?
Tesla could parcel out those jobs across millions of cars overnight.
This would give Tesla a potential moat that no other automaker — or cloud company — could easily match.
After all, GM and Ford don’t have proprietary chips like the AI5 in their cars. And Amazon doesn’t have five million connected vehicles plugged into its cloud.
It would also help shift AI from centralized supercomputers to distributed inference. That’s the same kind of edge computing model that powers smartphones, drones and industrial robots today.
Because in this scenario, the network wouldn’t need to exist in one central place.
It would live wherever a Tesla is parked.
Here’s My Take
If Musk can actually execute on this wild idea, Tesla’s fleet could rival the largest AI compute clusters on Earth.
But there are hurdles to solve before it could become reality.
Running inference jobs on vehicle batteries could shorten their lifespan if they aren’t managed carefully.
Some owners might refuse to allow their car to be used for Tesla’s compute work, even if they’re compensated. And data-privacy laws in Europe and California would require consent and transparency.
But Tesla already has experience orchestrating massive distributed systems. Every time it updates Autopilot or trains new vision models, it collects and processes video data from millions of cars worldwide.
The difference here is that Musk would want the Tesla fleet not just to train AI, but to run it.
In this future, Tesla’s cars would stop just being vehicles and start acting as mobile computing assets. Owners might opt in through software, allowing their vehicles to rent out compute cycles while parked, which would earn them credits or cash in return.
For Tesla, it would be an entirely new revenue stream layered on top of the existing fleet. And like Musk’s robotaxi venture, it would scale automatically.
Because every new car sold would expand the network’s computing power.
It’s a radical idea. And it would represent a radical shift for the company. If Tesla can pull it off, Musk could end up running the world’s most powerful, most distributed AI network…
Without ever building a data center.
Regards,
Ian KingChief Strategist, Banyan Hill Publishing
Editor’s Note: We’d love to hear from you!
If you want to share your thoughts or suggestions about the Daily Disruptor, or if there are any specific topics you’d like us to cover, just send an email to [email protected].
Don’t worry, we won’t reveal your full name in the event we publish a response. So feel free to comment away!























