Tesla ( TSLA ) just delivered a rare double for Nvidia ( NVDA ) this past weekend.
CEO Elon Musk revealed that Tesla is much talked about self-driving AI5 chips are almost completed, and the next one, A16, is already underway.
With the AI inference part covered, Musk said on Sunday at X that Dojo 3 is being rebooted, pushing Tesla back into full-scale AI training after previously pulling back.
Nvidia struck first, however, when it launched “Alpamayo” at CES 2026 (an open-source AI toolkit for autonomous vehicles), which it aims to become default autonomy platform powers a lot of brands.
Musk quickly responded, downplaying the risk.
Clearly, this is an extremely exciting time for the AV industry, with the tug of war between two giants in Nvidia and Tesla.
For Tesla, it’s all about building a closed loop that spans the entire AV stack.
In-car computing designed by Tesla (this includes AI5, which is “almost finished” and AI6, which is already underway)
Tesla’s camera software stack
Tesla’s data flywheel is powered by its own fleet
So for Tesla, it’s all about keeping the autonomy part of its strong ecosystem as Nvidia looks to power everyone else.
For investors, these promises are not new, which makes follow-through all the more critical.
Elon Musk says Tesla’s AI5 chip is nearing completion as next-generation self-driving hardware advancesPhoto by Bloomberg on Getty Images” loading=”eager” height=”640″ width=”960″ class=”yf-lglytj loader”/>
Elon Musk says Tesla’s AI5 chip is nearing completion as next-gen self-driving hardware advancesPhoto by Bloomberg on Getty Images ·Photo by Bloomberg on Getty Images
Tesla is trying to tighten its grip on the hardware behind autonomous driving.
More tech stocks:
Musk announced in an X post on Saturday that the EV giant is nearing completion of its AI5 computer chip and that AI6 is already in development.
According to Musk, AI5 chips, which are manufactured by Taiwan Semiconductor Manufacturing Company, will enter high-volume production in 2027replacing AI4 hardware. Tesla has also lined up Samsung Electronics for US chip production.
It’s pretty easy to get lost in AI jargon, so it’s important to be clear about things every step of the way about what’s going on.
Related: BlackRock CEO Gives Straightforward Warning on US National Debt
The movement of A15 and A16 is essentially about “edge inference”. That’s basically running Tesla’s fully self-driving neural networks inside the machine instead of relying on a third-party compute stack.
So if Tesla runs the software on its own chips, it gains a major competitive advantage:
Tesla doesn’t need Nvidia’s in-vehicle SoC (or its full “DRIVE” platform) for its cars.
Tesla gains control over unit costs, supply chain leverage and chip design.
It’s important to note, however, that Tesla has already moved away from Nvidia calculation in the machine since 2019so the latest moves are more of a doubling than a switch.
Nvidia offers automakers a complete solution, essentially a shortcut to full autonomy. Under them NVIDIA DRIVES umbrella, basically selling an integrated “brain, operating system and toolset”.
Related: Goldman Sachs renews price target on Microsoft shares ahead of earnings
So instead of building custom chips, software, safety frameworks and more, automakers can just plug and play into Nvidia’s robust ecosystem and get started. A big part of its appeal is that it’s essentially a hack for companies that don’t have Tesla’s decade-long autonomy drive or the billions to spend on research and development.
What Nvidia groups:
AGX MANAGEMENT in-vehicle computers such as Orin and Thor.
A complete software stack that includes OS DRIVE and DriveWorks.
DRIVE Hyperiona reference vehicle platform that comes with validated sensors and architecture.
Popular security and validation tools NVIDIA Halos umbrella, along with powerful AI models such as Alpamayoin speeding up training and simulation.
Tesla is ramping up its in-car AI chips, but clearly Nvidia still holds a critical lead in computing power.
AI5 and AI6 are customized for edge inference, but formation of frontier scale models it’s a whole other challenge.
Training modern AI systems is remarkably computer intensive.
For perspective, Meta said it trained its AI model Llama 3.1 (405B) using over 16,000 Nvidia H100 GPUs. So if we consider 700 watts per chip, that’s close Power of 11.2 megawatts for GPUs only. This level of scale is where Nvidia’s economics, availability, and ecosystem continue to dominate.
However, Tesla’s decision to reboot the Dojo 3 as it appears to be getting back into the training game.
At this point, however, I feel that Return of Dojo 3 most likely indicates a future hybrid.
Tesla will continue to develop its training capability using the AI5 and AI6 architectures, while relying on Nvidia where scale and economics matter.
When we see strong evidence of large-scale training clusters running on Tesla silicon, supported by transfer data and costs, then the rivalry really ramps up on the training front.
Related: Morgan Stanley changes price target on Rocket Lab shares after rally
This story was originally published by TheStreet on January 19, 2026, where it first appeared in the Technology section. Add TheStreet as a favorite source by clicking here.