By Karan Singh

Not a Tesla App
Tesla has officially launched its FSD Supervised capability in Australia and New Zealand, marking the fifth and sixth countries in which FSD is now available. The release makes Tesla’s latest version of FSD, V13.2.9 Down Under edition, available to customers with AI4 (Hardware 4) vehicles, in two of the world’s largest right-hand-drive markets.
Tesla will soon introduce its first FSD subscription outside of North America.
FSD Supervised is now available in Australia and New Zealand pic.twitter.com/0Lxsgz82na
— Tesla Australia & New Zealand (@TeslaAUNZ) September 17, 2025
Pricing and International Subscription
For customers in the region, FSD Supervised is available for an outright purchase of $11,400 NZD in New Zealand and $10,100 AUD in Australia, as usual.
Tesla has also confirmed in emails to customers that a monthly subscription option will be coming soon to both New Zealand and Australia, priced at $159 NZD and $149 AUD, respectively. This is the first time they’ve offered the FSD subscription model outside of the United States and Canada.
The price is roughly equivalent to the $99/month US price, but more expensive than the $99/month CAD pricing.
How to Get FSD
The new FSD capability is being delivered via software update 2025.26.7.10 to customers on AI4 vehicles.
For owners who have already purchased FSD, you’ll receive 2025.26.7.10 shortly, as Tesla rolls it out to your vehicle. Once downloaded, you will be prompted on your display to enable FSD.
For new buyers or potential subscribers, you may likely be on a branch newer than 2025.26, likely 2025.32. You’ll need to wait for Tesla to release an FSD-compatible build based on the newer software version to be eligible.
The expansion into Australia and New Zealand is a major technical and commercial milestone for Tesla. It validates FSD’s adaptability and ability to solve new complex challenges, whether it be on roads north or south of the equator.
Subscribe
Subscribe to our newsletter to stay up to date on the latest Tesla news, upcoming features and software updates.
Follow
By Karan Singh

Not a Tesla App
According to a recent vehicle approval filing in the European Union, the Model 3 is slated for a minor refresh that will add some long-requested functionality. The approved changes include the addition of a front bumper camera, bringing it in line with the rest of Tesla’s refreshed fleet, as well as the return of physical indicator stalks, aligning it with the refreshed Model Y.
These updates, which are expected to begin rolling out to newly manufactured vehicles in the near future, not only improve the Model 3’s feature set but also bring it closer to Tesla’s updated design philosophy that was cemented with the refreshed Model Y.
Front Bumper Camera
One of the most anticipated hardware additions for the Model 3 is finally on the way to production vehicles. The filing confirms the inclusion of a front bumper camera, something owners have been requesting since the transition from ultrasonic sensors to Tesla Vision.
While Tesla Vision is incredibly accurate for a camera system and can create 3D maps of its surroundings, the front camera in the windshield is simply not able to see the area directly in front of the bumper. This makes it challenging for low-speed maneuvers and parking in tight spaces. This camera will help provide visibility in those situations and allow drivers, as well asone day FSD, to judge distances to otherwise unseen obstacles accurately.
Turn Signal Stalks
In a move that will be celebrated by many drivers and critics, the European filing also confirms that Tesla is bringing back the traditional indicator stalks to the Model 3. This change will be accompanied by an updated steering wheel design that removes the haptic indicator buttons.
This is a fairly significant change. Now, Tesla will once again only ship indicator buttons on the Model S, Model X, and Cybertruck. The refreshed Model 3 adopted the button-based indicator system in its refresh in 2024, matching the Model S and Model X at the time.
While Tesla has perfected the steering indicator buttons on the Cybertruck thanks to Steer-by-Wire, this hardware has yet to make its way to the rest of the lineup, and for the most part, using physical stalks is just easier.
The stalks that Tesla will be including will likely match those included in the Model Y refresh, so they will only have two simple actions - tilt down and up. All other functionality, from wipers to high beams, is not included in the stalk. Wipers will continue to be on the steering wheel, and gear shifting will continue to be on the center display.
For customers, these changes to the Model 3 are a big win - the addition of a feature many have been waiting for, alongside the return of physical turn stalks. The question for many will be whether Tesla will support retrofits for the new turn stalk, wheel, and bumper camera onto existing Model 3s, or if it will require considerable DIY work, similar to the Model S Refresh.
In China, Tesla is already offering retrofits for the turn signal for about $350 for vehicles produced after February 2025. However, they plan to add support for older vehicles as well.
This is a sign that Tesla is listening to customers and bringing back features to better align with what people actually want and use. We’ll have to wait and see if there are any additional changes besides the front camera and turn signal stalks.
Thanks to Eivissa for bringing this to our attention.
By Karan Singh

Not a Tesla App
In a recent discussion on X, Elon Musk provided some updates on Tesla’s custom AI silicon roadmap, clarifying the manufacturing plan for its future chips, and offering a glimpse into Tesla’s first-principles focused engineering philosophy.
Elon clarified that the new Samsung fab in Taylor, Texas, will be producing the future AI6 chip, not the upcoming AI5 chip. Second, Elon revealed that there’s an ongoing debate inside Tesla that challenges an interesting trend in the hardware industry. He revealed that Tesla’s chip design engineers have not yet decided to use the new industry-standard High-Bandwidth Memory (HBM) for AI6, suggesting that, for Tesla’s specific needs, cheaper and more conventional RAM might actually be the more efficient choice.
AI6 in Texas
The confirmation that Samsung’s Texas Fab will produce AI6 isn’t new - we recently heard about this with Tesla’s pivot from Dojo to AI6 as its primary training chipset. However, it is a confirmation that Tesla is looking to onshore all of its production, even if it won’t be producing the chips itself, but will be doing the design work right alongside Samsung.
While the monster AI5 chip is the next major leap for Tesla’s training and inference hardware, the plan for its successor, AI6, is already in motion. This will help Tesla to secure the massive number of chips it needs, not just for training, but also to embed in every single future Tesla and Optimus product being sold.
Is High-Bandwidth Memory a Slam Dunk?
Beyond the manufacturing roadmap, Elon’s comments on memory architecture were the most insightful. The current gold standard for high-performance AI accelerators is High-Bandwidth Memory (HBM), a type of RAM that offers incredibly fast data transfer speeds, allowing the processor to access model parameters with minimal delay.
While acknowledging that HBM might be the right choice, Elon also explained that it really isn’t the slam dunk choice that many take for granted, especially not for Tesla’s use case.
The reasoning stems from the evolving nature of the neural networks Tesla is building. As the ratio of total parameters to active or frequently used parameters increases, the cost per unit of useful compute and compute per watt might favor conventional RAM. There’s a lot to digest in that single statement, so let’s dig in.
— Elon Musk (@elonmusk) September 15, 2025You can fit more total RAM on the board if you use “normal” memory than high-bandwidth memory and it is super cheap.
Maybe high-bandwidth memory is still the right choice, but using HBM isn’t the slam dunk many people think it is.
First Principles & AI Hardware
Think of a massive AI model as a gigantic library that contains billions of books - the total number of parameters. To perform a specific task, like identifying a pedestrian, the AI only needs to instantly access a few specific and relevant books (the active parameters).
The industry-standard approach with HBM is like building a smaller library with an incredibly fast retrieval system. It’s built for nothing but speed, assuming you need access to many or all of the books very quickly.
Elon’s argument suggests a different approach. As Tesla’s AI models grow ever larger, the library is becoming astronomically huge. This isn’t your local city library; we’re talking mythical Tower of Babel-sized libraries. The point here is that even in a massive library, you’re still only pulling a few key books at a time for any given task.
That leads to a different set of priorities entirely, focused on capacity, cost, and efficiency. In terms of capacity and cost, you can fit more total RAM on the board if you use normal memory rather than HBM, and it is far cheaper. For a model with trillions of parameters, having enough sheer capacity to hold the entire library is paramount, and the cost-effectiveness of regular RAM is a massive advantage.
In terms of efficiency, if only a small fraction of parameters are active at once, the extreme speed of HBM might simply be overkill. The cost per useful computation, and the energy used for that computation, could actually be better with a giant, cheaper pool of conventional RAM.
Third Choice: Hybrid
Of course, there is always a third choice. If you have a moderate number of books that you need to access extremely quickly, and a large number of other books that you need access to, but not as quickly, perhaps taking a hybrid approach is the best option. This is similar to the caching system in processors today.
The lower-level caches host the critical information needed for operations, and in Tesla’s terms, a small set of HBM memory could do just that - hold safety-critical and regularly accessed information in HBM memory, while everything else is dumped to conventional RAM.
While the final decision for AI6 remains to be made, this debate offers a window into the first-principles thinking that drives Tesla.