Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
If that was true, my Tesla should have learned over the last couple of years to stop phantom breaking under the same bridge and at the same highway turn every single day when on cruise control. I did notice that the phantom breaking was less often when I had the full autopilot, so maybe smarter software?

Do you have FSDs?

The highway stack (used for Autopilot) and currently for FSDs on highways is the old version from v11. It has not moved to the new End-To-End NN used on City Streets (and non-highways) when part of FSDs.

So, if you had an issue when using Autopilot/FSDs on a highway (divided highway) it will not have changed, because it is still using the original code.
 
  • Like
Reactions: 960design
If that was true, my Tesla should have learned over the last couple of years to stop phantom breaking under the same bridge and at the same highway turn every single day when on cruise control. I did notice that the phantom breaking was less often when I had the full autopilot, so maybe smarter software?
There was no claim that any software is 100%, especially an FSD or equivalent on any automobile.

I would expect version 12.5 and beyond to get significantly better, but it’s still labeled supervised. The 12.5 stack is allegedly merged.
 
There was no claim that any software is 100%, especially an FSD or equivalent on any automobile.
I would expect version 12.5 and beyond to get significantly better, but it’s still labeled supervise.
Oh, I agree with the no claim, it is a small gripe, I'm so used to it now, I just hold down the go pedal at those points to prevent braking. My 2020 Caddy never had a problem with phantom braking, that thing would drive you off a cliff, but it did not freak the cars out behind me or the passengers with me.
 
  • Like
Reactions: I7guy
Do you have FSDs?

The highway stack (used for Autopilot) and currently for FSDs on highways is the old version from v11. It has not moved to the new End-To-End NN used on City Streets (and non-highways) when part of FSDs.

So, if you had an issue when using Autopilot/FSDs on a highway (divided highway) it will not have changed, because it is still using the original code.
No FSD, only had it in April 2024 when it was given to everyone for 30 days for free. I liked it, a lot more information displayed, but not worth $8000 for my use case.
 
  • Like
Reactions: I7guy and JT2002TJ
There was no claim that any software is 100%, especially an FSD or equivalent on any automobile.

I would expect version 12.5 and beyond to get significantly better, but it’s still labeled supervised. The 12.5 stack is allegedly merged.

No FSD, only had it in April 2024 when it was given to everyone for 30 days for free. I liked it, a lot more information displayed, but not worth $8000 for my use case.
12.5.6 is the NN for highway released. There are mixed reviews, and it still isn't 100% clear if plain TACC/Autosteer will use the NN or use to v11 stack. Though if you believe what is on the CyberTruck, it will most likely use the v11 stack (the CT doesn't have plain TACC/AS cause it doesn't have the v11 stack).
 
  • Like
Reactions: I7guy and JT2002TJ
No FSD, only had it in April 2024 when it was given to everyone for 30 days for free. I liked it, a lot more information displayed, but not worth $8000 for my use case.

This would explain why you would have a problem that persists even today. You are on the old stack that hasn't been updated yet. Until the End-to-End is moved to highways and also pushed to Autopilot (I'm sure it will) if there is a problem that always happens with a specific set of inputs, it will continue. The programming hasn't changed.

All these OS' that have come out since April, have not changed the highway stack (not significantly at least).

Luckily where I am, I only get hard braking when the vehicle sees someone approaching the edge of their lane, and out of an abundance of caution it slows down to avoid a potential side swiping.
 
  • Like
Reactions: 960design
Really? My son as a new xc90 and their nav definitely doesn’t have crowdsourced data.
Every XC90 from the past ~3-4 years runs Android Auto OS with the full-fledged Google Maps app. It most certainly has the best crowdsourced data as it has the benefit of every Android-phone on the road out there.


No the rumor is Tesla is using their cameras and AI to take pictures of road conditions and transmit it to vehicles in the same flight path so to speak.
Yea, I read the rumor. Check out Volvo Connected Safety, which they've had since 2016 or so and is probably included in your son's XC90: https://www.volvocars.com/mt/support/topic/7c6e6ec6ab0de3aac0a801516430580e

Basically, if a Volvo further up the road from you has to engage any kind of stability control from slipping on the road from icy/wet conditions, it will warn all other Volvos upstream. It's a pretty neat system, but it needs to be standardized so all OEMs both feed into and benefit from the data.


Tesla will if these rumors are true, because they will be using Ai and cameras to detect not only motion, but what is on the road.
It's a matter of numbers. Every smartphone (even cheapo these days has a pretty advanced MPU, and smartphones on the road will pretty much always outnumber cars on the road (and certainly cars from any single OEM) because there is at least one smartphone per car on the road, but many cars have multiple passengers and multiple smartphones. So right off the bar, Apple's and Google's simply have more endpoints that are scattered wider.

Add in that data aggregators are also getting feeds from municipal tracking systems, fleet management software, private data collectors, etc. which are further enriching the data. It's already been far more sophisticated than "Ai and cameras" for over a decade.


An Elon if you’re following has sunk a massive amount into Ai for these purposes.
I don't really believe anything that clown promises because he's failed to deliver on his promises >50% of the time, and when he does deliver it's later he anticipates 100% of the time.

I am aware that Elon's companies have made some purchases of nVidia H100 cards in impressive quantities. However, I think the real story is a bit different from what you said - Elon did not increase the initial order, rather he redirected cards earmarked for Tesla to X instead. Setting aside the questionable SEC legality of doing such a thing, if anything it shows he is de-emphasizing the importance of datacenter AI for driving purposes and instead putting it into whatever the heck X is for these days.

Regardless and setting aside the Elon bashing, it remains to be seen if anyone does anything useful with all that AI hardware. So far, I think healthcare is the only industry that has successfully monetized and put all that AI hardware to good use. I don't think anyone else has actually created a solid AI-based business model where the outputs are worth more than the cost of the inputs. Not Microsoft, not Facebook, not OpenAI, not Apple, not Musk - nobody outside of healthcare has done it.
 
Every XC90 from the past ~3-4 years runs Android Auto OS with the full-fledged Google Maps app. It most certainly has the best crowdsourced data as it has the benefit of every Android-phone on the road out there.



Yea, I read the rumor. Check out Volvo Connected Safety, which they've had since 2016 or so and is probably included in your son's XC90: https://www.volvocars.com/mt/support/topic/7c6e6ec6ab0de3aac0a801516430580e

Basically, if a Volvo further up the road from you has to engage any kind of stability control from slipping on the road from icy/wet conditions, it will warn all other Volvos upstream. It's a pretty neat system, but it needs to be standardized so all OEMs both feed into and benefit from the data.



It's a matter of numbers. Every smartphone (even cheapo these days has a pretty advanced MPU, and smartphones on the road will pretty much always outnumber cars on the road (and certainly cars from any single OEM) because there is at least one smartphone per car on the road, but many cars have multiple passengers and multiple smartphones. So right off the bar, Apple's and Google's simply have more endpoints that are scattered wider.

Add in that data aggregators are also getting feeds from municipal tracking systems, fleet management software, private data collectors, etc. which are further enriching the data. It's already been far more sophisticated than "Ai and cameras" for over a decade.



I don't really believe anything that clown promises because he's failed to deliver on his promises >50% of the time, and when he does deliver it's later he anticipates 100% of the time.

I am aware that Elon's companies have made some purchases of nVidia H100 cards in impressive quantities. However, I think the real story is a bit different from what you said - Elon did not increase the initial order, rather he redirected cards earmarked for Tesla to X instead. Setting aside the questionable SEC legality of doing such a thing, if anything it shows he is de-emphasizing the importance of datacenter AI for driving purposes and instead putting it into whatever the heck X is for these days.

Regardless and setting aside the Elon bashing, it remains to be seen if anyone does anything useful with all that AI hardware. So far, I think healthcare is the only industry that has successfully monetized and put all that AI hardware to good use. I don't think anyone else has actually created a solid AI-based business model where the outputs are worth more than the cost of the inputs. Not Microsoft, not Facebook, not OpenAI, not Apple, not Musk - nobody outside of healthcare has done it.
Okay whether or not you take any credence in what Tesla is doing I don’t really care. You downplay teslas accomplishments while aggrandizing Volvo. I haven’t seen any crowdsourced data on my sons’ xc90, period. Maybe it’s turned off.

I’ll reiterate that Tesla will be using their cameras and ai in a way that will be revolutionary; doesn’t matter any feelings about the ceo. The 10 cameras that Teslas have is what makes the difference and the ability to take pictures and have ai act in them on-ups anything that is out there.
 
Last edited:
Exactly! Because unlike for the US market, BMWs and Mercedes make non-premium models for the European market. In Europe you can get a bare-bones BMW/Mercedes without stitched leather, without a heated steering wheel and seats, without all the infotainment bells and whistles. There, it is possible to get a BMW 3-series that is akin to a Camry.
Nobody compares a 3er or a C class with a Camry.
And the Camry only recently started to be offered because nobody was buying the Avensis (the EURO D segment) and they probably decided to offer something already designed elsewhere. At least it's cheaper not to sell. Even so it seems the 2024 model will be only in some European countries.
You don't get to be premium by loading gadgets into a car.
At least that's how the European market has always worked.
Even the lower A class or 1er are considered premium cars and command a price premium to a Golf, a 308 or whatever.
 
  • Like
Reactions: cyb3rdud3 and I7guy
I think Teslas push into AI will ultimately make being able to navigate anywhere a vehicle can safely travel - not saying FSD isn’t good enough if it can’t navigate the edge cases as shown here.

I get it if someone lives on unpaved, umimproved and unmarked roads there will be these cases where maybe full self driving of any type would not work.
Edge case is right. Personally I’d expect cars to get motorways and dual carriageways sorted long before the lanes where I live. But that’s to be expected. The way its developed in the last few years is leaps and bounds better than I expected to see in my lifetime.
 
  • Like
Reactions: I7guy
Okay whether or not you take any credence in what Tesla is doing I don’t really care. You downplay teslas accomplishments while aggrandizing Volvo. I haven’t seen any crowdsourced data in it sons’ xc90, period.

I’ll reiterate that Tesla will be using their cameras and ai in a way that will be revolutionary; doesn’t matter any feelings about the ceo. The 10 cameras that Teslas have is what makes the difference and the ability to take pictures and have ai act in them on-ups anything that is out there.
It's literally Google - the same app and same crowdsourced data as in everyone's phone. And I'm not aggrandizing Volvo. Every other OEM that uses AAOS has the same.

I'm telling you as fact, Tesla is behind the 8ball on driving assist tech. Just about every competitor (Waymo, Cruise) is ahead. Indeed, even nVidia's in-house self-driving tech is ahead of Tesla's and I guarantee you they have more H100s working on it than Tesla and X. Tesla messed up big time by ditching radar/ultrasonic (and foregoing lidar) in ~2020, and putting all their money on a vision-only solution. At the time Musk said it was to save money, but it was one of those penny-wise pound-foolish moves.

It doesn't matter if you have 10 cameras, 11 cameras (as Rivian has), or 100 cameras. CMOS sensors alone are simply too limited and insufficient. Tesla is now using cameras as a rain sensor too, and it works terribly. If they can't get a camera to accurately detect rain, how can anyone trust them to detect unexpected roadside events?
 
It's literally Google - the same app and same crowdsourced data as in everyone's phone. And I'm not aggrandizing Volvo. Every other OEM that uses AAOS has the same.

I'm telling you as fact, Tesla is behind the 8ball on driving assist tech. Just about every competitor (Waymo, Cruise) is ahead. Indeed, even nVidia's in-house self-driving tech is ahead of Tesla's and I guarantee you they have more H100s working on it than Tesla and X. Tesla messed up big time by ditching radar/ultrasonic (and foregoing lidar) in ~2020, and putting all their money on a vision-only solution. At the time Musk said it was to save money, but it was one of those penny-wise pound-foolish moves.

It doesn't matter if you have 10 cameras, 11 cameras (as Rivian has), or 100 cameras. CMOS sensors alone are simply too limited and insufficient. Tesla is now using cameras as a rain sensor too, and it works terribly. If they can't get a camera to accurately detect rain, how can anyone trust them to detect unexpected roadside events?
Is detection the problem, or doing something with what it detected?
 
It's literally Google - the same app and same crowdsourced data as in everyone's phone. And I'm not aggrandizing Volvo. Every other OEM that uses AAOS has the same.
Exactly how are the cameras of google seeing for example a pothole or a downed tree in real time? Don’t know if you are ignoring this, downplaying this or don’t understand it. Either way it’s a game changer.
I'm telling you as fact, Tesla is behind the 8ball on driving assist tech.
You’re telling me as an opinion.
Just about every competitor (Waymo, Cruise) is ahead.
Both have had their own share of stuff so nothing is perfect.
Indeed, even nVidia's in-house self-driving tech is ahead of Tesla's and I guarantee you they have more H100s working on it than Tesla and X. Tesla messed up big time by ditching radar/ultrasonic (and foregoing lidar) in ~2020, and putting all their money on a vision-only solution. At the time Musk said it was to save money, but it was one of those penny-wise pound-foolish moves.
My opinion it will work out well for Tesla.
It doesn't matter if you have 10 cameras, 11 cameras (as Rivian has), or 100 cameras. CMOS sensors alone are simply too limited and insufficient.
Tesla believes differently. And if I had to pick which side was probably going to be in the right side of this conversation….
Tesla is now using cameras as a rain sensor too, and it works terribly.
No, it’s okay, not great and not terrible.
If they can't get a camera to accurately detect rain, how can anyone trust them to detect unexpected roadside events?
This is called a logical fallacy.

Anyway hope you are watching the Tesla event at 10 edt.
 
Last edited:
Exactly how are the cameras of google seeing for example a pothole or a downed tree in real time? Don’t know if you are ignoring this, downplaying this or don’t understand it. Either way it’s a game changer.
First, IMU's can easily sense potholes. Bbut more importantly, nobody actually wants alerts for potholes ahead. Waze tried it, people hated it.

Second, whether it is a downed tree or a car accident or whatever up ahead doesn't matter. The only thing that matters is how it affects or impedes the movement of traffic, which we've been easily determining using IMUs and GPS for a decade. Real-time traffic and road closures are not a problem that needs to be solved with cameras and AI.

It's not a game changer. It's more vaporware from the king of vaporware.


You’re telling me as an opinion.
No, this is something that is literally my job to know as fact.

Both have had their own share of stuff so nothing is perfect.
Yes, "stuff". Very precise. Tesla's FSD will steer you into the back of a truck as sunset, into a highway barrier at full speed, and will drive on the wrong side of the road for whatever reason. These aren't niche corner cases, this is basic table stakes stuff that everyone else but Tesla have mastered.


My opinion it will work out well for Tesla.
Good luck to you. I honestly hope it does; we need more American car companies to succeed. I think Tesla will succeed long term, but not at FSD.

Tesla believes differently. And if I had to pick which side was probably going to be in the right side of this conversation….
Again, Tesla has been wrong about this a lot. Remember they've been saying FSD is right around the corner for the better part of a decade now, and they're really no better off than they were 6 years ago. Indeed, in objective metrics, Tesla FSD has gotten worse since they've disabled all the radar sensors.

No, it’s okay, not great and not terrible.
Isn't that wildly bad though? Rain sensing is a problem that has been perfectly solved for DECADES using 5-cent diodes. There was 0 reason to "innovate" here - it was already solved and super low cost. So Tesla's "innovative" solution is (a) more expensive, and (b) works "okay, not great." What?!? And you trust the guy in charge?

This is called a logical fallacy.
It isn't... it's the same CMOS camera.

Anyway hope you are watching the Tesla event at 10 edt.
Unfortunately I am. As I said, it's my job. The analysist already know what is going to be announced. Look at the puts - I would buy some but it seems the bad news is already priced in...
 
  • Love
Reactions: SalisburySam
First, IMU's can easily sense potholes. Bbut more importantly, nobody actually wants alerts for potholes ahead. Waze tried it, people hated it.
This is Tesla not google. And what I used was an example of something that could be done using the Tesla interconnected network...not that it will be a final product.
Second, whether it is a downed tree or a car accident or whatever up ahead doesn't matter.
In your opinion. In Teslas opinion it may matter.
The only thing that matters is how it affects or impedes the movement of traffic, which we've been easily determining using IMUs and GPS for a decade. Real-time traffic and road closures are not a problem that needs to be solved with cameras and AI.
Again, in your opinion. Since we don't know what Tesla has planned, which could be such things such as automatic rerouting or making calls to the appropriate government agency etc, this could be a game changer.
It's not a game changer. It's more vaporware from the king of vaporware.
Now you are being intellectually honest in this post. All this other stuff is just noise. Thankfully one persons lack of faith that Tesla will go on it's path will not stop them. The people who criticized Apple predicted it's implosion 10 years after Jobs died didn't learn.
No, this is something that is literally my job to know as fact.
Unless you work for Tesla everything you say is an opinion about what Tesla is doing.
Yes, "stuff". Very precise. Tesla's FSD will steer you into the back of a truck as sunset, into a highway barrier at full speed, and will drive on the wrong side of the road for whatever reason. These aren't niche corner cases, this is basic table stakes stuff that everyone else but Tesla have mastered.



Good luck to you. I honestly hope it does; we need more American car companies to succeed. I think Tesla will succeed long term, but not at FSD.


Again, Tesla has been wrong about this a lot. Remember they've been saying FSD is right around the corner for the better part of a decade now, and they're really no better off than they were 6 years ago. Indeed, in objective metrics, Tesla FSD has gotten worse since they've disabled all the radar sensors.


Isn't that wildly bad though? Rain sensing is a problem that has been perfectly solved for DECADES using 5-cent diodes. There was 0 reason to "innovate" here - it was already solved and super low cost. So Tesla's "innovative" solution is (a) more expensive, and (b) works "okay, not great." What?!? And you trust the guy in charge?


It isn't... it's the same CMOS camera.


Unfortunately I am. As I said, it's my job. The analysist already know what is going to be announced. Look at the puts - I would buy some but it seems the bad news is already priced in...
Well we could argue opinions back and forth till the cows come home. Tesla and Apple have a common denominator in that people said analogous things and Apple has proved them wrong, as Tesla will prove you wrong. I sincerely hope you are accurate about Tesla. I get to be wrong and my portfolio takes a mega-hit and you get a bonus. But if it is the other way around, $$$⬆️ and you're on the bread line.

Anyway, it's been fun. The universe is going to do what it is going to do and it's pretty interesting to see an alleged internet experts (on any topic any forum) go down in flames.
 
Last edited:
Is detection the problem, or doing something with what it detected?
Purely vision systems have many problems, but broadly to answer your question detection is the problem.

Getting a computer to accurately interpret a 2D image in real time with all the problems of optics is pretty hard. AI can do it more accurately but not in real time and still not perfectly (and it's very computationally expensive). Real-time algorithms have a much harder time. For example, if all you're working with is 2D images, a big white truck in front of a big white sky can be almost invisible. Or, the lens flare from the camera being pointed towards a setting sun can disguise a traffic light or a pedestrian. In poor lighting conditions, it can be difficult to tell whether a person is walking towards the car or away from the car--especially if the car itself is moving fast.

There are also issues with how CMOS cameras handle wide differences is lightning - ever notice how you can be outside on a sunny day but see fairly well with your eyes into the open door of a dark house. But if you try to take a picture of it, you either get blown-out outside to see in the house, or a crushed-black inside the house to see the outside. To get both, you need to take two photos and combine them. But your eyes it manage fine, right? This is because our eyes are really good at interpreting light intensity logarithmically, whereas the way CMOS sensors physically work (transistor wells and all) make such a feat impossible.

Vision has to be supplemented with 3D or distance data. This is why lidar is king - you can get an almost perfect 3D view of a 120deg field of view in front of the car with poor lighting conditions having no negative effect. However, rain makes lidar tricky (it still works, just worse). Also, the lidar sensors are expensive even at OEM volumes at about $500 per unit. Some of the Chinese OEMs have it down to $200/unit, but it's still major money. By comparison, a CMOS sensor (a camera) is <$5. Another great supplement is radar or ultrasonic sensors (like a more advanced version of the parking sensors most cars have today). These work great in the rain and all lighting conditions, but their resolution and field of view is pretty bad, but at roughly $20 per unit the price is not bad.

Waymo and Cruise use all of the above: vision cameras, lidar, and radar. They have their quirks, but in terms of safely driving their track record is nearly perfect. Just about every automotive OEM except Tesla is using or has plans to use all 3 in combination: Audi uses Valeo's system, Mecedes and Volvo use Luminar's system, BMW is trialing Innoviz's system, Toyota is working with Luminar and Aeva, not sure who Honda uses but they said they're going to use lidar, Ford is working with Princeton Lightwave, GM owns Cruise. I can go on. Tesla is the outlier here.
 
Waymo, the self-driving car company owned by Alphabet Inc., has recently come under investigation by the National Highway Traffic Safety Administration (NHTSA)1
. The investigation was initiated after reports of 22 incidents involving Waymo's autonomous vehicles, including 17 crashes and five possible traffic law violations2
. Fortunately, no injuries were reported in connection with these incidents3
.

The crashes mainly involved Waymo vehicles colliding with stationary objects like gates, chains, or parked vehicles3
. The NHTSA is looking into these incidents to ensure the safety of autonomous driving systems

And looking at youtube videos about Mercedes system, it seems very limited and prone to dangerous and unexpected issues. Empirically self-driving systems are not perfect and personally I wouldn't trust my life to them. But Tesla seems on moving forward with RoboTaxi. Either it will be what they say or not.
 
Last edited:
I guess I don’t use FSDs in NYC/Long Island everyday, since it can’t work as a camera only system. I guess I don’t have it come across the parking lot using ASS to pick up my son and I. It can’t work, since it’s vision only.

Waymo/Cruse are a different beast. I know 0 owners who have been inside of their own Waymo/Cruse vehicle. Yet I know dozens of people who have had FSDs drive them thousands of miles.

Robotaxis are not the same problem. We need both solved (Level 5 and Level 2-3). People need to be able to take their personal vehicles and have it drive them. Not just try to get into a $200k+ vehicle with sensors all over it that is Geofenced to certain areas and wont even get on a highway.

I have 20k+ miles on my teslas in the last year, about 10-15k are using FSDs.
 
Is this Tesla in the room with us right now?
They will be at 7pm today.
Wait wait, you telling me Tesla doesn’t have automatic rerouting today!?
I think we all know the answer to that and that wasn’t what was being said, and while we don’t know what the ultimate feature set will be of this rumor being discussed or the trajectory of it, just projecting there seem to be much more that can be done with real time camera views of the Tesla fleet combined with AI and gps data on a particular road issue. Rather than just crowd sourced gps location data.
 
My parents are in their late 80's, and live alone in a remote part of their town. They have a hard time getting around, and don't like driving, but still have to shop, visit doctors, and other things. We need to solve the issue of vehicles driving themselves, so people like my parents, or even younger people without the physical ability to drive have access to transportation.

Musk claims Tesla will have self-driving Cybercabs ready by 2027 for about $30k. While technology may be right around the corner regulation isn’t.
 
It doesn't matter if you have 10 cameras, 11 cameras (as Rivian has), or 100 cameras. CMOS sensors alone are simply too limited and insufficient. Tesla is now using cameras as a rain sensor too, and it works terribly. If they can't get a camera to accurately detect rain, how can anyone trust them to detect unexpected roadside events?
I don't know, I have two cameras (eyes) and no radar (my bald head appears to be a radar dome, but it is not) and I do pretty good.
Me = developed vision software that scored 112 IQ on Raven's (that's ridiculously good, if you are in the know). I think it is mostly a software thing that will be figured out. To be clear, I do not disagree with keeping lidar / radar; although it creates cross dependencies and complications that makes software development and troubleshooting more difficult.
 
  • Like
Reactions: I7guy
Guess Teslas are not made for police / law enforcement duty ?

https://www.sfgate.com/bayarea/article/california-switch-electric-cars-cops-19816671.php

Screenshot 2024-10-11 at 5.44.51 AM.png
 
So far, I think healthcare is the only industry that has successfully monetized and put all that AI hardware to good use. I don't think anyone else has actually created a solid AI-based business model where the outputs are worth more than the cost of the inputs. Not Microsoft, not Facebook, not OpenAI, not Apple, not Musk - nobody outside of healthcare has done it.
That's an excellent observation. Big Pharma is doing well with AI generated medications! I would add that AI generated instagram models are not doing too shabby. ;)

My company is working hard at finding the proper niche for AI solutions. Right now it feels more like square pegs being shoved in round holes with the only tool available, a hammer. Clients only appear to know 'buzzwords', but do not understand what they mean. It becomes difficult to not laugh at the ignorance and buddy promotion (CEO to CTO to CFO) on buzzword salads. (Is there a weekly buzzword handout that gets delivered to CEOs?).
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.