A Chinese TV channel spent a bunch of money doing ADAS tests and Tesla came out on top of all the Chinese brands, including all the LIDAR systems. Although tests were all in the day time.
There's also been talk of companies pushing a hybrid LIDAR+vision approach using custom hardware since it's complex to merge the two datasets. So the answer might eventually be somewhere in between instead of companies choosing one or the other depending on costs.
- Tesla's vision only approach seems a lot more competent than the Lidar suites from smaller Chinese makers. Perhaps I misjudged how necessary Lidar was to achieve safe driving.
- Virtually all of the Chinese car infotainment were basically a 1:1 copy of Tesla's. I couldn't find any that genuinely tried something unique lol
> - Tesla's vision only approach seems a lot more competent than the Lidar suites from smaller Chinese makers. Perhaps I misjudged how necessary Lidar was to achieve safe driving.
Three things can be simultaneously true:
* Tesla's cameras are sufficient for some scenarios.
* Tesla's cameras are insufficient for other scenarios.
* A system with good data and bad algorithmic processing is still going to be bad. The Chinese vehicles almost always fail the tests because they see the obstacle but drive into it anyway.
Yeah it's interesting hearing their engineering logic, that fewer sensor types means less sensor collision and faster iteration, where iteration speed is really what matters. I also think people overhyped lidar because they don't understand it, and human behavior is to associate things we don't understand to magic. It's not magic, it performs poorly in inclination weather and can have issues with resolution over range and data processing (although lidar does do a lot of things well).
All of this said, once Karpathy left they have slowly looked at adding new sensors (recently radar), so who knows what the future for Tesla's sensor suite holds.
The Tesla vehicles failed 4 of the 9 SAE Level 2 tests, during ideal weather conditions. How is it reasonable to operate an autonomous taxi service with these results?
Neat. I wonder which others will pass. I wonder if safety sense 3 cars will pass too. Speaking of which it’s insane a sienna doesn’t have that. I wish Tesla made a van instead of the cyber truck. Americans and their truck obsession…
Looks like they replace the entire drivetrain. Some older Ford vans, like the 90s windstars, were built on small truck platforms, so you could probably do one. But I'm not sure they'd sell one to ya
Autopilot (no longer for sale) is so unsafe I’m surprised there’s no class action for owners to force Tesla to upgrade it to FSD for free.
Especially in right hand drive markets (non US) it’s even worse than Toyota’s radar cruise.
I’ve nearly been killed by it about 5 times because it randomly steers into fences and things. It also randomly fails to change lanes (1 in 100), and then just randomly steers full lock and goes out of control.
Autopilot in my 2024 Model Y never changes lanes. That has always been a feature restricted to “Full” Self Driving. Autopilot is just lane assist and cruise control.
I can’t recall anytime either Autopilot or FSD put me in danger though.
The branding is confusing. I’m talking about the paid version of Autopilot, which was for sale for $5000, sometimes called “Autopilot Plus”.
For right hand drive markets, it seems to be a stripped down version of FSD 10 or 11. It automatically changes lanes, takes corners and highway exits, but does not stop at traffic lights. It drives exactly in the middle of the lane, doesn’t shuffle over for trucks, and is easily confused.
So, that's not my experience with current FSD versions. But whatever, sure. Let's accept your data point as measured:
Every... TWO HOURS?! I mean, come on. Put a camera on yourself or another human driver. There's an unexpected braking event at least that often, almost always in a more dangerous situation. The human failure tends to be failing to detect a real obstacle, vs. slowing for a phantom one.
This is just too much. If you don't like it don't use it. But to pretend that stomps-the-brakes-every-few-hours is a stop ship kind of safety bug is quite frankly ridiculous.
> Every... TWO HOURS?! I mean, come on. Put a camera on yourself or another human driver. There's an unexpected braking event at least that often, almost always in a more dangerous situation
Wait...what are counting as an "unexpected braking event"? I can't think of anything I do with brakes that would not be counted as ordinary braking that happens anywhere near as often as every two hours.
I mean that you stomp on your brakes to avoid rear ending a car you didn't notice stopping. That is objectively more dangerous a situation than "phantom braking" and far, far, FAR more common.
I know, I know, you're a perfect driver and would never fail to notice an obstacle. But the rest of us aren't.
You should not be driving if you are having to slam on your brakes to avoid rear ending someone every two hours. You are a danger to yourself and everyone around you
I'm pretty sure I am in fact only very rarely stomping my brakes, because since I drive with one pedal driving mode on I use my brakes rarely enough that I notice when I'm using them.
Maybe I simply don't drive often in circumstances where there are cars suddenly stopping ahead of me? I keep a pretty decent gap between me and the car ahead, unless I'm in stop and go traffic in which case I usually am using adaptive cruise control.
I've been planning to get an OBD2 scanner and an app that can report details on EV battery health, because a battery health report is the only thing that I'd get if used the dealer that is 15 miles away (and an hour bus ride away if I don't want to wait there) for scheduled maintenance that I don't get if I go to the independent service center a 10 minute walk away.
Perhaps I'll try to find one that can also log my brake activity to see if I'm somehow just not noticing a lot of brake stomping.
It's like clockwork. Everyone who hates Tesla just so happens to be a preturnaturally fantastic driver. Yet everyone I drive with, and me, and everyone around me isn't. I just can't understand it.
To wit, I, uh, don't believe a word you're saying. No one drives in traffic without an occasional oops. And the people who claim not to are, uh, almost certainly the worst of the bunch.
> It's like clockwork. Everyone who hates Tesla just so happens to be a preturnaturally fantastic driver.
I said nothing about Tesla and I said nothing about my driving.
If you really want to know, I would not claim I'm a better than average driver. I don't think "very rarely has to unexpectedly brake hard" is something you need to be a particularly good driver to accomplish.
I wonder if we're going to see a different spin on dieselgate in the future. Where a car company collects all the data from the NHTSA's test environment through the cars cameras/sensors and then includes that data into the training datasets for other cars/sfw updates. (I'm not implying that this happened, but I imagine it would at some point)
The article is vague, but I suspect this is referring to FMVSS-127 which makes certain active safety features mandatory in 2029 and also increases the difficulty of some required to pass scenarios. The new scenarios require responding from higher initial speeds which effectively requires longer sensor ranges and/or lower latency.
There is a big difference between "something like" and actually passing the tests, I would be surprised if any non vision based system has the reaction time needed to pass the new pedestrian tests.
Apparently some HW3 cars can get it. It's listed available for my 2022 Model 3 (Australia/Sydney). However the cost is twice what they charge for HW4, I believe.
It seems other HW3 might get a FSD-lite version. There's no official way to upgrade HW3-HW4.
I'm in the same boat, this is a whole thing right now. There is some kinda class action in Europe which will hopefully make them pay up or deliver something useful. I think a Refund plus interest plus a hefty fine for lying would be a good start.
China is more repressive than the United States on basically every metric of fascism/authoritarianism that political scientists actually use. Do we need to elaborate?
A Tesla still can't detect a motorcycle next to it, so I can't see how it would ace the blind spot warning test.
Any other administration and I would be willing to grant the benefit of the doubt, but Musk's spent a lot of money to corrupt government agencies over the past year and a half so that he could get silly pronouncements that the most dangerous "advanced" driving system in the world is somehow also the safest. (More people have been killed by Tesla's ADAS systems than every other automaker's ADAS systems, in the world, combined.)
A Chinese TV channel spent a bunch of money doing ADAS tests and Tesla came out on top of all the Chinese brands, including all the LIDAR systems. Although tests were all in the day time.
https://www.youtube.com/watch?v=0xumyEf-WRI&t=1203s
https://electrek.co/2025/07/29/another-huge-chinese-self-dri...
XPENG (major chinese ADAS brand) recently decided to copy Tesla's vision-only+AI world gen data approach, after originally focusing only on LIDAR https://electrek.co/2026/04/29/xpeng-vla-2-test-drive-tesla-...
There's also been talk of companies pushing a hybrid LIDAR+vision approach using custom hardware since it's complex to merge the two datasets. So the answer might eventually be somewhere in between instead of companies choosing one or the other depending on costs.
My god, from this video I learned two things:
- Tesla's vision only approach seems a lot more competent than the Lidar suites from smaller Chinese makers. Perhaps I misjudged how necessary Lidar was to achieve safe driving.
- Virtually all of the Chinese car infotainment were basically a 1:1 copy of Tesla's. I couldn't find any that genuinely tried something unique lol
> - Tesla's vision only approach seems a lot more competent than the Lidar suites from smaller Chinese makers. Perhaps I misjudged how necessary Lidar was to achieve safe driving.
Three things can be simultaneously true:
* Tesla's cameras are sufficient for some scenarios.
* Tesla's cameras are insufficient for other scenarios.
* A system with good data and bad algorithmic processing is still going to be bad. The Chinese vehicles almost always fail the tests because they see the obstacle but drive into it anyway.
The article notes that these tests were all done in daylight, where Lidar provides less of an advantage.
Yeah it's interesting hearing their engineering logic, that fewer sensor types means less sensor collision and faster iteration, where iteration speed is really what matters. I also think people overhyped lidar because they don't understand it, and human behavior is to associate things we don't understand to magic. It's not magic, it performs poorly in inclination weather and can have issues with resolution over range and data processing (although lidar does do a lot of things well).
All of this said, once Karpathy left they have slowly looked at adding new sensors (recently radar), so who knows what the future for Tesla's sensor suite holds.
It'll be difficult for any single company to compete with Tesla on scale and the AI we have so far rewards scale like no other technology before it.
Yes Waymo exists, but the amount of training data they have is a few orders of magnitude lower.
And yet Waymo is operating a real commercial service in multiple cities, and Tesla is in just one.
The Tesla vehicles failed 4 of the 9 SAE Level 2 tests, during ideal weather conditions. How is it reasonable to operate an autonomous taxi service with these results?
That was 1 year ago.
Neat. I wonder which others will pass. I wonder if safety sense 3 cars will pass too. Speaking of which it’s insane a sienna doesn’t have that. I wish Tesla made a van instead of the cyber truck. Americans and their truck obsession…
A Tesla van would be amazing. Unfortunately not going to happen.
I keep hoping Edison motors up in Canada comes out with a Van conversion kit like their pick up kit
... Do the Edison kits still use the original transmission?
Cause if not, it would be hilarious to do that to a clapped out van...
Looks like they replace the entire drivetrain. Some older Ford vans, like the 90s windstars, were built on small truck platforms, so you could probably do one. But I'm not sure they'd sell one to ya
Still would love to see it. The idea reminds me of farm truck https://okcfarmtruck.com/pages/about
Why would they make something that's like 5% of market share? Tiny car + huge SUV (YL doesn't count) makes far more sense.
yet i still can’t use basic autopilot on the highway because it phantom brakes every 2 hours
Autopilot (no longer for sale) is so unsafe I’m surprised there’s no class action for owners to force Tesla to upgrade it to FSD for free.
Especially in right hand drive markets (non US) it’s even worse than Toyota’s radar cruise.
I’ve nearly been killed by it about 5 times because it randomly steers into fences and things. It also randomly fails to change lanes (1 in 100), and then just randomly steers full lock and goes out of control.
Model 3 - Highland
Autopilot in my 2024 Model Y never changes lanes. That has always been a feature restricted to “Full” Self Driving. Autopilot is just lane assist and cruise control.
I can’t recall anytime either Autopilot or FSD put me in danger though.
The branding is confusing. I’m talking about the paid version of Autopilot, which was for sale for $5000, sometimes called “Autopilot Plus”.
For right hand drive markets, it seems to be a stripped down version of FSD 10 or 11. It automatically changes lanes, takes corners and highway exits, but does not stop at traffic lights. It drives exactly in the middle of the lane, doesn’t shuffle over for trucks, and is easily confused.
Are you sure you own a Tesla? Because nothing what you say makes sense.
https://www.drive.com.au/news/tesla-enhanced-autopilot-goes-...
Are you on an extremely old version or something? I have had my model Y for 5 years and it only phantom braked once ever.
Many countries in the world are on a 6+ year old version of Autopilot, yeah..
Really? That's weird, I owned a Tesla in Sweden for 2 years and had perhaps 3 ghost break events total. I used autopilot(not fsd) a lot.
So, that's not my experience with current FSD versions. But whatever, sure. Let's accept your data point as measured:
Every... TWO HOURS?! I mean, come on. Put a camera on yourself or another human driver. There's an unexpected braking event at least that often, almost always in a more dangerous situation. The human failure tends to be failing to detect a real obstacle, vs. slowing for a phantom one.
This is just too much. If you don't like it don't use it. But to pretend that stomps-the-brakes-every-few-hours is a stop ship kind of safety bug is quite frankly ridiculous.
If you are having unexpected braking events every 2 hours you should be paying better attention to your driving. I go months without them
> Every... TWO HOURS?! I mean, come on. Put a camera on yourself or another human driver. There's an unexpected braking event at least that often, almost always in a more dangerous situation
Wait...what are counting as an "unexpected braking event"? I can't think of anything I do with brakes that would not be counted as ordinary braking that happens anywhere near as often as every two hours.
I mean that you stomp on your brakes to avoid rear ending a car you didn't notice stopping. That is objectively more dangerous a situation than "phantom braking" and far, far, FAR more common.
I know, I know, you're a perfect driver and would never fail to notice an obstacle. But the rest of us aren't.
You should not be driving if you are having to slam on your brakes to avoid rear ending someone every two hours. You are a danger to yourself and everyone around you
I'm pretty sure I am in fact only very rarely stomping my brakes, because since I drive with one pedal driving mode on I use my brakes rarely enough that I notice when I'm using them.
Maybe I simply don't drive often in circumstances where there are cars suddenly stopping ahead of me? I keep a pretty decent gap between me and the car ahead, unless I'm in stop and go traffic in which case I usually am using adaptive cruise control.
I've been planning to get an OBD2 scanner and an app that can report details on EV battery health, because a battery health report is the only thing that I'd get if used the dealer that is 15 miles away (and an hour bus ride away if I don't want to wait there) for scheduled maintenance that I don't get if I go to the independent service center a 10 minute walk away.
Perhaps I'll try to find one that can also log my brake activity to see if I'm somehow just not noticing a lot of brake stomping.
I, uh, don't think the median driver goes through that. Maybe you tailgate really aggressively?
It's like clockwork. Everyone who hates Tesla just so happens to be a preturnaturally fantastic driver. Yet everyone I drive with, and me, and everyone around me isn't. I just can't understand it.
To wit, I, uh, don't believe a word you're saying. No one drives in traffic without an occasional oops. And the people who claim not to are, uh, almost certainly the worst of the bunch.
> It's like clockwork. Everyone who hates Tesla just so happens to be a preturnaturally fantastic driver.
I said nothing about Tesla and I said nothing about my driving.
If you really want to know, I would not claim I'm a better than average driver. I don't think "very rarely has to unexpectedly brake hard" is something you need to be a particularly good driver to accomplish.
I wonder if we're going to see a different spin on dieselgate in the future. Where a car company collects all the data from the NHTSA's test environment through the cars cameras/sensors and then includes that data into the training datasets for other cars/sfw updates. (I'm not implying that this happened, but I imagine it would at some point)
"four newly integrated advanced safety tests:
"Don't most cars do something like that now? I'm curious what's different between Tesla and, say, a Honda Accord?
The article is vague, but I suspect this is referring to FMVSS-127 which makes certain active safety features mandatory in 2029 and also increases the difficulty of some required to pass scenarios. The new scenarios require responding from higher initial speeds which effectively requires longer sensor ranges and/or lower latency.
How much did Honda's CEO give the president in the last election?
10% for the big guy, iirc
There is a big difference between "something like" and actually passing the tests, I would be surprised if any non vision based system has the reaction time needed to pass the new pedestrian tests.
Tesla has a Dog mode
Is this only when used with an FSD subscription, since the Model Y no longer has autopilot?
wth man I was told mh Model Y that I bought around 2021 was going to do all this but it's now too old or something?
Apparently some HW3 cars can get it. It's listed available for my 2022 Model 3 (Australia/Sydney). However the cost is twice what they charge for HW4, I believe.
It seems other HW3 might get a FSD-lite version. There's no official way to upgrade HW3-HW4.
I'm in the same boat, this is a whole thing right now. There is some kinda class action in Europe which will hopefully make them pay up or deliver something useful. I think a Refund plus interest plus a hefty fine for lying would be a good start.
It seems you got musked, overpromised and underdelivered.
He at least bought me a horse
[flagged]
China is more repressive than the United States on basically every metric of fascism/authoritarianism that political scientists actually use. Do we need to elaborate?
[flagged]
You gonna post a source that this happened? Or are you just spreading FUD?
seriously?
It’s pretty well documented that this is the case generally.
A Tesla still can't detect a motorcycle next to it, so I can't see how it would ace the blind spot warning test.
Any other administration and I would be willing to grant the benefit of the doubt, but Musk's spent a lot of money to corrupt government agencies over the past year and a half so that he could get silly pronouncements that the most dangerous "advanced" driving system in the world is somehow also the safest. (More people have been killed by Tesla's ADAS systems than every other automaker's ADAS systems, in the world, combined.)
This is obviously a factually incorrect post as anybody who uses the product can attest
Obviously your priors are wrong, it can ace a blind spot warning test because it can detect a motorcycle next to it.
As of today it can't. The Uber driver was as surprised as the guy he almost hit.