I wouldn't be surprised if this was a better solution. I think while radar might have a worse spatial resolution, it's depth perception, speed measurement capability, and general robustness to adverse weather might make it a better complementary sensor.
Lidar struggles with things like rain and snow way worse than cameras do.
Xpeng, Wayne, aiMotive to name three. Probably many others, who claim to use LIDAR but don’t actually rely on it. Because LIDAR is perceived as a prerequisite for autonomous safety, admitting to not needing it is a bad PR move — for now.
There is a massive technical difference between Vision first but with LiDAR redundancy vs No LiDAR at all that is Tesla approach. Those are not the same architecture. So claiming XPeng, Waymo, or aiMotive validate Tesla is technically misleading.
XPeng system is sensor fusion. It is not camera only. Waymo is even clearer. For them LiDAR is not optional. aiMotive has now started to market camera only, but its experimental, no production deployments.
Xpeng is abandoning sensor fusion. aiMotive has never bothered with sensor fusion. I never mentioned Waymo; unfortunately the AI gods at Apple auto-corrected me typing Wayve, as in Wayve Technologies Ltd.
Tesla FSD is not accurately described as a "no LIDAR at all" approach, if you're attempting to contrast it to other LIDAR-trained systems like aiMotive, Xpeng and Wayve.
Yes, silly using just cameras, I mean humans have Lidar sensors, that why they can drive, why didn't new just copy that....oh wait.
It all seriousness though, Tesla are producing cyber cabs now which are 10th the price of Waymo's and can drive autonomously anywhere in the world. I think we can see where this is going. (Hint: not well for Waymo)
Also the article is speculative 'MicroVision says its sensor could one day break the $100 barrier'. One day...
Humans also don't have wheels, but we build objects with wheels. It is as if we can build objects that don't resemble humans for specific purposes. Crazy...
> Tesla are producing cyber cabs now which are 10th the price of Waymo's and can drive autonomously anywhere in the world.
Wait what? when did they actually enter mass production?
> I mean humans have Lidar sensors
Real time slam is actually pretty good, the hard part is reliable object detection using just vision. Tesla's forward facing cameras are effectively monocular, which means that its much much harder to get depth (its not impossible but moving objects are much more difficult to observe if you only have cameras aligned on the same plane with no real parallax)
Ultimately Musk is right, you probably don't need lidar to drive safely. but its far more simple and easier to do if you have Lidar. Its also safer. Musk said "lidars are a crutch", not because he is some sort of genius, Its obvious that SLAM only driving is the way forward since the mid 00's (of not earlier). The reason he said it is because he thought he could save money not having lidar. The problem for him is that he didn't do the research to see how far away proper machine perception is to account for the last 1% in accuracy needed to make vision only safe and reliable.
> Tesla's forward facing cameras are effectively monocular
Notably, human perception is effectively monocular in driving situations at distances of 60 feet or farther. It's best in the area where your limbs can reach.
Not mass production yet, but the first one rolled off the completed assembly line at giga texas last week
Sensor fusion is not far simpler, when the sensors disagree, and they will often, you have to pick which to trust.
It is amazing to see how many people here are confident they know the one true way to build autonomous systems based on nothing but wanting to confirm their biases
This is a weirdly tired counterpoint that Elon and Elonstans like to bandy about as if it's an apples to apples comparison. Humans have a weirdly ultra-high-dynamic-range binocular vision system mounted on an advanced ptz/swivel gimbal that allows for a great degree of freedom of movement, parallax effects, and a complex heuristic system for analyzing vision data.
The Tesla FSD system has... well, sure, a few more cameras, but they're low resolution, and in inconveniently fixed locations.
My alley has an occlusion at the corner where it connects to the main road: a very tall, very ample bush that basically makes it impossible to authoritatively check oncoming traffic to my left. I, a human, can determine that if I see the light flicker even slightly as it filters through the bushes, that the path is not clear: a car is likely causing that very slight change in light. My Tesla has no clue at all that that's happening. And worse, the perpendicular camera responsible for checking cross-traffic is mounted _behind my head_ on the b-pillar, in a fixed location that means that without nosing my car _into_ the travel lane, there is literally no way for it to be sure the path is clear.
This edge case is navigated near-perfectly by Waymo, since its roof-mounted lidar can see above and beyond the bush and determine that the path is clear. And to hit back on the "Tesla is making cheaper cars that can drive autonomously anywhere in the world": I mean, they still aren't? Not authoritatively. Not authoritatively enough that they aren't seeing all sorts of interventions in the few "driverless" trials they're doing in Austin. Not authoritatively enough when I have my Tesla FSD to glory. It works well enough on the fat part of the bell curve, but those edges will get you, and a vision only system means that it is extremely brittle in certain conditions and with certain failure modes, that a lidar/radar backup help _enhance_.
Moreover, Waymo has brought lidar development in-house, they're working to dramatically reduce their vehicle platform cost by reducing some redundant sensors, and they can now simulate a ground truth model of an absurd number of edge cases and odd scenarios, as well as simulate different conditions for real-world locations in parallel with their new world modeling systems.
None of which reads to me as "not going well for Waymo." Waymo completes over 450,000 fully autonomous rides per week right now. They're dramatically lowering their own barriers to new cities/geographies/conditions, and they're pushing down the cost per unit substantially. Yeah, it won't get to be as cheap as Tesla owning the entire means of production, but I'm still extremely bullish on Waymo being the frontrunner for autonomous driving for the foreseeable future.
Waymos are still making lots of errors that a human wouldn't (Stopping in middle of a road due to a puddle was a recent one https://dmnews.co.uk/waymo-robotaxi-spotted-unable-to-cross-...) 17 years after betting on LIDAR, I think Tesla is ahead now in most respects. It's could be wrong though we will probably know by the end of this year.
> My Tesla has no clue at all that that's happening. And worse, the perpendicular camera responsible for checking cross-traffic is mounted _behind my head_ on the b-pillar
It has a wide angle camera in front that you usually can never see outside service menu. It should cover that case.
> Yes, silly using just cameras, I mean humans have Lidar sensors, that why they can drive, why didn't new just copy that....oh wait.
Humans don't have wheels and cannot go 70MPH. Humans also don't have rear view cameras and cannot process video feeds from 8 cameras simultaneously. The point of these machines is to be better than humans for transportation. If adding LIDAR means that these vehicles can see better than humans and avoid accidents that humans do get into, then I for one want them in my vehicle.
The human brain is a product of millions of years of dealing with spatial problems for survival — and most individual humans are the product of thousands of hours of experience using it to navigate the physical world.
We're always getting closer at emulating this, but we're still a ways off from matching it.
Stereo based depth mapping is kind of bad, especially so if it is not IR assisted. The quality you get from Lidar out of the box is crazy good in comparison.
What you can do is train a model using both the camera and Lidar data to produce a good disparity and depth map but this just means you're using more Lidar not less.
>It all seriousness though, Tesla are producing cyber cabs now which are 10th the price of Waymo's and can drive autonomously anywhere in the world. I think we can see where this is going. (Hint: not well for Waymo)
This feels like a highly misleading claim that might technically be true in the sense that there are less restrictions, but a reduction in restrictions doesn't imply an increase in capability.
The comment about Waymo seems to be particularly myopic. Waymo has self driving technology and is operating as a financially successful business. There is no conceivable situation where the mere existence of competition with almost the same capabilities would shake that up. Why isn't it companies like Uber, who have significantly fallen behind, that are in trouble?
>Also the article is speculative 'MicroVision says its sensor could one day break the $100 barrier'. One day...
Tesla actually re-introduced radar sensors in HW4. https://www.teslarati.com/tesla-hardware-4-hd-radar-first-lo...
They might not use them for autopilot, but maybe for some emergency braking stuff, when everything else failed.
I wouldn't be surprised if this was a better solution. I think while radar might have a worse spatial resolution, it's depth perception, speed measurement capability, and general robustness to adverse weather might make it a better complementary sensor.
Lidar struggles with things like rain and snow way worse than cameras do.
Is there anyone using only cameras except Tesla?
Xpeng, Wayne, aiMotive to name three. Probably many others, who claim to use LIDAR but don’t actually rely on it. Because LIDAR is perceived as a prerequisite for autonomous safety, admitting to not needing it is a bad PR move — for now.
There is a massive technical difference between Vision first but with LiDAR redundancy vs No LiDAR at all that is Tesla approach. Those are not the same architecture. So claiming XPeng, Waymo, or aiMotive validate Tesla is technically misleading.
XPeng system is sensor fusion. It is not camera only. Waymo is even clearer. For them LiDAR is not optional. aiMotive has now started to market camera only, but its experimental, no production deployments.
Xpeng is abandoning sensor fusion. aiMotive has never bothered with sensor fusion. I never mentioned Waymo; unfortunately the AI gods at Apple auto-corrected me typing Wayve, as in Wayve Technologies Ltd.
Tesla FSD is not accurately described as a "no LIDAR at all" approach, if you're attempting to contrast it to other LIDAR-trained systems like aiMotive, Xpeng and Wayve.
Tesla is using radars as well as cameras. No one is using only cameras
Nope...
Yes, silly using just cameras, I mean humans have Lidar sensors, that why they can drive, why didn't new just copy that....oh wait.
It all seriousness though, Tesla are producing cyber cabs now which are 10th the price of Waymo's and can drive autonomously anywhere in the world. I think we can see where this is going. (Hint: not well for Waymo)
Also the article is speculative 'MicroVision says its sensor could one day break the $100 barrier'. One day...
Humans also don't have wheels, but we build objects with wheels. It is as if we can build objects that don't resemble humans for specific purposes. Crazy...
> Tesla are producing cyber cabs now which are 10th the price of Waymo's and can drive autonomously anywhere in the world.
My understanding is that cyber cabs still need safety drivers to operate, is that not the case?
They have no steering wheel or pedals so no
Yes, but they are useless, they can't steer, hence why they have more accidents than humans per driven miles.
Robotaxis in Austin are in the process of removing in car safety monitors, there is a chance you would get one today
They are just moving the safety monitor in a car that drives behind you.
https://electrek.co/2026/01/22/tesla-didnt-remove-the-robota...
It would be funny, but tbh it's just sad.
Everything for the stock pump
tesla robotaxi crash rates are also currently (as in, with safety drivers) 4x higher than humans so that's not very promising
> Tesla are producing cyber cabs now which are 10th the price of Waymo's and can drive autonomously anywhere in the world.
Wait what? when did they actually enter mass production?
> I mean humans have Lidar sensors
Real time slam is actually pretty good, the hard part is reliable object detection using just vision. Tesla's forward facing cameras are effectively monocular, which means that its much much harder to get depth (its not impossible but moving objects are much more difficult to observe if you only have cameras aligned on the same plane with no real parallax)
Ultimately Musk is right, you probably don't need lidar to drive safely. but its far more simple and easier to do if you have Lidar. Its also safer. Musk said "lidars are a crutch", not because he is some sort of genius, Its obvious that SLAM only driving is the way forward since the mid 00's (of not earlier). The reason he said it is because he thought he could save money not having lidar. The problem for him is that he didn't do the research to see how far away proper machine perception is to account for the last 1% in accuracy needed to make vision only safe and reliable.
> Tesla's forward facing cameras are effectively monocular
Notably, human perception is effectively monocular in driving situations at distances of 60 feet or farther. It's best in the area where your limbs can reach.
We don't need stereoscopic vision to drive.
> Wait what? when did they actually enter mass production?
"mass" is a strong word but the first one came off their production line 5 days ago
ramp to high volume will probably be extremely slow
Not mass production yet, but the first one rolled off the completed assembly line at giga texas last week
Sensor fusion is not far simpler, when the sensors disagree, and they will often, you have to pick which to trust.
It is amazing to see how many people here are confident they know the one true way to build autonomous systems based on nothing but wanting to confirm their biases
This is a weirdly tired counterpoint that Elon and Elonstans like to bandy about as if it's an apples to apples comparison. Humans have a weirdly ultra-high-dynamic-range binocular vision system mounted on an advanced ptz/swivel gimbal that allows for a great degree of freedom of movement, parallax effects, and a complex heuristic system for analyzing vision data.
The Tesla FSD system has... well, sure, a few more cameras, but they're low resolution, and in inconveniently fixed locations.
My alley has an occlusion at the corner where it connects to the main road: a very tall, very ample bush that basically makes it impossible to authoritatively check oncoming traffic to my left. I, a human, can determine that if I see the light flicker even slightly as it filters through the bushes, that the path is not clear: a car is likely causing that very slight change in light. My Tesla has no clue at all that that's happening. And worse, the perpendicular camera responsible for checking cross-traffic is mounted _behind my head_ on the b-pillar, in a fixed location that means that without nosing my car _into_ the travel lane, there is literally no way for it to be sure the path is clear.
This edge case is navigated near-perfectly by Waymo, since its roof-mounted lidar can see above and beyond the bush and determine that the path is clear. And to hit back on the "Tesla is making cheaper cars that can drive autonomously anywhere in the world": I mean, they still aren't? Not authoritatively. Not authoritatively enough that they aren't seeing all sorts of interventions in the few "driverless" trials they're doing in Austin. Not authoritatively enough when I have my Tesla FSD to glory. It works well enough on the fat part of the bell curve, but those edges will get you, and a vision only system means that it is extremely brittle in certain conditions and with certain failure modes, that a lidar/radar backup help _enhance_.
Moreover, Waymo has brought lidar development in-house, they're working to dramatically reduce their vehicle platform cost by reducing some redundant sensors, and they can now simulate a ground truth model of an absurd number of edge cases and odd scenarios, as well as simulate different conditions for real-world locations in parallel with their new world modeling systems.
None of which reads to me as "not going well for Waymo." Waymo completes over 450,000 fully autonomous rides per week right now. They're dramatically lowering their own barriers to new cities/geographies/conditions, and they're pushing down the cost per unit substantially. Yeah, it won't get to be as cheap as Tesla owning the entire means of production, but I'm still extremely bullish on Waymo being the frontrunner for autonomous driving for the foreseeable future.
Waymos are still making lots of errors that a human wouldn't (Stopping in middle of a road due to a puddle was a recent one https://dmnews.co.uk/waymo-robotaxi-spotted-unable-to-cross-...) 17 years after betting on LIDAR, I think Tesla is ahead now in most respects. It's could be wrong though we will probably know by the end of this year.
> I think Tesla is ahead now in most respects
Do you actually own a Tesla? I do. With FSD. And let me assure you, you are very wrong.
How old? The 2023+ models with HW4 are pretty good at FSD. A 2021 model with HW3 was scary bad when I tried it about a year and a half ago.
> I tried it about a year and a half ago.
So, you do not own a Tesla.
> My Tesla has no clue at all that that's happening. And worse, the perpendicular camera responsible for checking cross-traffic is mounted _behind my head_ on the b-pillar
It has a wide angle camera in front that you usually can never see outside service menu. It should cover that case.
> Yes, silly using just cameras, I mean humans have Lidar sensors, that why they can drive, why didn't new just copy that....oh wait.
Humans don't have wheels and cannot go 70MPH. Humans also don't have rear view cameras and cannot process video feeds from 8 cameras simultaneously. The point of these machines is to be better than humans for transportation. If adding LIDAR means that these vehicles can see better than humans and avoid accidents that humans do get into, then I for one want them in my vehicle.
The human brain is a product of millions of years of dealing with spatial problems for survival — and most individual humans are the product of thousands of hours of experience using it to navigate the physical world.
We're always getting closer at emulating this, but we're still a ways off from matching it.
I don't understand what you're saying.
Stereo based depth mapping is kind of bad, especially so if it is not IR assisted. The quality you get from Lidar out of the box is crazy good in comparison.
What you can do is train a model using both the camera and Lidar data to produce a good disparity and depth map but this just means you're using more Lidar not less.
>It all seriousness though, Tesla are producing cyber cabs now which are 10th the price of Waymo's and can drive autonomously anywhere in the world. I think we can see where this is going. (Hint: not well for Waymo)
This feels like a highly misleading claim that might technically be true in the sense that there are less restrictions, but a reduction in restrictions doesn't imply an increase in capability.
The comment about Waymo seems to be particularly myopic. Waymo has self driving technology and is operating as a financially successful business. There is no conceivable situation where the mere existence of competition with almost the same capabilities would shake that up. Why isn't it companies like Uber, who have significantly fallen behind, that are in trouble?
>Also the article is speculative 'MicroVision says its sensor could one day break the $100 barrier'. One day...
And so is the comment about Tesla cyber cabs.
Humans cannot drive safely. Human drivers kill someone every 26 seconds. Waymos have never killed a person.
Part of that is that humans are distractible, and their performance can be degraded in many ways, and that silicon thinks faster than meat.
But part of it is the sensor suite. Look at Waymo vs Tesla robotaxi accident rates.