Moon Tricks Tesla's Full Self-Driving Feature into Thinking It's a Yellow Light
interestingengineering.comI'm not going to knock Tesla for trying a primarily visual-only approach, but I will knock them for insisting that it will work out to level 4 autonomous driving with just software.
I predict that these kinds of unexpected edge cases will keep popping up for several years. It certainly will be interesting to see how robust their image processing algorithms get as time goes on, but I wouldn't hold my breath for reliable full self driving for a while.
I would hold it against them. I think we should start with as many sensors as is necessary to get it working robustly. Over time we can then strip one or the other sensor as we see what we can do with software improvements.
First get something working, then reduce costs, not the other way around.
With hardware, comes software. Having fewer targets for your software can simplify the software and lead to superior results. Edge cases will always be there. You might be introducing some in a particular area with their vision-only approach, but reducing them substantially elsewhere. With radar, they had phantom breaking events and according to them, it's the primary reason they want to ditch it.
I'm on the fence as to whether it's a good or bad decision ... but let's not pretend they're idiots.
Wouldn't two cameras and parallax show that the moon is a bit too far away to worry about?
It would if the cameras were good enough and plentiful enough and can be rectified. It seems pretty clear from all of the failure stories that Tesla either does not do stereopsis up front to establish geometry at all or doesn't do it enough and instead relies primarily on blackbox identification NNs.
Parallax beyond the distance equivalent to that between the cameras very quickly drops the distance accuracy possible to resolve into the noise floor.
E.g. just a quarter mile is already in the noise.
> just a quarter mile is already in the noise
There's no reason to care about precisely measuring things a quarter of a mile away. Human drivers don't.
You can do structure from motion on a single camera without the need for stereo.
Just planning ahead. Musk knows that soon he'll be on Starlink 5, completely blocking out the sun and moon. Why bother with programming a workaround for an issue with a better solution coming!
Discussed here 2 days ago (85 comments):
I had this happen on my commute to work, but from the sun. It's really smoky here, and the sun was breaking through the smoke just enough and at the right angle to look like a yellow light. It was pretty amusing to see a bunch of yellow traffic lights stream by as I was driving.
"LIDAR is stupid." and also "Tesla to collaborate with SpaceX on star-shade" lol
How long before Teslas start identifying Jupiter and weather balloons as UFOs?