Settings

Theme

Moon Tricks Tesla's Full Self-Driving Feature into Thinking It's a Yellow Light

interestingengineering.com

29 points by clashmeifyoucan 4 years ago · 13 comments

Reader

ayane_m 4 years ago

I'm not going to knock Tesla for trying a primarily visual-only approach, but I will knock them for insisting that it will work out to level 4 autonomous driving with just software.

I predict that these kinds of unexpected edge cases will keep popping up for several years. It certainly will be interesting to see how robust their image processing algorithms get as time goes on, but I wouldn't hold my breath for reliable full self driving for a while.

  • carlmr 4 years ago

    I would hold it against them. I think we should start with as many sensors as is necessary to get it working robustly. Over time we can then strip one or the other sensor as we see what we can do with software improvements.

    First get something working, then reduce costs, not the other way around.

    • revnode 4 years ago

      With hardware, comes software. Having fewer targets for your software can simplify the software and lead to superior results. Edge cases will always be there. You might be introducing some in a particular area with their vision-only approach, but reducing them substantially elsewhere. With radar, they had phantom breaking events and according to them, it's the primary reason they want to ditch it.

      I'm on the fence as to whether it's a good or bad decision ... but let's not pretend they're idiots.

  • BenjiWiebe 4 years ago

    Wouldn't two cameras and parallax show that the moon is a bit too far away to worry about?

    • BugsJustFindMe 4 years ago

      It would if the cameras were good enough and plentiful enough and can be rectified. It seems pretty clear from all of the failure stories that Tesla either does not do stereopsis up front to establish geometry at all or doesn't do it enough and instead relies primarily on blackbox identification NNs.

    • xyzzy21 4 years ago

      Parallax beyond the distance equivalent to that between the cameras very quickly drops the distance accuracy possible to resolve into the noise floor.

      E.g. just a quarter mile is already in the noise.

      • BugsJustFindMe 4 years ago

        > just a quarter mile is already in the noise

        There's no reason to care about precisely measuring things a quarter of a mile away. Human drivers don't.

    • twiddlebits 4 years ago

      You can do structure from motion on a single camera without the need for stereo.

HWR_14 4 years ago

Just planning ahead. Musk knows that soon he'll be on Starlink 5, completely blocking out the sun and moon. Why bother with programming a workaround for an issue with a better solution coming!

Tempest1981 4 years ago

Discussed here 2 days ago (85 comments):

https://news.ycombinator.com/item?id=27934539

kevin42 4 years ago

I had this happen on my commute to work, but from the sun. It's really smoky here, and the sun was breaking through the smoke just enough and at the right angle to look like a yellow light. It was pretty amusing to see a bunch of yellow traffic lights stream by as I was driving.

high_byte 4 years ago

"LIDAR is stupid." and also "Tesla to collaborate with SpaceX on star-shade" lol

simonh 4 years ago

How long before Teslas start identifying Jupiter and weather balloons as UFOs?

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection