Ask HN: Why are we so far from reaching acceptable autonomous driving?
We seem to have large datasets that can be used, and the best of algorithms Why have we as a species failed to make acceptable autonomous driving a reality by now? What is missing? Do we not have the math to do this, like some folks say we don't to tackle hardest problems in quantum mechanics? (By acceptable I mean, a state wherin there exists a handful of people using it to drive across all terrains and under all weathers a normal human is capable of operating a car in) I recently read about this in an article [1] about Tesla's ambition to build a level 3 auto-pilot. The article points out several issues about this project that I didn't think about before reading it. Basically, there's more issues than just "finding the right math", and on different levels, not only the technical one. I'd recommend reading it. [1] https://dev.to/bosepchuk/is-it-ethical-to-work-on-the-tesla-... There's a really huge variety of stuff that can happen when you're driving a car. In my lifetime, I've seen: - A signal light malfunction such that I had a green turn arrow across the path of oncoming traffic, where the oncoming traffic also had a green light. - The axle break on the van going around the cloverleaf in front of me (and props to the driver for not rolling the van after the tire rolled off - I'd be surprised if autonomous drivers were trained for that). - A pickup truck drop an extension ladder in the freeway. It hit in such a way that it started spinning. The centrifugal force then made it extend. So there's this growing, spinning, sliding thing in the middle of the freeway. It was the single hardest to avoid thing I've ever experienced. - A baby stroller rolling across the crosswalk across my path (unaccompanied by any adults!) just as my light turned green. - Hydroplaning. - Torrential rain. - Show blowing across the road in the grip of 100-mile-an-hour winds. - Dirt roads that have rocks in the middle that can take out your oil pan. - Failing to clear my car of all the snow on it before I drove it. When I hit a red light and hit the brakes hard, the remaining snow dumped on the road in front of me. I then started to slide on what had been (until a moment ago) a clear road. - Having a semi in the next lane hit a groove in the pavement that was full of water, sending a wall of water across my windshield, rendering me abruptly blind. On a curve. With a semi beside me. Sure, you can train the autonomous driver to handle each one of these. But that's just stuff that I personally have seen. You've made a list of goofy circumstances to train for. But did you get all the ones that actually happen? No, you didn't. How's it going to do on some strange situation that it never trained for? Such situations are going to occur. Out of curiosity, how DID you handle the situation with the semi? I hit the brakes to try to get out of the wall of water it was throwing, tried to remember in my head what the shape of the curve looked like from my last look at it just before the water hit, and just waited to be able to see again. It took much longer than I wanted it to... That sounds terrifying. Oddly, an autonomous vehicle might have the advantage in that situation, since ideally it’d have some representation of the shape of the road ahead that would persist for a few seconds even if its sensors were blinded briefly. Kinda off topic, but pranks for the next generation are going to be amazing. Code is just so predictable. I can picture a couple of teens blocking an empty self driving uber/waymo going to pick up its next passenger, covering its cameras with paint/sticky notes, and leaving an empty disabled car in the middle of an intersection for the police to deal with. Dont think the AI is going to be able to get out and chase them down. Yes, there are cameras but a handkerchief and a hoodie will solve that. Genuinely curious as to how they are going to solve this problem.