A story on how far away self-driving cars are just came out in The Verge. It is more pessimistic than most on when we will see truly self-driving cars on our existing roads. For those of you who have read my blog posts on the unexpected consequences and the edge cases for self-driving cars or my technology adoption predictions, you will know that I too am pessimistic about when they will actually arrive. So, I tend to agree with this particular story and about the outstanding problems for AI that are pointed out by various people interviewed for the story.
BUT, there is one section that stands out for me.
Drive.AI founder Andrew Ng, a former Baidu executive and one of the industry’s most prominent boosters, argues the problem is less about building a perfect driving system than training bystanders to anticipate self-driving behavior. In other words, we can make roads safe for the cars instead of the other way around. As an example of an unpredictable case, I asked him whether he thought modern systems could handle a pedestrian on a pogo stick, even if they had never seen one before. “I think many AV teams could handle a pogo stick user in pedestrian crosswalk,” Ng told me. “Having said that, bouncing on a pogo stick in the middle of a highway would be really dangerous.”
“Rather than building AI to solve the pogo stick problem, we should partner with the government to ask people to be lawful and considerate,” he said. “Safety isn’t just about the quality of the AI technology.”
Now I really hope that Andrew didn’t say all this stuff. Really, I hope that. So let’s assume someone else actually said this. Let’s call him Professor Confused, whoever he was, just so we can reference him.
The quoted section above is right after two paragraphs about recent fatal accidents involving self-driving cars (though probably none of them should have been left unattended by the person in the driver’s seat in each case). Of the three accidents, only one involves an external person, the woman pushing a bicycle across the road in Phoenix this last March, killed by an experimental Uber vehicle driving itself.
In the first sentence Professor Confused seems to be saying that he is giving up on the promise of self-driving cars seamlessly slotting into the existing infrastructure. Now he is saying that every person, every “bystander”, is going to be responsible for changing their behavior to accommodate imperfect self-driving systems. And they are all going to have to be trained! I guess that means all of us.
Whoa!!!!
The great promise of self-driving cars has been that they will eliminate traffic deaths. Now Professor Confused is saying that they will eliminate traffic deaths as long as all humans are trained to change their behavior? What just happened?
If changing everyone’s behavior is on the table then let’s change everyone’s behavior today, right now, and eliminate the annual 35,000 fatalities on US roads, and the 1 million annual fatalities world-wide. Let’s do it today, and save all those lives.
Professor Confused suggests having the government ask people to be lawful. Excellent idea! The government should make it illegal for people to drive drunk, and then ask everyone to obey that law. That will eliminate half the deaths in the US immediately. Let’s just do that today!
Oh, wait…
I don’t know who the real Professor Confused is that the reporter spoke to. But whoever it is just completely upended the whole rationale for self-driving cars. Now the goal, according to Professor Confused, as reported here, is self-driving cars, right or wrong, über alles (so to speak). And you people who think you know how to currently get around safely on the street better beware, or those self-driving cars are licensed to kill you and it will be your own damn fault.
PS This is why the world’s relative handful of self-driving train systems have elaborate safe guards to make sure that people can never get on to the tracks. Take a look next time you are at an airport and you will see the glass wall and doors that keep you separated from the track at all times when you are on the platform. And the track does not intersect with any pedestrian or other transport route. The track is routed above and under them all. We are more likely to geo fence self-driving cars than accept poor safety from them in our human spaces.
PPS Dear Professor Confused, first rule of product management. If you need the government to coerce a change in behavior of all your potential customers in order for them to become your actual customers, then you don’t got no customers for what you are trying to sell. Hmmm. In this case I guess they are not your customers. They are just the potential literal roadkill in the self-satisfaction your actual customers will experience knowing that they have gotten just the latest gee whiz technology all for themselves.