Man trapped inside driverless Waymo car as it spins in circles
bbc.co.ukMy guess is that the car is looking for a place to pull over (perhaps because the rider pressed the "pull over" button. In this mode, the car hugs the right side of the road until it can find a spot to stop, which would normally work alright, except in this one particular spot where the "right side of the road" is only this tiny island.
So the car just circles around it indefinitely.
It's all of these edge cases that makes self driving cars impractical outside of optimal conditions.
Buridan's principle will tell you that there will always be edge cases.
There will always be room for improvement. The best you can do is to make them vanishingly rare.
That would be the definition of an edge case
Islands are so common in driving, I would expect them to be anticipated or encountered early on in development.
The video is wild - Waymo support apparently struggles to remotely control the car.
these things need to have a giant E-STOP button accessible to the passengers
otherwise they should simply be banned
Absolutely not. While that might sound like a safety feature it can also be extremely unsafe.
Imagine letting a passenger in a car use the e-brake at any time.
It's a tricky problem.
> Imagine letting a passenger in a car use the e-brake at any time.
Alright, I'm imagining. Seems extremely sensible, especially in an AI-operated vehicle.
Much of the world (the US being an outlier) puts passenger-facing emergency breaks in every carriage of passenger trains.
Buses and trains already have publicly usable e-stops. Hell, English ones even have them on the OUTSIDE of the bus.
Clearly autonomous vehicles need e-stops, like all automated machines.
they tried that. in an emergency they just stopped. the problem was they were stopping in the middle traffic and in the middle of intersections, so it's now an pull over button. or just open the door.
I expressed this exact same concern a couple months ago (and was downvoted, a common theme w/ Google fanbois nowadays):
https://news.ycombinator.com/item?id=42117343
From @simonw back then,
>Yes, there's a "pullover now" button on the dashboard at all times.
Well, doesn't seem to be the case here.
The pull over button is in fact on the screen.
According to his spokesperson, he wasn't aware of it.
the button is giant and blue and impossible to miss and you can see from the video it isn't on the screen. the screen in the video is mostly brown with a sliver of blue on the side.
Realistically, how much can she do from a cubicle in the Philippines?
Maybe driverless cars should have remained a relic of 80s Schwarzenegger movies.
I guess the only option they have is "pull over" which in this case just caused the car to continue circling looking for a safe place to pull over. If they had an actual kill switch, we'd probably be watching another video of some guy on a call to waymo support while stuck in the middle of a highway.
> If they had an actual kill switch
I'm unfamiliar - we have these in train cars, did the architects behind driverless techshit not think it was necessary?
To be clear, I'm talking specifically about the first line of support at Waymo here. I am not precluding that they have higher levels of control behind layers of authorisation.
Train cars? As in the vehicle on a track where an immediate stop is almost never more dangerous?
Yes, in much of the world there are mandatory passenger-facing emergency break levers in every carriage of passenger trains. The US is the outlier here.
And yes, passengers should absolutely be able to bring their vehicle to an immediate stop. It's an "emergency break"! Of course you need an emergency break in an autonomous car! What exact alternative are you proposing for when you're in an AI-operated car hurtling under the chassis of a white truck that it failed to detect in snow conditions?
It seems like an incredibly obvious and basic legislative requirement for self-driving cars to have some kind of immediate manual break for emergencies. I'm kind of shocked that that apparently isn't the case now?
I didn't say pull the hand brake at 60mph.
There should be an emergency "pull over and stop gracefully" button.
They do. I'm guessing there was a bug in finding a place to safely pull over?
Sounds likely, in which case there needs to be a much more "break glass in case of emergency" control which gradually lowers the maximum speed cap of the vehicle.
So even if the vision/pathfinding believes there is nowhere to park and nowhere else to turn, it will still coast to a stop in a way that is not inherently less-safe than a more-normal car running out of gas and stalling on the road.
If they can remotely kill the engine of any Waymo car in motion, so can hackers.
If they can't secure it, then the entire project should be scrapped.
But GM already has this capability with OnStar. Stolen vehicles can be slowed down then stopped and and disabled remotely.
It's easy enough to imagine an actual emergency which would necessitate remote or local intervention to stop the car, and the call seems to indicate that they don't have an emergency override or at least not without escalation.
What if there were a:
medical emergency of the passenger
crash up ahead
fire up ahead
earthquake
flood
malfunction of the driverless car
really anything that would make you pull over your actual car to the side of the road for your own safety or emergent needs.
And then you have to imagine if so, even with an e-stop button are you in a less safe situation if you do not have ability to reach the wheel from the back seat.
Those are concerns that would give me pause.
There's a Pull Over button that will stop the vehicle.
He was "not immediately aware" of it and "did not have an opportunity to use it."
edit: there's a small blurry one that plausibly could be it at the very end of the video.
~you can see in the video that it's not showing that button on the screen though.~
It is a little blurry, but the icon for pull over appears to be visible at 1:41 on the screen at the bottom.
ah you're right, edited.
these conditions will be off not already covered in the TOS, and exempted the operator from being responsible for.
I usually think it's a stretch when people compare new tech to old sci-fi stories, but "brilliant machine just runs in circles endlessly and nobody can stop it" is straight out of I, Robot.
I have never worked on a single robot that didn't go through a circle spin phase. It's a meme at this point, at least for folks I've worked with.
I remember "Speedy": https://en.wikipedia.org/wiki/Runaround_(story)
So the guy is in the car and he's concerned about catching his flight but when the operator asks him to do something in the waymo app he doesn't want to do it. Could it be that he'd rather keep filming for internet notoriety than stop filming and actually solve his problem?
Could it be that Waymo should just be able to stop the car? It doesn't seem at all ridiculous to you that he is already on the phone with them, and then they just read the script to tell him to pull out his phone and fire up the app? Just what I want is to ride in a car with Comcast level of customer service.
>Could it be that Waymo should just be able to stop the car?
That kind of remote control opens up the possibility - maliciously or accidentally, likely or unlikely - for every Waymo in the fleet to abruptly stop, regardless of whether it's safe to do so. That scenario is orders of magnitude worse than "sometimes a Waymo gets lost in a parking lot and it takes a thirty second call to fix it".
You don't think Google is able to stop a Waymo vehicle? They can literally control it remotely. The issue is that the person behind the phone did not have the authorization or access.
Do you think customer support agents should have that capability?
Of course not, but I'm open to the idea that they should be able to escalate to someone who can. This is a transitional period with self-driving cars, and it helps to mitigate serious potential safety issues.
Ideally, in the future you would have to expressly press some button to send an OTP to someone you wanted to allow to control your car, with fine-grained permissions.
Will that future happen? Probably not, and we'll probably see extreme corpgov oversight and mass reduction in individual freedom in return for convenience.
Anyway, I was only responding to OP's claim that support for remote access capabilities could lead to an exploit: There's already remote access capabilities in these cars.
> ride in a car
It's riding in a robot with no reliable safety fencing. This is the mayhem these companies are subjecting the general public to.
What's mayhem is humans driving. This is downright calm by comparison. Inconvenient, sure, but it's not Mad Max, he's just slowly circling.
He didn't even miss his flight.
This makes sense. If one thing is bad, another somewhat similar thing is good. Being trapped in a moving vehicle without a driver is not bad because being in a vehicle with a person is bad. Similarly getting hit with a brick is not bad because getting hit with a shovel is bad.
Yeah totes fake drama.
Like 90% of our news. Even the weather is like all drama "atmospheric cyclone tsunami" and ends up as light rain with some flooding, like last year.
Maybe for you—we were devastated by two hurricanes last year
I survived multiple "atmospheric rivers" unscathed.
And a super typhoon on vacation.
they do have that capability but they don't give level one customer support access to it.
> Could it be that he'd rather keep filming for internet notoriety than stop filming and actually solve his problem?
The guy kept filming for under sixteen seconds after asking about and receiving clarification that she is unable to intervene with the car and needs him to use the app. I like the idea that this somehow indicates bad faith or ineptitude in your estimation.
I've also never missed a flight by 16 seconds, so it is interesting that this chap is worried about missing his flight because of a brief delay?
Once something wrong was identified and it should have done so independently, the passenger should not have to intervene at all.
Passenger:1 Google: 0
That clearly seems the case, doesn't mean the video wasn't worth making though. This situation would terrify a lot of people
It's worth making, but there's some serious drama queen vibes that make it feel pretty overblown. If an uber is late to the airport a reasonable person doesn't threaten the driver with covering the cost of their flight.
If an Uber driver caused you to miss a flight by driving around a parking lot in circles at a speed you can't exit the vehicle, you don't think it would be a reasonable request for the customer to ask Uber to make it right?
Fair enough, there is a difference. But now we are not looking at a missed flight so much as attempted kidnapping or imprisonment or some other much more serious crime. Which is interesting to think about with the Waymo example, but hard to take seriously in the context of the video since the rider declines to do what the customer service rep asks them to do (at least appears to for the sake of producing additional outrage for their video)
Reasonable to make a request. Also reasonable for it to be denied.
it seems important to note that they didn't miss their flight.
I might if my uber driving was doing donuts in the parking lot lmao
I.e. the user is always at fault, the engineering manifesto.
Indeed.
Let's see how that works out when we're doing 65mph in an autonomous drone car with no steering wheel.
It's interesting how quickly we've gone from discussing an interesting failure mode of autonomous robots that travel in public spaces, and switched to calling it PEBKAC.
This is a unironically a great signal society is willing to accept self driving cars. Even computer security isn't this good at playing blame the user.
"Society" didn't blame the user, I think "society" would have no problem blaming the car. Hacker News isn't representative of society at large.
The signal you're seeing is the tendency of tech people to consider any risk to human safety to be less important than the benefits of the technology itself, and to always blame the human and never the tech.
You also see this manifest in any conversation about the failure modes of AI, in the inevitable knee-jerk response of "humans also do x."
Yes, that could be. And you could focus on being annoyed at this guy's social media behavior, if you like. However, it doesn't mitigate, and is less important than, the problem of the car getting into this state and Waymo not having control over it.
"…the operator asks him to do something in the waymo app he doesn't want to do it."
Why are you trying to offer an excuse for the inexcusable?
The app should have nothing to do with it. Where are the emergency stop and exit controls for the passenger? He should be able to exit the vehicle at any time.
I fed up with this bad tech and the fact that governments let Big Tech act irresponsibly to get away with this shit. Why aren't there regulations in place before this tech is allowed loose on an unsuspecting public?
If CEOs of tech companies were held directly responsible with the threat of jail time it'd stop almost instantly.
Jail time?
This poor man was forced to slowly circle around a parking lot eight times! It took over five minutes before they fixed the problem! He almost missed his flight! Outrage, outrage.
This is clearly worthy of the electric chair.
Reckon that's a bridge too far even for those miserable reprobates. Besides, it'd only take for a couple to be locked away to scare the shit out of rest of them. There's nothing like making an example out of a few to bring better behavior.
The real problem these days is that the culprits who perpetrate this shit are able to hide behind corporate walls—at worst the company gets fined (albeit rarely) but those who are actually responsible escape both freely and anonymously. Laws need to be changed to make employees directly accountable for their actions and to take the consequences.
Unfortunslely, introducing such laws is somewhat complicated and would meet with huge resistance for many reasons too involved to list here. Nevertheless, there's one I should mention, sometimes unavoidable mistakes occur (or important facts remain unknown) even after due diligence and rigorous testing. Laws should not hold individuals responsible for force majeure† (so-called acts of God) that they have no control over. Any change in the law would have to allow for this.
That said, we know damned well that that provision/exception is totally irrelevant in most of these cases/customer unfriendly fuck-ups, clearly they're as guilty as hell.
I'd advocate another change in the law in that shareholders need to be identifiable. We need a way of publicly embarrassing and shaming shareholders when they invest in 'carpetbagger' companies. To avoid shame investors who invested in good faith would be left with little option than to divest themselves of their shares. In effect, shareholders have to share the responsibility for a company's bad behavior and be seen doing so.
Mind you, I can't see this happening anytime soon, as many shareholders are just too greedy to allow laws like this to come into effect. Unfortunately, the political will just isn't there.
_
† A classic example of where hubris, cocksureness, cost cutting and insufficient engineering rigor — AND force majeure led to a disaster was the collapse of the Tacoma Narrows Bridge. If engineering rigor had been followed to the letter and penny-pinching avoided then it's likely the bridge wouldn't have collapsed despite force majeure entering the scene (the physics of Kármán vortex street turbulence was not well understood at the time).
Is "he's just doing it for clout" really passing for a thoughtful response on this topic, on this site?
Yeah, I got a negative impression of the guy. It looped around the parking lot for five minutes. I'm not a fan of self-driving cars, but this doesn't seem like a huge deal.
It's called "drifting donuts" and that's a feature, not a bug.
At least he didn't get shipped to a billionaire's private island
Quality reference ;)
Welcome to AI people. It will only get worse from here. Enjoy the ride!
Can we please get some clarification on "trapped" from someone who's used a Wayno? Surely the driverless car hasn't locked him inside. Right?
Surely every driverless vehicle ought to have a button that when pressed gracefully stops the car so the passengers can safely exit.
> Surely the driverless car hasn't locked him inside. Right?
Every (modern) car I’ve been in, driverless or not, will lock the doors once the vehicle is in motion - automakers are not in the habit of letting passengers fall into the road like that. Just imagine the lawsuits…
It's been my (limited) experience that there's a manual override that can still be unlocked. Can't I slide the lock lever to the unlocked position while the car is in motion?
I'd also submit that in any case where I'd open the door while the vehicle is in motion, I'd have a damned good reason for wanting to do so[0] and would not take kindly to being thwarted.
[0] Usually: backing the plow truck up to a small, flat utility trailer that's invisible through the rear window.
Child safety locks that prevent people from opening locked doors from the inside are standard on four door vehicles for the rear doors, but they can usually be disabled.
My two-door had them disabled by default and it's not uncommon for taxis and rideshares to disable them.
It seems insane not to disable them in a Waymo - you wouldn't be able to escape in an emergency.
I think it's usually locked from the outside. Opening the door from the inside (in most cars I've been in) works.
I would actually be surprised of the opposite (not being able to leave a vehicle because the doors are locked). This sounds very scary in an emergency situation where you have to leave the vehicle
The locks on my car don’t prevent me from opening front doors while driving. The only rear doors have a child safety switch which would stop the doors from opening when locked. But you have to turn that on.
You don’t want a locked door preventing you from exiting the car if you have a crash.
>Surely the driverless car hasn't locked him inside.
Yeah, why didn't he just jumped out of a moving car, right?
This is wild! Wow! I used to think highly of Waymo but this could be the worst possible way to handle a situation where a 2-ton object went awry.
She should have stopped the car immediately after she became aware of the situation, within the first few seconds of the call. She kept following this dumb scripted conversation as if it was someone calling support because their router won't turn on. What an absolute shit display of incompetence and recklessness.
Thank you for calling Waymo, have you tried turning it off and on again?
Obligatory sci-fi reference: "Road Stop" by David Mason:
Yeah, this happened to me at home. I have this lock on the door and I couldn’t get in because it was locked. I called support and they said I needed to use the key on the lock but that SHOULDN’T BE NECESSARY. Completely unsafe that the door wouldn’t unlock. Shelter is a human right and I was trapped outside of my house just because I wouldn’t use some “device” that so-called “support” was trying to get me to use.
Same situation as this guy and the “End Ride” button. It’s actually horrifying.
Beneath the satire, you're asserting that the primary purpose of locks on cars are to prevent adult passengers from exiting? That's not right at all.
Might as well design a desktop environment that forces you to enter your password in order to log off.
Ah you’re right. I should’ve used The Adage of The Man With The Door Handle. Unusual story about a guy who was trapped in his home because he had to press down on a device to disengage a lock. Perhaps President Biden will issue an order against these terrible things.
Are door handles evil? Find out this weekend on Hacker News as it explores the concept of clicking “End Trip” on an app dispatched taxi ride that was embarked on by app.