Settings

Theme

Waymo cars ignored stopped school buses in Atlanta. What happens now?

ajc.com

25 points by themaninthedark 17 days ago · 36 comments

Reader

m-s-y 17 days ago

A high school classmate of mine (many many years ago) was unexpectedly and brazenly pulled out of a school-wide assembly by local police one morning.

It was the talk of the school. Rumors spread like wildfire. Consensus was that whatever she did, it must have been terrible.

She had driven past a stopped school bus.

If this reaction is acceptable when a person does it, a $1 fine for a company is a slap in the face to law-abiding citizens.

  • zug_zug 17 days ago

    I mean my immediate reaction is it's probably not reasonable what happened to your classmate. One wrong doesn't justify another...

    • Atomic_Torrfisk 17 days ago

      So Waymo should go relatively unpunished? Sure the laws might be draconian, but at least apply them evenly, or change them for everyone.

      • SpicyLemonZest 17 days ago

        I don't think punishments should be decided relative to social media anecdotes. If there's some area of the country where local police routinely show up in assemblies or other gatherings and arrest people for driving past school busses, I support reforming their laws; in my local jurisdiction it's a traffic violation and police don't do that.

      • LorenPechtel 17 days ago

        Edge case that Waymo missed. They'll fix it. Their track record is good enough I have no problem with not punishing them.

    • moomoo11 17 days ago

      I think it’s fine. She could have killed someone.

NullHypothesist 17 days ago

A few software engineers work a weekend to fix the issue, and it never happens again?

  • belter 17 days ago

    Do such software engineers reflect on the child the could have killed, or are the stock options too sweet?

    • justonceokay 17 days ago

      I’m ideologically alighted with you but that isn’t an argument in good faith. We let convicted murderers buy cars.

      Your quip about stock options is actually funny, because if the engineers were killing people then those stock options shouldn’t be worth so much.

      • belter 14 days ago

        Looking at the massive downvote of my comment and all the subsequent replies, lots of people here missed their MIT class of Ethics in Software Engineering...

        In safety critical engineering, ethics are not opinions but process guarantees. Waymo system is ML dominant, non deterministic, and validated statistically, without a public end to end safety case, formal failure bounds, or provably safe fallback under unknown conditions.

        Shipping anyway is not a technical necessity but a choice to externalize unbounded risk onto non consenting bystanders. Comparing that to bad human drivers or stock prices misses the point that this about what risks you knowingly impose.

        Looks like the Waymo Software team could apply at Boeing. I hear they are hiring....

    • Workaccount2 17 days ago

      I know this might be a hot take but:

      I'd bet all my money, and all the money I could borrow, that a waymo would stop/swerve for a child running out before the sensory nerves in a humans eye reacted to that child. Just thinking it's not as egregious a violation when committed by something with a 0.1ms response time. Still a violation, still shouldn't do it, but the worst case outcome would be much much harder to realize than with a human driver.

      Also just to add, the fact that there aren't cases of this from Phoenix or SF seems to signal it's a dumb mistake bug in the "Atlanta" build.

      • nobodyandproud 17 days ago

        You’re giving a technical answer to a question that’s actually about the economic and policy incentives.

        Yes, electronic sensors can enable the car to react more quickly: But react how?

        A buggy or unexpected reaction will just lead to equal or faster tragedy.

        Individual drivers are incentivized to keep their behavior (or be taken off the road). What legal incentives are there when a faceless company is involved and creates one or two drivers “at scale”?

      • belter 14 days ago

        If your safety argument is a bet, you already failed the ethics test.

      • klooney 17 days ago

        Does SF have school buses?

    • alpha-male-swe 17 days ago

      wow nice virtue signal. and your point is???

ChrisArchitect 17 days ago

Related:

Authorities investigating Waymo over failure to stop for school buses

https://news.ycombinator.com/item?id=46169695

semiquaver 17 days ago

Funny how the words are all backwards on archive.is: https://archive.ph/3BvNR

  • jostylr 17 days ago

    It is even a bit more scrambled such as this part: hcihw selcihev selcihev ot ot ot ot erauqs strap. Looking at the original site, that text is in various nested structures with the paragraphs having that kind of text. They have multiple bits of it being an article block with a .is-paywalled governing various behaviors such as showing ads. The scrambled text is in paragraphs within the separate article portions. Presumably they have a script that will decode it for those to login though I do not understand why they even provide the text? Why not just return it after login? Maybe it is total trash text and just there to pad it out like a lorum ipso. Kind of interesting.

  • m463 17 days ago

    I saw that too - have adblocker.

    whatever, close page.

  • mikestew 17 days ago

    The words are backward if you go the original page and turn on reader mode to get around the paywall. “Ha ha!”, they say, “your reader mode powers are no good here!”

    So archive.ph is presumably just picking that up.

functionmouse 17 days ago

Fines start at a dollar and double for each repeat occurrence.

If it gets to the point where the fine is prohibitively expensive, then the system should in fact be prohibited.

seg_lol 17 days ago

One week of jailtime for everyone involved.

When I was twelve, a 10 yo kid from the next town over was hit and killed, his body was thrown over 100 feet when someone sped around a stopped bus with its flashers out.

  • java-man 17 days ago

    No, to the CEO and all the managers who approved the process.

    In addition to that, fine the company. Calculate the fine by the usual punishment multiplied by the number of vehicles on the road. And suddenly the companies begin focusing on safety.

prepend 17 days ago

Would AI be better at stopping for children jumping out from a stopped school bus so it’s not as necessary to stop with human drivers?

That being said, just ticket the company and make them pay. Isn’t this how it works with all moving violations? Does Waymo get pulled over for speeding?

  • vablings 17 days ago

    The first point is exactly my thought. Self-driving cars are completely different to human drivers. We should not hold them to the same standards while simultaneously holding them to much higher standards. There are many driving violations that are just laws because they could lead to an unsafe scenario that is purely the fault of the driver.

    Eg; stop signs. The only reason a full stop is required is to ensure that drivers are taking a clear observation and to give way to other stop signs. If there are no other traffic and no other drivers to give way to. Why do self-driving cars full-stop

    • drob518 17 days ago

      You’re probably right in the long term. So, when the world is 100% self-driving cars, we can probably change the rules to favor the machines. In the near-term, however, it’s probably good to make the robots obey the human laws so that the humans don’t start getting the idea that they can disobey them, too.

    • Atomic_Torrfisk 17 days ago

      laws of physics still apply. Car still takes time to slow down, even with perfect reaction times. Well, maybe you could get it to stop in time, but it might break the necks of everyone in the car.

      • vablings 13 days ago

        At 30 miles per hour, the majority of the stopping distance is reaction time from a human. Self-driving cars have maybe 10/20 of that reaction time in the case of immanent collision. I also don't know about you, but my car can stop in significantly less than pretty much all of the stated distances by a fraction.

        • Atomic_Torrfisk 11 days ago

          at 30mph breaking is about 50/50 perception time and breaking time for a total of 3-4s. Self-driving cars would an improvement for sure, they would have a max 2 second emergency break, but not quick enough as far as I understand. Even if that were enough, I would not appreciate my cab emergency breaking randomly because a kid steps out in front of a bus. Its best to slowly stop, then slowly accelerate. Maybe the optimal solution is to creep past the buss?

  • Atomic_Torrfisk 17 days ago

    Given how hidden children are walking in front of the bus, if the AI instantly applied breaks upon seeing the child, would the car slowdown in time? probably not. Better yes, good enough? no.

  • simulator5g 17 days ago

    Their license would be suspended if this were any of us. There's been at least 30 documented instances of this behavior.

Bender 17 days ago

Or Waymo going into an active crime scene, loads of cop cars, guns drawn? [1] Cops yelling to get away and instead Waymo pulls over closer to the crime scene causing the passengers to panic.

[1] - https://www.youtube.com/watch?v=p2XoMKwZE3o [video][1m42s]

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection