Settings

Theme

"Liability sponge": When algorithms mess up, the nearest human gets the blame

technologyreview.com

2 points by bnabholz 7 years ago · 3 comments

Reader

AnimalMuppet 7 years ago

Do we want to let algorithms be the liability sponge? Do we want humans to say "It's not my fault, blame the algorithm (that I wrote)"?

The algorithms aren't responsible; humans are (because they wrote the algorithms).

  • bnabholzOP 7 years ago

    My apologies - when posted I made an attempt to reduce the title to fit HN restrictions; I've revised it now to be a bit more clear. The implication is that whichever human is closest to the accident is at fault. There is some sense in saying "the human in the car is ultimately responsible," but if that is the case, I think it will kill self-driving cars. Why would you accept the blame for something you didn't "do?"

    In my very humble opinion... I think the classic "corporate IP" laws should extend here. If you write software and that software belongs to your company, they should be responsible for the liability of said software. Ownership and consequences should be associated, otherwise you've misplaced the incentive for people/organizations to do the right thing.

    It's hard to see a positive outcome. Blaming the driver will make people not want self-driving tech. Blaming the dev will just make them pick a less risky line of work. Blaming the company is probably the most fair, but then companies will be less likely to develop it.

    • AnimalMuppet 7 years ago

      Well... if the algorithm is certified as being able to do X autonomously, and it fails as doing X, then whoever certified it should be liable (presumably a corporation). If it's supposed to be monitored, then whoever was supposed to be monitoring it should be liable (but see below).

      So in the case of a "self driving car", if it's claimed to be truly autonomous, if the ads say you can take a nap and the car will wake you at your destination, and the car crashes, then the company that made those claims is liable. If the claim is that the car is supposed to help the human but the human still has to drive, then the human had better be paying attention enough to drive. (And in cases where the company doesn't certify that it's truly a self-driving car, but the advertisements heavily imply that, you've got some nice court cases about who's liable.)

      Here's the "but see below" part. The Boeing 737 Max crashes are... I don't quite know what to do with them. The pilot is responsible for flying the plane, no matter what. And yet, the planes have certification of airworthiness (not certified by the pilot). I am not trying to pin all the blame for those crashes on the pilots.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection