Settings

Theme

Man raped in jail after AI technology wrongfully identifies him in robbery

star-telegram.com

31 points by landonxjames 2 years ago · 24 comments

Reader

neilv 2 years ago

Unless EssilorLuxottica or Macy's developed all the "AI", "facial recognition", "biometric information", etc. tech in-house, are there missing company names here?

There's also this, which I'm not sure how to interpret:

> “So, when EssilorLuxottica and Macy’s compared unclear security footage to (the man’s) mugshots from the 1980s, these companies knew that there was an error rate of almost 90%. Yet these companies told (Houston Police) with absolute certainty that they identified the person who robbed the Sunglass Hut,” the lawsuit said.

  • romwell 2 years ago

    Yup, I'd love to know the name of that wonderful software and the company that made it.

    Also I'd love to know the names of the judge, prosecutor, and investigator who put a man in jail simply because a man from the store told them to.

    • bb88 2 years ago

      There's not really any need.

      The judge did it because the prosecution said he did.

      The prosecution brought charges because the police said it was him.

      The police arrested him because the corporation said it was him.

      The corporation said it was him because the software said it was him.

      The software said it was him because it was SHIT.

      • benwad 2 years ago

        The whole point of those peoples' jobs is to do due diligence. If they're blindly trusting the next person/machine down the line then they're abdicating their responsibility.

      • romwell 2 years ago

        >There's not really any need.

        No, there is a need.

        We know what happened and why.

        We also need to know who all these people were, because all of them failed at their jobs.

        The last person in the chain using a shitty tool doesn't excuse anyone else. The store employee could've consulted astrological charts FFS, the use astrology wouldn't be the problem in this scenario.

        • neilv 2 years ago

          People must be seeing different reporting than I am.

          Because otherwise it looks like some people are reacting to a headline and a vague statement (possibly morphed through the telephone game of reporting) claiming how one piece of how the outcome happened, and are ready to extrapolate from that, and be judge, jury, and executioner.

          Which would seem ironic: making much the same error themselves that they believe they're correcting in someone else.

      • true_religion 2 years ago

        I feel like the software did its job.

        It reported a 90% chance that this was not the suspect when asked to compare photos. Yet the company decided 10% possibility of being correct was good enough for them to report to the police that it’s definitely this guy.

      • neilv 2 years ago

        How do you know all these things, about this specific case?

    • deadbolt 2 years ago

      Can we learn that with an FOIA request?

    • neilv 2 years ago

      Journalists might look into exactly what happened on that side of things, but this particular article doesn't seem to have done that, and some assertion in there might be incorrect or misleading.

      Best not to feed Internet pitchfork villagers, who have shown countless times that they're collectively dumb as snot, with no sense of process, critical thinking, nor decency.

      I'd think the identities of public servants isn't really relevant at this point, but rather, what was the evidence and chronology.

      As techies, we're best suited to tackle some of the evidence around tech, like what was the tech, how does it work, how was it represented, how was it used, what did it do, etc.

      • deadbolt 2 years ago

        > As techies, we're best suited to tackle some of the evidence around tech, like what was the tech, how does it work, how was it represented, how was it used, what did it do, etc.

        Go ahead then...

a_bonobo 2 years ago

How many more cases where data and AI people messed up? This is on us as a professional community of people who build the future. Is this the future we want? We seem to actively work towards it.

  • krapp 2 years ago

    As many as necessary. Late stage capitalism has entered its paperclip maximizer phase - AI will pervade everything, everywhere all the time and we will simply learn (or more likely be trained) to accept the failures as the necessary price we must pay for efficient markets.

    Is this the future we want? For many people, the answer is a clear no. But like the Luddites of generations before (who were right about everything) we are not the ones who get to decide what the future will be. Maybe there's still time to go be a potato farmer in Idaho or something.

Buttons840 2 years ago

> “He was followed into the bathroom by three violent criminals. He was beaten, forced on the ground, and brutally gang raped. After this violent attack, one of the criminals held a shank against his neck and told him that if he reported the rape to anyone, he would be murdered,” the lawsuit said.

Every square inch of the prison should be covered by cameras. Of all the new powers and fancy equipment law enforcement seeks, this should come first.

Yes, put cameras in the bathroom. It's stupid that we wring our hands over bathroom privacy in jails while rapes are so common. "You'll get raped, but at least you'll have privacy". Stupid.

ChrisArchitect 2 years ago

[dupe]

More discussion last week: https://news.ycombinator.com/item?id=39118534

deadbolt 2 years ago

And not a single person will face any consequences.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection