A big biometric security company in the UK, Facewatch, is in hot water after their facial recognition system caused a major snafu - the system wrongly identified a 19-year-old girl as a shoplifter.

  • Hacksaw@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    Stores in most developed countries, UK included, can refuse service only for legitimate reasons, and they have to do so uniformly based on fair and unbiased rules. If they don’t, they’re at risk of an unlawful discrimination suite.

    https://www.milnerslaw.co.uk/can-i-choose-my-customers-the-right-to-refuse-service-in-uk-law

    She didn’t do anything that would be considered a “legitimate reason”, and although applied uniformly, it’s difficult to prove that an AI model doesn’t discriminate against protected groups. Especially with so many studies showing the opposite.

    I think she has as much standing as anyone to sue for discrimination. There was no legitimate reason to refuse service, AI models famously discriminate against women and minorities, especially when it comes to “lower class” criminal behavior like shoplifting.

    • SSTF@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      I am waiting to follow the case for updates, because while I hope that the outcome pushes back on AI system like this, I am skeptical of current laws to perceive what is happening as protected class discrimination. I presume in the UK the burden for proving fault in the AI lays on the plaintiff, which is at the heart of if the reason is legitimate in the eyes of the law.

    • المنطقة عكف عفريت@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      It’s not only that it discriminated against certain groups, but also that it in itself has a high enough error rate to make it unusable for any decision making. MAYBE to select people for screening, but we would be falling further down into a dystopian future.

      At the current performance, none if these AIs should be involved in anything this critical.

    • phx@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      I’d also wonder about a case against the company supplying the facial rec software/DB. Essentially they are fingering her as a criminal in a way that has a notable impact, which seems like a potential case for slander/libel to me (not sure what a “bad db entry” would fall under in this case).

      • chiliedogg@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        She needs to apply for a jobs at these companies that use the software in order to generate damages she can sue over.