• lemillionsocks@beehaw.org
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        We’ve already got numerous examples of how these ai models and face recognition models tend to have biases or are fed data that accidentally has a racial bias. Its not a stretch of the imagination to see how this can go wrong.

        • Scrubbles@poptalk.scrubbles.tech
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          Yep, the age old “garbage in garbage out”. If we had a perfect track record we could just send in all the cop data, but we know for a fact the poor and PoC are stopped more than others. You send that into AI it will learn those same biases

  • money_loo@1337lemmy.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    Oh no, the A.I. identified someone as a drug trafficker, and the police pulled that person over on suspicion of being a drug trafficker, and found out that he was indeed a drug trafficker, and now he’s upset he got caught by a robot dragnet.

    I don’t think drugs should be criminalized, but are we supposed to be upset that A.I. is going to finally help parse data and solve crimes?

    • webghost0101@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      This time they where right because it was indeed a drugsdealer but just look at what it took to get this data

      “in this case it was used to examine the driving patterns of anyone passing one of Westchester County’s 480 cameras over a two-year period.”

      “the AI determined that Zayas’ car was on a journey typical of a drug trafficker. According to a Department of Justice prosecutor filing, it made nine trips from Massachusetts to different parts of New York between October 2020 and August 2021 following routes known to be used by narcotics pushers and for conspicuously short stays.”

      So apparently making long trips with short stays is now enough proof to be searched by police. And if they can extrapolate that into “this guys a dealer” how much other data and possible extrapolations got caught in the crossfire off all those cameras. How long till someone in power decides selling some of that info to corporations is a good way to line state/government/own pockets?

      Maybe we should place cameras in everyones house and listening bugs in every single phone? Criminality solved? Or hear me out, the real criminals will adept, find new and novel ways while the common citizen is kept in line with fines for even the smallest offense.

      Then the police state will want to escalate the tools again, even more suppressing technology. Good thing were spend so much resources continuing to bully normal citizens into generating cash flow from fines. Money and resources well spend?

      Or maybe the world needs some actually intelligent people that can find the root causes of criminal behavior and restructure society to improve well being and chances so people want to belong and maintain it rather then feeling like the system of opportunities is rigged against them so they should cheat to survive.

        • lobelia581@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          There’s a difference between using AI as a tool and using it as a solution. Though knowing how this society works, it’ll start off as a tool like now, and soon enough the higher ups will wonder why humans are even necessary in the process, especially when they need to be paid, and against everyone else’s objections they’ll get rid of the human verification part and use only AI, and when things go wrong the people in charge will go “who could have seen it coming?”

      • money_loo@1337lemmy.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        McDonalds and White Castle have already begun using ALPR to tailor drive-through experiences, detecting returning customers and using past orders to guide them through the ordering process

        Yeah you’re right, helping people order lunch is literally 1984.

        • Programmer Belch@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Uh oh, the things you are buying look pretty suspicious, we are going to wiretap your house to make sure you are not doing any no-nos.

          I really dislike people using technology to analyze my habits, I prefer just stumbling onto things I like because they were around the places I usually look. Yes, I also don’t like algorithmic content, it just makes people try to appease the algorithm, meaning less effort into the thing they do