AI hiring tools may be filtering out the best job applicants

As firms increasingly rely on artificial intelligence-driven hiring platforms, many highly qualified candidates are finding themselves on the cutting room floor.

  • FuglyDuck@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    Almost certainly true.

    But how does it compare to HR hiring staff doing the same?

    There are those guys that take the x number of applicants off the top of the stack and discard the rest because “they’re unlucky.”

    Now i can hear the screams already. AI is programmed with bias and arbitrary bullshit might be less objectionable.

    The reality is the ai is programmed by the HR staff. And inherits their bias any how; but at least an AI can be made to ignore race on the face of it, where humans that are racist can’t.

    • Tja@programming.dev
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      AI is programmed by the HR staff

      Yes, those famous elite code wizards, the guys and gals in HR.

      • bl4ckblooc@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        If firms are the ones using AI, then it isn’t even HR wizards. It’s the geniuses that can manage to pick out the right candidate for a job even though they know nothing about the position, company, location, candidates, or work load.

        They are already throwing out your best candidates because they don’t know what they are doing, but their firm costs less than a couple HR people so the share holders still think it’s a good idea.

      • cynar@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        While they are not doing the coding, they are often providing the training data. This is what screws the process over. Garbage in, garbage out.

    • pine@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      Nah. They just inherent the racial biases the hr have through correlation. Associations based on zipcode or name. But this time it’s an inscrutable black box being racist.

      • Deello@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        Here is some dark timeline thoughts. Companies are people, patents and copyright can only be given to humans, AI hallucinates, companies are responsible for the actions of AI.

        Does this mean we will eventually have to compete with commercial AI personas that also have a verifiable work history complete with an itemized list of financial liability incidents. HR now has to decide if they want a human employee or an AI employee.

      • FuglyDuck@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        It isn’t necessarily inscrutable.

        It will be, almost certainly. But for many, the entire reason is to try and remove it, so it won’t have their name, and maybe not their zip code as a considered factor.

        This will depend very much on the organization in question- which gets to my point. There’s some high level bullshit that many applicants never get through just because.

        Race. Zip code. Name. They just don’t have to be used.

        Or more complicated… college from where the got degrees. If an applicant graduated from an HBC it’s a not bad bet that they’re… black. Or at least “woke”. On the other hand, some one who graduated from some deeply conservative college is almost certainly a conservative.

        A human is going to make those connections regardless. An AI can be made to not even know there is a connection

    • Deestan@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      Yeah, AI is in reality not competing with perfect.

      Though I do think it will do worse than run-of-the-mill HR, as HR people are almost certainly incapable of competently training an AI.

    • afraid_of_zombies@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      There are those guys that take the x number of applicants off the top of the stack and discard the rest because “they’re unlucky.”

      Fuck my blood boils whenever I think about that incident.

  • Stamets@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    There was a story not long ago of an AI hiring bot that was dismissing every candidate of color. Turns out if you try and build these things off of inherently racist jobs the AI learns to become stupidly racist. This was a surprise to some people. Same thing with motion sensor sinks. There are way too many that don’t work for people with darker skin because they were only ever tested on white people.

    It’s almost like if you build your systems off of extremely narrow data and profiles then the system will keep that. It isn’t a person. There’s no reasoning or empathy or stepping outside of the lines yet. It’s going to do what you tell it, for better or for worse.

    • Moneo@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      Same thing with motion sensor sinks. There are way too many that don’t work for people with darker skin because they were only ever tested on white people.

      Epitomizes white privilege for me. The white male is considered default and as a result we enjoy benefits on the daily without even realizing it. The book Invisible women does a great job pointing out many ways in which the “male default bias” (idk if thats the correct terminology) negatively effects women. Some of these are ‘trivial’, like how women’s bathrooms can serve fewer people at a time because stalls are less space efficient than urinals. So instead of giving womens bathrooms more space we just laugh at how womens bathrooms always have long lines. Many are obviously not so trivial.