Inevitable Waffles [Ohio]

Mid 30’s IT/Medical Device support and quality guy. I like cycling, video games, and singing.

  • 0 Posts
  • 40 Comments
Joined 1 year ago
cake
Cake day: June 1st, 2023

help-circle



  • It depends on how the company goes about it. The larger the company, the more established the HR department. They may use their HR platform to conduct the check which may find any and everything. The smaller companies may only check recent background with a local firm. Price is the name of the game. The more in-depth the background check, the more it costs. If you are going to work in a bank or with kids, be prepared to for the company/school to use the state equivalent of the FBI. For mom and pop shops, they may just take your word on the application. If you see a national HR platform like Paycom, then the results can vary depending on the package the company purchases.

    I just realized I didn’t answer your question though. The main issue of using data brokers is that you as in the employer, for the most part, can’t or are legally dissuaded from using them. We can only use official records to judge your trustworthiness. Things like data brokers are a grey area. It’s not legally admissible in a background check by most EEOC standards, but people use any system they want. It’s on the job seeker to prove they were discriminated against.

    As someone who hires people regularly, I only use the information provided by the HR platform. I don’t google people because I wouldn’t want that to happen to me. Other people may not have the same compunctions.

    Edit: Actually answering the damn question.




  • You can doubt all you like but we keep seeing the training data leaking out with passwords and personal information. This problem won’t be solved by the people who created it since they don’t care and fundamentally the technology will always show that lack of care. FOSS ones may do better in this regard but they are still datasets without context. Thats the crux of the issue. The program or LLM has no context for what it says. That’s why you get these nonsensical responses telling people that killing themselves is a valid treatment for a toothache. Intelligence is understanding. The “AI” or LLM or, as I like to call them, glorified predictive textbars, doesn’t understand the words it is stringing together and most people don’t know that due to flowery marketing language and hype. The threat is real.


  • As someone who frequently interacts with the tech illiterate, no they don’t. This sudden rush to put weighed text hallucination tables into everything isn’t that helpful. The hype feels like self driving cars or 3D TVs for those of us old enough to remember that. The potential for damage is much higher than either of those two preceding fads and cars actually killed poeple. I think many of us are expressing a healthy level of skepticism toward the people who need to sell us the next big thing and it is absolutely warranted.