this post was submitted on 20 Feb 2024
0 points (NaN% liked)

World News

22057 readers
92 users here now

Breaking news from around the world.

News that is American but has an international facet may also be posted here.


Guidelines for submissions:

These guidelines will be enforced on a know-it-when-I-see-it basis.


For US News, see the US News community.


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

AI hiring tools may be filtering out the best job applicants

As firms increasingly rely on artificial intelligence-driven hiring platforms, many highly qualified candidates are finding themselves on the cutting room floor.

you are viewing a single comment's thread
view the rest of the comments
[–] Gaywallet@beehaw.org 0 points 8 months ago (1 children)

This really does not surprise me one bit. But also, nobody using these tools really cares. It reduces the amount of applications they need to review, which is often all they care about. Can't wait for the inevitable company to pop up which will do the AI equivalent of SEO stacking your resume so you can get a job.

Also, perhaps more importantly, this is just going to undo fifty years of antiracism and antisexism work. The biggest problem with AI is that it's trained on a bigoted system and when it's used to gatekeep said system, it just creates additional inequality.

[–] TehPers@beehaw.org 0 points 8 months ago (1 children)

Building off your last point, with AI models, bias can come in ways you might not expect. For example, I once saw a model that was trained with diversity in mind, but then only ever output Asian people with a high bias towards women. It seems to me like diversity is something that is difficult to train into a model since it'd be really difficult not to overfit it on a specific demographic.

It might be interesting to see if a random input into the model could be used to increase the diversity of the model outputs. This doesn't really help with resume screening tools though (which are probably classifiers), only really generative models.

[–] agressivelyPassive@feddit.de 0 points 8 months ago

There isn't really a good way to even define for diversity.

The bad approach is the corporate token diversity, where every picture has to include a white, a black and an asian person, at least 50% have to be women and one of them has to wear a hijab. That might include many groups, but isn't really representative.

You could also use the "blind test" approach many tech solutions are using, where you simply leave out any hints to cultural background, but as has been shown, if the underlying data is biased, AIs will find that (for example by devaluing certain zip codes).

And of course there's the "equal opportunity" approach, where you try to represent the relevant groups in your selection like they are in the underlying population, but that is essentially *-ism by another name.