this post was submitted on 30 Apr 2025
175 points (98.3% liked)

Technology

69545 readers
3911 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 15 comments
sorted by: hot top controversial new old
[–] Zwuzelmaus 53 points 1 day ago (2 children)

Unions on a global scale. A very interesting development.

[–] oxysis@lemmy.blahaj.zone 17 points 1 day ago

One I am all for

[–] A_norny_mousse 9 points 1 day ago (1 children)

I'm for it.

Yet, cynical me cannot help but wonder how the big platforms will react. How they'll try to quash it, undoubtedly.

Not saying I'm pessimistic though!

[–] SaharaMaleikuhm 2 points 1 day ago (1 children)

I sincerely hope they just die. More likely they will just get rid of content moderation entirely.

[–] A_norny_mousse 3 points 19 hours ago* (last edited 16 hours ago)

In the current USA this might even ve an option.

Or they'll try to use AI only. But these things are not intelligent and won't be able to produce satisfying results but since everyone is following that hype rn they'll probably spend a billion before they figure that out...

Of course, if moderation gets worse than it already is, either way, the platforms will evtl. be banned from many other countries, notably the EU.

[–] solrize@lemmy.world 19 points 1 day ago* (last edited 1 day ago) (3 children)

Contract workers for Meta, TikTok, Google, and more are forming a global group to fight for better working conditions.

Try Reddit also, the moderators there don't even get paid :(.

[–] Vegasvator@lemmy.today 3 points 1 day ago

Jannies don't deserve to get paid. They just want control over whatever narrative they want to push.

[–] Ledericas@lemm.ee 1 points 20 hours ago (1 children)

They don't care as long as they hold power modding Especially there like power mods and supermods. The ones that hold ,500+subs with just 92 mods

[–] A_norny_mousse 3 points 19 hours ago

The ones that hold ,500+subs with just 92 mods

Those probably get paid, one way or another.

[–] pulido@lemmings.world 5 points 1 day ago

That's their fault.

[–] 001Guy001@lemm.ee 9 points 1 day ago (3 children)

The pressure to review thousands of horrific videos each day – beheadings, child abuse, torture – takes a devastating toll on our mental health

What could be a solution for dealing with that? I wouldn't want to be exposed to that type of content even if I was paid to do so and had access to health support to deal with the aftermath every time.

Whilst automated tools can help on this, there is a heckton of human labour to be done in training those tools, or in reviewing moderation decisions that require a human's eye. I think that in a world where we can't eradicate that need, the least we can do is ensure that people are paid well, in non-exploitative conditions, with additional support to cope.

Actually securing these things in a way that's more than just lipservice is part of that battle— I remember a harrowing article a while back about content moderators in Kenya, working for Sama, which was contracted to work for Facebook. There were so many layers of exploitation in that situation that it made me sick. If the "mental health support" you have access to is an on-site therapist who guilt trips you into going back to work asap, and you're so hurried and stressed that you don't have time to even take a breather after seeing something rough — conditions like that are going to cause a disproportionate amount of preventable human harm.

Even if we can't solve this problem entirely, there's so much needless harm being done, and that's part of what this fight is about now.

[–] wizardbeard@lemmy.dbzer0.com 6 points 1 day ago

On paper, it's one of the uses for AI image recognition. It could reduce the amount that needs human review drastically.

In reality, Youtube's partially automated system (to my knowledge the most robust one around) regularly flags highly stylized videogame violence as if it is real gore. It also has some very dumb workarounds like simply putting the violence more than 30 seconds into the video (which has concerning implications for its ability at filtering real gore).

[–] desktop_user@lemmy.blahaj.zone -3 points 20 hours ago (1 children)

employ people that aren't as bothered by it and pay them well. Presumably pedophiles would be more willing to moderate CSAM and people with psychopathy more willing to moderate torture and abuse. and as long as there is no paper trail of intentionally hiring these people I don't know that it would be illegal.

[–] A_norny_mousse 4 points 19 hours ago

That gave me a good laugh, thank you.