this post was submitted on 28 Feb 2025
264 points (99.3% liked)

Technology

69726 readers
3661 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Kusimulkku@lemm.ee 37 points 2 months ago (3 children)

Even in cases when the content is fully artificial and there is no real victim depicted, such as Operation Cumberland, AI-generated CSAM still contributes to the objectification and sexualisation of children.

I get how fucking creepy and downright sickening this all feels, but I'm genuinely surprised that it's illegal or criminal if there's no actual children involved.

It mentions sexual extortion and that's definitely something that should be illegal, same for spreading AI generated explicit stuff about real people without their concent, involving children or adults, but idk about the case mentioned here.

[–] HappySkullsplitter@lemmy.world 20 points 2 months ago

It's certainly creepy and disgusting

It also seems like we're half a step away from thought police regulating any thought or expression a person has that those in power do not like

[–] sugar_in_your_tea@sh.itjust.works 10 points 2 months ago

Exactly. If there's no victim, there's no crime.

[–] Korhaka@sopuli.xyz 5 points 2 months ago* (last edited 2 months ago) (1 children)

It would depend on the country. In the UK even drawn depictions are illegal. I assume it has to at least be realistic and stick figures don't count.

[–] Kusimulkku@lemm.ee 15 points 2 months ago (2 children)

It sounds like a very iffy thing to police. Since drawn stuff doesn't have actual age, how do you determine it? Looks? Wouldn't be great.

[–] JuxtaposedJaguar@lemmy.ml 12 points 2 months ago

Imagine having to argue to a jury that a wolf-human hybrid with bright neon fur is underage because it isn’t similar enough to a wolf for dog years to apply.

[–] jacksilver@lemmy.world 4 points 2 months ago (1 children)

I mean that's the same thing with AI generated content. It's all trained on a wide range of real people, how do you know what's generated isn't depicting an underage person, which is why laws like this are really dangerous.

Exactly. Any time there's subjectivity, it's ripe for abuse.

The law should punish:

  • creating images of actual underage people
  • creating images of actual non-consenting people of legal age
  • knowingly distributing one of the above

Each of those has a clearly identifiable victim. Creating a new work of a fictitious person doesn't have any clearly identifiable victim.

Don't make laws to make prosecution easier, make laws to protect actual people from becoming victims or at least punish those who victimize others.

[–] Allero@lemmy.today 35 points 2 months ago (5 children)

I'm afraid Europol is shooting themselves in the foot here.

What should be done is better ways to mark and identify AI-generated content, not a carpet ban and criminalization.

Let whoever happens to crave CSAM (remember: sexuality, however perverted or terrible it is, is not a choice) use the most harmless outlet - otherwise, they may just turn to the real materials, and as continuous investigations suggest, there's no shortage of supply or demand on that front. If everything is illegal, and some of that is needed anyway, it's easier to escalate, and that's dangerous.

As sickening as it may sound to us, these people often need something, or else things are quickly gonna go downhill. Give them their drawings.

[–] turnip@sh.itjust.works 8 points 2 months ago (3 children)

You can download the models and compile them yourself, that will be as effective as the US government was at banning encryption.

load more comments (3 replies)
[–] drmoose@lemmy.world 3 points 2 months ago (1 children)

This relies on the idea that "outlet" is not harmful. It might be encouraging even but who do you think even would ever study this to help us know here. Can you imagine the scientists who'd have to be leading studies like this - incredibly grim and difficult subject with high likelihood that no one would listen to you anyway.

load more comments (1 replies)
[–] raptir@lemmy.zip 3 points 2 months ago (5 children)

What would stop someone from creating a tool that tagged real images as AI generated?

Have at it with drawings that are easily distinguished, but if anything is photorealistic I feel like it needs to be treated as real.

load more comments (5 replies)
load more comments (2 replies)
[–] BrianTheeBiscuiteer@lemmy.world 34 points 2 months ago (11 children)

On one hand I don't think this kind of thing can be consequence free (from a practical standpoint). On the other hand... how old were the subjects? You can't look at a person to determine their age and someone that looks like a child but is actually adult wouldn't be charged as a child pornographer. The whole reason age limits are set is to give reasonable assurance the subject is not being exploited or otherwise harmed by the act.

This is a massive grey area and I just hope sentences are proportional to the crime. I could live with this kind of thing being classified as a misdemeanor provided the creator didn't use underage subjects to train or influence the output.

[–] Stanley_Pain@lemmy.dbzer0.com 58 points 2 months ago (5 children)

I think it's pretty stupid. Borders on Thought Crime kind of stuff.

I'd rather see that kind of enforcement and effort go towards actually finding people who are harming children.

[–] Inucune@lemmy.world 15 points 2 months ago

This is also my take: any person can set up an image generator and churn any content they want. Focus should be on actual people being trafficed and abused.

[–] SharkAttak@kbin.melroy.org 7 points 2 months ago

I've read it being defined as "victimless crime"; not that I condone it, but thinking about the energy and resources spent for such a large operation... about drawn porn? Cmon.

[–] raoulduke85@lemm.ee 3 points 2 months ago

There’s a few in the White House.

load more comments (2 replies)

I could live with this kind of thing being classified as a misdemeanor provided the creator didn’t use underage subjects to train or influence the output.

So could I, but that doesn't make it just. It should only be a crime if someone is actually harmed, or intended to be harmed.

Creating a work about a fictitious individual shouldn't be illegal, regardless of how distasteful the work is.

load more comments (9 replies)
[–] Xanza@lemm.ee 23 points 2 months ago (4 children)

I totally agree with these guys being arrested. I want to get that out of the way first.

But what crime did they commit? They didn't abuse children...they are AI generated and do not exist. What they did is obviously disgusting and makes me want to punch them in the face repeatedly until it's flat, but where's the line here? If they draw pictures of non-existent children is that also a crime?

Does that open artists to the interpretation of the law when it comes to art? Can they be put in prison because they did a professional painting of a child? Like what if they did a painting of their own child in the bath or something? Sure the contents questionable but it's not exactly predatory. And if you add safeguards for these people could then not the predators just claim artistic expression?

It just seems entirely unenforceable and an entire goddamn can of worms...

[–] Allero@lemmy.today 24 points 2 months ago (3 children)

I actually do not agree with them being arrested.

While I recognize the issue of identification posed in the article, I hold a strong opinion it should be tackled in another way.

AI-generated CSAM might be a powerful tool to reduce demand for the content featuring real children. If we leave it legal to watch and produce, and keep the actual materials illegal, we can make more pedophiles turn to what is less harmful and impactful - a computer-generated image that was produced with no children being harmed.

By introducing actions against AI-generated materials, they make such materials as illegal as the real thing, and there's one less reason for an interested party not to go to a CSAM site and watch actual children getting abused, perpetuating the cycle and leading to more real-world victims.

It's strange to me that it is referred to as CSAM. No people are involved so no one is a being sexually assaulted. It's creepy but calling it that implies a drawing is a person to me.

load more comments (2 replies)
[–] sugar_in_your_tea@sh.itjust.works 11 points 2 months ago

Exactly, which is why I'm against your first line, I don't want them arrested specifically because of artistic expression. I think they're absolutely disgusting and should stop, but they're not harming anyone so they shouldn't go to jail.

In my opinion, you should only go to jail if there's an actual victim. Who exactly is the victim here?

load more comments (2 replies)
[–] JuxtaposedJaguar@lemmy.ml 10 points 2 months ago (40 children)

Not going to read the article, but I will say that I understand making hyper-realistic fictional CP illegal, because it would make limiting actual CP impossible.

As long as it’s clearly fictional though, let people get off to whatever imaginary stuff they want to. We might find it disgusting, but there are plenty of sexual genres that most people would find disgusting b yet shouldn’t be illegal.

load more comments (40 replies)
[–] badbytes@lemmy.world 7 points 2 months ago (1 children)

If an underage AI character, is portrayed in say a movie or games, is that wrong? Seems like a very slippery slope.

[–] General_Effort@lemmy.world 3 points 2 months ago (1 children)

There have been controversies about that sort of thing.

I know the Oscar-winning movie The Tin Drum as an example. The book by Günter Grass is a very serious, highly celebrated piece of German post-war literature. It takes place around WW2. The protagonist has the mind of an adult in the body of a child. I guess the idea is that he is the other way around from most people?

The movie was banned in Ontario and Oklahoma, for a time. https://en.wikipedia.org/wiki/The_Tin_Drum_(film)#Censorship

With European societies shifting right, I doubt such a movie could be made today, but we aren't at a point where it would be outright illegal.

load more comments (1 replies)
[–] Muscle_Meteor@discuss.tchncs.de 3 points 2 months ago

Followed swiftly by operation jizzberworld

load more comments
view more: next ›