this post was submitted on 19 Sep 2023
0 points (NaN% liked)

Europe

8488 readers
1 users here now

News/Interesting Stories/Beautiful Pictures from Europe 🇪🇺

(Current banner: Thunder mountain, Germany, 🇩🇪 ) Feel free to post submissions for banner pictures

Rules

(This list is obviously incomplete, but it will get expanded when necessary)

  1. Be nice to each other (e.g. No direct insults against each other);
  2. No racism, antisemitism, dehumanisation of minorities or glorification of National Socialism allowed;
  3. No posts linking to mis-information funded by foreign states or billionaires.

Also check out !yurop@lemm.ee

founded 1 year ago
MODERATORS
 

Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”

top 23 comments
sorted by: hot top controversial new old
[–] MargotRobbie@lemm.ee 0 points 1 year ago (1 children)

Banning diffusion models doesn't work, the tech is already out there and you can't put it back in the box. Fake nudes used to be done with PhotoShop, the current generative AI models only makes them faster to make.

This can only be stopped on the distribution side, and any new laws should focus on that.

But the silver lining of this whole thing is that nude scandals for celebs aren't really possible any more if you can just say it's probably a deepfake.

[–] GCostanzaStepOnMe@feddit.de 0 points 1 year ago (1 children)

Other than banning those websites and apps that offer such services, I think we also need to seriously rethink our overall exposure to the internet, and especially rethink how and how much children access it.

[–] MadSurgeon@sh.itjust.works 0 points 1 year ago (1 children)

We'll need an AI run police state to stop this technology. I doubt anybody has even the slightest interest in that.

[–] GCostanzaStepOnMe@feddit.de 0 points 1 year ago* (last edited 1 year ago)

We'll need an AI run police state to stop this technology.

No? You really just need to ban websites that run ads for these apps.

[–] Aetherion@feddit.de 0 points 1 year ago

Better don't stop posting your life into the internet, this would push people to create more child porn! /s

[–] iByteABit@lemm.ee 0 points 1 year ago (1 children)

Governments need to strike hard against all kinds of platforms like this, even if they can be used for legitimate reasons.

AI is way too dangerous a tool to allow free innovation and market on, it's the number one technology right now that must be heavily regulated.

[–] Blapoo@lemmy.ml 0 points 1 year ago (1 children)

What, exactly would they regulate? The training data? The output? What kinds of user inputs are accepted?

All of this is hackable.

[–] RaivoKulli@sopuli.xyz 0 points 1 year ago (1 children)

Making unauthorized nude images of other people, probably. The service did advertise, "undress anyone".

[–] jet@hackertalks.com 0 points 1 year ago* (last edited 1 year ago) (2 children)

The Philosophical question becomes, if it's AI generated is it really a photo of them?

Let's take it to an extreme. If you cut the face off somebody's polaroid and then paste it into a nudie magazine over the face of an actress. Is that amalgam a nude photo of the Polaroid picture person?

It's a debate that could go either way, and I'm sure we will have an exciting legal land scape with countries with different rules.

[–] taladar@feddit.de 0 points 1 year ago

I suppose you could make a Ship of Theseus like argument there too. At what point does it matter where the parts of the picture came from. Most would probably be okay with their hairstyle being added to someone else's picture, what about their eyes, their mouth,... Where exactly is the line?

[–] ParsnipWitch@feddit.de 0 points 1 year ago

How about we teach people some baseline of respect towards other people? Punishing behaviour like that can help showing that it's not okay to treat other people like pieces of meat.

[–] ciko22i3@sopuli.xyz 0 points 1 year ago (1 children)

At least now you can claim it's AI if your real nudes leak

[–] taladar@feddit.de 0 points 1 year ago

In the long term that might even lead to society stopping their freak-outs every time someone in some semi-sensitive position is discovered to have nude pictures online.

[–] duxbellorum@lemm.ee 0 points 1 year ago (1 children)

This seems like a pretty significant overreaction. Like yes, it’s gross and it feels personal, but it’s not like any of the subjects were willing participants…their reputation is not being damaged. Would they lose their shit about a kid gluing a cut out of their crush’s face over the face of a pornstar in a magazine? Is this really any different from that?

[–] 0x815@feddit.de 0 points 1 year ago (1 children)

These are school girls in their teenage years.To them and their parents, this must be a nightmare.

[–] duxbellorum@lemm.ee 0 points 1 year ago* (last edited 1 year ago) (2 children)

Why? They didn’t take or share any nudes, and nobody believes they did.

This is only a nightmare if an ignorant adult tells them that it is.

[–] 0x815@feddit.de 0 points 1 year ago

@duxbellorum

Why? They didn’t take or share any nudes, and nobody believes they did.

This is only a nightmare if an ignorant adult tells them that it is.

So you don't have children, right?

[–] ParsnipWitch@feddit.de 0 points 1 year ago* (last edited 1 year ago)

Did your picture got taken and shared as a teenager? Did you get heavily sexualised and harassed? Believe me, it feels like a nightmare even if no one is telling you that it should feel like a nightmare.

Take your "sexual harassment is only bad to teenage girls if you tell them" shit elsewhere.

[–] aard@kyu.de 0 points 1 year ago (1 children)

This was just a matter of time - and there isn't really that much the affected can do (and in some cases, should do). Shutting down that service is the correct thing - but that'll only buy a short amount of time: Training custom models is trivial nowadays, and both the skill and hardware to do so is in reach of the age group in question.

So in the long term we'll see that shift to images generated at home, by kids often too young to be prosecuted - and you won't be able to stop that unless you start outlawing most of AI image generation tools.

At least in Germany the dealing with child/youth pornography got badly botched by incompetent populists in the government - which would send any of those parents to jail for at least a year, if they take possession of one of those generated pictures. Having it sent to their phone and going to police for a complaint would be sufficient to get prosecution against them started.

There's one blessing coming out of that mess, though: For girls who did take pictures, and had them leaked, saying "they're AI generated" is becoming a plausible way out.

[–] Turun@feddit.de 0 points 1 year ago

Quelle für das angesprochene Gesetz bitte. Das will ich im Detail lesen.

[–] A2PKXG@feddit.de 0 points 1 year ago

So, how close to the real bodies are the fakes?

[–] rufus@discuss.tchncs.de 0 points 1 year ago* (last edited 1 year ago) (1 children)

Interesting. Replika AI, ChatGPT etc crack down on me for doing erotic stories and roleplay text dialogues. And this Clothoff App happily draws child pornography of 14 year olds? Shaking my head...

I wonder why they have no address etc on their website and the app isn't available in any of the proper app-stores.

Obviously police should ask Instagram who blackmails all these girls... Teach them a proper lesson. And then stop this company. Have them fined a few millions for generating and spreading synthetic CP. At least write a letter to their hosting or payment providers.

[–] crispy_kilt@feddit.de 0 points 1 year ago

Fined? Fuck that. CP must result in jail time.