this post was submitted on 22 Aug 2023
1 points (100.0% liked)

Technology

59099 readers
3181 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.

top 25 comments
sorted by: hot top controversial new old
[–] p03locke@lemmy.dbzer0.com 0 points 1 year ago* (last edited 1 year ago) (2 children)

There is so much wrong with just the title of this article:

  1. What marketplace? CivitAI is free. Unstable Diffusion Discord is free. Stable Diffusion is free. All of the models and LoRAs are free to download. The only cost is a video card (even a basic one) and some time to figure this shit out.
  2. "Everyone is for sale". No, that's current fucking situation, where human trafficking runs rampant throughout the sex and porn industry. AI porn is conflict-free. You don't need to force an underaged, kidnapped teenager to perform a sex act in front of a camera to create AI porn.
  3. "For Sale". Again, where's the sale? This shit is free.

A 404 Media investigation shows that recent developments

Get the fuck outta here! This two bit blog want to call itself "a 404 Media investigation"? Maybe don't tackle subjects you have no knowledge or expertise in.

The Product

Repeat: FOR FREE! No product!

In one user’s feed, I saw eight images of the cartoon character from the children's’ show Ben 10, Gwen Tennyson, in a revealing maid’s uniform. Then, nine images of her making the “ahegao” face in front of an erect penis. Then more than a dozen images of her in bed, in pajamas, with very large breasts. Earlier the same day, that user generated dozens of innocuous images of various female celebrities in the style of red carpet or fashion magazine photos. Scrolling down further, I can see the user fixate on specific celebrities and fictional characters, Disney princesses, anime characters, and actresses, each rotated through a series of images posing them in lingerie, schoolgirl uniforms, and hardcore pornography.

Have you seen Danbooru? Or F95 Zone? This shit is out there, everywhere. Rule 34 has existed for decades. So has the literal site called "Rule 34". You remember that whole Tifa porn video that showed up in an Italian court room? Somebody had to animate that. 3D porn artists takes its donations from Patreon. Are you going to go after Patreon, too?

These dumbasses are describing things like they've been living in a rock for the past 25 years, watching cable TV with no Internet access, just NOW discovered AI porn as their first vice, and decided to write an article about it to get rid of the undeserved guilt of what they found.

What a shitty, pathetic attempt at creating some sort of moral panic.

[–] Send_me_nude_girls@feddit.de 0 points 1 year ago

I just wanted to say I love your comment. Your totally correct and I enjoyed the passion in your words. That's how we got to deal with shit article more often. Thx

[–] tux0r@feddit.de 0 points 1 year ago (1 children)

Repeat: FOR FREE! No product!

If it's free, chances are you're the product. I assume that there is a market for user-generated "prompts" somewhere.

[–] p03locke@lemmy.dbzer0.com 0 points 1 year ago (1 children)

No, that's not how open-source or open-source philosophies work. They share their work because they were able to download other people's work, and sometimes people improve upon their own work.

These aren't corporations. You don't need to immediately jump to capitalistic conclusions. Just jump on Unstable Diffusion Discord or CivitAI yourself. It's all free.

[–] tux0r@feddit.de 0 points 1 year ago

These aren’t corporations.

I know, I know: "but the website is free" (for now). However, Civit AI, Inc. is not a loose community. There must be something that pays their bills. I wonder what it is.

[–] ArbitraryValue@sh.itjust.works 0 points 1 year ago* (last edited 1 year ago) (1 children)

I'm unconvinced by this attempt to create a moral panic. IMO nothing here is shocking or offensive once I remember that people could already use their imaginations to picture celebrities naked.

[–] ThetaDev@lemm.ee 0 points 1 year ago (1 children)

The main issue of this would be public defamation, i.e. wrongfully portraying someone as porn actor which might destroy their career. You cant really do that with written or drawn fiction.

But for that the pictures would have to be photorealistic, which is not the case just yet. But the tech is going to improve plus the generated images could be further manipulated (i.e. add blur/noise to the image to make it look like a bad phone picture).

[–] ArbitraryValue@sh.itjust.works 0 points 1 year ago (1 children)

Once the ability to make photo-realistic images like that becomes commonplace, those images won't be evidence of anything anymore. Now I can tell you a story about how I had sex with a celebrity, and you won't believe me because you know I easily could have made it all up. In the future I will be able to show you a 100% realistic video of me having sex with a celebrity, and you won't believe me because you'll know that I easily could have made it all up.

[–] Savaran@lemmy.world 0 points 1 year ago (1 children)

The obvious thing is that at some point any camera worth it’s salt will have a nice embedded key that it signs it’s output traceable to a vendor’s CA at the least. No signature, the image would be considered fake.

[–] gandalf_der_12te@feddit.de 0 points 1 year ago (1 children)

As a programmer, I gotta say, that's probably not technically feasible in a sensible way.

Every camera has got to have an embedded key, and if any one of them leaks, the system becomes worthless.

[–] Turun@feddit.de 0 points 1 year ago* (last edited 1 year ago)

No, that would actually be feasible with enough effort.

The real question is what do you do if someone takes a screenshot of that image? Since the picture must be in a format that can be shown, nothing is stopping people from writing software that just strips the authentication from the camera file.

Edit: misread the problem. You need to get a private key to make forgeries and be able to say "no look, this was taken with a camera". Stripping the signature from photographs is the opposite of what we want here.

[–] Kazumara@feddit.de 0 points 1 year ago

Ha, the image description just says "An AI-generated woman found on CivitAI" even though that's clearly the character Power from Chainsaw Man.

[–] regalia@literature.cafe 0 points 1 year ago (1 children)

Who the fuck is buying this lol, also it's considered public domain.

[–] tux0r@feddit.de 0 points 1 year ago* (last edited 1 year ago) (1 children)

The problem with Public Domain is that it does not exist in most jurisdictions. There is no "Public Domain" in (edit: at least parts of) the EU, for example.

[–] fred@lemmy.ml 0 points 1 year ago (1 children)

What do you mean? Anything that isn't copyrighted is public domain, including old works.

[–] tux0r@feddit.de 0 points 1 year ago* (last edited 1 year ago)

In some countries, that might be the case. However, in Germany (where I live), there is no way to have something "not copyrighted". The author holds the copyright unless explicitly licensed. (Here's where the CC0 comes in handy, but the CC licenses weren't made for software...)

Our § 29 UrhG explicitly denies the possibility to give up your copyright before your death. Austria has similar laws. So no, nothing is "public domain" in Germany.

(edit:) See also this discussion on Hacker News for broader details.

[–] altima_neo@lemmy.zip 0 points 1 year ago (1 children)

Brb, firing up stable diffusion and at least 4 LORAs

[–] tux0r@feddit.de 0 points 1 year ago

Ah, a horse collector.

[–] db2@sopuli.xyz 0 points 1 year ago (2 children)

This is not a troll: zoom in on the feet of the yellow dress image. It's hilariously bad.

Oh no, the realism, it's just too much! 🤡

[–] lightnsfw@reddthat.com 0 points 1 year ago (1 children)

She has the correct amount of toes. Whats the problem?

[–] tux0r@feddit.de 0 points 1 year ago

Wait, why did you zoom in on the feet?

[–] afraid_of_zombies@lemmy.world 0 points 1 year ago (1 children)

Maybe we do live in the best possible world. Wow wouldn't it be great to get rid of this industry so you can consume porn while knowing that there is zero percent chance this wasn't made without their consent?

[–] hh93@lemm.ee 0 points 1 year ago (1 children)

Isn't the main problem with those models how you can create porn of everyone without their consent with those tools, too?

[–] gandalf_der_12te@feddit.de 0 points 1 year ago

Yeah so what. It's not as if somebody is "sold on the market" because there's a nude picture of them. Photoshop is not a real threat to society. We gotta stop making moral imaginations more important than physical things.