this post was submitted on 10 Dec 2024
321 points (99.4% liked)

A Boring Dystopia

9875 readers
779 users here now

Pictures, Videos, Articles showing just how boring it is to live in a dystopic society, or with signs of a dystopic society.

Rules (Subject to Change)

--Be a Decent Human Being

--Posting news articles: include the source name and exact title from article in your post title

--If a picture is just a screenshot of an article, link the article

--If a video's content isn't clear from title, write a short summary so people know what it's about.

--Posts must have something to do with the topic

--Zero tolerance for Racism/Sexism/Ableism/etc.

--No NSFW content

--Abide by the rules of lemmy.world

founded 2 years ago
MODERATORS
all 31 comments
sorted by: hot top controversial new old
[–] TommySoda@lemmy.world 67 points 1 week ago* (last edited 1 week ago) (5 children)

Let's not do anything about the unregulated technology that can spread lies faster than ever before as websites get absolutely flooded with believable bots that outnumber the actual users. Let's make secret passwords and handshakes like we're in a clubhouse.

Regardless, it's not a bad idea since it's probably not gonna get better for awhile if at all.

[–] Ledivin@lemmy.world 29 points 1 week ago

The technology is out. While something should be done on that side of things, it also doesn't remove the technology from existence - you will still need other protections.

[–] rudyharrelson@lemmy.radio 8 points 1 week ago

Regulations virtually always lag years behind technology, don't they? In the interim period with absolutely no regulations, we must take it upon ourselves to protect ourselves and loved ones from being exploited.

Given just how wealthy the AI bubble is making some people, we may not see any common sense regulation for quite some time. Best to adapt to that reality imo. Gonna tell my friends and family to call me by my hacker alias, "X360N0_sc0peX" on the phone or I'll assume they're a bot.

[–] zecg@lemmy.world 4 points 1 week ago

What can be done, you can download an LLM and run it locally, they're not going away

[–] westyvw@lemm.ee 2 points 1 week ago* (last edited 1 week ago)

Websites have been full of shit, bots or not, since forever. Nothing new here.

[–] IDKWhatUsernametoPutHereLolol@lemmy.dbzer0.com 2 points 1 week ago (1 children)

Regulating it does nothing. Only rich people gets to have deepfakes? Nah, let it be public, so everyone can have some vigilance.

[–] kibiz0r@midwest.social 4 points 1 week ago (1 children)

vigilance

Vigilance is like, not drinking the water that comes out of a nuclear reactor.

What we’re talking about here is letting everyone run their own reactor and dump the waste into the street.

You don’t gain vigilance, you lose all habitable public space.

[–] TechLich@lemmy.world 1 points 1 week ago (1 children)

It's a bit late for that. This particular nuclear reactor is open source, free to download and runs on consumer hardware. Can't really unfry that egg and the quality is getting better all the time. Identity fraud is already illegal in most places so not sure exactly what regulation would be appropriate here.

[–] phneutral 1 points 1 week ago (1 children)

First of all: you need giant data centres to train the models.

Identity fraud is illegal, copyright theft is illegal as well — put the blame on the owner of the data centres.

I know from valid sources that governments know who theses folks are.

[–] TechLich@lemmy.world 1 points 1 week ago

Not entirely true. You don't need your own personal data centre, you can use GPU cloud instances for a lot of that stuff. It's expensive but not so expensive that it would be impossible without being a huge tech company (only 1000s of dollars, not billions). This can be done by anyone with a credit card and some cash to burn. Also, you don't need to train a model from scratch, you can build on existing models that others have published to cut down on training.

However, to impersonate someone's voice you don't need any of that. You only need about 5-10 seconds of audio for a zero-shot impersonation with a pre-trained model. A minute or so for few-shot. This runs on consumer hardware and in some cases even in real time.

Even to build your own model from scratch for high quality voice audio, there doesn't need to be a huge amount of initial training data. Something like xtts was trained with about 10-15K hours of English audio which is actually pretty easy to come by in the public domain. There are a lot of open and public research datasets specifically for this kind of thing, no copyright infringements necessary. If a big tech company wants more audio data than what's publically available, they just pay people to record audio, no need to steal it or risk copyright claims and breaking surveillance laws, they have a budget to exploit people to record whatever they want.

This tech wasn't invented by some evil giant tech company stealing everybody's data, it was mostly geeky computer scientists presenting things at computer speech synthesis conferences. That's not to say there aren't a bunch of huge evil tech companies profiting from this or contributing to this kind of tech, but in the context of audio deepfakes being accessible to scammers, it's not on them and I don't think that some kind of extra copyright regulation on data centres would do anything about it.

The current industry leader in this space in terms of companies trying to monetize speech synthesis is elevenlabs which is a private start-up with only a few dozen employees.

The current tech is not perfect but definitely good enough to fool someone who isn't thinking too hard over a noisy phone call and a scammer doesn't need server time or access to a data centre to do it.

[–] IDKWhatUsernametoPutHereLolol@lemmy.dbzer0.com 34 points 1 week ago (2 children)

Secret phrases seem like it could just get wiretapped and its not longer a secret.

You're gonna need to change them every day, nay, every conversation.

Might need some some RSA 4096 to handshake each phone call for authentication and might as well do encryption too.

Or we might need to generate some One Time Pads and then do a challenge-response thing by reading 5 digits, then have the other person reply 5 digits after that, then the numbers are crossed out.

The future is gonna be so weird.

I feel like CSAM might go out of control.

Any video of politicians/candidates doing bad things would be responded with "CNN FAKE NEWS DEEPFAKE".

Like you could just murder someone on 4K camera and you can claim its a deepfake.

We're so fucked.

[–] xorollo@leminal.space 27 points 1 week ago (2 children)

It's the Terminator "your mother is dead" scene.

[–] Kolanaki@yiffit.net 7 points 1 week ago

I am just imagining the scene in question, but the dialogue is Arnold asking Conner's mom about the scene in question.

"What does Arnold ask you in Terminator 2?"

"I think he asked me about the weather."

"Your foster parents are dead."

[–] Asetru 19 points 1 week ago

So, let's make the formula for concentrated dark matter our secret code.

[–] thesohoriots@lemmy.world 15 points 1 week ago

“Mom, I’m getting fed up with this orgasm!”

[–] InternetCitizen2@lemmy.world 15 points 1 week ago

HA HA HA fellow humankind member. This has given me a pointer to a disk location of a friend back in university

Lol but seriously, back in uni a friend of mine got their social media hacked. The hacker was trying to beg for money and such. One person got suspicus and asked what their favorite beer was, so the scammer texted me "hey what is my favorite beer?"

Fortunately the account got locked for some reason, so no money was stolen. Bro still has not recovered it.

[–] dsilverz@thelemmy.club 14 points 1 week ago (1 children)

All of a sudden, poughkeepsie suddenly pops up inside my head as a curious "secret distress signal".

[–] otter@lemmy.dbzer0.com 5 points 1 week ago (1 children)

I thought ours was "Tahiti", though.

[–] dsilverz@thelemmy.club 4 points 1 week ago

My previous comment is a reference to the Supernatural TV series. The protagonist brothers Sam and Dean Winchester had Poughkeepsie as a distress signal whenever one of them needed to inform the other to "pack up and run". One of the situations involved Dean telling Crowley the distress signal so Crowley could enter Sam's mind and warn him about his ongoing angelic possession.

[–] CubitOom@infosec.pub 8 points 1 week ago

Now I have a reason to spend the night looking at my Klingon dictionary.

[–] Infynis@midwest.social 5 points 1 week ago

Saying the same thing over and over again in different conversations would be super useful if your goal is to train an AI to listen to calls

[–] JeeBaiChow@lemmy.world 5 points 1 week ago (2 children)

Just ask them about something you've experienced together, assuming there has been contact.

Remember that time we went out and talked about your father while having tea?

Starting to realize I wouldn't remember most things

[–] Delphia@lemmy.world 3 points 1 week ago (1 children)

Its honestly not hard

"Son its dad, Ive had to borrow a phone..."

"Before I transfer the money dad, whens Gran getting out of the hospital?"

(Grans been dead for a decade)

[–] JeeBaiChow@lemmy.world 3 points 1 week ago

(puts the phone down)

'Your foster parents are dead.'

[–] sag@lemm.ee 3 points 1 week ago

El Psy Congroo

[–] MehBlah@lemmy.world 1 points 1 week ago
[–] Hiro8811@lemmy.world 0 points 1 week ago

I mean close family members will recognise if it's you so they'll have to contact someone else which mostly likely will find through Facebook or other platforms. Still this still seems like a way to make people even more unique between them.