this post was submitted on 28 Aug 2024
202 points (94.7% liked)

Technology

34789 readers
417 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
 

The ideologues of Silicon Valley are in model collapse.

To train an AI model, you need to give it a ton of data, and the quality of output from the model depends upon whether that data is any good. A risk AI models face, especially as AI-generated output makes up a larger share of what’s published online, is “model collapse”: the rapid degradation that results from AI models being trained on the output of AI models. Essentially, the AI is primarily talking to, and learning from, itself, and this creates a self-reinforcing cascade of bad thinking.

We’ve been watching something similar happen, in real time, with the Elon Musks, Marc Andreessens, Peter Thiels, and other chronically online Silicon Valley representatives of far-right ideology. It’s not just that they have bad values that are leading to bad politics. They also seem to be talking themselves into believing nonsense at an increasing rate. The world they seem to believe exists, and which they’re reacting and warning against, bears less and less resemblance to the actual world, and instead represents an imagined lore they’ve gotten themselves lost in.

top 16 comments
sorted by: hot top controversial new old
[–] cygnus@lemmy.ca 67 points 2 months ago

Very good article, but it would have benefited from a few historical references, because none of this is new. The insularity and navel-gazing of royal and imperial courts, for example, are legendary — with their own customs, shibboleths, and of course everyone within believing they have the answers, despite being totally dsconnected form the "real world".

[–] stevedidwhat_infosec@infosec.pub 36 points 2 months ago* (last edited 2 months ago) (1 children)

Beautiful article thanks for sharing.

This highlights the problem that inherently exists in nature: feedback loops.

Repeated exposure to similar information or signals amplifies/entrenches existing patterns and can lead to distortion, narrowing frequencies, and reduced adaptability.

Its why these people turn into racists, sexists, abusers, etc. they get what they want all the time simply because they have money and power and know how to game systems. So they learn on those narrow pathways that they get what they want when they want.

They learn a reality that doesn’t exist for the greater whole because they themselves are not living as the greater whole does. None of this is rocket science.

I have to conclude that Extreme wealth threatens the stability of our society. Not arguing for communism or anything else. Every path has its hazards and shortcuts. Just pointing out problems and opening the floor to discussion of the problem and potential solutions

[–] metaStatic@kbin.earth 19 points 2 months ago (2 children)

Not arguing for communism

This is one of those feedback loops. Communism threatens the current system so you are told repeatedly that it's a bad thing.

and it doesn't help that Communists refuse to learn from the mistakes of the 20th century.

... Not arguing for communism either but can I interest you in a black flag?

[–] kurwa@lemmy.world 4 points 2 months ago (1 children)

We going to be pirates now?

[–] TSG_Asmodeus@lemmy.world 5 points 2 months ago

I mean, yeah, basically.

[–] stevedidwhat_infosec@infosec.pub 1 points 2 months ago

I just stated that I wasn’t interested in anything specifically, don’t have any one solution in mind. Not that it’s bad per se.

[–] spark947@lemm.ee 27 points 2 months ago (1 children)

You can label it as a trendy new tech term, but it is the age old yes-manism that I have watched eat up tech executives and CEOs that I have worked with.

[–] criticalthreshold@lemmy.ml 7 points 2 months ago

This is worse. They (Musk et. al) are seemingly only exposed to info that is algorithmically tailored to what they already ascribe to, and thus see a patently false view of the world. That in turn leads them to further ascribe to another layer of dogma, leading to another layer of insular views that further feed the cycle.

I believe this goes beyond the yes-men problem.

[–] fubarx@lemmy.ml 26 points 2 months ago (2 children)

TLDR: People need to go touch grass.

[–] harrys_balzac@lemmy.dbzer0.com 4 points 2 months ago (2 children)

TLDR: Some people need to be touched by blades of not grass.

[–] sik0fewl@lemmy.ca 4 points 2 months ago (1 children)
[–] Etterra@lemmy.world 3 points 2 months ago (1 children)

Sure. Let's call them that.

[–] harrys_balzac@lemmy.dbzer0.com 5 points 2 months ago

Single bladed French style fans

[–] drwho@beehaw.org 3 points 2 months ago

There isn't a whole lot of grass around here, it's pretty built up. And they fly over the parts of California that do have grass in private jets.

[–] SnotFlickerman@lemmy.blahaj.zone 15 points 2 months ago* (last edited 2 months ago)

They have fallen victim to the hyper-reality of the Spectacle.