this post was submitted on 10 Jul 2024
50 points (98.1% liked)

SneerClub

989 readers
18 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] dgerard@awful.systems 33 points 4 months ago* (last edited 4 months ago) (14 children)

I am well acquainted with this genre of article and I ain't reading all that. Not bothering to be involved with this example was the obviously correct decision, even if Trace kept nagging after I'd already said "no thank you" (that famous rationalist grasp of consent).

This in the companion article caught my eye:

While I am not personally a rationalist,

Trace, I have some unfortunate news for you.

[–] pja@awful.systems 22 points 4 months ago* (last edited 4 months ago) (13 children)

I regret to inform you that Trace is hate-reading awful.systems too & has posted this comment on their Twitter.

You’d think these people would have learned by now that there’s no upside in them spending their precious time on this earth obsessing over why a group of people don’t like them, but nevertheless here they are: drawn like moths to the flame.

[–] barsquid@lemmy.world 14 points 4 months ago (7 children)

There would be an upside if they could magically acquire some self-awareness, and reflect on why a whole group thinks their ideas are idiotic. Alas,

[–] Soyweiser@awful.systems 16 points 4 months ago* (last edited 4 months ago) (2 children)

Yeah see also his denouncement of Roko's Basilisk (ctrl-f the page), we know it wasn't that important, the funny part was that it was a dumb rehash of Pascals wager, and that at the time Yud took is very seriously.

Wood also doesn't seem to link to the actual Rationalwiki article which also makes clear that Yud doesn't really believe in it (probably). It also mentions just how few (but above the 5% lizardman constant, so cause for concern, if they took their own ideas and MH seriously) people were worried about it. And every now and then you do find a person online who does take the idea seriously and worries about it, which is a bit of a concern. So oddly they should take it more seriously but only because it wrecks a small percentage of minds.

It is weird to not mention Yuds freakout:

Listen to me very closely, you idiot.

YOU DO NOT THINK IN SUFFICIENT DETAIL ABOUT SUPERINTELLIGENCES CONSIDERING WHETHER OR NOT TO BLACKMAIL YOU. THAT IS THE ONLY POSSIBLE THING WHICH GIVES THEM A MOTIVE TO FOLLOW THROUGH ON THE BLACKMAIL.

There's an obvious equilibrium to this problem where you engage in all positive acausal trades and ignore all attempts at acausal blackmail. Until we have a better worked-out version of TDT and we can prove that formally, it should just be OBVIOUS that you DO NOT THINK ABOUT DISTANT BLACKMAILERS in SUFFICIENT DETAIL that they have a motive toACTUALLY[sic] BLACKMAIL YOU.

And pretend this was just a blip and nothing more. Mf'er acted like he was in Stross novel.

(Also after not clearly sharing the information about Roko's Basilisks history, and we sneer at it, I came across this sentence: "then cites his pet article on Roko’s Basilisk directly while giggling about how mad it made Yudkowsky fans." lol, no selfawareness there wood).

[–] barsquid@lemmy.world 20 points 4 months ago (1 children)

Roko's basilisk is one of my favorite things because of the combination of how stupid it is and also how utterly panicked they all were. Desperately imagining magical communication across time and space with their dumb paperclip demon and panicking.

Why don't they just believe in a deity if they want one so bad?

[–] YouKnowWhoTheFuckIAM@awful.systems 17 points 4 months ago* (last edited 4 months ago) (1 children)

It’s funny because you will hear over and over again from them online, in almost rank-and-file prose, about how it was all a big storm in the collective teacup, and then some time later run across yet another story of a real life non-anonymous person who was freaked the fuck out for some good period of time, as were some large portion of their friends

[–] Soyweiser@awful.systems 14 points 4 months ago

And it still randomly freaks people out to this day, it clearly isn't a storm in a teacup for some people. And it is quite easily countered, but that also adds some ideas which counter the whole FOOM bs, so it is clear why they rather all push this under the rug, no matter the mental health cost of any Rationalists or would be Rationalists.

[–] ibt3321@lemmy.blahaj.zone 17 points 4 months ago* (last edited 4 months ago) (1 children)

HOW does he seriously use the phrase 'acausal blackmail'. I assumed the majority of word combinations for 'acausal' + noun were just jokes from people on here but apparently not.

[–] Soyweiser@awful.systems 11 points 4 months ago* (last edited 4 months ago)

Ikr, literally a plot point from Charles Stross Singularity Sky (2003) series and they take it seriously. (Which fits a pattern, CW: sexual abuse

spoilerYuds math pets thing also has a similarity to the horrible sadist torture scene in the last rifter series book (2004) in which a woman gets horribly abused more if she answers question wrong. (with the added plot twist that the torturer isn't as smart as he thinks he is and is partially wrong about the questions he 'rewards' (which could be some 4d chess move but doubtful)). Anyway, not the greatest read. Esp 2 decades later, when a lot of the weird science fiction terms used in the books have taken a very alt-right/dark enlightenment turn (which makes me wonder if they stole that).
)

load more comments (4 replies)
load more comments (9 replies)
load more comments (9 replies)