swlabr

joined 1 year ago
[–] swlabr@awful.systems 1 points 10 months ago

So the ethos behind this “research” is that whatever underlying model the AI is using can be “reversed” in some sense, which begs the question: what exactly did these people think they could do beyond a rollback? That they could beg the AI to stop being mean or something?

They were probably inspired by the blanka creation scene from the street fighter movie where they brainwash some guy by showing him video clips of bad stuff and then switch it to showing good stuff.

[–] swlabr@awful.systems 1 points 10 months ago

my reference point for this kind of extension is the one that changes “social justice” and “sjw” with “skeleton” and “skeleton warrior.” For example:

“sjws are taking over X” -> “skeleton warriors are taking over X”

Actually now that I’m typing this I hope there’s a good one for “woke”.

[–] swlabr@awful.systems 1 points 10 months ago (1 children)

Scott: "Hmm, the reputation of the EA community that I am part of and love for some reason is tanking, due to the bad actions of its luminaries. What can I do to help? I know, I'll bring up 9/11"

Empty room: "..."

"And I'll throw out some made up statistics about terrorist attacks and how statistically we were due for a 9/11 and we overreacted by having any response whatsoever. And then I'll show how that's the same as when someone big in EA does something bad."

"..."

"Especially since it's common for people to, after a big scandal, try and push their agenda to improve things. We definitely don't want that."

"..."

"Also, on average there's less SA in STEM, and even though there is still plenty of SA, we don't need to change anything, because averages."

"..."

"Anyway, time for dexy no. 5"

[–] swlabr@awful.systems 0 points 11 months ago

She’s 35 and her sum total understanding of culture and race has lead to this. I think she has been effectively a Nazi from way back.

[–] swlabr@awful.systems 1 points 11 months ago (3 children)

Those are asymptotically approaching "Are we the baddies?" levels of self-awareness.

[–] swlabr@awful.systems 1 points 1 year ago* (last edited 1 year ago)

It’s only “due diligence” in the lesswrong region of the internet, otherwise it’s just sparkling willful ignorance

[–] swlabr@awful.systems 1 points 1 year ago* (last edited 1 year ago)

A thought I had a while back with google (and any other tech company I guess) with the same emotion that Rorschach has just before Dr. Manhattan disintegrates him: if they’ve already won, aka achieved virtual dominance over how we experience the web, then fine. Fucking break me with your personalised ads. Show me deep cut references from my personal life as emotional leverage. Orchestrate my nightmares with jingles. Show me the logical end of advertising. Just fucking end the human experience entirely since you’ve monetised all our dignity away anyway. Anything less than that is just an insult to my ability to hope.

Anyway yeah I hate this. Big ick

[–] swlabr@awful.systems 1 points 1 year ago* (last edited 1 year ago) (1 children)

Just sneering at a couple of comments, mostly the first.

This situation is best modeled by conflict theory, not mistake theory.

I thought rationalists were supposed to be strict mistake theorists (in their own terms). Seeing someone here essentially say, "Their opposition to us can't be resolved simply, just like how issues in the world are complex and not simple mistakes," when they actually believe (as any good liberal/nxr would) that any societal issue is a simple mistake to be corrected is... weird.

Since that does not seem likely to be the sort of answer you’re looking for though, if I wanted to bridge the inferential gap with a hypothetical Sneer Clubber who genuinely cared about truth, or indeed about anything other than status (which they do not)

This is the finest copium. Pure, uncut. Yes, I'm here to "boost my status" by collecting internet points. Everyone knows my name and keeps track of how cool I am. I don't sleep in a hotel and I own triples of every classic car. Triples makes it safe.

If you think that the conventional way to approach the world is usually right, the rationalist community will seem unusually stupid. We ignore all this free wisdom lying around and try to reinvent the wheel! If the conventional wisdom is correct, then concerns about the world changing, whether due to AI or any other reason, are pointless. If they were important, conventional wisdom would already be talking about them.

Hey, don't try to position yourselves as the plucky underdog/maverick here. That's a culture war move, and you aren't allowed to do that!

/r/SneerClub users are not the sort of entities with whom you can have that conversation. You might as well ask a group of chimpanzees why they're throwing shit at you.

LW talking to us would be more like this: a group of chimpanzees is throwing shit at some LWers. The LWers ask the chimps why. The chimps explain, using everyday language and concepts, that they think the worldview of the LWers is wrong and skewed in weird directions, and that any time someone tries to explain this, the chimps are met with condescension and the accusation that they can't understand the LWers because they are chimps. So in protest, the chimps explain they throw shit. The LWers shrug and say they can't understand what the chimps are saying, because they are chimps and chimps can't speak human language. The chimps continue to throw shit.

I think Sneer Club understands the Less Wrong worldview well enough. They just happen to reject it.

Least wrong LWer.

[–] swlabr@awful.systems 1 points 1 year ago

I constantly experience [the Gell-Mann amnesia] effect on this subreddit; everyone sounds so smart and so knowledgeable until they start talking about the handful of things I know a little bit about (leftism, the arts, philosophy) and they’re so far off the mark — then there’s another post and I’ve forgotten all about it

Bias noted, impact not reduced. Basic rationality failed. These people are so willing to discard their own sense of right and wrong, moral or rational, just to belong in their weird cult. Why is it so hard for these dorks to admit that they don't actually care about being smart or rational and that they just want a bunch of other dorks to be friends with?

view more: ‹ prev next ›