blakestacey

joined 1 year ago
MODERATOR OF
[–] blakestacey@awful.systems 3 points 5 months ago* (last edited 5 months ago)

In the first Foundation story, there's a weird mention of applying symbolic logic to human language that comes from nowhere and goes nowhere. Campbell insisted upon it because

he felt in our discussions that symbolic logic, further developed, would so clear up the mysteries of the human mind as to leave human actions predictable. The reason human beings are so unpredictable was we didn't really know what they were saying and thinking because language is generally used obscurely. So what we needed was something that would unobscure the language and leave everything clear.

Clear being a fortuitous choice of wording on Asimov's part there, given, well.

TESCREAL and Scientology don't just share methodology; they both descend directly from "Golden Age" science fiction. In this essay I will

[–] blakestacey@awful.systems 2 points 5 months ago

"And a waifu is only a waifu, but a good cigar is a smoke."

[–] blakestacey@awful.systems 2 points 5 months ago (4 children)

Mastodon has Reply Guys. Lemmy has Cater To Me Whilst I Am Literally, Not Figuratively, Taking a Shit Guys.

[–] blakestacey@awful.systems 2 points 5 months ago

banned for obnoxious not-pology

[–] blakestacey@awful.systems 1 points 5 months ago (4 children)

If we trace one ancestry path back to science-fiction fandom, well, there's John W. Campbell.

[–] blakestacey@awful.systems 2 points 5 months ago (1 children)

I'm trying to think of a polite way to say "in short, no" and "the linked tweet having "effectivealtruism" in it twice should have been a clue", because I'm not that mean, but I probably need more coffee too.

[–] blakestacey@awful.systems 1 points 5 months ago

"Pronouns" with a hard R

[–] blakestacey@awful.systems 1 points 5 months ago

Bio people here are poorly informed. Just in general some of the presentations are factually incorrect

B-but rationalists are experts at covalent bonds

Also meeting people.... as a woman I have never felt as ignored and disrespected as I have in some instances the pa...

I'm sure the feedback becomes more positive in the cut-off part, no doubt about it

[–] blakestacey@awful.systems 1 points 5 months ago

It's still hard to beat the dck pck

[–] blakestacey@awful.systems 1 points 5 months ago (1 children)

Quoth Yud:

There is a way of seeing the world where you look at a blade of grass and see "a solar-powered self-replicating factory". I've never figured out how to explain how hard a superintelligence can hit us, to someone who does not see from that angle. It's not just the one fact.

It's almost as if basing an entire worldview upon a literal reading of metaphors in grade-school science books and whatever Carl Sagan said just after "these edibles ain't shit" is, I dunno, bad?

[–] blakestacey@awful.systems 1 points 5 months ago (6 children)

Carl T. Bergstrom, 13 February 2023:

Meta. OpenAI. Google.

Your AI chatbot is not hallucinating.

It's bullshitting.

It's bullshitting, because that's what you designed it to do. You designed it to generate seemingly authoritative text "with a blatant disregard for truth and logical coherence," i.e., to bullshit.

Me, 2 February 2023:

I confess myself a bit baffled by people who act like "how to interact with ChatGPT" is a useful classroom skill. It's not a word processor or a spreadsheet; it doesn't have documented, well-defined, reproducible behaviors. No, it's not remotely analogous to a calculator. Calculators are built to be right, not to sound convincing. It's a bullshit fountain. Stop acting like you're a waterbender making emotive shapes by expressing your will in the medium of liquid bullshit. The lesson one needs about a bullshit fountain is not to swim in it.

[–] blakestacey@awful.systems 1 points 5 months ago* (last edited 5 months ago) (1 children)

Vitalik Buterin:

A few months ago I was looking into Lojban and trying to figure out how I would translate "charge" (as in, "my laptop is charging") and the best I could come up with is "pinxe lo dikca" ("drink electricity")

So... if you think LLMs don't drink, that's your imagination, not mine.

My parents said that the car was "thirsty" if the gas tank was nearly empty, therefore gas cars are sentient and electric vehicles are murder, checkmate atheists

That was in the replies to this, which Yud retweeted:

Hats off to Isaac Asimov for correctly predicting exactly this 75 years ago in I, Robot: Some people won't accept anything that doesn't eat, drink, and eventually die as being sentient.

Um, well, actually, mortality was a precondition of humanity, not of sentience, and that was in "The Bicentennial Man", not I, Robot. It's also presented textually as correct....

In the I, Robot story collection, Stephen Byerley eats, drinks and dies, and none of this is proof that he was human and not a robot.

view more: ‹ prev next ›