blakestacey

joined 1 year ago
MODERATOR OF
[–] blakestacey@awful.systems 11 points 2 months ago (2 children)

and hot young singles in your area have a bridge in Brooklyn to sell

on the blockchain

[–] blakestacey@awful.systems 8 points 2 months ago (2 children)

So many techbros have decided to scrape the fediverse that they all blur together now... I was able to dig up this:

"I hear I’m supposed to experiment with tech not people, and must not use data for unintended purposes without explicit consent. That all sounds great. But what does it mean?" He whined.

[–] blakestacey@awful.systems 29 points 2 months ago (27 children)

When you don’t have anything new, use brute force. Just as GPT-4 was eight instances of GPT-3 in a trenchcoat, o1 is GPT-4o, but running each query multiple times and evaluating the results. o1 even says “Thought for [number] seconds” so you can be impressed how hard it’s “thinking.”.

This “thinking” costs money. o1 increases accuracy by taking much longer for everything, so it costs developers three to four times as much per token as GPT-4o.

Because the industry wasn't doing enough climate damage already.... Let's quadruple the carbon we shit into the air!

[–] blakestacey@awful.systems 10 points 2 months ago* (last edited 2 months ago)

I have to admit that I wasn't expecting LinkedIn to become a wretched hive of "quantum" bullshit, but hey, here we are.

Tangentially: Schrödinger is a one-man argument for not naming ideas after people.

[–] blakestacey@awful.systems 7 points 2 months ago

(smashes imaginary intercom button) "Who is this 'some guy'? Find him and find out what he knows!!"

[–] blakestacey@awful.systems 5 points 2 months ago

Happy belated birthday!

[–] blakestacey@awful.systems 9 points 2 months ago (13 children)

Elon Musk in the replies:

Have you read Asimov’s Foundation books?

They pose an interesting question: if you knew a dark age was coming, what actions would you take to preserve knowledge and minimize the length of the dark age?

For humanity, a city on Mars. Terminus.

Isaac Asimov:

I'm a New Deal Democrat who believes in soaking the rich, even when I'm the rich.

(From a 1968 letter quoted in Yours, Isaac Asimov.)

[–] blakestacey@awful.systems 5 points 2 months ago* (last edited 2 months ago) (14 children)

Lex Fridman: "I'm going to do a deep dive on Ancient Rome. Turns out it was a land of contrasts"

I'm doing a podcast episode on the Roman Empire.

It's a deep dive into military conquest, technology, politics, economics, religion... from its rise to its collapse (n the west & the east).

History really does put everything in perspective.

(xcancel)

[–] blakestacey@awful.systems 15 points 2 months ago (1 children)

... "Coming of Age" also, oddly, describes another form of novel cognitive dissonance; encountering people who did not think Eliezer was the most intelligent person they had ever met, and then, more shocking yet, personally encountering people who seemed possibly more intelligent than himself.

The latter link is to "Competent Elities", a.k.a., "Yud fails to recognize that cocaine is a helluva drug".

I've met Jurvetson a few times. After the first I texted a friend: “Every other time I’ve met a VC I walked away thinking ‘Wow, I and all my friends are smarter than you.’ This time it was ‘Wow, you are smarter than me and all my friends.’“

Uh-huh.

Quick, to the Bat-Wikipedia:

On November 13, 2017, Jurvetson stepped down from his role at DFJ Venture Capital in addition to taking leave from the boards of SpaceX and Tesla following an internal DFJ investigation into allegations of sexual harassment.

Not smart enough to keep his dick in his pants, apparently.

Then, from 2006 to 2009, in what can be interpreted as an attempt to discover how his younger self made such a terrible mistake, and to avoid doing so again, Eliezer writes the 600,000 words of his Sequences, by blogging “almost daily, on the subjects of epistemology, language, cognitive biases, decision-making, quantum mechanics, metaethics, and artificial intelligence”

Or, in short, cult shit.

Between his Sequences and his Harry Potter fanfic, come 2015, Eliezer had promulgated his personal framework of rational thought — which was, as he put it, “about forming true beliefs and making decisions that help you win” — with extraordinary success. All the pieces seemed in place to foster a cohort of bright people who would overcome their unconscious biases, adjust their mindsets to consistently distinguish truth from falseness, and become effective thinkers who could build a better world ... and maybe save it from the scourge of runaway AI.

Which is why what happened next, explored in tomorrow’s chapter — the demons, the cults, the hells, the suicides — was, and is, so shocking.

Or not. See above, RE: cult shit.

[–] blakestacey@awful.systems 11 points 2 months ago

Something tells me they’re not just slapping chatGPT on the school computers and telling kids to go at it; surely one of the parents would have been up-to-date enough to know it’s a scam otherwise.

If people with money had that much good sense, the world would be a well-nigh unfathomably different place....

view more: ‹ prev next ›