Architeuthis

joined 1 year ago
[–] Architeuthis@awful.systems 1 points 2 months ago (4 children)

but it can make a human way more efficient, and make 1 human able to do the work of 3-5 humans.

Not if you have to proof-read everything to spot the entirely convincing-looking but completely inaccurate parts, is the problem the article cites.

[–] Architeuthis@awful.systems 12 points 2 months ago (2 children)

I’m truly surprised they didn’t cart Yud out for this shit

Self-proclaimed sexual sadist Yud is probably a sex scandal time bomb and really not ready for prime time. Plus it's not like he has anything of substance to add on top of Saltman's alarmist bullshit, so it would just be reminding people how weird in a bad way people in this subculture tend to be.

[–] Architeuthis@awful.systems 8 points 2 months ago

I liked how Scalzi brushed it away, basically your consciousness gets copied to a new body, which kills the old one, and an artifact of the transfer process is that for a few moments you experience yourself as a mind with two bodies, meaning you have at least the impression of continuity of self, which is enough for most people to get on with living in a new body and let philosophers do the worrying.

[–] Architeuthis@awful.systems 10 points 2 months ago* (last edited 2 months ago) (2 children)

I feel like a subset of sci-fi and philosophical meandering really is just increasingly convoluted paths of trying to avoid or come to terms with death as a possibly necessary component of life.

Given rationalism's intellectual heritage, this is absolutely transhumanist cope for people who were counting on some sort of digital personhood upload as a last resort to immortality in their lifetimes.

[–] Architeuthis@awful.systems 9 points 2 months ago

You mean swapped out with something that has feelings that can be hurt by mean language? Wouldn't that be something.

Are we putting endocrine systems in LLMs now?

[–] Architeuthis@awful.systems 27 points 2 months ago (7 children)

Archive the weights of the models we build today, so we can rebuild them in the future if we need to recompense them for moral harms.

To be clear, this means that if you treat someone like shit all their life, saying you're sorry to their Sufficiently Similar Simulation™ like a hundred years after they are dead makes it ok.

This must be one of the most blatantly supernatural rationalist Accepted Truths, that if your simulation is of sufficiently high fidelity you will share some ontology of self with it, which by the way is how the basilisk can torture you even if you've been dead for centuries.

[–] Architeuthis@awful.systems 4 points 2 months ago

Seems unnecessary, due to the paradox of intolerance it's trivial to be made to look the bad guy if you are actively trying to curtail fash influence in the public discourse.

[–] Architeuthis@awful.systems 9 points 2 months ago (1 children)

IQ test performance correlates with level of education

I read somewhere that this claim owes a little too much to the inclusion of pathological cases at the lower end of the spectrum, meaning that since below a certain score like 85 you are basically intellectually disabled (or even literally brain dead, or just dead) and academic achievement becomes nonexistent, the correlation is far more pronounced than if we were comparing educational attainment at the more functional ranges.

Will post source if I find it.

[–] Architeuthis@awful.systems 5 points 3 months ago

Why? Programmers should be legally liable for what they program.

Too many degrees of separation between a programmer and the final product and how it's used, usually.

Additionally, the decision to deploy an incomplete product or one that contains known flaws is an administrative decision, not a programming one.

[–] Architeuthis@awful.systems 7 points 3 months ago

Yeah but like national socialist power metal isn't a thing in the way nsbm is.

I wonder if it's primarily occultism's nazi problem metastasizing, foundational dorks like Vikernes notwithstanding.

[–] Architeuthis@awful.systems 11 points 3 months ago

SV Scientology, they can't land you a leading role in a summer blockbuster but they sure as hell can put you in the running for AI policy related positions of influence or for the board of a company run by one of their more successful groomings. Their current most popular product is court philosophers for the worst kind of aspiring technofeudalist billionaire.

If this gets them interested you'll eventually get your chance to do a deep dive to any details of cosmist lore you find relevant.

[–] Architeuthis@awful.systems 13 points 3 months ago* (last edited 3 months ago)

The whole article is sneertastic. Nothing to add, will be sharing.

What you’re dealing with here is a cult. These tech billionaires are building a religion. They believe they’re creating something with AI that’s going to be the most powerful thing that’s ever existed — this omniscient, all-knowing God-like entity — and they see themselves as the prophets of that future.

eugenic TESCREAL screed (an acronym for … oh, never mind).

“Immortality is a key part of this belief system. In that way, it’s very much like a religion. That’s why some people are calling it the Scientology of Silicon Valley.”

Others in San Francisco are calling it “The Nerd Reich.”

“I think these guys see Trump as an empty vessel,” says the well-known exec who’s supporting Harris. “They see him as a way to pursue their political agenda, which is survival of the fittest, no regulation, burn-the-house-down nihilism that lacks any empathy or nuance.”

view more: ‹ prev next ›