blakestacey

joined 1 year ago
MODERATOR OF
[–] blakestacey@awful.systems 7 points 2 days ago

"So, professor sir, are you OK with psychologically torturing Black people, or do you just not care?"

[–] blakestacey@awful.systems 1 points 3 days ago

Superficially, it looks like he's making a testable prediction. But that "prediction" is a number from a bullshit calculation (or maybe two or three different, mutually inconsistent bullshit calculations — it's hard to be sure). So if someone wasted their time and did the experiment, he'd handwave away the null result by fiddling the input bullshit.

[–] blakestacey@awful.systems 6 points 3 days ago (7 children)

I will try to have some more comments about the physics when I have time and energy. In the meanwhile:

Entropy in thermodynamics is not actually a hard concept. It's the ratio of the size of a heat flow to the temperature at which that flow is happening. (So, joules per kelvin, if you're using SI units.) See episodes 46 and 47 of The Mechanical Universe for the old-school PBS treatment of the story. The last time I taught thermodynamics for undergraduates, we used Finn's Thermal Physics, for the sophisticated reason that the previous professor used Finn's Thermal Physics.

Entropy in information theory is also not actually that hard of a concept. It's a numerical measure of how spread-out a probability distribution is.

It's relating the two meanings that is tricky and subtle. The big picture is something like this: A microstate is a complete specification of the positions and momenta of all the pieces of a system. We can consider a probability distribution over all the possible microstates, and then do information theory to that. This bridges the two definitions, if we are very careful about it. One thing that trips people up (particularly if they got poisoned by pop-science oversimplifications about "disorder" first) is forgetting the momentum part. We have to consider probabilities, not just for where the pieces are, but also for how they are moving. I suspect that this is among Vopson's many problems. Either he doesn't get it, or he's not capable of writing clearly enough to explain it.

So these two were published in American Institute of Physics Advances, which looks like a serious journal about physics. Does anyone know about it? It occupies a space where I can’t easily find any obvious issues, but I also can’t find anyone saying “ye this is legit”. It claims to be peer-reviewed, and at least isn’t just a place where you dump a PDF and get a DOI in return.

I have never heard of anything important being published there. I think it's the kind of journal where one submits a paper after it has been rejected by one's first and second (and possibly third) choices.

However, after skimming, I can at least say that it doesn’t seem outlandish?

Oh, it's worse than "outlandish". It's nonsensical. He's basically operating at a level of "there's an E in this formula and an E in this other formula, so I will set them equal and declare it revolutionary new physics".

Here's a passage from the second paragraph of the 2023 paper:

The physical entropy of a given system is a measure of all its possible physical microstates compatible with the macrostate, S~Phys~. This is a characteristic of the non-information bearing microstates within the system. Assuming the same system, and assuming that one is able to create N information states within the same physical system (for example, by writing digital bits in it), the effect of creating a number of N information states is to form N additional information microstates superimposed onto the existing physical microstates. These additional microstates are information bearing states, and the additional entropy associated with them is called the entropy of information, S~Info~. We can now define the total entropy of the system as the sum of the initial physical entropy and the newly created entropy of information, S~tot~ = S~Phys~ + S~Info~, showing that the information creation increases the entropy of a given system.

wat

Storing a message in a system doesn't make new microstates. How could it? You're just rearranging the pieces to spell out a message — selecting those microstates that are consistent with that message. Choosing from a list of available options doesn't magically add new options to the list.

[–] blakestacey@awful.systems 11 points 3 days ago (1 children)

The "simulation hypothesis" is an ego flex for men who want God to look like them.

[–] blakestacey@awful.systems 9 points 3 days ago (2 children)

I sneered that in a blog post last year, as it happens.

[–] blakestacey@awful.systems 17 points 5 days ago* (last edited 5 days ago)

From the Wired story:

As a comparison, Cui cited another analysis that GPTZero ran on Wikipedia earlier this year, which estimated that around one in 20 articles on the site are likely AI-generated—about half the frequency of the posts GPTZero looked at on Substack.

That should be one in 20 new articles, per the story they cite, which is ultimately based on arXiv:2410.08044.

David Skilling, a sports agency CEO who runs the popular soccer newsletter Original Football (over 630,000 subscribers), told WIRED he sees AI as a substitute editor. “I proudly use modern tools for productivity in my businesses,” says Skilling.

Babe wake up, a new insufferable prick just dropped.

Edit to add: There's an interesting example here of a dubious claim being laundered into truthiness. That arXiv preprint says this in the conclusion section.

Shao et al. (2024) have even designed a retrieval-based LLM workflow for writing Wikipedia-like articles and gathered perspectives from experienced Wikipedia editors on using it—the editors unanimously agreed that it would be helpful in their pre-writing stage.

But if we dig up arXiv:2402.14207, we find that the "unanimous" agreement depends upon lumping together "somewhat" and "strongly agree" on their Likert scale. Moreover, this grand claim rests upon a survey of a grand total of ten people. Ten people, we hasten to add, who agreed to the study in the first place, practically guaranteeing a response bias against those Wikipedians who find "AI" morally repugnant.

[–] blakestacey@awful.systems 8 points 5 days ago (6 children)

shocked, shocked to find that gambling is going on in here.gif

[–] blakestacey@awful.systems 15 points 5 days ago

Breaking news: "AI-generated poetry is indistinguishable from human-written poetry and is rated more favorably"!

Or, you know, not.

[–] blakestacey@awful.systems 10 points 6 days ago* (last edited 6 days ago)

If you find yourself saying

There isn't a single good term in English for people who are post-pubertal but below the legal age of consent or majority

you may already be morally diseased.

[–] blakestacey@awful.systems 10 points 6 days ago

My own final project was a parody of the IMDb that was "what if the IMDb was about books instead of movies", except that the user reviews told stories about people who turned out to have all gone to high school together before scattering around the world, and reading them in the right sequence unlocked a finale in which they reunited for a New Year's party and their world dissolved so that their author could repurpose them for other stories.

[–] blakestacey@awful.systems 13 points 1 week ago (4 children)

Senior year of college, I took an elective seminar on interactive fiction. For the final project, one of my classmates wrote a program that scraped a LiveJournal and converted it into a text adventure game.

[–] blakestacey@awful.systems 18 points 1 week ago

"I was somewhere in the middle of your mother last night, Trebek!"

 

So, after the Routledge thing, I got to wondering. I've had experience with a few noble projects that fizzled for lacking a clear goal, or at least a clear breathing point where we could say, "Having done this, we're in a good place. Stage One complete." And a project driven by volunteer idealism — the usual mix of spite and whimsy — can splutter out if it requires more than one person to be making it a high/top priority. If half a dozen people all like the idea but each of them ranks it 5th or 6th among things to do, academic life will ensure that it never gets done.

With all that in mind, here is where my thinking went. I provisionally tagged the idea "Harmonice Mundi Books", because Kepler writing about the harmony of the world at the outbreak of the Thirty Years' War is particularly resonant to me. It would be a micro-publisher with the tagline "By scholars, for scholars; by humans, for humans."

The Stage One goal would be six books. At least one would be by a "big name" (e.g., someone with a Wikipedia article that they didn't write themselves). At least one would be suitable for undergraduates: a supplemental text for a standard course, or even a drop-in replacement for one of those books that's so famous it's known by the author's last name. The idea is to be both reputable and useful in a readily apparent way.

Why six books? I want the authors to get paid, and I looked at the standard flat fee that a major publisher paid me for a monograph. Multiplying a figure in that range by 6 is a budget that I can imagine cobbling together. Not to make any binding promises here, but I think that authors should also get a chunk of the proceeds (printing will likely be on demand), which would be a deal that I didn't get for my monograph.

Possible entries in the Harmonice Mundi series:

  • anything you were going to send to a publisher that has since made a deal with the LLM devil

  • doctoral theses

  • lecture notes (I find these often fall short of being full-fledged textbooks, chiefly by lacking exercises, but perhaps a stipend is motivation to go the extra km)

  • collections of existing long-form online writing, like the science blogs of yore

  • text versions of video essays — zany, perhaps, but the intense essayists already have manual subtitles, so maybe one would be willing to take the next, highly experimental step

Skills necessary for this project to take off:

  • subject-matter editor(s) — making the call about what books to accept, in the case we end up with the problem we'd like to have, i.e., too many books; and supervising the revision of drafts

  • production editing — everything from the final spellcheck to a print-ready PDF

  • website person — the site could practically be static, but some kind of storefront integration would be necessary (and, e.g., rigging the server to provide LLM scrapers with garbled material would be pleasingly Puckish)

  • visuals — logo, website design, book covers, etc. We could have all the cover art be pictures of flowers that I have taken around town, but we probably shouldn't.

  • publicity — getting authors to hear about us, and getting our books into libraries and in front of reviewers

Anyway, I have just barely started looking into all the various pieces here. An unknown but probably large amount of volunteer enthusiasm will be needed to get the ball rolling. And cultures will have to be juggled. I know that there are some tasks I am willing to do pro bono because they are part of advancing the scientific community, I am already getting a salary and nobody else is profiting. I suspect that other academics have made similar mental calculations (e.g., about which journals to peer review for). But I am not going to go around asking creative folks to work "for exposure".

 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

Last week's thread

(Semi-obligatory thanks to @dgerard for starting this)

 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

Last week's thread

(Semi-obligatory thanks to @dgerard for starting this)

 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

Last week's thread

(Semi-obligatory thanks to @dgerard for starting this)

 

So, here I am, listening to the Cosmos soundtrack and strangely not stoned. And I realize that it's been a while since we've had a random music recommendation thread. What's the musical haps in your worlds, friends?

 

Need to make a primal scream without gathering footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh facts of Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

 

Bumping this up from the comments.

 

Need to make a primal scream without gathering footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid!

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

 

Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid!

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

 

Many magazines have closed their submission portals because people thought they could send in AI-written stories.

For years I would tell people who wanted to be writers that the only way to be a writer was to write your own stories because elves would not come in the night and do it for you.

With AI, drunk plagiaristic elves who cannot actually write and would not know an idea or a sentence if it bit their little elvish arses will actually turn up and write something unpublishable for you. This is not a good thing.

 

[Eupalinos of Megara appears out of a time portal from ancient Ionia] Wow, you guys must be really good at digging tunnels by now, right?

 

a lesswrong: 47-minute read extolling the ambition and insights of Christopher Langan's "CTMU"

a science blogger back in the day: not so impressed

[I]t’s sort of like saying “I’m going to fix the sink in my bathroom by replacing the leaky washer with the color blue”, or “I’m going to fly to the moon by correctly spelling my left leg.”

Langan, incidentally, is a 9/11 truther, a believer in the "white genocide" conspiracy theory and much more besides.

view more: next ›