this post was submitted on 07 Jul 2024
24 points (100.0% liked)

TechTakes

1438 readers
39 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

you are viewing a single comment's thread
view the rest of the comments
[–] blakestacey@awful.systems 10 points 4 months ago (9 children)

This part of Ed Zitron's latest post jumped out at me:

While Acemoglu has some positive things to say — for example, that AI models could be trained to help scientists conceive of and test new materials (which happened last year) — his general verdict is quite harsh: that using generative AI and "too much automation too soon could create bottlenecks and other problems for firms that no longer have the flexibility and trouble-shooting capabilities that human capital provides."

Click, click, search... Oh:

The recent report from a group of scientists at Google who employ a combination of existing data sets, high-throughput density functional theory calculations of structural stability, and the tools of artificial intelligence and machine learning (AI/ML) to propose new compounds is an exciting advance. We examine the claims of this work here, unfortunately finding scant evidence for compounds that fulfill the trifecta of novelty, credibility, and utility.

[–] froztbyte@awful.systems 8 points 4 months ago (8 children)

The materials (and some subtle walkbacks on PR) shit has featured here before too iirc

[–] blakestacey@awful.systems 9 points 4 months ago (5 children)

I think the only one discussed in depth was a different paper (here), but all these things blur together.

[–] froztbyte@awful.systems 8 points 4 months ago (1 children)

ah, my mistake. I guess it was another total bullshit google materials project. easy to confuse those, just like their 734 chat services

[–] skillissuer@discuss.tchncs.de 9 points 4 months ago* (last edited 4 months ago) (1 children)

different paper, same line of work. A-lab paper has two people from deepmind as authors, that were also authors of the other paper (Cubuk and Merchant). these two papers were published back to back in nature for some reason. rebuttals come from different authors tho, and happen at different stages (but point at exactly the same errors - excessively low symmetry/unlikely ordering of similar ions/metals and not looking for disordered structures)

so in retrospect it's even dumber, because they were called on their bullshit twice in space of three months, in format of full paper and preprint, and all that it caused was weak attempt at damage control

[–] blakestacey@awful.systems 5 points 4 months ago (1 children)
[–] skillissuer@discuss.tchncs.de 5 points 4 months ago (1 children)

sorry, had a brain fart, edited now

note to self: don't post after midnight

[–] blakestacey@awful.systems 5 points 4 months ago

No worries. I figured it was a typo for "excoriated" or something like that.

load more comments (3 replies)
load more comments (5 replies)
load more comments (5 replies)