this post was submitted on 16 Sep 2024
26 points (100.0% liked)

TechTakes

1425 readers
237 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

Last week's thread

(Semi-obligatory thanks to @dgerard for starting this)

you are viewing a single comment's thread
view the rest of the comments
[–] self@awful.systems 11 points 2 months ago (1 children)

I’m so sorry. the tech industry is shockingly good at finding people who are susceptible to conversion like your CEO and DSO and subjecting them to intense propaganda that unfortunately tends to work. for someone lower in the company like your DSO, that’s a conference where they’ll be subjected to induction techniques cribbed from cults and MLM schemes. I don’t know what they do to the executives — I imagine it involves a variety of expensive favors, high levels of intoxication, and a variant of the same techniques yud used — but it works instantly and produces someone who can’t be convinced they’ve been fed a lie until it ends up indisputably losing them a ton of money

[–] mii@awful.systems 8 points 2 months ago (1 children)

Yeah, I assume that's exactly what happened when CEO went to Silicon Valley to talk to "important people". Despite being on a course to save money before, he dumped tens of thousands into AI infrastructure which hasn't delivered anything so far and is suddenly very happy with sending people to AI workshops and conferences.

But I'm only half-surprised. He's somewhat known for making weird decisions after talking to people who want to sell him something. This time it's gonna be totally different, of course.

The "important people" line is a huge part of how the grift works and makes tech media partially responsible. Legitimizing the grift rather than criticizing it makes it easy for sales folks to push "the next big thing." And after all, don't you want to be an important person?