UnseriousAcademic

joined 4 months ago
 

Hello everybody, after a lengthy delay, my talk to the University of Sidney about Neoreaction and the ways I tried to map its various communities, is now available.

Please ignore the coughing. My keynote slides were very dusty.

Huh, I never imagined Wikipedia would have such a thing. Thanks!

 

I want to make the case to my employer that we should drop Twitter/X as a promotional channel. I could go drawing together the various examples of disinfo spreading, instating CSAM posters, rise of content inciting violence etc, but I thought I'd check to see if someone hasn't already been tracking this. The sooner I can pull the info together the better but I don't have time right now to go compiling it myself.

Anyone know if there's a site, wiki, resource, thread etc that could set me up?

[–] UnseriousAcademic@awful.systems 14 points 1 month ago (2 children)

Money represents the aggregate value of the intersection between human labour, ingenuity and scarce finite resources. Human lives are routinely rendered down, ground up and consumed by the drive to generate this representative value. Entire ways of living, forms of self perception and our understanding of what makes a human worthy of existing is inextricably wrapped up in this value generating process.

As a society we have declared that these people are best placed to decide what to do with that value. They chose anime.

[–] UnseriousAcademic@awful.systems 18 points 2 months ago* (last edited 2 months ago)

Good to see some reporting that continues to gloss over Srinivasan's obsession with culture war "woke ess" and writing that veers a little too close to the idea of purging undesirables. Also missing out on my favourite: the open musing that the next best steps might be to corrupt the police force through financial incentives so they can have them as their own private security force to take over San Francisco.

To be fair I've spent an inordinate amount of time looking at stuff on the Internet that doesn't interest me. Especially since my workplace moved their employee training online.

[–] UnseriousAcademic@awful.systems 35 points 2 months ago (2 children)

Man I feel this, particularly the sudden shutting down of data access because all the platforms want OpenAI money. I spent three years building a tool that pulled follower relation data from Twitter and exponentially crawled it's way outwards from a few seed accounts to millions of users. Using that data it was able to make a compressed summary network, identify community structures, give names to the communities based on words in user profiles, and then use sampled tweet data to tell us the extent to which different communities interacted.

I spent 8 months in ethics committees to get approval to do it, I got a prototype working, but rather than just publish I wanted to make it accessible to the academic community so I spent even more time building an interface, making it user friendly, improving performance, making it more stable etc.

I wanted to ensure that when we published our results I could also say "here is this method we've developed, and here you can test it and use it too for free, even if you don't know how to code". Some people at my institution wanted me to explore commercialising but I always intended to go open source. I'm not a professional developer by any means so the project was always going to be a janky academic thing, but it worked for our purposes and was a new way of working with social media data to ask questions that couldn't be answered before.

Then the API got put behind a $48K a month paywall and the project was dead. Then everywhere else started shutting their doors too. I don't do social media research anymore.

It's truly a wonder where these topics will take you.

[–] UnseriousAcademic@awful.systems 7 points 2 months ago (3 children)

These people aren't real nerds.

[–] UnseriousAcademic@awful.systems 3 points 2 months ago (2 children)

As in with Eliza where we interpret there as being humanity behind it? Or that ultimately "humans demanding we leave stuff to humans because those things are human" is ok?

 

The benefits of crypto are self evident, thus it is necessary to build an elaborate faux education system to demonstrate them.

I'm sure there will also be some Network Fascism in there for good measure.

[–] UnseriousAcademic@awful.systems 4 points 2 months ago (1 children)

To be fair the more imaginative ones have entire educational models built around teaching the societally transformative power of bitcoin.

[–] UnseriousAcademic@awful.systems 15 points 2 months ago (1 children)

Promptfondler sounds like an Aphex Twin song title.

the truth in the joke is that you're a huge nerd

Oh absolutely. Yes I think partly my fascination with all of this is that I think I could quite easily have gone the tech bro hype train route. I'm naturally very good with getting into the weeds of tech and understanding how it works. I love systems (love factory, strategy and logistics games) love learning techy skills purely to see how it works etc. I taught myself to code just because the primary software for a particularly for of qualitative analysis annoyed me. I feel I am prime candidate for this whole world.

But at the same time I really dislike the impoverished viewpoint that comes with being only in that space. There's just some things that don't fit that mode of thought. I also don't have ultimate faith in science and tech, probably because the social sciences captured me at an early age, but also because I have an annoying habit of never being comfortable with what I think, so I'm constantly reflecting and rethinking, which I don't think gels well with the tech bro hype train. That's why I embrace the moniker of "Luddite with an IDE". Captures most of it!

 

Revered friends. I wrote a thing. Mainly because I had a stack of stuff on Joseph Weizenbaum on tap and the AI classroom thing was stuck in my head. I don't know if it's good, but it's certainly written.

[–] UnseriousAcademic@awful.systems 34 points 2 months ago (10 children)

The learning facilitators they mention are the key to understanding all of this. They need them to actually maintain discipline and ensure the kids engage with the AI, so they need humans in the room still. But now roles that were once teachers have been redefined as "Learning facilitators". Apparently former teachers have rejoined the school in these new roles.

Like a lot of automation, the main selling point is deskilling roles, reducing pay, making people more easily replaceable (don't need a teaching qualification to be a "learning facilitator to the AI) and producing a worse service which is just good enough if it is wrapped in difficult to verify claims and assumptions about what education actually is. Of course it also means that you get a new middleman parasite siphoning off funds that used to flow to staff.

 

With Yarvin renewing interest in Urbit I was reminded of this paper that focuses on Urbit as a representation of the politics of "exit". It's free/open access if anyone is interested.

From the abstract...

This paper examines the impact of neoreactionary (NRx) thinking – that of Curtis Yarvin, Nick Land, Peter Thiel and Patri Friedman in particular – on contemporary political debates manifest in ‘architectures of exit’...While technological programmes such as Urbit may never ultimately succeed, we argue that these, and other speculative investments such as ‘seasteading’, reflect broader post-neoliberal NRx imaginaries that were, perhaps, prefigured a quarter of a century ago in The Sovereign Individual."

 

Hello all. People were very kind when I originally posted the start of this series. I've refrained from spamming you with every part but I thought I'd post to say the very final installment is done.

I got a bit weird with it this time as I felt like I had an infinite amount to say, all of which only barely got to the underlying point i was trying to make. So much that I wrote I also cut, it's ridiculous.

Anyway now the series is done I'm going to move on to smaller discrete pieces as I work on my book about Tech Culture's propensity to far-right politics. I'll be dropping interesting stuff I find, examples of Right Libertarians saying ridiculous things, so follow along if that's your jam.

 

The cost of simply retrieving an answer from the Web is infinitely smaller than the cost of generating a new one.

Great interview with Sasha Luccioni from Huggingface on all the ways that using generative AI for everything is both a) hugely costly compared to existing methods, and b) insane.

 

Seeing a sudden surge in interest in the "Tech Right" as they're being dubbed. Often the focus is on business motivations like tax breaks but I think there's more to it. The narrative that silicon Valley is a bunch of tech hippies was well sown early on, particularly by Stewart Brand and his ilk but throughout that period and prior, the intersection between tech and authoritative politics that favours systems over people is well established.

 

Hello all,

TLDR: I've written some stuff about tech ideology via the TV show Devs. It's all free, no paid subs etc. Would love it if anyone interested wanted to take a look - link is to my blog.

Longer blurb: Firstly if this is severely poor form please tell me to do one, throw tomatoes etc.

I'm a Sociologist that focuses on tech culture. Particularly elite tech culture and the far right. I started off writing about the piracy cultures of the 2000s and their role in the switch to digital distribution back in 2013. Just by virtue of paying attention to tech ideology I've now ended up also researching far right extremism and radicalisation and do a lot of data analysis with antifacist orgs. I also used to flirt around in the Sneerclub post-rat spaces on reddit and twitter a few years back too.

Anyway, I've been researching NRx and the wider fashy nature of tech since 2016 but because of "issues" I've not yet got much out into the world. I'm working on a book that more closely examines the way that the history and ideologies in tech culture play well to far right extremism and what it might say about the process of radicalisation more generally.

However, because I'm tired of glacial academic publishing timelines I've also started a research blog called Unserious Academic and for my first project I use the Alex Garland TV show Devs to illustrate and explore some of the things I know about tech culture. I've put out three parts so far with a fourth one ready for Monday. I'm not looking for paid subs or anything, all free I just figured some people might be interested.

I also desperately need a place where people know what a neoreactionary is so I can more easily complain about them so I'd like to hang around longer term too. Thanks for your time!

view more: next ›