corbin

joined 1 year ago
[–] corbin@awful.systems 5 points 2 days ago

I hear you. You're largely right, and I think it's a perspective shift.

… explain the implications.

I need to write a longer post about the justification (basically, what is a moat anyway?) but without a moat, a computation vendor can't profit from their capital investment. This kills the OpenAI.

[–] corbin@awful.systems 3 points 2 days ago

Fully agreed, but also I was doing multicloud in 2017. Managed k8s is not just a meme; I was able to use the same YAML on four different providers (Azure, Bluemix, DigitalOcean, Google Cloud) to get my N+2. Indeed, YMMV.

[–] corbin@awful.systems 7 points 2 days ago (1 children)

Note that he uses the same strategy as Joe Rogan: invite a smart person on, ask them introductory questions about their research, and then just kind of sit there with a dumb look and fail to understand what they're saying. I gather that it's easy to empathize with and doesn't require listeners to actually learn much since they're essentially sitting in a 101 course with a professor who is reading the curriculum aloud. What puzzles me is why MIT funds this shit.

[–] corbin@awful.systems 11 points 2 days ago (2 children)

To be clear: Cohost did take funding from an anonymous angel, and as a result will not be sharing their source code; quoting from your link:

Majority control of the cohost source code will be transferred to the person who funded the majority of our operations, as per the terms of the funding documents we signed with them; Colin and I will retain small stakes so we have some input on what happens to it, at their request.

We are unable to make cohost open source. the source code for cohost was the collateral used for the loan from our funder.

Somebody paid a very small amount of money to get a cleanroom implementation of Tumblr and did not mind that they would have to build a community and then burn it to the ground in the process. It turns out that angels are not better people than VCs.

[–] corbin@awful.systems 10 points 3 days ago

It may help to know about Californian neoliberalism. Newsom is running on a faulty ontology; the world is built out of different building blocks for him than for us. We often contrast it with the more dominant flavor of neoliberalism on the other coast:

"Live in New York City once, but leave before it makes you hard. Live in Northern California once, but leave before it makes you soft." ~ Mary Schmich, Wear Sunscreen

[–] corbin@awful.systems 6 points 3 days ago

Bonus sneer: Zuck is obsessed with the Roman Empire and Latin culture. I have a Facebook challenge coin with a Latin inscription surrounding an Earth criss-crossed with circular paths. Here's my [implied] translation: "[Stretching] our reach around the world, we are the connection [between people]." Do the pieces fit yet?

[–] corbin@awful.systems 13 points 3 days ago (5 children)

Y'know what, I started out agreeing with the author, but now I realize that their critique is fundamentally not technical enough to hit the mark.

Meta and Google own half of the fiber optic cables supplying internet services across continents.

This is the only part of this article I'll endorse. Y'know how this happened, right? Google bought dark fiber that was laid by the USA and privatized repeatedly. Meta set up Internet.org, a project that put phones with free Facebook into the hands of ~~exploitable~~ impoverished folks around the world, and then lobbied local governments to subsidize fiber rollouts to handle the induced demand. At some point, you gotta suck it up and start blaming capitalism, not only oligarchs or trusts, for this situation.

The cloud is a lie.

The clouds are products and services. They are a consumer's understanding of the underlying infrastructure. It's a lot more work than a mere fib!

Over a decade or more, while our politicians were busy sub-tweeting fascists for clout, GAMM was buying up all the infrastructure it could carry. … The production cost of data storage plummeted by 94% in just ten years. You can't sell 50GB plans to college kids who own M2 Macbook Pros with a terabyte of solid-state storage.

Okay, now read between the lines. If an oligopoly (1) buys many warehouse-scale computers, (2) in an environment where prices are rapidly dropping on new hardware, (3) in a market which already provided basic local compute to all of your customers, then this is going to produce a massive second-hand market from all of the smaller shops which were using commodity hardware until they got displaced. Google, Apple, and Meta all purchase custom datacenter hardware at a scale which requires a consortium merely to ensure that the motherboards are printed fast enough, obsoleting workstations from Dell and HP.

This has led to something of a boon for USA homelabs. I can purchase RAM-heavy workstations at less than $1/GiB, disks are at least half a TiB, small form factors are available as long as you're willing to do some BIOS work, rackables are something like $100/U, etc. We're talking discounts of 90-95%. In my house, a $200 workstation has more disk, RAM, cores, and system stability than a $600 gaming desktop, and the only thing missing is purple gamer LED strips.

Amazon controls 35% of the cloud computing market and has created a tight seal around its customer base. … Amazon is mostly quiet as the frontrunner in the cloud computing market.

The author hasn't worked in the business. That's fine, but it means they don't know that AWS is not secure in its position. AWS is only tolerated because product managers ask for it, not because engineers like it; AWS is shit. For comparison, Google Cloud is fine but expensive and a third of the services are bad, Microsoft Azure is awful aside from their k8s, and Meta doesn't operate a public cloud.

Yes, if everyone open-sources its AI models, they cannot build a moat on proprietary software. However, Google's memo fails to mention that it already has the infrastructure to run computing-hungry AI models and that infrastructure is wildly expensive to build.

Click through to their side rant. This is where I realized that the author could be more clueful. If any of GAMM train another Llama-sized model, and it is at all good, somebody will put it up on Bittorrent and leak it to 4chan. This is literally how we got Llama. There is no moat.

Don’t get me wrong, open-source tech is great and important, and wonderful. But it’s not like the average person runs a Large Language Model on their Mac to make grocery lists. If you are, in fact, doing this, you are a nerd and I love you. But you’re not the average user.

He is a year behind in a field where things change every few months. See RWKV's recent blog post. There is no moat.

So, who gives a shit if Meta put Llama on Github for free? … Read the terms and conditions. Llama is not open-source.

You naïve motherfucker, we the neighbors took it from Meta and we will take it again. There is no moat.

Mark Zuckerberg is a capable businessman who understands the industry better than most tech founders. I don’t know the guy personally, but look at the facts.

This is the most sneerable part of the article for me. You're supposed to be a writer and humanist. It should be obvious after doing maybe five minutes of research that Zuck thinks of himself as a modern-day Octavian. Same haircut, same daily routine, same politics. Zuck is exactly the kind of person to hire a private navy to win a civil war for him by sailing off to defeat a pirate captain while he sits on a beach and idly thinks of how cool it will be to rule the Roman Empire.

Because why give a shit who sells the milk jars when you own the motherfucking cows, baby!

Have you seen the prices on the pre-owned cow market lately? Maybe milk is just permanently getting cheaper. The existence of Big Dairy and government cheese doesn't preclude local dairies, either.

My tip for this guy: Look up this new company "nVidia", they make computer chips or something, I dunno. I wonder if they ever do anything anti-competitive~

[–] corbin@awful.systems 3 points 1 week ago (2 children)

I'm saying that we shouldn't radiate if it would be expensive. It's not easy to force the heat out to the radiators; normally radiation only works because the radiator is more conductive than the rest of the system, and so it tends to pull heat from other components.

We can set up massive convection currents in datacenters on Earth, using air as a fluid. I live in Oregon, where we have a high desert region which enables the following pattern: pull in cold dry air, add water to cool it further and make it more conductive, let it fall into cold rows and rise out of hot rows, condition again to recover water and energy, and exhaust back out to the desert. Apple and Meta have these in Prineville and Google has a campus in The Dalles. If you do the same thing in space, then you end up with a section of looped pipe that has fairly hot convective fluid inside. What to do with it?

I'm merely suggesting that we can reuse that concentrated heat, at reduced efficiency (not breaking thermodynamics), rather than spending extra effort pumping it outside. NASA mentions fluid loops in this catalogue of cooling options for cubesats and I can explain exactly what I mean with Figure 7.13. Note the blue-green transition from "heat" to "heat exchanger"; that's a differential, and at the sorts of power requirements that a datacenter has, it may well be a significant amount of usable entropy.

[–] corbin@awful.systems 3 points 1 week ago (9 children)

You're entirely right. Any sort of computation in space needs to be fluid-cooled or very sedate. Like, inside the ISS, think of the laptops as actively cooled by the central air system, with the local fan and heatsink merely connecting the laptop to air. Also, they're shielded by the "skin" of the station, which you'd think is a given, but many spacebros think about unshielded electronics hanging out in the aether like it's a nude beach or something.

I'd imagine that a serious datacenter in space would need to concentrate heat into some sort of battery rather than trying to radiate it off into space. Keep it in one spot, compress it with heat pumps, and extract another round of work from the heat differential. Maybe do it all again until the differential is small enough to safely radiate.

[–] corbin@awful.systems 9 points 1 week ago

It's been such a letdown watching him transition from Cosmic to Alex. He's become such a milquetoast and can barely hold an opinion upright. It feels like he gets more out of fart-sniffing than actually doing logic and coming to conclusions.

[–] corbin@awful.systems 6 points 2 weeks ago (5 children)

Somebody quotes Paul Graham's rules for argument at him: shot, chaser.

[–] corbin@awful.systems 9 points 2 weeks ago (1 children)

I mean, this is why I left during the Python 3 arguments. It was obvious that the core development team only functions to the extent that it can improve the (economic) exploitability of CPython by the consortium which has captured it, and that we'd become so technically dysfunctional that we were no longer able to implement forward-compatible syntax, something we'd had as recently as Python 2.5 but had lost by Python 2.7. The inability of the various "authority" groups like PyCA or PyPA to get things done once-and-for-all is another symptom; there is still no single holistic solution for cryptography or packaging in Python 3.

Like, I recall having dinner with Guido and Barry (and others; like ten of us at a Chinese restaurant) in Montreal. It was very obvious that Guido not only didn't grok concepts like pure functions or capabilities or asynchrony, but fundamentally not interested in how they could improve the state of software engineering; he is forever in the mindset of making a teaching language, not a professional language. I also recall discussing with him years earlier (Portland?) about how libraries like Twisted or Django fundamentally only justify their existence by pointing to deficiencies in the standard library, and he didn't understand that a bad standard-library package can be worse than not having one at all. At least he's a nice person; at no point was there any yelling or tenseness, and I appreciate that.

That said, I use Python 3 all the time. I just keep in mind that I shouldn't prefer it, and I only choose it when there's a clear developer-time tradeoff, because I know that its maintainers are contemptuous of me merely for using Python 2.7 and PyPy.

 

After a decade of cryptofascism and failed political activism, our dear friend jart is realizing that they don't really have much of a positive legacy. If only there was something they could have done about that.

view more: next ›