this post was submitted on 05 Oct 2024
486 points (89.4% liked)

solarpunk memes

2913 readers
326 users here now

For when you need a laugh!

The definition of a "meme" here is intentionally pretty loose. Images, screenshots, and the like are welcome!

But, keep it lighthearted and/or within our server's ideals.

Posts and comments that are hateful, trolling, inciting, and/or overly negative will be removed at the moderators' discretion.

Please follow all slrpnk.net rules and community guidelines

Have fun!

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] NigelFrobisher@aussie.zone 64 points 1 month ago (2 children)

What’s actually going to kill LLMs is when the sweet VC money runs out and the vendors have to start charging what it actually costs to run.

[–] PriorityMotif@lemmy.world 19 points 1 month ago (3 children)

You can run it on your own machine. It won't work on a phone right now, but I guarantee chip manufacturers are working on a custom SOC right now which will be able to run a rudimentary local model.

[–] TherapyGary@lemmy.blahaj.zone 10 points 1 month ago

You can already run 3B llms on cheap phones using MLCChat, it's just hella slow

[–] MystikIncarnate@lemmy.ca 7 points 1 month ago (1 children)

Both apple and Google have integrated machine learning optimisations, specifically for running ML algorithms, into their processors.

As long as you have something optimized to run the model, it will work fairly well.

They don't want to have independent ML chips, they want it baked into every processor.

[–] PriorityMotif@lemmy.world 4 points 1 month ago (1 children)

Jokes on them because I can't afford their overpriced phones 😎

[–] MystikIncarnate@lemmy.ca 2 points 1 month ago

That's fine, Qualcomm has followed suit, and Samsung is doing the same.

I'm sure Intel and AMD are not far behind. They may already be doing this, I just haven't kept up on the latest information from them.

Eventually all processors will have it, whether you want it or not.

I'm not saying this is a good thing, I'm saying this as a matter of fact.

[–] TriflingToad@lemmy.world 1 points 1 month ago* (last edited 1 month ago)

It will run on a phone right now. Llama3.2 on Pixel 8

Only drawback is that it requires a lot of RAM so I needed to close all other applications, but that could be fixed easily on the next phone. Other than that it was quite fast and only took ~3gb of storage!

[–] gnarly@lemmy.world 12 points 1 month ago* (last edited 1 month ago) (1 children)

This isn't the case. Midjourney doesn't receive any VC money since it has no investors and this ignores genned imagery made locally off your own rig.

[–] match@pawb.social 1 points 1 month ago

yeah but that's pretty alright all told, the tech bros do not have the basic competency to do that and they can't sell it to dollar-sign-eyed ceos