this post was submitted on 10 Aug 2024
250 points (100.0% liked)

TechTakes

1436 readers
129 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] sunzu@kbin.run 14 points 3 months ago (15 children)

Do we know if local models are any safer or is that a trust me bro?

[–] sturlabragason@lemmy.world -4 points 3 months ago* (last edited 3 months ago) (1 children)

You can download multiple LLM models yourself and run them locally. It’s relatively straightforward;

https://ollama.com/

Then you can switch off your network after download, wireshark the shit out of it, run it behind a proxy, etc.

[–] froztbyte@awful.systems 8 points 3 months ago

you didn’t need to give random llms free advertising to make your point, y’know

load more comments (13 replies)