this post was submitted on 01 Oct 2024
357 points (91.0% liked)

Programmer Humor

19623 readers
1797 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] passepartout 13 points 1 month ago (1 children)

If you have a supported GPU you could try Ollama (with openwebui), works like a charm.

[–] bi_tux@lemmy.world 6 points 1 month ago (2 children)

you don't even need a supported gpu, I run ollama on my rx 6700 xt

[–] BaroqueInMind@lemmy.one 3 points 1 month ago (1 children)

You don't even need a GPU, i can run Ollama Open-WebUI on my CPU with an 8B model fast af

[–] bi_tux@lemmy.world 2 points 1 month ago (1 children)

I tried it with my cpu (with llama 3.0 7B), but unfortunately it ran really slow (I got a ryzen 5700x)

[–] tomjuggler@lemmy.world 2 points 1 month ago

I ran it on my dual core celeron and.. just kidding try the mini llama 1B. I'm in the same boat with Ryzen 5000 something cpu

[–] passepartout 2 points 1 month ago

I have the same gpu my friend. I was trying to say that you won't be able to run ROCm on some Radeon HD xy from 2008 :D