this post was submitted on 21 Sep 2024
20 points (83.3% liked)

Technik

297 readers
2 users here now

die Community für alles, was man als Technik beschreiben kann

Beiträge auf Deutsch oder Englisch

founded 4 months ago
MODERATORS
top 18 comments
sorted by: hot top controversial new old
[–] Hackworth@lemmy.world 8 points 2 months ago

My money's on analog optical computing replacing GPUs as the hardware used for inference / generation. Analog computing in general is promising, and I look forward to seeing how the chips fall re: training.

[–] Anivia 8 points 2 months ago

I can't tell if this article is satire or not

[–] adespoton@lemmy.ca 5 points 2 months ago (3 children)

Both the article and the pushback are kind of silly here — the dGPU’s heyday was over a decade ago, back when “serious gamers” had a custom built PC on their desk and upgraded their GPU every two years at a minimum.

Back in 2008, gaming on a laptop started to become a possibility, and dGPUs were part of that story — but for the most part, good luck swapping out your GPU for a newer model; it generally wasn’t so easy to do on a laptop.

THAT was the beginning of the end for dGPUs.

By 2015, I had a laptop with both an iGPU and a dGPU. eGPUs were just appearing on the market as a way around the lack of upgradeability, but these were niche, and not required for most computing tasks, including gaming.

At the same time, console hardware began to converge with desktop hardware, so gaming houses, who had for over 20 years driven the dGPU market, fell into a slower demand cadence that matched the console hardware. GPUs stagnated.

And then came cryptomining, a totally new driver of GPUs. And it almost destroyed the market, gobbling up the hardware so that none was available for any other compute task.

Computer designers responded by doubling down on the iGPU, making them good enough for almost all tasks you’d use a personal computer for.

Then came AI. It too was a new driver for GPUs, and like crypto, sucked some of the oxygen out of the PC market… which switched to adding iNPUs to handle ML tasks.

So yeah; GPUs are now for the cloud services market and niche developers; everyone else can get their hands on a “good enough” SoC with enough CPU, GPU and NPU compute to do what they need, and the ability to offload to a remote server cluster for weightier jobs.

[–] unmagical@lemmy.ml 13 points 2 months ago (2 children)

Show me an iGPU that will compete with my 3080ti.

[–] jacksilver@lemmy.world 3 points 2 months ago

Yeah, dgpus have been for niche applications for decades, I didn't read the article, but the parent comment is vastly overestimating igpu capabilites

[–] Anticorp@lemmy.world 12 points 2 months ago

over a decade ago, back when “serious gamers” had a custom built PC on their desk and upgraded their GPU every two years at a minimum.

Hey man, I'm old, but I still have a custom built gaming PC on my desktop and a fairly recent GFX card (3070 ti). Although I'd say I only update my card maybe once every 3-5 years depending on necessity.

[–] Randomgal@lemmy.ca -1 points 2 months ago

This is the real write up right here. Article is pretty meh.

[–] pantherina -4 points 2 months ago

Ich hab mir ne NVidia Titan X gekauft für LLMs, und hab das Gefühl dass das ein absoluter Fehlkauf war.

Der Energieverbrauch muss absurd sein, moderne Chips sind bestimmt extrem viel besser.

Es war halt so 1000€ günstiger, glaube ich. Vielleicht bin ich da aber auch falsch