this post was submitted on 19 Nov 2024
483 points (99.6% liked)

Technology

59599 readers
3370 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 49 comments
sorted by: hot top controversial new old
[–] schizo@forum.uncomfortable.business 126 points 5 days ago* (last edited 5 days ago) (3 children)

Amazing what happens when your primary competitor spends 18 months stepping on every rake they can find.

And, then, having run out of rakes, they then deeply invest in a rake factory so they can keep right on stepping on them.

This'll probably be a lot more interesting a year from now, given that the product lines for the next ~9 months or so are out and uh, well.....

[–] empireOfLove2@lemmy.dbzer0.com 82 points 5 days ago (2 children)

18 months? Lol.

Intel has been stagnating since the 4th gen Core uarch in 2014 with little competition. They knew they were top dog and they sat on their hands until their hands went numb. There's a reason "14nm++++++++++" was a running joke. This is a decade of monopolistic market behavior finally coming home to roost.

[–] cygnus@lemmy.ca 31 points 5 days ago (2 children)

So you're telling me that milking my 4770k until this year when I built a new rig with AMD was in fact a genius move?

[–] SapphironZA@sh.itjust.works 13 points 5 days ago

Perfect market timing.

[–] empireOfLove2@lemmy.dbzer0.com 8 points 5 days ago (1 children)

Basically yeah. Up until Zen 2 intel didn't do much innovating and only around the zen 2 era did those 4th/6th gen chips start to really struggle in modern workloads.

[–] cygnus@lemmy.ca 1 points 5 days ago

TBH the only thing that caused me grief with that old beast of an i7 (other that the fact it would have bottlenecked my new GPU) was playing Stellaris.

[–] schizo@forum.uncomfortable.business 19 points 5 days ago (6 children)

That's a wee revisionist: Zen/Zen+/Zen2 were not especially performant and Intel still ran circles around them with Coffee Lake chips, though in fairness that was probably because Zen forced them to stuff more cores on them.

Zen3 and newer, though, yeah, Intel has been firmly in 2nd place or 1st place with asterisks.

But the last 18 months has them fucking up in such a way that if you told me that they were doing it on purpose, I wouldn't really doubt it.

It's not so much failing to execute well-conceived plans as it was shipping meltingly hot, sub-par performing chips that turned out to self-immolate, combined with also giving up on being their own fab, and THEN torching the relationship with TSMC before you launched your first products they're fabbing.

You could write the story as a malicious evil CEO wanting to destroy the company and it'd read much the same as what's actually happening (not that I think Patty G is doing that, mind you) right now.

[–] Dudewitbow@lemmy.zip 28 points 5 days ago (1 children)

early zen werent performant in lower core count loads, but were extremely competitive in multi core workloads, especially when performamce per dollar was added into the equation. even if one revisits heavy multi core workload benchmarks, they faired fairly well in it. its just at a consumer level, they werent up to snuff yet because in gaming, they were still stuck with developers optimizing for an 8 thread console, and for laptops amds presence was near non existant.

[–] TheGrandNagus@lemmy.world 4 points 4 days ago

Not only that, but it was vastly more power efficient, and didn't have the glaring security vulnerabilities that Intel had. All while being on a worse Global Foundries manufacturing process.

Unless you were a PC gamer who also didn't care about $/perf, Zen1 was the better architecture.

[–] fuckwit_mcbumcrumble@lemmy.dbzer0.com 9 points 5 days ago* (last edited 5 days ago)

Single core workloads Intel still had the lead. But multi core (or just multi tasking) Zen 1 was a beast. By zen 2 there was hardly a reason to get Intel even for gaming, and especially at normal setups (nobody is using a top of the line GPU at 1080p). Even when you’re “just” playing a game you still have stuff running in the background, and those extra cores helped a lot.

Plus newer games are much more multi threaded than when zen first came out so those chips aged better as well.

[–] SapphironZA@sh.itjust.works 9 points 5 days ago (1 children)

Its chronic underinvestment in engineering to "maximize shareholder value" for a decade before AMD launched Zen. Then Intel got 5 years behind on engineering, and have only managed to get 2 of those 3 caught up. The newest tile based architecture only just matches the performance of AMD's 3 year old AM4 architecture.

[–] frezik@midwest.social 1 points 4 days ago

There's also a little tidbit that I think gets overlooked: the Arrow Lake CPUs are on a better TSMC node than AMD's Ryzen 9000 series. You wouldn't know it from any of the charts.

Which puts into perspective any Intel fanbois saying this is their Zen 1 moment. They're on a better node but still doing worse. There are no signs of life here, which was not the case for Zen 1.

[–] TheGrandNagus@lemmy.world 6 points 4 days ago* (last edited 4 days ago) (1 children)

Zen1 was slower in gaming and most 1-2 core workloads, but it was immediately far faster in server, faster in highly-threaded tasks, was hugely cheaper to manufacture, didn't have the huge security flaws Intel chips had, and was way more power efficient.

They achieved that while still being on an inferior Global Foundries manufacturing process.

Zen1 was overall better than Coffee Lake. Just not to PC gamers, the loudest online PC hardware demographic.

[–] frezik@midwest.social 5 points 4 days ago

Also, PC gamers are loud, but they make up a pretty small portion of the market. There was a time when Intel's server division made more revenue than all of AMD. Even now, AMD as a whole is only a little above that. That's not even considering the OEM market, which is far, far larger than PC gamers.

I got really annoyed with /r/buildapc. Everyone is a gamer and thinks they're the center of the universe. They haven't the faintest conception that someone would do a build for anything other than gaming and how that changes the choices.

[–] iopq@lemmy.world 6 points 5 days ago

Zen 2 was only a little slower for gaming, but it cooked the 8 core Intel 9900K in multicore performance. You could stick a 16 core 3950x into a normal mobo. The chiplet was a revolution

[–] shortwavesurfer@lemmy.zip 5 points 5 days ago

All of my computers had been Intel for many many years and here about a year and a half ago I got my first AMD computer because I had seen other people's machines with AMD processors but I had never owned one for myself and so now I do I have one with an AMD Ryzen 5

[–] simplejack@lemmy.world 61 points 5 days ago (3 children)

If you look at who is manufacturing silicon, the numbers look even worse for Intel. All of these competitors are using TSMC fabs. AMD, Apple, Qualcomm, etc.

TSMC is the real 500lb gorilla in the room.

[–] ayyy@sh.itjust.works 26 points 5 days ago (2 children)

It’s gonna suck so hard for the whole world when they get invaded :(

[–] Wahots@pawb.social 23 points 5 days ago

Pray they don't, but I'm almost certain they will now that the US is appointing complete morons to every portion of the US government. The US won't really be able to help until this rot gets cleaned out. China has four years before we can really help Taiwan again. (Or at least give them air superiority)

[–] randon31415@lemmy.world 7 points 5 days ago (1 children)

Biden just finalized the Arizona TSMC plant.

If that gets invaded, I think semiconductors are the least of our problems.

[–] chutchatut@lemm.ee 14 points 5 days ago (1 children)

But the Arizona plant wouldn't be allowed to manufacture the most cutting edge chips.

[–] capital@lemmy.world 2 points 5 days ago (2 children)

How/why is that blocked?

Would it still be if China invades Taiwan?

[–] dabaldeagul@feddit.nl 17 points 5 days ago

Taiwan is incentivized to keep the latest and greatest local, so they can hopefully get protection from the USA and Europe

[–] ColeSloth@discuss.tchncs.de 9 points 4 days ago (1 children)

Taiwans rule. Foreign tsmc fabs have to be a gen behind. This would definitely change if China took over Taiwan, but who knows what China will do or allow at that point. They could shut the whole US fab down if they want. Even if they did try to re-tool the US fab (taiwan or china or tsmc) in a few years, it would cost billions and a lot of time to get it done.

[–] Dudewitbow@lemmy.zip 1 points 4 days ago* (last edited 4 days ago)

you also have to keep in mind, the client that purchases cutting edge nodes first is apple. AMD only currently uses it for Zen 5c, and Qualcomm uses it for snapdragon elite/8 gen 4. mobile usally always gets them for efficiency reasons(and better yields due to smaller dies). other markets have historically been a node behind already (e.g despite the 9800x3d being new, its only a N4 die with a N6 io die)

[–] TheGrandNagus@lemmy.world 15 points 4 days ago

And Intel. Intel has been using TSMC fabs for a while.

They used to get a 40% discount, too, but that stopped recently when Pat Gelsinger said people should stop buying from TSMC because there's a good chance they'll be invaded.

TSMC's CEO didn't like that, and said "ok, no more 40% discount for you. Effective immediately." (TL;DR'd, obviously).

[–] GamingChairModel@lemmy.world 2 points 5 days ago

Even some of Intel's Arrow Lake/Lunar Lake chips are being fabbed at TSMC.

[–] walden@sub.wetshaving.social 54 points 5 days ago* (last edited 5 days ago) (1 children)

I just built a computer for a friend and she decided to get an AMD when I told her it was about the same performance but used half as much electricity.

This is a person who knows nothing about computers. Intel is losing their "household name" status in a big way judging by that.

[–] simplejack@lemmy.world 27 points 5 days ago (1 children)

People like long battery life and computers that don’t cook your crotch.

[–] capital@lemmy.world 11 points 5 days ago (2 children)

What are the chances they were building a laptop?

Wait you don't straddle your desktop tower?

[–] simplejack@lemmy.world 6 points 5 days ago

Good point.

[–] circuitfarmer@lemmy.sdf.org 32 points 5 days ago (1 children)

Not surprised. I switched to AMD CPU and GPU about a year ago. Could not be happier. Ryzen sips power and I run mine in Eco mode (since I'm on an air cooler). Performance is still fantastic.

[–] addie@feddit.uk 6 points 4 days ago

Invested in a water cooler setup back when I had a Bulldozer chip, which was near essential. Now on a Ryzen, and getting it to exceed about 35 degrees is very difficult. Been very good for long-term stability of my desktop - all the niggling hard disk issues seem to just go away when they've not subjected to such thermal cycling any more.

Fantastic chips.

[–] Aatube@kbin.melroy.org 27 points 5 days ago (1 children)

it's the year of the linux des-

oh wait, wrong thread

[–] GhiLA@sh.itjust.works 5 points 4 days ago (1 children)

The joke: every year is the year of the Linux desktop

head tap

because Linux is rather awesome.

[–] Aatube@kbin.melroy.org 1 points 4 days ago

get away from me you filthy crocodile

[–] GhiLA@sh.itjust.works 10 points 4 days ago* (last edited 4 days ago) (2 children)

My thought process:

Desktop: I need cost for performance...

Server: fps for the Jellyfin, transcodes for the transcode god

[–] frezik@midwest.social 4 points 4 days ago* (last edited 4 days ago)

I'd drop in an old Nvidia GPU for transcoding, anyway. There's lots of old cards that support nvenc. Don't neglect the Quadro cards, either. Lots of them are cheap on ebay and will transcode just fine without even needing their own cooling fan.

[–] frazorth@feddit.uk 1 points 4 days ago (1 children)

Transcodes worked vastly better with QuickSync last time I bought a machine.

Does the AMD transcoded work as well these days?

[–] yonder@sh.itjust.works 5 points 4 days ago (1 children)

I don't think so. The Jellyfin documentation still says it sucks lol.

[–] frazorth@feddit.uk 1 points 4 days ago* (last edited 4 days ago)

Damnit.

I wonder if thats because the transcoding hardware ismcrap or they just aren't concentrating on that in the software.

[–] brucethemoose@lemmy.world 20 points 5 days ago* (last edited 5 days ago) (1 children)

I feel like they are dropping the ball in the GPU space though, both on desktop and in servers.

They'renot really leveraging it. They killed the steam deck line of "small core count, GPU heavy APUs" which is why Valve hasn't updated it and competitors seem so power hungry. They all but killed server APUs, making them mega expensive and HPC only. They're finally coming out with a M-Pro like consumer APU, but it took until 2025, and pricing will probably be a joke just like their Radeon Pro GPUs...

And I don't even wanna get into the AI space. They get like 99% there and then go "nah, we don't really care about this market, let Nvidia have their monopoly and screw everyone over." It makes me want to pull my hair out.

[–] WalnutLum@lemmy.ml 4 points 4 days ago

The fact they pulled ROCM support for older cards boggles the mind.

[–] Myro@lemm.ee 6 points 4 days ago

Sad but true. Intel's performance was poor over the last year. Shuddering thinking about my Mac with Intel CPU, there must be burn victims from this thing. Still, less competition is never a good thing.

[–] finitebanjo@lemmy.world 4 points 4 days ago* (last edited 4 days ago)

EDIT: Sorry, the article isn't about GPU rather it's about the CPU market where AMD is projected to overtake Intel in the far future.


When the AI Crash wipes out nVidia's demand in the server market they're not gonna have any loyal customers in the desktop market right as the tech boom comes to places formerly reliant on only smartphones. Then they're gonna be like surprised_pikachu.jpg

[–] Routhinator@startrek.website 1 points 5 days ago* (last edited 5 days ago)

Still not ready to trust AMD/ATI again. I used them exclusively right up until they bought ATI and then decided fuck open source and the drivers for Linux tanked.

I hear all the issues folks have had with Intel/NVIDIA but I have yet to experience any of them. From where I'm sitting they are still working great. And their open source has not been perfect but its consistent. Instead of going from being golden to fuck you Linux folks overnight.