this post was submitted on 26 Jul 2024
988 points (99.5% liked)

Technology

59612 readers
3262 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] MudMan@fedia.io 53 points 4 months ago* (last edited 4 months ago) (2 children)

So here's the thing about that, the real performance I lose is... not negligible, but somewhere between 0 and 10% in most scenarios, and I went pretty hard keeping the power limits low. Once I set it up this way, realizing just how much power and heat I'm saving for the last few few drops of performance made me angrier than having to do this. The dumb performance race with all the built-in overclocking has led to these insanely power hungry parts that are super sensitive to small defects and require super aggressive cooling solutions.

I would have been fine with a part rated for 150W instead of 250 that worked fine with an air cooler. I could have chosen whether to push it. But instead here we are, with extremely expensive motherboards massaging those electrons into a firehose automatically and turning my computer into a space heater for the sake of bragging about shaving half a milisecond per frame on CounterStrike. It's absurd.

None of which changes that I got sold a bum part, Intel is fairly obviously trying to weasel out of the obviously needed recall and warranty extension and I'm suddenly on the hook for close to a grand in superfluous hardware next time I want to upgrade because my futureproof parts are apparently made of rust and happy thoughts.

[–] tal@lemmy.today 8 points 4 months ago* (last edited 4 months ago) (1 children)

150W instead of 250

Yeah, when I saw that the CPU could pull 250W, I initially thought that it was a misprint in the spec sheet. That is kind of a nutty number. I have a space heater that can run at low at 400W, which is getting into that range, and you can get very low-power space heaters that consume less power than the TDP on that processor. That's an awful lot of heat to be putting into an incredibly small, fragile part.

That being said, I don't believe that Intel intentionally passed the initial QA for the 13th generation thinking that there were problems. They probably thought there was a healthy safety margin. You can certainly blame them for insufficient QA or for how they handled the problem as the issue was ongoing, though.

And you could also have said "this is absurd" at many times in the past when other performance barriers came up. I remember -- a long time ago now -- when the idea of processors that needed active cooling or they would destroy themselves seemed rather alarming and fragile. I mean, fans do fail. Processors capable of at least shutting down on overheat to avoid destroying themselves, or later throttling themselves, didn't come along until much later. But if we'd stopped with passive heatsink cooling, we'd be using far slower systems (though probably a lot quieter!)

[–] MudMan@fedia.io 3 points 4 months ago

You're not wrong, but "we've been winging it for decades" is not necessarily a good defense here.

That said, I do think they did look at their performance numbers and made a conscious choice to lean into feeding these more power and running them hotter, though. Whether the impact would be lower with more conservative power specs is debatable, but as you say there are other reasons why trying to fake generational leaps by making CPUs capable of fusing helium is not a great idea.

[–] Nighed@sffa.community 1 points 4 months ago (1 children)

Could you not have just bought a lower power chip then?

Or does that loose you cores?

[–] MudMan@fedia.io 3 points 4 months ago

Oh, I absolutely could have. It would lose a couple of cores, but the 13th gen is pretty linear, it would have performed more or less the same.

Thing is, I couldn't have known that then, could I? Chip reviews aren't aiming at normalizing for temps, everybody is reviewing for moar pahwah. So is there a way for me to know that gimping this chip to run silently basically gets me a slightly overclocked 13600K? Not really. Do I know, even at this point, that getting a 13600K wouldn't deliver the same performance but require my fans to be back to sounding noticeable? I don't know that.

Because the actual performance of these is not to a reliable spec other than "run flat out and see how much heat your thermal solution can soak" there is no good way to evaluate these for applications that aren't just that without buying them and checking. Maybe I could have saved a hundred bucks. Maybe not. Who knows?

This is less of a problem if you buy laptops, but for casual DIY I frankly find the current status quo absurd.