this post was submitted on 05 Nov 2024
56 points (100.0% liked)

Hardware

698 readers
118 users here now

All things related to technology hardware, with a focus on computing hardware.


Rules (Click to Expand):

  1. Follow the Lemmy.world Rules - https://mastodon.world/about

  2. Be kind. No bullying, harassment, racism, sexism etc. against other users.

  3. No Spam, illegal content, or NSFW content.

  4. Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.

  5. Please try and post original sources when possible (as opposed to summaries).

  6. If posting an archived version of the article, please include a URL link to the original article in the body of the post.


Some other hardware communities across Lemmy:

Icon by "icon lauk" under CC BY 3.0

founded 1 year ago
MODERATORS
top 10 comments
sorted by: hot top controversial new old
[–] Rekall_Incorporated@lemm.ee 19 points 2 weeks ago (2 children)

I was really hoping to see more competition in the dGPU space. But considering Intel's overall troubles and the challenges with gaming dGPUs (even AMD can't come anywhere close to Nvidia in the gaming dGPU space) this is to be expected.

[–] filister@lemmy.world 5 points 2 weeks ago (2 children)

Yeah, unfortunately Nvidia killed it with the CUDA. They spent really a lot of time on this software but made it very ubiquitous unlike AMD which made ROCm one big of a mess.

I don't like NVIDIA but my next GPU will be from them just because of that CUDA support.

[–] grue@lemmy.world 9 points 2 weeks ago

Yeah, unfortunately Nvidia killed it with the CUDA.

This shit is why anti-trust law is important. CUDA should've been forcibly opened so that competitors were allowed to implement it.

[–] Hubi 4 points 2 weeks ago

Same here, it's the only reason I buy NVIDIA. I'd still like to see more Intel GPUs, they are great for budget builds in a time where prices have skyrocketed. I was thinking about getting one as a secondary GPU.

[–] yonder@sh.itjust.works 1 points 2 weeks ago (2 children)

I wonder how many of ARC's issues were a result of Windows, since I've been using an ARC a750 for about a year now on Linux and it has been really solid, better than the 2060 I had before. The only compute I use the card for is in Blender and it works really well for that.

[–] Rekall_Incorporated@lemm.ee 1 points 2 weeks ago (1 children)

I believe many of major the drivers issues were sorted out after releases. Although I doubt support is anywhere close to being as good as AMD, let alone Nvidia.

[–] yonder@sh.itjust.works 2 points 2 weeks ago

I tried to run Windows with ARC earlier this year and the driver updater would shit istelf and refuse to update the drivers, requiring a full reinstall. Some of the software I wanted to use just straight up did not work (Steam link, ALVR, Minecraft with the Vivecraft mod) and Half Life: ALyx had some annoying graphical issues. It was pretty performant though so ARC is still a good option when every penny counts.

[–] yonder@sh.itjust.works 1 points 2 weeks ago

To add on, Windows needs support for all the directX versions on top of vulkan and OpenGl, when Linux only needs vulkan and OpenGl working.

[–] DarkThoughts@fedia.io 18 points 2 weeks ago

Welp. Not sure how you can call this committed. Intel already did integrated graphics before Arc so this would just be an evolution of them, while it sounds like they're likely going to abandon the discrete / desktop GPU market.

[–] tekato@lemmy.world 2 points 2 weeks ago

Intel ARC GPUs are actually very good. They just tried being a little too based by saying fuck you to HDMI and DirectX, both market leaders. They only support DP (HDMI is actually converted to DP in ARC GPUs), and only cared to properly implement Vulkan drivers.