this post was submitted on 21 Aug 2024
327 points (98.8% liked)

Technology

59612 readers
3001 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Back in 2013, Nvidia introduced a new technology called G-Sync to eliminate screen tearing and stuttering effects and reduce input lag when playing PC games. The company accomplished this by tying your display's refresh rate to the actual frame rate of the game you were playing, and similar variable refresh-rate (VRR) technology has become a mainstay even in budget monitors and TVs today.

The issue for Nvidia is that G-Sync isn't what has been driving most of that adoption. G-Sync has always required extra dedicated hardware inside of displays, increasing the costs for both users and monitor manufacturers. The VRR technology in most low-end to mid-range screens these days is usually some version of the royalty-free AMD FreeSync or the similar VESA Adaptive-Sync standard, both of which provide G-Sync's most important features without requiring extra hardware. Nvidia more or less acknowledged that the free-to-use, cheap-to-implement VRR technologies had won in 2019 when it announced its "G-Sync Compatible" certification tier for FreeSync monitors. The list of G-Sync Compatible screens now vastly outnumbers the list of G-Sync and G-Sync Ultimate screens.

you are viewing a single comment's thread
view the rest of the comments
[–] vikingtons@lemmy.world 4 points 3 months ago (2 children)

Good for them if it help eliminate the mark up of displays advertising gsync ultimate. I have my doubts but it'd make sense if they're no longer using dedicated boards with FPGAs and RAM.

One has to wonder if VESA will further their VRR standard to support refresh rates as low as 1Hz

[–] AngryMob@lemmy.one 2 points 3 months ago (1 children)

Yeah it feels premature since so many freesync displays still only go to 48hz.

Maybe if the mediatek chip can go to 30hz then VESA will update.

[–] vikingtons@lemmy.world 4 points 3 months ago

I think below that range they can frame double (low framerate compensation LFC) to go as low as 24 FPS

[–] barsoap@lemm.ee 2 points 3 months ago* (last edited 3 months ago) (1 children)

I'm not aware of any protocol limitations there, it's just that monitors don't bother to support refresh rates that low.

Experience at low frame rates will be choppy anyways, if it's a fixed low framerate you can use LFC without quality degradation (say for movies) and if it's a variable low framerate (where LFC causes jitter)... you should be lowering your graphics settings to get better fps. Why spend extra engineering and hardware on a capability that won't ever result in a good experience anyway?

...has it really come to this? From laughing at console people for their "cinematic FPS" to nvidia fanboys saying "my monitor supports lower framerates than yours"? Aren't we supposed to brag about our displays (pointlessly) reaching haptic fps? (that's be 1kHz btw).

[–] vikingtons@lemmy.world 3 points 3 months ago

Higher end phones have the capability to gear down to 1hz to save power on static representation. Would be nice to see that on notebook eDP and hell, even with dekstop monitors too.