this post was submitted on 23 Aug 2024
98 points (98.0% liked)

Games

31749 readers
1339 users here now

Welcome to the largest gaming community on Lemmy! Discussion for all kinds of games. Video games, tabletop games, card games etc.

Weekly Threads:

What Are You Playing?

The Weekly Discussion Topic

Rules:

  1. Submissions have to be related to games

  2. No bigotry or harassment, be civil

  3. No excessive self-promotion

  4. Stay on-topic; no memes, funny videos, giveaways, reposts, or low-effort posts

  5. Mark Spoilers and NSFW

  6. No linking to piracy

More information about the community rules can be found here.

founded 1 year ago
MODERATORS
 

Absolutely bizarre that a 1st party title doesn't seem optimized for the console they're developing for. This makes me skeptical the PC version will be optimized too.

top 50 comments
sorted by: hot top controversial new old
[–] ShinkanTrain@lemmy.ml 43 points 2 weeks ago* (last edited 2 weeks ago) (2 children)

30, 60 or whatever fps is (or at least should be) a development decision made very early in development. It's only a case of poor optimization if it doesn't reach the target they've set.

I don't like it either, but an Unreal 5 game running at 30 fps (if that lol) on current gen is the norm.

[–] variants@possumpat.io 19 points 2 weeks ago (1 children)

The human eye can't see more than 30fps anyway /s

[–] Ghoelian@lemmy.dbzer0.com 18 points 2 weeks ago (3 children)

The people that keep saying that should really just try to use a 144+hz monitor for a while. Surely they'll be able to notice the difference as well.

[–] lengau@midwest.social 16 points 2 weeks ago (1 children)

If someone's saying that about 30fps they should just set their refresh rate to 30 and move their mouse.

[–] intensely_human@lemm.ee 7 points 2 weeks ago

Or just stand in a room with fluorescent lights and move their eyes

[–] Thrashy@lemmy.world 6 points 2 weeks ago* (last edited 2 weeks ago) (6 children)

Might just be my middle-aged eyes, but I recently went from a 75Hz monitor to a 160Hz one and I'll be damned if I can see the difference in motion. Granted that don't play much in the way of twitch-style shooters anymore, but for me the threshold of visual smoothness is closer to 60Hz than whatever bonkers 240Hz+ refresh rates that current OLEDs are pushing.

I'll agree that 30fps is pretty marginal for any sort of action gameplay, though historically console players have been more forgiving of mediocre performance in service of more eye candy.

[–] mephiska@fedia.io 7 points 2 weeks ago (1 children)

Are you sure you have the reset rate set correctly on your video card? The difference between 75hz and 160hz is very clear just by moving your mouse cursor around. Age shouldn't have anything to do with it.

[–] Thrashy@lemmy.world 3 points 2 weeks ago

Quite sure -- and given that one game I've been playing lately (and the exception to the lack of shooters in my portfolio) is Selaco, so I ought to have noticed by now.

There's a very slight difference in smoothness when I'm rapidly waving a mouse cursor waving around on one screen versus the other, but it's hardly the night-and-day difference that going from 30fsp to 60fps was back in Ye Olden Days, and watching a small, fast-moving, high-contrast object doesn't make up the bulk of gameplay in anything I play these days.

[–] OminousOrange@lemmy.ca 6 points 2 weeks ago (1 children)

If the two are beside eachother, you'll definitely see the difference.

[–] Thrashy@lemmy.world 4 points 2 weeks ago (1 children)

The old one and the new one are literally side by side on my desktop, don't know what to tell you...

[–] OminousOrange@lemmy.ca 3 points 2 weeks ago (1 children)

Hmm, I've found it quite noticeable. Perhaps turn an FPS counter on and see what it's actually running at. If you have a game showing on both screens, it'll likely limit the fps to suit the lowest display hz.

[–] Gerudo@lemm.ee 4 points 2 weeks ago

This is a good point, a lot of people just assume plugging it in gets the hz, but a lot of the time you have to select the hz in your settings.

[–] jorp@lemmy.world 2 points 2 weeks ago (2 children)

A 160hz refresh rate gives the software a 6ms render budget, do things actually even run at that rate?

[–] SpacetimeMachine@lemmy.world 9 points 2 weeks ago

If your comp is good enough absolutely. Strong PCs now can run sub 5ms frame times at 4k pretty regularly. Especially for competitive games that aren't designed to look incredible.

[–] ParetoOptimalDev@lemmy.today 4 points 2 weeks ago

I can confirm 3-5ms frametimes with a popular shooter at 165hz.

[–] ParetoOptimalDev@lemmy.today 2 points 2 weeks ago (1 children)

Games feel almost disgusting on 60hz now, but they felt fine before I tried 144hz.

Maybe if I was stuck at 60hz for a long time id get used to it.

Now though, if I switch for 30m I can't ignore the difference.

[–] lustyargonian@lemm.ee 1 points 2 weeks ago

Do you primarily game on mouse or controller?

[–] lengau@midwest.social 1 points 2 weeks ago

It really depends what one's doing, also. For many things, including many games, 30fps is fine for me. But I need at least 60fps for mousing. Beyond that though I don't notice the mouse getting smoother above 60fps, but some games I do have a better experience at 120fps. And I'm absolutely sold on 500+ fps for simulating paper.

load more comments (1 replies)
[–] lustyargonian@lemm.ee 3 points 2 weeks ago

My work macbook can only do 60 while my Rog Ally can do 120, and damn the mouse feel of 120 is so much better that I hate my work laptop can't do it.

[–] lustyargonian@lemm.ee 3 points 2 weeks ago

In the interview they said how they show the game the way it is and focus on that part of development. They said how combat wasn't worked on yet when they showed the game, which now looks pretty reactive. They're going to focus on sound next and performance last, and when they said 30 it seemed like "bare minimum is solid 30". Given the feedback, there's a chance they'll try to incorporate 60 fps now.

While it's a design decision, UE is also a bit more scalable generally, assuming it's not all reliant on lumen, nanite and vsm.

Either ways, they need to learn from previous 30 FPS launches and try to communicate better. Saying it doesn't need 60 is dismissive to a large audience of gamers who don't like the trade-off of frames over image quality.

[–] dinckelman@lemmy.world 22 points 2 weeks ago (1 children)

No wonder consoles are just not as appealing anymore.

We used to get systems, that were purposefully designed to only play games, but do it phenomenally well. That shit absolutely defined an entire generation of gaming.

Now we get a crippled PC, with dorito ads on the dashboard

[–] Thrashy@lemmy.world 8 points 2 weeks ago (1 children)

Eh... Consoles used to be horribly crippled compared to a dedicated gaming PC of similar era, but people were more lenient about it because TVs were low-res and the hardware was vastly cheaper. Do you remember Perfect Dark multiplayer on N64, for instance? I do, and it was a slideshow -- didn't stop the game from being lauded as the apex of console shooters at the time. I remember Xbox 360 flagship titles upscaling from sub-720p resolutions in order to maintain a consistent 30fps.

The console model has always been cheap hardware masked by lenient output resolutions and a less discerning player base. Only in the era of 4K televisions and ubiquitous crossplay with PC has that become a problem.

[–] ShinkanTrain@lemmy.ml 6 points 2 weeks ago (1 children)

The Xbox 360, at launch, was more powerful than the most powerful PC you could build at the time.

[–] Thrashy@lemmy.world 7 points 2 weeks ago (1 children)

At launch the 360 was on par graphically with contemporary high-end GPUs, you're right. By even the midpoint of its seven year lifespan, though, it was getting outclassed by midrange PC hardware. You've got to factor in the insanely long refresh cycles of consoles starting with the six and seventh generations of consoles when you talk about processing power. Sony and Microsoft have tried to fix this with mid-cycle refresh consoles, but I think this has honestly hurt more than helped since it breaks the basic promise of console gaming -- that you buy the hardware and you're promised a consistent experience with it for the whole lifecycle. Making multiple performance targets for developers to aim for complicates development and takes away from the consumer appeal

[–] Stovetop@lemmy.world 3 points 2 weeks ago* (last edited 2 weeks ago)

Between last generation and this one, though, we're at the point where consoles are more like prebuilts. Games have performance targets, it's up to users to decide when they feel like an upgrade. The only difference is that games (usually) won't release for models that can't run them well, compared to some people who try to squeeze out every frame they can from their 10-year-old potato PCs, though every now and then you still get a Cyberpunk 2077 on consoles.

But there's a reason why some games still target the PS4 in 2024, because if you're a small-budget indie game that doesn't need the full hardware of the PS5, why not? Since you don't get locked out of older stuff when you upgrade anymore, which enables newer stuff to keep releasing on older systems, anyone can hold on to a console until they run into a game worth upgrading for.

[–] Deestan@lemmy.world 21 points 2 weeks ago

It's playable and you can enjoy the game, but 30FPS is embarrassing. It makes me feel like I'm a kid playing on a PC assembled out of old leftover components. Which was tolerable when I was a cashless kid playing pirated games on inherited frankenPCs, but it feels so wrong when playing a bought game on its intended spec hardware.

[–] PunchingWood@lemmy.world 16 points 2 weeks ago* (last edited 2 weeks ago) (5 children)

It’s a first-person, single-player game, you don’t necessarily need that 60 frames

These people shouldn't be allowed to work in game development.

Just grow a fucking pair and say that the Xbox isn't powerful enough to run it at anything beyond that.

[–] ShinkanTrain@lemmy.ml 10 points 2 weeks ago

Dev: The Xbox isn't powerful enough for that

Phil Spencer: You now work at the CoD mines

[–] Ghoelian@lemmy.dbzer0.com 8 points 2 weeks ago (1 children)

I'd say 60+fps is especially necessary for first-person games. I seriously have issues making out objects and other things when looking around first-person at 30fps.

[–] xavier666@lemm.ee 8 points 2 weeks ago (2 children)

60 fps is the bare minimum for FPS games

[–] RxBrad@infosec.pub 4 points 2 weeks ago

Luckily, this is about as much of a FPS as Skyrim.

Skyrim, too, was 30fps when it first released on PS3/360 back in 2011. None of this is new.

[–] intensely_human@lemm.ee 2 points 2 weeks ago

I’d say 1 FPS is the minimum for an FPS game 🤷

[–] RxBrad@infosec.pub 5 points 2 weeks ago (3 children)

Both can be true.

I mean.. 30fps has been the single-player console experience for as long as I can remember. (Except for the PS4/XboxOne-native games -- seemingly this entire generation -- which get 60fps on current gen.)

Yes, PC can do 60fps+ if your rig is beefy enough. Yay.

Console wars bullshit is insufferable. Even when PC is one of the consoles.

[–] Ghoelian@lemmy.dbzer0.com 9 points 2 weeks ago (1 children)

Yeah but on PC you usually get graphics settings you can tune to whatever you like. I'd personally rather have a slightly worse looking game running at 60+fps, than a beautiful one at 30.

[–] RxBrad@infosec.pub 4 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

That was an option on console for most of the generation so far: Performance Mode vs. Quality Mode. But that's mostly because nearly every game released so far has been a hastily ported last-gen title. It feels like this gen has really just barely started.

Single-player console games being 30fps is not new by any stretch. That's basically what consoles do. And they've managed pretty well with it so far. If you want to spend 2-3x more on a beefy PC, you can get all the frames you want. More power to you.

20 years ago... Skyrim, Fallout, The Last of Us 1, GTA4-5 on PS3/360 gen. 30fps.

10 years ago... God of War, Gears of War single-player, Fallout 4, The Last of Us 2 on PS4/XBoxOne gen. Also 30fps.

[–] Ghoelian@lemmy.dbzer0.com 1 points 2 weeks ago* (last edited 2 weeks ago)

Single-player console games being 30fps is not new by any stretch

Yeah I know, that's why I never really got into console gaming unfortunately. As I said elsewhere, I genuinely have trouble making out objects while looking around in first-person games, if it's running at 30fps.

Didn't know about the current gen having performance settings, that's pretty neat. Might actually consider getting one if I can actually run games at a reasonable framerate on them with a lower quality setting.

load more comments (2 replies)
[–] simple@lemm.ee 5 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

say that the Xbox isn’t powerful enough to run it at anything beyond that.

There's no way they can't just lower the resolution and apply upscaling like every other game that has a quality and performance"mode. They're intentionally locking it to 30 for some bizarre reason.

"It's 4K in the X. It's 1440 on the S. We do lock it at 30, because we want that fidelity, we want all that stuff. We don't want to sacrifice any of it."

[–] PunchingWood@lemmy.world 2 points 2 weeks ago* (last edited 2 weeks ago)

I might hope it's not because of the same reason Bethesda locked their framerates, because their entire game's physics and other stuff would break when you unlocked it. I assume it's not, if it's only locked on Xbox, which then would mean that the console is just weak.

[–] ampersandrew@lemmy.world 2 points 2 weeks ago

If they said that, Microsoft wouldn't allow them to work in game development.

[–] bokherif@lemmy.world 14 points 2 weeks ago

30FPS gaming should be illegal in 2024

[–] AFC1886VCC@reddthat.com 7 points 2 weeks ago (1 children)

Lmao I can run ghost of tsushima on my steam deck at 60fps. This is pathetic for current gen consoles.

What settings? GoT directors cut is only running 40fps on my oled

[–] aluminium@lemmy.world 6 points 2 weeks ago

UE5, surprise surprise.

[–] Katana314@lemmy.world 3 points 2 weeks ago (1 children)

I'm aware that the PS5 is low on "exclusives". A big part of the reason I got it was for simple things like being able to run old PS4 games at higher framerates.

We're past the diminishing returns on visuals for games; not to say games can't look ugly, but with a decent art direction, the capabilities of current consoles are more than enough. That's why Nintendo was still able to sell Tears of the Kingdom for $60.

[–] acosmichippo@lemmy.world 4 points 2 weeks ago (1 children)

PS5 certainly has better exclusives than xbox.

[–] Katana314@lemmy.world 2 points 2 weeks ago

True - I think I meant to refer to it more as a generational issue; many people haven't upgraded to either current-gen console yet because they don't technically need them. PS5 might have few exclusives, but Xbox has basically none. (Many of their heavy-hitters like Sea of Thieves still run on Xbox One)

[–] Dariusmiles2123@sh.itjust.works 1 points 2 weeks ago (1 children)

I think people give too much importance to such things.

I’m not saying 60fps isn’t nice, but it ain’t the most important thing and I feel like drawing distance or stuff like this are more distracting.

For me it’s crazy to still have racing games where shadows or trees are still appearing too late or where things are just abnormally disappearing in the rear view mirror.

[–] acosmichippo@lemmy.world 2 points 2 weeks ago

I generally agree, but it should still be trivially easy to ship the game with at least two options, as most ps5 games do. one high fidelity at 30fps and one high performance at ~60fps.

load more comments
view more: next ›