It’s not necessarily poor optimization when they’ve been working with Epic to push the state of the art of real time graphics tech for the past 5 years.
I’m with you, but 30 FPS isn’t a slideshow. And ever since we were able to get 60 FPS on consoles, devs have been willing to forgo it for fidelity at lower frame rates. I didn’t see any reason that trend would stop now.
And that new Marvel game will also be a game with state of the art real time graphics, and it will also run at 30 FPS. Same with GTA 6.
Saying it’s “state of the art” isn’t an excuse for poor optimization. Developers have been able to pump out 60fps on way worse hardware and still make it look good. We have more powerful hardware now but worse software. In the past 8 years alone new optimization techniques have been found but no one uses them
Just because Hellblade 2 runs at 30 FPS, it doesn’t mean it’s optimized worse than Metal Gear Solid 2. There’s way more being processed per second in order to render Senua than there is to render Raiden, and it’s a trade-off that the developers decided was worth it, even if you and I disagree. That still doesn’t mean it’s poorly optimized.
No clearly not but we’re talked about the latest console from Microsoft. Not saying it’s insanely powerful but it sure as shit ain’t weak or outdated hardware
Then you can acknowledge that there are limits to what video games can run on a given set of hardware, regardless of optimization. There’s been diminishing returns in graphics processing since the beginning of time. In order to get to that next step of realism, it’s going to cost more than it took the last time we saw a similar leap.
Well, GTA 6 is also going to be running at 1080p instead of 4K, so that is an option some games use on consoles. I think Jedi Survivor only ran at 720p30, but that one we definitely can say is poorly optimized.
And we’ve also got a new generation of graphics hardware in the 4 years since the last gen came out. This isn’t new and is always going to happen when you can’t upgrade your hardware. 30fps isn’t even that bad, even if its not great.
I’m worried this will mean the game runs poorly on PC as well. I’ll wait for a sale down the line before picking this up, even though I really enjoyed the first one.
That the eye can only perceive 24 fps is a myth. Plus vision perception is very complicated with many different processes, and your eyes and brain don’t strictly perceive things in frames per second. 24 fps is a relatively arbitrary number picked by the early movie industry, to make sure it would stay a good amount above 16 fps (below this you lose the persistence of motion illusion) without wasting too much more film, and is just a nice easily divisible number. The difference between higher frame rates is quite obvious. Just go grab any older pc game to make sure you can get a high frame rate, then cap it to not go higher to 24 after that, and the difference is night and day. Tons of people complaining about how much they hated the look of Hobbit movie with its 48 fps film can attest to this as well. You certainly do start to get some diminishing returns the higher fps you get though. Movies can be shot to deliberately avoid quick camera movements and other things that wouldn’t do well at 24 fps, but video games don’t always have that luxury. For an rpg or something sure 30 fps is probably fine. But fighting, action, racing, anything with a lot of movement or especially quick movements of the camera starts to feel pretty bad at 30 compared to 60.
Add comment