I’ve played some action games in the teens and was fine with it. Maybe lower frame rate at low resolution (1080) isn’t as apparent as the high 4K, but I’ve never understood why people can’t play with frame rates still far faster than film (if it’s truly refreshing the frames completely and not ripping the picture of course). I suppose this argument goes the same direction as the vinyl/CD one, with both opinions dead sure they’re right.
If the game is handling variations of frame rates during play badly, that’s a different story. The goal is for the player to not realize there’s a change and stay focused on the game.
I started out playing Doom on a 386, in a tiny tiny viewport, and until recently my hardware was apways behind the curve. I remember playing Oblivion at 640 x 380. And enjoying foggy weather in San Andreas because the reduced draw distance made my fps a lot better.
Over the years I’ve trained my brain to do amazing real time upsacaling, anti-aliasing, hell, even frame generation. nVidia has nothing on the neural network in my head.
But not everyone has this experience and smooth FPS is always better, even if I can handle teerible performance if the game is any good.
Some of the settings are messed up, I think. It definitely can run faster than that by toning down some settings on that hardware. They really should have changed the defaults or straight up removed some visual settings, given what they do to the game. In my experience, the volumetric clouds, reflections and GI presets are all messed up and cost a disproportionate amount of performance when maxed out.
We’ve been warned, I expected performance to be rough but ~35fps on a 4090 is a new low for me.
Yeah, there’s “bad” and there’s “embarrassingly terrible”
And then there’s everything not triple A, which is 99% terrible but 1% gold.
I’ve played some action games in the teens and was fine with it. Maybe lower frame rate at low resolution (1080) isn’t as apparent as the high 4K, but I’ve never understood why people can’t play with frame rates still far faster than film (if it’s truly refreshing the frames completely and not ripping the picture of course). I suppose this argument goes the same direction as the vinyl/CD one, with both opinions dead sure they’re right.
If the game is handling variations of frame rates during play badly, that’s a different story. The goal is for the player to not realize there’s a change and stay focused on the game.
I started out playing Doom on a 386, in a tiny tiny viewport, and until recently my hardware was apways behind the curve. I remember playing Oblivion at 640 x 380. And enjoying foggy weather in San Andreas because the reduced draw distance made my fps a lot better.
Over the years I’ve trained my brain to do amazing real time upsacaling, anti-aliasing, hell, even frame generation. nVidia has nothing on the neural network in my head.
But not everyone has this experience and smooth FPS is always better, even if I can handle teerible performance if the game is any good.
Some of the settings are messed up, I think. It definitely can run faster than that by toning down some settings on that hardware. They really should have changed the defaults or straight up removed some visual settings, given what they do to the game. In my experience, the volumetric clouds, reflections and GI presets are all messed up and cost a disproportionate amount of performance when maxed out.
Well, maybe high settings was created for a 6090 or even 7090?