I must admit, when I got my 144hz monitor I was excited, coming from a 60hz monitor. But even if a game runs at 144 fps I don’t see much of a difference, many people do, but I don’t. It’s a bit smoother, but not much.
But if a game runs at 30 fps it’s horrible. The Crew, for example, can be switched to 30 or 60 fps, that’s night and day!
Yeah, 144hz makes a significant difference for competitive FPS games (especially fast paced ones like Overwatch), but I hardly notice a difference when playing single player or PvE oriented games.
Hell, on some games (e.g. Borderlands 3 and CP2077) I actually prefer to play on my 60hz monitor since a smooth 60hz is much more enjoyable IMO than an inconsistent 100-144hz experience. My computer is admittedly pretty old though.
Just to make sure since it does happen a lot, you did change your monitor refresh rate in your OS right? Windows for some reason really likes to not default to higher than 60hz. You’d also probaly want to enable variable refresh rate in your GPU settings if available. And if you do have VRR, some games are weird and have a specific Vsync option for it, others you can just use VRR on normal Vsync just fine.
Was gonna say the same. I’ve had this discussion before…
“Dude 144hz is a scam it’s the same as 60 for me” my brother in Christ, did you enable it in windows?!
Yea it honestly shocks me - I mean… not really but yknow - that Microsoft has not done anything about it. Surely someone from the team that keeps trying to jam Edge down peoples throats could just port that shit over for when people have 60hz+ monitors plugged in.
The faster something on screen moves, the higher your framerate needs to be for a certain level of motion blur.
A 2D point and click adventure at 30fps could have comparable motion blur to a competitive shooter at 180, for example
Framerate is inversly proportial to frametimes, which is what makes it harder to notice a difference the higher you go.
From 30 to 60? That’s an improvement of 16.67ms. 60 to 120 makes 8.33ms, 120 to 240 only improves by 4.17ms, and so on
Ah, something I want to add:
That’s only explaining the visual aspect, but frametimes are also directly tied to latency.
Some people might notice the visual difference less than the latency benefit. That’s the one topic where opinions on frame generation seem to clash the most, since the interpolated frames provide smoother motion on screen, but don’t change the latency.
It’s super dependent on the game. Baldur’s Gate 3? 30 fps is more than enough. League of Legends? Yeah, I’ll take those 144hz, tho to be honest I don’t notice a big difference compared to 60 fps.
I must admit, when I got my 144hz monitor I was excited, coming from a 60hz monitor. But even if a game runs at 144 fps I don’t see much of a difference, many people do, but I don’t. It’s a bit smoother, but not much.
But if a game runs at 30 fps it’s horrible. The Crew, for example, can be switched to 30 or 60 fps, that’s night and day!
Yeah, 144hz makes a significant difference for competitive FPS games (especially fast paced ones like Overwatch), but I hardly notice a difference when playing single player or PvE oriented games.
Hell, on some games (e.g. Borderlands 3 and CP2077) I actually prefer to play on my 60hz monitor since a smooth 60hz is much more enjoyable IMO than an inconsistent 100-144hz experience. My computer is admittedly pretty old though.
144hz in overwatch feels like putting glasses on for the first time, my brain can actually track movement properly
Most other games I barely notice the difference though
You can cap the fps in software, no need to switch monitors.
Also personally I always notice the difference, even when scrolling webpages
Going back to 60, I notice an extreme difference.
Yeah, the difference is very noticable once you get used to the higher frame rate.
Yes, many do. I’m just one of the unlucky ones. But at least I can see the difference between 1080p and 4k. It’s the little things in life…
Just to make sure since it does happen a lot, you did change your monitor refresh rate in your OS right? Windows for some reason really likes to not default to higher than 60hz. You’d also probaly want to enable variable refresh rate in your GPU settings if available. And if you do have VRR, some games are weird and have a specific Vsync option for it, others you can just use VRR on normal Vsync just fine.
Was gonna say the same. I’ve had this discussion before… “Dude 144hz is a scam it’s the same as 60 for me” my brother in Christ, did you enable it in windows?!
Yea it honestly shocks me - I mean… not really but yknow - that Microsoft has not done anything about it. Surely someone from the team that keeps trying to jam Edge down peoples throats could just port that shit over for when people have 60hz+ monitors plugged in.
Two things are important here:
A 2D point and click adventure at 30fps could have comparable motion blur to a competitive shooter at 180, for example
From 30 to 60? That’s an improvement of 16.67ms. 60 to 120 makes 8.33ms, 120 to 240 only improves by 4.17ms, and so on
Ah, something I want to add:
That’s only explaining the visual aspect, but frametimes are also directly tied to latency.
Some people might notice the visual difference less than the latency benefit. That’s the one topic where opinions on frame generation seem to clash the most, since the interpolated frames provide smoother motion on screen, but don’t change the latency.
It’s super dependent on the game. Baldur’s Gate 3? 30 fps is more than enough. League of Legends? Yeah, I’ll take those 144hz, tho to be honest I don’t notice a big difference compared to 60 fps.