• Rhaedas@kbin.social
    link
    fedilink
    arrow-up
    14
    arrow-down
    6
    ·
    edit-2
    1 year ago

    I’ve played some action games in the teens and was fine with it. Maybe lower frame rate at low resolution (1080) isn’t as apparent as the high 4K, but I’ve never understood why people can’t play with frame rates still far faster than film (if it’s truly refreshing the frames completely and not ripping the picture of course). I suppose this argument goes the same direction as the vinyl/CD one, with both opinions dead sure they’re right.

    If the game is handling variations of frame rates during play badly, that’s a different story. The goal is for the player to not realize there’s a change and stay focused on the game.

    • Klear@sh.itjust.works
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      I started out playing Doom on a 386, in a tiny tiny viewport, and until recently my hardware was apways behind the curve. I remember playing Oblivion at 640 x 380. And enjoying foggy weather in San Andreas because the reduced draw distance made my fps a lot better.

      Over the years I’ve trained my brain to do amazing real time upsacaling, anti-aliasing, hell, even frame generation. nVidia has nothing on the neural network in my head.

      But not everyone has this experience and smooth FPS is always better, even if I can handle teerible performance if the game is any good.