I know they can be situational, but if you had to pick one to rule them all which of these two would you pick?

  • BombOmOm@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    edit-2
    20 hours ago

    Since I do quite a bit of gaming, 2k/120hz by a far margin. The fewer pixels makes it easier to turn the graphics up more, and the extra frame-rate cap allows for games to be nice and smooth.

    My main display is actually a 2k/144hz screen right now. :)

    • MrVilliam@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      18 hours ago

      Same, but for some reason my wife doesn’t like the look of high refresh rate in games for the exact reason I don’t like low refresh. I played Jedi Survivor on PS5 for a little bit and felt uneasy until I found a setting to switch it from quality to performance. I was getting very mild headache and nausea symptoms until switching that, especially when spinning the camera around quickly. Running it in higher framerate also made timing in gameplay much easier. The only drawback was that cutscenes broke immersion because they were still locked at 30fps.

      I really wish that game devs would prioritize achieving 60fps above 4k resolution. This question is even assuming that 60fps is the lowest acceptable option, yet it’s still not a given even at 2k yet. EA released a game in 2023 that runs in 30fps by default. I enjoyed the game overall, but it ran like dog shit, crashed multiple times, and felt a little short/barren to me. I could excuse all of that if it were a smaller company or the first year of the console’s life, but neither of those were the case in any way.

      Anyway, yeah 60fps should be the bare minimum, but it still sometimes isn’t, and it literally makes me sick.