No matter how hard I try to fight it, every year during Black Friday I come down with a bad case of technolust. Stricken with deal frenzy, that rational part of my brain that says "do you really need that?" or "you can't eat a stick of RAM when you're hungry" shorts out. I start eyeing PC upgrades that, for the rest of the year, I never seriously considered. This fugue state is how I ended up buying a $600 RTX 3070 in 2020 just to play Quake 2 with ray tracing (and it was worth it). Yesterday I bought a speedy PCIe Gen 4 SSD (opens in new tab), and am dangerously close to getting a new motherboard and CPU to squeeze every possible ounce of speed out of it.
The one thing I absolutely won't be buying this year is a fancy 4K monitor.
I won't be buying one next year, either. Or the year after. Until GPU power has doubled and then doubled again, I'm sticking with my beloved 2560x1440 monitors, and I think most PC gamers should, too.
4K monitors have been around for years now, and they're no longer the hyper-expensive beasts they were at the beginning. Monitor tech has also pushed beyond some of 4K's biggest drawbacks, like limited refresh rates: you can now get a 144Hz 4K panel (opens in new tab) for $600, or you could pay $1,000 for a G-Sync model with HDR support (opens in new tab) and every bell and whistle you could ask for. Pricey, but it's not really the cost that keeps me away from 4K: it's how demanding running games at 4K remains today.
While consoles games regularly offer a choice between 30 fps "quality" and 60 fps "performance" modes, I think the choice for most PC gamers today with a new-ish desktop is a bit different. We're choosing between 60 fps as an acceptable minimum for the most demanding games, while aiming at 120 fps or 144 fps with today's gaming monitors.
And yes, the human eye can distinguish well beyond 60 fps, if you're wondering. For fans of lighter competitive games like CS:GO or League of Legends, you're probably aiming for a 240 fps experience instead.
Hitting those kinds of framerates at 4K is still really challenging—especially with ray tracing. In our RTX 4080 review, for example, you'll see that Cyberpunk 2077 with ray tracing on averaged only 31 frames per second; F1 22 came closer, at 57 fps. There are games that can run at native 4K well over 60 fps, of course—Far Cry 6, for example, hit nearly 100. But that's all on a $1,200 graphics card. When the GTX 980 launched eight years ago, it cost only $550, less than half the price.
A couple great 1440p monitors with Cyber Monday discounts right now:
To be fair, most people gaming on 4K monitors aren't trying to run games at native resolution—they're using DLSS. Upscaling tech like DLSS and FSR is indeed amazing to have at our fingertips: on the RTX 4080, Cyberpunk 2077 jumped to an average framerate of 83 fps with DLSS enabled, set to quality. That's a huge improvement, but it's still shy of 120 fps on an incredibly expensive graphics card.
There's an eternal arms race between demanding games and ever-beefier graphics cards. At 4K, the two are in a dead heat, though only if you can afford the latest top-of-the-line hardware. But if you play games at 1440p, you're giving your graphics card a huge advantage—arming it with a crate of shiny new machine guns while your games, even with ray tracing enabled, are still firing bolt actions.
Even without DLSS, our RTX 4080 tests saw Cyberpunk hit 64 fps on average, F1 22 cracking 106 fps, and Far Cry 6 topping 120 fps. With DLSS enabled, all three of those games will rocket up in performance. Now we're talking high framerates.
Above: 1440p doesn't get as much buzz as 4K, but there are still exciting developments happening with it.
With DLSS on my side I'm ready for 60 fps to be my worst case scenario, with the expectation that I'm going to play most PC games at a smooth, smooth 120 plus. And that's actually affordable with 1440p hardware now. My RTX 3070 is still capable of topping 60 fps with ultra ray tracing in the brand new (and very demanding) Warhammer 40K: Darktide, and I expect to cruise through this graphics card generation without feeling the need to spend $1,000 on a marginal upgrade.
1440p at its most common screen size, 27 inches, is also a great resolution for everyday work. Here's a pixel density comparison with some common screen sizes:
- 1080p monitor, 24 inches: 91.8 PPI
- 1440p monitor, 27 inches: 108.8 PPI
- 4K monitor, 32 inches: 137.7 PPI
The screen is dense enough for multiple windows, doesn't require mucking about with scaling settings at the distance I sit (about 29 inches from the screen), and I'm not constantly moving my head to look at a different corner of my monitor.
I've been using these same 1440p, 144Hz monitors since 2015, and every other part of my PC has changed in that time. Multiple cases, multiple motherboards and CPUs, new SSDs and keyboards and mice. But nothing's tempted me to replace the displays so far—and when I do, I can almost guarantee it'll be with another 1440p monitor, unless DLSS 4.0 or 5.0 has completely brought 4K to its knees. Even then, 240 fps sure sounds tempting…