1080p1440p4k resolutionaceracer inccomputer monitordelldigital imagingGadgetGadgetsinformation and communications technologynvidiasamsungTechnologytechnology internetultra high definition televisionVideovideo formats

I Finally Upgraded to a 1440p Monitor and I Don’t Regret a Thing

Stalker: Anomaly running on a Dell SG3220DGF monitor.

Stalker: Anomaly running on a Dell SG3220DGF monitor.
Photo: Tom McKay/Gizmodo

I’m a few years behind the curve when it comes to PCs. I’ve never really had the latest and greatest hardware; I’ve historically made do with a mix of older, secondhand, or budget parts, whether it’s the CPU, RAM, and graphics card or peripherals like speakers and screens.

For example, my PC’s sound system is a Samsung Blu-Ray player from 2012 that I’ve had to rip various wires out of and install hacked drivers to keep working fine. I’ve got a secondhand gaming mouse from an online auction site and a mélange of hard drives, some dating back to at least 2013. And until last week, my primary monitor was still running at a resolution that first became standard like a decade ago.

But, I finally bit the bullet and upgraded from a 1920x1080p monitor to a 2560x1440p, 32-inch Dell SG3220DGF that was recently on sale. I’m not here to talk up the Dell (it is quite good, though). But I am here to say that if you, like me, have been hemming and hawing for years about going beyond 1080p for everyday work and gaming, the time to make the leap is now. And if you thought going further than 1080p is just an overly hyped talking point from gamers that think sub-$700 graphics cards are for “casuals,” you’ve probably held out long enough. I am admitting defeat. 1080p as a primary display resolution is never going to cut it again for me again.

My monitor history is, as I can recall it:

  • Two low-quality 1680 x 1050 monitors, one of which was an Acer X193W+BD acquired in 2009, and another Dell which was found abandoned in a Florida dorm room sometime between 2007 and 2011. The Acer is still in service, and the Dell would be if it hadn’t been damaged during a move in 2018.
  • In July 2018, I bought a 24-inch 1080p Acer GN246HL gaming monitor.
  • I apparently wasn’t satisfied with the upgrade, because, in September 2018, I bought a 32-inch, 1080p Viotek GN32C with a 144hz refresh rate.

That means, essentially, that I skipped an entire decade of monitor upgrades from 2007 to 2018 before buying two 1080p panels. Even when I finally sprung for a huge 32-inch monitor, it was just 69 pixels-per-inch. Instead of buying one cutting-edge monitor, I hooked up three older ones.

This made complete and total sense to me. For me, 1080p was the sweet spot. I didn’t need a higher resolution monitor for day-to-day work, and 1080p was exactly the resolution where someone with acceptable hardware could crank up the settings to very high or max and still get playable framerates. (It apparently never even occurred to me until 2018 that my framerates were being capped at the older monitors’ max refresh rate of 60-75 Hz.) The 32-inch Viotek monitor—which by the time I purchased it in September 2018 would have been considered headache-inducingly blurry by many gamers—seemed like a treasure at the time.

And this put me squarely in line with most other gamers. If you were to base your impression of what type of hardware the average PC gamer has on the specs of tech that’s shown up in reviews in the past few years, you’d be mistaking the aspirational for what most people actually own. The typical consumer doesn’t upgrade hardware at anywhere near the breakneck pace manufacturers would like you to believe because staying a few generations behind is the optimal approach for anyone who isn’t rich.

The Steam Hardware Survey shows that over two-thirds of respondents (67.29 percent) in February 2021 reported that they’re still using a primary display monitor with a resolution of 1080p. (Similarly, some 42 percent were still using Nvidia 10th generation video cards, and 57.5 percent had four or fewer CPU cores.)

That all said, there’s no way in hell I could go back to 1080p now that I have a 1440p monitor. What was I thinking? 

I’m telling you, I was somewhat skeptical it would make a huge difference, but at 32 inches the difference between 1080p and 1440p is massive. The on-screen text seemed unreadably tiny at first, but now that my eyes have adjusted, the increased clarity means it’s actually much less taxing on my eyes to read than before. 1440p is a whopping 78 percent increase in pixels over 1080p, which is a lot bigger of an increase in gaming clarity than I anticipated.

And while it’s more taxing on hardware, 1440p comes with nowhere near the performance cost of 4K gaming, which is still galling for anyone not willing to drop $500 on a graphics card. 1440p gaming is realistically achievable on middle-of-the-pack hardware and, if you’re already getting good performance at 1080p, you likely won’t require any PC upgrades at all. Conversely, buying a 4k monitor that your PC just can’t keep up with means you’ll have to rely on an interpolated lower resolution that looks a good bit worse than if you’d just went with 1440p. Sticking with 1080p has its own diminishing returns: As game visuals increase in complexity, the more jagged edges are going to appear that will need to be dealt with by increased anti-aliasing or other hardware-taxing solutions like Dynamic Super Resolution (Nvidia) or Virtual Sample Resolution (AMD).

There are other benefits to the monitor upgrade, too: If you’re still running 1080p, odds are pretty good you have an older TN or IPS panel that underperforms relative to newer screens in many more ways than raw resolution. A moderately-priced 1440p VA panel released in the past few years is likely to have far better brightness, contrast, and color, as well as significantly less backlight bleed. For example, while playing Stalker: Anomaly on the Viotek monitor in a dark room, walking around nighttime levels resulted in an obnoxious and very visible glow around the top of the screen—enough to make it look like the sky was covered in moonlit fog. Your results may vary, but in my case, the new Dell monitor reduced the backlight bleed to far less perceptible levels.

I paid $380 for the Dell monitor, which is a lot. But most 1440p monitors are nowhere near that price point. RTings.com’s Winter 2021 list has models in the $300 range, and even ones cheaper than that are likely to be a visual upgrade if you’re still using a 1080p panel from 2015 or whatever. Used or refurbished hardware is often just as good, and 1440p is now widespread enough that it’s possible to find good deals on used equipment in fine condition.

I’m an equipment hoarder who never throws out or sells old hardware and thus has three monitors now (it’d be all four if my graphics card had that many ports). But if you’re like most PC owners and just have one, an additional upshot of getting a 1440p monitor is that you can just leave the old one hooked up too and enjoy a multi-screen desktop.

Look, 1080p is fine, it’s not obsolete, and I’m not calling for it to go gentle into that good night. It’s served us well for over a decade, and it’s going to continue to serve us well for at least one more (and possibly far beyond that). There’s no need to buy a new monitor just for the hell of it. What I am saying, however, is that if you’re considering an upgrade at any time in the near future, 1080p is no longer the best bang for your buck: 1440p is. I’m a convert.


Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Back to top button