When it comes to a proper gaming setup, everything counts. Most gamers take care of the major stuff first – the mouse, keyboard, tower, and of course the internals of the PC, to make sure they have the best advantage possible and avoid even any suggestion of lag.
But what about monitors? We’ve already covered the big question surrounding HDR, so now it’s time to look at refresh rate – what is it, and is it worth shelling out for that expensive monitor that boasts 144Hz?
Let’s take a look!
Table of Contents
Everybody Hertz – Understanding Refresh Rate
First of all, let’s take a quick look at what exactly refresh rate is – how it changes a monitor’s performance, and by extension, your gaming performance.
To clear up any confusion, it’s worth noting that refresh rate is also referred to as FPS (frames per second, not first-person shooter – in this instance). Refresh rate refers to how often your monitor will update the image on your screen.
The time between these updates is measured in milliseconds (ms), as even the monitors with the slowest refresh rate will update multiple times in a second. If they didn’t, moving images on your screen would appear jumpy and delayed.
The time in ms between each image update dictates the overall refresh rate of the monitor. This rate is measured in hertz (Hz). A Hz rate, for example, 144Hz, is usually ascribed to a monitor as a measure of its capabilities.
FPS usually relates to the GPU (graphics processing unit) of your computer, and the FPS number is the maximum number of frames your GPU is pushing (for your monitor to display).
But, if you’re thinking of stopping there and going out to spend hundreds of dollars on a new monitor, wait! If your GPU and CPU (central processing unit) aren’t capable of higher refresh rates, they won’t be able to keep up with your monitor, and you’re essentially wasting your money.
So, when it comes to higher refresh rates – we’re talking 120Hz, 144Hz, 180Hz, and even a staggering 240Hz – what is the actual benefit of shelling out on a high-end monitor? Do those extra few miliseconds make all the difference, or is it money ill-spent on something you won’t even notice?
Refresh Rates and Resolution
Many of you may be wondering how refresh rate works in tandem with your screen’s resolution. Obviously, the bigger and better your screen resolution, the brighter, crisper and clearer the images on it are going to look.
Problems can arise when you have a super high-quality image and a super-fast refresh rate. Your GPU and your monitor may not be capable enough to deliver both top-notch image quality as well as a super high refresh rate.
At 4K, for example, most GPUs will struggle to keep up with a 144Hz refresh rate monitor, so it’s better to seriously consider which is more important to you: image quality or speed and smoothness?
At 2K resolution, modern GPUs will fare a lot better with higher refresh rates – even 120Hz and 144Hz. But if you start pushing that refresh rate higher (180 – 240Hz), again, you’ll likely run into problems.
120Hz, 144Hz, and Beyond – When Is It Worth It?
Let’s assume that if you’re considering buying a high refresh rate monitor, you’ve got all other aspects in place, namely a capable GPU and CPU. When is it worth, if ever, making the jump to those high FPS screens?
Well, in reality, if you’re already using a good monitor with high FPS, it’s unlikely that you’ll notice that much of a difference if you were to make the change. That doesn’t mean it’s never worth doing, however.
Let’s take a look at a few situations where upgrading your gaming monitor might be worth it.
Your Current Monitor Has a Low Refresh Rate
Most standard monitors have a refresh rate of 60Hz. That means the image on the screen updates 60 times every second.
While that sounds like a lot, and in many other applications, something happening 60 times in a second would be deemed incredibly fast, when it comes to the human eye perceiving images on a screen, it’s pretty standard.
Things will look fine; not spectacular. If you’re just gaming casually, or you’re not that fussed about the smoothness of the graphics or any kind of competitive edge, you might not even consider upgrading.
But if you do, you will notice a world of difference. The jump from 60fps to 120 or more is a big one, and your games (and other media) will appear a lot smoother if you upgrade. In this instance, if you’re looking to improve your gaming experience and/or performance, upgrading from a standard (60fps) or very low (30fps) refresh rate to 120Hz or more will bring a noticeable jump in quality.
You’re a Competitive or Professional Gamer
Most competitive or professional gamers will probably already be using a high-spec monitor of 120Hz or more. The fact of the matter is that these guys are probably some of the only people out there who will notice a difference between 120 and 144Hz.
When you’re playing fast-paced, detail-oriented games like popular online multiplayer first-person shooters, every (milli)second counts! The fraction-of-a-second delay difference between 120Hz and 144Hz can make all the difference in the heat of the moment.
In reality, most pro gamers will see 120Hz as the minimum entry-level requirement for a monitor that they may consider using. A lot of pros will use 180Hz or even 240Hz monitors to ensure they don’t lose any competitive edge.
As you can see, when you really get down to it, for most gamers, the difference between 120Hz and 144Hz is negligible. The truth is you may spend hundreds of dollars upgrading your monitor and not even be able to see the difference when you start gaming.
Unless you’re playing a lot of fast-paced games, you’re a pro who’s taking their gaming career seriously, or you’re stuck with a low-spec 60Hz (or less) monitor and want to improve your gaming experience, upgrading your screen doesn’t really need to be at the top of your list of priorities.