Why Framerate MattersTECHNOLOGY
In this article, I try to dispel some of the false notions behind framerates and display technology.
Gamers have long clamored for better framerates, but many developers tend to disagree. Bragging about hitting a consistent 60 FPS is fine as a bare minimum, but where should we draw the line?
The 60 FPS benchmark largely stems from consoles and the displays they’re most commonly attached to. The vast majority of HDTVs out there only display images at 60 Hz, meaning rendering your game at a higher framerate is largely a waste of processing power—especially when you could soup up the graphics a bit instead. There are a handful of HDTVs out there that do 120 Hz or more. But they often cheat by using a process known as 3:2 pulldown instead of displaying 120 fresh images per second from the device feeding it information. This often comes with added, rather than reduced, input delay. That’s unfortunate, since lower input lag is the primary appeal for high-refresh rate displays intended for gaming, as opposed to simply watching movies.
Let’s talk a little bit about input delay and some of the math behind it. At 60 Hz, your display shows a new image sixty times per second, or every 16.67 milliseconds. On top of that time, you have to factor in the time it takes your input device—mouse, keyboard or controller—to get its signal to the computer, followed by the time it takes that device’s operating system to process the signal and feed the game a command, followed by any additional delays imposed by the game and its engine (some games are worse than others), and then the time it takes the GPU to render the latest image.
If that seems confusing, check out this informative video from Battle(non)sense explaining input delay and how it behaves in Overwatch. To simplify things, there is very little delay between your input device transmitting the signal and your operating system processing it – the delays primarily come from the GPU and display, which is precisely why framerate matters.
Wait, am I a mutant?
Long has a meme existed in the PC gaming master race culture that sarcastically mentions “the human eye can only see 60fps”. That number has gradually increased over the years, from 30 FPS to 40, and from there to today’s 60. The reality is that it’s different for everyone, and that is largely due to genetics alongside having a trained eye. Did the fluorescent lights in your classroom as a youngster give you headaches? You might be one of the statistical outliers that can perceive these rapid changes in light. Not everyone has the same biology – some fighter pilots can see flashes up to 220 Hz, and some folks simply can’t tell the difference between 60hz and 120hz.
Reaction time also does not directly apply here. The average human reaction time is approximately 250 milliseconds for visual stimuli, and 150 milliseconds for touch stimuli. Saying humans can only react to changes in frames once every 250 milliseconds (or at a rate of 4hz) is simply incorrect: We receive a constant stream of information, but there is an approximately 250-millisecond delay between the change on screen and the our processing the new data and inputting the changes.
It’d be fairer to say that a player is approximately 250 ms behind what is taking place in-game, though that too is somewhat incorrect. You’re sending the system constant inputs – say you move your mouse and do a 90-degree turn – you anticipate the turn to happen as quickly as your system will allow (the aforementioned delay between the input and display update). It’s when new stimuli come into play that the reaction times become a factor. You turn 90 degrees and spot an enemy you didn’t know was there. Now, depending on your reaction time and aptitude with the game, it can take you approximately 250 ms to process that you’ve spotted an enemy, execute the motion to put the crosshair over his or her face and click to fire.
There is also a difference between interactive and non-interactive content. Movies are inherently more viewable at 24 frames per second than a game is playable at that rate. When you’re watching a movie, you’re not being constantly made aware of the delay between each frame because you’re not constantly sending the movie inputs and anticipating feedback. When you’re playing a game and the framerate dips to 24 FPS, your brain immediately recognizes a disparity between what and when it expects events to happen and what is actually taking place… leading to frustration.
Frustration leads to dizziness… Dizziness leads to suffering
Virtual reality has seen explosive growth over the last few years, and one of the biggest technical challenges has been the display refresh rate. Testers said they’d start to feel motion sickness if the display they’d strap to their eyeballs had too low of a refresh rate, again because the brain anticipates instant changes in the visual stimuli and isn’t receiving them. The solution is simple: get displays that refresh more often, though getting to that point is easier said than done.
First, shrinking LED panels to that size while still offering high resolutions is a challenge that results in lower yield, driving up the cost. Thankfully, smartphone technology has pushed that field far enough along that we can now have displays with resolutions of 1080p, 1440p and higher with refresh rates of 90hz all compacted into a size small enough to fit into a VR headset and affordable enough to be put into consumer products.
The next challenge is making sure the system driving the VR display can push an equivalent number of frames per second. Your computer must be able to render the VR game at a resolution high enough for you not to easily perceive each individual pixel (the “screen door effect”), and be able to do so at a rate high enough that your brain doesn’t feel disoriented or confused when your head motions aren’t replicated visually as fast as it expects – for most, this starts at a little higher than 60 Hz, with 90 Hz being a comfortable medium. If your graphics card can’t push out 90 frames per second at 1440p, you won’t just be frustrated because of some added input delay, you just might start to feel dizzy and disoriented.
This all adds up to a relatively high barrier to entry, but as these displays get cheaper, games get more optimized and high-end graphics cards more popular, we’ll see all of these costs come down and more and more consumers will flood into the market. Understanding the requirements for low input delay and the implications of a high framerate is essential to get the most out of both the traditional console or PC experience, as well as virtual reality.