This whole frame rate debacle seems to have picked up a lot of heat for a number of reasons. First, the seventh generation had some aesthetic changes made by key developers that opted for a lower-framerate experience. One key example of this spoken in interviews was Insomniac Games’ Mike Action coming out about the studio’s mid-gen decision to focus on developing games at thirty frames per second (fps) and how there seemed to be no difference in sales and review scores between 60 and 30 fps games (1). Ironically, no Insomniac Games’ title running at 30 fps has yet reached the metacritic score nor the sales numbers of Ratchet and Clank: A Crack in Time, their last 60 fps game. Besides the winds of change from developers during last gen, a lot of pre-announcement hype surrounded the 8th generation consoles as both Microsoft’s and Sony’s new hardware making 1080p resolution and 60 fps the new gold standard. Another case of PR hype getting the best of people. But as this generation has started, some developers continue to use questionable marketing terms like “cinematic” for explaining their aesthetic choices and in so doing avoiding the term “compromise” altogether. And it’s that lack of candor found in a couple of recent examples that’s made me interested to delve into the topic further.
Before answering this question of importance of frame rate in the present the standards of how we arrived here lay in the past. With the advent of motion pictures came the concept of individual frames per second and putting a practical limit that would still be fine for the viewer experience. Back then, there’s wasn’t a specific standard since film cameras were hand-cracked devices and projectors were capable of outputting at speeds between 16 to 24 fps (2). Some of those silent films artists even took artistic license in speeding up or slowing down that frame rate for the vibe they were trying to give a specific scene. When the new ability to add sound in films came into the picture standards had to be made. Cranking too fast or too slow resulted in inconsistent sound quality; as a result, 24 fps became the standard for 35 nm sound film. How we reached that standard was actually for a number of reasons: the practicality of that fitting onto a film reel, the costs not being too radical, and research Thomas Edison made on the topic back in the day. He found that eye strain occurs if a person sees less than forty-six images in a second (3). While the set standards appears to be much less than Edison’s suggested number, that 24 fps number mentioned earlier actually means 48 fps or 72 fps because two-blade or three-blade shutters in movie projecters repeat that individual image twice or thrice respectively (3).
With something like Edison’s lowest requirements being at such a number…where does that leave us in regards to the maximum frame rate the human eye can detect? If you’re like me, you’ve probably been greeted with the proposed maximum being anything between 60 fps, 30 fps, or even 24 fps. None of these answers are really correct in the way that we see the question. Providing a clearer understanding to this fps question actually relies on two separate facets of information:
1.) The first way in dismantling those old preconceived notions relies on a study done by the US Air Force. It showed that fighter pilots could correctly identify an image flashed on-screen for 1/220 of a second (~2.2 milliseconds) under specific circumstances (4). 60 fps translates to about 16.7 milliseconds for each individual image.
2.) Despite this sort of interesting data, a fundamental problem with the question is that it may actually be the wrong one to ask. The eye is one of the most complex tissues humans have and its intricacies mean it doesn’t transmit visual images to your brain as a consistent set of still images like a computer screen. Variables such as genetics, lifestyle, training, etc. can all play a role in the amount of quick flashes of light an eye can detect.
While there’s still some research needed to be done in the future in seeing the “frame rate cap” for the average human eye, it’s clearly not going to be some mid/low double-digit number that’s espoused on some corners of the internet for so long.
If that’s clearly the case then why can something like film get away with such a lower frame rate for so long and whenever changes do come along, like The Hobbit films shot in 48 fps, the results are mixed for certain film-goers (5)? Well…before attempting to talk about that we can’t ignore the fact that using film as some sort of a crutch when bringing up 30 fps games is a false equivalence. It's just that simple. Film has that advantage of every single movie being made from a set perspective, typically helmed by someone who knows how to frame shots, and the advantage of motion blur being able to fill in the gaps between each individual image. And the reason motion blur doesn’t bring a heap of complaints in films compared to games is in it being a natural side effect to the way light is captured by celluloid or a camera sensor. While I didn’t find it to be a big issue when used in the likes of Alan Wake or Killzone 2, that didn’t seem to be the case for everyone else and that could’ve been part of the reason why that was annoying to some users. So, why would there be complaints from certain film critics against The Hobbit when it was running at 48 fps at certain theaters? One thing to keep in mind is that those filmmakers are dealing with nascent technology that they’ll have to work out as time moves on; so it’s actually more reasonable to understand the demands these new standards bring for set design, costumes, CGI, etc. with said doubled visual clarity instead of slamming it outright. Funnily enough: the mixed reports by some critics and film-goers alike noticing the change is another way to dismantle the idea that 24 or 30 fps is the highest threshold the human eye can perceive.
“Fine,” one might say, “you make a case in how film and games differentiate when it comes to the importance of frame rate but could it really be THAT important?” Well, it depends on what you’d mean by THAT important, but it stands to reason that the experience is undeniably smoother.
Whenever I’ve seen these arguments downplayed there’d oftentimes be still images used as a means of proving the difference between 30 and 60 fps being barely noticeable. The problem with that kind of a comparison is that it doesn’t adequately represent what the fps difference would look like when player movement is considered; and since there’s plenty of ways to show those visual differences across the web (6), such a counter-argument strikes me as being intellectually dishonest. An easily relatable way for sports fans to understand would be in using those cameras which could take a certain number of shots per second, say something like...eight or twelve. If you see that camera’s images of a baseball player taking a swing, you can see the drastic amount of new information per frame. You can also see that kind of difference in gameplay videos that are either running at full-speed or slowed down to about 25% speed and notice the difference it takes to load visual information when moving the camera around.
Bear in mind: it’s not just about the seeing of said visual information loading onto the screen at a quicker rate but also REACTING to what new visual information, be it enemies or otherwise, comes into view at a faster rate. This is why that heralded gold standard of 60 fps was made during the epoch of bits. Games being fundamentally based on player input changes the stakes compared to inactive forms of media. Imagine using the same logic of the still pictures example used earlier but change it to reacting to an enemy sniper in a distance. The transitions between seeing a target, getting a bead on said target, and taking a shot with double the frames of the typical 7th gen console-standard shooters feels smoother despite the action taking equal time to accomplish; it’s essentially the Inception logic of game feel when you think about it.
There’s a lot of interesting studies that can go much, much more in-depth into this subject, not to mention the first-hand accounts of the typical or hardcore PC crowd when given the opportunity to toy around with the technical aspects of their games. Even the simple math of a 33.3 millisecond transition between each frame (30 fps) compared to 16.6 milliseconds shows that a higher frame rate objectively trims down on input lag. In an ideal world, the debate for this would’ve been over a long time ago; as it stands, when marketing is such a key aspect to the AAA game scene, you can even witness developers trying conjure up some kind of an excuse or even validating a lower frame rate as the better option.
=============================
Continued in Part 2
=============================
Provided Links:
(1.): http://www.eurogamer.net/ar...
(2.): http://books.google.com/boo...
(3.): http://web.archive.org/web/... http://www.cinemaweb.com/si...
(4.): http://amo.net/NT/02-21-01F...
(5.): http://www.theguardian.com/...
(6.) http://30vs60.com/
Mario Kart World is a solid leap forward, but its $80 price tag and underwhelming free-roam leave much to be desired.
yeh i agree.
the funmy thing is tho, theworld map road drivingz reminds me sooo much of Diddy Kong racing.
At £75 ($101), I soon got bored of it, even free roam. All old tracks up to now, can't fault much else, but maybe some of the characters are stupid. It's just not as good as Mario Kart 8, which was a Wii U port on Switch. Went back to Mario Kart 8, World doesn't come close even if you play the Wii U version.
The multiplayer beta for Gears of War: Reloaded, is available on PS5/PS5 Pro/Xbox Series X|S, Read ahead to learn how they compare.
"the game runs at its best on the PS5 Pro in Performance mode, with a higher average resolution compared to the PS5 and Xbox Series X versions in Graphics mode"
MS evidently enjoying Pro devkits. Thank you Team Xbox.
The Outerhaven writes: Bandai Namco officially announces Code Vein II during Summer Game Fest 2025 Play Days. Here’s everything we learned from the hands-off preview.