[Continued from Part 1]
There’s been two key examples—limited to my recent memory—that most would scratch their head over. The first example being Ubisoft’s damage control of Assassin’s Creed: Unity running at 900p and 30 fps (that wasn’t even locked) across both 8th gen platforms. While the console parity is another argument that’s fair to make, the excuse Ubisoft used for that frame rate was to make it more “cinematic,” which has been an excuse that’s gotten pretty old at this point. Another key instance in 2014 was when some developers at Ready at Dawn explained their rationality behind two key visual choices: locking at 30 fps and setting the resolution to 1920:800:1, the visual standard when watching films in widescreen.
“60 fps is really responsive and really cool. I enjoy playing games in 60 fps. But one thing that really changes is the aesthetic of the game in 60 fps. We're going for this filmic look, so one thing that we knew immediately was films run at 24 fps. We're gonna run at 30 because 24 fps does not feel good to play. So there's one concession in terms of making it aesthetically pleasing, because it just has to feel good to play. (7)” (Dana Jan, The Order: 1886 Director)
“[30 fps] feels more cinematic [and] it actually feels better for people when it’s at 30 fps. (8)” (Alex Amancio, Assassin’s Creed: Unity Creative Director)
http://replygif.net/i/101.g...
While I have no qualms over the idea of that underused type of resolution—which I thought worked well with a game like Beyond: Two Souls, especially when considering that it can also help when it comes to frame rate, this idea of deliberately seeking out a less-smoother frame rate than that once golden standard as if the smoother option somehow harms the game’s aesthetic doesn’t sit well with me; the reason being is because it’s a lie. In both AC:U’s case and this one, it’s pretty obvious the marketing department’s just going on damage control. There have been many examples in the past in which that sacrifice was made for visual fidelity during that time, but it’s tough for me to remember hearing/seeing a creator suggest that lower frame rate cap could actually be seen as some kind of benefit. And in the gaming world where Ready at Dawn’s new ip is backed by one of the most popular publishers in the industry and consistently showcased on their demo stages, I have a worrisome presentiment of the influence such statements can have against those either less-knowledgeable about the subject or even developers looking to make games that emulate cinema.
Judging by comments from other big-name developers just recently, it seems that this sort of PR dodge isn’t going away any time soon. Whether its Naughty Dog not pushing for 60 fps in Uncharted 4 if doing so means they’d “lose something that would really impact the player’s experience (9)” or even CD Projekt Red providing a mostly-harmless response in regards to PS4/Xbox One versions of The Witcher 3 going for a locked 30 fps experience (10), it’s becoming frustrating for me, as a consumer who really enjoys gaming on a console and would certainly like to pay upfront for some of these exciting AAA blockbusters, also has to deal with this sort of PR bunkum from the developers. It’s not enjoyable to expatiate this whole fps ordeal but the part that feels most condescending about some of these quotes listed above is that developers KNOW the obvious advantages a higher framerate specifically provides for this unique medium. It puts me in a rather uncomfortable bind as to now expanding my early/late sale purchasing principle to include ludicrous claims by developers in regards to this issue, which is aggravating to consider when I really, really want to play some of these games.
Before attempting to explain what I hope to see in the future when it comes to this, it seems fair to acknowledge my own inconsistency in the past as far as this topic is concerned. So…
[DISCLAIMER: My favorite game of all time, Star Wars: KOTOR (played the Original Xbox version), had frame rate hiccups that dipped below 30 during specific action scenes. These were even jarring at the time but the quality in storytelling, gameplay design, choices, etc. consistently impressed me and filled me with dozens of memorable moments I can still talk about to this day. Mass Effect on 360 was definitely annoying when it came to technical aspects of pop-in and framerate before the option of installing had come along. I couldn’t get enough of everything else when it released and still cherish it and that ‘fridge on rollerskates’ Bioware decided to axe in the sequels. I’ve played a lot of 30 fps titles that I really, really, REALLY enjoyed. A lot of my favorite games from the last generation did run at locked 30 fps and still felt great to play.]
…you’re reading the words of a super-hypocrite, I guess? Keep in mind that none of what I’ve said here is meant to act as some about-face of what I think about frame rate or to say anything ridiculous as if 30 fps games should no longer be considered enjoyable. Some of my favorites running at that rate will REMAIN my favorites because of so many more aspects I appreciate about them; heck, Ocarina of Time hasn’t lost that GOAT title (which I’m basing solely on popular opinion) despite having a frame rate that would sometimes dip pretty low. This isn’t me essentially saying I’m going to hold my nose while walking past the console game aisles at a retail store like some super PC elitist. How could I think like that while being able to pour hours into something like Destiny and really like the game feel of that product? I suppose part of my inspiration for this blog could be due to venting disappointment on the promises this gen was hyped to offer; another could be in recently playing more games on my laptop. Perhaps the flexibility offered there makes wonder why console games can’t start emulating them more in that respect.
What’s been the usual expectation for console games is in getting a game that has one set frame rate option. Players often just get what the developers give in this regard. But in playing a MMO like Star Wars: The Old Republic and toying between medium, low, or the lowest settings I’ve wondered why that sort of mentality isn’t more widespread in console games. Having two pre-set options where a lower frame rate for all the visual options maximized (for that console) or a higher one with degraded visual quality (resolution, texture quality, etc.) doesn’t seem like some back-breaking ordeal for a dev to accomplish; in fact, MMO’s like the PS4 version of FFXIV: A Realm Reborn already have 1080p/30fps and 720p/60fps options in place. In the past, there have been a handful of console games that give options between a consistent 30 fps and a variable frame rate as well. But the fact that such options are largely unheard of across console games in today’s day and age seems outdated and could be a great way to alleviate some of these frustrations altogether.
Perhaps there is some ray of hope when it comes to this from a marketing perspective as well when considering some new dynamics presented in the recent years. Both aspects to consider stem from one site in particular: YouTube. With the rise of video playbacks and social media features during 7th gen changing gaming in drastic ways, YouTube had been—and continues to be—THE spot to watch lets-players, trailers, and gameplay during the PS360Wii era. One issue with the site was video playbacks being locked to 30 fps across the board. When considering how marketing-focused the AAA industry had become altogether, it could’ve been a reason why shinier graphics being pushed to the forefront may have been in the industry’s best interest then. Now, with YouTube supporting 60 fps video playback (11), could we see new marketing considerations being made in the future? Tie this consideration with the rise of certain YouTube lets-players who have some clout in regards to community influence and focus on framerate often and there’s a chance of seeing the big-budget industry considering both advantages and disadvantages in presenting gameplay trailers of their titles at a higher or lower framerate.
With all that’s been stated here, I certainly hope this wouldn’t come off as some snobbish attack against anyone who doesn’t really mind frame rate either way. Considering just how lenient I’m capable of being when it comes to evaluating games on this particular subject, I’d be slandering myself if that were the case. It just seems that whenever I assess some kind of industry problem, the probability of this reminiscing an ouroboros is sure to spring up; the perpetual self-sacrificial circle of the creative side and the consumer side have constantly fed each other enough bait over the years to keep this newer standard (without any option) for so long that we’re willing to argue over something that’s been answered a long time ago and developers willingly presenting false information that may inevitably trickle down to consumers less knowledgeable about this subject.
I guess I’m just tired of feeling as though I’ve been effectively taking part in that feeding frenzy for so long.
Provided Links:
(7.) http://kotaku.com/a-develop...
(8.) http://gamerant.com/assassi...
(9.) http://www.videogamer.com/p...
(10.) http://gamingbolt.com/cd-pr...
(11.) http://www.theverge.com/201...
Son Heung-min Fortnite collaboration confirmed for June 21 with exclusive skin and Snap Ceremony emote bundle.
The Nintendo Switch 2 may receive support for docked mode 4K and 120 Hz output at some point in the future.
hopefully VRR via a firmware update is released for docked play as well (like the PS5 did post release)
Nintendo Switch 2 breaks US launch sales record, selling over 1.1 million units in its first week. This breaks the record set by PS4.
I'm happy for Nintendo. Even though the Switch 2 has received so much hate just to bring it down, it hasn't won. Even though some are saying they'll boycott Nintendo, it hasn't affected sales. Especially now, with word of mouth that the system is good, demand for the Switch 2 will definitely increase more.
But But the internet told me the switch 2 was going to be a failure.
Such a good system im having a blast with it and really enjoying my first ever playthrough of yakuza 0
Sold through or to the stores? Asking because they're tons of them at my local Best Buy. When the PS5 and Series consoles launched, you couldn't find them anywhere.
Good for them, I'm still not buying.
Looks to be a pretty safe and boring upgrade over Switch 1 which I don't think they can top it on how plain looking it is design wise.
Shame since Switch Lite showed they still know how to make a handheld that looks good and is actually portable in a classic Nintendo sense.
That's why I got one over Switch 2's block of charcoal.
Then there's the prices of it all and the whole Key card thing destroyed any sense of game ownership.
many parts
such Lionsgate
much milking
wow
:P
Hope everyone enjoyed this 2-piece blog (if you bothered to read both). I hope I was able to explain my back-and-forth feelings on this topic clearly. Please feel free to leave any comment and/or questions below. Hopefully all of those get funneled into the Part 2 section just so they're all pooled in one place, but it's no big deal if that doesn't happen. It's strange to be finally running into this character limit with blogs (here and with Halo 5 Beta Impressions) while my TLOU review gets away with so much more.
I'm just responding to the 30 FPS part for now. One thing I do not understand is how 30 FPS feels more cinematic. It makes no sense. Choppier animation with less frames is more cinematic?
Few comments:
1. Motion blur in games can feel awkward because of the way it is generated.
In games, the shutter speed is pretty much instantaneous; A frame at 24 fps is as sharp as a frame at 300fps. A real camera, on the other hand, will produce more blur at the lower frame-rate.
This natural form of motion blur doesn't occur in video games, and has to be accomplished by post-processing or blending frames, which leads to an unnatural result when you only have 30 frames to begin with.
2. Most movies are not shot in the first-person, whereas the first-person view is very common in games. It doesn't make a whole lot of sense to introduce characteristics of movie cameras (24 fps, heavy blur) if the idea is to be looking through someone's eyes.
3. The biggest reason a higher frame-rate is important to me is not the visual smoothness, but rather the feedback. Games are interactive, unlike movies. The gameplay is closely tied with how fast the player gets feedback. So, while 60fps in youtube might make a small difference, the big difference can only be felt when the controller is in my hands.
As an aside, it's a bit of a shame that developers don't take advantage of this fact to create unique gameplay possibilities. Suppose in a game like DeusEx or MetalGear, the player's implants/nanomachines are compromised. Sure, you could show this as a cutscene where the main character struggles to walk around... or, let the player keep playing, but start artificially and violently spiking the frame-rate so that the player has difficulty moving around.
Make the player experience the emotion that the character is supposed to be going through, rather than just having the character display that emotion in a cutscene. Though, I suppose, that's a different topic.
Remember in the 6th gen when there were titles running in 60fps, with great graphics and no loading screens. Jak II, Metroid Prime, Ratchet and Clank, etc.
I think it is very important to prioritize framerate over graphics. Some genres can get away with a locked 30, but everything is still better at 60. Look at Devil May Cry 4 vs. DmC 2013 and you see a world of difference because the former has a better framerate. It looks more impressive despite being 5 years older. I probably wouldnt have even got so into pc game if the prolonged 7th gen didnt deliver everything in sub-30 just to look pretty.
This is one reason i've played a lot of the Wii U thus far this gen... nintendo always focuses on delivering good games at 60 frames that aren't broken at launch.
More developers should also follow the Bioshock devs, who included a locked framerate and an unlocked one.
One mistake I see a lot of gamers make is calling developers lazy and hardware bad if the games don't usually run at 60fps on it.
Framerate is a trade-off with visual fidelity. A higher framerate inherently means an opportunity cost of visual quality and the scope and detail of the game environments. Better optimization can significantly lessen the severity of the trade-off, but only within a limited range.
Basically, this means that the PS4 is no more capable of 60fps as the PS2. Its games are just allowed to visually look better at 60fps, and would still be able to look even better at 30fps. The trade-off doesn't change. (The trade-off exists on PC as well, but players are allowed to customize it based off their settings and specs. PC specs that are relatively high above requirements can do both.)
I wish more developers would explain this; the way it actually is; rather than use insincere excuses like "cinematic" and such. Choosing a better-looking game over a better-running game is an understandable decision, and it's going to be difficult to move the debate forward if people don't truly understand the heart of the issue.