Tired & Agitated

coolbeans

Moderator
CRank: 23Score: 377910

Tired of this Schmidt Episode VII: The Non-Argument Reawakens [Part 2]

[Continued from Part 1]

There’s been two key examples—limited to my recent memory—that most would scratch their head over. The first example being Ubisoft’s damage control of Assassin’s Creed: Unity running at 900p and 30 fps (that wasn’t even locked) across both 8th gen platforms. While the console parity is another argument that’s fair to make, the excuse Ubisoft used for that frame rate was to make it more “cinematic,” which has been an excuse that’s gotten pretty old at this point. Another key instance in 2014 was when some developers at Ready at Dawn explained their rationality behind two key visual choices: locking at 30 fps and setting the resolution to 1920:800:1, the visual standard when watching films in widescreen.

“60 fps is really responsive and really cool. I enjoy playing games in 60 fps. But one thing that really changes is the aesthetic of the game in 60 fps. We're going for this filmic look, so one thing that we knew immediately was films run at 24 fps. We're gonna run at 30 because 24 fps does not feel good to play. So there's one concession in terms of making it aesthetically pleasing, because it just has to feel good to play. (7)” (Dana Jan, The Order: 1886 Director)

“[30 fps] feels more cinematic [and] it actually feels better for people when it’s at 30 fps. (8)” (Alex Amancio, Assassin’s Creed: Unity Creative Director)

http://replygif.net/i/101.g...

While I have no qualms over the idea of that underused type of resolution—which I thought worked well with a game like Beyond: Two Souls, especially when considering that it can also help when it comes to frame rate, this idea of deliberately seeking out a less-smoother frame rate than that once golden standard as if the smoother option somehow harms the game’s aesthetic doesn’t sit well with me; the reason being is because it’s a lie. In both AC:U’s case and this one, it’s pretty obvious the marketing department’s just going on damage control. There have been many examples in the past in which that sacrifice was made for visual fidelity during that time, but it’s tough for me to remember hearing/seeing a creator suggest that lower frame rate cap could actually be seen as some kind of benefit. And in the gaming world where Ready at Dawn’s new ip is backed by one of the most popular publishers in the industry and consistently showcased on their demo stages, I have a worrisome presentiment of the influence such statements can have against those either less-knowledgeable about the subject or even developers looking to make games that emulate cinema.

Judging by comments from other big-name developers just recently, it seems that this sort of PR dodge isn’t going away any time soon. Whether its Naughty Dog not pushing for 60 fps in Uncharted 4 if doing so means they’d “lose something that would really impact the player’s experience (9)” or even CD Projekt Red providing a mostly-harmless response in regards to PS4/Xbox One versions of The Witcher 3 going for a locked 30 fps experience (10), it’s becoming frustrating for me, as a consumer who really enjoys gaming on a console and would certainly like to pay upfront for some of these exciting AAA blockbusters, also has to deal with this sort of PR bunkum from the developers. It’s not enjoyable to expatiate this whole fps ordeal but the part that feels most condescending about some of these quotes listed above is that developers KNOW the obvious advantages a higher framerate specifically provides for this unique medium. It puts me in a rather uncomfortable bind as to now expanding my early/late sale purchasing principle to include ludicrous claims by developers in regards to this issue, which is aggravating to consider when I really, really want to play some of these games.

Before attempting to explain what I hope to see in the future when it comes to this, it seems fair to acknowledge my own inconsistency in the past as far as this topic is concerned. So…

[DISCLAIMER: My favorite game of all time, Star Wars: KOTOR (played the Original Xbox version), had frame rate hiccups that dipped below 30 during specific action scenes. These were even jarring at the time but the quality in storytelling, gameplay design, choices, etc. consistently impressed me and filled me with dozens of memorable moments I can still talk about to this day. Mass Effect on 360 was definitely annoying when it came to technical aspects of pop-in and framerate before the option of installing had come along. I couldn’t get enough of everything else when it released and still cherish it and that ‘fridge on rollerskates’ Bioware decided to axe in the sequels. I’ve played a lot of 30 fps titles that I really, really, REALLY enjoyed. A lot of my favorite games from the last generation did run at locked 30 fps and still felt great to play.]

…you’re reading the words of a super-hypocrite, I guess? Keep in mind that none of what I’ve said here is meant to act as some about-face of what I think about frame rate or to say anything ridiculous as if 30 fps games should no longer be considered enjoyable. Some of my favorites running at that rate will REMAIN my favorites because of so many more aspects I appreciate about them; heck, Ocarina of Time hasn’t lost that GOAT title (which I’m basing solely on popular opinion) despite having a frame rate that would sometimes dip pretty low. This isn’t me essentially saying I’m going to hold my nose while walking past the console game aisles at a retail store like some super PC elitist. How could I think like that while being able to pour hours into something like Destiny and really like the game feel of that product? I suppose part of my inspiration for this blog could be due to venting disappointment on the promises this gen was hyped to offer; another could be in recently playing more games on my laptop. Perhaps the flexibility offered there makes wonder why console games can’t start emulating them more in that respect.

What’s been the usual expectation for console games is in getting a game that has one set frame rate option. Players often just get what the developers give in this regard. But in playing a MMO like Star Wars: The Old Republic and toying between medium, low, or the lowest settings I’ve wondered why that sort of mentality isn’t more widespread in console games. Having two pre-set options where a lower frame rate for all the visual options maximized (for that console) or a higher one with degraded visual quality (resolution, texture quality, etc.) doesn’t seem like some back-breaking ordeal for a dev to accomplish; in fact, MMO’s like the PS4 version of FFXIV: A Realm Reborn already have 1080p/30fps and 720p/60fps options in place. In the past, there have been a handful of console games that give options between a consistent 30 fps and a variable frame rate as well. But the fact that such options are largely unheard of across console games in today’s day and age seems outdated and could be a great way to alleviate some of these frustrations altogether.

Perhaps there is some ray of hope when it comes to this from a marketing perspective as well when considering some new dynamics presented in the recent years. Both aspects to consider stem from one site in particular: YouTube. With the rise of video playbacks and social media features during 7th gen changing gaming in drastic ways, YouTube had been—and continues to be—THE spot to watch lets-players, trailers, and gameplay during the PS360Wii era. One issue with the site was video playbacks being locked to 30 fps across the board. When considering how marketing-focused the AAA industry had become altogether, it could’ve been a reason why shinier graphics being pushed to the forefront may have been in the industry’s best interest then. Now, with YouTube supporting 60 fps video playback (11), could we see new marketing considerations being made in the future? Tie this consideration with the rise of certain YouTube lets-players who have some clout in regards to community influence and focus on framerate often and there’s a chance of seeing the big-budget industry considering both advantages and disadvantages in presenting gameplay trailers of their titles at a higher or lower framerate.

With all that’s been stated here, I certainly hope this wouldn’t come off as some snobbish attack against anyone who doesn’t really mind frame rate either way. Considering just how lenient I’m capable of being when it comes to evaluating games on this particular subject, I’d be slandering myself if that were the case. It just seems that whenever I assess some kind of industry problem, the probability of this reminiscing an ouroboros is sure to spring up; the perpetual self-sacrificial circle of the creative side and the consumer side have constantly fed each other enough bait over the years to keep this newer standard (without any option) for so long that we’re willing to argue over something that’s been answered a long time ago and developers willingly presenting false information that may inevitably trickle down to consumers less knowledgeable about this subject.

I guess I’m just tired of feeling as though I’ve been effectively taking part in that feeding frenzy for so long.

Provided Links:

(7.) http://kotaku.com/a-develop...

(8.) http://gamerant.com/assassi...

(9.) http://www.videogamer.com/p...

(10.) http://gamingbolt.com/cd-pr...

(11.) http://www.theverge.com/201...

coolbeans3763d ago

many parts
such Lionsgate
much milking
wow

:P

Hope everyone enjoyed this 2-piece blog (if you bothered to read both). I hope I was able to explain my back-and-forth feelings on this topic clearly. Please feel free to leave any comment and/or questions below. Hopefully all of those get funneled into the Part 2 section just so they're all pooled in one place, but it's no big deal if that doesn't happen. It's strange to be finally running into this character limit with blogs (here and with Halo 5 Beta Impressions) while my TLOU review gets away with so much more.

garrettbobbyferguson3763d ago

I'm just responding to the 30 FPS part for now. One thing I do not understand is how 30 FPS feels more cinematic. It makes no sense. Choppier animation with less frames is more cinematic?

coolbeans3762d ago

It depends on how one's trying to define it, really. Even the 1st definition of it in Merriam-Webster is "of, relating to, suggestive of, or suitable for motion pictures or the filming of motion pictures." And there's subtle nuances of its definition elsewhere for someone to frame it in such a way for how they're trying to make their game look.

As far as a definition goes, I see the logic behind the word; as far as acting like a badge of a team's artistic vision, it's very troubling to see.

uth113762d ago

We are used to seeing films in 24fps.

When they screened the Hobbit in the 48fps version, many people hated it. They said it ruined the fantasy aspect of it all.

I have not seen the Hobbit in 48fps, but I do know that I hate watching TV on those TVs that interpolate extra frames to bring the content up to 60fps. Looks unnatural to me. They call it the soap opera effect.

So when they say 30fps games look more cinematic, they do have a point.

garrettbobbyferguson3762d ago

Except Ducky explained everything below which disproves anything about 30 FPS being cinematic.

Plus there's a complete difference between 24 FPS live action films and 24 FPS animation. Hell, even in animated tv shows and movies you can tell very clearly when something is animated with less frames than another thing in the same scene to save some budget.

So the only reason 30 FPS is considered "cinematic" is because it's closer to the Frame Rate that a live action movie is shot in.

3762d ago
Ducky3763d ago (Edited 3763d ago )

Few comments:

1. Motion blur in games can feel awkward because of the way it is generated.
In games, the shutter speed is pretty much instantaneous; A frame at 24 fps is as sharp as a frame at 300fps. A real camera, on the other hand, will produce more blur at the lower frame-rate.
This natural form of motion blur doesn't occur in video games, and has to be accomplished by post-processing or blending frames, which leads to an unnatural result when you only have 30 frames to begin with.

2. Most movies are not shot in the first-person, whereas the first-person view is very common in games. It doesn't make a whole lot of sense to introduce characteristics of movie cameras (24 fps, heavy blur) if the idea is to be looking through someone's eyes.

3. The biggest reason a higher frame-rate is important to me is not the visual smoothness, but rather the feedback. Games are interactive, unlike movies. The gameplay is closely tied with how fast the player gets feedback. So, while 60fps in youtube might make a small difference, the big difference can only be felt when the controller is in my hands.

As an aside, it's a bit of a shame that developers don't take advantage of this fact to create unique gameplay possibilities. Suppose in a game like DeusEx or MetalGear, the player's implants/nanomachines are compromised. Sure, you could show this as a cutscene where the main character struggles to walk around... or, let the player keep playing, but start artificially and violently spiking the frame-rate so that the player has difficulty moving around.
Make the player experience the emotion that the character is supposed to be going through, rather than just having the character display that emotion in a cutscene. Though, I suppose, that's a different topic.

coolbeans3762d ago (Edited 3762d ago )

Some interesting info in regards to motion blur, Ducky. I know people often bring up some of those issues, but I honestly didn't experience in those games I had listed in Part 1, maybe a bit moreso annoying in KZ2 due to the first-person perspective (like you mentioned) but perhaps techniques can improve on this avenue in the future; in fact, I think an old Kotaku article and certain forums brought up The Witcher 2's blurring effect as working effectively.

Agreed that the lessened input lag would be what I like more about it too. While I can admit to being lenient on this technical aspect when it comes to grading (perhaps some users don't like that), I do think of COD and RAGE from last gen as being examples that kept me going back for more and more. The more I think about, the more it feels like that typically-locked 60FPS really bolstered my overall enjoyment with them.

"Sure, you could show this as a cutscene where the main character struggles to walk around... or, let the player keep playing, but start artificially and violently spiking the frame-rate so that the player has difficulty moving around."

lol

Concertoine3763d ago (Edited 3763d ago )

Remember in the 6th gen when there were titles running in 60fps, with great graphics and no loading screens. Jak II, Metroid Prime, Ratchet and Clank, etc.

I think it is very important to prioritize framerate over graphics. Some genres can get away with a locked 30, but everything is still better at 60. Look at Devil May Cry 4 vs. DmC 2013 and you see a world of difference because the former has a better framerate. It looks more impressive despite being 5 years older. I probably wouldnt have even got so into pc game if the prolonged 7th gen didnt deliver everything in sub-30 just to look pretty.

This is one reason i've played a lot of the Wii U thus far this gen... nintendo always focuses on delivering good games at 60 frames that aren't broken at launch.

More developers should also follow the Bioshock devs, who included a locked framerate and an unlocked one.

coolbeans3762d ago (Edited 3762d ago )

"This is one reason i've played a lot of the Wii U thus far this gen... nintendo always focuses on delivering good games at 60 frames that aren't broken at launch."

That's also one big reason why I feel bad about not getting one yet too. Although they do have a number of stupid problems tied with the Wii U (namely online infrastructure), it's actually been the only one to present 1 exclusive per year I've been excited to try out: Zombii U, The Wonderful 101, Smash Bros. + Mario Kart 8 (really enjoyed the demo) for 2014. Then again, there's no way I can avoid it once Star Fox comes out.

They really do deserve greater sales for the Wii U when looking from a game release standpoint.

"More developers should also follow the Bioshock devs, who included a locked framerate and an unlocked one."

Either the locked 30/locked 60 or locked 30/unlocked 60 route would sit well with me, preferably the former. Even though I liked playing it Bioshock + Titanfall (360) at unlocked, it's still not really possible for a TV to iron out the visual oddities that come with the variable rate of the game and the locked 60Hertz refresh rate of the TV. I've heard G-sync is working on a way of computer screens syncing with variable frame rates, but that doesn't seem like it'll affect TV and console gaming altogether.

Concertoine3762d ago

What's your opinion on screen tear? Honestly im not very susceptible to it. I'd prefer a solid 60 with tears to a target 60 without them.

coolbeans3762d ago (Edited 3762d ago )

I guess it depends on the situation, but I'm not greatly susceptible to it either. In the case of something like Titanfall (360), those kinds of oddities became more noticeable simply due to the newer TV my renter bought, which I then moved my 360 to there. I'm also not sure if a transition from Component 1080p to HDMI during middle of 2014 could've played any part as well.

Needless to say, a bunch of minor changes to my 360 gaming situation took place last year. But even in the face of having to deal with tearing, I'd still take a variable on TF since it captures the playstyle so darn well. Well I don't really have any kind of scientific research to back this assumption up, but I really did feel an incremental advantage in simply having it instead of locked 30. It's one of those unique cases of a higher framerate giving a console game an aesthetic and competitive edge. I haven't gotten that far into Bioshock yet so I can't say for sure how drastic that may seem.

Blacklash933763d ago (Edited 3763d ago )

One mistake I see a lot of gamers make is calling developers lazy and hardware bad if the games don't usually run at 60fps on it.

Framerate is a trade-off with visual fidelity. A higher framerate inherently means an opportunity cost of visual quality and the scope and detail of the game environments. Better optimization can significantly lessen the severity of the trade-off, but only within a limited range.

Basically, this means that the PS4 is no more capable of 60fps as the PS2. Its games are just allowed to visually look better at 60fps, and would still be able to look even better at 30fps. The trade-off doesn't change. (The trade-off exists on PC as well, but players are allowed to customize it based off their settings and specs. PC specs that are relatively high above requirements can do both.)

I wish more developers would explain this; the way it actually is; rather than use insincere excuses like "cinematic" and such. Choosing a better-looking game over a better-running game is an understandable decision, and it's going to be difficult to move the debate forward if people don't truly understand the heart of the issue.

uth113762d ago

because too many gamers expect their games to do everything.

yes, if you double the frame rate, then you will have to half the visual quality (assuming you are pushing it to the max already).

A lot of gamers say they'd rather have the 60fps and less visual quality, but when TLOU:R gave the choice between 30 and 60, many were outraged that shadow-quality was better at 30!

If they start making 60fps standard for console games, there will be lots of complaints about games looking last-gen. Can't win! In the end developers choose 30fps and better visuals because that sells more games.. you can't show 30fps in screenshots.

coolbeans3762d ago (Edited 3762d ago )

I guess that's the key issue then: choice. Just as I would hold no bad opinion of a PC gamer who likes to crank the max out of their PC at 30 FPS so too would I not judge a console player going for 30 FPS in TLOU:R or whatever else. It would just be great to see those concessions made much more often.

"If they start making 60fps standard for console games, there will be lots of complaints about games looking last-gen."

I'm not sure this is a realistic situation given how I haven't seen/heard anyone complain about the Halo 5 Beta (which ran at 60 FPS) looking anywhere near a last-gen game. They looked really, really good for a this-gen game imo. And I'm not quite about your 'visual quality cut in half by double the FPS' logic. It makes sense when it comes to math, but I'm not sure that's really how it works in the practical game-making sense.

uth113762d ago

@coolbeans- it depends a lot on optimization too. At 30fps you have to be able to render your entire scene is 33ms. At 60, you have to do it in 17ms. Can you do everything you wanted to do in 17? Maybe you can, good. But it's always possible to do twice as much at 30. It's just a question whether your game needs twice as much

I'm not saying that a 60fps game has to suck visually on today's consoles. But I do know that not every dev is a god at optimization. I.E. based on what Ubi put out last year, their games would probably suffer visually if they went to 60fps. Someone like a Naughty Dog might pull it off with great visuals.

Show all comments (17)
70°

Epic Games Reveals Son Heung-min Fortnite Skin

Son Heung-min Fortnite collaboration confirmed for June 21 with exclusive skin and Snap Ceremony emote bundle.

Read Full Story >>
gameslatestnews.com
130°

Nintendo Switch 2 May Receive Support For Docked Mode 4K 120 Hz Output In The Future

The Nintendo Switch 2 may receive support for docked mode 4K and 120 Hz output at some point in the future.

Read Full Story >>
twistedvoxel.com
Neonridr19h ago

hopefully VRR via a firmware update is released for docked play as well (like the PS5 did post release)

fr0sty7h ago(Edited 7h ago)

There's actually a pretty long list of games on PS5 that support 120hz. Even more with 60+hz unlocked, and that isn't even counting PS5 Pro.

https://www.reddit.com/r/PS...

badz1498h ago

err...is it even HDMI 2.1 on that thing?

repsahj7h ago(Edited 7h ago)

Yes, according to the dock teardown. It contains HDMI hardware chip of Realtek RTD2175N that supports 4K 120hz. And the HDMI included on the Switch 2 is HDMI 2.1.

140°

Nintendo Switch 2 Sets US Launch Sales Record with Over 1.1 Million Units Sold in First Week

Nintendo Switch 2 breaks US launch sales record, selling over 1.1 million units in its first week. This breaks the record set by PS4.

Read Full Story >>
twistedvoxel.com
repsahj16h ago(Edited 16h ago)

I'm happy for Nintendo. Even though the Switch 2 has received so much hate just to bring it down, it hasn't won. Even though some are saying they'll boycott Nintendo, it hasn't affected sales. Especially now, with word of mouth that the system is good, demand for the Switch 2 will definitely increase more.

Profchaos11h ago

Yeah loud minorities that were not going to buy it anyway

ZeekQuattro2h ago

Considering the record sales the Switch 2 is seeing around the globe its clear a portion of the loud minority clearly bought the device.

Profchaos11h ago(Edited 11h ago)

But But the internet told me the switch 2 was going to be a failure.

Such a good system im having a blast with it and really enjoying my first ever playthrough of yakuza 0

senorfartcushion1h ago

You shouldn’t care about the sales of a game company

babadivad10h ago

Sold through or to the stores? Asking because they're tons of them at my local Best Buy. When the PS5 and Series consoles launched, you couldn't find them anywhere.

OtterX8h ago(Edited 8h ago)

The article clearly says sold. If it were shipped, it would say shipped.

Nintendo had been saying the last year or so that they would be flooding stock so that scalpers would not be an issue. They've followed through with their word. It's not cutting edge tech, so it's been easier for them to sit on stock for a while until it accumulated to a massive number available to be shipped.

Good-Smurf9h ago(Edited 9h ago)

Good for them, I'm still not buying.
Looks to be a pretty safe and boring upgrade over Switch 1 which I don't think they can top it on how plain looking it is design wise.
Shame since Switch Lite showed they still know how to make a handheld that looks good and is actually portable in a classic Nintendo sense.
That's why I got one over Switch 2's block of charcoal.
Then there's the prices of it all and the whole Key card thing destroyed any sense of game ownership.

Profchaos8h ago

There'll be a number of revisions for the s2 like a s2 lite or oled also seems inevitable

Good-Smurf8h ago

Unless they drop the prices and actually has a more portable variant that can also dock then maybe I'll reconsider but as of now, nope too big too boring and expensive to make it worth this early, not enough of its own games atm as well.

Profchaos7h ago

Who knows removing the dock capabilities would lower the cost of the unit pretty significantly.

The cost of the dock, reduction in shipping costs etc bug savings which will enable a cheaper system

Show all comments (17)