Nihongo wakarimasen

wishingW3L

Contributor
CRank: 7Score: 76770

The Frame-rate Debate

People often say that you have bad eyesight if you can't see the difference between 30 frames and 60 but it is as a joke. What heavily depends on eyesight is the resolution not the frame-rate.

The only way you wouldn’t be able to see 60 frames is if the video is interlaced (1080i) instead of progressive (1080p) because interlace halves the frames. One thing to have in mind is that the lower the frames the lower the resolution will be once the camera starts moving, and this introduces a new element called video "judder" due to the image content not being fully translated to the screen. In games this annoyance is simplified with motion blur post process but in movies is a whole different thing.

If you watched any of the Nolan's Batman films then you know how confusing the action can get when Batman is beating up the bad guys up-close. You can't even tell the movements Batman is doing most of the time because of the low frame-rate of the film. This is when directing experience comes into play to make it easier on the viewer with smart editing and angles but lots of judder will still be present.

Judder in a fully CGI film works a bit different because you have at your disposal a bunch of artificial effects just as in the gaming environment. But relying too much on it can make a film that is combining both, CGI with live action scenes, look very unnatural.

But moving on, frame-rate is more important for games than movies because is not just a visual effect to maintain resolution and clarity on fast camera movements by displaying more visual content to a screen, but a way to reduce control latency too. This is the very reason why from now on the EVO tourney that takes place once a year is now moving on to Xbox. Because believe it or not one slight drop in frame-rate can mess up high level execution combos. Most PS3 fighting games are Vsync to avoid screen tear when the frame-rate goes a bit erratic but Xbox fighting games don’t have it. This makes that the Xbox is able to skip frames without bringing up the latency like the PS3 does due to Vsync. This may sound a bit contradictory but it is much more complex than that and that would require an entire new blog entry for itself to explain the technicality of how Vsync forces the system to display a frame before moving on to the next one and then there’s Triple Buffering too. Why most fighting game developers like Capcom decided to implement Vsync on their PS3 version, only they know why. Maybe they thought the PS3 was capable enough to handle the task while the X360 wouldn’t but at the end of the day the pro-players prefer performance over the image quality.

Another example of a game having higher latency due to Vsync would be the Uncharted series. The first Uncharted game had serious issues with screen tear but due to its lack of Vsync it could skip frames, so the latency stayed the same through all the game. Then for Uncharted 2 and 3 Naughty Dog wanted to increase the visual quality, so they implemented Vsync at the expense of latency knowing that Uncharted less than a competitive game it was more about the cinematic experience. Here the use of Vsync makes sense but if the PS3 had better specs then higher frame-rate would have been a welcomed addition due to its advantages in visual quality and latency. With lower latency the gameplay, particularly the aiming since its platforming has a very wide timing, would have been much more precise and snappy. So when you see Call of Duty players complaining about Uncharted’s bad shooting mechanics just try to see it from their point of view where they take high frame-rate for granted without even knowing it. At the end of the day the game developer are the ones who decide where to allocate the resources depending on the vision they have for the game but to me is a good thing that as of late gamers have been very vocal about wanting higher frame-rate on next-gen games. If there’s a good thing that could be said about Nintendo is their focus of gameplay over graphics and because of that their games always have very consistent frame-rate, not always 60 but never below 30, compared to many current gen games on consoles.

There are many more things that can be discussed about this topic that could take lots of more text like: “what exactly is a frame and how it is produced by the display, be it interlaced or progressive”, “Vsync vs Triple Buffering”, frame-skipping or the 120hz post-process (not real native frame-rate, the TV is still stuck at 60 frames max) featured on many modern LCD based TVs at the expense of latency by duplicating the same frame over and over again (not good for gaming at all even if the image quality is artificially increased), but this is the end of my 2 cents in the discussion. And I'm not a native English speaker and my vocabulary is limited, so sorry if some stuff might sound like it was written by an elementary kid. bye ;)

Meep3861d ago

What is this? I learned news things from information backed up by examples, being introduced to me in a well thought-out manner, and not being shoved that my throat? Well this obviously doesn't belong here.

DoctorJones3861d ago

Yes, this isn't the done thing. A blog of this type should immediately inform you that you are an idiot and then tell you why you're an idiot in a condescending tone.

Joking about a similar blog aside, good blog wishingW3L, good to see the other side of the coin.

Godlovesgamers3861d ago

There's no excuse for next gen not being 1080p 60fps, so stop acting like a stumbling block to progress.

Pandamobile3861d ago

There is an excuse. The PS4 and Xbox One are not nearly as powerful as people hoped they would be a year or two ago.

DragonKnight3861d ago

@Pandamobile: Can you prove that it's the consoles and not the engines, or the developers, or design choice?

Pandamobile3861d ago (Edited 3861d ago )

I'm not stating my opinions as facts, but anyone who has spent a little bit of time learning about computer hardware will probably paint you a similar picture.

If you look at the amount of computational power the previous generation of consoles (360, PS3) had, you'll notice that those things were absolute beasts when they came out. The 360 had a 3 core multi-threaded CPU when most PCs barely had two. It also had a unified architecture GPU that was near top of the line for 2005. The PS3 was an enigma as well. The Cell was some hardcore stuff for 2006. It had a lot of power locked away that took programmers years to wrap their heads around. As such, it wasn't until 2009-2010 that games started to really take advantage of the hardware.

This generation will not be like the last one. We have $600 GPUs in 2013 that have 2.5x the total compute power of a PS4. Both consoles, at their core, are mid-range AMD PCs. Each has a mobile 8 core AMD CPU, a mobile AMD GPU and 8 GB of RAM. These are PC parts. They're not specially engineered from the ground up by IBM or the Cell consortium. These are largely the same chips that AMD stuffs into laptops and desktop PCs. As such, every software engineer in the world should have a pretty good idea of what they're dealing with. Each console has a specific way of handling memory (Xbox One with its ESRAM + 8 GB DDR3 and the PS4 with its 8 GB of GDDR5) that developers will have to learn to master, but this is about where the differences stop.

Sony and Microsoft COULD have made monstrous machines that would give gaming rigs of today a run for their money, but after last gen, I doubt either company wants to be in that same situation. The PS3 almost destroyed Sony because it was so expensive to produce for the first few years of the cycle. On the other hand, the Xbox 360 cost MS a fortune in warranty support because of the RROD (caused by a hardware failure). The PS4 and Xbox One are a safe approach to next-gen hardware. No special hardware means monumental screw-ups are not likely to happen. And best of all, Sony and Microsoft might actually make a few dollars on every console sold, rather than lose hundreds (*cough*PS3*cough)

What it boils down to is that next-gen consoles are pretty comparable to high-end gaming rigs from 2011. Yeah, optimizations will help keep console graphics moving forward until the next-gen, but it likely won't be as drastic as it was last-gen. It took console developers about 3-4 years to hit the graphical wall in the 360 and PS3. It'll take console developers probably 2-3 years to hit that same wall (accounting for the amount of cross-gen titles that will be produced between now and then).

Long story short, a game like Watch Dogs will require an enormous amount of compute power to run at 1080p60. More computational power than the PS4 and Xbox One offer.

DragonKnight3861d ago

You're still basing your point on consoles vs. PC and it's not about that. But even using your methodology, and knowing that you're a PC fan first and foremost, I said in my own blog that there are countless PC fans who will be all too willing to explain why a 2006 PC can run a game at 1080p 60FPS. So given that the PS4/Xbox One are more PC than any other console in the history of the industry, and that their specs are better than the aforementioned 2006 PC, how can anyone honestly think that the consoles (which have fixed hardware) are at fault?

Does it not register in people's minds that Watch Dogs is a cross-gen multiplat? Is it not a possibility that the lack of 60FPS could be a fault of the engine or a design choice? How can it automatically be the consoles when their specs aren't that bad and comparable PC specs will run the games at better than 1080p/60FPS?

You don't even need to be a PC hobbyist to know that.

Pandamobile3861d ago (Edited 3861d ago )

1080p60 isn't that hard to achieve if you dial back the graphics. I'm sure that the Watch Dogs team is more interested in getting the best visuals on PS4 and Xbox One than pushing for higher resolutions and frame rate.

It's easy to get a game to run at 1080p60 when it doesn't use fancy shaders, lighting and a massive draw distance. It's a different story otherwise.

"could be a fault of the engine or a design choice?"

If it can run at 1080p60 on a good PC, it's not an engine limitation. If you were arguing about Dark Souls being stuck at 30 FPS on PC, then that would be an engine limitation. 30 FPS is not so much a design choice as it is a technical limitation.

"PC fans who will be all too willing to explain why a 2006 PC can run a game at 1080p 60FPS."

Yeah, a 2006 PC can run games at 1080p60, but not newer games. You can probably play COD:WAW at 1080p60, but most newer AAA games require a 2006 PC as a minimum requirement.

The short and skinny of this whole debate is that Ubisoft are opting for better visuals over resolution and frame rate. Why? Because they're facing hardware limitations.

+ Show (2) more repliesLast reply 3861d ago
-Gespenst-3861d ago

1080i/p and 60fps are BOTH unmistakable when you see them, how could they not be?

However, IT DOESN'T MATTER. We don't NEED either of those things to make good games.

CrossingEden3860d ago

Considering that more than half the Sony fanbase believes that infamous second son is running at 60fps shows that they really should stop pretending like they can tell the difference. And these same people are the ones pretending like they understand the difference between 150K polygons and 85K polygons. Yet no of them respond when I tell them to explain the difference between the two models in this picture.
http://www.cgexplorer.com/_...

maniacmayhem3861d ago

These are launch games people.

When has a system (besides the SNES *wink*) taken full advantage of a console's power at launch?

I can't believe that people are actually concerns about 60fps/1080p...I thought we were he to discuss the games not specs.

Isn't there a tech site by these guys that all of this should be discussed?

DragonKnight3861d ago (Edited 3861d ago )

I said the same thing about discussing specs. You're just going to be met with "just because we're talking about tech, doesn't mean we don't enjoy the game."

But take a look at N4G's news lately and how many discussions about how good a game is or looks to be based on its story or gameplay or characters are there, versus discussions about how many polygons are in a character or how a game is below 1080p? The latter severely outweighs the former and anyone could be forgiven for looking at said discussions and questioning whether gamers were speaking, or amateur developers were.

maniacmayhem3861d ago

What really boggles my mind is how everyone expects these full powered games at console release.
We didn't get the real meaty games until half way into each console's life cycle.

People tend to forget what most launch games were like because we have so much on the shelves now.

I remember Gun, King Kong and Lair being some of the first games for the new systems and in no way was that a true example of what the systems were actually capable of.

But I agree more of us need to discuss games, design, gameplay and how fun the game actually is.

DragonKnight3861d ago

If you have the time, look up ReviewTechUSA on Youtube and you'll see a guy who explains why people have these expectations. He's wrong, and he reaches quite a bit in his videos, but the gist of it is "consoles are more like PCs now more than ever before so they should be able to do 1080p 60fps no problem and if they can't they're underpowered."

No one takes the time to look objectively as to WHY a game isn't at 60fps, they just know that it's not and blame the console immediately.

Oschino19073861d ago (Edited 3861d ago )

What about my TV, it has 240hz native display and can be artificially boosted to 960. It also has tons of other bells and whistles to get the most out of any experience.

Every game looks so much better (smoother, less jaggies, colors, etc...) all friends notice immediately and often assume the PS3 version is just that much better then their 360 version but really its just the tv itself.

Have XBR models of Sony since 2007-2008, current tv is a XBR HX929. My cousin brings his pc over and even his eyes melt compared to his tv at home he plays on daily.

The more capable the game great for over performance, but having a great display can make a world of difference as well

Pandamobile3861d ago

In my experience, TVs that interpolate content looks absolutely disgusting and really makes no difference in latency.

If you have a 240 Hz panel, then it's pretty useless for games and movies that aren't actually 240 Hz.

I have two monitors. A 120 Hz 27" and 24" 60 Hz. The best part about it is that my 120 Hz panel doesn't have to interpolate motion. When I move around my desktop, I actually get to see things at 120 Hz. It really is beautifully smooth. It's gotten to the point where the difference between 120 and 60 Hz is night and day.

Don't fool yourself into buying those artificially boosted refresh rates. If you're buying a TV specifically for the 120 Hz refresh rate, don't waste your money unless you have 120 Hz content to back it up.

50°

Homeworld: Vast Reaches Descends onto Meta Quest in May

Farbridge's upcoming VR RTS Homeworld: Vast Reaches will be available for Meta Quest headsets from 2nd May 2024.

Read Full Story >>
xrsource.net
thorstein17h ago

That's incredible! Homeworld is so good and adding VR.

40°

EGX and MCM Comic Con combine for the UK's biggest pop culture weekend

You'll know of EGX. You'll know of MCM Comic Con. Today the two events have confirmed that they'll be coming together under one roof

Read Full Story >>
thexboxhub.com
40°

Squad Busters is a New Action Game from Supercell Soft Launched in Selected Countries

Supercell has released its new squad-building action game Squad Busters in Spain, Mexico, Finland, Sweden, Norway, Denmark, Canada, and Singapore. This game features characters from different Supercell titles, such as Clash of Clans, Brawl Stars, Hay Day, Clash Royale, and Boom Beach.

Read Full Story >>
realgamingnews.com