Top
160°

Ninja Theory On Hellblade Running At 1080p/60fps On PS4: We’ll Be Thinking About It Down The Line

Gamingbolt "Ninja Theory's next big game Hellblade will be a timed PlayStation 4 exclusive. The studio is famous for several underrated games like DmC: Devil May Cry, Enslaved and one of the early PlayStation 3 exclusives, Heavenly Sword."

Read Full Story >>
gamingbolt.com
The story is too old to be commented.
JMaine5181089d ago

This question needs to die. We all know early out that 1080p/60fps is not the standard for these games.

NuggetsOfGod1089d ago (Edited 1089d ago )

But what about planetside?? Oh yeah nvm..

Least it will have 2000 players!! Oh yeah nvm..

Love console games but I hate console hardware.

Thinking about down the road = graphics first and probably 30fps.

Gta5 looks amazing! Dips below 30.
1080p is hurting console games.

Cernunnos1089d ago

Old hardware is hurting console games. Not 1080p

Svinya1089d ago (Edited 1089d ago )

1080/60 is fine for games like Forza or arcade but let's be honest here, it won't be anywhere near a standard this gen, even ong the "OMG SUPERCOMPUTER GIGAFLOPS" PS4..

loganbdh1089d ago

Ugh I hate Ninja Theory. Hack and slash games should automatically 60fps no matter the resolution.

NuggetsOfGod1089d ago (Edited 1089d ago )

Here is the thing... A good amount of console gamers have been trained to believe 60fps is unnecessarily and messes up some games hence ur disagree.

Though they hardly ever play 60fps in 2014 alot of them believe this and will die for it.

I wonder if they will complain when ps5 is 4k 60fps?

Who am I kidding ps5 will be 4k 30fps and they will love it lol.

The last of us was great at 60fps btw.

Wonder why The Naughty Gods made it 60fps? Maybe they are to lazy for a
Solid 30? Who knows.

Truth is all games should be experienced @ 60fps+.

Mgsv is not even a fast paced game yet kojima wants it at 60!
He could have just made it 30fps and push the graphics higher.

Fps is god. Not resolution lol. If devs gave an option for 720p 60fps I bet many would love that.

MegaRay1088d ago

Almost all genre need to be 60fps. Gameplay before graphics. To me atleast.

younglj011089d ago

can't wait for more info..

bunfighterii1089d ago

I got a question for someone with technical know how on this:

Is it possible for devs to lock games in between 30 and 60fps? Why does it always seem to be a a choice between 30 and 60 fps?

Like if it runs stable at 1080p and 43fps, could they lock it there or is there some reason they don't?

PersonMan1089d ago

Most TVs and Monitors run at 60Hz. That means the TV refreshes the image 60 times a second. If the frame rate doesn't sync up with the monitor's refresh rate, then you'll notice an odd stutter effect because 43 doesn't divide evenly into 60. However, 30 is half of 60, so you're basically seeing each frame exactly twice, resulting in a smooth display of frames.

However, if the framerate is an odd number like 40, then you'll never know what you'll get. You'll get a couple unique frames here, and a couple duplicates there and it will appear to stutter randomly.

60fps is like this: 1,2,3,4,5,6,7,8,9,10
30fps is like: 1,1,2,2,3,3,4,4,5,5,6,6,7,7,8, 8,9,9,10,10
20fps is like: 1,1,1,2,2,2,3,3,3,4,4,4 etc.

40fps is like 1,2,3,3,4,5,6,6,7,8,9,9,10 (where you see those duplicates, the game appears to pause briefly because it's displaying the same frame twice in only those spots). It interrupts the smooth motion of the video.

Cernunnos1089d ago

It won't cause stuttering, but it will cause screen-tearing.

PersonMan1089d ago

Cernunnos: It doesn't have to cause screen-tearing. Triple-buffering means a game can run at 43fps with no screen tearing. However, you'll still see judder.

Anonagrog1089d ago (Edited 1089d ago )

Edit: Moved my comment further down.

Svinya1089d ago (Edited 1089d ago )

Frame rates other than 30 or 60 fps cause judder on TVs. Anything that's not divisible by 30.

A locked 30 is smoother than 45fps and arguably a better choice, even if it adds a little more latency to the controls over 45fps.

Anonagrog1089d ago (Edited 1089d ago )

PersonMan's example is based on the usage of vertical sync, where we synchronize frame output with the monitor's native refresh rate. That 43fps you mentioned may be the internal frame-rate the engine can achieve, but how we see that on the monitor can vary so another scenario worth exploring is one without vsync.

With vsync enabled, if the current rendering frame isn't complete in time for the next monitor refresh the previously completed frame has to be re-used again. The downside is that we may negatively effect smoothness of the game, as well as potentially incur a hit to input latency too.

Why is "smoothness" affected?

If you consider any motion in a game scene, that motion is inevitably going to be a function of time. When we display what is happening to the player via the monitor, that's us effectively taking a snapshot of a moment in time within the game. If the speed at which we render and present the scene is consistent with respect to the motions of things within the scene, and thus time, we can represent this motion fairly smoothly to the player. Vsync can cause this smoothness to break down if the internal frame-rate deviates often from the vsync multiples for a monitor (eg. 60Hz monitor: 60fps, 30fps, 20fps, etc), because of frames being delayed every now and then. The result is almost like time stretches and contracts, which we perceive as a lack of "smoothness".

If you ignore vsync and just 'send' the frame as and when it's rendered instead, we no longer introduce unwanted judder and input latency; however, we may now be hit by 'screen tear'. In short, it's just the occurrence of seeing both the previous and the current frames spliced together on screen at the same time. As you may imagine, if a lot has 'changed' between the previous frame and the current one (e.g. fast motion of you through the environment, or perhaps the environment around you), there may be a noticeable 'tear' in screen image that lowers the overall apparent image quality.

The pros and cons of each suit different situations, but it's ultimately subjective depending on the user.

In the console development realm a lot of effort is put into minimizing such downsides by trying to ensure a consistent internal frame-rate at all times near to the desired 60 or 30 fps vsync target. A target any lower (20fps as the next multiple) would cause visuals and responsiveness to suffer. That's why some console games temporarily endure screen tear (eg. at 27fps) when dropping below 30fps instead of dropping straight down to 20 with vsync.

NB: You may have heard of pc monitors with tech like Nvidia's 'G-Sync'. That's a way of combating the above by having the monitor vary it's own refresh rate to suit instead of the other way around. All of this is really a legacy support problem that serves little purpose in modern displays. It's an unfortunate thing that needs to "go the way of the Dodo" in my eyes. Bring on an open standard.

Illusive_Man1089d ago

It other words the system isn't powerful enough but we'll try

Show all comments (20)