In Theory: 1080p30 vs 720p60 - could next-gen let us choose?

It's an intriguing thought. With so many next-gen console titles targeting 1080p at 30 frames per second, why not offer gamers the option of playing at 720p60 instead? After all, the ability to trade pixels for frames has long been part and parcel of the PC gaming experience - and both Xbox One and PlayStation 4 are based on x86 CPU cores and Radeon GCN graphics architecture

Read Full Story >>
Oculus Quest Giveaway! Click Here to Enter
The story is too old to be commented.
yewles12189d ago

What about Witcher 3? If rumors of the X1 version being 720p/30 being true (just like Dead Rising 3), would gamers REALLY be desiring a 576p/60, 540p/60, or even a 480p/60 option after all these years?

GarrusVakarian2189d ago (Edited 2189d ago )

Like RexLex can't just switch [email protected] 30fps for 720p @ 60fps, but i wouldn't mind if MP modes were graphically inferior to their SP counterparts in order for more fps. Fps matters much more in competitive MP.

For example, Uncharted 4 could be 1080p @ 30fps in the SP but the MP could be 900p 60fps /720p 60fps....

I don't mind if devs make the SP portions of their games 30fps so they can use the extra power to make it look more visually amazing, but for ALL MP's, i think framerate should take priority.

ABizzel12189d ago (Edited 2189d ago )


Stop changing your profile pic. I can't recognize you as often :D

OT: generally moving from 1080p down to 720p with the same settings gains you an average of 15fps in most current PC games, so realistically only games with unlocked framerates would reach 60fps (example Tomb Raider PS4), whereas games with 30fps, basically reach unlocked framerate (Tomb Raider XBO).

But i completely agree with Lukas on multiplayer. 9/10 I want 60fps for my multiplayer games especially if they're fast action based. 30fps is fine if it's a slower paced game.

These consoles aren't amazing, spec-wise but that's because this is a transitional generation. Hardware cost too much to produce an affordable console with high-end specs, which is why we have a mid-range console (PS4), a mixed low-high console (XBO), and Wii U, PS3, and 360 still.

The GPU is great in the PS4, good in the XBO, decent in the Wii U, and acceptable in the PS360 still.

The CPU in all consoles are dated and only acceptable, compared to what's being offered on the PC side of things.

And RAM wise only the PS4 is where the consoles need to be.

These are transitional consoles, and this is why I keep saying I can't wait for the next generation of consoles, because technology, power, and performance will finally be a non-factor for the majority of developers, and games can finally focus on creativity over graphics and power.

nukeitall2189d ago

I think frame rate is overrated in multiplayer. Why?

The frame rate only serves to make the game slightly smoother, yet a laggy experience online will negate any frame rate increase you have.

In a similar vein, resolution increases just like graphics has diminishing returns. Games are hardly noticeable when massive amount of computing power is added.

Just see the latest crop of games on PS4/Xbox One compared to previous generation.

So I think this is mostly an irrelevant discussion, and what I would like to see instead is all online games moving into robust cloud infrastructure like MS Azure. The fact that majority of gamers get 15-30ms ping time is huge and well ironically makes 60fps matter again.

That should be the standard, especially now that PS4/Xbox One gamers are paying a fee for their online gaming access.

Which still leaves me scratching my head why KillZone: Shadowfall is P2P with a proxy server in the middle. You are still subject to lag switches, host advantage and so on.

AliTheSnake12188d ago (Edited 2188d ago )

I would choose 1080p30 in everything but multiplayer games. It's not all about resolutions tho, there's a ton more graphical options like textures quality, AA, shadows, reflections,Ambient Occlusion, Texture Filtering, particle effects. But I would rather not mess with these, and have the developers optimize the game, so I can get the full experience they want me to get. ( funny I just described console gaming)

starchild2188d ago

The problem is, going from 30fps to 60fps requires twice as much hardware performance, whereas jumping from 720p to 1080p only requires around 30% more performance.

For this reason you can't simply trade resolution for framerate straight across like that.

+ Show (1) more replyLast reply 2188d ago
Edsword2189d ago

One of the reasons consoles can be optimized is because the game designers set certain targets and the press for them. If there are two targets, more has to be compromised to reach both. A game will not always run 30 fps faster because it is a lower resolution. A lot more goes into it than that. I say let the game designers choose what is best for the system that are designing the game for. If you can't stand 30 fps or 720p you can vote with your dollars. There is no reason every game cannot be 1080p 60fps, if that is what the public wants, but it will be at the sacrifice of other visual elements of the game. I think smaller choices could be pratical such as implementing higher quality AA for lower fps, but because of costs that is unlikely to happen.

bicfitness2189d ago

The only reason we are entertaining these sorts of ridiculous fancies, especially considering that the average cellphone outputs at 1080p these days, is because one of the manufacturers has a gimped box. If the shoe was reversed, and MS had the 1080p box, we'd be hearing all about the "value" you get for the extra $100. 1080p is a standard for modern displays. Enough, Mr. Leadbetter, you're supposed to be a tech "enthusiast". So show some enthusiasm and expertise.

nukeitall2189d ago

So would you say an average cellphone outputting 1080p games are of the same graphical quality as a console one?

Once you start focusing on resolution instead of other things that matter, you get the Gigahertz war again. Pointless, and has minimal benefits to consumers at the cost of "heat". In this case, resolution in trade of actual gaming benefits.

I suspect, if the shoe was reversed between and Sony had the 900p box, we'd be hearing more about how resolution doesn't matter.

Remember how the Wii dazzled everyone with last generation graphics?

That is the sort of tech enthusiasm I was hoping for. More of innovation, and less of "more of the same".

RexLex2189d ago

Engines don't work this way
720p @60 does not guarantee 1080p @30 and the other way around,.. It would be cool if it was that simple

XtraTrstrL2189d ago

I assume in a setup like this for consoles, if you are just choosing between an option for 1080p x 30 and 720p x 60, other fx and settings would be adjusted in the background by the devs for each of those resolutions to be able to hit the set fps target.

KimoNoir2189d ago

It is simple. It is up to the design manager to decide what to compromise to maintain a 30fps at 1080p for ps4 (or even 1920x1080 w/60fps) or 720p @60fps (for xbox).

I doubt this gen for ps4 will go any lower than 1600x900 (battlefield 4 res)

dcj05242189d ago

Yes, I think 900p will be the resolution for the more bigger and ambitious games like The Division or Battlefield without sacrificing graphics.

yewles12189d ago

"Killzone Shadow Fall's multiplayer runs at 960x1080 with a high quality temporal upscale. Fill-rate is reduced, but we're still not at 60fps."

That's 1,036,800 pixels, less than the 1,440,000 in 1600x900...

WrAiTh Sp3cTr32189d ago

So KZ:SF has less pixels than TitanFall? That's interesting...

BitbyDeath2188d ago

Hard to believe it runs at 960x1080 when more reputable sources had already confirmed it as running at 1080p.

The levels are much smaller in multiplayer which is why it can reach 60FPS.

+ Show (1) more replyLast reply 2188d ago
rafaman2188d ago

nice damaging control title. lol

"In the single-player mode, the game runs at full 1080p with an unlocked frame-rate (though a 30fps cap has been introduced as an option in a recent patch), but it's a different story altogether with multiplayer. Here Guerrilla Games has opted for a 960x1080 framebuffer, in pursuit of a 60fps refresh. Across a range of clips, we see the game handing in a 50fps average on multiplayer. It makes a palpable difference, but it's probably not the sort of boost you might expect from halving fill-rate."

CharlesSwann2189d ago

Like RexLex says it doesn't work that way. Me think, video game writers should stick to talking about video games and not technical matters.

ChickeyCantor2189d ago

It actually does work that way. Engines are build with scalability in mind (And I'm not just talking about resolution).

Sophisticated game engines have options that allow the developers to tweak their output settings.
If they give the options for the end user the settings are hardcoded for each profile and tweaked to run optimized.

Obviously some quality settings might be tweaked to go lower to sustain the higher resolutions.

porkChop2188d ago

But it does work that way. Say you have 1080p/30fps with all the graphical effects turned on, then when the player selects 60fps some of the more intensive effects are scaled down or turned off. For example, switching from HBAO to SSAO, switching from MSAA to FXAA, turning Parallax Occlusion Mapping off, etc. It's very possible to do, though it would take a little bit of extra work to optimize both modes.

Destrania2189d ago

How about we just let developers make games how they believe is best for their games because we're not developers? Hmm...

ProjectVulcan2189d ago (Edited 2189d ago )

While I don't disagree, what frustrates me is that we still get games are technically incompetent.

Thief recently is the perfect example.

You would think that if you left the programmers and developers up to their own devices, they would produce something nice and polished and technically excellent as we should expect from professionals.

However, we still get all these games that are quite obviously technically flawed on release.

This may be the result of a political decision, internal release date fixed by the publisher and the developers can't argue for more time.

But it is still unacceptable that we have games shipped that to even the untrained eye are deficient, and not the quality polished product gamers deserve when they are dropping their hard earned on it.

Fundamentally you expect a game to have a nice playable framerate and not full of game breaking bugs.

Why is it so many games can't manage even that? One of the reasons we need people and sites like this to point out games that suffer these problems is to try and stem the tide a little and spread the word it is not good enough.

Destrania2188d ago

Well said. I didn't mean we should just let developers slack and release incompetent/incomplete games and just accept that either.

Qrphe2189d ago

It's not that easy and kind of ridiculous (it would take a lot of effort from the dev [timeand money]) to cater to a few. I know it's slow news day at Eurogamer but this is a new low for them.

Show all comments (52)
The story is too old to be commented.