This is kind of a trick question because it all depends on how you look at things.
Gamers these days I don’t think understand either.
Someone ask me this the other day so I thought I would explain, so let’s start with the basic’s - resolution is important because it determines how good the picture will look./ More pixels mean better detail. (even though sometimes is hard to tell the difference between 720p and 1080p they are after all both HD.
If you think about your cell phone camera, the more pixels it has, the better the image quality is, because it has more pixels to make up the picture. (So think about this, it’s just like watching a video on youtube, at 360p it’s not that clear, but at 720p it’s really sweet and at 1080p it’s even more detailed.) Same thing with games. When you start to do games in 1080p vs 720p or 900p you are taking about a lot of pixels that make up the picture you see. The more pixels there are the better and clearer the image will start to look.
And fps is important because frames per second is how fast or fluid the image is being show to you.. / More speed means the more it looks true to real life, so higher is better, and that’s kind of why 30 fps works fine, since at 30 you are really close to how you see things in real life, of course 60 fps is the icing on the cake because at this speed you can’t tell the difference, everything will look as fluid as it does in real life. Anything above that is overkill, you really can’t tell the difference past 60 fps.
But here is where it gets tricky, you need both, so of course that’s the problem, you rarely get both and the reason is not as complex as you might think. It all comes down to hardware, if you were on a high-end PC with a monster graphics card, no problem. PC’s have been doing 1080p gaming for years, hell for today's gaming PC’s 1080p is kind of the minimum you expect it to be able to do. But pulling this off on consoles is actually very tricky.
Why? Because it usually means you sacrifice one to get the other, and by that I mean, do I want really amazing picture detail / but less physis less add-ons. Or do I want a really fluid game experience more physis more add-ons. / but less detailed world with lower textures.
But wasn’t 1080p last years big thing, so why in the world are today’s next gen consoles having problems hitting 1080p? Good question, here’s the simple answer, games today are trying to do more, much more than they did on last gen consoles. So graphics are much more detail, size of the games are getting bigger, you begin to try to do more. And that brings you right back to the same problem from before. Hitting 1080p is not as easy as it might look.
It’s mainly because it requires a pretty strong graphics card, so why not just put a super graphics card in the PS4 and XB1. Sure that is a great idea, just one problem, it already has a really high-end card, and no matter what I put in it, the problem will still be there. Why? Because as the graphics card becomes stronger, the game companies then just try to push the graphics even further to look even better. So the problem just starts again.
(it’s a tricky balance, because gamers want next gen graphics that look like a high-end Gaming PC which has a super graphics card in it, but then they want that console to not cost more than $399) It’s like saying I want a car that can go 0-60 as fast as a porsche but I want it to cost the same price of a station wagon.
This is a double edge sword. It cuts on both sides, consumers want amazing graphics and they want it to run at 60 fps. So now game companies make a really tough decision, get the game to look amazing, much better than last gen’s game, and also make it look insane, and then somehow get it to run at 60 fps too. So game companies shoot for the star’s (it’s part of the reason why every demo you see of a new game coming out / Watchdogs / The Division / Looks amazing. Then the actual developers start trying to make the whole game and deliver on the promise, but they run into a problem, the engineers say, “ok we have a problem, as soon as we hit 60 fps there is no way it can keep it looking that detailed. Since we probably won’t get both choose which one suffers if we have to choose one.”
So now the downgrade begins. Make it still look good, like the insane demo we showed, but get it to run really great. The only way for that to happen is for the textures to be slightly turned down. So now it still looks like the demo but not as detailed or as amazing as what you saw at E3. And if it does get close to that demo then the frame rate is going to suffer, no way it stays locked at 60 fps. Project cars I’m looking at you. The game does dip below 60 (gets as low as 45) but it remains above 30 which is more than playable so I really don’t mind that, because it looks and plays amazing.
Game companies don’t really get to ask gamers which one is more important, they instead just try to deliver a great game experience and meet the requirements of the game, while still looking as good as possible. As they should, they basically just try to deliver on the promise.
The problem is most of the time they fall just a little short, but no one wants to hear that.
So which one is really more important to you, do you want that 1080p at 30 fps or is 720p or 900p fine just as long as it’s 60 fps? Or do you say, oh hell no, give me 1080p at 60 fps above everything else or you have failed? What’s your preference when it comes to console games?
-here's a great video below which helps explains this a little more-