Smell that? Yep, it's that magical time of year for games to come out. A show of the games having the latest tech, the evolution of gameplay, and of course the endless bickering between people over trivial things. Tis the season I suppose, but a rather interesting piece has come out over the latest iteration of the Assassin's Creed series, Assassin's Creed Unity, coming only to current-gen and PC. We've seen a lot of what French Revolution Paris has to offer and people are understandably hyped. However, something that was revealed recently says that the new game will run at 900p at a locked 30 fps on both consoles. Now people are asking the usual questions. Is this all what next-gen has to offer? Why is Ubisoft always so stupid? Why do AAA games never deliver on hype? Allow me to throw a question into the mix.
Why does it matter?
Of course the resolution and frames-per-second battle has been going on far longer than AC Unity's conception, but I ask everyone reading this to honestly think. What is a pixel? It's the smallest element that is displayed on a screen. What is resolution? It's the number of pixels output horizontally by pixels output vertically. The current max that games have been going for is 1080p, which is 1,920 pixels wide and 1,080 pixels high. So AC Unity is 180 pixels shy of being the current max for consoles.
So yes, people are crying foul and complaining over a mere 180 pixels that I can't even tell the difference anyway, and I doubt a lot of people can either. The problem comes in when you really start to separate graphical fidelity and design between resolution. Case-in-point, there was a certain game a few years ago which output 1080p and attempted 60 fps. I say attempted because the game struggled to stay there, but moving on. So by current gamer logic, that game that ran at 1080p is better looking than AC Unity. Well, the game I'm talking about is the infamous Sonic the Hedgehog, also known as Sonic '06 to most. The game was a buggy mess, the general design was terrible, the graphics were hardly impressive, people looked like mannequins, and the game largely consisted of ugly textures that were painful to look at. But the game ran at 1080p, so it's clearly better looking than AC Unity, right? The Last of Us ran at a meager 720p at 30fps, but did that stop it from becoming one of the greatest games of last-gen and winning tons of Game of the Year awards in 2013?
If people want to play the numbers game, then I can too. AC Unity has promised at least 12,000 active NPC's on screen at a time, 1 in 4 buildings can be accessed without a loading time in between, the recreation of Paris is a literal 1:1 scale of real-world Paris, and has literally hundreds of buildings all rendering with stunning detail. So out of all these numbers, the 900p and 30 fps are the numbers that stick?
Starting to see what I mean? While the difference between 30 and 60 fps is definitely more noticable, the game being 900p instead of 1080p does not matter. A crisper image doesn't mean anything if the game itself isn't pretty to look at. Just because I can stare at a piece of crap with glasses on doesn't change the fact that it's a piece of crap. This is where the true idea of graphics come in. Not in output, but in design, and many games thankfully prioritize design over output. I dare you to look inside the Notre Dame cathedral recreated and tell me that it doesn't look beautiful.
These are some of the moments in the community which make me ashamed to be a gamer. Have we gotten so bad as to rely on numbers instead of our eyes? Must we really argue that Ubisoft did this to keep the game virtually the same looks on both consoles? That happens anyway with most multi-platform titles, so why does it matter now? To be honest, it's hard to have fun talking with people about my favorite games when most of gamers want to do nothing but argue over a meaningless number. A game that looks pretty looks pretty, it's as simple as that.
If the reason most people are fighting is because of proof-of-concept that Sony and Microsoft promised with the new hardware, then that's even more silly. Just because a game is just whisker-shy short 180 pixels does not make your $400 system useless, and the idea that it would even come close to doing that means that you are one picky person with a very loose wallet.
If you still aren't convinced that these numbers amount to nothing more than cannon fodder for purists, then I'm sorry I couldn't convince you. However, if you need me, I'll be having fun playing pretty games, regardless of missing pixels.
Paul writes: Mostroscopy brings something different to the table in regards to its presentation, but then the wheels fall off.
WTMG's Kyle Nicol: "If you’ve never played Yakuza 0 before, then this is another fantastic version of one of the best titles in the franchise, and I would highly recommend it as one of the best titles in the Switch 2’s launch lineup. It’s a game that everyone should experience at least once, and the Switch 2 Director’s Cut might actually the best way for a new player to get into the franchise. However, not much here is worth getting the game again if you’ve got the tremendous PC or current-gen versions already."
Part of the fun of Disney Dreamlight Valley is its crossover character interactions, and in this game's case, more would certainly be better.
I think you are missing the big picture. It's not about a hand running at 900p vs 1080p, it's about how Ubisoft lowered the PS4s resolution deliberately, even if they didn't have to. Sounds even more suspicious if you take into account that MS and Ubi have a DLC deal
Cgoodno... you are forgetting something... First- It doesn't really matter for the most part... and resolution shouldn't affect your enjoyment of an awesome game, as long as it runs smoothly. AND- If you remember Watchdogs, the PC version is not guaranteed to be flawless either... especially coming from Ubisoft.
If resolution discrepancies didn't matter to Ps3 fans last gen, then they SURELY don't matter to Xbox One fans now... Even when we compare the Original Xbox to the Ps2...the PS2 rez is inferior on multiplats. Playstation has NEVER had the lead for gfx even comparing PS1 to N64 .... so there's no room to flaunt about gfx UNTIL the PS4... and even then, it's mostly irrelevant to enjoyment.
The script can't be flipped either, because the people who USED to NEVER care about resolution- (playstation gamers) suddenly DO care??? Either Playstation fans have been lying about pixel count being important since the PS1, or they have no room to brag for FINALLY having a graphical edge now...
It either NEVER matters, or it ALWAYS WILL... flip-flopping between the two proves that sometimes- Blind loyalty is JUST blind loyalty.
I don't know about you but I notice the difference so yes the pixel difference does matter.
I've made my stance pretty clear about graphics many times, so I won't go into what I think about their importance.
I will however state that the issue isn't whether or not the GRAPHICS are important, but whether the GAMERS are.
Ubisoft has sent a message with this move. The message is "we can program our games to be the absolute best they can be on the PS4, but we don't want to, so you shouldn't expect to have anything more than what the Xbox One can handle."
This tells us, once again of course, that Ubisoft cares more for Microsoft's business and PR situation than they care about their consumers. I find that to be funny considering that Microsoft is in no position to turn down Ubisoft's games, so Ubisoft shouldn't care if Microsoft wouldn't like the Xbox One version being less than the PS4 version. After all, AC Unity wouldn't even be the first game where that was the case.
I personally haven't ruled out that Ubisoft bit off more than they could chew when it came to AC Unity, and are deciding to throw the PS4 under the bus to cover it, but in any event the graphics are simply the catalyst, not the core problem.
Choosing to purposely weaken your game in the interests of a single platform holder that's in the weakest position smacks of a bad business decision, and clearly didn't have the intended results that Ubisoft wanted. Everyone knows that the PS4 is the better piece of hardware and that discussion is happening right now. What matters is that Ubisoft continues to show that the bigger a company gets, the more out of touch they get with their userbase.
If anyone thinks this is just because of a few pixels that no one will notice, remember that graphics isn't the only thing that publishers/developers can gimp in the name of parity. Parity is taking something from one to prop up another. It is forced equality and shows disrespect to the group that has had something taken from them, doesn't matter what it is.
I just wanna come out and say you're dead wrong assuming the difference is 180 pixels.
900p, 1080p doesn't mean the number of pixels on a screen.
Its the resolution of the screen and that number (900 or 1080) is the number of pixels in JUST ONE DIMENSION of the screen
900p= 1600×900 =1440000 pixels on screen =>(a)
1080p=1920×1080=2073600 pixels on screen =>(b)
(a)-(b) = 633600 =>(c)
(c) represents 44% of (a) => (b) represents a 44% increment from (a)
Now you tell me if a 44% difference matters or not.