There are many who feel they've been slighted by recent resolution announcements. We put your eyes to the test.
Yes, next question?
why settle for less when you can have the best!
You mean 4K? There is a difference between 900p and 1080p but I bet not many would notice unless they had it side by side.
Because less resolution means more processing power equalling to a better looking game. For example if Dead Rising 3 ran at 1080p it would have far less Zombies in the screen making the game less enjoyable. If Infamous ran at 720p they would have more power to add to enemy AI or more units on the screen etc. Same goes with every title. 1080p is only a small part to making the games look better. A game 720p using MSAAx8 will look better then a game at 1080p using just MSAA. If your a PC gamer you can benefit from both quantity and quality but for a console gamers there are always sacrifices to reach there goals.
@azznation no a game at 720p with MSAA won't look better than 1080p unless the TV is below 42inches and you sit at least 4 feet away. If you are on PC the difference is obvious because you are so close. It depends on the size of the screen and distance from it. For example I have a 70inch TV and I sit roughly 6-7 feet away and there is a HUGE visible difference between 720p and 1080p however there is no obvious difference between 900p and 1080p. The human eye can only discern certain levels at detail at certain ranges. If I move to 3 feet away i can see slight differences between 900p and 1080p but it's no drastic. generally if a game is running 900p on x1 and 1080p on ps4 my decision falls to frame rate and controller. I prefer the Xbox controller so if frame rate is the same I'll go x1. If there are dips below 30 I'll go ps4, if ps4 is 60 and x1 is 30 I'll go ps4. People will argue there is a difference between the two resolutions at screen sizea and ranges I have listed above but they are fooling themselves. This is a scientific fact. Detail loss of the human eye starts happening immediately with distance. Size of subject effects distance required but in order to see the difference between 900p and 1080p at a range of 6 feet you would need a 100 inch TV screen.
AC4: Bottom on is better. UTF: Upper one clearly is better. WD: WTF? Both version of Watchdogs were 800 something p on X1 and 900p on the PS4. Yet, both those images are downgraded to 720p? It'll look 720p if you downgrade it.
@d0x360 What the heck a logical sounding post. Bubbles
@demeador Except.. He is using common sense to bend the truth. There is an obvious difference in 900p to 1080p if you have a GOOD display and your eyes aren't fried. What kind of bullcrap made up argument is this? 6 feet from a 70 inch TV is really close, but he is claiming you need a 100 inch TV to actually see a difference between 900p and 1080p. I sit 3 feet away from a 17 inch screen. 900p to 1080p is blatantly obvious. If you can't see it on your 70 inch TV from 6 feet away than you are blind. The pixels aren't the same size. They grow with the screen. Don't use the bullcrap PC gamers can only see it excuse. You can't be making these calls if your vision is obviously impaired to be making these accusations with your coincidentally existing 70 inch 1080p TV. I have prescribed glasses, which I don't need except for far away objects. There is a slight blur from objects about 6 or more feet away but I can noticeably make out what they are no problem and watch TV at that distance no problem. I see an improvement on my aunts 65 inch 1080p tv from 900p to 1080p from 9 feet away. If you don't notice a difference between the two resolutions then you are probably playing games where so many things are happening or moving on screen that you don't actually look at the menus and the extra clarity in whats on screen. Or, you are also looking at comparisons which add filters or upscale the lower resolution to make it appear less aliased.
You can when your TV is over 50 inches
Thats what I'm saying.... but not everybody has big screens and not all big tvs are equal. My Samsung vs my sony are different. Both are 1080 but they are different.
Pls stop with this it has to be 50 inches or more crap.My tv is 43 inches and I can clearly tell the difference.I aint a damn blind bat.
My TV is 42 inches and I can tell the difference from 10 feet away. Have I got super-human eyes, or are you just full of the brown stuff? I'm thinking the latter. :P
I dont think he's saying you cant tell below 50, just that you definitely can tell over 50. I can tell the difference on my old galaxy tab vs the tab s. Screen size isn't tricking my eyes. I can see the difference in picture and clarity between the galaxy s5 vs the lg g3 which is quad hd. The screens are relatively tiny compared to tvs. If an image is upscaled the to the point a difference cannot be seen then what is the point of trying to hit 1080p if 900p would suffice
I can tell the difference on my 15.6 laptop screen lol.
I have a 42" TV and I can tell the difference. It's not bad or anything(same for 720), and you have to be looking for it, but it's there. Typically it's in the finer details, or AA. Sometimes it's hard to notice when a game is in motion, but easy if you just sit still in the scene.
Took the test on my 24" monitor. Got them all correct. You don't need a big TV to tell the difference.
Wow, instead of questioning the parity, this article want to question if 1080p is needed over 900p? Encouraging gamers to settle for less, ha? Hear that nVidia? AMD? apparently high res gfx is not important anymore. Close your shops and settle for what you already have.
My TV is a 40inch bravia and I could see a huuuuuuuuuuge difference between AC black flag before the patch and after it, the people that cant see that are in the minority I suspect.
on my 1080p 47" led tv it looks crisper at 1080p but not a killer,900p is fine. then when im on my 106" 1080p projector 900p is just ok and playable,yet 1080p looks perfect.
bullshit myth,,i got a 40inch tv,,,i played black flag at 900p and then later at 1080p after the patch,looked MUCH better
AC, I don't see it on a 7", but I do on a 13". Granted, on a couch you are more far off, but you can see it.
Tell this to a pc gamer who has a 27 inch monitor and games at 1440p or even 4k. They can see the difference it's very obvious. The problem is most console players will claim there is no difference because they barely play anything that's even at 1080p alone.
"Can You Honestly Tell the Difference Between 1080p and 900p?" Sitting far away not much difference. As a PC gamer from 1440p+ gaming yes :)
A man was just about to say If PC gamers are paying for 22" 4k monitors, we can tell the difference on any size screen. This is real and it's real far beyond the difference between 900p and 1080p.
@Jaqen_Hghar, having a 4K monitor has nothing to do with anything. It's not like you're actually playing anything at 4K. You're wasting your money on that.
yes i do, like the majority of peoples.
Yes, I can as well. It's not that hard to tell. And by that hard, I mean super easy.
Whether people can tell the difference between 900p and 1080p is moot, the real issue here is that they intentionally gimped and held back the PS4 version due to the limitations of the XB1 hardware. I sure as hell didn't buy the more powerful console only to be limited to the lowest-common denominator because developers don't want to hurt MS's feelings. But even if we consider the resolution differences alone, take this into consideration; The difference in pixel count between 900p and 1080p is actually nearly the same as the difference between 480p and 720p, which is a bit over 600,000 pixels. So can gamers really say they wouldn't mind playing a game at 480p when it could be 720p? Or that they cannot tell the difference?
It's not even about 1080p vs 900p. It's the fact that Ubisoft is blantantly gimping the PS4 version to match the X1 version when the entire industry knows the PS4 is able to do 1080p quite easily.
From what they said, it had nothing to do with the graphics at all, it had to do with the A.I. and the CPU. They had already said both version could be 1080P, but not with all the NPC and the A.I. for all of them in the game and on screen at the same time. Has nothing to do with XB1 gimping the PS4 version.
I'd like to see a blind test where 10 gamers have to identify which game is playing at 720/900/1080. Let's include ps4 and X1 too, and see if anyone can tell. Let's see that on N4G... Oh, and no "pausing the game" to count pixels... Just good Ol' fashion game play. Good luck...
They tried that on sites and they tried it with 60 to 30fps too and around 90% usually tell the difference within seconds. If you cant see the difference it doesnt mean everyone else needs glasses.
If even mrmagoo can tell the difference.. then everyone can
It depends on the aspect ratio of the screen. When talking about HDTV most people would have a screen that has a 16:9 aspect ratio which would be 720p (1280x720 pixels), 1080p (1920x1080 pixels) and for 4k TV's or UHDTV's 2160p (3840x2160 pixels). With the exception of the standard definition standards a normal HDTV will only display 720p and 1080p signals. The so called 900p has to be unscaled to 1080p assuming it can be scaled to a 16:9 aspect ratio. If it can't you will see "letter-boxing", which in itself is not a bad thing since the display tries to look like what you would expect from a Cinemascope display but on a 16:9 screen you will have "letter-boxing". Unless the so called 900p is actually 1600x900 pixels which is an aspect ratio of 16:9 that when unscaled from the source (ie the console) will be 1920x1080 pixels on input to the HDTV. If the 900p is 1920x900 pixels you are going to get "letter-boxing" and that you will definitely pick on a 16:9 aspect ratio screen. Can you pick up-scaling like I just mentioned? I think the best answer is "possibly" since the result may be slightly "washed out" (maybe). Hows that for a wishy-washy answer. Before anyone disagrees with me read the following since it goes into much greater detail: http://www.projectorcentral... In addition there are plenty of sites that discuss this topic. It must also be noted that there are other screens that have different aspect ratios but the main one that was agreed to by all manufacturers to be the best for general purposes was 16:9. Other aspect ratios can be better for some content but bad for other content, it really depends on what you prefer to display on your screen.
"Unless the so called 900p is actually 1600x900 pixels which is an aspect ratio of 16:9..." What happens when you divide both of those numbers by 100? Yeah. That's probably the most obvious one of all outside of ye olde 400x300.
Who gives a Raccoon's rotten ass if the differences are negligible or not my 60 bucks says get the best version possible.
everyone says yes but when Call of Duty and Killzone 3's Multiplayer was found out to NOT be native 1080p, no one knew. Ok.
Yes the different is there. Usually 900p look a bit more blurry than 1080p, but it also depends on how far and how big is the TV.
A better question would be can you tell the difference between 1080P native and 900P upscaled to 1080P and that answer would be NO. lol!! If you took the same game on two difference systems and put them side by side and asked gamers to pick out which one was 900P native, I'm sure most of them would get it wrong.
I bet u £1000 u can't tell difference between 1080p and 900p if it was between 720p and 1080p yeah you can tell difference if you got tv which is 60 inch or more but PS fan boys are special from other people and God gave them special eyes so they can tell difference between 1080p and 900p
It also appears that God gave bad eyes to the other side of the fanbase
Yes I can, as well difference between 1080p, 1440p, 2160p. I can also tell the difference between 30fps, 60fps, 120fps etc.
They are trying to downplay ppl like you by using screenshots instead of video. Resolution is vastly harder to discern in static images. But games aren't static images, now, are they?
I often wonder if people actually can't tell the difference or they just say that to "tow the party line".
The only people who can't tell the difference are people who only own a certain console. If you can't tell the difference between 900p and 1080p, then you need to buy a new, 1080p TV, wear your glasses, or have laser eye surgery.
Or have a crappy TV. 900 to 1080 may be visually minute, but you can most definitely see the difference between 30 and 60 FPS. I was blown away when I fired up TLOU. I really hope the PS4 version is amended, 1080p/60fps!
PC gamers claim to see a difference between 4k monitors and 2160p monitors that are both 22". A man is pretty sure they don't have a party line to tow since both of those are far superior to any console in resolution and they could simply claim 2160p is good enough and still laugh at console graphics. A man hasn't seen either of those things but if it makes a difference at those high of levels of pixel count on a small screen, it DEFINITELY makes a difference when we're talking 900p to 1080p considering the smaller the overall numbers, the bigger the differences are noticed.
I have an insignia 42" TV, and I would consider it a low end TV, despite having a nice picture for the price. It is most certainly possible to tell the difference in different resolutions, and I have put this to the test. It's not hard, as all the systems, and most Blu-Ray players allow you to set up which resolution you want to output at. I also have a set up disc which has different resolution videos for ajusting stuff, and there is a clear difference. Try it, and see what you think. Just make sure you pick a game that is native 1080p, otherwise it may be upscaled depending on the game.
Some of us play games on our pcs. And I'll tell you right now, it's damn obvious.
Playing on a 4k TV, it is upscaled either way (and upscaled very well I might add). Yes, I can tell the difference in some cases. Destiny for example, was noticably different between the beta and the retail release. But good quality AA makes a bigger difference. And that is sorely lacking on most next gen games so far. And whatever magic techniques Crysis used to make Ryse look like it does supports that (they use their own software techniques, not the hardware scaler)
between 60 and 120 no difference no human eye can see that!
So you're one of those who thinks the human eye sees in frames per second? sigh..
Eyes don't see in frames....a human eye can most certainly tell the difference.
I notice it all the time and it's just as jarring as winding down from 60 to 30. The only difference is that 60 is very playable and 30 verges on unplayability in many cases (particularly on last gen console where dips into the sub 30 range were a regular occurrence). I find it funny how so many are now touting the "clear as day" difference between 1080p and sub-1080p now that it suits them as an argument. PC gamers were berated and defended this very topic until they were blue in the face last gen but this moronically popular opinion persisted until it became fanboy fuel for the console crowd. Guess how it's going to go with 120FPS vs. 60? Or 60 vs. 30 (also gaining momentum now that it's something one camp can lord over the other)? Yes, these differences are real. Your eyes do not perceive reality in FPS. Your eyes absorb all of the light they are capable of sensing and your brain then processes that information. People can perceive north of 260hz/frames per second (that's as far as the testing went in that case).
There have been experiments to measure the shortest period of time an image could be shown and recognized to the human eye, which is something like 1/250th of a second. So, one could say that the most the human eye can see before being completely unable to notice improvement is something around 250-300fps...and that's with people who have exceptional vision. The average person is probably quite a bit less.
I agree with you, FasterThanFTL but it's also true that there's that big a difference between 900 and 1080. That said, I'll still get the highest possible.
900p vs 1080p? Probably... 1080p native vs 1080p upscaled? Probably not People always pretend to overlook that tidbit of info.. We aren't talking about pixel count because that's your display, not the source.. How that fixed number of pixels are drawn is the topic
upscaled tends to be more announced than native resolutions due to distortions in the upscalers algorithms. I'd prefer native 900p over upscaled 1080p for games. My receiver upscales content to 4K if I so desire, but I turned the feature off since I find native resolution a bit less jarring. upscaling a image does not make the image clearer...just makes it bigger. Easily evidenced by zooming up a bit on any picture on a PC...even ones that can interpolate pixels well.
Everything is scaled tho.. You can't view 900p on a 1080 display.. If the console didn't scale it, your display would stretch it.. Unless it'll letterbox the image but I can't see anyone playing like that.. But the difference can't be too large.. Like 720 upscaled to 1080 is quite noticeable but 900 to 1080 isn't.. 1080 to 4k is also quite the leap My only point was people treat this debate like its 1080 displays vs 720 displays when its totally different.. Pixel counts are identical at the output, its the method used to draw each pixel that varies.
Too true. From my experience though, the upscaling seems off when coming from the gaming devices. I don't know why this is, as it seems that it would be much better than a TV's upscaler, but in some cases it just seems leaving everything normal on the devices works better for me. Maybe it's because the TV's upscaler doesn't screw around so much with the AA...never really thought about it much so may have to give it another run down. My receiver is apparently rated highly for upscaling, so I was a little disappointed in the results.
I can't tell the difference on fps post 30 or so unless it's a really simple game without too many moving elements/environmental detail :(
as long as you sat at the right distance,sat 10ft and 15ft from a 40" its impossible to tell. 5ft from a 47" u can tell 2ft from a 15" u can tell
While I do agree with what you said I think it is important to note the following. Yes you will definitely pick the difference between 1080p content and 4K (2160p) content, but if the content is 1080p or even 720p to both TV's you probably won't notice much of a difference even side to side unless the 4K TV has a very good up-scaler. As for frame rate that also depends on the content in that a film may be shown at 24 fps and be quite comfortable to watch, while content that has allot of motion such as sports/action movies or action/FPS games would definitely benefit from higher frame rates. Eventually you reach the limits of human vision (we have with sound) and unless you modify the human eye and therefore the brain it becomes pointless to keep increasing screen resolution and frame rate. For your interest the BBC is experimenting with 300fps while some TV manufactures are experimenting with 8K TV's, although I would not hold my breath waiting for this.
Yes. But resolution is not the issue with AC Unity. Dumb article.
Very true it's about not using the full power of a console to "avoid debate"
Yeah, its a matter of principle, regardless if its a single p difference. Stupid thing is though, whether or not you can tell difference by COMPARISON, it makes no difference when youre playing the game at all. If you were given a random game, you couldnt tell what the resolution was.