Comments (71)
user7402931  +   825d ago | Well said
all you need to know

1080p> 720p

120 hz > 60hz

ps4 > one
Fanboyssuck27  +   825d ago

Wii u > ps4 > xbone
#1.1 (Edited 825d ago ) | Agree(26) | Disagree(134) | Report | Reply
CGI-Quality  +   825d ago
Unless you're talking personal game experience, the technology in a WiiU is NOT greater than the PS4.
LOL_WUT  +   825d ago
Even the X1 has more appealing next-gen titles than the WiiU! And yes resolution, specs all matter if it didn't devs wouldn't pass up on Nintendo. ;)
u got owned  +   825d ago

Oh boy, you will be at one bubble in no time if you keep that up.
#1.1.3 (Edited 825d ago ) | Agree(14) | Disagree(7) | Report
PlanetBathwater  +   825d ago
More Correction.

ziggurcat  +   825d ago
"Wii u > ps4..."

nice try, but that's beyond delusional.
Godhimself_In_3d  +   825d ago
Your f@#king high.
triforce79  +   825d ago
Mariokart8,X,ProjectCARS,and titanfall have the best nextgen textures ive seen so far...
#1.1.7 (Edited 825d ago ) | Agree(3) | Disagree(17) | Report
Dehnus  +   825d ago
Sir, you get a bubble for being so truthful. Everybody knows this fact yet they keep denying it. The facts look them straight in the face! Why do these fanboys keep denying it.

Why, we can already play games on it, while they can only compare spec sheets. SO by definition it is better ;).

In 2 weeks that might change a bit though ;).
UltimateMaster  +   825d ago
You're a brave one.
Here's you Mario kitty VS COD Dog.
Magicite  +   825d ago
you need to check your head, mate
LaserEyeKitty  +   825d ago
LOL at all the morons who just got trolled. You figure the name would give it away.

But, if you can't beat'em - PC Specs trump all.
user5575708  +   824d ago
it doesnt matter because once youre focused on gameplay graphics tend of become an afterthough

however it does matter because before you play the game you want to know you're buying the best possible experience. so it def matters for sales. why pay $100 more for a technologically inferior product that outputs inferior resolution. on paper it doesn't make sense
0ut1awed  +   825d ago
I really doubt 120hz is going to matter for a PS4 other than maybe smoothing it out a little bit. 60fps seems to be the sweet spot for the system so you really don't need anything over 60hz.

Also FYI pretty much every TV that is "120hz" or "240hz" is actually not true. For all intensive purposes it is basically "upscaled" so to speak. None of those TVs actually allow for more than a 60fps input.

You need a computer monitor with a display port or DVI-D input to actually get anything over 60fps.
#1.2 (Edited 825d ago ) | Agree(14) | Disagree(4) | Report | Reply
Sci0n  +   825d ago
This is true, as I was tv shopping I was really considering getting a tv with like 240hz for gaming mainly. I spoke with a best buy TV sales person and told him why I was searching for a 240hz tv and he kindly said are you going to be gaming from a pc rig or a console? and when I told him a console he said well I don't think any console games are rendering over 60fps so you really don't need a insane amount of hz at the moment 60hz is enough. I am now waiting for 4k TVs before I upgrade.
0ut1awed  +   825d ago
I guess it's his job but if you would have said PC gaming I know for a fact he would have tried to sell you one of those 120hz/240hz TVs (otherwise he wouldn't have even asked that question). Kind of messed up because as I already stated those TVs don't actually support higher FPS inputs. HDMI just doesn't have enough throughput to support it.
#1.2.2 (Edited 825d ago ) | Agree(3) | Disagree(0) | Report
Knal  +   825d ago

The latest versions of the HDMI standard does not allow through put for more than 60fps? You have proof of this? Could you support that with some links?
mabreu  +   825d ago
@ 0ut1awed

according to Wikipedia, hdmi can output 1080p @ 120hz after version 1.4b which was released back in October 2011.

However, I'm not sure if the PS4 or Xbox1 can support it. I assume they can but don't see any evidence on the web.
CrusRuss  +   825d ago
You guys are confusing TV refresh rate (hz) with fps. If you read the article it explains the difference.
Visiblemarc  +   825d ago
It really is basically that simple. I'm tired of how the articles about how "resolutiongate" being "overblown" have become more exhausting than the relevant genre that are articles about "resolutiongate" (those that compare price/performance)

If you don't like price/performance articles, take your ignorant opinions and go find a hobby that caters to morons.
BlueShirt7749  +   825d ago
With a name like that you need to get over yourself and get out more. Do you work for Sony? No? Then those figures are certainly not "all you need to know."

Simple mind....
WorldGamer  +   825d ago
You didn't really provide any evidence. You basically personally attacked him, then trailed off.

I think what he says is correct, and many other reputable forums also agree. I don't see why would would get so upset.
BOLO  +   824d ago
No one needs to work for Sony or Microsoft to know the facts...PS4 is a more technically capable console compared to Xbox One and Wii U. Try again.
skydragoonity  +   825d ago
Ur right goddamit!!!
bornsinner  +   825d ago
xbox one has dedicated servers a better online experience and better games, ps4 has killzone & losers who count pixels all day
GW212  +   825d ago
* better launch games.

* PS4 has Killzone, better graphics, better hardware, and losers who count pixels all day because they are actually there.
AngelicIceDiamond  +   825d ago
"720p? 1080p? ESRAM? Why it matters and why it doesn’t"

It matters now. But when everything gets fixed, we'll move on to the next thing and forget about it.
Biggest  +   825d ago
Fixed? Do you mean the next generation of consoles?
MRMagoo123  +   825d ago
If you know anything about tech you will know the xbone is only going to get worse as much needs to be put on the ESram in the future to keep up with the PS4, the lower gpu and the ESram will make it impossible in a couple years, gpgpu will be the xbone downfall.
AngelicIceDiamond  +   824d ago

Yeah that's not even close to being right.
linkenski  +   825d ago
720p @60fps > 1080p @30fps, though
H0RSE  +   825d ago
I agree. I'll take 60fps and lower resolution over the opposite. Hell, in competitive PC gaming, it's pretty common (or at least was) for players to disable a lot of visual features, to obtain the best framerates possible. When I played competitively, that is what is what I did. It was also used to gain advantages, like disabling lightning do eliminate dark spots/shadows for players to hide in.
#1.8.1 (Edited 825d ago ) | Agree(1) | Disagree(3) | Report
Mitchblue  +   825d ago
@linkenski: Not a fair assessment. You cut the FPS in half with your illustration but don't double the resolution.

Fairer would be 720p @60fps vs 1080p @45fps.
#1.8.2 (Edited 825d ago ) | Agree(0) | Disagree(2) | Report
H0RSE  +   825d ago
The assessment is completely fair, since it's using standard measurements for comparison. Resolution has known increments - 360, 480, 720, 1080, etc. Framerates also follow a similar philosophy. 24fps for film, 30 or 60fps for games, and then you can unlock the framerate (like PC) and obtain whatever your system can handle. Although devs can aim for virtual any framerate they desire, 30 and/or 60 tends to be the "sweet spot."

So it isn't about actually halving everything equally, but rather comparing the different increments. That being said, I would still prefer 720p/60fps over 1080p/45fps.
#1.8.3 (Edited 825d ago ) | Agree(0) | Disagree(0) | Report
GentlemenRUs  +   825d ago
Killzone runs at 1080p@60fps
So does Knack? I forgot.
Resogun does.
New'n'Tasty does.

Your statement is invalid.

1080p@60fps > 720p@15fps
#1.8.4 (Edited 825d ago ) | Agree(3) | Disagree(1) | Report
mabreu  +   825d ago
@ linkenski
I agree that frame rate is more important than resolution. It helps with response times when playing competitive games while making the animation run smooth.

However, this is relative to what type of game you're playing. 30fps is acceptable and suitable for games that don't have fast moving images and don't require fast response times. Also, less frame rates gives more graphical power to render other visuals like textures and lighting effects.

When I play Bioshock on my PC, I always favored more fps. When I played Civilization 5, I put the graphic setting on high and don't mind sacrificing frame rates.
#1.8.5 (Edited 825d ago ) | Agree(0) | Disagree(0) | Report
MRMagoo123  +   825d ago
shame xbone isnt getting 720p @60fps tho huh , its barely getting 720 @30fps and even then that dips on DR3, and even on COD ghosts its 720p with frame drops.
sovietsoldier  +   825d ago
how did i know the first comment would mention a ps4 and at the same time slam xbox one.
Ol_G  +   825d ago
This is
fonger08  +   825d ago
Yeah unless you have a blind pro-Sony comment, don't bother.
True_Samurai  +   825d ago
Hahah no thanks
I'll take that x1
oof46  +   825d ago
Your own preference > Anyone else's opinion
Magicite  +   825d ago
Erdrick  +   825d ago
forgot pc > all ;)
inf3cted1  +   825d ago
PC > PS4 > Xbone
mistertwoturbo  +   825d ago
Such a poor article.
mewhy32  +   825d ago
Wow. more spin. It does matter, it does matter. I wonder if the spinners think that if they said that it doesn't matter enough time that people will start believing it?
Knightshade  +   825d ago
It matters if you want to choose the right platform that meets your setup, but it matters little in terms of overall image quality. It's not spin. There is plenty of math and knowledge in the article to back it up.
oof46  +   825d ago
Apparently, a lot of people disagree with math and knowledge. :)
buckley  +   825d ago
Spin? Did you read the article?
lnvisibleMan  +   825d ago
Of course it does. Unless your gaming on a tiny screen smaller than an Ipad.
SpinalRemains138  +   825d ago
It matters. A new generation, you want a new standard. You pay 500 bucks for a new machine, you would want it to run contemporary settings. Why is this just not understood by some?
mightyhokie  +   825d ago
I'm getting the x1 but I agree. We should expect a lot better graphics and all that stuff for our money. I'm willing to let them work out the kinks. In a year you won't be able to tell the difference between x1 and ps4 visually. look at oblivion vs skyrim. and the ps3 had some major issues at launch. not to mention the 'far more powerful ps3 will give you much better graphics' that didn't come true. both systems are going to be awesome. but just like the issue with frame rate drop on the ps4 at 1080p...those issues will be done rather quickly as game developers get more used to the hardware.

But I do agree that when I spend that $500 i am expecting better looking stuff moving faster and offering more.
MrCastle  +   825d ago
I am getting an XB1 on launch day as well, and I've decided to reserve judgement until these games actually come out. Everyone has their opinion of what looks good. A number is just a number to me. If it looks as good as it plays, its a winner.
RegorL  +   825d ago
I wouldn't bet on XBox One ever catching up, but you do - good luck to you!

A. PS3 had a more difficult architecture
B. worse GPU
C. better CPU+SPU.
The difficult architecture made initial games worse.
The SPU let it catch up.
But XBox 360s better GPU kept the playing field equal.

A. XBox One has a more difficult architecture
B. worse GPU
C. most likely worse CPU+GPGPU than PS4.
The architecture should not be that cumbersome as it resembles XBox 360s a lot - how to utilize that is known. But the eSRAM is only slightly faster than the PS4 RDRAM. Even with perfect (100%) utilization of it XBox ones memory bandwidth will only be slightly better than PS4, the likeliness of perfect utilization will decrease with bigger graphical environments...

XBox One has a cool camera, if game developers really find ways to use that it could sell systems, like the Wii motes sold Wiis.
WeedyOne  +   825d ago
"not to mention the 'far more powerful ps3 will give you much better graphics' that didn't come true"

This statement is only true for multiplatform games. Ps3 exclusives had WAY better graphics than anything on the 360. Killzone, God of War, and The Last of Us say hi!

The cell was completely different than anything any developer had ever worked on, so I understand the learning curve with that. But the XB1 and PS4 both use x86 which has been used for decades. You wouldn't think that would have the same steep learning curve.
Prime157  +   824d ago
@Hokie: "I'm willing to let them work out the kinks. In a year you won't be able to tell the difference between x1 and ps4 visually." /@

Look, you used oblivion vs skyrim, and I'm laughing because bethesda never figured out the ps3. I'm not defending the ps3, but I think that was a lot different than ESRam is to Cell. However, I'm going to quote the article in mention to help explain my skepticism (I cut out a lot so it's not a lengthy read as I assume you already read it or can find it).

"Sony put 8GB of GDDR5... 176GB/s of memory bandwidth ... Microsoft went a different direction, putting 8GB of DDR3 memory that hits a bandwidth of 68.3 GB/s, but also features a memory subsystem of 32 MB of ESRAM... at a theoretical speed of 192 GB/s. "

The way I see it is that the GDDR5 @ 176 is easy to use. The DDR3 RAM @ 68 is easy to use. That being said, Devs have find a way to CRAM 124 GB/s (the difference) throughput through a TINY 32mb of ESRam.

OK, so, I'm just disagreeing that we'll see the differences that soon - ESPECIALLY IN MULTIPLAT which are usually the same. I'm highly skeptical of if we will see that by the time the next next gen comes out (PS5 vs XBOXall(?))...
#4.1.4 (Edited 824d ago ) | Agree(0) | Disagree(0) | Report
halfblackcanadian  +   825d ago
PS3 ran some early games sub-HD as well...look how it turned out for them.

Resolution, in the end, is one aspect. There is a lot more that this next gen is expected to (and will) do that the current/last cannot achieve. Yeas, PS4 will always have the edge in raw computational power but people are seriously taking this as Xbox One cannot produce 1080p (which, beyond being proven wrong by launch titles, is ludicrous considering that last gen machines can do it and these are exponentially more powerful)

Argue that one machine is more powerful than the other, that's fine, but to assume that one is last gen (when they share so much of the same DNA to boot) is being ignorant.
aquamala  +   825d ago
did you think only early PS3 games are sub-HD? all COD games on PS3 are sub-HD (960x540, 880x720),

Saints Row 4 is sub-HD (960x720), Splinter Cell Blacklist is sub-HD (1152x648), both games came out just couple of months ago.
#4.2.1 (Edited 825d ago ) | Agree(1) | Disagree(0) | Report
Prime157  +   824d ago
What is sub-HD to you? A good guide is 720p, but some might argue that 480p was the start of HD. If it's 720p, then a lot of ps3 AND 360 games were sub-HD. If it's 480p then a lot of 360 AND PS3 games were sub-HD. Do you see what I did there?

Current (soon to be last gen) games had a very similar native resolution than the new Gen. Here's a quick find of early launch games:

Keep in mind that as resolutions get higher and higher (See also: 4k tvs are out) the more we'll notice the difference. The fact that we're entering this generation with such differences (104 million is and will always be better than 46 million) as this article quotes, makes me wonder if that ESRam can really boost it to 2.26x of what the XBone is now...
H0RSE  +   825d ago

It is understood by more than you think, it just isn't emphasized because it isn't as big a deal as many claim. So many people are focusing on resolution, that even something as substantial as framerates seems to take a backseat.

The new consoles are offering a enough new features, that without even getting into resolution, they justify their price tag. Battlefield will have 564 player support, and be running at 60fps. Many if not all MP games will utilizing dedicated servers, many games will be able to interact with other devices, such as phones, tablets and PC's. Cloud computing is being utilized for things such as AI processing and real-time stats, to a degree never achieved on consoles, with more potential to come. DVR features are being implemented, other PC-centric features, such as being able to play a game while it's downloading, are being introduced.

Real-time snapping between multiple apps, TV integration, an HDMI-input that can be used with virtually any device witha HDMI-out port, including a 360 and even a PS4. A completely redesigned Kinect, built from the ground up to act as an extension to the X1, rather than an optional peripheral released years after the launch of the console, which was the case with Kinect 1.0.

With all these new features and ideas taking place, my $500 spent on an X1, is more than justified, even if every game I ever play is only 720p.
#4.3 (Edited 825d ago ) | Agree(0) | Disagree(0) | Report | Reply
miklo84  +   825d ago
Good read. Best part, "Your conclusion should be this – buy the games you want to play and stop worrying so much about the platform. The debate will rage on, but in the end it is and will always be about gameplay. Both systems will have their ups and downs, Microsoft and Sony fans will lash out at one another, Nintendo will do their own thing, and the PC crowd will lord their 4K resolution over the top of all of them."
davidrobots  +   825d ago
Yep. In time I'm sure we'll see more and more games hitting that magical 1080p/60fps mark, and this will all be a moot point anyway.
cyguration  +   825d ago
Except when one system hits that "magical mark" and one doesn't, will it still be a moot point?
MonkeyOne   825d ago | Trolling | show | Replies(7)
RVanner_  +   825d ago
ahhh, not more of this nonsense.

It matters to some and doesn't to others, 1080p does look better than 720p.

Just choose your platform and play your games.
buckley  +   825d ago
"Just choose your platform and play your games."

That's essentially the message of the article. It goes into technical explanation, sure, but it's not arguing for any particular console.
RVanner_  +   825d ago
It is the need for the article in the first place that I am having a pop at not the articles outcome.
buckley  +   825d ago
@RVanner_ Still, it seems to me the intent is actually to quell the fanboy fanfare, not encourage it.
RVanner_  +   825d ago
@Buckley - I agree, and that's my point, the fact that we have a fanboy fanfare that we need to quell if you will. Have no issues with the article at all.
Ron_Danger  +   825d ago
If I spend the money to buy a tv that can output at 1080p, then I want to hook hardware to it that I know is guaranteed to output at 1080p.

Just like if I buy a 7.1 surround system, I don't want to listen to mono through it.
Neonridr  +   825d ago
Yes but take into account how far away you sit from that 1080p TV. If you are gaming from more than 11 feet away from that TV (Using a 55" as an example - that distance would be less for a smaller TV) then your eye can't see the difference between 720p and 1080p. So while you think you're getting a better experience, you are only fooling yourself into thinking that.

Now if you are gaming at less than 10 feet from that TV, then I could understand the nitpicking...
Campy da Camper  +   825d ago
There are some if us here who do understand that. I have a 54" LED and I used a measuring tape from my screen to the headrest of my recliner. I am exactly 6.5 feet away. I also measured out my surround sound speakers for maximum output. I adjust my TV settings for each game I play.

That said, I don't care if dark souls looks like ass because its fun as heck to play BUT when I am spending 400 bucks on new tech I do want the best picture out there. Not saying I wouldn't play a game in lesser quality but if I have to pay for it I will always choose the one with the best resolution .
Neonridr  +   825d ago
@Campy - totally agree, if you are spending money on new tech you expect it to do certain things for you. I was merely pointing out there are a lot of people here who probably sit pretty far from their TV's but go on tirades about the whole 720p/1080p dilemma not realizing that they couldn't tell the difference between the two at that distance anyways... sorta ironic I guess.
MasterCornholio  +   824d ago
In my case I sit about a meter away from my 1080P monitor which is why 1080P is very important to me.

Nexus 7 2013
H0RSE  +   824d ago you're hooking a PC to your TV? I do that too...oh, you're talking about PS4, aren't you?...

pssst, not all PS4 games will be 1080p...

Also, the mindset from others that spending $400 on a console warrants a "demand" for best, is eye-roll material. Really? $400 and you demand "the best"....consistently? Talk about setting your expectations high...

This isn't the first time a console has released, and it isn't the most expensive console either, and never have consoles been able to achieve "the best." PC's will always reign king in the resolution and framerate realm. If you want the best in this regard, you get a PC. Period. If you choose not to and instead choose a console, then you are willfully foregoing "the best" for "good enough."
#8.2 (Edited 824d ago ) | Agree(0) | Disagree(0) | Report | Reply
KingDadXVI  +   825d ago
A good article. Although upscaling is far more complex than made out in the article. There are some very complex algorithms used to determine the added pixels not a simple stretching as implied. Thus you do get a really close to fully rendered image.

That being said the Xbone has a really complex SOC that will definitely take time for the developers to come to terms with. You will end up seeing lots of native 1080p coming down the pipe.
davidrobots  +   825d ago
Yep. People forget that the PS3 was initially incredibly difficult to program for, and look at the games it ended up getting. I'm sure Xbox One will has some great looking games, just like the PS4. No need for everyone to get upset over it.
Knightshade  +   825d ago
I completely agree. I just didn't want the article to stretch to infinity either. :) There is a ton of tech that you really have to present to get the whole picture, but it really does come down to "don't stretch things - it looks bad". hehe
svoulis  +   825d ago
Couldn't read the article as the site was down. But here is what I think.

The issue isn't 720p vs 1080p, it isn't 30fps vs 60fps.

Its simply that you are paying a higher premium price for a worse product. and not by a few dollars by 1/4th the cost of a PS4. That is almost 2 retail games worth for less? That is the gripe I have with it.
halfblackcanadian  +   825d ago
Everyone seems to hate it (on the internet) but the Kinect is a factor in the $100. If it was pound-for-pound just the systems in the box I would argue that, taking into account loss on each piece, MS would undercut PS4 by $50 (that is to say that Kinect accounts for $150 of the total, not $100). MS believes in the vision it has with Kinect enough to risk it, so who knows, there may be a lot coming down the line that we aren't privy to.
svoulis  +   825d ago
Yea I agree, A vision that will soon backfire in their face. They need to release a $349.99 Sku to win back an audience. Remove the Kinect. EVEN remove the HDMI IN. Things we don't need for gaming.

Its a fact that the PS4 is not just slightly more powerful then the XO, its a fact on paper and in games.

I dunno, I am getting PS4 day one, I am going to wait on XO for a few months to see what happens.
Stringerbell  +   825d ago
Screw you losers! I'm playing my Atari Jaguar at 1080p at 60 frames! I'll be playing Tempest 2000 basically I'll be gaming in the future while you losers play your generic nonsense. Excelsior!
Sci0n  +   825d ago
Gaming on my PS4 should look good on this TV right? , that's my TV I bought it in 2012 and its been amazing for me while gaming on PS3.
Dunban67  +   825d ago
Why would both MSFT and Sony strive for 1080 and 60 fps if it did not matter?

Why would they spend the resources to "upgrade" the specs, the graphics and the general performance of consoles each generation if it did not matter?

I don t own an Xbox or PlayStation, so my question is sincere and objective?

I can understand that opinions may differ on how big or little difference these thing make to the overall console experience, but I do not understand how an objective person/organization say "it does not matter".

Obviously it matters- I think the proper question is "how much"

Why would MSFT and developers come out and say more of their games will be 1080 p and 60 fps after the Dev s get more familiar the machine? Why would they even care if it did not matter at all?
H0RSE  +   824d ago
They strive for it, because it's wanted, in many cases demanded. It it less about whether or not it actually matters, and more about what is going to sell consoles. It's about marketing, and bragging rights.

Whether or not it "matters" is completely subjective, and there is no right or wrong answer.
#13.1 (Edited 824d ago ) | Agree(0) | Disagree(0) | Report | Reply
assdan  +   825d ago
Showing 720p and 1080p photos that are compressed doesn't offer a good comparison
smokeyf  +   825d ago

yes, it will look good depending on how close you sit to it.

This is another reason that i think people are dumb when it comes to this resolution pissing match. If you aren't within the proper range of your display you wouldn't notice any difference.

Related image(s)
Neonridr  +   825d ago
Exactly, I just made that same statement as a reply up above. I have a 55" TV and if I sit more than 10' away from it, my eyes can't tell jack $hit between 720p and 1080p. I know tons of people who have their sofas a good 10' from the TV.

So everything think about that before you gripe about 1080p..
Sci0n  +   825d ago
How far away should I sit? cause I have noticed my tv looks even better depending on what angle I am viewing it from or how far away I am. Nevermind I see that chart you posted. Should we not want the highest resolution available though?
#16 (Edited 825d ago ) | Agree(0) | Disagree(0) | Report | Reply
Neonridr  +   825d ago
Then make sure you sit close enough to notice. The larger the TV the further away you can sit and have your eyes able to differentiate between the resolutions.
Campy da Camper  +   825d ago
About one foot for every 12 inches of screen size...maybe a tad more depending on your display.
neoandrew  +   825d ago
Who cares ESRAM?

Best gpus have gddr5, not pathetic ddr3...
ATiElite  +   825d ago
Esram is there to boost the multitasking performance of the XB1.

Laptops use DDR3 in GPU's and a laptop with a GTX750m SLI will run rings around a PS4.

So your statement about Pathetic DDR3 is so WRONG! It's obvious you are not fully clued in to Tech Hardware.

I'm here to help.

*edit* OMG i just saw your name, I think I told you this before but you prefer to Flaimbait!
#17.1 (Edited 825d ago ) | Agree(1) | Disagree(4) | Report | Reply
neoandrew  +   825d ago
Best pc desktop gpus have gddr5, not pathetic ddr3...

I need a gaming console not pathetic multitasking center.

If ddr3 is so GOOD, why 99% gaming GPUs have gddr5, forget that, don't answer, i'm not interested what m$ fanboy has to say.

Enjoy your 720p games and multitasking...
ATiElite  +   822d ago
You are not bright!

anyway go ahead and be stuck on simple.

Like I said there are laptops out there with GPU's using DDR3 that will outperform a PS4.

I'm not saying DDr3 is better than Gddr5 in Gpu becuase obviously GDDR5 is better but DDr3 is very capable.

720p LMFAO YOU Filthy consoler I am The Glorious PC Gaming Master Race, I game at 1600p and will upgrade to 4K this January.

SO enjoy enjoy Battlefield 4 2 900p.

Next Gen but LAST GEN resolution ha ha ha ha
neoandrew  +   822d ago
"becuase obviously GDDR5 is better"

I know, so whats the problem man?
edonus   825d ago | Spam
ATiElite  +   825d ago
The ONLY thing that matters is that the GPU in the XB1 is WEAKER than the GPU in the PS4

PS4 GPU = HD7850
XB1 GPU = HD7790

You can swap out the Esram and DDR3 in the XB1 and give it 10GB of GDDR5 and it STILL will NOT achieve 1080p with same eye candy and fidelity as the PS4 because of the WEAKER GPU! FACT!

So enough with this Esram/DDR3 B.S. because the ONLY thing gimping the XB1 is it's WEAK GPU.

Microsoft had enough time to Delay the XB1 a few months and upgrade the GPU way back at E3 when it was CLEAR that the PS4's GPU was so Dominate but they chose Kinect over Gaming performance.

Live by Kinect DIE by Kinect!
MasterCornholio  +   824d ago
"ONLY thing gimping the XB1 is it's WEAK GPU. "

And the 32MBs of ESRAM.

Nexus 7 2013
RegorL  +   825d ago
Game consoles are running against your television.

During the lifespan of XBox 360 and PS4 _minimum_ TV resolution has gone from SD to Full HD (now some tries to sell you 4k).

Interestingly computer screens almost went the opposite direction as I had a difficult time to find a screen with over 1080p/60 when I had to buy a new one about one and a half year ago... (power of mass market - same LCDs/plasmas/... were used both in TVs and computer screens)

Now PCs will surely go to 4k.

What about TVs? I do not think they will go to anything higher than 1080p/30 as the needed transmission bandwidth is TOO HIGH.

Now wouldn't you expect a new game console to drive the most common TVs?

Since even the XBox One can drive 1080p/30 (NFS Rivals) it should be fine. Only problem PS4 can drive 1080p/60 when it is reduced to 1080p/30
- its frame rate should be rock solid
- leave more resources free for physics/AI

The Cloud won't help making the frame rate rock solid. It could help in AI but only when you are online, and only on strategic level
- Suggesting the ideal track to a car game, but that could be calculated in advance... Unless you add destruction into play requiring reroutes during race.
- Note: You still need a local AI that needs to handle driver-driver related AI like someone turns in front of you or tries to push AI player off road. The AI could play aggressively trying to push you off the road, a cloud AI could select where on the track this should be done but can not execute it.
ATiElite  +   822d ago
I don't understand your post but during the lifetime of the Xbox360 PC screens were at 1080p before the Xbox360 released and went like this:


Yes 4K monitors are available but at $5K I rather buy a 4K TV
Godhimself_In_3d  +   825d ago
If you want 1080p on xbox one get the ps4 and run it threw the xbox ones hdmi input lol
dells17  +   825d ago
I also found it useful if you did not purchase a ps4/xbone but instead if you purchase a wii u you can fully utilize its power which is greater than a gtx titan.
N311V  +   825d ago
Excellent article. Very well written and informative.
Prime157  +   824d ago
"This means that the show CSI has been lying to you for quite a while – zoom and enhance is bullshit."

That made me laugh.

Add comment

You need to be registered to add comments. Register here or login