"At this year's Game Developers Conference, Avalanche Studios head of research Emil Persson offered his views on the PS4 and Xbox One effectively utilizing ROPs for higher bandwidth textures."
Good read thank you. I also like the fact that they Esram is still factor in this :)
Regardless of specs if the games don't show a difference then what does it matter? Its a console. You're stuck with it the way it is for the next ~6-7 years. Just play your games.
But they do show difference. I mean Xbox One is less than full HD most of the time. Right now we have huge GPU power gap, ram gap, rop gap. ROP may not matter yet, but once GPGPU takes off, it will create another huge gap. Power difference exist, devs just have not had time and tools to fully utilize additional power that exist on PS4. Either way, gaps going no where. It will just likely get bigger.
@GameNameFame Resolution is not a benchmark for computer graphics. Minecraft in 4k will never look as good as let's say GTAV. It's all about lighting, just like in real life. Ryse is still the best looking title eventho it's in 900p with some great AA going on. I've played them all on both system, and you don't need to trust me, I can proove it (pm if need be). Devs should focus on animation, lighting and simulation, not resolution.
@BallsEye that is quite possibly the stupidest thing i have ever personally read on this site.
@BallsEye That is just desperate argument. Framerates are commonly used as benchmark figures in which Ryse does 15 fps. It has good lighting at cost of fps and resolution. Tells you that hardware cannot handle good lighting. Lol Also benchmark of hardwares are always done on same games, so everything else remains equal.Look at COD, BF4, Watch dog and so many others. All inferior resolution on Xbox One. You cry for legit benchmark? Thats how it is done. One variable and all else remain same or similar. In this case, that is resolution. If you want exactly same as PC benchmark, look at Tomb Raider. it is 30 fps vs 45-60 fps. 50% percent power gap on spec is being shown on actual benchmark.
@bullseye. And to think you used your only bubble on that comment! Get a grip man!
@BallsEye: "Ryse is still the best looking title eventho it's in 900p with some great AA going on." No. Just no. Somebody hasn't been paying attention.
@BallsEye - lol, sorry but no. Infamous:SS dominates Ryse in every possible way. Which is quite sad since Ryse is an interactive cutscene game that is pre-rendered (extremely light on resources), and Infamous:SS which is a real-time rendered open-world game (extremely heavy on resources).
@BallsEye Killzone and Infamous destroy Ryse in terms of graphics and performance. To me Killzone is still the best looking next generation game. Infamous looks great too especially with the lighting. Forza looks great... well at least the cars... The backgrounds were very bad though.
@GameNameFame, if devs havent had the time and tools to fully utilize the additional power on the PS4, it seems only logical that the same circumstances apply to the Xbone. Over time, devs will learn to utilize the hardware of both platforms for better results. Of course theres a power gap, but given that the hardware on both system is static, I'm curious how you can conclude that the gap will likely get bigger?
"PS4 and Xbox One effectively utilizing ROPs for higher bandwidth textures. According to Persson, the PS4 could render 64 bit textures while the Xbox One could only handle 32 bit textures before it ran out of bandwidth." And "In most cases I predict memory access patterns will be your biggest challenge." Referring to the eSRAM. What article did you read?
Ryse had dips down to 15 to 20 fps but it was patched at launch. Please give accurate information. It now runs at a locked 30fps. So much misinformation against the x1 on here. Every time I come on here someone is lieing to make the ps4 look better. Why would you even do this. It's just childish and insecure. Play the damn games and quit worrying if the x1 might do something better. I have never seen anything as bad a this site for insecure children fighting over toys. I love gaming but if you stay on this site long you will learn to hate it as it is negative and constantly bashing anything that is not a ps4. This is not gaming its pathetic.
Why do you assume the memory access patterns comment was made against the One in particular? This article talks about ESRAM as a good thing in the XBox spec, not a negative. The comment about memory access patterns seems to be referring to both consoles. Just a couple of days ago Road Hog were saying that they would prefer the PS4 to have split DDR and GDDR memory, because the differing memory access patterns would help them get more out of the hardware. Reality clearly favours the PS4, but it's somewhat encouraging to see some of game development's finest say the One can give it a run for it's money. Can only mean better games.
Just FYI. The PS4 has 3 memory paths: CPU only (cacheable), GPU only (cacheable) and CPU/GPU shared (non-cacheable). All three have different bandwidth and are usually pooled (static configuration). Devs have the freedom to split this as they like within the 5.5GB boundary.
@Gamerbynight I remember the days as a N4G reader back when Ps3 and Xbox were in its infancy. Xbox fans were like sony fans today here. Both sides have equal amount of haters and sadly journalist adds fuel to the flame. You just have to live with it and do your best to not get caught up in it. After all, these are so called "Toys" and is a given that children are going to be involved in the gaming community. Happy gaming
Wait wtf you like the fact that a console is underpowered. All you care is your Ps4 to be powerful?Shouldn't all consoles be equal so that multiplats won't suffer;
@Yo Mama - now i like that lol
Standby for PS fanboys..
I don't think baiting them in is the smartest decision
Since when do piranhas need baiting?
@MaxwellBuddha When you give them something to bait about
@Angainor7 & MaxwellBuddha "Standby for PS fanboys.. " "Since when do piranhas need baiting? " Oh the irony and hypocrisy!
Thas a big circle jerk of denial. Ill just say this....rhetoric vs proof
You guys don't sound like gaming fans.Has it now gotten to the point where we create these fake wars just to get a click or reaction. I love to hear good news about x1 but have no interest in talking about its competitors.
When you say rubbish like that your forcing the fanboy to come out to play because your being overly and unnecessarily defensive
He probably wanted that reaction (of fanboys coming out to play). It's called trolling. What I don't get, and that I've noticed a lot lately, it's how he has (as of now) 19 agrees and 55 disagrees... a ratio that suggests 55 people notice him trolling, and haven't bubbled him as trolling. Yet, I see people getting down voted with 10 to 1 agrees to disagrees anymore on the opposite camp. N4g: Sociology at it's finest.
Lol, its true and we have seen that in every single game out there. Lol. Im going to buy a cheaper GPU because it wont matter...
When you buy a graphic card you look at three things graphics processor (GPU), clock rates, and memory bandwidth. You guys can look it up the xb1 GPU has a faster clock rate. Its not a lie look it up for yourself. Where i think the GPU wins against GPU inside the xb1 is the extra grunt with shading and with the some extra features that came with the Sony GPU. 50 per cent power difference, is speculated due to difference in CU'S 18 For Sony and 14 for Microsoft (12 been used) Sony fans are of the mistaken believe 18 CU'S are been used, for games, this is not the case. Only 14 CU's confirmed by a Playstation dev a few months ago. This is why some Sony fans are not getting it. 14 CU's for Sony 12 for Microsoft. Sony PS4 does have an advantage over the xb1, but its only about 10 to 15 per cent on the GPU side.
No, you look at BENCHMARKS which document real-world performance. I've seen nVidia cards run circles around supposedly superior offerings from AMD.
It's still 18 CUs. 14 are directly usable by devs for the visuals, the other 4 are automatically set directly for compute. From what I understand they can still use all 18 for graphics or 14+4, 15+3, 16+2 or a 17+1 setup for gfx + compute. Regardless though, the other 4 are still used, so you are misrepresenting the difference. It's still 18 vs 12 CUs. So, there's still that big gap in performance.
That what you wish. Because already the diference is more then 10 to 15 per cent. I dont know how many year you have been playing, but Sony cuts power and then later release. Normal. The clock speed diference is minimum. And the GPU is not just CU´s is the combination of every single aspect that make it much better and with space to evolve in the future. The same cant be told by the GPu of the Xbox. Thats why all the talking about ESram, Dx12, SDK, DGpu...