"At this year's Game Developers Conference, Avalanche Studios head of research Emil Persson offered his views on the PS4 and Xbox One effectively utilizing ROPs for higher bandwidth textures."
Good read thank you. I also like the fact that they Esram is still factor in this :)
Regardless of specs if the games don't show a difference then what does it matter? Its a console. You're stuck with it the way it is for the next ~6-7 years. Just play your games.
But they do show difference. I mean Xbox One is less than full HD most of the time. Right now we have huge GPU power gap, ram gap, rop gap. ROP may not matter yet, but once GPGPU takes off, it will create another huge gap. Power difference exist, devs just have not had time and tools to fully utilize additional power that exist on PS4. Either way, gaps going no where. It will just likely get bigger.
@GameNameFame Resolution is not a benchmark for computer graphics. Minecraft in 4k will never look as good as let's say GTAV. It's all about lighting, just like in real life. Ryse is still the best looking title eventho it's in 900p with some great AA going on. I've played them all on both system, and you don't need to trust me, I can proove it (pm if need be). Devs should focus on animation, lighting and simulation, not resolution.
@BallsEye that is quite possibly the stupidest thing i have ever personally read on this site.
@BallsEye That is just desperate argument. Framerates are commonly used as benchmark figures in which Ryse does 15 fps. It has good lighting at cost of fps and resolution. Tells you that hardware cannot handle good lighting. Lol Also benchmark of hardwares are always done on same games, so everything else remains equal.Look at COD, BF4, Watch dog and so many others. All inferior resolution on Xbox One. You cry for legit benchmark? Thats how it is done. One variable and all else remain same or similar. In this case, that is resolution. If you want exactly same as PC benchmark, look at Tomb Raider. it is 30 fps vs 45-60 fps. 50% percent power gap on spec is being shown on actual benchmark.
@bullseye. And to think you used your only bubble on that comment! Get a grip man!
@BallsEye: "Ryse is still the best looking title eventho it's in 900p with some great AA going on." No. Just no. Somebody hasn't been paying attention.
@BallsEye - lol, sorry but no. Infamous:SS dominates Ryse in every possible way. Which is quite sad since Ryse is an interactive cutscene game that is pre-rendered (extremely light on resources), and Infamous:SS which is a real-time rendered open-world game (extremely heavy on resources).
@BallsEye Killzone and Infamous destroy Ryse in terms of graphics and performance. To me Killzone is still the best looking next generation game. Infamous looks great too especially with the lighting. Forza looks great... well at least the cars... The backgrounds were very bad though.
@GameNameFame, if devs havent had the time and tools to fully utilize the additional power on the PS4, it seems only logical that the same circumstances apply to the Xbone. Over time, devs will learn to utilize the hardware of both platforms for better results. Of course theres a power gap, but given that the hardware on both system is static, I'm curious how you can conclude that the gap will likely get bigger?
"PS4 and Xbox One effectively utilizing ROPs for higher bandwidth textures. According to Persson, the PS4 could render 64 bit textures while the Xbox One could only handle 32 bit textures before it ran out of bandwidth." And "In most cases I predict memory access patterns will be your biggest challenge." Referring to the eSRAM. What article did you read?
Ryse had dips down to 15 to 20 fps but it was patched at launch. Please give accurate information. It now runs at a locked 30fps. So much misinformation against the x1 on here. Every time I come on here someone is lieing to make the ps4 look better. Why would you even do this. It's just childish and insecure. Play the damn games and quit worrying if the x1 might do something better. I have never seen anything as bad a this site for insecure children fighting over toys. I love gaming but if you stay on this site long you will learn to hate it as it is negative and constantly bashing anything that is not a ps4. This is not gaming its pathetic.
Why do you assume the memory access patterns comment was made against the One in particular? This article talks about ESRAM as a good thing in the XBox spec, not a negative. The comment about memory access patterns seems to be referring to both consoles. Just a couple of days ago Road Hog were saying that they would prefer the PS4 to have split DDR and GDDR memory, because the differing memory access patterns would help them get more out of the hardware. Reality clearly favours the PS4, but it's somewhat encouraging to see some of game development's finest say the One can give it a run for it's money. Can only mean better games.
Just FYI. The PS4 has 3 memory paths: CPU only (cacheable), GPU only (cacheable) and CPU/GPU shared (non-cacheable). All three have different bandwidth and are usually pooled (static configuration). Devs have the freedom to split this as they like within the 5.5GB boundary.
@Gamerbynight I remember the days as a N4G reader back when Ps3 and Xbox were in its infancy. Xbox fans were like sony fans today here. Both sides have equal amount of haters and sadly journalist adds fuel to the flame. You just have to live with it and do your best to not get caught up in it. After all, these are so called "Toys" and is a given that children are going to be involved in the gaming community. Happy gaming
Wait wtf you like the fact that a console is underpowered. All you care is your Ps4 to be powerful?Shouldn't all consoles be equal so that multiplats won't suffer;
@Yo Mama - now i like that lol
Standby for PS fanboys..
I don't think baiting them in is the smartest decision
Since when do piranhas need baiting?
@MaxwellBuddha When you give them something to bait about
@Angainor7 & MaxwellBuddha "Standby for PS fanboys.. " "Since when do piranhas need baiting? " Oh the irony and hypocrisy!
Thas a big circle jerk of denial. Ill just say this....rhetoric vs proof
You guys don't sound like gaming fans.Has it now gotten to the point where we create these fake wars just to get a click or reaction. I love to hear good news about x1 but have no interest in talking about its competitors.
When you say rubbish like that your forcing the fanboy to come out to play because your being overly and unnecessarily defensive
He probably wanted that reaction (of fanboys coming out to play). It's called trolling. What I don't get, and that I've noticed a lot lately, it's how he has (as of now) 19 agrees and 55 disagrees... a ratio that suggests 55 people notice him trolling, and haven't bubbled him as trolling. Yet, I see people getting down voted with 10 to 1 agrees to disagrees anymore on the opposite camp. N4g: Sociology at it's finest.
Lol, its true and we have seen that in every single game out there. Lol. Im going to buy a cheaper GPU because it wont matter...
When you buy a graphic card you look at three things graphics processor (GPU), clock rates, and memory bandwidth. You guys can look it up the xb1 GPU has a faster clock rate. Its not a lie look it up for yourself. Where i think the GPU wins against GPU inside the xb1 is the extra grunt with shading and with the some extra features that came with the Sony GPU. 50 per cent power difference, is speculated due to difference in CU'S 18 For Sony and 14 for Microsoft (12 been used) Sony fans are of the mistaken believe 18 CU'S are been used, for games, this is not the case. Only 14 CU's confirmed by a Playstation dev a few months ago. This is why some Sony fans are not getting it. 14 CU's for Sony 12 for Microsoft. Sony PS4 does have an advantage over the xb1, but its only about 10 to 15 per cent on the GPU side.
No, you look at BENCHMARKS which document real-world performance. I've seen nVidia cards run circles around supposedly superior offerings from AMD.
It's still 18 CUs. 14 are directly usable by devs for the visuals, the other 4 are automatically set directly for compute. From what I understand they can still use all 18 for graphics or 14+4, 15+3, 16+2 or a 17+1 setup for gfx + compute. Regardless though, the other 4 are still used, so you are misrepresenting the difference. It's still 18 vs 12 CUs. So, there's still that big gap in performance.
That what you wish. Because already the diference is more then 10 to 15 per cent. I dont know how many year you have been playing, but Sony cuts power and then later release. Normal. The clock speed diference is minimum. And the GPU is not just CU´s is the combination of every single aspect that make it much better and with space to evolve in the future. The same cant be told by the GPu of the Xbox. Thats why all the talking about ESram, Dx12, SDK, DGpu...
I looked at specs and benchmarks when I picked my GPU. I also looked at what CPU I could couple it with to give the best overall results. I personally will admit, I failed at choosing the best CPU for my GPU.
@KNWS If the X1 has 14 CUs then the PS4 has 20 CUs. You know damn well that both have them disabled for redundancy. Also, even with a measely 6% clock speed advantage over the PS4 GPU (853MHz vs 800MHz), it still performnce worst than the PS4 GPU. And you should know this, the PS4 has: 40.6% more teraflop performance at 1.843TF vs 1.31TF (less actually) 50.5% more shaders than the X1 at 1156 shaders vs 768 And 100% 32ROPS vs 16ROPS 50% more Texture Units at 72 TUs vs 48 TU 400% more ACE/Queues at 8 ACE/64 queues vs 2 ACE/16 queues for the X1. 88% better pixel fillrate at 25.6GPixels/s vs 13.6 for the X1 And 40% higher Texel fillrate at 57.6GTexel/s vs 40.90
ps4 vs xone specs X1 GPU: 1.18 TF GPU (12 CUs) for games 768 Shaders 48 Texture units 16 ROPS 2 ACE/ 16 queues PS4 GPU: 1.84TF GPU ( 18 CUs) for games + 56% 1152 Shaders +50% 72 Texture units +50% 32 ROPS + 100% 8 ACE/64 queues +400%
@XtraTrstrL "From what I understand they can still use all 18 for graphics or 14+4, 15+3, 16+2 or a 17+1 setup for gfx + compute." Exactly.
Eerm, benchmarks? I think that's where you go in you want comparisons and as for your cu usage, you're not right at all. Your twisting the facts to suit your argument. Not cool.
That's your opinion and it's incorrect. The differences in the games we've seen so far are greater than 10-15% that's a fact. So you look foolish saying what's obviously not true. The X1 will NEVER outperform the PS4 when it comes to games so you need to stop spreading lies that have already been proven to be the exact opposite of your claim.
"Sony PS4 does have an advantage over the xb1, but its only about 10 to 15 per cent on the GPU side." The difference in the number of pixels alone works out to a bigger percentage than that, and when you start adding in other differences (better AA, ambient occlusion, etc) the power gap becomes even more apparent.
Don't bother with the specs. These fools would believe a Radeon hd 4000 series gpu would outrun a titan if m.s. told them it would.
"Only 14 CU's confirmed by a Playstation dev a few months ago. This is why some Sony fans are not getting it." - Lmaooooo this is gold. Man IGN should employ you. Yh confirmed by the chief architects of the system to be a suggestion not a hardware split. "Digital Foundry: Going back to GPU compute for a moment, I wouldn't call it a rumour - it was more than that. There was a recommendation - a suggestion? - for 14 cores [GPU compute units] allocated to visuals and four to GPU compute... Mark Cerny: That comes from a leak and is not any form of formal evangelisation. The point is the hardware is intentionally not 100 per cent round. It has a little bit more ALU in it than it would if you were thinking strictly about graphics. As a result of that you have an opportunity, you could say an incentivisation, to use that ALU for GPGPU." http://www.eurogamer.net/ar... Also official announcement "The GPU contains a unified array of 18 compute units, which collectively generate 1.84 Teraflops of processing power that can freely be applied to graphics, simulation tasks, or some mixture of the two." http://www.scei.co.jp/corpo... Edit - Even Vgleaks doesn't agree with your BS "We understand that Sony found that using 4 CUs in GPGPU tasks can be more efficient and with better overall results than using all of them for rendering tasks, but they don’t stop developers to use them only for rendering or GPGPU, as they wish depending on the game demands. It’s not mandatory at all. Sony is only providing more resources to the developers." http://www.vgleaks.com/play...
Yup a faster gpu clock speed at 853 And a faster CPU clock speed at 1.76 50% reduction of CPU with Directx 12 The gap isn't that big it's slight
@lifeisgamesok "Yup a faster gpu clock speed at 853" "And a faster CPU clock speed at 1.76" - Yh those things sure are helping XB1 out space the PS4 aint they.... "The gap isn't that big it's slight" - But your delusional on the Direct X12 prospectives for XB1 are. Hint - There will be a performance boost but nothing in the same light as PC that doesn't enjoy the benefits of "console efficiency" http://i.imgur.com/xxNpozh.... "Traditionally this level of efficiency was only available on console – now, Direct3D 12, even in an alpha state, brings this efficiency to PC and Phone as well. By porting their Xbox One Direct3D 11.X core rendering engine to use Direct3D 12 on PC, Turn 10 was able to bring that console-level efficiency to their PC tech demo." http://blogs.msdn.com/b/dir... June 16: Insight into the Xbox One Technology - ISCA 2014 - This will be the final killer to all your delusional seeing as from the recent events we got no info.
Destiny, very good advanced trolling..... Guess you expect a reaction..... All the effort you putin to say PS4 is more powerful..... Well done..... Told the world something they knew already, a revelation you probably think.....
You think, KNWS, that some people are really stupid and senile? You wrote same shit every time and shit that you wrote are 99% ALWAYS FALSE! I mean, what the hell you try to prove to all uf us??? You really look stupid! For example, Xbone can't handle Metro Redux collection ( 2033 and Last Light ) @1080p. How you will react if UFC on PS4 get 1080p patch and Xbone remain @900p? You will cry in your room or you will smash your TV? Devs CAN USE ALL 18 CU's IF THEY WANT for rendering on PS4 Xbone HAS 12 CU's ( 2 are locked ( 14 CU's then ) because yield production ). PS4 has 18 CU's ( 2 are locked ( 20 CU's then ) because yield production ) @lifegameisok Quote : Yup a faster gpu clock speed at 853 And a faster CPU clock speed at 1.76 50% reduction of CPU with Directx 12 The gap isn't that big it's slight You're lifegameisbullshit! No, PS4 CPU won't get LESS CPU overhead in future! Sony will stagnate. :rolleyes: And it's proven that you CAN GET MORE from the PS4 CPU. Here, read, god dammit : http://www.neogaf.com/forum... http://gamingbolt.com/subst... http://www.lazygamer.net/24...
Doesn't the ps4 have 20 CUs. 2 which where completely disabled right now?
First of all Sony hasn't released the official speed of their CPU/GPU that I know of. People are going off of default specs. MS overclocked their hardware... Sony can easily do it too.
give developers time ,remember when the ps3 came out they struggle with the cell processor but overtime they overcome the mountain ,we already see that some developers already take advantage of the Esram cryteck for one and turn 10
True, complicated architecture does take more time to master, just like what we saw with the PS3's CELL. But i don't think those two games you mentioned are the best examples of "mastering" the ESRAM. Both of those games had to make sacrifices in other areas in order to run, or look, as good as they do. Forza had to make pretty big visual sacrifices in order to hold 60fps at 1080p, and Ryse had to make performance sacrifices, with such narrow level design, in order to pull of such great visuals. Not to bash those games, but i don't think we have really seen any dev "master" the ESRAM yet, at all, definitely not with those games. Those two games above would have been made rather hastily for the launch of the new gen. If anyone is going to put in the work to get the most of out ESRAM, it's going to be studios like 343i, and Black Tusk, studios that have had a while with the system before putting out their first game.
Yeah, If Crytek went with the initial build, prior to the downgraded build.... it would probably muster an average five frames as opposed to fifteen.
@lukas Yeah you are right but I do believe that all ms 1st party studios including 2nd party devs like playground and remedy will eventually lead the way when it comes to mastering the esram but I expect turn 10/forza 6, 343i and BT to blow us away and if crytek actually develops a sequel to ryse i think it would make the first game look like childs play. I also expect some great looking titles from rare too.they have pulled off some fantastic looking titles in the past.
Developers never really overcame the PS3's complexity which is why multi plat games ran objectively worse on the PS3. Only first party games and developers really had a grasp on the Cell and unlocked what it could do. And Sony's first party exclusives looked better than Xbox 360's first party exclusives. This is where I've got a problem with Xbox fans/customers. Last gen the differences in games were a lot smaller than people will admit and this gen the differences we're seeing are noticeably bigger yet they are downplayed by Xbox fans as insignificant. If the small differences mattered to Xfans last gen you don't get to ignore and dismiss the huge differences in the games we're seeing now.
all the esram did for crytek on the X1 was make a company that likes to utilize the most specs and power they can to get top notch graphics and performance out of there games underachieve with ryse running below 1080p and the framerate dipping in the teens. They aren't happy about the esram nor do they embrace it they have to work with it .
ok guys, listen. Ryse runs horribly, is extremely linear, and its just so goddamn boring to play. It is embarassing that anyone would tout it as a showpiece of the Xbox One's capabilities.
I don't get why everyone is seeing "ESRAM is a factor" in the title, and reading into it as "ESRAM is a problem". Read the article, he's saying that ESRAM throws the on-paper specs off. That's not "ESRAM makes it even worse". He's implying it can mitigate some of the raw spec differences between the GPUs.
Buying a console for graphics these days is like buying pancakes for shoes.
Haha, best comment this week (and so true)
Yea, or about as ridiculous as buying a graphics card for graphics, huh?
I wasn't saying the new consoles are pointless, I'll be picking them up in the future. But the whole fickle argument between fanboys about graphics is pointless, you buy consoles for the games and your preference.
If that's the case then why upgrade to the Xbox one at all when the games that are on both the 360 and X1 look and run almost identical? Graphic fidelity and framerate do matter even on consoles because you can't go build a 400.00 PC and get the same performance you're getting out of the PS4. Suddenly now that the X1 can't outperform the PS4 people who back MS are saying "graphics don't matter" It's hypocrisy plain and simple.
Copen, agreed. ....ive never understood that notion of console gamers NOT caring about graphics and performance. Why upgrade at all. Of course we cared about graphics and performance.. Those saying different should go check DF and LoT articles from last gen... After sifting through some of the comments, they should come back and tell us graphics and performance didn't matter.. After that, they should try and explain why some who used to laud the differences are now playing the morality card and when they've finished doing that they can then explain why those who are now screaming 'resolution doesn't matter' flock to and champion any rhetorical gap closing article.
I assume then you still only own an original XBox and play games on a 24 inch CRT TV that weighs 100 lbs. Do you still watch movies on VHS ? Funny how all of a sudden graphics don't matter. Yet it's one of the biggest driving forces behind the evolution of video games. Heard denile is a nice place to live in.
I didn't say graphics didn't matter, but when it comes down to a game being good then it's dependent on numerous other factors. Such as gameplay, story and fun. If people only play games for the graphics and nothing else then i don't see why they haven't bought a PC.
@ainsleyharriott Hmmmm so because we care about graphics we should get a PC ? K can you please link where I can buy Infamous second son on PC ? I will pick it up.
@magoo you obviously didn't read my other comment then did you. To quote myself 'But the whole fickle argument between fanboys about graphics is pointless, you buy consoles for the games and your preference.'
all those technical things are useless when the games show big differences, guess there are some devs which are pro xbox and I am not surprised.
Which games are you talking about that have "big" difference between the versions?
The Order 1886 ! I have been unimpressed by the graphic on xbone . Ryse is the only game on xbone I would consider next gen. Ryes is in 900p and has horrible frame rate. The xbone has no game close to the graphic level of The Order . The Order is the first true next gen . Ryse doesn't come close to The Order in graphic . The Order will be the start of Next gen on PS4. Can't wait for more The Order and Uncharted 4 a new console graphic revolution the beginning of true next gen gaming . Quantum break and sunset overdrive look last gen 2.0 neither have better graphic that infamous ss. Instead of disagrees please explain why I'm incorrect
So the only examples you have are 2 games that arent out yet, one of which we havnt seen anything of. Come on, I know you can do better than that Sunset overdrive looks great for the cartoony look theyre going after and weve hardly seen anything about QB. Of course the ps4 graphics are better but its nothing like you make it out to be. Listening to some of the fanboys youd think the xbox is almost unplayable because of how bad it looks