IR: "It’s a well known fact that the Xbox One has a pretty slower clock speed. In fact both the new consoles have slower CPUs."
I'll wait & see
Informative vid. Not riding anything off but I am cautiously excited for the potential cloud and DX12 capabilities. @Notorious your right, its simply a wait and see.
Nice vid. DirectX 12 will help optimise the XB one's graphics but it would be incorrect to think that the advantages of cloud compute can make the Xbox graphics run as if it has more powerful hardware then the competitor.
DX12 I am eager to see what it can do. The Cloud however, I'm skeptical about. I need a working example of it on the Xbox One or at least even a hint of it coming to the X1 anytime soon. The term "the Cloud" has become a bit of a running joke when its associated with MS. It'll take a real, practical working example for it to be taken remotely seriously.
I believe it will improve X1 games, but I have trouble accepting it will be anything like MS (and Fanboys) hype's it to be. Truth is, I'm pretty happy with the X1 now, so any improvements the Cloud/DX12 can offer is awesome in my book. I'm inclined to agree with the wait 'n see philosophy :/
@angelice Yup im waiting to see what it can do. Hope we see some things at e3.
I'm gonna go with what Microsoft said: DX12 will improve performance by making CPU computations more efficient by spreading jobs across multiple cores. The result is a situation where the framerate can be increased - because you are spending less time waiting on the CPU. Everything else is complete speculation because it is only theory and is not based on anything that Microsoft has said themselves.
@Thantalas, Cloud Compute can make graphics run better and here's how. Say you're running a game, and 50% of your ram is working on rendering, 25% on physics, and 25% on AI (percentages are completely arbitrary). Now suppose that you offload your physics and AI onto Cloud Compute, now that 50% that was focused on other tasks is freed up to do more rendering. BOOM, graphics have been enhanced.
@balcrist, problem though online games generally have no AI as people generally prefer to play against real people and not bots, if they were to play against bots this would usually be handled in an offline manner. Which means the cloud will only work for offline games forcing them to be online. Which brings us back to the whole DRM scenario MS wanted in the first place. How do you know they are not trying to use the cloud to lure you back? You may get some added benefits but even if it were 25% free it would then likely need 15% to reinterpret it back into the game. There are just way too many holes. And this is without taking into account different connection speeds. Will slower connections then cause tearing in your single player games or even stop working all together if you connection is lost? Also what happens when they shut the "AI" servers down? No more AI in your games which means you can no longer play them. People need to think what Clouds will bring, cause I highly suspect it is rain.
definitely wait and see moment right here!
Tbh I've yet to see any real evidence as to why it will not improve xb1. The proof will be in the pudding though.
I honestly want to see the cloud capabilities in action, I want to know what they are able to do with it and I'm not going to take any word over seeing it myself. As for DX12, that is easy to see what kind of improvements it can make.
Quote from article : DirectX12 along with eSRAM could resolve the 1080p problem. I know that eSRAM is the major culprit behind Xbox One’s inability to run games at full 1080p. Some may argue that the amount is less, after all it’s just 32MB. But eSRAM has an extremely high bandwidth of 204Gb/s... WTF??? How in the whole world DX API can boost memory bandwith? 204 GB/s is theoretically bandwith and only with read/write cycle ( every cycle ). But the problem is, eSRAM can do this in every 8th cycle, so, there is no practical use.
Bandwidth cannot be increased however with improved API amount of needed data being send can be optimised and even compressed. Instead of for example sending a half empty data 'box' they send a full box and/or remove unneeded double content. Some data might also be used by gpu and cpu instead of writing it twice to different memory they write it once to a shared virtual memory pool.
What's so useless about using eSRAM its like the eDRAM in the Xbox 360? You put the framebuffer in it, apply effects like anti-aliasing or particles, then pass it off to be displayed. It's similar to the 360's eDRAM in that it has logic units surrounding it rather than just being a chunk of fast memory. But can only be accessed by the GPU... The only problem I see with the Xbox One is Dx11... its always been a temporary solution till DX12 was finished and when it's released we will all see the Xbox One will not have any more problems running games at 1080p @60fps
Quote : The only problem I see with the Xbox One is Dx11... its always been a temporary solution till DX12 was finished and when it's released we will all see the Xbox One will not have any more problems running games at 1080p @60fps DX12 reduce CPU overhead and CPU has nothing to do with resolution. DX12 is just software and will not change nothing on GPU. Hardware specs of GPU remain the same and major factor for any frame buffer resolution is ROP's (Render Output Unit). Xbone GPU doesn't have enough ROP's for rendering game @1080p with decent graphical elements. eSRAM size is the problem also. Xbone WILL HAVE problem in future with graphical demanding games @1080p/60fps. Quote from one article : Ian Bell took it to himself to confirm their frame rate target for both the PS4 and Xbox One. Speaking about the frame rate, he said: “We’re still aiming for 60 FPS on those consoles.” Here the consoles refer to PS4 and Xbox One, not Wii U. When questioned whether 60 fps is possible at all on the PS4/XBO, he replied: “We’re already there on PS4, so high : )” http://gearnuke.com/project... So, PS4 version very likely 1080p/lock60fps. And Xbone???
Deferred Rendering needs large framebuffers. Killzone 2 needed 36 MB which is larger than ESRAM for it's G Buffer, and that is 720p. Page 18: http://www.slideshare.net/g...
@cchum True sony has the advantage with 'normal deferred rendering' as it requires large framebuffers. Probably for this reason Ryse used 'tiled defered shading' as this takes up less resources compared to deferred rendering. Also believe project cars will use esram for defered rendering not sure if they will use same solution as in Ryse for the large frame buffers. edit source - tilled defered shading: http://www.crytek.com/downl... http://www.dualshockers.com...
Slightly off topic, but looking for clarification on this, is that the same ian bell whom once worked with David braben to create the amazing game 'Elite'?
Leave it to the Professional Software engineers and leave your uneducated ignorance on the subject to what you "Only" know works and how... ;)
@cchum Render targets must be priotized to either the ESRAM (204 GB/s) or the 5 GB DDR3 memory pool (68 Gb/s) on Xbox One that is how you achieve 1080p at 60fps on it... New SDKs and Dx12 will make this much easier. And ESRAM was designed for Hardware Based Tiling Resources which will eliminate this size problem that is present atm... which will be utilized in further development in games, there is only so far optimization and processing power can bring games so Tiled Streaming will be very important for games devs on these consoles in the future... @imt558 And don't act like you know exactly what Dx12 will do for the console cause you don't, there is a lot of undisclosed info... you are neither a dev or the creator of DX12 so why dont you stay off Xbox Articles you troll. And if you really dont think an API handles how efficient a GPU is then your ignorant.Mantle and DX12 are new low-level graphics API specifically geared for Graphics Core Next architecture.(GCN) http://www.legitreviews.com... Quote: "Raja Kadouri today announced that AMD will be supporting the DirectX 12 API on all of their GCN (Graphics Core Next) hardware solutions. This is good news for owners of AMD Radeon graphics cards, AMD APUs and even Xbox One game console owners that use GCN hardware. This means that AMD will have full compatibility with DirectX 12 on day one and be able to give users an instant performance boost on DX12 applications thanks to the lower API overhead." You see that "even Xbox One game consoles"... So DX12 will remove both CPU overhead, more effiecent GPU and TR/PRT with ESRAM will make the Xbox One much better console... http://gamingbolt.com/graph... Quote: "He stated that, “DX12 continues to build on DX11.1+ and as such, also includes the Tiled Resources feature. DX12 is however closer to the metal and gives more control to the developer." You wanna downplay DX12 for the Xbox One so right ahead but it'll just make you look that more stupid when it finally released and use on all games on Xbox One...
Really believe if you were this smart Imt, you would never in a 1000 years be on this site.....
Everyone's been talking about DX12 as if it's going to allow Microsoft to catch up with or even surpass the PS4, but what makes everyone think that Sony isn't going to work on improving their APIs as well? The reality is that Microsoft screwed up on the hardware side, DX12 might help, but I foresee the PS4 remaining ahead in terms of graphical fidelity and performance.
Everyone's been talking about DX12 as if it's going to allow Microsoft to catch up with or even surpass the PS4 The only people I have seen saying this are Sony fans. Notice that before your comment, no one was talking about the PS4.
The top of the page mentions "In fact both the new consoles have slower CPUs", bringing in the PS4. And every other article about DX12 mentions it will put the xbone on par with the PS4 or make it better. Hell, the article says the "cloud" could make the system 32 times more powerful... And if I'm not mistaken, the azure servers are being used for Titanfall and that's only 792p... Software can always be updated and fine tuned, but with consoles the hardware will never change.
@rdgneoz3 "The top of the page mentions "In fact both the new consoles have slower CPUs", bringing in the PS4. And every other article about DX12 mentions it will put the xbone on par with the PS4 or make it better." That is probably the weakest excuse to bring up the PS4, but whatever. Also, most articles talk about making the existing hardware more capable through optimization and don't even mention power in relation to the PS4. You are confusing the actual articles with the comments section, which unfortunately changes the narrative to more of a "versus" thing.
"Everyone's been talking about DX12 as if it's going to allow Microsoft to catch up with or even surpass the PS4" NOBODY is saying that. "but what makes everyone think that Sony isn't going to work on improving their APIs as well?" NOBODY is saying that either...what is wrong with you people lately?
@realness idk what it is, honestly. they get so defensive if any news comes up about the xbox
Because the PS4 APIs are already very good. DX11 API has a ton of overhead issues both on the PC and X1 side. PS4 also has very good, easier, and more direct access to the single unified pool of GDDR5. Devs can use Garlic to have full access that high bandwidth 176 GB/s 4.5 GB of RAM immediately on the GPU side. However, The CPU only has access to 20 GB/s of the RAM so they have to optimize accordingly. http://www.eurogamer.net/ar... Xbox One is very different. Render targets must be priotized to either the ESRAM (204 GB/s) or the 4.5 GB DDR3 memory pool (68 Gb/s). ESRAM is GPU bound only. So GPU can use both memory systems. The CPU however can freely use the 68 GB/s but does not see ESRAM at all. It is well known the DX11 has held back gaming performance http://www.tomshardware.com... http://www.bit-tech.net/har... PS4 API is already very efficient. The ps4's opengl based API like PS2's API & PS3's libGCM offers much more low level access to the hardware without the need to go through nearly as many abstraction layers as DirectX does. http://www.eurogamer.net/ar...
I actually learned tons from your very informative article, I found it quite intersting to read also. Thanks! Bubble up for being "Interesting" :)
Sony's 8 gigs of GDDR5 in the PS4, is great Just because of that it should be a winner for years to come. I would have gone a different route though and used 6 to 7 gigs of it for gaming and maybe had 2 to 3 gigs of drr3 for the OS. The 5 gigs for gaming and 3 gigs of it for the OS was the wrong decision. GDDR5 is best used for graphics, its kind of waste using it for the OS
@KNWS Like PS3 (unsure of 360), Sony can reduce the OS footprint, and I see know reason as to why MS can't do the same with Xbox One. Over time I think we'll see more Ram allocated to games. It's not yet required though. 4-GB for games is more than enough right now ;)
But what happens when the multiplats and games for X1 hit 1080p/60fps then what? Where can the "catch up" go from there?