IR: "It’s a well known fact that the Xbox One has a pretty slower clock speed. In fact both the new consoles have slower CPUs."
I'll wait & see
Informative vid. Not riding anything off but I am cautiously excited for the potential cloud and DX12 capabilities. @Notorious your right, its simply a wait and see.
Nice vid. DirectX 12 will help optimise the XB one's graphics but it would be incorrect to think that the advantages of cloud compute can make the Xbox graphics run as if it has more powerful hardware then the competitor.
DX12 I am eager to see what it can do. The Cloud however, I'm skeptical about. I need a working example of it on the Xbox One or at least even a hint of it coming to the X1 anytime soon. The term "the Cloud" has become a bit of a running joke when its associated with MS. It'll take a real, practical working example for it to be taken remotely seriously.
I believe it will improve X1 games, but I have trouble accepting it will be anything like MS (and Fanboys) hype's it to be. Truth is, I'm pretty happy with the X1 now, so any improvements the Cloud/DX12 can offer is awesome in my book. I'm inclined to agree with the wait 'n see philosophy :/
@angelice Yup im waiting to see what it can do. Hope we see some things at e3.
I'm gonna go with what Microsoft said: DX12 will improve performance by making CPU computations more efficient by spreading jobs across multiple cores. The result is a situation where the framerate can be increased - because you are spending less time waiting on the CPU. Everything else is complete speculation because it is only theory and is not based on anything that Microsoft has said themselves.
@Thantalas, Cloud Compute can make graphics run better and here's how. Say you're running a game, and 50% of your ram is working on rendering, 25% on physics, and 25% on AI (percentages are completely arbitrary). Now suppose that you offload your physics and AI onto Cloud Compute, now that 50% that was focused on other tasks is freed up to do more rendering. BOOM, graphics have been enhanced.
@balcrist, problem though online games generally have no AI as people generally prefer to play against real people and not bots, if they were to play against bots this would usually be handled in an offline manner. Which means the cloud will only work for offline games forcing them to be online. Which brings us back to the whole DRM scenario MS wanted in the first place. How do you know they are not trying to use the cloud to lure you back? You may get some added benefits but even if it were 25% free it would then likely need 15% to reinterpret it back into the game. There are just way too many holes. And this is without taking into account different connection speeds. Will slower connections then cause tearing in your single player games or even stop working all together if you connection is lost? Also what happens when they shut the "AI" servers down? No more AI in your games which means you can no longer play them. People need to think what Clouds will bring, cause I highly suspect it is rain.
definitely wait and see moment right here!
Tbh I've yet to see any real evidence as to why it will not improve xb1. The proof will be in the pudding though.
I honestly want to see the cloud capabilities in action, I want to know what they are able to do with it and I'm not going to take any word over seeing it myself. As for DX12, that is easy to see what kind of improvements it can make.
Quote from article : DirectX12 along with eSRAM could resolve the 1080p problem. I know that eSRAM is the major culprit behind Xbox One’s inability to run games at full 1080p. Some may argue that the amount is less, after all it’s just 32MB. But eSRAM has an extremely high bandwidth of 204Gb/s... WTF??? How in the whole world DX API can boost memory bandwith? 204 GB/s is theoretically bandwith and only with read/write cycle ( every cycle ). But the problem is, eSRAM can do this in every 8th cycle, so, there is no practical use.
Bandwidth cannot be increased however with improved API amount of needed data being send can be optimised and even compressed. Instead of for example sending a half empty data 'box' they send a full box and/or remove unneeded double content. Some data might also be used by gpu and cpu instead of writing it twice to different memory they write it once to a shared virtual memory pool.
What's so useless about using eSRAM its like the eDRAM in the Xbox 360? You put the framebuffer in it, apply effects like anti-aliasing or particles, then pass it off to be displayed. It's similar to the 360's eDRAM in that it has logic units surrounding it rather than just being a chunk of fast memory. But can only be accessed by the GPU... The only problem I see with the Xbox One is Dx11... its always been a temporary solution till DX12 was finished and when it's released we will all see the Xbox One will not have any more problems running games at 1080p @60fps
Quote : The only problem I see with the Xbox One is Dx11... its always been a temporary solution till DX12 was finished and when it's released we will all see the Xbox One will not have any more problems running games at 1080p @60fps DX12 reduce CPU overhead and CPU has nothing to do with resolution. DX12 is just software and will not change nothing on GPU. Hardware specs of GPU remain the same and major factor for any frame buffer resolution is ROP's (Render Output Unit). Xbone GPU doesn't have enough ROP's for rendering game @1080p with decent graphical elements. eSRAM size is the problem also. Xbone WILL HAVE problem in future with graphical demanding games @1080p/60fps. Quote from one article : Ian Bell took it to himself to confirm their frame rate target for both the PS4 and Xbox One. Speaking about the frame rate, he said: “We’re still aiming for 60 FPS on those consoles.” Here the consoles refer to PS4 and Xbox One, not Wii U. When questioned whether 60 fps is possible at all on the PS4/XBO, he replied: “We’re already there on PS4, so high : )” http://gearnuke.com/project... So, PS4 version very likely 1080p/lock60fps. And Xbone???
Deferred Rendering needs large framebuffers. Killzone 2 needed 36 MB which is larger than ESRAM for it's G Buffer, and that is 720p. Page 18: http://www.slideshare.net/g...
@cchum True sony has the advantage with 'normal deferred rendering' as it requires large framebuffers. Probably for this reason Ryse used 'tiled defered shading' as this takes up less resources compared to deferred rendering. Also believe project cars will use esram for defered rendering not sure if they will use same solution as in Ryse for the large frame buffers. edit source - tilled defered shading: http://www.crytek.com/downl... http://www.dualshockers.com...
Slightly off topic, but looking for clarification on this, is that the same ian bell whom once worked with David braben to create the amazing game 'Elite'?
Leave it to the Professional Software engineers and leave your uneducated ignorance on the subject to what you "Only" know works and how... ;)
@cchum Render targets must be priotized to either the ESRAM (204 GB/s) or the 5 GB DDR3 memory pool (68 Gb/s) on Xbox One that is how you achieve 1080p at 60fps on it... New SDKs and Dx12 will make this much easier. And ESRAM was designed for Hardware Based Tiling Resources which will eliminate this size problem that is present atm... which will be utilized in further development in games, there is only so far optimization and processing power can bring games so Tiled Streaming will be very important for games devs on these consoles in the future... @imt558 And don't act like you know exactly what Dx12 will do for the console cause you don't, there is a lot of undisclosed info... you are neither a dev or the creator of DX12 so why dont you stay off Xbox Articles you troll. And if you really dont think an API handles how efficient a GPU is then your ignorant.Mantle and DX12 are new low-level graphics API specifically geared for Graphics Core Next architecture.(GCN) http://www.legitreviews.com... Quote: "Raja Kadouri today announced that AMD will be supporting the DirectX 12 API on all of their GCN (Graphics Core Next) hardware solutions. This is good news for owners of AMD Radeon graphics cards, AMD APUs and even Xbox One game console owners that use GCN hardware. This means that AMD will have full compatibility with DirectX 12 on day one and be able to give users an instant performance boost on DX12 applications thanks to the lower API overhead." You see that "even Xbox One game consoles"... So DX12 will remove both CPU overhead, more effiecent GPU and TR/PRT with ESRAM will make the Xbox One much better console... http://gamingbolt.com/graph... Quote: "He stated that, “DX12 continues to build on DX11.1+ and as such, also includes the Tiled Resources feature. DX12 is however closer to the metal and gives more control to the developer." You wanna downplay DX12 for the Xbox One so right ahead but it'll just make you look that more stupid when it finally released and use on all games on Xbox One...
Really believe if you were this smart Imt, you would never in a 1000 years be on this site.....
Everyone's been talking about DX12 as if it's going to allow Microsoft to catch up with or even surpass the PS4, but what makes everyone think that Sony isn't going to work on improving their APIs as well? The reality is that Microsoft screwed up on the hardware side, DX12 might help, but I foresee the PS4 remaining ahead in terms of graphical fidelity and performance.
Everyone's been talking about DX12 as if it's going to allow Microsoft to catch up with or even surpass the PS4 The only people I have seen saying this are Sony fans. Notice that before your comment, no one was talking about the PS4.
The top of the page mentions "In fact both the new consoles have slower CPUs", bringing in the PS4. And every other article about DX12 mentions it will put the xbone on par with the PS4 or make it better. Hell, the article says the "cloud" could make the system 32 times more powerful... And if I'm not mistaken, the azure servers are being used for Titanfall and that's only 792p... Software can always be updated and fine tuned, but with consoles the hardware will never change.
@rdgneoz3 "The top of the page mentions "In fact both the new consoles have slower CPUs", bringing in the PS4. And every other article about DX12 mentions it will put the xbone on par with the PS4 or make it better." That is probably the weakest excuse to bring up the PS4, but whatever. Also, most articles talk about making the existing hardware more capable through optimization and don't even mention power in relation to the PS4. You are confusing the actual articles with the comments section, which unfortunately changes the narrative to more of a "versus" thing.
"Everyone's been talking about DX12 as if it's going to allow Microsoft to catch up with or even surpass the PS4" NOBODY is saying that. "but what makes everyone think that Sony isn't going to work on improving their APIs as well?" NOBODY is saying that either...what is wrong with you people lately?
@realness idk what it is, honestly. they get so defensive if any news comes up about the xbox
Because the PS4 APIs are already very good. DX11 API has a ton of overhead issues both on the PC and X1 side. PS4 also has very good, easier, and more direct access to the single unified pool of GDDR5. Devs can use Garlic to have full access that high bandwidth 176 GB/s 4.5 GB of RAM immediately on the GPU side. However, The CPU only has access to 20 GB/s of the RAM so they have to optimize accordingly. http://www.eurogamer.net/ar... Xbox One is very different. Render targets must be priotized to either the ESRAM (204 GB/s) or the 4.5 GB DDR3 memory pool (68 Gb/s). ESRAM is GPU bound only. So GPU can use both memory systems. The CPU however can freely use the 68 GB/s but does not see ESRAM at all. It is well known the DX11 has held back gaming performance http://www.tomshardware.com... http://www.bit-tech.net/har... PS4 API is already very efficient. The ps4's opengl based API like PS2's API & PS3's libGCM offers much more low level access to the hardware without the need to go through nearly as many abstraction layers as DirectX does. http://www.eurogamer.net/ar...
I actually learned tons from your very informative article, I found it quite intersting to read also. Thanks! Bubble up for being "Interesting" :)
Sony's 8 gigs of GDDR5 in the PS4, is great Just because of that it should be a winner for years to come. I would have gone a different route though and used 6 to 7 gigs of it for gaming and maybe had 2 to 3 gigs of drr3 for the OS. The 5 gigs for gaming and 3 gigs of it for the OS was the wrong decision. GDDR5 is best used for graphics, its kind of waste using it for the OS
@KNWS Like PS3 (unsure of 360), Sony can reduce the OS footprint, and I see know reason as to why MS can't do the same with Xbox One. Over time I think we'll see more Ram allocated to games. It's not yet required though. 4-GB for games is more than enough right now ;)
But what happens when the multiplats and games for X1 hit 1080p/60fps then what? Where can the "catch up" go from there?
http://m.techradar.com/news... What people forget is that dx12 is mainly for pc's and is just an after thought on the xb1!
Because it's not? Xbox one it's literally running windows 8. Which is the same OS as PC.
Oh ok so the xbox one can play pc games? Do the devs only have to make a pc version and the xbox one version would be the EXACT same? while it does have windows 8 it is not a pc!
You clearly have no idea what you are talking about, and have no idea what/how an API works.
So you are saying the Direct X box using Direct X was an after thought and not something Microsoft had in mind when the Xbox was created? Makes perfect sense if you hate Microsoft and everything associated with it. http://en.wikipedia.org/wik... Just for fun, this is how the Playstation was created, http://en.wikipedia.org/wik...
you're in denial if you don't think x1 was made with dx12 in mind
I doubt very much that it would magically fix the resolution problems.
a fitting quote for dx12 from our good ole friend major nelson "I can't wait until til the truth comes out." what we only got to wait a year or two to see if dx12 helps xbox one.
Makes me scratch my head and wonder WTH was Larry talking about??? Its been time enough for any or all truths to have been put out to the public by now so what is the holdup??? Its like waiting for the secret sauce recipe which is being guarded at the highest level of MS security before it goes public. DX12? Cloud Power? Mysterious fairy dust sdk's? We have heard all of this a;ready please let us in on this potentially industry-shattering secret Larry and the rest of MS big wigs.
Because imagination. For the moment at least.
OMG here we go again..... More secret DX12 sauce and gravy goodness served up by Phil AND Major Nelson who are both wearing accompanying chef hats.
DirectX will only help if software is the major bottleneck now, and developers cannot work around current DX limitations. I'm not sure how much of the Xbox resolution problems is due to hardware limitations, and how much is due to poorly optimized software libraries. The jury is still out on this. On the other hand, I fail to see how cloud computing will boost graphics.. there's just too way too much latency there!
DirectX is the industry standard! Why yes lets just ignore OpenGL! The Absolute Arrogance never gets tiresome. Keep up the great work.
Both consoles will have to make performance improvements to remain competative towards the mid and end of this 7-8 year cycle. Especially when/if steam, apple, samsung and co decide to enter the market with more agressive refresh cycles and the such. If anything its good to know how microsoft intends to do that, now its over to sony to say how they will improve their system as it ages.
Eh.. PS4's API already does things like tile streaming, and mantle is only relevent to PC users, as the PS4 apis are already similarly low level (there will be no mantle for PS4) The cloud is more interesting, but we'll have to see if it ends up doing anything particularly impressive.
This is really the first major re-write of DX since DX9, which is why DX11.x is still acting like multicore cpu's do not exist, because back then they really didn't outside of servers. As code moves to working parallel across all platforms utilizing multiple cores becomes the highest priority and the reason that DX12 is so important. The days of the standalone GPU are ending, the era of parallel computing is upon us. DX12 is not only necessary but the most important step in moving to that model for Microsoft. Basically DX12 isn't the reason you should be excited, but parallel computing is. And DX12 plays a big part in making that a reality. And yes, Parallel computing will happen across all platforms, it already is to a small extent, but with API's like DX12 making it a priority "and easier to implement", the amount of code running parallel will increase ten folds.
@lemoncake Mark Cerney and Dice already talked about this quite a while back there is Api, and of course core metal engine dynamics being advanced right now for the playstation 4 hardware. Hell they are using techniques they used on the PS3 by engine level they are currently porting to the Playstation 4's hardware right now. Hell Sonys use a combination of DirectX and OpenGL I do not get why many think the measure for advanced software techniques is Microsofts Domain and that Only Microsoft could teach Sony a thing or too about software. Thats what many seem to think, but yet here in this very video he claimed that DirectX was the standard? But just so happens to ignore OpenGL he brings up AMD Mantle so he knows his APi's it seems. It seems to always be everything is measured against microsoft's Api but some how some way its always the measurement to go by. But in the console industry, its OpenGL is the standard, so why is DirectX still being touted as the "defacto" Thats exactly what im talking about its this Arrogance is one of the reasons Even Valve is backing OpenGL more so now!
What's funny is that many current APU features originated on the CELL processor. The CELL itself has become mostly irrelevant for general purpose computing, but most of what it started has evolved into what is being used in processors today, and will likely continue into the future. What MS is offering in it's DX12 is what Sony has been offering for years with it's PS3 API's.
Another day, another "something magical will make Xbones hardware better than it physically is" article.
Article very two weeks insert: "Why XXX could be a gamer changer for the XBO" 1. Titanfall 2. The Cloud 3. Microsoft Surface 4. DX12 5. Gears of War 6. HALO 5 7. No Kinect 8. Spawts, Spawts, TV, TV, TV Spawts 9. Price drop If there's this many game changers needed after only 6 months on the market, then there's something much deeper wrong with the business model.
Bravo, have a bubble! It's funny how very few people see it from this perspective. MS have been throwing everything and anything out the PR window to try and improve their image ever since E3 last year.
i love all the arguments about dx12 when literally not a single person knows what it'll do lol. its like watching a bunch of kids throw crap at the wall hoping it sticks
@daniel2115 And I just love how the Arrogance of many view that DirectX Api is the "standard" which all other Api' are measured against. Which is quite untrue, but many Gamers on here seem to think it is. I mean there is all this talk about microsoft fixing this issue about the Api, and yet just like last generation do people really think microsoft is going to push the hardware? Past two cycles show no to be the result, because many times microsoft's head honchos would rather invest low to gather more returns than get a slight increase in performance, thats why they do not push the hardware, and that is exactly why all of the founding father CEO's of the xbox platform all left. Its one thing to get phil, or others that in charge now to push for more investment into xbox as a platform, but its a whole other ballpark to get it done. Lets see developers put more hardware pushing investment into xbox ip software despite More costs an lower profit to do so. Lets see if Microsoft will do that. Because the cost increases not lowers to make more hardware pushing software. In all of the past History of Microsoft in the console market where have they shown any real drive to do such for a "slight" increase in performance. Thus brings me to the next point, why many are saying it will gain parity. Because if they do gain 1080p /60 fps msny can claim what will the PS4 offer over the xboxone at that point but price. Its about perception an thats exactly what microsoft is fighting for not to really show what the hardware can do, because if they did, they really do not need to talk about it just show it. Im all for Microsoft showing what the xboxone can do but the people in charge over at microsoft have to convince the people in charge to spend the extra money, and from what I have seen this is being made more for 3rd party benefits than microsofts own internal studios, because atleast they may infact push the hardware, as for microsofts 1st party...well im not holding my breath.
Your the kid obviously. If you were around in the past, you would know how much of a difference a new api can make. However, you were probably still in diapers when directx first came out.
Did, I say that DirectX would not be able to get better performance? No, and im not a kid, the fact what I indeed pointed out is the truth. Microsoft does not spend more resources to garner what many who run Microsoft to spend more than they have too and they in fact know that many times many gamers are happy with the results they release, just look at Titanfall look how many were indeed happy with such results already, the fact that could Microsoft push the xboxone further the answer the real blunt truth is many times the many gamers already stated " you really cannot tell the diff. Between 900p vs 1080p" I mean that has been stated over and over again on here, like I stated before microsofts higher ups look at profit at the cost of a lil higher increase in performance as not worth it. The real only point about is too get a certain gaming group to shut up about resolution about the xboxone being "seen as just as powerful" as the PS4 its mainly to gain acceptence over many that feel the xboxone is being slighted by FUD. The blunt truth of the matter is despite the performance increase that DirectX 12+ will provide for the xboxone OpenGL will provide just as much enhancements to the PS4. The fact is both will garner increase performance gains for such but again to be blunt, sonys hardware just has a higher upper ceiling it can attain.
I was around when DirectX first came out. It was heralded as a god-send to PC gaming, which it really was given the nature of PC graphics hardware at the time. I also remember how crappy it was. How it didn't work. And how it took a couple years before we saw any games and hardware that took advantage of it. Since I was around for the first DirectX, I was also around for subsequent updates to it, and recall how each one failed to live up to it's promises, and took a couple years before games really took advantage of it, by which time MS was usually ready to announce the next iteration which would fix all the problems of their prior version. Hell every other release of DX was usually worthless and unsupported, and only updated to try and force OS upgrades. Their bigger updates to DirectX were often buggy in fact, and I don't doubt this time will be the same until GPU makers design their chips around DX12 feature set. There is hope though that MS customized their GPU with DX12 in mind, but I fail to see how that will matter given the low level nature of console programming to begin with. It'll only streamline processes which are already available, not open up new processes, unless those processes are currently blocked from the developer...something which has been mentioned or speculated already due to DX11 limitations. So, question for you. How many years are we going to have to wait until we see games that take advantage of it? What advantages will it actually bring, in particular for the Xbox? I'm all for bringing improvements to API's being an engineer myself, but seriously, people, the improvements to games are going to be a minimum of 1 year away, and that's being very generous. You're probably looking at 1.5-2 years before the real power of DX12 is utilized in a consumer game release.
DX12 will reduce CPU bottlenecking and reduce API overhead so the software can more directly communicate with the hardware. DX12 also makes cross-porting between PC and XB1 extremely easy since they will share the same standard API. http://blogs.nvidia.com/blo... http://channel9.msdn.com/Ev... http://www.amd.com/en-us/pr... http://gamingbolt.com/stard... ESRAM was designed for Hardware Based Tiling Resources - Put your textures in the 4.5-5 GB DDR3 RAM pool as it has always been done and how it is being done on PS4 (All textures get thrown into the single pool of RAM). Only problem is on X1, the big pool is of slower bandwidth. So you have to prioritize which render targets get put where. For X1, you put the slower, more static textures, in the DDR3 pool. ESRAM can be used concurrently to throw the lighting, textures, fast moving render targets in ESRAM which prioritizes "tiles" based on pre-specified level of detail. Up to an additional 6GB of textures being streamed in additon to the the 4.5-5 GB of textures in the DDR3. In total 10.5-11GB of textures. http://channel9.msdn.com/Ev... Azure Cloud just allows additional offloading in the Cloud. Physics calculations, lighting, etc. Things that do not require very low latency. All these together will help the Xbox One achieve must closer to its peak performance. Currently, more games are being launched at 900p-1080p. Performance increases are already appearing on the old APIs and without using the ESRAM as it was designed but the new techniques will make all that easier.
Your second paragraph completely contradicts your first. How is porting easier from PC if you have to completely change how you render the screen? Last I checked, PC's don't use a single pool of RAM for games. They have split memory. PC's also don't have an ESRAM buffer to contend with, and currently all that is managed by the GPU, so more code has to be written. DX12 will probably reduce API overhead, I have no doubt there, but DX was always an overhead used as an abstraction layer so developers could code to it instead of the hardware(which wasn't constant in PC). This isn't an issue for consoles. In consoles, this abstraction layer is already much smaller, or completely non-existent in some cases. All this is even simpler on the PS4, as there is simply a single unified pool to work with. No extra step involved. The bandwidth of the GDDR5 makes it so it doesn't really need the extra pool of faster graphics memory on the GPU. However, a simple port is still not possible regardless of what the original API was. If simple ports are to come, don't expect amazing results. In any case, the simplicity of the port will more likely be determined by which engine is used. If the engine is designed to be flexible and not rely too heavily on DX or OpenGL specific functions, porting will be simpler. If, and I don't want to say this is or will be definitively true, if developers use the PS4 as the lead platform, then that means that porting will not be simple. There is also nothing that says that devs have to use DirectX for their games on PC, as OpenGL is still a commonly used option. So really what you mean is that devs who use DirectX on PC will have an easier time porting. But I wouldn't go so far as to say it's simple. Any level of abstraction that DX offers has to be worked around if the devs want more optimized results on the console.
Anyone can say whatever they want. Real results are proved and until this stuff is proved to work in reality then they are just wishful thinking and just hopes and hopes.
Wasn't that the reason they used a third party benchmark to show the differences during the announcement? There is no secret CPU bound tasks are limited to one core with DX11, so I think the benefits in this case are pretty apparent and I think the benchmarks they provided show those benefits rather well. Nothing they showed was a mystery, they laid out the differences in a very detailed and logical way.
The cloud demonstration by Microsoft was impressive. They showed a High end PC with a very powerful graphics card, struggling to do the physics (stuff blowing up) the demo was dropping frames like crazy. They than used a High End PC, powerful graphic card, and the cloud. Look how better it performed the demo! The x box 1 is build around the cloud. Its a cloud based gaming machine directly connected to many data centres around the world. The potential is there and the demo is prove Microsoft is going to use the cloud eventually for x box 1 first party games. The latency should be lower than what people expect it to be too. I don't believe you'll need super fast internet connection and i think Microsoft have already confirmed this as true. Directx12 improves CPU performance. The x box 1 only uses 1 core currently according to 1 dev, so be foolish to think the x box 1 will not perform better using the new directx12 The new features coming with directx12 will optimise graphics, its a given. All directx releases have increased the look and feel of videos games. Search out images of directx10 games on the PC and directx11 games!
Dude in that demo that pc was hardlined right to the server, things are a bit different when you have to travel through hundreds and even thousands of miles of phone line bouncing from this stop to that along the way. You're talking about increasing a systems power over the internet and thinking it won't take high speeds to accomplish? Hate to break it to you buddy but if these servers was capable of pushing that much data through the internet in real time and it was that easy then there would be no such thing a lag and the other glitchs people experience in online multiplayer now. Truth is that while the machines themselves are capable the network isn't and likely won't be for most of the world even by the end of this console cycle.
Why would the xbox only use one core? All cpu's made now are multi core, so why would only the xbox be programmed to use just one core? If dx12 is improving cpu performance, how is that going to improve xbox graphics, which struggles because of a weak GPU. Of course if you compare pictures of dx10 and dx11, dx11 will look better. NEW HARDWARE was required to run dx11.
Cloud is the future for electronics, but I don't think we are close yet. I'll wait and see if it does anything for Xbox this Gen. As for directx 12, being around in pc gaming since before the was a directx, I know that yes a new api can make a huge difference. This will certainly help. I remember the shift we saw with directx 3 and 9, they were huge performance wise. On the flip side, the same is true for Sony. Ms has the bigger Hill to climb, but newer games will be vastly better.
I'm glad you remember DX3 and DX9 as I mentioned older DX versions up above. The immediate graphics improvements came from the optimization of the particular lines of code used to render said graphics. But here's the kicker. How long was it before we really saw games that took advantage of the DX3 and DX9 features? It was over a year in both instances as I recall, with a few exceptions where devs sent out updates to their game engine for DX9.
Obviously it will take time, but we are not discussing time here, we are discussing the improvements it will bring to the system, which COULD be significant (or irrelevant). As for how fast the improvements come in, if MS puts this all out in the back end, we could see immediate improvements in games (once DX12 goes out to the X1 AND those games go to market). Other wise, yes, it will take time. The system will be around for a while, so we have that time. Current games will suck when compared to later cycle games,