DirectX 12 allows developers to have more control on how they manage their resources, but this also means that they have to do a lot of work upfront.
Will we see tests performed with DX12 performed on Xbox One software or will the only testing we will see only involve the DX12 API being used on PC software?
Im in the Fable beta which uses Dx12 and i can say hands down Fable is by far the best looking next gen game graphically. Better than Ryse and The order i played and own both games and mind you Fable is in BETA and it looks and play that good.. Edit: Im In the Xbox One Beta
On which platform? Thought dx12 was coming only with W10 later in the year?
I disagree. Ryse is far better looking and yes, I'm in the beta too. There's a lot going on graphically but the textures are murky and the character models aren't detailed enough. Plus the gameplay is very basic. The game is very similar to the DA: inquisition multiplayer mode. But without the awesome single player campaign. A good decision, by lionhead, to go free to play imo, based on what I've seen. But it was only a beta.
So tired of word flipper click bait war fondaling articles. even after the boss of xbox REPEATEDLY said it wont do much these articles still come. Mixing pc and xbox and posting a pic of an xbox instead of a pc even though pc is the thing that dx12 will impact most. man, and yall sit there and eat this stuff up. Dx12 will be great for pc and help a little for xbox. Its not a game changer, and who cares other than fanboys. Enjoy games and stop feeding this very old beast....im waiting for more cloud articles....i best thos will kick in when dx12 launches and is seen to be exactly what phil said it will be for xbox...minor.
@Kingthrash360 Do you not see how ever slide shown on this article is stamped with the xbox logo in the corner? Phil Spencer never once said it won't do anything for xbox, he said the changes won't be dramatic. 12-15 fps gains and easily achievable 1080p resolution are not dramatic changes, they are just perfomance gains but they are definitely welcome
^If you think an AMD 7850m in the xbox one is going to magically start hitting 1080 and/or 60fps across the board...because of a new api...i don't know what to say... the xbox one already uses a to the metal API...so does the PS4...PC, other than like 4 games using mantle...DO NOT... a switch from the ridiculous DX11.1 to something like mantle or DX12 very much does give 10 to 15 fps reliably on a PC strangled by windows...but it won't come anywhere near that on a console, as their current API's are already heavily optimized... marketing and hollow promises in regards to DX12 and the XO...i 100% promise...haha 12-15 fps on an Xbox One 'isn't dramatic'...dude, that is like a 50% jump in frame output, MS would be ALL over that if they could be...'easily achievable 1080'...do you have any idea how big of a deal it would be PR wise for the xbox division to accomplish that?...they'd be all over it, but the sad truth: an API switch 2 years in isn't going to make it happen...no xbox product is going to get 1080/60fps as standard until the xbox one is replaced...
Please add me to the beta list? Sopoem gamer tag
@Kleptic "xbox one is going to magically start hitting 1080 and/or 60fps across the board...because of a new api...i don't know what to say... " um...."xbox one" doesn't "hit" "1080p, 60fps" ...games do. Resolution and frame rate are GAME DEPENDENT, not system dependent. ie Forza 5 ran 1080p 60fps, with many limitations, effects turned off etc. That is a choice a developer makes, just like its a choice on PC when a consumer decides they want "1080p 60fps" settings on their games. Developers are having a hard time on XONE due to not really it being weak as suppose to it being "weaker" ie in comparison to the PS4. Thus...they can make the game with XONE in mind, and port to PS4 thus adding to PS4 having the better res and frame. ala porting from last gen to current gen. PC has those settings most times quite easy due to more over powered systems with underpowered games. ie...we had a gen with lots of games that didn't really require beast rigs, thus....you could max most games out with mid range cards. XONE games are being maxed out, thus when ported to PS4, they are able to get the fancy treatment of being put to 1080p 60fps or what ever the developer is able to do while maintaining graphics, textures, lighting etc. ".no xbox product is going to get 1080/60fps as standard until the xbox one is replaced." Again...you don't really know what your talking about. 1080p 60fps is GAME DEPENDENT, not system dependent. Developers do so at a choice, not because they can't. Have you considered that all the developers with 900p games have seen what it looks like in 1080p? Have you considered its NOT as good looking as in 900p? or at 30fps etc? Have you not considred that develoepr are purposley maxing out the systems beyond in which 1080p 60fps is actually LESSOR? In a rig that MAXED OUT L4D at 1080p 60fps (ie it couldn't do any better then that) it would NOT be able to do the same with something like Crysis... Thus..the developer are choosing to create Crysis at 30fps 900p, vs something on source at 1080p 60fps. They are choosing to create a more demanding game... If XONE was MORE powerful then it is...it would still get those same resolutions and frames as those are game dependent and the team could just make a BETTER engine that was still more demanding. Your not really understanding that why those settings are happening are based on teams choosing to focus on graphics, textures, lighting etc vs ...a number. A more demanding engine will always be better looking then a lessor, its just the way it is in terms of gaming. More effects will always be at a cost, no developer wants to waste time with a 1080p 60fps last gen looking game. Most don't care, they want next gen engines, not out dated engines...
Then why is the resolution difference happening? Why are 3rd party games like hardline for example running 900 at 60 on ps4 and 720 at 60 on xbo. If its game specific? Again i don't really care ....its just all your explanation did was not explain why we are having this conversation in the first place. Many.....MANY 3rd party games have had this problem. Yeah forza hit 1080 at 60....but at what cost? Forza horizon didnt hit 1080 at 60. ...and cost? Cost really? Ms paid for titan fall..an online only fps game with ok graphics no destruction 6v6 low end bots ....very little in the content....at 792? While a game like battlefield has 64 players on much larger maps with destruction.....yada, yada yada. At 1080p on ps4. ...x1 was at 900 or 720. ....you are all wrong on that bro.
@Yetter Man, if you think 12-15 extra fps is a minor increase what do you consider a major increase? 60? I've seen some PC guys in here buy brand new GPU's and get 10-20 fps extra and they're stoked.
@Kleptic little correction here xbox one use hd7790 the ps4 use hd7850/7870 not even 'm' mobile version being used
really!?!? i am sorry. I dont think its better than ryse and the order.
@kumomeme my mistake, but the reality is neither console is directly comparable to any off the shelf PC parts gpu wise...and a 7800 series mobile gpu from AMD, in most cases, is outperformed by even 7770 desktop gpu's...just because of TDP figures, and limited clock speeds on the mobile versions... and EDMIX...a big long post that went in circles...I never declared 'why' a particular game is 1080 @ 60 fps... I fully understand that is a developer decision... All i said is that a low to mid range gpu will NEVER...ever...set up a situation where it pumps out 1920 by 1080 pixels native...at or near 60 times per second...on modern games...at a frequency that makes it a standard...and no API switch is going to change that, either... of course a developer could make a game(s) in which the above is true, but it'll come at the cost of everything 'modern' about it...so, won't happen very often...exactly like last generation...the 'cinematic' effects and everything are far more important to developers, and apparently 'us', that they will always be pushed harder than native res and frame rates...
It would be nice to see some benchmarks so we could put some of this arguing to rest. One thing is for sure though; DX12 will make a difference on the console. We now know that it will absolutely improve eSRAM and CPU performance to some degree. What we don't know yet is how the CPU/GPU relationship will be affected by the direct link shown in the Microsoft slides, or why the GPU is split. We also don't understand to what degree developers can take advantage of the new api to get the same effects using less code. To analogize, look at the difference between UE3 and UE4 running on the same hardware and you can see that coding can significantly impact visual quality in a closed system. There is no secret sauce? Maybe not. Perhaps some people just haven't read the label on the bottle.
@TheCommentator Why the GPU is split? good question because I've been wondering the same thing for a while. I think the GPU is split for a reason unknown to us, but can be used in number of ways. I thought about one possibility. I asked myself how does the x1 talks to MS's cloud tech. According to recent x1 spec sheet on the internet the x1 has a two channel GPU. Perhaps one of the channel handles incoming compute instruction for MS cloud tech, while the second channel is used for x1's hardware. Perhaps the two channel on the x1's gpu is used as some sort of bridge to combine the cloud & x1 compute processes together. There might be other uses...
MS went out of their way to show only PC. Even said wont have dramatic effect on console. Asked about improvements said it was for PC. So what possible conclusion you have that it will "absolutely improve" it? THats like saying, I have no support data or facts and data that actually disproves it, but going to say it will be "absolutely" work.
@GameNameFame I guess you missed GDC, when it was stated that eSRAM would see a 15% boost with Win 10/DX12? It has also been confirmed that this update will reduce CPU binding by allowing the systems cores to talk simultaneously with the gpu. As I stated both will absolutely improve performance; the only thing up for debate is how much. I clearly pointed out everything else as speculation since none of us can answer those questions. Not even an angry troll.
@jhoward There are a couple of possibilities. 1. It's a derivative of AMD's DirectGMA, which allows DMA between the GPU and memory. Typically this would be done with on board graphics card memory, but could also work with shared memory. It allows DMA read/write access to happen concurrently, which has already been stated as possible on the X1. 2. It allows for more efficient data transfer between the CPU and GPU for GPU compute. Not sure how this plays out in an APU, but thinking off the top of my head it's a logical guess. 3. Reduces latency in non-traditional memory management processes. Pretty technical, won't go into detail. 4. Allows direct access to other parts of the system, not directly tied to system memory(move engines or other co-processors on the board). Multiple memory channels are not uncommon, and I'd be more surprised if it wasn't there in an APU. A GPU channel is a path between the CPU and GPU, so it would make sense that in a system that can work with GPU compute, that multi-channel would be there to allow for simultaneous read/write operations, and allow independent access to the memory between the CPU and GPU...sometimes referred to as hUMA. Edit: I'd like to state that I'm not saying that it's any of the things above, just possible reasons based on my understanding of how computers work. Take them as a starting point for your own research if you so choose. :)
Why is this being disagreed? I would LOVE to see this information (if only to shut up both sides of the argument - just show concrete details instead of allowing everyone on either side to keep pushing their end of the agenda)
Well, someone didn't read end of the article : "Using DirectX 12, the developers will have the final word on where and how they want to utilize their resources but eventually they would need to do a lot of work in providing high level information to the application. But there is no doubt that there are some serious performance gains to be had via DirectX 12, AT LEAST ON THE PC!" Well, it will be a game changer......ON PC!
Why does it matter so much to you? X1 and Pc will get a boost....pc is oblivious superior over PS4/x1 but ppl keep downplaying x1 medium gains. Wouldn't any gamer want better FPS more pixels and characters on screen. Why troll why? pS4 is winning but for some reason I feel this lead isn't as "secure" as many portray. Stop silly
Yeah... Fanboys like you try and cock block positive stories.
so astonishing to see people say it will improve PC but not XB1 smh. First it is a new API so right there you get new features and better efficiency. Yes PC gains will be more due to it not being as close to the metal but XB1 will have substantial gains too.DX12 is more about all the cpu cores being used equally instead of 1 doing 70% of the work so this has nothing to do with being to metal as much as it has to do with unused cores being used DX12 brings bundles can XB1 do that now no also those bundles can make thousands of draw calls with one command.Tiled resources tier 3 come with dx12 also Esram gets pix which gives 15% boost and esram also gets it's own freaking API. Keep saying DX12 won't do anything won't change the fact that it does. Notice all the slides from this article and all GDC dx12 conferences has xbox and windows on the bottom of the slides meaning that it applies to both http://gamingbolt.com/wp-co... Seriously MS makes DX12 and XB1 they have said they knew what Dx12 was doing when they built XB1 also Phil said XB1 gets full dx12..
DX12 will help XB1 as much as any API update for a console that already has DX 11.x low level access.
As long as the Xbox one has games that is as good as the order which it will then what does it matter if dx12 or the cloud might not be what its all cracked up to be. Articles that say ps4 is capable of much more than the order gets positive feedback which is all right cause we're only in the 2nd full yr of the generation. But when Phil says dx12 won't triple the power of the Xbox, Sony fans say the Xbox is already maxed.
DX12 is good for PC. I'm a PC gamer and im very excited about it. for Xbox One? it won't change anything. Xbox One is not a PC. It's a gaming console which is already 99% optimized for gaming. PC is a multitask system. It's not just for games, heck even with most powerful gigs it usually leaves out 30% of performance because its really hard to optimize all the PC in the world, unlike Xbox One where the hardware is the same with each and every xbox one out there.
.....hey.... you....post more! Agree! Someone knows what they are talking about finally! Most of N4G knows very little of PC gaming, they think a system makes the "1080p 60fps" and its not a movable figure that can be altered or changed. ie "xone can do 1080p 60fps" lol! I've never heard someone say this about a PC...that it can um..."do" a set setting. Never mind quality, never mind setting, never mind effects ....just do the res and frame lol. I want to sell them volcano insurance sooooo bad!
what are you talking about?Dx12 brings new tools new methods of doing things.The tools allow the cpu to do more work with less power.Right now XB1 has 7 cores that can access the GPU 1 does 70% others around 10-20%.DX12 splits the work out equally across all cores.So now you have 7 cores each doing 40% each devs can ramp it up to 60% each and so on. This alone will give it a boost and is something that can't be done now but DX12 enables it this is baked in man.Esram gets 15% boost this also will make difference gains are gains man.Then Esram gets it's on API helping devs use it better and who knows what the api will enable.Can XB1 do bundles now nope DX12 enables bundles another new thing.Do you even now what the bundles do how man thousands of draw calls it allows with one command? What about tiled resources through Esram please just stop. people like to say close to metal yet the don't know what it means. Yes XB1 is close to the metal but do you realize how much work and code is required to do it?Dx12 simplifies the work and @ the same time gets dev closer to the metal even on XB1. I suggest you watch the GDC DX12 conferences and study the slides B4 making nonsense claims.If DX12 did nothing for XB1 why even bother putting it on there in the first place. Build is coming up and even more gets revealed there read up bro.
Well see about Xbox. Cant wait to see how it helps my PC games.
Like everything else related to promises for this new gen....let's wait and see. I'm tired of lofty promises from both Sony and Microsoft about the offerings of this gen. Let's wait and see.
Once again it wont help xbone, now carry on.
I'm really tired of the so called secret sauce comments... There is no secret. M$ developed an award winning console in the xone. When information was first leaked about the console developers called it a monster. Describing it as effectively having two cpu's and two gpu's. Now why is that...Well the key word is effectively. Un-like consoles of the past or even most PC's the Xone can't be measured by just its cpu and gpu. Those 15 co-processors do something 7 are dedicated to sound then there are 8 others not including the move engines. My point is that un-like the x360, ps3, or even ps4, when programmed properly the xone can offload a massive amout of things that would have been assigned to the other consoles cpu or gpu. This is why those super early leaks said this. Fast forward to 6 months before launch and all the way up to launch and M$ are still having developers using weak sdk and bad programming tools that have no chance of taking advantage of the hardware in the box. The xone wasn't even designed for dx11 it was designed for dx12. So many of the features in the box have never been used and can't be utilized without dx12 or actually writing to the metal which no developer has done to date. The truth is we have no idea how much dx12 will improve xone, but lets give it a chance to do it without dismissing it as secret sauce.
This is pure misterxmedia garbage.
It will help the xbox one no doubt. However Sony isnt going to sit around and watch the Xbox one improve leaps and bounds over it. I like how the xbox one has evolved over its life cycle but it will never be stronger than the ps4. As long as the exclusives come and we don't get another delay 2015 looks to be great for both.
Lol, tell that to EA and so on. The PS4 is just much more easy to work with from the start. Something told by the studios This DX12 news are just a silly cover to the problems of a machine that two months before launch needed to overclock the Cpu and GPU... I dont understand what Microsoft was thinking to bring out such underpowered machine if they got so much money then sony. If we compare the 360 to the Xbox One in terms of specs when each come out the 360 crush, destroys and burns to the ground the Xbox One. CPU, GPU, RAM, easy to work, design, price, games. The only thing you could talk about was the lack of blue-ray. Then they make a machine that struggle to reach 1080p in 2015, ok rez doesnt matter for some, but i just cant understand what were they thinking and neither millions that jump from the Xbox 360 to the PS4 and thats a fact that we see each week on the sales.
DX 12 is a big benefit for PC Gamers
The denial from you Sony fanboys on this post is unbelievable. You all keep cycling back to Phil Spencers twitter quote about "dramatic changes". Well heres a couple of other quotes from Phil Spencer and others on the DirectX team from twitter Phil - "DX12 will have impact on XBOX One games written for DX12. Some DX12 features are already in XBOX One but full DX12 coming." Phil- "We knew what DX12 was doing when we built Xbox One." and one of my favorites Syned69 @Blight7 "@DirectX12 can't wait to hear how dx12 will work with Xbox one" DirectX 12 @DirectX12 @Blight7 Oh, we know. :)
N4G is a community of gamers posting and discussing the latest game news. It’s part of NewsBoiler, a network of social news sites covering today’s pop culture.