We already know DirectX 12 will drastically improve game performance when Windows 10 ships later this year, but just how much of a "free" boost you'll get isn't exactly known.
"Before you look at the results from these tests and assume you're going to see a frickin' 10x free performance boost from DX12 games later this year, zing, zam, zow! You won't. So ease off the hype engine. " I have to say after reading this article, i am damn excited to see the improvements the DX12 will bring. Its sounds way too damn awesome to be true. Only time will tell but colour may super interested.
It can be true its been shown. Plus look at the leap from DX-6-8 and 9-10 wich very little differnce to 11 now there changing way DX used to work Im optimistic unlike mantle were to get most bennifit you need a AMD complete setup. Myself have intel and AMD graphics card setup
What is interesting is how MS designed the Xbox One compared to PS4. Xbox One has more powerful CPU, and less GPU power than PS4, but as the benchmarks shows it is the CPU holding the GPU back. As we have heard from numerous game developers saying it is the CPU holding back, it is all falling in place with the idea that the Xbox One is a very well balanced system. I cannot wait to see what this will do for both PC/Windows and Xbox One gaming! That said, more pixels hasn't really mattered to me for some time. Stable frame rate, I will take that, but I truly hope this opens up for more innovative game mechanics instead of being another graphics war.
@donthate What the hell are you talking about?? The xbox one and the ps4 has the exact same cpu. The xbox one's cpu is just clocked slightly higher. You don't think sony could clock theirs slightly higher if they wanted? Heat would not be an issue because the ps4 runs cooler than the xbox one even though the xbox one has a bigger fan.
I really hope DX12 does what it's been said to do, given that mantle is AMD only. I run Intel/Nvidia, planing to do a AMD setup later. I don't want to be left behind lol.
@johndoe11211 Actually the Xbox one runs cooler than the ps4, yes they both have the same CPU's but with the Xbox having a slight edge due to the small overclock Microsoft gave it before release. Sony actually cant overclock the ps4 because it runs hot, overclocking would make it even hotter and with so little space inside the ps4 for circulation wouldn't be a good idea.
@Mercenary You do know PS4 also have dedicated separate processor. PS4 can free up more power on CPU on OS than MS ever could. Funny thing was "hidden processor" has been X1 secret sauce rumor for a long time. Hyped like crazy. but it was confirmed PS4 has it. Sony also said later in console like after OS updates and optimization, more resource will be freed up. ----------------------------- ------------------ Yup. DX12 benefits PC only. Even MS said all those figures are benchmark were for PC. Not X1. And are we seriously arguing about CPU? Thats just desperate. http://www.cinemablend.com/... Difference is TINY. that is benchmark done by Ubisoft.This is where GPGPU comes in. GPU gap is massive. And even MS themselves said DX12 is giving what console have to PC. How desperate do you have to believe all those benefit figure for PC will work on X1? when even MS said it wont do much for X1? LOL. Are you so desperately trying to believe something even MS said wont happen?
^^^ sounds salty. Ms has specifically stated that direct x 12 performance on Xbox one will depend on developers use of it for how much it will boost performance. "We haven't lied about DX12 on XB1. Speed increase over DX11 depends on how your code works." Straight from Phil spencer. No need to downplay it and say nothing's gonna change because as long as developers use it, there will be performance gains http://www.gamepur.com/news...
Dx12 shown on pc.. queue the apu discussion..
@Kribwalker Even then, MS said "effects won't be dramatic" - Spencer MS dev says -"Traditionally this level of efficiency was only available on console – now, Direct3D 12, even in an alpha state, brings this efficiency to PC" http://blogs.msdn.com/b/dir... Brings what console have to PC. You are holding on to your bubble so dearly.
@johndoe, who you lying to kid. The PS4 heat output is hotter than Xbox One. Thats a fact!! Its also the main reason I put my PS4 on top my desk with open ventilation and my Xbox enclosed but a good amount of airflow. Edit: Also the CPU are not the same but similar as both have their own customizations. Also I don't think MS would pay AMD the insane of money they did just the have the same CPU as PS4.
@mercenary & bakpain Both of you guys are full of it. Facts don't matter to you guys only blind loyalty, and the sad thing is people are actually agreeing with that false information. Maybe they think that if they lie to themselves enough it may become true. The thing is, no amount of delusion can change facts: http://www.extremetech.com/... http://gimmegimmegames.com/... @mercenary It speaks volumes of the mentality of xbox fans on this site that you got a well said for a factually inaccurate statement.
Good luck with DX12, I hope it does good things for the X1. I am personally skeptical that it will do much on X1, but it would be cool if it makes it run better.
It's a non argument. Even AMD say to use DX12
I have a complete AMD setup and mantle was VERY disappointing. BF4 performs far worse, and about 2fps difference with Dragon Age Inquisition. DX12 MASSIVE improvement on PC? Sorry but I'll believe it when I see it.
So I've just noticed a flaw in some of the Xbox fanboy's logic. Donthate is clamoring on about the design decisions of MS to give it a more powerful CPU to keep it from holding back the GPU(again)...presumably Sony is OK with the CPU holding back the GPU. Anyhow, since they both use the same CPU, how is it a design decision that was there from the start, when the increase in clock speed came out a mere month before the release of the console. If the X1 was designed around DX12 from the start, so all it's special stuff that will make it better, then why not have the higher clock speed from the beginning if it was so important? The console was originally designed to run with a slower CPU(not sure how it relates to the PS4's clock but that's irrelevant to my point). So why is a small uptick in the clock speed suddenly a huge thing for DX12 being the bestest thing ever to blow us away when it finally gets implemented in full.
The PS4 still doesn't have any 1080p 60fps Locked AAA Exclusive Games yet. Uncharted 4 had a chance, they delayed it to 2016. Forza Motorsport 5 is the only 1080p 60fps Locked AAA Exclusive out. Windows 10 & DirectX 12 will ensure Forza Motorsport 6, Fable Legends, & Halo 5 are Locked at 1080p 60fps. What other developers do is up to them. However, M$ Exclusives are going to set the tone for Next Generation Games.
Because we all care who has 1080p 60fps locked AAA exclusives! That's totally the barometer for a title's worth! How many of the greatest games of all time have been 1080p 60fps? Any? Right.
@Big-finger " It's also a racing game so running at 1080p 60p is nothing to be proud of.' hu? I guess you are not proud of DC neither which runs at 30 FPS.
MLB the show is AAA and runs at 1080p 60fps on PS4... But that is a moot point anyways, X1 is simply not as strong a machine, hardware wise. And the PS4 generally runs the same game better, so it would probably do the same with Forza. Exclusive game performance literally doesn't prove anything!
Nobody knows what framerate, resolution of those games are yet!
@johndoe11211 A clear exampe of someone following what they been told. In real life, outside bs online misinformation, ps4 is louder and runs much much hotter than xbo. If you had any experience with both consoles you would know this.
Can't speak from personal experience on behalf of the X1, but if the PS4 is louder, then the X1 must be whisper quiet because the PS4 is virtually silent except on start up if there's a disc in the drive. Heat is probably a variable thing, but never noticed the PS4 being particularly hot, and it certainly runs well within the boundaries of the chip manufacturers.
@rainslacker yeah i agree it depends on certain conditions like where you keep your console an what game you play ect. PS4 is algood aye but XBO is def quieter and always cool even under heavy load i can put my hand on it an its barely warm. Always lots of cold air coming out the vents :)
Seriously all this talk about heat between X1 and ps4 is rediculous anybody gunna mention that the fact the Xbox is the size of a VCR and has a power brick instead of a built in psu. Why are you guys arguing about cpu's vs gpu's. fact when it comes to gaming it's all about the gpu, only crappy devs like ubisoft think its all about the cpu and that's why their games are unoptimized piles of crap on PC. Most of the heat in the ps4 is coming from the damn PSU. You guys are comparing apples to oranges. Gpu > cpu so just seriously shut up about the consoles this article was suppose to be about pc anyways. And people mentioning forza for X1 grind my gears that game was downgraded to the tits to make that 60 FPS. That game literally is a polished basic turd. Driveclub even at 30fps is better than forza at 60. Minus the server issues because that's not graphics.
I'm pumped to see what it can do. I hope it's as good as they're making it out to be!
DX12 is incredible.my guess is that MS will demo the first DX12 games at E3.
The are already demoing Fable Legends which is confirmed to run on DX12
I am keen to see what the final product looks like with fable, but it was a dx11 game ported over. I'm more interested to see Forza 6 and Halo 5. I think they will be the first true dx12 games. And the new Gears in U4 will look even better still no doubt :)
Wouldn't this hurt the sales of high end components on PC? I have a feeling that something is fishy here, we just need to wait for win 10 and dx 12 to oficially launch.
It won't hurt sales. They'll increase the graphics by 1300% and beyond so you're forced to continue spending on high end rigs if you want the best.
There is no increase in graphics by 1300% lmao. It will be a big boost to pc but it will be to all pcs no matter the specs.
It's actually a really fair point. I think there will be a bit of a wobble for a bit while "the best bang for buck" is re-established as fundamentally the marking criteria has changed. I seem to remember though that there are a lot of older medium-high end cards (560,570s, 6XX etc) still being surfaced in the valve hardware reports. I would assume a number of people will upgrade those to take advantage of "full DX12" when the games start rolling round. I also think there's a big opportunity for Intel/AMD here as to be honest, the CPU market is quite unexciting unless you love to overclock!
I doubt it. I just means that high end graphics card will be able to produce even better results. Performance nuts are still going to want to have the best hardware available. And they be happy knowing that their expensive hardware will be used more efficiently.
Higher end graphics won't mean much when displays are 1080p. 99% of all displays are 1080p and only now are we moving towards 4k. If DX12 can boost performance from 30-60 pretty easily, then the need for an ultra high end card becomes even less desirable since the extra horsepower would be for resolution, which again, is useless since most everyone runs a 1080p display. I think card manufacturers are going to have to try and sell the whole 4k thing now, more than anything else. I also see them jumping on VR the same way TV manufacturers jumped on 3D to try and push some tech out there that'll cost more to the consumer. VR and 4K would need more horsepower to run fluidly, so I definitely think those two will be the selling point of cards in the future.
@outthink You've countered your own argument. People who value the highest performance aren't going to settle with having 1080p screens. They want to get the most out of their hardware.
It shows how underoptimized PC games are now. And gamers are currently forced to buy more expensive hardware than they need. It probably will cause a hardware slump if it delivers as promised, as many people will no longer feel forced to upgrade.
I've been wondering why gpu's havent had those leaps in performance that they did previously. And the prices remain high. Maybe they are just waiting for 3d memory to show off the new guns.
I'm not a PC gamers but I dont see how this hurts PC. I mean if your gaing good amount of performance from old hardware just imagine what You'll be getting with actual Dx12 components.
LOL graphics will improve but not by 1300% , thats quite the overstatement. No it wont hurt sales of high end components, devs will just crank it up a few notch, Games played on mid-range gpu will just look more crisper. Think of it this way, the low settings will become medium, medium will become high , high will become ultra , ultra will become mega super ultra . The bright side is Low Settings won't look like muddy crap no more.
Not really, because it would mean if you bought a powerful CPU/GPU-Card you would get the extra benefit of performance, and perhaps 4k resolution.
Possibly on the very short term as all current games will probably see a boost in performance and not many will see the need to upgrade. But as adoption to DX12 grows, developers will be able to further push the boundaries of what can create, thus leading to higher requirements for better graphical quality and better frame rate.
Graphics will improve, and frame rate will improve, but you can't force a GPU to perform tasks in real time that it wasn't designed to do in real time. No API can do that. It can simply make the hardware do what it is capable of doing more efficiently. There is a lot of misunderstanding on what exactly these API's do, and what the GPU is designed to do. There is also a lot of people relating the two as if the API is the GPU, which is extremely wrong. The most relevant example I can relate to from around here is this. Some of the arguments when it comes to consoles is that the X1 will begin to have the same performance as the PS4, or at least close to it. This disregards the fact that the two different GPU's, while similar, are actually designed to do different things...or rather the PS4 can do a bit more on this front, and more of it due to the increase in processing power and specs. Since the X1's GPU wasn't designed to do beyond a certain number of tasks at a certain spec, there is no way it will ever do so no matter how good an API gets, even if some dev were to completely disregard the API and code everything in machine code. I don't know how this will relate to PC hardware buyers. I may stave off some people's desire to upgrade for a little while if they suddenly have better performance, but eventually new games are going to come out which will require them to upgrade to move past what would be considered low spec. I think we will actually see a surge in new hardware sales as chip vendors will push their "fully DX12 compliant" chips which would allow gamers to take full advantage of their games capabilities.
The number of tasks XB1 was designed to do with Full DX12 include things like tiled resources being stored in eSRAM; GDDR5 is not nearly as efficient with tiled resources due to packet loss. MS also has a much better sound processor than PS4 which frees up more CPU power. XB1 has more ways to move data around the system to reduce bottlenecks too. There's even more but to summarize, the XB1 is anything but an off the shelf PC since it's the first full dx12 compliant device on the market, so we can't count it out yet. The real question is what was XB1 designed to do in real time? That answer will tell us just how close the two systems may be.
I wasn't speaking of techniques to allow the GPU to reach it's peak performance. I was speaking purely internal hardware to the GPU itself. GDDR5 is fine with tiled resources. It's capable of doing tiled resources now. Tiled resources are possible on the PS3 and the 360 as well. It's not a new technique. ESRAM allows for a faster frame buffer, which is good, but despite what MS says, it will not allow 6GB of data to reside compressed in 32MB of RAM...at least not with acceptable results. The ESRAM in the X1 exist precisely to deal with the bottleneck that DDR3 introduces to a memory system, which isn't a problem with GDDR5, which is why it's used for graphics, which is higly parallel. I don't think you are qualified to speak on the ways that MS made the X1 to move data around the system. I don't think you even know what that means. When it comes to graphics, which again is highly parallel, you don't introduce ways to move stuff around, you make the memory available directly to the GPU at the fastest speed possible. This is why gaming PC's use their own graphics memory, so it doesn't have to move stuff around. I never claimed that the X1 was an off the shelf PC. I know that isn't the case. Also, the X1 uses a subset of DirectX. It's uneccesarry to have the full DX12 feature set on the X1. As far as what the GPU is capable of doing on a hardware level, general consensus is that it lies somewhere between an AMD HD7700-7900. Not sure if the exact die being used was ever confirmed. Those specs are readily available through google if you care to look them up.
People will still try to downplay it since its tied to Microsoft.
"On the DX12 question, I was asked early on by people if DX12 is gonna dramatically change the graphics capabilities of Xbox One and I said it wouldn’t. I’m not trying to rain on anybody’s parade, but the CPU, GPU and memory that are on Xbox One don’t change when you go to DX12." - Phil Spencer Pretty sure Phil Spencer is tied to Microsoft as well, and he was downplaying it too.
It is kind of funny and sad how much DirectX 12 gets so much hype and attention. Vulkan is coming and it is doing the same and more (extensible). Unfortunately no one is writing about that. While Microsoft is working on DX12, I'm glad that Valve and others are behind Vulkan.
Okay. The reasons for no Vulkan info are simple: It is not close to being completed yet; It covers a much wider variety of devices and would need much more tested and patching; Those who are working on it are not providing any details, except that it will be based on Mantle; Most in the PC world will be using DX12, unless Vulkan can some how out perform DX12 using Win7 and 8, as all the other OS combined are still considerable tiny in comparison to windows install base. I understand that a lot of you are worried about M$ gaining a development advantage because of DX12 on PC and Xbox. And yes, that will happen initially, until Sony adapts something like Mantle or VulKan, and developer figure out how to best port to it. In the mean time, there will be plenty of developers, who will still put in the extra time to build the PS4 version separately, like they did last gen.
"Most in the PC world will be using DX12, unless Vulkan can some how out perform DX12 using Win7 and 8, as all the other OS combined are still considerable tiny in comparison to windows install base." Vulkan is supported on Windows as well so Windows is part of Vulkan install base. Additionally Android will most likely support Vulkan and who knows even Apple. That said the install base will be hugely bigger than Windows/DX12. Microsoft has huge PR resources and it really shows. That's fine of course.
Vulkan is a base API for OEM's to implement to their own hardware. The testing and patching only has to be done on a particular vendors own hardware, not across the board. Vulkan is much closer to Mantle than it is to DX12 because DX12 is actually based on Mantle principles with MS doing it's own implementation of the ideas. No one is worried about MS gaining a development advantage because of DX12. OpenGL(Mantle) is still much more widely used than MS relatively proprietary to MS products API. MS is not the industry leader it once was. Even though it still holds a bulk of the PC OS market, it pales in comparison to the number of devices that run non Windows OS's. Windows just has the advantage of not having an actual decent competitor in this field. When it comes to games support, I imagine so long as the PS4 continues to sell as well as it does, that that won't be a problem. Ease of development is great and all, but you don't ignore the competing console with double the numbers of units in customers hands, and more powerful hardware to do more with to boot. I do agree that the reason we aren't seeing much vulkan stuff is because there just isn't much being reported on it, but I don't think it's a problem of being close to completion. These API's are never really finished. I just think there isn't any current showable demo of performance gains, and most of the stuff about it was shown with mantle already. The reason it's not called Mantle anymore is because AMD decided to go with the Vulkan initiative instead.
My impression is valve is pursuing Vulkan and it's gaming OS because Microsoft will be doing the same, and that will be difficult competition for them especially if MS changes the rules on how gaming is accessed on their new Windows 10/Xbox.
So then go write or submit something about Vulkan. Don't be hating developers and others for praising DX12. Maybe it really is that good and if Vulkan is as you say better...well we need to see it.
we probably wont feel the difference though, because new games with dx12 will come out maxing out its features, therefore making it feel like any other newer game out there, the only difference we'll feel is if they enable all current games with dx12, i.e day z etc. but newer games, will just feel like new games, everything clocked
My Xbox and PC will be loving dx12 :D they will embrace its power to the fullest
Its a damn shame DX12 wasn't ready 6 to 12 months before the XB1 release so that developers could include its benefits in the first 2 years of games.
Guys this is a synthetic benchmark that is designed to deal with 1 very specific bottleneck - Draw Calls. If you watch the demo run, ask yourself what % of a game might benefit? 10%? 20%? "Microsoft execs expect frame rates to more than double when comparing DX12 to the current DX11 API." That's probably for PCs. Xone might get a 30% bump? Trying to be conservative here so the Sony Army of pretend graphics programmers doesn't attack me.
Draw calls are the primary reason assassin's creed unity ran like dogsh*t. Double the framerate sounds pretty amaazing! Also a 30% bump for the xbone could mean the difference between 900p and 1080p. All of these are reasons to be excited!
Lets say it foes receive 30% better performance, the main thing that helps the xbox reach 1080p is the esram, with dx12 the esram runs 15% better according to phil spencer i believe, now that would be able to help the xbox reach 1080p that 30% if true is basically extra performance for anything may it be frame rate or used to increase the visual fidelity. If all goes well it will be great for the xbox.
Actually X1 does not need a 30% jump to run games at 1080p, it ran games at 1080p from day one. What has happened with X1, is that because it was built so focused on DX12, M$ did not initially have the tools (SDK) in place to help developers use ESRAM and multiple cores. So to get the game running at an expectable frame rate, developers just used 900p. The Witcher dame artist said that in their case, they weren't enough shaders to apply to their 1080p images. So I don't know what the solution would be for that scenario. PS4 fanboys like to point to Battle Field games and other early 720p games, but it is completely up to developers as to how much time they are willing to spend in redoing engines and coding to the metal to get optimal results. Sometime trying to change an engine to much creates more instability and can lead to massive delays. But the point I really wanted to make was that we can't use the difference between 900p and 1080p to gage how much more gains X1 needs to have more games at 1080p. It has more to do with tools and coding for the architecture than anything else. M$ is changing all the software in X1 with win10 and DX12. Lets see if this make thing so easy for developers, that 1080p/60fps becomes easily achievable. With that said, I would prefer there use the additional gains from draw calls to focus on lighting, physics, AI, particle effect, facial expressing and animation. These things are way more immersive lead to much better games than 1080p. !080p is more about volume of graphics than quality of graphics. I hope with X1 they put their focus on quality of graphics, as this would be the best benefit of more draw calls and X1 games would show a huge advantage that would give them the power attention when gamers see the difference between mind blowing graphics and high resolution. If they achieve both then that would of course be ideal. But quality over volume all day as shown by Ryse.
" Battle Field games and other early 720p games, but it is completely up to developers as to how much time they are willing to spend in redoing engines and coding to the metal to get optimal results. Sometime trying to change an engine to much creates more instability and can lead to massive delays." I think that going forward, DICE' frostbite engine will port across to XB1 far better than in BF4 and hardline, once they have implimented DX12 (which I have no doubt they will announce this year). Just in time for starwars and mass effect. Along with EAs partnership with MS I expect far better ports via DX12. Im certain the PC versions of these games will use it so the smart money is on the X1 versions being highly optimised ports of the PC versions of those games. BF5 on XB1 will be much better than BF4 for sure. "With that said, I would prefer there use the additional gains from draw calls to focus on lighting, physics, AI, particle effect, facial expressing and animation. These things are way more immersive lead to much better games than 1080p." ^^THIS time a hundred. This is next gen to me. I been playing 1080P games for years on PC, so I dont really call that next gen. Show me a game with a world that really ...Breathes. like the original watchdog demo, with tonnes of NPC and AI..And multiple simulations going on within the world that effect gameplay in dynamic ways...THAT is next gen. Im looking forward to seeing elite dangerous running on XB1. A game from a team with X1 experience, who also happen to have a cutting edge game on their hands. Seeing that and eventually others like star citizen come to X1 will really show DX12s party tricks. And the tech implications of DX12 will really shine in a game like scale bound and its large scale dragon battles. Crackdown as well. Halo 5 will be a more linear experience than those games, even though it has a sandbox way of playing out, so I expect that game to have some of the best visuals in any game this year. Halo 4 is still one of the best looking last gen games. Forza 6 will look stunning. Something tells me the difference between FM6 and FM5 will be huge.And QB continually looks impressive. And gears will benefit from being over a whole year away and a matured DX12 Xb1 environment and UE4/DX12 implimentation. And thats before we even see what rare is working on or any other surprises at E3. We could see any othe old rare IP show up being developed by ANY AA dev. MS have the resources to get bethsheda to make them an exclusive RPG, based on one of their old IP, if they so wish. They were ready to let insomniac make a new banjo/conker game if they had the time. For the second place console, the X1 is looking REAL tasty on the AAA exclusive front.In all this talk the one thing I have been able to talk on is the games on X1 that will use DX12. We all know this stuff. But its the surprises that MS will shock us with over the next few years.It really is all about the games. Buts its nice to know the tech behind it is being optimised constantly.
If you can get twice the number of draw calls per frame then yes, you can virtually double frame rate as it's essentially allowing the frame to be drawn twice as fast. Draw calls are everything to a frame. It is the frame, and everything in it. So every game should benefit from this regardless of if they are written directly for DX12 or not. I'm not sure how this relates to the X1, since I don't know it's current abilities, nor what it will be capable of once DX12 releases. You didn't have to bring Sony into this. I am a graphics programmer(the real kind, not the fake Sony kind) so I find it ironic you are worried about the fake Sony ones when you talk BS with such authority when it's apparent you have no clue what you're talking about as well. @Big That last paragraph of yours is spot on. The more draw calls you have, the more the GPU can render in a single frame. However, the GPU is still limited to it's hardware capabilities, so even if you had unlimited draw calls, the GPU may not be able to physically handle them all. The best thing about this is is that if there are gains, it means that the developers can do more without spending more time trying to get them to work using their own low level implementations. This includes all those things you mentioned. One point of correction though, only lighting and particle effects would be directly influenced by draw calls, the rest is GPU Compute, which is a different beast altogether.:)
I'm new to pc gaming. My question is is there any truth to unified GPU memory in SLI or crossfire mode. I currently have 2 GTX 970 in SLI that I'm hoping will be enough for VR at the end the year with Windows 10.
Unified GPU memory in dual graphics mode (not sli or crossfire, dx12 has no such limitation) will work only with dx12 games.
this is insane that users are waiting for dx12 to make it equal to ps4. that's like me buying a console and thinking oh well in 3 years I should have resolution bumps lol
Nobody here has said dx12 will make the xbox one equal to the ps4. But if dx12 does make hardware more efficient on xbox one which it is said to do. Then we can expect a performance increase of how much? We dont know, but we will see sometime this year. So dont act like dx12 wont help xbox one until you see some results.
DX12 won't change the hardware in Xbone but it sure can use it more effectively. LIKE ANY OTHER API FROM PREVIOUS CONSOLE generation, because every console has low-level coding. PC didn't have that for a long time and now PC have benefits with Mantle, later with DX12 and Vulcan. So, it won't bring nothing to Xbone cuz Xbone already have that what PC will have soon! From Phil S. : "On the DX12 question, I was asked early on by people if DX12 is gonna dramatically change the graphics capabilities of Xbox One and I said it wouldn’t. I’m not trying to rain on anybody’s parade, but the CPU, GPU and memory that are on Xbox One don’t change when you go to DX12. DX12 makes it easier to do some of the things that Xbox One’s good at, which will be nice and you’ll see improvement in games that use DX12, but people ask me if it’s gonna be dramatic and I think I answered no at the time and I’ll say the same thing." What is DX12 for PC : http://image.slidesharecdn....
No what's insane is so many Sony fanboys on N4G are obviously threatened by it when it doesn't have any thing to do with Sony. This news is for PC (like myself) and Xbox gamers, this has nothing to do with the PS4 at all.
Actually, this is just for PC, and has nothing to do with Xbox(like yourself) at all. Thing is: Xbox fans have been flaunting for months that DX12 will bring all these massive changes for the XB1. It won't. Not to say that it won't bring improvements, but the biggest gains will be for the PC, as this article shows. As for Sony fanboys or whatever, IF this article referred to the XB1, it WOULD have something to do with the PS4; the consoles don't exist in a vacuum, else the Wii and Kinect wouldn't have existed, having been influenced by the Eyetoy. Else the XB1 wouldn't have bluray, and the PS4 wouldn't have party chat. That said, this article's title doesn't specify WHAT the gains are for; you'd have to at least click this far to find out what it referred to. And just as the ambiguity will have the Sony-aligned coming in here, so will those who strongly favor Xbox find their way in.