Top
420°

DirectX 12 On PC And Xbox One: Improving GPU Efficiency, Reduced CPU Overhead And More Analyzed

DirectX 12 allows developers to have more control on how they manage their resources, but this also means that they have to do a lot of work upfront.

Read Full Story >>
gamingbolt.com
The story is too old to be commented.
IGiveHugs2NakedWomen984d ago

Will we see tests performed with DX12 performed on Xbox One software or will the only testing we will see only involve the DX12 API being used on PC software?

IGiveHugs2NakedWomen984d ago ShowReplies(1)
Brisco984d ago Show
WCxAlchemist984d ago (Edited 984d ago )

Im in the Fable beta which uses Dx12 and i can say hands down Fable is by far the best looking next gen game graphically. Better than Ryse and The order i played and own both games and mind you Fable is in BETA and it looks and play that good..

Edit: Im In the Xbox One Beta

SniperControl984d ago (Edited 984d ago )

On which platform? Thought dx12 was coming only with W10 later in the year?

lazyboyblue984d ago

I disagree. Ryse is far better looking and yes, I'm in the beta too.
There's a lot going on graphically but the textures are murky and the character models aren't detailed enough.
Plus the gameplay is very basic.
The game is very similar to the DA: inquisition multiplayer mode. But without the awesome single player campaign.
A good decision, by lionhead, to go free to play imo, based on what I've seen.

But it was only a beta.

Kingthrash360984d ago (Edited 984d ago )

So tired of word flipper click bait war fondaling articles.
even after the boss of xbox REPEATEDLY said it wont do much these articles still come. Mixing pc and xbox and posting a pic of an xbox instead of a pc even though pc is the thing that dx12 will impact most.
man, and yall sit there and eat this stuff up. Dx12 will be great for pc and help a little for xbox. Its not a game changer, and who cares other than fanboys. Enjoy games and stop feeding this very old beast....im waiting for more cloud articles....i best thos will kick in when dx12 launches and is seen to be exactly what phil said it will be for xbox...minor.

Yetter984d ago

@Kingthrash360 Do you not see how ever slide shown on this article is stamped with the xbox logo in the corner? Phil Spencer never once said it won't do anything for xbox, he said the changes won't be dramatic. 12-15 fps gains and easily achievable 1080p resolution are not dramatic changes, they are just perfomance gains but they are definitely welcome

Kleptic984d ago (Edited 984d ago )

^If you think an AMD 7850m in the xbox one is going to magically start hitting 1080 and/or 60fps across the board...because of a new api...i don't know what to say...

the xbox one already uses a to the metal API...so does the PS4...PC, other than like 4 games using mantle...DO NOT...

a switch from the ridiculous DX11.1 to something like mantle or DX12 very much does give 10 to 15 fps reliably on a PC strangled by windows...but it won't come anywhere near that on a console, as their current API's are already heavily optimized...

marketing and hollow promises in regards to DX12 and the XO...i 100% promise...haha 12-15 fps on an Xbox One 'isn't dramatic'...dude, that is like a 50% jump in frame output, MS would be ALL over that if they could be...'easily achievable 1080'...do you have any idea how big of a deal it would be PR wise for the xbox division to accomplish that?...they'd be all over it, but the sad truth: an API switch 2 years in isn't going to make it happen...no xbox product is going to get 1080/60fps as standard until the xbox one is replaced...

proudxgamer984d ago

Please add me to the beta list? Sopoem gamer tag

_-EDMIX-_983d ago

@Kleptic "xbox one is going to magically start hitting 1080 and/or 60fps across the board...because of a new api...i don't know what to say... "

um...."xbox one" doesn't "hit" "1080p, 60fps"

...games do. Resolution and frame rate are GAME DEPENDENT, not system dependent.

ie Forza 5 ran 1080p 60fps, with many limitations, effects turned off etc.

That is a choice a developer makes, just like its a choice on PC when a consumer decides they want "1080p 60fps" settings on their games.

Developers are having a hard time on XONE due to not really it being weak as suppose to it being "weaker" ie in comparison to the PS4.

Thus...they can make the game with XONE in mind, and port to PS4 thus adding to PS4 having the better res and frame.

ala porting from last gen to current gen.

PC has those settings most times quite easy due to more over powered systems with underpowered games.

ie...we had a gen with lots of games that didn't really require beast rigs, thus....you could max most games out with mid range cards.

XONE games are being maxed out, thus when ported to PS4, they are able to get the fancy treatment of being put to 1080p 60fps or what ever the developer is able to do while maintaining graphics, textures, lighting etc.

".no xbox product is going to get 1080/60fps as standard until the xbox one is replaced."

Again...you don't really know what your talking about.

1080p 60fps is GAME DEPENDENT, not system dependent.

Developers do so at a choice, not because they can't.

Have you considered that all the developers with 900p games have seen what it looks like in 1080p?

Have you considered its NOT as good looking as in 900p? or at 30fps etc?

Have you not considred that develoepr are purposley maxing out the systems beyond in which 1080p 60fps is actually LESSOR?

In a rig that MAXED OUT L4D at 1080p 60fps (ie it couldn't do any better then that)

it would NOT be able to do the same with something like Crysis...

Thus..the developer are choosing to create Crysis at 30fps 900p, vs something on source at 1080p 60fps.

They are choosing to create a more demanding game...

If XONE was MORE powerful then it is...it would still get those same resolutions and frames as those are game dependent and the team could just make a BETTER engine that was still more demanding.

Your not really understanding that why those settings are happening are based on teams choosing to focus on graphics, textures, lighting etc vs ...a number.

A more demanding engine will always be better looking then a lessor, its just the way it is in terms of gaming.

More effects will always be at a cost, no developer wants to waste time with a 1080p 60fps last gen looking game.

Most don't care, they want next gen engines, not out dated engines...

Kingthrash360983d ago

Then why is the resolution difference happening?
Why are 3rd party games like hardline for example running 900 at 60 on ps4 and 720 at 60 on xbo. If its game specific?
Again i don't really care ....its just all your explanation did was not explain why we are having this conversation in the first place.
Many.....MANY 3rd party games have had this problem. Yeah forza hit 1080 at 60....but at what cost? Forza horizon didnt hit 1080 at 60. ...and cost? Cost really? Ms paid for titan fall..an online only fps game with ok graphics no destruction 6v6 low end bots ....very little in the content....at 792? While a game like battlefield has 64 players on much larger maps with destruction.....yada, yada yada. At 1080p on ps4. ...x1 was at 900 or 720. ....you are all wrong on that bro.

its_JEFF983d ago

@Yetter Man, if you think 12-15 extra fps is a minor increase what do you consider a major increase? 60? I've seen some PC guys in here buy brand new GPU's and get 10-20 fps extra and they're stoked.

Kumomeme983d ago

@Kleptic
little correction here

xbox one use hd7790
the ps4 use hd7850/7870

not even 'm' mobile version being used

UltraAtomic983d ago

really!?!? i am sorry. I dont think its better than ryse and the order.

Kleptic982d ago

@kumomeme

my mistake, but the reality is neither console is directly comparable to any off the shelf PC parts gpu wise...and a 7800 series mobile gpu from AMD, in most cases, is outperformed by even 7770 desktop gpu's...just because of TDP figures, and limited clock speeds on the mobile versions...

and EDMIX...a big long post that went in circles...I never declared 'why' a particular game is 1080 @ 60 fps... I fully understand that is a developer decision...

All i said is that a low to mid range gpu will NEVER...ever...set up a situation where it pumps out 1920 by 1080 pixels native...at or near 60 times per second...on modern games...at a frequency that makes it a standard...and no API switch is going to change that, either...

of course a developer could make a game(s) in which the above is true, but it'll come at the cost of everything 'modern' about it...so, won't happen very often...exactly like last generation...the 'cinematic' effects and everything are far more important to developers, and apparently 'us', that they will always be pushed harder than native res and frame rates...

+ Show (10) more repliesLast reply 982d ago
TheCommentator984d ago (Edited 984d ago )

It would be nice to see some benchmarks so we could put some of this arguing to rest. One thing is for sure though; DX12 will make a difference on the console. We now know that it will absolutely improve eSRAM and CPU performance to some degree.

What we don't know yet is how the CPU/GPU relationship will be affected by the direct link shown in the Microsoft slides, or why the GPU is split. We also don't understand to what degree developers can take advantage of the new api to get the same effects using less code. To analogize, look at the difference between UE3 and UE4 running on the same hardware and you can see that coding can significantly impact visual quality in a closed system.

There is no secret sauce? Maybe not. Perhaps some people just haven't read the label on the bottle.

jhoward585984d ago (Edited 984d ago )

@TheCommentator

Why the GPU is split? good question because I've been wondering the same thing for a while.

I think the GPU is split for a reason unknown to us, but can be used in number of ways.

I thought about one possibility. I asked myself how does the x1 talks to MS's cloud tech.

According to recent x1 spec sheet on the internet the x1 has a two channel GPU. Perhaps one of the channel handles incoming compute instruction for MS cloud tech, while the second channel is used for x1's hardware. Perhaps the two channel on the x1's gpu is used as some sort of bridge to combine the cloud & x1 compute processes together.

There might be other uses...

GameNameFame984d ago

MS went out of their way to show only PC.

Even said wont have dramatic effect on console.

Asked about improvements said it was for PC.

So what possible conclusion you have that it will "absolutely improve" it?

THats like saying, I have no support data or facts and data that actually disproves it, but going to say it will be "absolutely" work.

TheCommentator983d ago

@GameNameFame

I guess you missed GDC, when it was stated that eSRAM would see a 15% boost with Win 10/DX12? It has also been confirmed that this update will reduce CPU binding by allowing the systems cores to talk simultaneously with the gpu. As I stated both will absolutely improve performance; the only thing up for debate is how much.

I clearly pointed out everything else as speculation since none of us can answer those questions. Not even an angry troll.

rainslacker983d ago (Edited 983d ago )

@jhoward

There are a couple of possibilities.

1. It's a derivative of AMD's DirectGMA, which allows DMA between the GPU and memory. Typically this would be done with on board graphics card memory, but could also work with shared memory. It allows DMA read/write access to happen concurrently, which has already been stated as possible on the X1.

2. It allows for more efficient data transfer between the CPU and GPU for GPU compute. Not sure how this plays out in an APU, but thinking off the top of my head it's a logical guess.

3. Reduces latency in non-traditional memory management processes. Pretty technical, won't go into detail.

4. Allows direct access to other parts of the system, not directly tied to system memory(move engines or other co-processors on the board).

Multiple memory channels are not uncommon, and I'd be more surprised if it wasn't there in an APU. A GPU channel is a path between the CPU and GPU, so it would make sense that in a system that can work with GPU compute, that multi-channel would be there to allow for simultaneous read/write operations, and allow independent access to the memory between the CPU and GPU...sometimes referred to as hUMA.

Edit: I'd like to state that I'm not saying that it's any of the things above, just possible reasons based on my understanding of how computers work. Take them as a starting point for your own research if you so choose. :)

+ Show (1) more replyLast reply 983d ago
halfblackcanadian984d ago

Why is this being disagreed? I would LOVE to see this information (if only to shut up both sides of the argument - just show concrete details instead of allowing everyone on either side to keep pushing their end of the agenda)

imt558984d ago (Edited 984d ago )

Well, someone didn't read end of the article :

"Using DirectX 12, the developers will have the final word on where and how they want to utilize their resources but eventually they would need to do a lot of work in providing high level information to the application. But there is no doubt that there are some serious performance gains to be had via DirectX 12, AT LEAST ON THE PC!"

Well, it will be a game changer......ON PC!

proudxgamer984d ago

Why does it matter so much to you? X1 and Pc will get a boost....pc is oblivious superior over PS4/x1 but ppl keep downplaying x1 medium gains. Wouldn't any gamer want better FPS more pixels and characters on screen. Why troll why? pS4 is winning but for some reason I feel this lead isn't as "secure" as many portray. Stop silly

ShottyGibs983d ago

Yeah... Fanboys like you try and cock block positive stories.

+ Show (4) more repliesLast reply 982d ago
Re-versed984d ago Show
payikick984d ago ShowReplies(1)
Show all comments (75)
The story is too old to be commented.