Top
420°

DirectX 12 Gives A Boost of 330% To Draw Calls On 3 Year Old Hardware, More Details Shared

Over 300% improvement on 3 year old hardware and more details shared.

Read Full Story >>
gamingbolt.com
The story is too old to be commented.
Ghost_of_Tsushima940d ago (Edited 940d ago )

On asked how the new API will impact the Xbox One, he stated, “The problem there is the Xbone is hideously GPU bound and while multithreaded performance SHOULD be good, the CPU is operating at a painfully slow clock rate.”

I really hate the CPUs used in this gen consoles.

Gwiz940d ago

Yep,that's my main problem with 8th gen.

Nio-Nai940d ago

It was the main problem with 7th gen as well.

6th gen was the last one to release with then on par PC tech.

Gwiz940d ago

7th gen had decent CPU's,hell it was top tier when they released.

johndoe11211940d ago

@nionai

Do you have any idea how horrendously powerful the cell in the ps3 was? The problem with it was that it was difficult to develop for but in no way was it underpowered.

shloobmm3940d ago

The 360 was state of the art when it launched. Ahead of pc hardware in its time of launch.

Nio-Nai939d ago

The cell was not "difficult" to program for and I find it funny as hell people still believe this nonsense..

And yes, Both the 360 and the Ps3 cpu released behind PC tech at the time, the 6th Gen was the last to release with then current tech.

rainslacker939d ago

@Nio

It wasn't hard to develop for, but it did require using techniques that weren't common in game programming, or even regular programming. The biggest obstacle with it was that multi-plat development became much more of a thing last gen, and many of the things that could be done on a PowerPC architecture couldn't be done directly on a CELL, so a lot of things had to be rewritten.

Nowadays, many of the techniques from CELL are actually in modern processors, so developers learned to take advantage of it.

It wasn't that much different with prior PS consoles as well. The Emotion Engine was somewhat unique, but at least it was built off of what was already being done. However a lot of the specialized things it could do took a few years for developers to get a handle on.

Mr Pumblechook939d ago (Edited 939d ago )

So many articles telling us how a new system update or SDK will dramatically improve performance. But stop using words to try and persuade us this true, just show us.

DarXyde939d ago (Edited 939d ago )

Replying to your comment about CELL being difficult to program for, we've had it for, what, 9 years? I don't think that holds true anymore. Developers got a great handle on it and it's much easier now. The results of late generation games are a testament to that.

Still, I can agree that it's better for the industry to have gone for x86 architectures. Everything is uniform now and multiplatform titles can be released faster. This benefits people playing on consoles to PC and vice versa. It also helps out indies quite a bit. I imagine the porting process is much cheaper and faster now.

funkateer939d ago

@rainslacker
"many of the things that could be done on a PowerPC architecture couldn't be done directly on a CELL"

Cell *is* (a specialized) PowerPC architecture.
PPC code runs on Cell, but not necessarily the other way around.
What made the Cell difficult is that is was only a single core CPU with satellite SPEs, which is uncommon.
It wasn't only uncommon, these SPEs weren't general purpose cores so they weren't good for everything.

Also, there was no direct language support for it, so developers had to learn how to go low-level on Cell to get the performance out of it. That usually involved going specialized assembly and really knowing about the limited memory available in an SPE core.
There were some C++ APIs for the SPEs, but you couldn't directly code the SPEs in C++ as you would a general purpose CPU core.

Last but not least, in the first years there wasn't really any development tools support or good documentation for the SPEs to speak of.
Mark Cerny has gone on record saying that in the early days, the PS3 was this board of exotic components with some sparse documentation, in Japanese. Go figure...
Gabe Newell had to say something about PS3 as well, and it wasn't about how easy it was...

So in short, yes, the Cell *was* difficult for the average developer. And it still is.

"Nowadays, many of the techniques from CELL are actually in modern processors, so developers learned to take advantage of it. "
Which ones are those? X86 doesn't have something comparable to Cell SPEs, does it?

rainslacker939d ago

@funk

The main core was a PowerPC core. But it wasn't a full powerPC core. It used a subset of the PowerPC architecture, and there is not single PowerPC instruction set to begin with, just like there is no single x86 instruction set, although all x86 processors can run legacy code, but sometimes older processors cannot run newer code. It's rare that you see this though, as most of those things are rather specialized and wouldn't exist in general purpose computing.

Anyhow, the idea behind CELL is that some of the things that a regular PowerPC could do, could be done more effectively on the SPE's instead of the PPE's. Those SPE's required they be programmed to, because there was no high level way to make it happen early on. The API's to make it work existed in rudimentary form for regular development, and got better over time. Sony's own PS3 API's most certainly had libraries which handled many SPE functions. I don't know how robust it was out of the gate, but it was pretty well refined later in the generation.

Otherwise, you are correct, assembly was the better way to go with the SPE's. Assembly though isn't exactly rocket science. You simply figure out your algorithm and find the instruction to make it work, then program the instruction with the variables and derive a result. That's all assembly really is.

That being said, the first few years I can concede that the documentation would be lacking, although I found a ton of it when I was looking into CELL programming. It wasn't terribly well organized though.

rainslacker939d ago

For the average developer, I can say you're probably right. The principals involved in CELL would be extremely foreign. However, AAA devs aren't average developers. They're well versed in assembly, and know how to manage data. But the time to take to at least get to learn the new stuff could be seen as difficult either way.

"So in short, yes, the Cell *was* difficult for the average developer. And it still is."

I'd have to disagree here. I wouldn't consider myself a genius level programmer, and I had no issue finding and understanding how to utilize the CELL. I never got up to the advanced stuff like you would see in games due to it being more an area of interest and not a job, but CELL is hardly foreign once you understand the basic principal of how to make it work.

I can understand there may be a lot of resistance towards programming for CELL, but that likely comes from programmers who learn one thing, and then don't adapt well. Most programmers I know find one thing they like, then have a hard time moving onto other concepts because it's just not what they're used to. I don't think a quality game engineer would have this issue though.

"Which ones are those? X86 doesn't have something comparable to Cell SPEs, does it?"

Many modern processors actually use a powerPC RISC instruction set for internal processing. This allows more to be done with less need to feed instructions in sequence. This actually started in the Pentium chips(can't remember which one), but was rather rudimentary, and didn't involve offloading tasks like the CELL design can do.

The EIB bus is becoming the standard in modern CPU's, as it removes a level of hardware for interconnecting various parts of the system bus.

Hyperthreading also become relatively common on the CELL chip, while past ventures didn't really get utilized due to restraints on the memory bus, and was reserved for more specialized applications. Nowadays, hyperthreading is made more mainstream due to the EIB listed above. This particular thing isn't really a CELL trait, but CELL was the first one to utilize it for general purpose computing.

CELL is mostly defunct now outside of some specialized applications, but there are things from it that have improved modern computing. There's a good scholarly article on it somewhere on the net. When I have some time to hunt it down I'll link it to you.

+ Show (8) more repliesLast reply 939d ago
JazMac34940d ago

Buy a PC then and you can choose your own CPU ^_^

ABizzel1940d ago

The thing is most people are truly computer illiterate outside of using them to type documents and get online.

Most people don't know hardware specs and differences, don't know how to download driver updates, etc... All things a simply Google / YouTube search can bring up.

And then most people just don't want to deal with having a PC as a gaming platform when they can have everything bundled right there for them in a console, that ensures that most of the gamers have an identical gaming experience.

Just two different mindsets.

I enjoy both personally, but I also don't expect my consoles to compete with my PC, they both have their uses.

nitus10939d ago

Build yourself a PC and and you can choose your mother board, CPU type, memory, graphics card. hard and/or solid state disks not to mention a decent monitor, power supply, keyboard and mouse.

Buy a PC and you get what it has and if you get one with really good specs then you you are going to pay for it.

BTW before anyone chimes in with "But keyboard and mouse are cheap" I would say you really aren't serious with PC gaming.

cell989940d ago

last gen it was good CPU, decent GPU and BAD RAM. Now we have great RAM, ok GPU, but bad CPU. Will they ever get it right

johndoe11211940d ago (Edited 940d ago )

It's ridiculous to expect all 3 of those things to be great for $400. Something needs to be sacrificed somewhere.

Gwiz940d ago

True,butcpu should not be an afterthought,like they shouldn't
have done it with the GPU on the PS3 while it still worked out
it was everything but a popular move for third-party devs.

ABizzel1940d ago

Well next gen should get the right balance, but they won't be high end at all.

The following specs should be the minimum if they stick with AMD.

FX 8320+ (they need single core performance boost)
R9 390x equivalent GPU (8+ TFLOPS)
8GB - 16GB DDR4 System RAM (most likely 8GB)
4GB - 8GB HBM (Probably 8GB)

$399 - $499

So a mid range CPU (even without the single core boost), top end GPU, a good amount of System RAM from a much better source, and a much better source VRAM. By today's standards.

By 2020 when the consoles release, it'll be a solid CPU still (with the single core boost, it'll still be mid-range), a mid range GPU (in 5 years time GPU tech will be much better, R9 770x), and the RAM should still be good if a bit on the smaller side pool wise (although 8GB of VRAM is still a good considering the most demanding games of today are capping at 4GB, and 8GB system RAM is still plenty).

JasonKCK939d ago

They should have kept the blu-ray drives out and added extra horse power.

+ Show (1) more replyLast reply 939d ago
GUTZnPAPERCUTZ940d ago

Yep, X1 - 1.75ghz per core, PS4 - 1.6ghz per core, meanwhile PC's are pushing over 4ghz easily :/

shloobmm3940d ago

Pcs have been running at 4ghz for 15 years.iit was originally believed that the intel pentium 4 could hit 10 ghz. However the reason cpus havent surpassed those marks is because its not beneficial. We r forced to overclock our cpus because they arent being used correctly. The speeds arent the problem. Its just a problem of them not being used the way they should be.

2pacalypsenow940d ago (Edited 940d ago )

@shloobmm3

Im pretty sure in 2000 we barely hit 1ghz ,I know because i had a pentium III that was one of the first along with AMD Athlon , No way we had 4ghz Cpu's on 2000 , atlas in the consumer market

Lordani66939d ago

So what? Consoles hardware should not be compared to PC hardware. It's like comparing PC hardware to this of space rockets... Look at PS2, it had freaking 300MHz prcessor (single)! It played every multiplatform game liek a charm, and at it's end of lie it had ganes like Black that looked just like PC games At the time. And the PCs at that time were 3.0GHz up and core 2 duos were around with 3,2 GHz on 2 cores. PCs need that extra memory, frequency etc. to not only play games but also do 9412412789 other things at the same time, that consoles just simply don't do.

This, plus the opened platfrom that PC is. No game developer will optimise their game for 4124152 PC builds that each have just one hardware part that the others (and as a soon to be IT engineer believe me, it MATTERS), so that' s another reason consoles can have "slower" CPUs.

Lordani66939d ago

You can dislike my previous comment how many times you like, I don't care, but these are the facts. And I know butthurt pc fanboys don't like them ;).

+ Show (1) more replyLast reply 939d ago
shloobmm3940d ago

Ive run the api test thru the preview build. Its significant. Anyone with the win10 technical preview and 3dmark can run the benchmark. However is answer about the xbox one is ludicrous. Since when do care about not only a random persons benchmark but also his assumptions of what it will do for the x1.

DragonKnight939d ago

So we're supposed to care about your benchmark, but not his? Let me guess, your benchmark says DX12 turns the Xbox One into a high end PC right?

andibandit939d ago

@dk
Did you read the article?

neither the guy in the article or shloobmm3, has benchmarked the X1 with DX12.

Eonjay939d ago

Well, all someone has to do is benchmark a rig with a similar CPU and DDR3 GPU

Father__Merrin940d ago

but who will pay for a higher spec cpu in consoles? it doesnt come for free a £500 console simply wont cut it

UKmilitia940d ago (Edited 940d ago )

what console was £500?
ps4 i got on launch was £379 with 2 games.
i would of happily pad £400 or £450 if it would of made a huge improvement.

i love my ps4 but i would of been happy to spend upto £450

lemoncake939d ago

in the past gens console makers would make a lose on each console sold and recover that with the premium fee they add to games, this gen it's been all about making profit from day one hence why the machines are massively underpowered. You have to remember these console manufacturers attach huge fees onto the games sold not to mention the other stuff like subscription fees and advertising revenue they make per console.

ShottyGibs939d ago

I will. Not wasting my gaming time playing crap at 30fps. Screw this gen

MonkeyOne939d ago

I think this is also probably why we don't see the PS4, with it's presumably faster GPU and GDDR5 memory, blowing the Xbox one out of the water in multi-platform games.

The CPU's are probably the bottleneck in many benchmarks.

In the PC gaming world 3.4 is now the base, low-end speed for a CPU.

Raw CPU horsepower matters.

A lot.

headblackman939d ago

@MonkeyOne

I think this is also probably why we don't see the PS4, with it's presumably faster GPU and GDDR5 memory, blowing the Xbox one out of the water in multi-platform games.

huh??? where at??? i think our understanding of blowing something out of the water is tooootally different from one another. the ps4 and the x1 both has the same everything when it comes to multiplat games, with a slight bump of resolution from time to time in favor of the ps4, but 900p to 1080p to the naked eye (depending on the size of your tv and the distance that you're sitting from your tv) wont allow you to see the difference (this is why most people rely on machines and companies like digital foundry to tell us what the true performance of a game is). now if it had better textures,better lighting,more polygons,better particle and liquid effects, then you might be able to say that 1 blows the other out of the water, but 900p to 1080 isn't enough to say that sir. 720p to 1080p isn't even enough to say so. you need way more than a resolution bump on a multiplat game to blow one console out of the water. now you can say that in the resolution department the ps4 has a slight advantage (at the moment), but that's about it and even someone with half of a brain wouldn't contest that.

r2oB939d ago

@ head black man

I think you need to read his comment again.

+ Show (4) more repliesLast reply 939d ago
brich233940d ago

OK, now show me a game that has 330% boost over Directx 11.

TheXgamerLive940d ago (Edited 940d ago )

So, I see you do not understand the statement's meaning OR the fact that DX12 games aren't out yet.

brich233940d ago

I know the games arent out yet, they need to stop saying how much of a performance boost DX12 is an actually show us a game with real time performance, not just some benchmark.

death_gun940d ago

i was going to built a new gaming rig but if this is true then my actualy pc should be able to play all games maxed out once directx 12 is out

brish940d ago

"DirectX 12 Gives A Boost of 330% To Draw Calls On 3 Year Old Hardware"

What this means is ONLY the draw calls are 330% faster. It doesn't mean the frame rate is 330% faster.

Bigpappy939d ago

Very true. But if the draw calls are that much better, and the game is optimized to use the CPU in that manner, then the frame rate with increase. They are by no means directly proportional though. Draw call with do more on the screen at a specified frame rate for DX12 API as opposed to DX11.

TheCommentator940d ago

A draw call affects the complexity of a given scene, including things like geometry and texturing. More texture layers applied to the same surface increase the number of draw calls, resulting in a more detailed image. 330% more draw calls, however, does not make the whole system 330% more powerful.

940d ago
ShottyGibs939d ago

They aren't saying it's a 330% fps boost.

You do know what draw calls are right?
Well that's what they are talking about - Geez.. do some more reading before you make simpleton comments.

+ Show (3) more repliesLast reply 939d ago
Dario_DC940d ago

So my laptop's 820M is gonna become a 750TI?
Joking asside I'm sure this will make gaming quite easier for the low to middle end PCs

hockeyglory940d ago

Avoid clicking it. Click bait.

It is an article focusing on PC hardware, yet the N4G pic shows an Xbox One making gamers think the benchmark tests are for Xbox when it was really PC tests.

Kayant940d ago

Gamingbolt at their finest...

SlavisH2940d ago

Know Xbox doesn't have PC hardware and Xbox does run dx12. Microsoft made dx12 not Xbox. Wft

IGiveHugs2NakedWomen940d ago

"It is an article focusing on PC hardware, yet the N4G pic shows an Xbox One making gamers think the benchmark tests are for Xbox when it was really PC tests."

"Avoid clicking it. Click bait."

This^^^

Nothing more needs to be said.

Haru940d ago

now i can finally play the witcher 3 at ultra 4k on my old laptop!!

thank you gamingbolt! you're always right and you always state true facts and never fake ones :)

gangsta_red940d ago

Does Witcher 3 support DX12? I think the game has to support it first right?

sinspirit940d ago

Why did we ever buy new consoles when we can just download an update?!

/s

Show all comments (67)
The story is too old to be commented.