Screenshot Comparison of Direct X 9/10/11

A screenshot comparison of Direct X 9, 10 and 11.

Read Full Story >>
red2tango5317d ago

even though I only play games on my PS3, I'm glad to see improvements in the PC.

SL1M DADDY5317d ago

As for the DX11 pics, a horse and carriage would be all busted up after a trip down that road. I know that DX11 is going to be awesome but some devs need to use the options a little more realistically.

Xgamerzus5317d ago

Cheesy DX11 MS trying to come up with Software that replaces hardware!! LOL
Kinda useless crap, Im thinking PC games have the luxury of raw band width and horsepower and RAM, DX or not or OPen GL, all the same when you pump it through a 200GB pers sec Nvidia card..

Christopher5316d ago

DX10 looks the best to me. The 'changes' made in DX11 are a bit extreme, IMHO. Such as oversized and jutting rocks on the road that make it look less like a road and more like terrain that's been pounded into randomly by a jack hammer.

Same with the courtyard view, where DX10 looks like cracked and eroded stones, but DX11 looks like huge slabs of stones jutting above the ground in an unnatural way.

The Lazy One5316d ago

I'm sure the results are scalable some how. The point was to show the difference. If they made a road with a flat texture and a flat road with a bumped texture, you wouldn't be able to tell the difference. Even if one was technically much more impressive.

In terms of graphical power, not necessarily looks as the images are exaggerated to show the power, DX11 is much more impressive.

dirthurts5316d ago

Is to show what can be done.
As for the road, they used an extreme example of what can be done. You can bump map in full 3d now, which is great.
The road could have been bump mapped in a more subtle way, only showing mild crevices and cracks. But the idea of this is to show how far you can take it.
It's a sliding scale. You can go from flat, to bumps, all the way to spikes(like you saw on the dragon).
Look at the rocks that make the curb of the road. That's a more subtle approach. Which would have looked better on the road honestly. But that's not the idea. It's a tech demo. Showing tech.

dirthurts5316d ago

And in the dragon pic, did anyone else notice how it worked with the stairs?
It took a flat textured surface, and gave it full blown steps. Looks closely.
Also the ground in the dragon pics look amazing.

Christopher5316d ago

Oh, I know what DX11 does, specifically in this regard it's showing off the new tessellation capabilities. The issue is that if you can't make it look better without making it hard to tell the difference, then what's the point in a 'graphical' review?

I know the performance is better, but I don't need to see some extreme concepts that don't look better, just look like they're using the technology for the wrong purposes. What they should have done is made the best looking, not just a overdone example of the technology, and then performed some benchmark tests from one to another.

What DX11 really needs is for someone to get their hands on it and use the tessellation for something outside of the box that really makes it shine alongside DX10 without doing ridiculous things just because they could.

The Lazy One5316d ago

You don't know who these pictures were made for though. If they're made for programmers, they don't really care how pretty it looks, only that it shows what the system can do.

The pictures aren't even especially ugly. Just not very realistic.

Christopher5315d ago

Speaking as a programmer, I don't give a rats arse about screenshots. You want to show me something, then give me the sample code. Programmers aren't the ones who care about looks, they care about performance on almost all levels possible to match with the necessary data being handled. If they did something for programmers, they'd give them the code for a sample program, not some screenshots.

MNicholas5310d ago

but consoles have been doing things like this for quite some time.

There are two main reasons.

In consoles, the developer has a fixed hardware platform and can be creative in how it's used but when developing for the PC, developers have to follow a fairly restrictive rendering path because they have to support a wide range of end-user configurations.

Also, some consoles have been specifically designed to blur the line between CPU and GPU in rendering yet PC GPUs are only now gaining the level of programmability required to implement these kinds of effects.

The first GPU that will have the ability to completely redefine how PCs render games will be Fermi yet, sadly, developers will still be hamstrung by having to support older systems. It will probably be a few years after Fermi comes out that we really see some interesting things from PC visuals.

+ Show (7) more repliesLast reply 5310d ago
KRONie5317d ago

DX11 looks the same as DX10, cept more things sticking out. I don't see how that's an improvement tho since you could do that in DX10. except you had to manually draw the road texture ETC instead of telling Direct3d which things to stick out.

Pandamobile5317d ago

Tessalation and OpenCL. Two major additions in DX11. Tessalation allows for textures to translate into geometry, which adds a LOT more detail, without any more work for the modelers.

OpenCL allows for programs to be run inside the GPU and use the ridiculous amount of processing power inside a modern GPU for more than just graphics.

DX11 isn't just additions to the surface.

El_Colombiano5317d ago

That's what I like about your comments Panda. They actually have knowledge and know how to them.

superrey195317d ago

That tessellation in DX11 sure makes a big difference in my opinion. If im not mistaken, tessellation in DX11 renders a lot more detail without a huge hit on performance. The only difference i could see between DX9 and 10 was the grass textures lol.

REALgamer5317d ago

One of the features ATi / AMD kept touting was Tessellation which the ATi Radeon 3000 and 4000 series were capable of using DirectX 10.1.

Any chance then owners of these cards will be able to use the tessellation effects of future DirectX11 games on DirectX10.1 cards?

OpenGL5317d ago (Edited 5317d ago )

@ Pandamobile

OpenCL has nothing to do with DirectX 11 other than all DirectX 11 cards work with OpenCL. It is a platform agnostic C based programming standard similar to CUDA from Khronos Group, the same group behind OpenGL. It allows developers to get away from things like HLSL and work with a more familiar C based language when programming for GPUs. All Nvidia GPUs from Geforce 8 series and later are OpenCL ready, and I believe only ATI's 4800 series and up are OpenCL ready.

Eventually OpenGL and DirectX will be a thing of the past when we get to 100% software pipelines on fully programmable GPUs. Tim Sweeney says that Epic Games will not be using DirectX or OpenGL for Unreal4, but a software based OpenCL or CUDA based render pipeline.

Lich1205317d ago (Edited 5317d ago )

I think you're getting the tessellation mixed up with normal maps, which yes DX10 was well capable of. Granted, it doesn't look like the roof was being rendered with a map applied so it looked flatter than it probably could have. Still, it wouldn't match DX11 in terms of realism.

I noticed improvements in the soft shadows. DX9 had really hard edges. Also, not pictured, but for DX10 there were shader improvements made.

evrfighter5317d ago

That's all fine and dandy OpenGL but whens the last time Epic made a worthwhile pc game.

Truth be told Epic is no longer a pc developer who's games I'm interested in. I'm sure many other pc gamers feel the same way.

Pandamobile5317d ago

I can't have all the answers :(

Kakkoii5317d ago


ATI liked the idea of Tessellation, but they never created a good enough standard. Their hardware tessellation lacked some key functionalities that DirectX 11's tessellation has. Not to mention with ATI's low market share, it was hard to convince a developer to program hardware tessellation into their game, knowing it would only work for people with ATI GPU's. Nvidia has a hard enough time getting PhysX into games. Imagine ATI trying to get tessellation into games back then lol.


And no, you will not be able to use the hardware tessellation in previous ATI cards with DirectX 11's tessellation. ATI's previous hardware doesn't support all of the tessellation features DX11 implements. So it's not supported. You need to use a DX11 GPU.

ATI's tessellation has been around for quite some time, with small improvements over time.

It was supposed to be supported in DX10.1. But nothing seems to have come out of that. There's very little information about it. So something about it must have been a failure.

With DX11 it's a whole new tessellation engine created by Microsoft, and a hardware requirement that both Nvidia and ATI will be using. And thus it's adoption will be much easier.

mal_tez925317d ago

That's FU**ING amazing. How much for a graphics card that can do this? I want it!

Electricear5317d ago

is known as Displacement mapping in the 3d industry. The displacement map is the image that "describes" how the the 3d geometry will be deformed when it is tessellated. A displacement map is similar to a bump or normal map depending on the type of image used. Tessellation, for displacement maps at least, is the process where the render engine subdivides the polygons that make up the model based on the image and deforms them to reproduce a more complex surface. The thing about tessellation is it can vary from render engine to render engine, and as you increase the tessellation you also impact how long it takes to render. Sub pixel displacement tessellation is considered the ultimate level, as it is pointless to subdivide past that point as it won't have any major impact on an image. From what I can tell Direct X 11 hasn't reached sub pixel tessellation yet; however, it is a very high level. It's pretty awesome to see this stuff finally happening in real time.

Ju5317d ago

@OpenGL "100% software pipelines on fully programmable GPUs"

That's an interesting thought. Because I was playing around with the CELL (linux) a bit and actually ended up writing some sort of SW renderer. Doesn't do much, but I got some sort of a flat geometry engine running. The basic idea, however, was a parallel core I could spread across my SPUs, and where that left me eventually, was, that I could possibly use that backend framework to use via a shader or so (CUDA or what not). So, there you go. Finally I ended up exactly there. Doing a pure SW renderer using a HW accelerated implementation (and that could be CUDA, OpenCL, DirectCompute or pure SW/SIMD/SPU or what ever). I never finished that, and I have yet to implement a texture engine and some sort of abstract shaders on top of that. But that was exactly my thought: "Eventually we are back at software rendering" - extremely fast, though. And basically having a dedicated HW running the code.

That's just so funny, because I started poking pixels in the mid 80s. And when we got a "blitter" (Block Image "Transferer", for the younglings here) we were all excited. Then it became more and more fixed function set HW (despite the first blitter was programmable) and now we are going back to a sophisticated "blitter", which is basically fully programmable.