A screenshot comparison of Direct X 9, 10 and 11.
even though I only play games on my PS3, I'm glad to see improvements in the PC.
As for the DX11 pics, a horse and carriage would be all busted up after a trip down that road. I know that DX11 is going to be awesome but some devs need to use the options a little more realistically.
Cheesy DX11 MS trying to come up with Software that replaces hardware!! LOL Kinda useless crap, Im thinking PC games have the luxury of raw band width and horsepower and RAM, DX or not or OPen GL, all the same when you pump it through a 200GB pers sec Nvidia card..
DX10 looks the best to me. The 'changes' made in DX11 are a bit extreme, IMHO. Such as oversized and jutting rocks on the road that make it look less like a road and more like terrain that's been pounded into randomly by a jack hammer. Same with the courtyard view, where DX10 looks like cracked and eroded stones, but DX11 looks like huge slabs of stones jutting above the ground in an unnatural way.
I'm sure the results are scalable some how. The point was to show the difference. If they made a road with a flat texture and a flat road with a bumped texture, you wouldn't be able to tell the difference. Even if one was technically much more impressive. In terms of graphical power, not necessarily looks as the images are exaggerated to show the power, DX11 is much more impressive.
Is to show what can be done. As for the road, they used an extreme example of what can be done. You can bump map in full 3d now, which is great. The road could have been bump mapped in a more subtle way, only showing mild crevices and cracks. But the idea of this is to show how far you can take it. It's a sliding scale. You can go from flat, to bumps, all the way to spikes(like you saw on the dragon). Look at the rocks that make the curb of the road. That's a more subtle approach. Which would have looked better on the road honestly. But that's not the idea. It's a tech demo. Showing tech.
And in the dragon pic, did anyone else notice how it worked with the stairs? It took a flat textured surface, and gave it full blown steps. Looks closely. Also the ground in the dragon pics look amazing.
Oh, I know what DX11 does, specifically in this regard it's showing off the new tessellation capabilities. The issue is that if you can't make it look better without making it hard to tell the difference, then what's the point in a 'graphical' review? I know the performance is better, but I don't need to see some extreme concepts that don't look better, just look like they're using the technology for the wrong purposes. What they should have done is made the best looking, not just a overdone example of the technology, and then performed some benchmark tests from one to another. What DX11 really needs is for someone to get their hands on it and use the tessellation for something outside of the box that really makes it shine alongside DX10 without doing ridiculous things just because they could.
You don't know who these pictures were made for though. If they're made for programmers, they don't really care how pretty it looks, only that it shows what the system can do. The pictures aren't even especially ugly. Just not very realistic.
Speaking as a programmer, I don't give a rats arse about screenshots. You want to show me something, then give me the sample code. Programmers aren't the ones who care about looks, they care about performance on almost all levels possible to match with the necessary data being handled. If they did something for programmers, they'd give them the code for a sample program, not some screenshots.
but consoles have been doing things like this for quite some time. There are two main reasons. In consoles, the developer has a fixed hardware platform and can be creative in how it's used but when developing for the PC, developers have to follow a fairly restrictive rendering path because they have to support a wide range of end-user configurations. Also, some consoles have been specifically designed to blur the line between CPU and GPU in rendering yet PC GPUs are only now gaining the level of programmability required to implement these kinds of effects. The first GPU that will have the ability to completely redefine how PCs render games will be Fermi yet, sadly, developers will still be hamstrung by having to support older systems. It will probably be a few years after Fermi comes out that we really see some interesting things from PC visuals.
DX11 looks the same as DX10, cept more things sticking out. I don't see how that's an improvement tho since you could do that in DX10. except you had to manually draw the road texture ETC instead of telling Direct3d which things to stick out.
Tessalation and OpenCL. Two major additions in DX11. Tessalation allows for textures to translate into geometry, which adds a LOT more detail, without any more work for the modelers. OpenCL allows for programs to be run inside the GPU and use the ridiculous amount of processing power inside a modern GPU for more than just graphics. DX11 isn't just additions to the surface.
That's what I like about your comments Panda. They actually have knowledge and know how to them.
That tessellation in DX11 sure makes a big difference in my opinion. If im not mistaken, tessellation in DX11 renders a lot more detail without a huge hit on performance. The only difference i could see between DX9 and 10 was the grass textures lol.
One of the features ATi / AMD kept touting was Tessellation which the ATi Radeon 3000 and 4000 series were capable of using DirectX 10.1. Any chance then owners of these cards will be able to use the tessellation effects of future DirectX11 games on DirectX10.1 cards?
@ Pandamobile OpenCL has nothing to do with DirectX 11 other than all DirectX 11 cards work with OpenCL. It is a platform agnostic C based programming standard similar to CUDA from Khronos Group, the same group behind OpenGL. It allows developers to get away from things like HLSL and work with a more familiar C based language when programming for GPUs. All Nvidia GPUs from Geforce 8 series and later are OpenCL ready, and I believe only ATI's 4800 series and up are OpenCL ready. Eventually OpenGL and DirectX will be a thing of the past when we get to 100% software pipelines on fully programmable GPUs. Tim Sweeney says that Epic Games will not be using DirectX or OpenGL for Unreal4, but a software based OpenCL or CUDA based render pipeline.
@Kronie I think you're getting the tessellation mixed up with normal maps, which yes DX10 was well capable of. Granted, it doesn't look like the roof was being rendered with a map applied so it looked flatter than it probably could have. Still, it wouldn't match DX11 in terms of realism. @Supprey I noticed improvements in the soft shadows. DX9 had really hard edges. Also, not pictured, but for DX10 there were shader improvements made.
That's all fine and dandy OpenGL but whens the last time Epic made a worthwhile pc game. Truth be told Epic is no longer a pc developer who's games I'm interested in. I'm sure many other pc gamers feel the same way.
I can't have all the answers :(
@REALgamer: ATI liked the idea of Tessellation, but they never created a good enough standard. Their hardware tessellation lacked some key functionalities that DirectX 11's tessellation has. Not to mention with ATI's low market share, it was hard to convince a developer to program hardware tessellation into their game, knowing it would only work for people with ATI GPU's. Nvidia has a hard enough time getting PhysX into games. Imagine ATI trying to get tessellation into games back then lol. http://en.wikipedia.org/wik... And no, you will not be able to use the hardware tessellation in previous ATI cards with DirectX 11's tessellation. ATI's previous hardware doesn't support all of the tessellation features DX11 implements. So it's not supported. You need to use a DX11 GPU. ATI's tessellation has been around for quite some time, with small improvements over time. http://en.wikipedia.org/wik... It was supposed to be supported in DX10.1. But nothing seems to have come out of that. There's very little information about it. So something about it must have been a failure. With DX11 it's a whole new tessellation engine created by Microsoft, and a hardware requirement that both Nvidia and ATI will be using. And thus it's adoption will be much easier.
That's FU**ING amazing. How much for a graphics card that can do this? I want it!
$144.99 to $409.99 http://www.newegg.com/Produ...
is known as Displacement mapping in the 3d industry. The displacement map is the image that "describes" how the the 3d geometry will be deformed when it is tessellated. A displacement map is similar to a bump or normal map depending on the type of image used. Tessellation, for displacement maps at least, is the process where the render engine subdivides the polygons that make up the model based on the image and deforms them to reproduce a more complex surface. The thing about tessellation is it can vary from render engine to render engine, and as you increase the tessellation you also impact how long it takes to render. Sub pixel displacement tessellation is considered the ultimate level, as it is pointless to subdivide past that point as it won't have any major impact on an image. From what I can tell Direct X 11 hasn't reached sub pixel tessellation yet; however, it is a very high level. It's pretty awesome to see this stuff finally happening in real time.
@OpenGL "100% software pipelines on fully programmable GPUs" That's an interesting thought. Because I was playing around with the CELL (linux) a bit and actually ended up writing some sort of SW renderer. Doesn't do much, but I got some sort of a flat geometry engine running. The basic idea, however, was a parallel core I could spread across my SPUs, and where that left me eventually, was, that I could possibly use that backend framework to use via a shader or so (CUDA or what not). So, there you go. Finally I ended up exactly there. Doing a pure SW renderer using a HW accelerated implementation (and that could be CUDA, OpenCL, DirectCompute or pure SW/SIMD/SPU or what ever). I never finished that, and I have yet to implement a texture engine and some sort of abstract shaders on top of that. But that was exactly my thought: "Eventually we are back at software rendering" - extremely fast, though. And basically having a dedicated HW running the code. That's just so funny, because I started poking pixels in the mid 80s. And when we got a "blitter" (Block Image "Transferer", for the younglings here) we were all excited. Then it became more and more fixed function set HW (despite the first blitter was programmable) and now we are going back to a sophisticated "blitter", which is basically fully programmable.
Actually I believe the term for what they're using here is not normal maps but actually displacement maps. It's a map texture which protrudes from the surface rather than simply providing embossing.
We're currently in an in-between stage. BTW, what they're showing is displacement map tessellation. Displacement map tessellation or, for that matter, any kind of tessellation is good for some types of detail creation but not good for others. What they've shown are some of the more suitable applications for the technology. In the overall scheme of things, it's not going to dramatically transform game graphics as perceived by the user but it does make the artist's job a lot easier by lowering the incremental cost of some kinds of additional detail. @ju: Interesting work. Would love to hear more about it over at beyond3d.
OpenGL is better imo.
OpenGL 3.2 supports everything DirectX 11 does without the requirement of Windows Vista / Windows 7 and it's royalty free.
OpenGL has been garbage for years, its nowhere near as good as DirectX
The problem with OpenGL is that there is little developer support behind it compared to DirectX. DX is progressing at a much faster rate.
Actually OpenGL is not better. And this TomsHardware article will help explain at least a bit why: http://www.tomshardware.com...
Wuh oh, OpenGL vs. DX <----a neverending debate.
OpenGL has it's purpose and that's mainly in high precision software, like autocad and 3dsm or anything workstation related. Compared to dx it's slower and has less features to work with, it's libraries are also, for a lack of a better term, waky. If you want precision then go with OpenGL but if you want better, faster results then go with D3D. Hence the reason DX has been the staple for videogames since 8, while opengl is the staple for workstations. Open source is great because it's got a lot of freedom and there's a lot that can be added to it, but it lacks definition and boundaries or the ability to receive nominal upgrades that a closed version has.
Open Source over Closed. Simply put guys.
@kakkoi Thanks for the good read. Very well written article.
Eventually, down the road it, won't matter. Dx and Open GL will only provide the frame work for sophisticated shaders. OpenCL or Direct Compute (CUDA will die ?), that's all which will be left.
Thank god macs don't have Directx11. that crap sucks. opengl FTW
You're aware than PC's use OpenGL too, right?
that someone with a mac is replying to a pc gaming article...
All the PS3 fanboys are more in 360 than in PS3 articles
What 360 articles I thought we had funeral services for that pos already.
wait....the 360 is still in production?
Holy CRAP... look at the texture difference between DX10 and DX11! DirectX 10: http://img25.imageshack.us/... vs. DirectX 11: http://img98.imageshack.us/... DirectX 10: http://img263.imageshack.us... vs. DirectX 11: http://img20.imageshack.us/...
That road is most likely a difference in (real) geometry then "just" textures. Dx10 might be a bump map, while Dx11 looks like real geometry using tessellation (nothing but a very efficient way to compress/expand vertices in HW).
I mainly see texture differences between DX11 and its predecessors. But still changes always help progress.
Wow! I was *this* close to placing an order on Newegg for a 9800GTX, but now I'm completely sold on DX11. That Tessellation really makes a huge difference and should make developers lives a little easier. The improved multi-threading support is also a big deal. *Waits for nVidia to release some DX11 cards*
Wow man why such an old card like a 9800GTX?
lol, I hope you were being sarcastic El_Colombiano.. XD And yeah, Nvidia's high-end DX11 cards should be out in time for Christmas :P.
I am not up to date, well I never was, with the 9 series. What I read was that it was an 8 series re branded with smaller dies. The 9800GTX can't compete with, say, a GTX 295.
Yeah it can't compete with a 295. Just the part about the card being "old" is all I was talking about XD.
I don't trust those pics. Anyone can make it look like DX11 is all the rage by purposely under-developing the DX10 version so it slightly looks better than DX9 or show no difference at all to inflate the DX11 image when we see the startling realization. Just my gut feeling.
Yes, they didn't seem to use many of the typical technologies you'd find in DX10/9 games. Like occlusion culling. So it made it look like even more of a change. But still, the very fact that Tessellation is finally set in stone with DX11, both teams having to support it with their DX11 cards, makes DX11 an awesome update. Not to mention a GPGPU standard as well called DirectCompute. And much improved CPU multithreading, allowing future games to better utilize multi-core/threaded CPU's. Tessellation works by increasing geometry in a fractal way. And using displacement maps to move the now highly detailed mesh's thousands of points up or down. So you create a basic 3D model. Then you tessellate it and use displacement maps to add all the fine surface details. Like those large spikes on the dragon, or the tiles on the roof. Normally this would have to be done by making high polygon count models, which would be very taxing on your GPU power. But since your merely tessellating the existing basic geometry and then displacing points with a map, it's a lot less taxing on your GPU then usual. And thus games can look better while still having playable frame-rates.
I agree that tessellation is the "new hotness" when it comes to GPU, but I'll use the Crysis example. No point in introducing brand new tech if more than half the $1k rigs are incapable to running it at acceptable frame-rates. I am a PC gamer and I know how much it costs to maintain my hobby, but I can't fathom having to buy a 2nd GPU if I'm just trying to get one game @14x9 to barely inch me over the 30FPS mark: max IQ of course. Taking the pessimistic approach here, the things that DX11 promises sound just as good as multi-threaded apps. In theory and tech, the practice is good, but what good is good if 2 years from now we barely see any promising fruition? Multi-Core CPUs have been out since 2005-2006: 3-4 years and we still barely have a handful of apps that can really take advantage of the tech. The standing question is, what gives gamers the incentive to choose DX11 over a really nice DX10 when the games aren't there yet or won't be for 2 years? To go off-topic, tessellation is a heavily touted feature that Codemasters is featuring in DiRT 2. Enough for ATI to heavily invest in it and pre-sell copies on their existing DX11 products. Codemasters aren't rebuilding their engine for DX11 and is a port at best, but with some of the features only. To make a long-winded post,short. I don't start touting any kind of features until the games walk the talk first. It's like PS3 back in 2006 when the games didn't justify the price-tag at launch or the long PR-babble of POWER OF THE CELL!!!!lololololzzzzzz. I don't mean to downplay your post Kakkoii, but just offering an alternative viewpoint, not dissenting. You're clearly well-informed about the gears under the hood. That's was just my reaction when I see others write about the tech before the games. DiRT 2 barely qualifies as a DX11 game and I'd bite my tongue before using DiRT 2 and possibly Crysis 2 to justify my DX11 purchase between now and end of Q4.
Well firstly about the multi-threading in DX11. TH explains it here much better than I could: http://www.tomshardware.com... Multi-threading an application yourself is a very hard job, and is also hard to do well. That's one of the reasons you still don't see many multi-threading programs still. Even though multi-thread CPU's have been around since Pentium 4. On Tessellation and the Crysis example. Tessellation isn't just about making games look better. It's also about increasing performance. Your keeping the base models low res, and thus low file size. From there you load the model, and then turn the tessellation level up appropriate to how much that persons GPU is rated for. That's the nice thing about this tessellation. It's dynamic and can be changed on the fly. A game engine can instantly adjust the magnitude of tessellation and displacement on a model. http://developer.amd.com/gp... If you have an ATI card, download the Character Rendering zip on that page, extract it to a folder and run the "CharacterRenderingTessel lationDx9.exe". This is an interactive Tessellation and Displacement map demo. You can adjust the level of tessellation and displacement, see the wireframe and see the number of triangles. With the tessellation slider all the way up, the mesh consists of a ridiculous 3,651,324 triangles. A normal 3D model with that many triangles would easily fill all my ram up, and probably make my system come to a complete freeze. But through tessellation, I'm able to rotate the head around at a steady 118 FPS on my Radeon HD4650 with 1GB DDR2. The amount of displacement makes no effect on the FPS. And this is just a demo made for ATI's HD 2000 series years ago under DX9. DX11's tessellation has even better functionality and performance. So if you have an ATI card, that gives you a nice hands on with tessellation to understand the impact on quality and performance it will be able to bring. And yeah, I'm not arguing that games ARE going to be awesome and whatnot. I'm just arguing for the technology put in place with DX11. Whether or not many developers adopt it and utilize it well is just something we have to hope for. But at least the tech is here, and opens up the possibility for even more amazing games in the future. That's the only reason I debate this. I just want people to understand the DX11 itself is very good, with mainly performance increasing features, and 1 that increases performance and quality, which almost sounds crazy right? lol. And I agree that there isn't too much reason to upgrade just for DX11 until at least Jan/Feb. At that point there will be a clear view of what Nvidia has against ATI, and what is coming out a bit further down the road from both teams. Plus we'll have a larger list of upcoming DX11 titles and dates. Keep an eye on zee list! :P http://en.wikipedia.org/wik...
@Fantasy Star I'm curious what GFX Card you can't peak over 30fps on for crysis? My Rig is a year old (8 gigs ddr2, quad core 2.66, and a ATI4870) and I run it on high settings 1680x1050 (no AA unfortuantely) at a solid 40FPS. I'm asking because I feel like you would have a comparable setup from your price range and was wondering if you do have a similar card if you were running some settings that might be making you take a bit of a hit.
@Lich120 I built my rig in 2007 and accrued expenses along the way which tallies over 1k. My GFX is a 8800GT from '07 (you can check my Bio for the full-list). I haven't decided to upgrade in some time knowing that DX11 will come soon and that I'll be good to go. I'm aware that cards like the HD 4870 and 275 GTX are capable of producing the necessary FPS for my own means. With ToD mods and .cfg mods, I'm able to get playable 20-24 FPS with minor slowdowns most of the time, but I would like to strive for more in the future. I do appreciate the gesture of help though. I referenced Crysis because I believe that tech shouldn't come at the price of performance. When Crysis released in 2007, it brought every single card on the market to its knees. What good is tech if we can't even play it properly the way it was meant to be played for at least 1-2 years? If Crysis 2 debuted with DX11 and repeats this history, then it's just going to hurt our wallets. What I will do when I get my first DX11 card is reinstall Crysis and unleash the full-weight(along with ArmAII). That and I'm a firm believer that at my resolution (14x9), I should be able to buy a single GPU and get acceptable (30+ FPS) with that. Taking a book from consoles, adding more raw power isn't the solution to performance issues, but using existing hardware and optimizing it to fit the current envelope. I would like to add that I was one of those people that did think that Crysis was very unoptimized, but after Crysis Warhead: I believed that Crytek did all they could and that their tech was just years ahead of our current hardware, which is a good thing. It's a great benchmark and gives us something to strive for. @Kakkoii I've always been aware of how ATI had a hand up on the competition with their inherent tessellation units, but vastly underused due to how developers code their games. Oh boy, You nailed all the points I wanted to write. As usual, always a pleasure to discuss PC topics with ya. DX11 is DX9. DX10 was DX8 back then. History does repeat and that's reason enough to be excited for DX11. I hope Rage uses DX11, given Carmack's position.
Thanks for the link to the demos! Awesome stuff.
yu nerds always look at tha bad looks good to me
automatic bubbles for life for you my friend, Buckethead is a sexy beast and a ****ing amazing guitarist. Rock on!
Yup, He is one of my fav. guitarists ever! Oh and thanks! :D Positive feedback to you aswell! \m/
Sorry for double post.
What a jerk...
WHAT A TWIST!
Uniengine is actually a good reflection of the 3 iterations of DX, I thought it was going to be a noob article posting screenshots from 3 different games from the 3 generations ( which would have different art styles/settings andf there would be no real way to compare). That engine has been well optimised though. Unlike the new stalker screenshots we've seen , where the "DX10" character screenshots have people with octagonal heads and then when they post the DX11 pics, it just looks like they've turned on a little bit of AA. It's a shame to think of all that could have been in crysis 2 had the console's not got their greedy hands on it. But Tesselation is just one feature, i'm sure an optimised/more scalable crysis 2 will still keep up with the best....and surpass the best if the new stalker is anything to go off. I also hope the AVP engine shots i've seen have been from the console version because if not...it's not looking pretty for the first DX11 games at all. Now to start saving for Fermi again after another pc hardware component expenditure... New PSU: $229 Not having to RMA your card because Antec PSU's are ****suckers: Priceless EDIT: When I first saw Uniengine, I thought it was another deceptive trick where one version of DX is heavily optimised, and the other is not, making one look better my comparison....I was blown away when I found out that Tesselation could automatically do that...I wonder how they program games for it...if the hardware can just turn textures into geometry, wouldn't it be possible in DX10 games...even via very small tweaks, or would they have to re-texture an entire game with compatible textures?? EDIT 2: apparently crysis 2 will have dx11 support, that's the first i've heard of it :S
Well it's not so much that textures are being turned into geometry. But that a texture is displacing geometry, hence the name "displacement map". The texture is being used as a sort of displacement value sheet. Each pixel of the texture has a brightness value. From pure black to white. The brighter the pixel, the higher it's displacement value is. So when you map this texture onto a model at say 1024x1024, you have over a million pixels worth of displacement values spread across the model. So the more polygons/triangles a model has, the more points these displacement maps can displace, creating more defined detail. http://en.wikipedia.org/wik... So no, you can't reconfigure an old game with just a few tweaks. You need to create displacement maps. And there's no tessellation engine present in DX10. Only a hard coded solution on ATI's HD2000 and up. DX11 brings a unified hardware/software standard that both teams will be using in their DX11 GPU's.