Top
140°

Opinion: Why tessellation is the most overhyped DirectX 11 feature

Nvidia and AMD have been going on about tessellation for ages, but what is it, and why is Nvidia's all-in approach misleading?

Read Full Story >>
pcauthority.com.au
The story is too old to be commented.
chak_2409d ago

It's not overhyped, it's not yet mastered.

But it has real potential for ressource-saving. I'm waiting for BF3 to see what they can come up with, as they seems to concentrate their efforts on DX10/11.

JsonHenry2409d ago

I don't think it is over hyped. No one has used it enough yet. The famous DX11 benchmark programs all show how awesome tess can make things look. The problem is devs have not fully utilized it yet.

DelbertGrady2409d ago

That was a tesselating read.

Syaz12409d ago

tessellation is still something devs are mastering in, but is getting popular very quickly. i think it's the next step in advancing video game graphics. imagine if you could see the necromorphs scaly skin, or the shape of the scar on commander shepard's face? it would improve realism a lot, and once devs have perfected it, it would be mainstream.

btw, personal opinion, uniengine's heaven tessellation is good looking, although extreme tessellation is seriously extreme. how do you walk on a pavement like that in real life anyway?

WhiteNoise2409d ago (Edited 2409d ago )

Ummm... just because it sucks in DA2 does not mean it sucks.

That's like picking a console game with 16bit HDR and saying HDR sucks as a result, it's only when people have seen the mind blowing 128bit HDR in crysis and farcry 2 in DX10 that they realise how amazing it is.

I turn tessellation off in DA2 because of the 'sinking feet' B.S.

It's the fault of the developers, not the technique.

@Snake

It increases polygon count and adds depth to objects eg.

http://www.hardocp.com/imag...

Syaz12409d ago

people who disagree with you is clearly clueless about tessellation, or ps3 fanboys butthurt over having only software tessellation.

BK-2012409d ago (Edited 2409d ago )

Why the hell do you idiots have to mention PS3 fanboys in every goddamn comment you make. they aren't the problem, its retards like you who are paranoid as shit.

drexl2408d ago (Edited 2408d ago )

I love how you PC elitists always single out the ps3 for hate. Why not the 360 as well? Are you and the 360 fanboys fuckbuddies and sucking eachothers cocks or something?

bozebo2409d ago (Edited 2409d ago )

WhiteNoise understands.

Tessellation is the core reason why the samaritan demo looks so good, it is also heavily used in battlefield 3 and the witcher 2. The 3 best looking gameplay graphics examples around.

@syaz1
The 360 has to tessellate on the same graphics cores as the other vertex and pixel shaders (it uses a unified architecture) so it limits how much power there can be available for other graphical effects. The ps3's cpu is particularly good at floating point calculations (like a graphics chip) so it is able to designate a core to tessellation and use some of that spare cpu power that games generally don't need - the problem with that is the programmers must manually write & debug the software tessellation code rather than call upon the graphics hardware to do the job - which is why it isn't/won't be done very much.

LostDjinn2409d ago (Edited 2409d ago )

Right on D!

Now for me to annoy people with the reason why.

Firstly, DA2's required compute power for tessellation is minimal.

The rasterized objects/image (done by all GPU's) are/is normal. An very simple algorithm is then applied to the object/image with only the baseline report requiring any real compute power (and very little at that). After the report is crunched the GPU applies the outcome to the image and and rescales it. There ya have it, basic Tessellation.

As time goes by the algorithm will become more elegant and the number of baseline reports processed greater. It'll lead to far better outcomes and less overlap.

The thing is that if you have a fast enough bus, anything with a compute capacity can read the baseline results (CPU's). The DX11 bit just means it'll take place on the GPU.

*smiles*
Told you I'd annoy people. ;)

@ above. In the 360 the baseline reports cannot be read by GPU cores using DX9. NEC's daughter die or the Xenon would have to be used and with the bandwidth of the FSB array on the 360 you know it's going to NEC's handy work for sure.

Show all comments (16)
The story is too old to be commented.