AMD Radeon HD 7900 series brings us closer to Pixal quality graphics; behold 'Leo' DX11 Tech-demo

DSOGaming writes: "This is simply too good to pass by. AMD has released a DX11 tech demo for its new GPU 7900 series that looks phenomenal. The tech demo is a proof-of-concept technology, with the aim to demonstrate certain rendering methods at run-time.The two main points here are the use of hardware managed virtual texturing (PRT) and? forward rendering pipeline with compute shader based light occlusion, that allows application of hardware MSAA by avoiding the pitfalls of the deferred rendering. And it looks pretty amazing."


Obvious typo, lol. Pixal = Pixar

Read Full Story >>
The story is too old to be commented.
CarlitoBrigante2305d ago

I'm going to get a 7950 as soon as possible! Currently have the 6870.

dark-hollow2305d ago

Thats it??
Sorry, but am not impressed

kevnb2305d ago

if it was a console you would jizz yourself.

dark-hollow2305d ago

Yes i would.

A non playable tech demo of that level on a gpu that suppose to be generations above current consoles is not impressive.

Dont get me wrong, the gpu itself smokes anything available right now, but that demo doesnt really give the gpu it justice.
Ill wait for heavy graphics games like crysis 3 to put this gpu into it real potential.

FanboyPunisher2305d ago (Edited 2305d ago )

Nice marketing demo, I'm sure it'll make some ignorant people go get it.

Such a tiny feature like that isn't a reason to upgrade, FPS and settings are the main reason.

Waiting on Nvidia 600 series, 7970 aint that hot with all the problems i've been reading on Hardocp and other tech forums.

Nvidia have yet to let me down, more so when it's a EVGA card with a lifetime warranty and RMA upgrade. Cant say the same for ATI, I mean AMD.

CaptCalvin2305d ago

I don't think I've heard of a "Pixal." "Pixar" though sounds more familiar.

Kewl_Kat2305d ago

Very true. And that's also why "murder" is murder and not "murdel". Sounds tougher, just ask Dwight Schrute lol.

plato2305d ago (Edited 2305d ago )

Which will never happen...unless you want the PS4 to cost $800+


And look what happened to the PS3 at $600+ price point and the NEO-GEO when it launched?

The problem is MANY people cannot afford a console at those prices and the reason consoles launch for a set price.
The other reason why you will NEVER see a high end GPU like the 7970 OR 580GTX inside any of these consoles is the fact that the heat dissipation AND cost of development would be insane.

Expect to see a low end to mid range GPU inside the PS4 AND Xbox720 because of this

Ares84HU2305d ago

I would pay the money if we are given a jaw dropping atom reactor of a console. I don't care about people who can't afford it. They are the ones who are holding tech back anyways because consoles have to be built so even the cheap bastards could afford it.

Machioto2305d ago

@Plato I said this every time when people brought up the ps4 graphic card it's suppose to be an high end 6 series powervr chip.

Half-Mafia2305d ago

@Ares84HU build a gaming PC then cause your not going to be getting it on the PS4/720.

I built a gaming PC last year for less then a £1000 and has 2 HD 6970s inside. I also dont care what the next playstation price is.

'I don't care about people who can't afford it'

But Sony does cause they want to sell to as many people as possible. So dont expect the PS4 to be anymore then £300.

If price was never a deciding factor then will would be living in one hell of a world right now.

Hayabusa 1172305d ago


YOU may not care, but Sony does...and no one is holding technology back, AMD just built their Radeon HD 7900 and you can buy it if you why wait another 2 years to see it in the PS3? Why not just build your own PC with as many graphic cards as you want? Oh wait, lemme guess...

CarlitoBrigante2305d ago

"Wich will never happen?" Sorry for harsh words but are you an idiot?

PS4 won't be releasing until end of 2013 and maybe even 2014. 79xx series have been/are being released since last month.

When the PS4 launches, we will be 2 graphics card generations away(+-2 years) and these cards will cost less then 150.

plato2305d ago


"When the PS4 launches, we will be 2 graphics card generations away(+-2 years) and these cards will cost less then 150."

How old is the 4850 AMD card and that was not even a high end AMD model card? Its more than 2+ years old and it sells for $179 in some places.

You ACTUALLY think a 7970 type card will be $150 in two years? LOL.
AND you're calling me an idiot? Jesus Christ...

danthebios2305d ago

if im not mistaken Sony has the money and technology to make their own graphic card.

plato2304d ago


"if im not mistaken Sony has the money and technology to make their own graphic card."

It would cost Sony BILLIONS to do this. Moreover, this is not an area of their expertise.
Dedicated GPU would be an entire sub division Sony would have to create. That's BILLIONS of dollars.
Don't expect to see that happen any time soon

+ Show (5) more repliesLast reply 2304d ago
Ares84HU2305d ago

As far as I know, the PS uses nVidia based gfx cards and the xbox uses ATI gfx cards so maybe not but would be nice.

KMCROC542305d ago

just from what i know, anything DX11 will never be put in a PS product ,maybe an open gl that may render simalar tech.

anonym2305d ago

A "DirectX 11 GPU" is just one that meets the hardware- and software-support specifications to qualify it as being DirectX 11 compliant. It doesn't imply any form of exclusivity. Any consumer GPU you might buy will have support for both DirectX and OpenGL.

KMCROC542305d ago (Edited 2305d ago )

@anonym was under the impression that DX was a MS tech, mean that anything not allowed to use said tech would have to use something simalar to DX tech in thier products. my i have always heard that Sony use OpenCL as opposse to DX tech.

anonym2305d ago

Yes, DirectX is a Microsoft product, but it's just a graphics API, a software layer running on the CPU that interfaces with a machine's graphics driver. Like OpenGL, it allows programmers to write code that can run on any supported GPU instead of writing low-level code for each card.

So while Sony won't be using DirectX itself, there's no reason for them not to use a GPU just because it can support it.

adorie2305d ago

hm. if Sony does go with a PowerVR based gfx chip, then that would mean it is capable of rocking DX11.

Not saying they will use that API, since Sony seems to like openGL.

+ Show (1) more replyLast reply 2305d ago
EditorAtGNG2305d ago

I didn't understand anything from the technical "mumbo-jumbo" in the description, but the video does speak for itself.

Time to upgrade from 5770 to 7770 I guess :)

RedDead2305d ago

Why? 5770 should be fine for a few more years shouldn't it?

aaaaaaaaa2305d ago

@ Dragonshardz
That would be the 7870 AMD changed the numbering system last year for the mid range cards

@ RedDeadDestroyer
All depends on how much detail you want in a game I've got 2 5770's and they are starting to show there age

CaptCalvin2305d ago (Edited 2305d ago )

HD5850 here. Runs BF3 butter smooth at max graphics with the exception of the faux AA (FXAA which looks find anyway)at 1920x1200. Still see no big reason to upgrade yet.

LightofDarkness2305d ago

I can't imagine that the 7770 would be a worthy upgrade, but I could be wrong. I'd wait a little while longer and possibly go for something in the 7800 series, which should be very affordable and noticeably faster.

Mikhail2305d ago

wait for nvidia to show its cards. Even if they dissapoint, AMD will lower the 7000 series prices due to competition

john22305d ago

precisely that. Word has it though that Nvidia's latest GPUs will be even better than the 7900 series. Can't wait to see their tech-demo for them :D

CaptCalvin2305d ago (Edited 2305d ago )

My HD5850 runs BF3 at max settings except the faux AA (FXAA which works for me as I don't gaze at pixels) @ 1920x1200. I still see no real big reason to upgrade. Games wont come out to look like Pixar movies anytime soon anyway, probably not until the next console generation.

Show all comments (47)
The story is too old to be commented.