230°

Exclusive: DirectX 12 Will Allow Multi-GPU Between GeForce And Radeon

A source with knowledge of the matter gave us some early information about an "unspoken API," which we strongly infer is DirectX 12.

Read Full Story >>
tomshardware.com
mikeslemonade3760d ago

This is awesome hopefully all developers optimize this function.

NuggetsOfGod3760d ago (Edited 3760d ago )

Ok pc...

You win..

Seriously as much as I like nvidia, if this is true they find a way to stop it lol.

Amd gpu won't work with nvidia's new gxcfxkAA feature lol

Let's hope the devs have more control over this and the company's can't stop it.

jmd7493760d ago

Not gonna happen.

Not if Nvidia and AMD have anything to say about it.

Volkama3760d ago

Should be marked as a rumour really.

-The report is about an "unspoken API" that they assume is DirectX 12
-Even if the API allows it, nVidia have previously taken steps to prevent people using AMD and nVidia cards in conjunction (it was possible to use AMD primary card, with an nVidia slave for PhysX until nVidia put a stop to it).

Christopher3760d ago

I thought this has been rumored for a while now. Am I mistaken?

Volkama3760d ago

DirectX 12 is said to be able to see multiple GPUs as one entity, but that's not the same as cross vendor support. Right now you should use the DDU tool in safe mode to completely remove all traces if an nvidia driver before attempting to swap to an amd card. And let's not talk about gameworks and the like.

But regardless, the article is an anonymous source talking about an "unspoken API". I'm not dismissing it as trash, just saying it should probably be marked as a rumour until we see some sort of confirmation.

joeorc3760d ago (Edited 3760d ago )

Not really marked as rumor in my opinion..see for yourself:
doubt this is that much of a stretch for DirectX12 to gain such functionality.
SLI, Or CrossfireX is just a smaller in concept in setting up a Smaller GPU Grid based cluster.

"Explicit Asynchronous Multi-GPU"

Date: 25 Feb 2012
Efficient parallel implementation of the lattice Boltzmann method on large clusters of graphic processing units

http://link.springer.com/ar...

GPU clustering for Grid Distribution on GPU's is not that much out of the way of Advancement is not unlike Grid Computing has been doing with CPU's for so long. Further defined Grid Simulation for GPGPU was just going to be the next step in an further refined advancement.

it is really a natural progression in my opinion. It is like further multi threaded, than more movement to Multi-Core, now moving to Grid Clustering but with GPU's and it would account for many types of GPU's inside your GRID MSI for example has board support for such a configure with an AMD and a Nvidia both.

jhoward5853760d ago (Edited 3760d ago )

This is great news. My only concern about about this is how well can the Multi-GPU run together in synchronization.

Kavorklestein3760d ago

In the words of Hannah Montana, "It's the best of both worlds" Haha jokes aside If this rumour is true then Mantle should just go away forever. Being able to mix two cards is revolutionary. You could have one card that is good for heavy shadows and one that is good for dynamic lighting and textures and get the best of both worlds.

Bigpappy3760d ago

I have never tried anything like this and have no idea what the benefits would be. I do understand that there is more GPU power, but how do you combine the power to the same app, and then have them go to the same output? how does the app know to split the work?

peowpeow3760d ago

http://goo.gl/hmznkw

This gem connects the 2 GPUs. It is up to the developer to support crossfire/sli. Generally they do

Show all comments (20)
70°

NVIDIA Smooth Motion: Up to 70% More FPS Using Driver Level Frame Gen on RTX 50 GPUs

NVIDIA’s RTX 50 “Blackwell” architecture has been a bit of a bore for us gamers. Apart from Multi Frame Generation, which has limited use-case scenarios, there isn’t much to be excited about. It is achieved using GPU-side Flip Metering. The optical field data is generated using AI models in the Tensor cores.

Read Full Story >>
pcoptimizedsettings.com
60°

PNY NVIDIA GeForce RTX 5060 Ti GPU Review

Between the price, performance and power draw, with the GeForce RTX 5060 Ti, NVIDIA nailed the mainstream formula.

Read Full Story >>
cgmagonline.com
59d ago
230°

Nintendo Switch 2 Leveled Up With NVIDIA AI-Powered DLSS and 4K Gaming

Nvidia writes:

The Nintendo Switch 2 takes performance to the next level, powered by a custom NVIDIA processor featuring an NVIDIA GPU with dedicated RT Cores and Tensor Cores for stunning visuals and AI-driven enhancements.

Read Full Story >>
blogs.nvidia.com
ZycoFox72d ago

The raytracing probably doesn't even equal a low end PC GPU, even if it did it would probably be mostly useless. They'll probably force it in some game now that will run like shit maybe 30fps at best, just because "it can do it"

B5R72d ago

Raytracing is so unnecessary for a handheld. I just hope you can turn it off.

Vits71d ago

A lot of gamers don’t realize that ray tracing isn’t really about making games look better. It’s mainly there to make development easier and cheaper, since it lets devs skip a bunch of old-school tricks to fake reflections and lighting. The visual upgrade is just a nice bonus, but that’s not the main reason the tech exists.

So you can be 100% sure that developers will try to implement it every chance they get.

RaidenBlack71d ago (Edited 71d ago )

Agree with Vits .... but also to add, if devs and designers just implement RT to a game world then it won't always work as expected. RT is not just reflections but also lighting and illumination as well. For example, If you just create a room with minimal windows, then it will look dark af, if RTGI is enabled. Devs and designers needs to sort out the game world design accordingly as well.
DF's Metro Exodus RT upgrade is an amazing reference video to go through, if anybody's interested.

darthv7271d ago

So is HDR... but they have it anyway.

thesoftware73071d ago

Some PS5 and SX games run at 30fps with RT...just like those systems, if you don't like it, turn it off.

I only say this to say, you make it seem like a problem exclusive to the Switch 2.

Neonridr71d ago (Edited 71d ago )

sour grapes much?

"It probably doesn't do it well because it's Nintendo and they suck". That's how your comment reads. Why don't you just wait and see before making these ridiculous statements?

Goodguy0172d ago

Please. I'd like to play my switch games on my 4k tv without it looking all doodoo.

PRIMORDUS72d ago

Nvidia could have said this months ago and cut the bullshit. Anyway the rumors were true.

Profchaos71d ago

Would have been nice but NDA likely prevented them from saying anything

PRIMORDUS71d ago

TBH I don't think Nvidia would have cared if they broke the NDA. A little fine they pay, and they go back to their AI shit. They don't even care about GPU's anymore. I myself would like them to leave the PC and console market.

Tacoboto71d ago

This story was written half a decade ago when the world knew Nvidia would provide the chip for Switch 2 and DLSS was taking off.

Profchaos71d ago

Yeah but similar thing happened a long time ago when 3dfx announced they were working with Sega when they took the company public Sega pulled out of the contract for the Dreamcast GPU.

In response Sega terminated the contract and went to a ultimately weaker chipset.

So there's a precedent but that Nintendo would have much Of an option its AMD, NVIDIA or Intel

Profchaos71d ago

I'm not expecting of anything from ray tracing but dlss will be the thing that sees the unit get some impossible ports.

andy8571d ago

Correct. All I'm seeing online is it'll never run FF7 Rebirth. If it can run cyberpunk it'll run it. The DLSS will help. Obviously only 30 fps but a lot don't care

Profchaos71d ago (Edited 71d ago )

Exactly right when I buy a game on switch I know what I'm getting into I'm buying a game for its portability and I'm willing to sacrifice fidelity and performance to play on a train or comfortably from a hotel room when I travel for work.

71d ago Replies(1)
Show all comments (23)