270°

Nvidia Showcases Cloud Based Indirect Lighting At Siggraph 2013

Siggraph 2013 is currently under way and Nvidia has showcased some impressive cloud techniques that it is currently working on.

Read Full Story >>
dsogaming.com
Need4Game4347d ago

Cloud Based Nvidia Physx please, since PS4 console don't have dedicated Nvidia hardware for Physx.

john24347d ago

this opens up plenty of possibilities. My guess is that PhysX Cloud will happen in the near future.

Pandamobile4346d ago (Edited 4346d ago )

I'm not sure how many applications there would be for a cloud computed physics.

The only cloud-based technologies that have will have real-world applications will be the ones that aren't latency sensitive. Physics interactions are usually some of the most latency-sensitive computations in a game engine. They usually have their own sub-routines in the rendering pipeline that updates much more often than the screen actually displays. The LOWEST level of latency you're ever going to want between physics objects in a game is 1/60 seconds. In a lot of cases, engines update their physics at 1/120 seconds or even 1/300 seconds.

Anything that has a direct impact on gameplay will always be done on the client's side.

GameNameFame4346d ago

There was already discussion for cloud lighting.

It is still very limited to many things and does not look as impressive as one ran in local hardware.

Regardless. Partial clouding is limited. the real future is at full clouding.

Both Sony and MS are heading for this.

In the mean time, we are stuck with local hardware. That means weaker x1 spec wont change

fr0sty4346d ago (Edited 4346d ago )

Eventually, when we're all connected with fiber, we'll be able to have in game visuals that are aided by the cloud to create nearly unlimited capabilities. However, for them to still have lag and artifacts even while only running at 30fps and using top of the line hardware on both ends, it's going to be a little while. There is a very noticeable lag when light sources move vs. when they illuminate the environment. There are also some odd artifacts around some light sources.

nukeitall4346d ago

Notice how huge latencies barely affect the experience if at all.

Now who said the cloud is smoke and mirrors?

"We[Microsoft] have something over a million servers in our data center infrastructure. Google is bigger than we are. Amazon is a little bit smaller. You get Yahoo! and Facebook, and then everybody else is 100,000 units probably or less."

http://www.neowin.net/news/...

DeadlyFire4346d ago (Edited 4346d ago )

The cloud tech for game design for physical games you play with a disc or installed on your hardware is very limited. It can have some tricks here and there, but to take full advantage of it. Bandwidth would have to be pretty high I would believe. Although it could be funny to see a game or two with all cloud lighting have the lights go out when a lag hiccup occurs.

In the future when games are all streamable with everyone in the country having access to 50 Mbps or more for cheap then Cloud tech has lots of unlimited possibilities. Right now its just a showcase of what it can do at the moment. Which is basically play with the lights in the background while your looking around. I would like solid games with high fps before cloud lighting takes a place in any game design.

awi59514346d ago

Onlive already proved you dont need the best hardware to play games at max settings. You can play the game elsewhere and your console is just a display device.

Kleptic4346d ago

@nukeitall...

did you notice the KEY difference with this cloud processing?...

there is no 'internet' in this example...the 150ms latency is just in the short range network between these devices...the most powerful option requiring a hard line...and still has over double what is generally acceptable latency for real time rendering...

and this response latency is completely different than ISP related latency...which has absolutely nothing to do with process latency...its just built in lag that ISP's for request queues...

So...until MS...or Sony...or Nvidia...or anyone...is putting the 'cloud' in your house with a purpose built network for it...and you don't have to use Comcast or some other helplessly shatty residential ISP...this reflects nothing for consoles launching this year...

BallsEye4346d ago

@DeadlyFire

In the future? In europe you can get 100mbps anywhere. In my country it costs 16 bucks (converted from local currency) and it's available even in small towns. You guys need to keep up with the internet!

+ Show (6) more repliesLast reply 4346d ago
Foxgod4347d ago

Havok would be better, otherwise you would have to build gpu clouds, way too expensive and energy consuming.

DJMarty4346d ago

I believe Gaikai is already powered by top of the rage Nvidia GPU's, so this is well possible.

JunioRS1014346d ago

There was an article explaining how it can't be used for physics...

Apparently, anything that changes in real-time can't be cloud computed because of latency, but things like lighting which changes very slowly over time, can be done on the cloud.

Interesting to see that it can be used for some sort of lighting stuff. Hope it works well.

RegorL4346d ago

Ever heard of speed of light?

Lightning situation can chang very quickly in a game built with those kinds of events.

Trees moving in the wind or due to explosions cast shadows.

Someone punches a hole in a wall, turns on or off a flashlight, or someone outside sprays your dark hiding place with tiny holes.

I do not really think Microsofts intents to put three GPUs and a Titan per blade... but who knows...

Kleptic4346d ago

it 'can't' be dynamic, which this video doesn't really get in to...at least it can't be dynamic in the sense of reacting to player control...its all scripted, thats the only way it can work...

but the significance is that many lighting engines are not a single layer...so the cloud can compute any scripted sequences...while the local hardware will handle anything related to 'real time'...

on the flip side though, what many of us have been saying since all this cloud computed gaming stuff became 'cool'...is that those scripted lighting situations take very few cycles to create locally...its a very complex way to offload a marginally small amount of processing...the real time stuff is whats expensive resource wise...but there is no way to have any of that done 'upstairs'...yet...

chaldo4346d ago

@need4game

HATERS GONNA HATE!

Gawdl3y4346d ago

There are other technologies with similar (and even superior) feature sets to PhysX, such as OpenCL. PhysX is just a gimmick on Nvidia cards, they usually pay developers to use PhysX. That being said, I still use an Nvidia card in my PC.

NewsForge4346d ago

I see the potential with cloud computing, but MS statment that it has 3 times the power of one Xbox One in the cloud for every Xbox One to be released is completly rubbish.

Xbox One = 1230 GFLOPS

Cloud power allocated per Xbox One = 1230 GFLOP x 3 = 3690 GFLOPS

The price per FLOP in 2013 is about 0,2$/GFLOP
http://en.wikipedia.org/wik... (Scroll until you see the table)

0,2$/GFLOP x 3690GFLOP = 738$

It means that MS are be spending more than 700$ per Xbox One on server infrustructure.

How in hell can the Xbox division be profitable?

+ Show (4) more repliesLast reply 4346d ago
dedicatedtogamers4347d ago

Definitely cool to see, although Nvidia makes it sound like this is more a thing of the near-future when they can bring down the cost of maintaining GPU-focused server farms. Currently, the CPU-focused clouds wouldn't be able to do what they're talking about.

theWB274347d ago (Edited 4347d ago )

Why can't they do this? Why isn't this what Microsoft is offering with the Azure cloud?

DragonKnight4346d ago

You really need to stop believing Microsoft's Cloud hype. What they are claiming is not possible.

NewsForge4346d ago (Edited 4346d ago )

Azure is a "CPU-focused cloud", you need GPU's for graphic data (Like the video displayed above).

With Ms cloud, I expect CPU related activity to be offloaded like A.I and matchmaking, but don't expect your games to have Avatar level graphics anytime soon with their "We are going to boost the graphics"...

Foxgod4347d ago (Edited 4347d ago )

Sure they can, everything can be done in software, just takes more resources, and thats the nice thing of a cloud, you can hot plugin more resources when needed.

dedicatedtogamers4346d ago (Edited 4346d ago )

@ people asking "why can't we do this now?"

Nvidia in their own video (you know, the video linked at the top of the page) is showing off their upcoming GPU-focused cloud framework. The reason why it's special isn't because it is "yet another cloud". It's special because it is unlike other cloud computing services, which are typically CPU-focused frameworks.

Their focus on GPU processing instead of relying on "virtual machines" which is what Azure will offer is why this particular cloud will be able to do what it does. In other words, the Nvidia cloud will assist with graphics because it is DESIGNED from the ground up to assist with graphics. Neither Microsoft nor Sony have announced anything that leads me to believe their cloud framework will assist in graphics. In fact, on the Sony side Cerny has specifically said that cloud will not be used for graphics, and I believe the same has been said on the Microsoft side by Respawn Entertainment.

pedrof934346d ago (Edited 4346d ago )

Well, Nvidia partner with Gaikai.

+ Show (1) more replyLast reply 4346d ago
kenmid4347d ago ShowReplies(7)
Fireseed4347d ago

Would LOVE to see something like this in the form of a Maya perspective viewport renderer. Never again would I deal with the malignant tumor that is VPR rendering >_>

elhebbo164347d ago ShowReplies(4)
Show all comments (95)
70°

NVIDIA Smooth Motion: Up to 70% More FPS Using Driver Level Frame Gen on RTX 50 GPUs

NVIDIA’s RTX 50 “Blackwell” architecture has been a bit of a bore for us gamers. Apart from Multi Frame Generation, which has limited use-case scenarios, there isn’t much to be excited about. It is achieved using GPU-side Flip Metering. The optical field data is generated using AI models in the Tensor cores.

Read Full Story >>
pcoptimizedsettings.com
60°

PNY NVIDIA GeForce RTX 5060 Ti GPU Review

Between the price, performance and power draw, with the GeForce RTX 5060 Ti, NVIDIA nailed the mainstream formula.

Read Full Story >>
cgmagonline.com
66d ago
230°

Nintendo Switch 2 Leveled Up With NVIDIA AI-Powered DLSS and 4K Gaming

Nvidia writes:

The Nintendo Switch 2 takes performance to the next level, powered by a custom NVIDIA processor featuring an NVIDIA GPU with dedicated RT Cores and Tensor Cores for stunning visuals and AI-driven enhancements.

Read Full Story >>
blogs.nvidia.com
ZycoFox79d ago

The raytracing probably doesn't even equal a low end PC GPU, even if it did it would probably be mostly useless. They'll probably force it in some game now that will run like shit maybe 30fps at best, just because "it can do it"

B5R79d ago

Raytracing is so unnecessary for a handheld. I just hope you can turn it off.

Vits79d ago

A lot of gamers don’t realize that ray tracing isn’t really about making games look better. It’s mainly there to make development easier and cheaper, since it lets devs skip a bunch of old-school tricks to fake reflections and lighting. The visual upgrade is just a nice bonus, but that’s not the main reason the tech exists.

So you can be 100% sure that developers will try to implement it every chance they get.

RaidenBlack79d ago (Edited 79d ago )

Agree with Vits .... but also to add, if devs and designers just implement RT to a game world then it won't always work as expected. RT is not just reflections but also lighting and illumination as well. For example, If you just create a room with minimal windows, then it will look dark af, if RTGI is enabled. Devs and designers needs to sort out the game world design accordingly as well.
DF's Metro Exodus RT upgrade is an amazing reference video to go through, if anybody's interested.

darthv7279d ago

So is HDR... but they have it anyway.

thesoftware73079d ago

Some PS5 and SX games run at 30fps with RT...just like those systems, if you don't like it, turn it off.

I only say this to say, you make it seem like a problem exclusive to the Switch 2.

Neonridr79d ago (Edited 79d ago )

sour grapes much?

"It probably doesn't do it well because it's Nintendo and they suck". That's how your comment reads. Why don't you just wait and see before making these ridiculous statements?

Goodguy0179d ago

Please. I'd like to play my switch games on my 4k tv without it looking all doodoo.

PRIMORDUS79d ago

Nvidia could have said this months ago and cut the bullshit. Anyway the rumors were true.

Profchaos79d ago

Would have been nice but NDA likely prevented them from saying anything

PRIMORDUS78d ago

TBH I don't think Nvidia would have cared if they broke the NDA. A little fine they pay, and they go back to their AI shit. They don't even care about GPU's anymore. I myself would like them to leave the PC and console market.

Tacoboto78d ago

This story was written half a decade ago when the world knew Nvidia would provide the chip for Switch 2 and DLSS was taking off.

Profchaos78d ago

Yeah but similar thing happened a long time ago when 3dfx announced they were working with Sega when they took the company public Sega pulled out of the contract for the Dreamcast GPU.

In response Sega terminated the contract and went to a ultimately weaker chipset.

So there's a precedent but that Nintendo would have much Of an option its AMD, NVIDIA or Intel

Profchaos79d ago

I'm not expecting of anything from ray tracing but dlss will be the thing that sees the unit get some impossible ports.

andy8579d ago

Correct. All I'm seeing online is it'll never run FF7 Rebirth. If it can run cyberpunk it'll run it. The DLSS will help. Obviously only 30 fps but a lot don't care

Profchaos79d ago (Edited 79d ago )

Exactly right when I buy a game on switch I know what I'm getting into I'm buying a game for its portability and I'm willing to sacrifice fidelity and performance to play on a train or comfortably from a hotel room when I travel for work.

79d ago Replies(1)
Show all comments (23)