270°

Nvidia Showcases Cloud Based Indirect Lighting At Siggraph 2013

Siggraph 2013 is currently under way and Nvidia has showcased some impressive cloud techniques that it is currently working on.

Read Full Story >>
dsogaming.com
Need4Game3914d ago

Cloud Based Nvidia Physx please, since PS4 console don't have dedicated Nvidia hardware for Physx.

john23914d ago

this opens up plenty of possibilities. My guess is that PhysX Cloud will happen in the near future.

Pandamobile3914d ago (Edited 3914d ago )

I'm not sure how many applications there would be for a cloud computed physics.

The only cloud-based technologies that have will have real-world applications will be the ones that aren't latency sensitive. Physics interactions are usually some of the most latency-sensitive computations in a game engine. They usually have their own sub-routines in the rendering pipeline that updates much more often than the screen actually displays. The LOWEST level of latency you're ever going to want between physics objects in a game is 1/60 seconds. In a lot of cases, engines update their physics at 1/120 seconds or even 1/300 seconds.

Anything that has a direct impact on gameplay will always be done on the client's side.

GameNameFame3914d ago

There was already discussion for cloud lighting.

It is still very limited to many things and does not look as impressive as one ran in local hardware.

Regardless. Partial clouding is limited. the real future is at full clouding.

Both Sony and MS are heading for this.

In the mean time, we are stuck with local hardware. That means weaker x1 spec wont change

fr0sty3914d ago (Edited 3914d ago )

Eventually, when we're all connected with fiber, we'll be able to have in game visuals that are aided by the cloud to create nearly unlimited capabilities. However, for them to still have lag and artifacts even while only running at 30fps and using top of the line hardware on both ends, it's going to be a little while. There is a very noticeable lag when light sources move vs. when they illuminate the environment. There are also some odd artifacts around some light sources.

nukeitall3914d ago

Notice how huge latencies barely affect the experience if at all.

Now who said the cloud is smoke and mirrors?

"We[Microsoft] have something over a million servers in our data center infrastructure. Google is bigger than we are. Amazon is a little bit smaller. You get Yahoo! and Facebook, and then everybody else is 100,000 units probably or less."

http://www.neowin.net/news/...

DeadlyFire3914d ago (Edited 3914d ago )

The cloud tech for game design for physical games you play with a disc or installed on your hardware is very limited. It can have some tricks here and there, but to take full advantage of it. Bandwidth would have to be pretty high I would believe. Although it could be funny to see a game or two with all cloud lighting have the lights go out when a lag hiccup occurs.

In the future when games are all streamable with everyone in the country having access to 50 Mbps or more for cheap then Cloud tech has lots of unlimited possibilities. Right now its just a showcase of what it can do at the moment. Which is basically play with the lights in the background while your looking around. I would like solid games with high fps before cloud lighting takes a place in any game design.

awi59513914d ago

Onlive already proved you dont need the best hardware to play games at max settings. You can play the game elsewhere and your console is just a display device.

Kleptic3914d ago

@nukeitall...

did you notice the KEY difference with this cloud processing?...

there is no 'internet' in this example...the 150ms latency is just in the short range network between these devices...the most powerful option requiring a hard line...and still has over double what is generally acceptable latency for real time rendering...

and this response latency is completely different than ISP related latency...which has absolutely nothing to do with process latency...its just built in lag that ISP's for request queues...

So...until MS...or Sony...or Nvidia...or anyone...is putting the 'cloud' in your house with a purpose built network for it...and you don't have to use Comcast or some other helplessly shatty residential ISP...this reflects nothing for consoles launching this year...

BallsEye3913d ago

@DeadlyFire

In the future? In europe you can get 100mbps anywhere. In my country it costs 16 bucks (converted from local currency) and it's available even in small towns. You guys need to keep up with the internet!

+ Show (6) more repliesLast reply 3913d ago
Foxgod3914d ago

Havok would be better, otherwise you would have to build gpu clouds, way too expensive and energy consuming.

DJMarty3914d ago

I believe Gaikai is already powered by top of the rage Nvidia GPU's, so this is well possible.

JunioRS1013914d ago

There was an article explaining how it can't be used for physics...

Apparently, anything that changes in real-time can't be cloud computed because of latency, but things like lighting which changes very slowly over time, can be done on the cloud.

Interesting to see that it can be used for some sort of lighting stuff. Hope it works well.

RegorL3914d ago

Ever heard of speed of light?

Lightning situation can chang very quickly in a game built with those kinds of events.

Trees moving in the wind or due to explosions cast shadows.

Someone punches a hole in a wall, turns on or off a flashlight, or someone outside sprays your dark hiding place with tiny holes.

I do not really think Microsofts intents to put three GPUs and a Titan per blade... but who knows...

Kleptic3914d ago

it 'can't' be dynamic, which this video doesn't really get in to...at least it can't be dynamic in the sense of reacting to player control...its all scripted, thats the only way it can work...

but the significance is that many lighting engines are not a single layer...so the cloud can compute any scripted sequences...while the local hardware will handle anything related to 'real time'...

on the flip side though, what many of us have been saying since all this cloud computed gaming stuff became 'cool'...is that those scripted lighting situations take very few cycles to create locally...its a very complex way to offload a marginally small amount of processing...the real time stuff is whats expensive resource wise...but there is no way to have any of that done 'upstairs'...yet...

chaldo3914d ago

@need4game

HATERS GONNA HATE!

Gawdl3y3914d ago

There are other technologies with similar (and even superior) feature sets to PhysX, such as OpenCL. PhysX is just a gimmick on Nvidia cards, they usually pay developers to use PhysX. That being said, I still use an Nvidia card in my PC.

NewsForge3913d ago

I see the potential with cloud computing, but MS statment that it has 3 times the power of one Xbox One in the cloud for every Xbox One to be released is completly rubbish.

Xbox One = 1230 GFLOPS

Cloud power allocated per Xbox One = 1230 GFLOP x 3 = 3690 GFLOPS

The price per FLOP in 2013 is about 0,2$/GFLOP
http://en.wikipedia.org/wik... (Scroll until you see the table)

0,2$/GFLOP x 3690GFLOP = 738$

It means that MS are be spending more than 700$ per Xbox One on server infrustructure.

How in hell can the Xbox division be profitable?

+ Show (4) more repliesLast reply 3913d ago
dedicatedtogamers3914d ago

Definitely cool to see, although Nvidia makes it sound like this is more a thing of the near-future when they can bring down the cost of maintaining GPU-focused server farms. Currently, the CPU-focused clouds wouldn't be able to do what they're talking about.

theWB273914d ago (Edited 3914d ago )

Why can't they do this? Why isn't this what Microsoft is offering with the Azure cloud?

DragonKnight3914d ago

You really need to stop believing Microsoft's Cloud hype. What they are claiming is not possible.

NewsForge3913d ago (Edited 3913d ago )

Azure is a "CPU-focused cloud", you need GPU's for graphic data (Like the video displayed above).

With Ms cloud, I expect CPU related activity to be offloaded like A.I and matchmaking, but don't expect your games to have Avatar level graphics anytime soon with their "We are going to boost the graphics"...

Foxgod3914d ago (Edited 3914d ago )

Sure they can, everything can be done in software, just takes more resources, and thats the nice thing of a cloud, you can hot plugin more resources when needed.

dedicatedtogamers3914d ago (Edited 3914d ago )

@ people asking "why can't we do this now?"

Nvidia in their own video (you know, the video linked at the top of the page) is showing off their upcoming GPU-focused cloud framework. The reason why it's special isn't because it is "yet another cloud". It's special because it is unlike other cloud computing services, which are typically CPU-focused frameworks.

Their focus on GPU processing instead of relying on "virtual machines" which is what Azure will offer is why this particular cloud will be able to do what it does. In other words, the Nvidia cloud will assist with graphics because it is DESIGNED from the ground up to assist with graphics. Neither Microsoft nor Sony have announced anything that leads me to believe their cloud framework will assist in graphics. In fact, on the Sony side Cerny has specifically said that cloud will not be used for graphics, and I believe the same has been said on the Microsoft side by Respawn Entertainment.

pedrof933914d ago (Edited 3914d ago )

Well, Nvidia partner with Gaikai.

+ Show (1) more replyLast reply 3913d ago
kenmid3914d ago ShowReplies(7)
Fireseed3914d ago

Would LOVE to see something like this in the form of a Maya perspective viewport renderer. Never again would I deal with the malignant tumor that is VPR rendering >_>

elhebbo163914d ago ShowReplies(4)
Show all comments (95)
330°

Nvidia DLSS 3.7 drives a further nail in the coffin of native performance

Nvidia DLSS 3.7 is the latest update to the long-running AI upscaling technology, and it further shows native performance doesn't matter.

DustMan3d ago

I think hardware development is at a point where they need to figure out how to draw less power, These beefy high end cards eat wattage, and I'm curious if using DLSS & AI in general will lower the power draw. It would seem like the days of just adding more VRAM & horsepower is over. Law of diminishing returns. Pretty soon DLSS/FSR will be incorporated into everything, and eventually the tech will be good enough to hardly notice a difference if at all. AI is the future and it would be foolish to turn around and not incorporate it at all. Reliance on AI is only going to pick up more & more.

Tapani3d ago (Edited 3d ago )

DLSS certainly lowers power consumption. Also, the numbers such as the 4090 at 450W does not tell you everything, most of the time the GPU stays between 200-350W in gameplay, which is not too different from the highest end GPU of 10 years ago. Plus, today you can undervolt + OC GPUs by a good margin to keep stock performance while utilizing 80% of the power limit.

You can make the 4090 extremely power efficient and keep 90% of its performance at 320W.

However, in today's world the chip manufacturing is limited by physics and we will have power increases in the next 5-10 years at the very least to keep the technology moving forward at a pace that satisfies both businesses and consumers.

Maybe in 10 years we have new tech coming to the markets which we are yet to invent or perhaps we can solve existing technologies problems with manufacturing or cost of production.

On the other hand, if we were to solve the energy problem on earth by utilizing fusion and solar etc. it would not matter how much these chips require. That being said, in the next 30-40 years that is a pipedream.

MrBaskerville3d ago

I don't think fusion is the way forward. It will mosy likely be too late when it's finally ready, meaning it will probably never be ready. Something else might arrive before though and then it becomes viable.

Firebird3602d ago

We need to stop the smear campaign on nuclear energy.
We could power everything forever if we wanted too.

Tacoboto3d ago

PS4 Pro had dedicated hardware in it for supporting checkerboard rendering that was used significantly in PS4 first party titles, so you don't need to look to PC or even modern PC gaming. The first RTX cards released nearly 6 years ago, so how many nails does this coffin need?

InUrFoxHole3d ago

Well... its a coffin man. So atleast 4?

Tacoboto3d ago

PSSR in the fall can assume that role.

anast3d ago

and those nails need to be replaced annually

Einhander19723d ago

I'm not sure what the point you're trying to make is, but PS4 Pro was before DLSS and FSR, and it still provides one of the highest performance uplifts while maintaining good image quality.

DLSS is it's own thing but checkerboarding om PS5 still is a rival to the likes of FSR2.

Tacoboto3d ago

Um. That is my point. That there have been so many nails in this "native performance" coffin and they've been getting hammered in for years, even on PS4 Pro before DLSS was even a thing.

RaidenBlack2d ago

Don't know what's OP's point is either but ... checkerboard rendering was good enough for its time but in terms of image quality its wayy behind what's DLSS 3 or FSR 3 is currently offering.
The main point of the article and what OP missed here is that DLSS 3.7 is soo good that its nearly undisguisable from native rendering and basically throws the "its still blurry and inferior to native rendering" debacle, (that's been going around in PC community since 2019), right out of the window.

Einhander19722d ago

RaidenBlack

DLSS is as i said a different thing from FSR and checkerboard.

But you're talking about FSR 3 which probably is better than checkerboard, but FSR 3 has only started to get games this year, so checkerboard which was the first hardware upscaling solution was and is still one of the best upscaling solutions.

Give credit where credit is due, PlayStation was first and they got it right from the get go, and PSSR will almost certainly be better than it will be given credit for, heck digital foundry is already spreading misinformation about the Pro.

Rhythmattic3d ago

Tacoboto
Yes... Its amazing how many talekd about KZ2 deferred rendering, pointing out the explosions were lower res than the frame itself..
And of course, Then the idea of checkerboard rendering, not being native....
For sure, maybe this tech makes it minimal while pixel counting, but alas, seems performance and close enough , and not native now matters.....
I want to see it run native without DLSS.. why not?

RonsonPL3d ago

Almost deaf person:
- lightweight portable 5$, speakers of 0,5cm diameter are the final nail in coffin of Hi-Fi audio!

Some people in 2010:
- smartphones are the final nain in the console gaming's coffin!

This is just the same.
AI upscalling is complete dogshit in terms of motion quality. The fact that someone is not aware of it (look at the deaf guy example) doesn't mean the flaws are not there. They are. And all it takes to see them, is to use a display that handles motion well, so either gets true 500fps at 500Hz LCD TN or OLED (or faster tech) or uses low persistence mode (check blurbusters.com if you don't know what it means) also known as Black Frame Insertion or backlight strobing.

Also, image ruined by any type of TAA is just as "native image" as chineese 0,5$ screwdriver is "high quality, heavy duty, for professional use". It's nowhere near it. But if you're an ignorant "journalist", you will publish crap like this article, just to flow with the current.

There's no coffin to native res quality and there never will be. Eventually, we'll have enough performance in rasterization to drive 500fps, which will be a game changer for motion quality while also adding other benefit - lower latency.
And at 500fps, the amount of time required for upscalling makes it completely useless.
This crap is only usable for cinematic stuff, like cutscenes and such. Not for gaming. Beware of ignorants on the internet. The TAA is not "native" and the shitty look of the modern games when you disable any TAA, is not "native" either as it's ruined by the developer's design choice - you can cheat by rendering every 4th pixel when you plan to put a smeary TAA pass on it later on. When you disable it, you will see a ruined image, horrible pixellation and other visual "glitches" but it is NOT what native would've looked like if you'd like to honestly compare the two.

Stay informed.

RaidenBlack2d ago

Main point of the article is how far DLSS has come with v3.7 since 2018. If this is what we're getting already, then imagine what we'll get within next ~3 years. Yes parity will obviously be there compared to the then native rendering tech but it'll slowly narrow down to the point it'll be indistinguishable.
Something similar is like the genAI Sora ... AI generative videos were turd back when they were introduced (the infamous Will Smith eating video) ... but now look at Sora, generating videos that just looks like real life.

2d ago
Yui_Suzumiya3d ago

How much VRAM is standard today? My laptop has a 1080p QLED display but only an Intel Iris Xe with 128MB of VRAM. I currently do all my gaming on it but certain titles do suffer because of it. I plan on getting a Steam Deck OLED soon to play the newer and more demanding titles.

purple1012d ago

Maybe better to get a budget gaming laptop and link a dualsense to it

= Portable console with far better graphics than a steam deck! + bigger screen and able to use it for work / etc

170°

Why I'm worried about the Nvidia RTX 50 series

Aleksha writes: "Nvidia has established itself as a dominant force in the world of AI, but I can't shake the worry of what this means for the RTX 50 series."

Tal1696d ago

Echo sentiment here - I think the way GPUs are going, gaming could be secondary to deep learning. Wonder if the 40 series was the last true generation of GPUs?

Number1TailzFan6d ago

No.. Jensen believes GPUs should stay expensive. Those wanting a top end GPU will have to splash out for it, or play at just 1080p and 60fps or something if you can only afford a low end option.

On the other hand if you don't care about RT or AI performance then there's always AMD that are doing ok at the mid range.

Christopher6d ago

***or play at just 1080p and 60fps or something***

My over 2-year-old laptop GPU still runs fine. I think this is more a reason why GPUs are going to other things in priority, because the market reach for new users is shrinking as more PC gamers focus less on replacing older and still working parts that run RT/AI fine enough as it is. Not to say there aren't people who still do it, but I think the market is shrinking for having the latest and greatest like it has been the past two decades. Problem is we aren't growing things at a rate as we were, we're reaching the the flattening of that exponential curve in regards to advancement. We need another major technological advancement to restart that curve.

D0nkeyBoi6d ago

The irremoval ad makes it impossible to read article

Tzuno6d ago (Edited 6d ago )

I hope Intel takes some lead and do a big dent to nvidia sales

Jingsing6d ago

You also need to consider that NVIDIA are heavily invested in cloud gaming. So they are likely going to make moves to push you into yet another life subscription service.

Kayser815d ago

NVIDIA will never change their price point until AMD or intel makes a GPU that is comparable and cheaper than them .
it happend before in the days of gtx280 which they changed the price from 650$ to 450$ in a matter of 2 weeks because of rx4870 which is being sold at 380$.

Show all comments (8)
230°

Nvidia AI Demo Unwittingly Proves that Human Voice Actors, Artists, and Writers are Irreplaceable

Nvidia presented Covert Protocol, a tech demo aiming to showcase the "power" of the Nvidia Ace technology applied to video game characters.

Read Full Story >>
techraptor.net
Eonjay26d ago (Edited 26d ago )

They look like they are in pain. Almost begging to be put down. It was uncomfortable to watch.

PRIMORDUS27d ago

The tech. is too early. Come back in 10+yrs and see what it can do then.

N3mzor27d ago

That presentation sounds like it was written by an AI using corporate buzzwords.

CS727d ago

I don’t know why people keep thinking of it as AI vs no AI.

A much more likely scenario is the use of AI alongside human work.

Eg. AI voices used during side quests or banter to boost the lines of dialog.

AI generating additional pre determined branches in dialog tree options for more freedom in conversations with NPCs

Smellsforfree26d ago

"AI generating additional pre determined branches in dialog tree options for more freedom in conversations with NPCs"

I'm wondering about that last one. Will that make a game more fun or more immersive? In the end, how can it possibly be more than filler content and then if it is filler content how much do I really want to engage with conversing with it if I know it will lead no where?

MrBaskerville26d ago

It's one of those things that sounds cool on paper. But will probably get old fast.

DivineHand12526d ago

The tech is now available, and it is up to creators to create something unique with it.

Profchaos27d ago (Edited 27d ago )

The biggest thing to talk about here is that every interaction requires communication to inworld servers so there's three big impacts here
1) games are always online no question about it
2) delays in processing on inworld servers, outages or unexpected load as a result of some astronomically popular game will cause real time game delays ever wait for a chat got response this will be similar as the context must be pulled via the llm.

Now as for the other impact the artistic one no I don't think writers can be replaced I've mentioned before often AI generated writing is word soup I still standby that it's also evident in the video to.
AI can not convery accurately human emotions and I don't think ever will.

I know publishers are looking to cut down on development costs but what happens when inworld decide to charge per interaction or update their pricing a year after your game goes live you have no choice but pay it or shutter it.

Ive felt for a while that we are heading towards this place of games being disposable entertainment and now it's feeling more and more accurate

Show all comments (23)