Top
530°

Improving Graphics Performance Using Cloud Is Going To Be Really Hard: Ex-Naughty Dog Dev

The idea is tempting but simply offloading assets to cloud won’t work, says Filmic World's boss and Ex-Naughty Dog developer John Hable.

Read Full Story >>
gamingbolt.com
The story is too old to be commented.
931d ago Replies(12)
Sonyslave3931d ago (Edited 931d ago )

I can’t comment on any specific applications

In other words I can't really comment what going on at MS because I have no ideal what they are doing. -___-

Eonjay931d ago (Edited 931d ago )

Well, the last demo they showed was cloud based physics calculations. They never said anything about graphics processing in the cloud. But they have demonstrated other uses like cloud based software updating and cloud based physics.

But indeed, the issue is that Microsoft hasn't actually shown anything new or compelling yet. This is part of the problem. They made a lot of comments at the beginning of the gen about the potential but it hasn't come to fruition yet.

vega275931d ago

http://www.pcgamer.com/nvid...

He must have forgot nvidia was all so messing with cloud computing with lighting effects in games. So I call B.S on his statement.

Dark_king931d ago

@vega275 That is not graphics though its just doing the calculations then sending the data back to be rendered.

“for computing indirect lighting in the Cloud to support real-time rendering for interactive 3D applications on a user's local device.”

says so right there

Lennoxb63931d ago (Edited 931d ago )

And he would know about everything that's going on behind the scenes at MS. I mean he does work for Naughty... Oh wait. He wouldn't know a [email protected] thing would he? lol

Ghost_of_Tsushima931d ago (Edited 931d ago )

I'm pretty sure he'd know the technical limitations and such of how it would be done using the cloud and according to him it would be very complicated. Maybe we'll see something using the cloud at E3 like Crackdown.

At least he said it would be really hard and not impossible;)

Lennoxb63931d ago (Edited 931d ago )

Well since the developers are getting comfortable with each other's projects. I would like to hear 343's or Rare's opinion on Uncharted 4 or PS4's API updates. Maybe even a possible TLOU sequel. I mean it's only right. Right?

Eonjay931d ago

The laws of physics as they apply to modern graphics processing are not the property of any entity. Anyone can comment on technology and they do all the time.

rainslacker930d ago (Edited 930d ago )

MS isn't changing the way rendering is done just because they do some of the rendering on the cloud. The rendering process remains the same, it's only the pipelines that change.

Given that, and given that this guy writes high speed, extremely powerful graphics rendering tools(seriously, it's what his company does...he's an ex-ND employee), I'd say he's pretty well versed in how a frame is rendered.

Given that, I'm sure he can speak with a good amount of understanding an authority to say whether or not if improving graphics using cloud would be really hard. He didn't say it was impossible, just really hard.

Given that the current premise of game development is to make everything easier, and more streamlined, it does seem counter productive to focus resources on things that make it harder, not to mention more expensive as server time does have to be paid by someone. Cloud services don't just give away their server time and bandwidth for free. Nvidia's own example even stated that phychologist found that using high levels of lighting detail only had a marginally positive effect based on user analysis, which means that it's far from the most important thing in the world(in other words, graphics don't matter).

I actually wonder if anyone here could provide a reason why this guy is wrong beyond their own feelings, and actually discredit his statements on a technical level that go beyond MS provided high level PR for the masses.

Fez931d ago (Edited 931d ago )

This is not a specific comment on MS... It's to do with offloading graphical computations to remote servers.

Did you even read and think about the article or did you just see the thumbnail and go into "console war mode"? Pretty low of the submitter to do that but I would expect almost everyone to see past this media manipulation by now. RTFA.

Of course there are limitations in trying to perform low-latency graphical computations over a network - this is all that has been stated and is common sense.

It will be cool to see what does emerge from cloud computing in terms of gaming. Maybe some AI can finally make big leaps, esp. as the generation unfolds and there are more servers and power.
Tbh it is hard to think of something that isn't required almost immediately in gaming though - I don't know what the average would be but something like a >100ms round trip time seems reasonable... and is an awfully long time. But perhaps there are tricks and techniques to overcome this. Maybe some things can be calculated in advance for the next few frames.

Lennoxb63931d ago

It doesn't make sense to even talk about offloading graphical assets to the cloud since that was never the goal of cloud compute in the first place. It was meant to take care of some of the CPU tasks (most very small) in order to give the CPU more room to take on other things. The cloud is not rendering graphics, the console is still doing that. All it's doing is taking bits and pieces of data that the console needs for things like AI or physics, and sending it to the console. Not rendering it in the cloud.

rainslacker930d ago

The paper that Nvidia provided with their demonstration has a lot of great information on what is going on, as well as the requirements to make it work.

http://graphics.cs.williams...

It's a pretty technical read, but good none the less.

Nvidia claims that the standard ping time in latency is sufficient in most cases, however, the actual real world results would be highly dependent on a lot of things. Certain applications work better with lower ping.

This of course only took into account 3 different types of lighting systems, but they are fairly common approaches in today's games, but often don't happen on the same scale as it'd be resource prohibitive.

This is of course only one implementation of this kind of tech. There are others out there, and I'm sure there are some that have yet to be revealed.

@Lennox

That's actually true. However, Wardell, in his infinite wisdom, decided to discuss offloading the lighting, which many in the media took as MS actually saying it. MS has mentioned this procedure, but never specifically mentioned it in regards to X1. Much of the expectations from some people on the tech comes from misrepresentation from a 3rd party who was only talking theoretical possibilities and not actual intented application.

Cloud compute to MS was 3X the resources, however resources does not equate to 3X the power. For example, 3X the resources in a physics engine means that you can calculate 3X the number of physics calculations per interval. 3X the power means that you could likely process 300X the number of physics calculations per interval, or process much more complex physics calculations than what would be necessary for a game.

That being said, whether or not games need to calculate 3X times the amount of physics per game is questionable. I imagine there are times when it could be nice, but not sure the overall practicality of such a feat as there does need to be an object for everything physics calculation.

AI makes a lot more sense, as it can be quite complex, but the results, and variables to derive those results are typically very small and allow for quick and easy transport through standard latency scenarios. Any lag introduced would likely be imperceptible to the end user, unless there was a huge spike or disconnect, and there would likely be backup routines should the data not arrive in time.

Fez931d ago (Edited 931d ago )

It does make sense to talk about offloading graphical assets to the cloud if you're asked the question in an interview though lol. And it's also an interesting topic outwith any narrow console war you may or may not be involved in.

The goal of cloud computing (in particular console gaming) is to provide a better experience to the user by working around the limitations of the hardware in whatever way is feasible.
No doubt work is being done on distributed graphics right now and if it is a feasible option, you can bet it will be tried at some point.

Asking the question to a developer (ex or otherwise) for their thoughts on this subject seems okay to me.

Spotie931d ago

Lennoxb63 says, "It doesn't make sense to even talk about offloading graphical assets to the cloud since that was never the goal of cloud compute in the first place."

Ars Technica interviews Matt Booty.
http://arstechnica.com/gami...

I'm sorry, what was that?

+ Show (1) more replyLast reply 930d ago
Bizzare21931d ago

I like how old developers from Sony are commenting on the new things MS is trying to do. lol

XeyedGamer931d ago

I don't see a direct reference to, MS... You just made the assumption that the cloud tech is exclusive to them.
That being said, I think it's actually Sony who are pushing it forward at this point with game streaming. I've seen very little other than a couple demos touting one thing or another from the super bs pr team over at MS.

TheCommentator931d ago

Game streaming is as different from cloud processing as playing games online.

MS has built the XB1 architecture around this function. Each of the 4 multipurpose DME's can do data movement at 26Gb/sec to/from any 2 locations at no cost to CPU, GPU, or memory bandwidth. My understanding is that server systems use these types of accelerators to communicate between nodes within the server bank. MS will show Crackdown at E3, so they will talk more about the cloud's usefulness in gaming then.

Bizzare21931d ago

Well the thumbnail is XB1 vs PS4...

madmonkey01931d ago

nothing new about remote servers, just a new buzzword to market it.

Fin_The_Human931d ago (Edited 931d ago )

Read the article and to sum it all up it all depends on internet speed and reliability.

Cloud graphic and physics improvement is still too early because of the slow and unreliable internet speed that 80% of gamers have.

MS is onto something but its way ahead of its time.

kaizokuspy931d ago

This, exactly. Microsoft is right, but it's not practical yet, but when it becomes practical it will be amazing

Brisco931d ago

80% of the gamers? Maybe in the states but europe is ready for this.

MysticStrummer931d ago

The states are the biggest gaming market. I'd say 80% is conservative.

dcbronco930d ago

Eighty is way overblown. The Washington D.C. Baltimore region and northern Virginia have over 10-13 million people. The average internet speed here is 30mbs easy. I would bet most of the major cities are similar. I would bet a third of this country has around that speed or higher. The problem is low population states like North and South Dakota and Montana bring the average down. And we have too many states like that. But more than half of us live in the major cities.

On topic I think this guy is too busy with his new business to be up on what Microsoft is doing. Some of his comments make that clear.

He mentioned server cost.

It's like he doesn't know what Azure is. So the answer to who will pay for all those servers is the companies that use Azure. Microsoft offers them to developers for free. He has a business, he knows business' pay a premium for every service they use.

Bandwidth.

Microsoft and Duke university have already cut back bandwidth need by 83%. Plus if he looks up some of the information on Project Orleans a big part of that is instantaneously hydration and dehydration of information to reduce bandwidth needs.

Wi-Fi?

Not even sure that is a real problem. If your Wi-Fi sucks buy a long cat-5.

Server goes down.

Again Azure has protocols in place to switch anything running instantly over to another server. He has to remember Azure is being sold as a business tool with Quality of Service guarantees. They want to use this to create mobile disaster infrastructure that can quickly switch over to a new host if needed. Look up agents in Project Orleans.

Latency is an issue.

Some of the other things will help address that. But the predictive technology Microsoft has been working on may play a part in that. As well as lost packets.

The reality is it may be really, really hard. But there are people thinking outside the box working on it. One of the things they were working on was the console handling everything in a certain area around the player and the cloud handling the things further away. Plus I think people should remember some of the rumors we've heard over the years. It could be a matter of the cloud just adding more details to things the console draws.

I think people need to wait a couple of more weeks.

Fez931d ago

I think the round-trip time is the big problem though, not bandwidth or reliability. And unless servers are going to be in your country and close to your location you might not be able to benefit from cloud computing.

It's a really interesting problem to overcome. So many variables that will affect everyone's experience differently.
50ms RTT vs 250ms RTT.
Internet cutting out.
Internet traffic at peak times.

I wouldn't like to be the guys programming that... actually that's a lie it would be awesome but very very difficult!

Outthink_The_Room931d ago

It's always funny seeing people talk about RTT, but never bring up how MMO's receive data, update state and game logic, apply data and then return said packet.

Why, if a Cloud Compute approach is held back by RTT, would any MMO be playable if it requires a similar approach for data?

Fez930d ago (Edited 930d ago )

That's a good point... also any online game in general must do the same.

I'm not an expert on this at all but I think there are a lot of tricks to achieve this and that's why the experience is not always optimal and varies per game.
For example: Client-Side Prediction and Server Reconciliation discussion here ( http://www.gabrielgambetta....

This kind of workaround to lag may not be possible because you would need the result (at least for graphical computations) immediately. But maybe things like AI could lag a few frames behind and be better than local AI.

It could be a great thing for some people, just like game streaming could be, but in my experience with my internet connection the lag sometimes creates noticeable problems.

rainslacker930d ago

In terms of physics, beyond a feasibility standpoint, the developers still have to want to implement these things on such a grand scale. Implementing all these little extras takes time and resources that honestly could often be spent better elsewhere. What's the point in having a billion pieces of a destructible window, when one million will be sufficient?

Over a great period of time, obviously things will become available which make this kind of stuff more feasible on a development level, but there comes a point of diminishing returns. No matter how much a computer system may be able to do something, that something still has to be implemented at some point, and that takes time and money. One of the basic tenants of AAA game design is to make the complex out of the simple, because the simple is cheaper and more flexible across multiple implementations.

In graphics there is a term called "Level of Detail", or LOD. The premise behind this is that objects that are very close to the user's view have a higher level of detail applied to them, whereas things that are very far away have very little detail applied to them. The same is true with physics calculations. How many objects can a user reasonably have within their immediate view that requires such vast amounts of physics processing to make move? Again, diminishing returns for what amounts to lots of work on the development side.

On top of all this, for graphics rendering, GPU's are becoming more powerful faster than the internet infrastructure is becoming faster and more ubiquitous, so over time, the idea of rendering on the cloud may actually hinder the designed almost routine abilities of a graphics processor. The idea of remote rendering makes sense on certain types of devices, say mobile due to the issues with heat in a very compact device, as one day, those devices are not going to be able to go any further based on today's technology. Because of this, Moore's law is actually coming to an end at twice the speed of the average PC component. However, should a device have the ability to have a rather recent GPU, then chances are that it's abilities are going to far outpace the abilities of what the extra cloud rendering could provide.

+ Show (1) more replyLast reply 930d ago