Nvidia Shows Why Gaming Consoles Won't Be Around Forever

Nvidia Chief Executive Jen-Hsun Huang gave an audience at his company’s annual GPU Technology Conference a peek at what could be the next generation of gaming hardware Tuesday. It’s long. It’s skinny. It’s green. It’s nothing more than a cord linking a video game controller to a LG Cinema 3D Smart TV television set.

Read Full Story >>
The story is too old to be commented.
GamingPerson1614d ago (Edited 1614d ago )

This really is revolutionary that they can get the latency as low as a console.

Also since the gpus is made specifically for cloud computing it will get better every year.

I am 150% sure consoles sony playstation and xbox will only be a cloud service someday and will play on anything you want.

People will hate it but.
They have no choice.

Onlive is being built into Vizio TV's & gaikai is being built into LG TV's & is launching on facebook. Electronic companies will push this to make lot's of money.

Gaikai uses more bandwidth than Onlive but has higher quality than onlive and is more responsive because they use Nvidia technology.

The only hardware in the future will be PC builders. I am sure Nvidia won't stop selling $200 - $1000 Gpus to enthusiasts.

Console gamers on the other hand tend to care less about & are less involved in the world of hardware.

Lots of people have crap connection but lots of people have good connection. So I can see this happening more from now to maybe 2 years.

Orpheus1614d ago (Edited 1614d ago )

If this gets going , nd nVIDIA does not skimp in upgrading their GPU clusters ..... I guess it will be like the 90s when no developer could retain the graphics / tech crown for more than a year ... or probably 6 months perhaps :-)

Even few years back .... Far Cry came out wowed everybody ... then few months later Doom 3 to surpass it ... then F.E.A.R ... then Elder Scrolls ..... nd then oh the pathetic new consoles came out ... the progress slowed down and..... then the world got stuck with Crysis as the graphics king for around 4 years :-( .

Hope that old trend comes back.

raytraceme1614d ago

One thing I really hate is how NVIDIA and AMD don't get their due credit for gaming. It's their gpu's that are in consoles. They create the gpu's that power a game. They don't gimp on their development and they release top products for consumers to buy.

Anyone see the UE4 shots. They were taken on a current gen gpu from nvidia's keplar series probably the $1000 gtx 690 or even the $500 gtx 680. In terms of price a gpu thats about 15x more powerful than the 360's and ps3's gpu is very well worth the 2x pricetag over a ps3 or xbox console.

Yes I am an nvidia/amd fanboy sue me.

Machioto1614d ago

Consoles are not the entire reason why pc graphics has not advance as much,some are devs willingness to push PCs,gpu drivers and the fact that PCs don't have standard hardware .

kevnb1613d ago (Edited 1613d ago )

You know there was a recession that we are still recovering from right? Also, the trend of better graphics every few months was short lived. The only time we saw it was in that era, not the 90s. I'm glad it died out. Who wants to upgrade all the time?

sikbeta1613d ago

Only the rich and enthusiastic crowd which is minimal at this point given to the f*cking recession out there would only chase this thing of upgrading every 6 months to get graphical bumps, meanwhile the rest of the crowd aka almost everyone of the guys out there will not give a sh*t about graphical upgrades in any amount of time, thing with consoles and what's made them popular is that, buy once, enjoy it for years without caring of any upgrades, with the plus that after years of playing on the same console, games have their graphics upgrades as well, think about a launch game, then compare it with a game released 5 years later, this game looks X amount of times better, people notice that and that's all they care about


Forget the whole PC x Consoles x Cloud. When it comes to graphics, it's not just a hardware or software problem.

We are right now at almost life like graphics in all 3 gaming "ways", so what can you really do next?

Yes you can add more polygons, better lighting and what not... But quite frankly, that just don't "wow" us anymore, at least not to the extend it once did.

If we look over decades, the 70's were not much more than some controlable pixels, the 80's brough complex 2D graphics, the 90 brought the 3D enviroments... But from there to now we can argue that everything just evolved straight out of 90's tech, reaching for life like textures, models, lighting and view range, 4 elements that were present even on the very first 3D games.

Graphics simply don't have that many milestones to overtake anymore. Real time physics and animations do, but they'll also will suffer from it down the road.

That's the main reason that the last trends had been to look for different genres, controllers and optic features (screen definition, 3D, head tracking, etc), not computer graphics per se.

Don't get me wrong, I'm not saying that gaming or computer graphics in general has reached it's peak or anything like that, I'm eager to hear more into that cloud of points technique that Unlimited Detail had shown some years ago.

But I believe that right now the evolution of gaming is not as notable as it once were, specially in visual detail. Games and tech developers face today a challenge that is: "what can we really do to make a major visual advance that the current and last gen partially lack?".

FarEastOrient1613d ago

This isn't going to work is American ISP keeps placing bandwidth caps on users.

+ Show (3) more repliesLast reply 1613d ago
plmkoh1613d ago

"This really is revolutionary that they can get the latency as low as a console."

Highly deceptive, they only reduced the latency on server side and completely did not mention the time it takes from server to client (Which is most important part).

They also assumed a 10bmile distance from data centre directly to client which will never happen for 99% of users, considering even IP's need to bounce information across nodes which can easily be 20-30miles apart, let's not even think about the added distance to the main hub.

humbleopinion1613d ago

Exactly. Cloud gaming will simply never get as fast as dedicated gaming - you're bound not only by your TV refresh rate and by your local hardare decoding, but also by the speed of light and the distance from the servers.
And it gets even more complicated when we expect games next gen to run in 60fps and in 1080p - this requires about 8 times more graphics bandwidth from most current console games running in 30fps 720p (or less). Currently Gaikai and the rest of the services don't even offer a video compression that qualifies to compete with current gen offering (artifacts such as deblocking are very easy to see on their compressed video).
Add to this more advanced audio formats expected next gen which also require more bandwith, and now you don't only have a latency issue but also a bandwith issue with your ISP (and local network and whatnot). Expecting a 60FPS game like COD to run on less than 100ms latency (combined with your screen latency) when played from the cloud is still wishful thinking.

Also: Don't forget that at least according to the rumors Nvidia was kicked out from the next gen of console gaming, and that MS, Sony and Nintendo all chose AMD-ATI for their graphics solution.
Nvidia is probably not very interested in promoting the next gen of consoles all that much, their business is now lying on PC dedicated graphics cards and in mobile platforms architecture (with their SOC solutions).

andibandit1613d ago (Edited 1613d ago )

bla bla bla latency....back in the 90'ties i played action quake with horrible modem latency and i still loved it and so did alot of others. In the end as long as the playing field is somewhat level, ill accept a little latency.

SilentNegotiator1613d ago


And people wore parachute pants.
But it isn't the 90s anymore.

humbleopinion1612d ago

But latency didn't drop since the ninetees: electrons traveled and still do travel close to the speed of light. If anything the overall latency actually increased in some cases thanks to non-CRT monitors producing more lag, home wireless connections also introducing more lag, wireless mouse/keyboard/gamepad etc.

It's just the bandwidth that increased tenfold since then. But lest not forget that Quake was also every effective with compact messages going over the net, with most of the stuff playing on the client side. You can't seriously compare that to steaming an actual video...

+ Show (1) more replyLast reply 1612d ago
Denethor_II1613d ago

You speak as if you are a messiah bringing us, the people, a revolutional concept, of which we must bow down and be gracious. It's just your opinion, no need to ram if down my throat.

vallencer1613d ago InappropriateShow
uncharted561613d ago

The biggest obstacle to this technology is the internet connection. There are so many with low speed internet connections in the world even in developed countries specifically US. The only country in the world that has a good internet infrastructure is Japan. They have hotspots all over the place and have a good speed connection available to everyone. Thats the only place I can see this working.

KiLLeRCLaM1613d ago

I had 100Mbit connection 4 years ago when I lived in Sweden. Now I live in Toronto and have 50Mbit..

SilentNegotiator1613d ago

Physical, private hardware will ALWAYS be more responsive than "cloud" technology. Unless you live in a server room.

Ser1613d ago

Yup. Was just about to say that as well. Agreed fully.

NelsonPaige1613d ago SpamShow
Soldierone1613d ago

We will have a choice......a choice to not buy the "hardware." Honestly if Sony and MS want to go full on "cloud computing" then why not just jump on one of the already established ones on the PC? Go buy a cheap arse PC and do some cloud gaming.

Consoles lose physical formats, they lose consumers like me.

Syanara171613d ago

I don't see Console gaming ever being a full cloud service for practical reasons despite all technological advancements.

1). if the internet at your house or wherever goes out, a gamer is pretty much screwed

2). distrust of a company holding a persons games and save data

3). Many consumer's addiction to having a physical disk despite the inconveniences it brings

+ Show (6) more repliesLast reply 1612d ago
Nes_Daze1614d ago

Nvidia should get more sun...

Burning_Finger1614d ago (Edited 1614d ago )

Nvidia won't be around forever.

AMD Radeon FTW

Tachyon_Nova1613d ago

Guess you haven't seen the sales figures then of AMD vs nVidia?

SilentNegotiator1613d ago

Oh please. Neither of them are going anywhere.

ATi_Elite1613d ago (Edited 1613d ago )

Just exactly what are you smoking?

Nvidia is killing AMD in the graphics department this series. AMD took the lead with the HD5000 series and the HD6000 did well but Nvidia's GTX600 is killing the HD7000 (in the performance and sales department) and the GTX660ti isn't even out yet!

Nvidia is making big money with Tegra 2 and the soon to be Tegra 3 as slate/tablet sales increase meanwhile AMD has nothing for slates and is almost non existent in the desktop department as Intel dominance is too strong although AMD has a strong following in the laptop arena oh and next gen consoles.

sure the consoles skipped on Nvidia this time around (like Nvidia gives a crap seeing how they will sell more Tegra 3's than AMD will sell GPU's in consoles) and Nvidia is too busy with other projects that will once again take gaming to new levels.

It's only a matter of time before AMD is broken in half and sold off to Intel (Graphics division) and Nvidia (CPU division)! AMD doesn't have the money to properly finance research and development like Intel and Nvidia!!

Nvidia = The Way It's Meant to be Played!!!

RedDragan1613d ago

Thats not true regarding performance, the green is doing better on some while the red is doing better on others.

This fanboy crap is just that, crap! Grow up FFS!

ThatHappyGamer1614d ago

nvidia is mad because both MS & Sony are going with AMD for the GPU in their next gen consoles.

WeskerChildReborned1613d ago

Was that confirmed or was it just a rumor? I heard that they were going with AMD but i wasn't sure if it was confirmed or not.

house1613d ago

same here but the way sony has been pushing 3d this gen you would think that they would back nivida since they have some of the best 3d tech out there

SignifiedSix1613d ago

We all know Microsoft is going with amd after nvidia ripped them off last Gen.

house1613d ago (Edited 1613d ago )

exactly what i was about to post, what else can they say but this its not like there gpus are going to be used for a whole gen so they have to bank on cloud and PC gaming,


its a rumor but people like forbs have been saying that AMD is making there (ps4) gpu but its still a rumor.. personally i hope nvidia is still in one of them (ps4)

WeskerChildReborned1613d ago

Yea i wanted Nvidia to be part of it.

sjaakiejj1613d ago

I would take any of these rumors with a large bag of salt.

I think it's highly unlikely that ps4 and Xbox 720 will be using off-the-shelf AMD components, or even highly modified ones. Sony in particular has worked a lot with IBM over the years, so why drop them and lose backward compatibility for Ps3 software?

KiLLeRCLaM1613d ago

Maybe because Nvidia know that this will be the future of gaming