540°

Nvidia Shows Why Gaming Consoles Won't Be Around Forever

Nvidia Chief Executive Jen-Hsun Huang gave an audience at his company’s annual GPU Technology Conference a peek at what could be the next generation of gaming hardware Tuesday. It’s long. It’s skinny. It’s green. It’s nothing more than a cord linking a video game controller to a LG Cinema 3D Smart TV television set.

Read Full Story >>
finance.yahoo.com
GamingPerson4348d ago (Edited 4348d ago )

This really is revolutionary that they can get the latency as low as a console.

Also since the gpus is made specifically for cloud computing it will get better every year.

I am 150% sure consoles sony playstation and xbox will only be a cloud service someday and will play on anything you want.

People will hate it but.
They have no choice.

Onlive is being built into Vizio TV's & gaikai is being built into LG TV's & is launching on facebook. Electronic companies will push this to make lot's of money.

Gaikai uses more bandwidth than Onlive but has higher quality than onlive and is more responsive because they use Nvidia technology.
http://www.gaikai.com/games

The only hardware in the future will be PC builders. I am sure Nvidia won't stop selling $200 - $1000 Gpus to enthusiasts.

Console gamers on the other hand tend to care less about & are less involved in the world of hardware.

Lots of people have crap connection but lots of people have good connection. So I can see this happening more from now to maybe 2 years.

Orpheus4348d ago (Edited 4348d ago )

If this gets going , nd nVIDIA does not skimp in upgrading their GPU clusters ..... I guess it will be like the 90s when no developer could retain the graphics / tech crown for more than a year ... or probably 6 months perhaps :-)

Even few years back .... Far Cry came out wowed everybody ... then few months later Doom 3 to surpass it ... then F.E.A.R ... then Elder Scrolls ..... nd then oh the pathetic new consoles came out ... the progress slowed down and..... then the world got stuck with Crysis as the graphics king for around 4 years :-( .

Hope that old trend comes back.

raytraceme4348d ago

One thing I really hate is how NVIDIA and AMD don't get their due credit for gaming. It's their gpu's that are in consoles. They create the gpu's that power a game. They don't gimp on their development and they release top products for consumers to buy.

Anyone see the UE4 shots. They were taken on a current gen gpu from nvidia's keplar series probably the $1000 gtx 690 or even the $500 gtx 680. In terms of price a gpu thats about 15x more powerful than the 360's and ps3's gpu is very well worth the 2x pricetag over a ps3 or xbox console.

Yes I am an nvidia/amd fanboy sue me.

Machioto4348d ago

Consoles are not the entire reason why pc graphics has not advance as much,some are devs willingness to push PCs,gpu drivers and the fact that PCs don't have standard hardware .

kevnb4347d ago (Edited 4347d ago )

You know there was a recession that we are still recovering from right? Also, the trend of better graphics every few months was short lived. The only time we saw it was in that era, not the 90s. I'm glad it died out. Who wants to upgrade all the time?

sikbeta4347d ago

Only the rich and enthusiastic crowd which is minimal at this point given to the f*cking recession out there would only chase this thing of upgrading every 6 months to get graphical bumps, meanwhile the rest of the crowd aka almost everyone of the guys out there will not give a sh*t about graphical upgrades in any amount of time, thing with consoles and what's made them popular is that, buy once, enjoy it for years without caring of any upgrades, with the plus that after years of playing on the same console, games have their graphics upgrades as well, think about a launch game, then compare it with a game released 5 years later, this game looks X amount of times better, people notice that and that's all they care about

BISHOP-BRASIL4347d ago

Forget the whole PC x Consoles x Cloud. When it comes to graphics, it's not just a hardware or software problem.

We are right now at almost life like graphics in all 3 gaming "ways", so what can you really do next?

Yes you can add more polygons, better lighting and what not... But quite frankly, that just don't "wow" us anymore, at least not to the extend it once did.

If we look over decades, the 70's were not much more than some controlable pixels, the 80's brough complex 2D graphics, the 90 brought the 3D enviroments... But from there to now we can argue that everything just evolved straight out of 90's tech, reaching for life like textures, models, lighting and view range, 4 elements that were present even on the very first 3D games.

Graphics simply don't have that many milestones to overtake anymore. Real time physics and animations do, but they'll also will suffer from it down the road.

That's the main reason that the last trends had been to look for different genres, controllers and optic features (screen definition, 3D, head tracking, etc), not computer graphics per se.

Don't get me wrong, I'm not saying that gaming or computer graphics in general has reached it's peak or anything like that, I'm eager to hear more into that cloud of points technique that Unlimited Detail had shown some years ago.

But I believe that right now the evolution of gaming is not as notable as it once were, specially in visual detail. Games and tech developers face today a challenge that is: "what can we really do to make a major visual advance that the current and last gen partially lack?".

FarEastOrient4347d ago

This isn't going to work is American ISP keeps placing bandwidth caps on users.

+ Show (3) more repliesLast reply 4347d ago
plmkoh4348d ago

"This really is revolutionary that they can get the latency as low as a console."

Highly deceptive, they only reduced the latency on server side and completely did not mention the time it takes from server to client (Which is most important part).

They also assumed a 10bmile distance from data centre directly to client which will never happen for 99% of users, considering even IP's need to bounce information across nodes which can easily be 20-30miles apart, let's not even think about the added distance to the main hub.

humbleopinion4348d ago

Exactly. Cloud gaming will simply never get as fast as dedicated gaming - you're bound not only by your TV refresh rate and by your local hardare decoding, but also by the speed of light and the distance from the servers.
And it gets even more complicated when we expect games next gen to run in 60fps and in 1080p - this requires about 8 times more graphics bandwidth from most current console games running in 30fps 720p (or less). Currently Gaikai and the rest of the services don't even offer a video compression that qualifies to compete with current gen offering (artifacts such as deblocking are very easy to see on their compressed video).
Add to this more advanced audio formats expected next gen which also require more bandwith, and now you don't only have a latency issue but also a bandwith issue with your ISP (and local network and whatnot). Expecting a 60FPS game like COD to run on less than 100ms latency (combined with your screen latency) when played from the cloud is still wishful thinking.

Also: Don't forget that at least according to the rumors Nvidia was kicked out from the next gen of console gaming, and that MS, Sony and Nintendo all chose AMD-ATI for their graphics solution.
Nvidia is probably not very interested in promoting the next gen of consoles all that much, their business is now lying on PC dedicated graphics cards and in mobile platforms architecture (with their SOC solutions).

andibandit4347d ago (Edited 4347d ago )

bla bla bla latency....back in the 90'ties i played action quake with horrible modem latency and i still loved it and so did alot of others. In the end as long as the playing field is somewhat level, ill accept a little latency.

SilentNegotiator4347d ago

@and

And people wore parachute pants.
But it isn't the 90s anymore.

humbleopinion4347d ago

@andibandit
But latency didn't drop since the ninetees: electrons traveled and still do travel close to the speed of light. If anything the overall latency actually increased in some cases thanks to non-CRT monitors producing more lag, home wireless connections also introducing more lag, wireless mouse/keyboard/gamepad etc.

It's just the bandwidth that increased tenfold since then. But lest not forget that Quake was also every effective with compact messages going over the net, with most of the stuff playing on the client side. You can't seriously compare that to steaming an actual video...

+ Show (1) more replyLast reply 4347d ago
Denethor_II4347d ago

You speak as if you are a messiah bringing us, the people, a revolutional concept, of which we must bow down and be gracious. It's just your opinion, no need to ram if down my throat.

vallencer4347d ago Show
uncharted564347d ago

The biggest obstacle to this technology is the internet connection. There are so many with low speed internet connections in the world even in developed countries specifically US. The only country in the world that has a good internet infrastructure is Japan. They have hotspots all over the place and have a good speed connection available to everyone. Thats the only place I can see this working.

KiLLeRCLaM4347d ago

I had 100Mbit connection 4 years ago when I lived in Sweden. Now I live in Toronto and have 50Mbit..

SilentNegotiator4347d ago

Physical, private hardware will ALWAYS be more responsive than "cloud" technology. Unless you live in a server room.

Ser4347d ago

Yup. Was just about to say that as well. Agreed fully.

4347d ago
Soldierone4347d ago

We will have a choice......a choice to not buy the "hardware." Honestly if Sony and MS want to go full on "cloud computing" then why not just jump on one of the already established ones on the PC? Go buy a cheap arse PC and do some cloud gaming.

Consoles lose physical formats, they lose consumers like me.

Syanara174347d ago

I don't see Console gaming ever being a full cloud service for practical reasons despite all technological advancements.

1). if the internet at your house or wherever goes out, a gamer is pretty much screwed

2). distrust of a company holding a persons games and save data

3). Many consumer's addiction to having a physical disk despite the inconveniences it brings

+ Show (6) more repliesLast reply 4347d ago
Nes_Daze4348d ago

Nvidia should get more sun...

Burning_Finger4348d ago (Edited 4348d ago )

Nvidia won't be around forever.

AMD Radeon FTW

Tachyon_Nova4348d ago

Guess you haven't seen the sales figures then of AMD vs nVidia?

SilentNegotiator4347d ago

Oh please. Neither of them are going anywhere.

ATi_Elite4347d ago (Edited 4347d ago )

Just exactly what are you smoking?

Nvidia is killing AMD in the graphics department this series. AMD took the lead with the HD5000 series and the HD6000 did well but Nvidia's GTX600 is killing the HD7000 (in the performance and sales department) and the GTX660ti isn't even out yet!

Nvidia is making big money with Tegra 2 and the soon to be Tegra 3 as slate/tablet sales increase meanwhile AMD has nothing for slates and is almost non existent in the desktop department as Intel dominance is too strong although AMD has a strong following in the laptop arena oh and next gen consoles.

sure the consoles skipped on Nvidia this time around (like Nvidia gives a crap seeing how they will sell more Tegra 3's than AMD will sell GPU's in consoles) and Nvidia is too busy with other projects that will once again take gaming to new levels.

It's only a matter of time before AMD is broken in half and sold off to Intel (Graphics division) and Nvidia (CPU division)! AMD doesn't have the money to properly finance research and development like Intel and Nvidia!!

Nvidia = The Way It's Meant to be Played!!!

RedDragan4347d ago

Thats not true regarding performance, the green is doing better on some while the red is doing better on others.

This fanboy crap is just that, crap! Grow up FFS!

ThatHappyGamer4348d ago

nvidia is mad because both MS & Sony are going with AMD for the GPU in their next gen consoles.

WeskerChildReborned4348d ago

Was that confirmed or was it just a rumor? I heard that they were going with AMD but i wasn't sure if it was confirmed or not.

house4348d ago

same here but the way sony has been pushing 3d this gen you would think that they would back nivida since they have some of the best 3d tech out there

SignifiedSix4347d ago

We all know Microsoft is going with amd after nvidia ripped them off last Gen.

house4348d ago (Edited 4348d ago )

exactly what i was about to post, what else can they say but this its not like there gpus are going to be used for a whole gen so they have to bank on cloud and PC gaming,

@wesker

its a rumor but people like forbs have been saying that AMD is making there (ps4) gpu but its still a rumor.. personally i hope nvidia is still in one of them (ps4)

WeskerChildReborned4348d ago

Yea i wanted Nvidia to be part of it.

sjaakiejj4347d ago

I would take any of these rumors with a large bag of salt.

I think it's highly unlikely that ps4 and Xbox 720 will be using off-the-shelf AMD components, or even highly modified ones. Sony in particular has worked a lot with IBM over the years, so why drop them and lose backward compatibility for Ps3 software?

KiLLeRCLaM4347d ago

Maybe because Nvidia know that this will be the future of gaming

Show all comments (102)
330°

Nvidia DLSS 3.7 drives a further nail in the coffin of native performance

Nvidia DLSS 3.7 is the latest update to the long-running AI upscaling technology, and it further shows native performance doesn't matter.

DustMan3d ago

I think hardware development is at a point where they need to figure out how to draw less power, These beefy high end cards eat wattage, and I'm curious if using DLSS & AI in general will lower the power draw. It would seem like the days of just adding more VRAM & horsepower is over. Law of diminishing returns. Pretty soon DLSS/FSR will be incorporated into everything, and eventually the tech will be good enough to hardly notice a difference if at all. AI is the future and it would be foolish to turn around and not incorporate it at all. Reliance on AI is only going to pick up more & more.

Tapani3d ago (Edited 3d ago )

DLSS certainly lowers power consumption. Also, the numbers such as the 4090 at 450W does not tell you everything, most of the time the GPU stays between 200-350W in gameplay, which is not too different from the highest end GPU of 10 years ago. Plus, today you can undervolt + OC GPUs by a good margin to keep stock performance while utilizing 80% of the power limit.

You can make the 4090 extremely power efficient and keep 90% of its performance at 320W.

However, in today's world the chip manufacturing is limited by physics and we will have power increases in the next 5-10 years at the very least to keep the technology moving forward at a pace that satisfies both businesses and consumers.

Maybe in 10 years we have new tech coming to the markets which we are yet to invent or perhaps we can solve existing technologies problems with manufacturing or cost of production.

On the other hand, if we were to solve the energy problem on earth by utilizing fusion and solar etc. it would not matter how much these chips require. That being said, in the next 30-40 years that is a pipedream.

MrBaskerville2d ago

I don't think fusion is the way forward. It will mosy likely be too late when it's finally ready, meaning it will probably never be ready. Something else might arrive before though and then it becomes viable.

Firebird3602d ago

We need to stop the smear campaign on nuclear energy.
We could power everything forever if we wanted too.

Tacoboto3d ago

PS4 Pro had dedicated hardware in it for supporting checkerboard rendering that was used significantly in PS4 first party titles, so you don't need to look to PC or even modern PC gaming. The first RTX cards released nearly 6 years ago, so how many nails does this coffin need?

InUrFoxHole3d ago

Well... its a coffin man. So atleast 4?

Tacoboto3d ago

PSSR in the fall can assume that role.

anast3d ago

and those nails need to be replaced annually

Einhander19723d ago

I'm not sure what the point you're trying to make is, but PS4 Pro was before DLSS and FSR, and it still provides one of the highest performance uplifts while maintaining good image quality.

DLSS is it's own thing but checkerboarding om PS5 still is a rival to the likes of FSR2.

Tacoboto3d ago

Um. That is my point. That there have been so many nails in this "native performance" coffin and they've been getting hammered in for years, even on PS4 Pro before DLSS was even a thing.

RaidenBlack2d ago

Don't know what's OP's point is either but ... checkerboard rendering was good enough for its time but in terms of image quality its wayy behind what's DLSS 3 or FSR 3 is currently offering.
The main point of the article and what OP missed here is that DLSS 3.7 is soo good that its nearly undisguisable from native rendering and basically throws the "its still blurry and inferior to native rendering" debacle, (that's been going around in PC community since 2019), right out of the window.

Einhander19722d ago

RaidenBlack

DLSS is as i said a different thing from FSR and checkerboard.

But you're talking about FSR 3 which probably is better than checkerboard, but FSR 3 has only started to get games this year, so checkerboard which was the first hardware upscaling solution was and is still one of the best upscaling solutions.

Give credit where credit is due, PlayStation was first and they got it right from the get go, and PSSR will almost certainly be better than it will be given credit for, heck digital foundry is already spreading misinformation about the Pro.

Rhythmattic2d ago

Tacoboto
Yes... Its amazing how many talekd about KZ2 deferred rendering, pointing out the explosions were lower res than the frame itself..
And of course, Then the idea of checkerboard rendering, not being native....
For sure, maybe this tech makes it minimal while pixel counting, but alas, seems performance and close enough , and not native now matters.....
I want to see it run native without DLSS.. why not?

RonsonPL3d ago

Almost deaf person:
- lightweight portable 5$, speakers of 0,5cm diameter are the final nail in coffin of Hi-Fi audio!

Some people in 2010:
- smartphones are the final nain in the console gaming's coffin!

This is just the same.
AI upscalling is complete dogshit in terms of motion quality. The fact that someone is not aware of it (look at the deaf guy example) doesn't mean the flaws are not there. They are. And all it takes to see them, is to use a display that handles motion well, so either gets true 500fps at 500Hz LCD TN or OLED (or faster tech) or uses low persistence mode (check blurbusters.com if you don't know what it means) also known as Black Frame Insertion or backlight strobing.

Also, image ruined by any type of TAA is just as "native image" as chineese 0,5$ screwdriver is "high quality, heavy duty, for professional use". It's nowhere near it. But if you're an ignorant "journalist", you will publish crap like this article, just to flow with the current.

There's no coffin to native res quality and there never will be. Eventually, we'll have enough performance in rasterization to drive 500fps, which will be a game changer for motion quality while also adding other benefit - lower latency.
And at 500fps, the amount of time required for upscalling makes it completely useless.
This crap is only usable for cinematic stuff, like cutscenes and such. Not for gaming. Beware of ignorants on the internet. The TAA is not "native" and the shitty look of the modern games when you disable any TAA, is not "native" either as it's ruined by the developer's design choice - you can cheat by rendering every 4th pixel when you plan to put a smeary TAA pass on it later on. When you disable it, you will see a ruined image, horrible pixellation and other visual "glitches" but it is NOT what native would've looked like if you'd like to honestly compare the two.

Stay informed.

RaidenBlack2d ago

Main point of the article is how far DLSS has come with v3.7 since 2018. If this is what we're getting already, then imagine what we'll get within next ~3 years. Yes parity will obviously be there compared to the then native rendering tech but it'll slowly narrow down to the point it'll be indistinguishable.
Something similar is like the genAI Sora ... AI generative videos were turd back when they were introduced (the infamous Will Smith eating video) ... but now look at Sora, generating videos that just looks like real life.

2d ago
Yui_Suzumiya2d ago

How much VRAM is standard today? My laptop has a 1080p QLED display but only an Intel Iris Xe with 128MB of VRAM. I currently do all my gaming on it but certain titles do suffer because of it. I plan on getting a Steam Deck OLED soon to play the newer and more demanding titles.

purple1012d ago

Maybe better to get a budget gaming laptop and link a dualsense to it

= Portable console with far better graphics than a steam deck! + bigger screen and able to use it for work / etc

170°

Why I'm worried about the Nvidia RTX 50 series

Aleksha writes: "Nvidia has established itself as a dominant force in the world of AI, but I can't shake the worry of what this means for the RTX 50 series."

Tal1696d ago

Echo sentiment here - I think the way GPUs are going, gaming could be secondary to deep learning. Wonder if the 40 series was the last true generation of GPUs?

Number1TailzFan6d ago

No.. Jensen believes GPUs should stay expensive. Those wanting a top end GPU will have to splash out for it, or play at just 1080p and 60fps or something if you can only afford a low end option.

On the other hand if you don't care about RT or AI performance then there's always AMD that are doing ok at the mid range.

Christopher6d ago

***or play at just 1080p and 60fps or something***

My over 2-year-old laptop GPU still runs fine. I think this is more a reason why GPUs are going to other things in priority, because the market reach for new users is shrinking as more PC gamers focus less on replacing older and still working parts that run RT/AI fine enough as it is. Not to say there aren't people who still do it, but I think the market is shrinking for having the latest and greatest like it has been the past two decades. Problem is we aren't growing things at a rate as we were, we're reaching the the flattening of that exponential curve in regards to advancement. We need another major technological advancement to restart that curve.

D0nkeyBoi6d ago

The irremoval ad makes it impossible to read article

Tzuno5d ago (Edited 5d ago )

I hope Intel takes some lead and do a big dent to nvidia sales

Jingsing5d ago

You also need to consider that NVIDIA are heavily invested in cloud gaming. So they are likely going to make moves to push you into yet another life subscription service.

Kayser815d ago

NVIDIA will never change their price point until AMD or intel makes a GPU that is comparable and cheaper than them .
it happend before in the days of gtx280 which they changed the price from 650$ to 450$ in a matter of 2 weeks because of rx4870 which is being sold at 380$.

Show all comments (8)
230°

Nvidia AI Demo Unwittingly Proves that Human Voice Actors, Artists, and Writers are Irreplaceable

Nvidia presented Covert Protocol, a tech demo aiming to showcase the "power" of the Nvidia Ace technology applied to video game characters.

Read Full Story >>
techraptor.net
Eonjay26d ago (Edited 26d ago )

They look like they are in pain. Almost begging to be put down. It was uncomfortable to watch.

PRIMORDUS27d ago

The tech. is too early. Come back in 10+yrs and see what it can do then.

N3mzor27d ago

That presentation sounds like it was written by an AI using corporate buzzwords.

CS727d ago

I don’t know why people keep thinking of it as AI vs no AI.

A much more likely scenario is the use of AI alongside human work.

Eg. AI voices used during side quests or banter to boost the lines of dialog.

AI generating additional pre determined branches in dialog tree options for more freedom in conversations with NPCs

Smellsforfree26d ago

"AI generating additional pre determined branches in dialog tree options for more freedom in conversations with NPCs"

I'm wondering about that last one. Will that make a game more fun or more immersive? In the end, how can it possibly be more than filler content and then if it is filler content how much do I really want to engage with conversing with it if I know it will lead no where?

MrBaskerville26d ago

It's one of those things that sounds cool on paper. But will probably get old fast.

DivineHand12526d ago

The tech is now available, and it is up to creators to create something unique with it.

Profchaos27d ago (Edited 27d ago )

The biggest thing to talk about here is that every interaction requires communication to inworld servers so there's three big impacts here
1) games are always online no question about it
2) delays in processing on inworld servers, outages or unexpected load as a result of some astronomically popular game will cause real time game delays ever wait for a chat got response this will be similar as the context must be pulled via the llm.

Now as for the other impact the artistic one no I don't think writers can be replaced I've mentioned before often AI generated writing is word soup I still standby that it's also evident in the video to.
AI can not convery accurately human emotions and I don't think ever will.

I know publishers are looking to cut down on development costs but what happens when inworld decide to charge per interaction or update their pricing a year after your game goes live you have no choice but pay it or shutter it.

Ive felt for a while that we are heading towards this place of games being disposable entertainment and now it's feeling more and more accurate

Show all comments (23)