770°

Xbox One already has an answer to Nvidia G-Sync

With the announcement of its G-sync technology, many believe that Nvidia may already have ousted the next generation of consoles before their launch. Sure enough, the GPU manufacturer makes a rather fascinating proposition, with the promise of gaming without annoyances like input lag, stutter, and screen tearing.

Read Full Story >>
gearnuke.com
GalacticEmpire3833d ago

Did misterXmedia write this?

Somebody with some tech know how explain this crap.

meatysausage3832d ago (Edited 3832d ago )

Haha, I saw that.The poeple on that blog are mental

trying to be fair, they need to fix the problems of not hitting high enough res
althought he cod 720p rumors are not confirmed, its worrying for those xbox owners that this is starting.
Xbox one might have a great scaler chip, but for people like me who game on a 1080p projector, the difference between native and 720p is enormous.

More on topic, that would be good if true but i doubt it as the conclusions are not that sound

Hydrolex3832d ago

I work in tech and I will tell you how this works...

Well it doesn't, you just dream it

meatysausage3832d ago

Hydrolex

which part are you talking about?

Hydrolex3832d ago (Edited 3832d ago )

go home hydrolex, you're drunk

Eonjay3832d ago

Does misterXmedia run a sci-fi blog?

jaosobno3832d ago (Edited 3832d ago )

What dynamic framebuffer does is the following: it changes image resolution on the fly, based on performance analysis.

For example, let's say Xbox One runs the game at 900p@30FPS. Suddenly the scene becomes too complex for 30 FPS to be mantained. Instead of dropping FPS, the game drops the resolution to 720p in order to mantain performance target of 30 FPS. When things "get back to normal", game goes back to 900p.

So this has nothing to do with things that G-Sync fixes.

Author of this article is an idiot.

meatysausage3832d ago (Edited 3832d ago )

@jaosobno

Thats odd, wouldnt that mean you could notice a decrease in quality when its changing resolutions (in game) to maintain a stable framerate

Would be annoying

Kleptic3832d ago

This is actually nothing new, at least how it ends up for the person playing the game...

the PS3 did it, albeit through software, with WipeOut HD...most of the time the game ran at full HD 60fps...but 60fps is what was locked, not res, and the buffer could change on the fly in order to keep the frames where they needed to be...

Rage's ID Tech 5 engine also did the exact same thing, but instead of an overall rendered frame change...it dropped resolution of specific textures...

I played the hell out of wipeout HD...can't say i ever noticed it...but with Rage, without an SSD...the texture res pop was very noticeable (doing a quick 180 gave a delay on environment textures, and everyone complained)...but i still thought that game at 60fps was worth more than having it at 30fps with no pop in...

the xbox one apparently has some sort of hardware scaler to do this for 'free'...thats fine, i guess...but its NOTHING like g-sync...as g-sync fixes the issues created by a lower framerate without gutting the image quality...it, instead, steps the monitor resolution around...the worst thing you'd notice would be some flicker on your monitor, but not helplessly shitty input lag and stutter that you currently get...

so its hardly an 'answer' to g-sync...its a method to try and reduce frame drops, by lowering image quality...g-sync is trying to remove the problems associated with lower frame rates entirely...two completely different things...

BattleTorn3832d ago

Hydrolex,

did you just troll against yourself?

loulou3832d ago

lol misterxmedia... he did call this

Utalkin2me3832d ago

@Hydrolex

Did someone forget to log in with the other account?

UltimateMaster3831d ago

Article:
~Of course, all of this is only in theory based on what the Xbox One architects have claimed. It remains to be seen if such favorable circumstances for game performance will actually be realized in games.

@GalacticEmpire
It's somewhat written by Microsoft since the article is basing itself from the engineering them at Microsoft.

Whether all that is true or not remains to be seen.
So far, I don't see any problems with the next gen consoles.

+ Show (8) more repliesLast reply 3831d ago
heliumhead20303832d ago

This is old news. Basically when things get hectic, instead of dropping the frame rate the game will drop its resolution. And he lied because this is already being used in dead rising 3

Kleptic3832d ago

and rage...and wipeout HD/fury...and a few others iirc...

wishingW3L3832d ago

and let's not forget about the worst case game: Ninja Gaiden 3.

Seriously, this dynamic res stuff is not a good thing AT ALL. Just optimize and use Triple Buffering better like BF4.

Shake_Zula3832d ago

Sure... Hopefully I don't get too many disagrees for this. lol

When they are referring to the dynamic frame buffer, they are talking about the ESRAM implementation. What happens is, first, compressed textures and graphical data is loaded into the 8GB RAM space. When specific data is needed it is decompressed and transferred to ESRAM which is then fed to the GPU.

Traditionally, v-sync was a GPU-only process that limits the frame rate to prevent screen tearing. What is detailed in the article is throttling data from ESRAM to the GPU to achieve a target frame rate. So conceptually, it's still v-sync, but at a different point in the process. Nothing new here.

The way this differs from G-sync is that in G-sync, hardware in the monitor allows the GPU and the monitor to sync framerates, which eliminates screen tearing completely and reduces graphical delay. The latter feature is something completely new as most high-end displays still have at least a 5ms delay.

In other words, this article is incorrect.

meetajhu3832d ago

This is not possible. Microsoft is leaving a false news to gamers. Because there is no way for the Television or Monitor to know at what dynamic framerate the image is being rendered per sec. What Xbox One could do is have the year old Adaptive VSync.

Let me explain what VSYNC does and why GSYNC

Vsync- Sets a fixed framerate to the monitor's framerate. Eg:- If your playing BF3 on PC which runs at 60fps with Vsync ON. When your game drops 1fps instead of dropping to 59fps it drops to 45fps. When this happens you see massive slow down in fluidity in the game but you can actually eliminate the screen tear by doing so. This is why some games even without VSync doesn't cause screen tear. But games that do screen tear are not syncing with the monitor because the monitor doesn't know at what refresh rate the game being rendered apart from its standard refresh rates 15,30,45 & 60. This is the case with Xbox One

GSync- This isn't a new tech. Its already been in Nvidia's Quadro series and iPhone. This is only possible using a display port cable or medium in which according to DisplayPort 1.0 specification the displayport monitors are capable of changing its refresh rate dynamically depending on GPU output. And the fluidity is maintained because GSYNC specs require 144hz and nobody will notice it at that framerate. I have no time to write detailed explanation.

malokevi3832d ago (Edited 3832d ago )

Doesn't seem like it needs explaining. They do a good job explaining how it works in the article.

Sounds like a cool feature to me. Momentary drops in resolution to maintain framerate.

No need to be threatened by this, guys. If PS4 is as perfect as you all seem to believe, then the framerate in PS4 games will never dip below 60fps and the res will always be 1080p... right?

dantesparda3831d ago

Malokavi

Ok, now which Sony fanboys think that the PS4 will never drop in frames?

What they believe is that anything the x1 can do, the PS4 can do better.

Ok!? you got that!? or is that to much for you fanboy mind to understand?

malokevi3831d ago (Edited 3831d ago )

The way I've heard it, "its the console of 1080p 60fps" no if ands or buts.

Also that 60fps 1080p is something that simply happens because you're on a playstation4, and that only "slacked-ass developers who are being brought down by the X1" could ever create a game to any other standard.

Little realizing how crazy that really is.

Edit: also, you sound mad... need a hug? 😊

dantesparda3831d ago (Edited 3831d ago )

I sound mad? why cuz you ms fanboys are delusional? No i laugh at you's. Its sad really, i think you need the hug. I know you fanboys are crying inside, with the x1 being such a huge let down and all. With all the downgrades and all

malokevi3831d ago

Beer + codine formula = relief from sadness

Don't be such a snickerpuss! Cure your twisty nickers syndrome. Life aint so bad.

+ Show (1) more replyLast reply 3831d ago
rainslacker3831d ago

Basically, G-Sync syncs the monitors refresh rate to the Video cards refresh rate to prevent screen tearing and input lag, as well as hopefully preventing screen skipping or freezing.

MS apparent answer is to downgrade the native the resolution, then upscale through an additional hardware component to maintain frame rate. This upscaler can be thought to be akin to how some Blu-Ray Players can upscale DVD content to HDTV content. It's worth noting that this is done this gen already. Games often have a lower resolution, but are upscaled for output independent of the GPU.

MS solution is solving a different problem than G-Sync, and quite honestly, doesn't seem like an answer to G-Sync in the slightest based on this articles description.

+ Show (3) more repliesLast reply 3831d ago
Pandamobile3833d ago (Edited 3833d ago )

I don't think the writer of this really understands what G-Sync is all about.

All this sounds like is a pretty standard V-sync implementation. The whole point of G-Sync is to only refresh the display when the GPU sends a new frame, instead of just updating at 60 Hz no matter what.

Kayant3833d ago

And not to add that it's a hardware module vs a software implementation. Dedicated hardware + software > than just software.

jeffgoldwin3832d ago

True, but you only super charge a Honda civic so much.

Studio-YaMi3832d ago

So what you're saying is that the gameplay would be smooth and won't have that "lag" when the frames drop !?

Is what I'm understanding here is right ?? :0
because if so,I'm buying me a freakin monitor with G-Sync implanted !

Mariusmssj3832d ago

Yes essentially that! + a monitor with g-sync running a game at 40fps will look at smooth as monitor running a game at 60fps without g-sync.

Gimmemorebubblez3833d ago

"this would lead to the elimination of input lag, stutter, and SCREEN TEARING"
Scrolls down to related articles
"The new Ryse build still shows some flaws, mainly SCREEN-TEARING."
-___-
This article is based on assumptions made by the author from a month old Digital Foundry interview.

I broke my promise to myself not to comment again on N4G, damn.

MorePowerOfGreen3833d ago (Edited 3833d ago )

Explain in detail please. No C-boat spin and guessing.

SignifiedSix913832d ago

"Multiple industry sources have indicated that developers are keen on making use of this hardware feature. However, given the unfinished and evolving state of early Xbox One development kits and considering that development on launch titles was already well underway, we won’t be seeing its utilization any time soon."

Guess you didn't read that, eh?

3832d ago
Godz Kastro3832d ago

@gimmeLESSbubbles

Did you read the entire article? He clearly mentioned no launch games would support it as it wasn't available on early dev kits. Should've kept your promise :/

2cents3832d ago

Most importantly it was also mentioned that none of the launch window games will benefit from any of the new implemented development pipelines as they have been finalised way to late in the cycle. The next wave of games should start to tweak and play with these touted features.

We all need to accept that this is a new generation, it's gonna take time for the both to really shine. Sony have the advantage in the short term, long term we just have to wait and see. I'm guessing a sensible answe like this won't be taken too well.

+ Show (2) more repliesLast reply 3832d ago
ThatCanadianGuy5143833d ago

More wishful thinking and wild theories.

One day it's "yeah but specs don't matter" Then rumors prop up from the stupidest sources and now they get their game face on and back into the spec war trenches - until the rumors turn bust again.

After the double GPU rumor went bust this is what's next?
These guys are too much.Borderline insanity at this point.

Godz Kastro3832d ago

@DayZ

Can you link an article where MS stated there were double gpu's?

saikorican3832d ago

Well I think you'd be hard pressed to find a source with Microsoft stating it because he just said himself that it was a rumor.

Godz Kastro3832d ago (Edited 3832d ago )

@rican...my point exactly. Hes mixing up a rumor that MS never acknowledged with a story where MS has acknowledged a certain feature.

ziggurcat3832d ago

@ godz:

it was all over the misterx blog. they were even claiming that there were 3 GPUs in the xbone, and that xbone is a 6tflop console...

Godz Kastro3832d ago

@cat... Im familiar because I followed but mixter is mixter and MS is MS. Its not their fault everyone on neogaf and n4g was taking that guys serious. No need to bring his fallacies up in a legit article.

BlackTar1873831d ago

No one take sMisterx seriously except Mister X's 90 second accounts he uses to respond to his own articles.

+ Show (2) more repliesLast reply 3831d ago
MightyNoX3832d ago

First it was

"PS4 is just PS3.5! Next gen gaming only possible on Xbox One."

Then it was

"It's about gameplay! It's not the graphics!"

I'm having trouble keeping up...

monkeyDzoro3832d ago

LoooooL.
You killed me there.

BBBirdistheWord3832d ago

"PS4 is just PS3.5! Next gen gaming only possible on Xbox One."

That's an interesting quote.
Who said it?
When did they say it?

I think you made the quote up, but if you give me a link I will stand corrected. Please give me the exact quote though.

You seem to have garnered lots of agrees, so the quote should not be too hard to find.
chop chop.

MightyNoX3832d ago

@BBB

- Many Xbots on Gamefaqs Xbox One board, Gamespot, IGN, Reddit. Mostly before the Xbox's weaker specs became public. Too busy to find it but I've always been soft on helping the lazy and the challenged

Here: http://microsoft-news.com/h...

and Here: http://lmgtfy.com/?q=PS4+is...

5eriously3832d ago

I have to waste a bubble on this. You had me tearing. I almost spilled my brew. Seems to me that some fanbois have convenient memory loss. They also forgot conveniently how they reacted and what was said before, during and after the PS3 launch window.

Bubbles up!

monkeyDzoro3832d ago

@BBB

Just... step away from your keyboard.

ATi_Elite3832d ago (Edited 3832d ago )

@ MightyNox Let me help you!

Fanboy argument rules 101:

1. When you're talking about Graphics but the PC version is involved then it's "All about Gameplay " and then "Sony Exclusives" automatically gets dragged into the conversation.
(like the PC doesn't have a Gazillion Exclusives"

2. Now if the argument is about Graphics and just between consoles "PS4 is automatically SUPERIOR" no matter what. WHY? because somehow the PS4 MAGICALLY is 50% more powerful than the XB1 and cost $100 less.

(Now based on Factual Human Laws of Compute Physics the PS4 is NOT 50% more powerful than the XB1 but then AGAIN commonsense and FACTS are thrown out the window in fanboy arguments)

3. Now if the argument is about True Next Gen Gaming " and the PC is NOT involved then again the PS4 wins automatically even when the XB1 begins using Cloud Gaming to render, Multi-tasking power, and advanced Kinect 2.0 features.

Why? Because most fanboys on N4G do NOT understand Cloud Gaming therefore PS4 wins by default.

4. Now if the PC is involved in Next Gen Gaming just bring up "Sony Exclusives" and then say "I'm not paying $990000 for a Gaming PC"!

5. Last but not least if the argument is over a Game NOT coming to the PS4 then that GAME sucks (at least until a PS4 version is annonced). Doesn't matter the Game, that game is not worth money unless it's on the PS4.

It either has bad shadows, repetitive gameplay, boring story, clunky animation, short SP experience, no split screen Co-op, Press X to win gameplay, QTE, or whatever YOU wanna make up to justify it not being worthy of the PS4.

(Now when said game does get a PS4 release date it MAGICALLY becomes a great experience)

I hope this chart helps and it should make your debates with other fanboys a lot easier as this is the definitive Guide to fanboy debates on N4G.

ziggurcat3832d ago (Edited 3832d ago )

Fanboy argument rules 101:

1. When you're talking about Graphics but the PC version is involved then it's "All about Gameplay " and then "Ryse" automatically gets dragged into the conversation.
(like the PC doesn't have a Gazillion Exclusives"

2. Now if the argument is about Graphics and just between consoles "Xbox is automatically SUPERIOR" no matter what. WHY? because somehow the Xbox MAGICALLY is 50% more powerful than the PS4 because it costs $100 more.

(Now based on Factual Human Laws of Compute Physics the Xbox is NOT 50% more powerful than the PS4 but then AGAIN commonsense and FACTS are thrown out the window in fanboy arguments)

3. Now if the argument is about True Next Gen Gaming " and the PC is NOT involved then again the Xbox wins automatically because it uses Cloud Gaming to render, Multi-tasking power, and advanced Kinect 2.0 features.

Why? Because most fanboys on N4G do NOT understand Cloud Gaming therefore Xbox wins by default.

4. Now if the PC is involved in Next Gen Gaming just bring up "Ryse" and then say "I'm not paying $990000 for a Gaming PC"!

5. Last but not least if the argument is over a Game NOT coming to the Xbox then that GAME sucks (at least until an Xbox version is announced). Doesn't matter the Game, that game is not worth money unless it's on the Xbox.

It either has bad shadows, repetitive gameplay, boring story, clunky animation, short SP experience, no split screen Co-op, Press X to win gameplay, QTE, Indie or whatever YOU wanna make up to justify it not being worthy of the Xbox.

(Now when said game does get a Xbox release date it MAGICALLY becomes a great experience)

I hope this chart helps and it should make your debates with other fanboys a lot easier as this is the definitive Guide to fanboy debates on N4G.

there... i fixed it for you.

Baka-akaB3832d ago

Forgot one important rule in both guides ..

90% of the time , the pc fanboys chime in the xbox and ps fanboys brawl , when they are almost never invited or dragged into the mess .

+ Show (1) more replyLast reply 3832d ago
Show all comments (87)
330°

Nvidia DLSS 3.7 drives a further nail in the coffin of native performance

Nvidia DLSS 3.7 is the latest update to the long-running AI upscaling technology, and it further shows native performance doesn't matter.

DustMan8d ago

I think hardware development is at a point where they need to figure out how to draw less power, These beefy high end cards eat wattage, and I'm curious if using DLSS & AI in general will lower the power draw. It would seem like the days of just adding more VRAM & horsepower is over. Law of diminishing returns. Pretty soon DLSS/FSR will be incorporated into everything, and eventually the tech will be good enough to hardly notice a difference if at all. AI is the future and it would be foolish to turn around and not incorporate it at all. Reliance on AI is only going to pick up more & more.

Tapani7d ago (Edited 7d ago )

DLSS certainly lowers power consumption. Also, the numbers such as the 4090 at 450W does not tell you everything, most of the time the GPU stays between 200-350W in gameplay, which is not too different from the highest end GPU of 10 years ago. Plus, today you can undervolt + OC GPUs by a good margin to keep stock performance while utilizing 80% of the power limit.

You can make the 4090 extremely power efficient and keep 90% of its performance at 320W.

However, in today's world the chip manufacturing is limited by physics and we will have power increases in the next 5-10 years at the very least to keep the technology moving forward at a pace that satisfies both businesses and consumers.

Maybe in 10 years we have new tech coming to the markets which we are yet to invent or perhaps we can solve existing technologies problems with manufacturing or cost of production.

On the other hand, if we were to solve the energy problem on earth by utilizing fusion and solar etc. it would not matter how much these chips require. That being said, in the next 30-40 years that is a pipedream.

MrBaskerville7d ago

I don't think fusion is the way forward. It will mosy likely be too late when it's finally ready, meaning it will probably never be ready. Something else might arrive before though and then it becomes viable.

Firebird3607d ago

We need to stop the smear campaign on nuclear energy.
We could power everything forever if we wanted too.

Tacoboto7d ago

PS4 Pro had dedicated hardware in it for supporting checkerboard rendering that was used significantly in PS4 first party titles, so you don't need to look to PC or even modern PC gaming. The first RTX cards released nearly 6 years ago, so how many nails does this coffin need?

InUrFoxHole7d ago

Well... its a coffin man. So atleast 4?

Tacoboto7d ago

PSSR in the fall can assume that role.

anast7d ago

and those nails need to be replaced annually

Einhander19727d ago

I'm not sure what the point you're trying to make is, but PS4 Pro was before DLSS and FSR, and it still provides one of the highest performance uplifts while maintaining good image quality.

DLSS is it's own thing but checkerboarding om PS5 still is a rival to the likes of FSR2.

Tacoboto7d ago

Um. That is my point. That there have been so many nails in this "native performance" coffin and they've been getting hammered in for years, even on PS4 Pro before DLSS was even a thing.

RaidenBlack7d ago

Don't know what's OP's point is either but ... checkerboard rendering was good enough for its time but in terms of image quality its wayy behind what's DLSS 3 or FSR 3 is currently offering.
The main point of the article and what OP missed here is that DLSS 3.7 is soo good that its nearly undisguisable from native rendering and basically throws the "its still blurry and inferior to native rendering" debacle, (that's been going around in PC community since 2019), right out of the window.

Einhander19726d ago

RaidenBlack

DLSS is as i said a different thing from FSR and checkerboard.

But you're talking about FSR 3 which probably is better than checkerboard, but FSR 3 has only started to get games this year, so checkerboard which was the first hardware upscaling solution was and is still one of the best upscaling solutions.

Give credit where credit is due, PlayStation was first and they got it right from the get go, and PSSR will almost certainly be better than it will be given credit for, heck digital foundry is already spreading misinformation about the Pro.

Rhythmattic7d ago

Tacoboto
Yes... Its amazing how many talekd about KZ2 deferred rendering, pointing out the explosions were lower res than the frame itself..
And of course, Then the idea of checkerboard rendering, not being native....
For sure, maybe this tech makes it minimal while pixel counting, but alas, seems performance and close enough , and not native now matters.....
I want to see it run native without DLSS.. why not?

RonsonPL7d ago

Almost deaf person:
- lightweight portable 5$, speakers of 0,5cm diameter are the final nail in coffin of Hi-Fi audio!

Some people in 2010:
- smartphones are the final nain in the console gaming's coffin!

This is just the same.
AI upscalling is complete dogshit in terms of motion quality. The fact that someone is not aware of it (look at the deaf guy example) doesn't mean the flaws are not there. They are. And all it takes to see them, is to use a display that handles motion well, so either gets true 500fps at 500Hz LCD TN or OLED (or faster tech) or uses low persistence mode (check blurbusters.com if you don't know what it means) also known as Black Frame Insertion or backlight strobing.

Also, image ruined by any type of TAA is just as "native image" as chineese 0,5$ screwdriver is "high quality, heavy duty, for professional use". It's nowhere near it. But if you're an ignorant "journalist", you will publish crap like this article, just to flow with the current.

There's no coffin to native res quality and there never will be. Eventually, we'll have enough performance in rasterization to drive 500fps, which will be a game changer for motion quality while also adding other benefit - lower latency.
And at 500fps, the amount of time required for upscalling makes it completely useless.
This crap is only usable for cinematic stuff, like cutscenes and such. Not for gaming. Beware of ignorants on the internet. The TAA is not "native" and the shitty look of the modern games when you disable any TAA, is not "native" either as it's ruined by the developer's design choice - you can cheat by rendering every 4th pixel when you plan to put a smeary TAA pass on it later on. When you disable it, you will see a ruined image, horrible pixellation and other visual "glitches" but it is NOT what native would've looked like if you'd like to honestly compare the two.

Stay informed.

RaidenBlack7d ago

Main point of the article is how far DLSS has come with v3.7 since 2018. If this is what we're getting already, then imagine what we'll get within next ~3 years. Yes parity will obviously be there compared to the then native rendering tech but it'll slowly narrow down to the point it'll be indistinguishable.
Something similar is like the genAI Sora ... AI generative videos were turd back when they were introduced (the infamous Will Smith eating video) ... but now look at Sora, generating videos that just looks like real life.

6d ago
Yui_Suzumiya7d ago

How much VRAM is standard today? My laptop has a 1080p QLED display but only an Intel Iris Xe with 128MB of VRAM. I currently do all my gaming on it but certain titles do suffer because of it. I plan on getting a Steam Deck OLED soon to play the newer and more demanding titles.

purple1017d ago

Maybe better to get a budget gaming laptop and link a dualsense to it

= Portable console with far better graphics than a steam deck! + bigger screen and able to use it for work / etc

170°

Why I'm worried about the Nvidia RTX 50 series

Aleksha writes: "Nvidia has established itself as a dominant force in the world of AI, but I can't shake the worry of what this means for the RTX 50 series."

Tal16910d ago

Echo sentiment here - I think the way GPUs are going, gaming could be secondary to deep learning. Wonder if the 40 series was the last true generation of GPUs?

Number1TailzFan10d ago

No.. Jensen believes GPUs should stay expensive. Those wanting a top end GPU will have to splash out for it, or play at just 1080p and 60fps or something if you can only afford a low end option.

On the other hand if you don't care about RT or AI performance then there's always AMD that are doing ok at the mid range.

Christopher10d ago

***or play at just 1080p and 60fps or something***

My over 2-year-old laptop GPU still runs fine. I think this is more a reason why GPUs are going to other things in priority, because the market reach for new users is shrinking as more PC gamers focus less on replacing older and still working parts that run RT/AI fine enough as it is. Not to say there aren't people who still do it, but I think the market is shrinking for having the latest and greatest like it has been the past two decades. Problem is we aren't growing things at a rate as we were, we're reaching the the flattening of that exponential curve in regards to advancement. We need another major technological advancement to restart that curve.

D0nkeyBoi10d ago

The irremoval ad makes it impossible to read article

Tzuno10d ago (Edited 10d ago )

I hope Intel takes some lead and do a big dent to nvidia sales

Jingsing10d ago

You also need to consider that NVIDIA are heavily invested in cloud gaming. So they are likely going to make moves to push you into yet another life subscription service.

Kayser8110d ago

NVIDIA will never change their price point until AMD or intel makes a GPU that is comparable and cheaper than them .
it happend before in the days of gtx280 which they changed the price from 650$ to 450$ in a matter of 2 weeks because of rx4870 which is being sold at 380$.

Show all comments (8)
230°

Nvidia AI Demo Unwittingly Proves that Human Voice Actors, Artists, and Writers are Irreplaceable

Nvidia presented Covert Protocol, a tech demo aiming to showcase the "power" of the Nvidia Ace technology applied to video game characters.

Read Full Story >>
techraptor.net
Eonjay31d ago (Edited 31d ago )

They look like they are in pain. Almost begging to be put down. It was uncomfortable to watch.

PRIMORDUS31d ago

The tech. is too early. Come back in 10+yrs and see what it can do then.

N3mzor31d ago

That presentation sounds like it was written by an AI using corporate buzzwords.

CS731d ago

I don’t know why people keep thinking of it as AI vs no AI.

A much more likely scenario is the use of AI alongside human work.

Eg. AI voices used during side quests or banter to boost the lines of dialog.

AI generating additional pre determined branches in dialog tree options for more freedom in conversations with NPCs

Smellsforfree30d ago

"AI generating additional pre determined branches in dialog tree options for more freedom in conversations with NPCs"

I'm wondering about that last one. Will that make a game more fun or more immersive? In the end, how can it possibly be more than filler content and then if it is filler content how much do I really want to engage with conversing with it if I know it will lead no where?

MrBaskerville30d ago

It's one of those things that sounds cool on paper. But will probably get old fast.

DivineHand12530d ago

The tech is now available, and it is up to creators to create something unique with it.

Profchaos31d ago (Edited 31d ago )

The biggest thing to talk about here is that every interaction requires communication to inworld servers so there's three big impacts here
1) games are always online no question about it
2) delays in processing on inworld servers, outages or unexpected load as a result of some astronomically popular game will cause real time game delays ever wait for a chat got response this will be similar as the context must be pulled via the llm.

Now as for the other impact the artistic one no I don't think writers can be replaced I've mentioned before often AI generated writing is word soup I still standby that it's also evident in the video to.
AI can not convery accurately human emotions and I don't think ever will.

I know publishers are looking to cut down on development costs but what happens when inworld decide to charge per interaction or update their pricing a year after your game goes live you have no choice but pay it or shutter it.

Ive felt for a while that we are heading towards this place of games being disposable entertainment and now it's feeling more and more accurate

Show all comments (23)