770°

Xbox One already has an answer to Nvidia G-Sync

With the announcement of its G-sync technology, many believe that Nvidia may already have ousted the next generation of consoles before their launch. Sure enough, the GPU manufacturer makes a rather fascinating proposition, with the promise of gaming without annoyances like input lag, stutter, and screen tearing.

Read Full Story >>
gearnuke.com
GalacticEmpire4266d ago

Did misterXmedia write this?

Somebody with some tech know how explain this crap.

meatysausage4265d ago (Edited 4265d ago )

Haha, I saw that.The poeple on that blog are mental

trying to be fair, they need to fix the problems of not hitting high enough res
althought he cod 720p rumors are not confirmed, its worrying for those xbox owners that this is starting.
Xbox one might have a great scaler chip, but for people like me who game on a 1080p projector, the difference between native and 720p is enormous.

More on topic, that would be good if true but i doubt it as the conclusions are not that sound

Hydrolex4265d ago

I work in tech and I will tell you how this works...

Well it doesn't, you just dream it

meatysausage4265d ago

Hydrolex

which part are you talking about?

Hydrolex4265d ago (Edited 4265d ago )

go home hydrolex, you're drunk

Eonjay4265d ago

Does misterXmedia run a sci-fi blog?

jaosobno4265d ago (Edited 4265d ago )

What dynamic framebuffer does is the following: it changes image resolution on the fly, based on performance analysis.

For example, let's say Xbox One runs the game at 900p@30FPS. Suddenly the scene becomes too complex for 30 FPS to be mantained. Instead of dropping FPS, the game drops the resolution to 720p in order to mantain performance target of 30 FPS. When things "get back to normal", game goes back to 900p.

So this has nothing to do with things that G-Sync fixes.

Author of this article is an idiot.

meatysausage4265d ago (Edited 4265d ago )

@jaosobno

Thats odd, wouldnt that mean you could notice a decrease in quality when its changing resolutions (in game) to maintain a stable framerate

Would be annoying

Kleptic4265d ago

This is actually nothing new, at least how it ends up for the person playing the game...

the PS3 did it, albeit through software, with WipeOut HD...most of the time the game ran at full HD 60fps...but 60fps is what was locked, not res, and the buffer could change on the fly in order to keep the frames where they needed to be...

Rage's ID Tech 5 engine also did the exact same thing, but instead of an overall rendered frame change...it dropped resolution of specific textures...

I played the hell out of wipeout HD...can't say i ever noticed it...but with Rage, without an SSD...the texture res pop was very noticeable (doing a quick 180 gave a delay on environment textures, and everyone complained)...but i still thought that game at 60fps was worth more than having it at 30fps with no pop in...

the xbox one apparently has some sort of hardware scaler to do this for 'free'...thats fine, i guess...but its NOTHING like g-sync...as g-sync fixes the issues created by a lower framerate without gutting the image quality...it, instead, steps the monitor resolution around...the worst thing you'd notice would be some flicker on your monitor, but not helplessly shitty input lag and stutter that you currently get...

so its hardly an 'answer' to g-sync...its a method to try and reduce frame drops, by lowering image quality...g-sync is trying to remove the problems associated with lower frame rates entirely...two completely different things...

BattleTorn4265d ago

Hydrolex,

did you just troll against yourself?

loulou4265d ago

lol misterxmedia... he did call this

Utalkin2me4265d ago

@Hydrolex

Did someone forget to log in with the other account?

UltimateMaster4265d ago

Article:
~Of course, all of this is only in theory based on what the Xbox One architects have claimed. It remains to be seen if such favorable circumstances for game performance will actually be realized in games.

@GalacticEmpire
It's somewhat written by Microsoft since the article is basing itself from the engineering them at Microsoft.

Whether all that is true or not remains to be seen.
So far, I don't see any problems with the next gen consoles.

+ Show (8) more repliesLast reply 4265d ago
heliumhead20304265d ago

This is old news. Basically when things get hectic, instead of dropping the frame rate the game will drop its resolution. And he lied because this is already being used in dead rising 3

Kleptic4265d ago

and rage...and wipeout HD/fury...and a few others iirc...

wishingW3L4265d ago

and let's not forget about the worst case game: Ninja Gaiden 3.

Seriously, this dynamic res stuff is not a good thing AT ALL. Just optimize and use Triple Buffering better like BF4.

Shake_Zula4265d ago

Sure... Hopefully I don't get too many disagrees for this. lol

When they are referring to the dynamic frame buffer, they are talking about the ESRAM implementation. What happens is, first, compressed textures and graphical data is loaded into the 8GB RAM space. When specific data is needed it is decompressed and transferred to ESRAM which is then fed to the GPU.

Traditionally, v-sync was a GPU-only process that limits the frame rate to prevent screen tearing. What is detailed in the article is throttling data from ESRAM to the GPU to achieve a target frame rate. So conceptually, it's still v-sync, but at a different point in the process. Nothing new here.

The way this differs from G-sync is that in G-sync, hardware in the monitor allows the GPU and the monitor to sync framerates, which eliminates screen tearing completely and reduces graphical delay. The latter feature is something completely new as most high-end displays still have at least a 5ms delay.

In other words, this article is incorrect.

meetajhu4265d ago

This is not possible. Microsoft is leaving a false news to gamers. Because there is no way for the Television or Monitor to know at what dynamic framerate the image is being rendered per sec. What Xbox One could do is have the year old Adaptive VSync.

Let me explain what VSYNC does and why GSYNC

Vsync- Sets a fixed framerate to the monitor's framerate. Eg:- If your playing BF3 on PC which runs at 60fps with Vsync ON. When your game drops 1fps instead of dropping to 59fps it drops to 45fps. When this happens you see massive slow down in fluidity in the game but you can actually eliminate the screen tear by doing so. This is why some games even without VSync doesn't cause screen tear. But games that do screen tear are not syncing with the monitor because the monitor doesn't know at what refresh rate the game being rendered apart from its standard refresh rates 15,30,45 & 60. This is the case with Xbox One

GSync- This isn't a new tech. Its already been in Nvidia's Quadro series and iPhone. This is only possible using a display port cable or medium in which according to DisplayPort 1.0 specification the displayport monitors are capable of changing its refresh rate dynamically depending on GPU output. And the fluidity is maintained because GSYNC specs require 144hz and nobody will notice it at that framerate. I have no time to write detailed explanation.

malokevi4265d ago (Edited 4265d ago )

Doesn't seem like it needs explaining. They do a good job explaining how it works in the article.

Sounds like a cool feature to me. Momentary drops in resolution to maintain framerate.

No need to be threatened by this, guys. If PS4 is as perfect as you all seem to believe, then the framerate in PS4 games will never dip below 60fps and the res will always be 1080p... right?

dantesparda4265d ago

Malokavi

Ok, now which Sony fanboys think that the PS4 will never drop in frames?

What they believe is that anything the x1 can do, the PS4 can do better.

Ok!? you got that!? or is that to much for you fanboy mind to understand?

malokevi4265d ago (Edited 4265d ago )

The way I've heard it, "its the console of 1080p 60fps" no if ands or buts.

Also that 60fps 1080p is something that simply happens because you're on a playstation4, and that only "slacked-ass developers who are being brought down by the X1" could ever create a game to any other standard.

Little realizing how crazy that really is.

Edit: also, you sound mad... need a hug? 😊

dantesparda4265d ago (Edited 4265d ago )

I sound mad? why cuz you ms fanboys are delusional? No i laugh at you's. Its sad really, i think you need the hug. I know you fanboys are crying inside, with the x1 being such a huge let down and all. With all the downgrades and all

malokevi4265d ago

Beer + codine formula = relief from sadness

Don't be such a snickerpuss! Cure your twisty nickers syndrome. Life aint so bad.

+ Show (1) more replyLast reply 4265d ago
rainslacker4265d ago

Basically, G-Sync syncs the monitors refresh rate to the Video cards refresh rate to prevent screen tearing and input lag, as well as hopefully preventing screen skipping or freezing.

MS apparent answer is to downgrade the native the resolution, then upscale through an additional hardware component to maintain frame rate. This upscaler can be thought to be akin to how some Blu-Ray Players can upscale DVD content to HDTV content. It's worth noting that this is done this gen already. Games often have a lower resolution, but are upscaled for output independent of the GPU.

MS solution is solving a different problem than G-Sync, and quite honestly, doesn't seem like an answer to G-Sync in the slightest based on this articles description.

+ Show (3) more repliesLast reply 4265d ago
Pandamobile4266d ago (Edited 4266d ago )

I don't think the writer of this really understands what G-Sync is all about.

All this sounds like is a pretty standard V-sync implementation. The whole point of G-Sync is to only refresh the display when the GPU sends a new frame, instead of just updating at 60 Hz no matter what.

Kayant4266d ago

And not to add that it's a hardware module vs a software implementation. Dedicated hardware + software > than just software.

jeffgoldwin4265d ago

True, but you only super charge a Honda civic so much.

Studio-YaMi4265d ago

So what you're saying is that the gameplay would be smooth and won't have that "lag" when the frames drop !?

Is what I'm understanding here is right ?? :0
because if so,I'm buying me a freakin monitor with G-Sync implanted !

Mariusmssj4265d ago

Yes essentially that! + a monitor with g-sync running a game at 40fps will look at smooth as monitor running a game at 60fps without g-sync.

Gimmemorebubblez4266d ago

"this would lead to the elimination of input lag, stutter, and SCREEN TEARING"
Scrolls down to related articles
"The new Ryse build still shows some flaws, mainly SCREEN-TEARING."
-___-
This article is based on assumptions made by the author from a month old Digital Foundry interview.

I broke my promise to myself not to comment again on N4G, damn.

MorePowerOfGreen4266d ago (Edited 4266d ago )

Explain in detail please. No C-boat spin and guessing.

SignifiedSix914265d ago

"Multiple industry sources have indicated that developers are keen on making use of this hardware feature. However, given the unfinished and evolving state of early Xbox One development kits and considering that development on launch titles was already well underway, we won’t be seeing its utilization any time soon."

Guess you didn't read that, eh?

4265d ago
Godz Kastro4265d ago

@gimmeLESSbubbles

Did you read the entire article? He clearly mentioned no launch games would support it as it wasn't available on early dev kits. Should've kept your promise :/

2cents4265d ago

Most importantly it was also mentioned that none of the launch window games will benefit from any of the new implemented development pipelines as they have been finalised way to late in the cycle. The next wave of games should start to tweak and play with these touted features.

We all need to accept that this is a new generation, it's gonna take time for the both to really shine. Sony have the advantage in the short term, long term we just have to wait and see. I'm guessing a sensible answe like this won't be taken too well.

+ Show (2) more repliesLast reply 4265d ago
ThatCanadianGuy5144266d ago

More wishful thinking and wild theories.

One day it's "yeah but specs don't matter" Then rumors prop up from the stupidest sources and now they get their game face on and back into the spec war trenches - until the rumors turn bust again.

After the double GPU rumor went bust this is what's next?
These guys are too much.Borderline insanity at this point.

Godz Kastro4265d ago

@DayZ

Can you link an article where MS stated there were double gpu's?

saikorican4265d ago

Well I think you'd be hard pressed to find a source with Microsoft stating it because he just said himself that it was a rumor.

Godz Kastro4265d ago (Edited 4265d ago )

@rican...my point exactly. Hes mixing up a rumor that MS never acknowledged with a story where MS has acknowledged a certain feature.

ziggurcat4265d ago

@ godz:

it was all over the misterx blog. they were even claiming that there were 3 GPUs in the xbone, and that xbone is a 6tflop console...

Godz Kastro4265d ago

@cat... Im familiar because I followed but mixter is mixter and MS is MS. Its not their fault everyone on neogaf and n4g was taking that guys serious. No need to bring his fallacies up in a legit article.

BlackTar1874265d ago

No one take sMisterx seriously except Mister X's 90 second accounts he uses to respond to his own articles.

+ Show (2) more repliesLast reply 4265d ago
MightyNoX4265d ago

First it was

"PS4 is just PS3.5! Next gen gaming only possible on Xbox One."

Then it was

"It's about gameplay! It's not the graphics!"

I'm having trouble keeping up...

monkeyDzoro4265d ago

LoooooL.
You killed me there.

BBBirdistheWord4265d ago

"PS4 is just PS3.5! Next gen gaming only possible on Xbox One."

That's an interesting quote.
Who said it?
When did they say it?

I think you made the quote up, but if you give me a link I will stand corrected. Please give me the exact quote though.

You seem to have garnered lots of agrees, so the quote should not be too hard to find.
chop chop.

MightyNoX4265d ago

@BBB

- Many Xbots on Gamefaqs Xbox One board, Gamespot, IGN, Reddit. Mostly before the Xbox's weaker specs became public. Too busy to find it but I've always been soft on helping the lazy and the challenged

Here: http://microsoft-news.com/h...

and Here: http://lmgtfy.com/?q=PS4+is...

5eriously4265d ago

I have to waste a bubble on this. You had me tearing. I almost spilled my brew. Seems to me that some fanbois have convenient memory loss. They also forgot conveniently how they reacted and what was said before, during and after the PS3 launch window.

Bubbles up!

monkeyDzoro4265d ago

@BBB

Just... step away from your keyboard.

ATi_Elite4265d ago (Edited 4265d ago )

@ MightyNox Let me help you!

Fanboy argument rules 101:

1. When you're talking about Graphics but the PC version is involved then it's "All about Gameplay " and then "Sony Exclusives" automatically gets dragged into the conversation.
(like the PC doesn't have a Gazillion Exclusives"

2. Now if the argument is about Graphics and just between consoles "PS4 is automatically SUPERIOR" no matter what. WHY? because somehow the PS4 MAGICALLY is 50% more powerful than the XB1 and cost $100 less.

(Now based on Factual Human Laws of Compute Physics the PS4 is NOT 50% more powerful than the XB1 but then AGAIN commonsense and FACTS are thrown out the window in fanboy arguments)

3. Now if the argument is about True Next Gen Gaming " and the PC is NOT involved then again the PS4 wins automatically even when the XB1 begins using Cloud Gaming to render, Multi-tasking power, and advanced Kinect 2.0 features.

Why? Because most fanboys on N4G do NOT understand Cloud Gaming therefore PS4 wins by default.

4. Now if the PC is involved in Next Gen Gaming just bring up "Sony Exclusives" and then say "I'm not paying $990000 for a Gaming PC"!

5. Last but not least if the argument is over a Game NOT coming to the PS4 then that GAME sucks (at least until a PS4 version is annonced). Doesn't matter the Game, that game is not worth money unless it's on the PS4.

It either has bad shadows, repetitive gameplay, boring story, clunky animation, short SP experience, no split screen Co-op, Press X to win gameplay, QTE, or whatever YOU wanna make up to justify it not being worthy of the PS4.

(Now when said game does get a PS4 release date it MAGICALLY becomes a great experience)

I hope this chart helps and it should make your debates with other fanboys a lot easier as this is the definitive Guide to fanboy debates on N4G.

ziggurcat4265d ago (Edited 4265d ago )

Fanboy argument rules 101:

1. When you're talking about Graphics but the PC version is involved then it's "All about Gameplay " and then "Ryse" automatically gets dragged into the conversation.
(like the PC doesn't have a Gazillion Exclusives"

2. Now if the argument is about Graphics and just between consoles "Xbox is automatically SUPERIOR" no matter what. WHY? because somehow the Xbox MAGICALLY is 50% more powerful than the PS4 because it costs $100 more.

(Now based on Factual Human Laws of Compute Physics the Xbox is NOT 50% more powerful than the PS4 but then AGAIN commonsense and FACTS are thrown out the window in fanboy arguments)

3. Now if the argument is about True Next Gen Gaming " and the PC is NOT involved then again the Xbox wins automatically because it uses Cloud Gaming to render, Multi-tasking power, and advanced Kinect 2.0 features.

Why? Because most fanboys on N4G do NOT understand Cloud Gaming therefore Xbox wins by default.

4. Now if the PC is involved in Next Gen Gaming just bring up "Ryse" and then say "I'm not paying $990000 for a Gaming PC"!

5. Last but not least if the argument is over a Game NOT coming to the Xbox then that GAME sucks (at least until an Xbox version is announced). Doesn't matter the Game, that game is not worth money unless it's on the Xbox.

It either has bad shadows, repetitive gameplay, boring story, clunky animation, short SP experience, no split screen Co-op, Press X to win gameplay, QTE, Indie or whatever YOU wanna make up to justify it not being worthy of the Xbox.

(Now when said game does get a Xbox release date it MAGICALLY becomes a great experience)

I hope this chart helps and it should make your debates with other fanboys a lot easier as this is the definitive Guide to fanboy debates on N4G.

there... i fixed it for you.

Baka-akaB4265d ago

Forgot one important rule in both guides ..

90% of the time , the pc fanboys chime in the xbox and ps fanboys brawl , when they are almost never invited or dragged into the mess .

+ Show (1) more replyLast reply 4265d ago
Show all comments (87)
70°

NVIDIA Smooth Motion: Up to 70% More FPS Using Driver Level Frame Gen on RTX 50 GPUs

NVIDIA’s RTX 50 “Blackwell” architecture has been a bit of a bore for us gamers. Apart from Multi Frame Generation, which has limited use-case scenarios, there isn’t much to be excited about. It is achieved using GPU-side Flip Metering. The optical field data is generated using AI models in the Tensor cores.

Read Full Story >>
pcoptimizedsettings.com
60°

PNY NVIDIA GeForce RTX 5060 Ti GPU Review

Between the price, performance and power draw, with the GeForce RTX 5060 Ti, NVIDIA nailed the mainstream formula.

Read Full Story >>
cgmagonline.com
71d ago
230°

Nintendo Switch 2 Leveled Up With NVIDIA AI-Powered DLSS and 4K Gaming

Nvidia writes:

The Nintendo Switch 2 takes performance to the next level, powered by a custom NVIDIA processor featuring an NVIDIA GPU with dedicated RT Cores and Tensor Cores for stunning visuals and AI-driven enhancements.

Read Full Story >>
blogs.nvidia.com
ZycoFox85d ago

The raytracing probably doesn't even equal a low end PC GPU, even if it did it would probably be mostly useless. They'll probably force it in some game now that will run like shit maybe 30fps at best, just because "it can do it"

B5R84d ago

Raytracing is so unnecessary for a handheld. I just hope you can turn it off.

Vits84d ago

A lot of gamers don’t realize that ray tracing isn’t really about making games look better. It’s mainly there to make development easier and cheaper, since it lets devs skip a bunch of old-school tricks to fake reflections and lighting. The visual upgrade is just a nice bonus, but that’s not the main reason the tech exists.

So you can be 100% sure that developers will try to implement it every chance they get.

RaidenBlack84d ago (Edited 84d ago )

Agree with Vits .... but also to add, if devs and designers just implement RT to a game world then it won't always work as expected. RT is not just reflections but also lighting and illumination as well. For example, If you just create a room with minimal windows, then it will look dark af, if RTGI is enabled. Devs and designers needs to sort out the game world design accordingly as well.
DF's Metro Exodus RT upgrade is an amazing reference video to go through, if anybody's interested.

darthv7284d ago

So is HDR... but they have it anyway.

thesoftware73084d ago

Some PS5 and SX games run at 30fps with RT...just like those systems, if you don't like it, turn it off.

I only say this to say, you make it seem like a problem exclusive to the Switch 2.

Neonridr84d ago (Edited 84d ago )

sour grapes much?

"It probably doesn't do it well because it's Nintendo and they suck". That's how your comment reads. Why don't you just wait and see before making these ridiculous statements?

Goodguy0184d ago

Please. I'd like to play my switch games on my 4k tv without it looking all doodoo.

PRIMORDUS84d ago

Nvidia could have said this months ago and cut the bullshit. Anyway the rumors were true.

Profchaos84d ago

Would have been nice but NDA likely prevented them from saying anything

PRIMORDUS84d ago

TBH I don't think Nvidia would have cared if they broke the NDA. A little fine they pay, and they go back to their AI shit. They don't even care about GPU's anymore. I myself would like them to leave the PC and console market.

Tacoboto83d ago

This story was written half a decade ago when the world knew Nvidia would provide the chip for Switch 2 and DLSS was taking off.

Profchaos83d ago

Yeah but similar thing happened a long time ago when 3dfx announced they were working with Sega when they took the company public Sega pulled out of the contract for the Dreamcast GPU.

In response Sega terminated the contract and went to a ultimately weaker chipset.

So there's a precedent but that Nintendo would have much Of an option its AMD, NVIDIA or Intel

Profchaos84d ago

I'm not expecting of anything from ray tracing but dlss will be the thing that sees the unit get some impossible ports.

andy8584d ago

Correct. All I'm seeing online is it'll never run FF7 Rebirth. If it can run cyberpunk it'll run it. The DLSS will help. Obviously only 30 fps but a lot don't care

Profchaos84d ago (Edited 84d ago )

Exactly right when I buy a game on switch I know what I'm getting into I'm buying a game for its portability and I'm willing to sacrifice fidelity and performance to play on a train or comfortably from a hotel room when I travel for work.

84d ago Replies(1)
Show all comments (23)