Xbox One already has an answer to Nvidia G-Sync

With the announcement of its G-sync technology, many believe that Nvidia may already have ousted the next generation of consoles before their launch. Sure enough, the GPU manufacturer makes a rather fascinating proposition, with the promise of gaming without annoyances like input lag, stutter, and screen tearing.

Read Full Story >>
GalacticEmpire3890d ago

Did misterXmedia write this?

Somebody with some tech know how explain this crap.

meatysausage3889d ago (Edited 3889d ago )

Haha, I saw that.The poeple on that blog are mental

trying to be fair, they need to fix the problems of not hitting high enough res
althought he cod 720p rumors are not confirmed, its worrying for those xbox owners that this is starting.
Xbox one might have a great scaler chip, but for people like me who game on a 1080p projector, the difference between native and 720p is enormous.

More on topic, that would be good if true but i doubt it as the conclusions are not that sound

Hydrolex3889d ago

I work in tech and I will tell you how this works...

Well it doesn't, you just dream it

meatysausage3889d ago


which part are you talking about?

Hydrolex3889d ago (Edited 3889d ago )

go home hydrolex, you're drunk

Eonjay3889d ago

Does misterXmedia run a sci-fi blog?

jaosobno3889d ago (Edited 3889d ago )

What dynamic framebuffer does is the following: it changes image resolution on the fly, based on performance analysis.

For example, let's say Xbox One runs the game at 900p@30FPS. Suddenly the scene becomes too complex for 30 FPS to be mantained. Instead of dropping FPS, the game drops the resolution to 720p in order to mantain performance target of 30 FPS. When things "get back to normal", game goes back to 900p.

So this has nothing to do with things that G-Sync fixes.

Author of this article is an idiot.

meatysausage3889d ago (Edited 3889d ago )


Thats odd, wouldnt that mean you could notice a decrease in quality when its changing resolutions (in game) to maintain a stable framerate

Would be annoying

Kleptic3889d ago

This is actually nothing new, at least how it ends up for the person playing the game...

the PS3 did it, albeit through software, with WipeOut HD...most of the time the game ran at full HD 60fps...but 60fps is what was locked, not res, and the buffer could change on the fly in order to keep the frames where they needed to be...

Rage's ID Tech 5 engine also did the exact same thing, but instead of an overall rendered frame change...it dropped resolution of specific textures...

I played the hell out of wipeout HD...can't say i ever noticed it...but with Rage, without an SSD...the texture res pop was very noticeable (doing a quick 180 gave a delay on environment textures, and everyone complained)...but i still thought that game at 60fps was worth more than having it at 30fps with no pop in...

the xbox one apparently has some sort of hardware scaler to do this for 'free'...thats fine, i guess...but its NOTHING like g-sync...as g-sync fixes the issues created by a lower framerate without gutting the image quality...it, instead, steps the monitor resolution around...the worst thing you'd notice would be some flicker on your monitor, but not helplessly shitty input lag and stutter that you currently get...

so its hardly an 'answer' to g-sync...its a method to try and reduce frame drops, by lowering image quality...g-sync is trying to remove the problems associated with lower frame rates entirely...two completely different things...

BattleTorn3889d ago


did you just troll against yourself?

loulou3889d ago

lol misterxmedia... he did call this

Utalkin2me3889d ago


Did someone forget to log in with the other account?

UltimateMaster3889d ago

~Of course, all of this is only in theory based on what the Xbox One architects have claimed. It remains to be seen if such favorable circumstances for game performance will actually be realized in games.

It's somewhat written by Microsoft since the article is basing itself from the engineering them at Microsoft.

Whether all that is true or not remains to be seen.
So far, I don't see any problems with the next gen consoles.

+ Show (8) more repliesLast reply 3889d ago
heliumhead20303889d ago

This is old news. Basically when things get hectic, instead of dropping the frame rate the game will drop its resolution. And he lied because this is already being used in dead rising 3

Kleptic3889d ago

and rage...and wipeout HD/fury...and a few others iirc...

wishingW3L3889d ago

and let's not forget about the worst case game: Ninja Gaiden 3.

Seriously, this dynamic res stuff is not a good thing AT ALL. Just optimize and use Triple Buffering better like BF4.

Shake_Zula3889d ago

Sure... Hopefully I don't get too many disagrees for this. lol

When they are referring to the dynamic frame buffer, they are talking about the ESRAM implementation. What happens is, first, compressed textures and graphical data is loaded into the 8GB RAM space. When specific data is needed it is decompressed and transferred to ESRAM which is then fed to the GPU.

Traditionally, v-sync was a GPU-only process that limits the frame rate to prevent screen tearing. What is detailed in the article is throttling data from ESRAM to the GPU to achieve a target frame rate. So conceptually, it's still v-sync, but at a different point in the process. Nothing new here.

The way this differs from G-sync is that in G-sync, hardware in the monitor allows the GPU and the monitor to sync framerates, which eliminates screen tearing completely and reduces graphical delay. The latter feature is something completely new as most high-end displays still have at least a 5ms delay.

In other words, this article is incorrect.

meetajhu3889d ago

This is not possible. Microsoft is leaving a false news to gamers. Because there is no way for the Television or Monitor to know at what dynamic framerate the image is being rendered per sec. What Xbox One could do is have the year old Adaptive VSync.

Let me explain what VSYNC does and why GSYNC

Vsync- Sets a fixed framerate to the monitor's framerate. Eg:- If your playing BF3 on PC which runs at 60fps with Vsync ON. When your game drops 1fps instead of dropping to 59fps it drops to 45fps. When this happens you see massive slow down in fluidity in the game but you can actually eliminate the screen tear by doing so. This is why some games even without VSync doesn't cause screen tear. But games that do screen tear are not syncing with the monitor because the monitor doesn't know at what refresh rate the game being rendered apart from its standard refresh rates 15,30,45 & 60. This is the case with Xbox One

GSync- This isn't a new tech. Its already been in Nvidia's Quadro series and iPhone. This is only possible using a display port cable or medium in which according to DisplayPort 1.0 specification the displayport monitors are capable of changing its refresh rate dynamically depending on GPU output. And the fluidity is maintained because GSYNC specs require 144hz and nobody will notice it at that framerate. I have no time to write detailed explanation.

malokevi3889d ago (Edited 3889d ago )

Doesn't seem like it needs explaining. They do a good job explaining how it works in the article.

Sounds like a cool feature to me. Momentary drops in resolution to maintain framerate.

No need to be threatened by this, guys. If PS4 is as perfect as you all seem to believe, then the framerate in PS4 games will never dip below 60fps and the res will always be 1080p... right?

dantesparda3889d ago


Ok, now which Sony fanboys think that the PS4 will never drop in frames?

What they believe is that anything the x1 can do, the PS4 can do better.

Ok!? you got that!? or is that to much for you fanboy mind to understand?

malokevi3889d ago (Edited 3889d ago )

The way I've heard it, "its the console of 1080p 60fps" no if ands or buts.

Also that 60fps 1080p is something that simply happens because you're on a playstation4, and that only "slacked-ass developers who are being brought down by the X1" could ever create a game to any other standard.

Little realizing how crazy that really is.

Edit: also, you sound mad... need a hug? 😊

dantesparda3889d ago (Edited 3889d ago )

I sound mad? why cuz you ms fanboys are delusional? No i laugh at you's. Its sad really, i think you need the hug. I know you fanboys are crying inside, with the x1 being such a huge let down and all. With all the downgrades and all

malokevi3889d ago

Beer + codine formula = relief from sadness

Don't be such a snickerpuss! Cure your twisty nickers syndrome. Life aint so bad.

+ Show (1) more replyLast reply 3889d ago
rainslacker3889d ago

Basically, G-Sync syncs the monitors refresh rate to the Video cards refresh rate to prevent screen tearing and input lag, as well as hopefully preventing screen skipping or freezing.

MS apparent answer is to downgrade the native the resolution, then upscale through an additional hardware component to maintain frame rate. This upscaler can be thought to be akin to how some Blu-Ray Players can upscale DVD content to HDTV content. It's worth noting that this is done this gen already. Games often have a lower resolution, but are upscaled for output independent of the GPU.

MS solution is solving a different problem than G-Sync, and quite honestly, doesn't seem like an answer to G-Sync in the slightest based on this articles description.

+ Show (3) more repliesLast reply 3889d ago
Pandamobile3890d ago (Edited 3890d ago )

I don't think the writer of this really understands what G-Sync is all about.

All this sounds like is a pretty standard V-sync implementation. The whole point of G-Sync is to only refresh the display when the GPU sends a new frame, instead of just updating at 60 Hz no matter what.

Kayant3890d ago

And not to add that it's a hardware module vs a software implementation. Dedicated hardware + software > than just software.

jeffgoldwin3889d ago

True, but you only super charge a Honda civic so much.

Studio-YaMi3889d ago

So what you're saying is that the gameplay would be smooth and won't have that "lag" when the frames drop !?

Is what I'm understanding here is right ?? :0
because if so,I'm buying me a freakin monitor with G-Sync implanted !