NVIDIA wowed the entire industry with the reveal of the G-SYNC technology at a press event in Montreal. Find out the whole report and why it could leave PlayStation 4 and Xbox One trailing behind before they even launch.
^ What a compelling argument you make.
The graphics card companies can make any tech they'd like, but if developers don't use it, the tech won't be making anything "obsolete". It's not obsolete until it's been replaced.
@inveni0 G sync is available for every game ever made, if you did not understand that you are not really well informed on the subject.
Most on Steam people are running integrated graphics. The minimum GPU required for this will be the GeForce 650Ti Boost. There are monitor compatibility issues making the only way to get G-SYNC will be to buy a modded Asus VG248QE. That or wait until sometime next year. I don't think people know the meaning of the word obsolete. Especially if you're mainly a PC gamer. Your world is different than a console gamer's. You care too much about some things that console gamers do not stress about. @kingduqc Except for those games made specifically for PlayStation, Xbox and Nintendo platforms... so in other words, no. Not for "every game ever made."
The PS4/Xbox One will have no problems dominating. The reason is because Nvidia's technology, while state of the art, is so far from average discretionary spending that it wont matter. It the same reason why Aston Martins aren't mass market vehicles. Lest we forget that Gaikai is powered by Nvidia... We begin to realize that Nvidia has more invested in consles than meets the eye.
But i thought everything nvidia has been releasing since the ps4 and xbone was announced has made next gen consoles"obsolete" according to nvidia?So is this what makes it obsolete, or does everything else make it obsolete, or does everything nvidia do from now on makes next gen consoles obsolete? These are rhetorical questions btw.
@ShinMaster That's only a matter of time. Plus it was reffering to him talking about how it can't be useful if dev don't use it when actually has nothing to do with that. I live close to montreal and got the chance to be there at nvidia's showcase. I actually seen it hands on and I truthfully believe g-sync is one of the things upcoming for gaming that is a game changer. And that's coming from someone who has a great 2560*1440 monitor and a rig with sli 770. (that and oculus, but that's another story) But im sure you will be fine without it, after all your standard are so low you won't even notice.
Nvidia trying hard to make next-gen console obsolete. Not going happen.
no, youre wrong, they allready are obsolete
They're trying hard to improve PC gamers' experience.
Technically speaking, the consoles are based on less than state of the art hardware at this point, but it doesn't matter. Even if development tools allow pc devs to code to the metal like console devs, the big selling points of consoles are 1) A static hardware configuration that always works, 2) Ease of use and 3) The cost as compared to the experience- it's relatively cheap and easy to play great games on consoles. That is why they exist. To call them obsolete is to entirely miss the point and the value proposition of the console as a product. And this is coming from a PC gamer, so no fanboyism here at all. I just wish the authors of these articles would realize the larger point of the console as a software delivery device and a viable business proposition for these consumer electronics companies.
Here's the thing. Not every one of your friends is shelling out $1300 for a gaming PC. $400, or even $300 in a bit isn't too bad. Gaming is funner with friends. And the exclusives make it well worth the price. So keep your cool graphics cards, you'll be playing alone anyways.
im sure anyone can make a friend with the 55 million users on Steam.
you are poor we get it. its cool. Dont be hating.
huh I hear this all the time are you trying to say pc don't have exclusives.
Lol domce. I have a fully built gaming rig. I still don't play it as much as I do with PS3. Maybe gets a job and stop leaching off your mom and then talk.
So now the pc gamers finally have a reason to be thankful to consoles. Finally the benefits of certain aspects of games development and console functionality (buy it and it just works) can be implemented into the far more time, resource and technical requirement heavy pc platform. Simpler accessibility and setup for the end user and easier and more powerful tools for development. This coming generation of gaming is really going to start to balloon. I fear that both the One and PS4 will struggle to make it 10years if the pc continues to flex the way it is currently. When the latest pc gubbins becomes affordable it will be a very tempting alternative, especially if game devs segregate console and pc development to really take advantage of all the power that is soon to be unleashed. DDR4 is coming soon so that does the Ones memory. Then there is talk of GDDR6 so... The idea of gaming is that of playing and detaching oneself from the real world for a time. These things mean nothing when you nail that headshot from across the map with a sniper rifle while in mid air or pulling off a dragon punch in street fighter for the first time. Regardless of what platform we play on our enjoyment will be no less as imagination is a powerful thing. Being 'lost' in the game is what it's all about. Bring on new tech and experiences, all it brings is a wealth of choices for us, the real difficulty is choosing which one to play first!
I´v never had any tearing since i finished Half Life 2 (v-sync, steady 60fps). Right now this new tech only seems to work with TN-Panels.... who the hell wants to play Crysis 4 at 1440p on a TN monitor ? I´m not going to throw my 30" IPS or PVA montior out of the window, only because Nvidia said so. Since this new tech is meant for new monitors, nothing will stop anyone from connecting a wii u, box 1 or PS4 and enjoy "a smoother experience; 30+fps(45)". Besides, from a noob POV, this looks like it´s going to be a "cache" for monitors. Like HD have 64mb these days.. few years ago most HD had 8mb.. more space means less caching (smoother). In other words, we have to see if this does not increase the input lag a bit or not. Big announcement ? Not really... has potential but nothing more.
We get it, you're a pc gamer. Move along.
by the time the cost of this new hardware is cheap enough for most people to buy, the ps5 and xbone2 will already be implementing this new hybrid of tech. Just like old crappy lcd tvs used to cost $2000 bucks but the same sized tvs now only cost $350 bucks for an LED. Until then first party publishers will no doubt be making AAA titles that pc gamers wish they had on their pcs and visa versa.
Um, sorry, but it uses currently existing tech. The only thing that will be different will be the monitors have the chip inside that interacts with current GPUs.
the only problem is that nvidia is not releasing such device to amd based cards or apus,
Next gen didn't wow me as much as I expected it would, the G-sync is just making the steam machines even more appealing. What advantages do next gen consoles even have compared to stream machines?
exclusives , which is what has always set them apart.
Personally I'll go with PS4 (for the exclusives) and PC with G-SYNC.
Except console exclusives can't hold a candle to PC exclusives. Console exclusives are pretty much 2 overdone genres the usual third person action game and generic shooter with 6-8 hour campaigns. Nothing special about either really. PC on the other hand has whole genres that are exclusive along with plenty of third person action games and fps to play. Best of both worlds.
@sorane , let's just agree to disagree , i find that Sony , MS and Nintendo make great exclusives games .
@sorane I'm primarily a PC gamer. But to be fair, Nintendo's exclusives are very different from what we can get on PC. There isn't a single game on PC that's like Rhythm Heaven, Wario Ware, Smash Bros, Rune Factory, or even Pokémon. That's why Nintendo's consoles and portables are the first ones I buy. Some months (or years) later I think about Sony's consoles, depending on the exclusives that were released. I only bought a PS3 when Heavy Rain was released. I don't even consider Microsoft's consoles. It's exclusives are all FPS or racing. And like those there are tons on PC. (And the 3RL fiasco - and how badly they handled the situation - crushed any trust I had in them).
Well, Steam Machines are mainly aimed at the TV/living room environment while G-SYNC will be for quite a while an exclusive for monitors.
they are cheaper,
PC gaming is far more advanced than console gaming but console gamers don't care. These articles strive to malk them feel inadequate and although I'm a PC gamer, I don't think that's right. To each their own. Now let me get a g sync monitor at 1440p please. :)
I'm usually a console gamer, but I do have a pretty beast gaming rig, and I switch back and forth between PC and PS3, gonna be the same thing when the PS4 comes out, Battlefield 4 for PC, Killzone: Shadow Fall for PS4, you get the idea. Still don't know why people have to act like dicks to one another..
Yet everyday on this site we hear "ps4 has betta grafix zen xboxs" or "xboxs haz betta grafix zenz ps4zzzzzz" then go on to talk about how great their features are when they can do them already on the other devices and things they own. So obviously console gamers do care for that.
PC gamer, nothing to see here folks, move along.
It's an article from PC Gamer, about a PC feature. What were you expecting?
An article about PCs, not an article about insecure PC gamers talking about consoles and how they think they're becoming obsolete... again... for the 13289412394823894th time.
these guys act like if Triple Buffering didn't exist. On top of it not every game requires to be V-synced... At the end, no V-sync will give you the best results but people are so dumb that don't realize that and fall for the marketing ploy. Another thing is, console gamers have been playing sub-30 frames games for years without complaining anyway. With the more stable frame-rates of next-gen console gamers will notice huge improvements without the need of a proprietary gimmick.
This guy clearly doesnt understand the advantages and disadvantages of vsync. Go watch the NVIDIA conference and educate yourself.
this guy at best has 10, plays ds, and knows nothing about fps visual impact, not to mention he did not read the article because at least he would be informed of vsync adv/disavd on current displays, if we want games to be cinematic,interactive experiences, graphics by itself wont be all, output also counts
Well, let's be fair here... Console games typically aim for 30fps because that is as much as the hardware will allow. The PS4 and X1 are a bit different because one of the "next-gen" targets is 60fps. Essentially, the PS4 and X1 may very well run into some tearing issues depending on the TV/monitor. And, G-sync in a nutshell, eliminates tearing by eliminating frame lag at he highest frame rate possible for the provided hardware. This is going to become a standard soon, and I'd imagine Sony is kicking themselves since they could have marketed this out for their Bravia. PS4-ready TV's would have sold at least an 25% PS4 market easy.
Well, let's be fair here... Console games typically aim for 30fps because that is as much as the hardware will allow. The PS4 and X1 are a bit different because one of the "next-gen" targets is 60fps. Essentially, the PS4 and X1 may very well run into some tearing issues depending on the TV/monitor. And, G-sync in a nutshell, eliminates tearing by eliminating frame lag at the highest frame rate possible for the provided hardware. This is going to become a standard soon, and I'd imagine Sony is kicking themselves since they could have marketed this out for their Bravia. PS4-ready TV's would have sold at least an 25% PS4 market easy.
Stupid from the get go Pc Are not meant to battle consoles only ignorant people compare a closed system with an evolving one...stupid. If anyone wants to rant well pit Nvidia gpu to Amd gpu or Pc vs Mac. Consoles are closed never evolving ecosystems that get optimized to then teeth and then replace entirely and they don't cost an arm and a leg every 6 to 12 months...Stupid article.
I'm sure you're trying to say something, but I'm also sure people stop reading from the moment they read the first or second time you say the word "stupid". I only got to "ignorant people".
Yeah guess I was to harsh on the stupidity my bad.
what the heck kind of article is this. OBSOLETE? This is just next gen v sync
This generally doesn't matter as vsync generally takes care of any issues and I've never noticed any input lag while using it. The only thing that I would suggest for Sony and Microsoft, is require developers to use vsync. Nothing ruins a game experience more for me than seeing screen tearing. It was really prominent last generation, especially with games using the unreal engine. Some of them, it was so horrible that it made the games unplayable. Some people see screen tearing, some people claim they dont. It is a problem that needs to go away. Hopefully, with the start of the new generation most developers will enable it by default. I don't think any of them want to see screen tearing in any of their games. But some of them use it as a fallback, instead of putting more effort into getting a better framerate. It's why I think that Sony and Microsoft should mandate it in all of the titles released on their systems. It's been a bullshit copout for far too long in gaming. As someone else here said, this technology will not take effect completely in the industry probably until a generation after the new machines. This will require that Nvidia work out deals with every TV manufacturer in the world, to make sure this is built in to the hardware to eliminate this issue. This Will take quite a bit of time and TV manufacturers will have to redesign their circuitry to accommodate it. Not something I am sure they are readily wanting to do, just to please gamer's. Monitor manufacturers ,on the other hand, know that their monitors are used to game on and will be far more willing to justify the redesigns. So, as someone else here said… Nothing to see here, move along. This will not matter to this generation of gaming and may not matter until the next(after Xbox One and PS4).
Speaking of unplayable, Rage on the PS3 was practically unplayable for me. Never before has a game's screen tearing and object/texture poppin annoyed me so much. I just couldn't stand it! So I sold that version and used that money to buy a PC version and never had the problem again. I'm not normally a PC kinda guy when it comes to games, but I seriously had no other option if I wanted to play the game and actually enjoy it.
The biggest problem with games are the jaggies that are all over the place, instead of thinking they can change gaming and that ps4/xboxone are obsolete with some Bollocks tech they just invented called G-SYNC, they should work on how to get rid of the damn jaggies, i hardly see any screen tearing when i play a game if not 0 screen tearings!
Please don't tell me someone actually bought the shield hardware, rofl. That's about as portable as my tv. I do find however, most pc fanboys own nvidia cards. btw titan just got spanked.
PC fanboys do tend to be nvidia fanboys. They're the ones disagreeing if you praise AMD for any reason.
You're crazy most people go with whatever's in their budget. I know more PC gamers with AMD cards simply because they are usually cheaper.
You obviously have never played on PC if you making that kind of statement. PC "fanboys" go for the company that provides the best bang for their buck and will switch to either Nvidia or AMD. Just look at the Steam Hardware Survey. Clearly the survey shows a minor lead for Nvidia which is hardly the picture you are trying to paint. http://store.steampowered.c...
This new technology is going to be extremely expensive for any average person to even consider buying until it becomes a standard thing. Until high end GPU's today are the "standard" pc 4-5 years from now, It's going to be something people want but can't afford.
It's a monitor bro. It won't be expensive at all.
Well yeah, it's in the monitor, but I was more referring to my inference that only soon to be released GPU's would be able to sync to the monitor. ( which I should have stated previousy, my bad ) If it's done with software rather than hardware, then any(Nvidia) GPU can be used to sync to the monitor which is awesome. My only question then is, when is Nvidia going to partner with different television making companies like Vizio or LG or are they going to make their own proprietary monitors? ...very interesting stuff.
They are partnering up with different manufacturers. Not making their own. Also, it works with the 600 series cards and forward.
You seem to forget that new series of GPUs are released at multiple price points, and thus, aren't necessarily expensive.
Screen tearing? Games hardly have any screen tearing, and, it doesnt bother no one, how about you create something to get rid of the jaggies in games which are all over the place every time we play a game, you smartasses? Thinking you can change gaming...
Full screen anti-aliasing (FSAA). which has been around for a while fixes jaggies. plus, jaggies are far more common in console games then PC games
I think I've said this before, but a lot of consumers, (generally not people browsing n4g,) buy based off of brand. Sony PlayStation whatever? Hey! That sounds cool. I've heard of that brand. PC with NVIDIA G-SYNC to synchronize the monitor's refresh rate with the GPU's rendering rate? Sounds complicated. What's NVIDIA? There's essentially no point where consoles being obsolete compared to their PC gaming rig counterparts will stop consumers from buying them.
I don't understand this site; I posted a perfectly reasonable comment. You guys thing I'm a fanboy or something? It's clear from my comment that I am not.
It's an amazing new technology but it will cost entirely to much for a relatively overlooked problem. It's something nice to have, but won't be buying a brand new, crazy expensive monitor just for it. Sorry...call me when the price is reasonable.
the Nvidia CEO said it will only be a small price increase for a monitor that has G-Sync installed...you didnt read the press release?
As of right now it's a little more than a small price increase imo. http://www.maximumpc.com/as...
Well if someone is going from a $99 monitor to a $399 monitor, because of this, that's a big increase. But to be fair, this might just be at launch. HOPEFULLY they extend this to EVERY monitor and HDTV maker and it becomes standard. It's really amazing tech, but it needs to be standardized. Somehow I doubt my 40" Dynex HDTV will be compatible.
$180 bucks just for the tech to be added? Somebody could buy a fully capable monitor for that price. Much more than some small price increase imo...
then i apologize mates. my bad
G-SYNC... heh!? Wait... don't some HDTV's have post processing also. I guess it makes sense to move away from dumb monitors right. Lets put a microchip in eva thang. Question is? What protocol are they leveraging to make this happen. Does it interface by other means in addition to DVI & HDMI?