Check out one hour of Tomb Raider Definitive Edition on PS4. The game is confirmed to be 1080p/60fps on PS4.
60FPS on PS4 confirmed at 50:35 :) There is a second question at 59:24 were the guy asked if its "locked 30fps?" He answers that their minimum was 1080p30fps and he goes further and says that they were impressed how well it was running and it runs in 60fps on PS4. He never mentioned or said that the Xbone version will be the same. lol From both questions it sounds like that Xbone version will be only ~30fps
Does this mean that Tomb Raider Runs at 1080p60fps only on PS4 http://img.pandawhale.com/m...
Ps4 is the future and the future is ps4.
PC wipes the floor with both versions. PC is the future unless you can't afford a decent one then you are just stuck with old outdated hardware that PC had two years ago.
^We can talk once PC gets exclusives in the league of The Last of Us, God Of War or Journey.
The frames are not locked. A well known Gaffer wrote this: http://www.rocketchainsaw.c... "On average: PlayStation 4 = 60 fps Xbox One = 30 fps Yes, the PlayStation 4 build is, on average, twice the framerate of the Xbox One build. Both builds are rendering at native 1080p resolution, and generally look the same in graphical effects (though some minor differences may apply). Performance is a different matter. Both feature unlocked framerates, meaning framerate fluctuates between higher and lower values. The Xbox One build can technically reach around 45 fps, though this performance is generally only achieved during the most empty, simplest environments. For most of your play, and during action scenes, the Xbox One build will sit on around 30 fps. On the other hand, the PlayStation 4 build will attempt to hit 60 fps as often as possible, and does a pretty good job of doing so, but does have slight dips under 60 fps during certain scenarios. So there it is. Both 1080p. PlayStation 4 = ~60fps average. Xbox One = ~30fps average. Take with a grain of salt if you’d like, but we’ve confirmed it on our end, and confirmation for the rest of the world is only a week away." EDIT: Just saw this article is on the previous page posted to N4G: http://n4g.com/news/1442403...
So are you disagreeing the frames are unlocked, that a well known Gaffer wrote that article, or that I just saw it. Hurts, doesn't it. Here, this was written a year ago by the creator of FXAA: http://www.neogaf.com/forum... And the reason the differences between the X1 and PS4 are already appearing: Xbone: 1.18 TF GPU (12 CUs) for games Xbone: 768 Shaders Xbone: 48 Texture units Xbone: 16 ROPS Xbone: 2 ACE/ 16 queues PS4: 1.84TF GPU ( 18 CUs) for games + 56% PS4: 1152 Shaders +50% PS4: 72 Texture units +50% PS4: 32 ROPS + 100% PS4: 8 ACE/64 queues +400%
@gapecanpie So true, can't wait to play The Last of Us 2 on PC, the sequel to the hands down Game of the Decade according to many sources.
wow this is a repeating pattern. The devs are taking full advantage of both consoles. Pushing them to their limits. I'm glad that they cared more about achieving maximum results from both both consoles instead of worrying about parity.
Well, I was already buying this game on PS4 since I missed it last year, 60fps confirmed for PS4...? Sold
Oh man, it feels great officially knowing this Hahaha! I laugh at all the nay Sayers and disagreeing people trying to tell me that the ps4 wasn't capable of 1080p @60fps for this game just because they thought the tech difference wasn't that big of a leap as it was going from the ps2-->ps3 LMAO! I told you fools that the ps4 was capable of it and only depended on the developers as to whether or not they wanted to implement it on the ps4 version :-D http://n4g.com/user/blogpos...
Considering the Tress-FX which puts down most of the high end PCs, the extra added next-gen effects, lighting and physics, running at native 1920 x 1080 at 60 FPS it is a technical avhievements even for high-end machines ( you can bring your Alienware or Origin quad SLI Titan + 64 gigs of RAM which you can only see in your dreams or fairly tales, we are talking mainstream here, check Steam polls, most of users are stuck with GT 9800 which i used to have before my old HDD was fried). Devs who knew what they do weren't lying about the PS4 being high-end.
Lol the GT 9800 is a 2008 era card and you're comparing its performance to a 2013 game. Failed trolling. The ignorance of PS4 fanboys continue to astound. Oddly enough my $700 PC run Tomb Raider with TresFX on at 60 FPS just fine. Oh and I only paid $15 for Tomb Raider complete. It's irrelevant anyway. For all this hooting and praising, I doubt any of you clowns are actually stupid enough to pay $60 for this Tomb Raider Full Price Edition.
@gapecanpie Lol well I have a high end gaming PC, but believe it or not, I use consoles way more. In my opinion, the console experience is far more enjoyable than PC. Not to mention PC has no decent exclusives. Only good indie exclusives. Also, why is it that PC gamers have never seen games that look like Watch Dogs and The Division? Because of the next gen systems and PC gamers finally not being dumbed down by consoles. If anything, you fanboys should be happy for the next gen systems.
Wait i could have sworn i read it was 30FPS?? Edit:Found it so PS4 is running at 60FPS?? http://n4g.com/news/1438387... @Infected Lol great minds think alike.
"They were some misquotes. Our goal was 1080p 30 FPS minimum. And as you can see here, the PS4 version, is clearly running at 60."
I'm pretty sure they hadn't said anything on it because, perhaps, mister Microsoft try to buy their silence or avoid it till launch. This is pretty impressive that from the get-go the ps4 can manage 60fps compared to the X1 45fps, at most. Can't imagine what developers would be up to when they get deeper into the systems.
Big deal.Just another multiplat looking and running better on the more powerful and efficient console.
don't celebrate just yet, save it
Save it for this you mean? http://n4g.com/news/1442403... These are fun times.
lol nothin' to see here.......
That's great and all but it sucks to not see it in 4k at 60 fps, another generation held back graphically by the consoles :/
>implying you or even a fraction of gamers have 4K monitors to begin with Typical PC elitist hyperbole lol.
So many ms console champions now championing pc....isn't that a coincidence. I guess running for your bigger brother to fight your battles is the in thing
@ThatCanadianGuy514 I have one... why should I care if others don't? They have options for their monitors. @Why o why Console champions champoning PC... yeah how about you read these and stuff calling me a console champion. http://steamcommunity.com/i... http://www.lolking.net/summ...
Human kind is held back by your ignorance. http://ho9od35yvs05ejqn.zip...
@Fireseed Dual titan GPUs can barely do 4k at 30 fps. 2 $1k graphic cards can't do that. Sorry but 4k gaming is not now.
Fair to you and my apologies but many ms console lovers are doing just like I said... if the xb isn't on top and the graphics are better on the ps then all of a sudden ps owners shouldn't care about graphics because if we did, we'd get pcs. Its nonsense... I prefer consoles, n4g is predominantly a console site but some pc guys love to pop up and spew superiority. Pcs have always and probably will always be superior, we get that, but I cant play the games I've enjoyed over the years on pc because they just aren't available. Show me the pc game of the year winners over the past 8 years....... With consoles, its a fixed, more level playing field. Basically its down to comparing apples to apples.
"Dual Titans can barely"> Why the **** would you get Dual titans?
did people lose there sense of humor ?. fireseed
@DarkHeroZX I have a dual GeForce GTX 780 ti rig... trust me... it can handle it. @Irishguy95 The utility of Dual GTX Titans comes with increased floating point precision performance. I've got to play around with a rig that had two of them, and holy crap was it a beautiful thing. @Why o why It's all good dude. But I mainly come in here and make this point on occasion because it does get sickening after a while seeing absolute fanboys like Canadian and the rest of them hooping and hollering about how 1080p 60 is the only acceptable standard NOW... and yet it's been the standard for A LOT of PC gamers since around the time of PS3s launch. It's the same feeling you get when you know a great deal about a subject, and see someone who most likely read a Wikipedia snippet passing himself off as a genius. And I like consoles too (hence why I own the PS4 and X1) but this hubris that SonyPonies have that their middle of the road specs are as one deepthoating SonyPony puts it "The future is PS4, and PS4 is the future." Is laughably stupid, sure it's more powerful than the X1... but to say those specs are the future... ummmm it's adorable at best and pathetic at worst.
Verified sources close to Rocket Chainsaw have detailed performance and rendering quality of both the Xbox One and PlayStation builds of Tomb Raider: Definitive Edition. And for that we’re thankful. So here it is! On average: PlayStation 4 = 60 fps Xbox One = 30 fps Yes, the PlayStation 4 build is, on average, twice the framerate of the Xbox One build. Both builds are rendering at native 1080p resolution, and generally look the same in graphical effects (though some minor differences may apply). Performance is a different matter. Both feature unlocked framerates, meaning framerate fluctuates between higher and lower values. The Xbox One build can technically reach around 45 fps, though this performance is generally only achieved during the most empty, simplest environments. For most of your play, and during action scenes, the Xbox One build will sit on around 30 fps. On the other hand, the PlayStation 4 build will attempt to hit 60 fps as often as possible, and does a pretty good job of doing so, but does have slight dips under 60 fps during certain scenarios. So there it is. Both 1080p. PlayStation 4 = ~60fps average. Xbox One = ~30fps average. Take with a grain of salt if you’d like, but we’ve confirmed it on our end, and confirmation for the rest of the world is only a week away. http://media1.giphy.com/med...
Forget it, I hate unlocked framerates. It would be far better to cap it at 30fps for a consistent frame interval. I hate uneven, juddery framerates. People screaming for 60fps is going to result in a whole bunch of stuttery games with unlocked framerates, just so they can say their game is 60fps (well, some of the time).
I think you are confusing bad frame times with inconsistent frame rate.
Agree. This is bad news. I don't want a framerate that fluctuates. Look at need for speed rivals on ps4... the frame rate is a ROCK SOLID 30fps (after patch) and it plays just as smoothly as a 60fps game.
@kevn inconsistent framerate is bad frame rate. There is no such thing as having fluctuating good frame rates. Any frame dip no matter how high will cause tearing and stuttering. Thats why there is V sync and Nvidea put resources into developing G Sync. I game on PC and lots of times and most games can reach up to 80-90 fps but when the action starts drops to 60,50,40 fps and I can see the change happening and its not pretty so I put v sync on always. Still then if its running at 60 and it drops to 50-45 fps I can see the stuttering. Most PC gamers have that problem, where console games dont usually have that since if they cant get 60 fps steady its probably just going to be 30 fps to avoid that. There are no graphic settings for console gamers to tinker with to make the frame rate as consistent as possible.
You know there is no such thing as "locked framerate". Vblank isn't used any more (TFTs don't have accurate vblank anyhow - that's a relic from tubes) and what other games do, is use a upper fps threshold - basically freeing up cycles. Neither is used here. Plenty of cores to run render loop at full speed. And, actually, you can't see that fluctuation. KZ or BF run dynamically - and close to 60 and you can't tell when it dips. It has no screen tear nothing. NFS is just fine as it is at 30, but it also could probably run faster. We'll see the verdict eventually on DF. PCs have latency issues and that's what you see. Solved yet again with brute force overpowering. Also, PC fans boast they can run games at 60+ fps when in fact a TFT has a fixed (simulated) refresh rate of 60fps which would then result in interference yet I haven't heard anyone complaining about that.
kevnb, with all due respect I do know what I am talking about. When your display refreshes at a 60hz frequency rate a game has to output at 30fps or 60fps or you will get judder caused by the uneven frame intervals. With 60fps a unique frame is sent to the display every 16.7ms, if there are latency spikes in the rendering pipeline somewhere and it misses that deadline and if the delay is pronounced enough you will detect it as a stutter. Now, even though 30fps doesn't give you as much visual information (temporal resolution) as 60fps it isn't too bad as long as it sticks close to that 30fps. This means each frame is rendered in 33.3ms. It might not be as smooth and responsive as 60fps, but it is still even and consistent. With double buffering the framerate has to be 30fps or 60fps, there is no in between. This is why games like AC4 on the PC run at 30fps with v-sync engaged if you can't maintain over 60fps. Now with triple buffering you can get framerates between 30 and 60fps, but while you won't have to deal with screen tearing you will still have to deal with judder due to the fact that the frame intervals are all over the place. I hate judder and it's one of the reasons I game on PC (because I can always buy more powerful hardware to guarantee I can maintain above 60fps). It's also the whole reason Nvidia and AMD are developing things like G-Sync and the so called "free sync". If judder due to uneven frame intervals didn't matter then there would be no purpose for technologies like G-Sync. This unlocked framerate trend on consoles is not a good thing at all.
So many disagrees and yet not a one of those people can prove that what I am saying is untrue, or is willing to argue that an uneven framerate is superior to a steady, smooth framerate. Again, if variable framerates and the resulting judder weren't a problem then companies like Nvidia wouldn't be trying to come up with technologies like G-Sync to deal with it. I watched the entire video now and it is easy to see the judder. I can also see that the devs weren't bullshitting about the graphical improvements. I easily noticed the improved particle effects, fuller foliage with more animation, richer physics on certain objects, etc. And all of that is great. I just think it would have been a much better experience had they capped it at 30fps. It would have given the game smoother motion from a visual perspective and a more consistent controller response.
Ok, I play games on pc and I usually don't use vsync. Framerate will always go up or down depending on what's happening. That's why when you see reviews on video cards, they give the avg. framerate. To have a good experience, the minimum framerate is what's important. So if you have a min. framerate of 50, than it will look smooth, but if the framerate is low with a high max framerate it will be noticeable. "A rate of 100 Hz is comfortable at almost any size. However, this does not apply to LCD monitors. The closest equivalent to a refresh rate on an LCD monitor is its frame rate, which is often locked at 60 frame/s. But this is rarely a problem, because the only part of an LCD monitor that could produce CRT-like flicker—its backlight—typically operates at around 200 Hz." http://en.wikipedia.org/wik...
My reactions to these news: http://img.gawkerassets.com...
if its 30 fps on xbox1 i wonder how many people will keep saying , developers choose to make it that fps, it has nothing to with the hardware
INCOMING ASS STROM!!!!
From the PS blog http://blog.us.playstation.... Someone asked and Clements replied: "14+ RecklessOnion_ on January 6th, 2014 at 12:02 pm said: Buttery framerate? Does that mean 60FPS?? Ryan Clements's Avatar + Ryan Clements on January 6th, 2014 at 12:08 pm said: No specifics yet on that! But I can assure you it was smooth. ;)" Interesting. If it is 60FPS at 1080P i will buy it day one
Very wierd, every other interview says 30 fps.
IM CRYING!!! Tears Of Joy
But the Xbox One has better achievements and voice commands?
maybe if you flap your arms for kinect the framerate will increase Kinect and Cloud FTW!!!!!
Me: "Xbox one increase frame rate on game" Xb1: " ..." Me: "Xbox one use the power of the cloud" Xb1: "..." Me: " Xbox one become more powerful" Xb1: "..." Me: "Xbox one I wish you were a ps4" Front door: "knock knock" Me: opens door Ps4: "hi I will be your new friend"
should have left it after the more powerful line, i was finding it funny till the other bits.
Voice commands are in both. PS4 supports touchpad, speakers in controller, six axis (parachute) and light bar.
So 1080p. Such framerate. Much awesome.
I suppose the PS4 version is the definitive "definitive" version then.
It wont match the Xbox One version.Just like battlefield 4 on the PS4 looks dull and not alot of detail.On the X1 battlefiled 4 has the vibrant color and alot more detail.Even EA said the X1 was the better version.Tomb raider will be no different no doubt.The PS4 just cant match the X1
you really have to stop leaving the /s after your comments or people may take what you say as serious lol.
after microsoft paid youtube people to say kind words about the xbox 1 and ea has done the same thing plus ea is in bed with microsoft with titanfall and bf4 dlc i wouldnt trust anything ea says about x1
nba 2k14 forza 5 fifa 14 Crytek said they could have made ryse 1080p easily but chose not to and the head of naughty dog said on gametrailers that "they could hit that 1080p 60fps target in a week" The devs of these games could hit 1080p 60fps, How come the others can't? (genuine question, if anyone knows???) this attached image came into my head:
Any developer can hit 1080p 60fps. Is the quality going to be as good as a 1080p 30fps game? No. When you're given TWICE the amount of time to render each frame, you can devote more resources to other things such as effects, lighting, physics, AI and so on. If you want this entire generation to be 60fps, then be prepared to be disappointed when you don't see the games improving in those areas.
Id take the 1080p over 60fps any day, I play on my xbox more than my ps4 so at least I'll have a new game to play on it. But even infamous: second son is 30fps. I found an article stating that ms don't force devs to hit that target but I think the devs should want to hit it. If anything I don't blame the hardware, I blame the devs.
@quaneylfc Well the hardware is also to blame. If a game runes at 1080p @ 60fps, and it can't on XBO, then developers have no choice but to drop the resolution down to 900p or 720p to hit 60fps or drop the quality of the graphics down 1 1/2 levels (High PS4, Medium XBO)to get the same results. Either way it's going to be a noticeable difference, and hardware is to blame for that, not developers. IMO it's best just to drop the graphics quality down to medium and keep resolution and FPS up.
@quaneylfc Hitting 1080p @ 60fps depends on how demanding the game is, how well the game/engine is coded, and how powerful that hardware is. Hardware is the simplest. Either you have the power to hit 1080p @ 60fps or you don't. In this case the PS4 and XBO both offer GPU's that can do gaming at that resolution and framerate. However, the PS4 has a more powerful GPU and significantly better RAM for gaming/rendering graphics. How well the game/engine are coded and optimized for release depends on the developer. Multiplatform games usually aren't optimized to the same degree as an exclusive simply because they generally have at least 2 (often 3, 4, or currently 6) platforms to launch on, so poorly optimized games don't run properly. For example BF4 multiplayer early on. These consoles have the benefits of being very familiar to PC's, and it should have been easy to port Tomb Raider over, and focus on codign and optimizations. The problem is a combination of Hardware and Tomb Raider being and open game with high textures, and demanding graphics that the XBO GPU can't handle at those levels. NBA 2k, FOrza 5, and Fifa 14 are all good looking games, but Forza is a racing game from point A to B, but a lot of last-gen rendering, and sports games are set in a small box compared to more open games like Tomb Raider. The PS4 has stronger hardware so it can maintain 1080p @ 60fps, and the XBO could as well if the graphics settings are lowered. You can think of Fifa being demanding simply because of all the characters on screen, but the crowds are generally 2D pictures or super low poly that it doesn't really take away from the game. However, think of EVERY single tree in Tomb Raider as a character on the field as well. The lighting has to cast real shadows off those trees, the animals running around need AI, the wind blowing needs physics, rocks need tessellation, etc... There's a lot more that needs to be rendered in an Action game than a sports or racing game.
The PS4 uses an under-clocked 7870/R9 270 hybrid which are still solid mid-range GPUs and can still handle most current PC games (aka these consoles graphics settings)at 1080p @ 40 - 50fps. Look at this chart. The Blue is the PS4's GPU, and the orange is the XBO. http://www.anandtech.com/be... Look at Crysis 3, as you can see it can hit 1080p @ 30fps on the XBO (mostly high settings), so Ryse could have been 1080p @ 30fps, if it wasn't in the launch rush. But at the same time it would have been effortless to get 1080p @ 30fps on the PS4 for launch, and with more time it could have been 1080p @ 40 - 45 fps, like Killzone: Shadow Fall. More powerful hardware produces better fps and resolution.
Well if multiplats continue the way they have been on next gen systems (and unless publishers all of a sudden enforce parity for some strange reason it will continue) I'm guessing you'll be playing A LOT more on your PS4.
Depends on how many of my friends get what console. "ABizzel1 thanks for the info :)
what kind of shit tech are they using in that box. something is seriously wrong. all its games are gimped to shit.
It's not "shit tech" it's an under-clocked 7790/R7 260X hybrid which are solid GPUs, but not ground breaking.