The gaming world appears to be moving on to the 60fps standard and we aren't sure that's such a good move. Here's why in the 30fps vs 60fps debate, the "Magical Story-Telling" and "Cinematic Experience" will Always stay with 30fps and Not 60fps.
Yeah well 60Fps will always be better for gameplay. If devs really want, drop to 30FPS for cutscenes, but please try for 60FPS if it's a FPS or racing game
Yea Exactly, 60FPS is great for Racing and Fighting. But RPGs and Storydriven games should be 30fps :)
this author is making a very bad comparison across two different mediums, and it's apples to oranges. comparing frames per second in movies and video games is NOT a direct comparison, so saying one way will work best in both is not correct. do movies have screen tearing, or anti-aliasing, or input lag, or anything involved in rendering video game code? no. movies deal in captured light, and it's reproducing the picture already shown, and it's going to reproduce it perfectly fine and perfectly smooth (assuming all the equipment is top notch, etc). video games have to deal with so much more because it's a different medium. 30fps in video games can mean more input delay, more possibly screen tearing, and other problems. frankly, going to 60fps and higher in video games just approaches the smooth quality that Hollywood already gives us. sorry, not trying to sound like a dick or anything, but this author clearly has no idea what he or she is talking about. you don't just get an idea in your head and run with it. think it over, do some research, and write a piece based on facts and logic.
He's only talking about Alan Wake - didn't that come out on PC? There you can make an easy comparison between 30 & 60FPS. I'm still thinking that games like GTA5, Killzone or InFamous are better with 60fps - this is the key to responsive and fluid gameplay.
If the game is 30FPS but is locked solid to it, no drops frames or tears, and the control response is totally consistent, I don't really care too much. This is the key to making a good 30FPS game. The best games that tell a story like the best movies capture you and get you to suspend your disbelief. This is much more difficult if the technical aspects of the game get in the way. Quality motion blur at 30FPS helps no end as well. We have some console games with a good implementation but most don't do it or get it right. Alan Wake has a nice implementation. The game is plainly aiming at being cinematic. This has been a pretty costly effect for existing consoles to manage. Again, Alan Wake manages it, but that game is very very low resolution on 360. I would expect the new consoles to be able to afford it. It'll help a lot. There just hasn't been enough power to do high resolution, high quality motion blur and high quality AA on a console together. The newer consoles will be much better at that.
60fps is a very different experience for many games than 30fps, a game that moves much smoother has advantages on gameplay. @matdillahunty -- totally agree with you, when it comes to games there are so many variables to have the perfect looking game with great balance of gameplay running @ 60fps, Tomb Raider at 30fps is not the same experience as it is at 60 fps, its way better.
In short: 30FPS gives pretty much 0.0333 second time for all the operations application can do to read input, run logic and then draw everything. Locked 30FPS means they manage to do all of this in this time fraction of a second. 60FPS on the other hand, gives you 0.166 frame time. Since usually most of frame time goes to visuals(them textures or vertex data can be fat), there's simply more time to draw with 30FPS. The other side of the coin is the fact, that usually input is one of the first thing read during a frame, draw among the last. Which makes the time calculated above to be pretty much a delay of your input. It's obvious to be close to twice as big with 30FPS. Sh*t hits the fan if frames are displayed incorrectly each second, too fast, to slow, unstable(varied frame time), etc. Or if your PC is too slow to render 48FPS, HD footage properly(twice the bandwidth maybe?). Which might often be the case. I can take 48FPS anytime over 24. But it's still within heavily brain limiting 60/updates blinks a second frequency(which can be responsible for dimnishing of multitasking capabilities).
@mattdillahunty "going to 60fps and higher in video games just approaches the smooth quality that Hollywood already gives us." Hate to burst your bubble man, but the standard for Hollywood films is 24fps.
@frostypants hate to burst YOUR bubble, but i already knew that and you completely missed my point. even though movies are only 24fps, they're a lot smoother and aren't subject to things like screen tearing, framerate drops, etc. ie things that happen in video games when a bunch of frames have to be rendered one by one. so even if video games have higher frame rates, movies tend to be smoother because of what they're displaying and the method used for recording and displaying it. hence why i said the higher the fps goes in video games, the closer it gets to movies in the QUALITY of the picture. not the frames.
that IS what the article is saying :P That bieng said semi-action games like RYSE should also be 30fps imo... too much smoothness ruins the Heat of the Moment feel.
RYSE is all about timing. It would play pretty poorly at 30fps.
Idk why you think smoothness is ever a bad thing. A game can be crisp and clear, visually as the developer intends without having to sacrifice frame rate. The beauty of having more frames allows you to make sure things are seen that you want seen. You can clean things up or muddy them. Devs want something to be blurry, choppy, inconsistent well then make do it at 60 fps. We have the tech.
But...Ryse IS 30fps...(if you are lucky).
Could it be that we are not used to it. Using the movie example (the hobbit is a bad example anyway as it already looks weird with the intensive CGI moments) we are not used to the more frame per seconds. That doesn't mean the lower frame rate is better because of what you said (weak link in my opinion). I assume 60fps looked weird to start with when we played games but it looks normal now. edit: to me 30 fps and 60 fps are more suited to different genres and game modes. Not because it affects story telling but because it can be a waste on limited resources (i.e. could be used to do other stuff without really sacrificing quality)
There is a reason they chose 24fps for movies. It's because under 24fps the human eye perceives the images differently (as a sort of slide show). Just over 23fps the human eye perceives the events as "cinematic" and experiences the action in front of your eyes as more "epic" or "meaningful" than in 30 or 60fps. This was a very conscious choice. For games, however, developers are trying to keep the artistic direction coherent. This is why they chose 30fps for cut-scenes AND gameplay, because this allows better gameplay while the cinematic action also looks "gamey". Nowadays, I would like to see how games really look like if the cut-scenes were in 24fps and gameplay in full 60fps (for responsiveness).
Actually, they chose 24fps because it was cheaper (they had to use less film) than 30fps. It was the lowest they could achieve without sacrificing too much smoothness. So there was no such scientific study as "being more epic" in this decision, it was purely about money.
It has nothing to do with how appealing it is. Film is extremely expensive so they lowered the count to 24 so they could produce the clearest and most fluid image without any choppyness. It just stuck that way and has become standard so that's why we see it as cinematic. Anything above just seems unnatural now and is not appealing. I saw the hobbit in 48fps and it was not appealing in the least bit. I'm a film student.
I rest my case and admit I was (under) wrong (impression)! Thanks for correcting me. Cheers! :)
FYI, 60fps (and 50fps PAL) was there long before we actually had to drop to 30fps. Interestingly "ancient" consoles were not fast enough to multitask, everything was linked to the VBlank (which is actually simulated on LCD) and was a hardware limit of the then used tubes. Everything which would not be in sync with the VBeam would tear and made those games pretty much unplayable. LCD's have no raster beam and thus no physical (hard) refresh rate, more like a "pixel refresh frequency" (usually given in ms). The current gen is a weird thing. It actually dropped the standard frequency to 30fps but also at the first time made cinematic games possible. The next gen will be interesting because it seems we got enough bandwidth to consider 60fps the way they were originally intended. But at the same time, I am wondering if the assumption is true that the lower frequency is better for cinematic experiences. I also believe it is necessary for a better, more lifelike animation system - 60fps would just "skip" frames. 60fps are awesome for fast paced action games, but not movie like animations.
No, just no. You are saying that a slower rate allows your brain to fill in the gap and that this is somehow better. Well how about 20 fps so your brain can make it even better? Or even 10 fps, or how about a hold up some pictures for you? If the lower frame rate gives YOU the magical feeling then that is simply down to the associations YOUR brain makes. If you had grown up watching movies at 60fps and someone showed you one at 30fps your reaction would be "What the hell is this crap". People always resist change. I want to feel immersed in my games/movies and I don't want to be distracted by 30fps. You sound like the people 10 years ago who said HD wasn't needed. Or the people who say that the imperfections and crackles of vinyl records is what makes them better. These people are/were wrong. With higher resolution you need higher frame rates. 30fps was ok with a CRT SD but its very noticeable at 1080p. I expect to hear your argument more and more as PS4 games are released at 60fps and XB1 games are 30fps but that will just be the usual fanboys. 60fps is the future.
You clearly did not read the article. Anything below 24 FPS wont register as a seamless video. Why do you think every single movie in Hollywood is shot at 24fs? Because it has the "Magic" thats why. If you think the ENTIRE production industry is plain wrong and you are right, then that's up to you.
Well aren't you lucky that your eyes don't work as well as mine. Because 24 fps doesn't register as seamless video to me. If there is any camera movement it give me one hell of a headache. Why is every movie filmed at 24fps? Money is most likely the answer. 1. 24 fps was the minimum they could get away with so they did. 48 would have have been better but twice as expensive for the makers, distributors and cinemas. 2. Because every movie maker wants the maximum profit they film their movies for maximum distribution. So they have to film for the lowest denominator. Many cinemas wouldn't be able to show a movie at 48 fps. 3. DVDs which are still the biggest sellers wouldn't be large enough to store a 48fps movie in a format that DVD players could show. At every stage 24 fps makes things cheaper. NOT BETTER.
The thing is, like HD or even 4K the technology is there to shoot e.g. movies at whatever frequency you want. And yet we are stuck with 24fps. And they tried (see the 48fps experiment). But it doesn't seem to work otherwise we would already have it. The same way we use a post processing filter over the image - because technically you can shoot ultra sharp images at what ever frequency with no motion blur what so ever and yet it's not happening and it sure isn't what we would enjoy.
This is all just brainwashing, trying to make it seem okay that most "next gen" games need to conform to 30fps.
A slower rate allows for that horsepower to be spent on better detail and effects. You cannot increase the framerate without giving up detail. I do find it stunning how many video game enthusiasts don't understand this.
Game and film production and how motion is conveyed are 2 completely different entities stop comparing the two. 24 frames of film, is 24 frames of real time motion captured within 1 second, which correlates to anything above 40 frames in relation to 3D animation.
It doesn't matter whether it's 60 or 30 as long as the FPS is stable and steady.
Am i the only one here who thought the hobbit in 48fps was freaking amazing??
Camera pans look superb, for starters. 24fps puts severe restrictions on how fast camera moves while maintaining detail.
i only know what my eyes show me, and they definitely prefer more fps than less.. in any type of game.
There is no valid reason to ever say 30fps is better than 60. Once your used to a smooth 60fps it's noticeable in any genre when dropping to 30. Movies and games are different mediums.
And I'm used to smooth looking movies! I too was complaining about the "soap opera effect" in the beginning. But once I got used to motion interpolated video provided by my LG (even with the artefacts on specific cases), I now have a hard time watching movies at the native 24fps.
I'm in the same situation as you. I bought a Philips TV with a motion interpolated mode, I tried it with Avatar just to see how it looked, and now if I turn it off, it feels like everything is a little choppy. DVDs are the ones that benefit the most for me, since the little artifacts are not as visible because of the lower resolution. I wish that since nowadays everything is being shot in digital cameras, so film prices are not a factor in choosing a lower framerate, they would shoot everything in higher framerate like they did with The Hobbit.
With console gamers majority of them dont realize how awesome 60fps is. I dont think I could name one genre that wouldnt benefit from 60fps. Majority of console only gamers experience with 60fps is limited to call of duty.
Please enlighten me as to how the tps effects the script given. Oh wait, it doesn't. A crappy story will be just as bad in 30 fps as it is in 60 fps.
This guy hasn't a clue. The reason 24FPS works on film is because you are capturing it in real-time over 1 second via lenses and light exposure on the film, which creates a natural motion blur that mimics how our own eyes perceive motion and creates a somewhat dynamic effect. It appears to be "smooth enough", and only just smooth enough for our eyes to be able to perceive it as fluid motion. Games are not filmed. They are rendered frame by frame to the screen without lenses or natural motion blur, so 24FPS looks very stuttered and does not produce a perceived sense of fluidity in the slightest. 48FPS film is becoming the standard in Hollywood now for any heavily 3D animation laden films because faking motion blur at 24FPS for CGI still looks too fake and stands out obviously when juxtaposed with real life actors. Higher FPS is better in general; in time, you will adjust to it. In games, it's VERY important, particularly in fast-paced skill-based games. The higher the FPS, the better. Our eyes can perceive above 240hz in some cases.
But that is exactly what this is. Higher frame rates need to "simulate" this "post processing" a analog lens achieves naturally. And obviously, those "artificial" filters are not good enough (yet) to achieve the same result. This sure applies to video games in the same way. The 30fps games simply spent more resources in those filters, while the 60fps games don't necessarily require a strong post processor - fast game play, low lag is more important than motion blur and sophisticated lighting and animations. And maybe it isn't relevant if it's 60 or 30 fps; if machines would be fast enough the same filter could be applied to whatever frequency you want with the same end result. But at the same time, you have twice the time for those calculations at 30fps - and nothing in-between is used because of synchronization requirements.
For consoles you mean, because in PC 60 FPS is the standard, while 120 FPS is what some crazy dudes are using lol
I'm getting a little bit tired of "on PC 60FPS" is standard. On PC a dynamic frame rate is standard. You get results all over the place from sub 30 to above 60. Everything on the PC is fully dynamic due to the fact the HW equipment is so "fragmented".
yea but what he is trying to say is that PC gamer will always try to aim make there games run on 60fps, even if it means lowering the settings
Fixed: Dynamic framerates if you dont know how to use vsync.
Well call me crazy but I personally LOVE gaming at 120 fps.
lol cool, I heard it was indeed good, will give it a try when I can... PS: I never said crazy was bad, just a little out of the standard ;)
60 FPS should be a staple for all next-gen games, it only enhances the experience in every possible way.
"in Cinema the Smokes and Mirrors fall away with increased fps" we just need more smoke and more mirrors. The higher fps in the hobbit was nice but you must understand that yes middle earth is a magical land we want that magic preserved so why taint it with realism? I think a film like the hobbit was a poor example of what a higher fps can bring to media. Give me a higher fps in kill bill, give me a higher fps in movies like life of pi, our editing, our effects the sets, the magic needs to be turned up to 11 and that higher fps will be much more welcome in film. As far as gaming goes as we become better at handling graphics and as a whole focus on story there is no easy a good story driven game couldn't run at 60 fps. 30 works but 60 could feel and look much better.
I'd say the more real the image looks, the more immersed one can be in this fantasy world.
This doesn't make sense 60fps will always be better for any game cinematic or not.
Is this a joke?
When you are controlling the action the higher the frame-rate the better.
This is silly. I'm usually very open-minded, but really, this is just silly. ** Also, movies and video games are two totally different mediums, but whatever... ** Frames per second (fps), in terms of functionality, is purely a matter of how smooth your experience will be; there are no other factors. As such, a higher fps will always be better than a lower fps. Fact. I cannot grasp how you can believe otherwise. This 'movie magic' that the author keeps going on about is nonsense. Am I to believe that, due to the lower fps, the experience is suddenly more 'magical', because it basically gives your brain more time to think? That's ridiculous. A higher fps does not mean that time itself is being sped up. A scene with a higher fps will last just as long as a scene with lower fps, it's just a matter of how visually smooth you want it to be. Or, if it's a matter of someone just not being used to a smoother visual experience, then no big deal. It's not like we and our senses are set in stone. If they give it a chance, they can and will get used to it. People are just afraid of change, because it relates to our inherent fear of the unknown. In summary, a higher fps > a lower fps.
I think 60fps should only be for fast-paced games, it looks weird in slower games especially without good motion-blur effects, then it just looks horrible to me as everything moves in a very unnatural way.
30 fps is NEVER better than 60fps 60fps is just way more fluid and detailed. I guess next this guy will say 480 is better than 1080p (anything for webhits i suppose)
but sometimes, 60 FPS make some stuff seem robotic, unnatural, fake etc... it sure is better for gamplay but for cutscenes i don't know ??
No... 60 fps make things look smooth, 30 is just chunky
If its locked solid at 30fps. It can be. :P Every movie you have watched (except hobbit) was at 24 fps. You liked those didnt you. the reason we dont like 30fps games is because the frames keep on going up and down. never rock solid at 30. That said. Some genres are better at high fps. Action Racing etc.
1. the Hobbit was not as good as TLOTR IMHO 2. Computer animation needs higher frame rates to TRY to mimic the real time FLUID movements of humans. 3. watching the amount of CPU/gpu power needed to mimic real life hair movement (TressFX) puts things into perspective. 4. Also watch this video on the state of the art Motion capturing used in Avatar and the computer powered needed to do it and you will realize that with ALL that might Avatar still only had about 1/3 the animation of a real human body/face. your face has 43 muscles alone (650 in the body total) which means at least 86 connection points which leads to even more twisting and how many points were on the Mo cap face in Avatar? Hell even the camera has problems processing all that info into the computer. http://www.youtube.com/watc...
You really have no clue about cinematography. You play video games, probably the same type and have no clue about production. Why would you like the movie Star Wars in the theater but want Star Wars as a video game to play at 60fps???!!! It would NO LONGER look like a movie thus ruining the experience! Again- If you love movies, then guess what, one day video games will look just like movies, you won't be able to tell the difference! And for that to happen the games will move at either 30fps or even 24fps. If this happens who the hell in their right mind will complain????!!!!!! If I turn on a game like Star Wars and I'm really thinking it's a movie but it's a video game..... that's when video games have finally reached their peak, don't you think???
No. Videogames aren't movies and shouldn't try to be movies. They won't reach their peak when/if they look exactly like a movie; they have a better chance of that when a game can be mistaken for reality. Your entire argument boils down to familiarity. Movies up to this point have been shot in 24FPS, and many new ones are using 48FPS. You'll get used to it. I've been used to 60FPS cutscenes for years, they look no less engaging to me in the slightest. They don't look like movies, but then again, they shouldn't. They're games. And 48FPS films will make that difference disappear even more. You will get used to it in time.
Solid 30 Fps for Campaign and Solid 60Fps for Multiplayer. I like what GG did with the Killzone Shadow fall. They balance everything out on the campaign and focus on the Graphics, Gameplay and Story. Where as in Multiplayer they focus on Gameplay with a touched of very nice graphics.
Pretty sure they said Multiplayer ran at 60FPS 'sometimes', I'd hardly call that "Solid 60Fps for Multiplayer." If they do indeed just use a 60FPS lock and allow the FPS to drop below that it's going to feel horrible. Theres nothing worse than a variable frame rate that drops from a v-synced 60FPS to say 45FPS or lower. It will impact gameplay dramatically.
I play games for the gameplay, not cutscenes, why should I suffer a sub par gameplay experience just so that cutscenes look more movie-esque? 24P is merely 'satisfactory' for movies not ideal, if movies were shot at 48P that would require double the bandwidth (TV) or storage space (DVD). "When you shoot video at 24 fps, you need to avoid quick pans and tilts because they may cause the image to stutter." - yep sounds ideal for games... not. You'll also notice that most 30fps games have a motion blur effect added to make frame transitions look smoother, it's not needed with 60fps.
It really amazes me when idiots on the internet post about things they have no clue about. Most of the post on this topic really shows why the people on N4G have to average 16 years of age and just because you play video games does not make you an expert on cinematography. I've read every post and 90% of you are 100% incorrect, honestly you are so far off but in your minds you really believe you are correct when you know you have NO EXPERIENCE or training to even make a comment. This is the power of the internet....smh. The poster of this topic is trying to educate you guys as I have been trying to do the same thing when it comes to this stupid 60fps myth. If you want your game to have a cinematic look and feel it should be 30fps (closest to 24) It does not matter the type of game, 30fps will yield a more cinematic look. Same is true for movies, from Star Wars to Shrek, 24fps creates the cinematic experience. To those of you who think Hollywood does this because of budget restraints, your showing your ignorance once more. Hollywood shoots digital now and the frame rate is still 24fps. If you have a blu ray player, it has the capability to play back blu ray movies at 24fps at 1080p resolution. It's all about a cinematic experience. As far as video games, developers decide on the cinematography of a game while it's in production, do they want the game to look like a movie or do they want the game to look like a broadcast. I prefer the cinematic look for movie type stories while the video look (60fps) for fighting and sports games. In the end it's an artistic choice, not a barometer that determines the power of a system. So fanboys, please stop using this 60fps out of context to justify your system, it's an idiotic statement and it shows how just ignorant you are.
You go ahead and call 90% of the users here ignorant, idiot fanboys and go on to post THAT drivel. Bravo, Bravo... *slow claps*
Comparing Game to a Movie is already pretty stupid statement. A Movie doesn't have all the technical capabilities that a game has. A movie doesn't use polygons, dynamic lighting, physics, A.I and any other things that needs rendering in actual gameplay. A movie is more of a static stracture rather than dynamic stracture where a everything changes with the interaction of the player.
If you like 24fps so much go play dark souls and go to blightown. Enjoy your "cinematic experience".
the headline is so fucked up.
This article is total crap. It's about games and 60 FPS is awesome for games. If a game runs at 30 FPS, that's cool, but 60 would be nice. To me, this sounds like an excuse for Killzone not hitting 60FPS during campaign. If it can't run at 60, it can't run at 60. Quit the "better experience BS." The game looks great (AI is suspect though), no need to make excuses.
The closer the framerate is to natural sight, the more life like and immersive the experience. Stop trying to hold back technology, you clown.
Framerate has a huge impact on immersion. 30 FPS is the absolute lowest you can ever target for a game. 60 is optimal, and anything more than 60 is just icing on the cake (if you have a display to support it).