People often say that you have bad eyesight if you can't see the difference between 30 frames and 60 but it is as a joke. What heavily depends on eyesight is the resolution not the frame-rate.
The only way you wouldn’t be able to see 60 frames is if the video is interlaced (1080i) instead of progressive (1080p) because interlace halves the frames. One thing to have in mind is that the lower the frames the lower the resolution will be once the camera starts moving, and this introduces a new element called video "judder" due to the image content not being fully translated to the screen. In games this annoyance is simplified with motion blur post process but in movies is a whole different thing.
If you watched any of the Nolan's Batman films then you know how confusing the action can get when Batman is beating up the bad guys up-close. You can't even tell the movements Batman is doing most of the time because of the low frame-rate of the film. This is when directing experience comes into play to make it easier on the viewer with smart editing and angles but lots of judder will still be present.
Judder in a fully CGI film works a bit different because you have at your disposal a bunch of artificial effects just as in the gaming environment. But relying too much on it can make a film that is combining both, CGI with live action scenes, look very unnatural.
But moving on, frame-rate is more important for games than movies because is not just a visual effect to maintain resolution and clarity on fast camera movements by displaying more visual content to a screen, but a way to reduce control latency too. This is the very reason why from now on the EVO tourney that takes place once a year is now moving on to Xbox. Because believe it or not one slight drop in frame-rate can mess up high level execution combos. Most PS3 fighting games are Vsync to avoid screen tear when the frame-rate goes a bit erratic but Xbox fighting games don’t have it. This makes that the Xbox is able to skip frames without bringing up the latency like the PS3 does due to Vsync. This may sound a bit contradictory but it is much more complex than that and that would require an entire new blog entry for itself to explain the technicality of how Vsync forces the system to display a frame before moving on to the next one and then there’s Triple Buffering too. Why most fighting game developers like Capcom decided to implement Vsync on their PS3 version, only they know why. Maybe they thought the PS3 was capable enough to handle the task while the X360 wouldn’t but at the end of the day the pro-players prefer performance over the image quality.
Another example of a game having higher latency due to Vsync would be the Uncharted series. The first Uncharted game had serious issues with screen tear but due to its lack of Vsync it could skip frames, so the latency stayed the same through all the game. Then for Uncharted 2 and 3 Naughty Dog wanted to increase the visual quality, so they implemented Vsync at the expense of latency knowing that Uncharted less than a competitive game it was more about the cinematic experience. Here the use of Vsync makes sense but if the PS3 had better specs then higher frame-rate would have been a welcomed addition due to its advantages in visual quality and latency. With lower latency the gameplay, particularly the aiming since its platforming has a very wide timing, would have been much more precise and snappy. So when you see Call of Duty players complaining about Uncharted’s bad shooting mechanics just try to see it from their point of view where they take high frame-rate for granted without even knowing it. At the end of the day the game developer are the ones who decide where to allocate the resources depending on the vision they have for the game but to me is a good thing that as of late gamers have been very vocal about wanting higher frame-rate on next-gen games. If there’s a good thing that could be said about Nintendo is their focus of gameplay over graphics and because of that their games always have very consistent frame-rate, not always 60 but never below 30, compared to many current gen games on consoles.
There are many more things that can be discussed about this topic that could take lots of more text like: “what exactly is a frame and how it is produced by the display, be it interlaced or progressive”, “Vsync vs Triple Buffering”, frame-skipping or the 120hz post-process (not real native frame-rate, the TV is still stuck at 60 frames max) featured on many modern LCD based TVs at the expense of latency by duplicating the same frame over and over again (not good for gaming at all even if the image quality is artificially increased), but this is the end of my 2 cents in the discussion. And I'm not a native English speaker and my vocabulary is limited, so sorry if some stuff might sound like it was written by an elementary kid. bye ;)