Top
450°

Why Ratchet and Clank: Rift Apart's 40fps support is a potential game-changer

A few weeks back, Insomniac patched Ratchet and Clank: Rift Apart on PlayStation 5 to introduce a revised version of its 4K30 fidelity mode. Tapping into the capabilities of 120Hz displays, what the team delivered is a potential game-changer for console titles - a 40fps mode that looked just as good as the older 30fps offering, but running considerably more smoothly and feeling better to play. On the face of it, a bonus 10fps doesn't sound like a huge bump, but in actuality, it's a very big deal.

Read Full Story >>
eurogamer.net
The story is too old to be commented.
darthv7266d ago

Interesting. It presents a nice balance of quality and performance. Id just need the right tv to take advantage. I hope more games will offer this option.

SeTTriP66d ago (Edited 66d ago )

This game is gorgeous on any setting on a lg c1 (finally decided to go oled)

But the added smoothness with all the bells and whistles,let's just say I'm loving my ps5.

This console is freaking amazing.

camel_toad66d ago (Edited 66d ago )

Nice TV! I've got a 65" lg c9 that I'm using. I went from a no HDR sub 4k LCD TV to an oled lg c9 and man it blew me away.

Anyways, how are you liking the oled? Come from lcd or something a little better?

Popsicle66d ago (Edited 66d ago )

I have an LG B6 and a C8 so I missed 120hz by 1 year. Also, even if I upgraded my OLED I would need to upgrade my Denon audio/video receiver that I paid $600 for when I upgraded to Atmos in 2018 because it will only pass thru 60hz to my TV. As such, I won’t be getting that smooth experience anytime soon. Grrrrrrrrrr!

sinjonezp66d ago

I have the 65 “ B9 and I’ll tell anyone , once you go OLED, you can’t go back to any other display. It is that big of a difference and having a PS5 connected has been amazing. Only thing is that 4K is only available up to 60hz unlike the C series which does 4K 120hz… It’s still an incredible experience with 4K 60 on PS5. Especially titles like FF7 remake and Ratchet. I would recommend anyone to OLED, burn in is a non factor, had mines for a year and it’s still as beautiful the day I purchased

Yppupdam66d ago

55" CX here, what a great display from LG.

nyctophilia1365d ago

Just in case anyone was wondering about burn-in, I have a 65 inch LG OLED from 2016 with a ton of miles on it and ZERO burn-in.

Neonridr65d ago

I have the 65" C9 as well. Love LG OLEDs. Beautiful TVs. Upgraded from an older 55" E6, so I'm all aboard the OLED train.

+ Show (3) more repliesLast reply 65d ago
camel_toad66d ago

I tried the 40 fps @ 120Hz mode and it was definitely better than the 30 fps by a long shot. But still, I had to go back to the 60 fps. Hard to go below it unless you just absolutely have to.

SeTTriP66d ago

I'm coming from a q80t samsung mid range.

Oled is just the best t.v for gaming.movies and other things it can be a toss up but for gaming nothing compare

Yppupdam66d ago (Edited 66d ago )

wrong thread, sorry

ABizzel165d ago

Yep, the benefits of this practically require a 120 Hz panel.

+ Show (1) more replyLast reply 65d ago
sinspirit66d ago

GoW on PS4 Pro was soo much better with unlocked framerate. Even being inconsistent, it felt way better than 30fps to me, when it was 30-40 most times, and sometimes above 40. Glad devs are being more experimental and trying these things out now.

Storm2366d ago

I wonder if any devs could create a mode at 80fps? Just speaking of experimenting. Wouldn't have to drop as much of the effects or resolution as the 120fps mode. Just a thought

MoonConquistador65d ago (Edited 65d ago )

Not as beneficial as you might think so I couldn't understand why it would be on any developers radar.

The 40Hz mode works so well because it syncs in with the refresh rate of 120Hz panels.

So a new frame is synced in with every 3rd refresh.

An 80Hz signal into a 120Hz panel would be unsynced with the refresh rate. As you would get a new frame every 1.5 refreshes.

The frame rate has to be divisible into a number which results in a whole number.
120Hz FPS input you would get a new frame every refresh.
60 FPS = new frame every 2nd refresh
40 FPS = new frame every 3rd refresh
30 FPS = new frame every 4th refresh

80 FPS doesn't fit in anywhere here.

For the same reason, the 40 FPS mode wouldn't provide the same benefits on a 60Hz panel. That would experience the same issue of a new frame trying to display every 1.5 refreshes.

Storm2365d ago

Cool. Thanks for the info. Makes sense.

Sunny1234566d ago

This should be adapted by other games. Giving the gamer options, some like more fps, some prefer higher resolution. Its the devs who should be praised for such inclusion.

Sunny1234565d ago

Sure, even I prefer pc, but can't have best of both worlds. Ps5 exclusives are the best games out there, and sadly I cannot play them on my pc.

aaronaton66d ago

I'd you haven't seen the video, basically:
The normal 4k fidelity mode had some head room where they could cap it at 40fps instead of 30fps. However on a standard 60hz TV the frame timing will be uneven. With 120hz TV's, running at 40fps has great frame pacing as it's divisible by 3.
This seems like a great option and should be popping up alot more with the adoption of 2.1 capable TV's.

Vanfernal66d ago

There was a lot of jargon in that video, but just wanted to make sure. Since the 40fps mode runs in 120hz mode the resolution gets capped at 1080p?

Destiny108066d ago

digital foundry put the output to 1080p because their capture card cant capture 120fps and 4k at the same time

if you have a 4k 120fps capable tv, you will get a 4k image while playing ratchet and clank at 40fps

StoneyYoshi66d ago

As Destiny1080 said, Its because there are no capture cards that support HDMI2.1 yet causing them to use 1080p for 120hz to be captured for the video.

Destiny108066d ago (Edited 66d ago )

60 fps = 16.7ms refresh
30 fps = 33.3ms refresh
40 fps = 25.0ms refresh

while 45fps may sound like the mid-point between 30fps and 60fps, in frame-time terms that is not the case: 25ms sits precisely between 16.7ms and 33.3ms

40fps is significantly faster, smoother, and there are also input lag benefits too. I tested this by pointing a 240fps camera at both screen and DualSense controller. I measured input lag by jumping 10 times and measuring the amount of frames between pressing the button and the character beginning the jump animation

on average, the input lag is only 6ms slower than the 60fps performance RT mode running on a 60Hz display

Insomniac mentions in the patch notes that the new 40fps fidelity mode is about reduced latency, but I was somewhat surprised at just how much of a win it is on my screen

What's clear is that many games do have CPU and GPU overhead left over when the 30fps cap is in place, the game control has some overhead, judging by the unlocked framerate in photo mode, and is also a candidate for a 40 fps mode

Show all comments (51)
The story is too old to be commented.