Both next-gen consoles will run Watch Dogs Legion at 4K 30FPS
Give us a 60fps option without Ray Tracing!
Just because it runs 30fps with RT doesn't mean it automatically runs at 60 if they disable RT. Other things can be bottlenecking it to a lower framerate than just the RT effects.
Well give us the option to play at 1080p then. It is sad that even with next gen consoles, the fight for 60fps continues.
Their frame rate could be limited due to CPU usage, or memory bandwith for example, and then lowering resolution wouldn't make a difference. With that said, just because 4k/30 is confirmed, that doesn't mean that there won't be a performance mode setting.
Cernunnos I mean these consoles are more like PC's now so why can't we the gamers given the option to pick performance over graphics? I mean some may take 60FPS at 1440P I just want that option for gamers to pick instead of one setting fit all kind of thing
But world’s most powerful console with 12 TFlops can run at 60 fps. Right?
@ SamTheGamer Why would you assume that? Even PC GPUs which cost more than an entire PS5 or XSX wont run every next-gen game at 4k 60fps. I think in the long run after we're passed the stage of being sold upgraded current-gen games as next-gen games, 4k/30 will be standard for most 3rd party developers (unless its a genre like racing or an arcade style game) on consoles but they will look smoother than the current gen and the previous gen where 30fps actually meant 20 to 30fps on a lot of AAA games so its still an improvement.
@GameZenith if you want 60 fps you can throw another $500 for a gaming pc. youre not limited to consoles
"Even PC GPUs which cost more than an entire PS5 or XSX wont run every next-gen game at 4k 60fps." Yeah, they will. Because on PC you can choose individual settings to get the balance you prefer, and not the one the developer imposes on you. That's why people like PC gaming. Ultimate control, Ultimate choice.
Right. I also like the pretentious demand like ubisoft will agree
While that can be true benchmarks for the rtx 3080 show a 40-50% decrease in fps at 4k with Ray tracing turned on. With ray tracing and upscaling on its more like a 25-35% decrease but we don't know how good the upscaling is on consoles. At 4k your less likely to have a cpu bottleneck, and being this early into the console i can't imagine they would be bottlenecking that would mean Sony and Microsoft both dropped the ball on their designs. Raytracing might not get you all the way to 60 turned off but it definitely would make a big difference. Especially since many of the ray tracing implementations i've seen don't add enough to a game to warrant the performance hit.
"Other things can be bottlenecking it to a lower framerate..." Yes. The resolution. Put it in 1440p or 1080p and turn off ray tracing, and 60fps is more than possible. There will be a performance mode for this game, and if not, I hope gamers will complain. It's time that console gamers stop accepting "cinematic framerates".
It's a last gen game though.
I will be playing 4k/60 with ray tracing on my 3080 :)
I would hope so seeing as the card alone is $700.00. Here, have a sticker!
@Sprucegoose77 The one I got, Aorus Master, was $850. Thanks for the sticker, I will put it on my pc case!
Don't count on it. It's Ubisoft, their open world games a terribly optimised on pc.
"I thought this was a gaming enthusiast website, not a salty sore loser site..." Acting as if one side is losing because they prefer consoles is part of your misconception.
Personally, I think 1440p (QuadHD) is the sweetspot visual/performance wise when playing on a monitor, but hey, If your setup can run 4k60, enjoy! ;-)
If you hold your breath and wait for someone to care, you won’t be playing Watchdogs at all.🙂
@torreth1 Why would I hold my breath? Sounds like something a Switch owner would do ;)
@sagapo I have my pc connected to a 50" Samsung 4k tv with HDR...pc monitors are too small for me.
And sadly like with most P.C. ports it **should** look multiple generations better --- but it won't. Other than either boosting resolution and/or fps the P.C. graphical differences are often so minuscule that an average (non-gamer) would be hard pressed to say something looks much better on P.C. (maybe different). Certainly, the average person would not see a fraction of the difference that the actual power differences SHOULD show. I am so annoyed with the fact that beyond boosting framerates (or the waist/joke that is "8K") there will be VERY VERY little visual difference between what is playing on my P.C. and the consoles. Heavens, Horizon & GOW3 on what is frankly a *WEAK* PS4 Pro still looks nearly as good as anything running on P.C. (with the exception of 60+ fps vs. 30). Sorry, but when my P.C. has nearly **7x!!** the TFLOP performance, some extra FPS and effects that my wife/kids/friends can barely see or only notice if I try really hard to point them out = COMPLETELY RIDICULOUS. For this reason, I have become so disillusioned with the glitz of P.C. cutting edge. Further, this is with "last gen" games. We are already at such a quality level that as the visual fidelity increases so too will diminishing returns increase going forward. These differences will only become smaller! Oh but ray-tracing! Whatever. It's nice. Pre-baked effects can be very close - they're just WAY more work for developers and until hardware is powerful enough to do global illumination/pathtracing on full scale, it is going to HAVE to be hybrid solutions to actually look any better than the better examples pre-baked work previously done. The power discussion, as well as P.C. potential drew me back into P.C. gaming as promises associated with DX12/Vulcan and other closer to "write to the metal" ideas began entering the P.C. market. I went with the 908Ti, then the 1080Ti, now I'm moving onto the 3080 (yes, I skipped Turing). I like my P.C. obviously. However, the rosy glasses of "My P.C. is so much more powerful" perspective are now heavily smudged (or perhaps the pleasant but false filter is simply wearing off) with the reality that ****unless everyone else has the same P.C. and developers focus on it only***, even with the RTX 5xxx series, games will, to most people, largely look very similar among platforms - but hey, extra fps will ABSOLUTELY be an option so there's that - and yes, I like it. In F1 a team can spend millions of dollars to shave .2 seconds off a single lap. For gaming, I would prefer A WHOLE LOT more than what MOSTLY amounts to some extra fps. The hardware is actually capable of it and I see that it is nearly NEVER utilized for much more (if I'm honest). This is just the reality. So, I say, "good for consoles reaching a level of graphical fidelity at least where I'm frankly not so excited about showing off my P.C. to anyone who doesn't know the difference between rasterization and ray tracing - and maybe even anyone who does know the difference, but remains brutally honest about the actual appearances despite necessarily suppressing the excitement that comes with knowing just how cool it is that the hardware is actually capable of doing what it is doing." Sounds sad that "showing off" is a thing in here, but I guess the validation helps when I have to convince myself that the differences go much beyond my silky smooth high FPS - but they do not - hence my comment. Disagree away now - keep telling yourself your P.C. somehow looks so much better than what mine has ;-). I've put up good P.C. money, but I'm honest and wish to see the hardware do what it actually COULD (but likely never will) do.
Came here to brag asshole? Got myself a 3090 so what? And no, you paid more than 850 liar
hahahaha so many butthurt responses just from one factual sentence. Stay insecure guys! Never seen so much jealousy and hatred in one post on here. @ThereGoThatManQ I am compensating for my penis. Is that what you want to hear? I got a new graphics card because of my penis? because that makes sense in your head somehow. hahaha ok! If it makes you feel better, yes I did get it for that reason, not to future proof myself for the next few years. @833 Nice novel lol @Promachos You are right, I had to pay $11 shipping too. haha I hope you enjoy your 3090.
I found it funny you get a good gpu and then you end up playing it on a tv LMAO what is the point of getting that GPU anyway, when you running a low FPS compare to a monitor.
@RedDevils, are you really that simple minded and obtuse, or are you just acting cute? If you weren't aware, a new gen is about to start with games that are going to be pushing the limits over the next few years. I am future proofing myself so that I can play all the newest games at minimum 4k/60 with raytracing and the highest settings. I guarantee these new consoles and gpus are going to be hitting the limits in terms of frame rates. Even with the new call of duty beta, I was getting max 90-100 fps at 4k.
@fax lol actually I meant “insecurities” but I guess we now know what you’re insecure about. Many others got the same card but you felt the need to boast about if for whatever social gain you thought this would bet you or fill whatever void it is you have within yourself. It’s not hard to catch a troll but you actually mean what you say which is sad to be honest.
@JustTheFax You ain't getting a full potential of that GPU on that 4k TV ever! when compare to a monitor (even though price are higher). Anyway, it not my money to waste as I have the same card. You can enjoy that gpu on whatever screen you want. But playing on a TV is such a waste if you care about refresh rates/input lags, even the best TV will still come in 2nd to and average monitor. As for me, I rather games on a TV with a PS5/XSX.
@jin you have to also figure out if the game was made with ray tracing only. they may not have another.... oh wait a minute it's releasing on last gen consoles so it should probably have an option to turn it off, my bad edit: all the people saying it won't magically work by turning off ray tracing is also not admitting that ray tracing affects frame rates for the most part or at least i've never seen a game where it doesn't
Ubisoft knows jack shit about optimization. High end CPUs are required to run their shitty games at 60fps. So I dont think GPU will be the limiting factor.
As long as games strive to push the limits of hardware 30fps will always be the standard. Even on my PC if it's a game I don't feel the need to use a mouse in I have no problem cranking the graphics options and locking 30fps with a controller.
I've found my nice minimum is 42fps. It still feels pretty smooth. Feels a lot better than 30, and a good spot if you're not getting a solid 60. My PC is no slouch, but it doesn't hit a straight 60 on most taxing games when cranked up. If I don't manage even that 42, I'm more than happy to drop a couple settings. I only do 30fps when that's literally my only option - so the odd locked PC game, or applicable console exclusives.
A lot feel 42 is just about right but I settled as 45 my last playthrough of creed origins there's no way you can constant 60 on that game so a 45fps lock in rivia tuner made it very playable for me, You get used to it after a while
Especially for open world games. People who know nothing about game development really do need to shut up and just play the game.
But Timdog said XSX will play games 4K60 without breaking a sweat😂, on topic though i would prefer 1440P upscaled and a 60 FPS mode, both consoles should go for that and have more overhead for effects and such.
It’s easy to say just drop the resolution and up the frames, but there may be other variables in play. Maybe it’s the animations or maybe they are bottlenecked on the CPU because of the Amount of Playable characters and AI required. Who knows. Maybe it’s not optimized great.
I'd honestly be surprised if this didn't have a fps.
Lol timdog and the xdealer crew is a joke.
Entertaining though lol
I wouldn't use this game as the smoking gun to disprove that. It's Ubisoft, they're far from the kings of optimization and getting everything they can out of hardware.
Ugh. Cutting the framerate in half for slightly better lighting effects. I have a feeling that's going to be an annoying trend this generation.
Ray tracing isn’t “slightly better lighting effects”. Wanna see the difference, try comparing the console versions of Cyberpunk 2077 to the PC version capable of ray tracing and the difference will be night and day.
Thanks, I've seen plenty of comparisons (using PC games with ray tracing on and off, because comparing console to PC is stupid), and except for minecraft, they all prove my point rather than refute it. Massive framerate hit for a moderate improvement.
30 FPS for maxed out graphics isn't a new thing. Don't know why folks thought that was going away
I didn't actually believe it would, I'm just going to criticize it anyway. Also, the CPUs aren't trash this time around, so if there was going to be a gen where 60 could become the norm, this is it.
Yup, and if any less was acceptable, they'd push for that too lol.
Games that are heavy on the CPU don't just magically run at double the frame rate if you drop the resolution or graphics quality - especially games that have a lot of systems at play. Dense urban environments with several dozen civilians, combat AI and vehicles potentially on screen at once means that the CPU is going to have a lot to contend with.
A lot of people don't understand this lol. They think resolution or RT is the only thing holding back frames.
Yes, but consoles finally have decent CPUs, as opposed to last gen.
This is the recommended PC specs for 4K while consoles can do more with less no way either consoles CPU is within spitting distance of a 9700k. So yes the CPU are leagues better there still not top of the line better which this game is asking for at 4K. CPU: Intel Core i7-9700K 3.6 GHz, AMD Ryzen 7 3700X 3.6 GHz GPU: NVIDIA GeForce RTX 2080 Ti or GeForce RTX 3080 Video memory: 10GB RAM: 16GB (Dual-channel setup) Hard Disk Space: 45GB (+20 GB HD Texture Pack) OS: Windows 10 (64 bit only) (For optimal experience use DX12, also compatible with “DX11”)
They can do more with less because they're nowhere near maxing games out when compared to recommended settings on PC. Thermal stability is carefully taken into consideration when it comes to consoles, always has been. Due to their compact nature it's a limiting factor when stability becomes prioritized over performance.
Sweet, maybe 1080p with ray tracing at 60 FPS option.
I hope their TV has fantastic upscaling, because I'm finally starting to find 1080p coming off as legitimately blurry. You can have all the best effects to be had, but if it's not at a proper resolution you're just looking at it through a muddy lense. In all honesty, so many people value certain things over others. The 3 main things right now are resolution, framerate, and ray tracing. While I'd like to see consoles go the PC route and just play with everything, it would really be cool to offer the ability to at least play around with those 3 ingame. Personally, ray tracing is cool, but I don't need it and would rather be given the option to simply turn it off to boost the other two on console. Hoping that's a thing we see when everything is said and done.
Too bad THEY ADMITTED I PROVED it all(and that everyone knows it) and where is you guys counter proof. If you can't balls up and answer that then there is no point in responding with the back and forth.
Dont care for raytracing over 60fps. Hope it will be an option in this game and others
So what about a 1080p/60fps option? Or the ability to turn off RT at 4k to increase fps?
It would be better with lower res 1080p but at 120fps. Always FPS over res until that res can meet the same FPS as 1080 at 4k.
Sounds like a PC is more up your alley. 120fps is still gonna be a bit of a rarity in console gaming this gen. Not enough ppl have TVs to support it
So few people have 120fps capable displays; I doubt most developers will give that option much thought.
Series S must be holding it back.
The Series S is holding back also the PS5 version. Oh yes; that makes sense.