World of Tanks technical director offers his thoughts on the console's CPU.
Better Optimization is always welcomed: “We’ve actually found the CPU and GPU improvements to complement each other quite well. Increasing the resolution from 1080p to 4K uses much of the additional power of the GPU but has basically no effect on the CPU. With those extra pixels available to display fine detail, though, we’ve chosen to extend our LOD ranges so that more objects render in the distance. That kind of change has CPU cost that is a good fit for the improved hardware there.”
well that explains how they were able to use a cheap CPU
I genuinely believe the reason both (MS/Sony) stuck with the same CPU was to ensure straightforward development and guaranteed (base level) performance would be doable on the OG systems. Popping in a different CPU would be a bitch for developers. Slapping in a better GPU was an easy way to up the visuals and keep things simple for everyone.
Yeah, it is called efficiency and optimization instead of wasteful.
Yeah but weak cpu will bottleneck fps...which at the end if the day is all that matters to me. I'd much rather have 90 fps. 60 fps while still ok, seems like last gen. We've had 60fps since the arcade days. Give us more. Doesn't matter the console.
@SierraGuy OK, one more time reminding people of why the CPU and GPU on the X is not setup to work the same way as a PCs. So that same slowdown you would get more easily on a PC or even the base model will not happen on the X. Much of that demand used on the CPU when rendering and calculating all the pages that a PC is doing has been removed and placed directly on the silicon by integrated DirectX into the GPU portion of the APU itself. PC's have to deal with software itself when dealing with DirectX calls and that puts most of the work on the CPU. A few calls to the CPU end on the X can translates into 1,000s of calls a PC would do on the it's CPU side . On the X has optimized DirectX and has the GPU silicon is doing all the work rather the CPU and software do on the PC. Sure this does not cure some issues that may tax a game but, it sure cures many of them. Consoles (PS4/Xbox) always an advantage that allows them to bat way out of their spec league. 1. All consoles are static devices that are easy to develop and optimize for. PC's are thousands of moving parts (on machine is rarely exactly like the next) and even the best optimization can not plan for all the issues. So. PC's must rely on much more brute speed of the CPU to make up for that lack of true optimization that all consoles get. 2. I am going to repeat #1 again because it is so important of why all consoles (not just the X) continuously out perform PC's even with those with newer and greater specs. The console is a svelte device. Extra lean. 2% body fat. A PC even with all that super strength and training still can't get below 13% fat. Sorry, no Mr. Fitness awards for them, And over time even those in shape PC gaming rigs will keep adding pounds and pounds of fat. So, yeah they need all that extra CPU muscle to get them where they need to be. The console not so much in most cases. Additionally the Xbox X has added the following: 3. As I mentioned above the Xbox X has moved Direct X away from the software, which commonly wastes thousands of cycles on a CPU with each page render that PC have to do. Microsoft placing most of DirectX3D directly on the GPU silicon takes away that zealous need for a powerful CPU to do a lot those calculations. 5. Memory bandwidth. Like 1080 GTX the X has high bandwidth 320GB/s memory pathway. But, the catch is that even a super PC gaming rig still have a CPU bottleneck because the have has to deal with that much slower PC system memory bandwidth where the CPU is doing all its work and than transfer that data to the GPU. The X has full 320GB/s. Party time on the X. There is 320Gb/s everywhere on the memory bandwidth train on the X.
@ImGumbyDammit, I think you are missing a #4.
@Sitdown 1, 2, 3, 5! I dont see the problem guize.. /s
well no duh. upping the resolution to 4k will put stress on the GPU not the CPU. If you wanted to tax the CPU try playing your games at lower resolutions such as 1080p or 720p. GPU spits out frames faster than the CPU can call for the next one xD.
So this for non taxing CPU games then right? I mean, there are already confirmed dynamic resolution games, most of them are bigger AAA games... so what about those games? What stops the native 4k resolution?
Yup. I said this for years. Resolution is determined by gpu not CPU. That's why the new consoles feature impovents to resolution as the main focus. Frame rates are far less common as they require cpu also.
Good optimization = better games
Image how games could look on PC if developers optimized just as much as some devs do for console games. Instead it's just raw power and sloppy code in most cases.
Buy but but my Gtx 1080ti eats blah blah blah ..closed system biatch
I wouldn't call that sloppy code. Would you like to optimize for the myriad of CPU types out there? It is a matter of not practically possible. The price for freedom on PC for the hardware is lack of optimization.
Developers simply can not optimize for PC on the same level that they can on consoles. Even if pc's were all that existed instead of consoles, the multitude of GPU and CPU configurations(including the same GPU but one has more or less ram which is not something to be taken lightly) would make it incredibly unviable to optimize for specific hardware configurations.
On PC the dev doesn't write directly to the hardware. Even with the new low level API's in Vulkan or DX12 they still write to a software layer which then communicates what is needed to the hardware. While it is possible to optimize this code, it isn't the same as optimizing the code to the hardware itself. Basically, there is only so much that can be done because the API that handles hardware calls at a software level can't be bypassed enough to really get to the meat of the optimization. This is why PC's can sometimes seem less efficient. There is really no way around this on a system with an operating system which doesn't allow direct access to the hardware due to the need to keep everything stabe within the operating system itself. Back before DirectX, writing directly to hardware, or writing specific drivers for the hardware on various systems wasn't uncommon. But it was just a mess in terms of compatibility, and took a lot of time which kept the optimization process to a minimum because the more optimization that was done, the fewer systems that could run the code itself.
Code for most games is not sloppy at all. PC's by their openness is the problem. Take a 1,0000 gaming rigs and there will be a 1,0000 different configurations. Take a 1,0000 different Xbox One S consoles and you will probably find there is 1 configuration.. PC Developers have to optimize against an impossible list of variables: * Different CPU's (even different generations of the same CPU) * Different motherboards * Different BIOS setups * Different memory manufactures, different memory speeds, different total memory * Different GPU's. Even if the same GPU type maybe not form the same manufacturer, different GPU memory * 1,0000 of combinations of drivers. mouse, printer, network, graphic, system, keyboard, audio, and so on... * Loaded third party software in the background. BAD ACTORS! Many issues aren't optimization/CPU related. * Anti-virus, not just one but there are dozens of programs and all acting differently. * More issues are caused by third party interaction to a game (e.g. bad driver) then sloppy game code.
Yeah sloppy was not the best word to use in this case. Unoptimized.
If thats the case then why do frames drop with higher resolution?? Lol
No disrespect, but this question answered itself. The answer is because GPUs have limits...
The CPU in Xbox One X maxes out long before the GPU does, this is what's called a bottleneck. This absolutely impacts performance.
Memory is usually the issue of frame drops on the GPU, not the ability for the GPU to process it fast enough. As far as CPU, there are physics processing that is increased in detail as resolution increases, otherwise you get blocky particle effects. Those would affect frame rate from the CPU if it isn't GPU bound.
True, but the CPU still needs to figure into feeding the GPU the data. Even with direct system-memory access from the GPU, the CPU is going to have to do some significant work in setting up buffers, decompressing assets from the game image on the fly (no sizeable game's graphics fit in RAM all at once), and probably even some side stuff related to rendering. (Your mentioned particles, as a possible example.) Quadrupling that load will impact performance to some degree--has to. Still, I do agree that increasing GPU performance and memory to match the increase in resolution will go a long way in handling the extra load transparently.
Totally agree if PC games were optimised as much as consoles games, PC gamers would be happier but yes raw power sloppy coding.
Not really true anymore with modern games. Forza 7's demo generally runs better on PC in DX12 than equivalent Xbox One X hardware.
You know you s*** better than me, no denying that. But not sure I agree with you. It’s much better admittedly (than it was and improving). But when you consider what they can still squeeze out of the OG (and updated) consoles it be hard to see matching anything like that price/performance (optimisation) with your PC. But me personally, I’m happy to pay a (pretty big) bit more too get a bit more.
Impossible, optimisation is always better. That's why these asset demanding games are running bad on even the highest quality pc's because they are free no optimisation and pc can utilise the power, that is not calling games running better, looking better yes..
They won't believe you but you can genuinely get 4K 60FPS with ultra settings AND MSAA on a GTX1060 in the Forza 7 demo. That's a midrange video card that is like 14 months old. I said several months ago when Forza 7 tech was shown on X1X that this would be the case and nobody believed me. Well sorry, but I'm right again.....
Well, that is only because big companies are not primarily or even hardly putting the hollywood budgets on pc exclusives or even timed exclusives.
In a perfect world, you would be correct. Console games can be optimized to the metal--directly to the fixed hardware spec. But are they? They used to be, when the hardware was more primitive. Now that consoles are basically PCs dedicated to closed-environment gaming, I doubt it. They run through an (optimized, but still present) abstraction layer, like DirectX or Vulkan. Economics have to figure into maintaining the code as portable as possible. I'd be very curious to find out exactly how much more efficient the console HAL code is, compared to the more hardware-flexible equivalent on PC.
"Xbox One X’s 4K Resolution Has No Impact on CPU..." - Captain Obvious --- GPU handles resolution to begin with, so I don't see how the CPU would affect anything like this anyway.
Yeah! News at 11
I guess it depends on the games? Other devs may produce games that are CPU bound? then we might have an issue.? . . hey, what do we gamers know? I'll leave the wizardry to the devs. -- The base PS4 has a 1.6ghz CPU. . theoretically a lot of games should not be possible . . . Try running today's games on a PC equipped with a 1 ghz CPU lol.
Resolution isn't a function the CPU. Every aspect of resolution is handled by the GPU. The CPU request a frame draw, the GPU takes the data of all the object in the scene, which is provided by the CPU, and then renders it based on the "camera" and outputs the frame. If the scene has a lot of objects to draw, then the CPU can indeed slow things down, because it has to process all that stuff before making the frame draw request. If the frame rate is high, then it's possible that not all those objects can be updated on every game loop to send along time time to maintain the frame rate. But things like that are independent of the resolution itself.
I ran Forza 7 demo at 4k maxed out @ 60fps on. RX 470 I5 2400 8GB RAM
That is a bold claim. Do you have proof of this? My skylake i5 with GTX 970 cant do it.
I have an I5 2400 instead. But get 60fps 4k. https://youtu.be/GaeW_PEBqf...
So your i5 2400 from 2011 is better than a Ryzen 5 from 2017? Press [x] to doubt.
Xbox One X was always designed to give a 4x boost to res only, thats why CPU wasnt majorly overhauled. If a game is 1080p60fps on X1 then that game would be 4k60fps on X1X. So when I see Playstation fans trolling Xbox articles with "True 4k, Uncompromised 4k xdxdxd" I just laugh because they don't know what they are talking about. If a game was 720p on X1, trolling the game because it isnt native 4k on X1X is just a waste of time.
It really makes me happy seeing devs being able to utilize the more powerful consoles... The PS4 Pro and Xbox One X are great machines. Glad they are being embraced so well! Here's to the future of gaming everyone! :)
Imgumbydammit consoles never out perform high spec pcs wth u on abought i mean never
That's a bit of a broad statement, half of the games these days are openworld so I'd say, "no impact" simply isn't true
Its talking about using dx12
That's complete nonsense, of course it has an impact on CPU. Obviously not as much as the GPU, and how much depends on the game, but to say it has no impact is absurd. Where the CPU matters most is framerate. Bottlenecking is the real problem, which obviously will be less of a problem on a 7 year old game like WoT.
It is still Ms made a pos.
Nice to see a positive article instead of those trash "xbux haz no gamez" crap that gets approved
Lol at the ppl who act like this is some new info that doesnt occur on other games for consoles smh. Gpu handles the resolution amongst other things and cpu deals with performance such as fps. This has always been so, on occasion they can offload certain things to gpu that cpu usually does but not all things.
Uh..... duh? Cpu has nothing to do with pixel count.
When they were creating the XB1 & PS4, games were still more GPU intensive. That has switched to being more CPU intensive at times are that factors in more now than around 2012 or so. For the next Xbox & PS5 you will see much better CPU's even than would have been prior to these tech industry boosts that have been going on. Tech shifted recently and it's in our favor(the consumer...the gamers). I expect a nice jump next gen simply because the downscaling of of chips and the power within them allows things to be smaller yet more powerful in the console. I have high hopes for next gen.....not just for graphics, but more so for 60fps all around gameplay.
Isn’t the gpu inside this box basically by a mobile 970/1070?
Mobile 580 that performs similar to a 1070.
This is just pr. We already know with detail from DF that the jag cpu was weak when launched at 2013 on consoles and by 2017 is even worse. The new cpu from amd are 200 + better, is just crazy how weak are the jags
Not only the most powerful console in the world, it's also the most efficient console in the world. Amazing.
N4G is a community of gamers posting and discussing the latest game news. It’s part of NewsBoiler, a network of social news sites covering today’s pop culture.