Intel shows at IDF in Beijing a technology demo of the Vision Engine 8 running twelve threads on a Core i7-980X. A Youtube video shows the engine in action. PCGH has more information on the techdemo.
Push the devs to use such powerful cpus. I certainly dont see a point of upgrading anymore if all we are getting are games that barely require a core 2 duo.
that plus running a home-made engine on a home made CPU isn't that impressive to me. I'd like to see GTA run on that baby for exemple. And we all know how hyperthreading sometimes mess up
But its gonna be a pain to develop a game that utilizes 6 cores, heaps of concurrent programming will be involved.
Well with all of the experience that developers are getting from working on the PS3, we will see more multi-threading in PC gaming. Just wait till Crytek is done with Crysis 2 and see if they don't start incorporating multi-threading into their game engine.
@smurfee: Look into what DirectX 11 is bringing in terms of multi-threading. Microsoft has made it much easier for developers to implement a good amount of multi-threading into a game.
So this cpu will be great when the next consols are out and the ported games come to the PC. But maybe this CPU will run a PS3 or Xbox 360 emulator making it cheaper to port games. I don't see IBM offering the Cell chip for there severs anymore... lol. CC :-)
I'm doubtful that even with the right software, this CPU would be capable of emulating the PS3's overly complex hardware. The 360 will probably be emulated first, but even that is still years away. By the time PS3 and 360 emulators are feasible, the next generation of consoles will be out and this CPU will be considered old.
Emulators require faster cores not more. So it will be a very long time before the 360 or PS3, the PS3 will be especially hard.
emulating the 360 or ps3 lol wont happen for long time theres still no emulater for the original xbox ps2 took years to make what iam trying saying console are just getting to hard to make for some guy living up in a basement
I'd say this CPU is better than that of the next generation consoles. The next consoles won't need that many cores to significantly improve on what we have now so why would console developers go to the expense? The GPU does most of the work anyway and GPUs in pcs nowadays are miles ahead of those in both the 360 and PS3. The ability to change and upgrade the gpu will always give pcs the advantage over consoles and by the time the next generation consoles are out, they will already be outdated by the latest GPUs and CPUs.
An unimpressive demo run by a guy who sounds like a used car salesman. We're only now getting games that really utilize two cores lets alone four so increasing that seems pointless apart from bragging rights running a tech demo. Developers have to cater for the lower specs and so you're always held back to some extent as much as I'd like to see some of the flight sims I own run over multiple cores to give me a better experience.
Anyone and their dog programmer can kick off a thread to do *something*. Making a game engine that does something interesting on all cores at once, while maintaining the sequential order necessary to run a game loop, is the trouble. Multiple threads/core are useful for some stuff... but if you need to, for example, know the position of a character in a game world, to know where the camera is, and you need to know where the camera is to know where to start culling geometry for the render pipeline that frame, and the render pipeline can't start until its fed geometry, and... Some stuff just happens serially. Lots of stuff, actually. You can't parallelize everything. One of the reasons the PS3 is so cool for parallel game code, is that individual SPUs are only like half the cost of a regular core -- so you can afford to double the number on the chip, and make the game loop steps that *do* fit with parallelism much faster. They just require more effort to program, since they lack the automated conveniences of a "full core", complete with full memory addressing and an automated memory caching mechanism. Having a ton of complete cores, for a game machine, is pretty wasteful.
I'm not sure why this guy thinks 20 fps is a good framerate, for a couple characters and some rain, on a small city street with clean square buildings. The thread workload % looks... wrong. What on earth could even be keeping all 12 HW threads so busy while doing these, relatively simple, things? Its almost as if some threads were just spinning, and they were counting that wasted time in the workload.
That's exactly what I was thinking. They must be throwing away a lot of computation on stuff that real video game developers would not...such as doing a collision trace for every single raindrop, physically realistic multi-drop splashes - things a gamer would not really notice. The end result is you have a pretty (boring) scene that would only captivate the attention for a minute.
Notice in this case, the GPU was the 'bottleneck'. Good stuff.
N4G is a community of gamers posting and discussing the latest game news. It’s part of NewsBoiler, a network of social news sites covering today’s pop culture.