Top
290°

Intel wants Larrabee in next generation of game consoles

Computer chip giant Intel has told Develop that it hopes to convince console manufacturers that its Larrabee chip will be an ideal processor for their next generation of games hardware.

Read Full Story >>
developmag.com
The story is too old to be commented.
XXXRATED3329d ago

wont be enough power for the likes of sony I see them using a 32 core processor or higher by then. In 8 years I don't thin this processor will even be around.

Ju3329d ago

Larrabee isn't even out yet. I bet, both PS4 and next Xbox are in the early dev state already. They are thinking about which CPU to take right now (or within 1-2 years). Ideal timing for intel. The 32core cell is on the roadmap, don't think it physically exists, yet. It should be ready in 8 years. But Larrabee was presented now. I think, however, for the next generation, backwards compatibility is crucial. I won't spend $600 again and not being able to play my current HD games. I'd rather stick with what I have instead - except I can have both worlds. That argument, however, would point toward a cpu evolution then another break with the architecture...

LightofDarkness3329d ago (Edited 3329d ago )

FYI, Larrabee will likely use a 32-core design. The actual cores more closely resemble older Pentium designs with a much higher vector register count than any GPU or CPU on the market (16 wide vector ALU). Interesting approach, which theoretically also produces massive gains in graphics processing.

INehalemEXI3329d ago

The video card containing Larrabee is expected to compete with GeForce and Radeon products from NVIDIA and AMD/ATI respectively. Larrabee will also compete in the GPGPU and high-performance computing markets. Intel plans to have engineering samples of Larrabee ready by the end of 2008, with a video card hitting shelves in late 2009 or 2010

zapass3329d ago

- asymmetric multicore architecture quite similar to the Cell: gen purposed cores and SIMD cores with their own cache. Gee, Intel is years behind STI!

- a lot of whiners who complained about the difficulty of coding the Cell are gonna have to suck it up somewhat on the PC too (Gabbe, where are you?)

- Ps3 coders who already made the transition will very easily port their code while 360 coders will have a bitter pill to swallow

Lemme laugh that one out really loud!

+ Show (1) more replyLast reply 3329d ago
DaKid3329d ago (Edited 3329d ago )

Where do you find the specs on it? Also sounds cool, I like the idea they are asking what the game industry wants in a processor. If you give them what they want, and design it to meet requirements then it should be a great tool to futher improve what games can do.

But I don't know who will use it, sony has Cell and Microsoft may be to cheap to pay for it.

Elven63329d ago

I don't see Sony using it unless CELL completely fails, since they co created the thing. Microsoft and Nintendo are really the only companies that will probably use it.

Xi3329d ago

And I highly doubt developers will be willing to code for a chipset featuring the high number of spe's required to match other cpu configurations.

If developers have trouble coding for 6-8 SPE's then they're going to have a lot more trouble with something like 32, the micromanaging of system resources and threads would drive programmers nuts. Unlike the larabee which greatly simplifies the process.

DaKid3329d ago

Good point Xi, I see what your saying.

SuperM3329d ago (Edited 3329d ago )

Multithreading is the future. Developers is going to have to learn to program for several cores nomather what. And it doesnt have to be alot more difficult to program for 32 cores then 8 cores. Once they get the hang of programming for several cores it will be just fine. If you ask any sony first party developer now i dont think they have any problem programming for the cell. Developers just need to learn how to do it right, after that it goes smooth.

Xi3329d ago

and developers are fine at multithreading and hyperthreading programming, for example, my PC has a total of 4cpu's and 2gpu's that's 6 cores, however they type of threading, the type of programming required and how each instruction is executed differs. Ever done parellel programming? It's a pain.

INehalemEXI3329d ago (Edited 3329d ago )

The Cell has little need to hyperthread. It has 1 PPE that kicks the tasks to the 7 SPE's Which are very quick. The next Cell thats in a console could look something like 4 PPE's and 32 ( maybe 4 would be redundant) SPE's and there would be little need to have those PPE's Hyperthread.

The Xenon with 3 PPE cores is the type of cpu that is reliant on hyperthreading and Hyperthreading is not very efficiant. Its not like you get the power of 6 cores when hyperthreading.

New algorithm's created for Cell programing these days should lead to wisdom in programing for the next generation of Cell. Programing is always tedious if we were to just use the same old code and traditional chipset sure it would be easy but there would be less advancements.

The Larrabee GPU could be in a ps4 and the next cell as the cpu who knows.

Ju3329d ago

The next cell on the road map has 2 PPE cores and 32 SPEs. But its not clear if it is meant to go into a console. Its more likely a replacement for the blade cpu (the CellX or what it's called).

However, I agree, that tools available for the current cell should scale very well for a new derivate. I also think, knowledge gained with the CELL today will support the next version - and the programmers. A faster cell can also use improved "virtualisation", like it could allow virtual "paging" of SDRAM (even in HW) or/and dynamic address translation for SRAM/local addresses. Further more, the instruction set could be expanded/optimized to catch up with current GPUs. A lot more graphics tasks could be off loaded to the SPUs, hence 32 instead of 6 spus and would eliminate the need for a high end GPU. And eventually, the memory bandwith could reach 100GB/s (or even 100GB/s) within 10 years.

So, I think, there is some potential for a next gen cell as well.

But back on topic. Is Larrabee an integrated CPU with a heavyweight SIMD core or what's the main intend of the chip ? I think for PCs its meant to be a companion to the CPU or integrated as a gpu. AFAIK the SIMD cores are more general purpose but can rival GPU shader programs. Is that right?

+ Show (4) more repliesLast reply 3329d ago
Elven63329d ago

http://en.wikipedia.org/wik...

Heres more info on Larrabee, so far up to 48 cores have been benchmarked!

DaKid3329d ago (Edited 3329d ago )

wow, thanks

+ bubbles

doshey3329d ago

wow no way if i was a developer would i want to develop for something like that

Saigon3329d ago

this chip will also have several cores...Xi it makes it no different than the cell, if devs have problems with the cell they will have problems with this...but the last statement i made is false...why you might ask...multi core programming is and will be the future...devs have no choice but to get use to it...it will be the norm...think about it the cell is not only used in the PS3...it will be used in all appliances...

Xi3329d ago (Edited 3329d ago )

the difference is in the architecture of the chipset as much as it is in the system itself.

All current pcs wich feature a CPU and have a gpu are multi-core, same goes with dualcore cpu's and systems running sli/crossfire, we're fine at multi-core programming. The way the cell is designed isn't to provide ease of programming, that's the problem, and unless you radically change the threading on the cell you're see those problems expand with more cores. The larrabee simplifies the process and it's growth is linear, meaning that one could simply split the threads over more cores, unlike in the cell.

Gam713329d ago

This will be easier to programme than the cell as it's architecture is closer to todays chips.

It will be made with feedback from the devs and will use open gl and directx unlike the cell which was made by sony for sony and has forced devs to programme differently.

It sounds like intel are desperate to get the next gen wrapped up and is courting the industry to ensure they get it right.

An interesting and obvious approach which is refreshing to hear.
It's like when a game developer listens to the feedback from the games buyers and you think why don't they all do that.

Ju3329d ago (Edited 3329d ago )

@XI, I do not agree. Larrabee just addresses some issues which the CELL brought to light. This is cache coherency and addressing local storage. This however can be implemented on the CELL as well thru a SPE specific kernel, which does all that for you, treating the SRAM virtually as cache and use explicit DMA to load/store (map) SRAM into local ram. Using that kernel would give you a "virtual" multi core CPU.

However, such a system is not 100% deterministic and you loose some control over when the SRAM gets loaded or flushed, hence I'd guess programmers prefer the low level approach. libspe2 for example allows "pseudo" preemtive scheduling of SPU threads. Others use "micro" tasks and a scheduler to split small taks and split them synchronously across SPUs (this actually would scale pretty good thru any number of SPUs, the more the better). Others use static SPU allocation (e.g. dedicated taks on various spus, but eliminates complex SRAM "coherency" managment).

But all this what Larrabee does in HW can also be done in SW. Larrabee just moves some portions into dedicated HW. But its not so far off from the CELL (well, except, each core has a discrete in-order general purpose execution unit, as well, a mini PPU if you want).

Oh, and what I forgot. Larrabee adds a texture unit (like an on die RSX).

LightofDarkness3329d ago

Speaking purely conceptually (I've never worked with the Cell so I can't comment for sure), it seems the Cell could very well produce playable ray-traced scenes if it had a larger array of both PPUs and SPEs. If yout think about it, the SPEs are basically very similar to shader processing units that are featured in modern GPUs. They have extra general purpose power that pure shader units lack. If you could off-load vector and ray processing to the PPEs and GPUs, you could leave the SPEs to process an enormous amount of shader operations, meaning detail levels could be maintained with the shaders while the scene is lit appropriately. Purely speculative, I know, I can't comment on the Cell's power having not dealt with it, but a man can dream...

Ju3329d ago

Google is your friend: http://www.google.com/searc...

One good example is this: http://www.youtube.com/watc...

Now, this is 3 PS3s, with 3PPUs and 18SPUs (and gigabit interconnects ?). No Dfp. Now imagine a next gen CELL with 2 PPU cores (with maybe two real HW threads each) and full DP SPU x32 performance. I do not know how deep you could render, but partially for sure.

+ Show (5) more repliesLast reply 3329d ago
DJ3329d ago

But Microsoft is making their own CPU chip from scratch, and Sony knows that x86 architecture offers less than optimal performance, and will simply stick with Cell.

Elven63329d ago

With the Xeon though, didn't they take Intel's Xeon chip and just modify it?

Xi3329d ago

and like both the xenon in the xbox and the cell's spe's they're all based on IBM powerPC architecture.

DJ3329d ago

And Cell's SPUs aren't PowerPC based. They're based off the old SIMD chips from the late 80's. But they do have a very similar instruction set, specifically with the VMX Units seen in IBM G5 chips.

LightofDarkness3329d ago

Yep, I believe the Cell's single general purpose CPU is a PowerPC based architecture. And the SPE's are indeed of and older design, hence the vast criticism levelled against the Cell since it's inception (which I believe has been proven ill-informed).

Ju3329d ago

The spus are not based on the power (or powerpc) architecture. They couldn't get that to run in that high amount of cores with the low power. SPUs are custom SIMD processors. "Old" might refer to its in order execution and pipelining. But other then that, SPUs were designed from scratch. The PPU is a power core (striped down, somewhat).

+ Show (2) more repliesLast reply 3329d ago
LightofDarkness3329d ago

I doubt Larrabee will be attractive enough for the next gen, but Intel is boasting that it can achieve much better experiences than competing discrete solutions offered by both AMD and nVidia.

Intel's overall aim is to be able to produce the first CP/GPU that can process in-game ray-traced graphics at playable framerates. The problem is can it produce ray-traced scenes with the same level of detail and graphical fidelity that we're accustomed to?

Larrabee is not designed to produce ray-traced scenes at playable speeds, but simply designed to show that a CPU/GPU hybrid can be competitive. Note that a CP/GPU would be far cheaper to implement as it only requires purchasing a single chip, as well as a single licensing deal with one company. I can see someone like Nintendo taking the bait alright.

Larrabee will be interesting, but it's successor will be much more intriguing given Intel's overall aim. The real question is will one of the console giants take the risk and bite the big, juicy ray-tracing worm for their next big console...

DJ3329d ago (Edited 3329d ago )

I can see the next Nintendo console simply using a G5 core, or a dual-core Intel chip. Nintendo doesn't really care about having cutting edge technology. They're more about cutting edge ideas.

DaKid3329d ago

I agree, this is just a building step for the future. you know, it takes a step to walk a mile, kinda thing

LightofDarkness3329d ago (Edited 3329d ago )

All I'm thinking of is costs. Nintendo will likely want to repeat their Wii business model, making a small profit off of each unit sold rather than a small loss. It all depends on how expensive the Larrabee manufacturing process is. Seeing as it's based on the much older Pentium design, it could be quite cheap to produce, though we'll have to see how much the massive increase of Vector registers affects the costs. AFAIK, they have only one lithography machine that has been engineered to produce a few test designs of Larrabee, and they won't be ready for weeks.

But if Nintendo don't have to shell out extra dough for a separate GPU/CPU design, they may opt for a single contract/license deal.

DJ3329d ago

I'm still a bit skeptical about integrated graphics, but if they have as many cores as I'm hearing then they should be fine. By the time a new Nintendo console is released, Larabee (or at least a form of it) should be fairly cheap to produce.

LightofDarkness3329d ago

Bubble up for decent conversation @ DJ ;)

Ju3329d ago

I think one of the deciding factors will be the price. If the thing delivers current (or future) visuals, and you can save the space for the CPU in a console, this is a big advantage. However, reading about the power consumption and cooling requirements might scare some people.

+ Show (3) more repliesLast reply 3329d ago
Show all comments (38)