290°

Intel wants Larrabee in next generation of game consoles

Computer chip giant Intel has told Develop that it hopes to convince console manufacturers that its Larrabee chip will be an ideal processor for their next generation of games hardware.

Read Full Story >>
developmag.com
XXXRATED6153d ago

wont be enough power for the likes of sony I see them using a 32 core processor or higher by then. In 8 years I don't thin this processor will even be around.

Ju6153d ago

Larrabee isn't even out yet. I bet, both PS4 and next Xbox are in the early dev state already. They are thinking about which CPU to take right now (or within 1-2 years). Ideal timing for intel. The 32core cell is on the roadmap, don't think it physically exists, yet. It should be ready in 8 years. But Larrabee was presented now. I think, however, for the next generation, backwards compatibility is crucial. I won't spend $600 again and not being able to play my current HD games. I'd rather stick with what I have instead - except I can have both worlds. That argument, however, would point toward a cpu evolution then another break with the architecture...

LightofDarkness6153d ago (Edited 6153d ago )

FYI, Larrabee will likely use a 32-core design. The actual cores more closely resemble older Pentium designs with a much higher vector register count than any GPU or CPU on the market (16 wide vector ALU). Interesting approach, which theoretically also produces massive gains in graphics processing.

INehalemEXI6153d ago

The video card containing Larrabee is expected to compete with GeForce and Radeon products from NVIDIA and AMD/ATI respectively. Larrabee will also compete in the GPGPU and high-performance computing markets. Intel plans to have engineering samples of Larrabee ready by the end of 2008, with a video card hitting shelves in late 2009 or 2010

zapass6153d ago

- asymmetric multicore architecture quite similar to the Cell: gen purposed cores and SIMD cores with their own cache. Gee, Intel is years behind STI!

- a lot of whiners who complained about the difficulty of coding the Cell are gonna have to suck it up somewhat on the PC too (Gabbe, where are you?)

- Ps3 coders who already made the transition will very easily port their code while 360 coders will have a bitter pill to swallow

Lemme laugh that one out really loud!

+ Show (1) more replyLast reply 6153d ago
DaKid6153d ago (Edited 6153d ago )

Where do you find the specs on it? Also sounds cool, I like the idea they are asking what the game industry wants in a processor. If you give them what they want, and design it to meet requirements then it should be a great tool to futher improve what games can do.

But I don't know who will use it, sony has Cell and Microsoft may be to cheap to pay for it.

Elven66153d ago

I don't see Sony using it unless CELL completely fails, since they co created the thing. Microsoft and Nintendo are really the only companies that will probably use it.

Xi6153d ago

And I highly doubt developers will be willing to code for a chipset featuring the high number of spe's required to match other cpu configurations.

If developers have trouble coding for 6-8 SPE's then they're going to have a lot more trouble with something like 32, the micromanaging of system resources and threads would drive programmers nuts. Unlike the larabee which greatly simplifies the process.

DaKid6153d ago

Good point Xi, I see what your saying.

SuperM6153d ago (Edited 6153d ago )

Multithreading is the future. Developers is going to have to learn to program for several cores nomather what. And it doesnt have to be alot more difficult to program for 32 cores then 8 cores. Once they get the hang of programming for several cores it will be just fine. If you ask any sony first party developer now i dont think they have any problem programming for the cell. Developers just need to learn how to do it right, after that it goes smooth.

Xi6153d ago

and developers are fine at multithreading and hyperthreading programming, for example, my PC has a total of 4cpu's and 2gpu's that's 6 cores, however they type of threading, the type of programming required and how each instruction is executed differs. Ever done parellel programming? It's a pain.

INehalemEXI6153d ago (Edited 6153d ago )

The Cell has little need to hyperthread. It has 1 PPE that kicks the tasks to the 7 SPE's Which are very quick. The next Cell thats in a console could look something like 4 PPE's and 32 ( maybe 4 would be redundant) SPE's and there would be little need to have those PPE's Hyperthread.

The Xenon with 3 PPE cores is the type of cpu that is reliant on hyperthreading and Hyperthreading is not very efficiant. Its not like you get the power of 6 cores when hyperthreading.

New algorithm's created for Cell programing these days should lead to wisdom in programing for the next generation of Cell. Programing is always tedious if we were to just use the same old code and traditional chipset sure it would be easy but there would be less advancements.

The Larrabee GPU could be in a ps4 and the next cell as the cpu who knows.

Ju6153d ago

The next cell on the road map has 2 PPE cores and 32 SPEs. But its not clear if it is meant to go into a console. Its more likely a replacement for the blade cpu (the CellX or what it's called).

However, I agree, that tools available for the current cell should scale very well for a new derivate. I also think, knowledge gained with the CELL today will support the next version - and the programmers. A faster cell can also use improved "virtualisation", like it could allow virtual "paging" of SDRAM (even in HW) or/and dynamic address translation for SRAM/local addresses. Further more, the instruction set could be expanded/optimized to catch up with current GPUs. A lot more graphics tasks could be off loaded to the SPUs, hence 32 instead of 6 spus and would eliminate the need for a high end GPU. And eventually, the memory bandwith could reach 100GB/s (or even 100GB/s) within 10 years.

So, I think, there is some potential for a next gen cell as well.

But back on topic. Is Larrabee an integrated CPU with a heavyweight SIMD core or what's the main intend of the chip ? I think for PCs its meant to be a companion to the CPU or integrated as a gpu. AFAIK the SIMD cores are more general purpose but can rival GPU shader programs. Is that right?

+ Show (4) more repliesLast reply 6153d ago
Elven66153d ago

http://en.wikipedia.org/wik...

Heres more info on Larrabee, so far up to 48 cores have been benchmarked!

DaKid6153d ago (Edited 6153d ago )

wow, thanks

+ bubbles

doshey6153d ago

wow no way if i was a developer would i want to develop for something like that

Saigon6153d ago

this chip will also have several cores...Xi it makes it no different than the cell, if devs have problems with the cell they will have problems with this...but the last statement i made is false...why you might ask...multi core programming is and will be the future...devs have no choice but to get use to it...it will be the norm...think about it the cell is not only used in the PS3...it will be used in all appliances...

Xi6153d ago (Edited 6153d ago )

the difference is in the architecture of the chipset as much as it is in the system itself.

All current pcs wich feature a CPU and have a gpu are multi-core, same goes with dualcore cpu's and systems running sli/crossfire, we're fine at multi-core programming. The way the cell is designed isn't to provide ease of programming, that's the problem, and unless you radically change the threading on the cell you're see those problems expand with more cores. The larrabee simplifies the process and it's growth is linear, meaning that one could simply split the threads over more cores, unlike in the cell.

Gam716153d ago

This will be easier to programme than the cell as it's architecture is closer to todays chips.

It will be made with feedback from the devs and will use open gl and directx unlike the cell which was made by sony for sony and has forced devs to programme differently.

It sounds like intel are desperate to get the next gen wrapped up and is courting the industry to ensure they get it right.

An interesting and obvious approach which is refreshing to hear.
It's like when a game developer listens to the feedback from the games buyers and you think why don't they all do that.

Ju6153d ago (Edited 6153d ago )

@XI, I do not agree. Larrabee just addresses some issues which the CELL brought to light. This is cache coherency and addressing local storage. This however can be implemented on the CELL as well thru a SPE specific kernel, which does all that for you, treating the SRAM virtually as cache and use explicit DMA to load/store (map) SRAM into local ram. Using that kernel would give you a "virtual" multi core CPU.

However, such a system is not 100% deterministic and you loose some control over when the SRAM gets loaded or flushed, hence I'd guess programmers prefer the low level approach. libspe2 for example allows "pseudo" preemtive scheduling of SPU threads. Others use "micro" tasks and a scheduler to split small taks and split them synchronously across SPUs (this actually would scale pretty good thru any number of SPUs, the more the better). Others use static SPU allocation (e.g. dedicated taks on various spus, but eliminates complex SRAM "coherency" managment).

But all this what Larrabee does in HW can also be done in SW. Larrabee just moves some portions into dedicated HW. But its not so far off from the CELL (well, except, each core has a discrete in-order general purpose execution unit, as well, a mini PPU if you want).

Oh, and what I forgot. Larrabee adds a texture unit (like an on die RSX).

LightofDarkness6153d ago

Speaking purely conceptually (I've never worked with the Cell so I can't comment for sure), it seems the Cell could very well produce playable ray-traced scenes if it had a larger array of both PPUs and SPEs. If yout think about it, the SPEs are basically very similar to shader processing units that are featured in modern GPUs. They have extra general purpose power that pure shader units lack. If you could off-load vector and ray processing to the PPEs and GPUs, you could leave the SPEs to process an enormous amount of shader operations, meaning detail levels could be maintained with the shaders while the scene is lit appropriately. Purely speculative, I know, I can't comment on the Cell's power having not dealt with it, but a man can dream...

Ju6153d ago

Google is your friend: http://www.google.com/searc...

One good example is this: http://www.youtube.com/watc...

Now, this is 3 PS3s, with 3PPUs and 18SPUs (and gigabit interconnects ?). No Dfp. Now imagine a next gen CELL with 2 PPU cores (with maybe two real HW threads each) and full DP SPU x32 performance. I do not know how deep you could render, but partially for sure.

+ Show (5) more repliesLast reply 6153d ago
DJ6153d ago

But Microsoft is making their own CPU chip from scratch, and Sony knows that x86 architecture offers less than optimal performance, and will simply stick with Cell.

Elven66153d ago

With the Xeon though, didn't they take Intel's Xeon chip and just modify it?

Xi6153d ago

and like both the xenon in the xbox and the cell's spe's they're all based on IBM powerPC architecture.

DJ6153d ago

And Cell's SPUs aren't PowerPC based. They're based off the old SIMD chips from the late 80's. But they do have a very similar instruction set, specifically with the VMX Units seen in IBM G5 chips.

LightofDarkness6153d ago

Yep, I believe the Cell's single general purpose CPU is a PowerPC based architecture. And the SPE's are indeed of and older design, hence the vast criticism levelled against the Cell since it's inception (which I believe has been proven ill-informed).

Ju6153d ago

The spus are not based on the power (or powerpc) architecture. They couldn't get that to run in that high amount of cores with the low power. SPUs are custom SIMD processors. "Old" might refer to its in order execution and pipelining. But other then that, SPUs were designed from scratch. The PPU is a power core (striped down, somewhat).

+ Show (2) more repliesLast reply 6153d ago
LightofDarkness6153d ago

I doubt Larrabee will be attractive enough for the next gen, but Intel is boasting that it can achieve much better experiences than competing discrete solutions offered by both AMD and nVidia.

Intel's overall aim is to be able to produce the first CP/GPU that can process in-game ray-traced graphics at playable framerates. The problem is can it produce ray-traced scenes with the same level of detail and graphical fidelity that we're accustomed to?

Larrabee is not designed to produce ray-traced scenes at playable speeds, but simply designed to show that a CPU/GPU hybrid can be competitive. Note that a CP/GPU would be far cheaper to implement as it only requires purchasing a single chip, as well as a single licensing deal with one company. I can see someone like Nintendo taking the bait alright.

Larrabee will be interesting, but it's successor will be much more intriguing given Intel's overall aim. The real question is will one of the console giants take the risk and bite the big, juicy ray-tracing worm for their next big console...

DJ6153d ago (Edited 6153d ago )

I can see the next Nintendo console simply using a G5 core, or a dual-core Intel chip. Nintendo doesn't really care about having cutting edge technology. They're more about cutting edge ideas.

DaKid6153d ago

I agree, this is just a building step for the future. you know, it takes a step to walk a mile, kinda thing

LightofDarkness6153d ago (Edited 6153d ago )

All I'm thinking of is costs. Nintendo will likely want to repeat their Wii business model, making a small profit off of each unit sold rather than a small loss. It all depends on how expensive the Larrabee manufacturing process is. Seeing as it's based on the much older Pentium design, it could be quite cheap to produce, though we'll have to see how much the massive increase of Vector registers affects the costs. AFAIK, they have only one lithography machine that has been engineered to produce a few test designs of Larrabee, and they won't be ready for weeks.

But if Nintendo don't have to shell out extra dough for a separate GPU/CPU design, they may opt for a single contract/license deal.

DJ6153d ago

I'm still a bit skeptical about integrated graphics, but if they have as many cores as I'm hearing then they should be fine. By the time a new Nintendo console is released, Larabee (or at least a form of it) should be fairly cheap to produce.

LightofDarkness6153d ago

Bubble up for decent conversation @ DJ ;)

Ju6153d ago

I think one of the deciding factors will be the price. If the thing delivers current (or future) visuals, and you can save the space for the CPU in a console, this is a big advantage. However, reading about the power consumption and cooling requirements might scare some people.

+ Show (3) more repliesLast reply 6153d ago
Show all comments (38)
70°

Josh Sawyer: "I feel good about the ability for people to create games."

Game Pressure met with the one and only Josh Sawyer at Digital Dragons and chatted about RPGs, Pentiment, Pillars of Eternity, the state of the industry, and the genre.

Read Full Story >>
gamepressure.com
80°

Phantom Squad — Game Overview • VGMM

Phantom Squad is an intense 1-4 player tactical top-down shooter that blends fast-paced combat with strategic planning, drawing inspiration from games like Hotline Miami and Rainbow Six. Set to release in 2025 on Steam, players take on the role of disavowed operatives who must carefully plan their assault before breaching rooms.

Read Full Story >>
videogamesmademe.com
170°

Switch 2 accessory makers are “working hard” on magnetic analogue sticks to combat stick drift

Nintendo Switch 2 stick drift is already an issue, but accessory makers are already working on magnetic joysticks.

Read Full Story >>
videogamer.com
Neonridr5d ago

I've never had stick drift in any controller I've ever owned. All my joycons (3 sets) from my Switch are perfectly fine. My Switch 2 ones are good. Never had a dualshock / dualsense have it (did have a dualshock get a stuck trigger once). Even my Valve Index controllers which were notorious for drift were fine for me.

Goodguy015d ago

Varies. Just because you've never had one doesn't mean others won't. I've had PS, Xbox, and Nintendo controllers indeed drift on me. Remember the drift lawsuits that Nintendo has had.

Please stop trying to say that traditional potentiometers are absolutely fine when hall effect/tmr sticks are beyond superior lol, not sure why so many people keep trying to defend potentiometers when hall effects are so pro consumer.
Companies like nintendo, sony, and mcsft don't want to implement them simply because they want you the consumer to keep buying them. Most casuals will just buy a new controller if theirs difts, easy money for the big 3.
I've gone hall effects and have never gone back, can leave my controllers out absolutely fine without worry of dust ruining the sticks.

Neonridr5d ago

when did I say that traditional joysticks are absolutely fine? I'm sharing my own experience here. I never said that others don't experience issues, nor that this is somehow acceptable. Strange leapfrog from A to B on that one.

People love to say it's a huge widespread issue, yet you only hear about it because the minority is vocal whereas the majority don't report anything because they are fine.

I'm happy you switched to hall effect sticks, if that's what works for you, then awesome.

Christopher5d ago

Cool? Pretty sure this is for those who are having it happen. Not sure how your anecdote addresses other peoples' problems.

Neonridr5d ago

people around here act like everyone will get drift. I'm willing to wager more people won't get drift than will.

Pyrofire954d ago

lucky you. I've only had minor drift once in my L Joycon which I fixed with some card stock, but drift is an undeniable problem as a whole.

repsahj4d ago

My 2 sets of Joy-Cons from my OG Switch started to drift with just a month of use. That is why I bought Dobe Joy-Cons with just $25, and until now they are still fine. I hope this will never happen again on my Switch 2.

jznrpg4d ago

My left joy con for the OG Switch got stick drift right away as well. I never had issues with my PS5 controllers since launch until I got my kids PS5’s and they had stick drift with 2 controllers. They drop them because they are kids and although I have them wash their hands before gaming they still get gunk under them somehow. I got their controllers modded because they would have issues no matter what I have them do.

jambola3d ago

"people around here act like everyone will get drift. I'm willing to wager more people won't get drift than will."
no, nobody is acting like that, you're jsut exaggerating it and lying to downplay a problem

-Foxtrot3d ago (Edited 3d ago )

Nintendos biggest defender on here so it figures

gold_drake3d ago

you know, i said the same thing not too long ago and then BOOM both controllers ha.

one on the moving stick and one on the look around stick.

both suck arse. ha

+ Show (3) more repliesLast reply 3d ago
jznrpg4d ago (Edited 4d ago )

The tech is already there. I had a couple of my PS5 controllers modded with Hall Effect modules and they work great. They should come standard with them these days but they don’t.

OtterX4d ago (Edited 4d ago )

Nyxi already made some fantastic Hall Effect controllers for Switch 1. I don't know why Nintendo couldn't have it by now if Nyxi has had for years.

spoonard4d ago

Cheap, frictionless sensors ALREADY exist. Why are they "working hard to combat stick drift"? Stick drift should be a thing of the past at this point. The technology is here...NOW. It has been...for YEARS! Why is stick drift even still spoken about? It shouldn't exist!

Chocoburger4d ago

Because its a scam for Nintendo, Sony, Microsoft to sell more overpriced controllers.

jambola3d ago

because pathetic fanboys will continue to defend and support

Doomeduk3d ago (Edited 3d ago )

WD 40 if it's shagg.d anyway why not ? I ordered a new ps5 pad after Helldivers 2 and POE 2 became unplayable due to drift but in the meantime I fired a bit of WD on my balls just below my stick rotated in a clockwise fashion massaging it in so to speak and also did the pin reset thingy and all clean no drift and hit that cancel purchase button like I meant it

UltimateOwnage3d ago

Honestly I’ve used my original Switch JoyCons and Pro Controller since launch and only in the last year did I see drift start to show up on one of my JoyCons. I’m sure it happens depending on how much and how firm the joystick is used, but it seems like a minor issue that goes with wear and tear after thousands of hours of play. I wish there had been Hall Effect sticks on Switch 2 just so there’s one less thing to worry about, but I’m not really concerned about it.