450°
Submitted by Rashid Sayed 262d ago | news

Targeting Full HD Using Xbox One's eSRAM Is A Challenge, Constantly Optimizing Usage: Witcher 3 Dev

"It's no secret that several developers have faced issues in outputting 1080p resolution due to Xbox One's eSRAM. In fact one such developer stated that Microsoft cheaped out on the RAM." (The Witcher 3: Wild Hunt, Xbox One)

Alternative Sources
Axios2  +   262d ago
Sounds like when developers had to get used to the PS3.

Back then ppl said they were lazy.

Keep optimizing those tools

@ Ziggurcat, you just described what some developers said about X1, must be lazy as almost 2 dozen titles are already 1080p.
#1 (Edited 261d ago ) | Agree(26) | Disagree(57) | Report | Reply
Rashid Sayed  +   262d ago
It will eventually happen though. But not sure how long it will take. But things seem to be getting better with the new SDK update.
DeadRabbits  +   262d ago
It should have already "happened" and by the time it does we will be playing with our PS5's

Once these hardware restriction don't effect game quality on other formats it can be easily contained!
#1.1.1 (Edited 262d ago ) | Agree(15) | Disagree(49) | Report
johndoe11211  +   262d ago
Is it me or did this dev not state a few months ago that it was much easier for them to work with the xbox one than the ps4? I'm pretty sure I saw an article stating that. This sounds contradictory.
blitz0623  +   262d ago
They didn't say it was easier on the X1, just that it's not as hard as others make it out to be.
OpieWinston  +   262d ago
@Johndoe
The CD Projekt Red team has a lot more experience with Direct X and that's why it's easier for them to develop for X1.

This is in relation to the RAM and not the API.
ShinMaster  +   261d ago
Even if developers MASTERED Xbox One's eSRAM...
http://i.imgur.com/FSd2iWd....
That's still a no. Sorry.
#1.1.5 (Edited 261d ago ) | Agree(5) | Disagree(3) | Report
DirtyPimp   262d ago | Trolling | show | Replies(6)
Jury  +   262d ago
It's underpowered. Plain and simple.
KNWS  +   262d ago | Well said
Its underpowered xb1? Well lets look at the PS4 Infamous was only 30 frames so was Killzone 30 frames (in fact Sony is getting sued by the courts for lying about the game claiming the multiplayer was 1080p when it was 720p in reality.

PS4 isn't a beast at all and destiny proved it once and for all. If the PS4 was vastly superior destiny would be 60 frames on PS4, but it isn't. Even the remastered Last Of US drops frames down to the 40's in some areas. Hardly a 60 frames game!!

I laugh at you Sony fanboys all the time, you guys are delusional and need serious help.I can accept the xb1 is slightly weaker than the ps4, but this 50% power difference has no basis in reality.

When the PS4 comes out with all those 1080p and 60 frames titles let me know will you.

When crackdown comes out using the cloud for destruction freeing up resources off the GPU it will might dawn on you guys than how different the xb1 really is.
imt558  +   262d ago
@KNWS

Destiny is cross-gen game and Bungie aimed for parity. Anyway, Destiny IS NOT graphical demanding game.

I smell Metro : Redux knocking on your Xbone door @900p.

Well, Metro games are graphical demanding games and they look astonishing on high-end PC's.
marioJP87  +   262d ago
PS4 is also underpowered too. It's just not as underpowered as X1. It was released at a mid level PC's performance. Killzone SF single player and multiplayer couldn't even execute native 1080P. And don't dare call me a fanboy. I have all 3 consoles.
Xsilver  +   262d ago
@KNWS
Ok ill play your little game

"Well lets look at the PS4 Infamous was only 30 frames"
well to be accurate InfamousSS is 30-45fps but i guess you prefer much lower since RYse drops to 16fps and never a constant 30 frames dead Rising 3 runs at 720p and frames drop to 14fps So don't bring up Sony games performance when xbox games can't do any better :/.

"(in fact Sony is getting sued by the courts for lying about the game claiming the multiplayer was 1080p when it was 720p in reality."

Well next we should we sue MS for false advertisement since Forza 5 didn't look like it like all the gameplay we saw before the game released wouldn't you say MS lied :o.

"PS4 isn't a beast at all and destiny proved it once and for all. If the PS4 was vastly superior destiny would be 60 frames on PS4"
So what about tomb Raider definite addition that was 60 fps on PS4 but 30 on XOne even then you denied that PS4 was more Powerful so now because Destiny aimed for Parity it means something or it could means that they are trying to make sure everyone get's the same experience on all platforms
"It’s also worth noting that, ultimately, there might not be any real-world difference at all. Most games are developed with cross-platform, lowest-common-denominator compatibility in mind. Developers are unlikely to create a game that runs well on the PS4, but chugs along on the Xbox One"

"Even the remastered Last Of US drops frames down to the 40's in some areas. Hardly a 60 frames game!!"
like i said before the last remastered game on Xone couldn't even hit 60 frames and i know that their are allot of Hardly 30 frames games on Xone Buuuuuut that's not my Business http://www.marketmenot.com/...

"I laugh at you Sony fanboys all the time, you guys are delusional and need serious help"

Bruh http://24.media.tumblr.com/...
.
"I can accept the xb1 is slightly weaker than the ps4, but this 50% power difference has no basis in reality."

"Slightly" you say http://gameswallpaperhd.com...

"When crackdown comes out using the cloud for destruction freeing up resources off the GPU it will might dawn on you guys than how different the xb1 really is."

http://gifatron.com/wp-cont... ummmmmmm ok.
#1.3.4 (Edited 262d ago ) | Agree(33) | Disagree(23) | Report
ocelot07  +   261d ago
@KNWS When someone says something negative about the XB1 on here. You always seem to just bring the PS4 into it even though the post has noting to do about the PS4.

Why? You are actually getting worse than that truefan guy.
turdburgler1080  +   261d ago
I love how butthurt xsilver got. Look at that post. It's like a junior high kid got upset and started throwing a temper tantrum. You can feel the fanboy seeping out of it lol. I love that he went and probably spent ten minutes looking for articles and gifs to link to it. Maybe you should take a break from n4g if you can't handle a little ribbing.
#1.3.6 (Edited 261d ago ) | Agree(4) | Disagree(12) | Report
Evilsnuggle  +   261d ago
KNWS @

you are a delusion xbone fanboy . I posted a article explaining Temporal Reprojection & Shadow Fall 1080P . But you and your xbone fanboys are in deep delusion and don t believe in facts or technology . Yes KillZONS SF is in full HD 1080p but uses a technique named temporal reprojection that takes a old frame of and combines it with a new frame pre rendering to make a new frame in full HD 1080p pre rendering not upscaled pre rendering .
http://n4g.com/news/1469012...

but im sure you and your xbone fanboys will enjoy your fantasies
#1.3.7 (Edited 261d ago ) | Agree(10) | Disagree(7) | Report
XSpike  +   261d ago
@KNWS

You know Killzone does run at 1080p, not 1920x1080p but 920x1080p. 1080p normally refers to 1920x1080p but really 1080p is only referring to how many vertical lines there is, 1080p doesn't always mean 1920x1080p but it ALWAYS means 1080p vertically

The person above explained it better, as KZSF does take two images to make a full 1080p image, but even without it, its still 1080p.
#1.3.8 (Edited 261d ago ) | Agree(7) | Disagree(4) | Report
TruthInsider  +   261d ago
Killzone single player is full 1080p @ 40 - 60fps.Only multiplayer is 960x1080.
Tianfall is 792p @ 35 - 60fps with near constant tearing.

Infamous is full 1080p at around 45fps.
Ryse is 900p at 25fps

COD Ghosts is 1080p @ 40 - 60fps (PS4)
COD Ghosts is 720p @ 30 - 60fps (Xbone)

BF4 is 900p @ 40 - 60fps (PS4 10fps advantage average)
BF4 is 792p @ 30 - 60fps (Xbone)

AC4 is 1080p at solid 30fps (PS4)
AC4 is 900p at solid 60fps (Xbone) (reduced/missing assets)

Watchdogs is 900p at 30fps (PS4)
Watchdogs is 792p at 30fps (Xbone reduced/missing assets)

MGS:GZ is full 1080p @ solid 60fps (PS4)
MGS:GZ is 720p @ 60fps (Xbone)

Tomb Raider is 1080p at 35 - 60fps (PS4)
Tomb Raider is 1080p at at 18 - 30fps (Xbone)

Etc etc...
#1.3.9 (Edited 261d ago ) | Agree(4) | Disagree(3) | Report
ziggurcat  +   261d ago
@KNWS:

"Killzone 30 frames (in fact Sony is getting sued by the courts for lying about the game claiming the multiplayer was 1080p when it was 720p in reality."

uh... no. killzone is native 1080p. where are you getting 720p from?

and the lawsuit is going to fall flat on its face (if it hasn't already).
#1.3.10 (Edited 261d ago ) | Agree(3) | Disagree(2) | Report
dirkdady  +   262d ago
This is a little different isn't it? Ps3 on paper is more powerful but difficult to develop for while the Xbox One is less powerful hardware wise and slightly more difficult to develop for.

If the hardware was more powerful using esram would probably be not as difficult or probably wouldn't need it at all I suspect.
MorePowerOfGreen  +   262d ago
What's interesting Xbox 360 had the better GPU and weaker CPU compared to PS3 yet was boasted as the reason why PS3 was more powerful. Now it's supposedly the other way around with XB1 and PS4.

Xbox will never be allowed to be seen as having an advantage over PS. 360 had a better GPU and better multi plats yet PS3 was more powerful because of the CPU yet XB1 doesn't get the same treatment.

Fanboys used exclusives to negate CPU and GPU points last gen.So is XB1 more powerful because it's coming out with exclusives that are open, bigger and more advanced/complex with social features pushed to the next level?(games that are doing much more at once/on screen)

Bu Bu BU PS3 was harder to develop for but not XB1? ;)

None of this will matter once devs start using DX12 with XB1's GPU being fully unlocked for the first time. We already see what the GPU unlock did with Destiny and that was just a last minute tweak *vs* start to finish development with 100% of the GPU, now imagine GPU-unlock+DX12+Azure
#1.4.1 (Edited 262d ago ) | Agree(18) | Disagree(32) | Report
Majin-vegeta  +   262d ago
@MPG *now imagine GPU-unlock+DX12+Azure

Too bad your own xbox messiah has already come out and denied any sort of significant change with DX12 on X1.

http://www.dualshockers.com...
TheRedButterfly  +   262d ago
I don't think anyone said that the devs were lazy... just that the platform was hard as dicks to develop for (especially in its early days)
TheXgamerLive  +   262d ago
eSRAM is a good add on consideding they used ddr3 but it's to little to do whats needed. Its getting done but this is a live and learn for MS. They'll never make this mistake again nor will sony.
That aside, im fine if a game is 720p or higher. It all upscales to 1080p and looks great so no worries here. Its all about playability. Past witcher games were....complicated so i hope fhey've made W3 control scheme much easier to learn. On the fly decisions needs simplicity in tbe controller.
DarkHeroZX  +   261d ago
Its not the same thing. The Cell was a completely new concept in how you program to a CPU. A CPU that was more powerful then the GPU and doing GPU related tasks was unheard of. The X1's issue comes from not having enough of this ram so now devs are forced to find little tricks to get such a trivial task done.
ziggurcat  +   261d ago
@axios:

it's absolutely nothing like the cell processor, so don't kid yourself.

and they were lazy because the excuse was that it was "hard." they refused to put the time, and effort into learning an entirely new architecture, which was made very apparent any time ND released a game.
ziggurcat  +   261d ago
axios:

"@ Ziggurcat, you just described what some developers said about X1, must be lazy as almost 2 dozen titles are already 1080p."

xbone's architecture isn't entirely different, though. it's very similar to X360's architecture (just replace eSRAM with eDRAM, and you have the X360 architecture), so it's nothing like what devs were saying about the cell architecture. they're not saying xbone's architecture is "hard", they're just saying that it takes a bit more time. optimising for a tiny amount of eSRAM is clearly not the same (and less complicated) than having to deal with several SPUs.
SilentNegotiator  +   261d ago
Except the results of developers making use of the cell resulted in games with graphics better than anything on Xbox 360.
#1.10 (Edited 261d ago ) | Agree(5) | Disagree(1) | Report | Reply
Iffyrhyme  +   261d ago
Hey you're right I agree with you. I remember people saying that the Ps3 was more powerful but complicated. I owned a Ps3 so I would feel a bit disappointed when the development of a third party game wasn't pushed to its limits.
XiSasukeUchiha  +   262d ago
Sounds like the Cell processor, let the RIP and hopefully this bottleneck can be fixed.
ABizzel1  +   262d ago
Not really, because eSRAM is something well known to many developers and at the end of the day it's still RAM.

The problem is trying to code for such a small pool of it while trying to match PC high end graphics, at 1080p. The fact that it's an APU meant RAM was very important to overall performance, and MS went with the safe best of a higher speed DDR3, and the eSRAM would have helped if it was more than 32MB (but they wouldn't risk losing CPU/GPU cores which would have put them at another disadvantage).

The GUP in the XBO wasn't designed for that. The XBO is running an R7 260 (w/o the X), and it's meant for gaming at 720p - 900p. That's not to say it can't run games at 1080p, but often they'll be less graphically demanding games or come at the cost of image quality.

If you really want to know the XBO's real performance simply look up the r7 260.

Here's a description of it from guru3d:

The R7-260 is a fun, almost entry-level product that is going to allow you to game at 1280x720 to say 1600x1200. If you stick to these resolutions then you may apply good image quality settings. At an MSRP of 79 EURO + VAT (or $109 USD) however it is a fair deal. You'll be looking at perf in-between say a Radeon HD 7770 and Radeon HD 7790. It also would be a really excellent HTPC card quite honestly. The small PCB footprint and cooler would even allow Mini-ITX setups. Then again at 75 Degrees C it does run a bit hot for such a setup. Overall though, a fun little card but don't expect to game properly at 1080P.

http://www.guru3d.com/artic...
headblackman  +   262d ago
i mean no disrespect and nor am i saying that your claims aren't true, but has microsoft ever given out their specs on the x1 gpu, because i personally have never seen them. only assumptions of what it could be equivalent to.
ABizzel1  +   261d ago | Helpful
@headblackman

First off, thanks for simply being mature about your comment.

As far as the XBO specs go, MS themselves have released some data such as the XBO would have an 8 core AMD processor, AMD GPU, 8GB of DDR3 RAM, and 32MB of eSRAM along with the speeds for both sets of RAM (64GB/s DDR3, 192 GB/s eSRAM).

As far as the gaming innards go that's all MS has given away. Everything else was found out by breakdowns of the console, compare and contrast, and reverse engineering.

The confirmations started rolling in when the XBO specs were being compared to the PS4 specs, and MS execs. (specifically Albert Penello), marketing team, and developers started chiming in on the difference between ROPs, TMUs, TFLOPS, Shader Cores, etc...

http://wimages.vr-zone.net/...
http://i2.wp.com/www.sonyru...

Ultimately this is why MS did their first performance boost and overclocked the CPU and GUP which, 1. proved the PS4 was more powerful, and 2. gave the technologically challenged the impression that the XBO had caught up to the PS4.

(see the TFLOPS jumped from 1.23 up to 1.31)
http://cdn.eteknix.com/wp-c...

After the consoles were finally for sale, everything that was treated as a rumor was finally revealed to be true as many sites finally had the hardware in hands, and could see what was powering these consoles. The result of the power difference has been proven in several multiplat games to this date have higher resolutions / framerates / or both on the PS4 than the XBO.

http://www.ign.com/wikis/xb...

Now look at the IGN list, and check the performance for Tomb Raider on the XBO 1080p, and the Performance of Tomb Raider on a gaming PC powered by a R7 260 in 1080p (below).

http://www.guru3d.com/artic...

Pretty much identical with the 260 edging out the XBO by 8 fps.

The removal of Kinect gave the XBO a 10% GPU boost, bringing it up from 1.31 TFLOPS to it's current 1.441 TFLOPS. Now compare the XBO specs. (keeping in mind it's 1.441 TFLOPS to the r7 260 specs)

(Scroll down until you get to the Specifications section)
http://gpuboss.com/graphics...

http://www.insiderp.com/ps4...

Exact same number of Shading Units, exact same number of Render output units, same compute units, and even though it's missing from the XBO chart they also have the same number of Texture mapping units. The main difference is clock rate, which is what causes the TFLOP totals to be difference since the r7 260 can run at a higher clock than the XBO, without worrying about overheating a console.
Dread  +   261d ago
Thx for the explanaiton Abzzel

bubbles up!!
imt558  +   261d ago
Quote :

"...The removal of Kinect gave the XBO a 10% GPU boost, bringing it up from 1.31 TFLOPS to it's current 1.441 TFLOPS..."

WRONG!

Xbone has 1.31 TFLOPS from the start ( or after upclock ) Before upclock Xbone GPU had 1.23 TFLOPS. There IS NO 10% boost after Kinect removal! They just freed up some GPU resources.

10% were reserved for Kinect before. So, it was 1.18 TFLOPS. Now, Kinect reservation is GONE.

You gave the link from insiderp! http://www.insiderp.com/ps4...

Quote :

FLOPS : 1.3 TF (10% of the SYSTEM RESOURCES DEDICATED to Kinect)

Get it?

EDIT :

"...It's possible, but the 10% boost would put it above the 1.32 TFLOPS, and if it was originally 1.18 TFLOPS it puts it below the 1.32 TFLOPS (although in this case other components could bump it up to 1.32).

I know it says 1.3 with 10% of the system resources dedicated to Kinect, but there's nothing stating that the 1.3 TFLOPS is including the Kinect boost or not, unless I'm missing something.

So for now I'll say 1.32 TFLOPS - 1.441 TFLOPS. ..."

Yes, you missing something! You got a really bad math.

I can't believe it what i'm reading here. It is just simple math. 10% GPU "boost" is Microsoft PR.

Once again.

Xbox One GPU 1.31 TFLOPS ( from the start ). 10% of that 1.31 were RESERVED for Kinect before. You know what RE
SERVED means, right? So 10% of that 1.31 is 0.131 which is 1.179 TFLOPS. Now, MS FREED UP that 10% Kinect GPU resources for developers, NOT ADD ANY new GPU resources and developers have full 1.31 at their disposal.

There is NO 1.32 TFLOPS! There is NO 1.38 TFLOPS! There is NO 1.44 TFLOPS!!! It is just 1.31 TFLOPS! Get it now?

Xb1: 1.18 TF GPU (12 CUs)
Xb1: 768 Shaders
Xb1: 48 Texture units
Xb1: 16 ROPS
Xb1: 2 ACE/ 16 queues
Xb1: 13.65GPixels/s
Xb1: 40.90GTexels/s
#2.1.4 (Edited 261d ago ) | Agree(9) | Disagree(2) | Report
ABizzel1  +   261d ago
@Dread

No problem. I have no hidden agenda. I'm a multi-console owner (XBO is all I have left to buy for this generation, and I'm getting it in August), and a PC gamer. Hardware is simply hardware, there's no getting around it. THe XBO isn't a bad system at all, none of them are. But for gaming the XBO's specs are suited for 900p @ 30 - 60fps for high end PC gaming.
ABizzel1  +   261d ago
@imt558

It's possible, but the 10% boost would put it above the 1.32 TFLOPS, and if it was originally 1.18 TFLOPS it puts it below the 1.32 TFLOPS (although in this case other components could bump it up to 1.32).

I know it says 1.3 with 10% of the system resources dedicated to Kinect, but there's nothing stating that the 1.3 TFLOPS is including the Kinect boost or not, unless I'm missing something.

So for now I'll say 1.32 TFLOPS - 1.441 TFLOPS.
ABizzel1  +   261d ago
@imt558

First off you don't have to be rude, I understood what you were saying, but obviously you're the one with the misunderstanding.

I was saying there's no way of knowing if the XBO was originally 1.31 TFLOPS before or after the Kinect boost.

The R7 260 has 1,536 TFLOPS of performance. The XBO as I've shown has a practically identical specs outside of the clock rates (1 GHz vs the XBO's original 800 MHz). They basically underclocked the GPU by 20%.

1,000 (aka 1 GHz) * 0.8 (aka 80%) = 800

When you do the same math with the TFLOPS

1,536 GFLOPS * 0.8 (again 80%) = you get 1,228.8 or.........1.23 TFLOPS as originally announced.

Then they increased the clocks by around 6.5% (800 * 1.065 aka 6.5% = 852MHz, 1MHz away from the current 853MHz) which is where the 1.31 TFLOPS came from (1,309.95 GFLOPS to be exact).

"Xbox One GPU 1.31 TFLOPS ( from the start ). 10% of that 1.31 were RESERVED for Kinect before. You know what RE
SERVED means, right? So 10% of that 1.31 is 0.131 which is 1.179 TFLOPS. Now, MS FREED UP that 10% Kinect GPU resources for developers, NOT ADD ANY new GPU resources and developers have full 1.31 at their disposal."

Like I said, this is a flat out lie, because the increase of the clock speeds for the CPU and GPU wouldn't have been needed or done if the XBO was already doing 1.31 TFLOPS. As you said 10% of 1.31 TFLOPS is 1.179 TFLOPS, which completely goes against what SEVERAL sources have said the XBO was running at prior to the announcement of the console.

1.23 TFLOPS (Jan 2013)
http://www.vg247.com/2013/0...

1.23 TFLOPS (May 2013)
http://www.anandtech.com/sh...

1.23 TFLOPS (July 2013)
http://www.neowin.net/news/...

And then the boost.

Clack Speed from 800MHz to 853MHz
http://www.ign.com/articles...

Clock Speed from 800MHz to 853MHz
http://www.techspot.com/new...

http://i.qkme.me/35qo9j.jpg
Letthewookiewin  +   262d ago
1080p on PS4 with high graphics settings @30fps. 900p on X1 with med-high graphics settings @25-30fps. Calling it now.
d_g  +   262d ago
edit

1080p on PS4 with high graphics settings @30-16fps

https://www.youtube.com/wat...
ABizzel1  +   262d ago
@d_g

You forgot to add the note at the from the about section.

Note: In-production code, final game may vary.
KNWS  +   262d ago
lol with your frame rate clueless Sony fanboy. I bet you now it will be 1080p on both consoles 30 frames for both.
manchesterman22  +   262d ago
i dont know where some of you people get this info from!! i have a ps4 and i find the frame rate on some of the games shocking compared to my xbox, its almost as if its a forced 1080p at the expense of smooth gameplay, i just wish both consoles would sort their shizzle out to be honest so i can enjoy true next gen gaming
asyouburn  +   262d ago
Which games have shocking frame rate in comparison?
manchesterman22  +   262d ago
COD Ghosts, Need for Speed, AC4, FIFA, its not all of them but it is noticable when you have played both versions, maybe i have a broke ps4 but if that was the case would it not happen on all games?

Also not really interested in the disagrees they are either in denial or cant spot frame rate issues but im not the only one who has seen them
TheXgamerLive  +   262d ago
Dont forget Tomb Raider def edition the XB1 locked 30 fps was better and smootber than the unlocked ps4 13 to 60 fps bumble.
manchesterman22  +   261d ago
im glad someone agrees, i never played the ps4 version of tomb raider, but i do also know that the ps4 version of thief suffered too, i remember one of my friends got rid of it and bought it for xbone. Im not a fanboy or trying to cause a war between you children on here, im simply stating a fact that i noticed between the different console versions of the same game
Chevalier  +   261d ago
Maybe because those were launch titles? Did Watchdogs or newer titles by those publishers suffer as well?
manchesterman22  +   261d ago
i dont know dude i gave up, i wanted to play stable versions of the games so i havent tried anything since, i just get it on my xbox, i will be getting TLOU on friday but i know that will not suffer any problems with frame rate (i hope) but as for multiplatform games i will get them on xbox only from now on and i will get exclusives on ps4, the same as i did with 360/ps3
windblowsagain  +   262d ago
Just Use the secret sauce.
Nicaragua  +   262d ago
They cant because the sauce bottle was stacked on top of the hidden GPUGPU, which is too high for the extra compute units to tron.
windblowsagain  +   262d ago
LMAO EPIC
Goku781  +   262d ago
Sounds like another Xbox One problem, the all in one problem system.
GundalfDeGrej  +   262d ago
"Hey guys did you know that the Xbox One is less powerful than the PS4?"

...and boom, this article goes to the top of the page.

In all seriousness though, I'll probably get this game for PC but I hope things will improve for the X1.
Stapleface  +   262d ago
Every time I see that guys last name I think of Turok: Dinosaur Hunter and about how a new version of that would be pretty awesome.
Mega24  +   262d ago
Turok needs a reboot!
Genuine-User  +   262d ago
This game will rock regardless of what current gen platform it's on.
strangeaeon  +   262d ago
Marking anyone that says "secret sauce" as trolling from now on, fair warning.
MegaRay  +   261d ago
You're trolling... -bubble.
fossilfern  +   262d ago
Funny how everyone seems to forget the early days of the PS3 or the fact that the PS2 was the weakest machine of its generation and it didnt do any harm.
Spotie  +   261d ago
Nobody has forgotten. It's just that those situations were very different.

For example: the PS3 was more powerful, but harder to code for.

The PS2 was less powerful, but had all the third party support and big names.

The XB1 has neither of these things.

Oh, there's one other big thing the XB1 is lacking:

It's not a Playstation.
fossilfern  +   258d ago
"Xbox has neither of these things" thats all relative, I dont really like Sonys IPs so its down to taste. You are just spewing fanboy nonsense.
lemoncake  +   262d ago
Technically all the consoles are disappointing this gen, the xbox 360 and ps3 were a lot more advanced compared to the tech of the day. The only slightly interesting spec is the ddr5 in the ps4, I don't know what Microsoft were smoking when they did the spec for their machine surely these problems came through when testing.

After the 360 successes Microsoft really needed to put the pressure on Sony but they have completely fumbled a great opportunity. Hopefully the engines and other tech improves to overcome the eSRAM with time.
IrishSt0ner  +   262d ago
I agree with your premise although it's GDDR5, DDR5 doesn't exist yet. DDR RAM is much better for OS functionality than GDDR, I expect that was part of the 'justification' of going with DDR3, and eSRAM for high bandwidth.

Personally I think the only chance MS have is go cheaper on the console, and push a better value service, I can't see true performance parity at all this generation... things like EA Access might turn out a great avenue.
MasterCornholio  +   262d ago
"I don't know what Microsoft were smoking when they did the spec for their machine"

Thats easy to answer.

They were basically thinking this, Xbox One + Kinect = 499$. They had to make a choice between producing a more powerful console or including Kinect in the package. They chose the latter which is why the Xbox One is weaker then the PS4. On the other hand they would have had a system more powerful than the PS4.

P.S Sony was in a similar situation but they gave more importance to the PS4s hardware than creating a bundle with the camera.
#11.2 (Edited 262d ago ) | Agree(9) | Disagree(4) | Report | Reply
n4rc  +   262d ago
Gddr5

And many rumors tend to agree the ps4 was to have 4gb.. Is ms planned around that then Its a whole different story
strangeaeon  +   262d ago
I agree, near the beginning of gen 7 the consoles were pumping out better looking games than PC.
lemoncake  +   261d ago
When I first saw kameo on 360 it blew my mind it was such a beautiful looking launch title and my PC had no chance of competing with it.
Walker  +   262d ago
ESRam is just a bottleneck and xbone's biggest issue for 1080p gaming !
windblowsagain  +   262d ago
No the poor GPU doesn't help either and the slower ddr3 ram as well.
gamingisnotacrime  +   262d ago
regardless of XONE specs, is clear that the PS4 was better thought out. Developers had their say and now they have a platform to go wild, while on the XONE is a platform to grind
corroios  +   262d ago
No just that. To reach the 1080p studios are force to cut the GFX, like AA, AO, textures and more...
stormplyr  +   262d ago
Eventually Gaming Bolt is going to run out of developers to ask about the esram in XB1
#15 (Edited 262d ago ) | Agree(6) | Disagree(0) | Report | Reply
ATi_Elite  +   261d ago
Today at Gaming Bolt we asked Gamers about Esram in the XB1

"it smells funny and looks weird"

there you have it folks the truth about Esram! The Secret is out!
ATi_Elite  +   261d ago
" In fact one such developer stated that Microsoft cheaped out on the RAM."

Microsoft did NOT cheap out on the ram (sorta kinda) they cheeped out on the GPU!

8gb ddr3 is as capable as 8GB GDDR5 a slight bit slower but capable. The GPU determines resolution! NOT RAM!

GDDR5 is DDR3 but just made with a larger bandwidth to handle rendering, video, textures, graphics unlike DDR3 which is multitasking Ram.

(DDR4 is GDDR6)

you could put 12GB GDDR5 in the XB1 right NOW and it still cant run BF4 at 1080p because the GPU cant handle it.
VealParmHero  +   261d ago
Not to surprised. Nice to hear a dev just be honest and realistic about the situation. I expect that we may see similar differences like with the Destiny beta, where ps4 looks better but the xb1 version is surprisingly not that behind...though I have my doubts about 1080. I would say 900p xb1 vs 1080p ps4, where the ps4 version might even have some better AA etc.

I'm not sure which platform to get this one on
LogicalReason  +   261d ago
If you havin' 1080p problems I feel bad for you, son.

I got 99 problems but an Xbox ain't One.
MonstaTruk  +   261d ago
Say it loud. The truth hurts, Xbots. You'll at least have access to 4 old EA games for $4.99/month, though... :-/
#19 (Edited 261d ago ) | Agree(5) | Disagree(3) | Report | Reply
larrysdirtydrawss  +   261d ago
can we just say what we're all thinking,, can we just call it the piece of shit box,cause that's what the xone is... I cant believe anyone will pay 400$ for a console with that gpu in it... ms knows they messed up with the Kinect junk,,so much time, energy, effort, resources wasted an that garbage.. ms learned their lesson and their next console will have a monster gpu in it like the ps4 has in it now... just get a ps4 ppl,its the console master race
#20 (Edited 261d ago ) | Agree(2) | Disagree(1) | Report | Reply
thexmanone  +   261d ago
Ah, now I see why you have one bubble.
#20.1 (Edited 261d ago ) | Agree(2) | Disagree(1) | Report | Reply
Reaper29r  +   261d ago
The game will look great regardless of what system it's on. I haven't decided what system I'm going to get it on yet. But I prefer smoother frames over resolution (which is usually only were the systems really differ) so I hope they don't sacrifice that for frames on either platform (looking at you Thief). At worst I'm sure the XB1 will pull off 900p so XB1 only gamers shouldn't worry. 900p isn't as bad as people make it sound when the game is running in front of you. When I was mostly a PC guy I used to play at 1050 (16:10 equivalent of 900) and my games still looked good. It's also easier (at least on PC) to have a lower res with better AA and makes it look smoother, albeit softer. 1080p is awesome, but I think it's a shame your average gamer is so dead set on it now days (it's a console not a super high end rig) because that's a lot of juice that could go towards better stuff.
#21 (Edited 261d ago ) | Agree(1) | Disagree(1) | Report | Reply
CaptDnaDonut  +   261d ago
I like how they have what the website said in quotes as if the people from CD Project Red had said it. That's really deceptive. Anyway if they can get 1080p on the Xbox One with a 30fps lock I would be happy if not. I would prefer a 30fps lock.
FayZ_  +   261d ago
the problem with esram is that its designed for tiled based deferred rendering, a lot of developers will have to learn to adapt to this type of method, the reason they chose this fixed design could be maybe due to cloud gaming, maybe moving & downloading data through the super fast low latency esram in small chunks for the average broadband connection, hell i'm no dev but with all the data move engines it seems xbox one is well designed for cloud gaming & not the standard 'throw it in the large chunk of memory' as most pc developers will usually do.

i think they should have used gddr5 as well to just avoid this type of issue as no one is currently using cloud compute. but i suppose ddr3 choice could be due to lower latency.
FBC   261d ago | Spam
cruzngta  +   261d ago
Esram is cool if there is enough of it for graphical use. 32 mb is not much and that is why they keep having to optimize all the time to try and use that small bit. They should have put in 64 or 128 and that would have made these graphics comparisons pretty interesting. In the end I am sure the Bone will get 900p for this game while PS4 will get the usual 1080p - just a thought.

Add comment

You need to be registered to add comments. Register here or login
Remember
New stories
30°

Affordable Space Adventures (WiiU) Review: Everybody’s dead, Dave | WASDuk

49m ago - Gareth from WASDuk Reviews Affordable Space Adventures "It’s rare that a game comes along tha... | Wii U
20°

Gratuitous Space Battles II - Pixel Dynamo

1h ago - Imagine playing chess, having only just learned how the pieces move, and you’ll have an idea what... | PC
Ad

SMITE World Championships LIVE January 9th to 11th

Now - You can watch the games and enter giveaways over at http://beta.cursevoice.com/smite-worlds | Promoted post
40°

Demascus Gear: Operation Tokyo Review | MONG

1h ago - Riley from Middle Of Nowhere Gaming had this to say about Demascus Gear: Operation Tokyo. "There... | PS Vita
30°

MLB 15: The Show Review | MONG

1h ago - Matt from Middle of Nowhere Gaming had this to say about MLB 15: The Show. "Baseball doesn’t get... | PS3
40°

Tower of Guns Review | MONG

1h ago - Shawn of Middle Of Nowhere Gaming had this to say about Tower of Guns. "Tower of Guns has its fla... | PC