Many gamers around the world were surprised by the price point of the PS4. Looks like Hideo Kojima was taken by surprise as well by Sony’s unveiling. On the other hand he's not worried about the difference in power between PS4 and Xbox one.
I was pleasantly surprised by the PlayStation 4 price as well, especially after hearing the XBox One would cost 500 dollars.
I actually went on record sayng that the PS4 would have costed 349, but in the end I was nearer to it that those that argued 500 :D
I went on record and said the console would be 399 and they would not have any of the Xbox restrictions. I was somewhat surprised.
PS4 plays vinyl? if that's the case than it's the greatest machine ever designed... haha
..would have costed 349" I think you meant to say ...would have cost 349.
I had it XB1 - $450 PS4 - $400 And I guessed a Wii U price drop which won't be happening until probably late 2014 at the earliest. 1/3 isn't too bad..
I wonder if there will be more SKUs with a 499 and 599 option?
I was the same. I wasn't surprised when it turned out to be cheap. If you look at how Sony gone out of their way to reduce the Vita's price as much as possible, you'd know the PS4 would be cheap.
He's a big Hollywood movies fan, but i'm affraid just as misguided on Motion Control as MS is. Interesting read for anyone interested => http://www.theawl.com/2013/...
Can u find info on the ps4 camera? Sony never went into detail about what exactly it was capable of. I hope it can do head tracking, many people use headtracking on pc and it's an amazing gamechanger for the fps genre.
Search Playroom on youtube there is some cool stuff the camera does.
@Donnieboi It's so the NSA and 0Bama can keep tabs on you while in the privacy of your own home. When you factor in how PRISM monitors all internet communication, including videos, skype, etc... i sure as hell don't want a camera on a console (or webcam etc)
he is not going to put down ms an anyway , hell he open the x1 show , so why would he , he has a good relationship with ms bringing his games over to xbox. y he didn't show it running on xbox one no one knows he demoed mg4 on ps3. im sure he has a contract with ms and in that contract there are thin gs like not putting x1 down and praising the sytem.
It's in Kojima's best interest to downplay the differences. People thinking this proves they're the same are not thinking this through. Besides, I would doubt there'd be that much difference in near launch titles anyways. But it will show soon enough. People can like MS exclusives, that's great. But please stop trying to delude others that plainly see the ps4 is not just more powerful but also easy to develop for. We aren't talking cell's difficulty here If anything, xbone has the trickier architecture that may make it more difficult to develop for. But again, I doubt as notoriously difficult as ps3 used to be.
Joystick, Read this: http://www.itproportal.com/... Both the XboxOne and PS4 use the same Jaguar CPU and a 7000 series GPU. The One uses DDR3 which is CPU ram, the PS4 uses GPU GDDR5. Neither is ideal for a combines CPU and GPU which is why the XboxOne uses a buffer to increase bandwidth. The two consoles are nearly identical this generation. Microsoft has been making easy to develop x86 based consoles from the start. To say the PS4 is easier to develop for is silly since they copied the Xbox design. If you get bored and want the information, read the difference between DDR3 and GDDR5 online. Gaming PC's use DDR3 on the motherboard for CPU calculations and high end video cards use a much smaller amount of GDDR5 for the GPU. If GDDR5 was better ram, we would be installing it on the motherboard. I paid $200 for the 16 gigs of Vengence DDR3 in my PC. Do you think I would have had a problem buying GDDR5 if it was available and better?
I don't ever recall Kojima talking negatively about any console. Maybe just say how it might be limited due to one thing or something, but always says there's ways around it. He a techie kind of guy. He likes to utilize tech that's available, and he's quite good at doing it in unique ways. Even mundane things get a notice from him.
sigh... ok I'll bite From the article you linked: "The PS4 has 8GB of GDDR5 RAM, providing 176GB/s of bandwidth to both the CPU and GPU. The Xbox One MOSTLY ameliorates this difference with 32MB of high-speed SRAM on the GPU, but it will be a more complex architecture to take advantage of." Simply stated, the X1 can theoretically almost bypass the difference in ram (due to ESRAM and Move Engines), but there needs to be clever coding by the game developers due to the "more complex architecture". And the article is simply comparing specs, not the actual ram available for games: X1's 5gb of DDR3 to PS4's 7gb of GDDR5. And as you said, DDR3 is for the CPU - which is weak in both systems because they are just for streamlined consoles that don't need the beefy CPU's of PC's to push all kinds of data. What will make a significant difference is PS4's more powerful GPU, in combination with more available ram (2 extra gb for games), with more bandwith (GDDR5), in a less complex architecture. PS3 was technically more powerful than 360, but it took some truly talented and dedicated devs to showcase that system. Now the PS4 is more powerful while being less complex than the X1. This is not slant or spin. Let's not move goalposts here. The differences between the PS4 and X1 will begin to show sooner rather than later as we get into the new generation.
Another article I just read was that there is no external power supply for the xboxone. Anyone know if ps4 will still have the brick?
@Joysticks The principle differences are: •DDR3 runs at a higher voltage that GDDR5 (typically 1.25-1.65V versus ~1V) •DDR3 uses a 64-bit memory controller per channel ( so, 128-bit bus for dual channel, 256-bit for quad channel), whereas GDDR5 is paired with controllers of a nominal 32-bit (16 bit each for input and output), but whereas the CPU's memory contoller is 64-bit per channel, a GPU can utilise any number of 32-bit I/O's (at the cost of die size) depending upon application ( 2 for 64-bit bus, 4 for 128-bit, 6 for 192-bit, 8 for 256-bit, 12 for 384-bit etc...). The GDDR5 setup also allows for doubling or asymetric memory configurations. Normally (using this generation of cards as example) GDDR5 memory uses 2Gbit memory chips for each 32-bit I/O (I.e for a 256-bit bus/2GB card: 8 x 32-bit I/O each connected by a circuit to a 2Gbit IC = 8 x 2Gbit = 16Gbit = 2GB), but GDDR5 can also operate in what is known as clamshell mode, where the 32-bit I/O instead of being connected to one IC is split between two (one on each side of the PCB) allowing for a doubling up of memory capacity. Mixing the arrangement of 32-bit memory controllers, memory IC density, and memory circuit splitting allows of asymetric configurations ( 192-bit, 2GB VRAM for example) •Physically, a GDDR5 controller/IC doubles the I/O of DDR3 - With DDR, I/O handles an input (written to memory), or output (read from memory) but not both on the same cycle. GDDR handles input and output on the same cycle. The memory is also fundamentally set up specifically for the application it uses: System memory (DDR3) benefits from low latency (tight timings) at the expense of bandwidth, GDDR5's case is the opposite. Timings for GDDR5 would seems unbelieveably slow in relation to DDR3, but the speed of VRAM is blazing fast in comparison with desktop RAM- this has resulted from the relative workloads that a CPU and GPU undertake. Latency isn't much of an issue with GPU's since their parallel nature allows them to move to other calculation when latency cycles cause a stall in the current workload/thread. A CPU running GDDR5 memory is at a great disadvantage because of the high latency caused by these memory timings. Also, the performance of a graphics card for instance is greatly affected (as a percentage) by altering the internal bandwidth, yet altering the external bandwidth (the PCI-Express bus, say lowering from x16 to x8 or x4 lanes) has a minimal effect. This is because there is a great deal of I/O (textures for examples) that get swapped in and out of VRAM continuously- the nature of a GPU is many parallel computations, whereas a CPU computes in a basically linear way. Nobody really seems to be mentioning the fact that even though the PS4 has very high memory bandwidth (176gb/s), the GPU isn't nearly powerful enough to utilize that much bandwidth. My PC's GTX680 has a memory bandwidth not much higher than the PS4 (208.3gb/s). At face value, one would think the PS4's Radeon HD7850 (flops are the same) would only be slightly slower, but the GTX680 wipes the floor with it. I still think the addition of ESRAM will do big things to bridge the bandwidth gap, while the DDR3's fast timings will give the XBONE more number crunching power than PS4. I know all the PS4 fanboys are going to raise hell over this, but the sooner you get used to the fact that these systems are going to be very close performance wise, the easier it will be to get over the disappointment when the systems are actually available and the reviews are released. And one last note.... I plan on buying a PS4 when they're released. But until then, many of you should try and be at least somewhat objective in your analysis of these systems. Blind faith in a company or a product always results in being let down, followed by unhinged nerd rage.
@ Andrew I don't think you're trolling, but people see ohh numbers or read the text book definition in school or some website, attempt to decipher them, bend logic, and ignore what is now widely accepted. And what is generally accepted that ps4 is at the very least a bit more powerful than x1, suddenly becomes x1 is the same or more powerful than ps4. Which system is more powerful it doesn't matter to me. I own all systems this gen, and I like the X1 exclusives I saw at e3 (though I'm holding out until ms changes their drm policies). What does irk me is some people keep posting all of this information without fully understanding it or it's real world application and then call others fanboys. And it's the same argument that have been debunked countless times in the past since the console spec leaks. When will people understand that DDR3 vs GDDR5 latency in these console architecture with their usage is negligible? We're talking nano seconds here. Where that DDR3 MIGHT make a difference is zipping around X1's apps and three OS. With games however, the bandwidth difference coupled with x1's more exotic design and the difference in available ram (x1's 5gb vs ps4's 7) for games are significant and very real. That's the simplest way I can explain it. Heck if everything is equal as you're trying to say, how does x1 make up the 2gb difference in available ram? That alone is going to make a difference. We've seen ps3 struggle at times with the same amount but split ram. So now having MORE ram in a unified pool available in an easier to develop for system doesn't count now? That called being blind, the same thing you're accusing others of. People are trying to over complicate things and end up confusing / deluding themselves and potentially others. Don't be that guy, Andrew. Don't be that guy.
Joystick- The purpose of that brick of text was to accurately explain the differences between the two memory standards because some people here continually post inaccurate information. Even more so when it comes to peoples understanding of the use of eSRAM. Xbox One will have an 8-core 64-bit x86 Jaguar AMD CPU @ 1.6GHz, coupled with a GPU that’s very close to the Radeon HD 7790. The Xbox One will have 68GB/sec of bandwidth between the CPU/GPU and RAM, the GPU will have 102GB/s of bandwidth to a local 32MB SRAM cache, and another 30GB/s of bandwidth to gamepads, Kinect, and other peripherals. The PS4, in comparison, has an 8-core Jaguar AMD CPU, with a GPU that’s around the same level as the Radeon 7870 (which is significantly more powerful than the 7790). The PS4 has 8GB of GDDR5 RAM, providing 176GB/s of bandwidth to both the CPU and GPU. The Xbox One mostly ameliorates this difference with 32MB of high-speed SRAM on the GPU, but it will be a more complex architecture to take advantage of. And again, I'm only speaking of bandwidth here. Everyone knows the PS4 has a more powerful GPU but the fact remains that nobody here really knows if either GPU has the processing power to actually saturate that bandwidth or how the PS4's CPU will perform with high latency GDDR5. Or how effective 32mb of on-die eSRAM dedicated to GPU functions exclusively will be compared to how effectively the PS4's CPU and GPU share that monolithic block of high latency ram. I highly doubt the difference in performance will be anywhere near as great as the fanboys claim. They'll probably be a lot closer than everyone thinks especially since nobody here has any idea what the clock speeds are for the Xbone's ram and gpu. The sizable amount of power saved by going with DDR3 could easily be used to clock the GPU much higher. We'll see...
at Andrew u also have to take in mind when u talk pc specs with pc os the ps4 does not run like one giving the advantage to take on more without using having to deal with the os like a pc does . which is where people say the ps4 has the aDVANTAGE.
Are you not all of the people that complained when the 360 was released saying the cost was not a true cost because you have to by all of the accessories? Well were is all that anger? The eye is not included with the P$4!?! Where is the outrage that Sony would make you pay extra for accessories you have to have to fully use the system? Befor you say you don't need it remember the light on the DS4? What was that light supposed to work with again?
What?? Are you serious? Including Wi-Fi and bluray in a console should be a no brainer in this day and age. But forcing an expensive camera attachment on a hardcore gaming crowd is just stupid.
you don't need the camera for the DS4 to work...
The light can be used for things other than motion control. One instance was to give players visual cues of what was going on in game. I personally haven't seen any use for it, but the camera isn't needed, and some people don't care about it. The PS4 wasn't built around the camera, so yeah, in this case, I'd rather not spend money on something I care nothing about.
Simple answer...it's an OPTION! Things like online adapters, high density disk storage, and hard drives are somewhat essential to a complete gaming experience. However, a camera is really just to ENHANCE the experience for the large percentage of games. (there are PS Eye and Kinect only games...but a small percentage of the market). It's really apples to oranges.
sony played this perfect nobody wanted a kinect so they made the ps4 come without there camera an by doin so they undercut there competitor by 100 bucks on console price
Great. Now remind me again why the PS4 camera is *necessary*..? It's still cheaper with it anyway.
The difference is not small. The thing is that he cant talk bad about Xbox and Microsoft because they made a deal
what deal? certainly not an exclusivity deal lol.
probably the deal is not to talk shit about the X1 lol