GR - "Their processing power includes 40,000 processors and 104 terabytes of RAM. Hardly the kind of thing you’d find on a $400 console."
What are tomorrows lottery numbers?
Are you serious? This is common logic! The author doesn't just spout a bunch of crap either, it is a well thought out, well researched article. There is noooo way that games next generation will look like avatar, EVEN ON PC! The guy even said "Their processing power includes 40,000 processors and 104 terabytes of RAM" I also read somewhere that AVATAR took the computers MONTHS to complete. Also it even says things like Avatar are pre-rendered, and the processing power needed for a real time version of it is immense. Let alone the fact that if we compare this to games, AVATAR would be the most linear game in the world because they are rendering only specific things. not game worlds with AI etc.
" A recent article on Cinemablend quotes Chris Doran, the co-founder of Geometrics, a middleware company currently developing technology for the next generation of consoles, as saying that new consoles like the Xbox 720 and PlayStation 4 will offer graphics comparable to the James Cameron movie." There are improvements made in technology on a daily basis and the way some things are achieved today using High tech doesn't mean it can't be achieved through other means of technological advancements. They don't need to achieve something that is pixel to pixel as accurate as "avatar" they just have to simulate the image fidelity or be close. Something that is "comparable"
Give it 20 years and we'll be close.
"The author doesn't just spout a bunch of crap either, it is a well thought out, well researched article." So well researched in fact that he even misinterpreted what Chris Doran said in the first place. Chris never said we'd see Avatar running in real-time, so any argument attempting to disprove that is already moot. Here's the actual quote, even posted in the article: "I am confident that the lighting itself could get close to Avatar quality. Then the question moves onto other aspects of content creation. Is that level of modeling detail feasible, and will the animation, physics, and AI all be equally plausible," said Doran of his company’s upcoming engine, Enlighten. If you read that sentence properly, you'll realize that Chris was only referring to the lighting, not anything else. If you've seen any of Geometrics' work, you'd know that that's perfectly plausible. Also, who to believe - random blogger that can't even interpret a quote correctly, or CEO of a company that has delivered incredible innovations in graphics in the past years?
lol @ wishful thinking. this embedded video is far and away the best looking thing i've seen visually in a game. i'll be surprised if the next ps and xbox can even touch this.
There are a lot of things, like the real time next gen final fantasy tech demo, that already look as good as that crysis video, if not better.
Think about what you're saying. Basically, it's common sense like you said (some people don't have this) that Avatar will never be met one to one in a game because the computers used to process it were insanely powerful, and even then it took a very long time to render. Now, you're saying that it doesn't have to meet Avatar's graphics with pixel-to-pixel quality, but may be able to get very close to looking like it. What you're saying is that next generation consoles (more than likely launching in 2013) that cost around $350 will be able to render CG photo realistic quality picture in real time, with at least 30 FPS. That is insane.
You guys are ridiculous. CURRENT consoles can do Avatar. I already played James Cameron's Avatar: The Game on the ps3 and 360, and watched the Blu-ray on my ps3. /s lol
My question is, who the hell stated it was going to be? I mean sure one can dream, but i've yet to hear really anyone state next gen would look like "Avatar" graphics. Will it in the future, yes. Its just the matter of time. Moore's law pretty much guarantees it will some day be a reality. http://www.pcgamer.com/2011... Now if such a thing where to happen in gaming, it would change a whole lot of things. On top of that, we need to remember this is a film and not a game. To argue weather its next gen or not is quite silly because i'm not sure anyone in there right mind is even suggesting such a thing, also to argue weather it would EVER happen is down right stupid. Moore's Law says other wise. PS1 had 2 mb of ram PS2 had 32mb PS3 had 256mb (which to my understanding can be shared like with the PS1's to the GPU making it 512mb like the 360. so 16X is what we seem to be looking at in terms of ram increase, mind you if in the future ram can indeed be used in a SSD, this will CHANGE! It will no longer be by 16X, it might be by 100X or even 200X (just based on SSD's and the space they will hold by the time such a thing happens) So PS4 will be around 8gigs or so. (just based on what History has shown us) If history stays true, by PS5 it will be 158GB ram, but by then if SSD's indeed act as ram too, it would actually be any where from 500GB to a 1TB. Mind you this is all being gauged by Avatar the CG film, by then the same technology that made Avatar would clearly be dated and far better techology would take its place. This theory on ram is simply if Avatar would be considered the "staple" of films to be copied in terms of graphics in a game. But this would be based off of technology some how just staying at a stand still and Avatar just being the target render. Games will look PAST Avatar by then, none of use know what ATI and Nvida have in store for there GPU/APU/CPU's in the next 15 years. MIND YOU this is just based off of what "consoles" might do, haven't even said crap about the PC yet. Just 10 years ago, having 256mb of ram in your PC was considered "beast". People have 32GB ram set ups now. From being half, of half a gig to 32X that in a decade. Now, what happens if SSD is indeed going to act as ram for gaming in the future?
@TheRealSpy: ROFL "Top Secret Tessellated Toad Tech"
moores law states that the number of transistors in a given amount of silicon will double every 18 months.. it hasnt really been relevent for a while Processor Transistor count Date of introduction Manufacturer Proc ess Area NV3 3,500,000 1997 NVIDIA 350 nm NV5 15,000,000 1999 NVIDIA 25 0 nm NV15 25,000,000 2000 NVIDIA 1 80 nm NV40 222,000,000 2004 NVIDIA 130 nm 305 mm² G80 681,000,000 2006 NVIDIA 9 0 nm 480 mm² RV770 956,000,000 2008 AMD 55 nm 260 mm² GT200 1,400,000,000 2008 NVIDIA 55 nm 576 mm² RV870 2,154,000,000 2009 AMD 40 nm 334 mm² Cayman 2,640,000,000 2010 AMD 40 nm 389 mm² GF100 3,000,000,000 2010 NVIDIA 40 nm 529 mm² Kepler 3,540,000,000 2012 NVIDIA 28 nm 294 mm² Tahiti 4,310,000,000 2011 AMD 28 nm 365 mm² as die size shrinks transistor count decreases engineers are now struggling with cross-talk noise basically on 28nm dies the electrons are so close they start to interfere with signals i do cgi as a hobby and can give an example of render time changing on a duron 600/512mb a one minute animation with 1 light source took me 119 hours to render at 30fps and 800x600 the same animation on a phenom 2 955 quad core with 4 gb ram less then an hour, that a huge improvment in tech over 10 years a die shrink of 100 % will enable mores law but until we get 14nm chips were more or less at the limit. only increasing chip size give us more room.
EDMIX- Moore's law will probably end within the next 10-15 years. It applies to existing silicon lithography which will either hit its practical ceiling within a decade, or become so expensive to be unpractical to shrink any further within this period. Other ways will have to be found to keep increasing computer performance while reducing size and power consumption of the chips. We have taken it for granted for the past 40 years that this process continues, but it WILL end for existing computer technology, and very soon. Maybe we can get into 3D chip stacking and such but we'll see, it'll probably not prevent the power consumption increasing.
@sjaakiejj Just the lighting? Lighting is THE factor in pre-rendered CG today. the rest is very feasible if not only more closely feasible on todays hardware. Its the lighting that makes the difference, geometry is one thing but you can literally pack tons of it on screen, and more often than not movie companies are just getting by in terms of polycount, they just use advanced lighting techniques to hide the polycount of their models which is perfectly normal on a budget. The lighting is the MOST compute intensive part of the entire process. No way in hell. not to mention these avatar graphics are rendered at 4k and then downsampled to 1080 which totally improves cinematic AA but then again sony fans cant seem to tell the difference between 720 and 1080 in terms of resolution so odds are the claim will be made anyway.
Although I agree that's not feasible for next gen hardware to meet Avatar's "photorealistic" graphics (since there isn't 9 feet tall blue cat-people walking around or natural xmas tree that talk to people through their hair, guess what, we're not going to readily sport differences between reality and CG while when those movies try to show CG people we can still spot it), I would like to remember that 10 years ago, to even think a game could look like a Pixar or Dreamworks animation was foil-hat crazy. So never say never. Personally I'm much more worried about how animations, physics and AI (specially with multiple enemies) will evolve. I don't feel the urge to switch graphics just yet, in fact, after watching several of those engine-sponsored "next gen tech demos", I gotta say I'm not impressed. It's not that the new tech doesn't look good or doesn't evolve enough, it does, but the current tech already look good enough, if that's a good term... I mean, it's not like the current 3D models look like a bunch of piled polygons or textures look like no more than a matching color. Also developers had been pulling some impressive lighting with the so called "outdated consoles". The moments that break the 4th wall and remind us that "hey it's just a game" are the weird animations, funny physics and stupid AI.
There's a difference between live graphic rendering, and rendering a movie down to a bluray, movie screen or dvd format. That takes forever. Playing a video game is different because all of the resources are actual image/sound files that get loaded and unloaded as the camera views or unviews them.
You are right but think of the technological advances we could make years from now. A $400 console in the future could end up making Avatar movie graphics look like a joke :D I didnt give a year, I'm just simply saying many years from now someone could invent or discover more crazy stuff that'll become cheap/affordable. Look how far technology available to the public has come in the last 12 years.
@jereththegryphon tell that to Intel they are already working on 14nm and are aiming to get down to 5nm by 2015.
@ angels and miku even if this is RE-MOTELY possible PS4-720 price tag be out-of-our reach with technology of AVATAR graphics 1,500-2,000.00
I see a lot of comments saying "some day" But, uh....I'm pretty sure this article is about the NEXT gen, and I think we can absolutely all agree that isn't going to happen.
When Moore deducted his law, i dont think he saw the PS3 comming.
This was posted below and I think you should all watch it before acting like you all know what the future holds For gaming. Btw didnt a dude from AMD say next gen would be able to produce avatar graphics. I think if anyone would know it would be the dudes producing the gpu...just sayin vortis + 15h ago Here is a correction since you are wrong. Real-time game simulation with ray-tracing. Actually works. Actually runs. http://www.cinemablend.com/...
Actually we are getting closer, really close. Gpgpus or CPU hybrids are paving the way. The problems is price now. You can actually get systems commercially (openly) as opposed to specialty systems from some avid reseller. I believe origin is very close to what I use. Yet unless the next gen has gtx690s and a CPU geared for certain things we may not see it this gen. You have to understand. This gens jump in graphics was not due to brute force. Shaders being supported in silicon made that happen. We have a interactive light source engine solution in multiple engines. Culling is coming to the gpu computing, per baked effects are switched on and off in real time. Ultimidely we will see voxel replace mip mapping. It is about effiencent use of what you have. This is not the goal of movie studios. This is the goal of amd, nvidia and countless engine designers. There will always be limits to real time graphics but it is up to the studios to hide these limits creatively. Look up shaders, normal mapping, polygon culling, voxels, tessalation, gpu ray tracing, gpu subsurface scattering, gpgpu progress on all of the above, good ol differed rendering using cloud based solutions, the limits of computing shaders, bandwidth advances and system on chip designs. A lot is possible yet we will see. Even ms is learning how to lower os over head! So any thing is possible. Many did not see normal mapping coming and I'm sure the next smart trick will be a suprise once more. Understand more is not the way forward focus and effiencent methods are.
I've become Hastune Miku fanboy after watching some videos last night, Sony needs to keep that IP exclusive and make her the Japanese mascot of PlayStation, Back on topic, do you know how much it would cost for a console to have that image fidelity in the far future ? I'm guessing 600$ People don't want to pay 600$ for a console in this economy and I heard the economy isn't going to get better, I'm talking about mainstream. I know some that would pay over 1000$ for it.
Miku belongs to Crypton Future Media, and the games are produced by Sega under strict licenses. The character has no affiliation to any console or console company, wouldn't make sense to have her as a mascot for a company that doesn't own her likeness or any rights to her at all >.<
I mean I think its common sense so I literally don't see why bother making a whole artisticle about it lol but yea no way in hell are next gen graphics on ps4/720 will be on par with Avatar.
Yeah, lets at least see a game that can reach early Pixar movies before we jump straight to Avatar.
The consoles will easily produce Avatar graphics as soon as they get dual GTX 690's, 16GB of RAM, and a 16 core processor in 2025.
Even that's not enough to render Avatar like graphics in real time. You didn't read the article, did you?
And you're taking things to literal. My point was it would cost too much and take 10+ years for console or even PC games to render something that even resembles Avatar or for that matter 1990's Pixar movies.
Engines like Unreal 4 have tools that assist & even eliminate certain tedious & stressful tasks devs have to go through.. Technology advancements will allow many new things to be accomplished in shorter time frames.. I dont think it will be dot by dot avatar graphics but i know some sexy things will happen.. Cloud gaming will be implemented into nx gen as well.. Some devs will only need to make one game build for certain games & stream them
*cough PCGaming already can cough*
@ tachyon: *cough*that's pretty nice, but no... they can't*cough*