In the beginning of 2010 I made some calculations to predict when we will see toy story level graphics and wrote in a forum ... but my calculation went horribly wrong when Crytek let us down
To prove I have not copied I have added a reply in the forum
verifying I Orpheus am Anil Mahmud
I also posted this thing in vr-zone at that time
http://www.fudzilla.com/for...
Here is the post :
""
When NVIDIA launched the GeForce 2 in 2000, Jen-Hsun Huang said it was a "major step" towards achieving "Pixar-level animation" in real-time only to be criticized by Pixar's Tom Duff.
"These guys just have no idea what goes into `Pixar-level animation.' (That's not quite fair, their engineers do, they come and visit all the time. But their managers and marketing monkeys haven't a clue, or possibly just think that you don't.)
`Pixar-level animation' runs about 8 hundred thousand times slower than real-time on our renderfarm cpus. (I'm guessing. There's about 1000 cpus in the renderfarm and I guess we could produce all the frames in TS2 in about 50 days of renderfarm time. That comes to 1.2 million cpu hours for a 1.5 hour movie. That lags real time by a factor of 800,000.)
Do you really believe that their toy is a million times faster than one of the cpus on our Ultra Sparc servers? What's the chance that we wouldn't put one of these babies on every desk in the building? They cost a couple of hundred bucks, right? Why hasn't NVIDIA tried to give us a carton of these things? -- think of the publicity milage they could get out of it!
Don't forget that the scene descriptions of TS2 frames average between 500MB and 1GB. The data rate required to read the data in real time is at least 96Gb/sec. Think your AGP port can do that? Think again. 96 Gb/sec means that if they clock data in at 250 MHz, they need a bus 384 bits wide. NBL!
At Moore's Law-like rates (a factor of 10 in 5 years), even if the hardware they have today is 80 times more powerful than what we use now, it will take them 20 years before they can do
the frames we do today in real time. And 20 years from now, Pixar won't be even remotely interested in TS2-level images, and I'll be retired, sitting on the front porch and picking my banjo, laughing at the same press release, recycled by NVIDIA's heirs and assigns. "
Many did not find this response a very warm one and started blogging for example , this one is from a brother :
http://industrialarithmetic...
Now I ventured to do some calculations. You can always correct me if I am wrong as I am no expert in the computer industry.
Machines used to render Toy Story :
87 dual-processor and 30 quad-processor 100-MHz SPARCstation 20s
Total number of processors = 294
According to :
http://ftp.sunet.se/pub/ben...
SPARCstation 20 (single processor) had SunOS 5.4 installed and used a HyperSPARC @100 MHz with 27.5066 MFLOPS
Theoretical maximum performance of the setup used by PIXAR
294 * 27.5066 = 8086.94 MFLOPS
Movie was rendered at 1526x922 pixels using Stochastic Anti-Aliasing
Scan-line rendering used, shadow mapping for shadows ( no ray tracing )
Movie Length ~ 75 minutes
Number of rendered frames = 110064
Movie frame rate ~ 25 frames per second
Rendering time = 46 days
Total data sent to renderer = 34 Terabytes
Factor by which more power is needed for this to be real time
46*24*60 (rendering time)/ 75 (movie length) = 66240 = 883.2
So required computational power = 883.2 * 8086.94 MFLOPS = ~ 7.14 TFLOPS
This is without considering all the network bottlenecks.
Theoretical Performance of Gforce 480GTX ~ 1.5 TFLOPS
(not considering ATI solution because it is less programmable)
If 4 cards are placed in quad SLI (EVGA SLI classified motherboard though this motherboard can handle 7 cards )
4*1.5 = 6 TFLOPS
Now hardware rendering is much faster than software rendering and there is less of network bottleneck here.
Quick facts
Per frame data of Toy Story = 34*1024*1024(total data sent to renderer)/ 110064 = 323 MB
Per frane data of Crysis ~ 200 MB
Per frame data of Crysis 2 = :-)
Polygon per frame of Toy Story = 5-6 Millions
Polygon per frame of Crysis ~ 1.7 Millions and Nanosuit = 67000
Polygon in the Nanosuit of Crysis 2 ~ 1 million
Texture Streaming can now allow for extremely detailed textures
Example : RAGE from Id Soft
Something similar is probably used in Crysis 2
Global Illumination is used in Crysis 2 which is probably better than the lighting system in Toy Story
Only lacking feature is probably Stochastic Anti - Aliasing
""
Some of the data like 1 million polygon in the nanosuit was circulating on the web at that time
Now I was expecting Metro Last Light / BF3 to do the thing but those GTA Mods ..... puts a smile on my face :-)
I have found another link providing performance data of the sun workstations
http://home.iae.nl/users/mh...
After Tron: Identity, Bithell Games has shifted gears with Tron: Catalyst, but will it capture the iconic digital world?
Star Overdrive is a third person adventure game heavily inspired by The Legend of Zelda: Breath of the Wild.
CarX Street's city has gone from mobile to PC and now to Xbox consoles - now in glorious 4K, but what’s with the trees?
The drifting is as approachable and as slick as ever, but the environment seems soulless and the grind is real.
The link for sun's workstation information is not working anymore
http://www.fudzilla.com/for...
To prove I have not copied I have added a reply in the forum
verifying I Orpheus am Anil Mahmud
Does not compute.......
Please speak in dumb person talk. All i got from that is theres no fu**ing way we can get pixar graphics for a loooong time.
I think there 2 totally different beasts one bieng prerendered video and one playing in realtime with physics and all that stuff running. Look at prerendered stuff from kz3, starcraft 2 and other games, it looks about as good as cgi from major films. Games have come FAR in these past few years. Crysis 2 really blew me away just looking at trees and the lighting.
I watched a documentary about toy story 3 and how the image didnt really get better because they couldnt do much more to the surfaces because its mainly plastic so they used the improvements in tech on stuff like the way the garbage bag reacted naturally to the toys in it. The physics and motions are much more lifelike and fluid. I think this is what we will see in gaming. More movement on screen, grass blowing, birds flying, destruction, more characters on screen behaving differently exc. This is a bigger leap then graphics because of how much it improves the world feeling alive and immersive.
... but this is just based on numbers.
A game can reach ToyStory's level of graphics without matching its polygon count, can't it?
I'd assume that the models and rigs for animating a movie is different than it is for a game...
Having all that detail in a continuous level would be pretty taxing on the system too, something which animated movies don't have to deal with.
With improved lighting techniques, I think games have a fair chance of matching ToyStory... or well, being comparable to it at least. It's hard to really match things with different art styles.
I understand that Pixar has made huge advances in computer animation, and I know that it all started with Toy Story, but comparing it to Crysis seems odd. Not because of the tech behind the two, but because of the vast differences between the two stories. One is a kids movie and the other is a FPS.
On the subject of the tech, while it's clear that Pixar has become the top computer animation studio it would be interesting to see what they could do if they decided to make a video game. I'm sure they would deliver something that was both unique and pleasing to the eye, but would the process of making it an interactive experience hinder the level of graphical performance? I'm sure there would have to be some give and take involved, but I would be very curious to see how a Pixar game would turn out.