PCGH reports: Nvidia says to have demonstrated the first GPU only ray trace at Siggraph 2008.
If it doesnt cost 2000$ and doesnt take 1 hourto render half frame then there is a chance for this in gaming industry.
because they just want to make the next best thing
Come on guys be realistic, they are not saying "Oh this will be in games in 3 months time". It is future technology, giving you a glimpse of what's to come.
I think ray tracing is still a few console generations away espically as we gamers get used to more visually complex games. Though it wouldn't surprise me if in the next generation someone creates a ray-traced pac-man game.
I wouldn't be surprised if Sony and M$ designed their hardware with raytracing in mind down the road. It's been around for too long with AMAZING results to completely disregard it. It probably won't be in launch games, but the power will probably be there to have it later in the generation. Also, I wouldn't be surprised if one of the consoles added a third processor for near dedicated physics. I think Sony took one step forward in using consoles as a way to experiment with new hardware. If they would have had the cell already well supported with programming knowledge it probably would have had an enormous effect. If someone did something similar (by expanding the power in an experimental way) without having it so hard to program for, it would probably make an amazing machine.
I'll say it'll be in games in 5 years.
A PS3 can do ray tracing in real time. Though this is very unrealistic it shows it is possible, I would say with in the next two generations of graphic cards and processors, it would be possible for it to come in 5 years.
While the PS3 does use ray tracing in real time it is of some form of it not exactly everything 100% ray traced. Games like GT5P utilize it as well as Getaway3 http://boardsus.playstation... and even Killzone2 (listen at around 28 seconds) http://www.gametrailers.com...
Relax I did not say games use them, but there was a project that did use "real" ray tracing, though its not a feasible solution just goes too show what can really happen.
can someone please explain ray tracing? (simplified)
It's basically taking an image, the source of light and a set of mathematical algorithms in order to draw pixels instead of normal methods. (Render image, apply shadow properties, apply specular properties, apply depth of field) Think of Ray-Tracing as a "all-in-one" package or as the tech behind holograms.
Basically there are two basic "schools" of rendering. The one that is used in gaming is called ray casting. Ray casting involves sending rays out from the virtual camera (your perspective in the game). When a ray intersects with an object it renders a pixel at that point depending on the proximity of that object to light sources and what effects are supposed to appear on the object, such as reflection maps or shadow maps. This is a very efficient system as the system only sends out enough rays to render whatever the camera is looking at. Ray Tracing works by setting your virtual camera in the scene that it will be rendering. Instead of the rays coming from the camera the system uses a different method. Each light source in the scene fires rays of virtual light out into the scene. The light ray is traced from its source and, if it ends up entering your camera it is rendered on screen. By tracing the light from the source to the camera it is possible to gather much more information about the scene. Reflections, shadows and other lighting effects appear much more realistic, but because you're dealing with a ton of particles being generated from each light source and with only some of them ending up in the virtual camera the system is not very efficient and the process takes a much longer time to complete.
@Leathersoup Thanks for that. Have a bubble, sir.
This will likely make it into the PS4 if NVidia is the GPU partner
..ray-tracing isn't the holy-grail. It's not good for everything. I also remember Sony-IBM showingd a demo with ray-tracing done with the Cell.Also google for ray-tracing with multiple PS3's. I also saw a video of Quake3 mod with object created with race-tracing.
it's great for making things look really realistic, but that comes at a price. Especially when you can make things look realistic when they're generated in entirely unrealistic ways. With what carmack is doing for textures and meshes (I heard he had an idea for doing a similar thing to the textures with idTech5 that would give artists almost no restrictions on polygon counts), I would definitely be excited if he started looking at lighting.
what's the benefit of RT? easier to program for? what? i don't get the big deal?
Shadows are the first thing to benefit, but reflections and refractions become incredibly more realistic. For instance, videogames usually just use a texture image to simulate reflective surfaces, but with ray tracing you can eliminate all the work needed to create seamless transitions (Call of Duty 4 suffers from this) and just let the render engine take care of it. Honestly, most people don't know the difference until a side-by-side comparison is made. For example, play Gran Turismo 5 Prologue or Forza 2. The cars reflect the environment, but not each other. That's because texture maps are being used. Some games like GTA IV, F1 Championship, and Project Gotham do a basic render projection to (also) simulate ray tracing.
5~7 years, and we'll see real-time ray tracing with monte carlo radiosity. Now THAT will be awesome. Gran Turismo 5: Prologue already does 1080p Raytracing for its Garage and Car Selection segments.
Really? Damn I would love to see that!
than most raytracing. radiosity adds so much more to the feeling of environments than most other raytracing benefits. You get a lot of detail with a lot of raytracing, but some radiosity effects really change the whole way a scene feels. http://en.wikipedia.org/wik... look at those rooms. dayum. edit to save bubble: HOLY JESUS. did they render those with high enough quality? That would only take a few days to render on my PC... :'(
Radiosity is indeed very cool.. but its stupidly expensive compared with normal raytracing techniques, and can increas the time taken to render a single frame by at least a factor of 10. I use Vray to do proper light scattering radiosity rendering. If I can render an environmental frame in the 3D max or Maya default renderer or mental ray in about 60 seconds, it would take at least 10 minutes to do a full radiosity render, and that factor increases as the complexity of the scene and number of light sources goes up. The memory requirements for the calcualtion are huge too - almost always requiring huge swap files on my machines! The results are staggering of course, and I'd never do a cuts scene, advert or film render without using Vray. But for real time, we are a very long way away from true radiosity rendering! I guess it will happen one day, and I can't wait to see it, but they're only just getting the real ray tracing going, so I think we're gonna have to be patient on this one!
I highly doubt it. You can get those wikipedia renders in just a few minutes on a Pentium 4. If you want some examples of Lightwave, I posted a few of my renders below: http://img84.imageshack.us/... http://img223.imageshack.us...
Not a fan of lightwave myself..but I know plenty of people are. Nice environments, but if I can be constructively critical you are getting grainy results as the sample frequancy is too low. Turning it up is what gives you longer rendering times, but you don't get the 'grunge'. I appreciate that sometime the grunge is a nice effect for a static shot, but you often don't want that in comercial work as I'm sure you know - and if you render motion its just noisy, as the grain is not consistant from frame to frame! :) As a comparison of rendering time, here's a completey untextured mesh thats just a concept design with just two lights - and it takes about 7 minutes on a fast P4. :)its max with Vray. If I were to texture it properly, put it in a full environment and add some proper HDR lightmaps, it would take a very long time. Radiosity IS expensive computationally:) http://tinypic.com/view.php... The last job I had to do radiosity rendering, for was a TV advert last year, with lots of digital robots and and digital set extension that I had to track in from tracking 2D data using Boujou. I was on set and measured all the lights and took light meter readings so we could match the real footage we were comping in with. Each HD frame of that took well over 1 hour per frame to render on a quad core, for a few hig res robots and a bit of set extension! Anyhow, nice talking it over with you dude. Though I'm skepitcal we'll see true radiosity rendering in real time soon, I'd like to! In the meantime, some of the faking techniques will start to make their way to game engines and we'll all apreciate the difference!
Radiosity doesn't really look super good until you add colors to walls/lightsources. That's where radiosity really makes you go :O. The wikipedia just happened to have a good explanation of it as well as have pictures with colored objects. One interesting thing about these things becoming available is that it makes a lot of 3d artists stupider. Compare someone who's had to fake radiosity with someone who's had it all along and the difference is really kind of funny. There's definitely something too be said for artists who do it without using the extra tools. I always thought you shouldn't use faster/better ways until you know how to do it the long crappy way.
We need more powerful processors before we see this ingame I reckon at least.
also all mhy games ruin agt 120fps
...and say yes to chin-ups. :D Just kidding.
Anyone notice thats a Bugatti Veyron ? Cost of tech = cost of car? xD ah well
... anything the PS3 can't do ;) Looks like GT4
N4G is a community of gamers posting and discussing the latest game news. It’s part of NewsBoiler, a network of social news sites covering today’s pop culture.