"During the Q&A session at QuakeCon 2013, John Carmack was asked whether he plans to use mega-textures in future games developed by id Software, to which he gave a pretty long explanation, detailing the benefits of using mega textures."
Would so like to see Carmack and Cerney on stage together...
I have an intermediate understanding of computer tech at best, so with respect, would one of you tech guru's out there help me understand this statement. "We have no downsides to dynamic lights, where right now we have to approximate them but you can also have the full glory of the completely baked world view." What does he mean specifically when he says fully baked world view? I am still a nerd in training, thank you.
I think he's talking about fake lighting. Coloring/shading textures so they LOOK like their being lit/shaded when they're really not.
Baking generally refers to taking the texture data and pre-applying mesh/material effects to it. This affects how the light is reflected off the actual object in a game engine. When done procedurally(real-time) it can cause performance to slow down. However, in many cases, textures can have these effects applied ahead of time because the difference wouldn't be noticeable, or the difference wouldn't have enough gain over performance. This can greatly improve performance on complex scenes. For instance. A brick wall has quite a bit of depth texture to it. By baking, you can take how it would look with the light hitting it and put that image on the wall instead of applying many meshes that have to figure out the depth as an individual light ray hits the wall. In this way you get the textured look, and the wall only has to reflect(process) a simple light ray instead of complex depth/light information. Actual description is a bit more in depth, and involves talking about vectors and RGB's and stuff of that nature.
I know Carmack is highly respected but the guy didn't come close to top graphics this gen nor did he release a single great game. You only as relevant as you games imo.
I agree he didn't release a "great" game but RAGE (especially on pc) was one of the best looking.
RAGE was okay on PC and nothing more really.
even on consoles rage looked really good .
as long as you didn't get too close, the textures looked fine, but once you did they looked horrible.
Since when has RAGE been touted as this amazing looking game? The game has great art direction, but with horrible textures like these http://cloud.steampowered.c... I can't see it as being touted as something that is a amazing.
Stop with the BS. Rage may not have been a great game (it was decent), but graphically it was easily one of the top 10 on consoles, especially considering it was running at 60fps open environment. http://www.youtube.com/watc...
I ran rage maxed on PC, but one of the best. WTF. Best thing was you could drive. But graphically, it was ok. Some area's better then others. Hopefully. They have improved the engine a big amount for the next game.
Would be be cool if you could name a single game that looked as good as Rage and ran at 60FPS on both PS3 and 360.
Rage suffer from the same thing cod does on console. There nothing going on in the environment so it feel stale and lifeless to me because all the work is being used for the framerate.
@LeoD Name one developer who's got the balls to create something like Rage's tech on 7 year old hardware running at 60fps.
My only problem was the pop in
I think that RAGE was one of the best looking console games. Seriously looked better than 99% of the other games out there. The game, I thought, was horrible. But the looks were great. But when it comes to making game engines I trust this guy. Megatextures/hardware tiling are the future. But how far into the future is the question.
And there is only one console that can do hardware tiling, I may wait and see what happens at gamescon before I reconsider my ps4 preorder.
^^ OpenGL can, in theory, do everything that DirectX can do. It is called "open" for a reason. I wouldn't doubt the process is cloned in some form or fashion for an OpenGL environment whether that be on the PC or on the PS4. But yes, for the foreseeable future the only console that can accomplish this particular feature is the Xbox1.
Ummmm.....hardware tiling is in both consoles.................prt is a standard feature of the gcn architecture. Guys........this was on OpenGL first aka IDTech 5/Rage is an OpenGL game engine.
@cchum - you couldn't possibly be more wrong if you tried. As of right now hardware tiling is NEW, never been done before, and only available on Windows 8 and Xbox1. The ability to do this on opengl in a future update is currently being considered however. http://www.bit-tech.net/new...
Nope you guys are wrong. http://diaryofagraphicsprog... http://www.anandtech.com/sh... now with AMD doing a hardware implementation of a Carmack inspired technology. "Among the features added to Graphics Core Next that were explicitly for gaming, the final feature was Partially Resident Textures, which many of you are probably more familiar with in concept as Carmack’s MegaTexture technology." I didn't say that megatexture is hardware tiling, but Partially resident textures are.
Megatextures are NOT hardware tiling. There is a huge difference.
@cchum no its not hardware tiling your getting confused with software tiling which requires alot more processing power and can't switch as many tiles and to do it properly requires very low latency and very high bandwidth which is why the xbone has EDram.
@jsonhenry tiled rendering isn't new both the dreamcast and ps vita can do it. But hardware tiling is. I don't think GDDR5 will be as good as DDR3 for tiled rendering as it relies on out of order memory access to quickly change between different tile sets. Unless this could be got around with clever programming queuing textures in order.
OpenGL Sparse textures: http://www.opengl.org/regis... By AMD since 4.2. There is no real benefit of doing tile based rendering with GDDR5 memory. It is used to overcome bandwidth limitation transferring smaller chunks of memory to in fact reach or exceed GDDR bandwidth - and will still be limited by buffer (eSRAM or physical GDDR) bandwidth. But with GDDR5 you simply do not need to swap/transfer anything - the GPU has access to the whole texture with full bandwidth - and since e.g. the PS4 does not have any other type of memory, this is irrelevant there. XBox needs to fill ESRAM from the "slow" DDR3. So, yes, it'll work great there. Streaming from disk will not benefit from tiled "acceleration" because fetching a block from disk will be far to slow anyhow. This must be done by the engine, not the "gpu-pager". It would be a different story if your eSRAM has 1000MB/s bandwidth. But it doesn't and GDDR5 bandwidth is in fact faster.
^^ Interesting. Make sure MS knows that. Because in this demo, they facilitate the GDDR as a cache for "tiled" textures from system memory: https://www.youtube.com/wat... to over come the memory limits of the graphics card and the bandwith limits of PCIe. It's a "memory mapping" for the GPU but like paging it will never be faster than no swapping at all.
I think you should read up a little on that subject. The whole point of "resident" textures is, that (GDDR) VRAM is simply too small to store large amount of textures. We are talking about the average 1-2GB GDDR based graphics cards. Now, you have two choices: a) increase GDDR to 8-12GB (expensive - and the most you can get today is 6) or b) find a way to use system memory (and in that regard also stream textures from disk). Partial resident textures, or tiled based rendering or mega textures (which includes streaming) now does the later and is cheaper to build. It uses the 1-2GB (or even less - MS demonstrated 16MB buffers!) to "cache" close to GPU high detail textures while the HW does paging to "out of buffer" areas - where ever that is. The tests show, that 16MB is sufficient to reach the same effect as no paging with GDDR5! So, you'll see, this fits perfectly into the XBone's design: instead of GDDR they use eSRAM. And with 32MB it is big enough to function as a buffer. But it has no (!) advantage over GDDR5 other than the idea to be cheaper to make. This is irrelevant for the PS4 because it has one bank of GDDR5 - which is solution a) to the problem. Neither one will solve the problem that you'll run out of memory for massive open world games. Neither one can page textures to disk (well, PS4 support VMEM, not sure about the XBone). But even if the OS would support it, this must be software driven because the game must have full control when and what parts to swap to disk or load from there. The "MoveEngines" are far too slow to do all that. With "only" 20GB/s bandwidth you can't swap tiles with those. This is integrated into the GPU which does "mmu" like memory addressing for texture tiles and most likely runs it's own memory controller. Or not. Who knows. Also, what makes you think that MoveEngines are low latency? What Rage does is, using the whole amount of console RAM as cache and actually stream from disk. But if you have 4-5GB of fast texture memory, this only makes sense if your game actually requires more than that - and even then, more than that at the very same time. Otherwise you'd simply create a streaming buffer and stream those while you move through your world. BTW: the eSRAM idea is very much interesting for notebooks which usually do not have dedicated VRAM. Now those could simply use a small eSRAM buffer and run tiled textures. Will make mobile chips as fast as desktop parts and can run standard DDR memory.
I think pretty graphics wasn't never his intention. If he wanted to make another PC exclusive- power hogging game like crysis he could've done it. but he didn't, he decided to show of his mega-texture tech which was never done before. "You only as relevant as you games imo" funny you say that. going by his game portfolio carmack would be considered the father of FPS.
LOL, the top selling game in the world COD is still using his engine that he created in 1999.
IT IS hardware tiling. http://images.anandtech.com...
i dont think carmack is a game developer per say any longer, i think hes a programmer who provides tools for coding. the pc is his background and he is pleasantly surprised at how close the xbone and ps4 are and how much alike they are to working on the pc now. memory will not be a problem any longer it seems and its highly unlikely we will ever see a wii u game from them.
So then why do we listen to Cerny?
"You only as relevant as you games imo." Yeah, let's just pretend he's not one of the literal fathers of FPS as we know it. Seriously, do people even do research or anything anymore before spouting BS like this?
I really wish I read that over, not because of I don't fully stand behind what I said but because the "you" instead of "you're" and "your" makes me cringe at an otherwise accurate statement imo. Nolan Bushnell and Ralph H. Baer may be the father of video games and consoles but people wouldn't take their opinions on success in the next-gen very seriously. Is Carmack a legend? Yes. Does he know how to make great games and the best graphics? Until he release another great AAA game, no.
Something like MegaTextures is already entering with DX11.2 and tiled resources. So yes, they're here.
But MegaTextures Carmack use with Rage a few years ago. Carmack knew that will be very useful in the near future.:) I hope we will see Doom 4.
He was one of the architects of using relative slow system ram with a fast on chip buffer to overcome bandwidth limitations. But this is really only relevant if you have slow memory - or even worse, PCIe. It is what MS adopted and implemented in the XB1 but really, is useless if you have a full pool of GDDR memory which will always beat this. It's a "paging" mechanism to "swap" tiles from fast "cache" (usually VRAM) to system RAM (or over PCIe). To slow for streaming. The main reason is PCs are VRAM limited - still. And there is a huge bottleneck "swapping" textures to system memory. His research was based on the finding, that a lot of high detailed textures use only a small portion of a actual texture and instead of swapping the whole texture this could be optimized by swapping "tiles" instead. This can be software driven (e.g. PS360) or HW (with some sort of GPU-mmu). And DX11.2 supports this on an API level. The whole idea goes so far back like it was used by Permedia2 chips, or how PowerVR uses tile based rendering in HW.
@ju it's as relevant to next gen texturing as scalable tessellation is to high polygon pushing. If you think every texture this gen will several layers of textures and translucency at a high resolution you can see how important high to low texture swapping will be.
We'll see. More relevant if you don't have the bandwidth using standard mip maps to begin with. But, sure, whatever reduces bandwidth requirements. I doubt we'll see tiled renderers within GDDR, though - at least not with the current gen of HW.
We may see a more advanced version of tile resources in the upcoming Dx11.5/12
It's true graphics only go so far, it's gameplay that matters, because if the game sucks it sucks no matter how good it looks.
Dem true statements
Agreed, except those tend to twist that statement depending on the game to favor their opinions.
Crysis 3 for example all eye candy, but no substance.
actually killzone 2 was damn great
hopefully the next Bethesda studio games would use something similar to the mega texture tech since Id software is part of Bethesda.. to many times the texture in Skyrim and Fallout would seem so repetitive and boring.
Just based on Rage, I felt mega texture would have been better for a game where you didn't get too close to the textures. Like a game where you flew "helicopters" or giant mechs. Anything that kept you at a medium distance. At that distance the textures looked great and the advantages of megatexture could be appreciated. Wouldn't be good for flight sims as the performance just wasn't that great. That all assumes the tech doesn't evolve. The "idea" of megatexture seems to have promise but in Rage I really didn't see any benefits. File size was huge, the lighting was meh, and up close the textures were ugly.
I know a lot of people regard Carmack as some kind of programming messiah but in reality he is proving to be a bit irrelevant as of late. Rage was beautiful and fun on consoles but that was years ago now. It also took Id a long time to finish that game. They need to focus on showcasing their dev engine more prominently and consistently like Epic and Crytek. Why haven't we seen Doom 4 yet? Carmack is indeed a programming beast but he needs to stop holding these Quakecons and get to work on Doom. I would love to play the sh!t out of that game.
Doom 4 has been scrapped and sent back to the drawing board numerous times because the devs and some publishers all kept fighting over things and it was generally agreed that the quality of the game wasn't up to a level that they liked. The game is still being developed, they will just show more of it when they get ready. As for Quakecon, it's a convention that happens once a year for just a few days at that... It's not like it's really cutting into the games' development or anything. Doom 4 will be shown and released when id feels like it.
I sensationalized my response for the sake of effect. I know what, when and where QuakeCon is and I am fully aware of the protracted development tailspin Doom is currently in. The meat of my argument pivots on the fact that Id is inconsistent and disorganized. Again, why haven't we seen anything of Doom 4? Carmack waxes prophetic every year about the future of gaming but barely contributes. Doom 3, Rage, uh...9 years and they only have 2 games to show for it. What Carmack and crew should do is stop being so pedantic and start making games. I couldn't care any less about Id's internal struggles. They need to get the lead out and realize that nostalgia will only get them but so far in an ever advancing development environment. They aren't even developing Wolfenstein. So again, where is Doom 4? What the hell are they doing? Figure the sh!t out. I know we have to wait for Carmack's next opus but by the time they are finished we will probably have 2 to 3 more COD games and a Duke Nukem sequel. Id is becoming irrelevant as a development studio.
Your question was answered by both me and yourself. We haven't seen Doom 4 because of internal strife and it being re-booted several times. You *should* care about the internal troubles because that's the main reason why idS hasn't been outputting many games in the last decade. In a recent interview, they said they aren't gonna do any new IP's anytime soon because they wanna focus on getting Doom 4 released. The new Wolfenstein is coming, which was outsourced primarily to another studio so they could focus on Doom 4 as well. They said they haven't forgotten about Quake, and that they don't know what they'll do with it and Rage, but basically Quake and RAGE are still alive. So just be patient. It's not as bad as people who are waiting on Half-Life 3 or anything (even though Doom 3 came out around the time HL2 did). At least we know Doom 4 is really a *thing*.
I guess I just can't excuse their ineptitudes. When I pose the question why, again, I am being hyperbolic for effect. I am always wary of projects slipping into the aether. Id is a longstanding and storied developer and I just expected more. I guess I have no other option but to be patient. I just hope this doesn't go the route of Half Life ep 3.
How many developers are buying his tech? Yeah, that's right, none.
Well there a game called the Evil Within being used by the id Tech 5 engine
Wolfenstein too. Now pretty much an internal Bethesda engine
@baldilf id Tech 3: Inifinty Ward's engine was developed off of id Tech 3 (an engine made in 2005), so Every Call of Duty since COD2, Wolfenstein, James Bond Games, and basically every shooter from Activision uses a engine Carmack designed in 2005 (none of those games hold the crown for top 10 graphics, but that 8 year old engine held it's own). id Tech 4: Wolfenstein 2009, Prey, and Brink. id Tech 4.5: While not a real engine this is basically what Infinity Ward's "NEW" engine is upgraded to so Call of Duty: Ghost and more than likely all following COD games. idTech 5: Rage, Wolfenstein (next-gen), The Evil Within, Doom 4. All games that use / buy this tech.
Besides that list which that guy just posted which pwned you, id Tech 5 isn't available for external licensing. Besides id Software themselves, only internal Zenimax studios will be able to do things with it until years from now when the engine is released as open source, like the engines before it.
Yeah, impressive list of games from eight years ago and a bunch of internal Bethesda projects. I say it again, no one is buying his tech. iD is at death's door as a company and Carmack is trying to sound relevant in a world that has already forgot about him.
I just wish John Carmack and Cevat Yerli from Crytek would concentrate more on "gameplay". Ok we know... you got graphics and hardware down to a tee... how about getting gameplay down too????
I like both lol bet you play on PS3! Shooting in rage was smoother than anything ive ever played
When I hear people talk about Rage, I rarely hear hiw great that game looked by rather "wtf was up with that ending?"
Rage was missing play more than game. You got excellent graphics but no sense of enjoyment from it. I be waiting for Doom 4 but if it sucks as much as Rage did then I'm all against MegaTextures. Because gameplay always come before graphics.
Can't wait to see it in The Evil Within.
I'm enjoying Rage on PC. Solid game and tweeked it to get rid of pop in and to get higher rez textures. I'm enjoying and have nothing to really complain about it. Thank you Steam sales.
Rage is a decent game that has a scary atmosphere. I thought that the AI was brilliant - too brilliant as it was so easy to get in to a deadly situation. Some beautiful graphical effects if you can overlook some pop up textures.
I felt Rage was a decent game (albeit a little lacking in the gameplay dept, but still fun) if you had a nice rig to run it on. Mega-textures sounds fine to this graphic whores ears.
haha what's wrong with shitting on these kids that doesn't know shit. Look at the comments that I'm replying to. It's hilarious. And this kind of amusing. That's probably the right word in our case. I thought you think i'm immature little kid. Why are you still replying to me? Do you ever see grown ups arguing with or mocking a kid that they doesn't even know? They would just walk away. Look at you. All "grown up." You started the mocking and then got pissed off. Then blocked me. Stop being so angry. Even if you can't stop, you pretty much brought it upon yourself. LOL Now you are message me all this bullshit acting all grown up. That's pretty funny. My statement still stands. "Speak for yourself." Besides, you are insulting too. If we are looking at this technically, you are not better than I am. Along with your insults, you are just supporting me statement/message (moral of the story).
Your grammar and sentence composition is atrocious, and Im getting a kick out of noticing you are trying really really hard to make your point, a point I really couldnt care less about. You keep following any post I make to press your desperate need to be correct. Sorry to burst your bubble Mimi, Im not mad or annoyed or ever was for that matter, but if only to make you feel better about yourself and your moot opinion, then fine, I am steaming mad, you're right, Im just a stupid kid, Im just fuming, so mad I cant even type. There, feel better now? Give it a rest, you're making yourself look like a fool. Man, you are some kind of sociopath.