Top
290°

Developer puts Unreal Engine 5 to the test with 10 billion polygons of dog

And Unreal barely broke a sweat.

The story is too old to be commented.
RonsonPL52d ago

Yeah. Awesome.
Meanwhile the amount of unique geometry at framerate not dipping below 60fps is?
Take a guess
a) also 10 billions
b) 1 billion
c) 100 millions
d) 10 millions
e) 1 million
d) none of the above, we're still in the shitty 100-500k, a drunk optimist might say 1M (rendered, not "in scene") but that's probably too much. There's no data cause no game consists solely of unique geometry.

instancing was utilized in 2004 Far Cry game.
I watched the keynotes, I'm not an idiot who just doesn't understand, but this is just stupid. No one needs 10 billions of useless instances.

99% of the hype related to ray tracing and UE 5.0 consists of things which are exciting only for developers. This allows to easily move assets, this takes some of your work off your hands..
Meanwhile, as a gamer, I miss being hyped for things which are actually cool for GAMERS.
This "news" is worthless.

suddenly high-end cards which existed since 2016 appeared on the radar of mainstream gaming industry.
Suddenly SSDs appeared. Not like PC gamers used them since a decade.

And WOW! you can put a lot of data into 24GB of memory on 3090! What a f.. surprise! And you can stream data from SSD better than from HDD!. And you can make better photos with drones and stuff in 2021.
yeah. EU 5.0 is worth making 10000s of news, including shit like this.

Give me a break. Or something which makes me excited from a gamer's perspective.
This can render textured polygons in real time! = exciting (I remember vector games)
This can do bumpmapping = exciting
This can do antialiasing or downsampling = exciting
vs.
This can help movie industry or help cutting the time and cost of game development so even more can be wasted on marketing and it's not even viable for 120fps gaming.

No. Abandon this stupid hype train already

Rebel_Scum52d ago

"No one needs 10 billions of useless instances."

idk man, Skyrim could sure use that much cheese to roll down the Throat of the World.

roadkillers52d ago

I can only assume you are a graphics designer, Game developer, or huge nerd... I did not understand what you said.

RonsonPL52d ago

Just a guy who enjoyed the gaming in its golden era (80's 90's and early 2000s) and who's not in love with things which are served to gamers as awesome nowadays, while being "meh" at best and actually backwards at worst.
gaming journalism drums the hype as the companies ask them to. Digital Foundry is one example, and ridiculous and worhtless "news" like this is another.
Well OK, I'm a nerd too. :P
Or maybe more like "was". There was so much things to get excited about back in the time and so little nowadays so I just turned into a grumpy old ex-nerd, I guess ;)

sleepyhead6252d ago

I don't know man... Nanite essentially cuts off authored LODs and replaces them with real-time LOD calculations. Theoretically, this removes asset pop-ins entirely (Which is still pretty evident in most of the games if you look closely).

That's something to be excited about.

chronoforce52d ago (Edited 52d ago )

Being able to render so many polygons at playable frame rates is a pretty big deal. Games are going get far more detailed and no less interactive. Enough reason for gamers to excited.

Gaming has always somewhat lagged behind film and other industries in terms 3d technology and it is always exciting when technology that could only be used efficiently in offline or HPC workloads become possible on consumer hardware. At the time blinn-phong shaders became widely used in games the model must have been at least a decade old.

I can't see a difference between being hyped for nanite and being hyped for bump mapping - a technique introduced to the world in 1978 but unfeasible to use in video games at the time.

RonsonPL51d ago (Edited 51d ago )

"Being able to render so many polygons at playable frame rates is a pretty big deal."

30fps on most powerful console (Xbox X won't ever be the target for console games, until next gen or PRO at the end of this decade) is NOT playable framerate. It doesn't exactly run fast on PCs either.
I can't tag in responses so I wrote more about it in a response to Ryushaa. I elaborated there.

@sleepyhead62
same thing

Ryushaa52d ago

TLDR: Artists don't need to constrain their design by number of polygons. More detailed meshes can be instantiated repeated times in different positions to make more natural scenes.

You really don't know what you are talking about or what Nanite is meant to do right? This has nothing to do with instances. You should really, I mean REALLY look it up first some Level of Detail (LOD) of 3D models in games. Then watch some video explaining what nanite is.

Then we have the number of instances. Yes, this won't make it goes up. However the number of instances will rise probably because of SSD's, as we won't need to preemptively keep geometry loaded in VRAM.
Although, Nanite can help produce more detailed environments and scenes by instancing the same HI-RES geometry and manipulating it in various forms. By rotating it, scaling and mashing one on another. Look it up the 'Megapacks' unreal made using only a handful of Megascans.

RonsonPL51d ago

yeah.. no :)

"Artists don't need to constrain their design by number of polygons" - Not true. It CAN be true
if all they care is narrative so only things close to camera matter.
I've seen professionals saying we've got enough geometry for everything in x360/ps3 era. They were writing that there's no point in having more as you can't see the difference - and they showed the human character as an example.
But how about 120fps gaming? How about VR, especially with wider FOV? How about large worlds other than those which can be built in 90% by cloning/instancing etc.?

"By rotating it, scaling and mashing one on another."
Yes, I am aware. Remember early methods of rendering trees? Take first Bayonetta for example. It's the first thing in my mind when I read "rotate" ;)

About megascans
personally I really don't care about realism and prefer things which are either nicer than realism (Far Cry 1 first level) or more useful (easier to see enemies in UT '99 or BF2) So it's not a silver bullet to everything.

Too many laymen think the only nextgen thing worth caring is EU5 and RT.
I was already expecting a huge bump in assets variety thanks to 12GB + SSD anyway. We should've gotten this in 2013 with PS4ssd PRO/ULTRA versions.
I also don't care how cool it is for movie creators being able to move assets directly to game engines. I've spent a fair amount of time on learning 3D Studio Max and I can agree this surely will be fun.
But as a gamer, i get too little to be excited about and EU5.0 or raytracing are definitely NOT that.
Let's see what EU 5.0 can do at 120fps (which should be a standard 35 years after 60fps became mainstream in computer games) with a variety of environments, with enemies, with explosions, projectiles and everything else. Lets see then what it can do. I really don't care what can be done at miserable 30fps on PS5 where it can't even hold 1440p resolution. Where are the demos with 60fps? Nowhere? Oh...

Cost benefits
Let's say a game costs $300 millions. 200M for marketing and bonuses for CEOs etc. Let's say graphics design is 50M. That's 1/6 of the budget. Even if the game engine is revolutionary and allows to cut costs as a result, it won't really matter for me when I buy a game. Also. 70-80$ pricing.

An analogy. Remember 2013? How awesome it is for game devs. How different compared to difficult PS2 and PS3 times! Awesome! Let's not mention shitty tablet CPU and a shitty HDD.

If i see another news about how we should worship Sony for giving us an SSD, or Epic for giving us such a revolutionary engine, or about raytracing, all of which are basically useless in 120fps gaming and "meh" at 60fps at best, i might lose my shit. ;)

Well, I've put my P166 64MB RAM to the test when I made a lamp in 3DS MAX in the 90s and cloned it 10 times. It meant more polygons in viewport than I ever saw in any game. No news about that, though ;)
People got hyped for Cerny's presentation about PS4. Or Carmack's keynotes. Remember megatexture technology? Yeah. Sounded nice.
Now we have UE5demo with 1/3 of what's in real games, at under 1440p 30fps on most powerful console (XboxX not relevant)
I'd rather get games at normal framerate (120fps fast and 60fps slow). I want to stop jumping in a rally game because ther "road" is still blocky as hell. I want VR and 3D which gives 1000x more than realisitic lighting.
Show me unreal enigine doing sophisticated framerate interpolation with minimum latency, so we can get 60->120fps interpolation or 480Hz gaming to finally leave motion blue behind. Then I'll jump in.

51d ago
+ Show (3) more repliesLast reply 51d ago
airshiraz52d ago (Edited 52d ago )

its so crazy .its like we have playstation 8 .wow . just wow.i cant wait for this nanite to take off. no more bad draw distance stupid lod problems game quality assets ....its so wild .at last we have cgi quality games this time for real

Duke1952d ago

Alright, well put your money where you mouth it. Build us a game that is just 10000s of dogs laying in bed

annoyedgamer52d ago

And all will be lost when scaled down for consoles.