Tech Demo – Testing UE5, Nanite & Lumen

Hi there. My name is Bernhard, I am an Unreal Engine Artist, born and raised in Austria, working and living in the U.S. for 18 years now. Like many others, I love film & cinematography. As a kid, I was watching the first epic Mad Max 1 movie with Mel Gibson. That specific style and the creative direction are still today very unique.

In the last 5 years, a lot of things changed in the Entertainment industry. Nvidia released RTX, and Epics flagship the Unreal Engine 5 was announced with a major change in their render engine. Nanite, Lumen, Metahumans, Volumetric Effects, etc., and that all in runtime.

Below, you can watch a very short clip I created over another weekend. So yes, I should call it weekend productions. Because I have a regular job as well, that keeps me very very busy. Being a 3D Visualizer is fun, but also requires a lot of hard work.

Short Teaser by Bernhard Rieder aka FattyBull using Epic’s Unreal Engine 5

I love cinematography, lighting, and look development. And so I thought, it would be just really cool to create some Fan Art and a Tech Demo Level. But also very important to say. I will not create a full fan game, because Mad Max is a protected copyright title, and simply put, I am not allowed to do that.

And second of all. I wouldn’t have time to create a real full game. So this is just a small level I created to test the new Unreal Engine 5 Game Engine. And that’s about it.

When I showed my short clip to a few of my friends, they told me it was a nice offline ray-traced 3d animation. “Hmm… but wait a second!”, I responded. “This is not an offline render, this is an in-game cinematic short, it’s in runtime and I hit on my RTX 2080 solid 60FPS.

How the heck is that possible? Especially when the game level contains millions and millions of polygons, with high resolution scanned data provided from the Quixel Megascan Team. How can that all be true?

Well, that’s what I couldn’t believe when I was running my first tests using the unofficial Beta Release of UE5. But once I installed their new Game Engine testing Nanite and Lumen. Kaboom.

Behind the Scenes

I started with a new Level and focused first on setting up the environment. For that purpose, I used Quixles Scanned assets. See below a screenshot of the Megascans.

3D Environment Design

Once I had roughly the landscape done, I started working on some look development. Let there be light, right? lol; I decided to go with a total different tonal direction vs. the one used in Fury Road. As we all know, Fury Road was very saturated and the vibrancy of the colors have been also cranked up a bit.

I wanted to create something a bit different and also wanted to make sure the overall look and feel fits to Mad Max Theme. And of course, I was curious to test all the new features Unreal Engine 5 had to offer.

For my Level, I used the built-in SunSky Plugin. It’s a Blueprint, that contains already Skylight, Directional Light and the Sky Atmosphere. In order to use all integrated Volumetric Effects, you need to turn on Volumetrics in within your project settings.

On top of it, I used the Exponential Height Fog to get as much realism as possible. All actors combined allowed me to play be the digital weather and atmospheric god.

Checking overall Lighting and Global Illumination
Kudos to Harald Wraunek one of my friends in Austria that is doing an amazing job scanning high-resolution assets
Visulization Buffer helps me to see how every pass

I also use all possible options within the PostProcess Volume and all my Camera settings. Adding Glare, changing your camera blades, lenses, etc. I couldn’t afford in real-life all the fancy camera’s. But in the digital world, heck yeah.

Metahumans for Game Production

Please find below a short clip that shows a bit behind the curtains of the Level Creation.

For my characters, I also use Epic’s Metahumans. You can use Bridge to launch the creator and after you are happy with your result, you can export the fully rigged and skinned character to your game.

Inside the Unreal Engine Sequencer, I can use the Control Rig for the Body and the Face to create poses. And yes, if you are an animator, you can actually manually animate your character using the Control Rig and the Sequencer as a combo.

Methauman’s for VR Production – Aerobics from the 80s

Yes, I loved the 80s and thei wonderful TV-Aerobics sessions. Guess what? It’s soon coming back. This time fully immersive. That’s right. Please check out my crazy VR Development process for the Oculus Quest 2 if not seen yet.

My mom always told me, stretching your legs does help with your mobility
As you can imagine. I always believe everything my mom told me. So I highly recommend doing some Aerobics.

Methaumans for Animation using Motion Capture and Facial Expressions

For the game production it would take me to long to animate the characters using the Control Rig. Also, if you want to achieve really good animations, you need to me a real Master in animating characters. That’s an art, and it’s very time consuming.

That lead to the decission, to use mainly Motion Capture Performance and Facial Expression Capturing. Ha, and there is one Master in that field. See how lucky I am, that I know this Master. lol; Let me introduce you to my friend Gabby als known as Feeding_Wolves. Her full name is Gabriella Krousaniotakis. We both found it eacher in the metaverse, and since then, well let’s keep it this way, we can’t let go from each other anymore. lol;

We both have so much fun working together on our ideas, that it’s just a natural and fluent process how we do things together. Gabby is recording all the mocap and the Facial Expressions we use for all our animations. And let me ensure you. It’s not just about wearing a jump suit and acting, and recording. There is a ton of other stuff you need to consider, and to setup before you get to this point.

However, we can use it for the game-play in run-time, but also for offline animations, Shows, Events, and of course for our VR-Experiences. Check out below Gabbies latest Performance Tests.

Gabby is recording everything in Real Time. And once we have our takes, we bring them into the Sequencer, we create our edits, actions, etc. what ever we need to tell our stories and to entertain our audience. This is how we combine our Jedi Forces and create our current Metaverse. And maybeeeeeee….. we do some other stuff in between.. lol; but hey, spolier alarm!

Metahuman Control Rig

Yes, I was just having some fun playing around with the Methahuman Control Rig.

Game & Asset Creation

Since UE5 and Nanite allows me to use assets with a lot of polygons, I don’t use super low-poly models anymore. For the V8 model, I used Blender and Substance Painter. I also like to use Marmoset Toolbag. It’s a bit different in the baking process, but I love the fact, that you have also a render engine included.

That helped me a lot to visualize my model instantly, not just by checking my textures within the viewport. But also, how the entire model will look like inside a render engine.

Mad Max Interceptor V8 Model textured in Substance Painter

V8 Interceptor scenes from the set

Some early other quick vehicle concepts done in Zbrush

Not quite ready yet for the annual Mad Max Wasteland Party in California

VR Unreal Engine Artist

On top of it I am working really hard on the latest possibilities using VR and Metahumans. Yes, I never thought that I would have so much fun as a VR Unreal Engine Artist. I am pretty happy with my latest results, considering to achieve high quality rendering while maintaining high performance and using forwarding shading.

You can follow my VR Development process on my slack channel. I am testing with the Oculus Rift S, and of course with the Quest 2. Can’t wait until Quest 3 is coming out, possible with an LED Screen? Wuuuhaaaaaa…. lol;

You’ll find at my VR-Blog all my latest Virtual Reality Development Tests. Especially shader & material tests for maximum performance. I am currently profiling a ton of different methods to achieve solid 11ms or 90FPS. Running also the Air Link.

My latest performance tests with Forward-Rendering is covering the following functions:

  • Distance Field Shadows
  • DX12, Raytracing and GPU Lightmass
  • Emissive Material
  • Translucency to render the “cheapest” Glass Material possible
  • Normal Maps, Displacement Maps
  • Volumetric Fog
  • Niagra Particle Effects
  • Optimized Shaders for Metahumans
  • Hair Solutions
  • Real Time Cloth Simulation (UE5 Native)

Ohhh.. and before I forget.. shout out to all my annual Mad Max Wasteland Party Fans. This year, I’ll be there as well… still working on my Mad Max Outfit. And guess what.. I could need some serious help with the Costume Design. If you can help me with that.. please DM me on Slack and let me know, that would be Epic.

Please check out my blog for more information and my current VR Development process. What else to say? Feel Free to Download all my Free Unreal Engine Projects for Learning & Study purposes. Or just download some of my Game Experiences, that you can run on your Computer.

May The Pixel Be With You, Have Fun, and talk to you soon. Happy Pixeling, Bernhard aka FattyBull