We wanted to share with you what it's been and still is to work in a VR game compared to a traditional one. Specifically for creating worlds, it can make a big difference because the experience of using a VR device really changes everything we know about making games.
To the contrary of what many people think, ideas evolve and consolidate as we work in the game. We watch them grow, change and sometimes the changes need to be drastic if it's for the best of the game. Below is a collection of screenshots during the development process for the August build we worked for 2 months:
There is a lot going on behind the scenes but the basic formula is to test the idea we have on paper and put is right away inside Unreal Engine. Most of them won't work as intended, specially for VR since the experience is completely different. But the process is reduced to iterate and look what works and what doesn't and continue to evolve the idea.
In this article we are going to talk about the considerations we need to take to create VR worlds inside Unreal Engine.
Most people don't know it, but creating VR games can be extremely difficult because you need to get a high frame rate, in our case we need a minimum of 90 frames per second in order for the game to work. Otherwise, it can cause motion sickness and make the experience a not very enjoyable one for the player. While the hardware required to run these type of games is very powerful, we still need to optimize a lot to achieve high frame rates.
This isn't for VR only games but we always need to take in consideration the size of the textures since it can decrease frame rate significantly if not handled well. The challenge here is to reduce the texture resolution while keeping the same amount of detail, since in a VR game people can look very close to the objects, we need to make sure the detail stays there.
Fortunately Unreal give us many tools for optimizing textures. Many developers wonder about how big the size of the texture should be and it really varies from project to project since the same object won't have the same impact if the game is a First Person Shooter versus a Third Person Action game and VR is not the exception. The good news is that we can easily know the size of the texture with Required Texture Resolution mode on.
This is how the object looks in the game:
And with Required Texture Resolution mode on we can see how much texture resolution is required depending on how much screen space the object occupies.
With this mode you can see how much texture resolution is required depending on the type of game you are making. The problem with VR games is that most objects have the potential of occupying a lot of screen space since the player see the objects as if they were in real life size. That means that even when the object is close we need to maintain the same quality. In order to achieve this, we can use detail textures to give the player the resolution they are looking for while using low resolution textures. This can be achieved easily just by using the DetailTexturing function inside the Material Editor.
You can see the difference between the same object with Detail Textures applied and another one without it. With this, we are able to achieve high quality graphics while keeping the texture resolution low and increasing the game frame rate which is our primary goal.
Level of Detail
This is a huge one, while testing the game performance you can notice without the VR headset that the frame rates are going well, probably above 90 FPS. However, when you test it with the headset the performance can change drastically. We noticed that using Level of Detail (LOD) on all our models increase our frame rates by a considerable amount, while it may be obvious, you can't find the problem until you test with the headset on and this one was a huge bottle neck.
Like everything inside Unreal, this can be easily achieved in theStatic Mesh Editor using the LOD feature that automatically reduces the triangle count while calculating how much screen space it deserves. This can save hours of work since it's calculating everything for you and assigning the correct distance for each LOD.
As you can see, the Highest LOD uses 1014 triangles for the 3D model and the lowest use around 100. While the silhouette is terribly wrong when the object is close, it works perfectly well from a distance. For Stage 3: Azaria we used 5 LODs for all models and it increased the frame rate significantly.
In our experience, the best you can do to optimize a VR game is to reduce the Draw Calls. Like everything you won't notice frame rate problems until you wear the headset, but if you use Unreal's tools to analize the rendering statistics you can see that the Draw Calls are the biggest bottleneck for a VR game. It is less forbidding for a project for PC and Consoles and we knew that for Mobile devices was a huge one, but we got the surprise it also have a heavy impact even on VR devices even when the machine is powerful enough.
It is easy to detect how many Draw calls your game has by using Unreal's optimization tools. By typing the Console Command.
These are the most used to track performance issues:
- Stat scenerendering
- Stat unit
- Stat fps
These will give us an Overview of how much milliseconds takes to process the frame both for the GPU, CPU, Game code and Draw Calls.
As you can see, the Draw calls appear to be the lowest problem here but if you wear the headset you will find that it is actually the biggest bottleneck. You can also check how many Draw Calls your scene have and work your way around to reduce it, for Stage 3: Azaria we went from 2000 Draw Calls to 300. The easiest way to reduce them is inside Unreal, the typical workflow would be something along the lines of "exporting content from Engine to 3D Package, merge objects inside 3D package, create new LODs, reimport and replace old meshes for new ones". Like anything in Unreal, you can reduce these steps into a single click! Just use the Merge Actors feature and you are good to go.
You can see in this image that we have merged some of the Heorogan's Castle columns and reduced Draw Calls from probably 200 to 4.
This is how the Merge Actor window looks like and it does all those steps in a single click! The best thing is that it also merges LODs so you don't need to create them again. Our advice is to create LODs before reducing Draw Calls since you will get the new LODS for free.
Advice and final words
If we were to put a list of insights that we got from experience these would be the most important:
- Always test with a VR headset, you can't know how well or bad it goes until you try it on.
- Epic has a lot of resources and information about VR development, be sure to read there first. Most likely they had the same problem you are experiencing and have a solution for it.
- Use the tools provided by Epic: Performance statistics, LOD generation, Merge Actors, and the list can go on. These are tools that really speed up the process and help you to focus on what is most important: the game.
- Avoid dynamic lightning whenever possible.