You can get Voxel Farm now. For more information click here.

Friday, June 30, 2017

Unity versus Unreal

This topic is as divisive as the US 2016 presidential election, so I'll tread carefully.

As a middleware maker, it makes no sense to have favorites. We do our best to keep integrations of Voxel Farm on par so we reach as many users as possible. As an individual, I see no problem stating I prefer Unreal, but this is only because it is an all C++ environment. It is not a rational thing.

This post, however, is not about how I feel. It is rather about the state of the two engines and how much they facilitate procedural generation and working with voxel data. I think many of the issues we have encountered over the past few years are common if you are doing a similar type of work with these engines. Hopefully, our story can help.

Let's start with the visuals. Both Unity and Unreal are capable of rendering beautiful scenes. Both are also able to render at very high frame rates, even for fairly complex content. This has likely been the lion's share of their R&D for years now. Unity has one crucial advantage over Unreal, which is it natively supports texture arrays. Unreal almost supports them, in fact, we managed to make them work in a custom branch of UE4 with little effort. However, this is not possible with the out-of-the-box Unreal distribution. That is a dealbreaker if your middleware is to be used as a plugin like it is our case.

Texture Arrays in Unity allow precise filtering and high detail

Texture arrays make a big difference if you need complex materials where many different types of surfaces need to be splatted in a single draw call. When an engine lacks texture array support, you must use 2D atlasing. This raises a whole hell of issues, like having to pick mip levels yourself and wasting precious memory in padding textures to avoid bleeding. When you hit this low point, you begin to seriously question your career choices.

If your application uses procedural generation, it likely means the contents of the scene is not known when the application is in design mode. This is at odds with how these engines have evolved to work. If your application allows users to change the world, it only makes it worse. For the most part, both engines expect you to manage new chunks of content in the main thread. This is something that if left unattended can cause severe spikes in your framerate.

There are multiple aspects involved in maintaining a dynamic world. First, you must update the meshes you use to render the world. This is fairly quick in both engines, but it does not come free. Then, you must create collision models for the new geometry. Here Unreal does better. Since you have closer access to the PhysX implementation, you can submit a much simpler version of the content. In Unity, you may be stuck with using the same geometry for rendering as colliders. (EDIT: I was wrong about this, see the comments section.) From reading their latest update, I see this motivated the Card Life developers to ditch PhysX collisions altogether.

Card Life, made in Unity, features a hi-res voxel world

Voxel Farm allows players to cut arbitrary chunks of the world, which then become subject to physics. Unity was able to take fragments of any complexity and properly simulate physics for them. Unreal, on the other hand, would model each fragment as a cube. Apparently, PhysX is not able to compute convex hulls, so for any object subject to physics, you must supply a simplified model. Unity appears to create these on-the-fly. For Unreal, we had to plug in a separate convex hull generation algorithm. Only then we could get the ball rolling, literally.

When it comes to AI and pathfinding, both engines appear to use Recast, which is a third party navigation mesh library. Recast uses voxels under the hood (go voxels!) but this aspect is not exposed by its interface. For a voxel system like us, it is a bit awkward to be submitting meshes to Recast, which then are voxelized again and ultimately contoured back into navigation meshes. But this is not bad, just messy. There is one key difference here between Unreal and Unity. Unreal will not let you change the scope of the nav-mesh solution in real-time. That means you cannot have the nav-mesh scope follow the player across a large open world. It is unfortunate since this is a tiny correction if you can modify the source code, but again for a plugin like Voxel Farm it is not an option.

Dynamic nav-mesh in UE4

This brings me to the last issue in this post, which is the fact Unreal is open source while Unity is closed. As a plugin developer, I find myself surprised to think a closed source system may be more amicable for plugin development. Here is my rationale: So far the open source model has been great allowing us to discover why a given feature will not work in the official distribution. You can clearly see the brick wall you are about to hit. For application developers, open source works better because you can always fork the engine code and remove the brick wall. The problem is this takes the pressure off and the brick wall stays there for longer. In Unity, both application and middleware developers must use the same version of the engine. I believe this creates an incentive for a more complete interface.

I'm sure there is more to add to this topic. There are some key aspects we still need to cover for both engines, like multiplayer. If you find any of our issues to be unjustified, I would love to be proven wrong, for the betterment of our little engine. Just let me know by dropping a comment.

Tuesday, May 2, 2017

Plugin Status

I'm happy to see our team of excellent developers at chez Voxel Farm has made quick and large improvements to our plugins for Unreal Engine 4 and Unity 5.

The UE4 plugin is now a proper UE4 plugin, not just an example for integration anymore. This opened a whole new set of possibilities. In a very short time, we were able to put together this video from different scenes and interaction modes within UE4:


The new Voxel Farm UI makes it quite simple to add Voxel Farm to any existing or new project. There is a button that will do that for you, requiring you to just point to the target project:


The plugin already offers blueprint access for typical tasks like block edition, voxelization and physics. The threading model is much better, resulting in a smoother experience.

There is a new demo for UE4, now including the plugin:


If you want to get a feeling of how the plugin is used in UE4, these topics will help.

If you are thinking Unity gets no love, you would be wrong. Most of our recent efforts went into the Unity plugin. I will cover this in my next post.


Thursday, March 23, 2017

Destroy The Block

"Destroy The Block" is a new demo we put together to showcase the new Unreal Engine 4 plugin. In this post, I will go over what this took.

The demo will soon be included in Voxel Farm's demo package, meanwhile here is a video:



If 20 min of that was not enough, here is an earlier video of just driving around town in different cars:



We did not create this town model. It was a Minecraft import. Following a comment in this blog by Piotr Kucharczyk, I took a look at Minecraft's Anvil format. It turned out to be quite reasonable and easy to use. 

After a few hours of work, I was able to see Minecraft levels in Voxel Studio. I started with the King's Landing model, I was curious if our systems would be OK with such a complex model. It turned out to be alright:


This model was not a good option for several reasons, mainly it was too crowded for any first person gameplay to happen. Maybe riding a voxel dragon and setting the city on fire, but that would be too obvious.

So we settled on the town level. The natural environment, which is not blocky but rather smooth, was created with a single Smart Biome in Voxel Studio in a few minutes:


We then imported this project into Unreal Engine 4, using the new Voxel Farm plugin. It took some time to figure out what would be the right scale for the scene. Since Voxel Farm's voxels are much smaller, the default configuration felt closer to a Godzilla/Kaiju simulator. That would have been a nice demo, but I was looking for a more human-level experience.

Minecraft levels may appear simple in the mind's eye, but a level like this town is insanely complex. All buildings, without exception, have intricate interiors. Here you can see a cross section of a residential tower:


As you can see each apartment is fully defined, they even have little beds!

To further complicate things, the draw distance needed to be insanely high so detail like windows would appear when from viewed far away:


This tall building is 1.5 Km away, but it is still rendered in full detail. The player can use the sniper scope at any time, and the switch must be immediate. There is not enough time to load a higher definition of the building.

Mesh optimization really helped here. Any surface that contains multiple voxels of the same material can be heavily optimized. The following image shows how this makes a big difference in triangle counts:


I do not think vanilla Minecraft does this. Just thinking about how many triangles they need to push gave me a new sense of respect for their rendering engine.

The main goal of this demo was to tune the UE4 plugin and in particular the physics. We also spent some effort making sure the whole scene, including terrain, would load in 10 seconds or so. The demo's behavior and interaction with the Voxel Farm plugin were done using UE's blueprints. There is not a single line of C++ in this project. 

I can say the demo is quite fun. My girls have spent countless hours just driving around and destroying stuff. At the beginning, they were quite afraid of breaking anything, as if the police would come after them. Once they realized there were no consequences, they were able to fully unleash their destructive instincts.

One last thing I would like to point out in these videos is what you cannot see: LOD changes and framerate hitches. This is achieved thanks to our new scene management system, which I began to cover here and here (third and final part of this series coming next).

The Minecraft import should become a standard Voxel Farm feature soon, also depending on the interest we see around it. If you would like to do the same for one of your projects, just let us know.

It was great that this entire exercise was triggered by a reader's comment to another post. As usual, I look forward to your comments and feedback.

Thursday, February 2, 2017

The very-far-away

The previous post described a new system that allows rendering rich surfaces we call "meta-materials" using low-resolution geometry. Meta-materials cover ranges from 100 meters to 10 meters. What about anything more distant than that, that is, the range from tens of kilometers down to 100 meters?

It turns out the same system applies. You can think of this as "meta-meta-materials", we just do not call them that because one "meta" in a name is already too much. We have multiple objects that do fit that description. A terrain biome is one example.

In this post, you can see the results of applying this method to biome objects. All images are in faux solid color, which we use to make sure feature placement is correct.

Here is a single biome and the amount of geometry it takes to represent it:


In order to capture the detail, this biome also uses 1024x1024 texture maps for diffuse color, normals and other maps required for physically based rendering. Terrain voxels, which are generated on the fly, emit UV coordinate pairs which link the voxel's position in the world with the right section of these texture maps.

Here you can see multiple biomes in the same image, again in faux color, covering an area of approximately 3000 square kilometers:


Since most detail is contained by textures, it is possible to use a much coarser geometry. The following images show that we can crank up the mesh simplification and still obtain fairly good looking features:


As a creator of worlds, this feature is entirely transparent to you. These detail textures are automatically generated. Actually, all the content in these images was generated by our procedural algorithms, but if you had custom made maps, you would not need to be concerned about creating and maintaining the detail textures.

Like I said in the previous post, this is a technique frequently used in modern polygon-based terrain. The key here is this is now working on voxel terrains. These environments can be modified in real time by players. They can harvest materials, make trenches, even blow out entire craters in real time.

Monday, January 30, 2017

Prettier, Faster Terrains

We will be updating the terrain systems in Voxel Farm soon. Hopefully, it will get a lot prettier and faster. It is not often you get improvements in these two, for the most part, opposite directions. In this case, it seems we got lucky.

It was thanks to a synergy between two existing aspects of the engine that get to play together really well. One is UV-mapped voxels, the other is meta-materials.

Here is how it works: A single meta-materials describes a type of terrain. For instance, mountain cliff. Within this single meta-material you may find different materials. In the case of a cliff that could be exposed rock, mossy rock, grass, dislodged stones, dirt, etc. An artist gets to create how the meta-material surface is broken down into these sub-materials. The meta-material also has a volumetric definition, which is a displacement map and can be carefully tied to the sub-material map.

When you are close to the meta-material's surface, it must be rendered as full geometry. This is because features in the meta-material, let's say a rock that sticks out, can measure up to dozens of meters. This content must be made of actual voxels so it can be harvested, destroyed, etc. It is not just a GPU displacement trick.

As you are farther away from these features, using geometry to capture detail becomes expensive. You face the hard choice of keeping a high geometry density or dial down geometry and loose detail.

The new terrain system can dial down geometry, but keep the appearance of detail by using automatically generated textures for the metamaterial. For the close range, it still uses geometry to capture detail, but at a certain distance the meta-material displacement can be represented with just a normal map. High resolution sub material textures for grass, rock, etc. are not needed anymore. A single color map is able to capture the look of the metamaterial from this distance. These are only a few extra maps that can be reused anywhere in the scene where the meta-material appears.

The following image shows a single meta-material that uses geometry for the close range and texture maps for the medium-far ranges:


The colors in the wireframe view show where each method is applied. Just by comparing the triangle densities you can see this saves a massive amount of geometry:


This method is not new in terrain rendering, however, it is quite new in a voxel terrain. It is all possible thanks to the fact our voxels can have UV coordinates. Voxels output by the terrain component in the green area have UV coordinates. These coordinates make sure the normal, diffuse and other maps created to render the meta-material at this distance match the volumetric profile and sub-material patterns in the meta-material up close.

The beauty of it is that this work with any type of terrain topology, not just heightmaps. If you are doing caves, cave walls, ceilings and ground are very distinct meta-materials and they would all benefit from this method. And it should be all automatic, we can turn this system on, and it won't require artists to create any new assets.

We are still figuring out how to solve some kinks in the system, but so far I am very pleased with the results. I will be posting more pictures and videos eventually.
There was an error in this gadget