Photogrammetry Breakdown: Retopology and Re-UVing in zBrush

See all parts of this breakdown: Photogrammetry: making Nanite meshes for UE5

This is, in my opinion, one of the most tricky parts of the process. In theory, one could think that now, thanks to Nanite, it’s possible to directly import the assets produced by Metashape into UE5. Although this is certainly an option, I don’t use it for several reasons. First at all, I don’t like the way Metashape produces UVs and I prefer to re-uv my meshes. Second reason is that, even if Nanite supports very high-poly meshes, one shouldn’t go too high, so I try to keep my high-poly assets to something still very high but reasonable for the Nanite scene I’m building. In theory I can reduce the number of polygons in Metashape, but as I need new UVs too, I prefer to do all this part in zBrush. I don’t know if this is the best way to proceed and I’m still trying a few variations, so, if you have some ideas, please write them in comment.

In zBrush I have experienced different approaches, the one shown here is the one that works for me in most cases. First, I import the mesh in zBrush and check if its topology is OK (no flying polys). Then, I duplicate the mesh, so I can work on this copy and project from the original one. Even if my target points is around 1 million in order to import the asset as Nanite asset in Unreal, in order to do the UVs in a simple way I need something low poly to start with. When reducing from a very high poly and triangulated mesh, as Metashape produces, to a low poly quad mesh using zRemesher a lot of time it causes zBrush to crush (at least in my experience). Also, I have noticed that zRemesher causes sometimes a loss of sharpness of the object shape. For this reason, I use the Decimation Master plugin to reduce the mesh to something around 10-15k ActivePoints. The mesh produced by the Decimation Master plugin is triangulated and its density isn’t uniform, but matches the shape of the mesh at its best. Then I use zRemesher to produce a quad mesh with around the same number of ActivePoints and I use the UV Master plugin to generate some nice UVs. Once this is done, I project from the original mesh to the new one subdividing the latter a few times until I reach the level of details I want. In a few cases, I’ve skipped the retopology with zRemesher and directly did the UVs, subdivisions and projections on the triangulated mesh.

Next: De-lighting


26 photo scanned rocks, stones and walls from Swiss Alps are available as game assets for UE5 on Artstation.

Photogrammetry Breakdown: Reconstructing the Mesh in Metashape

See all parts of this breakdown: Photogrammetry: making Nanite meshes for UE5

To generate the 3D mesh from the photos, we use the Agisoft Metashape software.

First, create a new project and add the photos to it. There is not “too many photos” concept when coming to photogrammetry, but the more the better.

Once the photos are added, we proceed to align them. I’ve found the default values proposed by Metashape are suboptimal in most cases. For this reason, I augment the key point limit and the tie point limit to something like 40k. I set the accuracy to “high” and check “Generic preselection” to reduce the processing time. In some cases, I use masks too and apply them to tie points. In this case, it’s enough to setup the mask only on a few photos, Metashape is able to use it for all photos. Instead, if the mask is applied to key points, each photo should have a mask in order to have it working. I usually use masks on tie points when I scan an object rotating it on an almost uniform background. In my little experience is not useful or even deleterious when scanning a big rock in the nature.

Align photos

Once the photos are aligned, we see a cloud of points and we can already redefine the region of the scan reducing it to the object of interest without, however, making it too tight. After that, we need to build the dense cloud. I usually set the quality to “high”. If, for some reason, it’s not the first time I build it, I check “Reuse depth maps”, saving some time.

Build dense cloud

Once the dense cloud is built we can finally build the mesh. This is the place in the whole process where I change a lot the default parameters. As source data, I set “Depth maps”, as I have built them in the previous step. This save processing time. I set the quality and the face count to high and I check again “Reuse depth maps”.

Build mesh

After some waiting, I have the mesh.

Generated 3D mesh.

At this point, I select and delete all the parts I don’t need, with special attention to eliminate all flying polygons (they can cause a lot of problem in the retopology-UVing process needed later)

Cleaned mesh

Once the cleaning is roughly done, I check the mesh statistics and fix them. I usually refine my cleaning and recheck the Mesh Statistics a few time, before I’m really satisfied with the result.

Mesh Statistics popup, click on Fix Topology

Once I’m done with cleaning, I use the “Close Holes” tool. I set it with the level quite high, but not 100% in order to have the small holes eventually present in the mesh closed, but not having it closing the bottom of the rock.

Close Holes

Once I think I’m done with editing in Metashape, I build the texture. I usually set a very high resolution because Metashape usually generate a lot of small UVs islands. Later, in zBrush I generate a more acceptable UV map and then transfer the texture using xNormals. The final texture that I import in Unreal Engine can definitively be smaller then the one generated in Metashape.

Textured mesh
Texture

Finally, I export the mesh with its texture.

Next: Retopology and Re-UVing in zBrush


26 photo scanned rocks, stones and walls from Swiss Alps are available as game assets for UE5 on Artstation.

Photogrammetry Breakdown: Preparing Photos

See all parts of this breakdown: Photogrammetry: making Nanite meshes for UE5

In order to prepare photos for being used in a photogrammetry software, like Metashape, we do the following steps:

  • As our camera stores the raw photos in CR3 format, first thing we do is to convert them to DNG using the Adobe DNG converter free tool;
  • Create the color checker profile using the “Colorchecker Camera Calibration” program, then save it in “C:\Users\UserName\AppData\Roaming\Adobe\CameraRaw\CameraProfiles” which is actually the default option ;
Step 1 Create the profile
  • Import photos in Lightroom;
  • Edit the color checker photo: in develop mode apply the profile and setup the white balance;
Step 2 Set profile in Lightroom
Step 3 Drag the White Balance pipette on the white square for landscape on the color checker photo.
Step 4 Now profile and WB are corrected, we can sync the color profile and the white balance to all concerned photos.
  • Sync these edited properties to all concerned photos;
  • Bulk export the edited photos as dng in order to use them in Metashape.

Next: Reconstructing the Mesh in Metashape


26 photo scanned rocks, stones and walls from Swiss Alps are available as game assets for UE5 on Artstation.

Photogrammetry Breakdown: Taking Photos

See all parts of this breakdown: Photogrammetry: making Nanite meshes for UE5

Strong sunlight and water are not good for photogrammetry, so we try to do it during a cloudy and dry day. Once we found the subject, let’s say a rock or a wall, we will first put the Colorchecker (we use Xrite Colorchecker Passport) near the subject and take a good big photo of it, that photo will be used for calibrating color and white balance later.

Good big photo of the Colorchecker on the rock we’ll about to scan.

Then we take many photos of the subject from every side, overview and details too. We use fixed 50mm lens or a zoom lens but not changing the zoom during shooting. We are taking pictures in RAW format. If you use jpeg then make sure your white balance is not changing during shooting (not Auto). The subject should be sharp, so you may want to adjust the aperture accordingly. More photos is better, with too few photos the program may fail to work or make a blurry scan. Typically you want to see every point of the subject in many photos. So like 100 pictures is not too much, some people are even making 1000s of pictures! Anyway once your scan is complete you can delete them all.

Read more: official tutorial about taking pictures for Metashape

Next: Preparing Photos


26 photo scanned rocks, stones and walls from Swiss Alps are available as game assets for UE5 on Artstation.

Photogrammetry Breakdown: Importing in UE5

See all parts of this breakdown: Photogrammetry: making Nanite meshes for UE5

That’s the last step of our photogrammetry workflow. At this point we have the mesh to import SM_Mesh.fbx and either its base color texture T_Mesh_D.tga or it’s color-roughness combined texture and the normal map (T_Mesh_DR.tga and T_Mesh_N.tga).

First we’ll import the mesh. Don’t forget to check the Build Nanite and uncheck Build Lightmap UVs.

Mesh import dialog

We will then use Modeling Tools plugin to modify the pivot and sometimes the scale, but remember, the Nanite mesh should be pretty big. If your mesh is too small it will become partly black because the Unreal couldn’t correctly calculate the normals for the micro triangles.

Using modelling tools plugin to modify the pivot.

Next step is to import the textures, this is pretty straight forward, just drag them into the content browser. Check if Unreal recognized the normal map correctly.

Final step is to create the material. We have a master material (inspired by Epic’s examples) which allows us to manipulate color, normals and roughness. We’ll create a new instance, setup it with our imported textures and modify various parameters. Then we set that material as default material of the mesh and voila, it’s done!

Final result: the nanite mesh and its material

26 photo scanned rocks, stones and walls from Swiss Alps are available as game assets for UE5 on Artstation.

Photogrammetry Breakdown: De-lighting

See all parts of this breakdown: Photogrammetry: making Nanite meshes for UE5

“De-lighting” actually means removing shadows from the texture, making a good base color texture to use in Unreal. We use free Agisoft De-Lighter tool (official tutorial).

Before
After (de-lit)

You start by importing the fbx produced by Metashape (or other program but there should by a texture).

On the right side of the UI there are 2 main tools to remove shadows: “remove cast shadows” and “remove shading”. Remove shading is an automatic tool suitable for simple situations. We try it first and if it works use the result.

“Remove cast shadows” requires manual markup of lit and shadowed areas. You basically show the examples of lit areas and shadowed areas in your picture then the program tries to remove shading. To start we paint some yellow strokes in lit areas and blue strokes in the shadows (you don’t need to paint them all, just try to cover all materials, like here we try to show all different stone colors)

Original texture (1) with light/shadow mask.

If the brushes (2) are grayed out then double-click on the “Illumination map” (1) to activate them. That also works with processed models. Check the other icons on the toolbar, you can erase the marks, hide them and change the size of the brush. Space bar switches between paint and rotate modes.

Once it’s been annotated, use “Remove cast shadows” button to run the algorithm. Preview button use 1/4 resolution to speed things up. The process can create some unnatural color and light variation, it that case try to use different Highlight and Color suppression parameters to see which works better.

The delighting process can remove some dark colors that are not actually shadows (like dark spots on the rocks or paint etc.) To fix that you need to create another mask called “Shadow scale map” and annotate those places with third color (light blue). To create this mask right click on the model on the left and choose “Add shadow scale mask”. Then paint with light blue color lit areas where you want to preserve dark details. To return to the illumination mask or another mask double click on it.

Shadow scale mask (1) and some place where we want to preserve dark spots (2)

Run the preview again and once you are satisfied run the “remove cast shadows” at full res.

Sometimes this first step will not remove all shadows. In this case you can repeat the procedure on the Processed texture. First double click on the Illumination map of the Processed texture to activate it. Your original light/shadow markup from previous step will be copied here. What we did for this wall is to leave all yellow marks and but remove the blue ones (using blue cross on the toolbar) then repaint them in places that remain too dark (between stones). Finally, we will also create a new Shadow scale Map and protect the lit areas using it

New illumination map (1) and shadow marks (3)
New Shadow scale map (2)

When it’s done use “Remove cast shadows” button again and if it works well export the final result by right clicking on it and choosing Export model. It will export a new fbx (which we don’t use) and a texture, which we’ll use in the next step. We usually export in tiff of tga format.

Read more: check the official tutorial, in the end there are links to downloadable examples.

Next: Preparing Textures


26 photo scanned rocks, stones and walls from Swiss Alps are available as game assets for UE5 on Artstation.

Photogrammetry Breakdown: Preparing Textures

See all parts of this breakdown: Photogrammetry: making Nanite meshes for UE5

In this step we will prepare textures to import in Unreal. Right now we have the original photo scanned mesh, lets’ call it Mesh.fbx, it’s de-lit base color texture Mesh_delit.tif and the re-topologized mesh SM_Mesh.fbx with new UVs prepared for import in Unreal. We will first create the base color texture that matches the new mesh (bake), then, optionally, create a Normal and Roughness textures.

But before the baking begins we will open our 2 meshes in Maya together and check two things: they should be in the same place in the world (otherwise the bake will fail of course), and they should be at least 1m large or more. If the meshes are too small (few centimeters large), the tiny high-poly triangles become microscopic and that will cause issues with normals both for baking and for Unreal. Even if the mesh is small IRL you need to scale it bigger. (you can scale it back in your level in Unreal). Finally we will freeze the transforms so rotation is 0 and scale is 1.

For baking the base color texture to the new mesh and its UVs we use xNormal. Specify Mesh.fbx as “High definition mesh”, Mesh_delit.tif as its “base texture to bake” and SM_Mesh.fbx as “Low definition mesh”. Then in the Baking options tab specify the output file, here we will call it Mesh_xn.tga, check “Bake base texture” and hit Generate maps button. This will give you the base color texture you can import in Unreal, usually we rename it in something like T_Mesh_D.tga (D for Difuse).

Baking base texture in xNormal

For creating optional roughness and normal maps we use Substance Alchemist. Simply drag the T_Mesh_D.tga into that program and it will generate various textures using its AI based algorithms. We will export roughness and normal maps, then for our Unreal material, we will add the roughness as alfa channel of the base color texture in Photoshop. In this case we will save the combined texture as a new file called T_Mesh_DR.tga (R for Roughness). We will call the normal map T_Mesh_N.tga

Alchemist also allows you to do the de-lighting, but it doesn’t take the mesh shape into account. It’s totally possible to use it for simple cases, for example when the mesh is very flat. But with more complex meshes and lighting scenarios the Agisoft De-Lighter works better. In case when we want to use Alchemist for de-lighting we will skip the De-Lighter step, using xNormal to bake the original photo scanned texture, then give it to the Alchemist and produce also the base color in there.

Exporting textures from Alchemist

Next: Importing in UE5


26 photo scanned rocks, stones and walls from Swiss Alps are available as game assets for UE5 on Artstation.