Photogrammetry Breakdown: Retopology and Re-UVing in zBrush

See all parts of this breakdown: Photogrammetry: making Nanite meshes for UE5

This is, in my opinion, one of the most tricky parts of the process. In theory, one could think that now, thanks to Nanite, it’s possible to directly import the assets produced by Metashape into UE5. Although this is certainly an option, I don’t use it for several reasons. First at all, I don’t like the way Metashape produces UVs and I prefer to re-uv my meshes. Second reason is that, even if Nanite supports very high-poly meshes, one shouldn’t go too high, so I try to keep my high-poly assets to something still very high but reasonable for the Nanite scene I’m building. In theory I can reduce the number of polygons in Metashape, but as I need new UVs too, I prefer to do all this part in zBrush. I don’t know if this is the best way to proceed and I’m still trying a few variations, so, if you have some ideas, please write them in comment.

In zBrush I have experienced different approaches, the one shown here is the one that works for me in most cases. First, I import the mesh in zBrush and check if its topology is OK (no flying polys). Then, I duplicate the mesh, so I can work on this copy and project from the original one. Even if my target points is around 1 million in order to import the asset as Nanite asset in Unreal, in order to do the UVs in a simple way I need something low poly to start with. When reducing from a very high poly and triangulated mesh, as Metashape produces, to a low poly quad mesh using zRemesher a lot of time it causes zBrush to crush (at least in my experience). Also, I have noticed that zRemesher causes sometimes a loss of sharpness of the object shape. For this reason, I use the Decimation Master plugin to reduce the mesh to something around 10-15k ActivePoints. The mesh produced by the Decimation Master plugin is triangulated and its density isn’t uniform, but matches the shape of the mesh at its best. Then I use zRemesher to produce a quad mesh with around the same number of ActivePoints and I use the UV Master plugin to generate some nice UVs. Once this is done, I project from the original mesh to the new one subdividing the latter a few times until I reach the level of details I want. In a few cases, I’ve skipped the retopology with zRemesher and directly did the UVs, subdivisions and projections on the triangulated mesh.

Next: De-lighting


26 photo scanned rocks, stones and walls from Swiss Alps are available as game assets for UE5 on Artstation.

Photogrammetry Breakdown: Reconstructing the Mesh in Metashape

See all parts of this breakdown: Photogrammetry: making Nanite meshes for UE5

To generate the 3D mesh from the photos, we use the Agisoft Metashape software.

First, create a new project and add the photos to it. There is not “too many photos” concept when coming to photogrammetry, but the more the better.

Once the photos are added, we proceed to align them. I’ve found the default values proposed by Metashape are suboptimal in most cases. For this reason, I augment the key point limit and the tie point limit to something like 40k. I set the accuracy to “high” and check “Generic preselection” to reduce the processing time. In some cases, I use masks too and apply them to tie points. In this case, it’s enough to setup the mask only on a few photos, Metashape is able to use it for all photos. Instead, if the mask is applied to key points, each photo should have a mask in order to have it working. I usually use masks on tie points when I scan an object rotating it on an almost uniform background. In my little experience is not useful or even deleterious when scanning a big rock in the nature.

Align photos

Once the photos are aligned, we see a cloud of points and we can already redefine the region of the scan reducing it to the object of interest without, however, making it too tight. After that, we need to build the dense cloud. I usually set the quality to “high”. If, for some reason, it’s not the first time I build it, I check “Reuse depth maps”, saving some time.

Build dense cloud

Once the dense cloud is built we can finally build the mesh. This is the place in the whole process where I change a lot the default parameters. As source data, I set “Depth maps”, as I have built them in the previous step. This save processing time. I set the quality and the face count to high and I check again “Reuse depth maps”.

Build mesh

After some waiting, I have the mesh.

Generated 3D mesh.

At this point, I select and delete all the parts I don’t need, with special attention to eliminate all flying polygons (they can cause a lot of problem in the retopology-UVing process needed later)

Cleaned mesh

Once the cleaning is roughly done, I check the mesh statistics and fix them. I usually refine my cleaning and recheck the Mesh Statistics a few time, before I’m really satisfied with the result.

Mesh Statistics popup, click on Fix Topology

Once I’m done with cleaning, I use the “Close Holes” tool. I set it with the level quite high, but not 100% in order to have the small holes eventually present in the mesh closed, but not having it closing the bottom of the rock.

Close Holes

Once I think I’m done with editing in Metashape, I build the texture. I usually set a very high resolution because Metashape usually generate a lot of small UVs islands. Later, in zBrush I generate a more acceptable UV map and then transfer the texture using xNormals. The final texture that I import in Unreal Engine can definitively be smaller then the one generated in Metashape.

Textured mesh
Texture

Finally, I export the mesh with its texture.

Next: Retopology and Re-UVing in zBrush


26 photo scanned rocks, stones and walls from Swiss Alps are available as game assets for UE5 on Artstation.

Photogrammetry Breakdown: Preparing Photos

See all parts of this breakdown: Photogrammetry: making Nanite meshes for UE5

In order to prepare photos for being used in a photogrammetry software, like Metashape, we do the following steps:

  • As our camera stores the raw photos in CR3 format, first thing we do is to convert them to DNG using the Adobe DNG converter free tool;
  • Create the color checker profile using the “Colorchecker Camera Calibration” program, then save it in “C:\Users\UserName\AppData\Roaming\Adobe\CameraRaw\CameraProfiles” which is actually the default option ;
Step 1 Create the profile
  • Import photos in Lightroom;
  • Edit the color checker photo: in develop mode apply the profile and setup the white balance;
Step 2 Set profile in Lightroom
Step 3 Drag the White Balance pipette on the white square for landscape on the color checker photo.
Step 4 Now profile and WB are corrected, we can sync the color profile and the white balance to all concerned photos.
  • Sync these edited properties to all concerned photos;
  • Bulk export the edited photos as dng in order to use them in Metashape.

Next: Reconstructing the Mesh in Metashape


26 photo scanned rocks, stones and walls from Swiss Alps are available as game assets for UE5 on Artstation.