In this post I’m going to show how I create a tree in SpeedTree 9 using the new cool features it offers for photogrammetry.
First, for this example, I decided to make a birch, more precisely a Japanese white birch (Betula platyphylla), a tree I need for my Unreal 5 “Shinsengumi HQ” project.
Here some useful references about Japanese traditional architecture, in particular about roofing. First, nowadays there are 3 standard tiled roof types in Japan:
J-shaped tiles: these are Japanese style tiles, the main tile is named 桟瓦 (sangawara); these tiles are the evolution of the traditional 本瓦葺 (hongawara roof) still used for temple and shrines; they are used for traditional houses;
This is, in my opinion, one of the most tricky parts of the process. In theory, one could think that now, thanks to Nanite, it’s possible to directly import the assets produced by Metashape into UE5. Although this is certainly an option, I don’t use it for several reasons. First at all, I don’t like the way Metashape produces UVs and I prefer to re-uv my meshes. Second reason is that, even if Nanite supports very high-poly meshes, one shouldn’t go too high, so I try to keep my high-poly assets to something still very high but reasonable for the Nanite scene I’m building. In theory I can reduce the number of polygons in Metashape, but as I need new UVs too, I prefer to do all this part in zBrush. I don’t know if this is the best way to proceed and I’m still trying a few variations, so, if you have some ideas, please write them in comment.
In zBrush I have experienced different approaches, the one shown here is the one that works for me in most cases. First, I import the mesh in zBrush and check if its topology is OK (no flying polys). Then, I duplicate the mesh, so I can work on this copy and project from the original one. Even if my target points is around 1 million in order to import the asset as Nanite asset in Unreal, in order to do the UVs in a simple way I need something low poly to start with. When reducing from a very high poly and triangulated mesh, as Metashape produces, to a low poly quad mesh using zRemesher a lot of time it causes zBrush to crush (at least in my experience). Also, I have noticed that zRemesher causes sometimes a loss of sharpness of the object shape. For this reason, I use the Decimation Master plugin to reduce the mesh to something around 10-15k ActivePoints. The mesh produced by the Decimation Master plugin is triangulated and its density isn’t uniform, but matches the shape of the mesh at its best. Then I use zRemesher to produce a quad mesh with around the same number of ActivePoints and I use the UV Master plugin to generate some nice UVs. Once this is done, I project from the original mesh to the new one subdividing the latter a few times until I reach the level of details I want. In a few cases, I’ve skipped the retopology with zRemesher and directly did the UVs, subdivisions and projections on the triangulated mesh.
To generate the 3D mesh from the photos, we use the Agisoft Metashapesoftware.
First, create a new project and add the photos to it. There is not “too many photos” concept when coming to photogrammetry, but the more the better.
Once the photos are added, we proceed to align them. I’ve found the default values proposed by Metashape are suboptimal in most cases. For this reason, I augment the key point limit and the tie point limit to something like 40k. I set the accuracy to “high” and check “Generic preselection” to reduce the processing time. In some cases, I use masks too and apply them to tie points. In this case, it’s enough to setup the mask only on a few photos, Metashape is able to use it for all photos. Instead, if the mask is applied to key points, each photo should have a mask in order to have it working. I usually use masks on tie points when I scan an object rotating it on an almost uniform background. In my little experience is not useful or even deleterious when scanning a big rock in the nature.
Align photos
Once the photos are aligned, we see a cloud of points and we can already redefine the region of the scan reducing it to the object of interest without, however, making it too tight. After that, we need to build the dense cloud. I usually set the quality to “high”. If, for some reason, it’s not the first time I build it, I check “Reuse depth maps”, saving some time.
Build dense cloud
Once the dense cloud is built we can finally build the mesh. This is the place in the whole process where I change a lot the default parameters. As source data, I set “Depth maps”, as I have built them in the previous step. This save processing time. I set the quality and the face count to high and I check again “Reuse depth maps”.
Build mesh
After some waiting, I have the mesh.
Generated 3D mesh.
At this point, I select and delete all the parts I don’t need, with special attention to eliminate all flying polygons (they can cause a lot of problem in the retopology-UVing process needed later)
Cleaned mesh
Once the cleaning is roughly done, I check the mesh statistics and fix them. I usually refine my cleaning and recheck the Mesh Statistics a few time, before I’m really satisfied with the result.
Mesh Statistics popup, click on Fix Topology
Once I’m done with cleaning, I use the “Close Holes” tool. I set it with the level quite high, but not 100% in order to have the small holes eventually present in the mesh closed, but not having it closing the bottom of the rock.
Close Holes
Once I think I’m done with editing in Metashape, I build the texture. I usually set a very high resolution because Metashape usually generate a lot of small UVs islands. Later, in zBrush I generate a more acceptable UV map and then transfer the texture using xNormals. The final texture that I import in Unreal Engine can definitively be smaller then the one generated in Metashape.
In order to prepare photos for being used in a photogrammetry software, like Metashape, we do the following steps:
As our camera stores the raw photos in CR3 format, first thing we do is to convert them to DNG using the Adobe DNG converter free tool;
Create the color checker profile using the “Colorchecker Camera Calibration” program, then save it in “C:\Users\UserName\AppData\Roaming\Adobe\CameraRaw\CameraProfiles” which is actually the default option ;
Step 1 Create the profile
Import photos in Lightroom;
Edit the color checker photo: in develop mode apply the profile and setup the white balance;
Step 2 Set profile in LightroomStep 3 Drag the White Balance pipette on the white square for landscape on the color checker photo.Step 4 Now profile and WB are corrected, we can sync the color profile and the white balance to all concerned photos.
Sync these edited properties to all concerned photos;
Bulk export the edited photos as dng in order to use them in Metashape.
Recently (UE4.25) Unreal Engine added the possibility to use alembic hair. It’s a revolutionary feature, still beta, but designated to radically change the procedure of creating characters to use with the engine. Thanks to the fact that finally Autodesk offers Maya Indie also in my country, I was able to test this beautiful new feature that Epic has added to its engine.
This little tutorial is about how to create a braid in Maya using nHair and xGen plugins. All I did was to follow this Daryl Obert’s video tutorial and adapt it to my needs. I’m writing this blog post essentially to memorize what I have learned, as I’m a total noob in the matter of creating hair with Maya.
Just a few screenshots to show some progress. Still working on completing it. I’ve also started to modify and add some pieces of the architecture and I plan to add some exterior props too. Goals: create a new Unreal 4 Marketplace pack for Japanese furniture and improve my existing MJH-Modular Japanese Town House pack.
I’ve started this project as a learning zBrush project. It consists on remaking/improving some props I already had for the MJH – Modular Japanese Town House UE4 kit, like the tansu (箪笥) cabinetry, and adding new ones. In particular I added a short leg table (chabudai, ちゃぶ台), legless chairs (zaisu, 座椅子) and their cushions (zabuton, 座布団) and a sake set.
This is first zbrush sculpting, I did it a couple of months ago, but I didn’t find neither the time neither the spirit to publish it on the blog, because of a sad event happening at the time. Kitsune means fox in Japanese and these statues are typical of inari shrines, like the famous Fushimi Inari-taisha in Kyoto Prefecture. It was since the time of the MJH Modular Japanese Town House kit for UE4 Marketplace that I wanted to sculpt them, but only now I’ve started learning Zbrush (and I really love it). As this statues are my very first Zbrush sculpting, I’ve decided to give them for free to everybody. You can download them on itch.io and use them in your projects under Creative Commons Attribution-ShareAlike 4.0 International.