How to use Microsoft Speech API (SAPI) in Unreal Engine 4 (UE4), with Visual Studio 2017 Community Edition. Because I’m always forgetting it.
Work on our Asian friend continues. I adapted the material for the teeth from the new UE4 photo-realistic character sample. Also studied SS profile, roughness and specularity values in the sample and changed mines to match them better. Still some problems from the eye occlusion mesh, maybe need to remake again.
Another port from Daz to UE4. This time a guy, “Saejima”. Screenshots are from 4.19, so no new 4.20 features yet.
We started from G8 basic male and used various shaping parameters to obtain an Asian looking face, then added some more muscles.
For making the normals and AO we exported high res and low res as .obj and baked in Substance Painter combined with G8M bump maps.
For eyes, we took some Paragon eyes, fit in G8M, imported in Daz, created a “Follower”, fit to our character, imported in back maya, fit again, because the occluder didn’t really fit. In short, eyes are difficult to adapt.
In Unreal, normals are a little bit intensified using “flatten normals” node with a negative parameter.
Probably, it can be improved more. Also, it would be interesting to check in 4.20.
Screenshots made in UE4.19 with an empty level, a directional light from the front and a spot light from the back.
Modular Snap System Editor Plugin allows to snapping and aligning Actors by simply moving them close enough to each other. Especially useful for working with modular assets without worrying about grid and alignment. Works with any Actor class that has Static Mesh Components. Existing static meshes can be easily prepared for use with the system.
Various stuff done: anim bp for player and NPC, clothing, posing, trying Motionbuilder, custom “look at” and fighting with jcms.