We’ve started work on a variety of other projects requiring a few new approaches to aspects. First one I’ll be discussing here is our updated pipeline utilizing Character Creator 3 and modern URP Unity.
Creation aspects within CC3 haven’t changed too much since my post back in 2020 on the subject. We are using some new bases and outfits, but overall the creation process is the same, including clothing reduction and game-ready base model within CC3 once finished building the character up. What starts to change up is in our export. Overall a lot of the settings are the same: Clothed FBX export “To Unity 3D”, with or without LoD based on use case. Turns out we don’t have to include the T-Pose every time now as that seems to be set up properly regardless.
Within Unity is the real update. It seems over the last couple years Reallusion themselves dropped their personal pipeline support for newer versions of Unity. Instead, they released the plugin source code and some other members of the Reallusion forum took it upon themselves to develop a solution for the various Unity pipelines, including the Universal Render Pipeline we at CIE are focusing on. Luckily, the solution is set up in a convenient Git Package which we have become well versed in integrating to our projects. This new solution also comes with some unique aspects and options that we’ll be looking into in the future, but for now the main factors is the automatic Prefab setup that is similar to the old system. Unlike that system, this one easily supports multiple character imports, organizing them into a list for easy switching and applying the construction process. It also sets up all the materials for proper URP readability.
From there, cleanup on the models is fairly simple. First, a quick texture fix needed for these stylized models. For some reason the occlusion map on the eye comes in at 100%, causing the area around the eyeball to be darkened to the point it turns black. This may work for subtle eye movements where the eye doesn’t move much as it creates a nice shading effect, but with larger eye movements the forced shadow becomes obviously off. Turning this down or off is required for these characters at least.
Next is a new setup for the character animator to make sharing animations easier: the main body is given a “CharAnim” controller that contains the Humanoid rig animation data we’ve gone over before. This allows them all to use the same humanoid animations made by the Center or pulled from Mixamo. Back when first talking about this character pipeline I mentioned issues with the Blendshape animation not being able to share between characters. I didn’t realize it at the time, but the solution was quite simple: using a second animator entirely for Blendshapes. Every CC3 character is structured the same way, with a unique name on the top level but sharing the name of the pieces beneath, particularly “CC_Base_Body”. This is the skin of the character and where the Blendshape data is actually controlled. The reason I couldn’t share the animation data between things before was cause I was trying to do it from the same top level animator. As such, every character had a unique path to their Blendshapes (i.e. “Neurtal_Female_Base/CC_Base_Body” vs “Neurtal_Male_Base/CC_Base_Body”). By creating a second animator on the CC_Base_Body object itself, the path is now uniform, meaning every CC3 character can share Blendshapes as long as they all have those Blendshapes named the same way (which they do by default). So we can place things like blinking and mood shifting into this animator and connect it to the Dialogue System I developed to work with these previously.
Finally we apply the eye tracker setup I used previously. Take two Empty Objects from the positions of the eye joints and pull them forward. Create a new Empty Object under the facial joint so it remains connected when not tracking that is between the two eye objects. Place the eye objects under this tracker. Then simply apply Look At constraints to the eye Joints, pointing them at the original Empty Objects we dragged from their position. Make sure the eyes are facing forward in the Look At and the eyes should now track the Tracker as it moves. Then I put the simple “Eye Tracker” script I put together on the tracker object which creates a gizmo for easy visualizing and the option to stick the Tracker to a specific object in the scene (i.e. the Player Camera).
All of this means we now have a much more streamlined pipeline for getting characters into our projects. This was done based on one we’re working on which will involve interacting with stylized characters that use the same model but different skin tones, so we needed a nice process for developing nuance with iterations within the characters. Below is the WIP characters we developed to start with as we established this updated pipeline. We’ll share more about the project as it gets going, but for now, the system is progressing smoothly.