Showcased below are some new avatars I’ve developed for CIE projects. We’re currently looking into the strengths and weaknesses between realistic and stylized avatars.
In my previous posts I explained how the Avatars are built using Character Creator 3. Once they’re in Unity, there’s still a bit of processing that has to happen to them though. First, for some reason the CC3 export/import process results in a fairly unusable list of Blendshapes for the character’s face. This is because the names of the blendshapes wind up too long and thus can’t be read individually in the Unity Editor. To fix this, I have to open up the FBX file for the character (the base mesh file) in a separate 3D program, Maya. There I do whatever mesh tweaks are required (i.e. cutting off their legs for avatars we want “floating”) and can rename the Blendshape node for the body to something shorter.
Once back in Unity, I have to adjust the character’s textures. Applying two way shaders to clothing and hair allows them to be seen from both sides, giving them a more proper full look, though the two way shader CC3 suggests for Unity is still nowhere near the high quality of the base shader in CC3, so the hair does end up somewhat jagged. This is one already evident benefit of the stylized characters as they use solid meshes as their hair instead of flat planes.
Moving on with the CC3 character development, the most tedious task is adjusting the animations for the new avatars. While Unity’s Human IK system is fantastic for sharing bone animation information between multiple humanoid characters, it unfortunately doesn’t seem to work for Blendshape nodes. Even if the Blendshapes have the exact same name, since the objects they’re attached to are different, their keys are separate. This means to, say, put the simple blinking function onto each new agent, I have to go to the blinkOpen and blinkClosed animations and add keys to the Blendshape for “agentBody_Blendshape_blink”. This process isn’t difficult, just time consuming. Moreso as we are now getting into layering facial expressions as well which involve multiple Blendshapes per agent to achieve the right look.
With all that set up, I can finally get to the final touches for the agents to be able to come alive. The Salsa3D plugin for controlling mouth movements while speaking. A simple eye tracking script is attached to an empty object on the characters Head bone (as to keep it in line with the head movements when not tracking), and two more empty objects are made as children of the eye tracker. Each eye bone of the character is tied to one of those child objects with a “look at” constraint and the eye tracker is situated in front of the character’s face. This way, when the tracker is turned on, it moves to whatever the desired target (i.e. the camera) is and the character’s eyes look to it. In the future, I may expand upon this to include some subtle neck movements to make a more realistic gaze.
And from there, the agent animations are controlled by the external components of the project. I’ll be going into a couple of those I’ve developed in my next post.
Leave a Reply