Original Discord Post by paulpowellii | 2024-07-23 16:42:53
Hello! I am working with a small team who are all just starting in our experimentation with Convai. I’m an animator, and have noticed that in conversation, some of the blendshapes on character’s faces are being ignored when they are speaking. When you create a character on the Convai website and chat with them, you can see their emotional states under “State of Mind.” Is it possible to maybe expose/access their State of Mind during runtime, and somehow blend in animations that tweak the blendshapes to make facial expressions that better match their emotions? For example, if they enter “Ecstasy,” using blendshapes to raise the eyebrows and widen the eyes. I realize this is a broad question, but I deeply appreciate anyone taking the time to respond!