I’m encountering an issue with the mouth movement of a 3D avatar. I created a character that animates and speaks correctly with audio, but the mouth does not open during speech.
I have integrated the Convai Lip Sync component, assigned the Skinned Mesh Renderer to the Head object, and added the OVRHead Effector to the Viseme Effectors List. I’ve also verified that the avatar includes the necessary Blend Shapes for lip sync.
However, when I add the Voice Handler component, it gets automatically disabled when the scene runs.
The current components on the avatar include:
Thank you for your response.
I’m attaching some screenshots of the Unity console, where I asked the avatar three questions and it responded correctly.
However, the issue remains — the avatar’s mouth still does not open during the responses.
I look forward to any further guidance. Thanks again!
Thanks for sharing the screenshots. I didn’t notice any errors in the console logs.
Could you please try the following?
Click on the three dots on the top-right of the LipSync component and test the available presets (like OVR) to see if any of them work better with your avatar.
While speaking to the character, open the AvatarHead object in the hierarchy, and check if the BlendShape values in the Skinned Mesh Renderer component are changing during speech. This will confirm whether the blendshapes are being triggered at all.
Let us know what you observe so we can help further!
Thank you for your response.
I tried switching to the OVR and Reallusion presets, but both generated errors. The only one that didn’t show any errors was ARKit. With this preset, the BlendShapes are triggered and show movement, but the mouth still doesn’t animate visually.
I’m attaching a screenshot for reference.
Thank you very much for your help.