I created a Convai character and pasted the Character ID into my MetaHuman setup in Unreal Engine.
Issue: when I talk to the character, it speaks the expression/gesture cues (like “smiles”, “nods”, “looks confused”, etc.) instead of showing them through face/body animation. It’s like the stage directions are being read by TTS.
How do I stop Convai from verbalizing expressions and make them apply only to the MetaHuman animation?
Is this a character prompt/system prompt issue, or a UE plugin/Blueprint routing issue?
Thank you. I am providing the requested information below.
Character ID: c173b5ae-0113-11f1-b3dd-42010a7be027
Character conversation: Most of the conversations has more. or less same problem I am expl;aining below:
I followed your YouTube tutorial on creating an AI MetaHuman using Convai. However, the resulting character is not functioning properly. It does not maintain eye contact, does not follow the user, and currently does not speak fluently, and it performs sluggish.
To rule out configuration errors on my end, I tested your pre-made characters and encountered the same issues.
Could you please advise whether these problems should be addressed individually through configuration adjustments, or whether this indicates a broader system issue?