I want to adapt my avatar’s facial emotions to the user.
I am using a python component to analyze user’s facial emotions and send it via an OSC server to unreal engine.
The goal is to read those user emotions (e.g. “happy”) and then trigger a “Joy” emotion on my Metahuman using Convai facial Anim class.
It’s possible to manually force a specific emotion animation, but keep in mind that the character’s LLM won’t be aware of this forced emotion. It will only affect the visual animation.
I do not mind if the LLM does not know about the emotion update. I basically want to start with a basic if condition that when IsHappy is true, preform a happy emotion, and I did the following under Set Emotion under Event Graph’s set emotion component under Convai_Metahuman_FaceAnim: