Always been curious about this. You can point to a ‘state of mind’ and the little dot goes there but the ‘Update’ button doesn’t light up and the little dot goes away next time you go to the SOM screen. I must assume it does nothing?
Hello @Bob_Hawkey,
This section visualizes your character’s emotional state during a conversation. The dot dynamically updates to reflect the emotions your character is experiencing in real time.
Try interacting with your character, and you’ll see how the emotional state shifts naturally based on the conversation.
Ahhhh, I suspected as much. Good for testing but not really when in my VR app. I imagine there is some way to get the emotional state in realtime. I know my buddy Cris does with TykeAI. Thanks!
You can refer to the documentation here:
However, from what I recall, it seems you might be using an older version of the SDK. This could prevent you from being able to properly test or use this feature.
You are correct. I use very few of the scripts you provide and most of them have been tweaked using ChatGPT. I would never expect support for those. The page you pointed to for the emotions thing is pretty sparse on information. It shows a list of emotions returned but do they come back before the audio or after? And why a list, why not just a single dominant emotion that I could use to modify a facial blendshape or body animation? I’m sure there is excellent logic behind it but I’m a bit lost.
Hi Bob, the emotions are provided in the ConvaiLipSync.cs script which you may or may not have added to your character.
They arrive from the Convai servers about every character 1 to 3 responses. They are in a list and I’ve found that the first emotion in that list almost always accurately reflects the tone of the current conversation. I grab the first emotion in the list to drive my characters animations and facial expression. I don’t do it for every single emotion as there are many of them, I have grouped the emotions into ‘positive’, ‘negative’, ‘indifferent’, ‘neutral’ etc and use those later.
I think there’s a method in that script called GetCharacterEmotions() which returns a list containing the emotions. What I do is apply the first emotion in that list, positive for example, to the Salsa EmoteR component which as you know controls the blendshapes, so in this instance it would induce a smile, although I find Salsa to be very conservative in doing this for Reallusion characters (I wish the smile was more smiley).
Send me an email for a more detailed description, I can make a video and give you my code relating to it no problem.
Thanks buddy. I’m gonna see what I can do with that. I don’t like them talking through expressions generally but I’d like to see something - maybe just non-mouth related. Brow morphs or eye squints!
Also - in Salsa if you specify a blenshape weight you should be able to go 100+ to get more expressive? You can test this in the inspector for the head blendshapes by typing in the value rather than using the slider. Pretty sure.
This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.