Original Discord Post by timguhlke | 2024-01-05 02:46:34
Hello everyone, I would like to inquire about the general progress and development of Unreal and Reallusion. The progress seems a bit slow here. It would be great to have more tutorials or documentation on custom animations and especially emotions. I haven’t found anything on emotions except for the functions in MetaHuman itself. Thanks
Hello <@143473124963254272> Sorry for you to face this inconvenience. The development we do for MetaHuman also applies to Reallusion characters. We are constantly introducing new features and with that we introduce new tutorials and documentation.
Reply by freezfast | 2024-01-10 15:36:39
Hi <@143473124963254272>, we will be having a new tutorial over Reallusion were we will cover adding emotions animations
Reply by timguhlke | 2024-01-10 15:47:49
That’s awesome! Thank you! When is the tutorial coming?
We’re targeting next end of week, but it usually takes longer due to the review process, so we could say around end of this month. Until then you can share what you need to do and we can help
Okay, that’s great. If you can help me beforehand that would be great. So I’m still unsure exactly what the emotion implimentation process looks like with Reallusion.
I see in the metahumans there is the animation layer “Emotions_Anim” which I would have to rebuild? And it still has to be linked in the Animgraph, right? In the eventgraph I also see “Set Emotions” which I should also recreate?
Is the whole thing then automatically linked to the lipsync?
I think the biggest question is what exactly happens in the screen and what are the differences? What do M and E stand for and what of a reallusion character would have to be inserted there? Because Reallusion is not separated with the head like a metahuman. Would you have to insert the idle animation + the matching emotion there?
A small todo list for the complete workflow with a focus on what you have to do differently with Reallusion would be extremely helpful. thanks
Here is a 5 mins video to clarify some of your points and help start up your integration with Reallusion.
For Reallusion, to have the latest updates like emotions you should use the following animation blueprint which is different from the one in the first YouTube tutorial,
note from the video: the set of emotions as well as the animation blueprint itself are going to slightly change by the time we create the tutorial (end of next week)
Amazing, that helps me so much and is really incredible support. Thank you so much. Just 2 more questions, even though I will probably notice this when testing.
What kind of mocap animations do I have to insert or export for the emotions? Because you can only export the whole figure. Just the idle position + emotion or tpose+ emotion? Or does it not matter because the blueprint only takes the face? And is the lipsync feature included in the beta folder?
Yes indeed, it will not matter, in the blueprint it will only take the face mocap and remove the body, the lipsync is included in the same blueprint and should work out of the box same as in MetaHumans and ReadyPlayerMe, just make sure to add the FaceSync component to the main character blueprint though
Thanks again for your support here, there are a few things that are not working for me right now and I was wondering if it’s because of the beta. Otherwise, I’ll wait for the video about it, but maybe it should be working already.
The List/Think and Talk animation is not being triggered. Both are connected to mocap data.
The character switches to the T-pose when I type something in the chat and he responds.
The Trust, Fear, Sadness, Disgust, and Anticipation emotion slots are not yet working. When I connect a mocap animation, for example Fear, with a different slot, for example Joy, then the Fear animation works.
As I said, I know it’s still in beta, but I wanted to mention it here and maybe there is a quick solution.
Reply by timguhlke | 2024-01-18 18:46:14
Thanks again for your support here, there are a few things that are not working for me right now and I was wondering if it’s because of the beta. Otherwise, I’ll wait for the video about it, but maybe it should be working already.
The List/Think and Talk animation is not being triggered. Both are connected to mocap data.
The character switches to the T-pose when I type something in the chat and he responds.
The Trust, Fear, Sadness, Disgust, and Anticipation emotion slots are not yet working. When I connect a mocap animation, for example Fear, with a different slot, for example Joy, then the Fear animation works.
As I said, I know it’s still in beta, but I wanted to mention it here and maybe there is a quick solution.
Reply by timguhlke | 2024-01-18 18:46:14
Thanks again for your support here, there are a few things that are not working for me right now and I was wondering if it’s because of the beta. Otherwise, I’ll wait for the video about it, but maybe it should be working already.
The List/Think and Talk animation is not being triggered. Both are connected to mocap data.
The character switches to the T-pose when I type something in the chat and he responds.
The Trust, Fear, Sadness, Disgust, and Anticipation emotion slots are not yet working. When I connect a mocap animation, for example Fear, with a different slot, for example Joy, then the Fear animation works.
As I said, I know it’s still in beta, but I wanted to mention it here and maybe there is a quick solution.
Reply by freezfast | 2024-01-19 11:38:29
Hi <@143473124963254272>, we’re delaying the tutorial a bit. However, we’re doing another reallusion blueprint update today, will ping you as soon as it’s up.
For issue 1&2, The animations should be normally triggered, but we can get on a quick call if the issue persists even after updating
For emotions, they trigger based on the character, to test a particular emotion you can add it in the initial emotions list when clicking on the character in the viewport