UE5 + Reallusion documentation (Custom Events/Emotions)

Original Discord Post by timguhlke | 2024-01-05 02:46:34

Hello everyone, I would like to inquire about the general progress and development of Unreal and Reallusion. The progress seems a bit slow here. It would be great to have more tutorials or documentation on custom animations and especially emotions. I haven’t found anything on emotions except for the functions in MetaHuman itself. Thanks

Reply by k3kalinix | 2024-01-09 10:51:39

Hello <@143473124963254272>,
I have forwarded your question to my teammate and we will respond to your question as soon as possible.

Reply by k3kalinix | 2024-01-09 10:51:44

<@&1163218672580575372>

Reply by mrd7041 | 2024-01-09 11:11:02

Hello <@143473124963254272> Sorry for you to face this inconvenience. The development we do for MetaHuman also applies to Reallusion characters. We are constantly introducing new features and with that we introduce new tutorials and documentation.

Reply by timguhlke | 2024-01-09 23:18:26

Thanks for the feedback. When could we expect something about emotions?

Reply by freezfast | 2024-01-10 15:36:39

Hi <@143473124963254272>, we will be having a new tutorial over Reallusion were we will cover adding emotions animations

Replying to freezfast’s Message

Reply by freezfast | 2024-01-10 15:36:39
Hi <@143473124963254272>, we will be having a new tutorial over Reallusion were we will cover adding emotions animations

Reply by timguhlke | 2024-01-10 15:47:49

That’s awesome! Thank you! When is the tutorial coming?

Reply by freezfast | 2024-01-10 15:49:41

We’re targeting next end of week, but it usually takes longer due to the review process, so we could say around end of this month. Until then you can share what you need to do and we can help

Reply by timguhlke | 2024-01-10 16:05:07

Okay, that’s great. If you can help me beforehand that would be great. So I’m still unsure exactly what the emotion implimentation process looks like with Reallusion.

I see in the metahumans there is the animation layer “Emotions_Anim” which I would have to rebuild? And it still has to be linked in the Animgraph, right? In the eventgraph I also see “Set Emotions” which I should also recreate?

Is the whole thing then automatically linked to the lipsync?

I think the biggest question is what exactly happens in the screen and what are the differences? What do M and E stand for and what of a reallusion character would have to be inserted there? Because Reallusion is not separated with the head like a metahuman. Would you have to insert the idle animation + the matching emotion there?

A small todo list for the complete workflow with a focus on what you have to do differently with Reallusion would be extremely helpful. thanks :heart:

Images:

Reply by freezfast | 2024-01-10 16:37:06

Here is a 5 mins video to clarify some of your points and help start up your integration with Reallusion.

For Reallusion, to have the latest updates like emotions you should use the following animation blueprint which is different from the one in the first YouTube tutorial,
note from the video: the set of emotions as well as the animation blueprint itself are going to slightly change by the time we create the tutorial (end of next week)

Reply by timguhlke | 2024-01-10 16:50:46

Amazing, that helps me so much and is really incredible support. Thank you so much. Just 2 more questions, even though I will probably notice this when testing.
What kind of mocap animations do I have to insert or export for the emotions? Because you can only export the whole figure. Just the idle position + emotion or tpose+ emotion? Or does it not matter because the blueprint only takes the face? And is the lipsync feature included in the beta folder?

Reply by freezfast | 2024-01-10 16:54:32

Yes indeed, it will not matter, in the blueprint it will only take the face mocap and remove the body, the lipsync is included in the same blueprint and should work out of the box same as in MetaHumans and ReadyPlayerMe, just make sure to add the FaceSync component to the main character blueprint though

Reply by timguhlke | 2024-01-10 16:55:39

I’ll test it right away and get back to you if I have any questions. Thank you very much.

Reply by timguhlke | 2024-01-13 00:33:50

Is there a way to reduce the strength of the lipsync? The character opens his mouth much more in the Beta BP.

Reply by freezfast | 2024-01-14 13:35:48

Yes it is possible

Attachments:

Reply by timguhlke | 2024-01-18 18:46:14

Thanks again for your support here, there are a few things that are not working for me right now and I was wondering if it’s because of the beta. Otherwise, I’ll wait for the video about it, but maybe it should be working already.

  1. The List/Think and Talk animation is not being triggered. Both are connected to mocap data.

  2. The character switches to the T-pose when I type something in the chat and he responds.

  3. The Trust, Fear, Sadness, Disgust, and Anticipation emotion slots are not yet working. When I connect a mocap animation, for example Fear, with a different slot, for example Joy, then the Fear animation works.

As I said, I know it’s still in beta, but I wanted to mention it here and maybe there is a quick solution.

Reply by k3kalinix | 2024-01-19 11:00:38

<@&1163218672580575372>

Replying to timguhlke’s Message

Reply by timguhlke | 2024-01-18 18:46:14
Thanks again for your support here, there are a few things that are not working for me right now and I was wondering if it’s because of the beta. Otherwise, I’ll wait for the video about it, but maybe it should be working already.

  1. The List/Think and Talk animation is not being triggered. Both are connected to mocap data.

  2. The character switches to the T-pose when I type something in the chat and he responds.

  3. The Trust, Fear, Sadness, Disgust, and Anticipation emotion slots are not yet working. When I connect a mocap animation, for example Fear, with a different slot, for example Joy, then the Fear animation works.

As I said, I know it’s still in beta, but I wanted to mention it here and maybe there is a quick solution.

Reply by mrd7041 | 2024-01-19 11:33:51

<@365628745886859267> Can you help him ?

Replying to timguhlke’s Message

Reply by timguhlke | 2024-01-18 18:46:14
Thanks again for your support here, there are a few things that are not working for me right now and I was wondering if it’s because of the beta. Otherwise, I’ll wait for the video about it, but maybe it should be working already.

  1. The List/Think and Talk animation is not being triggered. Both are connected to mocap data.

  2. The character switches to the T-pose when I type something in the chat and he responds.

  3. The Trust, Fear, Sadness, Disgust, and Anticipation emotion slots are not yet working. When I connect a mocap animation, for example Fear, with a different slot, for example Joy, then the Fear animation works.

As I said, I know it’s still in beta, but I wanted to mention it here and maybe there is a quick solution.

Reply by freezfast | 2024-01-19 11:38:29

Hi <@143473124963254272>, we’re delaying the tutorial a bit. However, we’re doing another reallusion blueprint update today, will ping you as soon as it’s up.

For issue 1&2, The animations should be normally triggered, but we can get on a quick call if the issue persists even after updating

Reply by freezfast | 2024-01-19 11:40:29

For emotions, they trigger based on the character, to test a particular emotion you can add it in the initial emotions list when clicking on the character in the viewport