CC4 Character Audio Playing But Speech Animation and Lip Sync Not Triggering in UE 5.7

Hello Convai Support Team,

I am experiencing an issue where my Character Creator 4 (CC4) character receives audio from the Convai service, but neither the talking animation nor the lip-sync (blendshapes) are triggered.

My System Specifications:

  • Unreal Engine Version: 5.7 (Custom/Latest Build)

  • Convai_Reallusion_AnimBP Version: 1.0.3

  • Character Base: Reallusion Character Creator 4 (CC4)

Problem Description:

  1. Audio Output: The character successfully receives and plays audio from the Convai service.

  2. Animation Transition: The character remains in the “Idle Animation” state and does not transition to the “Talk Animation” state machine, even when audio is playing.

  3. Lip-Sync/Blendshapes: Although the logs show that blendshape frames are being received (e.g., 404 or 1180 frames matched), there is no visible mouth movement on the mesh.

Logs & Observations:

  • Log Warning: ConvaiUtilsLog: Warning: FixCC5LipsyncPostProcessBlendshapes: No post process anim instance found.

  • Log Success: ConvaiSubsystemLog: [BlendshapeTurnStats] Server: Match: YES.

  • Component Setup: Convai Face Sync is set to CC4 Extended Blendshapes and Neurosync provider.

  • AnimBP Issue: Despite casting to the character and setting the isTalking variable, the AnimGraph does not switch the pose to “True” in the Blend Poses by bool node.

I have already resolved previous deprecated warnings regarding Get Skeletal Mesh by updating to Get Skeletal Mesh Asset, but the core animation trigger issue persists.

Could you please provide guidance on why the isTalking state might not be correctly recognized by the AnimBP or why the blendshapes are failing to apply to the post-process anim instance?

Hello,

Welcome to the Convai Developer Forum!

Did you select CC4 Extended?

yes.
What Do the Logs Say?
Data Flow: Successful (298 frames received).

Face System: Ready (60 modifications completed).

LipSync: Appears to be working (bIsPlaying: true).

Result: Data is entering Unreal, but because you haven’t made the final connection (Input Pose) that “applies” that data to the character’s face, you’re not seeing any visual movement.

That’s what he said when I shared the logs with Gemini.

Could you please share the logs with us? Also, did you add CC4 Extended to your character?

Yes. CC4 Extended Blendshapes selected. I shared the logs as a text file.

Logs.txt (27.4 KB)

I see that you changed your character’s parent blueprint. Do not change the parent of your Player or Character.

6:32

Hi there,

In addition to K3’s suggestion to use BP_ConvaiChatbotComponent (rather than reparenting the character), we noticed that the bIsTalking boolean isn’t being set to true correctly.

Please try applying the modification shown in the attached image to the Lipsync node within the Animation Blueprint Event Graph.

Let us know if this resolves the issue!

I made the character the same as in the video. I also applied all the settings from the tutorial. I added the bIsTalking edit you sent as well. But it still doesn’t work. Could it be because I’m using a CC4 character? Did you optimize the face animations for a CC5 character?

New Logs.txt (18.4 KB)

While speaking, the idle animation continues to play. It doesn’t even switch to talk animation.

I fixed the lip issue by using ARKit Blendshapes. However, the problem of the character not transitioning to talk animation still persists, and the character doesn’t focus on me while speaking; the idle animation continues.

Please make sure you follow the tutorial.