Hello everyone,
We are thrilled to share our latest Unreal Engine tutorial! If you are working with high-fidelity avatars, this video shows you exactly how to enable real-time lip sync and facial animation for Reallusion Character Creator 5 (CC5) avatars using Convai’s Unreal Engine plugin.
By combining the incredible visual quality of Reallusion characters with Convai’s conversational AI pipeline, you can create fully interactive avatars that see, hear, and speak with stunning realism.
Powered by NeuroSync
The core of this integration is driven by NeuroSync, our in-house neural animation AI model.
-
Real-Time Audio Analysis: NeuroSync processes AI-generated audio on the fly to produce synchronized blend shapes with ultra-low latency.
-
Nuanced Expressions: It allows your character to deliver highly accurate lip sync and nuanced facial expressions that adapt dynamically to the conversation.
What You Will Learn in This Workflow
We walk you through the entire process from start to finish. By the end of this video, you will have a conversational, lip-syncing Reallusion character perfectly suited for training simulations, interactive NPCs, XR experiences, and brand avatars.
This tutorial covers:
-
Exporting from CC5: The correct settings and workflow to export your avatar from Reallusion’s Character Creator 5.
-
Component Setup: Configuring the “Chatbot” and “FaceSync” components inside Unreal Engine.
-
Bringing It to Life: Tying the visual fidelity to our real-time AI conversation system.
Resources & Downloads
Ready to bring your Reallusion characters to life? Get the plugin and read the setup guide below:
-
Convai Plugin (Fab): Get the Plugin Here
-
Setup Documentation: Unreal Engine Plugin Beta Overview
ICYMI (In Case You Missed It):
Want to make these high-fidelity avatars even smarter? Check out our other tutorial on giving your AI Characters Streaming Vision Input and Hands-Free Interactions:
We can’t wait to see the incredible, lifelike characters you create with CC5 and Convai. Feel free to share your projects or ask any questions in the replies below!
Happy developing!