Hello everyone,
We are thrilled to introduce the New Convai Unreal Engine Plugin, powered by Convai’s latest Live Character API.
Built on the WebRTC protocol, this update represents a massive leap forward in AI character interaction. It enables ultra-low latency, real-time engagement, and unlocks powerful new capabilities that allow your AI characters to see, remember, and understand their world like never before.
In this video, we take you to a Mars World Demo to showcase these features live in action!
Key Features in This Update
-
WebRTC Powered: Experience significantly reduced latency for seamless, real-time conversations. -
Real-Time Vision: Characters can now “see” dynamic events and objects in the environment and react to them instantly. -
Long-Term Memory: Characters remember past interactions and context, making conversations deeper and more meaningful over time. -
Multimodal Knowledge: AI that understands context beyond just text, integrating visual cues into its knowledge base. -
Hands-Free Conversation: Fluid dialogue without the need for push-to-talk inputs.
Step-by-Step Setup Guide
This video is also a complete tutorial. We guide you through the entire process:
-
Installation: How to download the new Beta plugin from our GitHub Releases page.
-
Importing: Properly importing the plugin into your Unreal Engine project.
-
Activation: How to enable the new Vision and Hands-Free features within the Mars Demo Level.
Resources & Download
You can test these features yourself! Download the sample project and the beta plugin from our documentation below:
- Download & Setup Guide: Unreal Engine Plugin Beta Installation and Setup
We are excited to see how you utilize Vision and Memory in your Unreal Engine projects. Let us know your feedback!
Happy developing!