AI Characters with Streaming Vision Input and Handsfree Interactions | Convai Unreal Engine Tutorial

Hello everyone,

We are thrilled to introduce the New Convai Unreal Engine Plugin, powered by Convai’s latest Live Character API.

Built on the WebRTC protocol, this update represents a massive leap forward in AI character interaction. It enables ultra-low latency, real-time engagement, and unlocks powerful new capabilities that allow your AI characters to see, remember, and understand their world like never before.

In this video, we take you to a Mars World Demo to showcase these features live in action!

:sparkles: Key Features in This Update

  • :high_voltage: WebRTC Powered: Experience significantly reduced latency for seamless, real-time conversations.

  • :eye: Real-Time Vision: Characters can now “see” dynamic events and objects in the environment and react to them instantly.

  • :brain: Long-Term Memory: Characters remember past interactions and context, making conversations deeper and more meaningful over time.

  • :books: Multimodal Knowledge: AI that understands context beyond just text, integrating visual cues into its knowledge base.

  • :speaking_head: Hands-Free Conversation: Fluid dialogue without the need for push-to-talk inputs.

:hammer_and_wrench: Step-by-Step Setup Guide

This video is also a complete tutorial. We guide you through the entire process:

  1. Installation: How to download the new Beta plugin from our GitHub Releases page.

  2. Importing: Properly importing the plugin into your Unreal Engine project.

  3. Activation: How to enable the new Vision and Hands-Free features within the Mars Demo Level.

:link: Resources & Download

You can test these features yourself! Download the sample project and the beta plugin from our documentation below:

We are excited to see how you utilize Vision and Memory in your Unreal Engine projects. Let us know your feedback!

Happy developing!

1 Like