Offline Support

Any plans to add an offline version of Convai?

I’ve heard Llama (or some other LLM) has a model that can be ran locally or they’re actively pursuing that as an option. Is Convai also pursuing such a feature?

1 Like

Hello @Josh_Hughes_NeoKuro,

Welcome to the Convai Developer Forum!

Thank you for your interest in Convai. I’ve added your request to our Feature Request list.