Any plans to add an offline version of Convai?
I’ve heard Llama (or some other LLM) has a model that can be ran locally or they’re actively pursuing that as an option. Is Convai also pursuing such a feature?
Any plans to add an offline version of Convai?
I’ve heard Llama (or some other LLM) has a model that can be ran locally or they’re actively pursuing that as an option. Is Convai also pursuing such a feature?