Original Discord Post by rahulg1981 | 2024-10-17 17:46:40
Hi,
I have my own Ollama based server with my custom LLM, is it possible to use that in Convai?
Original Discord Post by rahulg1981 | 2024-10-17 17:46:40
Hi,
I have my own Ollama based server with my custom LLM, is it possible to use that in Convai?
Reply by k3kalinix | 2024-10-17 17:48:02
Hello <@117284011507712002>,
We can generally support custom LLMs under a Solutions Partner or Enterprise plan, if there is a clear use case and committed enterprise customer for deployment. Please contact our sales team to discuss your specific
Reply by k3kalinix | 2024-10-17 17:48:14
Embedded Content:
Let’s chat! - Convai Team
Sign up for a friendly conversation where we can discuss your use case and answer your questions. For technical help, please join our Discord channel. We’d love to explore how Convai can help you achieve your goals.
Link: Calendly - Convai Team
Reply by rahulg1981 | 2024-10-17 17:49:02
OK, thanks. We do have a requirement, but I need to check this with the team. If they agree, we connect soon. Thanks for quick response.
This conversation happened on the Convai Discord Server, so this post will be closed.