Convai Avatar conversation going off topic

We currently are using Convai scale subscription and run 8 avatar models with different scenarios assigned to them.

Recently we are encountered the issue where most of the avatar are going way off topic during the conversations.

It feels like they are not able to fetch the scenario information from their knowledge banks !

The LLM model used is gemini-2.5-flash.

To temporarily resolve the issue, we had to copy the scenario info from the knowledge bank to the core description section on each avatar.

Is this something to do with the Convai platform or the LLM model?

Please could you help us resolve this problem and advise a fix?

The project is being tested live at the client’s facility and is a high priority issue to resolve.

Could you please share the following details?

  • Character ID

  • Example Sessions

  • Expected Answers

  • Current Answer

  • KB File Names

  1. 54592db0-af37-11f0-bdc5-42010a7be025
  2. dd3632c0-aa57-11f0-8d31-42010a7be025
  3. cac6f424-b404-11f0-8ddd-42010a7be025
  4. 0b5d07e0-bb20-11f0-b7b9-42010a7be025
  5. 22158ac4-af37-11f0-9fe9-42010a7be025
  6. b9df8ab8-bb20-11f0-b667-42010a7be025
  7. 414dea96-bb29-11f0-b640-42010a7be025
  8. 64e03832-d0fb-11f0-a166-42010a7be027

The problem is that most of the dont follow their scenario inoformation and go way off topic.

Most of them tal about mobile contract, washing machine, internet, kettle etc. which is way off topic. The actual correct scenarios deal with Service and Sales conversation in a car dealership setup.

You just shared the character IDs.

Could you please share the following details?

  • Character ID

  • Example Sessions

  • Expected Answers

  • Current Answer

  • KB File Names