Hello, I have no documents connected to the metahuman and yet he answers things I should not know.
How can I force it to only answer from the knowledge base and not be able to hallucinate? For example, I ask him about “salt from the salt valley” and he answers about how salt can be used and he should not.
I would like him not to answer anything he doesn’t know and if he doesn’t have it in the knowledge base to say “I can’t help you with this topic”.
Hi, I’m no expert but I’m not sure if there is a feature that does that. The AI characters have training data which they draw information from, what you’re asking is for that to not happen, I’m not sure if that can be done.
If you want them to answer a question with specific answers, you could add the question and answer in a Q & A format in their Knowledge Bank text file, and make sure the Temperature setting in Core AI Settings is zero which makes them not deviate from accurate information as much.
By doing this, if they are asked a question that is the same as in the Knowledge Bank, they should more or less answer it as specified in the Knowledge Bank. But I don’t see a way of stopping them trying to answer things that aren’t included in the Knowledge Bank but I don’t know for sure.
You could try adding to the their backstory “You can only answer questions that you have knowledge of the answers from your Knowledge Bank” but I’m unsure if that would work or not.
Can you tell me if adding “You can only answer questions that you have knowledge of in your Knowledge Bank” to the prompt will limit the answers you don’t have in your knowledge base?