All Collections
Understanding Waybook
Features
Why am I getting a wrong answer in Ask?
Why am I getting a wrong answer in Ask?

Getting weird response from Ask or any of our other AI tools? Read on to learn more.

Sophia Terry avatar
Written by Sophia Terry
Updated over a week ago

We understand that sometimes, AI-generated content can throw a curveball by generating information that appears to come from your Waybook documents. In this article, we'll walk you through what to do when you encounter AI-generated responses that seem off-track or inaccurate. Rest assured, we've got your back!

Recognizing and Actioning:

AI Hallucination occurs when the AI model generates responses based on its vast general knowledge instead of your Waybook documents. If this situation occurs, here's what you should do:

  • Thumbs Down: If you come across a response that appears to be an AI Hallucination, use the "Thumbs Down" associated with the message. This helps us identify and address such instances.

  • Report to Support: After giving a thumbs down, report the message to our support team. We take your feedback and use it to improve the accuracy of AI-generated content.

Remember that we continuously refine our AI model to reduce AI Hallucination occurrences. Your reports are invaluable in this process, ensuring that Waybook remains a reliable platform for all users.

If you ever encounter AI-generated content that seems off-track or have any questions, please reach out to our friendly and knowledgeable support team. We're here to assist you and ensure your Waybook experience is as smooth as possible. Together, we make Waybook a trusted platform for all your documentation needs!

Did this answer your question?