We understand that sometimes, AI-generated content can throw a curveball by generating incorrect information that appears to come from your Waybook documents. In this article, we'll walk you through what to do when AI-generated responses seem inaccurate. Rest assured, we've got your back!
Recognizing and Actioning:
AI Hallucination occurs when the AI model generates responses based on its vast general knowledge instead of your Waybook documents. If this occurs, here's what you should do:
Thumbs Down: If you receive a response that appears to be an AI Hallucination, use the "Thumbs Down" option to rate the message. This helps us identify and address such instances.
Report to Support: After giving a thumbs down, report the message to our support team. We take your feedback and use it to improve the accuracy of AI-generated content.
Remember that we continuously refine our AI model to reduce AI Hallucination occurrences. Your reports are invaluable in this process, ensuring that Waybook remains a reliable platform for all users.
If you ever encounter AI-generated content that seems inaccurate or have any questions, please get in touch with our support team. We're here to help and ensure your Waybook experience is as smooth as possible.