Generative models are revolutionizing numerous industries, from producing stunning visual art to crafting persuasive text. However, these powerful assets can sometimes produce surprising results, known as hallucinations. When an AI network hallucinates, it generates incorrect or unintelligible output that differs from the desired result. These hal