Evaluating Hallucinations in RAG
Join us as we discuss the critical importance of retrieveal in text-based GenAI systems and how Vectara's Grounded Generation is leading the way
Date
Time
LocationOnline
Hallucinations continue to hinder the adoption of AI systems into production. Companies still do not fully trust that the conversational AI systems they build will not produce inaccurate results and degrade user confidence. Many vendors make claims to all but eliminate hallucinations, but to date, there are not many impartial mechanisms to quantity the amount of hallucination.
In this Webinar REPLAY, we will explore open-source tools that help provide visibility into the level and impact of hallucinations for many models.