Vectara
Back to webinar
Webinar

Evaluating Hallucinations in RAG

Join us as we discuss the critical importance of retrieveal in text-based GenAI systems and how Vectara's Grounded Generation is leading the way

  • Date

  • Time

  • LocationOnline

Evaluating Hallucinations in RAG

Hallucinations continue to hinder the adoption of AI systems into production. Companies still do not fully trust that the conversational AI systems they build will not produce inaccurate results and degrade user confidence. Many vendors make claims to all but eliminate hallucinations, but to date, there are not many impartial mechanisms to quantity the amount of hallucination.

In this Webinar REPLAY, we will explore open-source tools that help provide visibility into the level and impact of hallucinations for many models.

We’ll introduce you to

  • Understanding the impact of hallucinations on AI adoption
  • Exploration of tools for quantifying the level of hallucinations
  • Best practices to manage the reality of hallucination and best mitigate its impact
  • Simon Hughes photo
    Simon Hughes

Get started with Vectara

Vectara is the shortest path between question and answer, delivering true business value in the shortest time.