Podchaser Logo
Home
Hallucinations

Hallucinations

Released Thursday, 14th September 2023
Good episode? Give it some love!
Hallucinations

Hallucinations

Hallucinations

Hallucinations

Thursday, 14th September 2023
Good episode? Give it some love!
Rate Episode

In this episode host Jerry Cuomo introduces Mihai Criveti as the teammate he turns to first with any AI-related questions. Together, they tackle the complex issue of hallucinations in Large Language Models (LLMs). Mihai clarifies why these models can sometimes produce misleading or incorrect outputs and offers practical solutions like few-shot prompting and Retrieval Augmented Generation (RAG). If you're a business leader interested in implementing AI or simply curious about its limitations, this episode is a must-listen. Gain valuable insights from two experts as they discuss how to use these models more responsibly and effectively.

For a deeper understanding of the topics discussed in this episode, we highly recommend reading Mihai Criveti's article, "Understanding GenAI Large Language Model Limitations, and How Retrieval Augmented Generation Can Help," available on Medium.


Key Takeaways:

  • [00:32 - 01:06] Intro
  • [01:54 - 04:10] LLMs and their limitations
  • [04:32 - 07:31] What is meant by hallucination
  • [07:39 - 10:28] How to mitigate hallucinations


* Coverart was created with the assistance of DALL·E 2 by OpenAI.** Music for the podcast created by Mind The Gap Band - Cox, Cuomo, Haberkorn, Martin, Mosakowski, and Rodriguez

Show More
Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features