684: Get More Language Context out of your LLM
Open-source LLMs, FlashAttention and generative AI terminology: Host Jon Krohn gives us the lift we need to explore the next big steps in generative AI. Listen to the specific way in which Stanford University’s “exact attention” algorithm, FlashAttention, could become a competitor for GPT-4’s capabilities.Additional materials: www.superdatascience.com/684Interested in sponsoring a SuperDataScience Podcast episode? Visit JonKrohn.com/podcast for sponsorship information.
An episode of the Super Data Science: ML & AI Podcast with Jon Krohn podcast, hosted by Jon Krohn, titled "684: Get More Language Context out of your LLM" was published on June 2, 2023 and runs 5 minutes.
June 2, 2023 ·5m · Super Data Science: ML & AI Podcast with Jon Krohn
Summary
Open-source LLMs, FlashAttention and generative AI terminology: Host Jon Krohn gives us the lift we need to explore the next big steps in generative AI. Listen to the specific way in which Stanford University’s “exact attention” algorithm, FlashAttention, could become a competitor for GPT-4’s capabilities.Additional materials: www.superdatascience.com/684Interested in sponsoring a SuperDataScience Podcast episode? Visit JonKrohn.com/podcast for sponsorship information.
Episode Description
Similar Episodes
Apr 9, 2026 ·12m
Apr 7, 2026 ·14m
Apr 7, 2026 ·10m