PodParley Podparley
PodParley Podparley

AI in 2025 – A global perspective, with Kai-Fu Lee

Kai-Fu Lee joins me to discuss AI in 2025. Kai-Fu is a storied AI researcher, investor, inventor and entrepreneur based in Taiwan. As one of the leading AI experts based in Asia, I wanted to get his take on this particular market.

Listen to this episode

0:00 / 0:00

Summary

Kai-Fu Lee joins me to discuss AI in 2025. Kai-Fu is a storied AI researcher, investor, inventor and entrepreneur based in Taiwan. As one of the leading AI experts based in Asia, I wanted to get his take on this particular market.

First published

01/02/2025

Genres

technology business investing news

Duration

50 minutes

Parent Podcast

Azeem Azhar's Exponential View

View Podcast

Share this episode

Similar Episodes

No similar episodes found

Similar Podcasts

No similar podcasts found

Episode Description

<p>Kai-Fu Lee joins me to discuss AI in 2025. Kai-Fu is a storied AI researcher, investor, inventor and entrepreneur based in Taiwan. As one of the leading AI experts based in Asia, I wanted to get his take on this particular market.</p><p><strong>Key insights:</strong></p><ul><li>Kai-Fu noted that unlike the singular “ChatGPT moment” that stunned Western audiences, the Chinese market encountered generative AI in a more “incremental and distributed” fashion.</li><li>A particularly fascinating shift is how Chinese enterprises are adopting generative AI. Without the entrenched SaaS layers common in the US, Chinese companies are “rolling their own” solutions. This deep integration might be tougher and messier, but it encourages thorough, domain-specific implementations.</li><li>We reflected on a structural shift in how we think about productivity software. With AI “conceptualizing” the document and the user providing strategic nudges, it’s akin to reversing the traditional creative process.</li><li>We’re moving from a training-centric world to an inference-centric one. Models need to be cheaper, faster and less resource-intensive to run, not just to train. For instance, his team at ZeroOne.ai managed to train a top-tier model on “just” 2,000 H100 GPUs and bring inference costs down to 10 cents per million tokens—a fraction of GPT-4’s early costs.</li><li>In 2025, Kai-Fu predicts, we’ll see fewer “demos” and more “AI-first” applications deploying text, image and video generation tools into real-world workflows.</li></ul><p>Connect with us:</p><ul><li><a href="https://www.exponentialview.co" target="_blank">Exponential View</a></li></ul><br/> <p>Hosted by Simplecast, an AdsWizz company. See <a href="https://pcm.adswizz.com">pcm.adswizz.com</a> for information about our collection and use of personal data for advertising.</p>

just now