PodParley PodParley

Kernels!

Episode 18 of the Machine Learning Street Talk (MLST) podcast, hosted by Machine Learning Street Talk (MLST), titled "Kernels!" was published on September 18, 2020 and runs 97 minutes.

September 18, 2020 ·97m · Machine Learning Street Talk (MLST)

0:00 / 0:00

Today Yannic Lightspeed Kilcher and I spoke with Alex Stenlake about Kernel Methods. What is a kernel? Do you remember those weird kernel things which everyone obsessed about before deep learning? What about Representer theorem and reproducible kernel hilbert spaces? SVMs and kernel ridge regression? Remember them?! Hope you enjoy the conversation! 00:00:00 Tim Intro 00:01:35 Yannic clever insight from this discussion  00:03:25 Street talk and Alex intro  00:05:06 How kernels are taught 00:09:20 Computational tractability 00:10:32 Maths  00:11:50 What is a kernel?  00:19:39 Kernel latent expansion  00:23:57 Overfitting  00:24:50 Hilbert spaces  00:30:20 Compare to DL 00:31:18 Back to hilbert spaces 00:45:19 Computational tractability 2 00:52:23 Curse of dimensionality 00:55:01 RBF: infinite taylor series 00:57:20 Margin/SVM  01:00:07 KRR/dual 01:03:26 Complexity compute kernels vs deep learning 01:05:03 Good for small problems? vs deep learning) 01:07:50 Whats special about the RBF kernel 01:11:06 Another DL comparison 01:14:01 Representer theorem 01:20:05 Relation to back prop 01:25:10 Connection with NLP/transformers 01:27:31 Where else kernels good 01:34:34 Deep learning vs dual kernel methods 01:33:29 Thoughts on AI 01:34:35 Outro

Today Yannic Lightspeed Kilcher and I spoke with Alex Stenlake about Kernel Methods. What is a kernel? Do you remember those weird kernel things which everyone obsessed about before deep learning? What about Representer theorem and reproducible kernel hilbert spaces? SVMs and kernel ridge regression? Remember them?! Hope you enjoy the conversation!



00:00:00 Tim Intro

00:01:35 Yannic clever insight from this discussion 

00:03:25 Street talk and Alex intro 

00:05:06 How kernels are taught

00:09:20 Computational tractability

00:10:32 Maths 

00:11:50 What is a kernel? 

00:19:39 Kernel latent expansion 

00:23:57 Overfitting 

00:24:50 Hilbert spaces 

00:30:20 Compare to DL

00:31:18 Back to hilbert spaces

00:45:19 Computational tractability 2

00:52:23 Curse of dimensionality

00:55:01 RBF: infinite taylor series

00:57:20 Margin/SVM 

01:00:07 KRR/dual

01:03:26 Complexity compute kernels vs deep learning

01:05:03 Good for small problems? vs deep learning)

01:07:50 Whats special about the RBF kernel

01:11:06 Another DL comparison

01:14:01 Representer theorem

01:20:05 Relation to back prop

01:25:10 Connection with NLP/transformers

01:27:31 Where else kernels good

01:34:34 Deep learning vs dual kernel methods

01:33:29 Thoughts on AI

01:34:35 Outro

No similar episodes found.

URL copied to clipboard!