PodParley PodParley

#60 Geometric Deep Learning Blueprint (Special Edition)

Episode 60 of the Machine Learning Street Talk (MLST) podcast, hosted by Machine Learning Street Talk (MLST), titled "#60 Geometric Deep Learning Blueprint (Special Edition)" was published on September 19, 2021 and runs 213 minutes.

September 19, 2021 ·213m · Machine Learning Street Talk (MLST)

0:00 / 0:00

Patreon: https://www.patreon.com/mlst The last decade has witnessed an experimental revolution in data science and machine learning, epitomised by deep learning methods. Many high-dimensional learning tasks previously thought to be beyond reach -- such as computer vision, playing Go, or protein folding -- are in fact tractable given enough computational horsepower. Remarkably, the essence of deep learning is built from two simple algorithmic principles: first, the notion of representation or feature learning and second, learning by local gradient-descent type methods, typically implemented as backpropagation. While learning generic functions in high dimensions is a cursed estimation problem, most tasks of interest are not uniform and have strong repeating patterns as a result of the low-dimensionality and structure of the physical world. Geometric Deep Learning unifies a broad class of ML problems from the perspectives of symmetry and invariance. These principles not only underlie the breakthrough performance of convolutional neural networks and the recent success of graph neural networks but also provide a principled way to construct new types of problem-specific inductive biases. This week we spoke with Professor Michael Bronstein (head of graph ML at Twitter) and Dr. Petar Veličković (Senior Research Scientist at DeepMind), and Dr. Taco Cohen and Prof. Joan Bruna about their new proto-book Geometric Deep Learning: Grids, Groups, Graphs, Geodesics, and Gauges. See the table of contents for this (long) show at https://youtu.be/bIZB1hIJ4u8 

Patreon: https://www.patreon.com/mlst

The last decade has witnessed an experimental revolution in data science and machine learning, epitomised by deep learning methods. Many high-dimensional learning tasks previously thought to be beyond reach -- such as computer vision, playing Go, or protein folding -- are in fact tractable given enough computational horsepower. Remarkably, the essence of deep learning is built from two simple algorithmic principles: first, the notion of representation or feature learning and second, learning by local gradient-descent type methods, typically implemented as backpropagation.

While learning generic functions in high dimensions is a cursed estimation problem, most tasks of interest are not uniform and have strong repeating patterns as a result of the low-dimensionality and structure of the physical world.

Geometric Deep Learning unifies a broad class of ML problems from the perspectives of symmetry and invariance. These principles not only underlie the breakthrough performance of convolutional neural networks and the recent success of graph neural networks but also provide a principled way to construct new types of problem-specific inductive biases.

This week we spoke with Professor Michael Bronstein (head of graph ML at Twitter) and Dr.

Petar Veličković (Senior Research Scientist at DeepMind), and Dr. Taco Cohen and Prof. Joan Bruna about their new proto-book Geometric Deep Learning: Grids, Groups, Graphs, Geodesics, and Gauges.

See the table of contents for this (long) show at https://youtu.be/bIZB1hIJ4u8 

No similar episodes found.

Super Data Science: ML & AI Podcast with Jon Krohn Jon Krohn The latest machine learning, A.I., and data career topics from across both academia and industry are brought to you by host Dr. Jon Krohn on the Super Data Science Podcast. As the quantity of data on our planet doubles every couple of years and with this trend set to continue for decades to come, there's an unprecedented opportunity for you to make a meaningful impact in your lifetime. In conversation with the biggest names in the data science industry, Jon cuts through hype to fuel that professional impact.Whether you're curious about getting started in a data career or you're a deep technical expert, whether you'd like to understand what A.I. is or you'd like to integrate more data-driven processes into your business, we have inspiring guests and lighthearted conversation for you to enjoy.We cover tools, techniques, and implementation tricks across data collection, databases, analytics, predictive modeling, visualization, software engineering, r Your Data Teacher Podcast Your Data Teacher A podcast about data science, machine learning, artificial intelligence, statistics and everything related to data.Home Page: https://www.yourdatateacher.com Undercovers Vibe Machine Media A podcast where we discuss amazing album artwork with the artists behind them. A fascinating look at how the concepts came together, the interactions with the artists the covers were created for, inspirations, what album covers they wish they'd created and what acts they'd like to create artwork for! Werkleitz Festival 2021 Werkleitz How discontinuity and historical contexts, disorder, and machine learning collide is the topic of the podcasts with artists and scholars published continuously during the Werkleitz Festival 2021 and later on.
URL copied to clipboard!