Stability and combinations, with Aleksa Gordić

EPISODE · Sep 28, 2022 · 32 MIN

Stability and combinations, with Aleksa Gordić

from London Futurists · host London Futurists

This episode continues our discussion with AI researcher Aleksa Gordić from DeepMind on understanding today’s most advanced AI systems.00.07 This episode builds on Episode 501.05 We start with GANs – Generative Adversarial Networks01.33 Solving the problem of stability, with higher resolution03.24 GANs are notoriously hard to train. They suffer from mode collapse03.45 Worse, the model might not learn anything, and the result is pure noise03.55 DC GANs introduced convolutional layers to stabilise them and enable higher resolution04.37 The technique of outpainting05.55 Generating text as well as images, and producing stories06.14 AI Dungeon06.28 From GANs to Diffusion models06.48 DDPM (De-noising diffusion probabilistic models) does for diffusion models what DC GANs did for GANs07.20 They are more stable, and don’t suffer from mode collapse07.30 They do have downsides. They are much more computation intensive08.24 What does the word diffusion mean in this context?08.40 It’s adopted from physics. It peels noise away from the image09.17 Isn’t that rewinding entropy?09.45 One application is making a photo taken in 1830 look like one taken yesterday09.58 Semantic Segmentation Masks convert bands of flat colour into realistic images of sky, earth, sea, etc10.35 Bounding boxes generate objects of a specified class from tiny inputs11.00 The images are not taken from previously seen images on the internet, but invented from scratch11.40 The model saw a lot of images during training, but during the creation process it does not refer back to them12.40 Failures are eliminated by amendments, as always with models like this12.55 Scott Alexander blogged about models producing images with wrong relationships, and how this was fixed within 3 months13.30 The failure modes get harder to find as the obvious ones are eliminated13.45 Even with 175 billion parameters, GPT-3 struggled to handle three digits in computation15.18 Are you often surprised by what the models do next?15.50 The research community is like a hive mind, and you never know where the next idea will come from16.40 Often the next thing comes from a couple of students at a university16.58 How Ian Goodfellow created the first GAN17.35 Are the older tribes described by Pedro Domingos (analogisers, evolutionists, Bayesians…) now obsolete?18.15 We should cultivate different approaches because you never know where they might lead19.15 Symbolic AI (aka Good Old Fashioned AI, or GOFAI) is still alive and kicking19.40 AlphaGo combined deep learning and GOFAI21.00 Doug Lennart is still persevering with Cyc, a purely GOFAI approach21.30 GOFAI models had no learning element. They can’t go beyond the humans whose expertise they encapsulate22.25 The now-famous move 37 in AlphaGo’s game two against Lee Sedol in 201623.40 Moravec’s paradox. Easy things are hard, and hard things are easy24.20 The combination of deep learning and symbolic AI has been long urged, and in fact is already happening24.40 Will models always demand more and more compute?25.10 The human brain has far more compute power than even our biggest systems today25.45 Sparse, or MoE (Mixture of Experts) systems are quite efficient26.00 We need more compute, better algorithms, and more efficiency26.55 Dedicated AI chips will help a lot with efficiency26.25 Cerebros claims that GPT-3 could be traiC-Suite PerspectivesElevate how you lead with insight from today’s most influential executives.Listen on: Apple Podcasts   Spotify

NOW PLAYING

Stability and combinations, with Aleksa Gordić

0:00 32:33

No transcript for this episode yet

We transcribe on demand. Request one and we'll notify you when it's ready — usually under 10 minutes.

Dj Paulo Moreno Dj 🇬🇧 Sound selections born from a knowledge of cause could be a way to describe Paulo Moreno.Always connected to the music and entertainment industries, the artist had a late awakening to djing, but no less dazzling for that. It was in London that he embraced the DJ impetus and performed regularly in renowned clubs and events such as Fabric London, Fire, Área, Heaven, Club No65, Union, Egg, Coronet and the Summer Rites festival, but he didn't stop there. The following years witnessed Paulo traveling all over the world to delight all those who listen to him with his sets full of depth, versatility, and energy. Portugal brought him a residency at Kremlin nightclub who gave him international recognition, regularly playing alongside names like Dennis Ferrer, Steve Lawler, Mendo, Prok and Fitch, and Hobo, Alan Fitzpatrick, Anja Schneider, Dennis Cruz, Goncalo, Anna, just to name a few.🇵🇹 Seleções sonoras nascidas de um conhecimento de causa poderia ser uma forma de descrever Paulo Moreno.D Innovation Zero 2023 Innovation Zero Innovation Zero exists to accelerate meaningful action towards a low-carbon economy and society, and our mission is to build and connect a global network of innovators, funders, businesses and policymakers.Hear from the thought leaders and pioneers that took to the stage at Innovation Zero 2023 to deepen your knowledge in the transformation to a low-carbon economy.You can find recorded versions of these sessions here: https://www.innovationzero.com/content-centreWe will return to Olympia London, from April 30 to May 1, 2024. Register your interest today at www.innovationzero.com. Song Against Songs, The by G. K. Chesterton (1874 - 1936) LibriVox LibriVox volunteers bring you 9 recordings of The Song Against Songs by G. K. Chesterton. This was the Fortnightly Poetry project for October 16, 2011.Chesterton was a large man, standing 6 feet 4 inches (1.93 m) and weighing around 21 stone (130 kg; 290 lb). His girth gave rise to a famous anecdote. During World War I a lady in London asked why he was not 'out at the Front'; he replied, 'If you go round to the side, you will see that I am.' On another occasion he remarked to his friend George Bernard Shaw: "To look at you, anyone would think a famine had struck England". Shaw retorted, "To look at you, anyone would think you have caused it". P. G. Wodehouse once described a very loud crash as "a sound like Chesterton falling onto a sheet of tin."( Summary from Wikipedia ) What Works? Sophie Scott, UCL PALS Prof Sophie Scott, Director of the Institute of Cognitive Neuroscience at University College London, discusses life and science and careers with her colleagues from the Division of Psychology and Language Sciences at UCL, and beyond. The aim of the show is to highlight some amazing scientists, and explore their journeys through science and life, and find out what works for them.
URL copied to clipboard!