Make Stochastic Gradient Descent Fast Again (Ep. 113)
Episode 110 of the Data Science at Home podcast, hosted by Francesco Gadaleta, titled "Make Stochastic Gradient Descent Fast Again (Ep. 113)" was published on July 22, 2020 and runs 20 minutes.
July 22, 2020 ·20m · Data Science at Home
Summary
There is definitely room for improvement in the family of algorithms of stochastic gradient descent. In this episode I explain a relatively simple method that has shown to improve on the Adam optimizer. But, watch out! This approach does not generalize well. Join our Discord channel and chat with us. References More descent, less gradient Taylor Series
Episode Description
There is definitely room for improvement in the family of algorithms of stochastic gradient descent. In this episode I explain a relatively simple method that has shown to improve on the Adam optimizer. But, watch out! This approach does not generalize well.
Join our Discord channel and chat with us.
References
Similar Episodes
Apr 13, 2026 ·4m
Apr 12, 2026 ·5m
Apr 11, 2026 ·5m
Apr 10, 2026 ·4m
Apr 9, 2026 ·3m
Apr 8, 2026 ·3m