PhD Seminar: Adaptive Methods of Stochastic Optimization

Мероприятие завершено

Speaker: Aleksandr Ogaltsov, second-year PhD student, HDI Lab, Faculty of Computer Science
Where: https://zoom.us/j/442268392  
When: October 5, 18:10–19:30 

The talk will be dedicated to several line-search-based gradient methods for stochastic optimization: a gradient and accelerated gradient methods for convex optimization and a gradient method for non-convex optimization. The methods simultaneously adapt to the unknown Lipschitz constant of the gradient and variance of the stochastic approximation for the gradient. We will also consider a possibility of extension of such methods to gradient-free oracles in the stochastic case.