Oct 20 – 25, 2019
America/Mexico_City timezone

Neural Network optimisation

Oct 25, 2019, 3:00 PM


Andrey Ustyuzhanin (Yandex School of Data Analysis (RU))


Modern neural network architecture reflects the complexity of the problem. So those may become quite complex and computationally heavy. Usually, there are plenty of different meta-parameters to tune: number of layers, activation function, number of neurons per layer, drop-out rate, etc. There many different methods and tools that aimed at tuning those parameters for various reasons - accuracy, memory footprint or inference rate. This mini-course will cover the basics approaches for neural networks optimizing including hyperparameter optimization, network architecture search and Bayesian Neural Network perspective. Practical hands-on sessions will follow the theoretical introduction.

Presentation materials

There are no materials yet.