Third HSE-Yandex autumn school on generative models
from
Tuesday, November 23, 2021 (9:45 AM)
to
Friday, November 26, 2021 (10:00 PM)
Monday, November 22, 2021
Tuesday, November 23, 2021
9:45 AM
Welcome
-
Andrey Ustyuzhanin
(
Yandex School of Data Analysis (RU)
)
Welcome
Andrey Ustyuzhanin
(
Yandex School of Data Analysis (RU)
)
9:45 AM - 10:00 AM
Room: Princeton
10:00 AM
Generative Models Intro
-
Denis Derkach
(
National Research University Higher School of Economics (RU)
)
Generative Models Intro
Denis Derkach
(
National Research University Higher School of Economics (RU)
)
10:00 AM - 11:20 AM
Room: Princeton
11:20 AM
Coffee Break
Coffee Break
11:20 AM - 11:40 AM
Room: Princeton
11:40 AM
Introduction to Generative Models. Practice
Introduction to Generative Models. Practice
11:40 AM - 1:00 PM
Room: Princeton
1:00 PM
Lunch
Lunch
1:00 PM - 2:00 PM
2:00 PM
GANs Introduction
-
Denis Derkach
(
National Research University Higher School of Economics (RU)
)
GANs Introduction
Denis Derkach
(
National Research University Higher School of Economics (RU)
)
2:00 PM - 3:20 PM
Room: Princeton
3:20 PM
Tea Break
Tea Break
3:20 PM - 3:40 PM
Room: Princeton
3:40 PM
Generative Adversarial Networks. Practice
Generative Adversarial Networks. Practice
3:40 PM - 5:00 PM
Room: Princeton
Wednesday, November 24, 2021
10:00 AM
Diffusion Generative Models-1
Diffusion Generative Models-1
10:00 AM - 11:20 AM
Room: Princeton
11:20 AM
Coffee Break
Coffee Break
11:20 AM - 11:40 AM
Room: Princeton
11:40 AM
Diffusion Generative Models-1. Practice
Diffusion Generative Models-1. Practice
11:40 AM - 1:00 PM
Room: Princeton
1:00 PM
Lunch
Lunch
1:00 PM - 2:00 PM
2:00 PM
Diffusion Generative Models-2
Diffusion Generative Models-2
2:00 PM - 3:20 PM
Room: Princeton
Contributions
2:00 PM
Generative adversarial networks
-
Artem Maevskiy
(
HSE University
)
3:20 PM
Tea Break
Tea Break
3:20 PM - 3:40 PM
Room: Princeton
3:40 PM
Diffusion Generative Models-2. Practice
Diffusion Generative Models-2. Practice
3:40 PM - 5:00 PM
Room: Princeton
Thursday, November 25, 2021
10:00 AM
Efficient sampling methods-1
Efficient sampling methods-1
10:00 AM - 11:20 AM
Room: Princeton
11:20 AM
Coffee Break
Coffee Break
11:20 AM - 11:40 AM
Room: Princeton
11:40 AM
Efficient sampling methods-2
Efficient sampling methods-2
11:40 AM - 1:00 PM
Room: Princeton
1:00 PM
Lunch
Lunch
1:00 PM - 2:00 PM
2:00 PM
Efficient sampling methods-3
Efficient sampling methods-3
2:00 PM - 3:20 PM
Room: Princeton
3:20 PM
Tea Break
Tea Break
3:20 PM - 3:40 PM
Room: Princeton
3:40 PM
Efficient sampling methods-4
Efficient sampling methods-4
3:40 PM - 5:00 PM
Room: Princeton
Friday, November 26, 2021
10:00 AM
Monte Carlo Variational Auto-Encoders
-
Panov Maxim
Monte Carlo Variational Auto-Encoders
Panov Maxim
10:00 AM - 10:30 AM
Room: Princeton
10:30 AM
Survey of methods of k-means clustering with optimal transport
-
Suvorikova Alexandra
Survey of methods of k-means clustering with optimal transport
Suvorikova Alexandra
10:30 AM - 11:00 AM
Room: Princeton
TBA
11:00 AM
RICH GAN
-
Mokhnenko Sergey
RICH GAN
Mokhnenko Sergey
11:00 AM - 11:30 AM
Room: Princeton
11:30 AM
The Robustness of Deep Networks: A Geometrical Perspective
-
Evgenii Burnaev
The Robustness of Deep Networks: A Geometrical Perspective
Evgenii Burnaev
11:30 AM - 12:10 PM
Room: Princeton
12:10 PM
Coffee Break
Coffee Break
12:10 PM - 12:30 PM
Room: Princeton
12:30 PM
Resolution-robust Large Mask Inpainting with Fourier Convolutions
-
Roman Suvorov, Aleksei Silvestrov
Resolution-robust Large Mask Inpainting with Fourier Convolutions
Roman Suvorov, Aleksei Silvestrov
12:30 PM - 1:00 PM
Room: Princeton
1:00 PM
Problems with Deep Learning: new AI winter or new synthesis?
-
Mikhail Burtsev
Problems with Deep Learning: new AI winter or new synthesis?
Mikhail Burtsev
1:00 PM - 1:30 PM
Room: Princeton
1:30 PM
Lunch
Lunch
1:30 PM - 2:30 PM
2:30 PM
Material generation with AI
-
Lazarev Mikhail
Material generation with AI
Lazarev Mikhail
2:30 PM - 3:00 PM
Room: Princeton
3:00 PM
Uncertainty of Generative models
-
Agatha Shishigina
Uncertainty of Generative models
Agatha Shishigina
3:00 PM - 3:30 PM
Room: Princeton
3:30 PM
Tea Break
Tea Break
3:30 PM - 3:50 PM
Room: Princeton
3:50 PM
Black-Box Optimization with Local Generative Surrogates
-
Sergey Shirobokov
Black-Box Optimization with Local Generative Surrogates
Sergey Shirobokov
3:50 PM - 4:10 PM
Room: Princeton
4:10 PM
Weather tuning via surrogates
-
Sergey Popov
Weather tuning via surrogates
Sergey Popov
4:10 PM - 4:30 PM
Room: Princeton
4:30 PM
Tackling the Challenge of Uncertainty Estimation and Robustness to Distributional Shift in Real-World applications
-
Andrey Malinin
Tackling the Challenge of Uncertainty Estimation and Robustness to Distributional Shift in Real-World applications
Andrey Malinin
4:30 PM - 5:10 PM
Room: Princeton
While much research has been done on developing methods for improving robustness to distributional shift and uncertainty estimation, most of these methods were developed only for small-scale regression or image classification tasks. Limited work has examined developing standard datasets and benchmarks for assessing these approaches. Furthermore, many tasks of practical interest have different modalities, such as tabular data, audio, text, or sensor data, which offer significant challenges involving regression and discrete or continuous structured prediction. In this work, we propose the Shifts Dataset for evaluation of uncertainty estimates and robustness to distributional shift. The dataset, which has been collected from industrial sources and services, is composed of three tasks, with each corresponding to a particular data modality: tabular weather prediction, machine translation, and self-driving car (SDC) vehicle motion prediction. All of these data modalities and tasks are affected by real, `in-the-wild' distributional shifts and pose interesting challenges with respect to uncertainty estimation. We hope that this dataset will enable researchers to meaningfully evaluate the plethora of recently developed uncertainty quantification methods, assessment criteria and baselines, and accelerate the development of safe and reliable machine learning in real-world risk-critical applications. An additional challenge to uncertainty estimation in real world tasks is that standard approaches, such as model ensembles, are computationally expensive. Ensemble Distribution Distillation (EnDD) is an approach that allows a single model to efficiently capture both the predictive performance and uncertainty estimates of an ensemble. Although theoretically principled, this work shows that the original Dirichlet log-likelihood criterion for EnDD exhibits poor convergence when applied to large-scale tasks where the number of classes is very high. Specifically, we show that in such conditions the original criterion focuses on the distribution of the ensemble tail-class probabilities rather than the probability of the correct and closely related classes. We propose a new training objective which resolves the gradient issues of EnDD and enables its application to tasks with many classes, as we demonstrate on the ImageNet, LibriSpeech, and WMT17 En-De datasets containing 1000, 5000, and 40,000 classes, respectively.
5:10 PM
The end
The end
5:10 PM - 5:20 PM
Room: Princeton
6:00 PM
School dinner
School dinner
6:00 PM - 8:30 PM