Event Classification with Multi-step Machine Learning

18 May 2021, 11:29
13m
Short Talk Offline Computing Artificial Intelligence

Speaker

Masahiko Saito (University of Tokyo (JP))

Description

The usefulness and valuableness of Multi-step ML, where a task is organized into connected sub-tasks with known intermediate inference goals, as opposed to a single large model learned end-to-end without intermediate sub-tasks, is presented. Pre-optimized ML models are connected and better performance is obtained by re-optimizing the connected one. The selection of a ML model from several small ML model candidates for each sub-task has been performed by using the idea based on NAS. In this paper, DARTS and SPOS-NAS are tested, where the construction of loss functions is improved to keep all ML models smoothly learning. Using DARTS and SPOS-NAS as an optimization and selection as well as the connecting for multi-step machine learning systems, we find that (1) such system can quickly and successfully select highly performant model combinations, and (2) the selected models are consistent with baseline algorithms such as grid search and their outputs are well controlled.

Primary authors

Masahiko Saito (University of Tokyo (JP)) Tomoe Kishimoto (University of Tokyo (JP)) Yuya Kaneta (BrainPad Inc.) Taichi Itoh (BrainPad Inc.) Yoshiaki Umeda ( BrainPad Inc.) Junichi Tanaka (University of Tokyo (JP)) Yutaro Iiyama (University of Tokyo (JP)) Ryu Sawada (University of Tokyo (JP)) Koji Terashi (University of Tokyo (JP))

Presentation materials

Proceedings

Paper