Speaker
Description
In recent years, a correspondence has been established between the appropriate asymptotics of deep neural networks (DNNs), including convolutional ones (CNNs), and the machine learning methods based on Gaussian processes (GPs). The ultimate goal of establishing such interrelations is to achieve a better theoretical understanding of various methods of machine learning (ML) and their improvement. Since Gaussian processes are mathematically similar to Euclidean quantum field theory (QFT), one of the intriguing consequences of this correspondence is the potential for using a vast arsenal of QFT methods to analyze deep neural networks.
An important feature of convolutional networks is their equivariance (consistency) with respect to the symmetry transformations of the input data. Equivariance guarantees that exactly the same filters are applied to each part of the input image regardless of position and that the network can detect any given object equally well regardless of its location respecting data symmetry properties. It is important that the above mentioned works on establishing the interrelations between CNNs and GPs deal only with translational equivariance of images. On the other hand there exists investigations of more general equivariant neural Gaussian processes but without established relations with CNNs in the appropriate limit. In the present work, we fill this gap by establishing a relationship between the many-channel limit of equivariant CNNs and the corresponding equivariant Gaussian process, and hence the QFT with the appropriate symmetry.
Speaker time zone | Compatible with Europe |
---|