Tsinghua Workshop on Machine Learning in Geometry and Physics 2018

Asia/Shanghai
Tsinghua Sanya International Mathematics Forum

Tsinghua Sanya International Mathematics Forum

Sanya, Hainan, China
Description


We are pleased to announce the first workshop on machine learning in geometry and physics at the Tsinghua Sanya International Mathematics Forum, 11-15 June 2018.

The goal of the workshop is to explore how machine learning techniques can be applied in modern mathematics and theoretical physics. We intend to bring together a diverse set of experts whose research interests and expertise are of general interest for researchers who are trying to approach research problems in formal physics and mathematics from a data science perspective. Some of the topics the workshop intends to cover:

  • String landscape from a data science perspective
  • Holography, RG flows and deep learning
  • Novel geometric relations from data mining

Machine learning is a powerful new tool for many scientific fields, and is already entering and affecting our daily lives. It allows us to uncover new, often highly non-linear, structures and relations buried in vast amounts of data. In order to apply this machinery, it is imperative to be able to formulate the problem under consideration into a data science question – hence data is the key for machine learning. This yields an explanation why this novel tool has been of very limited interest and use so far in more formal scientific fields like theoretical physics or pure mathematics. What has been missing so far in these formal fields has been a set of suitable questions which can be viewed from a data science perspective. However, recently it became clear that there are in fact such problems in theoretical physics and mathematics which can be approached from a data science point of view, showcasing the discovery potential of machine learning techniques. For instance, one of the hopes which is emerging is that hints for new relations or proofs can be discovered in a statistical sense, with the aid of modern data analysis techniques based on machine learning - that is, leading to a novel notion of "experimental" mathematics.

We hope that this workshop helps to accelerate the establishment of machine learning as a powerful new tool in research fields related to geometry and theoretical physics.

Venue

Tsinghua Sanya International Mathematics Forum
Sanya, Hainan, China
http://ymsc.tsinghua.edu.cn/sanya/

Accommodation

Participants are lodged in the center with full board. Please indicate any special food requirements during registration.

Confirmed Participants

Daniel Krefl CERN
Rak-Kyeong Seong Tsinghua University
Shing-Tung Yau Tsinghua University & Harvard University
Jim Halverson Northeastern
Fabian Ruehle Oxford
Stefano Carrazza  CERN
Masato Taki RIKEN Tokyo
Sven Krippendorf Oxford
Fernando Quevedo Abdus Salam International Centre for Theoretical Physics, ICTP
Koji Hashimoto  Osaka university
Sotaro Sugishita  Japan Society for the Promotion of Science
Akinori Tanaka Riken
Akio Tomiya Central China Normal University
Jason Morton  Pennsylvania State University
Cedric Beny  Hanyang University
Dan Oprisa Agoda Services Co., Ltd
Per Berglund New Hampshire
Vishnu Jejjala  Witwatersrand
Gary Shiu University of Wisconsin-Madison
Peter Toth Google DeepMind
Artem Lenskiy Korea University of Technology and Education
Artur Garcia Saez Barcelona SC center
J.A. Orduz-Ducuara UNAM
Gregory Chirikjian  Johns Hopkins University
Yi-Zhuang You UCSD & Harvard
Maciej Koch-Janusz Swiss Federal Institute of Technology (ETH) Zurich
Zheng Sun Sichuan University
Jing Chen Flatiron Institute
Greg Yang Microsoft Research
Adam Smith University of Oxford
Hiroyuki Fujita University of Tokyo
Daham Lee Tsinghua University
Shi-Min Hu Tsinghua University

Conference Website

http://www.tsimf.cn/meeting/cnshow?id=148


Registration

-- registration is closed --

As the venue only offers limited space, it is mandatory to apply to participate in the workshop (email: stringsml2018 - at - gmail.com). 

Selected participants will receive an official invitation letter with further information regarding travel and visa requirements. 


Organizers

Daniel Krefl (CERN)
Rak-Kyeong Seong (Tsinghua University)
Shing-Tung Yau (Harvard University and Tsinghua University) 

 


Contact

stringsml2018 - at - gmail.com

    • 07:30
      Breakfast
    • 08:40
      Opening Ceremony
    • 1
      Learning and Lie Groups

      Machine learning methods are mostly based on calculus and probability and statistics on Euclidean spaces.
      However, many interesting problems can be articulated as learning in lower dimensional embedded manifolds
      and on Lie groups. This talk reviews how learning and Lie groups fit together, and how the machine learning community can benefit from modern mathematical developments. The topics include:

      •Introduction to Calculus on Lie Groups (Differential Operators, Integration)
      •Probability on Lie Groups (Convolution, Fourier Analysis, Diffusion Equations)
      •Application 1: Workspace Generation and Inverse Kinematics of Highly Articulated Robotic Manipulators
      •Application 2: Pose Distributions for Mobile Robots
      •Application 2: Lie-Theoretic Invariances in Image Processing and Computer Vision
      •Application 3: Coset-Spaces of Lie Groups by Discrete Subgroups in Crystallography
      •Prospects for the Future

      Speaker: Gregory S. Chirikjian (Johns Hopkins University)
    • 10:00
      Tea Break
    • 2
      A Generalized Construction of Calabi-Yau Manifolds and Mirror Symmetry

      We extend the construction of Calabi-Yau manifolds to hypersurfaces in non-Fano toric varieties. The associated non-reflexive polytopes provide a generalization of Batyrev’s original work, allowing us to construct new pairs of mirror manifolds. In particular, this allows us to find new K3-fibered Calabi-Yau manifolds, relevant for string compactifications.

      Speaker: Per Berglund (New Hampshire)
    • 3
      Patterns in Calabi-Yau Threefolds

      TBA

      Speaker: Vishnu Jejjala (Witwatersrand)
    • 12:00
      Lunch Break
    • 4
      Neural Program Synthesis and Neural Automated Theorem Proving, via Curry-Howard Correspondence

      Curry-Howard correspondence is, roughly speaking, the observation that proving a theorem is equivalent to writing a program. Using this principle, I will present a unified survey of recent trends in the application of deep learning in program synthesis and automated theorem proving, with commentary on their applicability to the working mathematician and physicists.

      Speaker: Greg Yang (Microsoft Research)
    • 5
      Quantum Computers and Machine Learning

      I will discuss the two-fold relation between Quantum Computers and Machine Learning. On one hand Quantum Computers offer new algorithms to perform training tasks on classical or Quantum data. On the other hand, Machine Learning offers new tools to study Quantum Matter, and to control Quantum experiments.

      Speaker: Artur Garcia Saez (Barcelona SC Center)
    • 15:30
      Tea Break
    • 16:00
      Free Discussion
    • 18:00
      Dinner
    • 07:30
      Breakfast
    • 6
      Renormalization and hierarchical knowledge representations

      Our understanding of any given complex physical system is made of not just one, but many theories which capture different aspects of the system. These theories are often stitched together only in informal ways. An exception is given by renormalization group techniques, which provide formal ways of hierarchically connecting descriptions at different scales.

      In machine learning, the various layers of a deep neural network seem to represent different levels of abstraction. How does this compare to scale in renormalization? Can one build a common information-theoretic framework underlying those techniques?

      To approach these questions, I compare two different renormalization techniques (which emerged from quantum information theory), and attempt to adapt them to unsupervised learning tasks. One approach, MERA, superficially resembles a deep convolutional neural net, while another approach based on dimensional reduction yields something similar to principal component analysis.

      Speaker: Cedric Beny (Hanyang University)
    • 10:00
      Tea Break
    • 7
      Real-space renormalization group

      Physical systems differing in their microscopic details often display strikingly similar behaviour when probed at macroscopic scales. Those universal properties, largely determining their physical characteristics, are revealed by the renormalization group (RG) procedure, which systematically retains ‘slow’ degrees of freedom and integrates out the rest. We demonstrate a machine-learning algorithm based on a model-independent, information-theoretic characterization of a real-space RG capable of identifying the relevant degrees of freedom and executing RG steps iteratively without any prior knowledge about the system. We apply it to classical statistical physics problems in 1 and 2D: we demonstrate RG flow and extract critical exponents. We also discuss the optimality of the procedure.

      Speaker: Maciej Koch-Janusz (ETH Zurich)
    • 8
      Topological Data Analysis for Cosmology and String Theory

      Topological data analysis (TDA) is a multi-scale approach in computational topology used to analyze the ``shape” of large datasets by identifying which homological characteristics persist over a range of scales. In this talk, I will discuss how TDA can be used to extract physics from cosmological datasets (e.g., primordial non-Gaussianities generated by cosmic inflation) and to explore the structure of the string landscape.

      Speaker: Gary Shiu (University of Wisconsin-Madison)
    • 12:00
      Lunch Break
    • 14:00
      Excursion: Nanshan Temple
    • 18:00
      Banquet
    • 07:30
      Breakfast
    • 9
      Algebraic geometry of the restricted Boltzmann machine

      TBA

      Speaker: Jason Morton (Pennsylvania State University)
    • 10:00
      Tea Break
    • 10
      Reinforcement learning in the string landscape

      In studying the string landscape, we often want to find vacua with specific properties, but do not know how to select the string geometry that gives rise to such vacua. For this reason, we apply reinforcement learning, a semi-supervised approach to machine learning in which the algorithm explores the landscape autonomously while being guided towards models with given properties. We illustrate the approach using examples from heterotic, type II, and F-theory.

      Speaker: Fabian Ruehle (Oxford)
    • 11
      On Finding Small Cosmological Constants with Deep Reinforcement Learning

      I will review the Bousso-Polchinski model and aspects of its computational complexity. An asynchronous advantage actor-critic will be used to find small cosmological constants.

      Speaker: Jim Halverson (Northeastern)
    • 12:00
      Lunch Break
    • 12
      Faster exploration of parameter space in supersymmetry and string theory using machine learning

      TBA

      Speaker: Sven Krippendorf (LMU Munich)
    • 13
      The Nelson-Seiberg theorem, its extensions, string realizations, and possible machine learning applications

      The Nelson-Seiberg theorem relates F-term SUSY breaking and R-symmetries in N=1 SUSY field theories. I will talk its several extensions including a revision to a necessary and sufficient condition, discrete R-symmetries and non-Abelian R-symmetries, relation to SUSY and W=0 vacua in the string landscape, and some possible machine learning applications in the searching for SUSY vacua.

      Speaker: Zheng Sun (Sichuan University)
    • 15:30
      Tea Break
    • 16:00
      Free Discussion
    • 18:00
      Dinner
    • 07:30
      Breakfast
    • 14
      Tensor network from quantum simulations to machine learning

      Tensor network is both a theoretical and numerical tool, which has achieved great success in many body physics from calculating he thermodynamic property and quantum phase transition to simulations of black holes. As a general form of high dimensional data structure, tensors have been adopted in diverse branches of data analysis, such as in signal and image processing, psychometric, quantum chemistry, biometric, quantum information, back holes, and brain science. Tensor network simulates the interactions between tensors and becomes a developing powerful in these new fields. During recent years, tensor network numerical methods such as matrix product state (MPS) and projected entangled pair state (PEPS) has also finds its way to machine learning. Besides, the physical concept of entanglement offers a new theoretical approach to the design of different neural networks.
      For example, we find that graphic models, such as restricted Boltzmann machine (RBM) is equivalent to a specific tensor network and we can study the expression power of the RBM.

      Phys. Rev. B 97, 085104 (2018)
      arXiv: 1712.04144

      Speaker: Jing Chen (Flatiron Institute)
    • 10:00
      Tea Break
    • 15
      Tensor Network Holography and Deep Learning

      Motivated by the close relations of the renormalization group with both the holography duality and the deep learning, we propose that the holographic geometry can emerge from deep learning the entanglement feature of a quantum many-body state. We develop a concrete algorithm, call the entanglement feature learning (EFL), based on the random tensor network (RTN) model for the tensor network holography. We show that each RTN can be mapped to a Boltzmann machine, trained by the entanglement entropies over all subregions of a given quantum many-body state. The goal is to construct the optimal RTN that best reproduce the entanglement feature. The RTN geometry can then be interpreted as the emergent holographic geometry. We demonstrate the EFL algorithm on 1D free fermion system and observe the emergence of the hyperbolic geometry (AdS_3 spatial geometry) as we tune the fermion system towards the gapless critical point (CFT_2 point).

      Speaker: Yi-Zhuang You (UCSD & Harvard)
    • 16
      Reverse engineering Hamiltonian from spectrum

      Handling the large number of degrees of freedom with proper approximations, namely the construction of the effective Hamiltonian is at the heart of the (condensed matter) physics. Here we propose a simple scheme of constructing Hamiltonians from given energy spectrum. The sparse nature of the physical Hamiltonians allows us to formulate this as a solvable supervised learning problem. Taking a simple model of correlated electron systems, we demonstrate the data-driven construction of its low-energy effective model. Moreover, we find that the same approach works for the construction of the entanglement Hamiltonian of a given quantum many-body state from its entanglement spectrum. Compared to the known approach based on the full diagonalization of the reduced density matrix, our one is computationally much cheeper thus offering a way of studying the entanglement nature of large (sub)systems under various boundary conditions.

      Speaker: Hiroyu Fujita (University of Tokyo)
    • 12:00
      Lunch Break
    • 17
      Toward reduction of autocorrelation in HMC by machine learning

      Recent development of machine learning (ML), especially deep learning is remarkable. It has been applied to image recognition, image generation and so on with very good precision. From a mathematical point of view, images are just real matrices, so it would be a natural idea to replace this matrices with the configurations of the physical system created by numerical simulation and see what happens. In this talk, I will review our attempt to reduce autocorrelation of Hamiltonian Monte Carlo (HMC) algorithm. In addition, I would like to discuss a possibility of using recent sophisticated generative models like VAE, GAN to improve HMC. (work in collaboration with A. Tomiya, arXiv:1712.03893)

      Speaker: Akinori Tanaka (Riken)
    • 18
      Machine learning for parton distribution determination

      Parton Distribution Functions (PDFs) are a crucial ingredient for
      accurate and reliable theoretical predictions for precision
      phenomenology at the LHC. The NNPDF approach to the extraction of
      Parton Distribution Functions relies on Monte Carlo techniques and
      Artificial Neural Networks to provide an unbiased determination of
      parton densities with a reliable determination of their uncertainties.
      I will discuss the NNPDF methodology in general, the latest NNPDF
      global fit (NNPDF3.1) and then present ideas to improve the training
      methodology used in the NNPDF fits and related PDFs.

      Speaker: Stefano Carrazza (CERN)
    • 15:30
      Tea Break
    • 16:00
      Free Discussion
    • 18:00
      Dinner
    • 07:30
      Breakfast
    • 19
      Deep Learning and AdS/CFT

      We present a deep neural network representation of the AdS/CFT correspondence, and demonstrate the emergence of the bulk metric function via the learning process for given data sets of response in boundary quantum field theories. The emergent radial direction of the bulk is identified with the depth of the layers, and the network itself is interpreted as a bulk geometry. Our network provides a data-driven holographic modeling of strongly coupled systems. With a scalar ϕ4 theory with unknown mass and coupling, in unknown curved spacetime with a black hole horizon, we demonstrate our deep learning (DL) framework can determine them which fit given response data. First, we show that, from boundary data generated by the AdS Schwarzschild spacetime, our network can reproduce the metric. Second, we demonstrate that our network with experimental data as an input can determine the bulk metric, the mass and the quadratic coupling of the holographic model. As an example we use the experimental data of magnetic response of a strongly correlated material Sm0.6Sr0.4MnO3. This AdS/DL correspondence not only enables gravity modeling of strongly correlated systems, but also sheds light on a hidden mechanism of the emerging space in both AdS and DL.
      (Work in collaboration with A.Tanaka, A.Tomiya and S.Sugishita, arXiv:1802.08313)

      Speaker: Koji Hashimoto (Osaka University)
    • 10:00
      Tea Break
    • 20
      Detection of phase transition via convolutional neural networks

      A Convolutional Neural Network (CNN) is designed to study correlation between the temperature and the spin configuration of the 2 dimensional Ising model.
      Our CNN is able to find the characteristic feature of the phase transition without prior knowledge. Also a novel order parameter on the basis of the CNN is introduced to identify the location of the critical temperature; the result is found to be consistent with the exact value. This talk is based on following paper,
      Journal of the Physical Society of Japan 86 (6), 063001 (arXiv:1609.09087).

      Speaker: Akio Tomiya (Central China Normal University)
    • 21
      The Machine Learning Role on High Energy Physics: A theoretical view

      In this talk, I show some concepts in computing, Physics and Mathematics
      focusing on High Energy Physics. I share some programming languages and
      tools implemented for computing the amplitudes, decays and cross sections.
      In particular, I explore the Two-Higgs Doublet Model and Extended Gauge Group
      Model and some results using Artificial Inteligence.

      Speaker: Javier Andres Orduz Ducuara (UNAM)
    • 12:00
      Lunch & Departure