Conveners
Submitted contributions: Session 1
- Steven Schramm (Universite de Geneve (CH))
Submitted contributions: Session 2
- Boris Escalante-Ramรญrez (UNAM)
Submitted contributions: Session 3
- Federico Carminati (CERN)
Submitted contributions: Session 4
- Steven Schramm (Universite de Geneve (CH))
Galaxy morphology is one of the most important parameters to understand the assembly and evolution of galaxies in the universe. The most used classification nowadays was proposed by Hubble (1926). This classification is based on the present of disks-, arms-, bulges-like structures, and is carried out mostly by eye identification. Thanks to the upcoming large telescopes and surveys, the new...
We explore the perspectives of machine learning techniques in the context of quantum field theories based on our recent publication[1]. In particular, we discuss two-dimensional complex scalar field theory at nonzero temperature and chemical potential โ a theory with a nontrivial phase diagram. A neural network is successfully trained to recognize the different phases of this system and to...
Parton Distribution Functions (PDFs) model the parton content of the proton. Of the many collaborations which focus on PDF determination in the last 20 years, NNPDF was pioneer on the use of Neural Networks to model the probability of finding partons (quarks and gluons) inside the proton with a given energy and momentum.
In this work we introduce state of the art techniques to modernize the...
Studies of High energy physics rely on cutting edge technology of particle detectors. These detectors are working under harsh conditions of constant radiations, and their operation must be strictly supervised to achieve maximum efficiency and quality of data taking. The LHCb's Vertex Locator is a strip silicon detector used in Run I and II of Large Hadron Collider. We study the application of...
The Belle II particle accelerator experiment is experiencing substantial background from outside of the interaction point. To avoid taking data representing this background, track parameters are estimated within the pipelined and dead time-free level 1 trigger system of the experiment and used to suppress such events. The estimation of a particle trackโs origin with respect to the z-Axis,...
At the High Luminosity Large Hadron Collider (HL-LHC), many
proton-proton collisions happen during a single bunch crossing. This leads on average to tens of thousands of particles emerging from the interaction region. Two major factors impede finding charged particle trajectories from measured hits in the tracking detectors. First, deciding whether a given set of hits was produced by a common...
Deep Learning techniques have are being studied for different applications by the HEP community: in this talk, we discuss the case of detector simulation. The need for simulated events, expected in the future for LHC experiments and their High Luminosity upgrades, is increasing dramatically and requires new fast simulation solutions. We will describe an R&D activity within CERN openlab, aimed...
We examine the discovery potential for double Higgs production at the high luminosity LHC in the final state with two b-tagged jets, two leptons and missing transverse momentum. Although this dilepton final state has been considered a difficult channel due to the large backgrounds, we argue that it is possible to obtain sizable signal significance, by adopting a deep learning framework making...
Online education captures all new frontiers. Modern platforms are scaled to transfer the necessary information to the student: illustrations of basic concepts, theoretical foundations, etc. At the same time, practical skills are difficult to convey online; they can only be acquired through personal experience. Such difficulties make it difficult to study natural sciences (physics, chemistry,...
In recent times, computational studies have emerged as a viable alternative for complementing the efforts of experienced radiologists in disease diagnosis. Computed tomography (CT) studies are a common way of predicting the lung nodule malignancy for the early diagnosis and treatment of lung cancer in patients. Early detection of the type of nodule is the key to determining the appropriate...
We demonstrate that it is possible to approach the skin lesion classification problem as a detection problem (multiple localization with classification), a much more complex and interesting problem with satisfactory and promising results. The image dataset used in the experiments comes from the ISIC Dermoscopic Archive, an open-access dermatology repository. In particular, the ISIC 2017...
According to recent studies, Machine Learning has become a demanded skill in the world of engineering. Therefore, it is possible to find hundreds of online courses offering to teach these abilities. Some companies now call for โMachine Learning Engineersโ as a new hiring position. For many engineering students this is a very difficult situation, since taught courses are often focusing on the...
Random number generation currently plays a fundamental role due to its several applications in probabilistic algorithms (e.g. Monte Carlo methods, stochastic gradient descent, etc.), but mainly for its importance in cryptography. The most common methods for characterizing random numbers generators (RNG) either lack formality (e.g. the battery of tests provided by the NIST) or are not...
Digital pulse shape analysis (DPSA) is becoming an essential tool to extract relevant information from waveforms arising from different source. For instance, in the particle detector field, digital techniques are competing very favorable against the traditional analog way to extract the information contained in the pulses coming from particle detectors. Nevertheless, the extraction of the...
The increasing availability of data due to effective and fast sharing methods offered by technological advances has catalyzed new approaches like network science to process large data sets and transcend traditional statistical tools analysis; hence we introduce the tripartite: destinations-rates-advisories network to make connections between complex non-linear interrelations that affect...
Application of AI methods in the energy industry has been increasing in the last years. The introduction of AI in the different segments of the energy industry has turned out to be revolutionary. Some of these segments are considered as fundamental for the economic development of countries. This means, increasing their efficiency can highlight the difference between conservatory and...
Human mobility in megacities is a fundamental problem to address and one of the most pressing societal challenges nowadays. Fortunately, we have now at our disposal a vast set of data, through mobile devices and geolocalized social networks, that allow us to explore, using Data Science, the patterns of mobility of tens of millions of people on a daily basis. We present here recent result for...
Talk given at SXSW 2019 in Austin, TX.
Contrary to what tech news outlets like Mashable, Techcrunch, and Wired tell us about the future of work and the purported "rise of the machines", we're seeing that Machine Learning developments are still far away from general AI, and hence far away from becoming real problem solvers. However, it is precisely because of this early stage of development...
High Energy Physics utilizes powerful distributed computational networks called grids, to process and analyze scientific data. Monitoring the security of these networks is a challenging task. Arbitrary and not-trusted applications can be executed inside the grid worker nodes by the scientists. Innovative methods and tools are required to reduce the risk associated with the
execution of users'...
Perform big data analysis and visualisation on your own computer? Yes, you can! Commodity computers are now very powerful in comparison to only a few years ago. On top of that, the performance of today's software and data development techniques facilitates complex computation with fewer resources. Cloud computing is not always the solution, and reliability or even privacy is regularly a...
A critical infrastructure is a complex interconnected system of systems providing basic and essential services to support the operation of particle accelerators but also industries and households for which they must guarantee high reliability of critical functions.
Model-based approaches are usually adopted to provide an early identi๏ฌcation of failures and to reveal hidden dependencies among...
The objective of this work is to show an analysis of the performance of a very well known Convolutional Neural Network applied to the classification of animals in the wild.
The interesting aspect of this application is that the set of images used have diverse characteristics from the training set of the ConvNet.
The analysis goes beyond the typical display of error rates, precision and...
The comparison of particle collisions is a key component of an understanding of high-energy physics experiments. It allows the user to check their theoretical knowledge with empirical results. The methods which currently are used mainly focus on properties of collision and collided particles itself. In this work, we present the new solution for this task, using real-life examples from the Time...