Lorenzo Moneta
(CERN),
Omar Andres Zapata Mesa
(Metropolitan Institute of Technology)
19/01/2016, 14:00
Data Analysis - Algorithms and Tools
Oral
ROOT, a data analysis framework, provides advanced statistical methods needed by the LHC experiments for analyzing their data. These include machine learning tools required for classification, regression and clustering.
These methods are provided by the TMVA, a toolkit for multi-variate analysis within ROOT.
We will present recent development in TMVA and new interfaces between ROOT and...
Thomas James Stevenson
(University of London (GB))
19/01/2016, 14:25
Data Analysis - Algorithms and Tools
Oral
We review the concept of support vector machines (SVMs) and discuss examples of their use in a number of scenarios.
One of the benefits of SVM algorithms, compared with neural networks and decision trees is that they can be less susceptible to over fitting than those other algorithms are to over training. This issue is related to the generalisation of a multivariate algorithm (MVA); a...
Juan Guillermo Pavez Sepulveda
(Federico Santa Maria Technical University (CL))
19/01/2016, 14:50
Data Analysis - Algorithms and Tools
Oral
In High Energy Physics and many other fields likelihood ratios are a key tool when reporting results from an experiment. In order to evaluate the likelihood ratio the likelihood function is needed. However, it is common in HEP to have complex simulations that describe the distribution while not having a description of the likelihood that can be directly evaluated. This simulations are used to...
Lucio Anderlini
(Universita e INFN, Firenze (IT))
19/01/2016, 15:45
Data Analysis - Algorithms and Tools
Oral
Density Estimation Trees (DETs) are decision trees trained on a multivariate dataset to estimate its probability density function. While not competitive with kernel techniques in terms of accuracy, they are incredibly fast, embarrassingly parallel and relatively small when stored to disk.
These properties make DETs appealing in the resource-expensive horizon of the LHC data...
Aleksei Rogozhnikov
(Yandex School of Data Analysis (RU))
19/01/2016, 16:10
Data Analysis - Algorithms and Tools
Oral
Machine learning tools are commonly used in high energy physics (HEP) nowadays.
In most cases, those are classification models based on ANN or BDT which are used to select the "signal" events from data. These classification models are usually trained using Monte Carlo (MC) simulated events.
A frequently used method in HEP analyses is reweighting of MC to reduce the discrepancy between...
Joao Victor Da Fonseca Pinto
(Univ. Federal do Rio de Janeiro (BR))
19/01/2016, 16:35
Data Analysis - Algorithms and Tools
Oral
After the successful operation of the Large Hadron Collider resulting with the
discovery of the Higgs boson, a new data-taking period (Run 2) has
started. For the first time, collisions are produced with
energies of 13 TeV in the centre of mass. It is foreseen the
luminosity increase, reaching values as high as $10^{34}cm^{-2}s^{-1}$
yet in 2015. These changes in experimental conditions...
Mrs
SAMEH MANNAI
(Université Catholique de Louvain. Belgium)
19/01/2016, 17:00
Data Analysis - Algorithms and Tools
Oral
The Semi-Digital Hadronic CALorimeter(SDHCAL) using Glass Resistive Plate Chambers (GRPCs) is one of the two hadronic calorimeter options proposed by the ILD (International Large Detector) project for the future (ILC) International Linear Collider experiments.
It is a sampling calorimeter with 48 layers. Each layer has a size of 1 m² and finely segmented into cells of 1 cm² ensuring a high...