The comparison of particle collisions is a key component of an understanding of high-energy physics experiments. It allows the user to check their theoretical knowledge with empirical results. The methods which currently are used mainly focus on properties of collision and collided particles itself. In this work, we present the new solution for this task, using real-life examples from the Time Projection Chamber (TPC) in the ALICE experiment at CERN. It focuses on the visual representation of particles in 3D space instead of properties of particles collision. For this purpose, we used the machine learning model - VAE (Variational Autoencoder). The obtained result we compared with edge histogram descriptor which is the algorithm used in MPEG-7 Visual Standard to measure the similarity of edges.
The main advantage of the proposed method is finding a similar collision in terms of visual representation. Apart from this, the complexity of these algorithms is linear, the same as it is in traditional methods. Nevertheless, this quality of finding a similar collision comes with the cost of execution time. It is about 4 times slower than the model based on physical properties.