Speaker
Description
Researchers have long highlighted the need for multi-modal approaches in teaching and learning environments using inclusive material, based on user-centred and universal design.
In most cases, the sonification mapping of the dataset was defined by its creator and shared as a final product or even with some musicalisation, not clearly devoted to the study of the data, the identification of features or to research.
Taking into account the increasing examples of sonification in astrophysics, the development of a sonification software that translates data in different formats into sound, which allows users to open different datasets, and explore them through visual and auditory display, the last permitting them to adjust visual and sound settings to enhance their perception, was done. It is an open-source application, based on a modular design, which allows users to open different datasets, and explore them through visual and auditory display, the last permitting them to adjust visual and sound settings to enhance their perception
The software emerged at the early stages of the REINFORCE project, and, because of this, it was possible to add multiple new functionalities and developments, including the sonification of diverse sets of data from the demonstrator-projects.
In this contribution, we present for a first time the particle sonorization, applied to two specific cases:
- New Particle search at CERN, with an innovative approach in order to be able to classify different particles using only sound.
- Cosmic Muon Images, the representation with sound was based on the possibility of correlating the deposit of energy through the three layers of the detector.
On the other hand, and after two training courses, we will present the results of impact and improvement of detection of data features using the multi-modal approach to the study of scientific data.
Collaboration(s) | REINFORCE Project Collaboration |
---|