Speaker
Description
Time series are present in many sectors such as weather forecasting, economy, medicine, artificial intelligence, and industry. The analysis of these sets to predict unknown values of variables in a complex system requires sophisticated algorithms which require a high computational cost. Machine learning algorithms, like recurrent neural networks, lead to plausible solutions.
However, when dealing with multi-layer networks and broad series, some issues, such as overfitting and memory losses, arise. Several approaches intend to address these issues, for example, the outstanding long short-term memory cell, the gated-recurrent unit, and the echo-state network. Despite these approaches, learning from multivariate-complex systems is still difficult and requires networks with several non-linear terms that are expensive to compute on classical devices.
Quantum computation emerges as a promising tool to tackle complex problems more efficiently since it allows to directly compute highly non-linear terms in a high dimensional space without the need for exponentially scaling resources. We are currently working on a quantum recurrent neural network that consists of two entangled quantum registers, A and B. Register A exchanges information with the environment by encoding input data into quantum states and measuring qubits to obtain output data. Register B keeps a quantum state that saves information from previous inputs and outputs and is never measured. We focus our research on evaluating architectures for multivariate time series. These datasets have significant interest in the industry, where one wants to learn the correlations between multiple sensors and predict the behaviour of the production chain.
The aim is to explain this new algorithm in terms of structure, optimization, development perspectives, and performance related to classical ones.