Speaker
Description
Just like their classical counterparts, quantum algorithms require a set of inputs, provided for example as real numbers, and a list of operations to be performed on some reference initial state. Unlike classical computers, however, information is stored in a quantum processor in the form of a wavefunction, thus requiring special procedures to read out the final results. In fact, it is in general neither possible nor convenient to fully reconstruct this quantum state, so that useful insights must be extracted by performing specific observations.
Unfortunately, the number of measurements required for many popular applications is known to grow unsustainably large with the size of the system, even when only partial information is needed. This is for example the case for the so-called Variational Quantum Eigensolver, which is based on the reconstruction of average energies. In this talk I will discuss a novel scheme to tackle this problem.
We employ a generalised class of quantum measurements that can be iteratively adapted to minimize the number of times the target quantum state should be prepared and observed. As the algorithm proceeds, it reuses previous measurement outcomes to adjust its own settings and increase the accuracy of subsequent runs. We make the most out of every sample by combining all data produced while fine-tuning the measurement into a single, highly accurate estimate of the energy, thus decreasing the expected runtime by several orders of magnitude. Furthermore, all the measurement data contain complete information about the state: once collected, they can be reused again and again to calculate any other property of the system without additional costs.