Speaker
Description
In this work we relate the physics of time to information theory via a simple question: how many bits of information do we gain when we read off the value of a clock?
Our motivation is to understand from an operational point of view how much information clocks provide about time. Doing so would allow us to connect the performance of clocks with basic quantities in physics, such as size, energy and thermodynamic resources. Furthermore, this will be beneficial in establishing general results characterizing the cost of communication of time-information in clock networks.
We present a measure of information based on a relative entropy between real clocks and "zero-information" clocks, and demonstrate that it unifies existing quantifiers of accuracy.