Speaker
Description
Most commercial wearables still capture only basic metrics such as step counts or heart rate, and remain closed systems without access to raw data. In this talk, I will present our holistic approach to full-body biosignal intelligence, where ultra-low-power embedded platforms and machine learning algorithms are co-designed to capture and process signals from the brain, eyes, muscles, and cardiovascular system in real time, while sustaining day-long battery lifetimes. I will show how open, modular platforms can be adapted into diverse wearable form factors, and how tailored ML algorithms make these signals usable for applications such as seizure detection and eye-movement classification. Finally, I will discuss how this vision extends to emerging modalities such as wearable ultrasound, representing the next leap in multimodal, ML-enabled wearables.