This project's vision is to unify sensors and signals from diverse domains, including physical, health, weather, and more, into a common language that can be processed by today's foundation models.
Our first proof-of-concept was a physical activity assistant, SensorFit AI, that utilizes real-time movement sensor data from a smartwatch to give personalized feedback and suggestions. It has memory, planning, and reasoning features, enabling it to give you personalized feedback on your skill level, mistakes in your form, tips to improve, and more.
We are currently developing LLMs to process, reason, and explain sensor data from diverse domains for many tasks, especially for ECG and EEG sensors in the medical domain.