Leveraging Periodicity for Robustness with Multi-modal Mood Pattern Models
AuthorsJaya Narain*, Jenny Sun*, Oussama Elachqar, Haraldur Hallgrimsson, Feng Zhu, Shirley Ren
Leveraging Periodicity for Robustness with Multi-modal Mood Pattern Models
AuthorsJaya Narain*, Jenny Sun*, Oussama Elachqar, Haraldur Hallgrimsson, Feng Zhu, Shirley Ren
*Equal Contributors
Data from wearable sensors (e.g., heart rate, step count) can be used to model mood patterns. We characterize feature representations and modeling strategies with multi-modal discrete time series data for mood pattern classification with a large dataset with naturalistic missingness (n=116,819 participants) using 12 wearable data streams, with a focus on capturing periodic trends in data. Considering both performance and robustness, periodicity-based aggregate feature representations with gradient boosting models outperformed other representations and architectures studied. The use of periodic features improved the model performance compared to temporal statistics, and gradient boosting models were more robust to missingness and shifts in missingness distributions than a deep learning time series model.
Speech Foundation Models Generalize to Time Series Tasks from Wearable Sensor Data
November 20, 2025research area Health, research area Methods and AlgorithmsWorkshop at NeurIPS
This paper was accepted at the Learning from Time Series for Health workshop at NeurIPS 2025.
Both speech and sensor time series data encode information in both the time- and frequency- domains, like spectral powers and waveform shapelets. We show that speech foundation models learn representations that generalize beyond the speech domain and achieve state-of-the-art performance on diverse time-series tasks from wearable sensors. Probes trained…
Generalizable Autoregressive Modeling of Time Series Through Functional Narratives
October 15, 2024research area Methods and Algorithms, research area Tools, Platforms, Frameworks
Time series data are inherently functions of time, yet current transformers often learn time series by modeling them as mere concatenations of time periods, overlooking their functional properties. In this work, we propose a novel objective for transformers that learn time series by re-interpreting them as temporal functions. We build an alternative sequence of time series by constructing degradation operators of different intensity in the…