View publication

The growing demand for personalized and private on-device applications highlights the importance of source-free unsupervised domain adaptation (SFDA) methods, especially for time-series data, where individual differences produce large domain shifts. As sensor-embedded mobile devices become ubiquitous, optimizing SFDA methods for parameter utilization and data-sample efficiency in time-series contexts becomes crucial. Personalization in time series is necessary to accommodate the unique patterns and behaviors of individual users, enhancing the relevance and accuracy of the predictions. In this work, we introduce a novel paradigm for source-model preparation and target-side adaptation aimed at improving both parameter and sample efficiency during the target-side adaptation process. Our approach re-parameterizes source-model weights with Tucker-style decomposed factors during the source-model preparation phase. Then, at the time of target-side adaptation, only a subset of these decomposed factors is fine-tuned. This strategy not only enhances parameter efficiency, but also implicitly regularizes the adaptation process by constraining the model's capacity, which is essential for personalization in diverse and dynamic time-series environments. Moreover, the proposed strategy achieves overall model compression and improves inference efficiency, making it highly suitable for resource-constrained devices. Extensive experiments on various time-series SFDA benchmark datasets demonstrate the effectiveness and efficiency of our approach, underscoring its potential for advancing personalized on-device time-series applications.

Related readings and updates.

Towards Time-Series Reasoning with LLMs

Multi-modal large language models (MLLMs) have enabled numerous advances in understanding and reasoning in domains like vision, but we have not yet seen this broad success for time-series. Although prior works on time-series MLLMs have shown promising performance in time-series forecasting, very few works show how an LLM could be used for time-series reasoning in natural language. We propose a novel multi-modal time-series LLM approach that…
See paper details

Generalizable Autoregressive Modeling of Time Series Through Functional Narratives

Time series data are inherently functions of time, yet current transformers often learn time series by modeling them as mere concatenations of time periods, overlooking their functional properties. In this work, we propose a novel objective for transformers that learn time series by re-interpreting them as temporal functions. We build an alternative sequence of time series by constructing degradation operators of different intensity in the…
See paper details