咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Rethinking Time Series Forecas... 收藏
arXiv

Rethinking Time Series Forecasting with LLMs via Nearest Neighbor Contrastive Learning

作     者:Bogahawatte, Jayanie Seneviratne, Sachith Perera, Maneesha Halgamuge, Saman 

作者机构:AI Optimization and Pattern Recognition Research Group Dept. of Mechanical Eng. University of Melbourne Australia 

出 版 物:《arXiv》 (arXiv)

年 卷 期:2024年

核心收录:

主  题:Contrastive Learning 

摘      要:Adapting Large Language Models (LLMs) that are extensively trained on abundant text data, and customizing the input prompt to enable time series forecasting has received considerable attention. While recent work has shown great potential for adapting the learned prior of LLMs, the formulation of the prompt to finetune LLMs remains challenging as prompt should be aligned with time series data. Additionally, current approaches do not effectively leverage word token embeddings which embody the rich representation space learned by LLMs. This emphasizes the need for a robust approach to formulate the prompt which utilizes the word token embeddings while effectively representing the characteristics of the time series. To address these challenges, we propose NNCL-TLLM: Nearest Neighbor Contrastive Learning for Time series forecasting via LLMs. First, we generate time series compatible text prototypes such that each text prototype represents both word token embeddings in its neighborhood and time series characteristics via end-to-end finetuning. Next, we draw inspiration from Nearest Neighbor Contrastive Learning to formulate the prompt while obtaining the top-k nearest neighbor time series compatible text prototypes. We then fine-tune the layer normalization and positional embeddings of the LLM, keeping the other layers intact, reducing the trainable parameters and decreasing the computational cost. Our comprehensive experiments demonstrate that NNCL-TLLM outperforms in few-shot forecasting while achieving competitive or superior performance over the state-of-the-art methods in long-term and short-term forecasting tasks. © 2024, CC BY.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分