Temporal distribution shifts are pervasive in real-world deployments of Large Language Models (LLMs), where data evolves continuously over time. While Temporal Domain Generalization (TDG) seeks to model such structured evolution, existing approaches characterize model adaptation in the full parameter space. This formulation becomes computationally infeasible for modern LLMs. This paper introduces a geometric reformulation of TDG under parameter-efficient fine-tuning. We establish that the low-dimensional temporal structure underlying model evolution can be preserved under parameter-efficient reparameterization, enabling temporal modeling without operating in the ambient parameter space. Building on this principle, we propose Manifold-aware Temporal LoRA (MaT-LoRA), which constrains temporal updates to a shared low-dimensional manifold within a low-rank adaptation subspace, and models its evolution through a structured temporal core. This reparameterization dramatically reduces temporal modeling complexity while retaining expressive power. Extensive experiments on synthetic and real-world datasets, including scientific documents, news publishers, and review ratings, demonstrate that MaT-LoRA achieves superior temporal generalization performance with practical scalability for LLMs.
翻译:大语言模型在现实世界部署中普遍面临时序分布偏移问题,其中数据随时间持续演化。尽管时序领域泛化旨在对此类结构化演化进行建模,现有方法均在完整参数空间中刻画模型适应过程。这种建模方式对现代大语言模型而言存在计算可行性障碍。本文提出在参数高效微调框架下的时序领域泛化几何重构方法。我们证明模型演化背后的低维时序结构在参数高效重参数化下得以保持,从而实现在非完整参数空间中的时序建模。基于此原理,我们提出流形感知时序低秩适应方法,该方法将时序更新约束在低秩适应子空间内的共享低维流形上,并通过结构化时序核心对其演化过程进行建模。这种重参数化在保持表达力的同时显著降低了时序建模复杂度。在合成数据集与真实数据集上的大量实验表明,该方法在科学文献、新闻媒体和评论评分等场景中均能实现优越的时序泛化性能,并具备面向大语言模型的实用可扩展性。