Large language models have limited context capacity, hindering reasoning over long conversations. We propose the Hierarchical Aggregate Tree memory structure to recursively aggregate relevant dialogue context through conditional tree traversals. HAT encapsulates information from children nodes, enabling broad coverage with depth control. We formulate finding best context as optimal tree traversal. Experiments show HAT improves dialog coherence and summary quality over baseline contexts, demonstrating the techniques effectiveness for multi turn reasoning without exponential parameter growth. This memory augmentation enables more consistent, grounded longform conversations from LLMs
翻译:大型语言模型存在上下文容量限制,阻碍了其对长对话的推理能力。本文提出层次聚合树记忆结构,通过条件化树遍历递归聚合相关对话上下文。HAT通过封装子节点信息,在深度可控的前提下实现广泛覆盖。我们将最佳上下文的寻找问题形式化为最优树遍历问题。实验表明,与基线上下文相比,HAT显著提升了对话连贯性与摘要质量,证明了该技术在避免参数指数增长的前提下对多轮推理的有效性。这种记忆增强机制使大型语言模型能够生成更连贯、更基于上下文的长篇对话。