The next Point of Interest (POI) recommendation task is to predict users' immediate next POI visit given their historical data. Location-Based Social Network (LBSN) data, which is often used for the next POI recommendation task, comes with challenges. One frequently disregarded challenge is how to effectively use the abundant contextual information present in LBSN data. Previous methods are limited by their numerical nature and fail to address this challenge. In this paper, we propose a framework that uses pretrained Large Language Models (LLMs) to tackle this challenge. Our framework allows us to preserve heterogeneous LBSN data in its original format, hence avoiding the loss of contextual information. Furthermore, our framework is capable of comprehending the inherent meaning of contextual information due to the inclusion of commonsense knowledge. In experiments, we test our framework on three real-world LBSN datasets. Our results show that the proposed framework outperforms the state-of-the-art models in all three datasets. Our analysis demonstrates the effectiveness of the proposed framework in using contextual information as well as alleviating the commonly encountered cold-start and short trajectory problems.
翻译:下一兴趣点(POI)推荐任务旨在根据用户历史数据预测其即将访问的下一个兴趣点。常用于该任务的位置社交网络(LBSN)数据存在若干挑战,其中一个常被忽视的挑战是如何有效利用LBSN数据中丰富的上下文信息。现有方法受限于其数值化处理方式,未能解决这一难题。本文提出一种利用预训练大语言模型(LLMs)应对该挑战的框架。该框架能够以原始格式保存异构LBSN数据,从而避免上下文信息丢失。此外,得益于常识知识的融入,本框架能够理解上下文信息的潜在语义。我们在三个真实LBSN数据集上进行了实验验证,结果表明所提框架在所有数据集上均优于当前最优模型。分析表明,该框架不仅能有效利用上下文信息,还能缓解常见的冷启动和短轨迹问题。