We introduce rLLM (relationLLM), a PyTorch library designed for Relational Table Learning (RTL) with Large Language Models (LLMs). The core idea is to decompose state-of-the-art Graph Neural Networks, LLMs, and Table Neural Networks into standardized modules, to enable the fast construction of novel RTL-type models in a simple "combine, align, and co-train" manner. To illustrate the usage of rLLM, we introduce a simple RTL method named \textbf{BRIDGE}. Additionally, we present three novel relational tabular datasets (TML1M, TLF2K, and TACM12K) by enhancing classic datasets. We hope rLLM can serve as a useful and easy-to-use development framework for RTL-related tasks. Our code is available at: https://github.com/rllm-project/rllm.
翻译:我们介绍了rLLM(relationLLM),这是一个专为基于大语言模型(LLMs)的关系型表格学习(RTL)而设计的PyTorch库。其核心思想是将最先进的图神经网络、大语言模型和表格神经网络分解为标准化的模块,从而能够以简单的“组合、对齐与协同训练”方式快速构建新型的RTL类模型。为说明rLLM的使用方法,我们引入了一种名为\textbf{BRIDGE}的简单RTL方法。此外,我们通过对经典数据集进行增强,提出了三个新颖的关系型表格数据集(TML1M、TLF2K和TACM12K)。我们希望rLLM能够成为RTL相关任务中一个实用且易于使用的开发框架。我们的代码发布于:https://github.com/rllm-project/rllm。