Pretrained transformers readily adapt to new sequence modeling tasks via zero-shot prompting, but relational domains still lack architectures that transfer across datasets and tasks. The core challenge is the diversity of relational data, with varying heterogeneous schemas, graph structures and functional dependencies. In this paper, we present the Relational Transformer (RT) architecture, which can be pretrained on diverse relational databases and directly applied to unseen datasets and tasks without task- or dataset-specific fine-tuning, or retrieval of in-context examples. RT (i) incorporates task specification via task table prompting, (ii) tokenizes cells with table/column metadata, (iii) is pretrained via masked token prediction, and (iv) utilizes a novel Relational Attention mechanism over columns, rows, and primary-foreign key links. Pretrained on RelBench datasets spanning tasks such as churn and sales forecasting, RT attains strong zero-shot performance, averaging 93% of fully supervised AUROC on binary classification tasks with a single forward pass of a 22M parameter model, as opposed to 84% for a 27B LLM. Fine-tuning yields state-of-the-art results with high sample efficiency. Our experimental analyses show that RT's zero-shot transfer leverages task context, relational attention patterns and schema semantics. Overall, RT provides a practical path toward foundation models for relational data. Code, models, data: https://github.com/snap-stanford/relational-transformer.
翻译:预训练的Transformer模型通过零样本提示能够轻松适应新的序列建模任务,但关系型领域仍然缺乏能够跨数据集和任务迁移的架构。核心挑战在于关系数据的多样性,包括异构的模式、图结构和函数依赖关系。本文提出了关系型Transformer(RT)架构,该架构可在多样化的关系数据库上进行预训练,并直接应用于未见过的数据集和任务,无需针对特定任务或数据集进行微调,也无需检索上下文示例。RT具有以下特点:(i)通过任务表提示整合任务规范;(ii)利用表/列元数据对单元格进行标记化;(iii)通过掩码标记预测进行预训练;(iv)采用一种新颖的关系注意力机制,该机制覆盖列、行以及主键-外键链接。在涵盖客户流失预测和销售预测等任务的RelBench数据集上进行预训练后,RT展现出强大的零样本性能:在二元分类任务中,仅需一次22M参数模型的前向传播,其AUROC平均值即可达到完全监督方法的93%,而27B参数的LLM仅能达到84%。经过微调后,RT能够以高样本效率取得最先进的结果。我们的实验分析表明,RT的零样本迁移能力得益于任务上下文、关系注意力模式以及模式语义。总体而言,RT为实现关系数据的基础模型提供了一条实用路径。代码、模型、数据:https://github.com/snap-stanford/relational-transformer。