Detecting payment fraud in real-world banking streams requires models that can exploit both the order of events and the irregular time gaps between them. We introduce FraudTransformer, a sequence model that augments a vanilla GPT-style architecture with (i) a dedicated time encoder that embeds either absolute timestamps or inter-event values, and (ii) a learned positional encoder that preserves relative order. Experiments on a large industrial dataset -- tens of millions of transactions and auxiliary events -- show that FraudTransformer surpasses four strong classical baselines (Logistic Regression, XGBoost and LightGBM) as well as transformer ablations that omit either the time or positional component. On the held-out test set it delivers the highest AUROC and PRAUC.
翻译:在现实银行交易流中检测支付欺诈需要能够同时利用事件顺序和事件间不规则时间间隔的模型。本文提出FraudTransformer序列模型,该模型在标准GPT架构基础上增强了两项机制:(i) 专设的时间编码器,可嵌入绝对时间戳或事件间隔值;(ii) 可学习的相对位置编码器,用于保持事件间顺序关系。在包含数千万笔交易及辅助事件的大型工业数据集上的实验表明,FraudTransformer在留出测试集上取得了最高的AUROC和PRAUC指标,其性能超越了四种经典基线模型(逻辑回归、XGBoost和LightGBM)以及省略时间或位置组件的Transformer消融模型。