This work introduces TRON, a scalable session-based Transformer Recommender using Optimized Negative-sampling. Motivated by the scalability and performance limitations of prevailing models such as SASRec and GRU4Rec+, TRON integrates top-k negative sampling and listwise loss functions to enhance its recommendation accuracy. Evaluations on relevant large-scale e-commerce datasets show that TRON improves upon the recommendation quality of current methods while maintaining training speeds similar to SASRec. A live A/B test yielded an 18.14% increase in click-through rate over SASRec, highlighting the potential of TRON in practical settings. For further research, we provide access to our source code at https://github.com/otto-de/TRON and an anonymized dataset at https://github.com/otto-de/recsys-dataset.
翻译:本研究提出了TRON,一种利用优化负采样的可扩展会话式Transformer推荐系统。受现有模型如SASRec和GRU4Rec+在可扩展性与性能方面的局限所启发,TRON整合了top-k负采样与列表式损失函数以提升推荐准确性。在相关大规模电子商务数据集上的评估表明,TRON在保持与SASRec相近训练速度的同时,显著提升了现有方法的推荐质量。在线A/B测试显示,TRON相较于SASRec实现了18.14%的点击率提升,彰显了其在实际应用场景中的潜力。为促进进一步研究,我们在https://github.com/otto-de/TRON公开了源代码,并在https://github.com/otto-de/recsys-dataset提供了匿名化数据集。