This paper introduces a novel training methodology that enables a Transformer model to generalize the addition of two-digit numbers to numbers with unseen lengths of digits. The proposed approach employs an autoregressive generation technique, processing from right to left, which mimics a common manual method for adding large numbers. To the best of my knowledge, this methodology has not been previously explored in the literature. All results are reproducible, and the corresponding R code is available at github.com/AGPatriota/ALGA-R/.
翻译:本文提出了一种新颖的训练方法,使Transformer模型能够将两位数加法运算泛化至未见过数字长度的数值。该方法采用自回归生成技术,从右向左进行处理,模拟了人工计算大数加法的常用方式。据我们所知,该方法在现有文献中尚未被探索。所有结果均可复现,相关R代码发布于github.com/AGPatriota/ALGA-R/。