The Transformer model, initially achieving significant success in the field of natural language processing, has recently shown great potential in the application of tactile perception. This review aims to comprehensively outline the application and development of Transformers in tactile technology. We first introduce the two fundamental concepts behind the success of the Transformer: the self-attention mechanism and large-scale pre-training. Then, we delve into the application of Transformers in various tactile tasks, including but not limited to object recognition, cross-modal generation, and object manipulation, offering a concise summary of the core methodologies, performance benchmarks, and design highlights. Finally, we suggest potential areas for further research and future work, aiming to generate more interest within the community, tackle existing challenges, and encourage the use of Transformer models in the tactile field.
翻译:Transformer模型最初在自然语言处理领域取得显著成功后,近年来在触觉感知应用中展现出巨大潜力。本综述旨在全面概述Transformer在触觉技术中的应用与发展。我们首先介绍支撑Transformer成功的两个基本概念:自注意力机制与大规模预训练。随后,深入探讨Transformer在各类触觉任务中的应用,包括但不限于物体识别、跨模态生成及物体操控,并系统总结核心方法、性能基准与设计亮点。最后,我们提出未来研究方向与潜在工作建议,以期激发领域内更多研究兴趣、应对现有挑战,并推动Transformer模型在触觉领域的广泛应用。