For individuals who are blind or have low vision, tactile maps provide essential spatial information but are limited in the amount of data they can convey. Digitally augmented tactile maps enhance these capabilities with audio feedback, thereby combining the tactile feedback provided by the map with an audio description of the touched elements. In this context, we explore an embodied interaction paradigm to augment tactile maps with conversational interaction based on Large Language Models, thus enabling users to obtain answers to arbitrary questions regarding the map. We analyze the type of questions the users are interested in asking, engineer the Large Language Model's prompt to provide reliable answers, and study the resulting system with a set of 10 participants, evaluating how the users interact with the system, its usability, and user experience.
翻译:对于盲人或低视力人群而言,触觉地图能提供必要的空间信息,但其可传达的数据量存在局限。数字增强型触觉地图通过音频反馈提升了信息承载能力,从而将地图提供的触觉反馈与被触控元素的语音描述相结合。在此背景下,我们探索了一种基于大语言模型的具身交互范式,通过对话式交互增强触觉地图功能,使用户能够获取关于地图的任意问题解答。我们分析了用户感兴趣的问题类型,设计了大语言模型的提示工程以提供可靠答案,并通过10名参与者的实验系统研究了用户与系统的交互方式、系统可用性及用户体验。