We introduce iMotion-LLM, a large language model (LLM) integrated with trajectory prediction modules for interactive motion generation. Unlike conventional approaches, it generates feasible, safety-aligned trajectories based on textual instructions, enabling adaptable and context-aware driving behavior. It combines an encoder-decoder multimodal trajectory prediction model with a pre-trained LLM fine-tuned using LoRA, projecting scene features into the LLM input space and mapping special tokens to a trajectory decoder for text-based interaction and interpretable driving. To support this framework, we introduce two datasets: 1) InstructWaymo, an extension of the Waymo Open Motion Dataset with direction-based motion instructions, and 2) Open-Vocabulary InstructNuPlan, which features safety-aligned instruction-caption pairs and corresponding safe trajectory scenarios. Our experiments validate that instruction conditioning enables trajectory generation that follows the intended condition. iMotion-LLM demonstrates strong contextual comprehension, achieving 84% average accuracy in direction feasibility detection and 96% average accuracy in safety evaluation of open-vocabulary instructions. This work lays the foundation for text-guided motion generation in autonomous driving, supporting simulated data generation, model interpretability, and robust safety alignment testing for trajectory generation models. Our code, pre-trained model, and datasets are available at: https://vision-cair.github.io/iMotion-LLM/.
翻译:本文介绍了iMotion-LLM,一种集成轨迹预测模块的大型语言模型(LLM),用于交互式运动生成。与传统方法不同,该模型能够基于文本指令生成可行且符合安全约束的轨迹,从而实现适应性强且具有上下文感知的驾驶行为。该模型将编码器-解码器多模态轨迹预测模型与通过LoRA微调的预训练LLM相结合,将场景特征投影到LLM输入空间,并将特殊令牌映射到轨迹解码器,以实现基于文本的交互和可解释的驾驶。为支持该框架,我们引入了两个数据集:1)InstructWaymo,这是Waymo开放运动数据集的扩展,包含基于方向的运动指令;2)Open-Vocabulary InstructNuPlan,该数据集包含安全对齐的指令-描述对及相应的安全轨迹场景。实验验证表明,指令条件能够使生成的轨迹遵循预期条件。iMotion-LLM展现出强大的上下文理解能力,在方向可行性检测中达到84%的平均准确率,在开放词汇指令的安全评估中达到96%的平均准确率。本研究为自动驾驶中的文本引导运动生成奠定了基础,支持模拟数据生成、模型可解释性以及轨迹生成模型的鲁棒安全对齐测试。我们的代码、预训练模型和数据集可在以下网址获取:https://vision-cair.github.io/iMotion-LLM/。