Accurate alignment of a fixed mobile device equipped with inertial sensors inside a moving vehicle is important for navigation, activity recognition, and other applications. Accurate estimation of the device mounting angle is required to rotate the inertial measurement from the sensor frame to the moving platform frame to standardize measurements and improve the performance of the target task. In this work, a data-driven approach using deep neural networks (DNNs) is proposed to learn the yaw mounting angle of a smartphone equipped with an inertial measurement unit (IMU) and strapped to a car. The proposed model uses only the accelerometer and gyroscope readings from an IMU as input and, in contrast to existing solutions, does not require global position inputs from global navigation satellite systems (GNSS). To train the model in a supervised manner, IMU data is collected for training and validation with the sensor mounted at a known yaw mounting angle, and a range of ground truth labels is generated by applying a random rotation in a bounded range to the measurements. The trained model is tested on data with real rotations showing similar performance as with synthetic rotations. The trained model is deployed on an Android device and evaluated in real-time to test the accuracy of the estimated yaw mounting angle. The model is shown to find the mounting angle at an accuracy of 8 degrees within 5 seconds, and 4 degrees within 27 seconds. An experiment is conducted to compare the proposed model with an existing off-the-shelf solution.
翻译:在运动车辆中固定安装的移动设备配备惯性传感器时,精确对准对于导航、活动识别及其他应用至关重要。准确估计设备安装角度对于将惯性测量从传感器坐标系旋转至移动平台坐标系以标准化测量并提升目标任务性能具有重要意义。本文提出一种基于深度神经网络(DNN)的数据驱动方法,用于学习配备惯性测量单元(IMU)并固定在汽车上的智能手机的偏航安装角度。所提模型仅使用IMU中的加速度计和陀螺仪读数作为输入,相较于现有解决方案,无需来自全球导航卫星系统(GNSS)的全局位置输入。为实现有监督训练,在传感器已知偏航安装角度下收集IMU数据进行训练与验证,并通过在测量值上施加有界范围内的随机旋转生成一系列真实标签。训练后的模型在具有真实旋转的数据上进行测试,其性能与合成旋转数据下的表现相当。该模型部署于Android设备并实时评估以检测偏航安装角估计精度。实验表明,模型能在5秒内以8度精度、27秒内以4度精度确定安装角度。通过实验将所提模型与现有商用解决方案进行对比。