In this paper, we propose Argus, a wearable add-on system based on stripped-down (i.e., compact, lightweight, low-power, limited-capability) mmWave radars. It is the first to achieve egocentric human mesh reconstruction in a multi-view manner. Compared with conventional frontal-view mmWave sensing solutions, it addresses several pain points, such as restricted sensing range, occlusion, and the multipath effect caused by surroundings. To overcome the limited capabilities of the stripped-down mmWave radars (with only one transmit antenna and three receive antennas), we tackle three main challenges and propose a holistic solution, including tailored hardware design, sophisticated signal processing, and a deep neural network optimized for high-dimensional complex point clouds. Extensive evaluation shows that Argus achieves performance comparable to traditional solutions based on high-capability mmWave radars, with an average vertex error of 6.5 cm, solely using stripped-down radars deployed in a multi-view configuration. It presents robustness and practicality across conditions, such as with unseen users and different host devices.
翻译:本文提出Argus,一种基于精简(即紧凑、轻量、低功耗、有限能力)毫米波雷达的可穿戴附加系统。该系统首次以多视角方式实现第一人称人体网格重建。与传统的前视毫米波传感方案相比,它解决了多个痛点,如受限的感知范围、遮挡以及由环境引起的多径效应。为克服精简毫米波雷达(仅配备一个发射天线和三个接收天线)的有限能力,我们应对了三大挑战并提出了一个整体解决方案,包括定制化的硬件设计、精密的信号处理以及针对高维复杂点云优化的深度神经网络。大量评估表明,Argus仅使用多视角部署的精简雷达,就实现了与传统基于高能力毫米波雷达方案相当的性能,平均顶点误差为6.5厘米。该系统在不同条件下(如面对未见过的用户和不同的宿主设备)均展现出鲁棒性和实用性。