This paper proposes small and efficient machine learning models (TinyML) for resource-constrained edge devices, specifically for on-device indoor localisation. Typical approaches for indoor localisation rely on centralised remote processing of data transmitted from lower powered devices such as wearables. However, there are several benefits for moving this to the edge device itself, including increased battery life, enhanced privacy, reduced latency and lowered operational costs, all of which are key for common applications such as health monitoring. The work focuses on model compression techniques, including quantization and knowledge distillation, to significantly reduce the model size while maintaining high predictive performance. We base our work on a large state-of-the-art transformer-based model and seek to deploy it within low-power MCUs. We also propose a state-space-based architecture using Mamba as a more compact alternative to the transformer. Our results show that the quantized transformer model performs well within a 64 KB RAM constraint, achieving an effective balance between model size and localisation precision. Additionally, the compact Mamba model has strong performance under even tighter constraints, such as a 32 KB of RAM, without the need for model compression, making it a viable option for more resource-limited environments. We demonstrate that, through our framework, it is feasible to deploy advanced indoor localisation models onto low-power MCUs with restricted memory limitations. The application of these TinyML models in healthcare has the potential to revolutionize patient monitoring by providing accurate, real-time location data while minimizing power consumption, increasing data privacy, improving latency and reducing infrastructure costs.
翻译:本文针对资源受限的边缘设备,提出了一种小型高效的机器学习模型(TinyML),专门用于设备端室内定位。典型的室内定位方法依赖于对来自可穿戴设备等低功耗设备所传输数据的集中式远程处理。然而,将此过程迁移至边缘设备本身具有诸多优势,包括延长电池寿命、增强隐私保护、降低延迟以及减少运营成本,这些对于健康监测等常见应用至关重要。本工作聚焦于模型压缩技术,包括量化和知识蒸馏,以在保持高预测性能的同时显著减小模型尺寸。我们基于一个大型、先进的基于Transformer的模型开展研究,并寻求将其部署在低功耗微控制器(MCU)中。我们还提出了一种基于状态空间的架构,使用Mamba作为Transformer的一种更紧凑的替代方案。我们的结果表明,量化后的Transformer模型在64 KB RAM的限制下表现良好,在模型尺寸与定位精度之间实现了有效平衡。此外,紧凑的Mamba模型在更严格的约束下(例如32 KB RAM)仍具有强劲性能,且无需模型压缩,这使其成为资源更受限环境下的可行选择。我们证明,通过我们的框架,可以将先进的室内定位模型部署到内存受限的低功耗MCU上。这些TinyML模型在医疗保健领域的应用,有望通过提供准确、实时的位置数据,同时最大限度地降低功耗、增强数据隐私、改善延迟并减少基础设施成本,从而彻底改变患者监测方式。