Eye-based emotion recognition enables eyewear devices to perceive users' emotional states and support emotion-aware interaction. However, deploying such functionality on their resource-limited embedded hardware remains challenging. Time-to-first-spike (TTFS)-coded spiking neural networks (SNNs) offer a promising solution due to their extremely sparse and energy-efficient computation, where each neuron emits at most one binary spike. While prior works have primarily focused on improving TTFS SNN training algorithms, the role of network architecture has been largely overlooked. This is particularly critical, as spike timing in TTFS SNNs is tightly coupled with architectural design, and eye-based emotion recognition requires compact yet highly efficient networks. In this paper, we propose TNAS-ER, the first neural architecture search (NAS) framework tailored to TTFS SNNs for eye-based emotion recognition. TNAS-ER presents a novel ANN-assisted search strategy that leverages a ReLU-based ANN counterpart to guide architecture optimization and stabilize training of the TTFS SNN. TNAS-ER employs an evolutionary algorithm, with weighted and unweighted average recall jointly defined as fitness objectives for emotion recognition. Extensive experiments demonstrate that TNAS-ER achieves high recognition performance with significantly improved efficiency. Furthermore, we evaluate TNAS-ER on a neuromorphic hardware, confirming its superior energy efficiency and strong potential for real-world applications.
翻译:暂无翻译