Understanding and reconstructing the 3D world through omnidirectional perception is an inevitable trend in the development of autonomous agents and embodied intelligence. However, existing 3D occupancy prediction methods are constrained by limited perspective inputs and predefined training distribution, making them difficult to apply to embodied agents that require comprehensive and safe perception of scenes in open world exploration. To address this, we present O3N, the first purely visual, end-to-end Omnidirectional Open-vocabulary Occupancy predictioN framework. O3N embeds omnidirectional voxels in a polar-spiral topology via the Polar-spiral Mamba (PsM) module, enabling continuous spatial representation and long-range context modeling across 360°. The Occupancy Cost Aggregation (OCA) module introduces a principled mechanism for unifying geometric and semantic supervision within the voxel space, ensuring consistency between the reconstructed geometry and the underlying semantic structure. Moreover, Natural Modality Alignment (NMA) establishes a gradient-free alignment pathway that harmonizes visual features, voxel embeddings, and text semantics, forming a consistent "pixel-voxel-text" representation triad. Extensive experiments on multiple models demonstrate that our method not only achieves state-of-the-art performance on QuadOcc and Human360Occ benchmarks but also exhibits remarkable cross-scene generalization and semantic scalability, paving the way toward universal 3D world modeling. The source code will be made publicly available at https://github.com/MengfeiD/O3N.
翻译:通过全向感知理解和重建三维世界是自主智能体与具身智能发展的必然趋势。然而,现有的三维占据预测方法受限于有限的视角输入和预定义的训练分布,难以应用于开放世界探索中需要全面且安全场景感知的具身智能体。为此,我们提出了O3N,首个纯视觉、端到端的全向开放词汇占据预测框架。O3N通过极坐标螺旋Mamba(PsM)模块将全向体素嵌入极坐标螺旋拓扑中,实现了跨360°的连续空间表征与长程上下文建模。占据代价聚合(OCA)模块引入了一种原则性机制,在体素空间内统一几何与语义监督,确保重建几何与底层语义结构的一致性。此外,自然模态对齐(NMA)建立了一条无梯度对齐路径,协调视觉特征、体素嵌入与文本语义,形成一致的“像素-体素-文本”表征三元组。在多个模型上的大量实验表明,我们的方法不仅在QuadOcc和Human360Occ基准上取得了最先进的性能,而且展现出卓越的跨场景泛化能力与语义可扩展性,为通用三维世界建模开辟了道路。源代码将在 https://github.com/MengfeiD/O3N 公开。