The injectivity of ReLU layers in neural networks, the recovery of vectors from clipped or saturated measurements, and (real) phase retrieval in $\mathbb{R}^n$ allow for a similar problem formulation and characterization using frame theory. In this paper, we revisit all three problems with a unified perspective and derive lower Lipschitz bounds for ReLU layers and clipping which are analogous to the previously known result for phase retrieval and are optimal up to a constant factor.
翻译:神经网络中ReLU层的单射性、从截断或饱和测量中恢复向量、以及$\mathbb{R}^n$中的(实)相位恢复问题,均可通过框架理论采用相似的问题表述与特征刻画。本文以统一视角重新审视这三个问题,推导出ReLU层与截断操作的Lipschitz下界,这些下界与先前已知的相位恢复结果类似,且在常数因子范围内达到最优。