In this work, we present and study Continuous Generative Neural Networks (CGNNs), namely, generative models in the continuous setting: the output of a CGNN belongs to an infinite-dimensional function space. The architecture is inspired by DCGAN, with one fully connected layer, several convolutional layers and nonlinear activation functions. In the continuous $L^2$ setting, the dimensions of the spaces of each layer are replaced by the scales of a multiresolution analysis of a compactly supported wavelet. We present conditions on the convolutional filters and on the nonlinearity that guarantee that a CGNN is injective. This theory finds applications to inverse problems, and allows for deriving Lipschitz stability estimates for (possibly nonlinear) infinite-dimensional inverse problems with unknowns belonging to the manifold generated by a CGNN. Several numerical simulations, including signal deblurring, illustrate and validate this approach.
翻译:本文提出并研究了连续生成神经网络(CGNN),即在连续设定下的生成模型:CGNN的输出属于无限维函数空间。该架构受DCGAN启发,包含一个全连接层、若干卷积层及非线性激活函数。在连续$L^2$设定下,各层空间维度被紧支撑小波的多分辨率分析尺度所替代。我们提出了保证CGNN单射性的卷积滤波器与非线性条件。该理论可应用于逆问题,并为(可能非线性的)无限维逆问题推导Lipschitz稳定性估计,其中未知量属于CGNN生成的流形。包括信号去模糊在内的若干数值模拟验证了该方法的有效性。