We study shallow and deep neural networks whose inputs range over a general topological space. The model is built from a prescribed family of continuous feature maps and reduces to multilayer feedforward networks in the Euclidean case. We focus on the universal approximation property and establish general conditions under which such networks are dense in spaces of continuous vector-valued functions on arbitrary topological spaces and, in particular, locally convex spaces. Universality results obtained in the arbitrary-width case extend classical approximation theorems to non-Euclidean spaces. We also consider the deep narrow setting, in which the width of each hidden layer is uniformly bounded while the depth is allowed to grow. We identify conditions under which such networks retain the universal approximation property. As a concrete example, we employ Ostrand's extension of the Kolmogorov superposition theorem to derive an explicit universality result for products of compact metric spaces, with width bounds expressed in terms of topological dimension.
翻译:暂无翻译