The great success of deep learning has stimulated avid research activities in verifying the power of depth in theory, a common consensus of which is that deep net are versatile in approximating and learning numerous functions. Such a versatility certainly enhances the understanding of the power of depth, but makes it difficult to judge which data features are crucial in a specific learning task. This paper proposes a constructive approach to equip deep nets for the feature qualification purpose. Using the product-gate nature and localized approximation property of deep nets with sigmoid activation (deep sigmoid nets), we succeed in constructing a linear deep net operator that possesses optimal approximation performance in approximating smooth and radial functions. Furthermore, we provide theoretical evidences that the constructed deep net operator is capable of qualifying multiple features such as the smoothness and radialness of the target functions.
翻译:深度学习的巨大成功激发了理论界对深度网络能力验证的浓厚研究兴趣,其中一个普遍共识是深度网络在逼近和学习众多函数方面具有普适性。这种普适性固然加深了我们对深度网络能力的理解,但也使得在特定学习任务中难以判断哪些数据特征至关重要。本文提出一种构造性方法,使深度网络具备特征量化功能。利用具有Sigmoid激活函数的深度网络(深度Sigmoid网络)的乘积门特性和局部逼近性质,我们成功构造了一个线性深度网络算子,该算子在逼近光滑径向函数时具有最优逼近性能。此外,我们提供了理论证据,证明所构造的深度网络算子能够量化目标函数的多种特征,例如光滑性与径向性。