This paper examines the influence of internal Gaussian noise on the performance of deep feedforward neural networks, focusing on the role of the noise injection stage relative to the activation function. Two scenarios are analyzed: noise introduced before and after the activation function, for both additive and multiplicative noise influence. The case of noise before activation function is similar to perturbations in the input channel of neuron, while the noise introduced after activation function is analogous to noise occurring either within the neuron itself or in its output channel. The types of noise and the method of their introduction were inspired by analog neural networks. The results show that the activation function acts as an effective nonlinear filter of noise. Networks with noise introduced before the activation function consistently achieve higher accuracy than those with noise applied after it, with additive noise being more effectively suppressed in this case. For noise introduced after the activation function, multiplicative noise is less detrimental than additive noise, and earlier hidden layers contribute more significantly to performance degradation due to cumulative noise amplification governed by the statistical properties of subsequent weight matrices. The study also demonstrates that pooling-based noise reduction is effective in both cases when noise is introduced before and after the activation function, consistently improving network performance.
翻译:暂无翻译