This paper presents an experimental study focused on understanding the simplification properties of neural networks under different hyperparameter configurations, specifically investigating the effects on Lempel Ziv complexity and sensitivity. By adjusting key hyperparameters such as activation functions, hidden layers, and learning rate, this study evaluates how these parameters impact the complexity of network outputs and their robustness to input perturbations. The experiments conducted using the MNIST dataset aim to provide insights into the relationships between hyperparameters, complexity, and sensitivity, contributing to a deeper theoretical understanding of these concepts in neural networks.
翻译:本文通过实验研究探讨了不同超参数配置下神经网络的简化特性,重点考察其对Lempel Ziv复杂度和敏感性的影响。通过调整激活函数、隐藏层和学习率等关键超参数,本研究评估了这些参数如何影响网络输出的复杂性及其对输入扰动的鲁棒性。基于MNIST数据集开展的实验旨在揭示超参数、复杂性与敏感性之间的内在关联,从而深化对神经网络中这些概念的理论理解。