We present TropNNC, a framework for compressing neural networks with linear and convolutional layers and ReLU activations. TropNNC is a structured compression framework based on a geometrical approach to machine/deep learning, using tropical geometry and extending the work of Misiakos et al. (2022). We use the Hausdorff distance of zonotopes in its standard continuous form to achieve a tighter approximation bound for tropical polynomials compared to previous work. This enhancement leads to the development of an effective compression algorithm that achieves superior functional approximations of neural networks. Our method is significantly easier to implement compared to other frameworks, and does not depend on the availability of training data samples. We validate our framework through extensive empirical evaluations on the MNIST, CIFAR, and ImageNet datasets. Our results demonstrate that TropNNC achieves performance on par with state-of-the-art methods like ThiNet (even surpassing it in compressing linear layers) and CUP. To the best of our knowledge, it is the first method that achieves this using tropical geometry.
翻译:本文提出TropNNC,一个用于压缩具有线性层、卷积层及ReLU激活函数的神经网络的框架。TropNNC是一种基于几何化机器学习/深度学习方法的结构化压缩框架,它运用热带几何并扩展了Misiakos等人(2022)的工作。我们采用标准连续形式下的zonotope Hausdorff距离,为热带多项式获得了比先前工作更紧的近似界。这一改进促成了一个有效的压缩算法的开发,该算法能够实现对神经网络的优异函数逼近。与其他框架相比,我们的方法实现起来显著更简单,且不依赖于训练数据样本的可用性。我们在MNIST、CIFAR和ImageNet数据集上进行了广泛的实证评估以验证我们的框架。结果表明,TropNNC取得了与ThiNet(在压缩线性层方面甚至超越之)和CUP等先进方法相当的性能。据我们所知,这是首个利用热带几何实现这一目标的方法。