Success of machine learning (ML) in the modern world is largely determined by abundance of data. However at many industrial and scientific problems, amount of data is limited. Application of ML methods to data-scarce scientific problems can be made more effective via several routes, one of them is equivariant neural networks possessing knowledge of symmetries. Here we suggest that combination of symmetry-aware invariant architectures and stacks of dilated convolutions is a very effective and easy to implement receipt allowing sizable improvements in accuracy over standard approaches. We apply it to representative physical problems from different realms: prediction of bandgaps of photonic crystals, and network approximations of magnetic ground states. The suggested invariant multiscale architectures increase expressibility of networks, which allow them to perform better in all considered cases.
翻译:机器学习(ML)在现代世界的成功很大程度上取决于数据的丰富性。然而,在许多工业和科学问题中,数据量是有限的。将ML方法应用于数据稀缺的科学问题可以通过多种途径提高效率,其中一种途径是利用具有对称性知识的等变神经网络。本文提出,将对称感知的不变架构与堆叠的膨胀卷积相结合,是一种非常有效且易于实现的方案,能够显著提高标准方法的准确性。我们将其应用于不同领域的代表性物理问题:光子晶体带隙的预测,以及磁基态的网络近似。所提出的不变多尺度架构增强了网络的表达能力,使其在所有考虑的情况下均表现更优。