Neural networks have recently been employed as material discretizations within adjoint optimization frameworks for inverse problems and topology optimization. While advantageous regularization effects and better optima have been found for some inverse problems, the benefit for topology optimization has been limited -- where the focus of investigations has been the compliance problem. We demonstrate how neural network material discretizations can, under certain conditions, find better local optima in more challenging optimization problems, where we here specifically consider acoustic topology optimization. The chances of identifying a better optimum can significantly be improved by running multiple partial optimizations with different neural network initializations. Furthermore, we show that the neural network material discretization's advantage comes from the interplay with the Adam optimizer and emphasize its current limitations when competing with constrained and higher-order optimization techniques. At the moment, this discretization has only been shown to be beneficial for unconstrained first-order optimization.
翻译:近年来,神经网络作为材料离散化方法被引入伴随优化框架,用于求解反问题与拓扑优化问题。虽然在某些反问题中已观察到正则化效应与更优解的获得,但神经网络在拓扑优化中的优势仍较为有限——现有研究主要集中于柔度优化问题。本文论证了在特定条件下,神经网络材料离散化方法如何在更具挑战性的优化问题中发现更优的局部最优解,并以声学拓扑优化为具体研究对象。通过采用不同神经网络初始化参数进行多次局部优化,可显著提升获得更优解的概率。此外,本研究揭示了神经网络材料离散化优势源于其与Adam优化器的协同作用,并指出该方法在当前与约束优化及高阶优化技术竞争时的局限性。目前,该离散化方法仅在无约束一阶优化中展现出明显优势。