This work presents a diffusion transformer framework for data-driven structural topology optimization that combines the accuracy of physics-based methods with the efficiency of generative deep learning. Conventional approaches such as the Solid Isotropic Material with Penalization (SIMP) method require repeated finite element analyses at every iteration, making large-scale or real-time optimization computationally expensive. We propose a hybrid conditioning diffusion transformer (DiT) model that learns to generate near-optimal topologies directly from problem definitions, eliminating iterative analysis during inference. The model integrates spatially distributed conditioning through concatenated stress and strain fields and global conditioning via adaptive layer normalization (AdaLN) using scalar descriptors such as load position, magnitude, and prescribed volume fraction. A dataset of 30,000 two-dimensional SIMP-optimized structures was generated for training and evaluation. Results demonstrate that the proposed DiT achieves less than 1% compliance errors relative to ground-truth SIMP solutions while maintaining accurate volume fractions and structural connectivity. Deterministic DDIM sampling enables high-fidelity topology generation in seconds using as few as five denoising steps, enabling near-real-time performance. The hybrid conditioning diffusion transformer thus provides an efficient and scalable alternative to traditional topology optimization methods, with strong potential for integration into interactive computer-aided design workflows.
翻译:暂无翻译