Generating high-fidelity 3D geometries that satisfy specific parameter constraints has broad applications in design and engineering. However, current methods typically rely on large training datasets and struggle with controllability and generalization beyond the training distributions. To overcome these limitations, we introduce LAMP (Linear Affine Mixing of Parametric shapes), a data-efficient framework for controllable and interpretable 3D generation. LAMP first aligns signed distance function (SDF) decoders by overfitting each exemplar from a shared initialization, then synthesizes new geometries by solving a parameter-constrained mixing problem in the aligned weight space. To ensure robustness, we further propose a safety metric that detects geometry validity via linearity mismatch. We evaluate LAMP on two 3D parametric benchmarks: DrivAerNet++ and BlendedNet. We found that LAMP enables (i) controlled interpolation within bounds with as few as 100 samples, (ii) safe extrapolation by up to 100% parameter difference beyond training ranges, (iii) physics performance-guided optimization under fixed parameters. LAMP significantly outperforms conditional autoencoder and Deep Network Interpolation (DNI) baselines in both extrapolation and data efficiency. Our results demonstrate that LAMP advances controllable, data-efficient, and safe 3D generation for design exploration, dataset generation, and performance-driven optimization.
翻译:生成满足特定参数约束的高保真三维几何体在设计与工程领域具有广泛应用。然而,现有方法通常依赖大规模训练数据集,且在可控性及训练分布外的泛化能力方面存在局限。为克服这些限制,本文提出LAMP(参数化形状的线性仿射混合),一种面向可控可解释三维生成的数据高效框架。LAMP首先通过从共享初始化状态过拟合每个样本,对齐符号距离函数解码器,随后在对齐的权重空间中通过求解参数约束混合问题来合成新几何体。为确保鲁棒性,我们进一步提出通过线性度失配检测几何有效性的安全度量标准。我们在两个三维参数化基准测试集DrivAerNet++和BlendedNet上评估LAMP。实验表明LAMP能够实现:(i)仅需100个样本即可在边界内进行受控插值;(ii)通过高达100%超出训练范围的参数差异实现安全外推;(iii)在固定参数下进行物理性能引导的优化。LAMP在外推能力和数据效率方面显著优于条件自编码器与深度网络插值基线方法。我们的研究结果证明,LAMP为设计探索、数据集生成和性能驱动优化等领域推进了可控、数据高效且安全的三维生成技术。