We present a novel, regression-based method for artistically styling images. Unlike recent neural style transfer or diffusion-based approaches, our method allows for explicit control over the stroke composition and level of detail in the rendered image through the use of an extensible set of stroke patches. The stroke patch sets are procedurally generated by small programs that control the shape, size, orientation, density, color, and noise level of the strokes in the individual patches. Once trained on a set of stroke patches, a U-Net based regression model can render any input image in a variety of distinct, evocative and customizable styles.
翻译:我们提出了一种新颖的、基于回归的艺术图像风格化方法。与近期的神经风格迁移或基于扩散的方法不同,我们的方法通过使用一组可扩展的笔画补丁,允许对渲染图像中的笔画构成和细节水平进行显式控制。笔画补丁集由小型程序过程化生成,这些程序控制着单个补丁中笔画的形状、大小、方向、密度、颜色和噪声水平。一旦在一组笔画补丁上完成训练,一个基于U-Net的回归模型便能够以多种独特、富有表现力且可定制的风格渲染任何输入图像。