In this work, we introduce Progressive Growing of Patch Size, a resource-efficient implicit curriculum learning approach for dense prediction tasks. Our curriculum approach is defined by growing the patch size during model training, which gradually increases the task's difficulty. We integrated our curriculum into the nnU-Net framework and evaluated the methodology on all 10 tasks of the Medical Segmentation Decathlon. With our approach, we are able to substantially reduce runtime, computational costs, and CO$_{2}$ emissions of network training compared to classical constant patch size training. In our experiments, the curriculum approach resulted in improved convergence. We are able to outperform standard nnU-Net training, which is trained with constant patch size, in terms of Dice Score on 7 out of 10 MSD tasks while only spending roughly 50\% of the original training runtime. To the best of our knowledge, our Progressive Growing of Patch Size is the first successful employment of a sample-length curriculum in the form of patch size in the field of computer vision. Our code is publicly available at \url{https://github.com}.
翻译:本研究提出了一种渐进式增长补丁尺寸的方法,这是一种面向密集预测任务的资源高效隐式课程学习策略。我们的课程学习方法通过在模型训练过程中逐步增大输入补丁的尺寸,从而渐进式提升任务难度。我们将该课程学习方案集成到nnU-Net框架中,并在医学分割十项全能挑战赛的全部10个任务上进行了系统性评估。与传统固定补丁尺寸训练方法相比,本方法能显著减少训练时间、计算资源消耗及CO$_{2}$排放。实验结果表明,该课程学习策略有效促进了模型收敛。在仅需约50%原始训练时长的条件下,我们在10个MSD任务中的7个任务上获得了优于标准固定补丁尺寸nnU-Net训练的Dice分数。据我们所知,渐进式增长补丁尺寸是计算机视觉领域中首次成功采用以补丁尺寸为形式的样本长度课程学习方法。相关代码已公开于\url{https://github.com}。