Deep learning algorithms have a large number of trainable parameters often with sizes of hundreds of thousands or more. Training this algorithm requires a large amount of training data and generating a sufficiently large dataset for these algorithms is costly\cite{noguchi2019image}. GANs are generative neural networks that use two deep learning networks that are competing with each other. The networks are generator and discriminator networks. The generator tries to generate realistic images which resemble the actual training dataset by approximating the training data distribution and the discriminator is trained to classify images as real or fake(generated)\cite{goodfellow2016nips}. Training these GAN algorithms also requires a large amount of training dataset\cite{noguchi2019image}. In this study, the aim is to address the question, "Given an unconditioned pretrained generator network and a pretrained classifier, is it feasible to develop a conditioned generator without relying on any training dataset?" The paper begins with a general introduction to the problem. The subsequent sections are structured as follows: Section 2 provides background information on the problem. Section 3 reviews relevant literature on the topic. Section 4 outlines the methodology employed in this study. Section 5 presents the experimental results. Section 6 discusses the findings and proposes potential future research directions. Finally, Section 7 offers concluding remarks. The implementation can be accessed \href{https://github.com/kidist-amde/BigGAN-PyTorch}{here}.
翻译:深度学习算法通常包含大量可训练参数,其规模常达数十万甚至更多。训练此类算法需要大量训练数据,而为这些算法生成足够规模的数据集成本高昂\cite{noguchi2019image}。生成对抗网络(GAN)是一种生成式神经网络,它通过两个相互竞争的深度学习网络实现:生成器网络与判别器网络。生成器通过逼近训练数据分布来生成与真实训练数据集相似的逼真图像,而判别器则被训练以区分图像的真伪(生成图像)\cite{goodfellow2016nips}。训练此类GAN算法同样需要大量训练数据集\cite{noguchi2019image}。本研究旨在探讨以下问题:"给定未经条件化的预训练生成器网络与预训练分类器,是否能在不依赖任何训练数据集的情况下开发出条件化生成器?" 本文首先对问题进行了概述。后续章节结构如下:第2节提供问题背景信息;第3节回顾相关文献;第4节阐述本研究采用的方法论;第5节展示实验结果;第6节讨论研究发现并提出未来潜在研究方向;第7节给出结论。具体实现可通过\href{https://github.com/kidist-amde/BigGAN-PyTorch}{此链接}访问。