stylegan2 단점 보완 논문
: 너무 많은 데이터가 필요하단 점을 보완한 논문
1. Training Generative Adversarial Networks with Limited Data
Metric 관련
2. generator metric
: GANs Trained by a Two Time-Scale Update Rule Converge to a Local Nash Equilibrium (Learned Perceptual Image Patch Similarity (LPIPS))
GANs Trained by a Two Time-Scale Update Rule Converge to a Local Nash Equilibrium
Generative Adversarial Networks (GANs) excel at creating realistic images with complex models for which maximum likelihood is infeasible. However, the convergence of GAN training has still not been proved. We propose a two time-scale update rule (TTUR) for
arxiv.org
3. discriminator metric
: The Unreasonable Effectiveness of Deep Features as a Perceptual Metric (Fréchet Inception Distance (FID))
GANs Trained by a Two Time-Scale Update Rule Converge to a Local Nash Equilibrium
Generative Adversarial Networks (GANs) excel at creating realistic images with complex models for which maximum likelihood is infeasible. However, the convergence of GAN training has still not been proved. We propose a two time-scale update rule (TTUR) for
arxiv.org