betterze/StyleSpace
StyleSpace Analysis: Disentangled Controls for StyleGAN Image Generation
repo name | betterze/StyleSpace |
repo link | https://github.com/betterze/StyleSpace |
homepage | |
language | Jupyter Notebook |
size (curr.) | 164588 kB |
stars (curr.) | 38 |
created | 2021-04-01 |
license | |
StyleSpace Analysis: Disentangled Controls for StyleGAN Image Generation
Demo video: CVPR 2021 Oral:
Single Channel Manipulation: Localized or attribute specific Manipulation:
StyleSpace Analysis: Disentangled Controls for StyleGAN Image Generation Zongze Wu, Dani Lischinski, Eli Shechtman paper (CVPR 2021 Oral) video
Abstract: We explore and analyze the latent style space of StyleGAN2, a state-of-the-art architecture for image generation, using models pretrained on several different datasets. We first show that StyleSpace, the space of channel-wise style parameters, is significantly more disentangled than the other intermediate latent spaces explored by previous works. Next, we describe a method for discovering a large collection of style channels, each of which is shown to control a distinct visual attribute in a highly localized and disentangled manner. Third, we propose a simple method for identifying style channels that control a specific attribute, using a pretrained classifier or a small number of example images. Manipulation of visual attributes via these StyleSpace controls is shown to be better disentangled than via those proposed in previous works. To show this, we make use of a newly proposed Attribute Dependency metric. Finally, we demonstrate the applicability of StyleSpace controls to the manipulation of real images. Our findings pave the way to semantically meaningful and well-disentangled image manipulations via simple and intuitive interfaces.