jason9693/Style-Transfer
Implemented Lee-Jung-Seob Style-Transfer
repo name | jason9693/Style-Transfer |
repo link | https://github.com/jason9693/Style-Transfer |
homepage | |
language | Jupyter Notebook |
size (curr.) | 23482 kB |
stars (curr.) | 34 |
created | 2018-02-12 |
license | |
Style Transfer
This Repo is originated from cs231n(2017) Assignment Repository
In this notebook we will implement the style transfer technique from “Image Style Transfer Using Convolutional Neural Networks” (Gatys et al., CVPR 2015).
The general idea is to take two images, and produce a new image that reflects the content of one but the artistic “style” of the other. We will do this by first formulating a loss function that matches the content and style of each respective image in the feature space of a deep network, and then performing gradient descent on the pixels of the image itself.
The deep network we use as a feature extractor is SqueezeNet, a small model that has been trained on ImageNet. You could use any network, but we chose SqueezeNet here for its small size and efficiency.
Here’s an example of the images you’ll be able to produce by the end of this notebook: