This script shows how to reconstruct face images from their sketch-like image using pix2pix that is a kind of conditional GAN.
This code is created based on https://github.com/matlab-deep-learning/pix2pix.
Fisrt of all, please download CelebAMask-HQ dataset. CelebAMask-HQ is a large-scale face image dataset that has 30,000 high-resolution face images selected from the CelebA dataset.
To download the dataset, the following sites are available;
CelebA-HQ-img, put the file in the current path like below.
After the installation,please push RUN button as shown below.
Run the function
install.m to ensure that all required files are added to the MATLAB path.
clear;clc;close all install();
Run the function
img2sketch.m to convert the face images into skech-like ones. Then, the folder "CelebA_Line" should be created now.
To train a model you need pairs of images of "before" and "after", which correspond to
CelebA_Line, respectively now.
We can tune the training parameters as below.
options = p2p.trainingOptions('MaxEpochs',1,'MiniBatchSize',8,'VerboseFrequency',30);
Note training the model will take several hours on a GPU and requires around 6GB of GPU memory.
p2pModel = p2p.train(labelFolder, targetFolder, options);
Once the model is trained we can use the generator to make generate a new image.
exampleInput = imread('./CelebA_Line/1355.jpg'); exampleInput = imresize(exampleInput, [256, 256]);
We can then use the
p2p.translate function to convert the input image using trained model.
exampleOutput = p2p.translate(p2pModel, exampleInput); figure;imshowpair(exampleInput, gather(exampleOutput), "montage");
This is the modified code formed from https://github.com/matlab-deep-learning/pix2pix by Kenta Itakura
Kenta (2022). sketch2im using Conditional GAN (pix2pix) (https://github.com/KentaItakura/pix2pix/releases/tag/1.0), GitHub. Retrieved .
MATLAB Release Compatibility
Platform CompatibilityWindows macOS Linux
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!Start Hunting!