• Optics and Precision Engineering
  • Vol. 27, Issue 8, 1836 (2019)
DU Zhen-long*, SHEN Hai-yang, SONG Guo-mei, and LI Xiao-li
Author Affiliations
  • [in Chinese]
  • show less
    DOI: 10.3788/ope.20192708.1836 Cite this Article
    DU Zhen-long, SHEN Hai-yang, SONG Guo-mei, LI Xiao-li. Image style transfer based on improved CycleGAN[J]. Optics and Precision Engineering, 2019, 27(8): 1836 Copy Citation Text show less

    Abstract

    Image style transfer exploits a specified style to modify given image content. An automatic image style transfer based on a Generative Adversarial Network (GAN) can reduce the workload and yield rich results. In some cases, the pair datasets required by the classical GAN were difficult to obtain. To overcome the limitations of paired datasets by a traditional GAN and improve the efficiency of style transfer, this study proposed an image style transfer method based on an improved Cycle-consistent adversarial network (CycleGAN). In this study, the deep residual network adopted by the conventional network generator was replaced by the dense connection convolution network, and a novel loss function composed of the same mapping and perceptual losses was used to measure the style transfer loss. These improvements were shown to increase the network performance, overcome the networks limitations on paired samples, and improve the quality of images generated by style migration. In addition, the stability was further improved and the network convergence speed was accelerated. Experiments demonstrate that the peak signal-to-noise ratio of the image generated by the proposed method increase 6.27% on average, where as the structural similarity index measure increased by approximately 10%. The improved CycleGAN image style transfer method proposed in this study can thus generate better style images.
    DU Zhen-long, SHEN Hai-yang, SONG Guo-mei, LI Xiao-li. Image style transfer based on improved CycleGAN[J]. Optics and Precision Engineering, 2019, 27(8): 1836
    Download Citation