• Infrared and Laser Engineering
  • Vol. 51, Issue 2, 20210891 (2022)
Yinxu Bian1、*, Tao Xing1, Weijie Deng2, Qin Xian3, Honglei Qiao3, Qian Yu4, Jilong Peng4, Xiaofei Yang5, Yannan Jiang6, Jiaxiong Wang7, Shenmin Yang7, Renbin Shen6, Hua Shen1, and Cuifang Kuang8
Author Affiliations
  • 1School of Electronic and Optical Engineering, Nanjing University of Science and Technology, Nanjing 210094, China
  • 2Changchun Institute of Optics, Fine Mechanics and Physics, Chinese Academy of Sciences, Changchun 130033, China
  • 3Chongqing Jialing Huaguang Optoelectronic Technology Co. LTD, Chongqing 400700, China
  • 4Beijing Environmental Satellite Engineering Institute, Beijing 100094, China
  • 5School of Optoelectronic Science and Engineering, Soochow University, Suzhou 215006, China
  • 6Department of General Surgery, the Affiliated Suzhou Hospital of Nanjing Medical University, Suzhou Municipal Hospital, Suzhou 215002, China
  • 7Center of Reproduction and Genetics, Affiliated Suzhou Hospital of Nanjing Medical University, Suzhou Municipal Hospital, Suzhou 215002, China
  • 8College of Optical Science and Engineering, Zhejiang University, Hangzhou 310027, China
  • show less
    DOI: 10.3788/IRLA20210891 Cite this Article
    Yinxu Bian, Tao Xing, Weijie Deng, Qin Xian, Honglei Qiao, Qian Yu, Jilong Peng, Xiaofei Yang, Yannan Jiang, Jiaxiong Wang, Shenmin Yang, Renbin Shen, Hua Shen, Cuifang Kuang. Deep learning-based color transfer biomedical imaging technology[J]. Infrared and Laser Engineering, 2022, 51(2): 20210891 Copy Citation Text show less
    Style transfer algorithm of Gatys et al[38]. (a) Style and content reconstruction; (b) Style transfer example
    Fig. 1. Style transfer algorithm of Gatys et al[38]. (a) Style and content reconstruction; (b) Style transfer example
    Network structure of Johnson et al[47]
    Fig. 2. Network structure of Johnson et al[47]
    Example results of pix2 pix[57]
    Fig. 3. Example results of pix2 pix[57]
    Paired training data and unpaired training data sets[59]
    Fig. 4. Paired training data and unpaired training data sets[59]
    Structure of CycleGAN[59]. (a) Two generators generate images cyclically; (b) Cyclic reconstruction process of X-domain images; (c) Cyclic reconstruction process of Y-domain images
    Fig. 5. Structure of CycleGAN[59]. (a) Two generators generate images cyclically; (b) Cyclic reconstruction process of X-domain images; (c) Cyclic reconstruction process of Y-domain images
    Application of CycleGAN in the field of color-transfer[59]
    Fig. 6. Application of CycleGAN in the field of color-transfer[59]
    Experimental results of Pranita et al[66] . (a) pix2 pix model for paired image translation; (b) Cycle CGAN model for unpaired image translation
    Fig. 7. Experimental results of Pranita et al[66] . (a) pix2 pix model for paired image translation; (b) Cycle CGAN model for unpaired image translation
    Experimental results of Teramoto et al[68]. (a) Transformation results of adenocarcinoma; (b) Transformation results of squamous cell carcinomas
    Fig. 8. Experimental results of Teramoto et al[68]. (a) Transformation results of adenocarcinoma; (b) Transformation results of squamous cell carcinomas
    Work done by Lo et al[73]. (a) CycleGAN structure of Lo et al; (b) The faster R-CNN structure of Lo et al;(c) P–R curves using different H&E trained models to test images with different stains, where “O” and “×” denote the manual detection results of H&E and PAS images, respectively, performed by four doctors
    Fig. 9. Work done by Lo et al[73]. (a) CycleGAN structure of Lo et al; (b) The faster R-CNN structure of Lo et al;(c) P–R curves using different H&E trained models to test images with different stains, where “O” and “×” denote the manual detection results of H&E and PAS images, respectively, performed by four doctors
    Work done by Xu et al[75]. (a) Structure of cCGAN; (b) Example of the training datasets;(c) Experiment results with different parameters settings
    Fig. 10. Work done by Xu et al[75]. (a) Structure of cCGAN; (b) Example of the training datasets;(c) Experiment results with different parameters settings
    Work done by de Bel et al[76]. (a) Architecture of the generator in the residual CycleGAN, closely resembling the standard U-net; (b) The generator learns the difference mapping or residual between a source and target domain; (c) Samples of colon tissue before and after transformation with the CycleGAN approaches
    Fig. 11. Work done by de Bel et al[76]. (a) Architecture of the generator in the residual CycleGAN, closely resembling the standard U-net; (b) The generator learns the difference mapping or residual between a source and target domain; (c) Samples of colon tissue before and after transformation with the CycleGAN approaches
    UV-PAM and Deep-PAM validation using a 7 µm thick frozen section of a mouse brain[80]
    Fig. 12. UV-PAM and Deep-PAM validation using a 7 µm thick frozen section of a mouse brain[80]
    Stain normalization network architecture proposed by Chen et al[86]
    Fig. 13. Stain normalization network architecture proposed by Chen et al[86]
    Examples of style normalized images of different normalization methods in the target domain[86]
    Fig. 14. Examples of style normalized images of different normalization methods in the target domain[86]
    Deep learning colorful PIE lens-less diffraction microscopy[94]. (a) Flow charts of computational algorithms for colorful PIE microcopy with only one kind illumination; (b) Vision comparisons of colorful PIE microscopy images and conventional RGB brightfield images
    Fig. 15. Deep learning colorful PIE lens-less diffraction microscopy[94]. (a) Flow charts of computational algorithms for colorful PIE microcopy with only one kind illumination; (b) Vision comparisons of colorful PIE microscopy images and conventional RGB brightfield images
    Virtual colorful lens-free on-chip microscopy[95]. (a) Lens-free on-chip microscope; (b) Data process to achieve virtual colorful lens-free on-chip microscopy. The yellow scale bar is 200 μm; (c) Deep learning GAN network established to achieve virtual colorization; (d) Comparisons of lens-free on-chip microscopy image, bench-top commercial microscopy image and virtual colorization image
    Fig. 16. Virtual colorful lens-free on-chip microscopy[95]. (a) Lens-free on-chip microscope; (b) Data process to achieve virtual colorful lens-free on-chip microscopy. The yellow scale bar is 200 μm; (c) Deep learning GAN network established to achieve virtual colorization; (d) Comparisons of lens-free on-chip microscopy image, bench-top commercial microscopy image and virtual colorization image
    Singlet microscopy colorization[96]. (a) An overview to achieve the singlet microscopy colorization; (b) 200 group images under B/G/R illumination to evaluate the virtual colorized microscopy images’ average PNSRs and SSIMs
    Fig. 17. Singlet microscopy colorization[96]. (a) An overview to achieve the singlet microscopy colorization; (b) 200 group images under B/G/R illumination to evaluate the virtual colorized microscopy images’ average PNSRs and SSIMs
    Application fieldNetwork structureLearning methodApplication problems
    Supervised learningUnsupervised learning
    Color transfer technology of pathological section imagespix2 pixComputational tissue staining
    Cycle CGANComputational tissue staining
    CycleGANMutual stain
    CycleGAN, Faster R-CNN Tissue staining and detection
    cCGAN, Residual CycleGAN Model improvements for different demand backgrounds
    Deep-PAMCombining different medical image information acquisition technologies
    GANUnsupervised image style normalization
    Virtual color enhancement for lensless and single lens imagingGANPIE
    GANImprovements in lensless microscopes
    U-NetComputational virtual shading method for single lens microscopy
    Table 1. Statistics on the usage of color transfer technology
    Yinxu Bian, Tao Xing, Weijie Deng, Qin Xian, Honglei Qiao, Qian Yu, Jilong Peng, Xiaofei Yang, Yannan Jiang, Jiaxiong Wang, Shenmin Yang, Renbin Shen, Hua Shen, Cuifang Kuang. Deep learning-based color transfer biomedical imaging technology[J]. Infrared and Laser Engineering, 2022, 51(2): 20210891
    Download Citation