• Optoelectronics Letters
  • Vol. 19, Issue 2, 117 (2023)
Qingsong ZHANG1、2, Linjun SUN2, Guowei YANG1、*, Baoli LU2, Xin NING2, and Weijun and LI2
Author Affiliations
  • 1School of Electronic Information, Qingdao University, Qingdao 266071, China
  • 2Institute of Semiconductors, Chinese Academy of Sciences, Beijing 100083, China
  • show less
    DOI: 10.1007/s11801-023-2113-2 Cite this Article
    ZHANG Qingsong, SUN Linjun, YANG Guowei, LU Baoli, NING Xin, and LI Weijun. TBNN: totally-binary neural network for image classification[J]. Optoelectronics Letters, 2023, 19(2): 117 Copy Citation Text show less
    References

    [1] HE R, SUN S, YANG J, et al. Knowledge distillation as efficient pre-training: faster convergence, higher data-efficiency, and better transferability[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, June 19-24, 2022, New Orleans, Louisiana, USA. New York:IEEE, 2022: 9161-9171.

    [2] ZHANG L, CHEN X, TU X, et al. Wavelet knowledge distillation:towards efficient image-to-image translation[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, June 19-24, 2022, New Orleans, Louisiana, USA. New York:IEEE, 2022: 12464-12474.

    [3] ZHONG Y, LIN M, NAN G, et al. IntraQ:learning synthetic images with intra-class heterogeneity for zero-shot network quantization[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, June 19-24, 2022, New Orleans, Louisiana, USA. New York:IEEE, 2022:12339-12348.

    [4] LIU C, DING W, XIA X, et al. Circulant binary convolutional networks:enhancing the performance of 1-bit DCNNs with circulant back propagation[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, June 13-19, 2019, Long Beach, CA, USA. New York:IEEE, 2019:2691-2699.

    [5] LIU Z, SHEN Z, SAVVIDES M, et al. ReActNet: towards precise binary neural network with generalized activation functions[C]//European Conference on Computer Vision, August 23-28, 2020, Virtual. Cham: Springer, 2020:143-159.

    [6] ZHOU S, WU Y, NI Z, et al. Dorefa-net:training low bit width convolutional neural networks with low bit width gradients[EB/OL]. (2016-06-20) [2022-06-22].https://arxiv.org/pdf/1606.06160.pdf.

    [7] DING R, CHIN T W, LIU Z, et al. Regularizing activation distribution for training binarized deep networks[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, June 13-19, 2019, Long Beach, CA, USA. New York:IEEE, 2019: 11408-11417.

    [8] HOWARD A G, ZHU M, CHEN B, et al. Mobilenets: efficient convolutional neural networks for mobile vision applications[EB/OL]. (2017-04-17) [2022-06-22].https://arxiv.org/pdf/1704.04861.pdf.

    [9] ZHANG X, ZHOU X, LIN M, et al. Shufflenet:an extremely efficient convolutional neural network for mobile devices[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, June 18-22, 2018, Salt Lake City, UT, USA. New York:IEEE, 2018:6848-6856.

    [10] MEHTA S, RASTEGARI M. Mobilevit:light-weight, general-purpose, and mobile-friendly vision transformer[EB/OL]. (2021-10-17) [2022-06-22]. https: //arxiv.org/pdf/2110.02178v2.pdf.

    [11] QIN H, GONG R, LIU X, et al. Forward and backward information retention for accurate binary neural networks[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, June 13-19, 2020, Seattle, WA, USA. New York:IEEE, 2020: 2250-2259.

    [12] LIU Z, WU B, LUO W, et al. Bi-real net:enhancing the performance of 1-bit CNNs with improved representational capability and advanced training algorithm[C]//Proceedings of the European Conference on Computer Vision, September 8-14, 2018, Munich, Germany. Berlin, Heidelberg:Springer-Verlag, 2018: 722-737.

    [13] LIN X, ZHAO C, PAN W. Towards accurate binary convolutional neural network[J]. Advances in neural information processing systems, 2017, 30:345-353.

    [14] SU Z, FANG L, GUO D, et al. FTBNN:rethinking non-linearity for 1-bit CNNs and going beyond[EB/OL]. (2010-09-29) [2022-06-22]. https://www.xueshufan.com/reader/3096361616.

    [15] KIM H, KIM K, KIM J, et al. Binaryduo:reducing gradient mismatch in binary activation network by coupling binary activations[EB/OL]. (2002-06-05)[2022-06-22]. https://arxiv.org/pdf/2002.06517v1.pdf.

    [16] XIE S, GIRSHICK R, DOLLAR P, et al. Aggregated residual transformations for deep neural networks[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, July 21-26, 2017, Honolulu, HI, USA. New York:IEEE, 2017: 1492-1500.

    [17] CHOLLET F. Xception:deep learning with depth wise separable convolutions[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, July 21-26, 2017, Honolulu, HI, USA. New York: IEEE, 2017:1251-1258.

    [18] HUBARA I, COURBARIAUX M, SOUDRY D, et al. Binarized neural networks[J]. Advances in neural information processing systems, 2016, 29:4107-4115.

    [19] GONG R, LIU X, JIANG S, et al. Differentiable soft quantization:bridging full-precision and low-bit neural networks[C]//Proceedings of the IEEE International Conference on Computer Vision, October 27-November 3, 2019, Seoul, South Korea. New York:IEEE, 2019: 4852-4861.

    ZHANG Qingsong, SUN Linjun, YANG Guowei, LU Baoli, NING Xin, and LI Weijun. TBNN: totally-binary neural network for image classification[J]. Optoelectronics Letters, 2023, 19(2): 117
    Download Citation