[1] 陈东宇, 杨晓雨, 王红心, 等. 基于Joinpoint回归和年龄-时期-队列模型分析中国皮肤癌的长期发病趋势[J]. 中华疾病控制杂志, 2022, 26(7): 756-765.CHEND Y, YANGX Y, WANGH X, et al. Analysis of secular trends in incidence of skin cancer in China based on Joinpoint regression and age-period-cohort model[J]. Chinese Journal of Disease Control & Prevention, 2022, 26(7): 756-765.(in Chinese)
[2] 林千里, 张文俊, 汪汇, 等. 皮肤黑色素瘤流行病学及防治研究进展[J]. 中国医药导报, 2019, 16(3): 28-32.LINQ L, ZHANGW J, WANGH, et al. Research progress on epidemiology, treatment and prevention of malignant melanoma of skin[J]. China Medical Herald, 2019, 16(3): 28-32.(in Chinese)
[3] N CODELLA, V ROTEMBERG, P TSCHANDL et al. Skin lesion analysis toward melanoma detection 2018: a challenge hosted by the international skin imaging collaboration (ISIC). ArXiv e-Prints(2019).
[4] 梁礼明, 周珑颂, 冯骏, 等. 基于高分辨率复合网络的皮肤病变分割[J]. 光学 精密工程, 2022, 30(16): 2021-2038. doi: 10.37188/ope.20223016.2021LIANGL M, ZHOUL S, FENGJ, et al. Skin lesion segmentation based on high-resolution composite network[J]. Opt. Precision Eng., 2022, 30(16): 2021-2038.(in Chinese). doi: 10.37188/ope.20223016.2021
[5] E SHELHAMER, J LONG, T DARRELL. Fully convolutional networks for semantic segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 39, 640-651(2017).
[6] O RONNEBERGER, P FISCHER, T BROX. U-Net: convolutional networks for biomedical image segmentation. October 5, 234-241(9).
[7] S M PANG, C L PANG, L ZHAO et al. SpineParseNet: spine parsing for volumetric MR image by a two-stage segmentation framework with semantic image representation. IEEE Transactions on Medical Imaging, 40, 262-273(2021).
[8] Z W ZHOU, M M R SIDDIQUEE, N TAJBAKHSH et al. UNet: redesigning skip connections to exploit multiscale features in image segmentation. IEEE Transactions on Medical Imaging, 39, 1856-1867(2020).
[9] O OKTAY, J SCHLEMPER, L LE FOLGOC et al. Attention U-net: learning where to look for the pancreas. ArXiv e-Prints(2018).
[10] R GU, G T WANG, T SONG et al. CA-net: comprehensive attention convolutional neural networks for explainable medical image segmentation. IEEE Transactions on Medical Imaging, 40, 699-711(2021).
[12] A DOSOVITSKIY, L BEYER, A KOLESNIKOV et al. An image is worth 16x16 words: transformers for image recognition at scale. ArXiv e-Prints(2020).
[14] Y D ZHANG, H Y LIU, Q HU.
[15] H Y WANG, Y K ZHU, B GREEN et al. Computer Vision-ECCV 2020. Axial-DeepLab: Stand-Alone Axial-Attention for Panoptic Segmentation, 108-126(2020).
[16] J HU, L SHEN, S ALBANIE et al. Squeeze-and-excitation networks. IEEE Transactions on Pattern Analysis and Machine Intelligence, 42, 2011-2023(2020).
[17] Q L WANG, B G WU, P F ZHU et al. ECA-Net: efficient channel attention for deep convolutional neural networks, 11531-11539(2020).
[18] F MILLETARI, N NAVAB, S A AHMADI. V-Net: fully convolutional neural networks for volumetric medical image segmentation, 565-571(2016).
[19] D GUTMAN, N C F CODELLA, E CELEBI et al. Skin lesion analysis toward melanoma detection: a challenge at the international symposium on biomedical imaging (ISBI) 2016, hosted by the international skin imaging collaboration (ISIC). ArXiv e-Prints(2016).
[20] P TSCHANDL, C ROSENDAHL, H KITTLER. The HAM10000 dataset, a large collection of multi-source dermatoscopic images of common pigmented skin lesions. Scientific Data, 5, 180161(2018).
[21] 徐昌佳, 易见兵, 曹锋, 等. 采用DoubleUNet网络的结直肠息肉分割算法[J]. 光学 精密工程, 2022, 30(8): 970-983. doi: 10.37188/ope.20223008.0970XUC J, YIJ B, CAOF, et al. Colorectal polyp segmentation algorithm using DoubleUNet network[J]. Opt. Precision Eng., 2022, 30(8): 970-983.(in Chinese). doi: 10.37188/ope.20223008.0970
[22] 孙军梅, 葛青青, 李秀梅, 等. 一种具有边缘增强特点的医学图像分割网络[J]. 电子与信息学报, 2022, 44(5): 1643-1652. doi: 10.11999/JEIT210784SUNJ M, GEQ Q, LIX M, et al. A medical image segmentation network with boundary enhancement[J]. Journal of Electronics & Information Technology, 2022, 44(5): 1643-1652.(in Chinese). doi: 10.11999/JEIT210784
[23] J M J VALANARASU, I HACIHALILOGLU et al.
[24] J M J VALANARASU, V M PATEL. UNeXt: Mlp-Based Rapid Medical Image Segmentation Network. Proceedings, 23-33(2022).