Author Affiliations
Faculty of Information Engineering and Automation, Kunming University of Science and Technology, Kunming, Yunnan 650500, Chinashow less
Fig. 1. Example of bronze inscription data set
Fig. 2. Training loss and accuracy curves on training and test data sets. (a) Accuracy curve; (b) loss curve
Fig. 3. L1 norm distributions of convolution kernes in C1,C2,C3 convolution layers . (a) C1 convolution layer; (b) C2 convolution layer; (c) C3 convolution layer
Fig. 4. Final model retraining process and L1 norm distributions in convolution layers. (a) Retraining process; (b) L1 norm distributions
Fig. 5. L1 norm distributions of convolution kernels in VGG16 convolution layer. (a) Before pruning; (b) after downsizing; (c) after pruning
Fig. 6. L1 norm distributions of convolution kernels in ResNet18 convolution layer. (a) Before pruning; (b) after pruning
Layer | Input size | Output size | Number of kernels | Number of parameters | FLOPS |
---|
C1 | 32×32×1 | 30×30×6 | 6 | 60 | 54000 | M1 | 30×30×6 | 15×15×6 | | | 5400 | C2 | 15×15×6 | 11×11×16 | 16 | 2416 | 292336 | M2 | 11×11×16 | 5×5×16 | | | 1936 | C3 | 5×5×16 | 1×1×32 | 32 | 12832 | 12832 | FC | 32 | 106 | | 3498 | 3392 | Total | | 18806 | 369896 |
|
Table 1. Network structure of LeNet
Number of iterations | C'1 | C'2 | C'3 | Number of parameters | FLOPS | Accuracy /% | Accuracy after retraining /% |
---|
0 | 6 | 16 | 32 | 18806 | 369896 | 100 | | 1 | 6 | 13 | 15 | 8609 | 298003 | 82.36 | 100 | 2 | 6 | 13 | 14 | 8177 | 297571 | 68.81 | 100 | 3 | 6 | 13 | 13 | 7745 | 297139 | 62.93 | 100 | 4 | 6 | 13 | 12 | 7313 | 296707 | 67.62 | 100 | 5 | 6 | 13 | 11 | 6881 | 296275 | 65.79 | 99.77 | 6 | 6 | 13 | 10 | 6449 | 295843 | 59.52 | 98.71 | 7 | 6 | 12 | 10 | 6048 | 277322 | 51.85 | 98.04 | 8 | 6 | 11 | 10 | 5647 | 258801 | 43.46 | 97.62 | 9 | 6 | 11 | 9 | 5265 | 258419 | 39.9 | 89.17 |
|
Table 2. Iterative process of pruning and retraining
Network | Number of kernels | FLOPS /106 | Number of parameters /103 |
---|
VGG16 | 64(C1),64(C2),128(C3),128(C4),256(C5),256(C6),256(C7), 512(C8),512(C9),512(C10),512(C11),512(C12),512(C13) | 333.19 | 34031 | ResNet18 | 64(C1),64(C2),64(C3),64(C4),64(C5),128(C6),128(C7),128(C8),128(C9), 256(C10),256(C11),256(C12),256(C13),512(C14),512(C15),512(C16),512(C17) | 37.22 | 11230 |
|
Table 3. Convolution layer structures of VGG16 and ResNet18
Network | Number ofiterations | Number of kernels after pruning | FLOPS /106 | Number ofparameters /103 | Accuracy /% |
---|
VGG16 | 18 | 39(C1),27(C2),86(C3),62(C4),93(C5),156(C6),0(C7), 176(C8),0(C9),0(C10),0(C11),0(C12),0(C13) | 62.29 | 18450 | 77.96 | ResNet18 | 39 | 64(C1),22(C2),64(C3),11(C4),64(C5),22(C6),128(C7),22(C8),128(C9),51(C10),256(C11),49(C12),256(C13),50(C14),512(C15),47(C16),512(C17) | 9.22 | 1549 | 77.86 |
|
Table 4. Pruning results of VGG16 and ResNet18