Author Affiliations
School of Computer Science and Technology, Xinjiang Normal University, Urumqi, Xinjiang 830054, Chinashow less
Fig. 1. Sample data of wild mushroom images. (a) Amanita exitalis; (b) Amanita fuliginea; (c) Amanita neoovoidea; (d) Amanita parvipantherina; (e) Amanita rubrovolvata; (f) Entoloma quadratum; (g) Panaeolus sphinctrinus; (h) Psilocybe coprophila; (i) Gyromitra infula; (j) Lonomidotis frondosa
Fig. 2. Effects of different data enhancement methods. (a) Origin image; (b) random rotation; (c) horizontal flip; (d) color dither; (e) Gaussian noise; (f) histogram equalization; (g) random cut
Fig. 3. Structural diagram of Xception
Fig. 4. Experimental flow chart of wild mushroom species identification model
Fig. 5. Principle diagram of CBAM's realization
Fig. 6. Comparison of three kinds of neural network structures. (a) Traditional neural network; (b) Dropout neural network; (c) Disout neural network
Fig. 7. Comparison among model parameters for different training methods. (a) Accuracy; (b) average training time
A | B | C | D |
---|
[Input 299×299×3] | Sep Conv 128, 3×3 | ×3 | ×3 | Conv 32,3×3,stride of 2×2ReLU | ReLU | Sep Conv 128,3×3 | MaxPool 3×3,stride of 2×2 | MaxPool 3×3,stride of 2×2 | | MaxPool 3×3,stride of 2×2 | | |
|
Table 1. Structures of A, B, C, and D components in Xception
E(repeated 8 times) | F | G |
---|
×3 | ReLU | Sep Conv 1536,3×3 | Sep Conv 728,3×3 | ReLU | | ReLU | Sep Conv 2048,3×3 | | Sep Conv 1024,3×3 | ReLU | | MaxPool 3×3,stride of 2×2 | Global average pool |
|
Table 2. Structures of E, F, and G components in Xception
Experimental number | Training structure | Top 1 /% | Top 5 /% |
---|
1# | Origin model | 94.91 | 99.30 | 2# | Dropout | 95.92 | 99. 33 | 3# | Disout | 96.32 | 99.61 |
|
Table 3. Comparison among different feature map disturbance forms
Experimental number | Training method | Top 1 /% | Top 5 /% | TAverage/s |
---|
1## | Random parameters | 92.10 | 98.10 | 1688.37 | 2## | Freezing all parameters | 87.26 | 98.34 | 599.38 | 3## | Freezing partial parameters | 96.47 | 99.69 | 647.91 | 4## | Training all network layers | 97.02 | 99.67 | 1645.39 |
|
Table 4. Comparisonamong different training methods
Experimental number | Size of training set | Size of validation set | Top 1 /% | Top 5 /% |
---|
1-1 | 5 | 5 | 94.23 | 98.91 | 2-1 | 6 | 4 | 95.15 | 98.99 | 3-1 | 7 | 3 | 95.67 | 99.36 | 4-1 | 8 | 2 | 96.32 | 99.61 |
|
Table 5. Comparison among different proportions
Model | Top 1 /% | Top 5 /% | Number of parameters /106 |
---|
Alex | 86.24 | 97.94 | 61.10 | ResNet50 | 89.39 | 98.64 | 25.64 | ResNet101 | 91.64 | 98.69 | 44.71 | ResNet152 | 92.36 | 99.32 | 60.42 | InceptionV1 | 90.64 | 98.85 | 13.02 | InceptionV3 | 93.16 | 99.09 | 23.93 | InceptionResNetV2 | 95.41 | 99.28 | 55.87 | Xception | 95.58 | 99.43 | 22.99 | Dis-Xception-CBAM | 96.32 | 99.61 | 24.04 |
|
Table 6. Comparison among different model experiments