Author Affiliations
1College of Computer Science, Sichuan University, Chengdu 610065, Sichuan, China2National Key Laboratory of Fundamental Science on Synthetic Vision, Sichuan University, Chengdu 610065, Sichuan, Chinashow less
Fig. 1. Overall architecture of AMCNet
Fig. 2. Illustration of the aggregation of local features in pyramid feature extractor
Fig. 3. GAB. (a) Improved cross-attention module; (b) self-attention module; (c) channel attention SELayer module (⊕ denotes element-wise addition and ⊙ denotes Hadamard product)
Fig. 4. Structure of the point generator
Fig. 5. Visualization of completion results of different networks on PCN dataset
Fig. 6. Visualization of completion results of different networks in terms of the chair class
Fig. 7. Visualization of completion results at different resolutions of input
Fig. 8. Visualization of completion results of different point cloud block size
Model | CD /103 |
---|
Average | Plane | Cabinet | Car | Chair | Lamp | Couch | Table | Boat |
---|
FoldingNet | 14.31 | 9.49 | 15.80 | 12.61 | 15.55 | 16.41 | 15.97 | 13.65 | 14.99 | PCN | 9.64 | 5.50 | 22.70 | 10.63 | 8.70 | 11.00 | 11.34 | 11.68 | 8.59 | GRNet | 8.83 | 6.45 | 10.37 | 9.45 | 9.41 | 7.96 | 10.51 | 5.44 | 8.04 | PMP-Net | 8.73 | 5.65 | 11.24 | 9.64 | 9.51 | 6.95 | 10.83 | 8.72 | 7.25 | PoinTr | 8.38 | 4.75 | 10.47 | 8.68 | 9.39 | 7.75 | 10.93 | 7.75 | 7.29 | SnowflakeNet | 7.21 | 4.29 | 9.16 | 8.08 | 7.89 | 6.07 | 9.23 | 6.55 | 6.40 | PointAttN | 6.86 | 3.87 | 9.00 | 7.63 | 7.43 | 5.90 | 8.68 | 6.32 | 6.09 | Ours | 6.45 | 3.58 | 8.74 | 7.36 | 6.86 | 5.28 | 8.32 | 5.88 | 5.69 |
|
Table 1. Point cloud completion comparison on PCN dataset in terms of CD(lower is better)
Resolution | 2048 | 1024 | 512 | 256 |
---|
CD | 6.45 | 6.54 | 6.66 | 7.23 |
|
Table 2. Effect of the resolution of the input point cloud
Size | K=16 | K=32 | K=64 |
---|
CD | 6.48 | 6.45 | 6.47 |
|
Table 3. Effect of point cloud block size
Model | SELayer | Skip connection | CD |
---|
A | | | 6.61 | B | √ | | 6.58 | C | | √ | 6.49 | AMCNet | √ | √ | 6.45 |
|
Table 4. Effect of attention module