• Acta Photonica Sinica
  • Vol. 51, Issue 6, 0610003 (2022)
Haixia WANG1, Lubin SHAN2, Qiaoling PANG1, and Yilong ZHANG1、*
Author Affiliations
  • 1College of Computer Science and Technology,Zhejiang University of Technology,Hangzhou 310000,China
  • 2College of Information Engineering,Zhejiang University of Technology,Hangzhou 310000,China
  • show less
    DOI: 10.3788/gzxb20225106.0610003 Cite this Article
    Haixia WANG, Lubin SHAN, Qiaoling PANG, Yilong ZHANG. Phase Extraction Method of Fingerprint Fringe Pattern Based on Convolutional Neural Network[J]. Acta Photonica Sinica, 2022, 51(6): 0610003 Copy Citation Text show less

    Abstract

    Fingerprint is the most widely used category among many identity features. Conventional fingerprint capturing devices use contract-based 2D measurement. The uncontrollability of the pressing force during measurement and the residue left by the previous collection result in unsatisfactory fingerprint quality and certain security problems. The non-contact 3D fingerprint acquisition technology has attracted research attention due to its high security and abundant fingerprint information. The Fringe Projection Profiolmetry (FPP) is a widely used non-contact 3D measurement technique, based on which a 3D fingerprint acquisition system is built in this paper. Phase demodulation is a key step of FPP, where the phase shifting method is widely used. Phase shifting method achieves high accuracy for the 3D measurement of static objects. The higher number of phase shifting steps, the higher the quality of the 3D information obtained. However, in fingerprint measurement, human fingers will involuntarily shake due to heart beating and excessive fatigue, which introduces potential defects using phase shifting method. How to shorten the acquisition time while maintaining the high-precision phase extraction quality is a problem that has to be considered in this field. In recent years, with the continuous improvement of computation power and cloud resources, the application of artificial intelligence technology to realize intelligent data processing is a brand-new solution. Therefore, a convolutional neural network with wrapping-aware loss is proposed in this paper to extract the phase of a single fringe pattern. The network consists of an encoder, a residual module and a decoder to exact the wrapped phase map directly from the fringe pattern. Two factors are considered during the network establishment. Firstly, the fingerprint is relatively subtle. The variations of fingerprint details are comparably small compared with variations of finger shape in phase map. The normalization operation, which is usually used in neural network to speed up the network convergence, and the current mainstream Mean Square Error (MSE) loss function, which causes the fingerprint details to be blurred and smoothed, need to be avoided. Secondly, in the region of 2π discontinuity in the wrapped phase map, a small phase difference will cause a large error in the loss function, which will misguide the optimization process and decrease the quality of the overall fingerprint details. Therefore, during the network design, instead of normalization, this paper adds nine residual blocks between encoder and decoder to improve the network training speed. In the loss function design, this paper proposes a new wrapping-aware loss function. The loss function is a fusion of Mean Absolute Error (MAE) and sine function error. The MAE has better capability to reserve subtle details. The sine function error can effectively reduce the sharp error caused by the sudden change in the 2π discontinuity region. Furthermore, considering that most of the acquired fringe image is the background area, only the region of interest of the fingerprint is carried out to realize the loss function calculation. Experiments are carried out to test the performance of the proposed method. The four-step phase-shifting results are used as ground-truth. The proposed method is compared with the conventional Fourier Transform (FT), Windowed Fourier Transform (WFT), HU' phase extraction network and FENG' phase extraction network. Firstly, the MAEs of the phase differences are calculated. The MAEs are 0.394 7, 0.341 7, 0.165 1, 0.179 2 and 0.083 9 for the FT, WFT, HU's method, FENG's method and the proposed method, respectively. Meanwhile, the MAEs at the 2π discontinuity region are 0.396 5, 0.355 4, 0.239 6, 0.370 4 and 0.104 9 respectively. The proposed method has achieved the best performance. Secondly, since images may be distorted in the process of neural network processing, it is necessary to evaluate the similarity of the results obtained by the methods in this paper. The Structural Similarity (SSIM) and histogram similarity indicators are used for comparison. The two indicators are 0.980 7 and 0.866 1 of the proposed method respectively, which are closest to the four-step phase-shifting results than the other four comparison methods. Thirdly, in the evaluation of different loss functions, the MSE loss function, the MAE loss function and the fused loss function are compared. They achieve MAEs of 0.157 8, 0.084 7, and 0.839, respectively of the whole fingerprint and 0.206 1, 0.120 9 and 0.104 9 at the 2π discontinuity region. The results also show that the loss function proposed in this paper can effectively improve the quality of the network prediction, and is suitable for images with subtle features such as fingerprints. In the last, the fingerprints estimated with the proposed method, HU's method, FENG's method and the four-step phase shifting method are visualized and demonstrated. It can be seen from the figures that the fingerprint using the proposed method performs better compared with HU's method and FENG's method. Clear ridges and valleys are observed. In summary, the convolutional neural network constructed in this paper and the proposed wrapping-aware loss function achieve a high phase extraction accuracy and retain fingerprint details.
    Haixia WANG, Lubin SHAN, Qiaoling PANG, Yilong ZHANG. Phase Extraction Method of Fingerprint Fringe Pattern Based on Convolutional Neural Network[J]. Acta Photonica Sinica, 2022, 51(6): 0610003
    Download Citation