• Chinese Optics Letters
  • Vol. 13, Issue 8, 081101 (2015)
Kun Liu, Changhe Zhou*, Shaoqing Wang, Shengbin Wei, and Xin Fan
Author Affiliations
  • Laboratory of Information Optics and Optoelectronics Techniques, Shanghai Institute of Optics and Fine Mechanics, University of Chinese Academy of Sciences, Shanghai 201800, China
  • show less
    DOI: 10.3788/COL201513.081101 Cite this Article Set citation alerts
    Kun Liu, Changhe Zhou, Shaoqing Wang, Shengbin Wei, Xin Fan. 3D shape measurement of a ground surface optical element using band-pass random patterns projection[J]. Chinese Optics Letters, 2015, 13(8): 081101 Copy Citation Text show less

    Abstract

    For manufacturing a fine optical glass lens, it is important to obtain a 3D profile of a semi-finished product with a rough surface. We develop an active binocular 3D scanning setup to measure the 3D profile of a rough surface optical element. Two cameras simultaneously capture the band-pass binary random patterns which are projected on the target object. The highlight of this system is using the temporal correlation technique to determine the stereo correspondence between the pixels of the two cameras. The 3D point cloud can be reconstructed by the triangulation principle. Experiment results confirmed that this method effectively measures the rough surface of an optical element with sufficient accuracy.

    An optical glass aspheric lens is usually made by grinding and polishing. They can be made by point-contact contouring to roughly modify the right area which is then polished to its final shape. During the manufacturing process, there is concern about the 3D shape of the semi-finished components for the next production[1].

    A white light interferometer is a noncontact optical method for surface-height measurement of 3D structures with surface profiles varying between tens of nanometers and a few centimeters[2]. However, a white light interferometer is not suitable to measure the rough surface of an optical element. A contact profilometer is a surface profile measuring instrument that used a diamond stylus. A diamond stylus is moved vertically in contact on a sample and then moved laterally across the sample for a specified distance and specified contact force. Nevertheless, it is difficult for a contact profilometer to obtain dense point clouds. Both of the aforementioned measurement instruments have very high accuracy. However, they cannot satisfy the practical requirement of 3D profile measurement of a rough optical element.

    In this work, we develop a noncontact active binocular 3D scanner[3,4] to measure the 3D profile of a glass optical element with a rough surface. The temporal correlation technique (TCT) is used to determine the corresponding of the binocular cameras. This scanner can measure the 3D profile of a rough surface glass lens with sufficient accuracy.

    Figure 1 shows a schematic view of the setup of our 3D scanner. The setup employs two Fire-Wire-Cameras with 1.3 MP (1280×960, pixel size=3.75μm) and a portable optical engine (800×600). The focal length of the camera lens is 8 mm. The optical engine contains an LED, an illumination lens group, a polarization beam splitter, an liquid crystal on silicon panel, and a projection lens[5]. The distance between one camera and the measured object is about 70 cm. Before the experiment, our system has been calibrated.

    Schematic view of the active binocular 3D measurement system.

    Figure 1.Schematic view of the active binocular 3D measurement system.

    The camera:projector pixel ratio is defined as 1 camera pixel over the number of projector pixels it can see. It is often the case that a single camera pixel corresponds to a linear combination of two or more adjacent projector pixels. This is known as having a low camera-projector pixel ratio. The Gray code method degrades in accordance with the pixel ratio, as insignificant bits become too blurred to be recovered and are simply discarded[6]. In our setup, the camera:projector pixel ratio was set to be close to 1.

    As shown in Fig. 2, the rectified stereo configuration is displayed as viewed along the direction of the row axis of the images, i.e., the y-axis of the camera coordinate system. It can be found that the depth of Pw mainly depends on the disparity of P1 and P2. Hence, the depth is given by z=fbdW=fbSPdP,where b is the length of the base, f is the focal length, and dw is the disparity of P1 and P2. Since the coordinates of image are given in pixels, but dw is given in world units. We convert dw to pixel coordinates by scaling it with the size of the pixels:dP=dw/SP. From Fig. 2 and Eq. (1), we find that the lateral resolution is mainly caused by the pixel size and focal length. Longitudinal resolution depends on the angle between the chief ray, pixel size, focal length, and length of the base Δx=zSPf,Δz=bfSp(tanθ+tanβ)2.

    In our setup, the pixel level lateral resolution is 0.4 mm and longitudinal resolution is about 0.8 mm. It is obvious that longitudinal resolution can be enhanced by replacing the lens with a longer focal length. The resolution can also be improved by sub-pixel interpolation[7]. As a reference, the theoretical lateral resolution is 0.04 mm when using an 8 mm focal length lens and the precision of sub-pixel interpolation is 0.1 pixels. θ and β are the angle between the principal ray

    Rectified stereo configuration of the binocular 3D measurement system.

    Figure 2.Rectified stereo configuration of the binocular 3D measurement system.

    The surface of the optical element under test is rough and translucent. When using a laser line to scan the optical element, because of scattering, clear images cannot be obtained. Wiegmann et al. introduced band-pass random patterns[8] to suppress the negative influence of binary pixelized patterns in 3D reconstruction. Schaffer et al. further pointed out that the quality of the reconstruction was further improved by using binary band-pass random patterns[9]. Benefiting from better contrast, the binary band-pass random patterns deliver the most point-matches as well as the lowest noise level. The previous works[8,9] are focused on 3D measurement of a human face. It is our recognition that this TCT is also an excellent method for measurement of a rough-surface optical element.

    The process of finding homologous points from the left and right camera images can be seen as template matching. Traditional template matching methods, sum of absolute intensity value differences (SADs), or normalized cross correlation (NCC), simply perform template matching in the spatial domain. However, with the areal correlation technique, deformation caused by the different camera angle leads to many false matches and poor 3D reconstruction precision. In view of the existing situation, it is found that this problem can be solved by reducing the size of the correlation windows to 1 pixel in the spatial domain and extending the length of the correlation windows to N pixels in the temporal domain. This is the so-called TCT[10], as shown in Fig. 3.

    Corresponding pixels with similar gray-value vectors using temporal correlation.

    Figure 3.Corresponding pixels with similar gray-value vectors using temporal correlation.

    Binarized image of the object to be measured.

    Figure 4.Binarized image of the object to be measured.

    Left–right pixel correspondence can be directly established and triangulated to reconstruct scene depths. Since it tolerates uniform brightness variations as well, zero mean normalized cross-correlation-based matching is one of the most popular similarity measurement algorithms. For the TCT-based stereo matching problem, it is given by TCT(x,y,d)=t=1N[IL(x,y,t)ML]·[LR(x+d,y,t)MR]SL(x,y,t)·SR(x+d,y,t).

    The numerator of Eq. (4) represents the cross correlation between the two temporal intensity value vectors. Furthermore, Mi and Si (i=L or R) denote the mean intensity value and the standard deviation of the temporal intensity vector in the left and right images. Two pixels are considered to be homologous when the TCT value exceeds the threshold, such as 0.9, and it reaches a maximum.

    Consequently, a series of 15 binary band-pass random patterns will be used to encode the surface of the object. Once a pattern is projected to the object, two cameras capture the image at the same time. After capturing, 15 pairs of stereo images will be obtained. Then, rectification will be performed to ensure that the epipolar line for a point is simply the line that has the same row coordinate as the point[11]. After the rectified images are obtained, a series of operations which contain extractions of the region of interest, self-adapting binarization, initial matching, exhaustive matching, and so forth will be performed in order to reconstruct the dense 3D point cloud. Figure 4 shows the result of self-adapting binarization which will eliminate the adverse effects caused by uneven lighting and defocus. This section has described in detail in Ref. [12]. The 3D reconstruction result is shown in Fig. 5. The related point cloud consists of more than 3.6×105 points. The measured spherical radius is 221.8 mm. The absolute error of the full field is less than 0.3 mm whereas the standard deviation is smaller than 60 μm.

    3D point cloud of the target object.

    Figure 5.3D point cloud of the target object.

    In conclusion, we develop a noncontact active 3D scanner to measure the 3D profile of a rough surface glass optical element. Two cameras simultaneously capture the band-pass binary random patterns which are projected on the target object. The highlight of this system is using the TCT to determine the stereo correspondence between the pixels of the two cameras. The 3D point cloud can be reconstructed by the triangulation principle. This scanner can measure a rough surface glass lens with sufficient accuracy, which should be a useful optical apparatus for practical applications.

    Kun Liu, Changhe Zhou, Shaoqing Wang, Shengbin Wei, Xin Fan. 3D shape measurement of a ground surface optical element using band-pass random patterns projection[J]. Chinese Optics Letters, 2015, 13(8): 081101
    Download Citation