Our PROMISE: Our ads will never cover up content.
Our children thank you.
Published: Thursday, August 29, 2019  20:22
Fringe projection techniques offer fast, noncontact measurements of the surface form of manufactured parts. Fringe projection has seen successful implementation in the automotive, aerospace, and medical industries. Recently, advances in fringe projection have reduced the sensitivity of the measurement system to effects such as multiple surface reflections and projector defocus. Typically, the measurement method is altered to optimize the system for specific measurement conditions, without any regard for quantifying the effects of influence factors. Furthermore, there is no standardized calibration framework for fringe projection systems and uncertainty evaluation of surface measurements is rarely carried out in practice, which places some restrictions on the use of this technique in manufacturing industry.
Fringe projection systems detect the intensity of a projected fringe pattern that is reflected from the surface to be measured. Any process that alters the intensity of this pattern received by the camera will change the measurement outcome. Therefore, fringe projection systems typically have many influence factors that affect the measurement outcome, including the surface characteristics (e.g., optical properties and topography), imaging optics (e.g., defocus and aberrations), and external factors (e.g., ambient light intensity level, mechanical vibration, and temperature). The complexity of the measurement model makes current calibration methods given in ISO 15530 (for contact coordinate measuring machines) unsuitable for fringe projection. Additionally, it is unclear how to apply the calibration method in ISO 25178 for areal surface topography measuring instruments. A calibration framework for estimating spatialfrequencydependent measurement uncertainty built on solid theoretical foundations is required.
To move towards a traceable surface measurement using fringe projection techniques, we are developing a measurement model to accurately predict the captured image and include all major uncertainty contributors. The first step of the model is to describe the optical field distribution within the projection volume of the projector by considering its point spread function in three dimensions. The optical field distribution is sampled at surface locations, using a ray tracing algorithm to map intensity values to corresponding camera pixels. The results are validated by comparing to an experimental fringe projection system with carefully controlled parameters. The intention is to use this simulation within a Monte Carlo framework to create an uncertainty map of the phase image that can be used to estimate the uncertainty at each pointcloud data point. Additionally, the model will give insights into the relationships between influence factors, allowing the implementation of improvements to fringe projection systems.
Fringe projection is a threedimensional optical measurement technique that measures surface topography and geometrical dimensions of a part and is increasing in popularity in the aeronautics, automotive, and medical industries.^{1,2 }Fringe projection uses a camera and a projector to take relatively quick measurements of surfaces, describing them as highdensity point clouds. A pattern is projected onto a measurement surface. The camera, offset from the projector, records the image of the projected pattern, which has become distorted due to the surface geometry. The camera image is decoded to give correspondence between the camera and projector reference frames, allowing points to be triangulated between the images. Many different fringe projection techniques exist which project different patterns, all optimized for specific measurement scenarios.^{3,4}
Unlike other coordinate measurement systems, such as contact coordinate measuring machines (CMMs), there is no standardized calibration framework for fringe projection systems, and this restricts the use of this technique in manufacturing industry. Fringe projection measurements depend on surface characteristics, e.g., optical properties and topography, making current calibration methods in parts one to three of ISO 15530^{5} (for contact CMMs) unsuitable for fringe projection. Performance verification standards, such as VDI/VDE 2634^{6} or the draft ISO 1036013^{7} currently assume Lambertian reflection on the surface. Also, it is unclear how to apply the calibration approach in ISO 25178^{8} for areal surface topography measuring instruments. Given an accurate measurement model, an uncertainty evaluation using ISO 155304^{9} is promising, but it is difficult to create a rigorous model.
Sensitivity to small changes in intensity, a complicated imagedecoding process and the large number of fringe projection configurations makes quantifying individual influence factors difficult in fringe projection systems. Fringe projection systems can operate from millimeterrange surface topography measurements^{10,11} to larger scales of meters and above.^{12} Current methods to evaluate fringe projection uncertainty are limited to geometrical optics only, ignoring diffraction and surface effects.^{13} Fringe projection measurement models generally assume specific surface optical characteristics, yet it is well understood that surface optical characteristics strongly influence the measurement outcome.^{14–16}
Any rigorous measurement model that accurately describes fringe projection must take into account both surface characteristics and diffraction effects. Recently, there has been some progress in using linear systems theory to understand and model optical surface topography instruments ^{1719}. These instruments tend to work at smaller scales, where the numerical aperture (NA) is large enough that diffraction effects are significant within the working measurement volume. Previously, a 2D instrument transfer function was used to characterise the spatial resolution of a fringe projection system^{16}. The instrument transfer function was valid with the assumption that the change in surface height was much smaller than the working distance of the camera, and the amplitude of the surface spatial frequency components were smaller than the linearity surface height limits. In this paper, a measurement model based on geometrical optics and linear systems theory is used to model a fringe projection system. The objective of this model is to be used in a Monte Carlo simulation in the future.
The projector in a fringe projection system forms an image of, e.g., a 2D sinusoid pattern, onto the surface of an object. For objects that cannot be placed entirely within the depth of focus, it is important to model the entire output intensity distribution. In this work, we attempt to use 3D imaging theory to model the imaging process in a fringe projection system. Under linear systems theory, the intensity distribution at position r in the projection volume I_{out}(r) can be described as:
(Equation 1)
where I_{in}(r) and h(r) are the input intensity distribution and point spread function, respectively. The Fourier transform of Equation 1 gives:
(Equation 2)
where k is the spatial frequency vector. Assuming the projector is a shiftinvariant and incoherent system, its optical transfer function (OTF), given by H(k), is similar to that of a microscope.^{20}
If the volumetric intensity distribution of the fringe pattern can be predicted using Equation 1 and the location of the surface is given, light reflected from the surface can be calculated. To simplify the surfacescattering problem, a Lambertian surface that scatters uniformly in all angles is assumed. By further assuming that the surface is located within the depth of focus of the camera, the image of the fringe pattern recorded by the camera can be calculated by considering the camera modulation transfer function (MTF) and the perspective of the camera.
A geometrical transformation (Ξ), is applied to the output field distribution that first expands the output distribution uniformly, then compresses or stretches the distribution in the lateral position (x, y) as a function of the axial position z. The geometrical transformation Ξ is chosen so that I_{out}(r) resembles that of a projection system. The expansion is given by:
r_{M }= Ξr, (Equation 3)
with Ξ given by:
(Equation 4)
where M is the global magnification of I_{in}. For the total intensity of I_{out}(r) across successive planes perpendicular to the optical axis to remain constant, I_{out}(r) decreases proportional to the increase in successive plane area. With the assumption that the scene remains fully within the depth of focus of the camera and that scattering is Lambertian, the surface intensity values can be mapped directly to camera pixels to create I_{cam} (u, w). A simple ray tracing algorithm maps surface intensity values to the corresponding pixel. The same algorithm is used to establish occlusions in the projector field of view.
A number of images were taken using a simple fringe projection system. Fringes were generated on a Raspberry Pi, projected using an Optoma HD142X projector and captured using a Nikon D3500 camera in a thermally stable environment. The laboratory wall was used as a flat plane target as it is a relatively flat, white, Lambertian surface that filled the projector field of view. A set of 20mmdiameter optical spheres were also used as a target. The optical spheres are Al_{2}O_{3} balls with a matte finish, designed to be Lambertian. All images have been manually cropped to remove scenery and converted to grayscale.
Two scenes were simulated that created the same features that are found in the experimental images. For the projector, NA = 0.2 and the projector image contains 800 × 450 pixels. The intensity distribution I(r) sampling resolution is 800 × 450 × 512. The sample distance for I_{in}(r) is set to 2 nm in (x, y). The model is run in MATLAB 2018b, on a computer with 32 GB of memory, Intel Xeon W2123 with a clock speed 3.6 GHz. Each simulation took approximately 30 sec.
Figure 1 shows fringes projected onto a sphere artifact. The fringes look curved in the camera perspective due to the form of the spheres. There is some specular reflection on the spheres. Part of the spheres are occluded from the view of the projector. A sphere located in the back right corner is out of focus of the camera. The simulation of a similar scene is shown in figure 2, where a single sphere is simulated that increases its radius from 0.08 m to 0.12 m and 0.16 m. The curved fringes found in figure 1 are also present in figure 2. The specular reflection is not present because the surface is assumed to be Lambertian. Figure 2 provides evidence that the model is capable of simulating the change in pattern due to surface topography.


Figure 3 shows another image taken where the projector is placed at distances 1.23 m, 0.89 m, and 0.66 m from the surface, with the surface and camera position fixed. In figure 3 the surface undergoes a magnification and blurring in figure 3(c). Figure 4 shows a simulation of a scene that includes a surface that is both in and out of focus. The surface closest to the projector (right) is brighter and compressed compared to the surface that is furthest away (left). Figure 4 shows that the model presented here can qualitatively recreate the effect of focus and magnification found in a fringe projection system, which cannot be done using a twodimensional linear model or a geometrical model only.


We have modeled fringe projection by combining 3D imaging theory with a geometric transformation. The preliminary results show that projection image distortion caused by surface topography and diffraction effects can be described by this model. Qualitative agreements with experiments were achieved. The potential advantage of this modelling approach is the inclusion of surface scattering/diffraction effects and using the OTF as a metric to evaluate fringe projection performance. In future work, surfacelight interactions will be included within the model. The model will continue to be developed for use in a framework that can evaluate uncertainty in fringe projection systems.
We would like to thank the Engineering and Physical Sciences Research Council (Grants EP/L01534X/1 and EP/M008983/1) and the Manufacturing Technology Centre (Coventry, UK).
^{1 } Chatterjee A., Singh P., Bhatia V., and Prakash S., Ear biometrics recognition using laser biospeckled fringe projection profilometry, Opt. Laser. Technol., 112: p. 368378, 2019.
^{2 } He W., Zhong K., Li Z., Meng X., Cheng X., Liu X., and Shi Y., Accurate calibration method for blade 3D shape metrology system integrated by fringe projection profilometry and conoscopic holography, Opt. Laser. Eng., 110: p. 253261, 2018.
^{3 } Zhang S., Highspeed 3D shape measurement with structured light methods: A review, Opt. Laser. Eng., 106: p. 119131, 2018.
^{4 } Feng S., Zhang Y., Chen Q., Zuo C., Li R., and Shen G., General solution for high dynamic range threedimensional shape measurement using the fringe projection technique, Opt. Laser. Eng., 59: p. 5671, 2014.
^{5 } ISO 15530: Geometrical product specifications (GPS)  Coordinate measuring machines (CMM): Technique for determining the uncertainty of measurement, 2013
^{6 } VDI/VDE 2634: Optical 3D Measuring Systems, 2002
^{7 } ISO 10360: Geometrical Product Specifications (GPS)  Acceptance and reverification tests for coordinate measuring machines (CMM), 2005
^{8 } ISO 25178: Geometrical product specifications (GPS)  Surface texture: Areal, 2019
^{9 } ISO 15530: Geometrical product specifications (GPS)  Coordinate measuring machines (CMM): Technique for determining the uncertainty of measurement, Part 4: Evaluating taskspecific measurement uncertainty using simulation 2008
^{10 } Southon N., Stavroulakis P., Goodridge R., and Leach R.K., Inprocess measurement and monitoring of a polymer laser sintering powder bed with fringe projection, Mater. Des., 157: p. 227234, 2018.
^{11 } Inanç A., Kösoğlu G., Yüksel H., and Inci M.N., 3d optical profilometry at micron scale with multifrequency fringe projection using modified fibre optic lloyd's mirror technique, Opt. Laser. Eng., 105: p. 1426, 2018.
^{12 } Du H., Chen X., Xi J., Yu C., and Zhao B., Development and Verification of a Novel RobotIntegrated Fringe Projection 3D Scanning System for LargeScale Metrology, Sensors, 17(12): p. 2886, 2017.
^{13 } Zhou D., Wang Z., Gao N., Zhang Z., and Jiang X., Virtual fringe projection system with nonparallel illumination based on iteration, Meas. Sci. Technol., 28(6): p. 065201, 2017.
^{14 } Ribo M. and Brandner M. State of the art on visionbased structured light systems for 3D measurements. ROSE. 2005. IEEE.
^{15 } Vukašinović N., Bračun D., Možina J., and Duhovnik J., The influence of incident angle, object colour and distance on CNC laser scanning, Int. J. Adv. Manuf. Tech., 50(14): p. 265274, 2010.
^{16 } Zhang B., Davies A., Evans C., and Ziegert J., Validity of the instrument transfer function for fringe projection metrology, Appl. Opt., 57(11): p. 27952803, 2018.
^{17 } Su R., Thomas M., Leach R.K., and Coupland J., Effects of defocus on the transfer function of coherence scanning interferometry, Opt. Lett., 43(1): p. 8285, 2018.
^{18 } Coupland J.M. and Lobera J., Holography, tomography and 3D microscopy as linear filtering operations, Meas. Sci. Technol., 19(7): p. 074012, 2008.
^{19 } Su R., Wang Y., Coupland J., and Leach R.K., On tilt and curvature dependent errors and the calibration of coherence scanning interferometry, Opt. Express, 25(4): p. 32973310, 2017.
^{20 } Streibl N., Threedimensional imaging by a microscope, JOSA A, 2(2): p. 121127, 1985.