Open Access
14 November 2016 Vision-based displacement measurement sensor using modified Taylor approximation approach
Bingyou Liu, Dashan Zhang, Jie Guo, Chang’an Zhu
Author Affiliations +
Abstract
The development of image sensors and optics lenses has contributed to the rapidly increasing use of vision-based methods as noncontact measurement methods in many areas. A high-speed camera system is developed to realize the displacement measurement in real time. Conventional visual measurement algorithms are commonly subjected to various shortcomings, such as complex processes, multiparameter adjustments, or integer-pixel accuracy. Inspired from the combination of block-matching algorithm and simplified optical flow, a motion estimation algorithm that uses modified Taylor approximation is proposed and applied to the vision sensor system. Simplifying integer-pixel searching with a rounding-iterative operation enables the modified algorithm to rapidly accomplish one displacement extraction within 1 ms and yield satisfactory subpixel accuracy. The performance of the vision sensor is evaluated through a simulation test and two experiments on a grating ruler motion platform and a steering wheel system of a forklift. Experimental results show that the developed vision sensor can extract accurate displacement signals and accomplish the vibration measurement of engineering structures.

1.

Introduction

Noncontact measure techniques, such as speckle photography,1 hologram interferometry,2 and laser Doppler vibrometry,3 have been developed for years and well applied in various fields. Compared to traditional measurement devices, such as accelerometer or linear displacement gauge, devices with noncontact approaches have a more flexible installation and provide intuitionistic exhibitions of the actual movements of the target without affecting its behavior. In some environments where traditional sensors do not have clear access or cannot work effectively, e.g., remote measurement targets or fields with high temperature or strong magnetic, noncontact measurement devices obviously have great advantages over conventional ones. However, most noncontact equipment requires high cost and strict construction structures, thus limiting the wide use of such systems in practical applications.

Technological developments in image sensors and optics lens have contributed to the rapidly increasing use of vision-based measurement methods as noncontact measurement methods in numerous research and industrial areas, such as vibration analysis,4,5 condition monitoring,611 human motion,12,13 and underwater measurement.14 With a relatively lower cost and better flexibility in structure, optical devices and cameras offer effective alternatives to noncontact equipment. Benefiting from the wide availability of affordable high-quality digital image sensors and high-performance computers, cheap high-resolution cameras have been growing used in many areas. Recently, vision-based techniques were successfully used to measure various structures and get satisfactory results.1524 Quan et al.25 achieved three-dimensional displacement measurement based on two-dimensional (2-D) digital image correlation (DIC). Kim et al.26 proposed a vision-based monitoring system that uses DIC to evaluate the cable tensile force of a cable-stayed bridge. The same method was also applied in experimental mechanics for noncontact, full-field deformation measurement.27,28 Park et al.29 realized displacement measurement for high-rise building structures using the partitioning approach. Wahbeh et al.30 realized the measurement of displacements and rotations of the Vincent Thomas Bridge in California by using a highly accurate camera in conjunction with a laser tracking reference. Fukuda et al.31 proposed a camera-based sensor system in which a robust object search algorithm was used to measure the dynamic displacements of large-scale structures. Feng et al.32 developed a vision-based sensor that employed an up-sampled cross-correlation (UCC) algorithm for noncontact structural displacement measurement, which can accurately measure the displacements of bridges.33

The traditional camera system for displacement measurement is composed of commercial digital cameras and video-processing devices (normally computers). However, ordinary digital cameras often have low video-sampling rate, which limits the application range of their vibration frequency.34 To overcome this restriction, a high-speed vision system with 1000 frames per second (fps) or even higher has been developed and applied in practice.35 In the present paper, a high-speed camera sensor system composed of a zoom optical lens, a high-speed camera body with a CCD receiver, and a notebook computer. A USB 3.0 interface is used to ensure stable data transfer between the camera body and the computer. On the notebook computer, the captured video can be processed by software to realize the tracking of an object and to extract motion information in real time.

Similar to other measurement equipment, a vision-based measurement system is mainly concerned with measurement of speed and accuracy, both of which significantly depend on the performance of the image-processing algorithm. Owing to their high sampling rate, high-speed sensors have strict requirements for motion-tracking algorithms on computing speed to satisfy the demand of real-time signal processing. Conventional motion extraction algorithms based on template matching registration techniques [i.e., sum of absolute difference (SAD) or normalized cross-correlation (NCC)] are mostly complex and have a heavy computation load. Moreover, template matching techniques can only achieve integer-pixel resolution because the minimal unit in a video image is 1 pixel. Such accuracy is far from satisfactory in numerous practical applications, particularly for those where the vibrations of small structures are required. Various methods have been proposed to refine measurement accuracy, including interpolation techniques and subpixel registration,3638 most of which improved accuracy indeed but exhibited low computational efficiency to some degree.

Chan et al.39 proposed a subpixel motion estimation method that uses a combination of classical block matching and simplified optical flow. The method is tremendously faster than any existing block-matching algorithm because no interpolation is needed. In the first step, a block-matching algorithm, such as three-step search (TSS) or cross-diamond search, is used to determine integer-pixel displacement. The result is then refined through local approximation using a simplified optical flow to subpixel level. In this paper, we simplified Chan’s algorithm by replacing block matching with rounding-iterative Taylor approximation. With no subpixel interpolation needed during frame cutting in each iteration, this modified algorithm runs much faster than conventional iterative DIC optical flow. Given that the improvement brings no additional parameter that requires specification, the modified algorithm naturally executes with a high degree of automation. After several times of optimization, the computation time of one extraction in the modified algorithm is reduced to less than 1 ms. The modified algorithm is utilized in the high-speed camera system for its high efficiency and satisfactory subpixel accuracy. A simulation and two experiments under laboratory and realistic conditions are carried out for performance verification. The positive results demonstrated the accuracy and efficiency of the camera sensor system in measuring dynamic displacement.

The rest of the paper is organized as follows. Section 2 introduces the components and capability parameters of the high-speed vision sensor system. Section 3 presents the theory of motion estimation algorithm without interpolation and the modified Taylor algorithm. Section 4 evaluates the performance of the modified algorithm with a simulation test. Section 5 presents two experiments for performance verification. Section 6 discusses the results and outlook.

2.

High-Speed Vision Sensor System

Traditional camera systems for displacement measurement are commonly composed of commercial digital cameras and personal computers. However, commercial digital cameras usually have low frame rates (i.e., 100 fps), which limit their application in vibration frequencies over 50 Hz. In this paper, a high-speed sensor system composed of a notebook computer (Intel Core processor 2.9 GHz, 2.75 GB RAM) and a video camera with telescopic lens is developed for displacement measurement, as shown in Fig. 1(a). The telescopic lens has a large zooming capability [Fig. 1(b)] that can reach the measurement requirement at different distances. The camera head uses a CCD sensor as the image receiver, which can capture 8-bit gray-scale images at a maximum of 1000 fps when the image resolution is set as 300  pixels×300  pixels. A USB 3.0 interface is used to ensure stable data transfer between the camera and the computer. With its high sampling rate and computing efficiency, the image-processing software on the notebook computer can use the refined Taylor algorithm to track specific fast-moving objects and extract motion information in real time.

Fig. 1

High-speed vision sensor system: (a) experimental setups and (b) video camera and zoom optical lens.

OE_55_11_114103_f001.png

A target panel preinstalled on the target is very helpful to ensure extraction accuracy during measurement. If the target panel is unavailable because of the limitation of the measurement environment, the distinct surface patterns of the structure, such as textures or edges, can be used as tracking templates. Then the camera system is ready to capture images from a remote location, and the displacement time history of the structure can be obtained by applying the displacement tracking algorithm to the digital video images.

3.

Motion Extraction Algorithm

3.1.

Subpixel Motion Estimation Without Interpolation

Figure 2 shows the subpixel motion estimation algorithm combined with block-matching algorithm and simplified optical flow. Two consecutive frames, f(x,y) and g(x,y), with real displacement (Δx,Δy) are given. The real displacement can be divided into an integer part (Δx¯,Δy¯) and a subpixel part (δx,δy) as

Eq. (1)

Δx=Δx¯+δx,Δy=Δy¯+δy.

Fig. 2

Flowchart of motion estimation without interpolation using a combination of block-matching algorithm and simplified optical flow.

OE_55_11_114103_f002.png

A block-matching algorithm is first applied to estimate integer-pixel displacements Δx¯ and Δy¯. When the integer part is determined, the image block f(x,y) is shifted by Δx¯ pixel in the x-direction and Δy¯ pixel in the y-direction.

For the subpixel part, the Taylor series approximation is used to refine the search. The shifted image f(x+Δx¯,y+Δy¯) differs from the accurate location only by |δx|<1 and |δy|<1, which can be computed by using one-step Taylor approximation.

Total displacement can be determined by combining the integer part and the subpixel part. Analytical error analysis39,40 is deduced in one dimension and can be generalized straightforwardly to a 2-D situation. The results imply that this two-step method can extract more accurate motion vectors than other block-matching algorithms. With no requirement for any interpolation and motion-compensated frames, the algorithm is much faster than the conventional method.

3.2.

Taylor Approximation with Rounding-Iterative Operation

In this part, an analytic model is built to illustrate the proposed modified algorithm used in the sensor system. Figure 3 illustrates the displacement extraction procedure from consecutive frames k and k+1. A random subimage f(x,y) in frame k is selected as the matching template. With all its pixels moved by displacement vector p=(Δx,Δy)T, the template image will become a new subimage g(x,y) in the next frame at the same position.

Fig. 3

Illustration of overall-shift template image.

OE_55_11_114103_f003.png

With the assumption of brightness constancy or intensity conservation, the relationship between the template images f(x,y) and g(x,y) at the same position in frame k+1 can be written as

Eq. (2)

g(x,y)=f(x+Δx,y+Δy).

Note that the surface radiance remaining fixed from one frame to the next rarely holds exactly. As the scene might be constrained with no specularities, object rotations, and secondary illumination (shadows or intersurface reflection), the brightness constancy assumption works well in practice.40

Given the fact that the displacement vector p=(Δx,Δy)T is usually a small value (normally several pixels), Eq. (2) can be approximated using first-order Taylor expansion with the higher-order terms ignored as follows:

Eq. (3)

g(x,y)=f(x+Δx,y+Δy)f(x,y)+Δxxf(x,y)+Δyyf(x,y).
With two unknowns, Δx and Δy, in one equation, the linear least squares (LS) estimator minimizes the square errors:

Eq. (4)

E(p)=x,y[g(x,y)f(x,y)Δxxf(x,y)Δyyf(x,y)]2.
As a linear LS problem, the minimum of E(p) can be found by setting its derivatives with respect to p are zero:

Eq. (5)

E(p)Δx=0,E(p)Δy=0.
Equation (5) can be written in matrix form

Eq. (6)

Ip=ΔI,
in which ΔI is the difference matrix and I denotes the gradient matrix

Eq. (7)

ΔI=[f(x1,y1)g(x1,y1)f(x2,y2)g(x2,y2)f(xn,yn)g(xn,yn)],I=[fx(x1,y1)fy(x1,y1)fx(x2,y2)fy(x2,y2)fx(xn,yn)fy(xn,yn)],
where n refers to the number of pixels in selected templates.

In the sense of general LS, if IT·ΔI is invertible (full rank), then the displacement vector can be expressed with an LS estimate as

Eq. (8)

p=[IT·ΔI]1·IT·ΔI.
With Eq. (8), displacement vectors between adjacent frames can be obtained precisely on the condition of a minute interframe displacement because the validation requirement of Taylor approximation is that |Δx|<1 and |Δy|<1. However, the interframe displacement between frames may be larger than the expected value in practical application, in which case the vector p might be correct in direction but inaccurate in magnitude. Therefore, a rounding-iterative process is introduced to solve the problem and guarantee accuracy. For each calculation step of pj=(Δxj,Δyj)T, the calculated Δxj and Δyj are set by rounding to the nearest integers until the termination condition is satisfied (pj<0.5).

Figure 4 shows the architecture of the proposed displacement extraction method with the reformative iteration process involved. The procedure for the proposed modified method can be summarized as follows:

  • Step 1: Cut f(x,y) and g(x,y) from consecutive frames at the same position;

  • Step 2: Compute the partial derivatives fx and fy of f(x,y) by the central difference;

  • Step 3: Compute the difference matrix ΔI and gradient matrix I according to Eq. (7);

  • Step 4: Compute the displacement vector pj=(Δxj,Δyj)T according to Eq. (8) and make sure pj is less than 0.5. If pj is less than 0.5, the algorithm proceeds to Step 5; if not, update f(x,y) to the new f[x+round(Δxj),y+round(Δyj)] and return to Step 2 (Symbol round denotes rounding to the nearest integer);

  • Step 5: Accumulate Δxj and Δyj to obtain the refined displacement vector p=(Δx,Δy)T.

Fig. 4

Displacement extraction using Taylor approximation with the reformative iteration operation.

OE_55_11_114103_f004.png

With this rounding-iterative modification, the integer-level motion estimation is also accomplished with optical flow instead of block matching. This modification of the original method is so simple because it does not introduce any unnecessary computation and pixel interpolation. The rounding-off operation eliminates the subpixel interpolation computation in each frame cutting, which makes the algorithm much faster than conventional iterative DIC optical flow. The algorithm naturally executes with a high degree of automation because the improvement brings no additional parameter that requires specification. Although the rounding-off operation significantly decreases the time consumed for subpixel interpolation, the modified algorithm may, to some extent, sacrifice accuracy for its relatively loose termination condition. Fortunately, the proposed method performs stable satisfactory subpixel results and executes with high efficiency in the following simulation and experiments. Thus, the algorithm can be used in high-speed camera systems to measure displacement in real time. The contrast simulation and experiments for validation are presented in the following sections.

4.

Simulation Test

The performance of the proposed modified Taylor algorithm is first evaluated through a simulation test. The simulation gives a simple case with only one vignetting black circle (a diameter of 160 pixels) on white ground, as shown in Fig. 5. The black circle is programmed to rotate with the following ellipse equation:

Eq. (9)

x(t)=10cos(2πft),y(t)=6sin(2πft).

Fig. 5

Simulation black circle on white ground and the selected tracking template.

OE_55_11_114103_f005.png

The maximum displacements in the x- and y-directions are 10 and 6 pixels, respectively. The rotation frequency is set to 1 Hz, and the sampling frequency is 50 Hz. Four algorithms, namely, classical NCC, UCC, TSS with optical flow, and the proposed modified Taylor, are applied to extract the motion displacement of the moving circle. All code programming and computing works are accomplished with MATLAB R2015a.

During the testing, an 80×80  pixels region (within the red box) is selected as the tracking template for the stable tracking error.41 UCC algorithm is an advanced subpixel image registration technique that allows resolution adjusting by changing the up-sampling factor.34 The up-sampling factors for UCC algorithm are set as 1, 10, and 100 for subpixel levels of one integer pixel, 0.1 pixel, and 0.01 pixel, respectively. Meanwhile, the UCC algorithm cannot give accurate displacement results until the template size is large enough. The contrast test results on the same template condition are summarized in Table 1 with an asterisk.

Table 1

Errors and time consumption comparisons in the simulation test.

AlgorithmMax error (pixel)Absolute average error (pixel)Ttotal (s)Tavg (ms)
xyxy
Classical NCC0.62790.62310.23400.24214.4617.77
UCC (usfac=1)0.57100.37420.23090.23782.459.89
UCC (usfac=10)0.18970.15200.05800.04913.5114.03
UCC (usfac=100)0.14970.12200.05480.04814.2016.78
TSS + optical flow0.33830.33830.11510.11261.074.08
Modified Taylor0.33270.25530.08170.06810.150.46
Modified Taylora0.15900.14800.05310.04630.320.97

aWith the same template size as the UCC algorithm.

Motion extraction results of two integer-level methods, namely, NCC and UCC (usfac=1), are shown in Figs. 6(a) and 6(b). These two algorithms only scan the template’s best-matching region per pixel, and such a deficiency certainly leads to a step-type curve shape and reduces extraction accuracy. Results of subpixel level motion extraction are shown in Figs. 6(c)6(f). The figures show that the motion curves with these four subpixel level algorithms are obviously smoother than the curves with NCC and UCC (usfac=1).

Fig. 6

Comparisons of displacement extraction results between the actual input and different algorithms. (a) Classical NCC algorithm, (b) UCC algorithm (usfac=1), (c) UCC algorithm (usfac=10), (d) UCC algorithm (usfac=100), (e) TSS and optical flow, and (f) modified Taylor algorithm.

OE_55_11_114103_f006.png

Quantitive contrast results regarding tracking error and computation time are given in Table 1. The table indicates that with the improvement of subpixel resolution level from 1 to 0.01 pixel, the absolute average horizontal error of the UCC algorithm reduces from 0.2309 to 0.0548 pixel, and the absolute average vertical error reduces from 0.2378 to 0.0481 pixel. Meanwhile, the time consumed increases from 14.03 to 16.78  ms/frame. When subjected to the requirement of template size, the UCC algorithm does not have the capability of giving dual attention to both accuracy and high efficiency. The error analysis of the combination of TSS and optical flow reveals that the combination method has a clear advantage in time consumed (4.08  ms/frame) with acceptable average errors (0.1151 pixel in the horizontal and 0.1126 pixel in the vertical). The proposed modified Taylor method is also observed to have the highest computation efficiency among all the listed methods. The average elapsed time of handling one frame costs only 0.46 ms with a relatively better error performance than the combined TSS and optical flow. Furthermore, with a large size template, the modified Taylor even gives a similar error performance as UCC (usfac=100) with the elapsed time per frame of just 1/17 of the latter.

Owing to the satisfactory performances in time efficiency and accuracy during displacement extraction in the simulation test, the modified Taylor algorithm was integrated into the real-time vision sensor system mentioned in Sec. 2. The software contains several modules. The high-speed camera module can control the parameters of the digital camera, such as contrast, brightness, and exposure time. The calibration part has the ability to compute the actual displacement of one pixel based on a target with an already known size. With the image-capturing part, the streaming image data can be acquired in real time and sent to the template tracking module, where the modified Taylor algorithm is operating. The entire sensor system is implemented based on the Qt and OpenCV libraries and is capable of realizing the displacement measurement of actual structures.

5.

Experimental Verification

5.1.

Case 1: Experiment on a Grating Ruler Motion Platform

To evaluate the performance of the developed vision-based sensor system, an experimental verification was carried out on a laboratory platform with a conventional grating ruler, as shown in Fig. 7. Using the Moiré fringe technology of grating and photoelectric conversion, the incremental grating displacement sensors widely act as a high-accuracy displacement measurement tool with numerous advantages, such as stability, reliability, and high accuracy. The experimental installations are shown in Fig. 7(a). The grating ruler displacement sensor was installed on the moving table platform, with its reading moving synchronously with the junction plate in the horizontal direction. With this structure, the displacement of the target can be recorded simultaneously by the high-speed sensor system and the grating ruler for comparison. The sampling frequency of the grating ruler sensor used in the experiment is 20 Hz, and the grating pitch is 0.02 mm with a resolution of 1  μm.

Fig. 7

Experiment setup in grating ruler platform experiment. (a) Experimental device and (b) the cross target and selected template.

OE_55_11_114103_f007.png

In the experiment, the vision-based high-speed camera system was experimentally evaluated against the grating ruler. As seen in Fig. 7(a), a circle target with a diameter of 20 mm was installed on the junction plate in advance. The target can be programmed to move with arbitrary amplitudes and frequencies in the horizontal direction. The video camera was placed at a stationary position 3 m away from the platform. The camera captured the moving target at a resolution of 160×160  pixels with 200 fps during the shooting process. To measure the displacement in real-time dimensions, the actual size of the preinstalled target panel in the video images was calculated. The results showed that 20 mm in real life corresponds to 104.7 pixels in the captured images, which means the pixel resolution would be 0.191  mm/pixel. A 50×50  pixels region on the target, as shown in Fig. 7(b), was chosen as the template matching image.

The guide screw was driven by a 10 s manual arbitrary input. As shown in Fig. 8, the horizontal displacement time history measured by the vision-based system was compared with that measured by the grating ruler sensor. The grating ruler data (green dashed line) matched well with the vision-based sensor data (dashed blue line). The integer-level tracking result is marked with a red solid line. The step-type result indicates that the integer-level algorithms can only acquire the integer-pixel motion of the target, which leads to a large measurement error.

Fig. 8

Results of displacement measurement on grating ruler moving platform.

OE_55_11_114103_f008.png

Similar to the simulation test, the captured video was analyzed by the different motion extraction algorithms previously mentioned. Quantitive experimental results regarding the tracking error and computation time are given in Table 2. To further evaluate the error performance, the normalized root mean squared error (NRMSE) is introduced as

Eq. (10)

NRMSE=1ni=1n(aibi)2bmaxbmin×100%,
where n denotes the frame number and a and b refer to the displacement data measured by the vision-based sensor and grating ruler, respectively. The result indicates that the vision sensor system with modified Taylor algorithm has the lowest NRMSE of 0.75% and the fastest average computing time per frame of 0.22 ms. The absolute average measuring error of modified Taylor was 0.020 mm. With the pixel resolution of 0.191  mm/pixel, the proposed sensor system achieved 1/9 the pixel accuracy of the experimental measurement.

Table 2

Errors and time consumption comparisons in grating ruler motion platform experiment.

AlgorithmErroravg (mm)NRMSE (%)Tavg (ms)
Classical NCC0.0515.93
UCC (usfac=1)0.0583.78
UCC (usfac=10)0.0281.0813.09
UCC (usfac=100)0.0220.8715.35
TSS + optical flow0.0251.012.54
Modified Taylor0.0200.750.22
Modified Taylora0.0190.730.45

aWith the same template size as the UCC algorithm.

5.2.

Case 2: Vibration Measurement of Steering Wheel System

To validate the effectiveness of the proposed sensor system in a practical environment, a vibration measure experiment on a forklift’s steering wheel system was conducted. The steering wheel system consists of several components, including front panel, install panel, mounting plate, commutator pump, steering wheel, and tubular column. As shown in Fig. 9(a), the steering wheel and tubular column are assembled using an interference fit, and the mounting plate is connected with the commutator pump using a locking device. The mounting plate and front panel are both welded to the install plate. Owing to the design defects, resonance exists on the steering wheel system when the engine is working at idle speed (22.8 to 28.3 Hz). This resonance may lead to physical complaints and industrial accidents if drivers operate the forklift for a long drive.

Fig. 9

Model of the steering wheel system. (a) 3-D model of the steering wheel system and (b) hexahedral FEM mesh result.

OE_55_11_114103_f009.png

A finite element model (FEM) of the steering wheel system was built with Pro/Engineer, as shown in Fig. 9(b). The locking device between commutator pump and mounting plate was simplified into a bolted connection in the FEM modeling. Figure 9 also illustrates the grid results using a hexahedral mesh. Modal test results proved that a first-order natural frequency occurs at 22.3262 Hz. From Fig. 10, the vibration mode of this frequency is shown as a horizontal steering wheel bending. The FEM analysis confirmed the resonance speculation because the natural frequency is apparently close to the resonance frequency range.

Fig. 10

Horizontal steering wheel bending mode at 22.3262 Hz using FEM modeling.

OE_55_11_114103_f010.png

The vision-based experimental setup on the forklift’s steering wheel system is shown in Fig. 11. The high-speed camera sensor was installed on a special support to avoid additional interference. Measurement targets with a 10  mm×10  mm size were marked on the upper surface of the steering wheel. The distance between the camera and the steering wheel was about 60 cm. The actual size of one pixel was 0.0375  mm/pixel, which was calculated by using the known physical size of the targets. With the efficient modified Taylor algorithm, the vibration can be analyzed and displayed in real time.

Fig. 11

Vibration measurement experiment on a forklift’s steering wheel system.

OE_55_11_114103_f011.png

The horizontal and vertical vibration displacements with their corresponding Fourier spectra after the modified Taylor algorithm were applied to the vibration video, as shown in Fig. 12. The results show that the center of the steering wheel vibrates with an amplitude under 0.5 mm after excitation. Two obvious spectral peaks can be observed at 22.27 and 44.24 Hz in the Fourier spectrum results; these peaks can be considered as the first-order master frequency and its double frequency of the steering wheel system. During the motion extraction process, the elapsed time for each frame was less than 0.4 ms, and more than 87% extractions were completed within 0.1 ms. The results are very close to the natural frequency obtained with FEM analysis with an acceptable error. Therefore, the same frequency can be obtained accurately from the proposed vision-based displacement sensor.

Fig. 12

Vibration displacements and the frequency spectra of the steering wheel system.

OE_55_11_114103_f012.png

6.

Conclusions

This study developed a vision-based high-speed sensor system for dynamic displacement measurement. The sensor system is composed of a high-speed camera head with a zoom optical lens and a notebook computer. To meet the requirement of real-time measurement, a motion extraction algorithm with high efficiency is used. With the combination of block-matching algorithm and simplified optical flow, motion vectors between frames can be extracted accurately. The method is proven to be much faster than conventional algorithms because there is no interpolation or motion compensation. However, this combination method still has room for improvement.

In our proposed algorithm, the integer-pixel searching is replaced with a rounding-iterative operation on Taylor approximation. This simple modification does not bring any unnecessary computation or pixel interpolation to the original method. By benefiting from no additional parameter requiring specification, the modified algorithm can execute with high automation and even faster with better accuracy. Based on the assumption of brightness constancy or intensity conservation, the proposed algorithm obtains the displacement vector between frames in the sense of LSs and achieves fast automatic computation by iteratively updating the template’s position. Without the image feature extraction process, the algorithm simplifies the selection of thresholds and is completed through a simple matrix operation. The properties of high efficiency, high precision, and good robustness of the proposed algorithm itself contribute to the applications of the high-speed camera sensor system.

A simulation on the tracking rotation motion of a black circle as well as two experiments on a grating ruler motion platform and vibration analysis of steering wheel system are conducted to verify the effectiveness of the modified algorithm and developed sensor system. The results of displacement extraction using the modified algorithm are compared with the actual values and the results of three other existing extraction algorithms. From the simulation test, a satisfactory agreement is observed between the real motion curve and the curve obtained through the modified algorithm. In the grating ruler motion platform experiment, the motion of the grating ruler platform is accurately measured using the developed sensor system. In a realistic environment, the performance of the vision sensor is further confirmed by the vibration analysis of the forklift’s steering wheel system. Of all the simulation and experiments, the modified algorithm shows its outperformance on computing efficiency. The average elapsed time of handling one frame can be reduced to less than 1 ms with an impressive measurement error.

Although the brightness constancy assumption works well in practice, the large vibration on illumination intensity may still influence the measurement results and lead to large errors. Different from the improvement through multiframe,39 the modified algorithm acquires the image basis by handling only one frame. This characteristic makes the method concise and highly effective, but the differential operation may amplify the image noise and cause an undesired error. The developed sensor system can only meet the real-time measurement under a frequency sampling below 500 Hz, which is limited by the camera module we can access. Future work will be focused on improving the algorithm’s robustness under large illumination changes and developing a sensor system for high frequency sampling over 500 Hz.

Acknowledgments

This work was supported by the Key Project of Natural Science by Education Department of Anhui Province (No. KJ2015A316) and the Outstanding Young Talents at Home Visit the School Training Project (No. gxfxZD2016101).

References

1. 

Y. Zhang et al., “Application of the Fourier transform in electronic speckle photography,” Exp. Mech., 42 409 –416 (2002). http://dx.doi.org/10.1007/BF02412146 EXMCAZ 0014-4851 Google Scholar

2. 

J. L. Valin et al., “Methodology for analysis of displacement using digital holography,” Opt. Laser Technol., 43 99 –111 (2005). http://dx.doi.org/10.1016/j.optlaseng.2004.05.010 OLTCAS 0030-3992 Google Scholar

3. 

H. H. Nassif et al., “Comparison of laser Doppler vibrometer with contact sensors for monitoring bridge deflection and vibration,” NDT&E Int., 38 213 –218 (2005). http://dx.doi.org/10.1016/j.ndteint.2004.06.012 Google Scholar

4. 

Y. Ji and C. Chang, “Nontarget stereo vision technique for spatiotemporal response measurement of line-like structure,” J. Eng. Mech., 134 466 –474 (2008). http://dx.doi.org/10.1061/(ASCE)0733-9399(2008)134:6(466) JENMDT 0733-9399 Google Scholar

5. 

D. L. B. R. Jurjo et al., “Experimental methodology for the dynamic analysis of slender structures based on digital image processing techniques,” Mech. Syst. Signal Process., 20 1112 –1133 (2006). http://dx.doi.org/10.1016/j.ymssp.2004.09.008 Google Scholar

6. 

T. Wu et al., “Full-life dynamic identification of wear state based on on-line wear debris image features,” Mech. Syst. Signal Process., 42 404 –414 (2014). http://dx.doi.org/10.1016/j.ymssp.2013.08.032 Google Scholar

7. 

Y. V. Filatov et al., “Noncontact measurement of angular position and angular movement by means of laser goniometer,” Opt. Eng., 54 (5), 054103 (2015). http://dx.doi.org/10.1117/1.OE.54.5.054103 Google Scholar

8. 

S. W. Park et al., “3D displacement measurement model for health monitoring of structures using a motion capture system,” Measurement, 59 352 –362 (2015). http://dx.doi.org/10.1016/j.measurement.2014.09.063 0263-2241 Google Scholar

9. 

Y. Arai, “Development of in-plane and out-of-plane deformations simultaneous measurement method for the analysis of buckling,” Opt. Eng., 54 (2), 024102 (2015). http://dx.doi.org/10.1117/1.OE.54.2.024102 Google Scholar

10. 

P. J. Figueroa, N. J. Leite and R. M. L. Barros, “Tracking soccer players aiming their kinematical motion analysis,” Comput. Vis. Image Understanding, 101 122 –135 (2006). http://dx.doi.org/10.1016/j.cviu.2005.07.006 CVIUF4 1077-3142 Google Scholar

11. 

R. Aharoni et al., “Real-time stand-off spatial detection and identification of gases and vapor using external-cavity quantum cascade laser open-path spectrometer,” Opt. Eng., 54 (6), 067103 (2015). http://dx.doi.org/10.1117/1.OE.54.6.067103 Google Scholar

12. 

F. Cheli et al., “Vision-based measuring system for rider’s pose estimation during motorcycle riding,” Mech. Syst. Signal Process., 38 399 –410 (2013). http://dx.doi.org/10.1016/j.ymssp.2013.01.009 Google Scholar

13. 

Y. Shao, Y. Guo and C. Gao, “Human action recognition using motion energy template,” Opt. Eng., 54 (6), 063107 (2015). http://dx.doi.org/10.1117/1.OE.54.6.063107 Google Scholar

14. 

F. C. Trigo et al., “Identification of a scaled-model riser dynamics through a combined computer vision and adaptive Kalman filter approach,” Mech. Syst. Signal Process., 43 124 –140 (2014). http://dx.doi.org/10.1016/j.ymssp.2013.10.005 Google Scholar

15. 

J. Guo, “Dynamic displacement measurement of large scale structures based on the Lucas–Kanade template tracking algorithm,” Mech. Syst. Signal Process., 66 425 –436 (2015). http://dx.doi.org/10.1016/j.ymssp.2015.06.004 Google Scholar

16. 

Y. Song et al., “Virtual visual sensors and their application in structural health monitoring,” Struct. Health Monit. An Int. J., 13 251 –264 (2014). http://dx.doi.org/10.1177/1475921714522841 Google Scholar

17. 

U. Yang et al., “Illumination-invariant color space and its application to skin-color detection,” Opt. Eng., 49 (10), 107004 (2010). http://dx.doi.org/10.1117/1.3497058 Google Scholar

18. 

J. J. Lee, H. N. Ho and J. H. Lee, “A vision-based dynamic rotational angle measurement system for large civil structures,” Sensors, 12 7326 –7336 (2012). http://dx.doi.org/10.3390/s120607326 SNSRES 0746-9462 Google Scholar

19. 

H. S. Park, “A new position measurement system using a motion-capture camera for wind tunnel tests,” Sensors, 13 12329 –12344 (2013). http://dx.doi.org/10.3390/s130912329 SNSRES 0746-9462 Google Scholar

20. 

J. Sadek et al., “Development of a vision based deflection measurement system and its accuracy assessment,” Measurement, 46 1237 –1249 (2013). http://dx.doi.org/10.1016/j.measurement.2012.10.021 0263-2241 Google Scholar

21. 

J. Morlier and G. Michon, “Virtual vibration measurement using KLT motion tracking algorithm,” J. Dyn. Syst. Meas. Control, 132 011003 (2010). http://dx.doi.org/10.1115/1.4000070 JDSMAA 0022-0434 Google Scholar

22. 

B. Ko and S. Kwak, “Survey of computer vision-based natural disaster warning systems machine vision and applications,” Opt. Eng., 51 (7), 070901 (2012). http://dx.doi.org/10.1117/1.OE.51.7.070901 Google Scholar

23. 

H. Wang et al., “Vision-based vehicle detection and tracking algorithm design,” Opt. Eng., 48 (2), 127201 (2009). http://dx.doi.org/10.1117/1.3269685 Google Scholar

24. 

A. Jaume-i-Capo et al., “Automatic human body modeling for vision-based motion capture system using B-spline parameterization of the silhouette,” Opt. Eng., 51 (2), 020501 (2012). http://dx.doi.org/10.1117/1.OE.51.2.020501 Google Scholar

25. 

C. Quan et al., “Determination of three-dimensional displacement using two-dimensional digital image correlation,” Appl. Opt., 47 583 –593 (2008). http://dx.doi.org/10.1364/AO.47.000583 APOPAI 0003-6935 Google Scholar

26. 

S. W. Kim et al., “Vision based monitoring system for evaluating cable tensile forces on a cable-stayed bridge,” Struct. Health Monit. An Int. J., 12 440 –456 (2013). http://dx.doi.org/10.1177/1475921713500513 Google Scholar

27. 

E. S. Bell, J. T. Peddle and A. Goudreau, “Bridge condition assessment using digital image correlation and structural modeling,” in 6th Int. Conf. on Bridge Maintenance, Safety and Management, 330 –337 (2012). Google Scholar

28. 

W. Tong, “Formulation of Lucas–Kanade digital image correlation algorithms for noncontact deformation measurements: a review,” Strain, 49 313 –334 (2013). http://dx.doi.org/10.1111/str.12039 Google Scholar

29. 

J. W. Park et al., “Vision based displacement measurement method for high-rise building structures using partitioning approach,” NDT&E Int., 43 642 –647 (2010). http://dx.doi.org/10.1016/j.ndteint.2010.06.009 Google Scholar

30. 

A. M. Wahbeh et al., “A vision-based approach for the direct measurement of displacements in vibrating systems,” Smart Mater. Struct., 12 785 –794 (2003). http://dx.doi.org/10.1088/0964-1726/12/5/016 SMSTER 0964-1726 Google Scholar

31. 

Y. Fukuda et al., “Vision-based displacement sensor for monitoring dynamic response using robust object search algorithm,” IEEE Sensors J., 13 4725 –4732 (2013). http://dx.doi.org/10.1109/JSEN.2013.2273309 ISJEAZ 1530-437X Google Scholar

32. 

D. Feng et al., “A vision based sensor for noncontact structural displacement measurement,” Sensors, 15 16557 –16575 (2015). http://dx.doi.org/10.3390/s150716557 SNSRES 0746-9462 Google Scholar

33. 

M. Guizar-Sicairos, S. T. Thurman and J. R. Fienup, “Efficient subpixel image registration algorithms,” Opt. Lett., 33 156 –158 (2008). http://dx.doi.org/10.1364/OL.33.000156 OPLEDP 0146-9592 Google Scholar

34. 

J. J. Lee and M. Shinozuka, “Real-time displacement measurement of a flexible bridge using digital image processing techniques,” Exp. Mech., 46 105 –114 (2006). http://dx.doi.org/10.1007/s11340-006-6124-2 EXMCAZ 0014-4851 Google Scholar

35. 

D. You, X. Gao and S. Katayama, “Monitoring of high power laser welding using high-speed photographing and image processing,” Mech. Syst. Signal Process., 49 39 –52 (2014). http://dx.doi.org/10.1016/j.ymssp.2013.10.024 Google Scholar

36. 

P. Bing et al., “Performance of sub-pixel registration algorithms in digital image correlation,” Meas. Sci. Technol., 17 1615 –1621 (2006). http://dx.doi.org/10.1088/0957-0233/17/6/045 MSTCEP 0957-0233 Google Scholar

37. 

Z. Zhang and R. Wang, “Robust image super resolution method to handle localized motion outliers,” Opt. Eng., 48 (7), 077005 (2009). http://dx.doi.org/10.1117/1.3159871 Google Scholar

38. 

L. Li et al., “Subpixel flood inundation mapping from multispectral remotely sensed images based on discrete particle swarm optimization,” J. Photogramm. Remote Sens., 101 10 –21 (2015). http://dx.doi.org/10.1016/j.isprsjprs.2014.11.006 Google Scholar

39. 

S. Chan et al., “Subpixel motion estimation without interpolation,” in 2010 IEEE Int. Conf. on Acoustics, Speech and Signal Processing, 722 –725 (2010). http://dx.doi.org/10.1109/ICASSP.2010.5495054 Google Scholar

40. 

.D. Fleet and Y. Weiss, Handbook of Mathematical Models in Computer Vision, 239 –256 Springer, New York (2006). Google Scholar

41. 

X. Lei et al., “Vibration extraction based on fast NCC algorithm and high-speed camera,” Appl. Opt., 54 8198 –8206 (2015). http://dx.doi.org/10.1364/AO.54.008198 APOPAI 0003-6935 Google Scholar

Biography

Bingyou Liu received his BS and MS degrees in detection technology and automation devices from Anhui Polytechnic University in 2003 and 2008, respectively. He is an associate professor at Anhui Polytechnic University. He has authored 15 journal papers. His current research interests include optoelectronic systems, intelligent control, and weak signal detection.

Dashan Zhang received his BS degree in mechanical engineering from Guizhou University in 2012. Currently, he is a PhD candidate in the Department of Precision Machinery and Precision Instrumentation at the University of Science and Technology of China. His research interests include image processing and optical measurement.

Jie Guo received his BS and PhD degrees in mechanical engineering from the University of Science and Technology of China, Hefei, China, in 2010 and 2015, respectively. Currently, he is a postdoctoral fellow with the Department of Precision Machinery and Precision Instrumentation, University of Science and Technology of China. His current research interests include machine vision, pattern recognition, machine learning, and fault diagnosis.

Chang’an Zhu received his BS degree from HeFei University of Technology in 1982, his MS degree from Xidian University in 1985, and his PhD from National University of Defense Technology in 1989. He is a professor at the University of Science and Technology of China. His current research interests include intelligent control, fault diagnosis technology, and advanced manufacturing technology.

CC BY: © The Authors. Published by SPIE under a Creative Commons Attribution 4.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
Bingyou Liu, Dashan Zhang, Jie Guo, and Chang’an Zhu "Vision-based displacement measurement sensor using modified Taylor approximation approach," Optical Engineering 55(11), 114103 (14 November 2016). https://doi.org/10.1117/1.OE.55.11.114103
Published: 14 November 2016
Lens.org Logo
CITATIONS
Cited by 16 scholarly publications.
Advertisement
Advertisement
KEYWORDS
Sensors

Computing systems

Detection and tracking algorithms

Cameras

Imaging systems

Optical flow

Computer simulations

RELATED CONTENT

Lightweight fall detection system based on Orangepi 5B
Proceedings of SPIE (March 04 2024)
Color gamma camera system for radiation monitoring
Proceedings of SPIE (November 21 2000)
Efficient object tracking in WAAS data streams
Proceedings of SPIE (February 02 2011)
A traffic situation analysis system
Proceedings of SPIE (January 24 2011)
Miss-distance indicator for tank main guns
Proceedings of SPIE (June 07 1996)
Object detection in real-time
Proceedings of SPIE (March 01 1991)

Back to Top