Open Access
30 August 2017 Wavefront-sensing-based autofocusing in microscopy
Jing Xu, Xiaolin Tian, Xin Meng, Yan Kong, Shumei Gao, Haoyang Cui, Fei Liu, Liang Xue, Cheng Liu, Shouyu Wang
Author Affiliations +
Abstract
Massive image acquisition is required along the optical axis in the classical image-analysis-based autofocus method, which significantly decreases autofocus efficiency. A wavefront-sensing-based autofocus technique is proposed to increase the speed of autofocusing and obtain high localization accuracy. Intensities at different planes along the optical axis can be computed numerically after extracting the wavefront at defocus position with the help of the transport-of-intensity equation method. According to the focus criterion, the focal plane can then be determined, and after sample shifting to this plane, the in-focus image can be recorded. The proposed approach allows for fast, precise focus detection with fewer image acquisitions compared to classical image-analysis-based autofocus techniques, and it can be applied in commercial microscopes only with an extra illumination filter.

1.

Introduction

As a classical technique for microsample observation and measurement, microscopic imaging is widely used in both biological research and medical diagnosis. Realization of accurate focus is often important to capture the details of a sample with high resolution and contrast. Though the optimal focal position can be found manually, this method is inefficient and imprecise, and as a result, an autofocus system is often equipped in commercial microscopes to increase focusing speed and accuracy.1,2 There are two main types of autofocus methods, based on laser reflection and image analysis, respectively. Autofocusing via laser reflection can quickly locate the sample slice to a fixed plane without massive computation, such as the image read-in and processing; however, it is invalid when a sample varies its location from the reference surface (such as the thickness of the coverslip deviates its standard value) since its focal plane is maintained at a certain distance above the reference surface.3 To overcome this shortcoming, an image-analysis-based autofocus technique is proposed; with massive intensities recorded by sample (or micro-objective) stage scanning along the optical axis, the focal position can be determined according to the contrast, resolution, frequency components, or entropy extracted from the captured images.46 This technique avoids focusing error due to surface variations and additionally, determines the focal position directly from captured intensities; as a result, the image-analysis-based autofocus method has been widely adopted in a majority of commercial microscopes because of its accuracy and robustness. However, it still suffers from disadvantages: first, the autofocus efficiency is limited because of the substantial image acquisition requirements; next, the effective autofocus range is narrow often within several micrometers, indicating that the method is invalid for large defocus cases. To realize autofocusing with high speed and accuracy, as well as to extend the effective range, Zheng’s group designed rapid autofocus methods similar to those adopted in professional photography.7,8 Unfortunately, in these methods, microscopes must be modified by introducing extra pinhole-modulated cameras, and their incorrect installation often causes error, which reduces the focusing accuracy. According to Brenner gradient, Yazdanfar et al.9 designed a three-shot autofocus method that can localize the focal plane with high accuracy and fast speed; therefore, this method is a rather promising method for autofocusing, especially in microscopy. When the sample is located far away from the focal plane, the sensitivity of the Brenner gradient is remarkably reduced, limiting its applications in large defocus cases. Ferraro’s group proposed an autofocus method in digital holography to retrieve high-quality in-focus image from holograms,1012 and based on these techniques, they also realized high-accurate three-dimensional image reconstructions and particle tracking.1315 However, limited by the demands of coherent source and extra reference beam, the method can hardly be integrated in commercial microscopes. To obtain high focusing speed and accuracy, while avoiding imaging system modifications, in this paper, we propose a wavefront-sensing-based autofocus technique relying on wavefront retrieval, propagation, and analysis.1622 In addition, the proposed autofocus method can be directly applied in commercial microscopes only with an extra illumination filter. Verified by both numerical simulations and experimental verification, the newly designed approach can precisely determine the focal plane using relatively few image collections (often <10) with a rather large effective range (over 100  μm). Considering its fast speed, high accuracy, and large effective range, we believe that the proposed method can be adopted for rapid autofocusing in commercial microscopes.

2.

Principle

Figure 1 provides a flowchart of the wavefront-sensing-based autofocus method. First, three defocus intensities equidistant along the optical axis are captured by sample stage shifting; note that these images are blurred since the sample is not located at the focal plane. Next, the phase distribution is retrieved by solving the transport-of-intensity equation (TIE)2330 as shown in Eq. (1), in which I/z indicates the intensity gradient (which can be computed from captured images at different planes), k is the wave number, and φ is the phase distribution, which can be retrieved with a widely used fast Fourier transform-based solver31

Eq. (1)

kI(x,y)z=·[I(x,y)φ(x,y)].

Fig. 1

Flowchart of the wavefront-sensing-based autofocus method.

JBO_22_8_086012_f001.png

As the phase is obtained, through combination with the amplitude directly extracted from the captured image, intensities at different planes along the optical axis can be numerically computed according to the wavefront propagation using the angular spectrum method. Then, the focal plane can be determined by evaluating these numerically propagated intensities using focus criteria, such as derivative-based, statistical, intuitive algorithms, etc.32 Here, we choose the Tamura-coefficient-based focus criterion (TC), which is robust in focus determination, and it has a good noise suppression capability.33 This focus criterion is based on the contrast between the gray-level variation and the average of an intensity distribution I, as shown in Eq. (2), in which σ(I) and I indicate the standard deviation and mean of a gray-level image, respectively. The focal plane can be determined by tracking the TC peak

Eq. (2)

TC=σ(I)I.

Finally, after the sample stage is shifted to the determined focal plane, one more processing loop is needed for verification; if the location of the focal plane is the same as that determined by the previous loop, the focal plane has been identified, otherwise, another processing loop is required until the focal positions determined by two successive processing loops coincide. Compared to the widely used image-analysis-based autofocus technique, which requires massive numbers of image captures, the newly designed method only needs 2 to 3 loops with <10 intensity captures. Moreover, considering the time saved in sample stage scanning, it is believed that the proposed method can realize a comparatively rapid processing speed for autofocusing.

3.

Numerical Simulations and Experiments

To prove the feasibility of the proposed autofocus technique, a numerical simulation was first implemented as shown in Fig. 2. To mimic the practical microscopic configuration, a lens system with 10× magnification representing both the micro-objective and tube lens in a commercial microscope was introduced in the simulation model. Preset intensity and phase distributions are listed in Fig. 2(b). Random Gaussian noise was added to generate a signal-to-noise ratio of 30 dB, close to that estimated from experiments. The wavelength was set to 532 nm and pixel size to 8.3  μm, both in accordance with experimental devices. When the sample was located 40.0  μm away from the focal plane, only the blurred image in Fig. 2(c) could be obtained. Using the TIE algorithm, the phase at the defocus plane shown in Fig. 2(c) was extracted through use of another two symmetric defocus images with a separation of 4.0  μm. Then, the intensities at different planes along the optical axis were numerically computed according to the extracted defocus wavefront information. Figure 2(d) shows quantitative evaluations of these numerically propagated intensities using the TC. These results indicate a defocus distance of around 40.0  μm, proving the feasibility of the wavefront-sensing-based autofocus technique. Moreover, to quantitatively analyze the accuracy of the proposed wavefront-sensing-based autofocus method, various patterns with different defocus conditions from 10.0 to 80.0  μm were used as examples, the error (interval between the determined focal plane and the real one) was statistically computed as 0.16±0.05  μm from all the simulated data, indicating the high accuracy of the proposed method.3,5

Fig. 2

Numerical verification of the wavefront-sensing-based autofocus technique. (a) Optical system in numerical simulations, (b) preset intensity and phase distributions, (c) defocus intensity and phase distributions, and (d) numerical quantitative evaluation of intensities of the propagated wavefront using TC.

JBO_22_8_086012_f002.png

Next, the proposed method was adopted for autofocusing in a commercial upright microscope (Mshot ML-32, China), in which sample position can be scanned along the optical axis using a motorized sample stage (Mshot MS-300, China) with step of 0.5  μm. A CCD camera (AVT Prosilica GC780, Germany) with pixel size of 8.3  μm was used for image recording. Narrowband illumination at 532 nm with FWHM of 10 nm was generated with an interference filter (Daheng Optics GCC-202003, China) to guarantee high-accuracy phase retrieval. In addition, Kohler illumination was implemented, and the condenser aperture was set as 40% of the objective aperture to ensure partially spatial coherent illumination.3437 Plant rhizome cross cuttings and red blood cell smears (Keda Biological Sample Company, China) were used as samples. Before experimental verification of the wavefront-sensing-based autofocusing, the real focal plane was first determined using the classical image-analysis-based technique; the defocus distance could then be precisely set by moving the sample to a defocus plane. For the plant rhizome cross cutting, a 10× micro-objective was used, and the defocus distance was set to 40.0  μm in the sample plane.

Figure 3(a1) shows the recorded defocus intensity. The defocus phase in Fig. 3(b1) was computed combining another two symmetric defocus image recorded next to the central one with a separation of 4.0  μm. Then, using wavefront propagation, the intensities at different positions along the optical axis were numerically computed according to the extracted complex amplitude. Because the minimum shifting step of the used motorized sample stage was 0.5  μm, besides, using the classical image-analysis-based autofocus method, the scanning step could only reach 0.5  μm limited by the hardware; therefore, the numerical wavefront propagation step in our proposed method was set as 0.5  μm. After evaluation by the TC, the focal plane could be determined as shown in Fig. 3(c1), which shows a defocus interval of 40.0  μm coinciding with the preset value. Though the numerically retrieved in-focus intensity in Fig. 3(d1) improved the image quality, it was still limited due to error and information loss from wavefront retrieval and propagation. Finally, after the sample stage was shifted to the determined focal plane, the real in-focus intensity was captured as shown in Fig. 3(e1). The proposed autofocus method was also tested using a red blood cell smear, representing a discrete sample, different from the continuous plant rhizome cross cutting. In this case, a 40× micro-objective was used, and the defocus distance was set to 10.0  μm in the sample plane. The defocus intensity and phase are shown in Figs. 3(a2) and 3(b2), respectively. Using the proposed approach, the real focal plane could be accurately localized, as shown in Fig. 3(c2), by searching the numerically propagated in-focus image shown in Fig. 3(d2). Finally, the real in-focus image in Fig. 3(e2) was recorded after the sample stage was shifted to the determined focal plane. Though another loop was required for focal plane confirmation, only images captured at six planes were needed. Considering the discrete step setting in wavefront propagation, the Tamura peaks always located at the correct focal plane, indicating the error of the proposed method was always within [0.25, 0.25  μm], which also fits well with the numerical simulation results. According to the practical application shown in Figs. 3(a1)3(e1) with a defocus distance of 40  μm, when using the classical image-analysis-based autofocus technique, multifocal images were captured with a step of 0.5  μm during the sample scanning along the imaging axis, besides, to certificate the focal plane, extra distance scanning passing the focal position (10  μm) was still required. Therefore, in this case, 100 multifocal images were captured along a scanning distance of 50  μm in sample plane. However, only six image captures were enough to localize the focal plane with the same accuracy when using our proposed wavefront-sensing-based autofocus technique. Compared to the traditional image-analysis-based technique, this method considerably reduces image captures and time consumption. Using traditional image-analysis-based autofocusing, a total of 4.88 s was needed for autofocusing with our self-built autofocus system, including 1.25 s for sample stage translation, 1.40 s for image recording, and 2.23 s for digital image processing (with a desktop with Intel Core i5-3470 CPU at 3.20 GHz and a 4-GB RAM). In contrast, the time consumption of the proposed autofocus method was 3.45 s, with 0.40 s for stage moving, 0.14 s for image recording, and 2.91 s for digital image processing. In traditional autofocusing, as the necessary scanning distance for stage translation was longer than that in the proposed method, additionally, multiple sample stage translation was required for image recording; it required more time compared to that in proposed approach. Though the proposed autofocusing did not require massive image read-in, phase retrieval and wavefront propagation were indispensable and consumed excess time. Compared to image-analysis-based technique, the proposed autofocusing method is more time-consuming in digital image processing, including a large amount of numerical wavefront propagation and their in-focus evaluations to maintain wide focus searching range and high focus determination accuracy. Since often the time-consuming sequential calculation was adopted to numerically compute the intensities as well as to execute the in-focus evaluation, obviously decreasing the computation efficiency. However, in the future work, introducing graphics processing unit computation for both numerical wavefront propagation and in-focus evaluation will obviously accelerate the image processing, and the superiority of the proposed autofocusing technique in terms of processing time will be more significant.

Fig. 3

Experimental verification of the wavefront-sensing-based autofocusing in (1) a plant rhizome cross cutting and (2) a red blood cell smear. (a) Defocus intensity, (b) phase distribution, (c) numerical quantitative evaluation of intensities of the propagated wavefront, (d) numerically propagated in-focus image, and (e) real in-focus image obtained via sample stage shifting.

JBO_22_8_086012_f003.png

Figures 4(a) and 4(b) list the quantitative comparisons of different focus criteria in the condition of plant rhizome cross cutting shown in Fig. 3. In addition to the adopted TC, Brenner gradient,38 image power,39 and autocorrelation40,41 methods were also implemented, which represent derivative-based, intuitive and statistical algorithms, respectively. All these focus criteria, including the Tamura-coefficient-based one, can precisely determine the focal plane not only in classical image-analysis-based autofocus method [Fig. 4(a)] but also in proposed wavefront-sensing-based autofocus technique [Fig. 4(b)], proving the capability of the TC.

Fig. 4

Experimental verification of TC by comparing with other criteria in both the (a) image-analysis-based autofocusing and (b) wavefront-sensing-based autofocusing.

JBO_22_8_086012_f004.png

If there is a large defocus distance in the sample location, a single processing loop cannot localize the focal plane due to the detail loss in wavefront extraction and the error in numerical wavefront propagation. However, the focal plane can still be determined using additional processing loops. Figure 5 provides two examples of autofocusing in large defocus distance cases. For the plant rhizome cross cutting, the defocus distance was set to 120.0  μm; both defocus intensity and phase are shown in Fig. 5(a1). After the first processing loop, Fig. 5(b1) shows the determined focal position, which was only 31.0  μm away from the real focal plane. Next, the second processing loop was implemented as shown in Figs. 5(c1) and 5(d1); the focal plane could be precisely determined after the confirmation using an additional processing loop. Finally, the real in-focus image was captured, as shown in Fig. 5(e1). For the red blood cell smear, there was a large defocus distance of 30.0  μm; the focal plane could still be determined using extra processing loops as shown in Figs. 5(a2) to 5(e2), proving that the wavefront-sensing-based autofocus method can precisely localize the focal plane even for large defocus cases. Additionally, both the autofocus effective ranges for 40× and 10× micro-objective cases were ±50  μm and ±170  μm away from the focal plane estimated by practical experiments, respectively. When the sample location is out of the effective range, the proposed autofocus method cannot provide a location that close to the actual focal plane due to high-frequency component lose and low signal-to-noise ratio, therefore preventing the convergent focal position searching. Though the autofocus method designed by Yazdanfar et al.9 can rapidly and precisely localize the focal plane only with three multifocal intensity captures, due to the low sensitivity in large defocus conditions, its effective ranges for 40× and 10× micro-objective cases were around ±20  μm and ±50  μm away from the focal plane also estimated by practical experiments, respectively. Compared to this three-shot autofocus technique, the effective ranges of the proposed autofocus method were relatively large, proving the proposed method can be successfully used in wide applications.

Fig. 5

Experimental verification of wavefront-sensing-based autofocusing with a large defocus distance in (1) a plant rhizome cross cutting and (2) a red blood cell smear. (a) Defocus intensity/phase distributions and (b) numerical quantitative evaluation of intensities of the propagated wavefront from the first processing loop, (c) defocus intensity/phase distributions and (d) numerical quantitative evaluation of intensities of the propagated wavefront from the second processing loop, and (e) real in-focus image.

JBO_22_8_086012_f005.png

4.

Conclusion

We combine wavefront retrieval, propagation, and analysis to propose wavefront-sensing-based autofocusing in microscopy. Since much less image recording is required compared to the widely used image-analysis-based technique, the proposed method typically demonstrates faster autofocus speed. Moreover, even if the initial sample location is far from the focal plane, this method can still effectively localize the focus position, which is proven by both numerical simulations and practical measurements. Additionally, the newly designed autofocus method can be applied in commercial microscopes only with an extra illumination filter. Considering its high processing efficiency and large effective range as well as its simple design and operation, the proposed approach is a good candidate for future adoption as the primary method of rapid autofocusing in microscopy.

Disclosures

S.W. and C.L. report patents, which are owned by Jiangnan University, that are related to the autofocus technique described in this paper.

Acknowledgments

The work was supported by the National Natural Science Foundation of China (Nos. 61705092, 11647144 and 31522056), the National Natural Science Foundation of Jiangsu Province of China (Nos. BK20130162 and BK20170194), the Shanghai Sailing Program (No. 17YF1407000), the Fundamental Research Funds for the Central Universities (Nos. JUSRP115A14 and JUSRP51721B), and the Local Colleges and Universities Capacity Building Program (Nos. 15110500900 and 14110500900). The authors thank Professor Haijiao Jiang in Nanjing Institute of Astronomical Optics and Technology, Chinese Academy of Sciences for experiment support and paper preparation.

References

1. 

J. M. Castillo-Secilla et al., “Autofocus method for automated microscopy using embedded GPUs,” Biomed. Opt. Express, 8 (3), 1731 –1740 (2017). http://dx.doi.org/10.1364/BOE.8.001731 BOEICL 2156-7085 Google Scholar

2. 

Z. Wang et al., “Compact multi-band fluorescent microscope with an electrically tunable lens for autofocusing,” Biomed. Opt. Express, 6 (11), 4353 –4364 (2015). http://dx.doi.org/10.1364/BOE.6.004353 BOEICL 2156-7085 Google Scholar

3. 

M. C. Montalto, R. R. McKay and R. J. Filkins, “Autofocus methods of whole slide imaging systems and the introduction of a second-generation independent dual sensor scanning method,” J. Pathol. Inf., 2 (1), 44 (2011). http://dx.doi.org/10.4103/2153-3539.86282 Google Scholar

4. 

L. Firestone et al., “Comparison of autofocus methods for automated microscopy,” Cytometry, 12 (3), 195 –206 (1991). http://dx.doi.org/10.1002/(ISSN)1097-0320 CYTODQ 0196-4763 Google Scholar

5. 

R. R. McKay, V. A. Baxi and M. C. Montalto, “The accuracy of dynamic predictive autofocusing for whole slide imaging,” J. Pathol. Inf., 2 (1), 38 (2011). http://dx.doi.org/10.4103/2153-3539.84231 Google Scholar

6. 

R. Redondo et al., “Autofocus evaluation for brightfield microscopy pathology,” J. Biomed. Opt., 17 (3), 036008 (2012). http://dx.doi.org/10.1117/1.JBO.17.3.036008 JBOPFO 1083-3668 Google Scholar

7. 

K. Guo et al., “InstantScope: a low-cost whole slide imaging system with instant focal plane detection,” Biomed. Opt. Express, 6 (9), 3210 –3216 (2015). http://dx.doi.org/10.1364/BOE.6.003210 BOEICL 2156-7085 Google Scholar

8. 

J. Liao et al., “Single-frame rapid autofocusing for brightfield and fluorescence whole slide imaging,” Biomed. Opt. Express, 7 (11), 4763 –4768 (2016). http://dx.doi.org/10.1364/BOE.7.004763 BOEICL 2156-7085 Google Scholar

9. 

S. Yazdanfar et al., “Simple and robust image-based autofocusing for digital microscopy,” Opt. Express, 16 (12), 8670 –8677 (2008). http://dx.doi.org/10.1364/OE.16.008670 OPEXFF 1094-4087 Google Scholar

10. 

P. Ferraro et al., “Digital holographic microscope with automatic focus tracking by detecting sample displacement in real time,” Opt. Lett., 28 (14), 1257 –1259 (2003). http://dx.doi.org/10.1364/OL.28.001257 OPLEDP 0146-9592 Google Scholar

11. 

P. Memmolo et al., “Automatic focusing in digital holography and its application to stretched holograms,” Opt. Lett., 36 (10), 1945 –1947 (2011). http://dx.doi.org/10.1364/OL.36.001945 OPLEDP 0146-9592 Google Scholar

12. 

A. Pelagotti et al., “An automatic method for assembling a large synthetic aperture digital hologram,” Opt. Express, 20 (5), 4830 –4839 (2012). http://dx.doi.org/10.1364/OE.20.004830 OPEXFF 1094-4087 Google Scholar

13. 

P. Ferraro et al., “Controlling depth of focus in 3D image reconstructions by flexible and adaptive deformation of digital holograms,” Opt. Lett., 34 (18), 2787 –2789 (2009). http://dx.doi.org/10.1364/OL.34.002787 OPLEDP 0146-9592 Google Scholar

14. 

L. Miccio et al., “Particle tracking by full-field complex wavefront subtraction in digital holography microscopy,” Lab Chip, 14 (6), 1129 –1134 (2014). http://dx.doi.org/10.1039/C3LC51104A LCAHAM 1473-0197 Google Scholar

15. 

P. Memmolo et al., “On the holographic 3D tracking of in vitro cells characterized by a highly-morphological change,” Opt. Express, 20 (27), 28485 –28493 (2012). http://dx.doi.org/10.1364/OE.20.028485 OPEXFF 1094-4087 Google Scholar

16. 

M. Mir et al., “Quantitative phase imaging,” Prog. Opt., 57 133 –217 (2012). http://dx.doi.org/10.1016/B978-0-44-459422-8.00003-5 POPTAN 0079-6638 Google Scholar

17. 

V. Akondi, S. Castillo and B. Vohnsen, “Multi-faceted digital pyramid wavefront sensor,” Opt. Commun., 323 77 –86 (2014). http://dx.doi.org/10.1016/j.optcom.2014.03.004 OPCOB8 0030-4018 Google Scholar

18. 

J. Polans et al., “Compressed wavefront sensing,” Opt. Lett., 39 (5), 1189 –1192 (2014). http://dx.doi.org/10.1364/OL.39.001189 OPLEDP 0146-9592 Google Scholar

19. 

D. Débarre, M. J. Booth and T. Wilson, “Image based adaptive optics through optimisation of low spatial frequencies,” Opt. Express, 15 (13), 8176 –8190 (2007). http://dx.doi.org/10.1364/OE.15.008176 OPEXFF 1094-4087 Google Scholar

20. 

T. H. Nguyen et al., “Automatic Gleason grading of prostate cancer using quantitative phase imaging and machine learning,” J. Biomed. Opt., 22 (3), 036015 (2017). http://dx.doi.org/10.1117/1.JBO.22.3.036015 JBOPFO 1083-3668 Google Scholar

21. 

Y. K. Park et al., “Fresnel particle tracing in three dimensions using diffraction phase microscopy,” Opt. Lett., 32 (7), 811 –813 (2007). http://dx.doi.org/10.1364/OL.32.000811 OPLEDP 0146-9592 Google Scholar

22. 

A. Facomprez, E. Beaurepaire and D. Débarre, “Accuracy of correction in modal sensorless adaptive optics,” Opt. Express, 20 (3), 2598 –2612 (2012). http://dx.doi.org/10.1364/OE.20.002598 OPEXFF 1094-4087 Google Scholar

23. 

L. Waller et al., “Phase from chromatic aberrations,” Opt. Express, 18 (22), 22817 –22825 (2010). http://dx.doi.org/10.1364/OE.18.022817 OPEXFF 1094-4087 Google Scholar

24. 

Z. Jingshan et al., “Transport of intensity phase imaging by intensity spectrum fitting of exponentially spaced defocus planes,” Opt. Express, 22 (9), 10661 –10674 (2014). http://dx.doi.org/10.1364/OE.22.010661 OPEXFF 1094-4087 Google Scholar

25. 

X. Tian et al., “Real-time quantitative phase imaging based on transport of intensity equation with dual simultaneously recorded field of view,” Opt. Lett., 41 (7), 1427 –1430 (2016). http://dx.doi.org/10.1364/OL.41.001427 OPLEDP 0146-9592 Google Scholar

26. 

W. Yu et al., “Real time quantitative phase microscopy based on single-shot transport of intensity equation (ssTIE) method,” Appl. Phys. Lett., 109 (7), 071112 (2016). http://dx.doi.org/10.1063/1.4961383 APPLAB 0003-6951 Google Scholar

27. 

X. Meng et al., “Smartphone based hand-held quantitative phase microscope using the transport of intensity equation method,” Lab Chip, 17 (1), 104 –109 (2017). http://dx.doi.org/10.1039/C6LC01321J LCAHAM 1473-0197 Google Scholar

28. 

L. G. Mesquita1, U. Agero and O. N. Mesquita, “Defocusing microscopy: an approach for red blood cell optics,” Appl. Phys. Lett., 88 (13), 133901 (2006). http://dx.doi.org/10.1063/1.2189010 APPLAB 0003-6951 Google Scholar

29. 

X. Tian et al., “In-focus quantitative intensity and phase imaging with the numerical focusing transport of intensity equation method,” J. Opt., 18 (10), 105302 (2016). http://dx.doi.org/10.1088/2040-8978/18/10/105302 Google Scholar

30. 

X. Meng et al., “Rapid in-focus corrections on quantitative amplitude and phase imaging using transport of intensity equation method,” J. Microsc., 266 (3), 253 –262 (2017). http://dx.doi.org/10.1111/jmi.12535 JMICAR 0022-2720 Google Scholar

31. 

L. Waller, L. Tian and G. Barbastathis, “Transport of intensity phase-amplitude imaging with higher order intensity derivatives,” Opt. Express, 18 (12), 12552 –12561 (2010). http://dx.doi.org/10.1364/OE.18.012552 OPEXFF 1094-4087 Google Scholar

32. 

Y. Su, S. Duthaler and B. J. Nelson, “Autofocusing in computer microscopy: selecting the optimal focus algorithm,” Microsc. Res. Tech., 65 (3), 139 –149 (2004). http://dx.doi.org/10.1002/(ISSN)1097-0029 MRTEEO 1059-910X Google Scholar

33. 

H. Tamura, S. Mori and T. Yamawaki, “Textural features corresponding to visual perception,” IEEE Trans. Syst. Man Cybern., 8 (6), 460 –473 (1978). http://dx.doi.org/10.1109/TSMC.1978.4309999 Google Scholar

34. 

S. S. Kou et al., “Transport-of-intensity approach to differential interference contrast (TI-DIC) microscopy for quantitative phase imaging,” Opt. Lett., 35 (3), 447 –449 (2010). http://dx.doi.org/10.1364/OL.35.000447 OPLEDP 0146-9592 Google Scholar

35. 

E. D. Barone-Nugent, A. Barty and K. A. Nugent, “Quantitative phase-amplitude microscopy I: optical microscopy,” J. Microsc., 206 (3), 194 –203 (2002). http://dx.doi.org/10.1046/j.1365-2818.2002.01027.x JMICAR 0022-2720 Google Scholar

36. 

C. J. R. Sheppard, “Defocused transfer function for a partially coherent microscope and application to phase retrieval,” J. Opt. Soc. Am. A, 21 (5), 828 –831 (2004). http://dx.doi.org/10.1364/JOSAA.21.000828 JOAOD6 0740-3232 Google Scholar

37. 

C. J. R. Sheppard, “Three-dimensional phase imaging with the intensity transport equation,” Appl. Opt., 41 (28), 5951 –5955 (2002). http://dx.doi.org/10.1364/AO.41.005951 APOPAI 0003-6935 Google Scholar

38. 

J. F. Brenner et al., “An automated microscope for cytologic research a preliminary evaluation,” J. Histochem. Cytochem., 24 (1), 100 –111 (1976). http://dx.doi.org/10.1177/24.1.1254907 JHCYAS 0022-1554 Google Scholar

39. 

A. Santos et al., “Evaluation of autofocus functions in molecular cytogenetic analysis,” J. Microsc., 188 (3), 264 –272 (1997). http://dx.doi.org/10.1046/j.1365-2818.1997.2630819.x JMICAR 0022-2720 Google Scholar

40. 

D. Vollath, “Automatic focusing by correlative methods,” J. Microsc., 147 (3), 279 –288 (1987). http://dx.doi.org/10.1111/jmi.1987.147.issue-3 JMICAR 0022-2720 Google Scholar

41. 

D. Vollath, “The influence of the scene parameters and of noise on the behaviour of automatic focusing algorithms,” J. Microsc., 151 (2), 133 –146 (1988). http://dx.doi.org/10.1111/j.1365-2818.1988.tb04620.x JMICAR 0022-2720 Google Scholar

Biographies for the authors are not available.

CC BY: © The Authors. Published by SPIE under a Creative Commons Attribution 4.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
Jing Xu, Xiaolin Tian, Xin Meng, Yan Kong, Shumei Gao, Haoyang Cui, Fei Liu, Liang Xue, Cheng Liu, and Shouyu Wang "Wavefront-sensing-based autofocusing in microscopy," Journal of Biomedical Optics 22(8), 086012 (30 August 2017). https://doi.org/10.1117/1.JBO.22.8.086012
Received: 26 May 2017; Accepted: 10 August 2017; Published: 30 August 2017
Lens.org Logo
CITATIONS
Cited by 10 scholarly publications.
Advertisement
Advertisement
KEYWORDS
Wavefronts

Wave propagation

Microscopy

Microscopes

Image acquisition

Image processing

Numerical simulations

Back to Top