Partially compensated deconvolution from wavefront sensing images of the eye fundus

Partially compensated deconvolution from wavefront sensing images of the eye fundus

Optics Communications 284 (2011) 1548–1552 Contents lists available at ScienceDirect Optics Communications j o u r n a l h o m e p a g e : w w w. e ...

980KB Sizes 0 Downloads 80 Views

Optics Communications 284 (2011) 1548–1552

Contents lists available at ScienceDirect

Optics Communications j o u r n a l h o m e p a g e : w w w. e l s ev i e r. c o m / l o c a t e / o p t c o m

Partially compensated deconvolution from wavefront sensing images of the eye fundus Justo Arines ⁎ Departamento de Física Aplicada (área de Óptica), Escola Universitaria de Óptica e Optometría (Campus Sur), 15782 Santiago de Compostela, Spain Departamento de Física Aplicada (Área de Óptica), Facultad de Ciencias, Universidad de Zaragoza, c\ Pedro Cerbuna 12, 50009, Zaragoza Spain

a r t i c l e

i n f o

Article history: Received 7 June 2010 Accepted 22 November 2010

a b s t r a c t We present the application of the technique partially compensated deconvolution from wavefront sensing for obtaining high resolution images of the eye fundus, where the static component of the subject's ocular aberration is compensated with a customized static phase-plate. The technique was tested over an artificial and human eye. The results showed the superior quality of the images restored with the proposed technique. They also show the possibility of obtaining real-time high resolution images. © 2010 Elsevier B.V. All rights reserved.

1. Introduction In the last years active research has been devoted to obtain high resolution images of the eye fundus. Astronomical techniques as Adaptive Optics (AO) [1], Deconvolution from wavefront sensing (DWFS) [2], and Blind deconvolution (BD) [3], initially developed to overcome the image degradation induced by the turbulent atmosphere, have been satisfactorily adapted to ophthalmic applications [4–11] performing in vivo images of anatomical structures that were hidden by the ocular optical aberrations. Among these techniques the most outstanding, both for astronomical and ophthalmic purposes, is AO. The high signal to noise ratio of the AO images and the possibility of being used in real time applications are their main advantages. BD technique has also experienced a recent growth in ophthalmic and astronomical applications. This technique is applied directly over the recorded images, so no specific optical setup is involved. However, its performance is highly dependent on the signal to noise ratio of the degraded image and on the BD algorithm. Finally, DWFS stands between the pure optical (AO systems) and the pure computational (BD algorithms). It is an image restoration technique developed in the framework of linear shift invariant systems, where the degradation function is assumed to depend on the system's point spread function (PSF) which is depicted by its optical aberrations [2,12,13]. Contrary to common deconvolution algorithms which use predefined models of the PSF, DWFS employs a wavefront sensor to measure the system aberrations and compute the system's PSF and Optical Transfer Function (OTF) involved in the construction of the restoration filter employed in the deconvolution process [14–19].

⁎ Tel.: +34 981594488 13509. E-mail address: [email protected]. 0030-4018/$ – see front matter © 2010 Elsevier B.V. All rights reserved. doi:10.1016/j.optcom.2010.11.063

Initial applications of DWFS were developed for astronomical purposes, but in the year 2000, Iglesias-Artal [6] proposed its application to the observation of the eye fundus. However it was in 2002 when Catlin-Dainty [7] showed DWFS high resolution images of the eye fundus. They presented images of the cone mosaic of one subject. The poor signal to noise ratio (SNR) of their retinal images forced them to average 20 frames in order to get a significant restoration. However, image SNR can be improved not only by frame averaging, but also by reducing the system's optical aberrations, and this is the basic of Compensated DWFS, where the restoration process is applied over AO compensated images [13]. Keeping in mind this concept Arines-Bara [8] suggested in 2003 the technique named partially compensated deconvolution from wavefront sensing (PCDWFS), which is characterized by using a static element to make up for the static component of the optical aberrations, leaving the compensation of the dynamic part to the DWFS process. They showed the utility of their proposal by performing different simulations considering static ocular aberration and no fixational movements. However real eyes are more complex and dynamic of ocular movements, aberrations, and scattered light are present. Therefore we think that our previous work should be complemented with an experimental validation.

2. Experimental setup and procedure A general DWFS system consists of two channels, one for image acquisition (image channel) and other for wavefront sensing (sensor channel). Data gathering in both channels is synchronized in order to record at the same time the degraded image and the aberrated wavefront. Image and sensor channels can be identified in the experimental setup of the high resolution eye fundus camera shown in Fig. 1. Other parts of the setup are the illumination and common arms.

J. Arines / Optics Communications 284 (2011) 1548–1552

Fig. 1. Schematic diagram of the PCDWFS high-resolution retinal imaging camera.

The illumination arm involves the following elements: Sources 1 and 2, DA2, E3, and PBS1. Source 1 was employed for wavefront sensing. It consisted of a low coherence laser pointer with 633 nm central wavelength and 50 nm spectral bandwidth. The beam intensity ranged between 2 and 4 μW at the level of the cornea. Source 2, a superluminiscent LED Luxeon Star/O with 625 nm central wavelength and 20 nm spectral bandwidth, was used for illuminating a wide region of the eye fundus. The intensity of this source was 30 μW at the level of the cornea. In order to reduce the presence of corneal reflex we used an annular illumination pattern, which was obtained by placing a central obscuration and a washer at the superluminiscent LED collimating lens (Table 1). The common path involves: BS1, L1, E1, E2, L2, PP, DA1, L3, DC1, L4, and BS2. L1–E1–E2–L2, form the Badal system used to induce controllable amounts of defocus (and in particular for focusing the retinal image) without changing the magnification of the eye pupil image. DA1 is a pupil aperture conjugated with the eye pupil thanks to the 4f system built with L1 and L2. We placed before DA1, the customized phase plate that performs the compensation of the static component of the subject's ocular aberrations. L3 and L4 are also placed in a 4f scheme and provide an aerial image of the eye pupil. DC1 is a field diaphragm. BS2 distributes the light between the wavefront and the imaging arms. The wavefront arm consisted of Hartmann–Shack sensor built with a microlens array (MM) placed in a plane conjugated with the eye

Table 1 Characteristics of the optical elements of the experimental setup.

Lenses L1, L2 L3, L4, L5 L6 L7 L8 Mirrors E1, E2 E3 Polarizers P1 Diaphragms DA1, DA2, DC1 Beam splitters BS1, BS2 Pellicle bea, splitters PBS1, PBS2

Focal length (cm)

Diameter (cm)

10 5 5/10 15 5

2.5 2.5 2.5 2.5 2.5 2.5 5 5 0.53, 0.1, 0.2

50/50

5

90/10

5

1549

pupil, a relay lens and camera 1 (Hamamatsu C5985-95 chilled). The MM consisted of 11 × 11 microlens (each of them with 560 μm of diameter) distributed in a square lattice. The image plane of the MM was located 5.12 cm apart with, an equivalent pixel size of 15.7 μm (at the microlens image plane). The imaging arm consisted of the imaging lens L6 and camera 2 (Hamamatsu ORCA-285, C4742-96-12 G04). The exposure time of cameras 1 and 2 was 120 ms with a frame interval between the frames of 400 ms. The artificial eye was built with a variable diaphragm, a lens of 5 cm focal length and 2.5 cm diameter, and a scratched photographic paper that simulates the retina in the extended object case. For the system alignment and the point-like object system validation we removed the photographic paper and placed a 100× microscope objective with its image plane conjugated with the artificial eye focal object, and a led at the objective pupil entrance (Toshiba 10 mm Ultra-Bright TOSBRIGHT, 644 nm emission central wavelength). The aberrations of the artificial eye were generated with a phase plate which induces the wavefront aberration of the order of the ones presented in human eyes (in the absence of defocus and low order astigmatism). 2.1. Data acquisition protocol We made an initial validation of the PCDWFS technique with the artificial eye. First of all we evaluated the case of a point-like object, and then the extended object case. In both situations we used the following steps: (1) Image recording in absence of aberrations; (2) Degraded image recording and wavefront measurement; and (3) Partially compensated image recording and wavefront measurement. The human eye validation was performed with one eye of a 30 years old male, which presented a root mean square aberration (rms) of 1.5 μm (neglecting tip-tilt and defocus) over a pupil diameter of 6 mm. In step 1 we measured the subject's ocular aberrations. In step 2 we manufactured a customized phase plate (PP) [20] for compensating the static component of the ocular aberration. Step 3, we instilled one drop of Tropicamide 1% in the subject's eye. Step 4, the subject is instructed to fixate the source 1, while centering his pupil using the image of the sensor camera. Step 5, the retinal image is focused with the Badal system. Step 6, a sequence of several retinal images and wavefront measurements were recorded while maintaining the focus. Step 7, the phase plate is placed close to DA1. Step 8, a sequence of several partially compensated retinal images and wavefront measurements were recorded, while maintaining the focus. 2.2. Restoration procedure The first step involved is the computation of the OTF. To do that, we calculated the autocorrelation of the generalized pupil function. The pupil wavefront was obtained as a linear combination of 21 Zernike polynomials whose corresponding coefficients were provided by the wavefront sensor. The pupil dimensions and coordinates used for computing the generalized pupil function were obtained from the image coordinates through the relation. mo =

λd Nm1

ð1Þ

where mo is the pupil pixel size, m1 is the image pixel size, N is the dimension of the extended matrix, d is the distance between the exit pupil and the image plane, and λ is the wavelength of the optical field [21]. The image restoration was performed with the Wiener Filter: ˆ ðv; uÞ ĝ ðv; uÞ h ôe ðv; uÞ = 2 ˆ hðv; uÞ + γ

ð2Þ

1550

J. Arines / Optics Communications 284 (2011) 1548–1552

Fig. 2. Comparison of the image restoration of the point-like object: (a) image of reference, (b) degraded image; (c) partially compensated image; (d) PCWFS restored image; (e) DWFS restored image. Field of view 0.24°.

where ôe(v,u), ĝ(v,u), ĥ(v,u) and γ are the Fourier Transforms of the restored image, and the degraded image, the system optical transfer function (OTF), and the regularization parameter. This last parameter controls the undesirable noise amplification. High values cause excessive smoothing of the restored image, but low values allow for

the noise amplification and the appearance of spurious artifacts in the restored image. Due to this compromise we determined γ by trial and error. Finally the restored image was obtained by computing the inverse Fourier Transform of ôe(v,u). The presented procedure was repeated for obtaining each of the restored images. If we had used all

Fig. 3. Comparison of the image restoration of the extended object: (a) image of reference, (b) degraded image; (c) partially compensated image; (d) PCDWFS restored image; (e) DWFS restored image. Field of view 1.11°.

J. Arines / Optics Communications 284 (2011) 1548–1552

the images recorded on each sequence for obtaining one restored image, we should had used the Vector Wiener Filter which is specifically designed for this purpose [17]. 3. Experimental results This section will be devoted to the presentation of the experimental results obtained with the artificial and human eye. We will start with the artificial eye, first with the point-like object, and then with the extended object. Then we will present the case of the human eye. 3.1. Artificial eye We start with the results obtained for the point-like object. Fig. 2 shows; (a) the reference image, (b) the degraded image; (c) the partially compensated image; (d) the PCDWFS restored image; and (e) the DWFS restored image. The images were obtained with a magnification of 6.25× with an equivalent pixel size is of 0.5 μm, and γ = 10− 7 for PCDWFS and γ = 10− 3 for DWFS. We want to notice first, the benefit of partial compensation, which can manage the degradation induced by the system aberrations, and provides fairly good approximation to the reference image. The high degree of compensation of the optical aberrations achieved with the phase plate (rms 0.054 μm) leaves a small amount of degradation to be compensated with the deconvolution process (see Fig. 2(c) and (d)). However, the comparison of the images obtained with DWFS and PCDWFS (Fig. 2(d) and (e)) shows that the resolution achieved with PCDWFS is significantly superior to the former one. Even more, the low quality of Fig. 2(d) clearly shows that the magnitude of the degradation presented by our artificial eye (rms 0.227 μm) cannot be

1551

completely managed only by the DWFS process. This fact emphasizes the relevance of PCDWFS as an alternative to AO. Similar results were obtained for the extended object case (see Fig. 3). However in this case the residual wavefront presented a rms of 0.100 μm (due to a decentration and/or rotation of the compensating phase plate with respect to the artificial eye) and therefore the benefit achieved with PCDWFS with respect to PC is more noticeable. The images were obtained with a magnification of 6.25×, with an equivalent pixel size is of 0.5 μm, and γ = 10− 3 for PCDWFS and DWFS. The experiments performed with the artificial eyes show the amount of improvement that should be expected and the superiority of the PCDWFS. The analysis with the artificial eye was performed in the framework of high degree of optical correction, no ocular movement, no defocus fluctuations, and no intraocular scattering. Thus, although the results are promising the final exam should be made with human eyes, under the presence of the various factor mentioned before. 3.2. Human eye In this case the comparison between DWFS and PCDWFS is specially affected by the ocular dynamics, which cause differences in the observed retinal section and in the optical aberrations. However, we tried to observe the same retinal region while focusing the retinal cone mosaic with the Badal system, although sometimes we lost that plane and other retinal features were observed. We present in Figs. 4 and 5 the images of two different trials obtained with a magnification of 12.5× (equivalent pixel size is 1 μm) and γ=10− 2. The mean rms of the residual wavefront measured with the HS was 0.354 μm and 0.487 μm respectively. Figs. 4(a) and 5(a) show the recorded image without partial compensation, Figs. 4(b) and 5(b) show

Fig. 4. Comparison of the image restoration procedures: (a) degraded image; (b)DWFS restored image; (e) partially compensated image; (f) PCDWFS restored image. Field of view 1.11°.

1552

J. Arines / Optics Communications 284 (2011) 1548–1552

Fig. 5. (a) degraded image (Media 1); (b) DWFS Restored Image (Media 2); (c) partially compensated image (Media 3); (d) PCDWFS restored image (Media 4). . Field of view 1.11°.

the partially compensated image with the PP, Figs. 4(c) and 5(c) the DWFS image, and Figs. 4(d) and 5(d) the PCDWFS image. The images comparison show the benefit achieved with the proposed technique. We also present the complete image sequences in different videos associated to Fig. 5 in order to show the utility of the proposed technique for real time applications. In the video we can observe the different structures focused during the sequence. Small capillaries and photoreceptors can be seen at different moments during the video. 4. Conclusions

Innovación grant FIS2008-03884. Besides I want to acknowledge a financial support from the Isidro Parga Pondal Programme 2009 (Xunta de Galicia, Spain). References [1] [2] [3] [4] [5]

We have demonstrated with an artificial and human eye the utility of partially compensated deconvolution from wavefront sensing for obtaining high resolution images of the eye fundus in real time. We have also compared the techniques PCDWFS and DWFS founding a significant superiority of the former one in terms of the restored image visual quality. We have also shown that the amount of degradation induced by the human eye cannot be overcome by DWFS. This result was previously reported by other researchers, however instead of using frame averaging to increase the SNR of the imaging system we proposed the improvement of the system OTF by compensating the static component of the ocular aberration. This question allowed us to obtain real time high resolution images of the eye fundus. Acknowledgements I want to express my most sincerely acknowledgement to Dr. Salvador X. Bará for the support, collaboration and fruitful discussions. This work was supported by the Spanish Ministerio de Educación y Ciencia, grant FIS2005-05020-C03-02, and Ministerio de Ciencia e

[6] [7] [8] [9] [10] [11] [12]

[13] [14] [15] [16] [17] [18] [19] [20] [21]

R.K. Tyson, Principles of Adaptive Optics, Academic Press, 1991. J.C. Fontanella, J. Opt. Paris 16 (1985) 257. G.R. Ayers, J.C. Dainty, Opt. Lett. 13 (7) (1988) 547. A. Roorda, F. Romero-Borja, W.J. Donnelly III, H. Queener, T.J. Hebert, M.C.W. Campbell, Opt. Express 10 (2002) 405. H. Hofer, P. Artal, B. Singer, A.L. Aragón, D.R. Williams, J. Opt. Soc. Am. A 18 (3) (2001) 497. I. Iglesias, P. Artal, Opt. Lett. 25 (2000) 1804. D. Catlin, C. Dainty, J. Opt. Soc. Am. A 19 (8) (2002) 1515. J. Arines, BaráS. , Opt. Express 11 (2003) 761. V. Nourrit, B. Vohnsen, P. Artal, J. Opt. Pure Appl. Opt. 7 (2005) 585. J.C. Christou, A. Roorda, D.R. Williams, J. Opt. Soc. Am. A 21 (8) (2004) 1393. S. Yang, G. Erry, S. Nemeth, S. Mitra, P. Soliz, Proceedings of the 17th IEEE Symposium on Computer-Based Medical SystemsCBMS'04, 2004, p. 479. J. Primot, “Application des techniques d'analyse de surface d'onde a la restauration d'images degradees par la turbulence atmospherique”, Ph.D. Thesis, Universitè de Paris-Sud, Orsay, France, 1989. M.C. Roggemann, B. Welsh, Imaging Through Turbulence, CRC Press, Boca Raton, 1996. R.C. Gonzalez, R.E. Woods, Degital Image Processing, 3rd Ed.Pearson Prentice Hall, Upper Saddle River, 2008. J. Primot, G. Rousset, J.C. Fontanella, J. Opt. Soc. Am. A 7 (9) (1990) 1598. J.D. Gongleswki, D.G. Volez, J.S. Fender, D.C. Dayton, B.K. Spielbusch, R.E. Pierson, Appl. Opt. 29 (31) (1990) 4527. S.D. Ford, B.W. Welsh, M.C. Roggemann, Opt. Eng. 37 (1998) 2491. L. Guan, R.K. Ward, Opt. Eng. 29 (4) (1990) 289. J. Arines, S. Bará, Opt. Eng. 39 (10) (2000) 2789. R. Navarro, E. Moreno-Barriuso, S. Bará, T. Mancebo, Opt. Lett. 25 (2000) 236. J. Arines, "Imagen de alta resolución del fondo de ojo por deconvolución tras compensación parcial", Ph. D. Thesis, Universidade de Santiago de Compostela, Galicia, Spain, 2006. http://www.unizar.es/fajap/Tesis%20Justo%20Arines.pdf.