A portable, optical scanning microsystem for large field of view, high resolution imaging of biological specimens

A portable, optical scanning microsystem for large field of view, high resolution imaging of biological specimens

Sensors and Actuators A 279 (2018) 367–375 Contents lists available at ScienceDirect Sensors and Actuators A: Physical journal homepage: www.elsevie...

3MB Sizes 0 Downloads 14 Views

Sensors and Actuators A 279 (2018) 367–375

Contents lists available at ScienceDirect

Sensors and Actuators A: Physical journal homepage: www.elsevier.com/locate/sna

A portable, optical scanning microsystem for large field of view, high resolution imaging of biological specimens Georgia Korompili a,b , Georgios Kanakaris a , Christos Ampatis a , Nikos Chronis a,b,∗ a b

Institute of Nanotechnology and Nanoscience, N.C.S.R. Demokritos, Athens, Greece Department of Materials Science and Technology, University of Crete, Irakleion, Greece

a r t i c l e

i n f o

Article history: Received 13 January 2018 Received in revised form 3 June 2018 Accepted 17 June 2018 Available online 24 June 2018 Keywords: Point-of-care Microscopy Large field of view Optical imaging Lens array

a b s t r a c t Adapting medical technology for use at the point of care, demands the development of portable, robust and accurate systems for the early diagnosis and monitoring of a wide range of diseases. Microscopy at the point of care, fueled by recent advances in micro-optics, micro-electronics and micro-electromechanical systems, is an emerging and promising field. However, imaging devices already developed remain rather sophisticated and bulky, mainly because of failure to address the most challenging technical limitation: to combine large field-of-view (FOV) with high resolution imaging of biological specimens. To address this need, we developed a portable, optical scanning microsystem that can image - with approximately 1 ␮m resolution- large areas (6 mm × 40 mm) from various biological samples. This is achieved through the use of a microfabricated - 2D lens array that scans a sample in 1 direction in few minutes. We demonstrated that our system can image blood smear and identify single white blood cells immobilized in a microfluidic chip. © 2018 Elsevier B.V. All rights reserved.

1. Introduction Early diagnosis and accurate continuous monitoring of the progress of a disease are both considered of critical importance. However, time consuming procedures and costly equipment required in many cases, act as serious obstacles in the early screening of a disease. Though, in the developed world advanced infrastructure and well-trained healthcare professionals are available, the cost of healthcare can still be prohibitive for patients to seek diagnosis. In low-resources settings, access to medical equipment may not even be available [1]. To address these issues, there is an emerging trend to adapt a patient-centered healthcare [2]. The use of point-of-care devices is part of this trend with excellent application in the field of microscopy imaging of biological specimens [3]. Recent technological advances in micro-optics, micro-electronics and micro-electromechanical systems (MEMS) have the potential to improve screening and detection of a wide range of diseases at the point-of-care in primary health care settings in both low and high-resource countries. Microscopy imaging is the gold standard for the diagnosis of malaria, tuberculosis and sickle cell anemia, while it is also used

∗ Corresponding author. E-mail address: [email protected] (N. Chronis). https://doi.org/10.1016/j.sna.2018.06.034 0924-4247/© 2018 Elsevier B.V. All rights reserved.

for progress monitoring and treatment adjustment in the case of HIV-infected patients and patients undergoing chemotherapy for cancer treatment. Whole slide blood imaging and cell population counting represent irreplaceable steps of the diagnostic procedure in many cases. The conventional diagnostic process of Malaria, common infection in sub-Saharan region, is based on microscopic examination of stained peripheral blood smears. The standard process requires examination of the entire smear slide with light microscope to detect infected cells [4,5]. For HIV infected individuals, CD4+ T cell concentration is a common test to perform to define the response of the immune system to antiretroviral therapy. Cell count can be accurately extracted through microscopy imaging of a drop of patient’s blood (1–10 ␮l), though flow cytometry remains the gold standard [6]. In tuberculosis-endemic countries, the low-cost, fast and accurate microscopy imaging of smears of non-concentrated sputum with Ziehl-Neelsen staining is the most common diagnostic test [7]. Other hematologic diseases, such as sickle cell anemia also require microscopy imaging for highly specific diagnosis. In both low and high-resource countries, frequent counting of white blood cells is required for patients undergoing chemotherapy or radiation therapy for cancer treatment as low white blood cell counts often represent an important side effect. In general, in vitro biological specimen imaging such as blood film or sputum is considered as the most promising field of application for automated and miniaturized medical instrumentation that

368

G. Korompili et al. / Sensors and Actuators A 279 (2018) 367–375

could be used at the point of care [8]. High accuracy, in terms of specificity and sensitivity, low cost, portability, ease of use and low power consumption are the specifications imposed by the World Health Organization for devices used at the point-of-care in poorresources settings [9]. Though a large number of devices have been developed in the field of automated portable imaging systems, unfortunately, so far, none of them has succeeded in complying with all criteria. Particularly, concerning the technical limitations and challenges met, major problem is the need to combine high resolution with large FOV imaging of the biological sample [8]. To address this need, two main approaches have been widely studied: (a) whole slide scanning [10–12] and (b) lens-free imaging techniques [13–16]. In particular, a certain number of evolved devices perform 2 or 3 direction scanning to cover an extended area of tissue or blood sample [11,12]. In these approaches, sophisticated motorized systems and complicated optical systems are employed, resulting in an increased cost of manufacturing. In cases where scanning components are excluded, medium or large format detectors have to be used instead [13], increasing furthermore the production cost. In case of non-scanning devices, the resolution is limited by the pixel size of the large format detector in use, much larger than desired resolution of at least a few micrometers. To compensate large pixel size of these sensors – approximately 10 ␮m – the existing approaches make use of complicated and bulk optical systems or sophisticated software algorithms for image information extraction. Consequently, time and computational power requirements radically increase. It is also a fact that the samples tested need to be rather sparse to reduce possible loss of information due to actual low resolution efficiency [13]. The same problems of augmented computational power demands and sparse sample requirements are encountered in approaches of lens-free imaging systems [13–16]. In the aforementioned platforms, image processing algorithms are used to detect particles in a sample, based on the shadow produced by the particles. Thus, provided accuracy may be highly affected by the sample density and result in a large number of lost particles in case of dense specimens. Preprocessing of blood is therefore a demand and a possible obstacle in areas where the necessary laboratory equipment is unavailable. With the advent of mobile phones, their extensive use and their fast technological development, there has been an integration of phone cameras in devices for cytological or histological specimen imaging. These devices, though assuring portability and autonomy, incorporate complicated and expensive optical systems to perform high resolution imaging by radically decreasing the detection area [17,18]. Though phone cameras provide high quality imaging, they are restricted for a short range of applications and are incapable of providing extended region imaging, unless combined with a scanning system. Rapid progress in microfabrication technology resulted in an emerging trend towards the use of micro-lens arrays. The integration of high numerical aperture micro-lenses in optical systems, benefits of high resolving power, large FOV imaging and radically decreases cost of manufacturing. The inevitable problem of whole slide imaging related to the use of micro-lens arrays is caused by the existence of undetected areas of the imaged specimen due to the blind regions between neighbor lenses in an array. To mitigate this challenge, in some cases a multilayer lens-array assembly is employed, improving also provided resolution, while a sophisticated, 2 or 3 direction- scanning system can additionally be used [12]. Such an approach increases the manufacturing cost of the equipment, while it is rather questionable whether it meets the criteria of compactness and portability. Despite the numerous approaches that already exist, the field still presents a challenge as there has been no approach that could overcome the aforementioned obstacles and provide a general purpose, portable, low-cost device for high resolution and wide FOV

imaging. To be effective at the point-of-care, evolved technologies need to be simple [19]. Our approach proposes the use of a specially patterned mini-lens array combined with a simple, one direction (1D) scanning system for whole slide, high resolution imaging of a biological specimen. The large FOV (290 ␮m) and high numerical aperture (NA∼0.7) of the used sapphire ball lenses reduce the number of required images – approximately 400 images for a 4 cm long sample surface in 1X magnification. The simplicity of the proposed portable system – one direction scanning and one layer of mini-lenses (1 mm in diameter) – render it compact, inexpensive and with low power requirements. It is, therefore, ideal for use in poor resources areas. 2. Materials and methods We developed an imaging, scanning platform (Fig. 1) that comprises of three components: (a) an optical, scanning head that images the specimen of interest (b) a motorized, translation stage that is responsible for moving the optical head along the specimen (1D scanning) and (c) an external white light LED array (Edmund Optics, #66-830) that provides homogeneous illumination of the entire specimen. The optical head is an assembly of a 10.7 Mpixel monochrome CMOS sensor (Imaging Source, DMM 27UJ003-ML) and of a custom-made mini-lens array. The optical head is attached to the translation stage that is connected to a computer-controlled, stepper motor through a lead screw. The specimen – typically blood sample inserted in a microfluidic chamber or sitting on a glass slide or coverslip – is placed at a short distance (∼0–500 ␮m) from the optical head and it is illuminated by the LED array light source. The CMOS/mini-lens array assembly moves in one direction and has a travelling distance of approximately 40 mm. This configuration creates a 40 mm x 6 mm image of the specimen. The distances between the mini-lens array, the CMOS sensor and the specimen are controlled by two step motors (20 ␮m step, 3 mm range) that allow us to obtain a sharp, focused image of the sample. The focusing procedure is performed prior to every scan. 2.1. The optical, scanning head The optical, scanning head is responsible for image acquisition process. It comprises of a microfabricated lenslens array, a 10.7 Mpixel CMOS camera (of 1.67 ␮m square pixels) and two, computer controlled, stepper motors that control the distance between the sample, the lens array and the detector. The two stepper motors are used to achieve sharp in-focus images as well as to operate in different magnification modes – from 1X up to 4X magnification. The operational principle of the entire scanning system is based on the special design of the lens array. 36 sapphire ball lenses of high numerical aperture (NA∼0.7), refractive index of 1.67 and 1 mm in diameter (EDMUND OPTICS #43-638) are placed on top of a specially patterned silicon die to form the array of mini-lenses. The mini-lens array is fabricated using standard microfabrication techniques (supplementary Fig. 1). A 400 ␮m thick, silicon wafer is Deep Reactive Ion Etched (DRIE) to create 960 ␮m in diameter, wafer-through holes [20]. After the etching process is completed, individual dies are glued with a UV curable optical adhesive (NORLAND 60, refractive index 1.62) on thin, glass coverslips of standard thickness (#1, 130–170 ␮m) to form the wells upon which the lenses are placed. The same adhesive is used to fill the wells and secure the lenses on top of the wells. The design of the lens array is the key element of the operation of the entire system. The wells are patterned so that all mini-lenses in the array are equally spaced with an edge-to-edge distance of 50 ␮m. The lens array is placed with a tilt angle of ␸ = 17◦ with respect to the scanning direction. This tilted design assures that

G. Korompili et al. / Sensors and Actuators A 279 (2018) 367–375

369

Fig. 1. (a) Schematic of the architecture of the portable scanning microsystem, (b) photograph of the prototype and close-up view of the optical head. The size of the prototype is 21 cm × 8 cm × 11 cm and it weights ∼1 kg, (c) schematic of the optical/scanning head illustrating the imaging principle.

Fig. 2. (a) Top view schematic and picture of the lens array. The black dotted outline indicates the effective imaging area of the CMOS sensor, the green rectangular represents the silicon chip; (b) The tilt (17◦ ) of the mini-lens array and the overlap between the FOV of neighboring lenses are crucial for covering blind areas between the lenses; (c) Estimation of the FOV of a lens (∼290 ␮m) based on a grid of lines imaged with the CMOS sensor in 1X magnification mode. The FOV is the diameter of the disc area where light intensity is not lower than 90% of its maximum value (center of the lens).

the diameter of the FOV of one lens – estimated approximately at 290 ␮m – has an overlap with the FOV of the neighboring lenses by a factor of 20% (Fig. 2). Thus, by performing one direction scanning, there are no undetected areas of the sample.

The calculation of the tilt angle ␸ was based on the geometry of the array, where (␸ = (0.8 x lens FOV diameter) / (Lens diameter + Lenses distance). The measurement of the FOV was performed with the use of a lines grid projection over the CMOS surface. The

370

G. Korompili et al. / Sensors and Actuators A 279 (2018) 367–375

estimated 290 ␮m FOV is the diameter of the disc surface where light intensity is not lower than 90% of its maximum value – located at the center of the lens. The quality of the produced image is dependent on the manufacturing and assembly process of the optical head components (see supplementary material). The characterization procedure aims at accurately defining the uniformity of the assembled mini-lens array in terms of resolution and contrast. Ideally, we require a perfectly uniform array in the sense that all lenses are focused on the same plane and they provide an image of the sample with the same resolution, magnification and contrast. In order to quantify the optical performance of the scanner and the uniformity of the lens array, we conducted two experiments: (a) we measured the contrast across the mini-lens array (36 imaging units) and (b) we measured the in-focus factor for each lens as they are not located in the same plane within the array. The in-focus factor is a measurement of focus that represents the ability of each lens to focus the image to the CMOS sensor. It has its maximum value at the best focused image plane. The defocusing error in each case can be extracted as the subtraction of the in-focus factor from 1, which is the maximum normalized in-focus factor. For quantifying the resolution and contrast, a line-pairs pattern of a chrome film (100 nm thick) deposited on glass slide was imaged in 1X and 4X magnification mode. The contrast (C), obtained from greyscale images from each lens, was based on the Michelson formula [21]: C = standard deviation of the light intensity gradient/ mean of light intensity gradient. For quantifying the defocusing error, a series of images from a whole blood smear test were obtained by varying the mini-lens array to CMOS distance from 500 ␮m up to 1500 ␮m with a step of 20 ␮m. An image processing algorithm based on image curvature focus measure [22] for the automatic detection of the sharpest image was used [23,24]. 2.2. The motorized, translation stage A motorized, translation stage is responsible for the movement of the optical, scanning head across the scanning direction. It comprises of a step motor, a lead screw upon which the optical head is sliding, a motor driver to control the step motor, an electronics board with an Arduino UNO Microcontroller and a flat sample holder located on top of the stage, where the biological specimen is placed. The stepper motor providing 200 steps per revolution is connected through a motor flex coupler with the leadscrew. The scanning head is able to move with a minimum 5 ␮m step for covering a distance of 40 mm maximum range. This scanning head is moving right beneath the sample – either typical microfluidic chamber or glass slide – across its long axis. It is kept in place during scanning with the use of two flexible strips. A window of the sample holder allows optical access to the specimen from the side of the scanning head, with the lens-array being able to move close to the sample and achieve high magnification. The sample holder can host a standard glass slide of 60 mm x 22 mm, however the scanned surface is defined by the CMOS sensor size and the maximum possible scanning range of the stepper motor. Thus, the surface that can be scanned has 5.7 mm width – equal to the length of the CMOS sensor – and 40 mm length – equal to the maximum scanning range, rendering the device ideal for use with microfluidic biochips respectively wide and long. The maximum adequate step for the entire chip scanning without undetected areas is defined by the design of the lens array and by the overlap between neighboring lenses. Based on the theoretical study of the lens-array properties the maximum allowed step size should not exceed 0.6*FOV Diameter = 174 ␮m at 1X magnification. To eliminate repeatability error between different scans, the device is programmed to return to ‘zero position’ before executing a

new scan. During scanning, the device performs only one direction translation of the optical head. Based on working principle the characterization procedure needs to define the error induced by the motorized stage during scanning. Any imperfections or misalignments in the assembly of the components of the stage may result in a defocusing error due to non-parallel move of the optical head with respect to the sample holder. Image curvature focus measure [22] was used to quantify the defocus error induced for 26 equally spaced (1000 ␮m) positions along the scanning range. 2.3. The illumination source The illumination source is a square homogeneous light source (EDMUND OPTICS, part number #66-830) with 5 cm side length. The light source covers the entire imaged area and does not need to move during scanning. This light source consists of an LED array (operated at 24 V) covered by a diffuser. The emitted light is homogeneously diffused onto the biological sample and it is distributed over a surface 6% wider than the surface of the source. The entire scanner along with the light source is placed inside a PMMA opaque enclosure to prevent ambient light from entering the scanner. The distance between the light source and the sample affects the contrast of the produced image. If that distance is short, then images obtained from neighboring lenses and projected to overlapping areas on a CMOS sensor – a phenomenon known as crosstalk. Crosstalk is frequently encountered in micro-lens arrays [25] and affects image quality as the original image appears blurred. In the case of a non-collimated light source, such as an LED array, increasing the distance between the LED array and the sample can reduce crosstalk, as diverging rays originating from the LED array do not reach the sample. To select the minimum LED-sample distance where crosstalk is adequately reduced, we measured the contrast of the produced image at different distances. 3. Results 3.1. Magnification range of the optical head The proposed imaging platform can operate at 1X-4X magnification range. An image curvature focus measure algorithm was used to automatically select the best focal plane and determine the positions of the lens array and the CMOS sensor with respect to the sample (Fig. 3). 3.2. Resolution at maximum (4X) and minimum (1X) magnification The resolution of our system at 1X and 4X magnification was experimentally measured using a custom-made resolution pattern (Fig. 4). The design of the resolution pattern was similar to the 1951 USAF resolution test chart. It was patterned on a thin Cr film using e-beam lithography. The maximum optical resolution was approximately 0.98 ␮m and 1.95 ␮m at 4X and 1X magnification respectively. 3.3. Sharpness and contrast uniformity of the images produced by the lenses of the array The image curvature focus factor was measured for each lens using a smear blood test when the lens array is at the closest position to the sample – 0 ␮m lens to sample distance. At this position of maximum magnification and resolution the out-of-focus error of the lenses of the array is expected to be maximized. 31 lenses – out of 36 – were selected for this experiment, as those lenses captured images that fitted entirely into the CMOS effective detection area.

G. Korompili et al. / Sensors and Actuators A 279 (2018) 367–375

371

Fig. 3. The distances of the lens array and the CMOS detector with respect to the sample are determined with the use of image curvature based autofocus algorithm. The magnification range (1X-4X) was calculated by the projection on the CMOS effective area of 10 ␮m width, chrome, line pairs patterned on glass slide.

Fig. 4. A photograph of the resolution line pattern as imaged through a single lens at 1X (A) and 4X (B) magnification and their corresponding light intensity profiles. The red line in the images indicates the length where the intensity profile was quantified. A minimum line width of 1.95 ␮m and 0.98 ␮m was detected at 1X and 4X magnification respectively.

The best focused image produced by the array was selected as the one that the averaged out-of-focus factor (averaged from all lenses) was minimum. At this image, defocusing error, measured for each lens separately, was less than 20% while the contrast was within 12% of its maximum value (Fig. 5). 3.4. Sharpness imaging uniformity along scanning direction The image curvature based autofocus algorithm was used to quantify the defocusing error induced during scanning. This error is presented as the variations of the in-focus factor measured for one lens in the array in equally spaced (1 mm) positions of the optical head for a smear blood test 25 mm long. The slight variations (less

than 5%) are attributed to differences in the pattern of smear blood cells monitored in each position as no noticeable drift is present. The precision of the step motor employed allows for automatic stitching of the images acquired by each lens during scanning. Since no repeatability step error is present, the images of the specimen can be cropped in the borders of FOV of each lens and stitched automatically to produce 36 images of the sample, each one corresponding to a different optical unit in the lens array (Fig. 6). 3.5. Minimizing crosstalk Crosstalk is a significant drawback of lens arrays [25,26] as it deteriorates image quality. It can be minimized by using collimated

372

G. Korompili et al. / Sensors and Actuators A 279 (2018) 367–375

Fig. 5. In-focus factor normalized for each lens within the array, as measured by images obtained from a blood smear test (left); Contrast of the image produced by each lens of the array (right); 5 lenses were excluded from this experiment as they did not produce high quality images.

Fig. 6. (a) De-focusing error versus position of the optical head along the scanning direction. The de-focusing error was induced by fabrication imperfections or assembly misalignments in the components of the motorized scanning stage. The in-focus factor – measured for one lens across the scanning length – remains for all positions of the optical head within 5% of the best in-focus image. (b) Stitching of 20 images acquired from one lens in the array with a scanning step of 100 ␮m. The resulting image monitors an area of the sample 2 mm long. Precision stepping allows for automatic stitching of the images.

light sources. The same result can be accomplished by placing a non-collimated light source at a large distance from the sample. In order to quantify the crosstalk, we measured the contrast in our system for different distance between the LED source and the sample (Fig. 7). The best contrast was obtained at distances of 40 cm and above. For practical reasons, we used a distance of 22 cm, where the contrast was ∼0.89 and it was adequate to clearly image blood cells from a blood smear test. 3.6. Image mosaicing and cell detection As a proof of concept for the operation of the device we present the image acquired from a typical microfluidic chip with white

blood cells captured and detected. The chip comprises of two 16 ␮m thick chambers of 1 ␮l total volume each, functionalized with human CD45 biotinylated antibody to capture white blood cells (WBCs). Red blood cells (RBCs) were washed away. The 40 mm long chip was scanned with our device within a few minutes. The images acquired were automatically processed to produce the entire image of the scanned surface (Fig. 8). The image processing algorithm comprises of the following steps: (a) cropping of the FOV of each lens, (b) vignetting effect correction for each cropped image, (c) automatic stitching of the images of each lens and mosaicing of the entire image. Vignetting effect induced by the spherical surface of the lenses is reduced through measurement of the sharpness of gradient distribution

G. Korompili et al. / Sensors and Actuators A 279 (2018) 367–375

373

Fig. 7. Normalized contrast versus light source to sample distance (A). Larger distances result in less crosstalk and clearer images as diverging rays do not reach the mini-lens array (B).

Fig. 8. (a) Schematic of the microfluidic chip with 2 parallel chambers used for the experiment. The dotted square indicates the scanned surface. (b) Stitched image of the 4 cm long microfluidic chip of two parallel chambers. (c) Zoom-in of the image in the region of inlets. (d) WBCs captured in the microfluidic chamber (e) automatically detected WBCs (indicated with an arrow) with Sobel operator image segmentation algorithm.

374

G. Korompili et al. / Sensors and Actuators A 279 (2018) 367–375

[27]. Captured WBCs can be easily observed and automatically detected with the use of image segmentation technique with Sobel operator [28]. All methods were implemented using MATLAB version 9.0.0.341360 (R2016a) and the additional image processing toolbox 9.4. 4. Discussion We developed a portable, optical, scanning system that can image a large area (6 mm x 40 mm) with approximately 1 ␮m optical resolution in 4X magnification. The system operates in white light transmission mode and completes a full scan in a few minutes. We validated our system by presenting smear blood test images and by performing single white blood cell count from a typical CD45 functionalized microfluidic chamber in an automated fashion. Although the developed prototype is connected to a computer through an Arduino controller and to two power supplies that power the stepper motors and the LED array, an integrated, portable system could be envisioned by using a microcontroller with adequate available memory (such as a Raspberry pi) to control all the components (CMOS sensor, LED array, motors) and to perform scanning and cell counting. We estimated that such an integrated system would require ∼10 W (2 W for the steppers motors, 2 W for the LED array, 6 W for the microcontolller and the CMOS sensors) and would weight < 3 kg. Such a power requirement can be easily provided by a commercial battery (e.g. a smartphone battery) or even a solar panel. Image quality could be further improved by making few modifications in the current system. Optical resolution is limited by the pixel size of the CMOS sensor (current pixel size is 1.67 ␮m). A CMOS sensor with a smaller pixel size would greatly improve the resolution. Decreasing the lens-to-lens distance would minimize the dead imaging area between the lenses, enabling more lenses to be packed in the array and therefore reducing the time that is required to complete a full scan. Furthermore, crosstalk could be reduced by using a collimator/condenser lens in order minimize the divergence angle of the optical rays from the illumination source [29]. We envision that the proposed imaging system can be used for general purpose imaging of biological specimens for the diagnosis and monitoring of a wide variety of diseases and conditions at the point of care. Acknowledgments We would like to thank Ning Gulari for useful discussions and for fabricating the silicon chip of the mini-lens array. Funding was provided by the ERC Starting Grant (project ID 693433) under the Horizon 2020 program. Appendix A. Supplementary data Supplementary material related to this article can be found, in the online version, at doi:https://doi.org/10.1016/j.sna.2018.06. 034.

[6] M. Usdin, M. Guillerm, A. Calmy, Patient needs and point-of-care requirements for HIV load testing in Resource-limited settings, J. Infect. Dis. (2010) 201. [7] K. Steingart, V. Ng, M. Henry, P. Hopewell, A. Ramsay, J. Cunningham, R. Urbanczik, M. Perkins, M. Aziz, M. Pai, Sputum processing methods to improve the sensitivity of smear microscopy for tuberculosis: a systematic review, Lancet Infect. Dis. 6 (2006) 664–674. [8] H. Zhua, S.O. Isikmana, O. Mudanyalia, A. Greenbauma, A. Ozcan, Optical imaging techniques for point-of-care diagnostics, Lab Chip 13 (1) (2013) 51–67. [9] G. Wu, M.H. Zaman, Low-cost tools for diagnosing and monitoring HIV infection in low-resource settings, Bull. World Health Organ. 90 (2012) 914–920. [10] F. Ghaznavi, A. Evans, A. Madabhushi, M. Feldman, Digital imaging in pathology: whole slide imaging and bejond, Annu. Rev. Pathol. Mech. Dis. 8 (2013) 331–359. [11] G. Huang, C. Deng, J. Zhu, S. Xu, C. Han, X. Song, X. Yang, Digital imaging scanning system and biomedical applications for biochips, J. Biomed. Opt. 13 (3) (2008). [12] R. Weinstein, M. Descour, C. Liang, G. Barker, K. Scott, L. Richter, E. Krupinski, A. Bhattacharyya, J. Davis, A. Graham, M. Rennels, W. Russum, J. Goodall, P. Zhou, A. Olszak, B. Williams, J. Wyant, P. Bartels, An array microscope for ultrarapid virtual slide processing and telepathology. Design, fabrication, and validation study, Hum. Pathol. 35 (11) (2004). [13] S. Moon, H.O. Keles, Y.-G. Kim, D. Kuritzkes, U. Demirci, Lensless imaging for Point-of-care testing, Conf. Proc. IEEE Eng. Med. Biol. Soc. (2009) 6376–6379. [14] A. Ozcan, U. Demirci, Ultra wide-field lens-free monitoring of cells on-chip, Lab Chip 8 (1) (2008) 98–106. [15] O. Mudanyali, W. Bishara, A. Ozcan, Lensfree super-resolution holographic microscopy using wetting films on a chip, Opt. Express 19 (18) (2011) 17378–17389. [16] A. Greenbaum, W. Luo, T.-W. Su, Z. Göröcs, L. Xue, S. OIsikman, A. Coskun, O. Mudanyali, A. Ozcan, Imaging without lenses: achievements and remaining challenges of wide-field on-chip microscopy, Nat. Methods 9 (9) (2012) 889–895. [17] D.N. Breslauer, R.N. Maamari, N.A. Switz, W.A. Lam, D.A. Fletcher, Mobile phone based clinical microscopy for global health applications, PLoS One 4 (7) (2009). [18] A. Skandarajah, C. Reber, N. Switz, D. Fletcher, Quantitative imaging with a mobile phone microscope, Plos One 9 (5) (2014). [19] A. Boppart, R. Richards-Kortum, Point-of-care and point-of-procedure optical imaging technologies for primary care and global health, Sci. Transl. Med. 6 (253) (2014). [20] M.N. Gulari, A. Tripathi, N. Chronis, Microfluidic-based oil-immersion lenses for high resolution microscopy, in: 16th International Conference on Miniaturized Systems for Chemistry and Life Sciences, Okinawa, Japan, October 28 – November 1, 2012. [21] E. Peli, Contrast in complex images, J. Opt. Soc. Am. 7 (10) (1990) 2032–2040. [22] F. Helmli, S. Scherer, Adaptive shape from focus with an error estimation in light microscopy, in: Image and Signal Processing and Analysis (ISPA), Pula, Croatia, 2001. [23] A. Santos, C.O.d. Solorzano, J.J. Vaquero, J. Rena, N. Malpica, F.d. Pozo, Evaluation of autofocus functions in molecular cytogenetic analysis, J. Microsc. 188 (3) (1997) 264–272. [24] Y. Sun, S. Duthaler, B. Nelson, Autofocusing in computer microscopy: selecting the optimal focus algorithm, Microsc. Res. Tech. 65 (2004) 139–149. [25] J. Tanida, T. Kumagai, K. Yamada, S. Miyatake, K. Ishida, T. Morimoto, N. Kondou, D. Miyazaki, Y. Ichioka, Thin observation module by bound optics (TOMBO): concept and experimental verification, Appl. Opt. 40 (11) (2001) 1806–1813. [26] J. Lee, J. Kim, G. Koo, C. Kim, D. Shin, J. Sim, Y. Won, Analysis and reduction of crosstalk in the liquid lenticular lens array, IEEE Photonics J. 9 (3) (2017). [27] Y. Zheng, M. Grossman, S.P. Awate, J.C. Gee, Automatic correction of intensity nonuniformity from sparseness of gradient distribution in medical images, Med. Image Comput. Comput. Assist. Interv. 12 (2) (2009) 852–859. [28] F. Sadeghian, Z. Seman, A.R. Ramli, B.H.A. Kahar, M. Saripan, A framework for white blood cell segmentation in microscopic blood images using digital image processing, Biol. Proc. Online 11 (1) (2009) 196–206. [29] J.F. Ribeiro, A.C. Costa, J.M. Gomes, C.G. Costa, S.B. Goncalves, R.F. Wolffenbuttel, J.H. Correia, PDMS microlenses for optical biopsy microsystems, IEEE Trans. Ind. Electron. 64 (12) (2017) 9683–9690.

References Biographies [1] L. Frost, M. Reich, Access: How Do Good Health Technologies Get to Poor People in Poor Countries? Harvard Center for Population and Development Studies, Cambridge, 2008. [2] K. Davis, S.C. Schoenbaum, A.-M. Audet, A 2020 vision of patient-centered primary care J. Gen. Intern. Med. 20 (2005) 953–957. [3] C. Price, A.S. John, Point of Care Testing: Making Innovation for Patient-Centered Care, AACC Press, Washington, USA, 2012. [4] N. Tangpukdee, C. Duangdee, P. W, S. K, Malaria diagnosis: a brief review, Korean J. Parasitol. 47 (no. 2) (2009) 93–102. [5] A. Moody, Rapid diagnostic tests for malaria parasites, Clin. Microbiol. Rev. 15 (1) (2002) 66–78.

G. Korompili holds a diploma in Electrical and Electronic Engineering from National Technical University of Athens. Right after graduation, she has been occupied in the field of systems and data acquisition for various applications, including a 2year fellowship at CERN. In 2012 she joined the Institute of Nanoscience and Nanotechnology of the N.C.S.R. Demokritos and particularly the laboratory of Microelectromechanical systems for biological applications (bioMEMS). Since 2014 she is a PhD candidate in the University of Crete – Department of Material Science – in the field of bioMEMS, under supervision of prof. Chronis Nikos. Her research interests focus on the design and construction of bio-imaging and microscopy platforms,

G. Korompili et al. / Sensors and Actuators A 279 (2018) 367–375 micro-optics systems, microflidic systems as well as automated microsystems for various biological applications. G.P. Kanakaris is a Mechanical Engineer specializing in mechanical design and biotechnology instrument development. He has acquired his M.Sc. from the National Technical University of Athens, 2011. His research interests include high throughput proteomic measurement platforms as well as Point of Care Systems for in vitro diagnostics. He is currently a PhD candidate in the field of proteomic measurement technologies. C. Ampatis is a mechanical engineer with specialization in aeronautics, control systems and robotics. He holds two M.Sc. diplomas in the aforementioned fields, acquired from National Technical University of Athens during 2010 and 2012 respectively. His research interests include axial compressor design and performance, wind tunnel design, multi-rotor aerial vehicles design and performance, as well as mecha-

375

tronics and its application on process development and point of care devices. He is currently working in consumer products industry as product design-prototype and process development engineer. Professor Chronis received a Bachelor in Engineering (B.E.) in mechanical engineering in 1998, from the Aristotle University of Thessaloniki with honors (graduated 1st out of 145 students in his class). He completed his Ph.D. in mechanical engineering from the University of California at Berkeley (USA) in 2004. From 2004–2006, he held a post-doctoral research position at Rockefeller University, New York, (USA). In 2006, he became a faculty at the Department of Mechanical Engineering at the University of Michigan, Ann Arbor. He is the co-author of more 60 journal and peer-reviewed conference publications and inventor in 3 patents. Dr. Chronis is the recipient of the prestigious NIH Director’s New Innovator Award.