In silico imaging: Definition, possibilities and challenges

In silico imaging: Definition, possibilities and challenges

Nuclear Instruments and Methods in Physics Research A 648 (2011) S276–S280 Contents lists available at ScienceDirect Nuclear Instruments and Methods...

582KB Sizes 0 Downloads 68 Views

Nuclear Instruments and Methods in Physics Research A 648 (2011) S276–S280

Contents lists available at ScienceDirect

Nuclear Instruments and Methods in Physics Research A journal homepage: www.elsevier.com/locate/nima

In silico imaging: Definition, possibilities and challenges$ Aldo Badano Division of Imaging and Applied Mathematics, Office of Science and Engineering Laboratories, Center for Devices and Radiological Health, Food and Drug Administration, United States

a r t i c l e i n f o

a b s t r a c t

Available online 23 November 2010

The capability to simulate the imaging performance of new detector concepts is crucial to develop the next generation of medical imaging systems. Proper modeling tools allow for optimal designs that maximize image quality while minimizing patient and occupational radiation doses. In this context, in silico imaging has become an emerging field of imaging research. This paper reviews current progress and challenges in the simulation of imaging systems with a focus on Monte Carlo approaches to X-ray detector modeling, acceleration approaches, and validation strategies. Published by Elsevier B.V.

Keywords: Medical imaging simulation Monte Carlo In silico imaging

1. Introduction Future editions of technical dictionaries might include the following definition: in si li co im ag ing Pronunciation: in-’si-li-ko ’im-aj’in. Function: n sing. Etymology: New Latin, literally, imaging in computer. Date: circa 2010. : Computer simulation of an entire imaging system (including source, object, detection and observer components) used for research, development, optimization, technology assessment, and regulatory evaluation of new technology to complement bench testing. – see also com pu ta tio nal i ma ging. From Merriam & Webster Unabridged Dictionary, 2015.1 In this proposed definition, the uses of in silico imaging are expanded beyond the more conventional applications of research and development of new imaging technology into new areas where computer simulation has not yet been applied at any significant level. For instance, in an adaptive imaging system, real-time system adjustments of imaging techniques for a given patient characteristic (i.e., size) would be possible by solving the optimization problem for the imaging task at hand [1,2]. In addition, powerful simulation tools could be used by industry and regulating agencies to better understand modifications to existing devices and to $ Adapted from an invited talk given on June 11th 2010 during the 4th International Conference on Imaging in Subatomic Physics, Astrophysics, Medicine and Biology, Stockholm, 2010. E-mail address: [email protected] 1 The use of an entry from a fictitious future edition of an English dictionary was taken from Lichtman and Sanes, Curr. Opin. Neurobiol. 18(3) (June 2008) 346–353.

0168-9002/$ - see front matter Published by Elsevier B.V. doi:10.1016/j.nima.2010.11.054

predict the performance of a new technology, thus facilitating the rapid deployment of meritorious imaging devices while demonstrating significant pitfalls in defective alternative designs. Many scientists with optimistic views on technological progress often argue that in silico imaging is especially needed. A methodology commonplace in other industries [3–9], can provide insight into devices not yet available in the physical world. Many also argue that it can save resources, for example, by reducing unnecessary steps in prototyping stages. Also of great importance is the fact that in silico imaging approaches contribute to the reduction of unnecessary radiation dose and to the minimization of human and animal testing. Skeptics’s countering arguments fall broadly into two classes: (a) accuracy—in silico imaging does not portray reality and (b) computational efficiency—it takes too long. In this paper, we review the current state-of-the-art techniques for in silico imaging through the description of available tools. We also discuss areas where more work is needed for widespread use of in silico imaging per the definition proposed earlier in this section. Within this scope, it is useful now to define what is the imaging problem to be addressed in more detail, schematically represented in Fig. 1. The imaging problem consists of an object, an image, and the use of the image as part of a medical decision process. The imaging system can be fully described [10] as the combined effects of a deterministic and a stochastic component indicated here as H and n. The deterministic component of the imaging system H can be fully described in the spatial domain (with the detailed response function map) or, under the assumptions of linearity and shiftinvariance, in the spatial frequency domain using the modulation transfer function. Similarly, the noise contributions can be described in the spatial and spatial frequency domains using the covariance matrix and the noise power spectrum respectively. Once images are obtained, they are utilized for a particular task, in this case, to help return a medical decision based on the imaging data.

A. Badano / Nuclear Instruments and Methods in Physics Research A 648 (2011) S276–S280

S277

n (measurement noise)

f (object)

H (deterministic properties)



g (image)

reader

medical decision

Fig. 1. The imaging system to model in the computer. Adapted from Ref. [10].

In silico imaging covers all aspects of this problem. The most widely used application of computational techniques is the modeling of the physics of the X-ray generation, transport and detection processes [11]. However, computational methods have been developed to analyze topics related to the object (e.g., dose, variability due to patient characteristics and to organ movement) and topics related to how the users extract information from the images, known as model or mathematical observers. The integration of these tools for modeling all the aspects of an entire imaging system in the computer is what we define as in silico imaging. For this paper, we concentrate on reviewing some of the aspects described above in terms of recent advancements (in terms of accuracy and efficiency) that make the definition proposed at the beginning of this paper feasible. 2. Example: investigation of the oblique incidence effect on the response of X-ray detectors To illustrate the potential of in silico imaging, I briefly recount a recent development in the area of breast imaging detectors used in mammography and tomosynthesis. In August 2006, Mainprize et al. [12] published experimental measurements of the response of a fullfield digital mammography detector for oblique X-ray incidence. The authors reported results in terms of the angular MTF ðMTF y Þ computed for an edge placed perpendicularly with respect to the angular tilt of the X-ray beam. Their definition of MTF y is simply the ratio of empirical MTF data at an oblique X-ray incidence angle with respect to the MTF at normal incidence, MTF y ¼ MTFðyÞ=MTFðy ¼ 03 Þ. The data indicates that the degradation in the resolution of the detector as the obliquity increases is significant and reaches 50% at 301 incidence. Moreover, the authors concluded that obliquity blurring also contributes to a small spatial shift in the detector response that may affect the quality of a tomosynthesis reconstruction. At about the same time, our group published simulation results that predicted similar degradation in the resolution in a similar detector model [13]. The results in this case were presented in terms of the response of the detector to an X-ray pencil beam for oblique incidence angles. Similar reports were published in follow-up references with different detector models [14] for different breast imaging modalities including breast CT with flat-panel detectors [15]. The predictions from the Monte Carlo simulations were later compared to the experimental results of Mainprize by building an in silico replica of Mainprize’s experiment. In addition, the simulation platform allows us to predict the degradation in MTF for a number of different cases that include detectors with CsI(Tl) phosphors of different thicknesses and different X-ray energy spectra. This can be used by developers of medical imaging systems to investigate potential modifications for improving image quality. For instance, a developer could explore cases where a system might be limited by the absorption of primary X-ray quanta by the detector or by elevated electronic noise levels. An increase in the detector active layer thickness can provide the additional signal required to overcome such limitations. However, increased thickness also increases the degradation of the angular MTF. The in silico imager needs to understand the limitations of the tools and methods. Often, however, estimates of the uncertainty of the empirical and computational results are not directly available and not trivial to calculate. For instance, for the Monte Carlo

Fig. 2. A comparison of experimental data from Ref. [12] and in silico predictions of the code MANTIS of angular MTF for a full-field digital mammography detector.

simulations, the statistical uncertainty introduced by a finite number of histories has to be evaluated taking into consideration other sources such as the uncertainties in the a priori knowledge of the geometry and materials for the detector model. While not necessarily quantitative in the absolute sense, the Monte Carlo tool MANTIS can provide definitive trends for this type of analysis. Fig. 2 shows a sample of the results obtained with the in silico replica of Mainprize’s experiment. The computational results have been generated with MANTIS [11], a Monte Carlo x-rAy, electroN and opTical Imaging Simulation tool that has been described in the literature [11]. Some of its unique features include: (1) intimate coupling of the g, x, e 7 and optical photon transport by combining PENELOPE package with DETECT-II optical transport routines which ensures accurate physics models over a wide energy range, (2) open source code (available at http://code.google.com/p/mantismc/), (3) simultaneous scoring of dose and imaging performance, and (4) some validation against empirical data. The analysis can be extended to other devices and conditions not available in the physical world. Fig. 3 shows a sample of those results. We used X-ray spectra from measured data obtained by Jennings et al. [16] for 2 beam qualities: a 26-kV Mo target with Mo intrinsic filtration and a 40-kV Rh anode with Al filtration plus 1 mm additional Al. The CsI thickness used for the in silico replica of the Mainprize experiment was Mainprize’s estimate ð130 mmÞ for GE Senographe 2000D. These results allow the designer of the imaging system (in this case, a tomosynthesis system) to investigate what degradations in resolution are expected as the thickness of the detector is increased to improve the detective quantum efficiency and reduce the problems of detector noise currently limiting low-dose tomosynthesis approaches for breast imaging. Although our results are limited to indirect detectors, direct detectors based on semiconductors such as a-Se would be also affected by this anisotropy effect. This effect was early on described by Que and

S278

A. Badano / Nuclear Instruments and Methods in Physics Research A 648 (2011) S276–S280

Fig. 3. Computational predictions of the effect of phosphor thickness on MTFy at 26-kV (top row) and 40-kV X-ray beams (bottom row). The data was obtained using a Monte Carlo tool (MANTIS) for the study of light transport in radiation detectors (see text for details).

1000

0.8 0.6

100

0.4

10 1

1

9.5 keV 19.5 keV 29.5 keV 39.5 keV 49.5 keV

I

Frequency

10000

Zhao04 Zhao06 MANTIS

0.2 0 5 10 15 20 25 30 35 40 45 50 55 Number of optical quanta per keV

0

0

20

40

60

80 100 120 140

E (keV)

Fig. 4. (a) Pulse-height spectra for CsI(Tl) screens in front-screen configuration and 100 mm thickness at different X-ray energies and (b) the corresponding information factors (I) along with experimental data from Ref. [20].

Rowlands [17] using a simplified analytical model of a-Se devices based on geometry and on a simplistic assumption of equal transducer gain with depth of interaction. It is interesting to point out that even if depth effects (also known as Lubberts effects [18,19]) are present in semiconductor X-ray imagers, it is likely that the effect would be different in nature than in scintillator-based detectors. The anisotropy from oblique incidence documented for indirect detectors is in part caused by the change in optical collection efficiency with respect to the depth of interaction in the crystal. For semiconductor detectors, depending on the arrangement of the circuitry, the charge collection can be influenced by hole or electron transport, and therefore can be designed to generate a strong signal for interactions near the X-ray entrance or near the opposite side surface electrode. More research is still needed to explore the depth effects in semiconductor detectors and their influence on spatial spread and detective quantum efficiency. 3. Accuracy The accuracy of computational models has to be validated against a wide range of experimental results. In particular, it is

often seen in the scientific literature that a model is adjusted to fit the experimental results using a floating factor or coefficient that, since it is often not possible to determine its value experimentally, is allowed to vary to obtain a proper result. To minimize the arbitrariness of this approach, it is always useful to validate against a range of experimental conditions and with a range of figure-ofmerits. For example, in our previous work, we have used two quantities that can be computed from the Monte Carlo simulation of indirect detectors and that can be measured experimentally for a range of X-ray energies and detector types. First, we have used the distribution of signal outputs characterized by the information or Swank factor and defined as I ¼ M21(q)/M0(q)/M2(q), where Mi(q) is ith moment of the pulse-height spectrum M. If the output M(q) corresponds to a Poisson distribution, the corresponding I is always greater than 0.9. For a mean signal of q ¼ 200,I ¼ 0:995. It is also worth noting that I is related to the Fano factor ðM1 ðqÞ=s2 ðqÞÞ, which is unity for Poisson distributions and less than unity for most scintillators [21]. Validation results expressed in terms of I for a variety of screen designs have been published in Refs. [14,13]. A summary figure is shown in Fig. 4.

A. Badano / Nuclear Instruments and Methods in Physics Research A 648 (2011) S276–S280

Secondly, we have used the characterization of the spatial spread to a pencil beam X-ray source. By combining these two quantities that capture resolution and noise aspects of the detectors it is possible to have some confidence in the results of the simulations. Of course, a better match to the empirical data could be obtained by adjusting all the model parameters to match the spread or output statistics. In other words, the combined validation approach offers additional certainty that the models are not overly trained for a particular quantity of interest. The readers are referred to Refs. [22,19] for details. Even with sound validation efforts, the in silico imaging techniques for indirect detectors remain a challenge mostly because of the geometrical complexity of the structures to be modeled and its effect on the imaging characteristics of the detector. As an example of that variability, Fig. 5 shows scanning electron micrographs of several CsI(Tl) screens. Even though the structures appear only from a two-dimensional cut through the phosphor, it is obvious that the columnar arrays have features that vary significantly from screen to screen, even for samples from the same screen manufacturer, and that the inclusion of these features in geometric models is not a trivial pursuit. Another aspect that is crucial in the development of accurate in silico imaging tools is related to openness and reproducibility of the results by independent investigators, a concept that is foundational for all scientific endeavors. First, the modeling of physical processes that take place in a simulated experiment should be based on basic, well-known physical principles whenever possible. Second, it is necessary to clearly state the assumptions and approximations that are used in the implementation of the model and to discuss what is the limit for generalization. One way to contribute to these goals is to distribute simulation codes as open source software packages encouraging the scientific community to validate, adapt and evolve the models increasing the level of confidence regarding the correctness of the models and the particular implementation.

Fig. 5. SEMs of CsI(Tl) screens and the parameters derived from the images as input for the Monte Carlo MANTIS simulations: (a) HAMAMATSU, J8978, 150 mm, a-C backing, columnar tilt 81, and (b) RMD, 390 mm, graphite, columnar tilt 21. Images courtesy of Melanie Freed (FDA) and Stuart Miller (RMD).

S279

4. Computational efficiency The other major obstacle to in silico imaging applications is the need for approaches that can improve current levels of computational efficiency by several orders of magnitude. To frame the severity of the problem, let us consider a typical simulation scenario where we are interested in simulating a clinical X-ray fluence (1010 mm  2), which in turn will generate on the order of 104 scattering events per primary in the detector. Each one of these events will deposit enough energy to create 103 optical quanta per event. These optical photons will experience, in a typical columnar CsI(Tl) phosphor, on the order of 104 scattering events, most of them total internal reflection events at the column walls. In total, on the order of 1021 events will have to be tracked, which translates into 1025 events for a detector size of 10  10-cm. This number of events is 2 orders of magnitude larger than N a (Avogadro’s number), and about 9 orders of magnitude larger than the current estimate of neuronal connections in the human brain. Several approaches can be used to increase computational efficiency: (a) implementing a less accurate model by simplifying the description of some processes that might not be critical for the final outcome of the simulation, (b) modifying the model implementation to avoid computational bottle-necks (e.g., using precalculated data when possible), and (c) using computer engineering methods to execute the existing code more efficiently (for example, using parallel execution methods, high-performance hardware, or optimized compilers). Our group has focused on three solutions that contribute to speed up complex Monte Carlo simulations: the use of octree structures to accelerate particle tracking in complex, anatomically accurate phantoms, the use of graphical processing units and their massive parallel architecture, and the development of analytical solutions based on the Monte Carlo output and validated with empirical data. Badal et al. showed that for highly detailed anatomical phantoms described with triangle meshes, simulation efficiency can be increased up to five orders of magnitude when the millions of triangles of the model are sorted in space using an octree structure [23]. Their general-purpose Monte Carlo simulation code, called penMesh [24] combines the accurate physics subroutines from PENELOPE [25] with the flexibility of a geometry based on surface meshes. Triangular meshes can be used to describe any arbitrary object and are extensively used in computer graphics to model complex objects. Further speed ups can be achieved making use of the massively parallel architecture of graphical processing units (GPUs). For instance, Badal and Badano demonstrated that photon transport in voxelized geometries can be accelerated up to 27 times compared to the same code run on a CPU [26]. Badal’s MC-GPU algorithm efficiently simulates X-ray transport in voxelized phantoms. MC-GPU algorithm and C code is in the public domain and it is openly and freely distributed at http://code.google.com/p/mcgpu/. This has allowed for Monte Carlo simulations of computed tomography of a detailed anatomical phantom from the virtual family of phantoms [27]. The total simulation time for obtaining 360 projections in a 580  260  500 1-mm3-voxel phantom can now be achieved in days (as opposed to months) using an 8-GPU cluster using MPI. In addition, we have recently demonstrated that for optical transport processes, speed up factors of 50 can be achieved when comparing GPU and CPU performance [28]. However, Monte Carlo simulations will become more and more detailed as scientists tackle an increasing complexity of objects to be imaged and system components. Many are also developing new modeling tools for describing fundamental aspects of the interaction of radiation and matter in imaging detectors. For instance, Fang et al. [29] have proposed a detailed Monte Carlo simulation of the direct detection process involving electron–hole pair transport that

S280

A. Badano / Nuclear Instruments and Methods in Physics Research A 648 (2011) S276–S280

can be incorporated into simulation packages of imaging systems. In that case, which involves the interaction of many thousands of pairs and the electric field bias in the detector in real time, massive parallelization is indispensable. Finally, while we wait for computer technology to reach new levels of processing power, some alternative solutions need to be implemented. For instance, Freed et al. [30] recently described an analytical, physics-based model to compute detector response functions. In particular, the model proposed by Freed describes, to a reasonable degree of accuracy, the dependence of the response with X-ray energy, thickness of the transducer layer, and incidence angle of the X-ray beam. The analytical model, which can obtain response functions in a fraction of a second in any modern computer hardware, provides a detector description that is similar to the Monte Carlo MANTIS results obtained after several hours of computing in large parallel computer clusters. Extensions of the model for a wide range of energies and thicknesses are currently being investigated.

5. Discussion and summary The age of in silico imaging is near. By analyzing the requirements and focusing our efforts on the road blocks for efficient development of the needed tools, we will soon be able to implement imaging systems in the computer with accuracy that approaches reality, and with efficiencies that are manageable. With the help of novel tools and techniques, we will be able to incorporate a complete simulation of the imaging process in the design and development of systems. These tools will affect the regulatory review of system improvements and new concepts by alleviating the need for clinical studies and by reducing resource consumption and radiation dose to humans and animals in initial experimentation phases. This review of current efforts to improve the accuracy and the computational efficiency of in silico imaging suggests that, following our proposed definition, in silico imaging can be considered indispensable and unavoidable. The required computer power will soon be available, and perhaps, the only major obstacle that remains a challenge is the need for continuous and rigorous validation efforts.

References [1] E. Clarkson, M. Kupinski, H. Barrett, L. Furenlid, Proceedings of the IEEE, Institute of Electrical and Electronics Engineers 96 (3) (2008) 500. [2] M. Freed, M.A. Kupinski, L.R. Furenlid, D.W. Wilson, H.H. Barrett, Medical Physics 35 (5) (2008) 1912. [3] J. Poropudas, K. Virtanen, IEEE Transactions on Systems, Man and Cybernetics, Part A: Systems and Humans 40 (5) (2010) 1057. [4] W. Jianqiang, L. Shengbo, H. Xiaoyu, L. Keqiang, IEEE Transactions on Intelligent Transport Systems 4 (2) (2010) 121. [5] F. Gao, K. Strunz, IEEE Transactions on Power Systems 24 (2) (2009) 561. [6] N. Allec, R. Knobel, L. Shang, IEEE Transactions on VLSI Systems 18 (8) (2010) 1253. [7] W. Li, G. Joos, J. Belanger, IEEE Transactions on Industrial Electronics 57 (4) (2010) 1137. [8] V. Jalili-Marandi, L.-F. Pak, V. Dinavahi, IEEE Transactions on Industrial Electronics 57 (9) (2010) 3010. [9] T. Kerwin, H.-W. Shen, D. Stredney, IEEE Transactions on Visualization Computer Graphics 15 (5) (2009) 747. [10] H.H. Barrett, K. Myers, Foundations of Image Science, first ed., John Wiley & Sons, 2004. [11] A. Badano, J. Sempau, Physics in Medicine and Biology 51 (6) (2006) 1545. [12] J.G. Mainprize, A.K. Bloomquist, M.P. Kempston, M.J. Yaffe, Medical Physics 33 (9) (2006) 3159. [13] A. Badano, I.S. Kyprianou, J. Sempau, Medical Physics 33 (8) (2006) 2698. [14] A. Badano, I.S. Kyprianou, R.J. Jennings, J. Sempau, Medical Physics 34 (11) (2007) 4076. [15] A. Badano, I.S. Kyprianou, M. Freed, R.J. Jennings, J. Sempau, IEEE Transactions on Medical Imaging 28 (5) (2009) 696. [16] R.J. Jennings, P. Quinn, R.M. Gagne, T.R. Fewell, Proceedings of the SPIE 1896 (1993) 259. [17] W. Que, J.A. Rowlands, Medical Physics 22 (4) (1995) 365. [18] G. Lubberts, Journal of the Optical Society of America 58 (11) (1968) 1475. [19] A. Badano, R.M. Gagne, B.D. Gallas, R.J. Jennings, J.S. Boswell, K.J. Myers, Medical Physics 31 (11) (2004) 3122. [20] W. Zhao, G. Ristic, J.A. Rowlands, Medical Physics 31 (9) (2004) 2594. [21] W.W. Moses, Nuclear Instruments and Methods in Physics Research A 487 (2002) 123. [22] M. Freed, S. Miller, K. Tang, A. Badano, Medical Physics 36 (11) (2009) 4944. [23] A. Badal, I. Kyprianou, A. Badano, J. Sempau, Radiotherapy and Oncology 86 (1) (2008) 99. [24] A. Badal, I. Kyprianou, D.P. Banh, A. Badano, J. Sempau, IEEE Transactions on Medical Imaging 28 (12) (2009) 1894. [25] F. Salvat, J.M. Ferna´ndez-Varea, J. Sempau, PENELOPE, A Code System for Monte Carlo Simulation of Electron and Photon Transport, OECD Nuclear Energy Agency, Issy-les-Moulineaux, France, 2003. Available at: http://www.nea.fr. [26] A. Badal, A. Badano, Medical Physics 36 (11) (2009) 4878. [27] A. Christ, W. Kainz, E. Hahn, Physics in Medicine and Biology 55 (2010) N23. [28] D. Sharma, A. Badal and A. Badano, Improving the computational efficiency of optical monte carlo transport for indirect X-ray imaging using parallel architectures, submitted to ISBI2011. [29] Y. Fang, A. Badal, N. Allec, K. Karim, A. Badano, Monte Carlo simulation of amorphous selenium imaging detectors, Proceedings of the SPIE 7622 (2010). [30] M. Freed, S. Park, A. Badano, Medical Physics 37 (6) (2010) 2593.