Image processing for the rest of us: The potential utility of inexpensive computerized image analysis in clinical pathology and radiology

Image processing for the rest of us: The potential utility of inexpensive computerized image analysis in clinical pathology and radiology

Compureri:ed Medical Imaging and Graphics. Printed in the U.S.A. All rights reserved. Vol. 13. No. I. PP. 3-30, I 0895-61 l/89 $3.00 + .oO Copyri...

4MB Sizes 0 Downloads 35 Views

Compureri:ed Medical Imaging and Graphics. Printed in the U.S.A. All rights reserved.

Vol.

13. No.

I. PP. 3-30,

I

0895-61 l/89 $3.00 + .oO Copyright G> 1989 Pergamon Press plc

1989

IMAGE PROCESSING FOR THE REST OF US: THE POTENTIAL UTILITY OF INEXPENSIVE COMPUTERIZED IMAGE ANALYSIS IN CLINICAL PATHOLOGY AND RADIOLOGY

Donald L. McEachron Image Processing Center and, Department of Bioscience and Biotechnology, Drexel University, Philadelphia, PA 19 104

Susan Hess Biomedical Engineering and Science Institute, Drexel University, Philadelphia,

PA 19 104

Lewis B. Knecht IBM, Inc., 1040 1 Fernwood Rd., Bethesda, MD 20817

and Lawrence D. True Department

of Radiology, School of Medicine, Yale University,

108 Lauder Hall, New Haven, CT 065 lo-8023

(Received 4 August 1988)

Abstract-Recent progress in computer technology in both hardware and software, combined with marked cost reductions, have placed quantitatively accurate video densitometry systems within the reach of the individual clinician, biomedical researcher, and community hospital. While much of the attention generated by advances in image processing has focus& on larger scale procedures, such as CAT, chemical shift, and positron emission tomography, important applications can be found for considerably more modest systems. In this article, we discuss three such applications of DUMAS, a personal computer-based imaging system developed by the Image Processing Center at Drexel University. A potential technique for quantifying numbers of estrogen receptors in tumorous breast tissue samples as a predictor of patient responsiveness to hormonal therapy is described first, along with possible sources of error. The second application, also related to clinical pathology and cancer, outlines methods for relating changes in nuclear and cell morphology to the diagnosis of Sezary Cell Syndrome. The utility of binary image filtering methods in the classification of cell types is discussed. The third application involves the development of a semi-automatic procedure for the determination of vessel diameter in arteriograms. A detailed description of the optimization and curve-fitting algorithms is provided along with preliminary test results comparing various approaches. The need for user demand to fuel research and development in small-scale imaging systems is also discussed. Key Words: Image processing, Video densitometry, Carcinoma, Estrogen receptors, Sezary cell, Lymphomas, Stenosis, Arteriograms

INTRODUCTION

In both medical practice and biomedical research, a large proportion of the data produced is in the form of visual images. When a pathologist examines a stained tissue section or a radiologist inspects an x-ray, he or she performs a type of image analysis to quantify a result or make a diagnosis. In recent years, the astonishing character of the images produced by CAT scanning, chemical shift imaging, or positron emission tomography (PET) using living tissue has fired both the public imagination and scientific curiosity. However, these images, no matter how startling or revealing, still represent only a very small fraction of the visual information which must be processed

daily by the medical and biomedical research communities. Most data still consist of stained sections, x-rays, or autoradiograms which must be processed by repetitive visual examination. To what extent can the fields of computer graphics and machine vision can aid in these more commonplace, but vital, visual processing tasks? Computers, plus and minus In very basic terms, computers excel in performing repetitive tasks which can be specified mathematically. This translates into advantages for computer processing in terms of increases in efficiency, precision, and in certain circumstances, accuracy. The increase in efficiency should be clear to anyone who has

3

C‘omputerueo

Ved~cal lmaglng

and Graphics

ever used a computer. Simply consider the time involved in adding a few thousand numbers by hand vs. the same addition performed by a computer. Indeed. some 20-30 years ago, working out a problem in multivariate statistics might have been considered sufficient for a degree in statistics from a reputable university. Presently, these same problems can be solved by commercial packages running on inexpensive personal computers. In statistical terms, precision refers to the degree of agreement obtained with repetitive counts or samples while accuracy refers to how close an estimate or measurement is to the “true” value. In a large census, manual counts are very likely to contain sizable errors which can be attributed to factors such as fatigue or boredom. This is one reason behind the development of random sampling techniques. Such samples may provide better estimates of population parameters than a complete census. Computers, however, do not suffer from boredom and are thus capable of manipulating large data sets with great precision. Unfortunately, this does not always translate into increases in accuracy. The old saw, “garbage in, garbage out” is especially applicable to problems in image processing for several reasons. One potential problem is the quality of the image input device. It does no good to use a $50,000 computer system if the input device is inaccurate. The use of video cameras as image scanners for densitometry has increased greatly in recent years. but examination of the fidelity of several different models has demonstrated that not all are sufficiently accurate for this purpose (38). In such cases, the researcher is left with very fancy, and expensive, garbage. Even in those cases where it can be demonstrated that the input device is sufficiently accurate, as is the case with the Circon 90 15H video camera (23, 38). there is still the problem of defining exactly what the data is that one wishes to collect and how a computerized system can best facillitate the duel processes of collection and analysis. This may seem like a simple problem, but in reality, it can be extremely difficult. Consider the computer-assisted analysis of an autoradiogram representing glucose metabolism in a section of rat brain. The usual protocol involves the researcher outlining an important neural area on the digitized autoradiogram’s image with a graphics device and recording the optical density within the outline. It has been found that if the procedure is repeated by the same or different individuals attempting to delineate the same region, the outlines vary considerably from one to the next (5). This highlights an important question involving the criteria by which

parucular .~utltnr I< selected. What aspct i> i tr ;A image allow in individual to determine that the ccigt ;)f the superior colliculus. for example. 15 where hc $1~ she placed it and not slightly to the right OI i&i” \rparently. the rules are not limited just to the part~r’~l lar image under consideration. but also involve &i-r< investigator’s previous experience with similar images as well as a viewpoint as to where a particular region “should” be (e.g. as indicated in a 5tereotaxic altlas). Even if one could limit the criteria 10 those in a particular image, a practical impossibility I! :‘, \,~ir’) unlikely that an investigator could objectivel! cpeciti all the cues being utilized in such a way that ‘1 computerized system could duplicate the feat. Ihis will be true irrespective of whether the data is in Thor:orrn .I! an autoradiogram, histological preparation. ;; .r;i! bx other common medical image. The human brain and eye represent a pattern recognition system which has evolved over a half billion tears and simpl! i:annot even be approximated in most situations. Hou ceri.. ous a limitation does this pose on the use of computer systems for processing visual data’! The answer is “it depends.” One clear rmphcation IS that one should not expect to purchase any system which will automatically solve a complex %‘Isual processing task. In fact. it is probably unwise t(b expect even simple images to be analyzed automatically. One reason was given above (i.e., it is difficult criteria one might he to specify all the appropriate using for data collection). 4 second reason is that purchasing ready-to-use systems implies that the criteria for analysis have been preset by the manufacturer. This forces the buver to accept rules for image analysis determined b! an unknown software engineer which probably do not agree with his or her own If a manual override or programming ability is not provided. the user will be stuck with an uncooperative system. a frustrating experience. This does leave. however. both manual and semi-automatic imaging techniques as possible alternatives. The difference between manual and semi-automatic techniques is a little vague insofar as almost all manual packages contain automated procedures. For example, the BRAIN software package runrung on DUMAS (Drexel’s Unix-based iMage Analysis System) allows the user to manually select thresholds to eliminate the background and these thresholds are then used to help create an automatic contour trace around an autoradiographic representation of a brain section. The package will soon incorporate an automatic registration routine, which will use second order moments to align two images. However. the basic design is a manual one, with the user specifying regions of interest on the autoradiogram of a brain .I

Utility of inexpensive image analysis 0 D. L. MCEACHRON etal.

section and then directing the system to return a derived measure, such optical density or radioligand concentration, which characterizes that region. Even so, DUMAS and similar systems represent an immense increase in efficiency and accuracy when compared to direct densitometer readings of brain autoradiograms. The increase in accuracy comes from several different factors. One is simply that an enlarged image of the brain section displayed on a monitor is easier to see which facilitates the process of selecting areas of interest for analysis. Another factor involves computer algorithms, such as pseudocolor, which enhance the image. The latter process uses a device known as a look-up table to change different optical densities in an image to different colors for display, highlighting subtle changes in density which would normally be overlooked (Fig. 1). Both of these factors can be useful when processing images other than autoradiograms. Finally, some computer systems, such as DUMAS, allow two or more images to be superimposed on the monitor. Using these techniques, an investigator can draw an outline on a histological section where a brain region can be clearly identified and then the outline is transfered automatically to the autoradiogram and used to determine any desired derived measure (Fig. 2). Thus, if there is a change in the pattern of glucose metabolism, as well as in the amount, which is not directly linked to standard nuclear areas, both the pattern and the difference will be uncovered. Even so, the greatest potential for increased accuracy with computerized systems is most likely to occur with the advent of true semi-automatic systems. An example of a semi-automatic system is under development at the Image Processing Center, Drexel University and is known as the parcellation project. Described in somewhat more detail in the article by Shivaramakrishnan and Tretiak in this issue, the parcellation project attempts to answer the issue of region identification in brain film autoradiography. As described above, the standard protocol for analysis of autoradiograms involves an investigator locating a particular brain region or nucleus, outlining the region, and the computer returning a derived measurement characterizing the part of the image within the drawn boundary. The outlining process varies considerably from investigator to investigator and even from outline to outline drawn by the same investigator (5). To address this problem, as well as increase the efficiency of the analytic process, the parcellation project will utilize a library of pattern recognition routines, using texture analysis, edge detectors, etc., a relational database, and an expert system. The basic

5

protocol involves the investigator drawing outlines around nuclei in a series of sections. For each outline, the system will attempt to redraw the boundary using the algorithms available in the library. When this is completed to the satisfaction of the user, the section location (position in the brain from which the section came), nuclei, and set of rules for finding and drawing the boundary will be placed in the database. Eventually, the database will contain a set of canonical sections, nuclei, and rules for finding the boundaries. Then, when the investigator wishes to analyze a new series of brain sections, he or she will prompt the system with the general section location and nuclei desired and the computer will apply the proper rules to drawn the boundaries, subject to the user’s approval. This approach gets around the problem of pre-existing rules for analysis, since each investigator will literally “teach” the system his or her own rules. The increase in accuracy is derived not so much by the system recognizing the “true” boundaries for nuclei, a matter of some controversy among neuroscientists, but rather from the establishment of objective computer-recognizable criteria, which can then be discussed and exchanged. If successful, a similar approach may be used for image processing in clinical pathology or other biomedical applications involving repetitive visual processing. This, however, is the future. What about the use of fairly inexpensive system for image processing now and in fields other than film autoradiography? The development of DUMAS, an inexpensive imaging system DUMAS, as the reader has probably guessed by now, was developed by the Image Processing Center (IPC) at Drexel University specifically for the analysis of film autoradiograms related to neuroscience research. The funding for the project came from the Biomedical Research Technology Program (BRTP) of Division of Research Resources, National Institutes of Health. The program was established in response to the need of biomedical researchers to have access to state-of-the-art technology in exploring basic and clinical research questions and in recognition of the fact the many individual scientists did not have either the facilities or expertise to utilize sophisticated equipment to its full potential. In many cases, the projects funded by the BRTP have resulted in regional and national centers where individual scientists and clinicians can come to investigate questions in biology and medicine. In the specific case of the IPC, the approach was somewhat different. In addition to setting up a center where investigators using film autoradiography could analyze their data, it was

C‘omputenzed Medical lmagmg and Graphics

:. I. An autoradiogram

Fig. 10. Mapping

from

a coronal

section

area losses due to the opening

!,Irlu;ii,~tehrtlai~.

of rat hratn DUM4S

operation

cnhancrc!

:uxv / I”,,.:ii

“dL;r,!II4

us~np thi- pszudocciic~r ~qlww

applied to a Sezary cell into contrasting

colors

Utility of inexpensive image analysis 0 D. L. MCEACHRON etal.

Fig. 2. (a) A coronal section of rat brain stained with cresyl violet with a computer-generated graphics outline surrounding a region of interest. (b) The same outline transfered to the corresponding autoradiogram.

x

Computenred

MedIcal Imagmg

and Graphics

decided that the Center personnel should develop a low-cost, quantitatively accurate imaging system based upon a personal computer which could be set up in an individual’s laboratory. Several inherent constraints in the design of both the hardware and software required by this approach have led unexpected benefits in adapting DUMAS to new medical imaging tasks The overall design for DUMAS was that of a ‘kit.’ The individual researcher was expected to purchase the components and the Center would provide, for a small fee. the software and some assistance in setting up the system. The assistance needed was expected to be minimal. In order for this plan to work, there were several requirements. The hardware had to be both reasonable in price and readily available. In addition, given that the system was to be used for quantitative densitometr)3, the components would have to be tested for accuracy. The software would require an extremely friendly user interface insofar as investigators were expected to utilize the system with little training. The software would also have to be quite modular and flexible in design. so that changes in both user requirements and computer hardware could be incorporated without major alterations in the interface or functions. The result of several years of effort and testing was DUMAS. originally implemented on the IBM PC-XT, moved to the IBM PC-AT, and presently running on AT’s, AT clones, and on machines using Intel 80386 microprocessors in approximately 50 laboratories. By their very nature, Resource Centers funded by the BRT Program generate a great deal of interaction between engineers, programmers, and researchers in the life sciences. This is especially true at the IPC, where the hardware and software design of DUMAS was greatly influenced by scientists using prototype systems at the Center as well as by interactions with those investigators who installed versions of DUMAS in their own laboratories. Aside from its role in the BRTP, the IPC is also an integral part of the Drexel University community, associated with the Department of Electrical and Computer Engineering and Biomedical Engineering and Science Institute. As such. the Center plays host to a number of graduate students investigating questions in computer science, image processing, and biomedical engineering. Situated in Philadelphia with its large number of research-oriented medical schools, such as Temple University and the University of Pennsylvania, the IPC has facilitated collaborations between these students and physicians in the area. Much of the success of these collaborations is due to the flexible and modular design of the imaging software as well as

to the accurac:+ testing program apptieo It* .i j DUMAS components. In effect. by designing the s\-, tern for ease-of-use. programming flexibility and a< curacy in quantitative film autoradiography. the 1PC opened the door for dexrelopment of approaches !r’ other questions in biomedical Imaging. We began this discussion by pointing crut that the majority of images in medicine and biomedical research are not as spectacular as those produced by CAT. PET, or NMR but are nonetheless vrtal ‘I‘he rest of this manuscript will describe three new approaches to biomedical problems developed a1 the IPC using the DUMAS system. We are constrained :I.) point out that while DUMAS is indeed the product 01 a unique environment rt is by no means ihc on!%. such system available or which could be imagined The purpose 1s not to describe what DUMAS can do. but rather to demonstrate what any properly designed and engineered. ine.~~elcnsirc~ computerized imaging system might accomplish. We are hoping to provide the reader with a glimpse into the future. where personal imaging systems will become commonplace m medical practice and biomedical research. The design (?i LjCM.4 X The hardware and software ot’ DUMAS have been described previously (24, 25) and we will not review that material beyond what is necessary to un derstand the modifications of the system described below. The core of this video densitometry system is an IBM PC-AT microcomputer, AT clone, or compatible using the 80386 microprocessor. Image acquisition is handled by a Circon model 90 15H solid state video camera using a MOS-AID two-dimensional photosensor array with 384 horizontal by 485 vertical picture elements (pixels) (Circon Microtechnology. Santa Barbara. CA) and a ChromaPro45 Image Analysis Illumination Stand (Circle S, Tucson. AR) as the light source. The camera is voltage stabilized using a Minicomputer Regulator (Allied Electronics, Marlow Heights. MD) while the light source uses a regulated DC power supply. Images are digitized using a Datacube IVG-128 (Datacube. Inc., Peabody, MA) image processing board with 5 12 by 384 pixels of image memory which can be configured into two frames of 256 x 384 pixels each. In the two frame mode, the system is capable of superimposing two different images (i.e.. the histological and autoradiographic images of a single section or autoradiograms of total and nonspecific binding for quantitative receptor autoradiography). Images are digitized to 8 bits or 256 grey levels from transmission values. The IVG-128 has two input look-up tables and two banks of three output look-up tables for pseudocolor

Utility of inexpensive image analysis 0 D. L. MCEACHRONet al.

enhancement and display. Other components of the system include an interactive graphics input device, either a Hitachi HDG- 1111 B graphics tablet (Hitachi Denshi America, Ltd., New York, NY) or Logitech C7 Serial Mouse (Logitech Corp., Redwood City, CA), for communicating with various software options and delinating regions of interest for quantitative analysis, a Barco CD233 color monitor (Barco Industries, Menlo Park, CA) with a 720 line horizontal by 540 line vertical resolution for display, and an IBM Proprinter for hard-copy output. The software is written in ‘C’ using the Xenix operating system. A general description of the approach used in designing the software is given in Kishore and Feingold, this issue. At present, the entire system plus software costs less than $13,000.00. IMAGE

PROCESSING IN CLINICAL PATHOLOGY

Until quite recently, there has been little demand for quantitative image processing in clinical pathology. The majority of critical assessments performed by pathologists have involved qualitative measurements which could be clearly differentiated. Quantitative evaluations of microscopic objects were limited to such procedures as counting mitoses which, while useful, did not provide vital information (41). This situation is changing and a number of factors are involved. One major factor is cost-computer-based imaging systems have become much less expensive in recent years, bringing such systems within the budgets of community hospitals and individual pathologists. At the same time, improvements in electronics, computer hardware, and video cameras have improved the quality of these inexpensive systems to such an extent that reliable quantitative measurements can be obtained. The development of “friendly” user interfaces, promoted by the success of the icon-based Apple Macintosh line of personal computers, has also played a role, providing individuals without computer experience access to sophisticated analysis programs. Concurrently with this revolution in computer science has been a revolution in histochemical techniques with the advent of monoclonal antibody technology as a means of quantifying biochemical content in tissue. The situation with these antibodies is quite similar to that seen with the use of radioligands to quantify neuroreceptor content in brain. Both techniques use labeled ligands to locate specific biochemicals of interest. Both techniques began by using extraction techniques or biochemical analysis to quantify binding and both ran into the same limitation using such analyses-the need for

9

relatively large amounts of tissue, preventing widespread application of the techniques when limited quantities of tissue are available. The solution is also the same in both cases-computerized image processing which can quantify with a far better resolution than most biochemical assays (7,4 1). So far, the use of imaging systems in quantitative receptor autoradiography is far ahead of applications in clinical pathology, but we believe, as discussed below in the case of estrogen receptors, that the use of such systems has great potential for the clinician.

Quantitative immunohistochemistry of estrogen receptors The estrogen receptor (ER) content of primary carcinomas of the breast is predictive of response to hormonal therapy (6, 13, 14, 29, 32). Generally, in patients with multifocally metastatic breast carcinoma about 70% of women with ER concentrations which exceed 10 femtomoles/mg of total protein in the tissue sample assayed have a significant, beneficial response of the metastatic tumor foci to hormonal therapy. Only 10% of patients with receptor concentrations less than 10 femtomoles/mg protein respond. The predictiveness of the ER assay can be improved if the age or menopausal status of the patient and the progesterone receptor concentrations of the tumor are also known. Even then, there are significant false positive and false negative predictions. One explanation which has been offered for the lack of perfect predictability involves the heterogeneity of receptor content in different foci of the same tumor. Extracts of tumor from separate metastatic foci taken at the same time, or from the same metastasis, have significantly different ER concentrations (12, 36). These observations have been cited to support the hypothesis that the cells of the same tumor vary in estrogen receptor concentrations. Breast carcinomas, however, are known to vary widely in their cellularity-the area or volume fraction of a piece of tissue which is actually neoplastic. Tumor cellularity values range from 4% to 92% (27, 43). The majority of studies which have correlated extractable ER levels with response of the metastatic tumor to hormonal therapy have expressed receptor concentrations as “fmoles of receptor protein” per “milligram of tissue assayed.” Normalization of the denominator of this expression with respect to cellularity so that receptor levels would be expressed as “milligrams of tumor cell tissue assayed” should yield values which more precisely reflect tumor cell concentrations of estrogen. Furthermore, such corrections might provide for a more accurate correlation of ER content with tumor behavior.

10

C’omputerlzed

Medical

lmagtng

and Graptucs

Recently. an assay has become available which allows quantification of estrogen receptor levels in individual tumor cells, a value unobtainable by previous extraction assays. The assay uses a rat monoclonal antibody which reportedly binds to the same protein assayed in ER extraction assays. Furthermore. binding is unaffected by the presence of free estradiol (8, 16). The assay, termed ERICA, localizes the antigen by an immunoperoxidase method to sections of tumorous breast tissue (Fig. 3). Comparability of the antigen immunohistochemically localized by this rat monoclonal antibody with extractable estrogen receptor levels assayed by charcoal-dextran sedimentation methods has been established (15. 22, 30). Correlations have been imperfect. however, causing questions to arise conceming the validity of the ERICA assay. There are several possible explanations which might account for these results including differences in the biochemical resolutions of the two approaches as well as problems in the method of visual quantification utilized in the ERICA procedure. Another possible explanation is that the antibody does not recognize the same protein as does the extraction assay. To evaluate these various hypotheses, a more precise quantification of the antigen in tissue sections needs be performed. The studies mentioned above assessed tumor cellularity manually by a relatively crude method. The procedure involved observers vi-

suall! estimating tumor area traction in Chc ~fton$ by quartiles. d technique which results in an Interohsen’er variability averaging 30%’!42). Such ti high level of variability is known to significantly aflect itff relative studies. Another, potentially greater. source: of error was rhe method by which estrogen receptor concentrations in tumor cell nuclei were calculated. Visual assessments of percentages of nuclei exhibiting from “l+ to q+” or. from “0 to 4+‘* Intensitv of staining were made manually. These estimates ot’receptor concentration were then treated as parametnc values to obtain the total receptor concentration. Reliance upon the ability ofthe observer to reproducibl! assay optical density on a parametric scale at different times with different tumors is probably misplaced and beyond the ability of an individual, although such a source of error has not been specificall\ examined. The DUMAS system is adaptable to anaiyzmg the density of the immunoperoxidase reaction product in sections of breast carcinomas. Assuming ideal conditions, the integrated optical density can be obtained for a section of tumor as follows: 1. The pixel distribution by grey level is determined. 2. The observer selects the grey level range considered to indicate “true positive” deposition of reaction product. 3. As the system can be initially calibrated to abso-

Fig. 3. A section of an infiltrating duct carcinoma of the breast which has been immunohistochemically for estrogen receptors.

stained

Utility

of inexpensive

image analysis

lute optical densities (O.D.) with an O.D. scale, the O.D. corresponding to each grey level is easily determined. The O.D. value of each pixel is logarithmically related to antigen concentration at that pixel; hence, the integrated O.D. reflects the total antigen content of the tissue in the image frame. The validity of this methodology can be assessed by comparing the relative estrogen receptor content, obtained using DUMAS, with the extractable estrogen receptor content from the same tumor, corrected for cellularity. Preliminary studies indicate that quantitative immunohistochemical assay of ER content correlates better with corrected extractable ER levels than do values of ER content obtained by “semi-quantitatively” visually estimating ER content (40). The predictive validity of this approach still needs to be established. Sources of variation which might be expected to reduce the correlation between the two estimates of ER include: Variation in the biochemical assay. Variation in the ERICA assay. An actual diference in ER concentration of the tissues assayed by the two methods. As each assay destroys the sample for the other assay, identicality can only be approximated, by using frozen sections from the same block of tissue from which ER is extracted. Variation in the DUMAS system. The analytic accuracy of DUMAS has been calculated evaluating contributions to the observable error from photometric nonuniformity, noise, flare, transfer function, and long-term temporal instability (drift). All the sources of photometric error can be combined into an average analytic error in densitometric measurement of approximately 0.01 optical density units in a range from 0 to 2.0 optical density units or less than 1% of the range of measured values (23). Thus, this error is unlikely to be a significant factor. Studies in progress will assess the variability in the ERICA assay and the degree to which it contributes to an imperfect correlation with an extraction assay for ER. The following are potential sources of error: 1. The estimate of the grey level range of “true positive” immunoperoxidase reaction product depends upon the observer. The observer pseudocolors the digitized image displayed on the monitor and selects the grey level range by comparing trial ranges with the stained section microscopically viewed. Although there is probably no way to

0 D. L. MCEACHRON et al.

11

automatically assess “true” reaction product, the variability introduced by selecting a marginally inaccurate grey level range can be determined. Thus, the effect of selecting a grey level range which is greater than or less than the selected grey level range on the ERICA assay can be assessed. Variation in antigen concentration due to section thickness. Repeat assays on serial frozen sections will provide an indirect measure of variability arising from section thickness, as well as from the technique. There is no simple method to determine frozen section thickness in areas of pixel dimensions. Variation due to nonideal immunologic conditions. Ideally, the antibody reagents are used at saturating conditions. Three immune reactions are involved in the ERICA assay: (a) Binding of the monoclonal rat anti-ER to ER in the tissue section. (b) Binding of the secondary, afinity-purijied quinea pig anti-rat immunoglobulin to theprimary rat antibody. (c) Binding of the enzyme complex of rat-anti-peroxidase-peroxidase to the secof these ondary antibody. The concentrations three sets of reagents will be varied. A plot of optical density vs. reagent concentration will be obtained, using serial sections. The concentration at which the plot deviates from a linear relationship is the optimal concentration; greater concentrations, which produce exponentially greater optical density, reflect nonimmunologic binding (9). According to the supplier of the ERICA assay, Abbott Laboratories, through the product developer Dr. G. Ring, the reagents are supplied at saturating concentrations. (d) Concentration of diaminobenzidine (DAB). (e) Concentration of hydrogen peroxide. (f) Incubation time in the DAB/hydrogen peroxide solution. These values will also be varied, and the value at which the rate of change of O.D. changes from linear to exponential will be taken as the optimal value. That there is precedent for quantitating an immunohistochemical reaction product (1, 9) provides optimism that variability values will not be great. Image processing application for diagnosis of lymphomas Not all applications of quantitative image processing involve the use of monoclonal antibodies or mimic biochemical assays. In some instances, computer-assisted image analysis can be utlized to quantify changes in the morphology of cells in an attempt to predict the transformation of normal cells into neoplastic phenotypes. Semi-automated techniques

I:!

Computenzed

MedIcal

Imaging

and Graphics

to determine DNA content and chromatin dispersion by means of optical density measurements have been used to distinguish the nuclei of cancerous cells from those of normal cells (46). Other studies have employed morphometric analysis of nuclear areas and contours in order to provide additional criteria toward a definitive diagnosis. Malignant cells and their nuclei generally exhibit a lack of uniformity. Irregularity in size and shape is most often seen, accompanied by a change in the condensed interphase chromatin (heterochromatin) pattern. Thought to be bound to the inner nuclear membrane by a thin filamentous layer known as the nuclear lamina, the uneven chromatin condensation produces an uneven nuclear border. In addition, the nuclei are usually enlarged, with an increase in the nucleocytoplasmic ratio which may vary considerably from cell to cell. The spatial relationship between the nucleus and cytoplasm may also change. thus altering the position of the nucleus. Cancers of the lymphoid system have been extensively investigated and are termed lymphomas. Unregulated cell proliferation may give rise to neoplasms which eventually metastasize to other sites in the body. Lymphomas are classified as Hodgkin’s disease and non-Hodgkin’s lymphomas and usually arise within the lymph nodes or lymphoid tissue of the spleen or alimentary tract. A definitive diagnosis of the cutaneous T-cell lymphomas such as Sezary syndrome (SS) and mycosis fungoides is often delayed due to unremarkable early histopathology. Diagnosis may be further complicated by the fact that SS may be preceded by atopic or contact dermatitis (26). Initial manifestations of Sezary syndrome include the appearance of isolated cutaneous lesions, followed by localized erythroderma and plaque formation. First described in 1938 by Sizary and Bouvrain (34), neoplastic lymphocytes termed Sezary cells are also found in the peripheral circulation (48). SCzary cells contain distinctively cerebriform, heterachromatic nuclei with lobulations connected by narrow nuclear bridges. The cytoplasmic organelles most commonly identified are mitochondria, ribosomes. some endoplasmic reticulum and fine microfilaments (19). Figure 4 displays both a normal and Sezary cell lymphocyte. The interval from the onset of symptoms to an unequivocal diagnosis is approximately 5 years, with a median survival after diagnosis of 8 years (2). A study was initiated, therefore, in an attempt to confirm established diagnoses on the basis of morphometry, statistics and/or binary image filtering methods. Circulating lymphocytes of 11 ill patients were analyzed and compared with those of two control pa-

tients. one ot’which was diagnosed as having atopn!. dermatitis. It is anticipated that successful use r!i these techniques will eventually provide the basis n-13 an earlier diagnosis of patients with suspected hur unconfirmed cases of the syndrome. Written in the 6 programming language, the software for morphometry analysis and binary image filtering was developed on DUMAS under the Xenix operating system. Using the basic utilities and func-. tions developed for Drexel’s autoradiography package. modules from the BRAIN software were modified and combined with 16 new modules to create the morphometry application package, The cellular images were obtained from electron micrographs (X5000) provided by Drs. Eric Vonderheid ant Thomas Griffin at the Skin and Cancer Hospitai 01 Temple University in Philadelphia. The following parameters were measured or calculated using the software described below. Meusured Purametert I. cellular area 3 cellular perimeter _. 3. nuclear area, total

a. heterochromatic area b. euchromatic area 4. nuclear perimeter

1. nuclear area/celMar area 2. nuclear perimeter/(areai ’

(nuclear contour index or NCI) 3, heterochromatic area/euchromatic area

The package has been designed specifically for use with contiguous images which must be isolated before analysis can proceed. The flexibility of the DUMAS system permits image modification via contrast enhancement and picture editing. While a copy of the unedited image is stored on the computer’s hard disk, the transformed image is maintained in image memory as a linear array (picture buffer) and modified as the need arises. In contrast enhancement, the range of pixel values within the image is linearly extended to 230 gray levels. Since the cell is superimposed upon a lighter background, contrast enhancement not only emphasizes the difference between dark and light nuclear areas but also places the cell against a whiter background (Fig. 5). As a result, less picture editing is required to isolate the cell from the background, and the automatic contour trace is less likely to deviate from the object of interest. Picture editing is carried out by using the graphics cursor as a brush of varying color and width; desired gray tones are substituted for unsuitable values to facilitate analysis of the desired area.

Utility of inexpensive

Fig. 4.

image analysis 0 D. L. MCEACHRON et al.

(a) A normal lymphocyte. (b) A Sezary cell.

Preliminary analysis begins by focusing the camera upon a cell illuminated by the underlying light source of the ChromaPro system. Using a module adapted from the BRAIN package, pixel calibration is carried out to determine the dimensions of the rectangular picture elements. Originally determined in pixels, all perimeter and area measurements are therefore converted to metric units. Thresholding

achieves a desired range of gray tone levels by eliminating values at either end of the gray scale which are not included in the object to be analyzed. The procedure therefore selects an image delimited by a continuous boundary. Any difficulties posed by the uneven nuclear border are corrected by moving the cursor over portions of the boundary while in the editing mode. Lighter sections of the perimeter are therefore

Computenzed

Medical

lmagmg

Fig. 5. The effect of contrast

and Graphics

enhancement.

replaced with a darker line one pixel in Thresholding is followed by contour tracing, proceeds from right to left across the image point, the reference pixel, is located whose x within the established threshold range. The

width. which until a value is coordi-

!anuarb-f’ebuuarij

(a) Before enhancement.

cYXY. b -iiiimr

a. uurntk.!

(b) After enhancement

nates of the two pixels to the left and above or below the reference pixel are then determined. Their gray level values determine another point of the contour trace. the new search direction and hence the new reference pixel (Fig. 6).

Utility of inexpensive image analysis 0 D. L. MCEACHRON et al.

Fig. 6.

15

Contour trace around the nucleus of a Sezary cell.

The x and y values of the contour trace are stored in two one-dimensional arrays for area computation. As each point of the contour trace is determined, the value is compared with the previous contour point and the direction determined. If the direction of the new point relative to the old point is up or down (left or right), one vertical (horizontal) unit is added to the perimeter variable. If the direction is diagonal, the pixel diagonal dimension is added to the perimeter variable. Two different area algorithms are employed, the first of which determines the nuclear area dynamically after the perimeter variable is incremented. The second area calculation (area by threshold) is commenced after the completed contour trace is shown on the monitor for confirmation by the analyst. Before obtaining the cellular or heterochromatic area by threshold, the two arrays containing the x and y values of the contour points are sorted in ascending order. The entire range Of yj values for each Xi value is accessed, and the gray values of the points (x,, y,) are compared to the low and high threshold values. If each gray value falls within the threshold limits, the area variable is incremented by one. If the area is nuclear, thresholding is used to select the heterochromatic or dark area, and the algorithm additionally sets the corresponding point in the graphics bit plane. After all points enclosed within the contour are checked, the dark area is displayed in color on the

monitor. Using both algorithms simultaneously for the nuclear parameters saves considerable time, since subtraction of the dark area from the heuristically determined nuclear area yields the euchromatic area. After the cells were processed, the data was analyzed by using nonparametric or distribution-free statistical methods. Preliminary results indicate that the NC1 and two other nuclear variables distinguished between the pooled Sezary and the control patient data. This suggests that the transformation of a normal lymphocyte to a Sezary cell may be accompanied by changes in nuclear rather than cellular morphology. The NC1 and four other variables also were found to be different for two groups of !%zary patients with nuclei at two different chromosome levels. In addition, the control patients were separated on the basis of the NC1 variable, confirming previous reports that Sizary-like cells have been identified in the blood of patients diagnosed with benign inflammatory disease such as dermatitis. Although employed by numerous investigators to characterize the shape of Sezary cell nuclei (44,45, 47), the NC1 reflects the boundary complexity but fails to distinguish between objects which are not circular. We felt that binary image processing techniques might be of value in distinguishing different nuclear shapes. Derived in part from the Minkowski addition and subtraction operations, these morphological techniques or filters simplify images by elimi-

6

Computerized

Medical

Imaging

and Graphics

Fig. 7. The nucleus of a Sezary cell as set .Y superimposed

Fig. 8. Translation

of the structuring

lanuar~-~ebruar?,‘lY~Y,

tiiiumr

upon a lighter background.

element,

%.Llrmhar

i’ the compliment

B,where B = {(O,Ol, CO. 1)i

U! i

II

Utility of inexpensive image analysis 0 D. L. MCEACHRON ef al.

nating extraneous detail while preserving essential shape characteristics (10, 35). Image operators may be applied to a black object or nucleus, represented as a set X superimposed upon a lighter background J?, the complement of X (Fig. 7). Briefly stated, dual set operations known as erosions and dilations use a structuring element B to modify the object boundary to more nearly resemble the boundary of the element. B may be visualized graphically as a collection of points defining a simple shape approximating a circle which is systematically translated to each picture element of X. If B at point (xi, yj) of the image is contained entirely within X, the point is included in the eroded image (Fig. 8). Since DUMAS permits simultaneous storage of several images, an eroded image may be stored in a second picture buffer and displayed on the monitor. Dilation is the morphological dual of erosion, since eroding an object is equivalent to dilating the background (35). With dilation, the object is expanded at all boundary points by the element B. Cascaded erosion and dilation operations describe an important morphological filter known as opening (Fig. 9) (21). Opening the cerebriform nuclei of Sezary cells smooths the nuclear contours internally, breaks the narrow isthmuses and eliminates the characteristic peaks. In the opening operation, the original object is eroded i times (i = 1, 2, 3 - - - ), losing area each time, and the ith eroded object is dilated or expanded i times. The object vanishes with the tih or final erosion, where the magnitude of n is usually related to the size of the object. The normalized area loss per cycle i is then plotted automatically in histogram fashion by using combinations of basic utilities available in the BRAIN package. If the area losses stored in another picture buffer as different gray scale values are displayed on the monitor, the pseudocolor capabilities of the system may be utilized by mapping the gray levels into contrasting colors (Fig. 10). The binary histograms generated by the opening process are characteristic for different shapes and range from a single peak for a circular object to a series of peaks of different magnitudes for more complicated shapes (Fig. 11). The binary histograms for two nuclei with the same NC1 value were different, indicating that the opening operation was a more sensitive shape descriptor than the nuclear contour index (Fig. 12). Moreover, the histograms of nuclei exhibiting hypotetraploidy sometimes displayed two prominent peak regions. Certain classes of nuclei may therefore exhibit unique histogram features regardless of the orientation in which the nuclear image is processed. In addition, a relationship between nu-

??B

Original Image

..::.. ... .:::::. :::::::::::.:.,.:::. .....-....::: ~.-:::::: ....... ::. . . ...::::::. . . . . . .. ... . :.*,. ......................................................................... ...:.:.:.:.:. .... ... . ..... . .

a

~.~.~.‘.~.‘.‘.‘.’ . . .. . .. . .. . .. . .. . .‘.‘ . .‘.‘ . ...~.~.‘ . . .‘ .. . . . . . ‘.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~.~. ~.~.~.~.‘.~.~.‘.~.~.~.~.‘.~.~.~ . . . . . . . . . . . . a.. . . . . . . . . . . . . :::::::. .:::::.

%

.

.::

.::

.

Opening an Image by Element B

.e,;i .:: “.:

,.

Q -mm.

--._

.:. ,

. .

.

.

. .

.

.

.-.

.--.:

.

;.

. .

I.

‘.

.

. . : _; ia.-. . . . ..+.

. . .

Closing an Image by Element B Fig. 9. Opening

and closing an image (modified Maragos (2 1)).

from

clear shape and function based on histogram appearance may be detected in future investigations. IMAGE

PROCESSING

IN RADIOLOGY

The impact of sophisticated and large-scale image processing, such as utilized in NMR and PET

Computenzed

Medical

Imaging

and Graphics

0

a

R 0

Fig. 11. Distributionsf‘(dX) for several shapes (modified from Serra (35)).

scanning, on health care and biomedical research is discussed elsewhere in this issue and therefore does not require extensive review in this manuscript. In addition, elegant computer-assisted image processing techniques involving laser scanning of x-rays and three-dimensional reconstruction and rotation of digitized images is destined to have an ever increasing influence on the diagnosis and treatment of disease. This influence, as stated in the Introduction, should not blind the medical and biomedical research communities to the possibilities inherent in more inexpensive imaging systems. To demonstrate the potential impact of such systems, a description of a new technique for measuring arterial diameters using DUMAS is presented. A computer-assisted method for measuring arterial diameters Considerable progress has been made in recent years in the treatment of cardiovascular disorders. Much of experimental and clinical research concerned with heart disease and related maladies caused by atherosclerotic stenosis has focused upon three

major methodologies: (a) pharmacologic drugs; (b) dietary regimes; and (c) clinical interventions (i.e., catheterization) to reduce the arterial blockage. All three of these treatment methods have been integrated into normal procedures to relieve a patient of complications due to atherosclerosis. But first, in order to diagnose the severity of the affected area and eventually evaluate the efficacy of any given treatment regimen, an arteriogram is taken of the affected arteries and their surrounding structures. An arteriogram is an x-ray of the affected artery or arteries taken after a contrast agent has been injected into the vessels via a catheter (Fig. 13). From the arteriograms, an accurate quantitative evaluation of the arterial stenosis is expected. Successful follow-up evaluations to a treatment plan are based largely on the reduction in size of the stenotic lesion and the resumption of normal blood flow. Unfortunately, significant alterations in the amount of constriction can be hidden by intrinsic errors associated with the quantitative analysis method used, especially when trying to measure a stenosis in relatively small (tl .O mm.) arteries. A

Utility of inexpensive image analysis 0 D.

44

L. MCEACHRON et al.

19

WW cell # 19; NC1 = 5.42 (4 Neighborhood)

36

44,

CONcell#21;NCI=5.40 (4 Neighborhood)

36. 28.

n

20, 12. 4.

Fig. 12. Binary histograms

for two lymphocytes

necessary corollary to correct evaluation of the treatment regimen, therefore, is achieving accurate and objective interpretations of the vessel diameters. In order to design a method to achieve such interpretations, it is necessary to understand the inherent errors associated with the present visual analysis of arteriograms. One major dilemma facing a radiologist is that vessels’ images as displayed on arteriograms have

with the same NC1 value.

edge gradients rather than sharply demarcated boundaries. In other words, the edges are fuzzy and it becomes a matter of subjective choice as to where the correct borders actually lie. As the true diameter of the artery decreases, the proportion of the image representing the boundary become larger, and the error associated with subjective demarcation of the edge increases. In addition, differences associated with varying levels of contrast available in different x-rays

Computerized Medical Imaging and Graphics

Fig. 13. An example

can lead to considerable dissimilarity in the magnitude of this subjective error, further reducing the precision of repeated measurements. Finally, visual inspection can be quite tedious and naturally very time consuming, leading to additional errors associated with boredom and fatigue. Significant intra- and interobserver variablity in diagnosing stenoses using manual measurements has been reported (4, 49) apparently caused by the factors described above. In addition, the magnitude of the errors associated with visual assessments was partially dependent on the technical quality of the film and the level of the radiologist’s recent experience in reading arteriograms. It becomes of prime importance in evaluating treatment plans to reduce this variablity to a minimum. In the Introduction, we pointed out that computerized systems can lead to significant increases in efficiency and precision. Accuracy can also be improved under certain circumstances if systems can be developed which utilize a more objective analysis of the data. From the description given above of the problems associated with visual inspection of arteriograms, we concluded that this task was a prime can-

iLlnUN\-khrua,‘~/

I’)xY i o!umt’ :

.

\ul-r;hi:i

01 an arteriogram

didate for the application of computerized image processing and video densitometry. Naturally. we were not alone in coming to this conclusion. Several researchers have attempted to apply computerized imaging techniques to this problem. Brown and co-workers (3) stressed the importance of developing an objective method for quantitative coronary arteriogram analysis. They proposed a semi-automatic system using computer-aided tracing of stenosed vessels to improve analytic accuracy. Some of the more fundamental aspects in image measurement, like magnification and pincushion distortion, were mathematically modeled and correction factors implemented to improve accuracy. Three factors limited the clinical utility of the system: (a) The time needed for analysis was too great, taking as long as 22 minutes to obtain results for a single arteriogram; (b) The system only produced accurate results with high quality films; and (c) The method was only effective for an average vessel diameter of 3.0 millimeters or greater. Other investigators (3 I, 33, 37) focused on the use of scanning densitometry to analyze arteriographic projections. Specifically, various curve fitting

Utility of inexpensive image analysis 0 D. L. MCEACHRON et al.

algorithms were developed to determine the arterial boundary that was obscured in the edge gradient. These methods were more objective yet even the best algorithms were plagued by systematic errors, including an inability to effectively deal with poor film quality. Film quality is undermined by variables such as: (a) contrast medium, (b) resolution, (c) energy levels during x-ray exposure and (d) superimposition of other anatomic structures. These and other similar factors play a role during both the creation of the x-ray image and its subsequent digitization and must be considered in any analysis scheme. An early attempt to quantitate coronary stenoses by densitometric measurement was reported by Sandor and co-workers (33). A typical densitometric profile across a vessel results in a bell-shaped curve with optical density or gray values on the ordinate and its location on the abscissa combined with various forms of interference and noise (Fig. 14). Sandor’s group smoothed the image profile using a low pass filter and then utilized an interactive operator technique to pick off the boundary points. After integrating the area between these selected points, the percent stenosis was calculated by taking the difference between a normal and a stenosed profile line. Again the accuracy and reproducibility of this method were affected by operator related errors. In

Fig. 14. A densitometric

21

addition, this method suffered from an inability to select the profile with the narrowest lumen, hence maximum stenosis could not be determined. A more automated method seemed an appropriate goal to pursue in order to reduce these errors. Further research by Spears and collaborators (37), which was a continuation of Sandor’s work, did produce a more sophisticated edge detection scheme utilizing the densitometric profile to measure vessel diameter. The ability of two different computer algorithms to calculate the coronary artery diameter from a single plane view was evaluated. The first algorithm automatically fit the best seventh order polynomial curve to the densitometric scan across the blood vessel image. From the curve, the base point and inflection points were identified and the distance between a corresponding pair of boundary points was equal to the diameter. According to Spear’s results, the base point measurements were consistently larger and the inflection point measurements consistently smaller than the actual anatomic diameter. The second algorithm was based on finding the maximum slope inside a variable window (i.e., number of pixels) as it moved across the densitometric profile. The accuracy of this approach was significantly dependent on the geometry of the edge gradient and very sensitive to the exact shape of the

profile across an arteriogram

of an artery injected

with a contrast

agent.

Computenred

Medical

Imaging

and Graphics

profile. For instance, if there were two local maxima of the slope, it was not clear which one should be the boundary point. Again during the radiographic process, good techniques had to be strictly followed in order to keep the fuzziness of the boundaries to a minimum. In all fairness, the accuracy of any approach is highly dependent on the contrast of the image’s edge gradient. One can perform all the quantitative analysis imaginable, but if the radiographic process does not produce acceptable arteriograms in the first place, then the accuracy produced by computerized image analysis will be inherently limited. An interesting algorithm was reported by Rieber and co-workers (31) which essentially performed an automatic contour detection of the arterial boundaries. The procedure required an operator to initially select various points corresponding to the centerline of the vessel. Subsequently, densitometric scanlines were generated perpendicular to the best fit centerline curve through the selected points. A weighted sum of the first and second derivative was computed for each point along the scanline. From this function, the right and left boundary points of the vessel were obtained. Finally, a two step iterative contour detection was used to minimize the influence of the initially defined centerline. The theory behind using a weighted sum of the first and second derivative is well documented by

Fig. 15. Manual

selection

;arluan-Fehrudrbj

14x4

L liiurnc

i

1

‘\urnher

i 1I!. Positions defined by maximal response 01 the first derivative criteria lie within projected arteries. In addition, the maximal response of second denvativc functions resulted in detected positions outside of the arterial lumen. This phenomenon was caused by, the limited frequency response of the imaging chain from x-ray to video. The protocol included the averaging ot eight sequential video frames to improve the signal tc~: noise ratio during digitization. Pincushion distortion was corrected by the use of a 1 cm rectangular grid. This grid was superimposed onto the arteriograph during the radiographic process and the correction factor determined. We did not concern ourselves with pincushion distortion in our study because image intensifying was not a factor with our arteriograms. One important element of the process was the total analysis time. A single obstruction took ten minutes for a complete analysis. which is excessive when using today’s computer technologies. Using the DUMAS system, we developed both a manual and semi-automatic method to perform diameter measurements on digitized arteriograms. The manual technique interactively allowed the operator to select points corresponding to the vessel borders and the computer then calculated the distance hetween the two points (Fig. 15). The semi-automatic technique is based on fitting the best parametric model to the curve produced by the densitometric

of vessel borders

using the point/pick

method

et al.

Utility of inexpensive image analysis 0 D. L. MCEACHRON

(gray value) profile perpendicular to a contrasted artery. The densitometric profile model across the vessel and curve-fitting optimization technique are designed to be very robust so that the parameters will closely fit the observed gray value information despite various artifacts normally present on a digitized arteriogram. The application programs for diameter measurements were written using the same operating system and many of the same functions as found in the BRAIN package. The user interface was also quite similar, requiring little training for the user. As with BRAIN, main control of the application is through a graphic input device (i.e., mouse or graphics tablet) interacting with a pull down menu system on the display monitor. Once the arteriogram has been digitized into the processor memory, the program was adapted to make measurements on any vessel regardless of its angle as displayed on the monitor. While much of the work is discussed below, a more detailed description of the assumptions, models and algorithms which were developed can be found in Knecht ( 18). If we consider an incident monoenergetic x-ray beam of intensity Zinpassing through a body, it will encounter a variety of materials. Some of the x-ray photons will be absorbed by this material, some scattered and the rest will interact with the x-ray film to produce an image of the internal structures. Therefore, the general equation is proportional to the incident beam, the interacting distance and the material (20). The resulting equation is:

where P is the attenuation coefficient of the material and z is the thickness of body material. Assigning the z-axis to be perpendicular to the x-ray source and film plane, we have an imaging chain as presented in Fig. 16. The x-ray beams emitted from the source diverge at various angles and contribute to the magnification of the body structures. In Fig. 16, the magnification factor can be calculated as (b + a)/~. The amount of magnification can be adjusted by a change in these distances and was checked to confirm the magnification calibration obtained in image analysis. During x-ray transmission there are three major types of materials the x-ray beam encounters along its path: bone, soft tissue, and the contrast-enhanced blood in the arteries. The above equation can be expanded in more detail to include the significant materials and the elimination of the spatial variation in ZiJx,y) by assigning as a constant Zi”:

z&x

,

y)

=

Jn

exp-(‘.‘“‘x,Y)+‘“b’xS)+‘,(~.Y)}

23

(2)

where pv, pb, and ps are the attenuation coefficients for contrast enhanced vessels, bone and soft tissue, respectively. We assume the different attenuation coefficients are constant for the respective material. The thicknesses of the vessel lumen, bone and soft tissue are represented by t,, tb and ts, respectively at a specific point (x, y). We are able to obtain a good contrast between the vessels and the other components of the x-ray image because of the higher attenuation coefficient of the contrast medium in the blood than the other materials and the slower variation in thickness from one point to the next of the bone and soft tissue. In Eq. 3 we will collect the bone term (&Zb(x, y)) and soft tissue term (p&(x, y)) into a general background term B which produces the equation: Z&X,

Jl)= ZineXp-~p""'x~y~+B1.

(3)

Taking the natural logarithm of both sides of the equation and assigning a new variable D(x, y) to the result gives us an expression for the projected image. The projected image D(x, y) or film density is related to the transmission of x-ray intensity and a logarithmic conversion factor. D(x, v) = k ln ZoUt(x, Y) = -+.&(x9 Y)+ & .

(4)

When an arteriogram is digitized, the digitizer produces a number or grey value proportional to the transmission of the film, so that J’(x, Y) = kT(x, Y)

(5)

where P(x, y) is an array of digitized values and T(x, y) is the transmission of the film. The projected image P(x, y) obtained from digital scanning does not make a logarithmic conversion as is done in Eq. 4. Therefore, the image is directly related to the transmission of the film and produces the densitometric profile as shown in Fig. 14. Referring to Fig. 16 and recalling that we are interested in modelling the densitometric profile data in the horizontal plane (x-z plane), we can drop the y term from the above equation to make the equation a function of x only. Therefore, the thickness terms t,(x, y) becomes simply Z”(x).Combining eqs. 4 and 5, the end result gives us a projected image P(x) along the x-axis in the image plane that looks like this: P(x) = Z+&(x) + Bl

(6)

where Z,(x) equals the thickness of the vessel. Next we have to develop functions to represent the thicknesses of each component of the projected image P(x).

Computenzed

Medical

Imaging

and Graphics

x-ray Source

Plane I

t-8

Light 30-e Pb

.--t+

Z-8x1s

d-b-----+

x-rayfilmPlane

-

Fig. 16. Pathway from x-ray to digital image.

An excellent model, based on an elliptical equation, for estimating the diameter of coronary arteries on arteriograms has been developed (17, 28). Insofar as our particular application was primarily concerned with measurements of iliac arteries, we were able to make several simplifying modifications. The point spread function was eliminated because iliac arteriograms do not contain nearly the amount of blur found in coronary arteriograms. In addition, the background of our data was sufficiently uniform so that a function to fit the varying background was not required. A typical densitometric profile across a vessel is diagrammed in Figs. 17 and 18, with the pixels comprising the vessel’s spatial location plotted on the hor-

izontal (x) axis and the corresponding gray value for each pixel on the vertical (y) axis. We developed a mathematical model to reproduce the curve stepping from pixel to pixel across a perpendicular segment to the centerline of the vessel. Our model is comprised of two major components, the projected image of the contrasted vessel and the background. The term u(x) models the projected image of the contrasted artery in the overall densitometric profile model P(X) and is calculated using the equation provided below: u(x) =

Pdr’ - (X - x,)‘, (0

if

1x1 I j

otherwise

where /3is a scaling factor based on the contrast of the

25

Utility of inexpensive image analysis 0 D. L. MCEACHRON etal.

75--

60-m

45--

I 4 I I ,

30--

1 I I ,

1

Y1

15--

I

I

I

10

20

30

Fig. 17. Densitometric

I 1 I 1

xc

I

I

40

50

n

profile and its model components.

vessel, r is the radius of the vessel and x, is the center of the vessel. We added a constant background term, B, to the densiometric profile model and the overall equation is summarized as the following: P(x) = B + u(x).

This model contains a total of four parameters that need to be estimated by an established optimization technique. We feel that this model is more efficient than those discussed throughout this section because, given the following circumstances: 1. The type of x-ray (i.e., iliac and femoral arteries) with a fairly constant background 2. The robustness of the curve-fitting technique. This simpler model can generate adequate results in less time. Before the parameters r, B, x, and B in the model P(x) can be optimized, initial estimates must be made to seed the optimization algorithm. The convergence time depends on how close the initial estimates are to the actual values. Artifacts such as noise and background structures may affect the estimation process but by creating a robust estimating algorithm it is hoped that the effects of these artifacts will be minimized. One requirement to achieve maximum performance of the estimation process is to choose a window outside the perceived arterial boundaries so as to include background information. Since the analysis

process allows the user to select a single densitometric profile or window containing multiple densitometric scanlines, the user should be able to fulfill this requirement. Furthermore, we assume that the profile data is horizontal to the artery centerline or can be modified to approximate this condition (18). Given these assumptions, the estimating procedure is ready to use any densitometric profile given to it. The estimation algorithm is as follows: 1. Let [x;, XJ] be the initial and final x coordinates selected on the densitometric array G(x). 2. Find the average threshold value B by averaging (G(xJ, G(x; + l), G(x; + 2), G(x, - 2), G(x/ - I), WI,)).

3. Find the maximum data point x, on the densitometric profile such that G(x,) 2 G(x,J where x, is any element of [x1, x~]. 4. To find the left side boundary point: start at x, and move to the left on the x-axis: a. if G(x,) I B, go to step 6, else go to b. b. if G(xJ I G(x, + 1) and G(xJ =CG(x, - 1) indicating the first local minimum point, then go to step 5. 5. If G(xJ I G(x, - 1) and G(x,) I G(x, - 2), then go to step 6, else go to step 4. 6. To find the right side boundary point, start at x, and move to the right on the x-axis: a. if G(x,) 5 B, go to step 8, else go to b. b. if G(x,) I G(x, - 1) and G(x,) < G(x, + 1) indicating the first local minimum point, then go to step 7.

Computerized

Fig. 18. (a) Selecting

Medical

a window

.ianuarv-February;

Imagmg

and Graphics

around

the region of interest. interest.

7. If G(x,) (: G(x;. + 1) and G(q) 5 G(x~ + 2), then go to step 8, else go to step 6. 8. Tlhe final results are the approximate boundary PCjints [XI, A;] on the densitometric profile G(X).

IYXY. L <,itlmt’

(b) A densitometric

i ;S hiumixr L

profile across the vessel t

The rationale for this algorithm is based on the following description of a normal case. The dlensitometric profile data usually has only one ma:ximum point x, and if the initial window is selected pr,operly .

Utility of inexpensive

image analysis 0 D. L. MCEACHRON et al.

it will not be on an arterial boundary. On the left side of x,, a search is performed to either find the point G(x,) I B, the established background value or a local minimum point. If it finds a local minimum point and the next point is a local maximum point, the search disregards this point as a border point and continues the search. It performs the same type of search on the right side to approximate its border. The justification for this search is that the profile is noisy and contains many small local maxima. This algorithm will bypass these small local maxima until it reaches a threshold point or a larger maximum due to background structures like bone or overlapping vessels. As a result, we now have approximations for the border points [xl, x,]. Once the vessel borders are obtained, we have the necessary information to initially estimate the remaining parameters; r, xc, fi of the parametric function (Fig. 17). A computer algorithm which optimizes the best fit curve of the densitometric profile is then called. In the algorithm, the computed squared error is represented by the equation: -W, r, xc, P) = i [G(i) - W(j), i=o

B, c xc, @)I2 (10)

where G(i) is digitized gray value at locations x(i), x(i) are the samples of x along its axis and P(x) is the model function. The algorithm minimizes the error measurement by calculating the best fit for the model by iteratively selecting the proper parameter values for B, r, x,., and p (39). The implementation of the curve fitting algorithm uses a combination of principles from (a) steepest descent; (b) a gradient technique, quadratic fit and the Fletcher-Reeves method; and (c) a partial conjugate gradient approach. Beginning with steepest descent, the object is to minimize the gradient of a function. In Eq. 10, E(B, r, xc, p) can be simplified to E(x) where x is a four dimensional vector representing the four arguments B, r, x,., /3. The gradient of the function E(x) is defined as: aE aE aE aE ag . VE(x) = S(x) = z , Y$, z

(

2

c

1

In order to simplify the expression for the gradient, we omit the argument x and let S,, be equal to S(x,) = VE(x,J. The negative gradient, -S,,, is a vector that points in the direction of steepest descent and lies in the direction of greatest rate of change of the function. To minimize E(x), we use an iterative algorithm which provides a good direction to reduce the function value most rapidly: &+I =x,-as,.

21

The scaling factor a, is iteratively calculated to minimize E(x, - a&J. Typically the process starts at x, and searches along the negative gradient, -S,, , until it reaches a minimum point on the line. Then xn+, is set equal to this minimum point and the search repeats starting at this new point. After replacing n + 1 with n, the new S, is calculated and a new a,, is determined. The objective of this method is to minimize the error function E(x) over II variables. At the end of each search, E(x) is reduced, but the process will continue as long as &(x) # 0. So as a stopping measure which does not affect measurement accuracy, the procedure is programmed to stop if the change in E(x) is smaller than a preset threshold value. It is also time-consuming to continue with the process if no significant improvement in the error will be gained. Therefore, some compromise must be made between accuracy and time required to execute the program. The Fletcher-Reeves method uses the partial conjugate gradient principle in nonquadratic cases. In Fletcher-Reeves, the search directions are not specified beforehand, but rather at the start of each iteration. At each step i, the current negative gradient has a linear combination of the previous direction vectors added to it to obtain a new conjugate direction vector along which to move. Then after II steps, a pure steepest descent step is taken, which serves as a spacer step to achieve global convergence. Simply stated, convergence is improved using this method and it performs more rapidly and efficiently especially when used in combination with the quadratic fit. The curve fitting algorithm using the FletcherReeves method is summarized as follows. After determining initial estimates for the parameters, the negative gradient -V&x,) is computed and set to S,. Because we have four parameters, the loop will be executed four times. During each loop the scalar term a,, is determined that will minimize the error function E(x; + US,). At this point the error function is compared to the previous iteration error to see if the parameters are converging on the best fit curve. If the difference between the current error and the previous iteration error is less than a fixed threshold, the program ends, otherwise it repeats this sequence again. A new gradient vector VE(Xi+,) is computed. Using this value and a value for r, the next negative gradient Si+1 is calculated. This value is used to repeat the minimization of the error function E(x). However, if the loop has been performed n times the whole process is reset to the beginning of the algorithm with the x0 values being substituted by the last parameters xk calculated in the looping process. Using the Fletcher-Reeves method should generate a convergence in n cycles or fewer depending on the threshold criteria.

Computenzed

28

Medical

Imagmg

and Graphics

This optimization technique is not complete without a discussion of quadratic fit. The underlying assumption is that the error function being searched for is unimodal and possesses a certain degree of smoothness. By exploiting this smoothness using quadratic fit, we can get a more efficient search technique. This technique is implemented just before the convergence check in the optimization algorithm. We initiate this line search by finding three points during the process that minimize a, lets call them x, , x2, xi with xi < x2 < x3. Corresponding error terms are obtained such thatf(x,) r.f(x2) 5 .f(x3) andf(x,) = f; , .f(_x:) = ,f . f(xj) = f; . A quadratic function can be found that passes through the above three points. A new point x4 can be determined where the derivative of this quadratic function vanishes. The point x4 is calculated by:

.)ectionjcaliper method uias I’.= .985. High correlation coefficients \vere also exhibited between the point pick/digital method and the direct caliper method t fi .987) and the semi-automatic/digital method and direct caliper method (I, = .989). From the data we obtained, the semi-automatic technique produced results that were as accurate as any manual method when compared with direct vernier caliper measurement of arterial endocasts. Thus. precision and efficiency were increased, as one might expect from a computerized system, without any loss in accuracy. Howev~er. for the reasons discussed above. it should be possible to improve the accuracy of DUMAS for this purpose by the proper modifications of the underlying model. This requires more extensive study and experimentation to determine what parameters and equations will be necessary CONCLUSION

where a,, = x, - x, and b,, = x,’ - x,‘. Next x4 is checked against one possibility, iff(x4) I f(x2). The lower value is compared against the previous iteration error and if the difference is less than the preset threshold value then the best estimates for the four parameters are obtained, otherwise the loop is repeated. A comprehensive set of evaluations of our approach was performed on standards cast from rabbit arteries and in-vivo arteriograms using four measurement methods: (a) direct caliper on arterial casts (i.e., caliper), (b) caliper on overhead projected x-ray film (i.e., caliper/projection), (c) manual point picking on a digital image, (d) semi-automatic curve-fit of densitometric profile on a digital image. The data listed below provides a summary of the accuracy exhibited by the various measurement methods. The percent accuracy error is calculated by comparing the specific method to the standard reference method which is listed at the top.

ARTERIAL CAST Method (std. reference)

caliper

Caliper/projection Point Pick/digital Curve Fit/digital

7.0% 8.0% 7.0%

caliper/ projection

0.7% 0.1%

In-vivo ARTERIOGRAMS caliper/ projection

0.9% 1.5%

Using arterial casts, we obtained excellent correlation between all the methods. The correlation coefficient between the direct caliper method and the pro-

We hope that this discussion has given the reader a taste of the potential utility of inexpensive image analysis based upon personal computers. Future advances in hardware and software will undoubtably create more opportunity for computerized imaging processing in medicine and biomedical research. For the potential of such systems to be fully realized, however, there must be a user demand fueling research and development. One of the goals of this paper, and indeed of the Journal itself. is to help create such a demand by showing the reader what is presently possible and what might be possible in the near future. Only through an extensive interaction between clinicians, biomedical researchers, engineers. and computer programmers can the future be summoned. The BRPT Centers funded by the NIH facillitate such interactions and with continued support and greater utilization will most certainly play a role in extending the realm of the possible in computerized image processing.

REFERENCES Benno. R.H.: Tucker. L.W : Joh, T.H.; Reis, D.J. Quantitattve immunocytochemistry of tyrosine hydroxylase in rat brain. I. Development of a computer assisted method using peroxtdase-antiperoxidase technique. Brain Res. 246:225: 1982. Broder, S.: Bunn, P.A.. Jr. Cutaneous T-cell lymphomas. Sem. in Oncology 7:3 10; 1980. Brown, G.: Bolson, E.: Frimer. H.; Dodge, H.T. Quantttattcc coronary arteriography: Estimation. dimensions. hemodynamic resistance and atheroma mass of coronary lesions using arteriogram and digital computation. Circulation 55:329: 1977. DeRouen, T.A.: Murray. J.A.: Owen, W. Variability in the analysis of coronary arteriograms. Circulation 55:324: 1977 Eilbert, J.L.: Gallistel. C.R.: McEachron. D.L. The variation rn

Utility of inexpensive image analysis 0 D. L. MCEACHRONet al. user drawn outlines on digital images: Effects on quantitative autoradiography; manuscript in preparation. 6. Fisher, B.; Redmond, C.; Brown, A.; and other NSABP investigators. Influence of tumor estrogen and progesterone receptor levels on the response to tamoxifen and chemotherapy in primary breast cancer. J. Clin One. 1:227; 1983. 7. Geary, W.A.; Wooten, G.F. Quantitative film autoradiography of agonist and antagonist binding in rat brain. J. Pharm. Exp. Ther. 225:234; 1983. 8. Greene, G.L.; Nolan, C.; Engler, J.P.; Jensen, E.V. Monoclonal antibodies to human estrogen receptor. PNAS 77:5 115; 1980. 9. Gross, D.S.; Rothfeld, J.M. Quantitative immunocytochemistry of hypothalamic and pituitary hormones: validation of an automated, computerized image analysis system. JCH 33: 11; 1985. 10. Haralick, R.M.; Sternberg, S.R.; Zhuang, X. Image analysis using mathematical morphology. IEEE Transactions on Pattern Analysis and Machine Intelligence 9:532; 1987. 11. Hawman, E. Digital boundary detection techniques for the analysis of gated cardiac scintigrams. Optical Engineering 20:719; 1981. 12 Hawkins, R.A.; Black, R.; Steele. R.J.C.; Dixon, J.M.J.; Forrest, A.P.M. Oestrogen receptor concentration in primary breast cancer and axillary node metastases. Breast Cancer Res. Treat. 1:245; 1981. 13 Jensen, E.V. Estrogen receptors in hormone-dependent breast cancers. Cancer Res. 35:3362; 1975. 14 Jensen, E.V.; Smith, S.; DeSombre, E.R. Hormone dependency in breast cancer. J. Steroid Biochem. 7:911; 1976. 15. King, W.J.; DeSombre, E.R.; Jensen, E.V.; Greene, G.L. Comparison of immunocytochemical and steroid-binding assays for estrogen receptor in human breast tumors. Cancer Res. 451293; 1985. 16. King, W.J.; Greene, G.L. Monoclonal antibodies localize oestrogen receptor in the nuclei of target cells. Nature 307:745; 1984. 17. Kitamura, K.; Sklansky, J.; Tobis, J. Estimating the transverse areas of coronary arteries. Technical Report, Pattern Recognition Project. University of California. Irvine: 1986. 18. Knecht,.L.B. Computer-Assisted Method for Measuring Arterial Diameters. A Master’s Thesis, Drexel University; 1987. 19. Lutzner, M.A.: Emerit, 1.; Duprepaire, R.; Flandren, G.: Gruppe, C.L.; Prunieras, M. Cytogenetic, cytophotometric and ultrastructural study of large cerebriform cells of Sezary syndrome and description of small cell variant. J. Nat. Cancer Instit. 50: 1145; 1973. 20. Macovski, A. Medical Imaging Systems, Engelwood Cliffs, NJ. Prentice-Hall, Inc.; 1983. 21. Maragos. P. Tutorial on advances in morphological image processing and analysis. Optical Engineering 26:623; 1987. 22. McCarty, K.S.; Miller, L.S.; Cox, E.B.; Konrath, J.; McCarty, K.S. Estrogen receptor analyses. Arch. Path. 109:7 16; 1985. 23. McEachron, D.L.; Gallistel, CR.; Eilbert, J.L.; Tretiak, O.J. The analytic and functional accuracy of a video densitometry system. J. Neurosci. Methods; in press. 24. McEachron, D.L.; Tretiak, O.J.; Feingold, E. Autoradiography and video densitometry: Image processing with DUMAS, Part 1. Functional Photography 22:30: 1987. 25. McEachron, D.L.; Tretiak, O.J.; Feingold, E. Autoradiography and video densitometry: Image processing with DUMAS, Part 2. Functional Photography 22:26; 1987. 26. Meijer, C.J.L.M.; van der Loo, E.M.; van Vloten, W.A.; van der Velde, E.A.; Scheffer, E.; Comelisse, C.J. Early diagnosis of mycosis fungoides and Sezary syndrome by morphometric analysis of lymphoid cells in the skin. Cancer 45:2864; 1980. 27 Mumford, C.J.; Elston, C.W.; Campbell, F.C.; Blarney, R.W.; Johnson, J.; Nicholsen, R.I.; Griffiths, K.K. Tumour epithelial cellularity and quantitative oetrogen receptor values in primary breast cancer. Br. J. Cancer 47:549; 1983. 28. Pappas, T.N.; Lim, J.S. Estimations of coronary artery boundaries in angiograms. IN: Application of Digital Image Processing VIII, SPIE 504, 3 12; 1984.

29

29. Parl, F.F. Estrogen receptor determination in human breast cancer. Prog. Clin. Pathol. 9:155; 1983. 30. Pertschuk, L.P.; Eisenberg, K.B.; Carter, A.C.; Feldman, J.G. lmmunohistologic localization of estrogen receptors in breast cancer with monoclonal antibodies. Cancer 55: 15 13; 1985. 31. Reiber, J.H.C.; Kooijman, C.J.; Gerbrands, J.J.; Schnurbiers, H.C.J.; den Boer, A.; Wijns, W.; Serruys, P.W.; Hugenholtz, P.G. Coronary artery dimensions from tine angiograms: Methodology and validation of a computer-assisted analysis procedure. iEEE Trans. Med. Imaging MI-3 13 1; 1984. 32. Roberts. M.M.: Rubens. R.D.: Kina. R.J.B.: Hawkins. R.A.: Millis, RR.; Hayward, J.L.; Forrest:A.P.M. bestrogen receptors and the response to endocrine therapy in advanced breast cancer. Br. J. Cancer 38:431; 1978. 33. Sandor, T.; Als, A.V.; Paulin, S. Cine-densitometric measurement of coronary arterial stenoses. Cath. Cardiovasc. Diagnosis 5:229; 1979. 34. Sezary. A.; Bouvrain, Y. Erythrodermis avec presence de cellulis monstres dans derme et dans sang currulaialt. Bull. Sot. Fr. Dematol. Syphilgr. 45:254; 1938. 35. Serra, J. Image analysis and mathematical morphology. London: Academic Press; 1982. 36. Silfversward, C.; Gustafsson, J.-A.: Gustafsson, S.A. Estrogen receptor concentrations in 269 cases of histologically classified human breast cancer. Cancer 45:2001; 1980. 37. Spears, J.R.; Sandor, T.; Als, A.V.; Malagold, M.; Markis, J.E.; Grossman, W.; Serur, J.R.; Paulin, S. Computerized image analysis for quantitative measurement of vessel diameter from cineangiograms. Circulation 68:453; 1983. 38. Tretiak, O.J.; Chu, C.-L.; McEachron, D.L. The evaluation of cameras for quantitative autoradiographic video densitometry. 16th Annual Meeting of the Society for Neuroscience. Nov. 9-14, 1986. Abstracts, Part I: 32.19, pg. 112. 39. Tretiak, O.J.; Yu, G. Curve-fitting method for the measurement of the resolution of digital image input devices. Optical Engineering25:1312; 1986. 40. True, L.D. Assay of estrogen receptors by quantitative Immunohistochemistry. Fed. Proc. 45:327; 1986. 41. True, L.D. Quantitative Immunohistochemistry: A new tool for surgical Pathology? Amer. J. Clin. Pathology: in press. 42 True, L.D.; Heimann, A.; Eisenfeld. A. Correlation of estrogen and progesterone receptor levels with cellularity ofbreast carcinoma. Lab. Invest. 52:69; 1985. 43. Underwood, J.C.E. A morphometric analysis of human breast carcinoma. Br. J. Cancer 26:234; 1972. 44. van der Loo, E.M.; Cnossen, J.; Meijer, C.J.L.M. Morphological aspects of T-cell subpopulations in human blood: Characterization of the cerebriform mononuclear cells in healthy individuals. Clin. Exp. Immunol. 43:506; 198 1. 45 van der Loo, E.M.; van Vloten, W.A.; Cornelisse, C.J.; Scheffer, E.; Meijer, C.J.L.M. The relevance of morphometry in the differential diagnosis of cutaneous T cell lymphomas. Brit. J. Dermatol. 104:257; 1981. 46. Vonderheid, E.C.; Tam, D.W.; Johnson, W.C.; van Scott, E.J.; Wallner, P.E. Prognostic significance of cytomorphology in the cutaneous T-cell lymphomas. Cancer 47: 119; 198 1. 47. Willemze, R.; van Vloten, W.A.; Hermans, J.; Darnsteeg, M.J.M.; Meijer, C.J.L.M. Diagnostic criteria in Sezary syndrome: A multiparameter study of peripheral blood lymphocytes in 32 patients with erythroderma. J. Invest. Dermatol. 81:392; 1983. 48. Winkler, C.F.; Bunn, P.A., Jr. Cutaneous T-cell lymphoma: A review. CRC Critical Reviews on Oncology/Hematology 1:49; 1983. 49. Zir, L.M.; Miller, S.W.; Dinsmore, R.E. Interobserver variability in coronary angiography. Circulation 53:627; 1976.

About the Author-DR.

DONALDL. MCEACHRONgraduated with Highest Honors in Behavior Genetics from the University of California at Berkeley in June, 1977 and received his Ph.D. in Neuroscience from the University of California at San Diego in March,

30

Computenzed

Medical

Imaging

and Graphics

1984. In Ma!. 1984, he became Scientific Dlrector of the Image Processing Center at Drexel University. a position which he still holds. He is the editor of Volume 1I of the Lx-perimm~ul Bio/og~~ andMdick series: Functional Mapping in Biology and Medicine: Computer Assisted Autoradiography (S. Karger. 1986). In 1987. he joined the Department of Bioscience and Biotechnology at Drexel University as a Visiting Assistant Professor and holds positions as Lecturer at the University of Pennsylvania and Thomas Jefferson University. His current interests include quantitative autoradiography and the interactions of biological rhythms and affective states. About the Author-Ms. S. K. HESS graduated from Goucher College in Maryland in I97 1 and has fulfilled ABET requirements for an undergraduate degree in engineering from Drexel University in Philadelphia. After graduating from Drexel with an M.S. in biomedical engineering in August. 1988. she will attend the Temple University School of Medicine in Philadelphia. About the Author-MR. LEWIS B. KNECHT received his B.S. degree in Biology from Lafayette College, Easton. PA in I980 and his M.S.

degree 111Biomedical Englnecrlng from Drexel iinr\er\ltj 14: pv? He is currently working for IBM in the areas of expert systems anil artificial intelllgencr

L.AL~RF.NC-EB. 'rRl!treceived his B.A. ai About the Author--DR. Harvard College and his M.D. at Tullane School of Medicine During training in Anatomic Pathology at the University of Colorado Health Sciences Center. he specialized in immunperotidase procedures in the laboraton of Dr. Paul Nakane. Following a year at Colorado as Research Associate and Instructor. he joined the facult! at the University of Chicago in the section of Surgical Pathology where. in collaboration with Dr. Elaine Fuchs of the Bio-chemistry Department, he studied the expression of keratin m tumors. In 1984. he joined the faculty of Yale University Medical School as an Assistant Professor of Pathology in the section of Surgical Pathology As Director of Diagnostic Immunohistochem.. istry. Dr. True has been developing quantitative immunohistochemtstry as a diagnostic modality, using computer-based methodologies