536
World abstracts on microelectronicsand reliability
Process optimization for polysilicon-thermal oxidepolysilicon capacitors. TOMASZ BROZEK and ROBERT WISNIEWSKI. Electron Technology (Warsaw, Poland), 28(1/2), 51 (1995). The paper presents results of experimental optimization of processing for polysilicon-thermal oxide (POLOX)polysilicon sandwich structures. This type of structurs has been found important recently, mainly due to its significant role in both memory and analog IC's. Electrical characteristics of the POLOX structures. however, still remain much poorer than those of oxides grown on monocrystalline silicon. During the course of optimization we have found that processing based on dry oxidation of"amorphous" polysilicon results in the best breakdown properties of the POLOX oxides. The results show that quality of both oxide interfaces, which determine conduction through the oxide and its polarity dependence, strongly depends on technological conditions. The process optimization allowed us to obtain good breakdown characteristics of POLOX structures (breakdown fields of 7 + 8 MV cm-~) with symmetrical current voltage characteristics. We also found that for this case the properties of both oxide interfaces, as far as electron injection and trapping are concerned, were similar.
3. CIRCUIT AND SYSTEMS RELIABILITY MAINTENANCE AND REDUNDANCY
Testing constant failure rate against NBAFR alternatives with randomly right-censored data. RAM C. TIWARI and JYOTI N. ZALKIKAR. IEEE Transactions on Reliability, 43(4), 634 (December 1994). Reliability analysts and biometricians have found it useful to categorize life distributions by the properties of the failure rate. This paper considers the problem of testing exponentiality vs (non-exponential) new better than average failure rate (NBAFR) alternatives. Often, in practice, the data are incomplete because of: (a) withdrawals from the study and (b) survivors at the time the data are analyzed. We propose a test statistic based on a function of the Kaplan-Meier estimator to accommodate randomly right censored data. The asymptotic efficacy of the test is derived and the efficiency loss due to censoring is studied. Our test is applied to published survival data. and to simulated data.
Fault-tolerant routing algorithms using estimator discretized learning automata for high-speed packetswitched networks. ATHANOSIOS V. VASILAKOg and CONSTANTINOS T. PAXIMADIS. IEEE Transactions on Reliability, 43(4), 582 (December 1994). We present an adaptive routing algorithm (VP-LA) for high speed packet-switched networks. We use the source routing strategy. VP-LA uses a new S-Model Ergodic Discretized Estimator Learning Automaton ($EDEL), specially designed for the routing problem, to select accurately and rapidly the minimum delay routes in high-speed packet-switched
networks. The estimator provides VP-LA with excellent fault-tolerant properties. Moreover, the VP-LA is E-optimal. VP-LA was extensively simulated; the results showed the superiority of VP-LA over other source and link-by-link routing algorithms. VP-LA performs quite well even where the network feedback is misleading, and can be easily and efficiently applied because of its reduced complexity and overhead.
A generalized geometric de-eutrophication softwarereliability model. OLIVER GAUDOIN, CHRISTIAN LAVERGNE and JEAN-LOUIS SOLER. IEEE Transactions on Reliability, 43(4), 536 (December 1994). We present a new software-reliability model, called the Lognormal Proportional Model (LPM). It belongs to the class of proportional models and can be viewed as a Bayes generalization of Moranda's geometric de-eutrophication model or Deterministic Proportional Model (DPM). It is based on the idea that the modeling of software improvement should be stochastic rather than deterministic. The LPM appears to be a Variance Components Linear Model that leads to the computation of several estimators of the parameters. We present a statistical test to compare the goodness-of-fit of the general LPM and the DPM, for a given realization of the failure process. An application to actual software failures data is briefly described. The LPM fits most data sets better than the DPM. This emphasizes the great variability of most software-reliability data.
Reliability development and qualification of a low-cost PQFB-based MCM. IEEE Transactions on Components, Packaging, and Manufacturing Technology, Part A, 18(1), 10 (March 1995). In Motorola's experience with commercial MCM customers, cost and size reduction are the largest driving factors for interest in MCM's. Speed and other performance factors are of secondary interest. Development and qualification can add significantly to the total cost of an MCM, so in addition to the normal desire to provide reliable products, the cost of doing so has gained increased importance. Motorola has identified three key factors in providing cost-effective MCM's: leverage single chip package experience, qualify MCM product families (package types), and use only qualified silicon devices in MCM products. This paper describes application of the three key factors to the reliability qualification of the 28-mm MCML ® Series package, a PQFP- (Plastic Quad Flat Pack) based MCM. An initial reliability evaluation was performed to investigate reliability issues. As a result of the initial evaluation, changes were made to assembly processes and materials. The MCM was then submitted to a suite of reliability stresses selected to evaluate mechanical, tfiermomechanical, moisture, and longevity performance. The MCM passed electrical and visual (SAT, or Scanning Acoustic Tomography) reliability requirements for all stresses, and performed well in extended stress tests