A new technique to optimize system reliability

A new technique to optimize system reliability

World Abstracts on Microelectronics and Reliability distribution of the t-th smallest value in a third-stage sample which exceeds the k-th smallest va...

133KB Sizes 3 Downloads 165 Views

World Abstracts on Microelectronics and Reliability distribution of the t-th smallest value in a third-stage sample which exceeds the k-th smallest value in the second sample. Procedures and tables are given for two situations: (A) The usual 2-stage prediction interval has been applied, and a third stage is now required; sample sizes are given for this problem. (B) We know in advance that three stages will be necessary; the factors are given for the required procedure.

991

squared-error loss function and Jeffreys' noninformative prior, a Bayes estimator of the Rayleigh parameter and the associated reliability function have been obtained and the estimators compared with their maximum likelihood and uniformly minimum variance unbiased counterparts. Credible intervals and the highest posterior density intervals for the scale parameter and the reliability function are derived. A numerical example is given.

Computer aided quality management in electronics and production--a challenge for the entire enterprise. B. P. DAVlS. Feinwerktechnik Messtechnik 91 (8), 365 (1983) (in German). The quality data from automatic test systems for electronic components, modules and devices can only really be used with a computer supported hierarchic system. The drawing in of marketing, product development, design and customer support into the quality management of the production leads to greater productivity and to numerous economic and strategic advantages for the entire enterprise.

Folded equations for a Weibull maximum likelihood ratio test, J. EDWARD B1LIKAM and RAYKUN R. CHANG. IEEE Trans. Reliab. R-32 (2), 197 (June 1983). An improved non-myopic method of maximum likelihood estimation for a 2-parameter Weibull model with multi-censored samples, is derived and demonstrated. Maximum likelihood ratio testing of two hypotheses is discussed for: (1) homogeneous shape parameters, and (2) a composite of homogeneous shape and scale parameters.

On the determination of all tie sets and minimal cut sets between any two nodes of a graph through Petri nets. G. S. HURA. Microelectron. Reliab. 23 (3), 471 (1983). A technique utilizing the reachability concept of Petri nets is proposed to determine all the tie sets and the minimal cut sets between two specified nodes in a graph. The proposed technique is novel in the sense that it requires only vector additions on a single matrix as compared to a large number of steps required in the existing techniques. This alleviates the computational problems and memory requirements.

Software reliability models: a review. J. G. SHANTHIKUMAR. Microelectron. Reliab. 23 (5), 903 (1983). In this paper we review several software reliability models and provide an extensive listing of papers dealing with software reliability modelling and their applications. The models discussed are grouped into two broad categories: empirical and analytical models. Analytical models are further subdivided into static and dynamic models, and the general theory behind these models are reviewed. Based on the observations made in this review, we provide suggestions for future research.

On profit evaluation in a modular system with two types of failures. M. C. GUPTA and ASHOK KUMAR. Microelectron. Reliab. 23 (5), 823 (1983). A single unit maintained system consisting of two types of modules has been discussed. Failure of the type I module brings the system to the failed state, whereas failure of the type II module brings the system to the less productive state. The system is identified by up and down states and the expected profit is obtained. Two cases are discussed. In the first case, earnings (cost) in the failed state are assumed to be a continuous function of the repair rate, and the optimum repair rate which maximizes expected profit is obtained. In the second case, earnings (cost) in the failed state are taken as a discrete function of the repair rate. A procedure is suggested which enables one to make an optimum choice of repair policy. Numerical examples are included to illustrate the results.

Steiner trees in probabilistic networks. JOSEPH A. WALD and CHARLES J. COLBOURN. Microelectron. Reliab. 23 (5), 837 (1983). Various network reliability problems are # P complete, however, certain classes of networks such as series-parallel networks, admit polynomial time algorithms. We extend these efficient methods to a superclass of seriesparallel networks, the partial 2-trees. In fact, we solve a more general problem: given a probabilistic partial 2-tree and a set T of target nodes, we compute in linear time the probability of obtaining a subgraph connecting all of the target nodes. Equivalently, this is the probability of obtaining a Steiner tree for T. The algorithm exploits a characterization of partial 2-trees as graphs with no subgraph homeomorphic to K 4.

A new technique to optimize system reliability. RYOICHI SASAKI, TATEO OKADA and SADANORISHINGAI. IEEE Trans. Reliab. R-32 (2), 175 (June 1983). This paper describes an algorithm for solving reliability optimization problems formulated as nonlinear binary programming problems with multiple-choice constraints. These constraints stand for restrictions in which only one variable is assigned to each subset making up the set; thus, they are expressed by equations whose r.h.s, is unity. Different types of methods for achieving high reliability (an increase in component reliability, parallel redundancy, standby redundancy, etc.) can be easily used simultaneously as design alternatives for each subsystem. In order to solve the problem effectively, the Lawler & Bell algorithm is improved by introducing a new lexicographic enumeration order which always satisfies the multiple-choice constraints. The function for obtaining feasible solutions which give first ~ L - t h minimum values of the objective function is added to the algorithm in order to make it more useful for decision making. After a numerical example assists in understanding the algorithm, the computational efficiency is compared with that of the Lawler & Bell algorithm. Credible and H P D intervals of the parameter and reliability of Rayleigh distribution. S. K. SINHA and H. A. HOWLADER. IEEE Trans. Reliab. R-32 (2), 217 (June 1983). Using

Quality control techniques for "Zero Defects". THOMAS W. CALVIN. IEEE Trans. Components Hybrids Mj(cl Technol. CHMT-6 (3), 323 (September 1983). Everyone is being exposed to the "zero defects" philosophy which establishes zero as a goal. This will not be achieved overnight but approached over time by continually striving to reduce targets. What kind of techniques are needed to assure zero defects'? What constitutes an out-of-control situation? An attributes control chart conveys little information at or near zero defects. Assuring zero defects through sampling inspection leads to infinite samples or 100 ~, inspection, assuming 100~ inspection efficiency (the latter rarely exists, and efficiency probably gets worse at lower defect levels). Obviously, some new approaches to quality control (QC) techniques will be necessary at zero defects. One old standby is the variables control chart, X and R, but with the specification at least five standard deviations from thc average. Thus one route to zero defects is a properly chosen specification. However, if attributes data must be used, the standard p and u charts are not very useful. Perhaps a control chart that plots the number of good items between defects on a logarithmic scale to accommodate large numbers can be used, establishing upper and lower limits on the number of items between defects. Another problem area at zero defects is sampling inspection to assure targets. Following the approach above on good items between defects, the number of accepted lots between rejected lots can be a criterion. Sample sizes can be related to lot sizes as in