Combining qualitative and quantitative factors—an analytic hierarchy approach

Combining qualitative and quantitative factors—an analytic hierarchy approach

003x-012190 s3.00 + 0.00 Copyright c 1990 Pergamon Press plc Combining Factors-An Qualitative and Quantitative Analytic Hierarchy Approach WILLIAM ...

763KB Sizes 0 Downloads 59 Views

003x-012190 s3.00 + 0.00 Copyright c 1990 Pergamon Press plc

Combining Factors-An

Qualitative and Quantitative Analytic Hierarchy Approach WILLIAM

Faculty

of Business Administration.

(Rece ‘L’cLIOrloher

C. WEDLEY

Simon Fraser University,

1989; rereiwdfor

publicdon

Burnaby,

Junuary

B.C., Canada

V5A IS6

1990)

Abstrect-The Analytic Hierarchy Process (AHP) provides a general theory of measurement for expressing both tangible and intangible factors. In this paper, intangible or qualifative factors are looked upon as dimensions which we have not yet learned how to measure very well. Through a redundant pajred-comparison process. AHP allows us to translate qualitative preferences into ratio scaled data. In addition, the structuring stage of AHP facilitates problem understanding.

INTRODUCTION In making decisions, we frequently encounter complex situations which invofve a number of conflicting criteria, alternative pcrspectivcs, and ill-defined attributes. Sometimes we ignore such decisions and put them off. In most cases, however, WCmust make a decision, while the complexity leaves a great deal of uncertainty in our minds. Moreover. the difficulty of evaluating the complexity with suitable analytical techniques means that we are probably achieving lower quality outcomes than is desired. The complexity comes from two main sources. The first is our relative inability to conceptualize and structure the numerous components of the problem into a framework which facilitates understanding and analysis. The second is the nature of the components-some are quantitative (with measurements) whereas others arc subjective or non-quantitative. Our problem as decision analysts is to discover some way to combine these seemingly non-commensurate factors into a common framework. The Analytic Hierarchy Process (AHP) is one such process which is able to both structure problems and combine quantitative and qualitative attributes. Invented by Thomas L. Saaty [IO-121. the AHP provides a general theory of measurement for expressing both tangible and intangible factors. Over the years, the technique has evolved into a very flexible tool for decision making [IS]. The following are typical applications discussed at a recent ~nternationai symposium on AHP 1131: Evaluation of route choices for expressways. Rural economic development strategies. Risk assessment of international investments. Checking and evaluating military and political offtcers. Determining environmental treatments for dust and toxic pollutants. Estimation of urban travel demand. Division of labor amongst rail marshalling yards. Designing metal cutting tools. Grading tea leaves with sensual assessments. Evaluating bidders of hydroelectric projects. Decisions on urban land use. Additional applications are presented in a recent book edited by Golden ef The purpose of this paper is to explore the use of AHP in handling complex the technique, three steps are required-(l) decomposition of the problem components or elements; (2) determination of ratio weights or priorities for 57

al. (31. problems. To utilize into a hierarchy of the elements of the

WILLIAM C. WEDLEY

58

hierarchy; and (3) composition of those numbers into overall weights which measure the decision outcomes. The next section explains and illustrates how the hierarchy is structured. This is followed by two sections-one illustrating the derivation of weights for tangible factors and the other for intangibles. The process of synthesis is then explained. The paper closes with a discussion and summary. STRUCTURING

THE HIERARCHY

Typical of many decision analytic techniques, AHP uses a process of decomposition to represent the important dimensions of a problem. The representation is in the form of a functional hierarchy, with the overall focus or goal situated at the top. More specific dimensions, perspectives, criteria, or alternatives emanate outwards at lower levels. In the structure, lower level items are evaluated as to their importance, impact, or effect upon the item in the next higher level, and their ultimate effect on the overall goal. What to include in a hierarchy requires a certain amount of expertise and experience. Nevertheless, there is sufficient commonality between structures to allow generalized guidelines to . be specified.

(1) Include onl_v relevant and suficienl

detail. Sufficient detail means that all relevant aspects and perspectives are included in the problem, but not so many that the problem becomes too large or unwieldy. Overspecifying the elements will complicate the subsequent analysis, while underspecifying will lead to an unrepresentative formulation. If uncertain, err on the side of slight overspecification and then undertake greater effort to analyze a larger structure. (2) Work downwards from perspective levels, through criteria levels, IO alternalire levels. Table 1 gives generic titles to each of the levels and archetype-specific labels for different types of decision problems. By changing the generic names to specific names, we can change the structure to fit different types of problems. In a decision analysis problem, for example, WC can have different stakeholders (perspectives), each with their own objectives (criteria), who are to evaluate different activities (alternatives). Similarly, in a backward planning problem, we can have different scenarios (main perspectives), each affecting different desired futures (sub-perspectives under each scenario), which have problems to be overcome (criteria) by specific policies (alternatives). As illustrated by this latter example, we can have sub-levels within any of the generic levels. The important point to note, however, is that by changing the labels we place on the generic levels, we can structure many different types of decision problems. This fact partly accounts for AHP’s popularity and flexibility. (3) Have nine items or fewer below each node. Based upon Miller’s [8] finding that humans have difficulty when considering more than nine items at one time, Saaty [lo] specifies that nine or fewer items should be below any node in the hierarchy. If more than nine items are relevant, then they should be decomposed into an additional level whereby there are two or more homogeneous sub-groups. Alternatively, the numerous items could be handled through absolute measurements, to be explained below. (4) Clusrer like elements of the same order of magnirude. On any level, the items should be homogeneous (similar nature) and within one order of magnitude. If heterogeneous or different by more than one order of magnitude, then the items should again be broken down Table

I. Powble

hierarchical

Gcncrlc

Dccismn

BCllCtil

level

analysis

COSI

Pcrspccllvc

Criwu

levels

Icvels

Alternative

levels

Actors

labels

Scgmcnts

for

ditTerent lypes of problems Forward

Forecasting

planning

Rlrks

Actors

Backward planning Dcsarcd

Scenxios

FilClIXS

Horizons

Stakeholders

Stakcholdcrs

GrOUpS

Parties

Seclors

ConstrainIs

Crwna

Objeclivcs

Problems

Criteria

Bcncli~s

Objectiws

COSIS

Activities

Projects

Problems

fulurcr

Policies

Factors

Scenarios

Policies

Strategies

Dcasions

Oulcomcs

Possibilities

Slralcgics

Actions

Programs

Carcgorics

Outcomes

Priorilics

Alternatives

0pc10ns

Opportunities

Combining qualitative and quantitatlre

factors

59

into sub-groups of similar nature and magnitude. Upon decomposition. be made within sub-groups and then between sub-groups.

evaluations

can first

There are no set rules for how hierarchies should be structured and how many levels they might contain. The structures can be complete (i.e. all elements at one level have the same lower level items) or incomplete (some or all of the lower level elements are different). The structure, the number of levels. and the labels used to describe items depend upon the nature of the problem. An example of an incomplete hierarchy is presented in Fig. 1. It shows the objectives, programs and services offered by the Job Training Division of a government department. The division had recently undergone a major change in focus by shifting away from wage subsidy programs in preference for supporting identifiable and direct training costs. This new focus was initiated through greater involvement and cooperation with other government, business, labour, education and community-based organizations. As a result of the reorganization, the division was confronted by two problems-the need to set priorities on different programs and services which they offered, and the need to predetermine criteria on which these programs would eventually be evaluated. To help solve the first problem, the division structured the hierarchy in Fig. I. Since the overall goal of the division is to facilitate economic and social development through job training, this main focus was placed at the top of the hierarchy. Next, achievement of this goal was looked at from two perspectives: the short term (less than 2 years) and the long term (2-5 years). At the third level. we placed the specific objectives or criteria on which the various programs and services were they were decomposed into four to be judged. Since there were many different programs, homogeneous groups (general training programs, special needs training programs, services to

Social and Economic Development

I

I Short Term

Long Term Increased

overall

Job

EmuloYcr Triiiniitg

Awareness

General PfOplL5

t

Apprenticeship Training

Special Needs ROW

cost

Development

Employment

Services to Employers

Services to Etnployees/rrainees

Job Entry Disabled Ttaining

Information

Information Access Centers

t t

I-

L

Access Centers

Infotmation

information

Kiosks

Kiosks

Industrial Adj. Service

t

L L

EXamioatiOnS t2rtitication

On-the-Job Training Prom. Special Needs Training Prom.

Fig. I. Hierarchy for selecting programs in a Job Training Division.

Scheduling

Training

WILLIAM

60

C.

WEDLEY

employers, and services to employees/trainees) with the specific programs (the alternatives) placed at the bottom level. Even before evaluation, this process brought greater understanding of the division’s numerous activities. ESTABLISHING

RATIO MEASURES

FOR TANGIBLES

After a problem has been decomposed and structured into a hierarchy, the next step is to establish weights for the items under each node. Since the mathematics to accomplish this step are complex, we need the assistance of an AHP computer program such as Expert Choice [I] or Fuzzy Choice [2]. How AHP determines weights is best understood, however, if we first consider the procedure for determining ratio weights for tangibles. In Fig. 1, the cost minimization criterion is a tangible factor. We know, or we can calculate, the actual costs. The other criteria are either difficult to measure (e.g. job creation) or are very subjective (e.g. overall awareness). We treat these other criteria as intangibles. The actual costs of the four program groupings are as follows: General training programs Special needs training programs Services to employers Services to employees/trainees

$23 million $12 million $8 million $14 million.

In AHP theory, we desire to express all relationships on a ratio scale. For tangible known measures, we can do this in two ways. The first is the method of direct entry. Here, we simply enter the mcasurcs into the computer and normalize them in ratio form. For our cost figures, where a lower cost is more desirable, WCwould normalize the inverse of the numbers, as follows: WI = w2 = w3 = w4 =

General training programs Special needs training programs Services to employers Services to employees/trainees

(l/23)/(1/23 + l/12+ l/8 + l/14) = (l/12)/(1/23 + l/l2 + l/8 + l/14) = (l/8)/( l/23 + l/l2 + l/8 + l/14) = (l/14)/( l/23 + l/l2 + l/8 + l/14) =

0.134 0.258 0.387 0.221 I .ooo

Total

Thus, the special needs programs are 1.9 times (0.258/0.134 or 23/12) cheaper than the general programs. The ratio of the priority weights maintains the ratio of the real numbers. The second procedure for determining the priority weights is to enter the actual numbers into a positive reciprocal pairwise comparison matrix (a,, = l/u,,, u,, = 1) as shown in Table 2. The eigenvector associated with the largest positive eigenvalue of this matrix is 0.134, 0.258, 0.387 and 0.22 I, exactly the same numbers achieved by direct entry. Moreover, the eigenvalue for this matrix is 4, the same as the size of the matrix. Saaty [I I] has shown that the eigenvalue will be equal to the matrix size when all of the comparisons are consistent. If the same answer can be achieved by both direct entry and pairwise comparisons, why would we ever want to use comparisons when the entries are more time-consuming and the calculations more difficult? The answer lies in the assumption behind direct entry. To use it, we assume linear utility for the directly entered numbers. In other words, we assume that one dollar of expenditures on general training is worth one dollar of training for special needs. If this assumption is violated, then direct entry should not be used. How do we know when we have a nonlinear ratio? The easiest approach is to ask ourselves whether the ratios from direct entry seem to be a reasonable representation of our priorities. Table

2. Posllivc

reciprocal

Gcncral programs Gcncral Special

programs needs programs

Services

10 employers

Services

lo cmploye&lrainces

I It/23

pairwise Special

comparison needs

programs 23112

I

a, 23

8112

14!23

I4112

malrix Services

to

employers

Services

lo

employees/trainees

23;a 1218

23il4

I

a04

14,a

I

12114

Combining qualitative and quantitative factors

r

I

EQUAL 1

2

MODERATE 3

I STRONG 5

4

Fig. 2. Saaty*s fundamental

61

I 6 AHP

VERY STRONG 7

EXTREME 9

8

scale.

If not, then we would assume a nonlinear relationship and should revert to the longer pairedcomparison approach. As we will see, the paired-comparison method of generating priorities can handle both linear and nonlinear relationships and tangible and intangible criteria. ESTABLISHING

RATIOS FOR INTANGIBLES

With intangibles, we do not have an accurate measure of the items in the hierarchy. We may look upon intangibles as dimensions which we have not yet learned how to measure very we!!. Yet, we can still interrogate people to help identify ratio priorities which reflect their values. It is in this intangible or fuzzy area where AHP excels. The procedure builds upon the pairwise comparison technique introduced for tangibles. Rather than make comparisons of the actual tangible numbers, we ask the evaluator to make comparisons with Saaty’s fundamental AHP scale [I I]. This scale (see Fig. 2) measures the degree of dominance of one item over another. In the scale, 2.4. 6. 8 and decimals may be used for intermediate values. In the pairwise comparison matrix, we treat the rated degree of dominance (the numbers) as a ratio relationship and calculate priorities by the eigenvector method. The difference in this case, howcvcr, is that. unlike entering numerical comparisons where all comparisons arc perfectly consistent, intangibles will inherently have a certain degree of inconsistency. With inconsistency. the cigcnvalue (A,,,,,,)will be larger than the size of the matrix, and this fact can be used as a measure of inconsistency. Saaty [IO] specified an inconsistency index, (&lPl - n)/(n - 1). as a measure of the average departure from pure consistency. When this measure for a specific matrix is compared to the average inconsistency index of randomly generated matrices, the result is a consistency ratio whose value should be below 0.1 and definitely below 0.2 [IO]. Under many different expcrimcnta! conditions, Saaty [I I] has shown that redundant comparisons with the fundamental scale and the eigenvector routine capture a person’s subjective evaluations with high accuracy. So long as the person’s consistency ratio is low (i.e. the consistency between comparisons is good), then it is likely that comparisons and the resulting priorities accurately impound the person’s values. Figure 3 gives one set of comparisons for the Job Training Division. It shows the paired comparisons, priorities and consistency ratio for the criteria level under the short run scenario (see

COMPARISONS 1 2 3 4 5 6

BETWEEN CRITERIA

OVERALL AWARE JOE CREATION INCR. EMPLYR TR COST MIN SPEC NEED EMPLY SKILLS DEV.

Priority Weights =

OVERALL A + 3.3 + 2 + 1.1 - 2.2 - 1.4

OVERALL A 0.09

FOR THE SHORT RUN SCENARIO 2 JO8 CREAT T 1.3 T 2.8 T 1.6 T 1.3

JO8 CREAT 0.27

3 INCR. EMP T 1.4 + 1.7 -2

Note: the T and - signs point toward the more desirable criterion 4 COST MIN. + 1.3 + 1.8

INCR. EMP 0.15

COST MIN 0.11

5 SPEC NEED EMPLY - 1.2 SPEC NEED 0.18

SKILL DEV 0.20

Comparison inconsistency = 0.02. which is excellent. RECOMMENDATION: Continue to next stage if You are satisfied with the weights, RATING

SCALE USED FOR COMPARISONS

lx EOUAL

2x

3x MODERATE

6x

4x ST::NG

7x VERY STRONG

Fig. 3. Fuzzy Choice pairwise comparisons for short run criteria.

8x

9 times EXTREME

6’

WILLIAM c. WEWLEY

Fig. I). These data are from an AHP software package called Fuzzy Choice [2] which generates one half of the matrix (the other half is reciprocal). Notice that the consistency is very good and that job creation is the most important criterion, being three times more important than the least important criterion (overall awareness). SYNTHESIS

TO FORM COMPOSITE

PRIORITIES

Through the above process of either direct entry or pairwise comparison, we establish what are called local weights under each node of the hierarchy. Each set of local weights sum to unity and represent the relative weight or importance which each element has at that particular level of the hierarchy. But in the hierarchy, higher levels are more important and dominate lower levels. Accordingly, the overall weight assigned to the lower levels must be adjusted to reflect diminishing importance as we move down the hierarchy. This is done by multiplying the local weights by the product of all local weights which lead to the topmost goal. These local weights adjusted for higher level impacts are called “global weights.” They represent the weighted importance of each node within the overall hierarchy. Whereas local weights sum to unity under one node. global weights sum to unity across a complete level. An example of global weights will help clarify how they are calculated. In Fig. 3, the calculated priorities are the local weights for the criteria under the short run scenario. If the paired comparisons at the higher level produced local weights of 0.20 and 0.80 for the short run and long run scenarios. respectively, then the global weights for the criteria under the short run scenario would be 0.20 times each of the local weights, yielding 0.018, 0.054,0.030,0.022, 0.036 and 0.040. Since these global weights are the product of all higher level weights, multiplying them by the local weights in their immediate lower level will generate the global weights of that lower level. After the global weights have been calculated through to the lowest level (obviously, the computer will do this for us), we can synthesize by adding the global weights of all bottom level nodes which have the same labels. In the case of incomplete hierarchies, however, we usually have to take an extra step to introduce a structural adjustment to reflect different numbers of alternatives at the lower level. We do this by weighting the set of lower level weights by the ratio of the number of items in the set to the total number of elements in that level. This is the same as introducing a new hierarchical level just above the alternatives which weighs each set for the relative number of items in it. If we ignored this structural adjustment, then we would be allocating excessive priority to sets which have few elements while penalizing those sets which have a large number of items. In our Training Division example, there are 26 items at the bottom level. Accordingly, we would multiply the General Program global weights by 4/26, the Special Needs programs by 6/26, and the items under the two Services nodes by 8/26. This adjustment proportionally allocates the criteria weights to the alternatives so that each alternative gets its fair share of the criterion weight. In synthesizing the adjusted global weights, we should note that all the bottom level programs are unique except for the Training and Enterprise Centers, the Information Access Centers, and the Information Kiosks. These facilities provide services to both employers and employees/trainees. Accordingly, we add the adjusted global weights of these three facilities to obtain their composite weights over the entire hierarchy. We may now compare the total weights for each program or service to see which ones have the greatest benefit for the Division. DISCUSSION In the Job Training Division example. most of the levels are composed of intangible or non-quantitative factors which have an impact upon our decision. As was shown, AHP is capable of expressing both tangible and intangible factors on standard scales based upon ratio relationships. This ability to intermix non-quantitative and quantitative factors in the same decision framework gives AHP considerable power as a decision making tool. But in spite of its simplicity, flexibility, and intuitive appeal, there are some difficulties in applying the AHP. One key issue involves what to do when there is a very large number of alternatives at the lowest level, implying that it would be too cumbersome to make paired comparisons amongst

Combining

qualitative

and quantitative

factors

63

them all. An example would be the selection amongst hundreds of candidates who have applied for vocational training, or numerous employers who have applied for limited training funds. Fortunately, AHP has a methodology for handling such situations. It is called absolute measurement, although the implied precision in the word “absolute” may be a little misleading. It works as follows. Instead of placing the numerous alternatives at the bottom level below the criteria, semantic anchors or indicators of the criteria (e.g. excellent references, average references, poor references) are inserted. For each criterion, paired comparisons are then made between the indicators. keeping in mind a typical candidate who would have the semantic anchor. In this manner, relative weights are generated for a prototype person with that indicator. Next, in a second stage, each candidate is evaluated and assigned an indicator for each criterion. Since we know the global weight of each indicator assigned to the candidate, we can sum across criteria to get a global score for each person. These scores are then used to rank the candidates. In carrying out this process, we are treating the indicators as absolute values attributable to the candidate. A second drawback of AHP is the number of paired comparisons which must be performed. As the number of levels and the number of items under each node increases, the number of required paired comparisons rises quite dramatically. For example, if pairwise comparisons were used for all nodes of Fig. I, then a total of 262 comparisons would be required. Fortunately, again, there is help on the horizon. Harker [4-71 has developed a method for calculating priorities with partial comparisons and for picking the next missing comparison which has the greatest impact on attribute priorities. Wedley [I41 has supplemented this work by developing equations for forecasting the final consistency ratio when there are incomplete comparisons. Finally, Millett and Harker [9] propose rules for reducing the number of initial questions, paring branches with negligible weights, and focusing on dominating alternatives. These procedures can be readily incorporated into AHP software, thereby reducing the amount of pairwisc comparison enbrt. An area for future development involves new techniques for applying AHP to group processes. All of the present software packages allow consensus groups to operate via procedures whereby group members huddle around the computer discussing the correct comparison values. Although this interaction is good for resolving conflict and reaching a consensus, it is time-consuming in terms of the human resources required. An alternative procedure would be a Delphi or Nominal Group Process. Here, the group members operate separately and perform individual evaluations with their input aggregated at a later time. This separated aggregation would enable significant differences in perceptions to be pinpointed via a computer program without actually having the group meet. Moreover, it would help avoid the lengthy discussions which ordinarily occur even when there are no significant differences in perception. SUMMARY Group processes and incomplete comparisons are just two directions in which AHP theory is evolving. New areas of application are conflict resolution, expert systems, and the merger of AHP principles into other methodologies. Central to these developments is AHP’s ability to overcome two major obstacles in complex decisions. As has been shown in this paper, AHP can structure the relevant components of a problem into a framework which people understand. Complexity is reduced to a functional hierarchical representation of the important decision elements. Secondly, AHP can analyze both the quantitative and qualitative aspects of those elements through ratio scales which tie the framework together. It is this ability to structure and analyze which has allowed AHP to gain major success as a flexible tool of decision making. People understand AHP output-it makes intuitive sense. They appreciate its application and they find ever-more situations in which it can be applied. Ackno,I,/c’[~~~nrcn/.r-The Natural Sciences and Engineering Research Council of Canada and the Social Sciences and Humanities Research Council of Canada have provided financial support for the author’s research into AHP methodologies. Their continued support has contributed to the knowledge in this article.

64

WILLIAM C. WEDLEV

REFERENCES I. E. Forman 2. 3. 4. 5. 6. 7. 8. 9.

IO. I I. 12. 13. 14. 15.

er 01. E.rperl Choice. Decision Support Software, McLean. Va (1986). Fair-:.r Choice. General Decision Support Systems Inc. Bumaby. B.C. (1986). B. Golden. E. Wasil and P. Harker. The Analytic Hierarchy Process: Applicarions und Studies. Springer, New York ( 1989). P. T. Harker. Derivatives of the Perron root of a positive reciprocal matrix: with application to the Analytic Hierarchy Process. Appl. Math. Compur. 22. 277-283 (1987). P. T. Harker. The incomplete pairwise comparisons in the Analytic Hierarchy Process. Math. Model. 9,837-848 (1987). P. T. Harker. Alternative modes of questioning in the Analytic Hierarchy Process. Math. Model. 9, 353-360 (1987). P. T. Harker and L. G. Vargas. The theory of ratio scale estimation: Saaty’s Analytic Hierarchy Process. Mgmr Sci. 33, 1383-1403 (1987). G. A. Miller. The magical number seven plus or minus two: some limits on our capacity for processing information. Psycho/. Ret,. 63, 81-97 (1956). I. Millett and P. T. Harker. Globally effective questioning in the Analytic Hierarchy Process. Decision Sciences Working Paper 88-06-05. Decision Sciences Department, The Wharton School, University of Pennsylvania, Philadelphia, Pa. (1988). T. L. Saaty. A scaling method for priorities in hierarchical structures. J. moth. Psycho/. 150). 234-281 (1977). T. L. Saaty. The Anolyric Hierarchy Process. McGraw-Hill, New York (1980). T. L. Saaty. Decision Making for Leaders. Wadsworth, Belmont, Calif. (1982). Tianjin University. Preprints of fnrernarional S,wnposium on rhe Anolyric Hierarchy Process. Tianjin, China 1988). W. C. Wedley. Consistency tests for incomplete AHP matrices-a comparison of two methods. Proceedings of the Adminisrroriw Sciences Association of Canada-Munagemenf Science Dicision (1989). F. Zahcdi. The analytic hierarchy process: a survey of the method and its applications. InrerJuces 16, 96108 (1986).