A Measure of Staff Perceptions of Quality-Oriented Organizational Performance: Initial Development and Internal Consistency Patrick M. McCarthy
Middle Tennessee State University, Murfreesboro, TN, USA
Thomas J. Keefe
Indiana University Southeast, New Albany, IN, USA
The present study originated from a university's effort to develop a Total Quality Management (TQM) system for improving the quality of staff services. This extends TQM into a major service sector, higher education, which has only recently begun to explore TQM applications. Integral to this effort was the development of an instrument for assessing service quality based on staff perceptions. The instrument was based on a core set of dimensions identified as common to the Malcolm Baldridge National Quality Award, and the psychology and management literatures. Initial support for this instrument came from high reliability coefficients for each of the component scales. Recommendations for further validation efforts and potential practical applications are also discussed.
OVERVIEW OF STUDY The present study emerged as part of an effort to develop a Total Quality Management (TQM) style plan for improving the quality of staff services to internal and external customers at a midwestern university. Only in recent years have such initiatives gotten underway in higher education, despite TQM's longer history in other service industries and in manufacturing. Along the way, we found not only a need for TQM-based approaches in higher education; we discovered a more general need for an empirically based unified Direct all correspondence to: Patrick M. McCarthy, Department of Psychology, Middle Tennessee State University, MTSU Box 87, Murfreesboro, TN 37132, USA; E-mail:
[email protected] Journal of Quality Management, Vol. 4, No. 2, pp. 185 ± 206 ISSN: 1084 ± 8568 Copyright D 2000 by Elsevier Science Inc. All rights of reproduction in any form reserved
185
186
MCCARTHY & KEEFE
description of the foundations for effective TQM. While we found suggestions from the TQM literature (such as the widely accepted Baldridge Award criteria) to be plentiful, nearly all were anecdotally driven. Further, criticisms of TQM as being atheoretical and lacking empirical research have yielded few attempts to integrate TQM's foundations with the more theoretically-based and empirically evaluated dimensions of organizational performance from the psychology and management literatures. A major focus of the present study was to identify a common framework across these literatures. Review and integration of the literatures provided the foundation for our next objective, which was the development of a survey battery that measures the component dimensions. We will now present, in sequence, a more detailed description of each of these steps in developing our measurement battery. We will then evaluate the internal consistency of each scale, offer recommendations for further validation, and discuss potential practical applications of this measure. First, though, we turn to its development.
SLOW RISE OF TQM IN HIGHER EDUCATION In the 1970s, the tremendous success of Japanese companies who were driven by quality-oriented approaches jolted all sectors of the American economyÐmanufacturing and serviceÐto redesign their management practices. To further promote such efforts, the Malcolm Baldridge National Quality Award was created in 1987, and is administered by the American Society for Quality Control under contract with the National Institute of Standards and Technology (NIST) of the U.S. Department of Commerce. Along the way, the principles of TQM, also commonly called Continuous Quality Improvement (CQI), rapidly gained acceptance as preeminent ways for achieving organizational success. This is evidenced by the countless articles on these approaches in professional journals and magazines, and NIST's distribution of 200,000 copies annually of the Baldridge criteria (Kendall & Stern, 1997). Yet, while TQM has been adopted by a wide range of manufacturing and service organizations, higher education is one major service sector that has been slow to transition into TQM. Colleges and universities have generally had a superficial awareness of TQM, but actual practice of TQM principles has been rare. One possible reason is that many in academia have argued against the need for involving business techniques on the basis of academic exceptionalism (Yudof & Busch-Vishniac, 1996). This perspective argues that a university's job is to enliven the minds of students, and that process is seen as too complex for TQM principles. Frankly, that argument does not hold up to scrutiny. Continuous improvement of organizational processes to improve the quality of services is not at cross-purposes with the mission of higher education. The value of TQM principles is not dependent on what services an organization provides. Rather, their value lies in their guidance for how an organization can raise the quality of its servicesÐno matter what those services happen to be. For instance, staff services at a university JOURNAL OF QUALITY MANAGEMENT, Vol. 4, No. 2, 1999
DEVELOPMENT OF ORGANIZATIONAL PERFORMANCE MEASURE
187
can be quite varied (e.g., student counseling, financial planning, computers and other facilities planning and maintenance, etc.), yet it is clear that all those services could benefit from a systematic approach for improving their quality. Unfortunately, many in academia still do not understand or welcome TQM as a valuable ally. Those that do are joined by taxpayers and politicians (Lee, 1993), and potential students in an increasingly competitive education marketplace who demand higher quality and accountability in higher education. Further supporting the value of TQM in higher education are the early signs of progress identified in the American Association for Higher Education's (AAHE) description of 25 campuses which have begun TQM/CQI efforts, most within the past 6 years (Brigham, 1994a). In addition, a group of Academic Quality Consortium institutions have begun work on developing a Baldridge award category for education (Seymour, 1995). Finally, in response to the question, ``Is CQI right for higher education?,'' Steve Brigham (1994b) of AAHE noted that while 2 years earlier the question was met with great skepticism, the answer [now] is a conditional yes. CQI is right for higher educationÐprovided there is appropriate translation and customization of its principles, proper training and practice, and a bona fide commitment from institutional leaders. The CQI movement in higher education is growing rapidly, stumbling at times, and learning from successes and failures as it ventures into very new territory . . . Much remains to be learned about the `fit' of CQI to our academic environments (pp. vii ± viii).
Thus, despite the recent attempts to begin TQM in higher education, significantly more progress is needed. Moreover, relatively few TQM initiatives have focused on the quality of non-faculty staff services. The present study emerged from a TQM initiative which did focus specifically on the quality of non-faculty staff services, in this case at a midwestern university serving approximately 5,500 students.
ORIGINS OF PRESENT STUDY In November 1996, the university's chancellor appointed a committee of staff and faculty to develop a TQM-based plan of action to improve staff (non-faculty) services to customers, both internal and external. (Other university committees were charged with addressing faculty-based service improvements.) Identified as a crucial step toward improving the quality of staff-based organizational performance was the assessment of the current performance quality. While many sources of information would be valuable to such an effort (including students' perceptions), we elected to begin by surveying the staff's perceptions. Proponents of TQM, including followers of Crosby, Juran, and especially Deming emphasize that quality efforts must be based on a profound knowledge of processes essential to providing quality services in a particular organization (Dobyns & Crawford-Mason, 1991). Staff could provide that knowledge. Staff are key internal customers for service quality, and they are better informed than most on many organizational performance processes. Additionally, JOURNAL OF QUALITY MANAGEMENT, Vol. 4, No. 2, 1999
188
MCCARTHY & KEEFE
we felt it was important to give staff a voice on issues directly affecting them, to help increase their feelings of both involvement and ownership for subsequent change efforts.
FOUNDATIONS FOR MEASURING TQM COMPONENTS We initially searched for an existing measurement instrument that had been previously validated and might offer benchmark results from other universities. We discovered, though, what Cardy and Dobbins (1996) already knew: there is a dearth of survey work addressing TQM processes in any type of setting. In fact, a number of recent studies have noted that little academic research has addressed TQM at all, despite its popularity. Further, the proliferation of anecdotal information about TQM in the popular press has created considerable confusion (Gatewood & Riordan, 1997; Hackman & Wageman, 1995; Waldman & Gopalakrishnan, 1996). Our search for a measure of quality service components yielded few options. Additionally, those we did find were either not well suited to an education staff services context, or lacked sufficiently clear theoretical foundations. In the latter case, the surveys typically had sections of questions, but the overall meaning or operational definition of each section was unclear. Contributing to this confusion was that typically there was little or no formal justification of why items were organized into particular sections (i.e., surveys lacked sufficient systematic qualitative assessments or quantitative evaluations of factor/dimension structure). These limitations could significantly obscure these surveys' ability to provide an understanding of processes essential to the quality of services. Thus, we faced ambiguities and confusion regarding the very foundations of organizational performance quality. A well-established basis for attempting to clarify these foundations is the seven Baldridge criteria. However, a significant criticism of these criteria is their atheoretical nature (Kendall & Stern, 1997; Wilson & Durant, 1994). Like the TQM literature overall, these criteria have relied too heavily on anecdotally-based arguments. Thus, we turned to the psychology and management literatures addressing organizational behavior for further theoretical and empirically based guidance. This literature has received little explicit recognitionÐlet alone integrationÐfrom TQM writers, despite the considerable potential value in doing so (Gatewood & Riordan, 1997). Kanter and Brinkerhoff (1981) observed that while the topic of measuring organizational performance has a voluminous literature spanning several disciplines, its conceptual foundations were not yet sufficiently clear. They further noted that questions of what to measure ``are not mere annoyances to be brushed aside as soon as better measurement techniques are invented; instead, they are fundamental aspects of modern organizations themselves.'' ( p. 321) Thus, while the organizational behavior literature offered a substantial empirical basis that the TQM literature lacked, it still needed a unified conceptual foundation for integration with the TQM literature. We sought to identify a unified conceptual foundation of organizational behavior from studies in this literature which reviewed and integrated the findings JOURNAL OF QUALITY MANAGEMENT, Vol. 4, No. 2, 1999
DEVELOPMENT OF ORGANIZATIONAL PERFORMANCE MEASURE
189
from numerous other studies. First was Corts and Gowing's (1992) review and integration of approximately 28 studies. The dimensions they identified were later utilized to develop Gregory and Park's (1992) Multipurpose Occupational Systems Analysis InventoryÐClose-Ended (MOSAIC). Secondly, Porras and Robertson (1992) provided an extensive review and integration of numerous studies for a chapter of the Handbook of Industrial and Organizational Psychology. Thirdly, Burke and Litwin (1992) proposed a set of dimensions fundamental to organizational development and change based on their consulting experiences, then catalogued 29 research studies to support these dimensions. Fourthly, McCarthy (1995) integrated findings from three major review articles and several other studies which attempted to identify dimensions of organizational performance for innovation-oriented organizations (e.g., those with a research and development mission). Several components of that study were further examined and supported by McCarthy, Fleishman, and Holt (1997). McCarthy's review/integration was included because universities' missions often include significant research, innovation, and creativity aspects.
DIMENSIONS OF ORGANIZATIONAL PERFORMANCE QUALITY The theoretically based, empirically examined dimensions that emerged across these reviews of the psychology and management literatures on organizational performance were consistent with each other, and with the Baldridge criteria. Table 1 presents the dimensions we identified, and summarizes their integration with the Baldridge criteria and the four literature reviews. The columns present information from each source, and the rows reveal the commonalities across these sources. The first dimension identified was Planning, which consists of two sub-dimensions, Mission Clarity, and Strategic Planning/Goal Setting. This is entirely consistent with the TQM emphasis on being mission-driven and the Baldridge award's strategic planning criteria. Strong support also emerged from all of the reviews of the organizational performance literature. Essentially, each recognized that performance quality improvements first require a clear answer to: ``improvements toward what?'' Corresponding strategic planning for how to move ahead is also essential to maximize the efficiency and effectiveness of quality improvements. The second major dimension, Culture, emerged from recognition that quality service requires an integration of individual actions housed within a cultural context, one devoted to the customer. For a quality service program to succeed, the cultural expectations and values of the organization need to be consistent with the expectations and values of high quality customer service. The cultural values need to be transmitted to staff, which may require adaptive cultural leadership (Kotter & Heskett, 1992; Kreitner & Kinicki, 1998). Speaking to this point, the Chancellor of the university from which the present study emerged said that his biggest challenge was ``changing direction: overcoming inertia'' (Muhammad, 1997). In fact, perhaps the strongest argument against pursuing a JOURNAL OF QUALITY MANAGEMENT, Vol. 4, No. 2, 1999
JOURNAL OF QUALITY MANAGEMENT, Vol. 4, No. 2, 1999
Workforce quality and training Support for work and personal life quality Workforce motivation
Planning Mission clarity Strategic planning/ goal setting Culture Customer orientation Quality improvement leadership Management of Workforce Culture
Management style
Client orientation
Human resources management
Customer focus and satisfaction Leadership
Human resources development and management (HRD&M)
Organizing Vision Goals and strategies
Planning
Corts and Gowing (1992) (Integration of 28 OP Research Studies)
Strategic planning
Baldridge Criteria (Widely Accepted in the TQM Literature)
Porras and Robertson (1992) (Integration of OP Research in Handbook of I/O Psychology)
Individuals' skills/ abilities Individuals' needs/ values Motivation
Leadership
Culture
Mission and strategy
Burke and Litwin (1992) (Integration of 29 Research Studies of Organizational Performance and Change)
Motivation
Workforce quality
Management of workforce
Planning Vision/mission Goals/strategies
McCarthy (1995) (Integration of OP Literature for Innovation-oriented Organizations)
Present Study's Dimensions of Organizational Performance (OP) Quality, Integrated with the Baldridge Criteria and Suggestions from Prior Extensive OP Literature Reviews
Present Study
Table 1.
190 MCCARTHY & KEEFE
Organizational commitment Locus of control/ empowerment
Job satisfaction
Outcomes
Information and Analysis Business Results HRD&M (cont. from above)
HRD&M (cont. from above)
Within-unit coordination Between-unit coordination Fairness and treatment of others
Performance Measurement and Feedback
Process management
Participative leadership/ decision-making System Processes
Rewards/recognition
Empowerment
Team focus (cont. from above)
Evaluating
Team focus
Organizing structure (cont.) + interaction processes + informal networks
Mgmt. style (cont.) + Organizing structure
Reward Systems
Systems (cont. from above)
Sys + Clim + Val (all cont. from above)
Systems (cont. from above) + structure + climate
Management practices
Systems
Cross-functional coordination Management processes (cont. from above) Communication systems
Management processes
DEVELOPMENT OF ORGANIZATIONAL PERFORMANCE MEASURE 191
JOURNAL OF QUALITY MANAGEMENT, Vol. 4, No. 2, 1999
192
MCCARTHY & KEEFE
quality strategy is that it requires a change in the status quo, which can be unwelcome by staff for a wide variety of reasons. These culture issues must be explicitly and carefully addressed for an organization's movement toward TQM to succeed, and thus, were included among our dimensions of organizational performance quality. Management of the Workforce was a major dimension in all of our sources, and encompasses several sub-dimensions. Each of these sub-dimensions listed in Table 1 has a considerable literature of its own supporting its contribution to the quality of an organization's performance. Any of the four review studies listed in Table 1 can connect the reader to these literatures. Process management is another major component of TQM and the Baldridge criteria that was further supported by the organizational performance literature. Thus, System Processes (and its sub-dimensions) was identified by our study as a key dimension of service quality. Also central to TQM is the development of systems which effectively measure and communicate process and results information. Corts and Gowing (1992) supported this evaluation component, and it was a major part of Burke and Litwin's (1992) ``systems'' and McCarthy's (1995) ``communication systems'' categories. A number of other researchers and practitioners have particularly emphasized the importance of such measurement and feedback issues (e.g., Daniels, 1994; Gilbert, 1978). Consequently, the next dimension of organizational performance quality we identified was Performance Measurement and Feedback. It could be argued that positive changes in the above dimensions define improved service quality. Yet without tying changes in these dimensions to customer-based outcomes, a tautology may result. Complicating matters is the fact that, ``there are serious measurement problems associated with even standard indices of [organizational] performance . . . [and] exogenous disturbances can significantly obscure the link between work processes and organizational outcomes.'' (Hackman & Wageman, 1995, p. 323) Even subjective data, such as customer satisfaction, may not be readily available. Our study faced such an unavailability of data from external customers. However, we did identify three outcomes affecting those internal to the organization: job satisfaction, organizational commitment, and locus of control/empowerment perceptions. These variables have been described as the most immediate outcomes of the above process dimensions, and direct mediators of external outcomes. Thus, these three variables can be used to independently gauge the impact of improvements on the other dimensions of organizational quality, and are likely indicators of progress in improving external outcomes. Additional support (beyond that specified in Table 1) for selecting these outcomes is plentiful (e.g., Allen & Meyer, 1990; Gatewood & Riordan, 1997; Griffin & Bateman, 1986). In summary, we utilized findings from the psychology and management literatures, along with the Baldridge criteria, to develop an integrated structure for describing organizational performance/service quality. The value of this integrated structure was that it offered an answer to the fundamental question, ``what is it we need to know for us to adequately understand the quality of our service management?'' Further, this structure directly guided our development of a service quality JOURNAL OF QUALITY MANAGEMENT, Vol. 4, No. 2, 1999
DEVELOPMENT OF ORGANIZATIONAL PERFORMANCE MEASURE
193
assessment. More specifically, we developed a measurement battery to address the component dimensions of this structure.
DEVELOPMENT OF A SURVEY OF ORGANIZATIONAL PERFORMANCE/SERVICE QUALITY Now that we had these dimensions to help us answer the question ``what do we want to know?,'' the next task was to adopt or develop survey items for each dimension. Items from published measures were adopted for job satisfaction (Shouksmith, Pajo, & Jepsen, 1990), organizational commitment (Meyer, Allen, & Smith, 1993), and locus of control (empowerment perceptions) (Spector, 1988). However, we were unable to find published surveys which sufficiently addressed the remaining dimensions. Thus, we developed items to address the remaining dimensions. The first stage of our item development was to identify a range of issues which adequately represent our conceptualization of each respective dimension. Since these dimensions were formulated from an integration of literature reviews, we noted key issues of relevance to each respective dimension identified by these reviews. Additional ideas for issues to address came as a by-product of our futile search for published surveys of these dimensions. We came across a few surveys which addressed issues that we thought may potentially supplement our initial ideas about one or more dimensions. We then examined the range of issues we had identified for each dimension and looked for other important aspects meriting representation. Next, we turned to crafting the language and format of our items. All the items were written as statements to be rated on a 5-point Likert agree±disagree scale. We sought to maximize clarity by simplifying the wording of each item, and tailored key terminology to be consistent with that common within this university. We further edited, for brevity, to keep each item as concise as possible. A Grammatik analysis of our items' sentence complexity yielded a score of 43 (where 100 equals very complex), and its analysis of the items' vocabulary complexity yielded a score of 1 (on the same 100-point scale). After all editing was completed, we pilot tested the full set of items with 10±15 individuals, and utilized their feedback to make final clarifying edits on a handful of items. We then prepared our survey instrument using the items presented in Appendix A. The order of the items we developed were randomized within blocks. This assured that the items for each dimension/sub-dimension were evenly distributed throughout each portion of the survey. It also eliminated the possibility of inducing within-dimension consistency from grouping a dimension or sub-dimension's items together. Randomizing the order of the items we developed also eliminated the likelihood of any other potential order-based differential effects emerging across these dimensions. For each of the three previously-published measures we adopted, the respective set of items remained grouped together in their originally-published order, since that was the format in which they were previously validated. These JOURNAL OF QUALITY MANAGEMENT, Vol. 4, No. 2, 1999
194
MCCARTHY & KEEFE
Table 2.
Reliability of the Measures for each Dimension of Organizational Performance Quality # of items
Planning Mission clarity Strategic planning/goal setting Culture Customer orientation Quality improvement leadership Management of the workforce Workforce quality and training Support for work and personal life quality Workforce motivation Rewards/recognition Participative leadership/decision-making System processes Within-unit coordination Between-unit coordination Fairness and treatment of others Performance measurement and feedback Outcomes Job satisfaction Organizational commitment Normative commitment Affective commitment Continuance commitment Locus of control (empowerment)
Coefficient a
10 14
.90 .93
17 15
.92 .95
8 8 13 12 13
.87 .81 .88 .88 .94
18 5 15 21
.91 .72 .92 .94
22
.84
6 6 5 16
.81 .82 .72 .80
three sets of measures followed the portion of the survey containing the items we had developed. Two hundred and twenty-two non-faculty staff were recruited to voluntarily participate in an anonymous survey. One hundred and fifty-five completed all sections of the survey (69.8% response rate). Missing data on individual items were rare and unsystematic.
SURVEY RELIABILITIES Fundamental to the utility of our survey is the reliability of its component measures. We examined the coefficient a for each component measure. Using listwise deletion for missing data, the number of respondents utilized for each respective measure ranged from 139 to 152. Typically, the reliability of the measure for each subdimension was above 0.80, and often exceeded 0.90. As you can see in Table 2, the reliabilities of the measures we developed compared quite favorably to those for the previously published measures. In fact, 11 of our 13 measures had reliabilities above the highest reliability for the previously published measures (i.e., job satisfaction's 0.84), and the other two had reliabilities within the range of those for the previously published measures. JOURNAL OF QUALITY MANAGEMENT, Vol. 4, No. 2, 1999
DEVELOPMENT OF ORGANIZATIONAL PERFORMANCE MEASURE
195
CONCLUSIONS This study offered three significant contributions. First was the extension of TQM into higher education. Second was the integration of the TQM/Baldridge criteria literature with the psychology and management literatures on organizational performance. This produced a unified framework of dimensions comprising organizational performance/service quality, including key human resource management processes, which may be useful for a variety of organizations. The third major contribution of this study was the development of a battery of measures with highly desirable psychometric properties for assessing an organization on each of the aforementioned dimensions via a staff survey. While the reliabilities for each of the sub-scales on the survey offer very encouraging support for these measures, further evaluations are needed. Specifically, while this study established that each sub-scale offers a reliable measure, further validity evidence would better clarify the meaning of what each measures. For instance, a factor analysis could empirically evaluate the dimensional structure of this battery's items. While the value of such an analysis is heightened by the high intercorrelations among a number of the proposed dimensions (see Appendices B,C,D,E), it requires a much larger sample than was available for this study. Potential validation strategies may also include an analysis of the empirical relations among the dimensions/sub-dimensions of this survey, as compared to theoretical models. Structural equation modeling might be particularly valuable, in that it can simultaneously evaluate our proposed factor/ dimension structure, as well as the relations among these dimensions and subdimensions. We hope to increase our sample size in future research and provide such analyses. Nevertheless, our comprehensive battery provides a unique opportunity for further evaluation of these dimensions, including the eventual development of an empirically based model of their relations. These measures may even provide a basis for translating such a theoretical model into practical application. An organization could use this battery to identify which aspects of organizational performance quality most need improvement. Eventually, further model development has the potential to reveal how such needs affect the other dimensions which underlie quality, and thus would help guide prescriptions for improvement from assessments with this battery.
APPENDIX A The items we developed for this survey instrument are presented below. The actual order of appearance of these 169 items was randomized within blocks in the instrument. All were rated using a 5-point Likert agree±disagree scale. The term ``director'' in these items is equivalent to what many organizations may call a manager, in that she/he has direct supervisory authority over the employees of the respective unit. JOURNAL OF QUALITY MANAGEMENT, Vol. 4, No. 2, 1999
196
MCCARTHY & KEEFE
Use of these scales (in whole or part) for commercial purposes is strictly prohibited without written permission of one of the authors. We respectfully request that anyone using these scales for noncommercial research purposes simply notify us so that we may track its usage and expand our base of potential validity-related information. I. Planning A. Mission Clarity 1. I understand clearly how my performance affects the overall effectiveness of my unit. 2. Each staff member can answer the question ``Who is it that my unit serves?'' 3. I know what is expected of me by those I serve. 4. I have a good understanding of our mission. 5. Staff have a good understanding of our unit's mission. 6. My unit has a mission statement that sets priorities for our work. 7. Each staff member's job is clearly linked to our unit's mission. 8. The director behaves in ways that are consistent with our unit's mission. 9. The director clearly communicates our mission to the staff. 10. The director tells staff how work contributes to our unit's mission/goals. B. Strategic Planning/Goal Setting 1. Standards for our work are based on the expectations of those we serve. 2. We are continually setting higher standards for the unit's performance. 3. Staff members know exactly what is expected of them to do high quality work. 4. There are quality goals that require my unit to strive continuously for excellence. 5. The director clearly communicates our goals and priorities. 6. There are service goals aimed at exceeding customer expectations. 7. Information about the external environment is used in the strategic planning process. 8. Day-to-day activities are guided by a long-term vision of where our unit should be. 9. The director sets challenging and attainable performance goals. 10. Clear standards are the basis against which we compare our performance. 11. Short-term and long-term quality improvement goals are established and integrated into the overall strategic planning process. 12. Systematic procedures are used to evaluate the strategic planning process. 13. There is an established process for developing and updating quality improvement goals. 14. Individual goals and objectives for improving work are included in staff performance appraisals. JOURNAL OF QUALITY MANAGEMENT, Vol. 4, No. 2, 1999
DEVELOPMENT OF ORGANIZATIONAL PERFORMANCE MEASURE
197
II. Culture A. Customer Orientation 1. Staff strive to provide high quality service to customers. 2. Our services and work processes are designed to meet customer needs and expectations. 3. Activities that link directly to those we serve get first priority. 4. We continuously try to identify and solve problems not yet recognized by those we serve. 5. Staff are willing to make major changes in the way they do their work to improve services. 6. The director is held accountable for meeting the needs of customers. 7. Staff know our unit's approach for improving the quality of our work. 8. The quality of our work has a priority at least as high as budgetary considerations. 9. Staff are fully empowered to resolve customer problems. 10. Staff are responsible for generating new ideas and suggestions for improvement. 11. Staff are encouraged to be involved in the improvement planning process. 12. Staff often discuss how well we are meeting the needs and expectations of those we serve. 13. Staff are encouraged to take risks to improve their work. 14. Our main focus has not been simply to satisfy those higher up in the University chain of command. 15. There are specific plans for improving the quality of our work. 16. Each staff member has a specific plan for improving quality of his or her performance. 17. Stories and examples of successful improvements occurring at the University are shared among the staff. B. Quality Improvement Leadership 1. The director understands the constraints affecting staff performance. 2. The director communicates the importance of high quality work to staff members. 3. Our policies keep quality a number one priority. 4. The director is open to change. 5. The director asks staff about ways to improve our work. 6. The director is actively involved in removing obstacles or barriers so we can be more effective. 7. The director inspires employees to take pride in their work. 8. The director demonstrates that quality is important in his or her daily activities. 9. The director communicates the importance of constant improvement. JOURNAL OF QUALITY MANAGEMENT, Vol. 4, No. 2, 1999
198
MCCARTHY & KEEFE
10. The director follows up on staff suggestions for improving our services and work processes. 11. The director leads by example; that is, she/he practices what she/ he preaches. 12. The director bases decisions primarily on facts and data rather than on opinions and feelings. 13. The director communicates a clear vision of what our unit can achieve in the future. 14. The director provides sufficient resources to promote quality improvement. 15. Resources are distributed in a way that helps achieve our long-term vision for what our unit should be like. III. Management of the Workforce A. Workforce Quality Training 1. Staff have opportunities to participate in training that improves their performance. 2. The people that are hired are those with the best qualifications to do the job. 3. Staff receive the training they need to perform their jobs effectively. 4. Staff are provided with adequate training when new technologies and tools are introduced. 5. Staff are provided with training that enhances their career advancement opportunities. 6. Training programs are identified and developed based on a comprehensive assessments of the training needs. 7. Training plans are fully integrated into our overall strategy and planning for quality. 8. The effectiveness of the training and development opportunities are evaluated. B. Support for Work and Personal Life Quality 1. The director understands and supports the staff's family and personal life responsibilities. 2. Staff are given the opportunity to work on flexible work schedules, when the job permits. 3. Staff who take advantage of family/personal life policies and benefits do not hurt their career opportunities. 4. Staff effectively balance their work and family/personal life responsibilities. 5. Staff members are provided with opportunities for personal growth and development. 6. Staff are comfortable discussing their needs and concerns with the director. JOURNAL OF QUALITY MANAGEMENT, Vol. 4, No. 2, 1999
DEVELOPMENT OF ORGANIZATIONAL PERFORMANCE MEASURE
199
7. Programs are made available to help staff balance work and family responsibilities. 8. The director takes steps to minimize work-related stress. C. Workforce Motivation 1. My work gives me a sense of accomplishment and pride. 2. Avoiding reprimands from the director is not the primary basis for my efforts. 3. This is a desirable place to work. 4. My job is challenging to me. 5. My coworkers consider me to be an energetic worker. 6. My job is motivating to me. 7. Staff consistently show high levels of initiative. 8. My job is fascinating to me. 9. My job energizes me. 10. My coworkers help keep up my motivation. 11. The director helps keep up my motivation. 12. Morale in my unit is good. 13. I am rarely frustrated by my job. D. Rewards/Recognition 1. The director provides more positive than negative feedback about my performance. 2. Risk-taking is encouraged without fear of punishment for mistakes. 3. The director is fair in recognizing personal and team accomplishments. 4. The director personally recognizes contributions of individuals on a regular basis. 5. High performing staff receive respect and recognition from their coworkers. 6. High performing staff are rewarded with more challenging and satisfying work. 7. Creativity and innovation are rewarded. 8. Outstanding service to customers is recognized or rewarded. 9. High performing staff receive non-monetary rewards. 10. High performing staff get promoted. 11. Staff are asked about their preferences for different types of rewards. 12. Pay raises depend on how well staff perform their job. E. Participative Leadership/Decision-Making 1.
Staff are encouraged to use personal initiative and independent judgment. 2. Staff can make and implement decisions that improve their work. JOURNAL OF QUALITY MANAGEMENT, Vol. 4, No. 2, 1999
200
MCCARTHY & KEEFE
3. Staff take initiative to provide input regarding important decisions. 4. The director involves staff in the design of their jobs. 5. The director helps staff become aware of customers' needs and expectations. 6. Staff are kept up-to-date about issues that affect them. 7. Staff have strong feelings of personal empowerment and team ownership of work processes. 8. Staff have good opportunities to use their leadership skills. 9. Decisions are made by those most knowledgeable about the work being done. 10. The director asks staff for ideas and opinions before making important work decisions. 11. There is no fear of punishment for staff who speak their minds. 12. Overall, I am satisfied with the style of management of this unit. 13. Staff participation in decision-making is an important priority of the director. IV. System Processes A. Within-Unit Coordination 1. We continually try to respond more quickly to those we serve. 2. I think of myself as contributing to the whole unit rather than just doing a single job. 3. Staff consistently look for ways to improve how we work, even when things are running well. 4. When mistakes occur we determine why they happened. 5. We seldom miss deadlines. 6. We rarely make the same mistake twice. 7. Staff have the appropriate supplies, materials, and equipment to perform their jobs well. 8. Staff are working in the same direction with a unified effort. 9. Division of responsibilities minimizes duplication of efforts and maximizes efficiency. 10. Coordination of efforts is efficient. 11. Every important problem is being worked on by at least one staff member. 12. In order to become more efficient we have reorganized the way we do our work. 13. We regularly find ways to increase our efficiency. 14. There are constant efforts to reduce our costs without sacrificing quality. 15. Efforts are made to simplify procedures and reduce regulations. 16. The distribution of work among staff is well balanced. 17. Staff are involved in team(s) that study and suggest ways to improve how we work. 18. Work is organized so as to eliminate bureaucratic roadblocks. JOURNAL OF QUALITY MANAGEMENT, Vol. 4, No. 2, 1999
DEVELOPMENT OF ORGANIZATIONAL PERFORMANCE MEASURE
201
B. Between-Unit Coordination 1. My unit has good working relationships with other units in the university. 2. Coordination between my unit and organizations external to the university is efficient. 3. The director promotes effective communication with other units in the university. 4. Coordination between units throughout the university is highly efficient and productive. 5. I rarely encounter problems in my job that are due to the performance of other units in the university. C. Fairness and Treatment of Others 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16.
Staff maintain high ethical standards. The director treats staff with respect. The director maintains high ethical standards. Staff treat the director with respect. I rarely feel like anyone is questioning how I follow the procedures of my job. Staff receive adequate help from the director when there are work-related problems. There is trust between staff and the director. Training and career development opportunities are allocated fairly. Disputes or conflicts are resolved fairly. The director provides. Staff with constructive suggestions to improve their job performance. When disagreements occur, ideas are criticized, not people. Staff are kept well informed on all issues affecting their jobs. Disciplinary actions are applied fairly to staff. The methods that we use to resolve conflicts are satisfactory. The director provides fair and accurate ratings of staff performance.
V. Performance Measurement and Feedback 1. Staff use suggestions from customers to improve the quality of our services. 2. Staff are held accountable for their responsibilities. 3. Those we serve can easily give us feedback or make suggestions for improvement. 4. Staff review their own work to decide what changes or improvements are needed. 5. Information about problems is given to those most responsible so improvements can be made. 6. We have shortened the time it takes to gather and distribute information. JOURNAL OF QUALITY MANAGEMENT, Vol. 4, No. 2, 1999
202
MCCARTHY & KEEFE
7. The quality of the services we provide is compared to the quality of services from other successful organizations. 8. We constantly seek feedback from those we serve so necessary adjustments can be made. 9. Information collected from those we serve is integrated with other data and used to improve the quality of our services. 10. Progress toward customer service goals is reported to relevant staff and used to plan for improvements. 11. The director is well informed about how those we serve perceive our service. 12. Staff are regularly asked to identify areas needing improvement. 13. There are effective ways to link feedback from those we serve to someone who can act on this. 14. We have quality assurance systems focusing on preventing future problems. 15. The director regularly reviews and evaluates our progress toward meeting our goals and objectives. 16. Staff are provided feedback about whether they are doing a good job. 17. Outcome/result measures are used to assess our overall performance. 18. Those we serve can formally evaluate the quality of the services we provide. 19. Assessments of the quality of our work processes and customer service are conducted regularly. 20. Diverse groups participate in the development of our performance measures. 21. Human resource policies and practices are evaluated and improved on an ongoing basis.
APPENDIX B Correlations of Planning and Culture Indices with Complete Set of Indices N = 154 ± 155 (pairwise deletion of missing data) Planning (1) Mission clarity (2) Strategic planning/goal setting
(1)
Planning
(2)
(3)
Culture
(4)
1.0 .80
.80 1.0
.82 .89
.83 .88
Culture (3) Customer orientation (4) Quality improvement leadership
.82 .83
.89 .88
1.0 .89
.89 1.0
Management of the workforce (5) Workforce quality and training (6) Support work and personal life quality (7) Workforce motivation (8) Rewards/recognition (9) Participative leadership/decision-making
.67 .77 .74 .68 .79
.77 .68 .67 .80 .79
.77 .78 .72 .78 .88
.79 .84 .75 .83 .90
JOURNAL OF QUALITY MANAGEMENT, Vol. 4, No. 2, 1999
DEVELOPMENT OF ORGANIZATIONAL PERFORMANCE MEASURE
System processes (10) Within-unit coordination (11) Between-unit coordination (12) Fairness and treatment of others (13) Performance measurement and feedback Outcomes (14) Job satisfaction Organizational commitment (15) Normative commitment (16) Affective commitment (17) Continuance commitment (18) Locus of control (empowerment)
203
.84 .61 .79 .79
.84 .66 .77 .94
.88 .69 .84 .90
.88 .68 .90 .87
.68
.54
.67
.72
.28 .33 ÿ.15 .23
.33 .28 ÿ.09 .22
.31 .33 ÿ.18 .28
.31 .35 ÿ.18 .32
*Data in bold are non-significant, italicized correlations are p < .05, and all others are p < .01.
APPENDIX C Correlations of Management Indices with Complete Set of Indices N = 154 ± 155 (pairwise deletion of missing data) Planning (1) Mission clarity (2) Strategic planning/goal setting Culture (3) Customer orientation (4) Quality improvement leadership Management of the workforce (5) Workforce quality and training (6) Support work and personal life quality (7) Workforce motivation (8) Rewards/recognition (9) Participative leadership/ decision-making System processes (10) Within-unit coordination (11) Between-unit coordination (12) Fairness and treatment of others (13) Performance measurement and feedback Outcomes (14) Job satisfaction Organizational commitment (15) Normative commitment (16) Affective commitment (17) Continuance commitment (18) Locus of control (empowerment)
(5)
Management of the workforce (6) (7) (8)
(9)
.67 .77
.77 .68
.74 .67
.68 .80
.79 .79
.77 .79
.78 .84
.72 .75
.78 .83
.88 .90
1.0 .73
.73 1.0
.61 .71
.71 .75
.74 .86
.61 .71 .74
.71 .75 .86
1.0 .68 .77
.68 1.0 .81
.77 .81 1.0
.73 .63 .72 .77
.80 .62 .88 .69
.74 .56 .73 .66
.78 .61 .83 .81
.86 .67 .90 .80
.58
.74
.74
.64
.75
.35 .34 ÿ.15 .34
.28 .35 ÿ.24 .33
.44 .56 ÿ.34 .46
.38 .32 ÿ.15 .33
.33 .39 ÿ.22 .34
*Data in bold are non-significant, italicized correlations are p < .05, and all others are p < .01. JOURNAL OF QUALITY MANAGEMENT, Vol. 4, No. 2, 1999
204
MCCARTHY & KEEFE
APPENDIX D Correlations of System Processes Indices and Performance Measurement and Feedback Index with Complete Set of Indices N = 154 ± 155 (pairwise deletion of missing data) Planning (1) Mission clarity (2) Strategic planning/goal setting
System processes (11) (12)
(10)
Meas and FB (13)
.84 .84
.61 .66
.79 .77
.79 .94
Culture (3) Customer orientation (4) Quality improvement leadership
.88 .88
.69 .68
.84 .90
.90 .87
Management of the workforce (5) Workforce quality and training (6) Support work and personal life quality (7) Workforce motivation (8) Rewards/recognition (9) Participative leadership/decision-making
.73 .80 .74 .78 .86
.63 .62 .56 .61 .67
.72 .88 .73 .83 .90
.77 .69 .66 .81 .80
1.0 .68 .86 .84
.68 1.0 .71 .65
.86 .71 1.0 .77
.84 .65 .77 1.0
.68
.59
.80
.56
.34 .39 ÿ.14 .27
.26 .26 ÿ.09 .18
.32 .38 ÿ.17 .32
.34 .31 ÿ.12 .23
System processes (10) Within-unit coordination (11) Between-unit coordination (12) Fairness and treatment of others (13) Performance measurement and feedback Outcomes (14) Job satisfaction Organizational commitment (15) Normative commitment (16) Affective commitment (17) Continuance commitment (18) Locus of control (empowerment)
*Data in bold are non-significant, italicized correlations are p < .05, and all others are p < .01.
APPENDIX E Correlations of Outcomes Indices with Complete Set of Indices N = 154 ± 155 (pairwise deletion of missing data) Planning (1) Mission clarity (2) Strategic planning/goal setting
Outcomes (14)
(15)
(16)
(17)
(18)
.68 .54
.28 .33
.33 .28
ÿ.15 ÿ.09
.23 .22
Culture (3) Customer orientation (4) Quality improvement leadership
.67 .72
.31 .31
.33 .35
ÿ.18 ÿ.18
.28 .32
Management of the workforce (5) Workforce quality and training
.58
.35
.34
ÿ.15
.34
JOURNAL OF QUALITY MANAGEMENT, Vol. 4, No. 2, 1999
DEVELOPMENT OF ORGANIZATIONAL PERFORMANCE MEASURE
(6) Support work and personal life quality (7) Workforce motivation (8) Rewards/recognition (9) Participative leadership/ decision-making System processes (10) Within-unit coordination (11) Between-unit coordination (12) Fairness and treatment of others (13) Performance measurement and feedback Outcomes (14) Job satisfaction Organizational commitment (15) Normative commitment (16) Affective commitment (17) Continuance commitment (18) Locus of control (empowerment)
205
.74
.28
.35
ÿ.24
.33
.74 .64 .75
.44 .38 .33
.56 .32 .39
ÿ.34 ÿ.15 ÿ.22
.46 .33 .34
.68 .59 .80 .56
.34 .26 .32 .34
.39 .26 .38 .31
ÿ.14 ÿ.09 ÿ.17 ÿ.12
.27 .17 .32 .23
.42
.54
ÿ.25
.42
1.0 .68 ÿ.19 .32
.68 1.0 ÿ.37 .47
ÿ.19 ÿ.38 1.0 ÿ.34
.32 .47 ÿ.34 1.0
1.0 .42 .54 ÿ.25 .42
*Data in bold are non-significant, italicized correlations are p < .05, and all others are p < .01.
REFERENCES Allen, N. J., & Meyer, J. P. 1990. The measurement and antecedents of affective, continuance and normative commitment to the organization. Journal of Occupational Psychology, 63: 1 ± 18. Brigham, S. 1994a. Introduction. 25 Snapshots of a movement: Profiles of campuses implementing CQI. Washington, DC: The American Association for Higher Education Continuous Quality Improvement Project. Brigham, S. 1994b. Introduction. CQI 101: A first reader for higher education. Washington, DC: The American Association for Higher Education Continuous Quality Improvement Project. Burke, W. W., & Litwin, G. H. 1992. A causal model of organizational performance and change. Journal of Management, 18: 523 ± 545. Cardy, R. L., & Dobbins, G. H. 1996. Human resources management in a total quality organizational environment: Shifting from a traditional to a TQHRM approach. Journal of Quality Management, 1: 5 ± 20. Corts, D. B., & Gowing, M. K. 1992. Dimensions of effective behavior: Supervisors, managers, and executives. Washington, DC: United States Office of Personnel Management. Daniels, A. C. 1994. Bringing out the best in people: How to apply the astonishing power of positive reinforcement. New York, NY: McGraw-Hill. Dobyns, L., & Crawford-Mason, C. 1991. Quality or else: The revolution in world business. Boston, MA: Houghton Mifflin. Gatewood, R. D., & Riordan, C. M. 1997. The development and test of a model of total quality: Organizational practices, TQ principles, employee attitudes and customer satisfaction. Journal of Quality Management, 2: 41 ± 65. Gilbert, T. F. 1978. Human competence: Engineering worthy performance. New York: McGraw-Hill. JOURNAL OF QUALITY MANAGEMENT, Vol. 4, No. 2, 1999
206
MCCARTHY & KEEFE
Gregory, D. J., & Park, R. K. 1992. Occupational study of federal executives, managers, and supervisors: An application of the multipurpose occupational systems analysis inventory-close ended (MOSAIC). Washington, DC: United States Office of Personnel Management. Griffin, R. W., & Bateman, T. S. 1986. Job satisfaction and organizational commitment. In C. L. Cooper & I. Robertson (Eds.), International review of industrial and organizational psychology. New York: Wiley. Hackman, J. R., & Wageman, R. 1995. Total quality management: Empirical, conceptual, and practical issues. Administrative Science Quarterly, 40: 309± 342. Kanter, R. M., & Brinkerhoff, D. 1981. Organizational performance: Recent developments in measurement. Annual Review of Sociology, 7: 321 ± 349. Kendall, S. D., & Stern, N. A. 1997. A national assessment against the Baldridge criteria. Paper presented at the meeting of the Society for Industrial and Organizational Psychology, St. Louis, MO (April). Kotter, J. P., & Heskett, J. L. 1992. Corporate culture and performance. New York, NY: The Free Press. Kreitner, R., & Kinicki, A. 1998. Organizational behavior (4th ed.). Boston, MA: McGraw-Hill. Lee, P. 1993. Some parallels of TQM and assessment. VCCA Journal, 8 (1): 12 ± 14. McCarthy, P. M. 1995. Measuring R&D Organizational Performance. U.S. Office of Personnel Management (Personnel Resources Development Center), Washington, DC. Unpublished manuscript. McCarthy, P. M., Fleishman, E. A., & Holt, R. 1997. Where's the evidence that managers' longterm performance really does synchronize with established organizational performance dimensions? Proceedings: Institute of Behavioral and Applied Management, 5: 85 ± 91. Meyer, J. P., Allen, N. J., & Smith, C. A. 1993. Commitment to organizations and occupations: Extension and test of a three-component conceptualization. Journal of Applied Psychology, 78 (4): 538 ± 551. Muhammad, L. 1997. Busy man on and off campus: Chancellor at IUS sets pace for academic, civic leadership. The Louisville Courier-Journal (February 6, pp. E1 and E4). Porras, J. I., & Robertson, P. J. 1992. Organizational development: Theory, practice, and research. In M. D. Dunnette & L. M. Hough (Eds.), (2nd ed.) Handbook of industrial and organizational psychology (vol. 3): 719 ± 822. Palo Alto, CA: Consulting Psychologists Press. Seymour, D. 1995. The Academic quality consortium Baldridge report: Lessons learned by nine colleges and universities undertaking self-study with the Malcolm Baldridge National award criteria. Washington, DC: American Association for Higher Education (April). Shouksmith, G., Pajo, K., & Jepsen, A. 1990. Construction of a multidimensional scale of job satisfaction. Psychological Reports, 67: 355 ± 364. Spector, P. E. 1988. Development of the work locus of control scale. Journal of Occupational Psychology, 61: 335 ± 340. Waldman, D. A., & Gopalakrishnan, M. 1996. Operational, organizational, and human resource factors predictive of customer perceptions of service quality. Journal of Quality Management, 1: 91 ± 107. Wilson, L. A., & Durant, R. F. 1994. Evaluating TQM: The case for a theory driven approach. Public Administration Review, 54 (2): 137 ± 146. Yudof, M. G., & Busch-Vishniac, I. J. 1996. Total quality: Myth or management in universities? Change, November/December: 19 ± 27.
JOURNAL OF QUALITY MANAGEMENT, Vol. 4, No. 2, 1999