Omega 39 (2011) 435–446
Contents lists available at ScienceDirect
Omega journal homepage: www.elsevier.com/locate/omega
Knowledge dissemination in operations management: Published perceptions versus academic reality Jack R. Meredith, Michelle D. Steward n, Bruce R. Lewis Wake Forest University, Schools of Business, P.O. Box 7285, Winston-Salem, NC 27109, USA
a r t i c l e in f o
abstract
Article history: Received 17 January 2010 Accepted 11 October 2010 Processed by Associate Editor Teo Available online 16 October 2010
The channels for knowledge generation and dissemination in the business disciplines are many: presenting research at conferences, writing books, distributing working papers, offering insights in society newsletters, giving invited talks, publishing studies in academic journals, and many other venues, including even blogs and perhaps Facebooks. But the most important venue is probably published research in ‘‘top-level’’ academic journals. In the discipline of Operations Management, many studies and lists have been published that attempt to determine which of these journals are supposedly the ‘‘top’’ according to either citation analyses, the opinion of recognized experts, author affiliations, bibliometric studies, and other approaches. These lists may then, in turn, be used in different degrees to evaluate research. However, what really counts is what the academic institutions actually use for guidance in evaluating faculty research. Based on a new source of ranking data from AACSB-accredited schools, we compare published journal-ranking studies against that of academe to determine the degree to which the studies reflect academic ‘‘reality’’. We present rankings of OM journals based on this new source of data and on an aggregate of the stream of published studies, and evaluate their consistency. & 2010 Elsevier Ltd. All rights reserved.
Keywords: Operations management Journal ranking Journal perception OM discipline
1. Introduction The channels available for disseminating academic knowledge are extensive, ranging from informal ‘‘blogs’’ these days to formal presentations to published books, monographs, and articles in journals. In the field of business, the most important, arguably, are published articles in ‘‘top’’ journals (rather than grants, as might be the case in medicine or engineering) since these are frequently the most important basis for promotion and tenure (P&T) decisions. But beyond P&T and annual evaluations, such top publications are also often the basis for:
research awards by universities, scholarly societies, governmental academies, and journal publishers;
grants from federal, state, and private agencies such as the National Science Foundation;
nominations for high-profile chairs, professorships, research grants, and fellowships;
offers for joining, or perhaps taking joint professorships at
n
prestigious universities such as Yale, Harvard, MIT, and such others; candidacy for high-visibility governmental positions such as advisory positions to the President or Governor, secretarial Corresponding author. Tel.: + 1 336 758 4426. E-mail address:
[email protected] (M.D. Steward).
0305-0483/$ - see front matter & 2010 Elsevier Ltd. All rights reserved. doi:10.1016/j.omega.2010.10.003
positions with the state or federal organizations such as the Treasury, membership on the Federal Reserve or Council of Economic Advisors, etc. and and even, at least partially, the requisite, ubiquitous business school/program rankings.
Clearly, the role of top publications in each of these sets of decisions will be different, but their influence is often substantial. For example, annual evaluations may include aspects of teaching, service to the department (and school, university, discipline, and perhaps even local community), research in-progress, and publications, while P&T may be based much more heavily on publications. Yet, universities are aware of the legal complications that can arise when a faculty member receives outstanding annual evaluations only to be turned down for P&T some years later. Hence, for schools that wish to place an emphasis on top journal publications for P&T, the role of such publications must be heavily included in the annual evaluations as well [1,2]. It is also clear that there are more stakeholders in top journal publications than just the university. Attaining a reputation as the ‘‘national expert’’ in a particular field or topic can lead to such accolades as being appointed Assistant Secretary of the Treasury, Economic Czar of the State, or Chair of the Federal Bankruptcy Committee for the XYZ Corporation. Such eminent appointments are a great boon to a university, enhancing its reputation, bringing in donations and endowments, increasing its student applications,
436
J.R. Meredith et al. / Omega 39 (2011) 435–446
and many other benefits. Hence, the university also has an interest in their faculty publishing in highly recognized journals, and may thus emphasize particular journals or fields over others. Understandably then, everyone from University Presidents to Deans to Assistant Professors are interested in knowing which journals the university considers to be the ‘‘top’’ journals, and especially the top journals in each Professor’s own field or discipline. To determine these top journals, and the rankings of other journals in a field, numerous authors have used various approaches such as citation impact scores, surveys of recognized scholars, bibliometric analyses, author affiliation indexes, and other techniques. These studies commonly identify the same general set of journals, but their ranking may differ considerably. In the discipline of interest here – operations management –Holsapple and Lee-Post [3] detail many of these approaches and identify examples of their use in operations management. However, with such a range of rankings derived by a variety of methods, it is not clear whose ranking to use, or even which method to rely upon. All of the methods have some justification, but also some weaknesses and limitations, again well described in Holsapple and Lee-Post [3] and Lewis [4]. For that matter, it is not only the methods that confound the studies but a range of other factors as well such as the geographic region (e.g., US versus Europe), the time period considered (e.g., 4 years versus 25 years), the date of publication, the selection of scholars, and the set of journals under consideration. In this study, we analyze all the ranking studies of operations management journals since 1990, and then compare them to a new source of data—official in-house journal lists of AACSB-accredited business schools used for helping evaluate faculty publications. This is a source of data that has not been used previously in OM journal ranking studies. (Although we asked schools for journal lists ‘‘that are used for evaluating publications,’’ we cannot state how, or even if, they were actually used. Our research here informs those evaluating research, but does not report any criteria for that evaluation, such as ‘‘three top-level journal articles’’ or ‘‘six solid publications.’’) Certainly not all schools formally and explicitly use in-house lists to evaluate research. However, schools often do create formal in-house lists for evaluation. Van Fleet et al. [5, p. 839] suggest that, ‘‘These rankings are designed to reduce difficulties in evaluating quality and to help faculty members identify target journals.’’ Additionally, Vokurka [6, p. 345] argues that, ‘‘The rankings of journals are important to academics because promotion and tenure decisions are based to a large extent on publication achievements. These decisions are based primarily on the journals in which research is published.’’ Our focus is on identifying the most credible ranking studies in the sense of conforming to the reality of academic guidelines provided by schools that have journal rating lists and then comment on these studies and their various approaches.
2. Background and data The literature on journal evaluation in operations management has a long history, as well as being diverse. We include here a discussion of these studies, as well as a description of the AACSB survey of journal lists used as a guide for making academic promotion, tenure, and salary decisions. 2.1. The journal ranking studies Published studies ranking OM journals over two decades – from 1990 to 2009 – were selected. Table 1 presents the 12 articles, the nature of the data that were used in each, and the specific table
within each study from which the ranking data were extracted. Seven of the studies compiled journal rankings based on perception scores derived from survey data, three based their analysis on citation data, and two utilized data from other sources—author affiliation index and behavior-based publication counts. For each study the final, overall ranking of quality was used. For cases where the ranking article ranked other dimensions, such as journal relevance, etc., we selected the data corresponding to the article’s overall measure of quality. Table 1 presents the specific table within each paper that the data was extracted from and a description of the nature of that data. As can be seen, seven studies were published in the Journal of Operations Management, three in Omega, and one each in Interfaces and Manufacturing & Service Operations Management. Eleven of the studies were published in the 14 year span between 1996 and 2009, for about one such article per year, showing the increased interest in publication venues since the mid-1990s. Two of the senior authors published another such study, the eight other studies being published by different author teams.
2.2. The AACSB-accredited school survey In order to capture the reality of the stature of OM journals, we collected the official lists used to help assess faculty research output at AACSB-accredited schools [4]. The AACSB was selected because it is recognized as the major accreditation entity for business schools worldwide. As Van Fleet et al. [5, p. 340] note: ‘‘A list provides an explicit measure of how a department values research outlets.’’ Moreover, such lists reflect the current state-ofthe-standings among competing journals in academic practice. To our knowledge, this data source has not been previously used in ranking OM journals. An email was sent to AACSB-accredited universities requesting a copy of their official journal list, if such was used, for evaluating faculty publications at their school. The email request was initially sent in November 2006, followed by two reminders in each of the two following months. Of the 545 institutions receiving the request, 206 responded, representing a 38% response rate. Table 2 offers general demographics of both the responding schools and the entire population of AACSB-accredited schools. To determine the representativeness of the set of respondents, the demographics of the responding schools were statistically compared to those of the entire AACSB population (see Table 2). For the categorical measures (i.e., affiliation, geographic region, degree level offered, and mission priority), one-sample chi-square tests were run, but only one test (public/private affiliation) was even marginally significant at the 0.05 level. One-sample t-tests were utilized for the continuous variables (essentially, different versions of school size), but no significant differences were found at the 0.05 level. Our conclusion from these tests was that the respondent set was representative of the entire AACSB population. While our focus is on OM journal rankings, not on the type of evaluative mechanisms that different schools use, we did find it of interest to compare responding schools that have internal lists versus the entire AACSB-accredited school population. Using the same approach of one-sample tests to compare the same demographic variables as above, we found that as compared to the population of AACSB-accredited schools, the schools that have internal lists have a statistically significantly larger faculty (98 versus 76 on average) and undergraduate enrollment, higher research focus as their ‘‘mission priority’’ (19% versus 12%), and accordingly, fewer schools that are private (19% versus 32%). These differences, and some others, are presented in Table 3. Of the responding schools, 83 (40%) provided their formal target journal lists. This is a fairly large proportion of schools that have
J.R. Meredith et al. / Omega 39 (2011) 435–446
437
Table 1 The 12 published journal ranking articles. Year
First author
Journal
Article title
1991
Barman
Journal of Operations Management
1996
Goh
1996
Vokurka
Omega—International Journal of Management Science Journal of Operations Management
1997
Goh
1999
Soteriou
2000
Donohue
2001
Barman
2005
Gorman
2005
Olson
Interfaces
2007
Theoharakis
2007
Zsidisin
Journal of Operations Management Journal of Operations Management
2009
Holsapple
Journal of Operations Management Journal of Operations Management
Data type
Table
Column
An empirical assessment of the perceived Perception scores relevance and quality of POM-related journals by academicians An empirical assessment of influences on Citation scores POM research
2
Mean—quality ratings
4
Normalized total—citation based
The relative importance of journals used Citation rankings in operations management research—a citation analysis Evaluating and classifying POM journals Citation rankings
8
Overall ranking—citations
2
Perception scores
4
Overall rank—age-adjusted across base years Mean—quality ratings
Perception scores
3
Perceived quality rating (POM interest area)
Perception scores
4
Mean—quality ratings
Assessing production and operations management related journals: the European perspective A multi-method evaluation of journals in the decisions and management sciences by US academics Perceived relevance and quality of POM journals: a decade later OM forum evaluating operations management-related journals via the author affiliation index Top 25-school professors rate journals in operations management and related fields Insights into factors affecting POM journal evaluation Criteria development and assessment of purchasing and supply management journals
Omega—International Journal of Management Science Journal of Operations Management Manufacturing & Service Operations Management
Behavior-based analysis of knowledge dissemination channels in operations management
Omega—International Journal of Management Science
Author affiliation index 2
Author affiliation index
Perception scores
2
Mean quality rating—from top-25 business school professors
Perception scores
4
Perception scores
5
Behavior: publication counts
4
Mean journal quality rating—worldwide Mean rating—based on: ‘‘articles published in this journal regularly meet my journal evaluation criteria’’ Intensity score
Table 2 Demographics of AACSB schools: respondents vs. the population. Demographic characteristic
Respondents
Population
One-sample test
n
Percent
n
Percent
Statistic
p-Value
Affiliation Private Public
51 150
25.37 74.63
169 362
31.83 68.17
w2 ¼ 3.86
p ¼0.050
Geographic region North America Europe Other
187 13 6
90.78 6.31 2.91
446 23 25
90.28 4.66 5.06
w2 ¼ 3.09
p ¼0.213
Degree level offered Undergraduate only Graduate only Both
19 7 168
9.79 3.61 86.60
37 35 415
7.60 7.19 85.22
w2 ¼ 4.73
p ¼0.094
Mission priority Teaching Research Teaching and research equal Teaching, research and service equal
103 21 63 7
53.09 10.82 32.47 3.61
250 59 157 21
51.33 12.11 32.24 4.31
w2 ¼ 0.61
p ¼0.894
Mean
Std. dev
Mean
Std. dev
Statistic
p-Value
72.5 1818.2 243.5 413.7 178.9
44.12 1472.16 429.29 333.65 207.73
76.7 1811.3 262.2 420.7 208.1
51.19 1459.56 404.87 355.53 282.13
t ¼1.33 t ¼0.06 t ¼0.58 t ¼0.29 t ¼1.86
p ¼0.184 p ¼0.949 p ¼0.566 p ¼0.776 p ¼0.064
Size Full time equivalent faculty Undergrad enrolment—full time Graduate enrolment—full time Undergraduate degrees conferred Graduate degrees conferred
Notes: The n’s for the respondents and for the population do not add to 206 and 545, respectively, due to the exclusion of missing data. p-Valuesr 0.05 are in bold.
438
J.R. Meredith et al. / Omega 39 (2011) 435–446
Table 3 Demographics of respondent schools with an internal list vs. the population. Demographic characteristic
Respondent schools with an internal list
Population
n
Percent
n
Percent
Statistic
p-Value
Affiliation Private Public
15 64
18.99 81.01
169 362
31.83 68.17
w2 ¼6.00
p¼ 0.014
Geographic region North America Europe Other
69 9 5
83.13 10.84 6.03
446 23 25
90.28 4.66 5.06
w2 ¼7.45
p¼ 0.042
Degree level offered Undergraduate only Graduate only Both
1 2 70
1.37 2.74 95.89
37 35 415
7.60 7.19 85.22
w2 ¼6.71
p¼ 0.035
Mission priority Teaching Research Teaching and research equal Teaching, research and service equal
23 14 32 4
31.50 19.18 43.84 5.48
250 59 157 21
51.33 12.11 32.24 4.31
w2 ¼11.9
p¼ 0.008
Mean
Std. dev.
Mean
Std. dev.
Statistic
p-Value
98.2 2614.0 357.7 618.3 261.7
44.78 1749.03 546.48 389.49 240.41
76.7 1811.3 262.2 420.7 208.1
51.19 1459.56 404.87 355.53 282.13
t ¼4.10 t ¼3.84 t ¼1.47 t ¼4.21 t ¼1.88
po 0.001 po 0.001 p ¼0.145 po 0.001 p ¼0.064
Size Full time equivalent faculty Undergrad enrolment—full time Graduate enrolment—full time Undergraduate degrees conferred Graduate degrees conferred
One-sample test
Notes: The n’s for the respondent schools with an internal list and for the population do not add to 83 and 545, respectively, due to the exclusion of missing data. p-Values r 0.05 are in bold.
Table 4 Demographics of respondent schools with an OM list vs. all schools with an internal list. Demographic characteristic
Respondent schools with an OM List
Respondent schools with an internal list
One-sample test
n
Percent
n
Percent
Statistic
p-Value
Affiliation Private Public
7 30
18.92 81.08
15 64
18.99 81.01
w2 o 0.01
p ¼0.992
Geographic region North America Europe Other
31 4 2
83.78 10.81 5.41
69 9 5
83.13 10.84 6.03
w2 ¼ 0.03
p ¼0.987
Degree level offered Undergraduate only Graduate only Both
0 2 32
0.00 5.88 94.12
1 2 70
1.37 2.74 95.89
w2 ¼ 1.21
p ¼0.271
Mission priority Teaching Research Teaching and research equal Teaching, research and service equal
8 7 17 2
23.53 20.59 50.00 5.88
23 14 32 4
31.50 19.18 43.84 5.48
w2 ¼ 1.03
p ¼0.795
Mean
Std. dev.
Mean
Std. dev.
Statistic
p-Value
98.3 2265.7 247.9 643.8 245.5
43.05 1359.84 191.53 384.27 168.24
98.2 2614.0 357.7 618.3 261.7
44.78 1749.03 546.48 389.49 240.41
t ¼0.01 t ¼1.43 t ¼3.29 t ¼0.37 t ¼0.55
p ¼0.989 p ¼0.164 p ¼0.002 p ¼0.714 p ¼0.583
Size Full time equivalent faculty Undergrad enrolment—full time Graduate enrolment—full time Undergraduate degrees conferred Graduate degrees conferred
Notes: The n’s for the respondent schools with an OM list and all respondent schools with an internal list do not add to 37 and 83, respectively, due to the exclusion of missing data. p-Values r 0.5 are in bold.
their own in-house lists. By comparison, a survey of Management departments [5] found that only 14% had internal journal lists and a survey of Accounting departments [7] found that only 13% had
internal journal lists. The remainder of the respondents included 89 schools that indicated they did not have internally generated lists, 12 that stated they used external lists, such as the Financial
J.R. Meredith et al. / Omega 39 (2011) 435–446
Times, and 22 that employed Cabell’s Directory of Publishing Opportunities. As noted earlier, these lists reflect how journals may influence university decisions on annual faculty evaluations, tenure and promotion decisions, research awards, and other research-oriented activities. Since journals are generally ranked, or at least categorized, at individual schools by their perceived quality [5], the metrics derived from these lists serve as a reasonable depiction of journal standing from an ‘‘operational’’ standpoint. Of the 83 school lists submitted, 37 classified journals in graded tiers and specifically categorized OM journals in groups labeled as: operations, operations management, production and operations management, supply chain management, and logistics. These school lists were the ones used in our calculations. Table 4 offers demographic information on these 37 schools, with a comparison to schools that did not specifically categorize OM journals. There are no significant differences between the two groups.
3. Results We next describe our methodology and assessment of statistical reliability. We follow that with a description of the results of our analysis of published journal-ranking studies, an analysis of our AACSB journal ranking data, and a comparison of the two. 3.1. Journal ranking studies To assess the psychometric soundness of the published ranking studies, reliability analysis was performed. Reliability is concerned with the consistency of a measure in different contexts and over successive trials [8]. Various methods can be used to evaluate reliability. At some level, each method correlates scores from one source with scores from another source. Higher correlations demonstrate more consistency, or reliability, across the sources. An assessment of the reliability of the rankings of the OM journal studies was conducted by correlating the ratings of all possible pairings of the 12 studies listed in Table 1. The nonparametric Spearman Rho correlation coefficient was used due to the ordinal nature of the data and the small number of journals that are common between studies. The results of this reliability analysis are reported in Table 5 [3,6,9, 17–25]. The journal articles are arranged by the data source of each article (perception, citation, other), and shaded regions highlight the correlations within each data source. Of the 66 Spearman correlation coefficients shown in Table 5, 41 are statistically significant (in bold) at the 0.05 level with the majority of these significant beyond the 0.001 level. The magnitude of the statistically significant correlation coefficients ranged from 0.436 to 0.941. Pairs of articles within each data source type are significantly correlated with one another with rare exception, such as Zsidisin et al. [9], which is not significantly correlated with any other study. A possible reason for this is that Zsidisin et al. examines purchasing and supply chain management journals specifically, rather than OM (or POM) journals as the other studies rank. It is also notable that the perception studies tend to be highly correlated with all the other studies. 3.2. OM journal categories Our interest in validating the published OM journal ranking stream using the new data source of in-house journal lists actually used for evaluating faculty research requires that we identify which journals are specifically dedicated to OM. For our candidate list of journals that disseminate OM knowledge, we considered the set of journals that were included in both the published studies on the
439
dissemination of OM knowledge and the school lists of journals that are recognized as contributing to OM knowledge. To be conservative at this stage, we only required that a journal be mentioned both in at least one published study and on at least one school’s list. As seen in Table 6, this gave us a set of 71 journals; 7 of the 71 journals appeared on only one published study and only one school list. The journals are listed in order of the number of published studies they appeared on, since the published studies preceded the more recent AACSB school list. As can be seen, there is a close, but not perfect, correlation between the two lists. (The actual correlations are given later in the paper for the OM-dedicated journals.) Next, we followed Holsapple and Lee-Post’s [3] approach to separating ‘‘OM-dedicated’’ journals from ‘‘Interdisciplinary’’ journals and reference-discipline (‘‘Reference’’) journals. The contrast between OM-dedicated and interdisciplinary journals is relatively straight-forward: whether virtually all the papers published in the journal are dedicated to OM topics or whether they range across various disciplines (either business or even broader) with OM being only a portion of the subject matter. The contrast between OMdedicated and reference journals is a bit more complex since the reference discipline may be either a topic area (e.g., Economics, Engineering, Information Systems) or a methodology area (operations research, statistics). It was concluded that if the papers in a journal were related to one of the reference discipline topic areas rather than OM, then it was a reference journal. For the more complicated situation of a journal with methodological papers but related to an OM topic, we classified that as OM-dedicated if the focus of the papers was on the OM topic but Reference Discipline if the focus was on methodological issues (e.g., operations research methodology, statistical methodology). These three categories (OM-dedicated, Reference, Interdisciplinary) are listed in the right column of Table 6; most of them are obvious but we describe our logic regarding some of the less obvious journals. We relied on five potential sources of evidence to help categorize the journals:
1. The specific Aims and Scope stated by the journal. 2. The type of society (e.g., engineering, economics, operations research, production research) if the journal is sponsored by such an institution. 3. The variety of membership groups within the society, if society sponsored. 4. The diversity of articles actually published in the journal. 5. Holsapple and Lee-Post’s [3] categorization, if the journal appeared on their list. Since we refer to this article often, we abbreviate it H&L.
In order to illustrate our logic we describe our categorization for eleven journals that were problematic to classify for one reason or another. Journal of Manufacturing and Operations Management (#30): This was a publication of INFORMS (the Institute for Operations Research and Management Science) that was discontinued many years ago. Its articles were then folded into Management Science. We designated it as an OM-dedicated journal. Production and Inventory Management Journal (#15): This journal was originally a publication of APICS, a practitioner society for production and inventory control professionals, but it ceased publication many years ago. APICS has recently reinstated the journal but with a different orientation, and primarily as an electronic journal directed to P&IC professionals and managers. Given the large number of papers that were published previously in the journal that fell in the OM domain, we have designated this, at least in its original form, as an OM-dedicated journal.
440
J.R. Meredith et al. / Omega 39 (2011) 435–446
Table 5 Reliability analysis: correlations between published study journal rankings—arranged by data type (perception, citation, other).
Spearman Correlation Coefficient (Number of Journals) p-value
[Barman, 18] [Soteriou, 19] [Donohu, 20] [Barman, 21] [Olson, 22] [Theoharakis, 23]
[Zsidisin, 9]
[Goh, 24] [Vokurka, 6] [Goh, 25]
[Gorman, 17] [Holsapple, 3]
[Holsapple, 3]
[Gorman, 17]
Other
[Goh, 25]
[Vokurka, 6]
[Goh, 24]
[Zsidisin, 9]
[Theoharakis, 23]
Citation Data
[Olson, 22]
[Barman, 21]
[Donohu, 20]
[Barman, 18]
Study
[Soteriou, 19]
Perception Data
1 0.757 (17) 0.001 0.779 (15) 0.001 0.904 (17) 0.001 0.728 (16) 0.0 01 0.850 (9) 0.004 0.318 (11) 0.340 0.799 (17) 0.001 0.792 (12) 0.002 0.752 (17) 0.001 0.600 (15) 0.018 0.852 (16) 0.001
1 0.546 (15) 0.035 0.775 (19) 0.001 0.510 (19) 0.026 0.529 (10) 0.116 0.150 (15) 0.594 0.793 (19) 0.001 0.378 (12) 0.225 0.646 (19) 0.003 0.431 (17) 0.084 0.790 (19) 0.001
1 0.868 (15) 0.001 0.830 (16) 0.001 0.898 (8) 0.002 0.261 (10) 0.467 0.629 (15) 0.012 0.705 (10) 0.023 0.648 (14) 0.012 0.679 (15) 0.005 0.654 (15) 0.008
1 0.834 (18) 0.001 0.863 (10) 0.001 0.379 (13) 0.201 0.771 (16) 0.001 0.802 (12) 0.002 0.679 (17) 0.003 0.750 (17) 0.001 0.853 (18) 0.001
1 0.939 (11) 0.001 0.102 (13) 0.741 0.282 (19) 0.243 0.374 (11) 0.258 0.312 (18) 0.208 0.859 (23) 0.001 0.436 (22) 0.043
1 -0.132 (8) 0.756 0.667 (8) 0.071 0.635 (8) 0.091 0.550 (9) 0.125 0.843 (11) 0.001 0.615 (11) 0.044
1 0.178 (14) 0.543 0.503 (8) 0.204 0.134 (14) 0.648 0.314 (14) 0.274 0.506 (13) 0.078
1 0.633 (11) 0.036 0.941 (30) 0.001 0.444 (16) 0.085 0.869 (19) 0.001
1 0.627 (12) 0.029 0.456 (11) 0.159 0.460 (11) 0.154
1 0.326 (17) 1 0.202 0.799 0.372 (19) (21) 0.001 0.097
1
p-Values r 0.05 are in bold. First author listed and reference number.
Production and Inventory Management Review (and APICS News) (#71): This was another publication of APICS that was discontinued, but it was more of a Newsletter than a journal. This publication and the P&IM Journal were occasionally mis-referenced, one for the other. Interfaces (#7): This is another INFORMS journal and is more directed to practice and application than their academic journals. As might be expected, the articles cover the full range of business disciplines, leading us to classify it, as did H&L, as Interdisciplinary. Journal of Manufacturing Systems (#28): This is a publication of the Society of Manufacturing Engineers. Although the journal has occasional articles about the management of manufacturing, the focus is largely on engineering issues, hence we classified it as a reference-discipline journal, Engineering in particular. The society has numerous local chapters and eight ‘‘technical communities’’ but nothing related to management. The ‘‘aims and scope’’ of the
journal does identify such OM issues as strategy, production planning, quality, and the supply chain as topics of interest, but the great majority of topics are related to manufacturing engineering. International Journal of Production Economics (#19): Although H&L classify this as an OM journal, close inspection of the heavily mathematical papers published in the journal leads to the conclusion that this is a reference-discipline journal, Economics in this case. The aims and scope of the journal note that it focuses on ‘‘the interface between engineering and management’’ but primarily through the ‘‘economic environment.’’ European Journal of Operational Research (#6): This journal publishes both articles directed toward extending the theory of operations research as well as applications across multiple areas, and far beyond business, such as agriculture, engineering, government, and so on. H&L classified the journal as Interdisciplinary, probably because of its range of applications, but inspection
J.R. Meredith et al. / Omega 39 (2011) 435–446
441
Table 6 71 journals both ranked in published OM studies and listed in school lists. Journal
No. of times listed in 12 published studies
No. of times listed in 37 school lists
OM-dedicated (D), reference discipline (R), interdisciplinary (I)
1. Journal of Operations Management 2. International Journal of Production Research 3. Management Science 4. International Journal of Operations and Production Management 5. Decision Sciences 6. European Journal of Operational Research 7. Interfaces 8. IIE Transactions 9. Operations Research 10. OMEGA—International Journal of Management Science 11. Journal of the Operations Research Society 12. Naval Research Logistics 13. Journal of Supply Chain Management 14. Computers and Operations Research 15. Production and Inventory Management Journal 16. Production and Operations Management 17. Computers and Industrial Engineering 18. Harvard Business Review 19. International Journal of Production Economics 20. Annals of Operations Research 21. IEEE Transactions on Engineering Management 22. Manufacturing and Service Operations Management 23. Operations Research Letters 24. Sloan Management Review 25. Strategic Management Journal 26. Journal of Business Logistics 27. Transportation Science 28. Journal of Manufacturing Systems 29. Mathematics of Operations Research 30. Journal of Manufacturing and Operations Management 31. Business Horizons 32. Journal of Quality Technology 33. Industrial Engineering 34. Quality Progress 35. Simulation 36. Journal of the American Statistical Association 37. Technometrics 38. California Management Review 39. IEEE Transactions on Systems, Man, Cybernetics 40. Journal of Management 41. Journal of Scheduling 42. International Journal of Physical Distribution and Logistics Management 43. International Journal of Logistics Management 44. Production Planning and Control 45. Supply Chain Management: An International Journal 46. INFORMS Journal on Computing 47. International Journal of Flexible Manufacturing Systems 48. International Transactions in Operational Research 49. Logistics and Transportation Review 50. International Journal of Operations and Quantitative Management 51. International Journal of Service Industry Management 52. Journal of Heuristics 53. Mathematical Programming 54. Networks 55. Quality Management Journal 56. Supply Chain Management Review 57. Communications of the ACM 58. Journal of Global Optimization 59. SIAM Review 60. International Journal of Purchasing and Materials Management 61. Journal of Purchasing and Supply Management 62. Quality Magazine 63. Service Industries Journal 64. TQM Magazine 65. American Journal of Mathematics and Management Sciences 66. Engineering Management Journal 67. IEEE Transactions on Reliability 68. Journal of the ACM 69. Manufacturing Review 70. Mathematical and Computer Modelling 71. Production and Inventory Management Review
12 12 12 12 12 12 11 11 10 10 10 10 10 9 9 8 8 7 6 5 5 4 4 4 4 3 3 3 3 3 3 2 2 2 2 2 2 2 2 2 2 1
28 24 24 22 20 18 17 15 24 23 18 18 17 16 9 28 10 1 17 12 5 19 11 1 1 16 16 6 6 2 1 4 3 3 3 2 2 1 1 1 1 13
D D I D I R I R R I R R D R D D R I R R R D R I R D D R R D I D R D R R R I R R D D
1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
10 8 7 6 5 5 5 4 4 4 4 4 4 4 3 3 3 2 2 2 2 2 1 1 1 1 1 1 1
D D D R D R D D D R R R D D R R R D D D D D R R R R D R D
442
J.R. Meredith et al. / Omega 39 (2011) 435–446
of its aims and scope (‘‘papers that contribute to the methodology of operational research (OR) and to the practice of decision making’’), as well as the papers actually published, leads us to classify it as a reference-discipline journal, namely, Operations Research. Omega: The International Journal of Management Science (#10): This journal is rather eclectic in the papers it publishes but notes in its aims and scope that it publishes in the ‘‘specific fields or functions of management,’’ and specifically with implications to the practice of management rather than purely theoretical articles. This focus on the implications to practice was emphasized in recent articles and editorials regarding the journal’s audience and the nature of the types of problems authors might address [10–12]. Inspection of the articles appearing in the journal confirms that it covers all aspects of management, such as accounting, information systems, and OM. We hence designated this as an Interdisciplinary journal, which agrees with the H&L categorization. Naval Research Logistics (#12): As stated in the journal’s ‘‘Description,’’ the journal focuses on ‘‘operations research, applied statistics, and general quantitative modeling’’ with special application to logistics. The papers reflect this orientation accurately and we thus designated this journal as a reference-discipline journal, in this case both Operations Research and Statistics. H&L came to the same conclusion. Decision Sciences (#5): This journal, the ‘‘face’’ of the Decision Science Institute, used to be a favorite for OM researchers to publish in when there were no OM-dedicated journals (i.e., prior to 1980). In spite of the fact that the society, which has 20 ‘‘interest areas,’’ only one of which is OM, clearly states in its description that it publishes in ‘‘all the functional areas of business,’’ and does indeed publish across the spectrum of business disciplines, the journal still seems to carry the earlier ‘‘image’’ among academics of being an OM journal. Part of the reason may be that the OM academics still represent a large interest area in the institute (24% of the membership as of 2009). Nevertheless, the journal has always been an interdisciplinary journal, as H&L classified it, and so did we. Management Science (#3): This, like Decision Sciences, is also a mainstream journal of a society, INFORMS in this case. And it too publishes across a wide range of disciplines but not just those limited to business. Its editorial statement notes that its articles address management issues with tools from foundational fields (e.g., economics, statistics, psychology, operations research), as well as multidisciplinary research. The institute is comprised of ten ‘‘societies’’ of which one is Manufacturing and Service Operations Management, and constitutes somewhat less than 10% of the INFORMS membership (as of 2009). And like Decision Sciences, the articles range across a spectrum of disciplines (including the military, government, and others). Again, like H&L, we classified this journal as Interdisciplinary. Clearly, many journals have contributed to the dissemination of OM knowledge over the years. This is not unlike other business disciplines that develop and benefit from work published in a range of other interdisciplinary and reference-discipline journals. However, not all such journals would be considered to be journals dedicated to a specific business discipline, and that is also the case with OM. Hence, we have selected those journals shown as ‘‘OM– Dedicated’’ from Table 6 and reproduced them in Table 7. As can be seen in Table 7, three OM-dedicated journals (JOM, IJPR, IJOPM) were listed in every ranking study, whereas no journal was listed by every single school on their lists. This may suggest that a bit of tailoring occurs in creating school lists of OM-dedicated journals to match faculty research interests. All OM-dedicated journals appeared at least once in both the collection of ranking studies and school lists.
Table 7 30 OM-dedicated journals. Journal
No. of times listed in 12 published studies
No. of times listed in 37 school lists
Journal of Operations Management International Journal of Production Research International Journal of Operations and Production Management Journal of Supply Chain Management Production and Inventory Management Journal a Production and Operations Management Manufacturing and Service Operations Management Journal of Business Logistics Transportation Science Journal of Manufacturing and Operations Management a Journal of Quality Technology Quality Progress Journal of Scheduling International Journal of Physical Distribution and Logistics Management International Journal of Logistics Management Production Planning and Control Supply Chain Management: An International Journal International Journal of Flexible Manufacturing Systems Logistics and Transportation Review International Journal of Operations and Quantitative Management International Journal of Service Industry Management Quality Management Journal Supply Chain Management Review International Journal of Purchasing and Materials Management Journal of Purchasing and Supply Management Quality Magazine Service Industries Journal TQM Magazine Manufacturing Review Production and Inventory Management Review a
12 12 12
28 24 22
10 9 8 4
17 9 28 19
3 3 3
16 16 2
2 2 2 1
4 3 1 13
1 1 1
10 8 7
1
5
1 1
5 4
1
4
1 1 1
4 4 2
1 1 1 1 1 1
2 2 2 2 1 1
a
These publications were discontinued; P&IMJ has recently been resurrected.
3.3. Final journal rankings Finally, we proceeded to rank the journals. The 37 tiered school lists for OM journals collected from AACSB-accredited schools were used to compute a composite ranking [13] for each journal. Initially, three obvious measures were calculated: (1) the percentage of times each journal was listed in the top tier across schools, (2) the percentage of times in the top two tiers, and (3) the percentage of times in any tier. However, because the number of tiers differed among the schools in the sample, a more sophisticated fourth measure – a weighted mean percentile score – was then computed for each journal based on its assignment in each school’s graded tiers. This score accounted for the actual number of tiers within each school list, the number of journals in each tier, the tier placement of each journal, and the number of schools listing that journal. All journals in the same tier at a given school were given the same percentile measure (the mean for that tier) for that school. These mean percentiles were then aggregated across the schools in the sample by creating an average of the mean percentiles for each journal. The final weighted average mean percentile score was calculated by multiplying the average mean percentile by the number of schools listing that journal in one of their tiers. This group of four school-list metrics, shown in Table 8, reflects how journals are actually judged and used in practice.
J.R. Meredith et al. / Omega 39 (2011) 435–446
443
Table 8 Rankings from the AACSB school lists data of the 30 focal journals. Journal
Journal of Operations Management Production and Operations Management International Journal of Production Research Manufacturing and Service Operations Management International Journal of Operations and Production Management Journal of Business Logistics Transportation Science Journal of Supply Chain Management International Journal of Physical Distribution and Logistics Management International Journal of Logistics Management Production and Inventory Management Journal Production Planning and Control Supply Chain Management: An International Journal Journal of Quality Technology Quality Management Journal Logistics and Transportation Review International Journal of Flexible Manufacturing Systems International Journal of Service Industry Management Supply Chain Management Review Quality Progress International Journal of Purchasing and Materials Management International Journal of Operations and Quantitative Management TQM Magazine Journal of Manufacturing and Operations Management Journal of Scheduling Service Industries Journal Journal of Purchasing and Supply Management Quality Magazine Manufacturing Review Production and Inventory Management Review
Weighted average mean percentile score
% Times listed by schools in any tier (%)
% Times listed by schools in top tier (%)
% Times listed by schools in top 2 tiers (%)
1 2 3 4 5 6 7 8 9
21.9693 15.7971 13.1545 10.6754 9.4535 9.0677 7.7144 5.0965 4.6064
75.68 75.68 64.86 51.35 59.46 43.24 43.24 45.95 35.14
64.86 29.73 18.92 29.73 13.51 21.62 10.81 2.70 5.41
75.68 62.16 51.35 40.54 35.14 32.43 32.43 18.92 16.22
10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 24 26 27 28 29 30
4.0541 3.9122 3.3259 2.3941 2.2922 2.0168 1.9153 1.7355 1.3168 1.0511 1.0422 0.9500 0.8381 0.5688 0.5000 0.5000 0.4929 0.4292 0.2948 0.1806 0.0595
27.03 24.32 21.62 18.92 10.81 10.81 13.51 13.51 10.81 10.81 8.11 5.41 10.81 5.41 5.41 2.70 5.41 5.41 5.41 2.70 2.70
2.70 5.41 5.41 0.00 2.70 2.70 2.70 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00
18.92 10.81 13.51 10.81 8.11 8.11 8.11 5.41 5.41 5.41 5.41 2.70 2.70 2.70 0.00 0.00 2.70 2.70 0.00 0.00 0.00
Overall ranking
However, the weighted average mean percentile score is the most nuanced of the four metrics in that it takes into consideration not only a given journal’s tier placement at each school but also the number of tiers at the school and the number of journals graded by the school. Furthermore, in this metric each journal’s score is weighted by the number of schools that graded that journal. We believe that this is the best school-list metric available to estimate a journal’s actual standing in that it is based on the most information. Table 8 presents the final ranking of the 30 OM journals from the AACSB data based on their weighted average mean percentile scores. The other scores (percentages of tier inclusion) are also shown in the table. In order to verify the bona fides of these rankings we correlated them with the average of the journal ranks across the published OM journal ranking articles listed earlier. The correlation coefficient of 0.697 (pr0.001) with the published ranking aggregate shows a strong association, indicating that the rankings from the school lists are very consistent with the findings in the published stream. Additionally, we correlated the school rankings with each of the types of ranking studies (perception, citation, other). While each type of ranking correlated strongly with the school lists (reality), perception studies were slightly stronger in association (0.771, p r0.001). Table 9 presents the correlations. The rankings of OM-dedicated journals from the school lists and the published ranking studies are produced side-by-side in Table 10. The ranking from the published ranking studies was established by using a double-weighted calculation. First, the rank for each journal in each study was divided by the number of journals ranked in that study. The underlying basis for this adjustment is to take into account the number of journals ranked in each study (i.e., a journal that ranked number 5 in a study of 200 journals should get a better ranking than a journal that ranked 5 in a study of 10 journals). Second, the average of these adjusted ranks
was computed across studies and divided by the number of studies in which each journal appeared. This adjustment factored in the number of times each journal appeared in the collection of studies (i.e., a journal that was included in 10 studies presumably has more visibility and impact than one that was present in only one study). This final double-weighted average is the determinant of the journal’s overall rank (the smaller the double-weighted average the better the rank). The primary reason for discrepancies between the aggregate rankings from the published studies versus the AACSB-accredited school lists is believed to be the timing of the two sets of data, with the AACSB data reflecting publication reality in early 2007 whereas the published studies go back almost two decades.
4. Discussion The comparison conducted in this study of actual lists used to assess faculty research and the published ranking stream is both needed (given that examining measures is a norm in scientific inquiry), and valuable, given the importance attached to publishing in the ‘‘top’’ journals in evaluation, promotion, tenure, award, nomination, and other such high-profile decisions. Several points of interest arise from the results of comparing the published stream of OM ranking papers to the actual in-house lists used by AACSBaccredited schools to evaluate faculty research. First, while the aggregate ranking from the published studies is significantly (and positively) correlated with the in-house lists, there is some dissimilarity in the rankings, even for the top journals. As noted earlier, the major reason is believed to be the difference in time periods between the published studies, on average about a decade. For instance, the journal Production and Operations Management has
444
J.R. Meredith et al. / Omega 39 (2011) 435–446
Table 9 Correlations between sources of journal rankings. Source
Spearman correlation coefficient (number of journals) p-value School lists
Published studies—all
Published studies—perception
Published studies—citation
Published studies—other
0.697 (30) 0.001 0.771 (23) 0.001 0.771 (10) 0.011 0.747 (13) 0.003
Published studies—all
Published studies—perception
Published studies—citation
0.962 (23) 0.001 0.796 (10) 0.006 0.842 (13) 0.002
0.886 (6) 0.019 0.673 (10) 0.033
0.600 (5) 0.285
Table 10 Ranking of 30 journals: school lists versus published ranking studies. Journals
Journal of Operations Management Production and Operations Management International Journal of Production Research Manufacturing and Service Operations Management International Journal of Operations and Production Management Journal of Business Logistics Transportation Science Journal of Supply Chain Management International Journal of Physical Distribution and Logistics Management International Journal of Logistics Management Production and Inventory Management Journal Production Planning and Control Supply Chain Management: An International Journal Journal of Quality Technology Quality Management Journal Logistics and Transportation Review International Journal of Flexible Manufacturing Systems International Journal of Service Industry Management Supply Chain Management Review Quality Progress International Journal of Purchasing and Materials Management International Journal of Operations and Quantitative Management TQM Magazine Journal of Manufacturing and Operations Management Journal of Scheduling Service Industries Journal Journal of Purchasing and Supply Management Quality Magazine Manufacturing Review Production and Inventory Management Review
School lists
1 2 3 4 5
Published ranking studies 1 3.5 2 5 3.5
6 7 8 9
9 8 6 15
10 11 12 13 14 15 16 17
10 7 22 17 16 25 17 29
18 19 20 21
22 19 20 21
22
30
23 24
26 11
24 26 27 28 29 30
13 24 14 27 27 12
substantially improved in stature since it began publication in the early 1990s, but it always takes time for a new journal to establish its reputation. Similarly, the even more recently initiated journal Manufacturing and Service Operations Management, sponsored by INFORMS, is ranked 4th in the AACSB list and 5th in the published studies list, reflecting the same delay in acceptance for new journals. And the British journal International Journal of Operations and Production Management, whose stature appears to have improved considerably over time in the published studies, suffers slightly in the
AACSB list which is more heavily weighted with American schools (31 of the 37 schools with tiered OM lists). Other effects may also be seen in the data. For example, the large discrepancy in the published vs. AACSB rankings for the Production and Inventory Management Journal reflect the loss of stature when an organization decides to terminate a journal; and like the difficulty of establishing the stature of a new journal, the falloff in stature of a defunct journal is equally slow, having fallen only 4 ranks in a decade or so. Also, there may be a quantity effect, with International Journal of Production Research, which is published 24 times a year with a large number of papers in each issue, holding a high ranking on both lists. And the long-established Journal of Business Logistics, ranked 6th in the AACSB list but only 9th in the published studies list, may attest to the value of keeping a journal publishing in its area of strength (in this case, an area that has come into prominence in the operations field: supply chains and logistics) for an extended duration. On the other hand, the Journal of Supply Chain Management shows the opposite effect, holding the rank of 6th in the published studies list but only 8th in the AACSB list, possibly due to the many changes in the name of this journal over the years and the confusion this generates. Since this study is the first to present and compare rankings of OM-dedicated journals from this new source of AACSB in-house lists and published journal ranking studies, we can also get a sense of how OM-dedicated journals fare when ranked with related reference and interdisciplinary journals, as compared to ranking solely OM journals. Journal rankings offer one cue suggesting the value placed on different collections of knowledge and, whether explicitly or implicitly, provide incentives for scholars to publish in certain journals. If journals of distinct disciplines or research areas are combined, the focus and potential development of the fields involved becomes diluted. We thus suggest that there are benefits in separating journals into distinct disciplines and ranking journals within those disciplines. An academic discipline is ‘‘a branch of learning or scholarly instruction’’ [14]. As such, ‘‘academic disciplines have become authoritative communities of expertise’’ [15, p. 63]. The community of a discipline influences theory development and shapes the types of phenomena that scholars in that community attempt to understand. Journals are a vital component of the academic community as a key vehicle of communication among scholars. In fact, it has been claimed that a discipline’s identity is primarily created by the journals that publish in that discipline [16]. As faculty are offered incentives for publishing in a field’s top or premier journals, (e.g., greater mobility, research awards, enhanced reputation, merit pay, etc.), clarity in identifying these top journals is important. One only need reflect on past tenure and promotion
J.R. Meredith et al. / Omega 39 (2011) 435–446
discussions, annual review evaluations, faculty hiring decisions, or target journal selection processes for new manuscripts to see the potential impact that ‘‘the #1 journal in our field’’ or ‘‘one of the top three journals in our field’’ has in academia. Clearly, being the editor of such a journal is also a prestigious position, but as well carries the responsibility to maintain and enhance the journal’s reputation. More difficult is the position of editors of lesser journals, who are trying to move the status of the journal up during their tenure. We have already seen some approaches for doing this:
445
categorized OM journals in groups, even with these specifically designated groupings there could be some differences across different departmental structures, faculty composition, and over time, as schools update their lists. Additionally, we do not have evidence that the lists provided are the only metric used to evaluate faculty research. Future studies might examine the extent to which journal rankings are the primary criterion for evaluating research at schools and the degree to which a portfolio of criteria are explicitly and formally considered in the evaluation process.
publishing well-respected authors (if they can be enticed to submit to a less-prestigious journal),
6. Conclusion
selecting only high-quality papers to publish and simulta-
neously reducing the all-important acceptance (selectivity) rate (but at the risk of not having enough papers to fill all the issues), publishing a lot of papers every year, thereby increasing the journal’s citation rate and popularity (but this conflicts with being more selective with higher quality), having a long established history (but hard for an editor to affect), publishing on a regular, continuous basis rather than late or intermittent (a good policy for any journal but sometimes out of the hands of the editor, especially if the journal is sponsored by a society), picking a niche area in an established field (e.g., consumer research in marketing, logistics or quality in operations, banking in finance, and ‘‘practice’’ or ‘‘methodology’’ in any management area) and aiming to be the top journal in this niche area. But will this detract from the ability of the journal to be recognized as one of a field’s top disciplinary journals?
5. Limitations Our research offers guidance for schools that currently do not use formal lists to evaluate research but wish to do so in the future, and also for schools that wish to compare their list with that of other schools. However, the study does not aid schools who choose to use other metrics of quality besides journal rankings. Further, given that the focus of our paper is on the validation of the stream of published journal ranking studies by comparing them to the inhouse journal lists used by AACSB-accredited schools, we do not know specifically how schools without such lists evaluate research. It would be interesting to know the varieties of methods employed to evaluate faculty research, including the degree to which faculty are asked in the evaluation process to provide their own justification and support for the quality of their publications. Another limitation of our study involves the lack of knowledge that we have about schools who did not respond to the survey. We do not know if non-responders are schools that tend to use lists or tend not to, or whether those schools share the same pattern of lists/no lists as those schools who responded. However, our sample does not differ significantly from the population on key demographic characteristics, thus suggesting a degree of representativeness. An additional point to note about the limitations of ranking studies in general, and ours in particular, is that schools have a variety of organizational structures which may shape the lists used to evaluate research. For example, a school with a Department of Purchasing and Operations Management may include and rank journals differently than a Department of Management (or Decision Sciences) which has OM faculty as part of the department, but they are not the dominant group in the department. While our study included only lists that were both tiered and specifically
This paper relied on two sources of data to evaluate academic journals in operations management, arguably the most important channel for scholarly knowledge generation and dissemination. Using published ranking studies, we categorized the studies by type, with most falling into the category of perception studies, a few being citation rankings, and two others, namely one that used an author–affiliation index [17] and another that used publication counts, a behavioral measure [3]. A new source of data, the inhouse target journal lists of AACSB-accredited schools, was introduced in this study. These lists are used by the universities for evaluation, promotion, tenure, and other such research-oriented appraisal activities, as well as by others with an interest in identifying experts in particular business areas. We have shown that both sources tend to be relatively consistent and reliable. Using the school lists and published studies, we identified 71 journals where operations management academics tend to publish. Starting with this list, we then separated, based on a variety of measures, the OM-dedicated journals from those representing either reference disciplines for the field (primarily engineering, economics, and operations research) or the many broader, interdisciplinary journals, resulting in a final list of 30 OM-dedicated journals. Then, based on the in-house AACSB data, we derived a weighted average mean percentile score by which we found how universities actually perceive the rankings of those journals dedicated to the operations management field and calculated the rankings of the 30 OM-dedicated journals. We similarly derived one overall ranking list for the same journals based on the published studies and compared the two lists, explaining the discrepancies and differences. We concluded that much of the difference between the two lists is accounted for by inertia effects due to time lags in both human perception and changes in the journal policies. As a result, the most up-to-date data should be relied upon more heavily, and of course, the reality of what the universities are actually using for their research-oriented decisions. We suggest that updates of our AACSB data will be important for monitoring changes in the stature and respect of the journals in the future. Our results should be of interest to university and business school administrators and faculty, as well as external stakeholders with an interest in identifying experts in areas such as quality, supply chains, scheduling, inventory management, process design, manufacturing management, and service management. To date, the great majority of published journal rankings for the operations management field have confounded these stakeholders by including inappropriate journals from either reference disciplines (e.g., engineering, economics, operations research, information systems, statistics) or from interdisciplinary journals that include marketing, finance, behavior, ethics, and other business, and sometimes non-business, fields. An ethicist who is well-published in an interdisciplinary journal such as Business Horizons, or a statistician who is well-published in a reference-discipline journal such as the Journal of Heuristics, may not have any competence (or, probably,
446
J.R. Meredith et al. / Omega 39 (2011) 435–446
interest) in operations management, or one of its sub-fields like scheduling. Here we have identified 30 OM-dedicated journals, some representing the entire discipline and others focused on individual sub-areas, and ranked them in terms of how AACSB-accredited universities actually perceive their quality. This list should be used in the future to identify those journals relevant to the operations management field, whether for purposes of comparing regional perceptions of quality or analyzing coverage of individual topics or any other particular interest of researchers in the field of operations management. And, of course, the list and rankings should be updated on a regular basis to stay current with the perceptions in the field. References [1] Diamond RM. Preparing for Promotion, Tenure, and Annual Review: A Faculty Guide. second ed.. Bolton, MA: Anker Publishing Company, Inc.; 2004. [2] Fairweather JS. The mythologies of faculty productivity: implications for institutional policy and decision making. The Journal of Higher Education 2002;73(1):26–48. [3] Holsapple CW, Lee-Post A. Behavior-based analysis of knowledge dissemination channels in operations management. OMEGA: The International Journal of Management Science 2009;38(3–4):167–78. [4] B.R. Lewis, Judging the journals, BizEd 2008, November/December, pp. 42–45. [5] Van Fleet DD, McWilliams A, Siegal DS. A theoretical and empirical analysis of journal rankings: the case of formal lists. Journal of Management 2000;26(5): 839–61. [6] Vokurka RJ. The relative importance of journals used in operations management research: a citation analysis. Journal of Operations Management 1996;14(4):345–55. [7] Reinstein A, Calderon TC. Examining accounting departments’ rankings of the quality of accounting journals. Critical Perspectives on Accounting 2006;17(4): 457–90. [8] Cronbach LJ. Coefficient alpha and the internal structure of tests. Psychometrika 1951;16(3):297–334. [9] Zsidisin GA, Smith ME, McNally RC, Kull TJ. Evaluation criteria development and assessment of purchasing and supply management journals. Journal of Operations Management 2007;25(1):165–83.
[10] Muller-Merbach H. OR of the people, by the people, for the people. OMEGA: The International Journal of Management Science 2011;39(2):119. [11] Muller-Merbach H. Five notions of OR/MS problems. OMEGA: The International Journal of Management Science 2011;39(1):1–2. [12] Paucar-Caceres A. Mapping the changes in management science: a review of ‘soft’ OR/MS articles published in Omega (1973–2008). OMEGA: The International Journal of Management Science 2010;38(1-2):46–56. [13] Steward MD, Lewis BR. A comprehensive analysis of marketing journal rankings. Journal of Marketing Education 2010;32(1):75–92. [14] The Oxford English Dictionary, OED online, ‘‘discipline, n2’’, second ed., Oxford University Press, 1989, May 29, 2009 /http://dictionary.oed.com/cgi/entry/ 50065209S. [15] Clark BR. The Academic Life: Small Worlds, Different Worlds. Princeton, New Jersey: Princeton University Press; 1987. [16] Lowry PB, Romans D, Curtis A. Global journal prestige and supporting disciplines: a scientometric study of information systems journals. Journal of the Association for Information Systems 2004;5(2):29–76. [17] Gorman MF, Kanet JJ. Evaluating operations management-related journals via the author affiliation index. Manufacturing & Service Operations Management 2005;7(1):3–19. [18] Barman S, Tersin RJ, Buckley MR. An empirical assessment of the perceived relevance and quality of POM-related journals by academicians. Journal of Operations Management 1991;10(2):194–212. [19] Soteriou AC, Hadjinicola GC, Patsia K. Assessing production and operations management related journals: the European perspective. Journal of Operations Management 1999;17(2):225–38. [20] Donohu JM, Fox JB. A multi-method evaluation of journals in the decision and management sciences by US academics. OMEGA: The International Journal of Management Science 2000;28(1):17–36. [21] Barman S, Hanna MD, LaForge RL. Perceived relevance and quality of POM journals: a decade later. Journal of Operations Management 2001;19(3): 367–85. [22] Olson JE. Top-25-business-school professors rate journals in operations management and related fields. Interfaces 2005;35(4):323–38. [23] Theoharakis V, Voss C, Hadjinicola GC, Soteriou AC. Insights into factors affecting Production and Operations Management (POM) journal evaluation. Journal of Operations Management 2007;25(4):932–55. [24] Goh CH, Holsapple CW, Johnson LE, Tanner JR. An empirical assessment of influences on POM research. OMEGA: The International Journal of Management Science 1996;24(3):337–45. [25] Goh CH, Holsapple CW, Johnson LE, Tanner JR. Evaluating and classifying POM journals. Journal of Operations Management 1997;15(2):123–38.