Critical Perspectives on Accounting 17 (2006) 457–490
Examining accounting departments’ rankings of the quality of accounting journals Alan Reinsteina,∗ , Thomas G. Calderonb,1 a
School of Business Administration, Wayne State University, Detroit, MI 48202-3930, USA b School of Accountancy, College of Business Administration, The University of Akron, Akron, OH 44325-4802, USA
Received 18 June 2003; received in revised form 3 June 2004; accepted 29 September 2004
Abstract Given the importance of research productivity to the tenure, promotion, merit raise processes and growth of the cognitive foundation of the accounting discipline, developing valid criteria for assessing the quality of accounting journals seems indispensable. While many studies have examined the quality of accounting and related journals, they relied mainly on surveys of the perceptions of faculty, accounting program chairs or deans. No study has yet ascertained the rankings that accounting departments actually use in evaluating journal quality. Moreover, few studies that examine journal quality have been published in the last decade. With the American Accounting Association’s support and using a survey of accounting programs, we examine how accounting programs actually assess the quality of accounting journals. We document the rankings used by both doctoral-granting and non-doctoral-granting accounting programs, and confirm the existence of an elite set of journals whose rankings are invariant to school type, faculty size, resource base or mission. We interpret this as evidence of the influence of a select group of elite accounting programs in defining the parameters of value in accounting scholarship, which we observe can be detrimental to the scholarship of application, integration and teaching in accounting. Our results offer insight into why accounting departments use (or do not use) journal rankings and presents detailed results that can help develop reasonable criteria for assessing research and scholarship. © 2004 Elsevier Ltd. All rights reserved.
∗
1
Corresponding author. Tel.: +1 248 357 2400. E-mail addresses:
[email protected] (A. Reinstein),
[email protected] (T.G. Calderon). Tel.: +1 330 972 6099.
1045-2354/$ – see front matter © 2004 Elsevier Ltd. All rights reserved. doi:10.1016/j.cpa.2004.09.002
458
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
The vast literature dealing with research productivity in accounting departments generally assumes that tenure, promotion and merit processes use the quality of journals in which faculty members publish. Presumably, journal quality is based on an explicit or implicit ranking of refereed journals in accounting, and faculty and administrators use such rankings in evaluating research and scholarship. The Association to Advance Collegiate Schools of Business (AACSB) (2004) requires business schools and accounting programs to develop standards of achievement and to measure outcomes against those standards. A published list of journal rankings or a list developed in-house could serve as a benchmark for assessing journal quality. At the very least, such a list could provide an efficient and transparent medium for evaluating research quality. Accounting faculty and administrators assign much importance to scholarship in making merit, tenure and promotion decisions in accounting programs. Academic administrators often seek objective research productivity data to make performance evaluation, hiring, tenure and promotion decisions. Both administrators and faculty are particularly interested in benchmark data to help set research productivity standards, to measure their own progress and to assess the progress of others. The academic literature also has long asserted a desire for information on faculty research productivity (see, for example, Brown, 2003; Brown and Huefner, 1994; Cargile and Bublitz, 1986; Hall and Ross, 1991; Hexer, 1969; Johnson et al., 2002; Kida and Mannino, 1980; Ostrowsky, 1986). However, the literature does not indicate the extent to which accounting departments rely on published or self-generated journal rankings in evaluating faculty research and scholarship. With the American Accounting Association’s (AAA) support, we examined (1) if accounting departments currently use journal rankings; (2) if so, who helped develop them and the rankings they actually use; and (3) what factors can explain uses of such lists. Our research also provides insight into the rationale some departments offered for using (or not using) formal journal rankings to evaluate faculty scholarship. This paper provides a snapshot of how accounting departments rate various journals. Faculty, academic administrators and others can use our research to help address a myriad of complex political and game playing issues associated with using journal rankings in accounting programs.
1. Background Most accounting academics recognize Journal of Accounting Research (JAR), The Accounting Review (TAR) and Journal of Accounting and Economics (JAE) as elite journals in the accounting domain (Schwartz et al., 2005). Their editorial boards consist primarily of graduates and faculty members from about 15 elite schools (Williams and Rodgers, 1995), with some being more tightly controlled by even fewer schools. Williams and Rodgers (1995) developed tables to show that these 15 elite schools have long controlled the editorship and editorial members of TAR, showing that while the number of accounting doctorates has grown, the percentage of elites on such boards constantly exceeds 75% of the available slots. Additionally, publishing articles in JAR was the single most important factor in selection to join the other major accounting journals’ editorial boards. Moreover, of all other 37 journals that Williams and Rodgers (1995) analyzed, only publishing in Contemporary
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
459
Accounting Research (CAR) helps “a bit” in being asked to serve on the elite journals’ editorial boards. Lee (1999) corroborates those findings. He observes that faculty from three universities (Illinois, Michigan and Texas) have generally controlled TAR’s editorial board, and it is well known that the Universities of Chicago and Rochester, respectively, control JAR and JAE. Faculty from the three universities that Lee (1999) calls a super-elite have controlled the AAA’s editorial and administrative leadership continually since its founding in 1917. For example, these three programs derive a very large share of AAA Presidents, VPs, TAR Editors and other key officers. He also found that graduates and faculty members from 20 schools dominate these journals’ editors, editorial boards and authors. Unsurprisingly, articles appearing in the top academic journals cite very heavily previously published articles in the journals in which they appear. For example, a JAR article cites JAR more than other major journals (Williams and Rodgers, 1995). This phenomenon often excludes ideas and theory that do not conform to elite programs’ agenda and research philosophy, while minimizing opportunities to learn from and influence practice and teaching in the domain of accounting. Additionally, some have criticized the status quo for creating barriers to involvement by certain groups of faculty members in academic accounting research. For example, Brinn et al. (2001) observed that UK academics perceive “high barriers” that preclude non-U.S. faculty seeking publication in such “top” U.S. journals as TAR, JAR and JAE. A major driver of these phenomena is that editors and reviewers of elite accounting journals who seek more recognition for their research at elite universities emulate economics research—since economics is often viewed a “true” academic discipline. We also see similar phenomena, as accounting emulates psychology, sociology and mathematics. Besides defining elite journals and controlling their editorial boards, elite programs often require their faculty to publish exclusively in such journals (Lee, 1995). Such programs influence the direction of accounting research largely through a seemingly efficient market for academic recognition. Publishing in an elite journal typically brings faculty members immediate recognition, academic perquisites and commendations for making progress toward promotion and tenure at even those non-elite schools with lesser research missions. While benefiting individual faculty members, elitism can impair overall accounting scholarship. In breeding even more elitism, elite faculty usually shun publishing in non-elite journals or joining non-elite faculties. Many accounting faculty with virtually no chance of publishing in the three elite journals consistently rate them as the top accounting journals, which biases their research efforts toward the methodologies, domain and substance of those journals (Lee, 1999). This overwhelming focus on elite journals limits opportunities for accounting scholarship to contribute to the profession and to teaching (Lee, 1995). Elite journals generally examine a narrow range of issues with minimal impact on accounting practice, in light of the relative costs. Lee and Williams (1999) found that except for Accounting Organizations and Society (which contain some accounting history and many behavioral articles), elite journals basically focus on examining capital markets, positive theory and forecasting. Authors of articles in the elite three often ignore non-elite journal sources, thus perpetuating a closed system that could impair innovation and growth in the accounting domain. Lee (1997) warns that this “inbreeding” discourages academics from criticizing accounting practice and practitioners. Citing Demski et al. (1991), Reiter and
460
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
Williams (2002) note a crisis in accounting research, observing that it no longer “leads” accounting practice (e.g., no discernable “real world” demand for accounting researchers exists, except from a few auditing firms for a few auditing/accounting types) or decision making and that accounting has developed few innovations. They contend that accounting scholarship is no closer now than 30 years ago in addressing major accounting issues. Elitism further segregates accounting teaching from practice and misdirects accounting resources, as many faculty spend large resources seeking to publish in elite journals (Lee, 1995). Baker and Bettner (1997, p. 293) note the importance of accounting research using greater interpretive and critical perspectives. They contend that since the field is a highly partisan activity and not a static reflection of economic reality, there is a need for interpretive and critical research that seeks to “describe, understand and interpret the meanings that human actors apply to the symbols and the structures of the setting in which they find themselves.” They (p. 304) also argue that accounting research models focus on positivist perspectives and quantitative methods, and that “the popularity of a particular research paradigm does not necessarily relate to its relative contribution to knowledge.” Chua (1986, p. 601) adds that mainstream accounting journals are “grounded in a common set of philosophical assumptions about knowledge, the empirical world, and the relationship between theory and practice.” But Brown and Huefner (1994) note that the more innovative and sometimes controversial pursuits often contribute most to knowledge. Increases in the number of accounting journals have not changed elite programs and elite aspirants’ perceptions of publication outlets. As the amount and diversity of accounting literature grows, so does its balkanization (Schwartz et al., 2005). They surveyed 151 accounting doctoral students from 39 accounting Ph.D. programs to assess their familiarity and “personal knowledge” of 35 actual and 2 fictitious accounting journals. While they all ranked TAR, JAR and JAE as the top three, they ranked The Journal of Accountancy as number six. Those at elite programs, however, rated the elite three much higher and knew much less about other journals than did students at non-elite programs. Thus, through their training, elite programs seem to perpetuate the emphasis on the very narrow scope of accounting scholarship that the elite three have historically embraced. We examine the journal rankings actually used by elite and non-elite programs to evaluate the pervasiveness of the emphasis on elite journals in accounting programs. 1.1. Prior journal ranking studies Prior research has generally confirmed the perception of an elite set of journals that are consistently ranked as the highest quality publications in the accounting domain. No prior research has measured which journals accounting programs actually use. Instead, they have used one of three broad techniques to assess the research productivity and publication quality—counting, citation analysis and surveys of journal quality. The remainder of this section contains a discussion and examples of these techniques. 1.1.1. Counting Counting techniques, presumably an objective and cost-efficient method, compile the number of articles faculty members or academic program publish in certain journals, ig-
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
461
noring the articles’ quality. While subjective attributes such as quality and rigor are important, decision makers often prefer to use a verifiable measure such as counting. Prior studies have generated intriguing results using counting techniques. For example, Zivney and Bertin (1992) found that only 5% of doctoral-degree faculty had published at least one article in the 128 accounting and finance journals included in their database. Chung et al. (1992) also noted that nearly one-third of the most prolific scholars had graduated from only seven doctoral programs. Dwyer (1994) used this method to show that females earning their doctorates in 1981 had written significantly fewer articles than male graduates of the same year. Streuly and Maranto (1994) reached similar conclusions for 2- and 5-year intervals. Programs that Brown (1996) and Fogarty (1995) ranked generally produce graduates who publish in academic journals significantly more often than their peers. Englebrecht et al. (1994) and Read et al. (1998) counted the research productivity of promoted faculty members. Fogarty and Ruhl (1997) counted and measured differences in accounting programs associated with their graduates’ research records. Similarly, Kirchmeyer et al. (2000) and Rama et al. (1997) used counting to compare the research records of male and female promoted accounting faculty members. Counting is neither as objective nor as simple as it may seem. Selecting journals to include in a study requires several subjective decisions, including identifying relevant and representative journals, plus justifying including some journals and excluding others. 1.1.2. Citation analysis Citation analysis measures how often articles, authors or journals are referenced in other articles. It assumes that higher quality articles are cited more frequently than those of lower quality. This technique relies on a frequency analysis of cited articles in a pre-defined set of published studies. Sriram and Gopalakrishnan (1994) used citation analysis to rank the top 34 doctoral programs and their most prolific graduates. Seetharaman and Islam (1995) used this technique to rank the quality of 32 accounting journals, considering factors such as a journal’s age and circulation, and citations of articles appearing in both premier accounting journals and non-accounting journals. They also compared their results from 1985–1987 and 1988–1989 to ascertain “movements” in these rankings over time. Like counting, a valued attribute of citation analysis is its presumed objectivity—i.e., either an article is cited or it is not. But the method has similar pitfalls and other problems, as well. MacRoberts and MacRoberts (1989) note that citation analysis often fails to consider all but “first-named” authors in co-authored pieces, usually fails to differentiate between different types of journals and gives credit to cited articles whether they are praised or criticized. Citation frequency can also be influenced by the author’s reputation, the sensitivity of the subject matter, and the journal’s circulation and coverage. Citation analysis suffers from limited databases, to a greater extent than counting perhaps, because it requires the tedious analysis of articles, footnotes or references. For example, McRae (1974) first used citation analysis on accounting publications by measuring the frequency of citations in only 17 articles. Beattie and Ryan (1991) and Gamble and O’Doherty (1985a, 1985b) also were limited in scope due to the difficulty in developing databases. Like counting, citation analysis must rely on objective criteria to ascertain which journals to count. The efficacy of citation analysis results depends on the representativeness of the publications used to conduct the frequency analysis of cited works.
462
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
1.1.3. Surveys of journal quality Several studies have used surveys to assess the quality of accounting and related journals. Typically, faculty or administrators are asked to rank journals relative to an “anchor” journal. For example, Howard and Nikolai (1983) used The Journal of Accountancy as their anchor, assigning it a rating of 100. Average responses usually are used to rank-order journals. Smith (1994) used this technique to rank 93 major accounting and other business journals. Additionally, Johnson et al. (2002) surveyed accounting department administrators nationwide, asking them to rank 33 accounting and MIS journals where accounting faculty often publish their works. They found that the respondents normally rated MIS journals significantly lower than accounting ones. While surveys have been used primarily to measure the quality of journals, most counting and citation analysis studies have measured the quantity, but not the quality, of faculty research. Like other assessment techniques, however, surveys have potential flaws. Morris et al. (1990) found that faculty who publish frequently in top journals tend to exhibit significant bias in rating those journals. Jolly et al. (1995) found significant differences in quality ratings among nearly 1000 respondents at AACSB-accredited institutions. While productivity can be evaluated on an ordinal, interval or ratio basis, most recent studies, e.g., Ballas and Theoharakis (2003), Hull and Wright (1990) and Schroeder et al. (1988) have used the more inferential ratio scale. Other issues include the selection of the anchor, the identification of appropriate persons to evaluate journals, potential response biases due to the specialty interests of the respondents, and the use of cluster analysis to group journals rather than rank-ordering them. As a guide to the many articles on the subject of accounting faculty members’ research productivity, we offer two tables summarizing recent published articles. Table 1 lists some key articles that use a quantitative method to arrive at their conclusions. Table 2 lists some major articles that use surveys oriented to the quality of journals. Most striking is the relative shortness of Table 2. Only 4 of the 16 articles listed in the two tables addressed quality directly. None examined whether departments and colleges actually used the generated lists. We seek to fill this information gap.
2. Method As shown in Appendix A, we developed a short e-mail survey instrument that requested journal-ranking documents that accounting departments used for promotion, tenure, merit and other purposes, who initiated the document and how departments used it. Using Hasselback’s (2001) Faculty Directory to ascertain faculty size, AACSB accounting accreditation status and Ph.D. program offering, we viewed public information sources such as the web and school catalogues to determine whether institutions were public or private. The AAA e-mailed the questionnaire to all 295 members of the AAA’s Accounting Leadership Program Group (consisting of 273 U.S. department chairs, but only of 7 Canadian and 15 other non-U.S. programs). We received 145 usable e-mail responses (an
Table 1 Examining accounting programs’ classification of accounting journals: applying quantitative techniques to qualitative bases (part I—quantitative) Objective
Number and types of journals used
Source of journals
Findings/information
Brown (2003)
Rank the quality of accounting journals by counting number of times an article was downloaded
All journals for which an article was downloaded. Ultimately, ranked the 18 top journals by download
Counted number of times the article was downloaded. Provides a demand-driven, micro-level approach to ranking journals
Buchheit et al. (2002)
Compare accounting productivity to that of other business disciplines
14 Journals: 3 accounting, 3 marketing, 4 finance and 4 management
Dyl and Lilly (1985)
Rank academic institutions
7 Journals from 1978 to 1981
Petry and Settle (1988)
Rank colleges of business in the U.S. by counting pages per article
Sriram and Gopalakrishnan (1994)
Consider the importance of publishing in non-U.S. journals Rank accounting and doctoral programs; identify 193 most prolific authors Examination of productivity of 584 faculty members promoted to associate or full professor 1987–1989
Total of 19 journals: 3 accounting, 4 finance, 7 management, 4 marketing, and 1 other Three academic accounting journals published outside of U.S. 24 Academic journals, separating results from inception–1988 and 1979–1988 Hull and Wright’s 79 journals plus categories for some other accounting journals, books, monographs, and symposium proceedings
Considered all “heavily downloaded” papers that accounting faculty wrote from the Social Science Research Network (SSRN) Used Trieschmann, Dennis, Northcraft and Niemi’s classification system to select top-tier journals Used Hendrickson’s (1980) list of scholarly academic journals Selected journals from business disciplines using high standing among scholars (excluding economics) Examined the Author’s Guide to Accounting and Financial Reporting Leading accounting journals that target primarily academic researchers and readers according to authors Used Hull and Wright’s (1990) rankings with some additions
Heck et al. (1990, 1991)
Englebrecht et al. (1994)
Top-tier publication rates are lower in accounting relative to other business disciplines Number of publications per faculty member in accounting appears to be surprisingly low Universities offering only doctorate and master degree programs have the highest productivity levels Rank authors by adjusted appearances, contributing authors to these journals The 1979–1988 decade accounts for 44% of all articles written since the journals began operations Professors in AACSB-accredited schools publish more than non-AACSB-accredited schools
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
Reference (study)
463
464
Reference (study)
Objective
Number and types of journals used
Source of journals
Findings/information
Zivney and Bertin (1992)
Measure performance data for doctoral graduates over a 25-year period Explore effects of doctoral school and initial faculty appointment on career publication productivity Examine rates of accounting faculty publishing
128 Finance and accounting journals
Finance Literature Index, Accounting Literature Index, and DIALOG online databases Identified the total publications in the top 33 journals according to Heck et al. (1991)
Only 5% of doctoral graduates published once a year over a prolonged period of time
Fogarty and Ruhl (1997)
Zivney et al. (1995)
Zeff (1996)
Update the listing of academic research journals in accounting
33 Accounting journals
66 Accounting and finance journals
Used Heck’s Finance–Accounting Literature Database
77 Accounting journals
Used three operative variables when considering whether to consider a particular journal as an “academic research journal”
Accounting faculty members earning doctoral degrees at top universities are more productive researchers
Average publishing accounting faculty members publish about one article every 3 years in 1 of the 66 journals studied The future of journals is in electronic compilation and transmission
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
Table 1 (Continued )
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
465
Table 2 Qualitative-survey research in ranking journal quality Reference (study)
Population surveyed
Sample/method
Evidence provided
Ballas and Theoharakis (2003)
Opinions of 6994 accounting faculty worldwide
Rank 40 accounting journals based on three different metrics by geographic region and research orientation of faculty respondents
Jolly et al. (1995)
Opinions of heads of 389 AACSB-accredited institutions Opinions of senior faculty at Business Week’s “best 40 MBA programs” Opinions of 278 department heads, as well as tenured and untenured faculty members
Measured familiarity, perceived rank, and readership to examine diversity in journal perceptions across geographic regions The Accounting Review article served as anchor to compare other journals Measured both the familiarity and prestige of the sampled journals A main The Journal of Accountancy article served as the base anchor for other surveyed journals
Brown and Huefner (1994) Hull and Wright (1990)
Rank 59 accounting journals
Rank 44 accounting journals
Rank 79 accounting journals; much agreement exists on the rankings of the top 15 journals by rank and specialty (e.g., tax)
approximate 47% response rate), including 19 usable journal-ranking attachments.1 We developed three separate databases, containing (1) responses to closed-ended questions from the survey instrument and information about each respondent obtained from Hasselback’s Faculty Directory or the Internet; (2) journal rankings along with information to identify schools submitting the rankings; and (3) voluntary comments related to journal rankings.
3. Results Table 3 (panel A) displays attributes of the 145 responding institutions; 70% of which came from public institutions without separate AACSB accounting accreditation. Respondents from non-Ph.D. programs outnumber those from Ph.D. programs by a 3:1 ratio. The median number of faculty at responding institutions is 11. Responses also came from 11 elite institutions, 2 of which supplied actual journal rankings. Overall, responses came from a wide cross-section of departments with diverse characteristics that are fairly similar to the broad profile of accounting departments nationally. While not claiming a randomly distributed sample, we believe that our data provides comprehensive information on the journal rankings that accounting departments use. The data also provide limited insight into why departments may not use journal rankings.
1 We received all 145 responses after our initial mailing in May 2002. The AAA subsequently sent a second request but we received no additional responses.
466
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
Table 3 Respondents’ background information (panel A) and number of journals included in journal rankings list (panel B) Panel A Non-Ph.D. programs = 110 (76%) Public institution = 104 (72%) Non-accounting AACSB = 101 (70%) Have no journal rankings = 126 (87%)
Ph.D. programs = 35 (24%) Private institution = 41 (28%) Accounting AACSB = 44 (30%) Have journal rankings = 19 (13%)
N = 145 N = 145 N = 145 N = 145
Average number of faculty
All faculty Instructors Assistants Associates Full
Median
Mean
S.D.
11 1 3 2 3
11.55 1.34 2.69 3.16 4.20
6.80 1.86 2.16 2.59 3.19
Panel Ba Average Median S.D. Highest Lowest Total number of programs with journal rankings Total number of programs without journal rankings a
Non-Ph.D.
Ph.D.
All
74 41 72 191 10 9 101
39 31 28 79 7 10 25
55 41 54 191 7 19 126
Differences in the number of ranked journals are statistically significant (p-value <0.05).
3.1. Using lists of journal rankings A total of 126 programs (representing 87% of respondents) do not use formal journal rankings for merit, promotion or tenure decisions; and 19 departments (13%) used in-house journal-ranking documents (see Table 3, panel B). The lists we examined contained an average of 55 publications, with statistically significant differences between Ph.D. and nonPh.D. programs in the number of ranked journals (averages of 74 and 39, respectively; medians of 41 and 31, respectively). One accounting department (non-Ph.D. granting) ranked as many as 191 publications. Both responding elite programs that used journal rankings listed no more than six named accounting journals, including TAR, JAR, JAE, CAR and RAS. AOS and a category called “AAA specialty journals” (without naming specific journals) appeared on one of those lists. Users of journal rankings state that they use them for promotion decisions (95%), tenure decisions (95%) and merit raises (100%). About 22% of respondents use their rankings for purposes other than traditional merit, tenure and promotion decisions. One chair noted her department (non-elite, non-Ph.D. granting) uses its list of top business and accounting journals solely to reward faculty for publishing, paying faculty US$ 10,000 for publishing in top journals (not pro-rated for co-authors), US$ 6000 for other refereed journals (pro-rated by co-author) and US$ 100 (not pro-rated) for lower level ones. Another non-Ph.D. grant-
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
467
ing department uses the journal rankings to help allocate summer grants, research awards and faculty release time for research. A third one offers release time for the next academic year based on the number articles published in ranked journals during the preceding 5 years. Many respondents wrote short notes to explain why their departments use or do not use journal rankings. The overwhelming tenor of their feedback is that even if they do not use formal journal rankings, most accounting departments routinely try to ascertain the quality of published articles. An analysis of the feedback received produce four broad themes. First, many programs do not have formal journal rankings because better alternatives are available given the nature of their programs. Second, the programs have a formal process for evaluating faculty publications based on either internal or external reviewers. Third, faculty peers use their professional judgment to review individual articles and do not see a need to rely exclusively on a formal list. Finally, many department chairs view journal rankings as a source of inter and intra departmental conflict. Comments that are representative of each theme are outlined further. 3.1.1. Programs that use an alternative approach Many department chairs indicate that they use various ways, other than formal journal rankings, to assess the quality of journals in which their faculty publish. Although they do not use formal journal rankings, the focus among this group is on the perceived reputation of the journal rather than the quality of the specific article published by a professor. They presume that everyone can identify the top scholarly (and likewise the least scholarly) journals in the field, and that a journal’s quality and the quality of a publication appearing in that journal are synonymous. However, some programs in this group simply do not evaluate faculty scholarship, but rather rely on the mere existence of a publication. The following is a list of examples of comments from this group: • We have no such document or ranking. People are conscious of which journals are more difficult to get into than others, however; so top journals certainly get more attention. All pubs are not created equal. • We do not have a formal ranking system in which different journals are assigned a different ranking. Instead, the member of the promotion review committee from the School of Business provides a sort of informal idea of the stature of the journal. • This does not mean that we do not have more informal lists of journal quality. There is a consensus on some rankings, and others are handled on a situational basis. TAR, JAR, CAR, JAE and AOS are thought of as A journals, also the newer Review of Accounting Studies. Journal of Accounting and Public Policy has been considered an A journal in the past. The higher quality section journals such as Auditing or BRIA or JATA are considered an A−. The best specialty journals in a field are generally considered A−, like JMAR or Critical Perspectives on Accounting or AAAJ or Journal of Business Ethics. • We do not have one, but it is clear that JAR, TAR and JAE are considered here to be first rank. AOS, RAST, CAR and the AAA specialty journals come next, followed by others. In our promotion reports, we refer to them as such, but have no formal document. • We do not use journal ratings at the present time, but do require publication in refereed journals for tenure, promotion and graduate faculty status.
468
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
• We certainly differentiate journals with different readerships, objectives and qualities. However, we do not have a formal list ranking the different journals. • Although we do not have a ranking document, journals are evaluated with any promotion/tenure decision. • Unfortunately, we have no journal rankings. We are fortunate if we can get our accounting faculty to publish in the New Accountant. • While we do not have a journal-ranking list, I think the faculty has some sense of quality with regard to publications. Frankly, many of our faculty publish in a few outlets, not known for stellar quality. With poor salaries, heavy teaching loads and other obstacles, I consider [it] a miracle that any one publishes anything at all. At this institution, it is more of a quantity game than a quality game. • We expect our faculty to publish in peer-reviewed/blind-reviewed journals but we do not require a particular level of journal. We are primarily a teaching institution. However, we still believe faculty should be engaged in scholarship. We just do not mandate a particular level of journal. • The only criterion we use is whether a journal is peer-reviewed. • We tend to use those that have been published in various accounting journals for promotion purposes only. Even then it is a fairly informal process. • There is (currently) no formal ranking. Evaluation committees for reference and perspective sometimes use Cabell’s. • We do look at various factors to inform decisions such as Cabell’s, SSCI data, AOM and other studies. 3.1.2. Programs that use reviewers Some programs use internal or external reviews—rather than formal journal rankings—to evaluate the quality of scholarly activity. Among this group, there appears to be no presumption that a journal’s quality and the quality of a publication appearing in that journal are synonymous. Therefore, the focus of their evaluation is on the individual article and its contribution to scholarship. Examples of comments among this group include the following: • We do not use “journal ranking”. Instead, we rely on opinion of three outside reviewers to comment on the quality of the research (as well as the quality of the journal). • We do not have a journal ranking system. All of the tenured professors read the papers and discuss them. Of course, whether or not the paper is published and where is taken into account, but only implicitly. • We do not rank individual journals. Instead, we evaluate the quality of individual papers submitted in a candidate’s documentation. Our view is that high quality works may be published in “lesser” outlets and that all papers published in a given journal are not necessarily created equally. Thus, rankings, per se, do not hold as much meaning here as in other places. 3.1.3. Programs that use professional judgment Programs falling into this group explicitly do not rank an article based solely on the journal in which it appears. Unlike other groups, this group relies on both formal journal rankings and faculty and administrators’ judgment to assess the quality of faculty pub-
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
469
lications. Believing that high quality journals can contain some low quality articles and vice versa, they review each article on a case-by-case basis, assessing them based on their contributions to scholarship. Examples of comments from this group include: • Please note that it is used with judgment, not a blind count. That is, sometimes a paper in an A journal is not counted as such because its quality is not comparable to other papers in the journal. Or a paper in a lower-ranking journal might be moved up. This happens about equally in each direction. • Overall, we believe that our list provides reasonable guidance in judging journal quality. However, it is possible for an article published in a non-“A” tier journal to be of exceptional quality; similarly, it is possible for a lesser quality article to appear in an “A” tier journal. Consequently, our departmental promotion and tenure reserves the right to make evaluations of publication quality on a case-by-case basis, guided by our list of journal quality. 3.1.4. Journal rankings as a source of conflict Many comments show that ranking journals is a contentious issue that causes tension among deans, department chairs and faculty. While 83% of chairs state that their departments originated the need for journal rankings, 44% note pressure from their deans to require journal rankings.2 A department chair noted that programs emphasizing a professional mission and, thus, focusing their publications on practitioner journals could be adversely affected when a college uses journal rankings. Some chairs expressed concern that using journal rankings can lead to inter- and intra-departmental competition and game playing that may not be in the department’s best interest. The following comments provide insight into such complications: • Our Dean is currently pestering us to rank academic journals. Unfortunately, she does not appreciate professional journals. We in the accounting department are strongly opposed to this practice given that we are an undergraduate teaching institution. • No? How about another category? Our Dean has one but the PTR committee does not use it because it violates the collective agreement and has no academic validity. [A subsequent e-mail confirmed this to be the chair’s actual experience.] • We tried years ago but people from other departments used it to deny contract renewal/tenure for accounting faculty. • We are not departmentalized, so there is no “penalty” for publishing in the journals of other fields. One of the reasons we have resisted a written document is that we want to be able to evaluate niche or specialized research and one cannot think in advance of every scenario. What if we had a person in tax? What if we have a person in international? This last comment speaks to a disadvantage of developing and rigidly applying a journalranking document. Artificially restricting faculty publications to only listed, ranked journals could ignore quality publications in non-ranked journals. In addition, because such lists are
2 Since the respondents were asked to select all that applied in responding to this question, the sum of percentages need not add to 100. This implies that some initiatives for journal rankings originated from multiple sources.
470
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
usually highly oriented to U.S.-based journals that publish mainly financial accounting research, they could inadvertently narrow the scope of accounting scholarship. In summary, while most departments have no formal journal-ranking documents, to help broaden the review process, many accounting programs rely on means other than in-house journal-ranking documents to assess publication quality. They use such methods as formal external reviews of published articles and faculty perceptions of journal quality. Only one respondent mentioned any of the lists generated in published scholarship, merely noting that the department generated its own list anyway. 3.2. Factors associated with use of journal rankings Table 4 shows respondents’ characteristics associated with formal journal-ranking documents. These departments tended to have Ph.D. programs, larger faculty sizes and separate AACSB accounting accreditation. While 8% of departments without Ph.D. programs used formal journal rankings, 29% of those with Ph.D. programs used them. The heaviest concentration of journal-ranking lists exists among non-elite Ph.D. programs that comprise 17% of the total number of respondent to our survey, but make up 42% of journal rankings users. About 5% of departments without separate accounting accreditation used journal rankings, while 32% of departments with separate accounting accreditation used them. Departments using journal rankings average 17 faculty members while departments that do not average about 11 faculty members. Univariate differences in Ph.D. program representation, accounting program accreditation status and faculty size are all statistically significant at the 1% level or lower. Characteristics of non-users of journal rankings are just as revealing as those of users. Non-Ph.D. (accounting) programs appear to use journal rankings less frequently than do their Ph.D counterparts. While very few schools without separate AACSB accounting accreditation are users, the type of school (public or private) show no statistically significant association with journal rankings use. Chairs from elite programs who responded to our survey tend not to use such lists. Of the 15 classified elite programs, 10 responded to our survey—8 of which stated that they use no formal journal-ranking lists to evaluate faculty scholarship. Similarly, of the seven doctoral programs identified by Chung et al. (1992) as producing the most prolific contributors to the accounting literature all of which were part of the 15 elite programs, five responded to our survey, with none using such formal journal-ranking lists. Thus, few elite programs use such lists. Our received comments may help to explain this observed pattern. An elite program chair commented that while his faculty is fully aware of the top journals in the field, they saw no merit in using an a priori journal-ranking list when they could review a published article to assess its quality. Considered in the context of the small number of highly selective journals that appear on the lists provided by the two elite programs that use formal journal rankings, this comment may reflect a more pervasive rationale for the relative absence of formal journal-ranking lists among the elite. To further explore factors associated with the use of journal rankings, we ran a probit analysis on the data, using the following model: Ui = α + β1 × Ph.D.i + β2 × Typei + β3 × Accreditationi + β4 × Sizei
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
471
Table 4 Factors related to rankingsa
Panel A: highest degree Non-Ph.D. Ph.D.—all responding Ph.D.—non-elite Ph.D.—elite programs Ph.D.—programs that graduated most prolific contributors to the accounting literature (Chung et al., 1992) Total (Ph.D. and non-Ph.D.) Test statistic (test of homogeneity—have vs. do not have rankings across Ph.D. and non-Ph.D.-granting programs) Pearson chi-square Panel B: institution type Public institution Private institution Total Test statistic (test of homogeneity—have vs. do not have rankings across public and private institutions) Pearson chi-square Panel C: accounting accreditation No AACSB accounting accreditation Separate AACSB accounting accreditation Total Test statistic (test of homogeneity—have vs. do not have rankings across separately accredited and not-separately accredited accounting programs) Pearson chi-square
Have no rankings
Have rankings
Total
101 (92%) 25 (71%) 17 (68%) 8 (80%) 5 (100%)
9 (8%) 10 (29%) 8 (32%) 2 (20%) 0 (0%)
110 35 25 10 5
126
19
145
Value
d.f.
Probability
9.694
1
0.002
89 (86%) 37 (90%) 126
15 (14%) 4 (10%) 19
104 41 145
Value
d.f.
Probability
0.563
1
0.453
96 (95%) 30 (68%)
5 (5%) 14 (32%)
101 44
126
19
145
Value
d.f.
Probability
14.873
1
0.000
Group
Mean number of faculty
S.D.
Panel D: number of faculty Have no rankingsb Have rankings
10.746 17.118
6.247 7.983
a b
Percentages are based on row totals. t-test for difference: p-value = 0.005.
where Ui = use of journal rankings by department i; Ph.D.i = whether the department offered a doctoral program; Typei = whether the university was private or public; Accreditationi = whether the department had separate accounting accreditation; Sizei = the number of faculty members in the department; α, β1 , β2 , β3 and β4 are constants representing model coefficients.
472
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
Table 5 Factors associated with use of journal rankings results for probit analysis Parameter
Estimate
Standard error
t-statistic
p-value
Constant Ph.D. Accreditation Type Size
−2.226 0.669 0.845 0.124 0.028
0.403 0.349 0.345 0.372 0.026
−5.526 1.919 2.451 0.333 1.07
0.000 0.055 0.014 0.739 0.284
−2 × log likelihood ratio = 21.338 with 4 degrees of freedom. Chi-square p-value = 0.000. Key: Ph.D.i = whether the department offered (1) or did not offer (0) a doctoral program; Typei = whether the University was private (1) or public (0); Accreditationi = whether the department had (1) or did not have (0) separate accounting accreditation; Sizei = the number of faculty members in the department.
The results, reported in Table 5, reveal that when considered in the context of a multivariate model, existence of an accounting doctoral program and accounting accreditation status are statistically significant (p-value <0.10) predictors for the use of journal rankings. This finding is intuitive as both factors are often viewed among the drivers of the need for high-quality refereed publications in accounting departments. Accounting accreditation standards require documented evidence of high-quality scholarship and doctoral-granting programs must usually meet very high research expectations. 3.3. Constructing a composite list of journal rankings In order to derive a composite journal-ranking document based on the various lists used by 19 accounting programs, we first had to create a common process to combine the disparate rating scales used by accounting programs. Some departments use letter grading scales (A, B and C) with or without plus and minus grades; others use qualitative rankings such as elite, high quality, quality and support. A few use numerical ranks; several use tiers or clusters. One places journals into various types of scholarship (basic, applied and instructional development) and assigns letter grades to journals within each scholarship type. To develop a consistent set of ratings, we used five tiers to capture the respondents’ disparate rankings. The following table documents our conversion results, which we use to create the composite journal rankings reported in this paper: Ranking assigned by department
Tier assigned by authors
A or A− Top 10 Top 5 Elite Tier/cluster 1 High quality
1 1 1 1 1 2
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
473
Ranking assigned by department
Tier assigned by authors
Tier/cluster 2 B+ Quality Tier/cluster 3 B/B− Support Tier/cluster 4 C+/C Tier/cluster 5 C−
2 2 3 3 3 4 4 4 5 5
We acknowledge that fewer tiers would place some lower ranked journals (based on the actual lists we reviewed) into our top tier, while adding tiers would place some journals that several schools placed into top tiers into lower tiers. However, we carefully reviewed all received ranking lists to ensure that our method neither over- nor under-ranked journals. Based on that review, we used our method only after we observed that the overall rankings based on five tiers were not significantly sensitive to using fewer or more tiers. In essence, we used the rankings the responding school used and applied it to our scale. To help derive generalizable results, we analyze and report results for only journals appearing on at least three departmental lists. We used a simple sorting process based on the number of times a journal is ranked by various accounting departments as tiers 1–5. Thus, the more tier 1 rankings a journal receives, the more likely it is to be rated highly in our list. This method produced more intuitive results (though not statistically significant) than a weighted average process that used ranking frequencies as weights and tiers as ratings. Furthermore, we observed that a weighted average process is much more sensitive to the number of departments that rank a journal, particularly when there are divergent rankings. For example, if three departments ranked a journal in the top tier and no one else ranked that journal, that journal would receive a composite average score of one. But if eight departments ranked a journal as tier 1 and two others ranked the same journal as tier 2, then the composite average score for that journal would be computed as 1.2. Nonetheless, we report both the weighted average tier and the sorted rank for each journal.
3.4. Rankings Tables 6–8 show how accounting programs rank various journals. The rankings include only accounting journals that appeared in the journal rankings lists submitted to us by accounting department chairs. Journals that were not included in those lists or journals that were included in fewer than three lists do not appear in our rankings. In a sense, Tables 6–8 are sorted lists of publications that appear in the journal rankings lists used by accounting programs.
474
Table 6 Top accounting journals—all programs Journal
Number of departments
Number of times rated in
The Accounting Review Journal of Accounting Research Journal of Accounting and Economics Contemporary Accounting Research Journal of the American Taxation Association Auditing: A Journal of Practice and Theory Accounting, Organizations and Society Journal of Management Accounting Research Behavioral Research in Accounting Journal of Accounting, Auditing and Finance Journal of Information Systems Journal of Accounting and Public Policy Accounting Horizons Issues in Accounting Education National Tax Journal Review of Accounting Studies The Journal of Accountancy Abacus Journal of Accounting Literature Journal of Accounting Education International Journal of Accounting Education and Research Research in Governmental and Nonprofit Accounting CPA Journal International Journal of Intelligent Systems in Accounting, Finance and Management Journal of Business Finance and Accounting Advances in Accounting Accounting & Business Research
Tier 2
Tier 3
Tier 4
Sorted rank
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 18 20 21
Tier 5
18 17 15 17 17 17 15 17 16 14 15 14 16 14 14 9 11 12 12 12 15
18 17 15 13 12 12 10 10 10 10 9 8 7 6 6 5 4 3 3 3 3
4 5 3 5 4 4 4 4 3 6 6 5 2 3 6 6 6 5
3 2 2 3 3 1 5
2 2
1.00 1.00 1.00 1.24 1.29 1.41 1.33 1.59 1.50 1.29 1.67 1.64 1.88 1.86 1.79 1.67 2.18 2.00 2.00 2.17 2.40
9
3
3
1
2
2.22
22
11 5
3 3
2
3 2
3
2.55 1.80
23 24
12 12 13
2 2 2
7 7 6
3 1 4
2.08 2.25 2.31
25 26 27
2 3 2 2 3 1
2 2
2
2 1
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
Tier 1
Weighted average tier
9
2
4
3
2.11
28
9 8 8 9 10 11 7
2 2 2 2 2 2 2
4 3 3 3 3 2 1
3 3 2 2 2 5 4
2.11 2.13 2.25 2.44 2.60 2.64 2.29
28 30 31 32 33 34 35
3 9 11 9 7 8 4 4 8 7 4 5 5 5 3 3
2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
5 5 4 3 3 2 2 1 1 1 1 1 1 1 1
3 3 3 2 2
2.00 2.22 2.55 2.44 2.43 2.63 2.25 2.25 2.63 2.57 2.25 2.80 2.80 2.80 2.00 2.33
36 37 38 30 40 41 42 42 44 45 46 48 48 48 50 51
4 6 3 4
1 1 1 1
2.75 3.33 2.33 2.75
52 53 54 55
1 2 3 2
1
6 5 2 1 1 1 1
2 1 1 2 1 1
2 2 2 1
1 1
2 3 2 2
1
1
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
International Journal of Accounting Information Systems Tax Adviser Taxes—The Tax Magazine Advances in International Accounting Management Accounting Accounting Educators Journal Research in Accounting Regulation Journal of International Accounting, Auditing and Taxation Journal of Accounting, Economics and Finance Journal of Taxation Advances in Taxation Advances in Management Accounting Journal of Corporate Taxation Critical Perspectives on Accounting Managerial Auditing Journal Tax Law Review Journal of Cost Management Accounting Historians Journal Accounting Systems Journal Accounting, Auditing and Accountability CMA Magazine EDP Auditor Research on Accounting Ethics Accounting, Management and Information Technologies Financial Analysts Journal Ohio CPA Journal Accounting Enquiries Journal of Accounting and Computers
475
476
Table 6 (Continued ) Journal
Number of departments
Number of times rated in Tier 1
4 6 6 7 5 5 4 7 7 5 3 3 3 3 4 6 4
Tier 2
1 4 3 3 3 3 3 2 2 2 2 2 2 2 2 1 1
5 3 3 4 5 3 3 4 3
1 1 1 1 1 1 1 1 1
3 3 4
1 1 1
Sorted rank
2.75 2.50 2.50 2.86 2.60 2.60 2.50 2.71 3.00 2.80 2.33 2.33 2.67 2.67 3.00 3.00 2.75
55 57 58 59 60 60 62 63 64 65 66 67 68 68 70 71 72
1 2 1 1 2 2
3.00 2.67 2.67 3.00 3.20 3.00 3.00 3.25 3.33
73 74 74 76 77 78 78 80 81
2 2 3
3.33 3.33 3.50
81 81 84
Tier 3
Tier 4
2 1 3 2 1 1
1 1
5 3 2 1 1
4 3 3 2 2 2 2 1 1 1
2 1 1 1 2 1
1 1 2 1
1
Tier 5 A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
Journal of Corporate Accounting and Finance Internal Auditor Government Accountants Journal Accounting and Finance Financial Accountability and Management International Tax Journal Advances in Accounting Education Advances in Public Interest Accounting Internal Auditing Journal of Partnership Taxation Advances in Accounting Behavioral Research Journal of Accounting and Finance Research Information Systems Audit and Control Journal of Real Estate Taxation CA Magazine Tax Executive Journal of International Financial Management and Accounting Journal of Cost Analysis British Accounting Review Taxation for Accountants Taxation for Lawyers Practical Accountant Accountancy European Accounting Review Estates, Gifts, and Trust Journal Accounting Education: An International Journal Pacific Accounting Review Public Finance and Accountancy Georgia Journal of Accounting
Weighted average tier
4 4
1
3
3.50 3.00
84 86
1 2 1 1 1 1 1
3.20 3.33 3.00 3.25 3.25 3.33 3.33 3.33
87 88 89 90 91 92 92 92
2 2 3 1 3
3.50 3.67 3.75 4.00 4.00
95 96 97 98 99
4
5 6 3 4 4 3 3 3
4 4 3 3 3 2 2 2
4 3 4 3 3
2 1 1 1
1
Notes: (1) This list shows journals that appeared in the rankings used by at least three accounting departments. Journals appearing on fewer than three lists are not included in the table. Rankings were created by sorting the number of tier 1–3 rankings received. To break ties in the number of tier 1–3 rankings received, journals listed more frequently as tiers 4 and 5 are ranked lower. (2) “Number of departments” represents the number of departments that includes a specific journal in its rankings.
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
National Public Accountant Journal of Public Budgeting, Accounting, and Financial Management Tax Notes Review of Accounting Information Systems Studies in Accounting and Finance Review of Quantitative Finance and Accounting Tax Lawyer Bank Accounting and Finance Corporate Accounting Petroleum Accounting and Financial Management Journal Oil and Gas Tax Quarterly Massachusetts CPA Review Connecticut CPA Journal New Accountant Woman CPA
477
478
Table 7 Top accounting journals–Ph.D. programs Journal
Number of departments
Number of times rated in
The Accounting Review Journal of Accounting and Economics Journal of Accounting Research Contemporary Accounting Research Auditing: A Journal of Practice and Theory Journal of Accounting, Auditing and Finance Journal of the American Taxation Association Accounting, Organizations and Society Journal of Accounting and Public Policy Journal of Management Accounting Research Behavioral Research in Accounting National Tax Journal Journal of Information Systems Review of Accounting Studies Accounting Horizons Research in Governmental and Nonprofit Accounting International Journal of Intelligent Systems in Accounting, Finance and Management Journal of Accounting Literature Issues in Accounting Education Journal of Accounting Education The Journal of Accountancy International Journal of Accounting Education and Research Journal of Business Finance and Accounting Advances in Accounting Accounting & Business Research International Journal of Accounting Information Systems
Tier 2
Tier 3
10 10 10 10 9 9 9 9 9 9 8 9 7 7 9 4
10 10 10 9 8 8 8 7 7 6 6 5 5 5 4 3
3
3
7 6 6 5 8
2 2 2 2 1
4 4 4 2 5
1
7 6 7 5
1 1 1 1
5 5 4 3
1
1 1 1 1 2 1 2 2 3 2 4 1
1 1 1 2 1
1 2
2 1
Tier 4
Sorted rank
1.00 1.00 1.00 1.10 1.11 1.11 1.11 1.22 1.33 1.44 1.25 1.56 1.29 1.57 1.67 1.25
1 2 3 4 5 5 5 8 9 10 11 12 13 14 15 16
1.00
17
1.86 1.67 1.67 1.80 2.13
18 19 19 21 22
2.00 1.83 2.14 2.00
23 24 25 26
Tier 5
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
Tier 1
Weighted average tier
5 5 4 4 4 3 5 4 3 5 5 4 4 4 3 3 3 3 3 3 5 4 4 4
1 1 1 1 1 1 1 1 1
3 3 2 2 2 2 1
1 1 1 1 1
4 4 4 3 3 2 2 2 2 2 2 2 2 1 1
1 1
3 3 2
1 1 1 1 1 1 1 1 3 1 3 3
1
2.00 2.00 2.00 2.00 2.00 1.67 2.40 2.50 2.33
26 26 29 29 29 32 33 34 35
2.20 2.20 2.00 2.25 2.25 2.33 2.33 2.33 2.33 2.33 2.33 2.60 2.75 2.75 2.75
36 36 38 39 39 41 41 41 41 41 41 47 48 49 50
Notes: (1) This list shows journals that appeared in the rankings used by at least three accounting departments. Journals appearing on fewer than three lists are not included in the table. Rankings were created by sorting the number of tier 1–3 rankings received. To break ties in the number of tier 1–3 rankings received, journals listed more frequently as tiers 4 and 5 are ranked lower. (2)“Number of departments” represents the number of departments that includes a specific journal in its rankings.
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
Management Accounting Tax Adviser Accounting Educators Journal Advances in International Accounting Taxes—The Tax Magazine Tax Law Review CPA Journal Journal of Cost Management Journal of International Accounting, Auditing and Taxation Abacus Advances in Taxation Journal of Taxation Advances in Management Accounting Critical Perspectives on Accounting Accounting and Finance Financial Accountability and Management Government Accountants Journal Internal Auditor Journal of Corporate Taxation Journal of Partnership Taxation Research in Accounting Regulation Internal Auditing Accounting Historians Journal Advances in Public Interest Accounting
479
480
Table 8 Top accounting journals—non-Ph.D. programs Journal
Number of Departments
Number of times rated in
The Accounting Review Journal of Accounting Research Journal of Accounting and Economics Contemporary Accounting Research Journal of the American Taxation Association Accounting, Organizations and Society Journal of Accounting, Auditing and Finance Journal of Management Accounting Research Auditing: A Journal of Practice and Theory Behavioral Research in Accounting Abacus Journal of Information Systems Issues in Accounting Education Research on Accounting Ethics Accounting Horizons National Tax Journal Journal of Business Finance and Accounting Journal of Accounting Literature Journal of Accounting and Public Policy Taxes—The Tax Magazine Tax Adviser Journal of International Accounting, Auditing and Taxation International Journal of Accounting Information Systems Managerial Auditing Journal Accounting Historians Journal Journal of Taxation The Journal of Accountancy Accounting & Business Research
Tier 2
Tier 3
Tier 4
Sorted rank
1.00 1.00 1.00 1.50 1.43 1.75 1.75 1.75 2.00 2.00 1.50 1.86 2.14 1.60 2.50 2.67 2.71 2.67 2.20 2.20 2.20 2.20
1 2 3 4 5 6 7 8 9 9 11 12 13 14 15 16 17 18 19 20 20 20
Tier 5
8 7 5 8 7 8 8 8 8 8 6 7 7 5 6 6 7 6 5 5 5 5
8 7 5 4 4 4 4 4 4 4 3 3 3 2 2 2 2 2 1 1 1 1
2 2 2 2
3 2 2 2 2 2
6
1
2
2
1
2.50
22
6 6 5 4 4
1 1 1 1 1
2 2 1 1 1
1 1 3 2 2
2 2
2.67 2.67 2.40 2.25 2.25
23 23 25 26 26
4 3 2 2 2 2 2 3 2 2 3 1 1
2 2 2 2 2 2 2 1
2 3 2 2
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
Tier 1
Weighted average tier
4 4 5 6 3 4 4 6 3 4 5
1 1 1 1 1 1 1 1 1 1 1
3 4 4 4 4 5
1 1 1 1 1
3 4 3 3 3 3 3 3 4 4 3
1 1 1 1 1 1 1 1 1 1 1
2 2 1 1 1 1 1 1 1 1 1 1
2 2 2 2 1 1 1 1
2 2 1 1 1 1
1 1 3 1 2 3
2.25 2.25 2.60 2.83 2.00 2.50 2.50 3.00 2.33 2.75 3.00
26 26 30 31 32 33 33 35 36 37 38
1 2 2 2 2
2.33 2.75 3.00 3.00 3.00 3.00
39 40 41 41 41 44
2.67 2.75 2.67 2.67 3.00 3.00 3.00 3.00 3.25 3.25 3.33
45 46 47 47 49 49 49 49 53 54 55
1 2
1 3 2 2 1 1 1 1 1 1
1 1 1 1 2 2 2
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
Journal of Corporate Taxation Advances in International Accounting Advances in Management Accounting Research in Accounting Regulation Journal of Accounting Education CPA Journal Advances in Accounting Internal Auditor Government Accountants Journal Advances in Public Interest Accounting International Journal of Accounting Education and Research Journal of Cost Management Journal of Corporate Accounting and Finance Financial Analysts Journal Advances in Taxation Accounting Educators Journal Research in Governmental and Nonprofit Accounting Ohio CPA Journal Management Accounting Critical Perspectives on Accounting Accounting, Auditing and Accountability Tax Executive Journal of Cost Analysis International Tax Journal Accountancy Estates, Gifts, and Trust Journal Accounting and Finance Public Finance and Accountancy
481
482
Number of Departments
Number of times rated in Tier 1
Public Budgeting and Finance Petroleum Accounting and Financial Management Journal National Public Accountant Journal of Accounting and Computers Internal Auditing Georgia Journal of Accounting EDP Auditor Corporate Accounting CMA Magazine CA Magazine Accounting Education: An International Journal Bank Accounting and Finance Review of Accounting Information Systems Oil and Gas Tax Quarterly Connecticut CPA Journal
Tier 2
Tier 3
Tier 4
Weighted average tier
Sorted rank
Tier 5
3 3
1 1
2 2
3.33 3.33
56 56
3 3 3 3 3 3 3 3 3 3 4 4 3
1 1 1 1
2 2 2 2 1 1 1 1 1 1 2 2 2
3.33 3.33 3.33 3.33 3.33 3.33 3.33 3.33 3.33 3.33 3.50 3.50 3.67
56 56 56 56 62 62 62 62 62 62 68 68 70
2 2 2 2 2 2 2 2 1
Notes: (1) This list shows journals that appeared in the rankings used by at least three accounting departments. Journals appearing on fewer than three lists are not included in the table. Rankings were created by sorting the number of tier 1–3 rankings received. To break ties in the number of tier 1–3 rankings received, journals listed more frequently as tiers 4 and 5 are ranked lower. (2) “Number of departments” represents the number of departments that includes a specific journal in its rankings.
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
Table 8 (Continued ) Journal
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
483
Table 6 shows the top-rated 99 journals in accounting, Table 7 shows the top 50 journals based on rankings used in doctoral programs, and Table 8 shows non-doctoral-granting programs’ top 70 journals.3 Four journals stand out as the top-rated in all three lists—The Accounting Review (TAR), Journal of Accounting Research (JAR), Journal of Accounting and Economics (JAE) and Contemporary Accounting Research (CAR). All departments that rate TAR, JAR and JAE put them in the top tier, making them clearly the elite accounting journals. CAR does not fare as well; four departments ranked it in the second tier. Some intriguing differences arose in how doctoral and non-doctoral programs use the rankings. For example, all three doctoral-granting programs that included International Journal of Intelligent Systems in Accounting, Finance and Management (IJISAFM) in their rankings called it a top-tier journal, while two of the four responding non-doctoral programs ranked IJISAFM in the third tier. Similarly, most doctoral-granting programs rank Review of Accounting Studies (RAS) in the top tier. A few, however, place it in the third tier. But RAS appears on only two non-doctoral-granting program lists, and thus is not included in our composite journal rankings for that group. If we included it in that list, it would rank as a tier two rather than a tier 1 journal. Eight of nine responding doctoral-granting programs rank Auditing: A Journal of Practice and Theory (AJPT) as a top-tier journal, but only four out of eight non-doctoral-granting programs rank it in the top tier. These differences suggest that departments with doctoral programs are more likely than other departments to rank specialized journals in the top tier, probably because doctoral-granting departments usually have more well-defined faculty specializations, making it necessary to have specific incentives to attract and retain such expertise. A journal’s affiliation with the AAA also seems to produce a halo affect, as both doctoral and non-doctoral programs rank AAAaffiliated journals in the top tier. However, lower tier, non-doctoral programs often rank such high-quality, non-AAA journals as RAS and IJISAFM much lower than do doctoral programs with strong research agendas. While the contrast between RAS’ recognition at doctoral and non-doctoral programs is interesting, equally interesting is RAS’ rapid rise to the top among doctoral programs. The composition of its editorial board and rapid ascendancy exemplify the structure and control of the accounting academy. Eighty-one percent of its editorial board graduated from or work at the 15 elite programs that Williams and Rodgers (1995) identified, and eight of the nine persons listed as editors either graduated or work at the elite 15. Thus, the elite 15 continue to dominate the setting of the agenda for accounting scholarship and in determining the type of scholarship that the accounting academy values (Lee, 1997; Williams and Rodgers, 1995). We speculate that programs that rank RAS in the second and third tier are unfamiliar with the journal and its affiliation with the elite. Use of journal rankings is concentrated among non-elite accounting programs, and seems to serve as a motivational tool for faculty to publish in elite journals and thereby move closer to those elite programs that set the research agenda for the accounting academy (Hasselback and Reinstein, 1995). Viewing journal rankings in that light, we speculate that, like JAE, which made a similarly rapid 3 In total, only 99 journals were included on at least three separate journal-ranking lists. Fifty journals appeared on at least three journal-ranking lists submitted by doctoral-granting programs. Seventy journals appeared on at least three journal-ranking lists submitted by non-doctoral-granting programs.
484
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
ascendance to the top of the ranks among all programs, RAS will continue to move up the ranks. While we did not ask chairs to describe how their departments developed their journal rankings, one department volunteered that it surveyed faculty serving on its Promotion and Tenure Committee to elicit their opinions of journal quality, using median scores as a starting point. It next examined published studies of journal rankings, notably Brown and Huefner (1994) and Hasselback et al. (2000), to ascertain the consistency of more wide-ranging surveys with its preliminary results. Despite some minor differences, general agreement arose as to the overall rankings. Third, the department did a benchmarking study of leading research-oriented state universities to obtain their rankings of journals used in the promotion and tenure process, nine of which used processes similar to the case school. But only two of them had developed an actual list, and both cautioned that programs still must use judgment in applying any set of lists. Nevertheless, all the schools agreed as to the “A” tier journals in the case school’s final list. Considered in light of the elite 15’s dominance in setting the research agenda, the process that this case school uses suggests strongly that journal rankings are not necessarily the product of an independent, objective assessment of the quality of journals. Instead, it is a consequence of the structure of the accounting academy and a reflection of what becomes socially acceptable as a result of that structure.
4. Discussion Our study differs from prior journal rankings both in terms of its methodology and findings. Regarding methodology, we solicited actual accounting department journal rankings to produce composite rankings for all listed accounting journals. Brown and Huefner (1994), Hull and Wright (1990), Johnson et al. (2002) and others based their rankings on faculty and department chair perceptions. We also ascertained the purposes for using those lists, verifying that relatively few accounting departments (13%) used them for promotion, tenure, merit pay and other similar decisions. Johnson et al. (2002), the most recently published study in this area, aptly highlight differences and similarities between ours and other studies. Providing about 600 department chairs with a list of 33 accounting and MIS journals, they asked them to classify the journals into “A” or “B” classes and to list how many class “A” and “B” articles faculty would need to meet their programs’ research requirements. All 33 journals had been cited as Class “A” in the prior literature, including nine periodicals (Decision Science, MIS Quarterly, Information Systems Research, Communications of ACM, Journal of Computer Information Systems, IEEE Transactions, Journal of Public Economics, Journal of Strategic Information Systems and Journal of Economic Psychology) that are not generally included in studies that rank accounting journals. While these are high quality journals, our rankings do not include them. However, our rankings cover a much broader range of accounting journals and reflect rankings that accounting departments actually use. Rather than constrain chairs with a predefined list of journals, we derived a composite list from their own journal rankings used for personnel decisions within their departments. While Johnson et al. (2002) and our findings produced similar lists for the top three journals—TAR, JAR and JAE—differences in the two studies arise once we compare jour-
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
485
nals beyond the top three. For example, CAR, which Johnson et al. (2002) rank as number eight, is consistently among the top five journals in our rankings. Similarly IJISAFM ranks in the top tier among doctoral programs in our study, but it is not among the 33 journals listed in Johnson et al. (2002). Furthermore, Johnson et al. (and others) rate AOS as a very high-level journal but it is not among the top five in any of our composite rankings. Journal of the American Taxation Association consistently ranks ahead of AOS in all our composite rankings. Similarly, U.S. schools doctoral programs ranked AJPT ranked higher than AOS. Despite the statistical significance of AACSB accreditation in the study, we note that more than 70% of departments with separate accounting accreditation indicated that they do not use formal journal rankings to evaluate their faculty research. This is intriguing due to the importance of documented processes and criteria in AACSB accreditation reviews. Intuitively, most AACSB-accredited departments would routinely develop such rankings, if for no other reason than to reduce the complexity of the accreditation review process. However, comments received from department chairs suggest that such lists are suspect and might be viewed as expedient, self-serving documents that do not necessarily reflect the quality, rigor and integrity of published articles. We were surprised to receive only 19 lists of journal rankings since the AAA sponsored our mailing—especially when a second mailing, this time directly from the AAA, produced no additional responses. Any conclusion we draw from this lack of response must be, of course, speculative. But when taken with other facts that emerge in our discussion, the small response fits into a pattern that shows a reluctance to use ranking lists appearing in published scholarship, an aversion to admit that any ranking system can be accurate if imposed from without—a sort of adaptation of the bromide that “all politics is local.” In some cases, the localization becomes so extreme that the three or four people on the committee making the rank, tenure and promotion decisions generated a list of journals ad hoc. Such a list could change as the committee membership changes. In other cases, the ranking is never written down—it is a kind of [gentleman’s] agreement. In the light of the conclusions we could draw from the total responses, we finally speculate that a small response could almost have been predicted from the other data. The results suggest that accounting departments use journal rankings as a means of reducing the complexity of their processes for evaluating faculty scholarship. This is evidenced by the statistically significant difference between the number of faculty in departments that use journal rankings and those that do not such rankings. Since size is typically an indicator of complexity, it is not surprising that larger departments are more likely to have and use journal rankings. While accounting program administrators may find this study useful, many writers urge caution in evaluating accounting faculty research productivity. For example, Zeff (1988, 1996) advised those assessing the quality of accounting journals to consider that new, high quality journals may have begun publishing articles relatively recently. Buchheit et al. (2002) show that the accounting discipline has considerably fewer top-tier articles relative to the other three major business disciplines (finance, management and marketing)—considering the number of faculty members in each discipline and the number of journal spaces in these top-tier publications. Thus, accounting faculty members may be at a comparative disadvantage when comparing their research records with those of their colleagues, especially
486
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
when they demand that accounting professors publish their works in top-tier journals. Additionally, Ettredge and Wong-on-Wing (1991) found that the chances of publication in the top 20 accounting research journals have declined significantly from 1970 to 1988, due to the smaller number of main articles in these publications, particularly in the context of the large growth in the number of accounting academics.
5. Limitations Like all prior studies measuring faculty research productivity or ranking programs, this study has limitations. First, our relatively small sample of departments that provided journal rankings may be unrepresentative of all departments that use such rankings. On the other hand, our research suggests that the vast majority of accounting departments do not formalize their journal rankings. It is plausible that not many more departments in our sample actually have a list of ranked journals. Second, despite using a sorting process based on all the rankings assigned to individual journals, our composite rankings are sensitive to the number of responding departments. Due to this concern, we sought to validate our rankings by using a clustering technique (K-means clustering)4 to see whether there were obvious inconsistencies between our rankings and the group into which each journal clusters. We observed no major inconsistencies. A third limitation involves the scope of the paper. We rank only accounting journals, thus excluding quality publication outlets of potentially high-quality scholarship such as notes, commentaries, monographs and other works published in non-accounting journals. As Christensen et al. (2002) found, many accounting scholars publish much of their work in non-accounting journals. An additional limitation related to scope relates to the geographic region represented. Ballas and Theoharakis (2003) report major differences in faculty perception across different geographic regions, with North America and Asia falling in one sphere of influence and Europe and Australia/New Zealand falling in another. Clearly, our study is most applicable to the North America/Asia sphere of influence. Nonetheless, our study should provide useful input for accounting departments seeking to develop more defensible and consistent measures of their faculties’ publication quality, given each program’s distinct research mission and resources.
6. Conclusions We show that few accounting departments—at both elite and non-elite accounting programs—use formal journal-ranking lists to evaluate faculty scholarship. Departments with doctoral programs and separate AACSB accreditation are more likely than other departments to use journal rankings. Elite and non-elite Ph.D.-granting programs use journal rankings very differently. Elite programs and programs that graduated the heaviest concentration 4 K-means clustering categorizes a set of objects into groups by maximizing the distance between clusters. It is the reverse of doing an analysis of variance in the sense that the groups are unknown a priori and the technique creates groups that maximize (through an iterative process) the between-group differences.
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
487
of contributors to the accounting literature either use no journal rankings or consider only a small number of highly selective journals. Journal rankings are most heavily concentrated among non-elite Ph.D.-granting programs. As expected, relative to elite programs, non-elite programs that use journal rankings generally include a much wider range of journals in their lists. However, their list of top journals almost always include the elite three, regardless of the program’s mission, resource base, emphasis on teaching or research orientation. Our study supports Lee’s (1997, 1999) findings, in that while the 10 responding Ph.D. programs use a ranked publications list of an average of 39 journals to assess faculty, and non-Ph.D-granting programs considered an average of 74 journals, the 2 elite Ph.D.-granting programs in our sample consider no more than 6 journals, journals that they control. We also surmise why doctoral students at non-elite doctoral programs are much more familiar with the accounting literature than their counterparts at elite programs, as Schwartz et al. 2005 found. Perhaps relative to elite programs, faculty at non-elite schools value a wide range of scholarship and journals, and, in turn, expose their doctoral students to a similarly broad range of journals and scholarship. Feedback received from accounting program chairs suggests that most departments assess the quality of a published article in many different ways besides formalized lists of ranked journals. For example, accounting departments (a) rely on the blind review process used by refereed journals; (b) informally rely on their perception of journal quality; (c) rely on an informal faculty review of and judgment on the quality of individual published articles; and (d) rely on a formal review process that uses designated in-house or external reviewers. Some accounting program chairs view journal rankings as a source of conflict that could reduce innovation and place professional accounting programs at a disadvantage relative to other business disciplines. They, therefore, prefer to distance themselves from using journal rankings to assess the quality of faculty publications. We also observe that academic accounting is taking on characteristics of the elite, which tends to discount the value of applied scholarship, scholarship of integration and scholarship of teaching. Briefly, while the rigor and resources required to publish in elite journals has risen, many programs still ask or “demand” that their faculty publish in such journals. This comes at a time when the a priori probability of publishing in most elite journals has declined very significantly over time. For example, during the 1950s, TAR published around 25 articles per issue, which has fallen to about 8 during the 2000s, despite the substantial increase in the number of doctoral-qualified faculty members relative to the 1950s. Since then, TAR also ceased publishing Correspondence, Educational, Small Sample Studies and Literature Reviews. Moreover, most non-elite programs simply cannot afford the large reduced teaching loads, computer support and research assistants needed for their faculty to compete for the few available slots in these elite journals. We find that elite programs demand that their faculty publish in five or six elite journals. Regardless of their missions, many other programs consider the same elite journals as their discipline’s top journals. Thus, elite programs—intentionally or unintentionally—have induced many aspiring accounting academics to focus on fewer and fewer journals, which can impede the broad research needs of accounting academe and the profession.
488
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
Appendix A
References Association to Advance Collegiate Schools of Business (AACSB). Eligibility procedures and standards for business accreditation. St. Louis, MO: AACSB; 2004.
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
489
Ballas A, Theoharakis V. Exploring diversity in accounting through faculty journal perceptions. Contemp Acc Res 2003;20(3):619–44. Baker CR, Bettner MS. Interpretive and critical research in accounting: a commentary on its absence from mainstream accounting research. Critic Perspect Acc 1997(8):293–310. Beattie V, Ryan RJ. The impact of non-serial publications on research in accounting and finance. Abacus 1991(March):32–50. Brinn T, Jones MJ, Pendlebury M. Why do UK accounting and finance academics not publish in top US journals? Brit Acc Rev 2001;33(2):223–32. Brown LD. Influential accounting articles, individuals, PhD granting institutions and faculties: a citational analysis. Acc Organ Society 1996;21:262–77. Brown LD. Ranking journals using SSRN downloads. Rev Finance Quant Finance Acc 2003;20(3):291– 307. Brown LD, Huefner RJ. The familiarity with and perceived quality of accounting journals: views of senior accounting faculty in leading U.S. MBA programs. Contemp Acc Res 1994(Summer):223–50. Buchheit S, Collins D, Reitenga A. A cross-discipline comparison of top-tier academic journal publication rates: 1997–1999. J Acc Educ 2002;20(2). Cargile BR, Bublitz B. Factors contributing to published research by accounting faculties. Acc Rev 1986(January):158–78. Christensen AL, Finger C, Latham C. New accounting scholars’ publications in non-accounting journals. Issues Acc Educ 2002;17(3):233–51. Chua W. Radical developments in accounting thought. Acc Rev 1986;41(4):601–32. Chung KH, Pak HS, Cox RAK. Patterns of research output in the accounting literature: a study of the bibliometric distributions. Abacus 1992;28(2):168–85. Demski JS, Dopuch N, Lev B, Ronen J, Searfoss G, Sunder S. A statement on the state of academic accounting. Statement to the Research Director of the American Accounting Association, Sarasota, FL, 1991. Dwyer PD. Gender differences in the scholarly activities of accounting academics: an empirical investigation. Issues Acc Educ 1994(Fall (2)):231–46. Dyl EA, Lilly MS. A note on institutional contributions to the accounting literature. Acc Organ Society 1985;10(2):171–5. Englebrecht TD, Iyer GS, Patterson DM. An empirical investigation of the publication productivity of promoted accounting faculty. Acc Horizons 1994;8(1):45–68. Ettredge M, Wong-on-Wing B. Publication opportunities in accounting research journals: 1970–1998. Issues Acc Educ 1991;6:239–47. Fogarty TJ. A ranking to end all rankings: a meta-analysis and critique of studies ranking accounting departments. Acc Perspect 1995;1:1–15. Fogarty TJ, Ruhl JM. Institutional antecedents of accounting faculty research productivity: a LISERAL study of the best and brightest. Issues Acc Educ 1997;12(Spring):27–38. Gamble GO, O’Doherty B. How accounting academicians can use citation indexing and analysis for research. J Acc Educ 1985a(Fall):123–44. Gamble GO, O’Doherty B. Citation indexing and its use in accounting: an awareness survey and departmental ranking. Issues Acc Educ 1985b:28–40. Hall TW, Ross WR. Contextual effect in measuring accounting faculty perceptions of accounting journals: an empirical test and updated journal rankings. Advances Acc 1991:161–82. Hasselback JR. Accounting faculty directory. Englewood Cliffs, NJ: Prentice-Hall; 2001. Hasselback JR, Reinstein A. A proposal for measuring scholarly productivity of accounting faculty. Issues Acc Educ 1995;10(2):269–306. Hasselback JR, Reinstein A, Schwan ES. Toward the development of benchmarks to assess the academic research productivity of accounting faculty. J Acc Educ 2000(Spring):79–97. Heck JL, Jensen RE, Cooley PL. An analysis of contributors to accounting journals. Part I. The aggregate performances. Int J Acc 1990:202–17. Heck JL, Jensen RE, Cooley PL. An analysis of contributors to accounting journals. Part II. The individual academic accounting journals. Int J Acc 1991:1–15. Hexer JH. Publish or perish—a defense. Public Interest 1969(Fall (17)):60–77.
490
A. Reinstein, T.G. Calderon / Critical Perspectives on Accounting 17 (2006) 457–490
Howard TP, Nikolai LA. Attitude measurement and perceptions of accounting faculty publication outlets. Acc Rev 1983(October):765–76. Hull RP, Wright GB. Faculty perceptions of journal quality: an update. Acc Horizons 1990(March):77–98. Johnson PM, Reckers PMJ, Solomon L. Evolving research benchmarks. Advances Acc 2002;19:235–43. Jolly SA, Schroeder RG, Spear RK. An empirical investigation of the relationship between journal quality ratings and promotion and tenure decisions. Acc Educators’ J 1995(Fall):47–68. Kida T, Mannino RC. Job selection criteria of accounting PhD students and faculty members. Acc Rev 1980(October):491–500. Kirchmeyer C, Reinstein A, Hasselback JR. Relational demography and career outcomes among academic accountants. Advances Acc Behavioral Res 2000;3:177–97. Lee T. Shaping the U.S academic accounting research profession: the American Accounting Association and the social construction of a professional elite. Critic Perspect Acc 1995;6(3):241–61. Lee T. The editorial gatekeepers of the accounting academy. Acc Aud Accountability 1997;10(1):11–30. Lee T. Anatomy of a professional elite: the Executive Committee of the American Accounting Association. Critic Perspect Acc 1999;10(6):247–64. Lee T, Williams P. Accounting from the inside: legitimizing the academic accounting elite. Critic Perspect Acc 1999;10(6):867–95. MacRoberts MH, MacRoberts BR. Problems of citation analysis: a critical review. J ASISI September 1989:342–9. McRae TW. A citational analysis of the accounting information network. J Acc Res 1974(Spring):80–92. Morris JL, Cudd RM, Crain JL. A study of the potential bias in accounting journal ratings: implications for promotion and tenure decisions. Acc Educators’ J 1990(Fall):46–55. Ostrowsky BA. First-time accounting faculty: the job search, acceptance and support processes. Issues Acc Educ 1986(Spring):48–55. Petry G, Settle J. A comprehensive analysis of worldwide scholarly productivity in selected U.S. business journals. Quart Rev Econ Bus 1988(Autumn):88–104. Rama DV, Raghunandan K, Logan LB, Barkman BV. Gender differences in publications by promoted faculty. Issues Acc Educ 1997;12(Fall):353–65. Read WJ, Rama DV, Raghundan K. Are publication requirements for accounting faculty promotions still increasing? Issues Acc Educ 1998;12(Fall):327–39. Reiter S, Williams J. The structure and progressively of accounting research: the crisis in the academy revisited. Acc Organ Society 2002;27:575–607. Schroeder RG, Payne DD, Harris DG. Perceptions of accounting publications outlets. Acc Educator’s J 1988(Fall):1–17. Schwartz BN, Williams S, Williams PF. US doctoral students’ familiarity with the accounting journals: insights onto the structure of the U S academy Critic Perspect Acc 2005;16:327–48. Seetharaman A, Islam MQ. Assessing the relative quality of accounting journals: the use of citation analysis. In: Proceedings of the Southwest Regional AAA Meeting; 1995. p. 53–60. Smith LM. Relative contributions of professional journals to the field of accounting. Acc Educators’ J 1994(Spring):1–31. Sriram RS, Gopalakrishnan V. Ranking of doctoral programs in accounting: productivity and citational analysis. Acc Educators’ J 1994(Spring):32–53. Streuly CA, Maranto C. Accounting faculty research productivity and citations: are there gender differences? Issues Acc Educ 1994(Fall):247–58. Williams P, Rodgers J. The accounting review and the production of accounting knowledge. Critic Perspect Acc 1995;6(3):263–87. Zeff SA. A study of academic journals in accounting. Acc Horizons 1996(September):158–77. Zeff SA. The surge of academic journals in academe—boon or bane? Acc Educ News 1988(May):8–9. Zivney TL, Bertin WJ. Publish or perish: what is the competition really doing. J Finance 1992(March):295–329. Zivney TL, Bertin WJ, Gavin WJ. A comprehensive examination of accounting faculty publishing. Issues Acc Educ 1995(Spring):1–25.