Publication performance of fifty top economics departments: a per capita analysis

Publication performance of fifty top economics departments: a per capita analysis

0271-7757 X6 S?.tW) + O.I)o 0 IYHhPcrymon Prc~ Ltd. Publication Performance of Fifty Top Economics Departments: a Per Capita Analysis JOHN GOLDEN, ...

270KB Sizes 2 Downloads 18 Views

0271-7757

X6 S?.tW) + O.I)o

0 IYHhPcrymon Prc~ Ltd.

Publication Performance of Fifty Top Economics Departments: a Per Capita Analysis JOHN GOLDEN,

FRED V. -CARSTENSEN,*

PAUL WEINER

and STEVE KANE

Department of Economics. U-63, University of Connecticut. Storrs. CT 06268, U.S.A.

Assessments of scholarly productivity have typically relied on a narrow set of journals. Relying on a broader data set (IEL abstracts) and evaluating productivity on a per capitu basis, this survey confirms the standing of some highly regarded departments but finds several whose records do not match their reputations and identifies a few small departments whose records merit more respect. Abstract

PRESUSIFTIVELY, the quality of a department’s faculty shows itself in the faculty’s publication performance. But departments with more members would, ceterkparibus, outperform departments with fewer members when measured simply by aggregate publications. Thus per capita rather than aggregate output ought to be relevant in evaluating departmental quality. Indeed, Hogan (1981) has already shown that per capita publication performance is a key input in the production of research-oriented doctorates. This note extends the analysis of per capita performance through an evaluation of the per faculty publication productivity of the top 50 economics departments (out of 91 departments with doctoral programs). We measured publication performance on the basis of the abstracts in the Journal of Economic Literature for the five-year period September 1978 to September 1983; the recent Conference Board of Associated Research Councils (CBARC) (1982) survey provided information on faculty size; the underlying data from Weiner et al. (1984) provided departmental rankings of aggregate output (see note for Table 1 for a discussion of the potential shortcomings in this procedure). The use of abstracts rather than the number of articles published in an arbitrarily defined set of ‘top’ journals is appealing because it both encompasses research published in a much wider variety of journals and includes many subfields typically

in major journals. Taking such a wide perspective shows that, while some highly regarded departments stay at the top, several drop significantly and a few, perhaps surprising, move up. Column 1 of Table 1 lists institution; column 2 ranks institutions on the basis of the total number of abstracts; column 3 lists per capita abstract output, based on all abstracts published in the JEL during the survey period (variant I); and column 4 ranks the top 50 departments (out of 91) on the basis of column 3. Because a significant share of abstracts in the /EL’s categories 500 and 700 (business/administration, agricultural economics/resources) are likely attributable to faculty outside an economics department (e.g. business schools, agricultural economics departments), variant II in column 5 shows per capita abstracts after eliminating abstracts from the 500s and 700s. Column 6 then ranks departments on the basis of variant II. Finally, column 7 shows a ranking of institutions based upon the 1981 CBARC (1982) graduate faculty ratings in underrepresented

economics.

Table 1 suggests three observations. First, though there is a high correlation between the rankings of departments in variant I and variant II - Chicago is first in either ranking, with a leading 6.59 abstracts per faculty member in variant I, 6.30 in variant II institutions departments

with

prominent

agricultural

do slip significantly.

Texas

economics A

&

M,

‘To whom all correspondence should be addressed. [Manuscript received I2 November 1984; revision accepted for publication 10 January 1985.1

83

Econon~ics of Etiwr~tion

Table

I. Publication

productivity

rankings

Rrr,iew

for 50 economics

dcpartmcnt>

(2)

(1)

-\ggrcptc ranking

Institution

(6) VAR

I

R&

\,‘,A R

II

Ranh

h.30

(71 Graduate f,iculry rating

Chicago Johns Hopkins Rochester Stanford Harvard

I 3Y

i '1) _._

IS 3 2

4.67 -I.hY 4.50

Cornell UC - Berkeley M.I.T. Columbia V.P.I.

I3 IO 4 II 25

4.23 4.15 3.Yi 3.79 3.60

6 7 8 Y IO

3.73 3.31 3.57 3.6-I 3.00

9 8 7 l’(tic)

9 X(tic)

Carnegie Mellon Pennsylvania U.C.L.A. N.Y.U. Princeton

22(tic)

II I2

I;: I4 6

3.41 .3.27 .3 I 9 3.17 2.03

I-I 15

.3 ._ ‘-J 3. IO 3.00 2.79 2.70

IO II I’(tie) I-I I5

2I(tic) s Itl(tic) Ih(tic) 3( tic)

Ohio State Texas A&M Minnesota Northwestern Wisconsin -

Ic)(tic) IY(tic) IS I2 7(tic)

2.67 2.67 2.64 2.56 2.56

Ih(tic) l6(tic) IS IO(tic) IY(tic)

’ is, -.. 2.13 2.42 2.3-i ’ 77 -._

I7 22( tic)

3S(tic) 33(tic) 7 IO(tie) IO(tic)

Purdue U.C. Davis Penn Stntc U.C. - San Diego Yak

Zh(tic) 2s 22( tic) 31 7(tic)

-I j’ _.. 2.-17 2.42 7 75 _._7 ?i _._.

71 22 23 2_l(tic) 2-!(tic)

2.00 2.05 2.13 ’ ‘0 -._ 2.1s

76 3 22(tic) 20 21

33(tic) 35(k) 47( tic) IH(tir) h

Michigan State North Carolina Illinois Urbana Southern Methodist Florida

21 26(tic) Y Jj(tic) 36(tic)

2.17 2.0-l 2.00 7.00 I.‘)5

26

2.07 I.88 I.63 I.92 I.65

‘4 2s 35 27 3-I

%(tic) 2Y(tic) 2Y(tie) 4Y(tic) 47( tic)

Duke Michigan Caltech U.S.C. Tulane

33 17 40( tic) 29(tie) 60( tic)

I .S6 I .80 1.7’) I .77 1.73

3I 32 33

I.86 I.69

I .hS

29 32 33

3-t 35

1.73 1.73

ZO(tie) 30( tic)

23( tic) I-l(tic) 23(tic) 3l(tie) Q(tic)

Madison

I -3

21 ZS(tic) X(tic) 30

3(tic)’ 2I(tic) I-l(tie) ?(tic) 3

6

‘6

IS I9

IS(tic) Il)(tic)

I

Publicnrion

Perfortnance

Table

(1)

IOW

Virginia Texax - Austin Brown Missouri Washington - Seattle South Carolina Kentuck) Boston College Florida State Wayne State

(continued)

VAR

22( tie) -I_) -lO(tie) 34(tie) 42

I.66 1.6i I.62 I .hO I .43

7’

Zh(tie) %(tir) %(tie) 34( tic) 55( tie) 53(tie) J7( tic) 60( tie) 47(tie)

1.25 1.2-l I.19 I.19 I. I9

(6)

(5) (4) Lter cctoira

(3)

Institution Rutgers Wisconsin Milwaukee Indiana U.C. - S. Barbara

I

it1 Economics

I

Rank

36 37 3s 39 40( tic)

‘VAR

11

RXlli

(7) Graduate faculty rating

I .-I9 I.59 1.57 I.48

3s

i2( tie)

36 37

I.30

JO

651 tic) -IP(tie) -I?( tic) -l5(tic)

-!ll(tic) 42 J3( tic) 13(tic) -!i( tic)

1.37 1.21 I .33 1.1.: I .,I)

46 47 -ls(tie) -lS( tic) -IX(tic)

I.19 0.9-J 1.1-l I .oo I.10

3Y

23(tic) 47(tic) Ifiitic) Il(tic) Zh(tic)

45 56 46 63 -lx

7l(tic) h3(tic) 2(tic) 50( tie) S2( tic)

CBARC (19SZ) did not include the departments at SUNY Buffalo, Arizona. Georgia and Tenncssec in its survey; hence they arc not included in the table ahovc. Thcsc dcpartmcnts would have rnnkcd 42(tie). J-l. &(tic). -th(tic) rcspcctivcly in aggrcgatc output (column 2 above) had they hcrn included in the survey. SUNY - Buffalo. which may have had the fewest dcpartmcntal mcmbcrs among these four schools during the survey period. would then have ranked rcahonably high on per ctrpirrr

output. Vanderbilt ranked 49th on VAR II (columns .i and 6) with a ,~‘r capitrr output of 1.0-l abstract,: Maryland College Park. Washington University (St. Louis) and Washington State tied for 5tlth ranking in VAR II (columns .i, and 6). with a ,lcr cupitrt output of 1.00. There are three sources of potentially significant error in this proccdurc. (I) Faculty hize is sometimes difficult to determine: this can, undcrstnndably. have an important inllucncc on perctrpitrr mcasurcs. (2) The abstracting process may itself bc biased or the JEL may publish abstracts srkctivcly. WC have found no evidence of cithcr: the selection process as well as the purpose of the abstracting service is to provide comprchensivc coverage of articles published in the roughly 75 scholarly journals publishing economic research. The /EL publishes all abstracts received. The only limitation in the coverage of the abstracts is that, &cause of financial considerations. some of the less prestigious journals wcrc. in some years, restricted to abstracting one half of their articles. This is a minor problem and WC assume introduces no systematic bias. (3) The authors whose ilrticlcs are abhtr;lctcd art‘ not ncccssarily housed in an economics department. WC respond to this problem with VAR II (columns 5 and 6) which excludes the two categories where the problem is the most obvious. Of course. such categorical exclusion eliminated some publications by members of economics departments and still includes publications by non-departmentally affiliotcd authors in fields other than agricultural economics and business administration. However. this did not seem to raise significant problems - and there was no way around it given the nature of the data base. Sources: Column 2: original data from Weiner er nl. (IYSJ). which giva a,,““rceate output and output by each of the ten major sub-fields of economics; column 3: computed from all abstracts in /EL September 1978 to Scptcmbcr 1983: column 4: ranking hascd on column 3: column 5: same as column 3. less all abstracts in categories 500 and 700; column 6: ranking based on column 4: column 7: ranking of graduate departments by faculty quality. from CBARC (1982).

-

Urbana

and Purdue

positions.

Illinois

Second.

some prestigious

column

7) which

performance high

when

rankings

ductivity, ments.

Johns

Ann

show

A brief

productive

more

measure

per

basis.

takes

of

many

monographs

essays, publication so

The -

for this

variety have

in top journals

aggregate

citations

favordepart-

Methodists.

are the leading

examples

Science (Davis that in

be obvious

can provide

a fully

research

productivity.

adequate Such

research ratings sensitive

to

It

also

identify

productivity should

focused

(Hirsch

198-I). can yield suggests the

journals,

Measurements on

et al..

198-t) and

This

OF economists note

reminds

significant that

these

characteristics

be synthesized

separate of

faculty quality

into a comprehensive, standard

us

variations

and of departmental

multidimensional

of

pages

1981 in the Social

to the work

Papanek.

rankings.

papers.

1975 and

Cirorion It&r and

such appraisals

efforts

departments.

for

original

notes and comments

recently

published

pro-

publi-

of

of professional

of working

productivity

including

collections

of articles.

the increasing

and production

forms. or

capim

rated

It should

productivity cation

pronounced

compare

highly

Southern

is in order.

of departmental

most on

quite

Wisconsin

departments

of the Hopkins.

caveat

that no single

the

(cf.

publication

rank

Arbor.

judged

and South Carolina

of such small,

in absolute

2) do not

-

when

many

at least five

institutions

on the per cnpifo

some smaller

with

picture

Yale

Third.

high

column

evaluated and

impact.

Tulane

rank

of iMichigan

Madison

ably

(cf.

all drop

of appraisal.

REFERENCES CONFEKENCE BOAKL~ OF ASSOCYATED Rnse~ackt COUNCILS (I%?) A~I Assessmetrr ofResenrch-Docrorure Proyrutns in the United States: Social & Bel~aviorul Scimces (Edited by JWES. L.V.. LINUZEY. G. and COGGESHALL. P.E.), pp. 5-I-62. Washington. DC: National Academy Press. DAVIS. P. and PAPANEK. G. (1981) Faculty ratings of major rconomics dapartmcnts by citations. Am. ecotr. Rev. 71, 225-230. HIKSCCI. B.T.. ALISTIN, R.. BKOOKS. J. and MOOKE. J.B. (198-t) Economics department rankings: comment. Am. ECO~I. Rev. 7-l. 822426. HOGAN. T.D. (1981) Faculty research activity and the quality ot’ graduate training. J Hrcmutr Resources 16. JOO-415. WEINEK, P.. CAKS~ENS~:N, F. and GOLDEN, J. (1984) Recent publication performance in economics: an abstract approach. Qncrrr. Rev. Ecotr. Business 24, 93-98.