Measurement and analysis of Chinese journal discriminative capacity

Measurement and analysis of Chinese journal discriminative capacity

Journal of Informetrics 14 (2020) 101000 Contents lists available at ScienceDirect Journal of Informetrics journal homepage: www.elsevier.com/locate...

2MB Sizes 0 Downloads 15 Views

Journal of Informetrics 14 (2020) 101000

Contents lists available at ScienceDirect

Journal of Informetrics journal homepage: www.elsevier.com/locate/joi

Regular article

Measurement and analysis of Chinese journal discriminative capacity Baolong Zhang a,b , Hao Wang a,b,∗ , Sanhong Deng a,b , Xinning Su a,b a b

School of Information Management, Nanjing University, Nanjing, 210023, China Jiangsu Key Laboratory of Data Engineering and Knowledge Service, Nanjing, 210023, China

a r t i c l e

i n f o

Article history: Received 29 July 2019 Received in revised form 8 December 2019 Accepted 9 December 2019 Keywords: Journal discriminative capacity Journal difference Journal evaluation Hierarchical clustering PCA dimension reduction

a b s t r a c t This study proposes a method for measuring the discriminative capacity of journals to evaluate and analyze differences in research content among academic journals. A total of 100 Chinese academic journals from five disciplines are selected as research objects. The discriminative capacities of journals in the library, information, and archival science in 2017 and 2013–2017 are calculated and analyzed using bibliographic data (titles, abstracts, and keywords). The characteristics of the discriminative capacities of journals within different time spans are compared based on changes in the spatial distribution and discriminative capacity of the journals. The discriminative capacity of journals published from 2013 to 2017 is determined and ranked to explore their annual change rules over time. The discriminative capacity of journals in five disciplines is calculated and discussed, and the differences among disciplines are measured and analyzed using the mean discriminative capacity. The experimental results reveal that the discriminative capacity of journals in library, information, and archival science evidently vary, with journals in archival science having the highest discriminative capacity, closely followed by journals in information and library science. In addition, the journals show distinct characteristics in different time spans. The discriminative capacity of journals shows obvious change rules over time and varies in characteristics among different disciplines. © 2019 Elsevier Ltd. All rights reserved.

1. Introduction Assessment of academic journals has been an important research topic for a considerable time. Objective and comprehensive assessment of academic journals bears significance, which largely influences the evaluation of scholars and institutions (Zhang, 2017). The evaluation of journals is mainly focused on influence, quality, and reputation (Cockriel & McDonald, 2018; Raj & Zainab, 2012; Rousseau, 2002; Xie, Wu, & Li, 2019). And previous studies provide a number of methods or indicators to measure the quality of journals, such as journal impact factor (Garfield, 2006), PrestigeRank (Su, Pan, Zhen, & Ma, 2011), journal’s integrated impact index (Ma, Wang, Dong, & Cao, 2012), sub-impact factor (Xu, Liu, & Rousseau, 2015), and so on. However, studying journals only from these perspectives is not sufficiently comprehensive, and differences between journals also need to be examined. Each journal initially has its research interests, which is an essential characteristic of the journal, and yet the research focus of journals gradually changes over time. Some journals maintain their research characteristics during development, while others, particularly those in the same field, only slightly vary in research content (research topic,

∗ Corresponding author at: School of Information Management, Nanjing University, Nanjing, 210023, China. E-mail address: [email protected] (H. Wang). https://doi.org/10.1016/j.joi.2019.101000 1751-1577/© 2019 Elsevier Ltd. All rights reserved.

2

B. Zhang, H. Wang, S. Deng et al. / Journal of Informetrics 14 (2020) 101000

research object, research method, etc.) and may have homogeneous tendencies. These trends may be detrimental to the diversity of academic journals and the innovation of academic research. Consequently, journals need to be evaluated from the perspective of difference, which plays an important role in maintaining the characteristics of journals and controlling the development direction of journals. Moreover, measuring differences in journals presents a problem that needs to be resolved. In information retrieval, Salton and Yang (1973, 1975) conceived of term discriminative capacity, which is used to describe the importance of indexing terms, and proposed the term discrimination model to measure term discriminative capacity. The core idea is to calculate the discrimination value of terms by measuring the change in document space density and to evaluate the quality of terms as retrieval words (Pushpalatha & Raju, 2010). Although the method entails high computational complexity and may not be suitable for large-scale data and large-grained academic objects (e.g. journals), its research idea helps analyze the differences in journal research content quantitatively. Besides, the approach provides a reliable theoretical basis for the quantitative assessment of differences in journals as well. Therefore, this study extends the idea of term differentiation value to academic journals for revealing the differences in journals and also identifies the unique and distinct journals by measuring their research content. The purpose is to provide a reference for guiding the diversity development and changing the homogenization trend observed in academic journals. Therefore, this study evaluates academic journals from a new perspective (i.e., the difference in research content) and presents a new index (i.e., the journal discriminative capacity (JDC)) and a method for measuring the difference in research content among academic journals. The JDC is defined as the overall difference between a journal and all other journals within a predefined group and the extent of the difference. As a relative value, the JDC can only be calculated in a given group. A journal with a higher JDC has a larger difference among the journals within a given group and a more distinct research content. The JDC method provides a reasonable and effective means for quantitative analysis of differences in academic journals, allowing the measurement of the differences in journals and detection of unique individual journals. In addition, the JDC method complements the theory of journal evaluation and content analysis. The proposed method is applied in the empirical analysis of a library, information, and archival (LIA) journal and journals in a multidisciplinary group. Titles, keywords, and abstracts are considered as data sources, and the terms extracted by data processing are used as the experimental corpus. First, the JDC calculation method is improved by optimizing the weights of the terms. The discriminative capacities of the journals in LIA are then calculated based on the aforementioned technique to analyze and evaluate the degree of variation in their research content. The change trend in the JDC of LIA journals from 2013 to 2017 is then determined. Moreover, the characteristics of the JDCs in five disciplines are analyzed and discussed to explore the change rules of the JDCs within different time periods and identify the characteristics of the discriminative capacity among different disciplines.

2. Related research Currently, the evaluation of academic journals mainly focuses on their quality, reputation, and influence. The evaluation methods adopted include peer review and scientific measurement, among others (Abramo & D’Angelo, 2011; Juˇzniˇc, Peˇclin, & Zˇ aucer, 2010). However, owing to problems such as strong personal subjectivity (Bornmann & Daniel, 2009), peer review is gradually replaced by scientific measurement, which mainly uses a single index or compound indexes to evaluate journals quantitatively. Common indexes include the impact factor and its derivative indexes (Garfield, 1998; Moed, 2010; Vinkler, 2008), the H index and its derivative indexes (Braun, Glänzel, & Schubert, 2006; Egghe, 2006; Prathap, 2009; Woeginger, 2008), indexes similar to the PageRank index (Cheang, Chu, Le, & Lim, 2014; González-Pereira, Guerrero-Bote, & MoyaAnegón, 2010; Guerrero-Bote & Moya-Anegón, 2012), and the comprehensive evaluation index (Haddawy, Hassan, Asghar, & Amin, 2016; Kulczycki, 2017; Yu, Chen, & Pan, 2009; Zhang, Liu, & Xu, 2011). These indicators mainly adopt usage data (such as downloads, citations) or citation data to perform a quantitative analysis of journals. Consequently, differences in the research content of academic journals are impossible to reveal. Some studies also use the diversity indicator (Bojovic´ et al., 2014; Shen, Chen, Yang, & Wu, 2019) to evaluate and analyze differences in journals. These studies are used as quantitative measures that reflect the number of subdisciplines in a journal. However, the difference in the degree of individuality among journal groups is difficult to gauge using the diversity indicator. Content analysis of academic journals has been performed in numerous studies. In the early stages, most of these studies focus on the quantitative analysis of the research topic, viewpoints, and methods in journals through descriptive statistics (Buttlar, 1991; Järvelin & Vakkari, 1990, 1993; Kumpulainen, 1991; Peritz, 1980). Järvelin and Vakkari (1993) performed one of the most sophisticated content analyses and systematically investigated the research topics, viewpoints, and research methods in LIS. However, the report failed to pay sufficient attention to the differences in research content (e.g. topics, methods) among journals. However, many studies have recently discussed the differences in content analysis among journals. Aharony (2012); Allen, Weber, and Howerton (2018), and Yoon, Bang, and Woo (2016) analyzed the research contents of journals and revealed the differences in research topic. Chu (2015); Zhang, Zhao, and Wang (2016), and Zhang, Wang, and Zhao (2017) compared and analyzed quantitatively and qualitatively the differences in the research method applied in journal articles. Content analysis, descriptive statistical analysis, and temporal analysis were mostly used to discuss the aforementioned differences. They mainly focused on the differences in a specific aspect of research content and did not further examine the differentiation level of an individual journal in the group.

B. Zhang, H. Wang, S. Deng et al. / Journal of Informetrics 14 (2020) 101000

3

Fig. 1. Flow chart of analysis.

Differences in journals have been studied using various scientometric approaches. Frequently used techniques include hierarchical clustering (HC) for grouping items in dendrograms and multidimensional scaling (MDS) for visualizing items in ˜ two- or three-dimensional maps (White & McCain, 1997; White & McCain, 1998). Tseng and Tsay (2013) and Gómez-Núnez, Vargas-Quesada, and de Moya-Anegón (2016) used cited references for hierarchical clustering to analyze the differences among journals. Leydesdorff and Rafols (2012); Tseng and Tsay (2013), and Leydesdorff, Bornmann, and Zhou (2016) further revealed differences in journal subdisciplines by MDS mapping. Clustering of journals by HC and MDS usually results in similar clusters, but MDS mapping can further analyze the differences in individual journals and classify them into groups by the distribution of journals in space. Wolfram and Zhao (2014) and Wang and Wolfram (2015) used the citing article disciplines to evaluate the similarities among journals by using MDS with hierarchical cluster analysis and principal component analysis, and discussed the similarity relationship among groups by MDS map. These studies suggest that cited references (or disciplines of cited references) can reflect the research content of journals to a certain extent; however, the ability of references to reveal research content (such as topics) is limited. Moreover, these studies focus on the analysis of differences in journal groups (Leydesdorff, Bornmann, & Wagner, 2017) among subgroups (or subdisciplines) on the basis of dendrograms, MDS map, or PCA map (Bojovic´ et al., 2014), but the differences and the degree of difference of journals in the group are rarely measured and discussed. With regard to the review of the literature on journal evaluation, most evaluation indexes and methods are based on the usage data and citation data, which cannot be used to measure the difference in journal research content. With regard to journal content analysis, most studies lack discussion on differences among journals. In the hierarchical clustering and MDS mapping of journals, most studies focus on the differences between groups of journals and do not further study the differences and the degree of difference between individual journals in groups of journals. Accordingly, this study proposes a JDC index to analyze and evaluate differences in journal content. This study also provides a scientific and reasonable calculation method to present a more operational and practical difference analysis and evaluation of journals, compared with the current approaches. The proposed method also offers an idea for the development and improvement of the system for evaluating academic journals. 3. Data and methods 3.1. Research framework This study calculates the discriminative capacity of academic journals from the difference perspective and analyzes deeply and comprehensively the differences in academic journals with respect to vertical individual differences, horizontal trends, and different disciplines. The research framework consists of three parts, as shown in Fig. 1. The first part consists of data acquisition and pre-processing. First, the journal objects are screened from Chinese Social Science Citation Index (CSSCI) and Chinese Science Citation Database (CSCD). The bibliographic data are then downloaded from the China National Knowledge Infrastructure (CNKI) database, and the titles, keywords, and abstracts are extracted from the bibliography as the core data. Subsequently, cleaning operations, such as de-duplicating the acquired data and eliminating invalid records, are performed.

4

B. Zhang, H. Wang, S. Deng et al. / Journal of Informetrics 14 (2020) 101000

On this basis, the sets of terms are formed after word segmentation and removal of stop words. The second part is the calculation, which consists of (i) selecting the corresponding term subset from the term set as the experimental sample and then transforming it into the corresponding triple < Journal number, term, weight >; (ii) constructing the journal–term matrix and transforming it into a journal–journal matrix (JJM) by cosine similarity; and (iii) calculating the journal space density to measure the JDC. The third part is the analysis. First, this study analyzes the differences among LIA journals in accordance with the calculation, discusses the JDC change trends in LIA journals from 2013 to 2017, and finally conducts a comparative analysis of the discriminative capacity among the different disciplines.

3.2. Data preprocessing This study intends to analyze the differences in the research content of academic journals by measuring the JDC. Thus, high-quality journals from different disciplines should be selected as research objects. CSSCI and CSCD, widely recognized as an authoritative and comprehensive database for scholarly citations in Chinese, include a large number of high-impact and high-quality journals. CSSCI (Su, Deng, & Shen, 2014) mainly covers journals in the humanities and social science, selected by the Social Science Evaluation Center of Nanjing University. The disciplines by which journals are classified are clearly sorted in CSSCI by classification and code of disciplines (GB/T 13745–2009) (Standardization administration of China, 2009), classification standard by the degree office of the state council (Degree office of the state council, 2011), and Chinese library classification (5th ed.) (Editorial board of Chinese library classification, 2010). CSCD (Jin & Wang, 1999), the counterpart of SCI in China, mainly covers journals in science, technology, engineering, and mathematics. The discipline categories in the CSCD database are based on the Chinese library classification (5th ed.) and are determined by calculating the coupling strength between subjects through a reference relationship. The interdisciplinary citation relationship constructed by the citation network in the CSCD database is then used to determine the disciplinary affiliation of the journal. The method of grouping journals by discipline in CSSCI and CSCD has been widely recognized by the academic community, ensuring the rationality of predefined groups of journals in the current study. Owing to the large discrepancy in the number of journals covered by different discipline categories in CSSCI and CSCD, the disciplines to examine are determined based on two rules: (i) the number of journals in a discipline covered by CSSCI or CSCD has increased or decreased by no more than five in the last 5 years; thus, the inclusion status of covered journals is relatively stable and (ii) the number of journals in a discipline is roughly equal to that in LIA science; thus, the number of journals in a discipline ranges from 20 to 25 to avoid the potential impact of the discrepancy in the number of journals on the sample analysis and findings. Four journal disciplines—art, law, biology, and aerospace—were finally determined as the research objects (note that LIA journals are the identified research object of this study) according to the areas of the disciplines (humanities, social sciences, sciences, technology). Subsequently, the top 20 journals with a high impact factor for each discipline were selected as the experimental subjects. The list of selected journals appears in the Appendix Table A1. The appropriate selection of data material for the experiment is crucial. The full text should be the best material for measuring and analyzing the difference in research content. However, large data size and sparsity prevent the text from being the best material. Meanwhile, the title, keywords, and particularly, abstract of journals are highly condensed and can efficiently reveal the content of the full text (Mack, 2012). They have higher information density and a smaller volume of data, compared with the full text. Many studies have proved that titles, abstracts, and keywords can be used for topic mining and content analysis (Tokarz & Bucy, 2019; Weismayer & Pezenka, 2017; Xu & Liu, 2018). Accordingly, in the present study, the title, abstract, and keywords, which can be easily obtained from the bibliographic data, are regarded as the most appropriate material. Thus, the bibliographic data of 100 journals in five disciplines, published from 2013 to 2017, are obtained from the CNKI database. After data acquisition, preprocessing is required: 1) Cleaning and sampling of bibliographic data. Some fields, such as the abstract and keyword fields, may be empty in the original bibliography. Most of these articles are non-research articles, such as Conference Notices, which can be regarded as invalid data removal. However, the valid records in different journals vary from 46 to 805 after cleaning. Accordingly, the author randomly selects 40 records as experimental samples from all valid records of each journal yearly; 2) Chinese text segmentation and deletion of stop words. The titles, keywords, and abstracts texts are segmented using the Natural Language Processing and Information Retrieval (NLPIR) word segmentation system. The segmented words are then filtered based on the list of stop words generated by Harbin University of Technology, Machine Intelligence Laboratory of Sichuan University, and Baidu.com. The filtered words are regarded as terms. Notably, English ¨ also regarded as terms and are thus retained. words such as “iSchool” and “citespaceare

3.3. Calculation of journal discriminative capacity In this study, JDC is mainly used to describe the difference in research contents among different journals. Its core objective is to measure the change in density in a journal space after a journal is removed from the space. The specific calculation processes are as follows: (1) Construction of journal space

B. Zhang, H. Wang, S. Deng et al. / Journal of Informetrics 14 (2020) 101000

5

To measure the change in density, a journal space should first be set up. In this study, the high-dimensional space composed of JJM is defined as a journal space, such as (1):





⎛ s11 · · · ⎟ ⎜ ⎟ ⎜ · · · sij · · · ⎟=⎜ ⎟ ⎝ ··· ⎠

S1

⎜S ⎜ i JJM = ⎜ ⎜. ⎝ ..

···

sm1

Sm

s1m

⎞ ⎟ ⎟ ⎟ ⎠

(1)

smm

where JJM is an m*m matrix; m represents the number of row vectors in the journal space, that is, the total number of journals; Si ={sij | i, j ∈ m} is the journal vector; and sij represents the cosine similarity of journals i and j. Cosine similarity is calculated as shown in Formula (2): n 

Similarity(Ci , Cj ) = sij =

n 

cit × cjt

t=1

cit2 ×

t=1

n 

1/2

(2)

cjt2

t=1

where Ci and Cj represent the term vectors, and all term vectors constitute a journal–term matrix (JTM), as shown in (3):





C1



···

c11

⎜C ⎟ ⎜ ⎜ i ⎟ ⎜ ⎟=⎜ ⎟ ⎝ ... ⎠ ⎝

···

JTM = ⎜ ⎜

cit

···

··· ···

cm1

Cm

c1n

⎞ ⎟ ⎟ ⎟ ⎠

(3)

cmn

where JTM is an m*n matrix; cit and cjt are the term weights; and n is the total number of unique terms contained in m journals. JTM is transformed from the triple of < No., term, term weight >. The term weight is usually indicated by presence (i.e., 1 represents appearance, 0 represents non-appearance), TF (term frequency), or TFIDF (term frequency – inverse document frequency), but the most suitable one has yet to be verified. TFIDF is calculated using Formula (4): TFIDF = TF × IDF = tft × lg(m/mt + ε)

(4)

where tft is the frequency of term t appearing in journal i; mt (mt = / 0) is the frequency of term t appearing in the term vector; and ε is the adjusting factor. In the study, let ε = 1/mt to ensure that IDF is not zero when m = mt . (2) Calculation of journal space density The average similarity between all journal vectors and journal space centers (named Centroid) is defined as the journal space density (JSD), which reflects the aggregation of journal collections in space. The similarity referred to is measured by distance-based similarity, and the distance is calculated using the Euclidean distance from the journal vector Si to Centroid. The calculation method is shown in Formula (5):



⎛ ⎞2 ⎞1/2 m m   1 ⎜ ⎝ ⎟ Dist(Si , Centroid) = ⎝ sij − sij ⎠ ⎠ m

Centroid is calculated using Formula (6):



Centroid = ⎝

m 1

m

s1j ,

j=1

(6)

j=1

i=1

m 1

m

s2j , ......,

j=1

m 1

m

⎞ smj ⎠

(7)

j=1

Thus, the similarity between the journal vector Si and Centroid can be calculated by distance-based similarity, as shown in Formula (7) (Zhang & Korfhage, 1999): DS(Si ) =

1 c Dist(Si ,Centroid)

c = 1.3

(8)

where c is a constant that affects the extent to which distance influences similarity. When c = 1.3, ideal similarity is obtained. Further, average similarity is used to determine JSD, as expressed in Formula (8):



JSD =



1 DS(Si ) m m

i=1

(9)

6

B. Zhang, H. Wang, S. Deng et al. / Journal of Informetrics 14 (2020) 101000

Table 1 Journal discriminative capacities based on three term weights. No.

Presence

TF

TFIDF

1 2 10 7 15 6 18 4 5 13 9 12 3 11 8 19 16 14 17 20

2.787 1.007 4.116 1.012 3.004 0.332 0.262 0.303 −0.229 1.493 −0.229 1.555 0.722 0.906 −0.194 −0.331 −0.329 −0.769 −0.064 −0.357

3.130 2.838 1.860 0.843 0.206 0.868 0.429 1.778 0.529 1.183 0.768 1.663 1.498 0.361 −0.157 −0.291 −0.539 −0.356 0.057 −0.647

6.991 6.941 0.937 0.502 0.495 0.447 0.443 0.421 0.347 0.341 0.313 0.266 0.153 0.132 0.044 −0.122 −0.170 −0.252 −0.271 −0.412

(3) Calculation of journal discriminative capacity When a journal vector in the journal space is removed, a new journal space is formed, resulting in a change in JSD. This change in density is then used to determine the difference between the journal and other journals. The calculation method is shown in Formula (9): JDCSk =

JSDSk − JSD AVG JSD

(10)

where JSDSk represents the calculated JSD with journal Sk removed, and the calculation method is the same as Formula (8); AVG JSD represents the average difference in the density of the journal space, which is mainly used to standardize the difference in density. With this method, the absolute value of the difference is used to avoid the positive values counteracting the negative values. The calculation method is shown in Formula (10): AVG JSD =

1 m

m 

|JSDSk − JSD|

(11)

k=1

4. Results and analysis 4.1. Calculation of journal discriminative capacity and analysis of library, information, and archival science journals Preprocessing of bibliographic data collected in 2017, 2015–2017, and 2013–2017 of LIA journals resulted in 5525, 9203, and 11557 terms, respectively. These terms are used as experimental data for JDC calculation and analysis. 4.1.1. Setting of term weights The description factor of the journal—that is, term weight—needs to be preset to construct a JJM. However, different term weights vary in their ability to describe journal contents, and the term weight with the strongest descriptive ability needs to be verified. Thus, JDC is determined based on three weights using the experimental data in the LIA journal in 2017, and the results are listed in Table 1. To verify the rationality of different weights, the data in Table 1 are plotted into a boxplot, as shown in Fig. 2. Fig. 2 shows that the numerical distribution of JDCs on the basis of three term weights markedly varies. No outliers appear on the TF-based JDC boxplot. However, one outlier appears on the Presence-based JDC boxplot (Data Analysis and Knowledge Discovery (No. 10)) and two outliers appear on the JDC boxplot based on TFIDF (Archives Science Bulletin (No. 1) and Archives Science Study (No. 2)). Archival science currently varies from library science and information science, which have a considerable degree of cross-integration and a slight difference in research content. However, the overall difference in JDC results based on TF is not significant, as determined from the boxplot. This finding indicates that TF cannot fully describe the difference in journal research content. In the JDC results based on Presence, the JDC of No. 10 (belonging to information science) is larger than that of the archival science journal, which is clearly inconsistent with common sense. This finding indicates that this weight cannot accurately describe the differences in research content. The results of JDC based on TFIDF are more consistent with the relatively independent research status quo of archival science in LIA. Therefore, Presence and

B. Zhang, H. Wang, S. Deng et al. / Journal of Informetrics 14 (2020) 101000

7

Fig. 2. Journal discriminative capacity boxplot based on three weights. The vertical axis represents the value of the JDC. The numbers in the figure indicate the number of journals.

Fig. 3. JDC scatter diagram of 20 journals in library, information, and archival science. The vertical axis represents the value of the JDC.

TF are not suitable for JDC measurement; in addition, TFIDF is the most reasonable basis for calculating the JDC. Therefore, the TFIDF-based JDC results are used in the experimental analysis of the follow-up study. 4.1.2. Journal difference analysis based on JDC The TFIDF-based JDC results in Table 1 are plotted as a scatter diagram (as shown in Fig. 3) to analyze the difference and JDC characteristics of journals. Meanwhile, a more intuitive detection of journal discrepancies is the adoption of principal component analysis (PCA). The purpose is to visualize the journal space in two dimensions (as shown in Fig. 4) and thus conduct an auxiliary analysis by observing the spatial aggregation status of journals. As shown in Fig. 3, most LIA journals have a positive JDC, and only 5 journals have a negative JDC. This observation indicates that most LIA journals maintain certain research characteristics, and research content markedly varies. 1) The JDC of LIA journals is evidently layered. Journals in archival science have the highest JDC, whereas those in crosslibrary and information science have the lowest and most negative JDC. As presented in Fig. 4, the journals in archival science are distributed independently at the edge of the journal space (in area ) 1 and have the highest degree of dispersion. Journals in library science and information science are distributed in the periphery (in area ), 2 whereas journals in cross-library and information science are concentrated in area , 3 which corresponds to the JDC results. This finding indicates that current research in archival science is significantly different from research in library and information science; in addition, its research features appear rather distinct. The research contents of journals in library and information science are closely related, and

8

B. Zhang, H. Wang, S. Deng et al. / Journal of Informetrics 14 (2020) 101000

Fig. 4. Visualization of the spatial distribution of 20 journals in library, information, and archive science. The numbers in the figure indicate the number of journals.

the research fields of some journals largely overlap and may even be homogenized to a certain extent. The distributions of Nos. 8 and 18 deviate, as shown in Fig. 4, which may be attributed to the distortion of PCA dimensionality reduction. 2) The JDC features of representative academic journals are distinct.  1 A wide gap exists between the JDCs in archival science (No. 1, No. 2) and those in other journals, indicating that the two journals exhibit distinct disciplinary characteristics. However, from another perspective, this view may also reflect relatively closed archival research, which is not closely related to library and information science. In addition, the JDCs of the two journals (No. 1, No. 2) are almost the same, and a tendency to “homogenize” is observed.  2 The JDC of data analysis and Knowledge Discovery (No. 10) is the highest and most distinct among the journals of library and information science. This finding shows that the journal presents a more distinctive specialized feature, which may be related to more research fields in computer science, data science, and so on.  3 As firstclass journals in LIA, the JDCs of the Journal of The China Society for Scientific and Technical Information (No. 7) and Journal of Library Science in China (No. 20) are divided into two. The JDC of the former ranks fourth, suggesting that its journal features are very clear, and its research content can be more distinct and integrated with more emerging research, which may be the driving force for research innovation in this journal. The JDC of the latter is the lowest among all journals, suggesting that the journal is devoted to studying the core issues of LIA; moreover, academic research has a certain depth but lacks breadth, reflecting its important position in LIA. On the basis of the aforementioned analysis, the archive science journals in LIA have the highest discriminative capacity and the most distinct research content. Journals in information science and library science have a high degree of crossintegration, and the difference in research content between information science and library science is relatively small. The two journals, No. 7 and No. 20, exhibit a distinct personal characteristic, which also confirms their core position in the field. 4.1.3. Comparative analysis of journal discriminative capacity in different time spans The JDC calculated using the data for a particular year can only represent the difference in status of the journals for that year, which may be accidental. To comparatively analyze the JDCs of LIA journals with a longer time span, sampling data for three years (2015–2017) and five years (2013–2017) are used. (1) Comparative analysis of JDCs in one-year and five-year periods The JDCs of LIA journals from 2013 to 2017 are determined, as shown in Fig. 5. To improve the contrast in JDC between the one-year and five-year periods, a two-dimensional distribution of the journal space is used for auxiliary analysis. As shown in Fig. 5, journals with positive a JDC still constitute the majority, indicating that most LIA journals maintain good differentiation within a five-year time span. However, this result is rather different from the JDC in the one-year period.  1 The difference in JDC has narrowed between journals in archival science and those in library and information science. Comparison between Figs. 3 and 5 indicates that the difference in JDC between journals in archival science and other journals is smaller. This finding suggests that the longer the time span, the weaker the research features of archival journals and the smaller the difference in research content.  2 The degree of dispersion degree of journals in library science and those in

B. Zhang, H. Wang, S. Deng et al. / Journal of Informetrics 14 (2020) 101000

9

Fig. 5. JDC of LIA journals in the five-year period.

Fig. 6. Two-dimensional visualization of the spatial distribution of LIA journals in the one-year and five-year periods.

information science has increased, and JDC layering is clarified. As shown in Fig. 6, the journal distribution in 2013–2017 exhibits increased dispersion, and the JDC gap widens. Moreover, the JDCs of No. 7, No. 9, and No. 10 in information science increase and are stratified with other journals.  3 The individual differences between journals are fully presented. The JDCs of No. 7 and No. 9 in 2013–2017 are increased, and the difference in research content is emphasized; meanwhile, the JDCs of No. 8 and No. 18 decrease to a negative value, the difference in research content decreases, and the comprehensive features are strengthened. (2) Ranking analysis of journal discriminative capacity in different time spans The JDC rankings vary in different time spans. Changes in the JDC ranking of LIA journals with different time spans are analyzed, as shown in Fig. 7. The JDC ranking in 2013–2017 is used as the baseline. Fig. 7 presents the following:  1 JDC rankings in 2017 change markedly; however, these changes become relatively small in the 2015–2017 period. This behavior shows that with an increase in time span, the research features of journals are obscured, and the degree of difference between journals decreases. Thus, the fluctuation of JDC rankings is relatively mild. In addition, the JDC ranking in 2017 is close to the ranking in the 2013–2017 period. This finding indicates that the discriminative capacity of journals within these two time spans is highly similar.  2 The journals with larger and smaller JDC rankings change slightly in different time spans, indicating that the differences in their research content are relatively stable. However, the JDC rankings of most journals in library science and information science markedly vary, suggesting that the degree of difference between these journals is not stable and that the research contents of the journals have distinct time characteristics. These features may change significantly with an increase in time span. The degree of difference between journals with different time spans generally changes, but the overall change is not considerably large. The use of one-year JDC results can reflect, to a certain extent, the difference between journals within a larger time span. 4.2. Change trend analysis of journal discriminative capacity in library, information, and archival science The analysis in the previous section shows that JDCs possess certain time attributes. Whether these journals have change rules with respect to time has yet to be determined. In this section, annual change rules are explored by measuring the JDCs in different years. Sample data in LIA journals from 2013 to 2017 are used to calculate the JDCs for each year. The results and rankings are listed in Table 2.

10

B. Zhang, H. Wang, S. Deng et al. / Journal of Informetrics 14 (2020) 101000

Fig. 7. Ranking of the journal discriminative capacity in LIA in different time spans. Table 2 Journal discriminative capacity and ranking from 2013 to 2017. No.

1 2 10 7 15 6 18 4 5 13 9 12 3 11 8 19 16 14 17 20

2014

2013

2015

2016

2017

JDC

Rank

JDC

Rank

JDC

Rank

JDC

Rank

JDC

Rank

7.153 7.215 0.811 0.724 0.139 0.267 −0.106 −0.012 0.382 −0.308 0.517 −0.095 0.214 −0.063 −0.288 −0.018 −0.515 −0.368 −0.548 −0.259

2 1 3 4 9 7 14 10 6 17 5 13 8 12 16 11 19 18 20 15

6.794 6.953 0.425 0.714 0.748 0.173 0.284 0.432 0.124 0.130 1.006 0.078 0.374 0.418 −0.340 −0.105 −0.322 −0.166 −0.051 −0.363

2 1 7 5 4 11 10 6 13 12 3 14 9 8 19 16 18 17 15 20

6.515 6.489 0.619 0.554 0.797 0.230 0.548 0.404 0.222 0.250 0.653 0.563 0.214 0.099 −0.115 0.214 −0.465 −0.124 −0.411 −0.515

1 2 5 7 3 11 8 9 12 10 4 6 13 15 16 14 19 17 18 20

7.151 7.068 0.648 0.436 0.053 0.149 −0.461 0.463 0.168 0.548 0.574 0.130 0.200 0.035 −0.151 −0.088 −0.531 −0.374 −0.303 −0.470

1 2 3 7 12 10 18 6 9 5 4 11 8 13 15 14 20 17 16 19

6.991 6.941 0.937 0.502 0.495 0.447 0.443 0.421 0.347 0.341 0.313 0.266 0.153 0.132 0.044 −0.122 −0.170 −0.252 −0.271 −0.412

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

The JDC for each year is determined based on different journal spaces and thus cannot be directly compared and analyzed. Consequently, JDC ranking is employed to investigate the annual change trend of journals. Hierarchical clustering by SPSS is conducted on the slope of annual JDC ranking polylines to identify the journals with similar change trends and to determine the rule change in the JDC of LIA journals in the recent five years. A dendrogram showing the result of hierarchical clustering is presented in Fig. 8. When the threshold is set to 14, for instance, the dendrogram can be split into five clusters. However, No. 12 is artificially merged with No. 15 and No. 18 not because they are more similar but they can be merged into one subgraph (i.e., Fig. 9(d)) to reduce the number of subgraphs. Accordingly, the corresponding broken-line graphs depicting change trends are divided into four, as shown in Fig. 9. The four sub-graphs in Fig. 9 are respectively the JDC ranking trends of LIA journals, corresponding to the dendrogram in Fig. 8. What can be tolled from Fig. 9 are as follows: 1) The JDC ranking of LIA journals in 2013–2017 shows certain regularity, and its changing trajectory generally presents four trends.  1 The JDC ranking of six journals generally shows a change trend characterized by a flat “M”, indicating the numerous changes in research topics for these journals from 2014 to 2016, as shown in Fig. 9(a).  2 The JDC ranking of five journals shows a relatively gentle change trend, indicating only a slight change in research field for these journals, as well as a relatively stable development trend, as shown in Fig. 9(b).  3 The JDC rankings of six journals decline significantly in 2014, followed by a slow or steady change, indicating that the discriminative capacity of these journals continue to improve or remain stable in recent years, as shown in Fig. 9(c).  4 The JDC ranking of three journals exhibit a largely fluctuating trend, indicating the lack of stability in research field and absence of characteristic research for these journals, as shown in Fig. 9(d). 2) The JDC ranking of journals in LIA subfields exhibits a different change trend.  1 The JDC ranking of journals in archival science (No. 1 and No. 2) consistently occupies the top two positions, showing a highly stable development trend.  2 The JDC

B. Zhang, H. Wang, S. Deng et al. / Journal of Informetrics 14 (2020) 101000

Fig. 8. Clustering results of slope-based journal discriminative capacity ranking trend.

Fig. 9. Annual change trend of journal discriminative capacity ranking in LIA.

11

12

B. Zhang, H. Wang, S. Deng et al. / Journal of Informetrics 14 (2020) 101000

Table 3 Journal discriminative capacity results in five disciplines in 2017. LIA No. 3 12 4 18 14 11 13 15 19 20 16 17 8 5 7 6 9 2 1 10

Aerospace JDC 3.5610 3.4930 3.4014 3.3664 3.3133 3.2220 3.2010 2.3829 2.2497 2.2290 2.0302 1.1370 1.0582 0.8672 0.8449 0.6752 0.6415 0.5752 0.5627 −0.3482

No. 25 36 26 31 38 32 34 29 35 23 28 40 22 27 33 24 21 39 30 37

Biology JDC 0.5738 0.5212 0.4915 0.4789 0.4660 0.4605 0.3553 0.2564 0.1788 0.1732 0.1414 0.1342 −0.0548 −0.1017 −0.1363 −0.1896 −0.2541 −0.2569 −0.2798 −0.4087

No. 51 48 53 59 52 60 50 58 43 49 57 54 55 42 45 46 56 44 47 41

Biology JDC 1.7418 1.4118 1.3071 1.2663 1.2661 1.1132 1.0311 0.9794 0.8978 0.7389 0.6651 0.6525 0.5634 0.2673 0.2655 0.1926 0.1406 0.0385 −0.5677 −0.6577

No. 77 76 80 79 61 74 64 63 68 70 66 73 67 72 69 71 75 62 78 65

Law JDC 1.6389 1.5110 1.3846 1.0870 0.9824 0.8579 0.8017 0.6906 0.2721 0.1755 0.0876 0.0259 −0.0609 −0.1979 −0.2058 −0.3847 −0.4475 −0.5236 −0.6469 −0.7571

No. 100 98 83 85 81 82 96 86 89 90 94 87 91 84 99 93 97 88 95 92

JDC 2.2509 2.1074 1.9755 1.7649 1.6050 1.5897 1.4157 1.3469 1.3424 1.3085 1.2997 1.2419 1.2164 1.1587 1.1437 1.1255 0.9757 0.6559 0.5781 0.3200

ranking of journals in information science (No. 5, No. 6, No. 7, No. 8, No. 10, and No. 19) shows a slightly fluctuating change trend (initially decreasing and then increasing), which indicates that the discipline characteristics of journals in information science have increase in prominence in recent years and may be in the stage of innovative development.  3 The JDC ranking of journals in library science (No. 3, No. 4, No. 11, No. 13, No. 12, and No. 15) shows a sharply fluctuating change trend, and the time characteristics of journals are more apparent. Some journals exhibit a declining JDC ranking in 2017, indicating a tendency of cross-integration with information science research. 3) The JDCs of high-ranking (or low-ranking) journals exhibit a relatively stable change trend, whereas the JDCs of journals at the middle has a relatively wide change range. To illustrate, first-class journals, such as No. 7 and No. 20, have a relatively stable development with less fluctuations in change trend. However, the JDCs of middle-rank journals No. 12, No. 15, and No. 18, have a wide change range, indicating that academic research for these journals tends to considerably change with time, and the features of their journal research are underdeveloped. In addition, the JDC of No. 13 generally maintains an upward change trend, which is rather distinct, indicating that the journal is constantly improving the features and individuality of its academic research. In summary, the discriminative capacity of LIA journals shows obvious change rules with respect to time. With respect to discipline, journals in archival science have the least change in JDC, followed by journals in information science and then in library science. From the perspective of the individual journal, the more significant the individual or comprehensive features of journal research content, the more stable the JDC change trend. 4.3. Comparative analysis of JDC in different disciplines To explore the JDC characteristics of journals in different disciplines, journals in LIA, aerospace, biology, art, and law are selected as the research objects. The JDCs are then compared and analyzed with respect to a particular discipline. 4.3.1. JDC analysis in five disciplines To detect the JDC characteristics of different disciplines, 40 pieces of data from 200 journals in five disciplines in 2017 are randomly selected, and 4000 pieces of data are obtained as experimental samples. Data preprocessing and JDC calculation are then conducted, and the results are listed in Table 3. The JDCs of the 81 journals in five disciplines are generally positive, indicating that the majority of the journals in the group have good discriminative capacity, and the journals themselves exhibit good individualized characteristics. Meanwhile, 16 of the 19 JDC-negative journals are from the disciplines of art and aerospace, which are significantly different from other disciplines. To explore such disciplinary differences, the journal space (100*100 dimensions) is placed on a two-dimensional plane by PCA dimensionality reduction to observe the spatial aggregation of the journals in five disciplines. The results are shown in Fig. 10. Fig. 10 shows that the journals in five disciplines are roughly divided into three parts, each of which is radioactively ribbon-like. Law journals are concentrated in the second quadrant, which is distant from other journals. This observation indicates that academic research journals in this discipline are distinct and do not overlap with other journals; thus, their JDCs are non-negative. Most LIA journals are in the first quadrant and are relatively highly dispersed. Some LIA journals are close to art journals and others, indicating that research in these journals may intersect with research in other disciplines; thus, their JDCs are negative. Journals in biological science, aerospace, and art are distributed in the third quadrant. The boundaries

B. Zhang, H. Wang, S. Deng et al. / Journal of Informetrics 14 (2020) 101000

13

Fig. 10. Spatial distribution of journals in five disciplines.

Fig. 11. Journal discriminative capacity histogram in five disciplines.

between the disciplines are clear, but the three disciplines fall in the same area as a whole; moreover, the distance between journals in each discipline is relatively small, indicating that research content in these three disciplines exhibits relevance in certain aspects. Consequently, these JDCs are generally smaller and have larger negative values. Owing to the distortion of dimensionality reduction, this method can only roughly observe the degree of spatial aggregation of journals. Moreover, the different characteristics of journals in different disciplines need to be further discussed. For a more intuitive in-depth analysis of the features of JDC in different disciplines, the data in Table 3 are plotted as a grouped histogram, as shown in Fig. 11. Fig.11 shows that the discrimination features of journals in five disciplines are quite different. More details are as follows: 1) The numerical distributions of JDCs in the five disciplines are significantly different.  1 Most JDCs of LIA journals are at a high level, and obvious stratification is observed in LIA.  2 The JDCs of all journals in law are positive and relatively concentrated. This result indicates that academic research in law has distinct features and is relatively independent. In addition, the gaps between most JDCs in law are considerably small, and “homogenization” may occur.  3 The JDCs in biological science are at the middle level, indicating that journals in this discipline also exhibits good discriminative capacity.  4 The JDCs in art and aerospace are polarized. However, the JDCs in art are more prominent, indicating that these two discipline journals cover a wide range of research fields, and the research contents of different journals in the same discipline vary to a certain extent. 2) Journals with higher or lower JDCs are unevenly distributed among disciplines.  1 All journals with higher JDCs are LIA journals, particularly library science journals (No. 3, No. 4, and No. 12). This finding among the five disciplines library science has significantly distinct academic research and characteristics.  2 Journals with lower JDCs are mainly art and biology journals. This finding indicates that these two journals have a wide range of research fields and may have more commonalities with other disciplines. The aforementioned analysis shows that the JDCs in different disciplines exhibit apparent characteristics. The JDCs of half of the LIA journals are high, whereas the JDCs of all journals in aerospace are low. The JDCs of all journals in law are positive, and the JDCs of journals in art are polarized. 4.3.2. Disciplinary analysis based on journal discriminative capacity In exploring the JDC in different disciplines, the mean JDC is used to represent the degree of variation. Section 4.2 also indicates that JDC varies in different time periods. Therefore, the processed data in 2017, 2015–2017, and 2013–2017 in five disciplines are used to measure the discriminative capacity and different characteristics of the disciplines in different time spans. The results are presented in Fig. 12.

14

B. Zhang, H. Wang, S. Deng et al. / Journal of Informetrics 14 (2020) 101000

Fig. 12. Journal discriminative capacity in five disciplines in different time spans.

As shown in Fig. 12, the mean JDCs in five disciplines in 2017, arranged from highest to lowest, are as follows: LIA > law > biology > art > aerospace. LIA and law are classified as social sciences, and considerably few fields overlap with other disciplines, resulting in a higher discriminative capacity. The research topics in biology and aerospace journals may involve various fields, and one discipline may intersect with another, resulting in a relatively low discriminative capacity. Art is classified under humanities, and the discriminative capacity between humanities and social sciences should be relatively close in theory. The research field of art itself is extensive but overlaps with social science; thus, the discipline is also relatively small. The journals in the five disciplines in 2015–2017 and 2013–2017 basically have the same order of discriminative capacity: law > LIA > biology > aerospace > art. When the time span is extended from one year to three years, the mean JDC of law discipline becomes the largest value, followed by LIA. Art has the smallest mean JDC. This finding demonstrates that the uniqueness of research in law is fully reflected after the time span is extended, while the research in art expands with an increase in time span, considerably narrowing the difference. When the time span extends from three years to five years, the mean JDCs in the five disciplines do not change significantly. This finding indicates that when the time span is extended, the mean JDCs in various disciplines gradually stabilize. However, with respect to range, the difference in discriminative capacity in various disciplines become more obvious as time span is extended. 5. Discussion The proposed JDC method in this study is an extension of Salton’s term differentiation model. Based on the algorithm for the term differentiation value, the calculation method of JDC obtains the degree of difference in journal research content in the predefined group. This measure provides theoretical support for the quantitative evaluation of differences in academic journals. By measuring and analyzing journals in LIA journal and five other disciplines, the JDCs can well depict the differentiated characteristics of journals in a given group and facilitates the identification of distinct journals and homogeneous journals. These results can well prove the efficiency of this the proposed method for the measurement and evaluation of differences in journals. Numerous studies have compared and analyzed differences in academic journal content. These studies mainly compare and contrast journals by content analysis and descriptive statistical analysis of the research topics, methods, theories, and viewpoints of journals (Tuomaala, Järvelin, & Vakkari, 2014). This approach can efficiently analyze the difference in journal content. However, these journals focused on the specific analysis of research content without considering the degree of variation among journals within groups. On the basis of the research findings, only differences in particular fields can be determined. Journals with unique research contents (which may be innovative points) and high degree of similarity in content (which may be homogeneous) are difficult to explore. The JDC method in this study can describe the degree of difference in journals in a particular group and can easily detect the discriminate characteristics of journals by determining the JDCs. Moreover, these findings confirm that the proposed method can potentially complement content analysis, the theoretical usefulness of which has been demonstrated. Hierarchical clustering analysis can detect differences among journals. However, as indicated in the dendrogram, this technique can only determine the presence of differences among clusters; the difference among clusters and the degree of difference between individuals groups cannot be evaluated. By PCA and MDS mapping, two-dimensional visualization of journal space may be realized, and the degree of difference in individual journals within a particular group can be observed.

B. Zhang, H. Wang, S. Deng et al. / Journal of Informetrics 14 (2020) 101000

15

As shown in Figs. 4,6, and 10, the individual journals distributed in the periphery may largely vary; the closer to the center of the journal space, the closer the similarity between journals. This method can effectively detect the degree of differentiation between individuals in the journal group but cannot measure the said difference. The JDC method not only uses the JTM and JJM generated during hierarchical clustering analysis and multidimensional scaling analysis (or principal component analysis); the technique also precisely calculates the degree of difference in journals in the predefined group. Therefore, the JDC method is a deepening of hierarchical clustering analysis and MDS analysis, which uses precise values to depict the degree of variation. In addition, it prevents subjectivity in the analysis. The JDC method can potentially compensate for the deficiency in comparison in journal content analysis. And it can also gauge the degree of difference in journals, which hierarchical clustering, PCA mapping, or MDS mapping can reflect. The application of this method can effectively promote diversity in journal groups and innovation in academic research. It can also help journal managers control the development direction of journals as well as improve the academic quality and impact of journals. 6. Conclusion By empirical analysis of journals in LIA and five other disciplines, the proposed method can be used to measure the difference of journals, which proves the theoretical or practical usefulness of the method. The experimental results indicate that the JDC calculation method can efficiently detect differences in journal research content. The most significant difference in research content is observed in archival science journals; meanwhile, no difference in research content is observed in integrated library journals. With respect to time, the JDC changes in LIA journals exhibit certain regularity, whereas those in archival and information science journals are relatively subtle. Trends in library science and comprehensive journals largely fluctuate. With respect to discipline, the mean JDCs in LIA and other social sciences (e.g., law) are markedly higher than those in science and engineering (e.g., biology and aerospace). Distinct disciplinary characteristics are observed. This study gauges and analyzes differences in Chinese journals by using the JDC method. Several concerns still need to be further explored in the research process: (1) in exploring the differences between LIA journals and interdisciplinary journals, only some journals are selected as research objects, which may contradict the objectivity of the study. In the followup study, the number of samples will be increased to explore differences in journals on a larger scale. (2) This study uses Chinese journals as the research object and explores their degree of differentiation. However, the discriminative capacities of English journals require further research, together with the different characteristics between English journals and Chinese journals published in English. These issues will be covered in the follow-up study. Author contributions Baolong Zhang: Conceived and designed the analysis; collected the data; performed the analysis; wrote the paper. Hao Wang: Conceived and designed the analysis; performed the analysis Sanhong Deng: Contributed data or analysis tools. Xinning Su: Conceived and designed the analysis. Acknowledgments This study was supported by the National Natural Science Foundation of China (No. 71503121), as well as Nanjing University World-class Universities and World-class Disciplines Construction of Humanities and Social Sciences, the third batch of “hundred-level” scientific research projects. We would also like to express our special thanks to the editor and two reviewers for their very constructive comments and suggestions. Appendix A

Table A1 Journals list of five disciplines. No.

Journal Name

Library, Information and Archival Science Archives Science Bulletin 1 Archives Science Study 2 Journal of Academic Libraries 3 Journal of the National Library of China 4 Information Science 5 6 Information Studies:Theory & Application Journal of the China Society for Scientific and 7 Technical Information Information and Documentation Services 8

No.

Journal Name

Aerospace 21 22 23 24 25 26 27

Journal of Beijing University of Aeronautics and Astronautics Missiles and Space Vehicles Flight Dynamics Journal of Solid Rocket Technology Journal of Aerospace Power Acta Aeronautica et Astronautica Sinica Spacecraft Recovery & Remote Sensing

28

Aerospace Control

16

B. Zhang, H. Wang, S. Deng et al. / Journal of Informetrics 14 (2020) 101000

Table A1 (Continued) No.

Journal Name

No.

Journal Name

9 10 11 12 13 14 15 16 17 18 19 20 Biology 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57

Journal of Intelligence Data Analysis and Knowledge Discovery Library Library Development Library Tribune Research on Library Science Library Journal Library and Information Service Documentation,Information & Knowledge Library & Information Journal of Modern Information Journal of Library Science in China

29 30 31 32 33 34 35 36 37 38 39 40 Art 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77

Spacecraft Engineering Chinese Journal of Space Science Aerospace Control and Application Acta Aerodynamica Sinica Journal of Nanjing University of Aeronautics & Astronautics Gas Turbine Experiment and Research Journal of Experiments in Fluid Mechanics Journal of Propulsion Technology Aerospace Materials & Technology Journal of Astronautics Journal of Chinese Inertial Technology Chinese Space Science and Technology

78 79 80

China Television Musicology in China Journal of the Central Conservatory of Music

91 92 93 94 95 96 97 98 99 100

ECUPL Journal Global Law Review Tsinghua University Law Journal Modern Law Science Administrative Law Review Journal of Political Science and Law Tribune of Political Science and Law China Legal Science Peking University Law Journal Political Science and Law

58 59 60 Law 81 82 83 84 85 86 87 88 89 90

Acta Palaeontologica Sinica Genomics and Applied Biology Chinese Bulletin of Life Sciences Acta Hydrobiologica Sinica Acta Ecologica Sinica Chinese Journal of Ecology Biodiversity Science Chinese Journal of Biotechnology Progress in Biochemistry and Biophysics Biotechnology Biotechnology Bulletin Acta Microbiologica Sinica Microbiology China Journal of Microbiology Hereditas Chinese Journal of Applied Ecology Chinese Journal of Applied and Environmental Biology Scientia Sinica(Vitae) China Biotechnology Chinese Journal of Biochemistry and Molecular Biology Journal of Comparative Law Contemporary Law Review Science of Law Studies in Law and Business Law Science The Jurist Legal Forum Law Review Chinese Journal of Law Law and Social Development

Journal of Beijing Film Academy Journal of Beijing Dance Academy Contemporary Cinema Film Art Architectural Journal Art Magazine Art Research National Arts Journal of Nanjing Arts Institute(Fine Arts & Design) Literature & Art Studies Theatre Arts New Arts Opera Art Hundred Schools in Arts Art & Design Research Music Research Art of Music(Journal of the Shanghai Conservatory of Music

References Abramo, G., & D’Angelo, C. A. (2011). Evaluating research: From informed peer review to bibliometrics. Scientometrics, 87(3), 499–514. http://dx.doi.org/10.1007/s11192-011-0352-7 Aharony, N. (2012). Library and Information Science research areas: A content analysis of articles from the top 10 journals 2007–8. Journal of Librarianship and Information Science, 44(1), 27–35. http://dx.doi.org/10.1177/0961000611424819 Allen, E., Weber, R., & Howerton, W. (2018). Library assessment research: A content comparison from three american library journals. Publications, 6(1), 12. http://dx.doi.org/10.3390/publications6010012 ´ S., Matic, ´ R., Popovic, ´ Z., Smiljanic, ´ M., Stefanovic, ´ M., & Vidakovic, ´ V. (2014). An overview of forestry journals in the period 2006–2010 as basis for Bojovic, ascertaining research trends. Scientometrics, 98(2), 1331–1346. http://dx.doi.org/10.1007/s11192-013-1171-9 Bornmann, L., & Daniel, H. D. (2009). Reviewer and editor biases in journal peer review: an investigation of manuscript refereeing at Angewandte Chemie International Edition. Research Evaluation, 18(4), 262–272. http://dx.doi.org/10.3152/095820209X477520 Braun, T., Glänzel, W., & Schubert, A. (2006). A Hirsch-type index for journals. Scientometrics, 69(1), 169–173. http://dx.doi.org/10.1007/s11192-006-0147-4 Buttlar, L. (1991). Analyzing the library periodical literature: Content and authorship. College and Research Libraries, 52(1), 38–53. http://dx.doi.org/10.5860/crl 52 01 38 Cheang, B., Chu, S., Le, C., Lim, A., et al. (2014). OR/MS journals evaluation based on a refined PageRank method: An updated and more comprehensive review. Scientometrics, 100(2), 339–361. http://dx.doi.org/10.1007/s11192-014-1272-0 Chu, H. (2015). Research methods in library and information science: A content analysis. Library & Information Science Research, 37(1), 36–41. http://dx.doi.org/10.1016/j.lisr.2014.09.003 Cockriel, W. M., & McDonald, J. B. (2018). The influence of dispersion on journal impact measures. Scientometrics, 116(1), 609–622. http://dx.doi.org/10.1007/s11192-018-2755-1 Degree office of the state council. (2011). Classification standard by the degree office of the state council. Beijing: The Ministry of Education of China. Editorial board of Chinese library classification. (2010). Chinese library classification (5th ed.). Beijing: National library of China.

B. Zhang, H. Wang, S. Deng et al. / Journal of Informetrics 14 (2020) 101000

17

Egghe, L. (2006). Theory and practise of the g-index. Scientometrics, 69(1), 131–152. http://dx.doi.org/10.1007/s11192-006-0144-7 Garfield, E. (1998). Long-term vs. Short-term journal impact: Does it matter. Scientist, 12(3), 11–12. Garfield, E. (2006). The history and meaning of the journal impact factor. JAMA, 295(1), 90–93. http://dx.doi.org/10.1001/jama.295.1.90 ˜ Gómez-Núnez, A. J., Vargas-Quesada, B., & de Moya-Anegón, F. (2016). Updating the SCI mago journal and country rank classification: A new approach using Ward’s clustering and alternative combination of citation measures. Journal of the Association for Information Science and Technology, 67(1), 178–190. http://dx.doi.org/10.1002/asi.23370 González-Pereira, B., Guerrero-Bote, V. P., & Moya-Anegón, F. (2010). A new approach to the metric of journals’ scientific prestige: The SJR indicator. Journal of Informetrics, 4(3), 379–391. http://dx.doi.org/10.1016/j.joi.2010.03.002 Guerrero-Bote, V. P., & Moya-Anegón, F. (2012). A further step forward in measuring journals’ scientific prestige: The SJR2 indicator. Journal of Informetrics, 6(4), 674–688. http://dx.doi.org/10.1016/j.joi.2012.07.001 Haddawy, P., Hassan, S. U., Asghar, A., & Amin, S. (2016). A comprehensive examination of the relation of three citation-based journal metrics to expert judgment of journal quality. Journal of Informetrics, 10(1), 162–173. http://dx.doi.org/10.1016/j.joi.2015.12.005 Järvelin, K., & Vakkari, P. (1990). Content analysis of research articles in library and information science. Library & Information Science Research, 12, 395–421. Järvelin, K., & Vakkari, P. (1993). The evolution of library and information science 1965–85: A content analysis of journal articles. Information Processing & Management, 29(1), 129–144. http://dx.doi.org/10.1016/0306-4573(93)90028-C Jin, B., & Wang, B. (1999). Chinese Science Citation Database: Its construction and application. Scientometrics, 45(2), 325–332. http://dx.doi.org/10.1007/BF02458440 Juˇzniˇc, P., Peˇclin, S., Zˇ aucer, M., et al. (2010). Scientometric indicators: Peer-review, bibliometric methods and conflict of interests. Scientometrics, 85(2), 429–441. http://dx.doi.org/10.1007/s11192-010-0230-8 Kulczycki, E. (2017). Assessing publications through a bibliometric indicator: The case of comprehensive evaluation of scientific units in Poland. Research Evaluation, 26(1), 41–52. http://dx.doi.org/10.1093/reseval/rvw023 Kumpulainen, S. (1991). Library and information science research 1975: Content analysis of journal articles. Libri, 41(1), 59–76. Leydesdorff, L., & Rafols, I. (2012). Interactive overlays: A new method for generating global journal maps from Web-of-Science data. Journal of Informetrics, 6(2), 318–332. http://dx.doi.org/10.1016/j.joi.2011.11.003 Leydesdorff, L., Bornmann, L., & Wagner, C. S. (2017). Generating clustered journal maps: An automated system for hierarchical classification. Scientometrics, 110(3), 1601–1614. http://dx.doi.org/10.1007/s11192-016-2226-5 Leydesdorff, L., Bornmann, L., & Zhou, P. (2016). Construction of a pragmatic base line for journal classifications and maps based on aggregated journal-journal citation relations. Journal of Informetrics, 10(4), 902–918. http://dx.doi.org/10.1016/j.joi.2016.07.008 Ma, T., Wang, G. F., Dong, K., & Cao, M. (2012). The Journal’s Integrated Impact Index: A new indicator for journal evaluation. Scientometrics, 90(2), 649–658. http://dx.doi.org/10.1007/s11192-011-0538-z Mack, C. A. (2012). How to write a good scientific paper: Title, abstract, and keywords. Journal of Micro/Nanolithography MEMS and MOEMS, 11(2), 020101, http://dx.doi.org/10.1117/1.jmm.11.2.020101 Moed, H. F. (2010). Measuring contextual citation impact of scientific journals. Journal of Informetrics, 4(3), 265–277. http://dx.doi.org/10.1016/j.joi.2010.01.002 Peritz, B. C. (1980). The methods of library science research: Some results from a bibliometric survey. Library Research, 2(3), 251–268. Prathap, G. (2009). Is there a place for a mock h-index? Scientometrics, 84(1), 153–165. http://dx.doi.org/10.1007/s11192-009-0066-2 Pushpalatha, K. P., & Raju, G. (2010). Analysis of algorithms used to compute term discrimination values. In 2010 IEEE International Conference on Computational Intelligence and Computing Research (pp. 1–6). http://dx.doi.org/10.1109/ICCIC.2010.5705844 Raj, R. G., & Zainab, A. N. (2012). Relative measure index: A metric to measure the quality of journals. Scientometrics, 93(2), 305–317. http://dx.doi.org/10.1007/s11192-012-0675-z Rousseau, R. (2002). Journal evaluation: Technical and practical issues. Library Trends, 50(3), 418–439. Salton, G., & Yang, C. S. (1973). On the specification of term values in automatic indexing. Journal of Documentation, 29(4), 351–372. http://dx.doi.org/10.1108/eb026562 Salton, G., Yang, C. S., & Yu, C. T. (1975). A theory of term importance in automatic text analysis. Journal of the American Society for Information Science, 26(1), 33–44. http://dx.doi.org/10.1002/asi.4630260106 Shen, Z., Chen, F., Yang, L., & Wu, J. (2019). Node2vec representation for clustering journals and as a possible measure of diversity. Journal of Data and Information Science, 4(2), 79–92. http://dx.doi.org/10.2478/jdis-2019-0010 Standardization administration of China. (2009). The People’s Republic of China national standard (GB/T 13745–2009): Classification and code of disciplines. Beijing: Chinese Standard Publishing house. Su, C., Pan, Y., Zhen, Y., Ma, Z., et al. (2011). PrestigeRank: A new evaluation method for papers and journals. Journal of Informetrics, 5(1), 1–13. http://dx.doi.org/10.1016/j.joi.2010.03.011 Su, X., Deng, S., & Shen, S. (2014). The design and application value of the Chinese Social Science Citation Index. Scientometrics, 98(3), 1567–1582. http://dx.doi.org/10.1007/s11192-012-0921-4 Tokarz, R. E., & Bucy, R. (2019). Global information literacy: A content analysis of three journals. Global Knowledge Memory and Communication, 68(3), 242–254. http://dx.doi.org/10.1108/gkmc-05-2018-0052 Tseng, Y. H., & Tsay, M. Y. (2013). Journal clustering of library and information science for subfield delineation using the bibliometric analysis toolkit: CATAR. Scientometrics, 95(2), 503–528. http://dx.doi.org/10.1007/s11192-013-0964-1 Tuomaala, O., Järvelin, K., & Vakkari, P. (2014). Evolution of library and information science, 1965–2005: Content analysis of journal articles. Journal of the Association for Information Science and Technology, 65(7), 1446–1462. http://dx.doi.org/10.1002/asi.23034 Vinkler, P. (2008). Introducing the Current Contribution Index for characterizing the recent, relevant impact of journals. Scientometrics, 79(2), 409–420. http://dx.doi.org/10.1007/s11192-009-0427-x Wang, F. F., & Wolfram, D. (2015). Assessment of journal similarity based on citing discipline analysis. Journal of the Association for Information Science and Technology, 66(6), 1189–1198. http://dx.doi.org/10.1002/asi.23241 Weismayer, C., & Pezenka, I. (2017). Identifying emerging research fields: A longitudinal latent semantic keyword analysis. Scientometrics, 113(3), 1757–1785. http://dx.doi.org/10.1007/s11192-017-2555-z White, H. D., & McCain, K. W. (1997). Visualization of literatures. Annual Review of Information Systems and Technology (ARIST), 32, 99–168. White, H. D., & McCain, K. W. (1998). Visualizing a discipline: An author co-citation analysis of information science, 1972–1995. Journal ofthe American Society for Information Science, 49(4), 327–355. https://doi.org/10.1002/(SICI)1097-4571(19980401)49:4<327::AID-ASI4>3.0.CO;2-4. Woeginger, G. J. (2008). An axiomatic characterization of the Hirsch-index. Mathematical Social Sciences, 56(2), 224–232. http://dx.doi.org/10.1016/j.mathsocsci.2008.03.001 Wolfram, D., & Zhao, Y. H. (2014). A comparison of journal similarity across six disciplines using citing discipline analysis. Journal of Informetrics, 8(4), 840–853. http://dx.doi.org/10.1016/j.joi.2014.08.003 Xie, Y., Wu, Q., & Li, X. (2019). Editorial team scholarly index (ETSI): An alternative indicator for evaluating academic journal reputation. Scientometrics, 120(3), 1333–1349. http://dx.doi.org/10.1007/s11192-019-03177-x Xu, F., Liu, W., & Rousseau, R. (2015). Introducing sub-impact factor (SIF-) sequences and an aggregated SIF-indicator for journal ranking. Scientometrics, 102(2), 1577–1593. http://dx.doi.org/10.1007/s11192-014-1401-9 Xu, J., & Liu, Y. B. (2018). A bibliometric analysis for global research trends on ectomycorrhizae over the past thirty years. The Electronic Library, 36(4), 733–749. http://dx.doi.org/10.1108/EL-05-2017-0104

18

B. Zhang, H. Wang, S. Deng et al. / Journal of Informetrics 14 (2020) 101000

Yoon, H. Y., Bang, H. S., & Woo, S. H. (2016). A comparative study on the logistics research between international and korean journals. The Asian Journal of Shipping and Logistics, 32(3), 149–156. http://dx.doi.org/10.1016/j.ajsl.2016.09.003 Yu, L. P., Chen, Y. Q., Pan, Y. T., et al. (2009). Research on the evaluation of academic journals based on structural equation modeling. Journal of Informetrics, 3(4), 304–311. http://dx.doi.org/10.1016/j.joi.2009.04.002 Zhang, C., Liu, X., Xu, Y., et al. (2011). Quality-structure index: A new metric to measure scientific journal influence. Journal of the American Society for Information Science and Technology, 62(4), 643–653. http://dx.doi.org/10.1002/asi.21487 Zhang, F. L. (2017). Evaluating journal impact based on weighted citations. Scientometrics, 113(2), 1155–1169. http://dx.doi.org/10.1007/s11192-017-2510-z Zhang, J., & Korfhage, R. R. (1999). A distance and angle similarity measure method. Journal of the American Society for Information Science, 50(9), 772–778. https://doi.org/10.1002/(SICI)1097-4571(1999)50:9<772::AID-ASI5>3.0.CO;2-E. Zhang, J., Wang, Y., & Zhao, Y. (2017). Investigation on the statistical methods in research studies of library and information science. The Electronic Library, 35(6), 1070–1086. http://dx.doi.org/10.1007/s11192-017-2555-z Zhang, J., Zhao, Y. H., & Wang, Y. Y. (2016). A study on statistical methods used in six journals of library and information science. Online Information Review, 40(3), 416–434. http://dx.doi.org/10.1108/OIR-07-2015-0247