Crowdsourced ranking of journals in Management; any news?

Vote for management journals

Management researchers love journal rankings. A new way to get the judgement of thousands of voters was launched recently by Teppo Felin, Brigham Young University. After a few weeks already 80.000 votes were submitted to rank more than 100 journals in the field of management. Voters can submit a vote as many times as they like.  After filtering out about 20.000 automated robot votes, an impressive amount of 60.000 votes remained.

There are already some blogposts on this ranking where peers discuss the pro’s and cons of performing the ranking this way.

The crowd initiative is here @ All Our Ideas: http://www.allourideas.org/management

Besides the obvious trouble to compare all these journals (a nearly impossible task), one can ask what the ‘wisdom of the crowds’ adds to already existing lists of the top journals in the field.

A lot of normal suspects turn up in the top 25: (as of today 22/01/2011, with 80.000 votes, the results before the cleaning).

A score of 100 means a journal is expected to win all the time when compared with another journal in the contest and a score of zero means the journal is expected to lose all the time. There are no zero scores in this ranking (yet).

 

  1. JOURNAL OF APPLIED PSYCHOLOGY (score: 81)
  2. ADMINISTRATIVE SCIENCE QUARTERLY (76)
  3. ACADEMY OF MANAGEMENT REVIEW (74)
  4. ORGANIZATION SCIENCE (73)
  5. ACADEMY OF MANAGEMENT JOURNAL (72)
  6. JOURNAL OF MANAGEMENT (71)
  7. MIS QUARTERLY (70)
  8. STRATEGIC MANAGEMENT JOURNAL (68)
  9. ORGANIZATIONAL BEHAVIOR AND HUMAN DECISION PROCESSES (67)
  10. MANAGEMENT SCIENCE (66)
  11. DECISION SCIENCES (65)
  12. ORGANIZATIONAL RESEARCH METHODS (64)
  13. JOURNAL OF ORGANIZATIONAL BEHAVIOR (63)
  14. JOURNAL OF MANAGEMENT STUDIES (63)
  15. PERSONNEL PSYCHOLOGY (63)
  16. JOURNAL OF INTERNATIONAL BUSINESS STUDIES (60)
  17. RESEARCH IN ORGANIZATIONAL BEHAVIOR (60)
  18. ACADEMY OF MANAGEMENT PERSPECTIVES (60)
  19. HUMAN RELATIONS (60)
  20. LEADERSHIP QUARTERLY (59)
  21. ORGANISATION STUDIES (59)
  22. JOURNAL OF OPERATIONS MANAGEMENT (59)
  23. HARVARD BUSINESS REVIEW (58)
  24. MIT SLOAN MANAGE REVIEW (57)
  25. HUMAN RESOURCE MANAGEMENT-US (57)

The initiator of the survey reported today (29-jan-2011) on the results:

So, what can a crowdsourced ranking tell us?  Nothing definitive is my guess, though I’m sure other rankings don’t necessarily give us a definitive signal either.  Though I do think that aggregated perceptions perhaps give us another data point when evaluating and comparing journals (along with impact and influence factors and other, more “objective” measures).  These rankings can of course mirror extant rankings (raising causal questions). But they might also capture more up-to-date anticipations of future performance. For example, the UC Irvine Law School (established in 2008) has not graduated a single student, though the school is already well within the top 50 in the crowdsourced ranking.

Lots of other questions can be raised, specifically related to a management journal ranking like this.  For example, should micro and macro journals be lumped together like this?  And certainly disciplinary journals play a large role in management – should they be included (sociology, psychology, economics)?

Strategic “gaming” of the results of course can also occur.  For example, I ended up having to delete some 25,000+ automatically generated votes (it looked like a computer script was created to throw the ranking off), votes that were explicitly cast to sabotage the effort (the African Journal of Management beat all the top journals according to this mega, robo-voter).  Though, it is interesting to see how the “crowd” essentially plays a role in averaging bias and in putting a check on strategic voting.

Ironically, I’m actually not one to necessarily really care about journal rankings like this.  I wonder whether article-effects trump journal-effects?  (I believe Joel Baum has a provocative paper on this.)  Of course I read and submit to “top” journals, but there are many “lesser” (ahem)  journals that are just as much on my radar screen, for example Industrial and Corporate Change, Managerial and Decision Economics andStrategic Organization. Obsessions with journal standing can detract from ideas.

Pragmatically, yes, journal rankings matter: promotions indirectly depend on it, as do resource flows etc.  So, perhaps a “democratic,” crowdsourced ranking like this can provide additional information for decision-makers and scholars in the field.

 

The top 15 list after filtering out the 20.000 automated robot-votes: the picture is now even more ‘traditional’.

Management Journals %
1 Administrative Science Quarterly 90.43%
2 Academy of Management Journal 90.41%
3 Academy of Management Review 88.94%
4 Organization Science 88.31%
5 Strategic Management Journal 84.42%
6 Journal of Applied Psychology 84.00%
7 Management Science 82.56%
8 Journal of Management 82.46%
9 OBHDP 78.93%
10 Organizational Research Methods 74.06%
11 Journal of Organizational Behavior 72.79%
12 Personnel Psychology 71.93%
13 Journal of Management Studies 71.10%
14 Research in OB 70.37%
15 Organization Studies 69.68%

 

Where do all these votes come from?

The whole picture is US en EU dominated. For example: only 16 votes from Shanghai, China are recorded.

overall crowd votes worldmap 20110122

The complete ranking is here as PDF file of the ranking website (22/01/2011): https://rmimr.files.wordpress.com/2011/01/all-our-ideas-crwodsourcing-management-journals-20110122.pdf

Advertisements

Science-metrix develops new journals classification system (and interactive webtool) using both the Web of Science (Thomson Reuters) and Scopus (Elsevier).

sciencemetrix explorerA new set of tools recently released (first public release: 2010-12-01 (v1.00)) by Science-Metrix Inc. seeks to improve the way we talk about and understand science. The US/Canada-based research evaluation firm has developed a new, multi-lingual (18 languages!) classification of scientific journals, which is accompanied by an interactive web tool.  The journal classification, which covers 15,000 peer-reviewed scientific journals, is translated by more than 22 international experts who volunteered their time and expertise, making the tools available to a worldwidScience-metrix logoe audience.  The complete journals list is available for download as excel sheet.

The interactive ‘Scientific Journals Ontology Explorer’ allows users to visualise the links between 175 scientific specialties (subject categories) in 18 languages, from Arabic to Swedish.

The visualization contains 3 different views: a circular “Subfield Citation Wheel” (representing both citations as well as references), a “Field Citation Wheel” (showing the links between distinct scientific disciplines) and a network “Map of Science” (revealing similarities between disciplines by relative distance). The goal of this visualization is to show people how science spans a broad universe and how interlinked scientific research actually is.

How is the field of Business & Management covered in the journals list?

The field of  Economics & Business (as part of the domain Economic & Social Sciences) contains 822 journals (this is 5,4% of he total Science metrix list)  in 12 subfields, where Business & Management is included with 222 journals (27%):

Every journal is classified into only one category.

Subfields of ‘Economics & Business’ N %
Accounting 32 3,9%
Agricultural Economics & Policy 27 3,3%
Business & Management 222 27,0%
Development Studies 42 5,1%
Econometrics 13 1,6%
Economic Theory 13 1,6%
Economics 244 29,7%
Finance 63 7,7%
Industrial Relations 21 2,6%
Logistics & Transportation 49 6,0%
Marketing 61 7,4%
Sport, Leisure & Tourism 35 4,3%
Total Journals 822 100%

I will compare these with our ERIM Journals list, the ISI quartiles and the SJR (scopus) quartiles scores to see how the list is structured in terms of quality layers.

(I wil add these details later this week.)

In the ontology browser, you can create a map of science and learn how the field of business and management is connected to other subject categories. I have selected the closest fields in the screenshot below.

business and management in science metrix

About Science-Metrix:

Science-Metrix is the world’s largest independent firm dedicated to scientometrics, technometrics, and science and technology (S&T) evaluation. The firm’s core business involves supporting evidence-based decision-making with strong empirical data and sound theoretical approaches. This contract research organization combines qualitative and quantitative techniques to deliver high quality program evaluations, performance and outcome assessments, and evaluation frameworks. Every Science-Metrix report is produced by a team of dedicated high-calibre experts and relies on the world-class data found in the Scopus, Web of Science and Questel databases.

SCImago Institutions Rankings (SIR): 2009 World Report; How is the NL score?

SCImago Institutions Rankings (SIR):  2009 World Report

SRG SCImago Reseach Group has published their SIR 2009 World Report. This report shows a ranking with more than 2000 of the best worldwide research institutions and organizations whose output surpass 100 scholarly publications during 2007. The ranking shows 5 indicators of institution research performance, stressing output (ordering criteria), collaboration and impact. Analyzed institutions are grouped into five research sectors: Government, Higher Education, Health, Corporate and Others. The resulting list includes institutions belonging to 84 countries from the five continents.
Publication and citation data used to build this report comes from Elsevier’s database Scopus. Current coverage includes data from more than 17000 research publications – mainly journals and proceedings – embracing the full range of scholarly research. Analyzed period goes from 2003 to 2007.
The construction of the current report involves the challenging task of identifying and normalizing the 2000 institutions through an overwhelming number of publications. The work, which is carried out by a mix of computer and human means, comprises the identification and gathering of institution’s affiliation variants under a unique identifiable form as well as the classification into research sectors.
Rank Indicators

Output; An institution’s publication output reveals its scientific outcomes and trends in terms of published documents in scholarly journals. Publication output values are affected by institution sizes and research profiles, among others factors. This indicator forms the basis for more complex metrics.  At co‐authored publications a score is assigned to each contributing institution through the author’s institutional address.

Cites per Document (CxD): This indicator shows the average scientific impact of an institution’s publication output in terms of citations per documents. Shown values express average citations received by the institution’s published documents over the whole period. The values are affected by institution research profiles.

International Collaboration (Int. Coll.): This value shows the institution’s output ratio that has been produced in collaboration with foreign institutions. The values are computed by analyzing the institution’s output whose affiliation includes more than one country address over the whole period.

Normalized SJR (Norm. SJR): It shows the journal average importance where an institution output is published. The indicator used to calculate this average is the SJR indicator (www.scimagojr.com). A value larger than 1 means that “on average” the papers of an institution have been published on journals whose “importance” is above the average in its scientific field. Whereas a value smaller than 1 means that “on average” the papers of an institution have been published on journals whose “importance” is below the average in its scientific field.

Field Normalized Citation Score (Norm. Cit.): This indicator reveals the ratio between the average scientific impact of an institution and the world average impact of publications of the same time frame and subject area. It is computed using the methodology established by Karolinska Intitutet in Sweden where it is named “Item oriented field normalized citation score average” since the normalization of the citation values is done on an individual article level. The values are expressed in percentages and show the relationship of the institution’s average impact to the world average, which is 1, ‐‐i.e. a score of 0.8 means the institution is cited 20% below average and 1.3 means the institution is cited 30% above average.

The NL score: with special attention for Erasmus University

I will focus on the normalized citation only: Although Erasmus University is nr 6 in NL in output, EUR scores highest of all NL university institutions in the list (and 4th if we included other sectors included in the list) on citation impact.  This means Erasmus University is the university with the highest citation impact in the Netherlands. I think this is an amazing score for one of the smaller universities in NL. But compaerd to the overall SIR ranking, EUR is at position 130 on this criterium.

Rank Organization Sector Output C*D Int. coll. norm. SJR norm cit.
42 Universiteit Utrecht Higher educ. 23031 10,48 40,51 1,08 1,71
53 Universiteit van Amsterdam Higher educ. 20608 10,51 42,3 1,07 1,73
138 Universiteit Leiden Higher educ. 12090 10,55 48,97 1,08 1,52
197 Rijksuniversiteit Groningen Higher educ. 9649 8,32 43,43 1,06 1,47
227 Vrije Universiteit Amsterdam Higher educ. 8812 9,28 48,41 1,06 1,54
253 Erasmus Universiteit Rotterdam Higher educ. 8172 6 12,24 3 38,85 16 1,1 2 1,91 4
256 Technische Universiteit Delft Higher educ. 8156 5,27 41,6 0,93 1,56
319 Wageningen Universiteit Higher educ. 6843 7,86 52,71 1,06 1,38
323 Technische Universiteit Eindhoven Higher educ. 6823 6,01 44 0,98 1,7
340 Radboud Universiteit Nijmegen Higher educ. 6437 8,21 44,79 1,06 1,44
397 Academic Medical Center Health 5500 12,52 31,24 1,1 1,81
414 Universiteit Twente Higher educ. 5372 5,64 41,7 0,97 1,54
478 TNO Government 4672 7,82 35,81 1,01 1,42
565 Universiteit Maastricht Higher educ. 4021 9,02 42,15 1,09 1,65
787 University Medical Center St Radboud Health 2733 8,45 26,16 1,08 1,39
885 Netherlands Institute for Metals Research Government 2372 9,65 34,78 1,05 1,91
918 Nederlands Kanker Instituut/ALZ Health 2267 18,2 39,52 1,13 2,1
995 Philips Research Private 2002 6,55 42,61 0,87 1,97
1090 Dutch Polymer Institute Government 1735 10,58 35,39 1,07 2,04
1124 Universiteit van Tilburg Higher educ. 1633 4,64 47,52 1,02 1,49
1257 Academic Hospital Maastricht Health 1325 10,83 38,04 1,1 1,79
1292 KNAW Government 1264 11,32 48,66 1,1 1,78

The citation chain: Eigenfactor maps the researchers citation trail and the amount of time he spends at each citation along the track.

Another way of mapping and ranking is developed, which ranks journals much as Google ranks websites; the Eigenfactor. This new concept and facility is developed as part of a non-commercial academic research project sponsored by the Bergstrom lab in the Department of follow the trailBiology at the University of Washington. They aim to develop novel methods for the F factor; ISi and NON-ISI combinedevaluating the influence of scholarly periodicals and for mapping the structure of academic research. The Eigenfactor score of a journal is an estimate of the percentage of time that library users spend with that journal. The Eigenfactor algorithm corresponds to a simple model of research in which readers follow chains of citations as they move from journal to journal. Imagine that a researcher goes to the library and selects a journal article at random. After reading the article, the researcher selects at random one of the citations from the article. She then proceeds to the journal that was cited, reads a random article there, and selects a citation to direct her to her next journal volume. The researcher does this ad infinitum. The amount of time that the researcher spends with each journal gives us a measure of that journal’s importance within network of academic citations. The amount of time that a researcher spends with each journal gives an estimate of the amount of time that real researchers spend with each journal. The developers of this new concept use mathematics to simulate this process.

Just to try I searched the Business category in the search option of this facility. As a result of this a list of 717 journals was presented. For all the journals the article influence was listed. The following screenshot gives an overview of the result:

Eigenfactor business category

Top 20 economists NL according to Hirsch index

In a recent issue of the dutch ESB journal, Albert Jolink updates the top 20 economists in the Netherlands and used the Hirsch index to do this.esb top 20

ESB has a long tradition of ranking dutch economists. This new one reshuffels the top to some extend. (latest version used here, dated march 2007)

The top 20 looks like this:

NL top 20

NL average = 4 citations / paper over the years 1995-2005

According to the latest in-cites statistics, published this month, the Netherlands in number 4, measured over the last 10 years for the field of Economics & Business. I think this is a very good result, but at the same time I think it is very difficult to reach this average as a school. still it is a nice figure to remember as a general benchmark. This is the overal list:

( in-cites, January 2006, Citing URL: http://www.in-cites.com/countries/top20eco.html)

RANK COUNTRY PAPERS CITATIONS CITATIONS
PER PAPER
1 USA

62,633

392,238

6.26

2 ENGLAND

15,012

65,196

4.34

3 CANADA

7,307

31,642

4.33

4 NETHERLANDS

4,208

16,831

4

5 FRANCE

4,251

15,569

3.66

6 AUSTRALIA

4,493

12,611

2.81

7 GERMANY

4,694

12,388

2.64

8 ISRAEL

1,725

9,130

5.29

9 SWEDEN

1,956

8,506

4.35

10 ITALY

2,468

7,488

3.03

11 SPAIN

2,609

7,430

2.85

12 BELGIUM

1,742

7,240

4.16

13 SCOTLAND

1,646

5,894

3.58

14 SWITZERLAND

1,333

5,634

4.23

15 HONG KONG

693

5,222

7.54

16 JAPAN

2,019

4,713

2.33

17 PEOPLES R CHINA

1,820

4,572

2.51

18 SOUTH KOREA

1,181

3,861

3.27

19 DENMARK

1,222

3,805

3.11

20 NORWAY

1,043

3,324

3.19

RSS Business School News

  • An error has occurred; the feed is probably down. Try again later.
Locations of Site Visitors
%d bloggers like this: