ESI: only 15 universities produced at least 1400 papers in the field of Economics and Business over the last 10 years; Erasmus University Rotterdam included

Essential Science Indicators has been updated as of September 1, 2012 to cover a 10-year plus 6-month period, January 1, 2002-June 30, 2012.

The field of economics & Business covers 289 journals in the essential science Indicators of Web of Science.

Only 15 universities produced at least 1400 papers in the field of Economics and Business over the last 10 years; Erasmus University Rotterdam included.

Erasmus University Rotterdam ranks 2nd outside the US next to LSE. This is a major performance for Erasmus University Rotterdam. The citations per paper of the two non-US schools still clearly lacks behind the US schools.

This is why citation impact is the next big excellence frontier for Erasmus University Rotterdam. Far the most research in economics and business is done in the Erasmus School of Economics (ese) and the Rotterdam School of Management (rsm) and their research institutes ERIM and TI.

Institution Papers Citations Citations Per Paper
1 NATL BUR ECON RES 3.672 66.785 18,19
2 HARVARD UNIV 2.882 52.511 18,22
3 UNIV PENN 1.957 34.958 17,86
4 NEW YORK UNIV 1.772 25.964 14,65
5 STANFORD UNIV 1.742 25.381 14,57
6 UNIV CALIF BERKELEY 1.706 24.367 14,28
7 COLUMBIA UNIV 1.672 22.342 13,36
8 UNIV MICHIGAN 1.595 20.741 13,00
9 MIT 1.583 32.251 20,37
10 UNIV CHICAGO 1.552 28.600 18,43
11 LONDON SCH ECON & POLIT SCI 1.542 13.808 8,95
12 CORNELL UNIV 1.530 17.388 11,36
13 UNIV ILLINOIS 1.492 16.353 10,96
14 UNIV WISCONSIN 1.486 16.459 11,08
15 ERASMUS UNIV ROTTERDAM 1.430 12.179 8,52

ESI: only 4 countries received at least 50.000 citations in the period 2002-2012 in the field of Economics & Business, Netherlands included.

Essential Science Indicators has been updated as of September 1, 2012 to cover a 10-year plus 6-month period, January 1, 2002-June 30, 2012.

The field of economics & Business covers 289 journals in the essential science Indicators of Web of Science.

Only 4 countries received on avarage at least 50.000 citations  in the period covered by the ESI. But the size of the production difefers very much. USA still by far the largets producers of papers in economics and business.

Country Papers Citations Citations Per Paper
1 USA 76.363 735.703 9,63
2 UK 21.348 158.187 7,41
3 CANADA 10.231 72.803 7,12
4 NETHERLANDS 7.338 53.213 7,25
5 GERMANY 11.140 52.663 4,73

ESI: only 20 ISI journals in Economics & Business receive on average at least 20 citations per article over the last ten years

Essential Science Indicators has been updated as of September 1, 2012 to cover a 10-year plus 6-month period, January 1, 2002-June 30, 2012.

The field of economics & Business covers 289 journals in the essential science Indicators of Web of Science. Only about 20 journals received on avarage at least 20 citations per paper in the period covered by the ESI.

Here is the list:

Journal Papers Citations Citations Per Paper
QUART J ECON 397 16894 42,55
ACAD MANAGE REV 377 15731 41,73
ADMIN SCI QUART 191 7854 41,12
J ECON LIT 200 8111 40,55
MIS QUART 380 14346 37,75
ACAD MANAGE J 631 23325 36,97
J FINAN 873 28216 32,32
J MARKETING 469 14346 30,59
J POLIT ECON 378 11526 30,49
STRATEG MANAGE J 717 20049 27,96
J ECON PERSPECT 457 11199 24,51
ORGAN SCI 613 14996 24,46
ECONOMETRICA 628 15223 24,24
J MANAGE 488 11413 23,39
J ECON GROWTH 132 3038 23,02
J FINAN ECON 932 20463 21,96
REV ACC STUD 57 1142 20,04
J ACCOUNT ECON 351 7028 20,02
J OPER MANAG 465 9304 20,01

Twitter and your academic reputation: friends or enemies?

trial by twitter

The feedback from social media like twitter can strike as fast as lightning, with consequences unforseen. For many researchers, the pace and tone of this online review can be intimidating — and can sometimes feel like an attack. How do authors best deal with these forms of peer review after publication? The speed of this “social peer review” is much faster than the time that was needed for the peer review process during submission and acceptance of the paper as part of the publishing process in (top) academic journals. and the feedback comes from anywhere, not just the circle of accepted experts in the field like with the blind review process of journals.

The result of this can be of enormous impact on your academic reputation. What if suddenly thousands of tweets disapprove of the conclusions of a paper just published? It will create a situation that is nearly impossible to handle for the author of the work. There will be a “negative sentiment” around the publications that will be of influence on aspects of your reputation. For example the chances your paper will be cited often. How will this social sentiment be of influence to other papers in the pipeline and under submission with (top) journals? How will the social sentiment be of influence to your co-authors? How will it influence the chances of your grant applications? How ill it influence your tenure process if the sentiment is negative? These are all huge stakes for researchers.

A recent article by Apoorva Mandavilla in Nature deals with this issue.  It is about “fast feedback”, “a chorus of (dis) approval”, “Meta-twitters ‘,  ‘new (alt) metrics of communication” and some possible solutions for the situation.

The possible power of social media for research and academic reputation is evident for me. The management of the communication and speed of the feedback needs special skills and special publication strategies by researchers (and institutes!) who care about their future careers and their reputation. The open social media review dynamics  at networks like twitter currently has many risks for the author of the paper. But at the same time the stakes and risks for the crowd who collectively performs these “trials”  is very low I guess. A single tweet is not powerful, but the flog together is impact full. It is a collective review of the crowd, with often a lot of people who just follow the sentiment by simply re-tweeting others.

I advice researchers to be very careful about which message on their paper is distributed in social networks, how this is distributed, by whom it is distributed and who is replying on it.  The social networks should not reproduce or copycat the formal peer review process by selected experts. They should be focused on adding value to the possible additional virtues of the work. The best approach might be to leverage the social media by initiating stories on possible practical values and practical impact of the research. Because when these  are confirmed in the wider social network audiences, the author can get confidence that the practical / managerial value of his research is valued and tested  immediately.  In this way the social networks can be very beneficial for the academic reputation; they perform a sounding board for testing managerial / practical value of the research.

Crowdsourced ranking of journals in Management; any news?

Vote for management journals

Management researchers love journal rankings. A new way to get the judgement of thousands of voters was launched recently by Teppo Felin, Brigham Young University. After a few weeks already 80.000 votes were submitted to rank more than 100 journals in the field of management. Voters can submit a vote as many times as they like.  After filtering out about 20.000 automated robot votes, an impressive amount of 60.000 votes remained.

There are already some blogposts on this ranking where peers discuss the pro’s and cons of performing the ranking this way.

The crowd initiative is here @ All Our Ideas: http://www.allourideas.org/management

Besides the obvious trouble to compare all these journals (a nearly impossible task), one can ask what the ‘wisdom of the crowds’ adds to already existing lists of the top journals in the field.

A lot of normal suspects turn up in the top 25: (as of today 22/01/2011, with 80.000 votes, the results before the cleaning).

A score of 100 means a journal is expected to win all the time when compared with another journal in the contest and a score of zero means the journal is expected to lose all the time. There are no zero scores in this ranking (yet).

 

  1. JOURNAL OF APPLIED PSYCHOLOGY (score: 81)
  2. ADMINISTRATIVE SCIENCE QUARTERLY (76)
  3. ACADEMY OF MANAGEMENT REVIEW (74)
  4. ORGANIZATION SCIENCE (73)
  5. ACADEMY OF MANAGEMENT JOURNAL (72)
  6. JOURNAL OF MANAGEMENT (71)
  7. MIS QUARTERLY (70)
  8. STRATEGIC MANAGEMENT JOURNAL (68)
  9. ORGANIZATIONAL BEHAVIOR AND HUMAN DECISION PROCESSES (67)
  10. MANAGEMENT SCIENCE (66)
  11. DECISION SCIENCES (65)
  12. ORGANIZATIONAL RESEARCH METHODS (64)
  13. JOURNAL OF ORGANIZATIONAL BEHAVIOR (63)
  14. JOURNAL OF MANAGEMENT STUDIES (63)
  15. PERSONNEL PSYCHOLOGY (63)
  16. JOURNAL OF INTERNATIONAL BUSINESS STUDIES (60)
  17. RESEARCH IN ORGANIZATIONAL BEHAVIOR (60)
  18. ACADEMY OF MANAGEMENT PERSPECTIVES (60)
  19. HUMAN RELATIONS (60)
  20. LEADERSHIP QUARTERLY (59)
  21. ORGANISATION STUDIES (59)
  22. JOURNAL OF OPERATIONS MANAGEMENT (59)
  23. HARVARD BUSINESS REVIEW (58)
  24. MIT SLOAN MANAGE REVIEW (57)
  25. HUMAN RESOURCE MANAGEMENT-US (57)

The initiator of the survey reported today (29-jan-2011) on the results:

So, what can a crowdsourced ranking tell us?  Nothing definitive is my guess, though I’m sure other rankings don’t necessarily give us a definitive signal either.  Though I do think that aggregated perceptions perhaps give us another data point when evaluating and comparing journals (along with impact and influence factors and other, more “objective” measures).  These rankings can of course mirror extant rankings (raising causal questions). But they might also capture more up-to-date anticipations of future performance. For example, the UC Irvine Law School (established in 2008) has not graduated a single student, though the school is already well within the top 50 in the crowdsourced ranking.

Lots of other questions can be raised, specifically related to a management journal ranking like this.  For example, should micro and macro journals be lumped together like this?  And certainly disciplinary journals play a large role in management – should they be included (sociology, psychology, economics)?

Strategic “gaming” of the results of course can also occur.  For example, I ended up having to delete some 25,000+ automatically generated votes (it looked like a computer script was created to throw the ranking off), votes that were explicitly cast to sabotage the effort (the African Journal of Management beat all the top journals according to this mega, robo-voter).  Though, it is interesting to see how the “crowd” essentially plays a role in averaging bias and in putting a check on strategic voting.

Ironically, I’m actually not one to necessarily really care about journal rankings like this.  I wonder whether article-effects trump journal-effects?  (I believe Joel Baum has a provocative paper on this.)  Of course I read and submit to “top” journals, but there are many “lesser” (ahem)  journals that are just as much on my radar screen, for example Industrial and Corporate Change, Managerial and Decision Economics andStrategic Organization. Obsessions with journal standing can detract from ideas.

Pragmatically, yes, journal rankings matter: promotions indirectly depend on it, as do resource flows etc.  So, perhaps a “democratic,” crowdsourced ranking like this can provide additional information for decision-makers and scholars in the field.

 

The top 15 list after filtering out the 20.000 automated robot-votes: the picture is now even more ‘traditional’.

Management Journals %
1 Administrative Science Quarterly 90.43%
2 Academy of Management Journal 90.41%
3 Academy of Management Review 88.94%
4 Organization Science 88.31%
5 Strategic Management Journal 84.42%
6 Journal of Applied Psychology 84.00%
7 Management Science 82.56%
8 Journal of Management 82.46%
9 OBHDP 78.93%
10 Organizational Research Methods 74.06%
11 Journal of Organizational Behavior 72.79%
12 Personnel Psychology 71.93%
13 Journal of Management Studies 71.10%
14 Research in OB 70.37%
15 Organization Studies 69.68%

 

Where do all these votes come from?

The whole picture is US en EU dominated. For example: only 16 votes from Shanghai, China are recorded.

overall crowd votes worldmap 20110122

The complete ranking is here as PDF file of the ranking website (22/01/2011): https://rmimr.files.wordpress.com/2011/01/all-our-ideas-crwodsourcing-management-journals-20110122.pdf

RSS Business School News

  • An error has occurred; the feed is probably down. Try again later.
Locations of Site Visitors
%d bloggers like this: