ESI: only 15 universities produced at least 1400 papers in the field of Economics and Business over the last 10 years; Erasmus University Rotterdam included

Essential Science Indicators has been updated as of September 1, 2012 to cover a 10-year plus 6-month period, January 1, 2002-June 30, 2012.

The field of economics & Business covers 289 journals in the essential science Indicators of Web of Science.

Only 15 universities produced at least 1400 papers in the field of Economics and Business over the last 10 years; Erasmus University Rotterdam included.

Erasmus University Rotterdam ranks 2nd outside the US next to LSE. This is a major performance for Erasmus University Rotterdam. The citations per paper of the two non-US schools still clearly lacks behind the US schools.

This is why citation impact is the next big excellence frontier for Erasmus University Rotterdam. Far the most research in economics and business is done in the Erasmus School of Economics (ese) and the Rotterdam School of Management (rsm) and their research institutes ERIM and TI.

Institution Papers Citations Citations Per Paper
1 NATL BUR ECON RES 3.672 66.785 18,19
2 HARVARD UNIV 2.882 52.511 18,22
3 UNIV PENN 1.957 34.958 17,86
4 NEW YORK UNIV 1.772 25.964 14,65
5 STANFORD UNIV 1.742 25.381 14,57
6 UNIV CALIF BERKELEY 1.706 24.367 14,28
7 COLUMBIA UNIV 1.672 22.342 13,36
8 UNIV MICHIGAN 1.595 20.741 13,00
9 MIT 1.583 32.251 20,37
10 UNIV CHICAGO 1.552 28.600 18,43
11 LONDON SCH ECON & POLIT SCI 1.542 13.808 8,95
12 CORNELL UNIV 1.530 17.388 11,36
13 UNIV ILLINOIS 1.492 16.353 10,96
14 UNIV WISCONSIN 1.486 16.459 11,08
15 ERASMUS UNIV ROTTERDAM 1.430 12.179 8,52
Advertisements

ESI: only 20 ISI journals in Economics & Business receive on average at least 20 citations per article over the last ten years

Essential Science Indicators has been updated as of September 1, 2012 to cover a 10-year plus 6-month period, January 1, 2002-June 30, 2012.

The field of economics & Business covers 289 journals in the essential science Indicators of Web of Science. Only about 20 journals received on avarage at least 20 citations per paper in the period covered by the ESI.

Here is the list:

Journal Papers Citations Citations Per Paper
QUART J ECON 397 16894 42,55
ACAD MANAGE REV 377 15731 41,73
ADMIN SCI QUART 191 7854 41,12
J ECON LIT 200 8111 40,55
MIS QUART 380 14346 37,75
ACAD MANAGE J 631 23325 36,97
J FINAN 873 28216 32,32
J MARKETING 469 14346 30,59
J POLIT ECON 378 11526 30,49
STRATEG MANAGE J 717 20049 27,96
J ECON PERSPECT 457 11199 24,51
ORGAN SCI 613 14996 24,46
ECONOMETRICA 628 15223 24,24
J MANAGE 488 11413 23,39
J ECON GROWTH 132 3038 23,02
J FINAN ECON 932 20463 21,96
REV ACC STUD 57 1142 20,04
J ACCOUNT ECON 351 7028 20,02
J OPER MANAG 465 9304 20,01

Twitter and your academic reputation: friends or enemies?

trial by twitter

The feedback from social media like twitter can strike as fast as lightning, with consequences unforseen. For many researchers, the pace and tone of this online review can be intimidating — and can sometimes feel like an attack. How do authors best deal with these forms of peer review after publication? The speed of this “social peer review” is much faster than the time that was needed for the peer review process during submission and acceptance of the paper as part of the publishing process in (top) academic journals. and the feedback comes from anywhere, not just the circle of accepted experts in the field like with the blind review process of journals.

The result of this can be of enormous impact on your academic reputation. What if suddenly thousands of tweets disapprove of the conclusions of a paper just published? It will create a situation that is nearly impossible to handle for the author of the work. There will be a “negative sentiment” around the publications that will be of influence on aspects of your reputation. For example the chances your paper will be cited often. How will this social sentiment be of influence to other papers in the pipeline and under submission with (top) journals? How will the social sentiment be of influence to your co-authors? How will it influence the chances of your grant applications? How ill it influence your tenure process if the sentiment is negative? These are all huge stakes for researchers.

A recent article by Apoorva Mandavilla in Nature deals with this issue.  It is about “fast feedback”, “a chorus of (dis) approval”, “Meta-twitters ‘,  ‘new (alt) metrics of communication” and some possible solutions for the situation.

The possible power of social media for research and academic reputation is evident for me. The management of the communication and speed of the feedback needs special skills and special publication strategies by researchers (and institutes!) who care about their future careers and their reputation. The open social media review dynamics  at networks like twitter currently has many risks for the author of the paper. But at the same time the stakes and risks for the crowd who collectively performs these “trials”  is very low I guess. A single tweet is not powerful, but the flog together is impact full. It is a collective review of the crowd, with often a lot of people who just follow the sentiment by simply re-tweeting others.

I advice researchers to be very careful about which message on their paper is distributed in social networks, how this is distributed, by whom it is distributed and who is replying on it.  The social networks should not reproduce or copycat the formal peer review process by selected experts. They should be focused on adding value to the possible additional virtues of the work. The best approach might be to leverage the social media by initiating stories on possible practical values and practical impact of the research. Because when these  are confirmed in the wider social network audiences, the author can get confidence that the practical / managerial value of his research is valued and tested  immediately.  In this way the social networks can be very beneficial for the academic reputation; they perform a sounding board for testing managerial / practical value of the research.

New peer review guide published by the Research Information Network

Peer review: good for all purposes? | Research Information Network.

Peer review is both a principle and a set of mechanisms at the heart of the arrangements for evaluating and assuring the quality of research. A new guide from the Research Information Network provides for researchers and others an outline of how the peer review system works, and highlights some of the challenges as well as the opportunities it faces in the internet age.

Peer review: A guide for researchers sets out the processes involved in peer review for both grant applications and publications. It also looks at the issues that have been raised in a series of recent reports on the costs of the system, and how effective and fair it is.

The growth in the size of the research community and of the volumes of research being undertaken across the world means that the amount of time and effort put into the peer review system is growing too, and that it is coming under increasing scrutiny. The guide looks at how effective peer review is in selecting the best research proposals, as well as in detecting misconduct and malpractice.

The guide also looks at how fair the system is, and at the different levels of transparency involved in the process: from completely closed systems, where the identities of reviewers and those whose work is being reviewed are kept hidden from each other, and reports are not revealed, to completely transparent systems where identities and reports are openly revealed.

The burdens on researchers as submitters and reviewers are by far the biggest costs in the peer review system, and the guide outlines some of the measures that are being taken to reduce those burdens, or at least to keep them in check. A growing number of researchers are taking the view that they should be paid for the time they spend in reviewing grant applications and draft publications. But there are also concerns that such payment would significantly increase the costs of the system, and also of scholarly publications.

The internet has speeded up the process of peer review, and widened the pool of reviewers who can be drawn on. It has also provided new channels through which researchers can communicate their findings, and through which other researchers can comment on, annotate and evaluate them. These new opportunities bring new challenges as well. The take-up of the opportunities for open comments, ratings and recommender systems has been patchy to date; and we currently lack clear protocols for the review of findings circulated in multiple formats, including blogs and wikis. The mechanisms for peer review will undoubtedly change in coming years, but the principle will remain central to all those involved in the research community.

Peer review: A guide for researchers is available at www.rin.ac.uk/peer-review-guide.

How are the stocks (top journals) performing over a period of 28 years?

Sciencewatch has published the latest Journal Performance Indicators (JPI’s) for the field of business.  This chart shows which journals perform in the top over decades.  It shows that the very top of the field is very stable over a longer period.  Publishing in these top journals can be seen as a stable investment for researchers; it increases the possibility of a good citation yield.

The table below compares the citation impact of journals in the field of business as measured over three different time spans. The left-hand column ranks journals based on their 2008 “impact factor,” as enumerated in the current edition of Journal Citations Report. The 2008 impact factor is calculated by taking the number of all current citations to source items published in a journal over the previous two years and dividing by the number of articles published in the journal during the same period–in other words, a ratio between citations and recent citable items published. The rankings in the next two columns show impact over longer time spans, based on figures from Journal Performance Indicators. In these columns, total citations to a journal’s published papers are divided by the total number of papers that the journal published, producing a citations-per-paper impact score over a five-year period (middle column) and a 28-year period (right-hand column).

Journals Ranked by Impact: Business
Rank 2008 Impact Factor Impact 2004-08 Impact 1981-2008
1 Academy of Management Review
(6.13)
Administrative Science Quarterly
(9.83)
Administrative Science Quarterly
(89.77)
2 Academy of Management Journal
(6.08)
Academy of Management Journal
(9.79)
Academy of Management Review
(66.56)
3 Journal of Retailing
(4.10)
Academy of Management Review
(8.91)
Journal of Marketing
(53.84)
4 Journal of Marketing
(3.60)
Journal of Marketing
(8.72)
Academy of Management Journal
(50.48)
5 Strategic Management Journal
(3.34)
Strategic Management Journal
(7.07)
Strategic Management Journal
(46.01)
6 Marketing Science
(3.31)
Journal of Management
(6.35)
Journal of Consumer Research
(39.30)
7 Journal of Management
(3.08)
Marketing Science
(6.10 )
Journal of Marketing Research
(36.34)
8 Journal of International Business Studies
(2.99)
Journal of Consumer Psychology
(5.30)
Journal of Management
(28.14)
9 Administrative Science Quarterly
(2.85)
Journal of International Business Studies
(5.05)
Sloan Management Review
(20.38)
10 Journal of Consumer Psychology
(2.84)
Journal of Organizational Behavior
(4.85)
Marketing Science
(20.36)

The variety in Researcher information services

I crossed a nice list of Researcher information services @ the REPINF wiki this week:

Research Crossroads

http://www.researchcrossroads.org

Self-registering searchable service that collects data on researchers in science and medicine: their research, grant funding, publications, affiliation, biography, etc. Also provides searchable databases on funders, grants awarded and clinical trials. Also acts as a networking tool.

Academia.edu

http://www.academia.edu

Authors self-register themselves, their departments and universities/institutions. Authors add details of their papers. Currently has around 24K people and 87K papers.

ResearcherID

http://www.researcherid.com/

Thomson Reuters’ author profiling service. Searchable researcher/research database with the benefit of Thomson Reuters’ unique author identifier system (see Author Identifiers topic).

ResearchGate

https://www.researchgate.net/

Scientists’ network with over 180,000 members to date (November 2009). Scientists add details about themselves and their work, upload full-texts of their papers, and discuss relevant topics. Has just introduced a ‘micro-article’ idea, which is to encourage scientists to write and upload brief versions of their latest work for rapid communication and discussion. There is also an embryonic job advertising facility. Free to join and use. It is not clear how this initiative is supported fiancially, though it claims to be bulit ‘by scientists for scientists’.

bibapp

http://bibapp.org/

For use as an institutional ‘campus gateway’. Database of researchers, their publications and their institutional affiliations (group/department/school, etc), enabling a search for ‘campus experts’. Accepts deposits in popular formats (e.g. Refworks) and provides automatic rights-checking using SHERPA RoMEO. Makes SWORD-compliant deposits to the institutional repository or other locations.

VIVO

http://vivo.cornell.edu/

Developed at Cornell University Library. A campus research discovery tool. Researchers can manage their own page/profile, which usually links to their personal web page and departmental or other affiliation web pages.

Scholar Universe (ProQuest)

http://www.scholaruniverse.com/productinfo.jsp

2 million scholar profiles generated from ProQuest’s databases.

Selected Works (Berkeley Electronic Press)

http://works.bepress.com/

Authors create their own profiles for a campus profiling service.

Ensuring the Integrity, Accessibility, and Stewardship of Research Data in the Digital Age

Here is a new book from NAP Committee on Ensuring the Utility and Integrity of Research Data in a Digital Age;

National Academy of Sciences

As digital technologies are expanding the power and reach of research, they are also
raising complex issues. These include complications in ensuring the validity of research
data; standards that do not keep pace with the high rate of innovation; restrictions on data
sharing that reduce the ability of researchers to verify results and build on previous
research; and huge increases in the amount of data being generated, creating severe
challenges in preserving that data for long-term use.
Ensuring the Integrity, Accessibility,
and Stewardship of Research Data in the Digital Age examines the consequences of the
changes affecting research data with respect to three issues – integrity, accessibility, and
stewardship-and finds a need for a new approach to the design and the management of
research projects. The report recommends that all researchers receive appropriate
training in the management of research data, and calls on researchers to make all
research data, methods, and other information underlying results publicly accessible in a
timely manner. The book also sees the stewardship of research data as a critical
long-term task for the research enterprise and its stakeholders. Individual researchers,
research institutions, research sponsors, professional societies, and journals involved in
scientific, engineering, and medical research will find this book an essential guide to the
principles affecting research data in the digital age.

Front Matter i-xvi
Summary 1-10 (skim)
1 Research Data in the Digital Age 11-32 (skim)
2 Ensuring the Integrity of Research Data 33-58 (skim)
3 Ensuring Access to Research Data 59-94 (skim)
4 Promoting the Stewardship of Research Data 95-114 (skim)
5 Defining Roles and Responsibilities 115-120 (skim)
Appendix A: Biographical Information on the Members of the Committee on Ensuring the Utility and Integrity of Research Data in a Digital Age 121-132 (skim)
Appendix B: Relevant National Academy of Sciences, National Academy of Engineering, Institute of Medicine, and National Research Council Reports 133-142 (skim)
Appendix C: Letters from Scientific Journals Requesting the Study 143-154 (skim)
Index 155-162 (skim)

RSS Business School News

  • An error has occurred; the feed is probably down. Try again later.
Locations of Site Visitors
%d bloggers like this: