ESI: Only 16 authors have published at least 70 papers in the field of Economics and Business over the last 10 years; Franses (EUR/ESE) included.

Essential Science Indicators has been updated as of September 1, 2012 to cover a 10-year plus 6-month period, January 1, 2002-June 30, 2012.

The field of economics & Business covers 289 journals in the essential science Indicators of Web of Science.

Erasmus University Rotterdam is included in this list with Professor Philip Hans Franses (Erasmus School of Economics) who produced 70 papers in the period 2002-2012 (according to ESI).

Personal page here: http://www.erim.eur.nl/people/philip-hans-franses/

 

Scientist Papers Citations Citations Per Paper
1 LEE, J 149 1.113 7,47
2 KIM, J 133 643 4,83
3 WRIGHT, M 127 1.762 13,87
4 KIM, S 116 391 3,37
5 ZHANG, J 115 497 4,32
6 LIST, JA 95 2.070 21,79
7 LI, Y 92 430 4,67
8 NARAYAN, PK 86 658 7,65
9 KIM, Y 85 294 3,46
10 SHOGREN, JF 84 813 9,68
11 ZHANG, Y 79 441 5,58
12 PHILLIPS, PCB 78 572 7,33
13 ROZELLE, S 77 765 9,94
14 KUMAR, V 71 1.162 16,37
15 LEE, K 71 551 7,76
16 FRANSES, PH 70 524 7,49

ESI: only 15 universities produced at least 1400 papers in the field of Economics and Business over the last 10 years; Erasmus University Rotterdam included

Essential Science Indicators has been updated as of September 1, 2012 to cover a 10-year plus 6-month period, January 1, 2002-June 30, 2012.

The field of economics & Business covers 289 journals in the essential science Indicators of Web of Science.

Only 15 universities produced at least 1400 papers in the field of Economics and Business over the last 10 years; Erasmus University Rotterdam included.

Erasmus University Rotterdam ranks 2nd outside the US next to LSE. This is a major performance for Erasmus University Rotterdam. The citations per paper of the two non-US schools still clearly lacks behind the US schools.

This is why citation impact is the next big excellence frontier for Erasmus University Rotterdam. Far the most research in economics and business is done in the Erasmus School of Economics (ese) and the Rotterdam School of Management (rsm) and their research institutes ERIM and TI.

Institution Papers Citations Citations Per Paper
1 NATL BUR ECON RES 3.672 66.785 18,19
2 HARVARD UNIV 2.882 52.511 18,22
3 UNIV PENN 1.957 34.958 17,86
4 NEW YORK UNIV 1.772 25.964 14,65
5 STANFORD UNIV 1.742 25.381 14,57
6 UNIV CALIF BERKELEY 1.706 24.367 14,28
7 COLUMBIA UNIV 1.672 22.342 13,36
8 UNIV MICHIGAN 1.595 20.741 13,00
9 MIT 1.583 32.251 20,37
10 UNIV CHICAGO 1.552 28.600 18,43
11 LONDON SCH ECON & POLIT SCI 1.542 13.808 8,95
12 CORNELL UNIV 1.530 17.388 11,36
13 UNIV ILLINOIS 1.492 16.353 10,96
14 UNIV WISCONSIN 1.486 16.459 11,08
15 ERASMUS UNIV ROTTERDAM 1.430 12.179 8,52

Twitter and your academic reputation: friends or enemies?

trial by twitter

The feedback from social media like twitter can strike as fast as lightning, with consequences unforseen. For many researchers, the pace and tone of this online review can be intimidating — and can sometimes feel like an attack. How do authors best deal with these forms of peer review after publication? The speed of this “social peer review” is much faster than the time that was needed for the peer review process during submission and acceptance of the paper as part of the publishing process in (top) academic journals. and the feedback comes from anywhere, not just the circle of accepted experts in the field like with the blind review process of journals.

The result of this can be of enormous impact on your academic reputation. What if suddenly thousands of tweets disapprove of the conclusions of a paper just published? It will create a situation that is nearly impossible to handle for the author of the work. There will be a “negative sentiment” around the publications that will be of influence on aspects of your reputation. For example the chances your paper will be cited often. How will this social sentiment be of influence to other papers in the pipeline and under submission with (top) journals? How will the social sentiment be of influence to your co-authors? How will it influence the chances of your grant applications? How ill it influence your tenure process if the sentiment is negative? These are all huge stakes for researchers.

A recent article by Apoorva Mandavilla in Nature deals with this issue.  It is about “fast feedback”, “a chorus of (dis) approval”, “Meta-twitters ‘,  ‘new (alt) metrics of communication” and some possible solutions for the situation.

The possible power of social media for research and academic reputation is evident for me. The management of the communication and speed of the feedback needs special skills and special publication strategies by researchers (and institutes!) who care about their future careers and their reputation. The open social media review dynamics  at networks like twitter currently has many risks for the author of the paper. But at the same time the stakes and risks for the crowd who collectively performs these “trials”  is very low I guess. A single tweet is not powerful, but the flog together is impact full. It is a collective review of the crowd, with often a lot of people who just follow the sentiment by simply re-tweeting others.

I advice researchers to be very careful about which message on their paper is distributed in social networks, how this is distributed, by whom it is distributed and who is replying on it.  The social networks should not reproduce or copycat the formal peer review process by selected experts. They should be focused on adding value to the possible additional virtues of the work. The best approach might be to leverage the social media by initiating stories on possible practical values and practical impact of the research. Because when these  are confirmed in the wider social network audiences, the author can get confidence that the practical / managerial value of his research is valued and tested  immediately.  In this way the social networks can be very beneficial for the academic reputation; they perform a sounding board for testing managerial / practical value of the research.

A new measure of esteem: prestige or how often is a researcher cited by highly cited researchers?

Prestige & Popularity

Ding and Cronin make a nice distinction between popularity and prestige of a researcher; popularity of a researcher is measured by the number of times he is cited
by all papers in a dataset; the prestige of a researcher by the number of times he is cited by highly cited papers in the same set.  A scholar may be highly cited but not highly regarded: popularity
and prestige are not identical measures of esteem. The authors focus primarily on authors rather than journals.

Popularity vs. Prestige
Popularity and prestige exist in the following possible relations:
High popularity-high prestige, High popularity-low prestige, Low popularity-high prestige and Low popularity-ow prestige

 

Source: http://arxiv.org/ftp/arxiv/papers/1012/1012.4871.pdf

Popular and/or Prestigious? Measures of Scholarly Esteem
Ying Ding, Blaise Cronin
School of Library and Information Science, Indiana University, Bloomington, IN 47405, USA
Abstract
Citation analysis does not generally take the quality of citations into account: all citations are weighted equally irrespective of source. However, a scholar may be highly cited but not highly regarded: popularity and prestige are not identical measures of esteem. In this study we define popularity as the number of times an author is cited and prestige as the number of times an author is cited by highly cited papers. Information Retrieval (IR) is the test field. We compare the 40 leading researchers in terms of their popularity and prestige over time. Some authors are ranked high on prestige but not on popularity, while others are ranked high on popularity but not on prestige. We also relate measures of popularity and prestige to date of Ph.D. award, number of key publications, organizational affiliation, receipt of prizes/honors, and gender.

Science-metrix develops new journals classification system (and interactive webtool) using both the Web of Science (Thomson Reuters) and Scopus (Elsevier).

sciencemetrix explorerA new set of tools recently released (first public release: 2010-12-01 (v1.00)) by Science-Metrix Inc. seeks to improve the way we talk about and understand science. The US/Canada-based research evaluation firm has developed a new, multi-lingual (18 languages!) classification of scientific journals, which is accompanied by an interactive web tool.  The journal classification, which covers 15,000 peer-reviewed scientific journals, is translated by more than 22 international experts who volunteered their time and expertise, making the tools available to a worldwidScience-metrix logoe audience.  The complete journals list is available for download as excel sheet.

The interactive ‘Scientific Journals Ontology Explorer’ allows users to visualise the links between 175 scientific specialties (subject categories) in 18 languages, from Arabic to Swedish.

The visualization contains 3 different views: a circular “Subfield Citation Wheel” (representing both citations as well as references), a “Field Citation Wheel” (showing the links between distinct scientific disciplines) and a network “Map of Science” (revealing similarities between disciplines by relative distance). The goal of this visualization is to show people how science spans a broad universe and how interlinked scientific research actually is.

How is the field of Business & Management covered in the journals list?

The field of  Economics & Business (as part of the domain Economic & Social Sciences) contains 822 journals (this is 5,4% of he total Science metrix list)  in 12 subfields, where Business & Management is included with 222 journals (27%):

Every journal is classified into only one category.

Subfields of ‘Economics & Business’ N %
Accounting 32 3,9%
Agricultural Economics & Policy 27 3,3%
Business & Management 222 27,0%
Development Studies 42 5,1%
Econometrics 13 1,6%
Economic Theory 13 1,6%
Economics 244 29,7%
Finance 63 7,7%
Industrial Relations 21 2,6%
Logistics & Transportation 49 6,0%
Marketing 61 7,4%
Sport, Leisure & Tourism 35 4,3%
Total Journals 822 100%

I will compare these with our ERIM Journals list, the ISI quartiles and the SJR (scopus) quartiles scores to see how the list is structured in terms of quality layers.

(I wil add these details later this week.)

In the ontology browser, you can create a map of science and learn how the field of business and management is connected to other subject categories. I have selected the closest fields in the screenshot below.

business and management in science metrix

About Science-Metrix:

Science-Metrix is the world’s largest independent firm dedicated to scientometrics, technometrics, and science and technology (S&T) evaluation. The firm’s core business involves supporting evidence-based decision-making with strong empirical data and sound theoretical approaches. This contract research organization combines qualitative and quantitative techniques to deliver high quality program evaluations, performance and outcome assessments, and evaluation frameworks. Every Science-Metrix report is produced by a team of dedicated high-calibre experts and relies on the world-class data found in the Scopus, Web of Science and Questel databases.

How are the stocks (top journals) performing over a period of 28 years?

Sciencewatch has published the latest Journal Performance Indicators (JPI’s) for the field of business.  This chart shows which journals perform in the top over decades.  It shows that the very top of the field is very stable over a longer period.  Publishing in these top journals can be seen as a stable investment for researchers; it increases the possibility of a good citation yield.

The table below compares the citation impact of journals in the field of business as measured over three different time spans. The left-hand column ranks journals based on their 2008 “impact factor,” as enumerated in the current edition of Journal Citations Report. The 2008 impact factor is calculated by taking the number of all current citations to source items published in a journal over the previous two years and dividing by the number of articles published in the journal during the same period–in other words, a ratio between citations and recent citable items published. The rankings in the next two columns show impact over longer time spans, based on figures from Journal Performance Indicators. In these columns, total citations to a journal’s published papers are divided by the total number of papers that the journal published, producing a citations-per-paper impact score over a five-year period (middle column) and a 28-year period (right-hand column).

Journals Ranked by Impact: Business
Rank 2008 Impact Factor Impact 2004-08 Impact 1981-2008
1 Academy of Management Review
(6.13)
Administrative Science Quarterly
(9.83)
Administrative Science Quarterly
(89.77)
2 Academy of Management Journal
(6.08)
Academy of Management Journal
(9.79)
Academy of Management Review
(66.56)
3 Journal of Retailing
(4.10)
Academy of Management Review
(8.91)
Journal of Marketing
(53.84)
4 Journal of Marketing
(3.60)
Journal of Marketing
(8.72)
Academy of Management Journal
(50.48)
5 Strategic Management Journal
(3.34)
Strategic Management Journal
(7.07)
Strategic Management Journal
(46.01)
6 Marketing Science
(3.31)
Journal of Management
(6.35)
Journal of Consumer Research
(39.30)
7 Journal of Management
(3.08)
Marketing Science
(6.10 )
Journal of Marketing Research
(36.34)
8 Journal of International Business Studies
(2.99)
Journal of Consumer Psychology
(5.30)
Journal of Management
(28.14)
9 Administrative Science Quarterly
(2.85)
Journal of International Business Studies
(5.05)
Sloan Management Review
(20.38)
10 Journal of Consumer Psychology
(2.84)
Journal of Organizational Behavior
(4.85)
Marketing Science
(20.36)

SCImago Institutions Rankings (SIR): 2009 World Report; How is the NL score?

SCImago Institutions Rankings (SIR):  2009 World Report

SRG SCImago Reseach Group has published their SIR 2009 World Report. This report shows a ranking with more than 2000 of the best worldwide research institutions and organizations whose output surpass 100 scholarly publications during 2007. The ranking shows 5 indicators of institution research performance, stressing output (ordering criteria), collaboration and impact. Analyzed institutions are grouped into five research sectors: Government, Higher Education, Health, Corporate and Others. The resulting list includes institutions belonging to 84 countries from the five continents.
Publication and citation data used to build this report comes from Elsevier’s database Scopus. Current coverage includes data from more than 17000 research publications – mainly journals and proceedings – embracing the full range of scholarly research. Analyzed period goes from 2003 to 2007.
The construction of the current report involves the challenging task of identifying and normalizing the 2000 institutions through an overwhelming number of publications. The work, which is carried out by a mix of computer and human means, comprises the identification and gathering of institution’s affiliation variants under a unique identifiable form as well as the classification into research sectors.
Rank Indicators

Output; An institution’s publication output reveals its scientific outcomes and trends in terms of published documents in scholarly journals. Publication output values are affected by institution sizes and research profiles, among others factors. This indicator forms the basis for more complex metrics.  At co‐authored publications a score is assigned to each contributing institution through the author’s institutional address.

Cites per Document (CxD): This indicator shows the average scientific impact of an institution’s publication output in terms of citations per documents. Shown values express average citations received by the institution’s published documents over the whole period. The values are affected by institution research profiles.

International Collaboration (Int. Coll.): This value shows the institution’s output ratio that has been produced in collaboration with foreign institutions. The values are computed by analyzing the institution’s output whose affiliation includes more than one country address over the whole period.

Normalized SJR (Norm. SJR): It shows the journal average importance where an institution output is published. The indicator used to calculate this average is the SJR indicator (www.scimagojr.com). A value larger than 1 means that “on average” the papers of an institution have been published on journals whose “importance” is above the average in its scientific field. Whereas a value smaller than 1 means that “on average” the papers of an institution have been published on journals whose “importance” is below the average in its scientific field.

Field Normalized Citation Score (Norm. Cit.): This indicator reveals the ratio between the average scientific impact of an institution and the world average impact of publications of the same time frame and subject area. It is computed using the methodology established by Karolinska Intitutet in Sweden where it is named “Item oriented field normalized citation score average” since the normalization of the citation values is done on an individual article level. The values are expressed in percentages and show the relationship of the institution’s average impact to the world average, which is 1, ‐‐i.e. a score of 0.8 means the institution is cited 20% below average and 1.3 means the institution is cited 30% above average.

The NL score: with special attention for Erasmus University

I will focus on the normalized citation only: Although Erasmus University is nr 6 in NL in output, EUR scores highest of all NL university institutions in the list (and 4th if we included other sectors included in the list) on citation impact.  This means Erasmus University is the university with the highest citation impact in the Netherlands. I think this is an amazing score for one of the smaller universities in NL. But compaerd to the overall SIR ranking, EUR is at position 130 on this criterium.

Rank Organization Sector Output C*D Int. coll. norm. SJR norm cit.
42 Universiteit Utrecht Higher educ. 23031 10,48 40,51 1,08 1,71
53 Universiteit van Amsterdam Higher educ. 20608 10,51 42,3 1,07 1,73
138 Universiteit Leiden Higher educ. 12090 10,55 48,97 1,08 1,52
197 Rijksuniversiteit Groningen Higher educ. 9649 8,32 43,43 1,06 1,47
227 Vrije Universiteit Amsterdam Higher educ. 8812 9,28 48,41 1,06 1,54
253 Erasmus Universiteit Rotterdam Higher educ. 8172 6 12,24 3 38,85 16 1,1 2 1,91 4
256 Technische Universiteit Delft Higher educ. 8156 5,27 41,6 0,93 1,56
319 Wageningen Universiteit Higher educ. 6843 7,86 52,71 1,06 1,38
323 Technische Universiteit Eindhoven Higher educ. 6823 6,01 44 0,98 1,7
340 Radboud Universiteit Nijmegen Higher educ. 6437 8,21 44,79 1,06 1,44
397 Academic Medical Center Health 5500 12,52 31,24 1,1 1,81
414 Universiteit Twente Higher educ. 5372 5,64 41,7 0,97 1,54
478 TNO Government 4672 7,82 35,81 1,01 1,42
565 Universiteit Maastricht Higher educ. 4021 9,02 42,15 1,09 1,65
787 University Medical Center St Radboud Health 2733 8,45 26,16 1,08 1,39
885 Netherlands Institute for Metals Research Government 2372 9,65 34,78 1,05 1,91
918 Nederlands Kanker Instituut/ALZ Health 2267 18,2 39,52 1,13 2,1
995 Philips Research Private 2002 6,55 42,61 0,87 1,97
1090 Dutch Polymer Institute Government 1735 10,58 35,39 1,07 2,04
1124 Universiteit van Tilburg Higher educ. 1633 4,64 47,52 1,02 1,49
1257 Academic Hospital Maastricht Health 1325 10,83 38,04 1,1 1,79
1292 KNAW Government 1264 11,32 48,66 1,1 1,78

Capturing Research Impacts: A review of international practice

To help inform the development of the Research Excellence Framework (REF), HEFCE commissioned RAND Europe to carry out a review of international approaches to evaluating the impact of research. This report presents the findings of the review, based on four case study examples.  The full report is here: http://www.hefce.ac.uk/Pubs/RDreports/2009//rd23_09/rd23_09.pdf

The review identifies relevant challenges and lessons from international practice and suggests that the work of the Australian RQF Working Group on Impact Assessment might provide a basis for developing an approach to impact in the REF. The report makes a number of other recommendations concerning attribution, burden and the role of research users, which are outlined in the executive summary.

The purpose of this report is to inform the Higher Education Funding Council for England’s (HEFCE’s) formulation of an approach to assess research impact as part of the proposed Research Excellence Framework (REF). HEFCE has identified several criteria that would be significant in developing an impact assessment framework. The framework should:

  1. be credible and acceptable to the academic as well as user communities
  2. encompass the full range of economic, social, public policy, welfare, cultural and quality-of-life benefits
  3. within a single broad approach, be adaptable to apply to all disciplines
  4. be practicable and not generate an excessive workload for the sector
  5. avoid undesirable perceptions and incentives
  6. complement other funding streams including the research councils’ approach to increasing the impact of research.

To inform their thinking, HEFCE commissioned RAND Europe to undertake an international review of how other research agencies measure impact. The objectives of the review were:   to review international practice in assessing research impact and   to identify relevant challenges, lessons and observations from international practice that will help HEFCE develop a framework for assessing research impact.1

Following a quick scan of international examples of impact frameworks, the researchers  selected four frameworks for further analysis:

  1. the Australian Research Quality and Accessibility Framework (RQF),
  2. the UK RAND/ARC Impact Scoring System (RAISS).
  3. Impact Scoring System, the US Program Assessment Rating Tool (PART) and the
  4. Dutch Evaluating Research in Context (ERiC).

RSS Business School News

  • An error has occurred; the feed is probably down. Try again later.
Locations of Site Visitors