Measures of Value of a Journal Beyond the Impact Factor

Anita Coleman gives a nice overview of possible multi-dimensional measure of the value of academic journals next to the traditional impact factor. To measure the value of a journals Coleman first selects three measures, namely journal attraction power, author associativity, and journal consumption power; she redefines two of them as journal measures of affinity (the proportion of foreign authors) and associativity (the amount of collaboration), and calculate these as objective indicators of journal value. To illustrate the multi-dimensional identity of the value of a journal I selected the following list of possible measures from Coleman’s article (in alfabetical order):

  1. Acceptance and Rejection rates
  2. Adjusted Impact Factor,
  3. Article Quality,
  4. Author Reputation Score,
  5. Average Ranking Position,
  6. Circulation size,
  7. Citation Rate,
  8. Citing Half-Life,
  9. Degree of specialization,
  10. Disciplinary Impact Factor,
  11. Editorial board,
  12. Editorial standards,
  13. Immediacy Index,
  14. Impact Factor,
  15. Importance Index
  16. Influence Weight,
  17. Journal Age,
  18. Journal origin and orientation,
  19. Mean Response Time,
  20. Popularity Factor
  21. Readership
  22. References per Paper,
  23. Reprint distribution,
  24. School Reputation Score,
  25. Self-citedness,
  26. Standing,
  27. Type of research covered,
  28. Uncitedness,
Advertisements

Peer review in the internet age: 5 alternatives

Gerry McKiernan Presents 5 alternative models for the classical peer review system:

  1. Open Peer Review (let authors know the identity of reviewers)
  2. Commentary-Based review (two-stage procedure where the first review phase is open)
  3. Community Based review (all submisions accepted with minimal review in a standard tier and only a few with full peer review in the upper tier)
  4. Usage-based review (a metric that uses access statistics as an indicator of significance)
  5. Citation-based (Citebase & Web Citation Index)

The article ends with the following lines;

As observed by Harnad, “the Net …[not only] offers the possibility of implementing peer review more efficiently and equitably …,” but more significantly, provides a “real revolutionary dimension” with such features as “open peer commentary on published and ongoing work.” In addition, the Net provides “room … for unrefereed discussion too, [notably] in high-level discussion forums ….” Such enhancements to conventional peer review need not, however, be limited to features that some may view as simple extensions of the traditional model. In addition to ‘ideal’ conversations, metrics such as access statistics, as well as citing and linking, can also offer impartial indicators of valid and significant scholarship in all its forms, at any and all stages.

Open Access, downloads (‘hits’) & citation impact: a bibliography

The Open Citation Project (Opcit) maintains a bibliography on the effects of downloads on citation impacts. The general assumption is that papers which are distributed in open archives before final publishing as articles in journals, have a positive effect on the amount of citations (Times Cited) for the final article when published in a journal. This bibliography is very useful because it puts together the most relevant and new papers and reports on the topic. I have put a a spy on that site to make sure I stay in touch with this bibiography.

Review of International practices and measures of quality and impact of research

As part of the Australian Research Evaluation and Policy Project (REPP), a recent review report lists 40 possible indicators of quality. The report ” A Review of Current Australian and International Practice in Measuring the Quality and Impact of Publicly Funded Research in the Humanities, Arts and Social Sciences.” by C Donovan ((2005), C. Donovan , REPP Discussion Paper 05/3, November) provides the following list (and much more very usefull information on the design of the assessment process:

Indicator 1. Number of publications
Indicator 2. Number of funded papers
Indicator 3. Number of ISI publications
Indicator 4. Number of ISI publications, weighted by journal impact
Indicator 5. Number of publications in top journals
Indicator 6. Distribution of publications over journal impact classes
Indicator 7. Journal Impact Factor (IF)
Indicator 8. Expected Citation Impact
Indicator 9. World Citation Impact
Indicator 10. Journal to Field Impact Score (JFIS)
Indicator 11. Share of self-citations
Indicator 12. Number of Citations
Indicator 13. Number of external citations
Indicator 14. Citations per publication
Indicator 15. Uncitedness
Indicator 16. Distribution of publications over field-normalised impact classes
Indicator 17. Top 5% most frequently cited publications
Indicator 18. Comparison of actual citation rates to journal averages
Indicator 19. Comparison of actual citation rates to the field average
Indicator 20. Weighted impact
Indicator 21. Ratio journal impact to subfield impact
Indicator 22. Position in the journal spectrum
Indicator 23. Field distribution of publications
Indicator 24. Field distribution of citations
Indicator 25. Number of citers
Indicator 26. Level of collaboration
Indicator 27. Country analysis of collaborators and citers
Indicator 28. Level of Research
Indicator 29. Activity Index
Indicator 30. Number of researchers
Indicator 31. Research time
Indicator 32. External funding
Indicator 33. Research students data
Indicator 34. Keynote addresses
Indicator 35. International visits
Indicator 36. Honours and awards.
Indicator 37. Election to learned societies
Indicator 38. Editorial board membersip
Indicator 39. Membership of review committees
Indicator 40. Membership of government bodies

China’s R&D intensity will catch up with Europe’s by 2010

Andreas von Bubnoff reports (Nature 436, 314 (21 July 2005)) that the rise of the global share of scientific output of China is striking. I think we’d better accept this new reality in the international research landscape and adapt to it.

By comparison, Europe last year produced 38% of the world’s scientific papers, and the United States produced 33% (see Graph). Although it is the current world leader, Europe is beginning to worry. The European Commission is due to release a report this week saying that the European Union (EU) may not reach its stated spending goals for research and development by the end of this decade.

Thomson’s findings echo a highly regarded 2004 National Science Foundation (NSF) analysis — the biennial Science and Engineering Indicators. This showed that the number of US papers published has remained essentially flat over the past decade, whereas the rest of the world has been publishing more with every year.

THOMSON SCIENTIFIC NATIONAL SCIENCE INDICATORS

Measuring scientific contributions through automatic acknowledgment indexing

In a recent article C. Lee Giles and Isaac G. Councill (PNAS, December 21, 2004, vol. 101, no. 51) present an alternative way to measure impact; through acknowledgements, which got quit some media attention.

Abstract:

Acknowledgments in research publications, like citations, indicate
influential contributions to scientific work. However, acknowledgments
are different from citations; whereas citations are formal
expressions of debt, acknowledgments are arguably more personal,
singular, or private expressions of appreciation and contribution.
Furthermore, many sources of research funding expect
researchers to acknowledge any support that contributed to the
published work. Just as citation indexing proved to be an important
tool for evaluating research contributions, we argue that
acknowledgments can be considered as a metric parallel to citations
in the academic audit process.We have developed automated
methods for acknowledgment extraction and analysis and show
that combining acknowledgment analysis with citation indexing
yields a measurable impact of the efficacy of various individuals as
well as government, corporate, and university sponsors of scientific
work.

The authors state that ‘Acknowledgments may be made for a number of reasons but often
imply significant intellectual debt. Just as citation indexing proved to be an important tool for evaluating research contributions, acknowledgments can be considered a metric parallel to citations in the academic audit process’.

Acknowledgments embody a wide range of relationships among people, agencies, institutions, and research.
Classification schemes exist for six categories of acknowledgment:

  1. moral support;
  2. financial support;
  3. editorial support;
  4. presentational support (e.g., presenting a paper at a conference);
  5. instrumental/technical support; and
  6. conceptual support, or peer interactive communication (PIC).

In this article, acknowledged entities are listed in four categories:

  1. funding agencies,
  2. companies,
  3. educational institutions, and
  4. individuals

NL average = 4 citations / paper over the years 1995-2005

According to the latest in-cites statistics, published this month, the Netherlands in number 4, measured over the last 10 years for the field of Economics & Business. I think this is a very good result, but at the same time I think it is very difficult to reach this average as a school. still it is a nice figure to remember as a general benchmark. This is the overal list:

( in-cites, January 2006, Citing URL: http://www.in-cites.com/countries/top20eco.html)

RANK COUNTRY PAPERS CITATIONS CITATIONS
PER PAPER
1 USA

62,633

392,238

6.26

2 ENGLAND

15,012

65,196

4.34

3 CANADA

7,307

31,642

4.33

4 NETHERLANDS

4,208

16,831

4

5 FRANCE

4,251

15,569

3.66

6 AUSTRALIA

4,493

12,611

2.81

7 GERMANY

4,694

12,388

2.64

8 ISRAEL

1,725

9,130

5.29

9 SWEDEN

1,956

8,506

4.35

10 ITALY

2,468

7,488

3.03

11 SPAIN

2,609

7,430

2.85

12 BELGIUM

1,742

7,240

4.16

13 SCOTLAND

1,646

5,894

3.58

14 SWITZERLAND

1,333

5,634

4.23

15 HONG KONG

693

5,222

7.54

16 JAPAN

2,019

4,713

2.33

17 PEOPLES R CHINA

1,820

4,572

2.51

18 SOUTH KOREA

1,181

3,861

3.27

19 DENMARK

1,222

3,805

3.11

20 NORWAY

1,043

3,324

3.19

RSS Business School News

  • An error has occurred; the feed is probably down. Try again later.
Locations of Site Visitors

Altmetric Related stuf

if(typeof(AltmetricWidget) !== 'undefined'){ window.AltmetricWidget489 = new AltmetricWidget("AltmetricWidget489"); }}

%d bloggers like this: