New OCLC Report on Research Assessment Regimes in Five Countries and the Role of Libraries in the Research Assessment Process

Comparing research assessment systems (countries) is a hot topic at the moment. Here is another (pilot) study on this topic, this time commissioned by OCLC Research. The report was written by Key Perspectives Ltd, a UK library and scholarly publishing consultancy.

New Report, “A Comparative Review of Research Assessment Regimes in Five Countries and the Role of Libraries in the Research Assessment Process”

DUBLIN, Ohio, USA, 17 December 2009—This report examines the role of research libraries in research assessment regimes in five different countries and helps establish a new set of responsibilities that is emerging for research libraries.

This report studies the role of research libraries in the higher education research assessment regimes in five countries: the Republic of Ireland, the UK, the Netherlands, Denmark and Australia.  A companion report, in summary form with recommendations, will be published by early January 2010.

Survey of Current Practice in Research Assessment Activity
http://www.oclc.org/research/activities/practice/default.htm

Capturing Research Impacts: A review of international practice

To help inform the development of the Research Excellence Framework (REF), HEFCE commissioned RAND Europe to carry out a review of international approaches to evaluating the impact of research. This report presents the findings of the review, based on four case study examples.  The full report is here: http://www.hefce.ac.uk/Pubs/RDreports/2009//rd23_09/rd23_09.pdf

The review identifies relevant challenges and lessons from international practice and suggests that the work of the Australian RQF Working Group on Impact Assessment might provide a basis for developing an approach to impact in the REF. The report makes a number of other recommendations concerning attribution, burden and the role of research users, which are outlined in the executive summary.

The purpose of this report is to inform the Higher Education Funding Council for England’s (HEFCE’s) formulation of an approach to assess research impact as part of the proposed Research Excellence Framework (REF). HEFCE has identified several criteria that would be significant in developing an impact assessment framework. The framework should:

  1. be credible and acceptable to the academic as well as user communities
  2. encompass the full range of economic, social, public policy, welfare, cultural and quality-of-life benefits
  3. within a single broad approach, be adaptable to apply to all disciplines
  4. be practicable and not generate an excessive workload for the sector
  5. avoid undesirable perceptions and incentives
  6. complement other funding streams including the research councils’ approach to increasing the impact of research.

To inform their thinking, HEFCE commissioned RAND Europe to undertake an international review of how other research agencies measure impact. The objectives of the review were:   to review international practice in assessing research impact and   to identify relevant challenges, lessons and observations from international practice that will help HEFCE develop a framework for assessing research impact.1

Following a quick scan of international examples of impact frameworks, the researchers  selected four frameworks for further analysis:

  1. the Australian Research Quality and Accessibility Framework (RQF),
  2. the UK RAND/ARC Impact Scoring System (RAISS).
  3. Impact Scoring System, the US Program Assessment Rating Tool (PART) and the
  4. Dutch Evaluating Research in Context (ERiC).

8 perspectives on measuring research

In a recent essay Douglas Comer lists 8 different ways to measure research. What is measured depends on the perspective and role of the evaluator. Here are the 8 ways and the point of view using this perspective on research:

Journal Paper Approach (preferred by journal publishers)
Measure: N, the total number of papers published.

Rate Of Publication Approach (preferred by young researchers)
Measure: N/T, the ratio of total papers published to the time in which they were published.

Weighted Publication Approach (preferred by accreditation agencies)
Measure: W, the sum of the weights assigned to published papers.

Millions Of Monkeys Approach (preferred by government granting agencies)
Measure: G, the total amount of taxpayer money distributed for research.

Direct Funding Approach (preferred by department heads)
Measure: D, the total dollars of grant funds acquired by a researcher.

Indirect Funding Approach (preferred by university administrators)
Measure: O, the total overhead dollars generated.

Bottom Line Approach (preferred by industrial research labs)
Measure: P, the profit generated by patents or products that result from the research.

Assessment Of Impact Approach (preferred by the handful of researchers who actually achieve something)
Measure: I/R, Ratio of the impact of the work to the amount of resources used to generate it.

All perspectives have pro’s and con’s, and I advice to read the essay for these details. The important thing is to keep in mind for research managers is to be aware of the the position of the person who measures research. There is nothing wrong in any perspective, as long it is clear that every perspective only looks at a partial reality aspect of research.

KNAW published new SEP protocol 2009-2015

This week The Royal Netherlands Academy of Arts and Sciences (KNAW) published the new Standard Evaluation Report (SEP) which is applied to all fields of research in the Netherlands. The  new feature is that it looks at Doctoral training too. This means that research and doctoral training are seen as integrated, which is a good development.

SEP protocol

RSS Business School News

  • An error has occurred; the feed is probably down. Try again later.
Locations of Site Visitors