Metrics can’t replace expert judgment in science assessments, Report by Canadian Expert Panel on Scienc Performance and Research Funding

See on Scoop.itDual impact of research; towards the impactelligent university

 

There are many quantitative indicators to assess science performance, the report notes. These include not just bibliometric indicators like publications and citation counts, but also patterns and trends in grant applications and research funding, measures of “esteem” such as academic honours and awards, and a host of new internet-based metrics such as the number of article downloads.

 

These indicators and assessment approaches “are sufficiently robust” to be used to assess science performance in the aggregate at the national level, the report concludes. However, these indicators “should be used to inform rather than replace expert judgment,” which the expert panel says remains “invaluable.” Examples of expert judgment include peer review and other “deliberative” methods, says the report.

 

Summary of Methodological Guidelines
Context is critical in determining whether any science indicator or assessment
strategy is appropriate and informative. As a result, it is impossible to provide a list of universally applicable best practices. With respect to assessing scientific research in the NSE at the level of nationally aggregated research fields, however, the following general methodological guidelines may be of assistance.

 

Assessments of Research Quality
Indicators associated with monitoring research quality often relate to different
aspects of quality or different timeframes. As a result, the best approach relies on a combination of assessment strategies and indicators.

• For an assessment of research quality of a field at the national level, a
balanced combination of deliberative methods and quantitative indicators
is the strongest approach.

• For an assessment of the scientific impact of research in a field at the national level, indicators based on relative, field-normalized citations (e.g., average relative citations) offer the best available metrics. At this level of aggregation, when appropriately normalized by field and based on a sufficiently long citation window, these measures provide a defensible and informative assessment of the impacts of past research in the NSE.

• Quantitative indicators of research quality should always be evaluated by informed expert review because accurate interpretation of data from available indicators can require detailed contextual knowledge of a field.

 

In addition to methodological guidelines, the Panel developed the following general principles for defining a process for NSE assessment in the context of informing research funding allocation:

 

• Context matters: Effective use of indicators or assessment strategies, as
applied to research fields in the NSE, is context dependent. Thus any approach should take into account national science and technology objectives as well as the goals and priorities of the organization and funding program.

• Do no harm: Attempts to link funding allocation directly to specific indicators
have the potential to lead to unintended consequences with negative impacts on the research community. Promising strategies identified by the Panel to mitigate this risk include relying on a balanced set of indicators and expert judgment in the assessment process.

• Transparency is critical: Assessment methods and indicators are most effective when fully transparent to the scientific community. Such transparency should include both the assessment methods or indicators (e.g., indicator construction and validation, data sources, criteria, procedures for selecting expert reviewers) and the method or process by which the indicators or assessments inform or influence funding decisions.

• The judgment of scientific experts remains invaluable: Many quantitative indicators are capable of providing useful information in the assessment of discovery research at the national and field level. In the context of informing research funding decisions, however, quantitative indicators are best interpreted by scientific experts with detailed knowledge and experience in the relevant fields of research, and a deep and nuanced understanding of the research funding contexts in question, and the scientific issues, problems, questions, and opportunities at stake.

 

Source:

Council of Canadian Academies.

Expert Panel on Science Performance and Research Funding Informing research choices: indicators and judgment / The Expert Panel on Science Performance and Research Funding.
ISBN 978-1-926558-42-4

 

Fulltext:

http://www.scienceadvice.ca/uploads/eng/assessments%20and%20publications%20and%20news%20releases/Science%20performance/SciencePerformance_FullReport_EN_Web.pdf

 

See on www.universityaffairs.ca

Advertisements

About Wilfred Mijnhardt
RMIMR is my virtual playground, a place to reflect on issues from my professional context, my job as Policy Director at Rotterdam School of Management, Erasmus University (RSM). RSM is the international university based business school at Erasmus University Rotterdam. More info here: www.rsm.nl Here is my list of relevant publications on the topic of my RMIMR weblog: http://www.mendeley.com/collections/694621/RMIMR-Repository/ The rss feed for my RMIMR collection is here: http://www.mendeley.com/collections/rss/694621/ Here is my other weblog on impact of research: http://www.scoop.it/t/dualimpact

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

RSS Business School News

  • An error has occurred; the feed is probably down. Try again later.
Locations of Site Visitors
%d bloggers like this: