New OCLC Report on Research Assessment Regimes in Five Countries and the Role of Libraries in the Research Assessment Process

Comparing research assessment systems (countries) is a hot topic at the moment. Here is another (pilot) study on this topic, this time commissioned by OCLC Research. The report was written by Key Perspectives Ltd, a UK library and scholarly publishing consultancy.

New Report, “A Comparative Review of Research Assessment Regimes in Five Countries and the Role of Libraries in the Research Assessment Process”

DUBLIN, Ohio, USA, 17 December 2009—This report examines the role of research libraries in research assessment regimes in five different countries and helps establish a new set of responsibilities that is emerging for research libraries.

This report studies the role of research libraries in the higher education research assessment regimes in five countries: the Republic of Ireland, the UK, the Netherlands, Denmark and Australia.  A companion report, in summary form with recommendations, will be published by early January 2010.

Survey of Current Practice in Research Assessment Activity
http://www.oclc.org/research/activities/practice/default.htm

Advertisements

SCImago Institutions Rankings (SIR): 2009 World Report; How is the NL score?

SCImago Institutions Rankings (SIR):  2009 World Report

SRG SCImago Reseach Group has published their SIR 2009 World Report. This report shows a ranking with more than 2000 of the best worldwide research institutions and organizations whose output surpass 100 scholarly publications during 2007. The ranking shows 5 indicators of institution research performance, stressing output (ordering criteria), collaboration and impact. Analyzed institutions are grouped into five research sectors: Government, Higher Education, Health, Corporate and Others. The resulting list includes institutions belonging to 84 countries from the five continents.
Publication and citation data used to build this report comes from Elsevier’s database Scopus. Current coverage includes data from more than 17000 research publications – mainly journals and proceedings – embracing the full range of scholarly research. Analyzed period goes from 2003 to 2007.
The construction of the current report involves the challenging task of identifying and normalizing the 2000 institutions through an overwhelming number of publications. The work, which is carried out by a mix of computer and human means, comprises the identification and gathering of institution’s affiliation variants under a unique identifiable form as well as the classification into research sectors.
Rank Indicators

Output; An institution’s publication output reveals its scientific outcomes and trends in terms of published documents in scholarly journals. Publication output values are affected by institution sizes and research profiles, among others factors. This indicator forms the basis for more complex metrics.  At co‐authored publications a score is assigned to each contributing institution through the author’s institutional address.

Cites per Document (CxD): This indicator shows the average scientific impact of an institution’s publication output in terms of citations per documents. Shown values express average citations received by the institution’s published documents over the whole period. The values are affected by institution research profiles.

International Collaboration (Int. Coll.): This value shows the institution’s output ratio that has been produced in collaboration with foreign institutions. The values are computed by analyzing the institution’s output whose affiliation includes more than one country address over the whole period.

Normalized SJR (Norm. SJR): It shows the journal average importance where an institution output is published. The indicator used to calculate this average is the SJR indicator (www.scimagojr.com). A value larger than 1 means that “on average” the papers of an institution have been published on journals whose “importance” is above the average in its scientific field. Whereas a value smaller than 1 means that “on average” the papers of an institution have been published on journals whose “importance” is below the average in its scientific field.

Field Normalized Citation Score (Norm. Cit.): This indicator reveals the ratio between the average scientific impact of an institution and the world average impact of publications of the same time frame and subject area. It is computed using the methodology established by Karolinska Intitutet in Sweden where it is named “Item oriented field normalized citation score average” since the normalization of the citation values is done on an individual article level. The values are expressed in percentages and show the relationship of the institution’s average impact to the world average, which is 1, ‐‐i.e. a score of 0.8 means the institution is cited 20% below average and 1.3 means the institution is cited 30% above average.

The NL score: with special attention for Erasmus University

I will focus on the normalized citation only: Although Erasmus University is nr 6 in NL in output, EUR scores highest of all NL university institutions in the list (and 4th if we included other sectors included in the list) on citation impact.  This means Erasmus University is the university with the highest citation impact in the Netherlands. I think this is an amazing score for one of the smaller universities in NL. But compaerd to the overall SIR ranking, EUR is at position 130 on this criterium.

Rank Organization Sector Output C*D Int. coll. norm. SJR norm cit.
42 Universiteit Utrecht Higher educ. 23031 10,48 40,51 1,08 1,71
53 Universiteit van Amsterdam Higher educ. 20608 10,51 42,3 1,07 1,73
138 Universiteit Leiden Higher educ. 12090 10,55 48,97 1,08 1,52
197 Rijksuniversiteit Groningen Higher educ. 9649 8,32 43,43 1,06 1,47
227 Vrije Universiteit Amsterdam Higher educ. 8812 9,28 48,41 1,06 1,54
253 Erasmus Universiteit Rotterdam Higher educ. 8172 6 12,24 3 38,85 16 1,1 2 1,91 4
256 Technische Universiteit Delft Higher educ. 8156 5,27 41,6 0,93 1,56
319 Wageningen Universiteit Higher educ. 6843 7,86 52,71 1,06 1,38
323 Technische Universiteit Eindhoven Higher educ. 6823 6,01 44 0,98 1,7
340 Radboud Universiteit Nijmegen Higher educ. 6437 8,21 44,79 1,06 1,44
397 Academic Medical Center Health 5500 12,52 31,24 1,1 1,81
414 Universiteit Twente Higher educ. 5372 5,64 41,7 0,97 1,54
478 TNO Government 4672 7,82 35,81 1,01 1,42
565 Universiteit Maastricht Higher educ. 4021 9,02 42,15 1,09 1,65
787 University Medical Center St Radboud Health 2733 8,45 26,16 1,08 1,39
885 Netherlands Institute for Metals Research Government 2372 9,65 34,78 1,05 1,91
918 Nederlands Kanker Instituut/ALZ Health 2267 18,2 39,52 1,13 2,1
995 Philips Research Private 2002 6,55 42,61 0,87 1,97
1090 Dutch Polymer Institute Government 1735 10,58 35,39 1,07 2,04
1124 Universiteit van Tilburg Higher educ. 1633 4,64 47,52 1,02 1,49
1257 Academic Hospital Maastricht Health 1325 10,83 38,04 1,1 1,79
1292 KNAW Government 1264 11,32 48,66 1,1 1,78

Finally a good initiative: ORCID: Open Researcher Contributor Identification Initiative

ORCID: Open Researcher Contributor Identification Initiative – Home.

Name ambiguity and attribution are persistent, critical problems imbedded in the scholarly research ecosystem. The ORCID Initiative represents a community effort to establish an open, independent registry that is adopted and embraced as the industry’s de facto standard. Our mission is to resolve the systemic name ambiguity, by means of assigning unique identifiers linkable to an individual’s research output, to enhance the scientific discovery process and improve the efficiency of funding and collaboration. Accurate identification of researchers and their work is one of the pillars for the transition from science to e-Science, wherein scholarly publications can be mined to spot links and ideas hidden in the ever-growing volume of scholarly literature. A disambiguated set of authors will allow new services and benefits to be built for the research community by all stakeholders in scholarly communication: from commercial actors to non-profit organizations, from governments to universities.

And related news “knowledge speak:

http://www.knowledgespeak.com/newsArchieveviewdtl.asp?pickUpBatch=1321&pickUpID=9303

Research community members seek to resolve author name ambiguity issue07 Dec 2009

Various members of the research community have announced their intent to collaborate to resolve the existing author name ambiguity problem in scholarly communication. Together, the group hopes to develop an open, independent identification system for scholarly authors. This follows the first Name Identifier Summit held last month in Cambridge, MA, by Thomson Reuters and Nature Publishing Group, where a cross-section of the research community explored approaches to address name ambiguity. A follow-on meeting of this group took place in London last week to discuss the next steps.

Accurate identification of researchers and their work is seen as key for the transition from science to e-science, wherein scholarly publications can be mined to spot links and ideas hidden in the growing volume of scholarly literature. A disambiguated set of authors will allow new services and benefits to be built for the research community by all stakeholders in scholarly communication: from commercial actors to non-profit organisations, from governments to universities.

The organisations that have agreed to work together to overcome the contributor identification issue include: American Institute of Physics, American Psychological Association, Association for Computing Machinery, British Library, CrossRef, Elsevier, European Molecular Biology Organisation, Hindawi, INSPIRE (project of CERN, DESY, Fermilab, SLAC), Massachusetts Institute of Technology Libraries, Nature Publishing Group, Public Library of Science, ProQuest, SAGE Publications Inc., Springer, Thomson Reuters, University College London, University of Manchester (JISC Names Project), University of Vienna, Wellcome Trust and Wiley-Blackwell.

Capturing Research Impacts: A review of international practice

To help inform the development of the Research Excellence Framework (REF), HEFCE commissioned RAND Europe to carry out a review of international approaches to evaluating the impact of research. This report presents the findings of the review, based on four case study examples.  The full report is here: http://www.hefce.ac.uk/Pubs/RDreports/2009//rd23_09/rd23_09.pdf

The review identifies relevant challenges and lessons from international practice and suggests that the work of the Australian RQF Working Group on Impact Assessment might provide a basis for developing an approach to impact in the REF. The report makes a number of other recommendations concerning attribution, burden and the role of research users, which are outlined in the executive summary.

The purpose of this report is to inform the Higher Education Funding Council for England’s (HEFCE’s) formulation of an approach to assess research impact as part of the proposed Research Excellence Framework (REF). HEFCE has identified several criteria that would be significant in developing an impact assessment framework. The framework should:

  1. be credible and acceptable to the academic as well as user communities
  2. encompass the full range of economic, social, public policy, welfare, cultural and quality-of-life benefits
  3. within a single broad approach, be adaptable to apply to all disciplines
  4. be practicable and not generate an excessive workload for the sector
  5. avoid undesirable perceptions and incentives
  6. complement other funding streams including the research councils’ approach to increasing the impact of research.

To inform their thinking, HEFCE commissioned RAND Europe to undertake an international review of how other research agencies measure impact. The objectives of the review were:   to review international practice in assessing research impact and   to identify relevant challenges, lessons and observations from international practice that will help HEFCE develop a framework for assessing research impact.1

Following a quick scan of international examples of impact frameworks, the researchers  selected four frameworks for further analysis:

  1. the Australian Research Quality and Accessibility Framework (RQF),
  2. the UK RAND/ARC Impact Scoring System (RAISS).
  3. Impact Scoring System, the US Program Assessment Rating Tool (PART) and the
  4. Dutch Evaluating Research in Context (ERiC).

The variety in Researcher information services

I crossed a nice list of Researcher information services @ the REPINF wiki this week:

Research Crossroads

http://www.researchcrossroads.org

Self-registering searchable service that collects data on researchers in science and medicine: their research, grant funding, publications, affiliation, biography, etc. Also provides searchable databases on funders, grants awarded and clinical trials. Also acts as a networking tool.

Academia.edu

http://www.academia.edu

Authors self-register themselves, their departments and universities/institutions. Authors add details of their papers. Currently has around 24K people and 87K papers.

ResearcherID

http://www.researcherid.com/

Thomson Reuters’ author profiling service. Searchable researcher/research database with the benefit of Thomson Reuters’ unique author identifier system (see Author Identifiers topic).

ResearchGate

https://www.researchgate.net/

Scientists’ network with over 180,000 members to date (November 2009). Scientists add details about themselves and their work, upload full-texts of their papers, and discuss relevant topics. Has just introduced a ‘micro-article’ idea, which is to encourage scientists to write and upload brief versions of their latest work for rapid communication and discussion. There is also an embryonic job advertising facility. Free to join and use. It is not clear how this initiative is supported fiancially, though it claims to be bulit ‘by scientists for scientists’.

bibapp

http://bibapp.org/

For use as an institutional ‘campus gateway’. Database of researchers, their publications and their institutional affiliations (group/department/school, etc), enabling a search for ‘campus experts’. Accepts deposits in popular formats (e.g. Refworks) and provides automatic rights-checking using SHERPA RoMEO. Makes SWORD-compliant deposits to the institutional repository or other locations.

VIVO

http://vivo.cornell.edu/

Developed at Cornell University Library. A campus research discovery tool. Researchers can manage their own page/profile, which usually links to their personal web page and departmental or other affiliation web pages.

Scholar Universe (ProQuest)

http://www.scholaruniverse.com/productinfo.jsp

2 million scholar profiles generated from ProQuest’s databases.

Selected Works (Berkeley Electronic Press)

http://works.bepress.com/

Authors create their own profiles for a campus profiling service.

RSS Business School News

  • An error has occurred; the feed is probably down. Try again later.
Locations of Site Visitors

Altmetric Related stuf

if(typeof(AltmetricWidget) !== 'undefined'){ window.AltmetricWidget489 = new AltmetricWidget("AltmetricWidget489"); }}

%d bloggers like this: