Science-metrix develops new journals classification system (and interactive webtool) using both the Web of Science (Thomson Reuters) and Scopus (Elsevier).

sciencemetrix explorerA new set of tools recently released (first public release: 2010-12-01 (v1.00)) by Science-Metrix Inc. seeks to improve the way we talk about and understand science. The US/Canada-based research evaluation firm has developed a new, multi-lingual (18 languages!) classification of scientific journals, which is accompanied by an interactive web tool.  The journal classification, which covers 15,000 peer-reviewed scientific journals, is translated by more than 22 international experts who volunteered their time and expertise, making the tools available to a worldwidScience-metrix logoe audience.  The complete journals list is available for download as excel sheet.

The interactive ‘Scientific Journals Ontology Explorer’ allows users to visualise the links between 175 scientific specialties (subject categories) in 18 languages, from Arabic to Swedish.

The visualization contains 3 different views: a circular “Subfield Citation Wheel” (representing both citations as well as references), a “Field Citation Wheel” (showing the links between distinct scientific disciplines) and a network “Map of Science” (revealing similarities between disciplines by relative distance). The goal of this visualization is to show people how science spans a broad universe and how interlinked scientific research actually is.

How is the field of Business & Management covered in the journals list?

The field of  Economics & Business (as part of the domain Economic & Social Sciences) contains 822 journals (this is 5,4% of he total Science metrix list)  in 12 subfields, where Business & Management is included with 222 journals (27%):

Every journal is classified into only one category.

Subfields of ‘Economics & Business’ N %
Accounting 32 3,9%
Agricultural Economics & Policy 27 3,3%
Business & Management 222 27,0%
Development Studies 42 5,1%
Econometrics 13 1,6%
Economic Theory 13 1,6%
Economics 244 29,7%
Finance 63 7,7%
Industrial Relations 21 2,6%
Logistics & Transportation 49 6,0%
Marketing 61 7,4%
Sport, Leisure & Tourism 35 4,3%
Total Journals 822 100%

I will compare these with our ERIM Journals list, the ISI quartiles and the SJR (scopus) quartiles scores to see how the list is structured in terms of quality layers.

(I wil add these details later this week.)

In the ontology browser, you can create a map of science and learn how the field of business and management is connected to other subject categories. I have selected the closest fields in the screenshot below.

business and management in science metrix

About Science-Metrix:

Science-Metrix is the world’s largest independent firm dedicated to scientometrics, technometrics, and science and technology (S&T) evaluation. The firm’s core business involves supporting evidence-based decision-making with strong empirical data and sound theoretical approaches. This contract research organization combines qualitative and quantitative techniques to deliver high quality program evaluations, performance and outcome assessments, and evaluation frameworks. Every Science-Metrix report is produced by a team of dedicated high-calibre experts and relies on the world-class data found in the Scopus, Web of Science and Questel databases.

New OCLC Report on Research Assessment Regimes in Five Countries and the Role of Libraries in the Research Assessment Process

Comparing research assessment systems (countries) is a hot topic at the moment. Here is another (pilot) study on this topic, this time commissioned by OCLC Research. The report was written by Key Perspectives Ltd, a UK library and scholarly publishing consultancy.

New Report, “A Comparative Review of Research Assessment Regimes in Five Countries and the Role of Libraries in the Research Assessment Process”

DUBLIN, Ohio, USA, 17 December 2009—This report examines the role of research libraries in research assessment regimes in five different countries and helps establish a new set of responsibilities that is emerging for research libraries.

This report studies the role of research libraries in the higher education research assessment regimes in five countries: the Republic of Ireland, the UK, the Netherlands, Denmark and Australia.  A companion report, in summary form with recommendations, will be published by early January 2010.

Survey of Current Practice in Research Assessment Activity
http://www.oclc.org/research/activities/practice/default.htm

Capturing Research Impacts: A review of international practice

To help inform the development of the Research Excellence Framework (REF), HEFCE commissioned RAND Europe to carry out a review of international approaches to evaluating the impact of research. This report presents the findings of the review, based on four case study examples.  The full report is here: http://www.hefce.ac.uk/Pubs/RDreports/2009//rd23_09/rd23_09.pdf

The review identifies relevant challenges and lessons from international practice and suggests that the work of the Australian RQF Working Group on Impact Assessment might provide a basis for developing an approach to impact in the REF. The report makes a number of other recommendations concerning attribution, burden and the role of research users, which are outlined in the executive summary.

The purpose of this report is to inform the Higher Education Funding Council for England’s (HEFCE’s) formulation of an approach to assess research impact as part of the proposed Research Excellence Framework (REF). HEFCE has identified several criteria that would be significant in developing an impact assessment framework. The framework should:

  1. be credible and acceptable to the academic as well as user communities
  2. encompass the full range of economic, social, public policy, welfare, cultural and quality-of-life benefits
  3. within a single broad approach, be adaptable to apply to all disciplines
  4. be practicable and not generate an excessive workload for the sector
  5. avoid undesirable perceptions and incentives
  6. complement other funding streams including the research councils’ approach to increasing the impact of research.

To inform their thinking, HEFCE commissioned RAND Europe to undertake an international review of how other research agencies measure impact. The objectives of the review were:   to review international practice in assessing research impact and   to identify relevant challenges, lessons and observations from international practice that will help HEFCE develop a framework for assessing research impact.1

Following a quick scan of international examples of impact frameworks, the researchers  selected four frameworks for further analysis:

  1. the Australian Research Quality and Accessibility Framework (RQF),
  2. the UK RAND/ARC Impact Scoring System (RAISS).
  3. Impact Scoring System, the US Program Assessment Rating Tool (PART) and the
  4. Dutch Evaluating Research in Context (ERiC).

8 perspectives on measuring research

In a recent essay Douglas Comer lists 8 different ways to measure research. What is measured depends on the perspective and role of the evaluator. Here are the 8 ways and the point of view using this perspective on research:

Journal Paper Approach (preferred by journal publishers)
Measure: N, the total number of papers published.

Rate Of Publication Approach (preferred by young researchers)
Measure: N/T, the ratio of total papers published to the time in which they were published.

Weighted Publication Approach (preferred by accreditation agencies)
Measure: W, the sum of the weights assigned to published papers.

Millions Of Monkeys Approach (preferred by government granting agencies)
Measure: G, the total amount of taxpayer money distributed for research.

Direct Funding Approach (preferred by department heads)
Measure: D, the total dollars of grant funds acquired by a researcher.

Indirect Funding Approach (preferred by university administrators)
Measure: O, the total overhead dollars generated.

Bottom Line Approach (preferred by industrial research labs)
Measure: P, the profit generated by patents or products that result from the research.

Assessment Of Impact Approach (preferred by the handful of researchers who actually achieve something)
Measure: I/R, Ratio of the impact of the work to the amount of resources used to generate it.

All perspectives have pro’s and con’s, and I advice to read the essay for these details. The important thing is to keep in mind for research managers is to be aware of the the position of the person who measures research. There is nothing wrong in any perspective, as long it is clear that every perspective only looks at a partial reality aspect of research.

Another new Citation Impact tool on Scopus data: Scimago

Declan Butler, Free journal-ranking tool enters citation market, Nature News, January 2, 2008. Excerpt:

A new [OA] Internet database lets users generate on-the-fly citation statistics of published research papers for free. The tool also calculates papers’ impact factors using a new algorithm similar to PageRank, the algorithm Google uses to rank web pages. The open-access database is collaborating with Elsevier, the giant Amsterdam-based science publisher, and its underlying data come from Scopus, a subscription abstracts database created by Elsevier in 2004.

The SCImago Journal & Country Rank database was launched in December by SCImago,

Thomson is also under fire from researchers who want greater transparency over how citation metrics are calculated and the data sets used. In a hard-hitting editorial published in Journal of Cell Biology in December, Mike Rossner, head of Rockefeller University Press, and colleagues say their analyses of databases supplied by Thomson yielded different values for metrics from those published by the company (M. Rossner et al . J. Cell Biol. 179, 1091–1092 ; 2007). Thomson, they claim, was unable to supply data to support its published impact factors. “Just as scientists would not accept the findings in a scientific paper without seeing the primary data,” states the editorial, “so should they not rely on Thomson Scientific’s impact factor, which is based on hidden data.”

It also includes a new metric: the SCImago Journal Rank (SJR).

The familiar impact factor created by industry leader Thomson Scientific, based in Philadelphia, Pennsylvania, is calculated as the average number of citations by the papers that each journal contains. The SJR also analyses the citation links between journals in a series of iterative cycles, in the same way as the Google PageRank algorithm. This means not all citations are considered equal; those coming from journals with higher SJRs are given more weight. The main difference between SJR and Google’s PageRank is that SJR uses a citation window of three years. See Table 1

I tested some testing on the marketing research subfield of business and management (see screenshot). I ranked the list according to total cites over the last 3 years.

Scimago for marketing field

SJR versus JCR:

Let’s take the highest ranked journal form Scimago: Journal of Marketing (sjr 0,107) and compare it with the JCR citation trend. JOM has the higest impactfactor i the ISI subjectcategory Business for 2006. So in general this would mean that the best journals come up equally. But it remains a situation of comparing apples and oranges because the subject categories differ between Scopus and ISI. So the relative position of a journal is different in the two measure systems.

JOM citation trend JCR

Review of International practices and measures of quality and impact of research

As part of the Australian Research Evaluation and Policy Project (REPP), a recent review report lists 40 possible indicators of quality. The report ” A Review of Current Australian and International Practice in Measuring the Quality and Impact of Publicly Funded Research in the Humanities, Arts and Social Sciences.” by C Donovan ((2005), C. Donovan , REPP Discussion Paper 05/3, November) provides the following list (and much more very usefull information on the design of the assessment process:

Indicator 1. Number of publications
Indicator 2. Number of funded papers
Indicator 3. Number of ISI publications
Indicator 4. Number of ISI publications, weighted by journal impact
Indicator 5. Number of publications in top journals
Indicator 6. Distribution of publications over journal impact classes
Indicator 7. Journal Impact Factor (IF)
Indicator 8. Expected Citation Impact
Indicator 9. World Citation Impact
Indicator 10. Journal to Field Impact Score (JFIS)
Indicator 11. Share of self-citations
Indicator 12. Number of Citations
Indicator 13. Number of external citations
Indicator 14. Citations per publication
Indicator 15. Uncitedness
Indicator 16. Distribution of publications over field-normalised impact classes
Indicator 17. Top 5% most frequently cited publications
Indicator 18. Comparison of actual citation rates to journal averages
Indicator 19. Comparison of actual citation rates to the field average
Indicator 20. Weighted impact
Indicator 21. Ratio journal impact to subfield impact
Indicator 22. Position in the journal spectrum
Indicator 23. Field distribution of publications
Indicator 24. Field distribution of citations
Indicator 25. Number of citers
Indicator 26. Level of collaboration
Indicator 27. Country analysis of collaborators and citers
Indicator 28. Level of Research
Indicator 29. Activity Index
Indicator 30. Number of researchers
Indicator 31. Research time
Indicator 32. External funding
Indicator 33. Research students data
Indicator 34. Keynote addresses
Indicator 35. International visits
Indicator 36. Honours and awards.
Indicator 37. Election to learned societies
Indicator 38. Editorial board membersip
Indicator 39. Membership of review committees
Indicator 40. Membership of government bodies

RSS Business School News

  • An error has occurred; the feed is probably down. Try again later.
Locations of Site Visitors
%d bloggers like this: