Scholarly Impact Revisited; ranking most highly cited management scholars in the past three decades by counting the number of pages indexed by google to assess scholarly impact on stakeholders outs…

See on Scoop.itDual impact of research; towards the impactelligent university

Scholarly impact is one of the strongest currencies in the Academy and has traditionally been equated with number of citations—be it for individuals, articles, departments, universities, journals, or entire fields. Adopting an alternative definition and measure, we use number of pages as indexed by google.com to assess scholarly impact on stakeholders outside of the Academy. Based on a sample including 384 of the 550 most highly cited management scholars in the past three decades, results show that scholarly impact is a multidimensional construct and the impact of scholarly research on internal stakeholders (i.e., other members of the Academy) cannot be equated with impact on external stakeholders (i.e., those outside of the Academy). We illustrate these results with tables showing important changes in the rank ordering of individuals based on whether we operationalize impact considering internal stakeholders (i.e., number of citations) or external stakeholders (i.e., number of non .edu web pages). Also, we provide tables listing the most influential scholars inside of the Academy who also have an important impact outside of the Academy. We discuss implications for empirical research, theory development, and practice regarding the meaning and measurement of scholarly impact.

 

Source:

Issue: Volume 26, Number 2 May 2012
Title: ARTICLE: Scholarly Impact Revisited
Authors: Herman Aguinis, Isabel Suarez-Gonzalez, Gustavo Lannelongue, Harry Joo

The Academy of Management Perspectives is a journal of The Academy of Management .

http://journals.aomonline.org/amp/

 

See on journals.aomonline.org

Advertisements

Making research count: applying H-indexes within and between disciplines to analyse publication cultures

See on Scoop.itDual impact of research; towards the impactelligent university

Conclusions:

This paper demonstrates that it is possible both to examine H-index scores within disciplines, and to  create discipline-normalized H-indexes for interdisciplinary comparisons. The same approach can be applied to other bibliometric measures and be used to improve comparative metrics.

 

This is especially important for several common types of bibliometric comparisons. The first is international rankings, such as the Shanghai Jiao Tong Academic Ranking of World Universities (ARWU, or Shanghai Rankings, as it is colloquially known). In the 2011 edition, 20% of this ranking was determined by research output. This was measured by number of papers cited in Nature and Science, and papers indexed in Science Citation Index-expanded, and Social Science Citation Index (which were given double the weight of papers indexed in Science Citation Index-expanded). Another 20% was determined by “highly cited researchers in 21 broad subject categories.”20 The absence of discipline-normalization means that this ranking does not control for publication cultures and the biases that they can generate. As a consequence, institutions with more faculty members in highly productive disciplines have an advantage. This issue is not unique to ARWU – while some ranking systems break out their comparisons by field of study, none to date have implemented valid discipline-normalization.

 

Secondly, comparisons within institutions can be strengthened by discipline normalization. When institutions assess the relative performance of academic departments or units, often as part of unit
review processes, using accurate discipline-normalization to control for publication culture provides a more valid comparison.

 

Lastly, bibliometrics are increasingly applied to hiring decisions within universities. When candidates’ fields of study differ (as would be the case for a dean of science, for example), discipline normalization helps to ensure that the hiring committee understands differences in publication records (and corresponding H-indexes) in the context of differences in publication cultures.

 

It’s time for applications of bibliometrics to become more sensitive to publication culture. Just as importantly, users of bibliometrics need to be aware of the dramatic biases that publication culture can generate. We hope that this paper inspires both scholars and professionals to use accurate and normalized measures when applying bibliometrics to their research, assessment, and decision-making activities.

 

 

source:

Jarvey, P., Usher, A. and McElroy, L. 2012. Making Research Count: Analyzing Canadian
Academic Publishing Cultures. Toronto: Higher Education Strategy Associates.

Permalink: http://higheredstrategy.com/announcements/new-publication-making-research-count-analyzing-canadian-academic-publishing-cultures/

Fulltext: http://higheredstrategy.com/wp-content/uploads/2012/06/2012-Bibliometrics-and-Publication-Culture-HESA.pdf

See on higheredstrategy.com

Making research count: applying H-indexes within and between disciplines to analyse publication cultures

See on Scoop.itDual impact of research; towards the impactelligent university

Conclusions:

This paper demonstrates that it is possible both to examine H-index scores within disciplines, and to  create discipline-normalized H-indexes for interdisciplinary comparisons. The same approach can be applied to other bibliometric measures and be used to improve comparative metrics.

 

This is especially important for several common types of bibliometric comparisons. The first is international rankings, such as the Shanghai Jiao Tong Academic Ranking of World Universities (ARWU, or Shanghai Rankings, as it is colloquially known). In the 2011 edition, 20% of this ranking was determined by research output. This was measured by number of papers cited in Nature and Science, and papers indexed in Science Citation Index-expanded, and Social Science Citation Index (which were given double the weight of papers indexed in Science Citation Index-expanded). Another 20% was determined by “highly cited researchers in 21 broad subject categories.”20 The absence of discipline-normalization means that this ranking does not control for publication cultures and the biases that they can generate. As a consequence, institutions with more faculty members in highly productive disciplines have an advantage. This issue is not unique to ARWU – while some ranking systems break out their comparisons by field of study, none to date have implemented valid discipline-normalization.

 

Secondly, comparisons within institutions can be strengthened by discipline normalization. When institutions assess the relative performance of academic departments or units, often as part of unit
review processes, using accurate discipline-normalization to control for publication culture provides a more valid comparison.

 

Lastly, bibliometrics are increasingly applied to hiring decisions within universities. When candidates’ fields of study differ (as would be the case for a dean of science, for example), discipline normalization helps to ensure that the hiring committee understands differences in publication records (and corresponding H-indexes) in the context of differences in publication cultures.

 

It’s time for applications of bibliometrics to become more sensitive to publication culture. Just as importantly, users of bibliometrics need to be aware of the dramatic biases that publication culture can generate. We hope that this paper inspires both scholars and professionals to use accurate and normalized measures when applying bibliometrics to their research, assessment, and decision-making activities.

 

 

source:

Jarvey, P., Usher, A. and McElroy, L. 2012. Making Research Count: Analyzing Canadian
Academic Publishing Cultures. Toronto: Higher Education Strategy Associates.

Permalink: http://higheredstrategy.com/announcements/new-publication-making-research-count-analyzing-canadian-academic-publishing-cultures/

Fulltext: http://higheredstrategy.com/wp-content/uploads/2012/06/2012-Bibliometrics-and-Publication-Culture-HESA.pdf

See on higheredstrategy.com

Impact of HBR Articles in the Scientific and Twitter Communities

See on Scoop.itDual impact of research; towards the impactelligent university

The Harvard Business Review invited openly to submit a data-vision on the archival history of the HBR. The goal of this prospect to generate analysis and visualizations from the metadata and abstracts of every article they have published over the last 90 years. Winning entries will be featured in the Vision Statement feature of the upcoming 90th anniversary issue.

 

The competition generated a series of great visualizations. One of the entries focused on two types of impact that Harvard Business Review (HBR) articles can have:

 

Citation Impact: Articles with deep insight get cited by other peer reviewed articles. Large number of citations usually implies [under the absence of spam], a high-impact article

 

Twitter Impact: Social media has become the new content discovery channel. In the second type of impact, we study how are HBR URLs shared on twitter.

See on kaggle2.blob.core.windows.net

Scimago releases 4th Instititions ranking with impact, excellence and leadership (new!) indexes

See on Scoop.itDual impact of research; towards the impactelligent university

The 2012 SIR report shows a set of bibliometric indicators that unveil some of the main dimensions of research performance of worldwide research-devoted institutions. As in former editions, SIR World Report 2012 aims at becoming an evaluation framework of research performance to Worldwide Research Organizations.

The report shows six indicators that will help users evaluate the scientific impact, thematic specialization, output size and international collaboration networks of the institutions. The period analyzed in the current edition covers 2006-2010. The tables include institutions having published at least 100 scientific documents of any type, that is, articles, reviews, short reviews, letters, conference papers, etc., during the year 2010 as collected by worldwide leader scientific
database Scopus by Elsevier. The report encompasses Higher Education Institutions (HEIs) as well as other research-focused organizations from different sizes, with different missions and from countries in the five continents. Institutions are grouped into five Institutional Sectors: Higher Education, Health System, Government Agencies, Corporations and Others.
Following the goal of embracing every institution around the world with meaningful scientific output, the ranking now includes 3,290 institutions that together are responsible for more than 80%
of worldwide scientific output during the term 2006-10 as indexed in Elsevier’s Scopus database.

 

Indicators in the Scimago Institutions Ranking 2012:

Output

 

Total number of documents published in scholarly journals indexed in Scopus.

International Collaboration

Institution’s output ratio produced in collaboration with foreign institutions. The values are computed by analyzing an institution’s output whose affiliations include more than one country address.

 

Normalized Impact

The values (in %) show the relationship between an institution’s average scientific impact and the world average set to a score of 1, –i.e. a Nl score of 0.8 means the institution is cited 20% below
world average and 1.3 means the institution is cited 30% above average.

High Quality Publications

Ratio of publications that an institution publishes in the most influential scholarly journals of the world; those ranked in the
first quartile (25%) in their categories as ordered by SCImago Journal Rank SJR indicator.

Specialization Index

The Specialization Index indicates the extent of thematic concentration / dispersion of an institution’s scientific
output. Values range between 0 to 1, indicating generalistic vs. specialized institutions respectively. This indicator is computed according to the Gini Index used in Economy.

 

Excellence Rate

Exc indicates the amount (in %) of an institution’s scientific output that is included into the set of the 10% of the most cited papers in their respective scientific fields. It is a measure of high quality output of research institutions.

 

Scientific Leadership

Leadership indicates an institution’s “output as main contributor”, that is the number of papers in wich the corresponding author belongs to the institution.

 

Fulltext of the Scimago Institutions ranking:

http://www.scimagoir.com/pdf/sir_2012_world_report.pdf

See on www.scimagoir.com

Predatory Open-Access Publishers: corrupting Open Access publishing and creating impact-terrorism.

See on Scoop.itDual impact of research; towards the impactelligent university

Beall’s List of Predatory Open-Access Publishers

This is a list of questionable, scholarly open-access publishers. I recommend that scholars not do any business with these publishers, including submitting articles, serving as editors or on editorial boards, or advertising with them. Also, articles published in these publishers’ journals should be given extra scrutiny in the process of evaluation for tenure and promotion.

 

Source:

Scholarly Open Access Weblog

Jeffrey Beall

librarian at Auraria Library,

University of Colorado Denver, in Denver, Colorado.

See on scholarlyoa.com

Research Networking Tools and Research Profiling Systems for impacting research expertises

See on Scoop.itDual impact of research; towards the impactelligent university

Research Networking (RN) is about using web-based tools to discover and use research and scholarly information about people and resources. Research Networking tools (RN tools) serve as knowledge management systems for the research enterprise. RN tools connect institution-level/enterprise systems, national research networks, publicly available research data (e.g., grants and publications), and restricted/proprietary data by harvesting information from disparate sources into compiled expertise profiles for faculty, investigators, scholars, clinicians, community partners, and facilities. RN tools facilitate the development of new collaborations and team science to address new or existing research challenges through the rapid discovery and recommendation of researchers, expertise, and resources.

 

Source:

Comparison of Research Networking Tools and Research Profiling Systems

From Wikipedia, the free encyclopedia

Credentials for initiating this comparison:

Holly J. Falk-Krzesinski, Ph.D.

Membership Chair and Past President

National Organization of Research Development Professionals (NORDP)

LinkedIn: http://www.linkedin.com/in/hollyfk

Director, Research Team Support & Development (RTS&D)

Clinical and Translational Sciences (NUCATS) Institute

Northwestern University

See on en.wikipedia.org

RSS Business School News

  • An error has occurred; the feed is probably down. Try again later.
Locations of Site Visitors

Altmetric Related stuf

if(typeof(AltmetricWidget) !== 'undefined'){ window.AltmetricWidget489 = new AltmetricWidget("AltmetricWidget489"); }}

%d bloggers like this: