Individual researchers benefit from online impact assessment, but Online measuring of impact not yet suited for research assessment exercises

Via Scoop.itDual impact of research; towards the impactelligent university

Individual researchers are very interested in evidence of the impact of their publications. Research institutes and independent organisations assessing research have a special interest when comparing groups and organisations for research assessment. Thanks to the possibilities of web based publishing it is now possible to gauge the impact of some publications under certain conditions. New information filters and tools are helping researchers to assess their own progress and to find responses of others to their publications.   The report Users, Narcissism and Control, which was commissioned by SURF, offers a comprehensive overview of the current tracking tools of online publications. The report shows to what extent it is possible to follow in real-time how research results are being downloaded, read, cited, and applied.   Stricter protocols required
The fact that researchers can use these tools does not necessarily mean that this technology is also a legitimate source of information for research assessment. For this application, they need to adhere to a far stricter protocol of data quality and indicator reliability and validity (for example; what does a download imply on the use of the research results). Most new tools do not yet comply with these stricter quality criteria required for them to be used in research assessments.   Tracking tools
The authors of the report ‘Users, narcissism and control’, Paul Wouters and Rodrigo Costas (CWTS), explore the explosion of tracking tools that has accompanied the surge of web based information instruments. A total of 16 quite different tools have been assessed. The authors also highlight the potential risks and disadvantages of the new tracking tools. Just some of the tools discussed are: Google Scholar, Microsoft Academic Research, Total-Impact, PlosONE altmetrics, en F1000.   Frank van Harmelen, professor of Knowledge Representation and Reasoning at the VU University Amsterdam writes on the report: “New web-based metrics for scientific impact will make it possible to observe the developments in science in near real-time. I’m very happy to see that such web-based measures of scientific impact are now being considered by leading scholars involved in the science of measuring and analysing science (Scientometrics)”.   Research assessment exercises
The report also recommends to start a concerted research programme investigating the dynamics, properties, and potential use of new web based metrics. This programme should relate these new measures to the already established indicators for scientific and scholarly impact. It can provide insight in how these developing metrics could be applied in research assessment exercises.   source: Users, narcissism and control – tracking the impact of scholarly publications in the 21st century
SURFfoundation   Paul Wouters – CWTS, Rodrigo Costas – CWTS   Contributors
Wouter Boon – Rathenau Institute
Jeroen Bosman- Utrecht University
Mariken Elsen – Netherlands Organisation for Scientific Research (NWO)
Gert Goris – Erasmus University Rotterdam
Paul Groth – VU University Amsterdam
Henk van den Hoogen – Maastricht University
Wilfred Mijnhardt – Erasmus University Rotterdam
Ronald Snijder – Amsterdam University Press
Maurice Vanderfeesten – SURFfoundation
Ludo Waltman – CWTS, Leiden University   Editors
Marnix van Berchum – SURFfoundation
Keith Russell – SURFfoundation

The fulltext report is here:

Impact indicators listed in the 2012 AACSB report on impact of research

Via Scoop.itDual impact of research; towards the impactelligent university

  The indicators were identified by schools that participated in the 2011-2012 exploratory study as potential indicators of research impact and/or alignment with expectations. The full list provided below is meant neither to be comprehensive (surely, schools will identify others not listed here, or find new variations) nor to be an endorsement of any particular indicator.
As emphasized within this report, schools must be discerning about whether any particular metric is relevant and cost-effective. Several of the measures included in the list below, for example, were identified by an exploratory study school as a potential measure, but, for various reasons, not one it would choose to utilize.   PRACTICE/COMMUNITY
• media citations (number, distribution)
• requests from the practice community for faculty expertise (e.g., consulting projects, broadcast forums, researcher-practitioner meetings)
• publications in practitioner journals or other venues aimed directly at improving
management expertise and application
• consulting reports
• research income from various types of industry and community collaborative schemes
• case studies of research leading to solutions to business problems or of research being adopted through new practices by industry and community partners
• presentations and workshops
• invitations to serve as experts on policy formulation panels, witnesses at legislative hearings, special interest groups/roundtables, etc.
• tools/methods developed for companies
• membership on boards of directors of corporate and non-profit organizations   ACADEMIC
• overall number of peer-reviewed publications (in designated journals, e.g.
Top 3, 10, etc.)
• citation counts (e.g., SSCI/ISI, Google Scholar)
• download counts (e.g., electronic journals)
• faculty activities as editors, associate editors, or as editorial board members (for
designated journals), reviews for journals
• elections and appointments to key positions in professional associations
• recognitions/awards (e.g., “Best Paper,” etc.) granted by university or scholarly societies
• invitations to participate in research conference, scholarly programs, and/or national and regional research forums
• inclusion of academic work as part of syllabi for courses by other professors
• use of papers in doctoral seminars
• grants from major national and international agencies, (e.g., NSF and NIH); third-party funded research projects, and funds obtained
• patents
appointments as visiting professors in other schools (or a designated set of schools)   DOCTORAL EDUCATION
• hiring/placement of PhD students, junior faculty, post-doctoral research assistants
• publications of PhD program students and graduates
• invited conference attendance, awards/nominations for doctoral students/graduates
• research fellowships awarded to doctoral students/graduates
• funding award levels for students of higher degree research training
• case studies of knowledge transfer to industry and impact on corporate or community practice through higher degree research training activities
• research output of junior faculty members (post-doctoral junior professors and
assistant professors as well as doctoral level research assistants and PhD students), because they are often influenced by a mentor/supervisor   TEACHING
• grants for research that influences teaching practice
• case studies of research leading to the adoption of new teaching and learning practices
• textbooks, teaching manuals, and publications that focus on research methods and teaching: number, editions, sales volume, use in teaching
• research-based learning (e.g., in projects with companies, institutions, and
non-profit organizations)
• instructional software (number developed, number of users)
• case study development (number developed, number of users)   UNDERGRADUATE EDUCATION
• mentorship of undergraduate research, by counting the papers produced by
undergraduate students (under faculty supervision) that culminate in presentation at formal and recognized conferences for undergraduate research
• documented improvements in learning outcomes that result from teaching
innovation (from learning and pedagogical research)   EXECUTIVE EDUCATION
• involvement of research-active faculty in executive education   RESEARCH CENTERS
• invitations by governmental agencies or other organizations for center
representatives to serve on policy making bodies
• continued funding (e.g., number of donors, scale of donations)
• number of hits (e.g., tracked by Google Analytics) on the research center website
• attendees (representing academics, practitioners, policymakers, etc.) at
center-sponsored conferences
• alignment of intellectual contributions with themes valued by the school’s mission
(e.g., “social justice,” “global,” “innovation”)
• percentage of intellectual contributions (at college level and/or department level) that
align with one or more “mission-related categories;” or, percentage of faculty with one or more intellectual contributions that align with one or more categories   See appendix A , page 38 – 39 in the report:

Impact of Research: A Guide for Business Schools; Insights from the AACSB International Impact of Research Exploratory Study

Via Scoop.itDual impact of research; towards the impactelligent university

AACSB’s latest research report draws upon the experiences of ten business schools that volunteered to participate in an exploratory study following the release of the Final Report of the Impact of Research Task Force. The study was intended to determine the overall feasibility of schools undertaking more concerted efforts to assess the impact of intellectual contributions, assess the burden and costs to schools, and begin to explore appropriate measures of impact.   The findings are presented in three sections that correspond generally to objectives that a school might seek to achieve through such an exercise. The objectives and related insights build upon one another to suggest avenues and critical questions for schools that seek to define overall research expectations, produce evidence consistent with their practices and expectations, and then reflect on the relationship of the results to their missions and visions:
I. Defining Research Expectations
Who are we as a school, and what are we aiming to accomplish through our research? What might our “Statement of Impact Expectations” be?
II. Exploring Possible Measures
Is it possible to assess whether or not we are achieving those expectations? What metrics might we use?
III. Using and Communicating What Was Learned
What have we learned about our school through this process? Are there insights from this process that inform other actions or decisions? Are we effectively communicating about our scholarship to relevant audiences?   Fulltext:

The Five Stars of Online Journal Articles — a Framework for Article Evaluation

Via Scoop.itDual impact of research; towards the impactelligent university

the author proposes five factors — peer review, open access, enriched content, available datasets and machine-readable metadata — as the Five Stars of Online Journal Articles, a constellation of five independent criteria within a multi-dimensional publishing universe against which online journal articles can be evaluated, to see how well they match up with current visions for enhanced research communications. Achievement along each of these publishing axes can vary, analogous to the different stars within the constellation shining with varying luminosities. I suggest a five-point scale for each, by which a journal article can be evaluated, and provide diagrammatic representations for such evaluations. While the criteria adopted for these scales are somewhat arbitrary, and while the rating of a particular article on each axis may involve elements of subjective judgment, these Five Stars of Online Journal Articles provide a conceptual framework by which to judge the degree to which any article achieves or falls short of the ideal, which should be useful to authors, editors and publishers. I exemplify such evaluations using my own recent publications of relevance to semantic publishing.   Source: D-Lib Magazine January/February 2012
Volume 18, Number 1/2

From Engagement to Impact? Articulating the Public Value of Academic Research

Via Scoop.itDual impact of research; towards the impactelligent university

This paper reviews recent culture-change in British higher education (HE) and an increasing emphasis on academics evidencing, in meaningful and measurable ways, the value and contribution of their work to national societies. Discussion focuses on what is purported to be a shift from a focus on academics rationalizing the benefits of their work in terms of public engagement to a more contentious signifier of research worth, “impact”. The primary argument herein is that an impact agenda, framed in terms of assessment and by the upcoming Research Excellence Framework 2014, has not eclipsed an engagement initiative for HE in the UK but actually provided greater credence and tacit momentum. Where public engagement “pre-impact” was viewed by sections of the academic community as frivolous, faddish and tokenistic, it is now elevated as an integral component of impact-capture work and in plotting the pathways between research producer and research intermediary/end-user/collaborator. Where “impact” is a statement of the value of academic work, engagement is the method of its articulation and the means by which impacts are mobilized.(2012). From Engagement to Impact? Articulating the Public Value of Academic Research. Tertiary Education and Management. PreviewBuy now DOI:10.1080/13583883.2011.641578 Richard Watermeyera*
Available online: 30 Jan 2012

The never ending search for credibility; Business Schools and their Contribution to Society

Via Scoop.itDual impact of research; towards the impactelligent university

Combing perspectives from business school Deans from around the world, as well as scholars and business leaders, this book presents a unique discussion of the current and future challenges facing business schools today. Business schools are arguably some of the most influential institutions in contemporary society, heavily influencing the way much socioeconomic activity is conducted. The education they provide is an important theme to be considered in its own right – and perhaps even challenged. This exciting book explores the role of business schools in contemporary global society through 3 key dimensions: How business school legitimacy has been challenged by the recent economic crisis and corporate scandals; How business schools contribute to shaping and transforming business conduct; and How business schools, past and present, develop their identities to face the challenges presented by the ongoing globalization process. Combing perspectives from business school Deans from around the world, as well as scholars and business leaders, this book presents a unique discussion of the current and future challenges facing business schools today.   Book: Business Schools and their Contribution to Society Mette Morsing Copenhagen Business School, Denmark
Alfons Sauquet Rovira ESADE Business School 280 pages SAGE Publications Ltd 2011. ISBN: 9780857023872

Who still needs a university? 400 Free Online Courses from Top Universities | Open Culture

Via Scoop.itDual impact of research; towards the impactelligent university

Impact for all. Here is a list of free online courses published by the world’s leading universities like Stanford, Yale, MIT, Harvard & Berkeley. This collection includes over 250 free courses in economics, business, philosophy, the liberal arts and sciences. Download these audio & video courses straight to your computer or mp3 player.

RSS Business School News

  • An error has occurred; the feed is probably down. Try again later.
Locations of Site Visitors

Altmetric Related stuf

if(typeof(AltmetricWidget) !== 'undefined'){ window.AltmetricWidget489 = new AltmetricWidget("AltmetricWidget489"); }}

%d bloggers like this: