Twitter and your academic reputation: friends or enemies?

trial by twitter

The feedback from social media like twitter can strike as fast as lightning, with consequences unforseen. For many researchers, the pace and tone of this online review can be intimidating — and can sometimes feel like an attack. How do authors best deal with these forms of peer review after publication? The speed of this “social peer review” is much faster than the time that was needed for the peer review process during submission and acceptance of the paper as part of the publishing process in (top) academic journals. and the feedback comes from anywhere, not just the circle of accepted experts in the field like with the blind review process of journals.

The result of this can be of enormous impact on your academic reputation. What if suddenly thousands of tweets disapprove of the conclusions of a paper just published? It will create a situation that is nearly impossible to handle for the author of the work. There will be a “negative sentiment” around the publications that will be of influence on aspects of your reputation. For example the chances your paper will be cited often. How will this social sentiment be of influence to other papers in the pipeline and under submission with (top) journals? How will the social sentiment be of influence to your co-authors? How will it influence the chances of your grant applications? How ill it influence your tenure process if the sentiment is negative? These are all huge stakes for researchers.

A recent article by Apoorva Mandavilla in Nature deals with this issue.  It is about “fast feedback”, “a chorus of (dis) approval”, “Meta-twitters ‘,  ‘new (alt) metrics of communication” and some possible solutions for the situation.

The possible power of social media for research and academic reputation is evident for me. The management of the communication and speed of the feedback needs special skills and special publication strategies by researchers (and institutes!) who care about their future careers and their reputation. The open social media review dynamics  at networks like twitter currently has many risks for the author of the paper. But at the same time the stakes and risks for the crowd who collectively performs these “trials”  is very low I guess. A single tweet is not powerful, but the flog together is impact full. It is a collective review of the crowd, with often a lot of people who just follow the sentiment by simply re-tweeting others.

I advice researchers to be very careful about which message on their paper is distributed in social networks, how this is distributed, by whom it is distributed and who is replying on it.  The social networks should not reproduce or copycat the formal peer review process by selected experts. They should be focused on adding value to the possible additional virtues of the work. The best approach might be to leverage the social media by initiating stories on possible practical values and practical impact of the research. Because when these  are confirmed in the wider social network audiences, the author can get confidence that the practical / managerial value of his research is valued and tested  immediately.  In this way the social networks can be very beneficial for the academic reputation; they perform a sounding board for testing managerial / practical value of the research.

Crowdsourced ranking of journals in Management; any news?

Vote for management journals

Management researchers love journal rankings. A new way to get the judgement of thousands of voters was launched recently by Teppo Felin, Brigham Young University. After a few weeks already 80.000 votes were submitted to rank more than 100 journals in the field of management. Voters can submit a vote as many times as they like.  After filtering out about 20.000 automated robot votes, an impressive amount of 60.000 votes remained.

There are already some blogposts on this ranking where peers discuss the pro’s and cons of performing the ranking this way.

The crowd initiative is here @ All Our Ideas: http://www.allourideas.org/management

Besides the obvious trouble to compare all these journals (a nearly impossible task), one can ask what the ‘wisdom of the crowds’ adds to already existing lists of the top journals in the field.

A lot of normal suspects turn up in the top 25: (as of today 22/01/2011, with 80.000 votes, the results before the cleaning).

A score of 100 means a journal is expected to win all the time when compared with another journal in the contest and a score of zero means the journal is expected to lose all the time. There are no zero scores in this ranking (yet).

 

  1. JOURNAL OF APPLIED PSYCHOLOGY (score: 81)
  2. ADMINISTRATIVE SCIENCE QUARTERLY (76)
  3. ACADEMY OF MANAGEMENT REVIEW (74)
  4. ORGANIZATION SCIENCE (73)
  5. ACADEMY OF MANAGEMENT JOURNAL (72)
  6. JOURNAL OF MANAGEMENT (71)
  7. MIS QUARTERLY (70)
  8. STRATEGIC MANAGEMENT JOURNAL (68)
  9. ORGANIZATIONAL BEHAVIOR AND HUMAN DECISION PROCESSES (67)
  10. MANAGEMENT SCIENCE (66)
  11. DECISION SCIENCES (65)
  12. ORGANIZATIONAL RESEARCH METHODS (64)
  13. JOURNAL OF ORGANIZATIONAL BEHAVIOR (63)
  14. JOURNAL OF MANAGEMENT STUDIES (63)
  15. PERSONNEL PSYCHOLOGY (63)
  16. JOURNAL OF INTERNATIONAL BUSINESS STUDIES (60)
  17. RESEARCH IN ORGANIZATIONAL BEHAVIOR (60)
  18. ACADEMY OF MANAGEMENT PERSPECTIVES (60)
  19. HUMAN RELATIONS (60)
  20. LEADERSHIP QUARTERLY (59)
  21. ORGANISATION STUDIES (59)
  22. JOURNAL OF OPERATIONS MANAGEMENT (59)
  23. HARVARD BUSINESS REVIEW (58)
  24. MIT SLOAN MANAGE REVIEW (57)
  25. HUMAN RESOURCE MANAGEMENT-US (57)

The initiator of the survey reported today (29-jan-2011) on the results:

So, what can a crowdsourced ranking tell us?  Nothing definitive is my guess, though I’m sure other rankings don’t necessarily give us a definitive signal either.  Though I do think that aggregated perceptions perhaps give us another data point when evaluating and comparing journals (along with impact and influence factors and other, more “objective” measures).  These rankings can of course mirror extant rankings (raising causal questions). But they might also capture more up-to-date anticipations of future performance. For example, the UC Irvine Law School (established in 2008) has not graduated a single student, though the school is already well within the top 50 in the crowdsourced ranking.

Lots of other questions can be raised, specifically related to a management journal ranking like this.  For example, should micro and macro journals be lumped together like this?  And certainly disciplinary journals play a large role in management – should they be included (sociology, psychology, economics)?

Strategic “gaming” of the results of course can also occur.  For example, I ended up having to delete some 25,000+ automatically generated votes (it looked like a computer script was created to throw the ranking off), votes that were explicitly cast to sabotage the effort (the African Journal of Management beat all the top journals according to this mega, robo-voter).  Though, it is interesting to see how the “crowd” essentially plays a role in averaging bias and in putting a check on strategic voting.

Ironically, I’m actually not one to necessarily really care about journal rankings like this.  I wonder whether article-effects trump journal-effects?  (I believe Joel Baum has a provocative paper on this.)  Of course I read and submit to “top” journals, but there are many “lesser” (ahem)  journals that are just as much on my radar screen, for example Industrial and Corporate Change, Managerial and Decision Economics andStrategic Organization. Obsessions with journal standing can detract from ideas.

Pragmatically, yes, journal rankings matter: promotions indirectly depend on it, as do resource flows etc.  So, perhaps a “democratic,” crowdsourced ranking like this can provide additional information for decision-makers and scholars in the field.

 

The top 15 list after filtering out the 20.000 automated robot-votes: the picture is now even more ‘traditional’.

Management Journals %
1 Administrative Science Quarterly 90.43%
2 Academy of Management Journal 90.41%
3 Academy of Management Review 88.94%
4 Organization Science 88.31%
5 Strategic Management Journal 84.42%
6 Journal of Applied Psychology 84.00%
7 Management Science 82.56%
8 Journal of Management 82.46%
9 OBHDP 78.93%
10 Organizational Research Methods 74.06%
11 Journal of Organizational Behavior 72.79%
12 Personnel Psychology 71.93%
13 Journal of Management Studies 71.10%
14 Research in OB 70.37%
15 Organization Studies 69.68%

 

Where do all these votes come from?

The whole picture is US en EU dominated. For example: only 16 votes from Shanghai, China are recorded.

overall crowd votes worldmap 20110122

The complete ranking is here as PDF file of the ranking website (22/01/2011): https://rmimr.wordpress.com/wp-content/uploads/2011/01/all-our-ideas-crwodsourcing-management-journals-20110122.pdf

A new measure of esteem: prestige or how often is a researcher cited by highly cited researchers?

Prestige & Popularity

Ding and Cronin make a nice distinction between popularity and prestige of a researcher; popularity of a researcher is measured by the number of times he is cited
by all papers in a dataset; the prestige of a researcher by the number of times he is cited by highly cited papers in the same set.  A scholar may be highly cited but not highly regarded: popularity
and prestige are not identical measures of esteem. The authors focus primarily on authors rather than journals.

Popularity vs. Prestige
Popularity and prestige exist in the following possible relations:
High popularity-high prestige, High popularity-low prestige, Low popularity-high prestige and Low popularity-ow prestige

 

Source: http://arxiv.org/ftp/arxiv/papers/1012/1012.4871.pdf

Popular and/or Prestigious? Measures of Scholarly Esteem
Ying Ding, Blaise Cronin
School of Library and Information Science, Indiana University, Bloomington, IN 47405, USA
Abstract
Citation analysis does not generally take the quality of citations into account: all citations are weighted equally irrespective of source. However, a scholar may be highly cited but not highly regarded: popularity and prestige are not identical measures of esteem. In this study we define popularity as the number of times an author is cited and prestige as the number of times an author is cited by highly cited papers. Information Retrieval (IR) is the test field. We compare the 40 leading researchers in terms of their popularity and prestige over time. Some authors are ranked high on prestige but not on popularity, while others are ranked high on popularity but not on prestige. We also relate measures of popularity and prestige to date of Ph.D. award, number of key publications, organizational affiliation, receipt of prizes/honors, and gender.

Science-metrix develops new journals classification system (and interactive webtool) using both the Web of Science (Thomson Reuters) and Scopus (Elsevier).

sciencemetrix explorerA new set of tools recently released (first public release: 2010-12-01 (v1.00)) by Science-Metrix Inc. seeks to improve the way we talk about and understand science. The US/Canada-based research evaluation firm has developed a new, multi-lingual (18 languages!) classification of scientific journals, which is accompanied by an interactive web tool.  The journal classification, which covers 15,000 peer-reviewed scientific journals, is translated by more than 22 international experts who volunteered their time and expertise, making the tools available to a worldwidScience-metrix logoe audience.  The complete journals list is available for download as excel sheet.

The interactive ‘Scientific Journals Ontology Explorer’ allows users to visualise the links between 175 scientific specialties (subject categories) in 18 languages, from Arabic to Swedish.

The visualization contains 3 different views: a circular “Subfield Citation Wheel” (representing both citations as well as references), a “Field Citation Wheel” (showing the links between distinct scientific disciplines) and a network “Map of Science” (revealing similarities between disciplines by relative distance). The goal of this visualization is to show people how science spans a broad universe and how interlinked scientific research actually is.

How is the field of Business & Management covered in the journals list?

The field of  Economics & Business (as part of the domain Economic & Social Sciences) contains 822 journals (this is 5,4% of he total Science metrix list)  in 12 subfields, where Business & Management is included with 222 journals (27%):

Every journal is classified into only one category.

Subfields of ‘Economics & Business’ N %
Accounting 32 3,9%
Agricultural Economics & Policy 27 3,3%
Business & Management 222 27,0%
Development Studies 42 5,1%
Econometrics 13 1,6%
Economic Theory 13 1,6%
Economics 244 29,7%
Finance 63 7,7%
Industrial Relations 21 2,6%
Logistics & Transportation 49 6,0%
Marketing 61 7,4%
Sport, Leisure & Tourism 35 4,3%
Total Journals 822 100%

I will compare these with our ERIM Journals list, the ISI quartiles and the SJR (scopus) quartiles scores to see how the list is structured in terms of quality layers.

(I wil add these details later this week.)

In the ontology browser, you can create a map of science and learn how the field of business and management is connected to other subject categories. I have selected the closest fields in the screenshot below.

business and management in science metrix

About Science-Metrix:

Science-Metrix is the world’s largest independent firm dedicated to scientometrics, technometrics, and science and technology (S&T) evaluation. The firm’s core business involves supporting evidence-based decision-making with strong empirical data and sound theoretical approaches. This contract research organization combines qualitative and quantitative techniques to deliver high quality program evaluations, performance and outcome assessments, and evaluation frameworks. Every Science-Metrix report is produced by a team of dedicated high-calibre experts and relies on the world-class data found in the Scopus, Web of Science and Questel databases.

RSS Business School News

  • An error has occurred; the feed is probably down. Try again later.
Locations of Site Visitors

Altmetric Related stuf

if(typeof(AltmetricWidget) !== 'undefined'){ window.AltmetricWidget489 = new AltmetricWidget("AltmetricWidget489"); }}