OC-2: Alternative Metrics ('Altmetric') Score for ICARDA publications

Definition: Alternative metrics measure the dissemination and citation of publications (including those that are not peer reviewed) in online media, providing an indication of reach and influence.

Unit of Measure: Number (Integer)

Disaggregated by: Type of knowledge product (Audio, Blog, Book, Book chapter, Brief, Brochure, Conf. paper, Conf. proceeding, Dataset, Donor report, Image, Journal Article ISI, Journal Article Non-ISI, Manual, Map, Newsletter, Other, poster, Presentation, Report, Software, Template, Thesis, Tool, Training material, Video, Website, Working paper)

Method of Calculation:
The score is derived from an automated algorithm, and represents a weighted count of the amount of attention picked up for a research output. The score is weighted to reflect the relative reach of each type of source (for example, an average newspaper story is more likely to bring attention to the research output than an average tweet). The weights are as listed in the square brackets ([]) below.
News [8], Blog [5], Policy document (per source) [3], Patent [3], Wikipedia [3], Twitter (tweets and retweets) [1], Peer review (Publons, Pubpeer) [1], Weibo (not trackable since 2015 but historical data kept) [1], Google+ (not trackable since 2019 but historical data kept) [1], F1000 [1], Syllabi (Open Syllabus) [1], LinkedIn (not trackable since 2014 but historical data kept) [0.5],
Facebook (only a curated list of public Pages) [0.25], Reddit [0.25], Pinterest (not trackable since 2013 but historical data kept) [0.25], Q&A (Stack Overflow) [0.25], Youtube [0.25], Number of Mendeley readers [0], Number of Dimensions and Web of Science citations [0]. Further details here.
The Altmetric Attention Score always has to be a whole number. This means that mentions that contribute less than 1 to the score sometimes get rounded up to one.

Data sources: MEL, DSpace

Data collection method: Automated

Data collection and reporting responsibility: Knowledge and Data Management team

Data Collection and Reporting Frequency: Annual

Evidence required: Publication statistics in CSV format, short narrative about the nature of their Altmetric scores.

Rationale:

  • Traditional measures of the dissemination (publication in peer reviewed journals) and use (academic citations) of research can fail to capture its use, influence and dissemination by non-traditional means, for example HTML views and PDF downloads or discussion in news sources, policy documents, science blogs, Wikipedia, Twitter, Facebook and other social media. As these non-traditional sources of information become increasingly important for uptake, including by policy-makers, using alternative metrics ('altmetrics') is useful for measuring dissemination and influence.

  • Altmetric is particularly useful for non-peer reviewed publications. There is often no permanent stable way to track use of these (although individual projects may track downloads, etc.) and tracking in Altmetric provides material to evidence their importance which can provide a counterbalance to an exclusive emphasis on peer reviewed publications.

  • Tracking Altmetric provides research and administrative staff with ideas for how to better communicate research findings and reach target users.

Comments and limitations:

  • Use of Altmetric requires proper archiving and use of stable links, instead of temporary links (e.g. to project websites), which overall encourages more sustainable information management of published materials. This is particularly important for non-peer reviewed publications – for example briefing papers, working papers, games, decision trees – as there is often little incentive to archive these properly and they can become 'lost to history' after projects finish, encouraging reinvention of the wheel and also loss of 'negative results'. Altmetric will be a useful metric if projects/researchers archive knowledge products properly.

  • It is recognised that annual reporting cycle/period does not give a full picture of the uptake of publications completed towards the end of the year, since it may take some months for full social media uptake (and years for conventional citations). However alternative periods have been suggested and none have found general acceptance.

  • Altmetric has a large number of disaggregates that are evolving over time, creating disparity in the score of knowledge products at different points in time.

  • Interpretation of the scores is not straightforward as different types of Altmetrics reflect different sorts of sharing and spread.