altmetrics in education

Altmetrics: A Practical Guide for Librarians, Researchers and Academics


In scholarly and scientific publishing, altmetrics are non-traditional metrics[2] proposed as an alternative[3] to more traditional citation impact metrics, such as impact factor and h-index.[4] The term altmetrics was proposed in 2010,[1] as a generalization of article level metrics,[5] and has its roots in the #altmetrics hashtag. Although altmetrics are often thought of as metrics about articles, they can be applied to people, journals, books, data sets, presentations, videos, source code repositories, web pages, etc. They are related to Webometrics, which had similar goals but evolved before the social web. Altmetrics did not originally cover citation counts.[6] It also covers other aspects of the impact of a work, such as how many data and knowledge bases refer to it, article views, downloads, or mentions in social media and news media.[7][8]


more on analytics and metrics in education in this IMS blog


3 Comments on altmetrics in education

    • Plamen Miltenoff
      December 15, 2016 at 11:49 pm (4 years ago)
      Eysenbach, G. (2011). Can Tweets Predict Citations? Metrics of Social Impact Based on Twitter and Correlation with Traditional Metrics of Scientific Impact. Journal of Medical Internet Research, 13(4).

      Harley, Diane; Acord, Sophia Krzys; Earl-Novell, Sarah; Lawrence, Shannon; & King, C. Judson. (2010). Assessing the Future Landscape of Scholarly Communication: An Exploration of Faculty Values and Needs in Seven Disciplines. Center for Studies in Higher Education. UC Berkeley: Center for Studies in Higher Education. Retrieved from:
      Someday, Altmetrics Will No Longer Need “Alt” – The Chronicle of Higher Education. (n.d.). Retrieved December 15, 2016, from

      Beyond Bibliometrics. (n.d.). Retrieved December 15, 2016, from

      Roemer, R. C., & Borchardt, R. (2015). Chapter 3. Issues, Controversies, and Opportunities for Altmetrics. Library Technology Reports, 51(5), 20–30.
      Meaningful Metrics: A 21st Century Librarian’s Guide to Bibliometrics, Altmetrics, and Research Impact – Books / Professional Development – Books for Academic Librarians – New Products – ALA Store. (n.d.). Retrieved December 15, 2016, from

      Tattersall, A. (2016). Altmetrics: A Practical Guide for Librarians, Researchers and Academics. Facet Publishing.
      Thelwall, M., & Wilson, P. (2016). Mendeley readership altmetrics for medical articles: An analysis of 45 fields. Journal of the Association for Information Science and Technology, 67(8), 1962–1972.
      p. 3 Bibliometric indicators have known biases, however. For example, citation-based impact indicators typically reflect an aspect of academic impact whereas funding agenci es often target wider types of impact.
      A second problem with citation-based indicators is that they impo se a delay of several years on evaluations. Citations take time to accrue whilst publications are read by others who then conduct new research informed by their reading, write up and publish studies and then wait for them to be peer reviewed and published. A lternative indicators derived from the web may be timelier because much of the web and all of the social web allows instant publishing. Web-derived indicators (for a review, see Thelwall and Kousha, 2015a, b; Kousha and Thelwall, 2015) may also reflect wider impacts than citation counts because non-academics widely publ ish online and may sometimes discuss researchers and research-related issues (Cronin et al., 1998; Priem et al., 2010). The web may also contain evidence of educational uses of research, such as citations in online syllabi (Kousha and Thelwall, 2008). This study explores a variety of alternative indicators, provides guidelines for research funding scheme evaluations and reports a pilot study of the Webometric/altmetric impact of research outputs produced by a sample of researchers supported by the Wellcome Trust
      Patent metrics PowerPoint file mentions, PDF and DOC mentions, Web (course) syllabus mentions, Web mentions, URL citations, Tweet links, Blog citations,
      F1000 scores: these post-publication peer-review evaluations can be informative indicators of scientific and non-scientific value for biomedical science articles (Li and Thelwall, 2012; Mohammadi and Thelwall, 2013), although only a minority have scores and they may need to be bought from F1000. Since 90 per cent of these appear within half a year of an article being published (Waltman and Costas, 2014), they have a substantial time advantage over citations.
      Online clinical guideline citations: citations from online clinical guidelines are direct evidence of the health benefits of medical research and these citations correlate weakly with citation counts
      • Mendeley readers: the number of registered users of the social reference sharing site Mendeley that bookmark an article correlates highly (about 0.7) with citation counts in many fields and contexts (Li et al., 2012; Thelwall and Wilson, in press), and seem to be more prevalent than other altmetrics (Zahedi et al., 2014a) except perhaps tweet counts (see also Borrego and Fry, 2012). These readership counts mainly reflect academic impact, although with an element of educational impact and a bias towards younger researchers (Mohammadi et al., 2015, in press; see also Zahedi et al., 2014b). Mendeley readers typically appear about one to two years before citations (Maflahi and Thelwall, in press), making them particularly useful for early impact evaluations.
      In addition to the above, Facebook wall posts, Zotero and CiteULike bookmarks, Reddits and LinkedIn citations seem to be too rare for use in evaluations, except perhaps those on a very large scale. There is some evidence of a weak correlation with citation counts for most of these (Costas et al., 2015; Thelwall et al., 2013).
      A generic problem with altmetrics and webometrics is that they are typically easy to manipulate because they are not subject to quality control. They are not suitable for formal evaluations of researchers or research groups (Wouters and Costas, 2012) unless steps are taken to guard against deliberate fraud.

  1. sabir shah
    September 28, 2017 at 6:32 pm (3 years ago)

    Turnitin Plagiarism Checker

    Turnitin and Ithenticate are the best plagiarism checkers. However, these are not the free version, but you don’t need to worry because these two we can provide you with low price. Turnitin Plagiarism Checker is the best Plagiarism checker. There are two types of settings in Turnitin, One is known as repository setting, and another one is known as no repository. The institute’s use repository setting and they keep the files in the Turnitin database while we are using no repository setting, none of your files and data will not be stored in the Turnitin database. Moreover, we always keep all the work of each writer as a confidential and we never share it with anyone. So, In that case, your document will always be secure and confidential. We will register a personal Turnitin account on your personal email and you will be able to check your paper anytime you want. As for the Ithenticate, we can’t give you Ithenticate account, but we can give you ithenticate Plagiarism report.

    For more information visit Our website:

    Thanks a lot.


Leave a Reply