altmetrics in education

Altmetrics: A Practical Guide for Librarians, Researchers and Academics

http://www.alastore.ala.org/detail.aspx?ID=11531&zbrandid=4634&zidType=CH&zid=38109786&zsubscriberId=1026665847&zbdom=http://ala-publishing.informz.net

——————————–

http://altmetrics.org/tools/

https://en.wikipedia.org/wiki/Altmetrics

In scholarly and scientific publishing, altmetrics are non-traditional metrics[2] proposed as an alternative[3] to more traditional citation impact metrics, such as impact factor and h-index.[4] The term altmetrics was proposed in 2010,[1] as a generalization of article level metrics,[5] and has its roots in the #altmetrics hashtag. Although altmetrics are often thought of as metrics about articles, they can be applied to people, journals, books, data sets, presentations, videos, source code repositories, web pages, etc. They are related to Webometrics, which had similar goals but evolved before the social web. Altmetrics did not originally cover citation counts.[6] It also covers other aspects of the impact of a work, such as how many data and knowledge bases refer to it, article views, downloads, or mentions in social media and news media.[7][8]

++++++++++++++++

more on analytics and metrics in education in this IMS blog

https://blog.stcloudstate.edu/ims?s=analytics

http://blog.stcloudstate.edu/ims?s=metrics

Save

3 Comments on altmetrics in education

    • Plamen Miltenoff
      December 15, 2016 at 11:49 pm (9 months ago)

      http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3278109/
      Eysenbach, G. (2011). Can Tweets Predict Citations? Metrics of Social Impact Based on Twitter and Correlation with Traditional Metrics of Scientific Impact. Journal of Medical Internet Research, 13(4). doi.org/10.2196/jmir.2012

      Harley, Diane; Acord, Sophia Krzys; Earl-Novell, Sarah; Lawrence, Shannon; & King, C. Judson. (2010). Assessing the Future Landscape of Scholarly Communication: An Exploration of Faculty Values and Needs in Seven Disciplines. Center for Studies in Higher Education. UC Berkeley: Center for Studies in Higher Education. Retrieved from: escholarship.org/uc/item/15x7385g

      http://www.chronicle.com/article/Someday-Altmetrics-Will-No/237118
      Someday, Altmetrics Will No Longer Need “Alt” – The Chronicle of Higher Education. (n.d.). Retrieved December 15, 2016, from http://www.chronicle.com/article/Someday-Altmetrics-Will-No/237118

      Beyond Bibliometrics. (n.d.). Retrieved December 15, 2016, from mitpress.mit.edu/books/beyond-bibliometrics

      Roemer, R. C., & Borchardt, R. (2015). Chapter 3. Issues, Controversies, and Opportunities for Altmetrics. Library Technology Reports, 51(5), 20–30.
      journals.ala.org/ltr/article/view/5747

      http://www.ala.org/acrl/sites/ala.org.acrl/files/content/publications/booksanddigitalresources/digital/9780838987568_metrics_OA.pdf
      Meaningful Metrics: A 21st Century Librarian’s Guide to Bibliometrics, Altmetrics, and Research Impact – Books / Professional Development – Books for Academic Librarians – New Products – ALA Store. (n.d.). Retrieved December 15, 2016, from http://www.alastore.ala.org/detail.aspx?ID=11441

      Tattersall, A. (2016). Altmetrics: A Practical Guide for Librarians, Researchers and Academics. Facet Publishing.

      http://www.scit.wlv.ac.uk/~cm1993/papers/MendeleyInScienceAltmetricsPreprint.pdf
      Thelwall, M., & Wilson, P. (2016). Mendeley readership altmetrics for medical articles: An analysis of 45 fields. Journal of the Association for Information Science and Technology, 67(8), 1962–1972. https://doi.org/10.1002/asi.23501
      p. 3 Bibliometric indicators have known biases, however. For example, citation-based impact indicators typically reflect an aspect of academic impact whereas funding agenci es often target wider types of impact.
      A second problem with citation-based indicators is that they impo se a delay of several years on evaluations. Citations take time to accrue whilst publications are read by others who then conduct new research informed by their reading, write up and publish studies and then wait for them to be peer reviewed and published. A lternative indicators derived from the web may be timelier because much of the web and all of the social web allows instant publishing. Web-derived indicators (for a review, see Thelwall and Kousha, 2015a, b; Kousha and Thelwall, 2015) may also reflect wider impacts than citation counts because non-academics widely publ ish online and may sometimes discuss researchers and research-related issues (Cronin et al., 1998; Priem et al., 2010). The web may also contain evidence of educational uses of research, such as citations in online syllabi (Kousha and Thelwall, 2008). This study explores a variety of alternative indicators, provides guidelines for research funding scheme evaluations and reports a pilot study of the Webometric/altmetric impact of research outputs produced by a sample of researchers supported by the Wellcome Trust
      Patent metrics PowerPoint file mentions, PDF and DOC mentions, Web (course) syllabus mentions, Web mentions, URL citations, Tweet links, Blog citations,
      F1000 scores: these post-publication peer-review evaluations can be informative indicators of scientific and non-scientific value for biomedical science articles (Li and Thelwall, 2012; Mohammadi and Thelwall, 2013), although only a minority have scores and they may need to be bought from F1000. Since 90 per cent of these appear within half a year of an article being published (Waltman and Costas, 2014), they have a substantial time advantage over citations.
      Online clinical guideline citations: citations from online clinical guidelines are direct evidence of the health benefits of medical research and these citations correlate weakly with citation counts
      • Mendeley readers: the number of registered users of the social reference sharing site Mendeley that bookmark an article correlates highly (about 0.7) with citation counts in many fields and contexts (Li et al., 2012; Thelwall and Wilson, in press), and seem to be more prevalent than other altmetrics (Zahedi et al., 2014a) except perhaps tweet counts (see also Borrego and Fry, 2012). These readership counts mainly reflect academic impact, although with an element of educational impact and a bias towards younger researchers (Mohammadi et al., 2015, in press; see also Zahedi et al., 2014b). Mendeley readers typically appear about one to two years before citations (Maflahi and Thelwall, in press), making them particularly useful for early impact evaluations.
      In addition to the above, Facebook wall posts, Zotero and CiteULike bookmarks, Reddits and LinkedIn citations seem to be too rare for use in evaluations, except perhaps those on a very large scale. There is some evidence of a weak correlation with citation counts for most of these (Costas et al., 2015; Thelwall et al., 2013).
      A generic problem with altmetrics and webometrics is that they are typically easy to manipulate because they are not subject to quality control. They are not suitable for formal evaluations of researchers or research groups (Wouters and Costas, 2012) unless steps are taken to guard against deliberate fraud.

      Reply
  1. Abhinav
    August 13, 2017 at 1:02 pm (1 month ago)

    Guys just sharing, I’ve found this interesting about Educational Blog! Check it out!

    Reply

Leave a Reply