Searching for "metrics"

altmetrics library Lily Troia

Taking Altmetrics to the Next Level in Your Library’s Systems and Services

Instructor: Lily Troia, Engagement Manager, Altmetric
October 31, 2017, 1:00 pm – 2:30 pm Central time

Register here, courses are listed by date

This 90 minute webinar will bring participants up to speed on the current state of altmetrics, and focus in on changes across the scholarly ecosystem. Through sharing of use cases, tips, and open discussion, this session will help participants to develop a nuanced, strategic framework for incorporating and promoting wider adoption of altmetrics throughout the research lifecycle at their institution and beyond.

++++++++++++++

https://www.force11.org/sites/default/files/d7/presentation/1/fsci_mt9_altmetrics_day1.pdf

Definition by National Information Standards Organization NISO (http://www.niso.org/home/): Altmetrics is a broad term that encapsulates the digital collection, creation, and use of multiple forms of assessment that are derived from activity and engagement among diverse stakeholders and scholarly outputs in the research ecosystem.”

Altmetrics are data that help us understand how often and by whom research objects are discussed, shared, and used on the social Web.”

PlumX Metrics – Plum Analytics

Altmetric Explorer

https://www.altmetric.com/login.php

How are researchers & institutions using Altmetric?

  • Research and evaluation services – Identify & track influential research; assess impact & reach
  • Grants and reporting – Target new grants & grantees; demonstrate value to stakeholders
  • Communications and reputation management – Track press/social media; connect to opinion leaders
  • Marketing and promotion – Highlight vital findings; benchmark campaigns and outreach
  • Collaboration and partnerships – Discover disciplinary intersections & collaborative opportunities

DISCOVERY • Find trending research • Unearth conversations among new audiences • Locate collaborators & research opportunities • Identify key opinion leaders • Uncover disciplinary intersection

SHOWCASING • Identifying research to share • Share top mentions • Impact on public policy • Real-time tracking • Identifying key researchers • Recognizing early-career researchers

REPORTING • Grant applications • Funder reporting • Impact requirements • Reputation management • Benchmarking and KPIs (Key performance indicators) • Recruitment & review • Integration into researcher profiles/repositories

++++++++++++

https://www.force11.org/sites/default/files/d7/presentation/1/fsci_mt9_altmetrics_day_2.pdf

https://www.force11.org/sites/default/files/d7/presentation/1/fsci_mt9_altmetrics_fridaysummary.pptx

++++++++++++

+++++++++++++
more on altmetrics in the library in this IMS blog
https://blog.stcloudstate.edu/ims?s=altmetrics+library

citations bibliometrics

Bertin, M., Atanassova, I., Gingras, Y., & Larivière, V. (2016). The Invariant Distribution of References in Scientific Articles. Journal Of The Association For Information Science & Technology67(1), 164-177. doi:10.1002/asi.23367

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3daph%26AN%3d112228404%26site%3dehost-live%26scope%3dsite

from the viewpoint of bibliometrics, how references are distributed along the structure of scientific papers as well as the age of these cited references

Once the sections of articles are realigned to follow the IMRaD sequence, the position of cited references along the text of articles is invariant across all PLoS journals, with the
introduction and discussion accounting for most of the references. It also provides evidence that the age of cited references varies by section, with older references being found in the methods and more recent references in the discussion.

different roles citations have in the scholarly communication process.

+++++++++++++++++++
more on bibliometrics in this IMS blog
http://blog.stcloudstate.edu/ims?s=bibliometrics

bibliometrics altmetrics

International Benchmarks for Academic Library Use of Bibliometrics & Altmetrics, 2016-17

ID: 3807768 Report August 2016 115 pages Primary Research Group

http://www.researchandmarkets.com/publication/min3qqb/3807768

The report gives detailed data on the use of various bibliometric and altmetric tools such as Google Scholar, Web of Science, Scimago, Plum Analytics

20 predominantly research universities in the USA, continental Europe, the UK, Canada and Australia/New Zealand. Among the survey participants are: Carnegie Mellon, Cambridge University, Universitat Politècnica de Catalunya the University at Albany, the University of Melbourne, Florida State University, the University of Alberta and Victoria University of Wellington

– 50% of the institutions sampled help their researchers to obtain a Thomsen/Reuters Researcher ID.

ResearcherID provides a solution to the author ambiguity problem within the scholarly research community. Each member is assigned a unique identifier to enable researchers to manage their publication lists, track their times cited counts and h-index, identify potential collaborators and avoid author misidentification. In addition, your ResearcherID information integrates with the Web of Science and is ORCID compliant, allowing you to claim and showcase your publications from a single one account. Search the registry to find collaborators, review publication lists and explore how research is used around the world!

– Just 5% of those surveyed use Facebook Insights in their altmetrics efforts.

 

 

++++++++++++++
more on altmetrics in this IMS blog
http://blog.stcloudstate.edu/ims?s=altmetrics

social media metrics

more on social media and metrics in this IMS blog
http://blog.stcloudstate.edu/ims?s=social+media+metrics

social media and altmetrics

Sugimoto, C. R., Work, S., Larivière, V., & Haustein, S. (2016). Scholarly use of social media and altmetrics: a review of the literature. Retrieved from https://arxiv.org/abs/1608.08112
https://arxiv.org/ftp/arxiv/papers/1608/1608.08112.pdf
One of the central issues associated with altmetrics (short for alternative metrics) is the identification of communities engaging with scholarly content on social media (Haustein, Bowman, & Costas, 2015; Neylon, 2014; Tsou, Bowman, Ghazinejad, & Sugimoto, 2015) . It is thus of central importance to understand the uses and users of social media in the context of scholarly communication.
most identify the following major categori es: social networking, social bookmarking, blogging, microblogging, wikis , and media and data sharing (Gu & Widén -Wulff, 2011; Rowlands, Nicholas, Russell, Canty, & Watkinson, 2011; Tenopir et al., 2013) . Some also conside r conferencing, collaborative authoring, scheduling and meeting tools (Rowlands et al., 2011) or RSS and online documents (Gu & Widén -Wulff, 2011; Tenopir et al., 2013) as social media. The landscape of social media, as well as that of altmetrics, is constantly changing and boundaries with othe r online platforms and traditional metrics are fuzzy. Many online platforms cannot be easily classified and more traditional metrics , such as downloads and mentions in policy documents , have been referred to as altmetrics due to data pr ovider policies.
the Use of social media platforms for by researchers is high — ranging from 75 to 80% in large -scale surveys (Rowlands et al., 2011; Tenopir et al., 2013; Van Eperen & Marincola, 2011) .
but
less than 10% of scholars reported using Twitter (Rowlands et al., 2011) , while 46% used ResearchGate (Van Noorden, 2014) , and more than 55% use d YouTube (Tenopir et al., 2013) —it is necessary to discuss the use of various types of social media separately . Furthermore, there i s a distinction among types of us e, with studies showing higher uses of social media for dissemination, consumption, communication , and promotion (e.g., Arcila -Calderón, Piñuel -Raigada, & Calderín -Cruz, 2013; Van Noorden, 2014) , and fewer instances of use for creation (i.e., using social media to construct scholarship) (British Library et al., 2012; Carpenter, Wetheridge, Tanner, & Smith, 2012; Procter et al., 2010b; Tenopir et al., 2013) .
Frequently mentioned social platforms in scholarly communication research include research -specific tools such as Mendeley, Zotero, CiteULike, BibSonomy, and Connotea (now defunct) as well as general tools such as Delicious and Digg (Hammond, Hannay, Lund, & Scott, 2005; Hull, Pettifer, & Kell, 2008; Priem & Hemminger, 2010; Reher & Haustein, 2010) .
Social data sharing platforms provide an infrastructure to share various types of scholarly objects —including datasets, software code, figures, presentation slides and videos —and for users to interact with these objects (e.g., comment on, favorite, like , and reuse ). Platforms such as Figshare and SlideShare disseminate scholars’ various types of research outputs such as datasets, figures, infographics, documents, videos, posters , or presentation slides (Enis, 2013) and displays views, likes, and shares by other users (Mas -Bleda et al., 2014) . GitHub provides for uploading and stor ing of software code, which allows users to modify and expand existing code (Dabbish, Stuart, Tsay, & Herbsleb, 2012) , which has been shown to lead to enhanced collaboratio n among developers (Thung, Bissyande, Lo, & Jiang, 2013) . As w ith other social data sharing platforms, usage statistics on the number of view and contributions to a project are provided (Kubilius, 2014) . The registry of research data repositories, re3data.org, ha s indexed more than 1,200 as of May 2015 2 . However, only a few of these repositories (i.e. , Figshare, SlideShare and Github) include social functionalities and have reached a certain level of participation from scholars (e.g., Begel, Bosch, & Storey, 2013; Kubilius, 2014) .
Video provide s yet another genre for social interaction and scholarly communication (Kousha, Thelwall, & Abdoli, 2012; Sugimoto & Thelwall, 2013) . Of the various video sharing platforms, YouTube, launched in 2005, is by far the most popular
A study of UK scholars reports that the majority o f respondents engaged with video for scholarly communication purposes (Tenopir et al., 2013) , yet only 20% have ever created in that genre. Among British PhD students, 17% had used videos and podcasts passively for research, while 8% had actively contributed (British Library et al., 2012) .
Blogs began in the mid -1990s and were considered ubiquitous by the mid- 200 0s (Gillmor, 2006; Hank, 2011; Lenhart & Fox, 2006; Rainie, 2005) . Scholarly blogs emerged during this time with their own neologisms (e.g., blogademia , blawgosphere , bloggership) and body of research (Hank, 2011) and were considered to change the exclusive structure of scholarly communication
Technorati, considered t o be on e of the largest ind ex of blogs, deleted their entire blog directory in 2014 3 . Individual blogs are also subject to abrupt cancellations and deletions, making questionable the degree to which blogging meets the permanence criteria of scholarly commu nication (Hank, 2011) .
ResearchBlogging.org (RB) — “an aggregator of blog posts referencing peer -reviewed research in a structured manner” (Shema, Bar -Ilan, & Thelwall, 2015, p. 3) — was launched in 2007 and has been a fairly stable structure in the scholarly blogging environment. RB both aggregates and —through the use of the RB icon — credentials scholarly blogs (Shema et al., 2015) . The informality of the genre (Mewburn & Thomson, 2013) and the ability to circumve nt traditional publishing barr iers has led advocates to claim that blogging can invert traditional academic power hierarchies (Walker, 2006) , allow ing people to construct scholarly identities outside of formal institutionalization (Ewins, 2005; Luzón, 2011; Potter, 2012) and democratize the scientific system (Gijón, 2013) . Another positive characteristic of blogs is their “inherently social” nature (Walker, 2006, p. 132) (see also Kjellberg, 2010; Luzón, 2011 ). Scholars have noted the potential for “communal scholarship” (Hendrick, 2012) made by linking and commenting, calling the platform “a new ‘third place’ for academic discourse” (Halavais, 2006, p. 117) . Commenting functionalities were seen as making possible the “shift from public understanding to public engagement with science” (Kouper, 2010, p. 1) .
Studies have also provided evidence of high rate s of blogging among certain subpopulations: for example, approximately one -third of German university staff (Pscheida et al., 2013) and one fifth of UK doctoral students use blogs (Carpenter et al., 2012) .
Academics are not only producers, but also consumers of blogs: a 2007 survey of medical bloggers foundthat the large majority (86%) read blogs to find medical news (Kovic et al., 2008)

Mahrt and Puschmann (2014) , who defined science blogging as “the use of blogs for science communication” (p. 1). It has been similarly likened to a sp ace for public intellectualism (Kirkup, 2010; Walker, 2006) and as a form of activism to combat perceived biased or pseudoscience (Riesch & Mendel, 2014. Yet, there remains a tension between science bloggers and science journalists, with many science journals dismissing the value of science blogs (Colson, 2011)

.
while there has been anecdotal evidence of the use of blogs in promotion and tenure (e.g., (Podgor, 2006) the consensus seem s to suggest that most institutions do not value blogging as highly as publishing in traditional outlets, or consider blogging as a measure of service rather than research activity (Hendricks, 2010, para. 30) .
Microblogging developed out of a particular blogging practice, wherein bloggers would post small messages or single files on a blog post. Blogs that focused on such “microposts” were then termed “tumblelogs” and were described as “a quick and dirty stream of consciousness” kind of blogging (Kottke, 2005, para. 2)
most popular microblogs are Twitter (launched in 2006), tumblr (launched in 2007), FriendFeed (launched in 2007 and available in several languages), Plurk (launched in 2008 and popular in Taiwan), and Sina Weibo (launched in 2009 and popular in China).
users to follow other users, search tweets by keywords or hashtags, and link to other media or other tweets
.

Conference chatter (backchanneling) is another widely studied area in the realm of scholarly microblogging. Twitter use at conferences is generally carried out by a minority of participants

Wikis are collaborative content management platforms enabled by web browsers and embedded markup languages.
Wikipedia has been advocated as a replacement for traditional publishing and peer review models (Xia o & Askin, 2012) and pleas have been made to encourage experts to contribute (Rush & Tracy, 2010) . Despite this, contribution rates remain low — likely hindered by the lack of explicit authorship in Wikipedia, a cornerstone of the traditional academic reward system (Black, 2008; Butler, 2008; Callaway, 2010; Whitworth & Friedman, 2009) . Citations to scholarly documents —another critical component in the reward system —are increasingly being found i n Wikiped ia entries (Bould et al., 2014; Park, 2011; Rousidis et al., 2013) , but are no t yet seen as valid impact indicators (Haustein, Peters, Bar -Ilan, et al., 2014) .
The altmetrics manifesto (Priem et al., 2010, para. 1) , altmetrics can serve as filters , which “reflect the broad, rapid impact of scholarship in this burgeoning ecosystem”.
There are also a host of platforms which are being used informally to discuss and rate scholarly material. Reddit, for example, is a general topic platform where users can submit, discuss and rate online content. Historically, mentions of scientific journals on Reddit have been rare (Thelwall, Haustein, et al., 2013) . However, several new subreddits —e.g., science subreddit 4 , Ask Me Anything sessions 5 –have recently been launched, focusing on the discussion of scientific information. Sites like Amazon (Kousha & Thelwall, 2015) and Goodreads (Zuccala, Verleysen, Cornacchia, & Engels, 2015) , which allow users to comment on and rate books, has also been mined as potential source for the compilation of impact indicators
libraries provide services to support researchers’ use of social media tools and metrics (Lapinski, Piwowar, & Priem, 2013; Rodgers & Barbrow, 2013; Roemer & Borchardt, 2013). One example is Mendeley Institutional Edition, https://www.elsevier.com/solutions/mendeley/Mendeley-Institutional-Edition, which mines Mendeley documents, annotations, and behavior and provides these data to libraries (Galligan & Dyas -Correia, 2013) . Libraries can use them for collection management, in a manner similar to other usage data, such as COUNTER statistics (Galligan & Dyas -Correia, 2013) .
Factors affecting social media use; age, academic rank and status, gender, discipline, country and language,

++++++++++++++++++++++++++
h-index

http://guides.library.cornell.edu/c.php?g=32272&p=203391
https://en.wikipedia.org/wiki/H-index

+++++++++++++
more on altmetrics in this IMS blog:
http://blog.stcloudstate.edu/ims?s=altmetrics

altmetrics in education

Altmetrics: A Practical Guide for Librarians, Researchers and Academics

http://www.alastore.ala.org/detail.aspx?ID=11531&zbrandid=4634&zidType=CH&zid=38109786&zsubscriberId=1026665847&zbdom=http://ala-publishing.informz.net

——————————–

http://altmetrics.org/tools/

https://en.wikipedia.org/wiki/Altmetrics

In scholarly and scientific publishing, altmetrics are non-traditional metrics[2] proposed as an alternative[3] to more traditional citation impact metrics, such as impact factor and h-index.[4] The term altmetrics was proposed in 2010,[1] as a generalization of article level metrics,[5] and has its roots in the #altmetrics hashtag. Although altmetrics are often thought of as metrics about articles, they can be applied to people, journals, books, data sets, presentations, videos, source code repositories, web pages, etc. They are related to Webometrics, which had similar goals but evolved before the social web. Altmetrics did not originally cover citation counts.[6] It also covers other aspects of the impact of a work, such as how many data and knowledge bases refer to it, article views, downloads, or mentions in social media and news media.[7][8]

++++++++++++++++

more on analytics and metrics in education in this IMS blog

https://blog.stcloudstate.edu/ims?s=analytics

http://blog.stcloudstate.edu/ims?s=metrics

Save

games and psychometrics

Could Video Games Measure Skills That Tests Can’t Capture?

http://ww2.kqed.org/mindshift/2014/08/11/could-video-games-measure-skills-that-tests-cant-capture/

applying the mechanics of games to the science of psychometrics — the measurement of the mind.

Scholars like James Paul Gee believe video games actually come much closer to capturing the learning process in action than traditional fill-in-the-bubble tests. My note: Duh...

Schwartz’s theory of assessment focuses on choice. He argues that the ultimate goal of education is to create independent thinkers who make good decisions. And so we need assessments that test how students think, not what they happen to know at a given moment.

more on games and gamification in this IMS blog:

http://blog.stcloudstate.edu/ims/?s=gaming

http://blog.stcloudstate.edu/ims/?s=gamification

Metrics for Social Media Marketing

Metrics to Improve Your Social Media Marketing

http://www.socialmediaexaminer.com/metrics-improve-social-media-marketing/

optimize for sharing, click-throughs, signups or even just visits.

arketing strategy is the editorial calendar,” explains Ben Harper in his article, How to Use Data to Improve Your Content Marketing Strategy

ROI

related articles in this blog:

http://blog.stcloudstate.edu/ims/2014/11/02/roi-of-social-media/

topics for IM260

proposed topics for IM 260 class

  • Media literacy. Differentiated instruction. Media literacy guide.
    Fake news as part of media literacy. Visual literacy as part of media literacy. Media literacy as part of digital citizenship.
  • Web design / web development
    the roles of HTML5, CSS, Java Script, PHP, Bootstrap, JQuery and other scripting languages. Heat maps and other usability issues; website content strategy.
  • Social media for institutional use. Digital Curation. Social Media algorithms. Etiquette Ethics. Mastodon
    I hosted a LITA webinar in the fall of 2016 (four weeks); I can accommodate any information from that webinar for the use of the IM students
  • OER and instructional designer’s assistance to book creators.
    I can cover both the “library part” (“free” OER, copyright issues etc) and the support / creative part of an OER book / textbook
  • “Big Data.” Data visualization. Large scale visualization. Text encoding. Analytics, Data mining. Unizin
    I can introduce the students to the large idea of Big Data and its importance in lieu of the upcoming IoT, but also departmentalize its importance for academia, business, etc. From infographics to heavy duty visualization (Primo X-Services API. JSON, Flask).
  • NetNeutrality, Digital Darwinism, Internet economy and the role of your professional in such environment
    I can introduce students to the issues, if not familiar and / or lead a discussion on a rather controversial topic
  • Digital assessment. Digital Assessment literacy.
    I can introduce students to tools, how to evaluate and select tools and their pedagogical implications
  • Wikipedia
    a hands-on exercise on working with Wikipedia. After the session, students will be able to create Wikipedia entries thus knowing intimately the process of Wikipedia and its information.
  • Effective presentations. Tools, methods, concepts and theories (cognitive load). Presentations in the era of VR, AR and mixed reality.
    I can facilitate a discussion among experts (your students) on selection of tools and their didactically sound use to convey information. I can supplement the discussion with my own findings and conclusions.
  • eConferencing. Tools and methods
    I can facilitate a discussion among your students on selection of tools and comparison. Discussion about the their future and their place in an increasing online learning environment
  • Digital Storytelling. Immersive Storytelling. The Moth. Twine. Transmedia Storytelling
    I am teaching a LIB 490/590 Digital Storytelling class. I can adapt any information from that class to the use of IM students
  • VR, AR, Mixed Reality.
    besides Mark Gill, I can facilitate a discussion, which goes beyond hardware and brands, but expand on the implications for academia and corporate education / world
  • IoT. Arduino, Raspberry PI. Industry 4.0
  • Instructional design. ID2ID
    I can facilitate a discussion based on the Educause suggestions about the profession’s development
  • Microcredentialing in academia and corporate world. Blockchain
  • IT in K12. How to evaluate; prioritize; select. obsolete trends in 21 century schools. K12 mobile learning
  • Podcasting: past, present, future. Beautiful Audio Editor.
    a definition of podcasting and delineation of similar activities; advantages and disadvantages.
  • Digital, Blended (Hybrid), Online teaching and learning: facilitation. Methods and techniques. Proctoring. Online students’ expectations. Faculty support. Asynch. Blended Synchronous Learning Environment
  • Gender, race and age in education. Digital divide. Xennials, Millennials and Gen Z. generational approach to teaching and learning. Young vs old Millennials. Millennial employees.
  • Privacy, [cyber]security, surveillance. K12 cyberincidents. Hackers.
  • Gaming and gamification. Appsmashing. Gradecraft
  • Lecture capture, course capture.
  • Bibliometrics, altmetrics
  • Technology and cheating, academic dishonest, plagiarism, copyright.

Borgman data

book reviews:
https://bobmorris.biz/big-data-little-data-no-data-a-book-review-by-bob-morris
“The challenge is to make data discoverable, usable, assessable, intelligible, and interpretable, and do so for extended periods of time…To restate the premise of this book, the value of data lies in their use. Unless stakeholders can agree on what to keep and why, and invest in the invisible work necessary to sustain knowledge infrastructures, big data and little data alike will become no data.”
http://www.cjc-online.ca/index.php/journal/article/view/3152/3337
he premise that data are not natural objects with their own essence, Borgman rather explores the different values assigned to them, as well as their many variations according to place, time, and the context in which they are collected. It is specifically through six “provocations” that she offers a deep engagement with different aspects of the knowledge industry. These include the reproducibility, sharing, and reuse of data; the transmission and publication of knowledge; the stability of scholarly knowledge, despite its increasing proliferation of forms and modes; the very porosity of the borders between different areas of knowledge; the costs, benefits, risks, and responsibilities related to knowledge infrastructure; and finally, investment in the sustainable acquisition and exploitation of data for scientific research.
beyond the six provocations, there is a larger question concerning the legitimacy, continuity, and durability of all scientific research—hence the urgent need for further reflection, initiated eloquently by Borgman, on the fact that “despite the media hyperbole, having the right data is usually better than having more data”
o Data management (Pages xviii-xix)
o Data definition (4-5 and 18-29)
p. 5 big data and little data are only awkwardly analogous to big science and little science. Modern science, or big science inDerek J. de Solla Price  (https://en.wikipedia.org/wiki/Big_Science) is characterized by international, collaborative efforts and by the invisible colleges of researchers who know each other and who exchange information on a formal and informal basis. Little science is the three hundred years of independent, smaller-scale work to develop theory and method for understanding research problems. Little science is typified by heterogeneous methods, heterogeneous data and by local control and analysis.
p. 8 The Long Tail
a popular way of characterizing the availability and use of data in research areas or in economic sectors. https://en.wikipedia.org/wiki/Long_tail

o Provocations (13-15)
o Digital data collections (21-26)
o Knowledge infrastructures (32-35)
o Open access to research (39-42)
o Open technologies (45-47)
o Metadata (65-70 and 79-80)
o Common resources in astronomy (71-76)
o Ethics (77-79)
o Research Methods and data practices, and, Sensor-networked science and technology (84-85 and 106-113)
o Knowledge infrastructures (94-100)
o COMPLETE survey (102-106)
o Internet surveys (128-143)
o Internet survey (128-143)
o Twitter (130-133, 138-141, and 157-158(
o Pisa Clark/CLAROS project (179-185)
o Collecting Data, Analyzing Data, and Publishing Findings (181-184)
o Buddhist studies 186-200)
o Data citation (241-268)
o Negotiating authorship credit (253-256)
o Personal names (258-261)
o Citation metrics (266-209)
o Access to data (279-283)

++++++++++++++++
more on big data in education in this IMS blog
http://blog.stcloudstate.edu/ims?s=big+data

1 2 3 5