Searching for "metrics"

altmetrics library Lily Troia

Taking Altmetrics to the Next Level in Your Library’s Systems and Services

Instructor: Lily Troia, Engagement Manager, Altmetric
October 31, 2017, 1:00 pm – 2:30 pm Central time

Register here, courses are listed by date

This 90 minute webinar will bring participants up to speed on the current state of altmetrics, and focus in on changes across the scholarly ecosystem. Through sharing of use cases, tips, and open discussion, this session will help participants to develop a nuanced, strategic framework for incorporating and promoting wider adoption of altmetrics throughout the research lifecycle at their institution and beyond.

++++++++++++++

https://www.force11.org/sites/default/files/d7/presentation/1/fsci_mt9_altmetrics_day1.pdf

Definition by National Information Standards Organization NISO (http://www.niso.org/home/): Altmetrics is a broad term that encapsulates the digital collection, creation, and use of multiple forms of assessment that are derived from activity and engagement among diverse stakeholders and scholarly outputs in the research ecosystem.”

Altmetrics are data that help us understand how often and by whom research objects are discussed, shared, and used on the social Web.”

PlumX Metrics – Plum Analytics

Altmetric Explorer

https://www.altmetric.com/login.php

How are researchers & institutions using Altmetric?

  • Research and evaluation services – Identify & track influential research; assess impact & reach
  • Grants and reporting – Target new grants & grantees; demonstrate value to stakeholders
  • Communications and reputation management – Track press/social media; connect to opinion leaders
  • Marketing and promotion – Highlight vital findings; benchmark campaigns and outreach
  • Collaboration and partnerships – Discover disciplinary intersections & collaborative opportunities

DISCOVERY • Find trending research • Unearth conversations among new audiences • Locate collaborators & research opportunities • Identify key opinion leaders • Uncover disciplinary intersection

SHOWCASING • Identifying research to share • Share top mentions • Impact on public policy • Real-time tracking • Identifying key researchers • Recognizing early-career researchers

REPORTING • Grant applications • Funder reporting • Impact requirements • Reputation management • Benchmarking and KPIs (Key performance indicators) • Recruitment & review • Integration into researcher profiles/repositories

++++++++++++

https://www.force11.org/sites/default/files/d7/presentation/1/fsci_mt9_altmetrics_day_2.pdf

https://www.force11.org/sites/default/files/d7/presentation/1/fsci_mt9_altmetrics_fridaysummary.pptx

++++++++++++

+++++++++++++
more on altmetrics in the library in this IMS blog
https://blog.stcloudstate.edu/ims?s=altmetrics+library

citations bibliometrics

Bertin, M., Atanassova, I., Gingras, Y., & Larivière, V. (2016). The Invariant Distribution of References in Scientific Articles. Journal Of The Association For Information Science & Technology67(1), 164-177. doi:10.1002/asi.23367

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3daph%26AN%3d112228404%26site%3dehost-live%26scope%3dsite

from the viewpoint of bibliometrics, how references are distributed along the structure of scientific papers as well as the age of these cited references

Once the sections of articles are realigned to follow the IMRaD sequence, the position of cited references along the text of articles is invariant across all PLoS journals, with the
introduction and discussion accounting for most of the references. It also provides evidence that the age of cited references varies by section, with older references being found in the methods and more recent references in the discussion.

different roles citations have in the scholarly communication process.

+++++++++++++++++++
more on bibliometrics in this IMS blog
http://blog.stcloudstate.edu/ims?s=bibliometrics

bibliometrics altmetrics

International Benchmarks for Academic Library Use of Bibliometrics & Altmetrics, 2016-17

ID: 3807768 Report August 2016 115 pages Primary Research Group

http://www.researchandmarkets.com/publication/min3qqb/3807768

The report gives detailed data on the use of various bibliometric and altmetric tools such as Google Scholar, Web of Science, Scimago, Plum Analytics

20 predominantly research universities in the USA, continental Europe, the UK, Canada and Australia/New Zealand. Among the survey participants are: Carnegie Mellon, Cambridge University, Universitat Politècnica de Catalunya the University at Albany, the University of Melbourne, Florida State University, the University of Alberta and Victoria University of Wellington

– 50% of the institutions sampled help their researchers to obtain a Thomsen/Reuters Researcher ID.

ResearcherID provides a solution to the author ambiguity problem within the scholarly research community. Each member is assigned a unique identifier to enable researchers to manage their publication lists, track their times cited counts and h-index, identify potential collaborators and avoid author misidentification. In addition, your ResearcherID information integrates with the Web of Science and is ORCID compliant, allowing you to claim and showcase your publications from a single one account. Search the registry to find collaborators, review publication lists and explore how research is used around the world!

– Just 5% of those surveyed use Facebook Insights in their altmetrics efforts.

 

 

++++++++++++++
more on altmetrics in this IMS blog
http://blog.stcloudstate.edu/ims?s=altmetrics

social media metrics

more on social media and metrics in this IMS blog
http://blog.stcloudstate.edu/ims?s=social+media+metrics

social media and altmetrics

Sugimoto, C. R., Work, S., Larivière, V., & Haustein, S. (2016). Scholarly use of social media and altmetrics: a review of the literature. Retrieved from https://arxiv.org/abs/1608.08112
https://arxiv.org/ftp/arxiv/papers/1608/1608.08112.pdf
One of the central issues associated with altmetrics (short for alternative metrics) is the identification of communities engaging with scholarly content on social media (Haustein, Bowman, & Costas, 2015; Neylon, 2014; Tsou, Bowman, Ghazinejad, & Sugimoto, 2015) . It is thus of central importance to understand the uses and users of social media in the context of scholarly communication.
most identify the following major categori es: social networking, social bookmarking, blogging, microblogging, wikis , and media and data sharing (Gu & Widén -Wulff, 2011; Rowlands, Nicholas, Russell, Canty, & Watkinson, 2011; Tenopir et al., 2013) . Some also conside r conferencing, collaborative authoring, scheduling and meeting tools (Rowlands et al., 2011) or RSS and online documents (Gu & Widén -Wulff, 2011; Tenopir et al., 2013) as social media. The landscape of social media, as well as that of altmetrics, is constantly changing and boundaries with othe r online platforms and traditional metrics are fuzzy. Many online platforms cannot be easily classified and more traditional metrics , such as downloads and mentions in policy documents , have been referred to as altmetrics due to data pr ovider policies.
the Use of social media platforms for by researchers is high — ranging from 75 to 80% in large -scale surveys (Rowlands et al., 2011; Tenopir et al., 2013; Van Eperen & Marincola, 2011) .
but
less than 10% of scholars reported using Twitter (Rowlands et al., 2011) , while 46% used ResearchGate (Van Noorden, 2014) , and more than 55% use d YouTube (Tenopir et al., 2013) —it is necessary to discuss the use of various types of social media separately . Furthermore, there i s a distinction among types of us e, with studies showing higher uses of social media for dissemination, consumption, communication , and promotion (e.g., Arcila -Calderón, Piñuel -Raigada, & Calderín -Cruz, 2013; Van Noorden, 2014) , and fewer instances of use for creation (i.e., using social media to construct scholarship) (British Library et al., 2012; Carpenter, Wetheridge, Tanner, & Smith, 2012; Procter et al., 2010b; Tenopir et al., 2013) .
Frequently mentioned social platforms in scholarly communication research include research -specific tools such as Mendeley, Zotero, CiteULike, BibSonomy, and Connotea (now defunct) as well as general tools such as Delicious and Digg (Hammond, Hannay, Lund, & Scott, 2005; Hull, Pettifer, & Kell, 2008; Priem & Hemminger, 2010; Reher & Haustein, 2010) .
Social data sharing platforms provide an infrastructure to share various types of scholarly objects —including datasets, software code, figures, presentation slides and videos —and for users to interact with these objects (e.g., comment on, favorite, like , and reuse ). Platforms such as Figshare and SlideShare disseminate scholars’ various types of research outputs such as datasets, figures, infographics, documents, videos, posters , or presentation slides (Enis, 2013) and displays views, likes, and shares by other users (Mas -Bleda et al., 2014) . GitHub provides for uploading and stor ing of software code, which allows users to modify and expand existing code (Dabbish, Stuart, Tsay, & Herbsleb, 2012) , which has been shown to lead to enhanced collaboratio n among developers (Thung, Bissyande, Lo, & Jiang, 2013) . As w ith other social data sharing platforms, usage statistics on the number of view and contributions to a project are provided (Kubilius, 2014) . The registry of research data repositories, re3data.org, ha s indexed more than 1,200 as of May 2015 2 . However, only a few of these repositories (i.e. , Figshare, SlideShare and Github) include social functionalities and have reached a certain level of participation from scholars (e.g., Begel, Bosch, & Storey, 2013; Kubilius, 2014) .
Video provide s yet another genre for social interaction and scholarly communication (Kousha, Thelwall, & Abdoli, 2012; Sugimoto & Thelwall, 2013) . Of the various video sharing platforms, YouTube, launched in 2005, is by far the most popular
A study of UK scholars reports that the majority o f respondents engaged with video for scholarly communication purposes (Tenopir et al., 2013) , yet only 20% have ever created in that genre. Among British PhD students, 17% had used videos and podcasts passively for research, while 8% had actively contributed (British Library et al., 2012) .
Blogs began in the mid -1990s and were considered ubiquitous by the mid- 200 0s (Gillmor, 2006; Hank, 2011; Lenhart & Fox, 2006; Rainie, 2005) . Scholarly blogs emerged during this time with their own neologisms (e.g., blogademia , blawgosphere , bloggership) and body of research (Hank, 2011) and were considered to change the exclusive structure of scholarly communication
Technorati, considered t o be on e of the largest ind ex of blogs, deleted their entire blog directory in 2014 3 . Individual blogs are also subject to abrupt cancellations and deletions, making questionable the degree to which blogging meets the permanence criteria of scholarly commu nication (Hank, 2011) .
ResearchBlogging.org (RB) — “an aggregator of blog posts referencing peer -reviewed research in a structured manner” (Shema, Bar -Ilan, & Thelwall, 2015, p. 3) — was launched in 2007 and has been a fairly stable structure in the scholarly blogging environment. RB both aggregates and —through the use of the RB icon — credentials scholarly blogs (Shema et al., 2015) . The informality of the genre (Mewburn & Thomson, 2013) and the ability to circumve nt traditional publishing barr iers has led advocates to claim that blogging can invert traditional academic power hierarchies (Walker, 2006) , allow ing people to construct scholarly identities outside of formal institutionalization (Ewins, 2005; Luzón, 2011; Potter, 2012) and democratize the scientific system (Gijón, 2013) . Another positive characteristic of blogs is their “inherently social” nature (Walker, 2006, p. 132) (see also Kjellberg, 2010; Luzón, 2011 ). Scholars have noted the potential for “communal scholarship” (Hendrick, 2012) made by linking and commenting, calling the platform “a new ‘third place’ for academic discourse” (Halavais, 2006, p. 117) . Commenting functionalities were seen as making possible the “shift from public understanding to public engagement with science” (Kouper, 2010, p. 1) .
Studies have also provided evidence of high rate s of blogging among certain subpopulations: for example, approximately one -third of German university staff (Pscheida et al., 2013) and one fifth of UK doctoral students use blogs (Carpenter et al., 2012) .
Academics are not only producers, but also consumers of blogs: a 2007 survey of medical bloggers foundthat the large majority (86%) read blogs to find medical news (Kovic et al., 2008)

Mahrt and Puschmann (2014) , who defined science blogging as “the use of blogs for science communication” (p. 1). It has been similarly likened to a sp ace for public intellectualism (Kirkup, 2010; Walker, 2006) and as a form of activism to combat perceived biased or pseudoscience (Riesch & Mendel, 2014. Yet, there remains a tension between science bloggers and science journalists, with many science journals dismissing the value of science blogs (Colson, 2011)

.
while there has been anecdotal evidence of the use of blogs in promotion and tenure (e.g., (Podgor, 2006) the consensus seem s to suggest that most institutions do not value blogging as highly as publishing in traditional outlets, or consider blogging as a measure of service rather than research activity (Hendricks, 2010, para. 30) .
Microblogging developed out of a particular blogging practice, wherein bloggers would post small messages or single files on a blog post. Blogs that focused on such “microposts” were then termed “tumblelogs” and were described as “a quick and dirty stream of consciousness” kind of blogging (Kottke, 2005, para. 2)
most popular microblogs are Twitter (launched in 2006), tumblr (launched in 2007), FriendFeed (launched in 2007 and available in several languages), Plurk (launched in 2008 and popular in Taiwan), and Sina Weibo (launched in 2009 and popular in China).
users to follow other users, search tweets by keywords or hashtags, and link to other media or other tweets
.

Conference chatter (backchanneling) is another widely studied area in the realm of scholarly microblogging. Twitter use at conferences is generally carried out by a minority of participants

Wikis are collaborative content management platforms enabled by web browsers and embedded markup languages.
Wikipedia has been advocated as a replacement for traditional publishing and peer review models (Xia o & Askin, 2012) and pleas have been made to encourage experts to contribute (Rush & Tracy, 2010) . Despite this, contribution rates remain low — likely hindered by the lack of explicit authorship in Wikipedia, a cornerstone of the traditional academic reward system (Black, 2008; Butler, 2008; Callaway, 2010; Whitworth & Friedman, 2009) . Citations to scholarly documents —another critical component in the reward system —are increasingly being found i n Wikiped ia entries (Bould et al., 2014; Park, 2011; Rousidis et al., 2013) , but are no t yet seen as valid impact indicators (Haustein, Peters, Bar -Ilan, et al., 2014) .
The altmetrics manifesto (Priem et al., 2010, para. 1) , altmetrics can serve as filters , which “reflect the broad, rapid impact of scholarship in this burgeoning ecosystem”.
There are also a host of platforms which are being used informally to discuss and rate scholarly material. Reddit, for example, is a general topic platform where users can submit, discuss and rate online content. Historically, mentions of scientific journals on Reddit have been rare (Thelwall, Haustein, et al., 2013) . However, several new subreddits —e.g., science subreddit 4 , Ask Me Anything sessions 5 –have recently been launched, focusing on the discussion of scientific information. Sites like Amazon (Kousha & Thelwall, 2015) and Goodreads (Zuccala, Verleysen, Cornacchia, & Engels, 2015) , which allow users to comment on and rate books, has also been mined as potential source for the compilation of impact indicators
libraries provide services to support researchers’ use of social media tools and metrics (Lapinski, Piwowar, & Priem, 2013; Rodgers & Barbrow, 2013; Roemer & Borchardt, 2013). One example is Mendeley Institutional Edition, https://www.elsevier.com/solutions/mendeley/Mendeley-Institutional-Edition, which mines Mendeley documents, annotations, and behavior and provides these data to libraries (Galligan & Dyas -Correia, 2013) . Libraries can use them for collection management, in a manner similar to other usage data, such as COUNTER statistics (Galligan & Dyas -Correia, 2013) .
Factors affecting social media use; age, academic rank and status, gender, discipline, country and language,

++++++++++++++++++++++++++
h-index

http://guides.library.cornell.edu/c.php?g=32272&p=203391
https://en.wikipedia.org/wiki/H-index

+++++++++++++
more on altmetrics in this IMS blog:
http://blog.stcloudstate.edu/ims?s=altmetrics

altmetrics in education

Altmetrics: A Practical Guide for Librarians, Researchers and Academics

http://www.alastore.ala.org/detail.aspx?ID=11531&zbrandid=4634&zidType=CH&zid=38109786&zsubscriberId=1026665847&zbdom=http://ala-publishing.informz.net

——————————–

http://altmetrics.org/tools/

https://en.wikipedia.org/wiki/Altmetrics

In scholarly and scientific publishing, altmetrics are non-traditional metrics[2] proposed as an alternative[3] to more traditional citation impact metrics, such as impact factor and h-index.[4] The term altmetrics was proposed in 2010,[1] as a generalization of article level metrics,[5] and has its roots in the #altmetrics hashtag. Although altmetrics are often thought of as metrics about articles, they can be applied to people, journals, books, data sets, presentations, videos, source code repositories, web pages, etc. They are related to Webometrics, which had similar goals but evolved before the social web. Altmetrics did not originally cover citation counts.[6] It also covers other aspects of the impact of a work, such as how many data and knowledge bases refer to it, article views, downloads, or mentions in social media and news media.[7][8]

++++++++++++++++

more on analytics and metrics in education in this IMS blog

https://blog.stcloudstate.edu/ims?s=analytics

http://blog.stcloudstate.edu/ims?s=metrics

Save

games and psychometrics

Could Video Games Measure Skills That Tests Can’t Capture?

http://ww2.kqed.org/mindshift/2014/08/11/could-video-games-measure-skills-that-tests-cant-capture/

applying the mechanics of games to the science of psychometrics — the measurement of the mind.

Scholars like James Paul Gee believe video games actually come much closer to capturing the learning process in action than traditional fill-in-the-bubble tests. My note: Duh...

Schwartz’s theory of assessment focuses on choice. He argues that the ultimate goal of education is to create independent thinkers who make good decisions. And so we need assessments that test how students think, not what they happen to know at a given moment.

more on games and gamification in this IMS blog:

http://blog.stcloudstate.edu/ims/?s=gaming

http://blog.stcloudstate.edu/ims/?s=gamification

Metrics for Social Media Marketing

Metrics to Improve Your Social Media Marketing

http://www.socialmediaexaminer.com/metrics-improve-social-media-marketing/

optimize for sharing, click-throughs, signups or even just visits.

arketing strategy is the editorial calendar,” explains Ben Harper in his article, How to Use Data to Improve Your Content Marketing Strategy

ROI

related articles in this blog:

http://blog.stcloudstate.edu/ims/2014/11/02/roi-of-social-media/

smartphone detox

Smartphone Detox: How To Power Down In A Wired World

February 12, 20185:03 AM ET

says David Greenfield, a psychologist and assistant clinical professor of psychiatry at the University of Connecticut:When we hear a ding or little ditty alerting us to a new text, email or Facebook post, cells in our brains likely release dopamine — one of the chemical transmitters in the brain’s reward circuitry. That dopamine makes us feel pleasure

“It’s a spectrum disorder,” says Dr. Anna Lembke, a psychiatrist at Stanford University, who studies addiction. “There are mild, moderate and extreme forms.” And for many people, there’s no problem at all.

Signs you might be experiencing problematic use, Lembke says, include these:

  • Interacting with the device keeps you up late or otherwise interferes with your sleep.
  • It reduces the time you have to be with friends or family.
  • It interferes with your ability to finish work or homework.
  • It causes you to be rude, even subconsciously. “For instance,” Lembke asks, “are you in the middle of having a conversation with someone and just dropping down and scrolling through your phone?” That’s a bad sign.
  • It’s squelching your creativity. “I think that’s really what people don’t realize with their smartphone usage,” Lembke says. “It can really deprive you of a kind of seamless flow of creative thought that generates from your own brain.”

Consider a digital detox one day a week

Tiffany Shlain, a San Francisco Bay Area filmmaker, and her family power down all their devices every Friday evening, for a 24-hour period.

“It’s something we look forward to each week,” Shlain says. She and her husband, Ken Goldberg, a professor in the field of robotics at the University of California, Berkeley, are very tech savvy.

A recent study of high school students, published in the journal Emotion, found that too much time spent on digital devices is linked to lower self-esteem and a decrease in well-being.

 

+++++++++++
more on contemplative computing in this IMS blog
http://blog.stcloudstate.edu/ims?s=contemplative+computing

bots, big data and the future

Computational Propaganda: Bots, Targeting And The Future

February 9, 201811:37 AM ET 

https://www.npr.org/sections/13.7/2018/02/09/584514805/computational-propaganda-yeah-that-s-a-thing-now

Combine the superfast calculational capacities of Big Compute with the oceans of specific personal information comprising Big Data — and the fertile ground for computational propaganda emerges. That’s how the small AI programs called bots can be unleashed into cyberspace to target and deliver misinformation exactly to the people who will be most vulnerable to it. These messages can be refined over and over again based on how well they perform (again in terms of clicks, likes and so on). Worst of all, all this can be done semiautonomously, allowing the targeted propaganda (like fake news stories or faked images) to spread like viruses through communities most vulnerable to their misinformation.

According to Bolsover and Howard, viewing computational propaganda only from a technical perspective would be a grave mistake. As they explain, seeing it just in terms of variables and algorithms “plays into the hands of those who create it, the platforms that serve it, and the firms that profit from it.”

Computational propaganda is a new thing. People just invented it. And they did so by realizing possibilities emerging from the intersection of new technologies (Big Compute, Big Data) and new behaviors those technologies allowed (social media). But the emphasis on behavior can’t be lost.

People are not machines. We do things for a whole lot of reasons including emotions of loss, anger, fear and longing. To combat computational propaganda’s potentially dangerous effects on democracy in a digital age, we will need to focus on both its howand its why.

++++++++++++++++
more on big data in this IMS blog
http://blog.stcloudstate.edu/ims?s=big+data

more on bots in this IMS blog
http://blog.stcloudstate.edu/ims?s=bot

more on fake news in this IMS blog
http://blog.stcloudstate.edu/ims?s=fake+news

1 2 3 6