School districts are more connected than ever. The latest Infrastructure Survey report from CoSN shows that over 90% of districts have sufficient broadband. So why isn’t everyone using it to generate measurable outcomes?
How technology can be used in the classroom to help support learning and productivity How school leaders can calculate the value of their tech investments
The importance of video when it comes to keeping students engaged (hint: video is key)
The most important metrics to consider when collecting data on your technology (it’s ok to start small)
From FYE to ROI to HIP, librarians are seeing new acronyms emerge in their campus administrations’ initiatives. How can today’s academic libraries position themselves to improve student success and retention, using high-impact practices (HIPs) to demonstrate a return-on-investment (ROI)? Many libraries struggle to define and implement their services in a way that meets these shifting expectations.
Wednesday, June 13, 2018 2:00 PM Eastern 1:00 PM Central12:00 PM Mountain 11:00 AM Pacific
Librarians in universities, colleges, and community colleges can establish, assess, and link
academic library outcomes to institutional outcomes related to the following areas:
student enrollment, student retention and graduation rates, student success, student
achievement, student learning, student engagement, faculty research productivity,
faculty teaching, service, and overarching institutional quality.
Assessment management systems help higher education educators, including librarians, manage their outcomes, record and maintain data on each outcome, facilitate connections to
similar outcomes throughout an institution, and generate reports.
Assessment management systems are helpful for documenting progress toward
strategic/organizational goals, but their real strength lies in managing learning
to determine the impact of library interactions on users, libraries can collect data on how individual users engage with library resources and services.
increase library impact on student enrollment.
p. 13-14improved student retention and graduation rates. High -impact practices include: first -year seminars and experiences, common intellectual experiences, learning communities, writing – intensive courses, collaborative assignments and projects, undergraduate research, Value of Academic Libraries diversity/global learning, service learning/community -based learning, internships, capstone courses and projects
Libraries support students’ ability to do well in internships, secure job placements, earn salaries, gain acceptance to graduate/professional schools, and obtain marketable skills.
librarians can investigate correlations between student library interactions and their GPA well as conduct test item audits of major professional/educational tests to determine correlations between library services or resources and specific test items.
p. 15 Review course content, readings, reserves, and assignments.
Track and increase library contributions to faculty research productivity.
Continue to investigate library impact on faculty grant proposals and funding, a means of generating institutional income. Librarians contribute to faculty grant proposals in a number of ways.
Demonstrate and improve library support of faculty teaching.
p. 20 Internal Focus: ROI – lib value = perceived benefits / perceived costs
production of a commodity – value=quantity of commodity produced × price per unit of commodity
p. 21 External focus
a fourth definition of value focuses on library impact on users. It asks, “What is the library trying to achieve? How can librarians tell if they have made a difference?” In universities, colleges, and community colleges, libraries impact learning, teaching, research, and service. A main method for measuring impact is to “observe what the [users] are actually doing and what they are producing as a result”
A fifth definition of value is based on user perceptions of the library in relation to competing alternatives. A related definition is “desired value” or “what a [user] wants to have happen when interacting with a [library] and/or using a [library’s] product or service” (Flint, Woodruff and Fisher Gardial 2002) . Both “impact” and “competing alternatives” approaches to value require libraries to gain new understanding of their users’ goals as well as the results of their interactions with academic libraries.
p. 23 Increasingly, academic library value is linked to service, rather than products. Because information products are generally produced outside of libraries, library value is increasingly invested in service aspects and librarian expertise.
service delivery supported by librarian expertise is an important library value.
p. 25 methodology based only on literature? weak!
p. 26 review and analysis of the literature: language and literature are old (e.g. educational administrators vs ed leaders).
G government often sees higher education as unresponsive to these economic demands. Other stakeholder groups —students, pa rents, communities, employers, and graduate/professional schools —expect higher education to make impacts in ways that are not primarily financial.
Because institutional missions vary (Keeling, et al. 2008, 86; Fraser, McClure and
Leahy 2002, 512), the methods by which academic libraries contribute value vary as
well. Consequently, each academic library must determine the unique ways in which they contribute to the mission of their institution and use that information to guide planning and decision making (Hernon and Altman, Assessing Service Quality 1998, 31) . For example, the University of Minnesota Libraries has rewritten their mission and vision to increase alignment with their overarching institution’s goals and emphasis on strategic engagement (Lougee 2009, allow institutional missions to guide library assessment
Assessment vs. Research
In community colleges, colleges, and universities, assessment is about defining the
purpose of higher education and determining the nature of quality (Astin 1987)
Academic libraries serve a number of purposes, often to the point of being
Assessment “strives to know…what is” and then uses that information to change the
status quo (Keeling, et al. 2008, 28); in contrast, research is designed to test
hypotheses. Assessment focuses on observations of change; research is concerned with the degree of correlation or causation among variables (Keeling, et al. 2008, 35) . Assessment “virtually always occurs in a political context ,” while research attempts to be apolitical” (Upcraft and Schuh 2002, 19) .
p. 31 Assessment seeks to document observations, but research seeks to prove or disprove ideas. Assessors have to complete assessment projects, even when there are significant design flaws (e.g., resource limitations, time limitations, organizational contexts, design limitations, or political contexts); whereas researchers can start over (Upcraft and Schuh 2002, 19) . Assessors cannot always attain “perfect” studies, but must make do with “good enough” (Upcraft and Schuh 2002, 18) . Of course, assessments should be well planned, be based on clear outcomes (Gorman 2009, 9- 10) , and use appropriate methods (Keeling, et al. 2008, 39) ; but they “must be comfortable with saying ‘after’ as well as ‘as a result of’…experiences” (Ke eling, et al. 2008, 35) .
Two multiple measure approaches are most significant for library assessment: 1) triangulation “where multiple methods are used to find areas of convergence of data from different methods with an aim of overcoming the biases or limitations of data gathered from any one particular method” (Keeling, et al. 2008, 53) and 2) complementary mixed methods , which “seek to use data from multiple methods to build upon each other by clarifying, enhancing, or illuminating findings between or among methods” (Keeling, et al. 2008, 53) .
p. 34 Academic libraries can help higher education institutions retain and graduate students, a keystone part of institutional missions (Mezick 2007, 561) , but the challenge lies in determining how libraries can contribute and then document their contribution
p. 35. Student Engagement: In recent years, academic libraries have been transformed to provide “technology and content ubiquity” as well as individualized support My Note: I read the “technology and content ubiquity” as digital literacy / metaliteracies, where basic technology instructional sessions (everything that IMS offers for years) is included, but this library still clenches to information literacy only.
In the past, academic libraries functioned primarily as information repositories; now they are becoming learning enterprises (Bennett 2009, 194) . This shift requires academic librarians to embed library services and resources in the teaching and learning activities of their institutions (Lewis 2007) . In the new paradigm, librarians focus on information skills, not information access (Bundy 2004, 3); they think like educators, not service providers (Bennett 2009, 194) .
p. 38. For librarians, the main content area of student learning is information literacy; however, they are not alone in their interest in student inform ation literacy skills (Oakleaf, Are They Learning? 2011). My note: Yep. it was. 20 years ago. Metaliteracies is now.
p. 41 surrogates for student learning in Table 3.
p. 42 strategic planning for learning:
According to Kantor, the university library “exists to benefit the students of the educational institution as individuals ” (Library as an Information Utility 1976 , 101) . In contrast, academic libraries tend to assess learning outcomes using groups of students
p. 45 Assessment Management Systems
Each assessment management system has a slightly different set of capabilities. Some guide outcomes creation, some develop rubrics, some score student work, or support student portfolios. All manage, maintain, and report assessment data
p. 46 faculty teaching
However, as online collections grow and discovery tools evolve, that role has become less critical (Schonfeld and Housewright 2010; Housewright and Schonfeld, Ithaka’s 2006 Studies of Key Stakeholders 2008, 256) . Now, libraries serve as research consultants, project managers, technical support professionals, purchasers , and archivists (Housewright, Themes of Change 2009, 256; Case 2008) .
Librarians can count citations of faculty publications (Dominguez 2005)
Tenopir, C. (2012). Beyond usage: measuring library outcomes and value. Library Management, 33(1/2), 5-13.
methods that can be used to measure the value of library products and services. (Oakleaf, 2010; Tenopir and King, 2007): three main categories
Implicit value. Measuring usage through downloads or usage logs provide an implicit measure of value. It is assumed that because libraries are used, they are of value to the users. Usage of e-resources is relatively easy to measure on an ongoing basis and is especially useful in collection development decisions and comparison of specific journal titles or use across subject disciplines.
do not show purpose, satisfaction, or outcomes of use (or whether what is downloaded is actually read).
Explicit methods of measuring value include qualitative interview techniques that ask faculty members, students, or others specifically about the value or outcomes attributed to their use of the library collections or services and surveys or interviews that focus on a specific (critical) incident of use.
Derived values, such as Return on Investment (ROI), use multiple types of data collected on both the returns (benefits) and the library and user costs (investment) to explain value in monetary terms.
vlog is simply a blog in video form. In a vlog, you can share anything you might do in a blog post, such as a tutorial or a story from your life.
Consistency is best for vlogging. If you post a vlog here and there, you won’t gain much traction.
the purpose of a vlog is to help people discover you. Videos that may be suitable for YouTube but that don’t help people discover you, such as a product commercial or an introduction to your company, don’t make great vlog posts. To be discovered, think of the users who are searching for a concern, a specialty, or the answer to a question. Think about what a potential customer or audience member might want to know, create a video about the topic, and upload it to YouTube.
What It Takes to Vlog
develop a strong message before you begin your video.
the camera is a vehicle delivering your message to people. When you talk to viewers the way you talk to another person, you do much better on camera.
ROI on Vlogging
the return on investment for vlogging, you need to focus on your goals. Don’t worry about vanity metrics such as followers, likes, and subscribers. Instead, measure what actually matters for your goals. For example, if your goal is to get clients, consider how many clients you need to acquire to make the hours you put into vlogging worthwhile.
goals and milestones are important for determining your ROI.
Consistency is another element for raising your channel’s profile on YouTube. If you post a video only here and there, you don’t consistently bring traffic and grow.
social media has a strong return on investment (ROI) – how to
Social media data is the collected information from social networks that show how users share, view or engage with your content or profiles. These numbers, percentages and statistics provide better insights into your social media strategy.
social media analytics to make sense of the raw information.
media data as the ingredients to your meal and the analysis as your recipe. Without the recipe, you wouldn’t know what to make or how to cook it.
Some of the raw social media data can include:
Key performance indicators (KPIs) are the various business metrics used to measure and analyze certain aspects of your business. Social media KPIs are the metrics that likely factor into your social media ROI.
Facebook business page, you can analyze some KPIs within the social network. The most essential Facebook metrics include (see entire article).
Engagement Rate: Total link clicks, Retweets, favorites and replies on your Tweet divided by total impressions.
Followers: Total number of Twitter followers.
Link Clicks: Total number of URL and hashtag links clicked.
Mentions: How many times your @username was mentioned by others.
Profile Visits: Total Twitter profile visits.
Replies: How many times people replied to your Tweets.
Retweets: Total Retweets received by others.
Tweet Impressions: Total of times your Tweet has been viewed whether it was clicked or not.
Tweets: How many Tweets you’ve posted.
Here are the top LinkedIn metrics:
Clicks: Total clicks on a post, company name or logo.
Engagement: Total interactions divided by number of impressions.
Followers: Total number of new followers through a sponsored update.
Impressions: Total times your update was visible to other users.
Interactions: Total number of comments, likes, comments and shares.
Average Session Duration: Average session times users spend on your site.
Bounce Rate: Percentage of users leaving your site after one page view.
New Users: Total number of new users coming to your site for the first time.
Pages / Session: Average number of pages a user views each session.
Pageviews: Number of pages loaded or reloaded in a browser.
Sessions: Total times when users are active on your site.
need to decipher what’s most important.
If you wanted to track audience growth on Facebook, consider engagement rates, new followers, Post reach and organic Likes.
For example, if you launched a social media campaign, track data that highlights your ROI. According to Mashable, your ROI cycle for a social media campaign should be set up in three stages:
41% of companies and agencies no clue about their social media financial impact. It’s nearly impossible to figure out data overnight. Instead, it takes months of tracking to ensure your future business decisions are valuable.
While measuring social media ROI can be tricky, especially since each company has different goals in mind with their campaigns, here are the key metrics that social media marketers should keep in mind: