THE VALUE OF ACADEMIC LIBRARIES
A Comprehensive Research Review and Report. Megan Oakleaf
Librarians in universities, colleges, and community colleges can establish, assess, and link
academic library outcomes to institutional outcomes related to the following areas:
student enrollment, student retention and graduation rates, student success, student
achievement, student learning, student engagement, faculty research productivity,
faculty teaching, service, and overarching institutional quality.
Assessment management systems help higher education educators, including librarians, manage their outcomes, record and maintain data on each outcome, facilitate connections to
similar outcomes throughout an institution, and generate reports.
Assessment management systems are helpful for documenting progress toward
strategic/organizational goals, but their real strength lies in managing learning
to determine the impact of library interactions on users, libraries can collect data on how individual users engage with library resources and services.
increase library impact on student enrollment.
p. 13-14improved student retention and graduation rates. High -impact practices include: first -year seminars and experiences, common intellectual experiences, learning communities, writing – intensive courses, collaborative assignments and projects, undergraduate research, Value of Academic Libraries diversity/global learning, service learning/community -based learning, internships, capstone courses and projects
Libraries support students’ ability to do well in internships, secure job placements, earn salaries, gain acceptance to graduate/professional schools, and obtain marketable skills.
librarians can investigate correlations between student library interactions and their GPA well as conduct test item audits of major professional/educational tests to determine correlations between library services or resources and specific test items.
p. 15 Review course content, readings, reserves, and assignments.
Track and increase library contributions to faculty research productivity.
Continue to investigate library impact on faculty grant proposals and funding, a means of generating institutional income. Librarians contribute to faculty grant proposals in a number of ways.
Demonstrate and improve library support of faculty teaching.
p. 20 Internal Focus: ROI – lib value = perceived benefits / perceived costs
production of a commodity – value=quantity of commodity produced × price per unit of commodity
p. 21 External focus
a fourth definition of value focuses on library impact on users. It asks, “What is the library trying to achieve? How can librarians tell if they have made a difference?” In universities, colleges, and community colleges, libraries impact learning, teaching, research, and service. A main method for measuring impact is to “observe what the [users] are actually doing and what they are producing as a result”
A fifth definition of value is based on user perceptions of the library in relation to competing alternatives. A related definition is “desired value” or “what a [user] wants to have happen when interacting with a [library] and/or using a [library’s] product or service” (Flint, Woodruff and Fisher Gardial 2002) . Both “impact” and “competing alternatives” approaches to value require libraries to gain new understanding of their users’ goals as well as the results of their interactions with academic libraries.
p. 23 Increasingly, academic library value is linked to service, rather than products. Because information products are generally produced outside of libraries, library value is increasingly invested in service aspects and librarian expertise.
service delivery supported by librarian expertise is an important library value.
p. 25 methodology based only on literature? weak!
p. 26 review and analysis of the literature: language and literature are old (e.g. educational administrators vs ed leaders).
G government often sees higher education as unresponsive to these economic demands. Other stakeholder groups —students, pa rents, communities, employers, and graduate/professional schools —expect higher education to make impacts in ways that are not primarily financial.
Because institutional missions vary (Keeling, et al. 2008, 86; Fraser, McClure and
Leahy 2002, 512), the methods by which academic libraries contribute value vary as
well. Consequently, each academic library must determine the unique ways in which they contribute to the mission of their institution and use that information to guide planning and decision making (Hernon and Altman, Assessing Service Quality 1998, 31) . For example, the University of Minnesota Libraries has rewritten their mission and vision to increase alignment with their overarching institution’s goals and emphasis on strategic engagement (Lougee 2009, allow institutional missions to guide library assessment
Assessment vs. Research
In community colleges, colleges, and universities, assessment is about defining the
purpose of higher education and determining the nature of quality (Astin 1987)
Academic libraries serve a number of purposes, often to the point of being
Assessment “strives to know…what is” and then uses that information to change the
status quo (Keeling, et al. 2008, 28); in contrast, research is designed to test
hypotheses. Assessment focuses on observations of change; research is concerned with the degree of correlation or causation among variables (Keeling, et al. 2008, 35) . Assessment “virtually always occurs in a political context ,” while research attempts to be apolitical” (Upcraft and Schuh 2002, 19) .
p. 31 Assessment seeks to document observations, but research seeks to prove or disprove ideas. Assessors have to complete assessment projects, even when there are significant design flaws (e.g., resource limitations, time limitations, organizational contexts, design limitations, or political contexts); whereas researchers can start over (Upcraft and Schuh 2002, 19) . Assessors cannot always attain “perfect” studies, but must make do with “good enough” (Upcraft and Schuh 2002, 18) . Of course, assessments should be well planned, be based on clear outcomes (Gorman 2009, 9- 10) , and use appropriate methods (Keeling, et al. 2008, 39) ; but they “must be comfortable with saying ‘after’ as well as ‘as a result of’…experiences” (Ke eling, et al. 2008, 35) .
Two multiple measure approaches are most significant for library assessment: 1) triangulation “where multiple methods are used to find areas of convergence of data from different methods with an aim of overcoming the biases or limitations of data gathered from any one particular method” (Keeling, et al. 2008, 53) and 2) complementary mixed methods , which “seek to use data from multiple methods to build upon each other by clarifying, enhancing, or illuminating findings between or among methods” (Keeling, et al. 2008, 53) .
p. 34 Academic libraries can help higher education institutions retain and graduate students, a keystone part of institutional missions (Mezick 2007, 561) , but the challenge lies in determining how libraries can contribute and then document their contribution
p. 35. Student Engagement: In recent years, academic libraries have been transformed to provide “technology and content ubiquity” as well as individualized support
My Note: I read the “technology and content ubiquity” as digital literacy / metaliteracies, where basic technology instructional sessions (everything that IMS offers for years) is included, but this library still clenches to information literacy only.
p. 37 Student Learning
In the past, academic libraries functioned primarily as information repositories; now they are becoming learning enterprises (Bennett 2009, 194) . This shift requires academic librarians to embed library services and resources in the teaching and learning activities of their institutions (Lewis 2007) . In the new paradigm, librarians focus on information skills, not information access (Bundy 2004, 3); they think like educators, not service providers (Bennett 2009, 194) .
p. 38. For librarians, the main content area of student learning is information literacy; however, they are not alone in their interest in student inform ation literacy skills (Oakleaf, Are They Learning? 2011).
My note: Yep. it was. 20 years ago. Metaliteracies is now.
p. 41 surrogates for student learning in Table 3.
p. 42 strategic planning for learning:
According to Kantor, the university library “exists to benefit the students of the educational institution as individuals ” (Library as an Information Utility 1976 , 101) . In contrast, academic libraries tend to assess learning outcomes using groups of students
p. 45 Assessment Management Systems
Each assessment management system has a slightly different set of capabilities. Some guide outcomes creation, some develop rubrics, some score student work, or support student portfolios. All manage, maintain, and report assessment data
p. 46 faculty teaching
However, as online collections grow and discovery tools evolve, that role has become less critical (Schonfeld and Housewright 2010; Housewright and Schonfeld, Ithaka’s 2006 Studies of Key Stakeholders 2008, 256) . Now, libraries serve as research consultants, project managers, technical support professionals, purchasers , and archivists (Housewright, Themes of Change 2009, 256; Case 2008) .
Librarians can count citations of faculty publications (Dominguez 2005)
Tenopir, C. (2012). Beyond usage: measuring library outcomes and value. Library Management, 33(1/2), 5-13.
methods that can be used to measure the value of library products and services. (Oakleaf, 2010; Tenopir and King, 2007): three main categories
- Implicit value. Measuring usage through downloads or usage logs provide an implicit measure of value. It is assumed that because libraries are used, they are of value to the users. Usage of e-resources is relatively easy to measure on an ongoing basis and is especially useful in collection development decisions and comparison of specific journal titles or use across subject disciplines.
do not show purpose, satisfaction, or outcomes of use (or whether what is downloaded is actually read).
- Explicit methods of measuring value include qualitative interview techniques that ask faculty members, students, or others specifically about the value or outcomes attributed to their use of the library collections or services and surveys or interviews that focus on a specific (critical) incident of use.
- Derived values, such as Return on Investment (ROI), use multiple types of data collected on both the returns (benefits) and the library and user costs (investment) to explain value in monetary terms.
more on ROI in this IMS blog
Teaching Naked: How Moving Technology out of Your College Classroom Will Improve Student Learning
José Antonio Bowen, president, Goucher College
Technology is changing higher education, but the greatest value of a physical university will remain its face-to-face (naked) interaction between faculty and students. Technology has fundamentally changed our relationship to knowledge and this increases the value of critical thinking, but we need to redesign our courses to deliver this value. The most important benefits to using technology occur outside of the classroom. New technology can increase student preparation and engagement between classes and create more time for the in-class dialogue that makes the campus experience worth the extra money it will always cost to deliver. Students already use online content, but need better ways to interact with material before every class. By using online quizzes and games, rethinking our assignments and course design, we can create more class time for the activities and interactions that most spark the critical thinking and change of mental models we seek.
more on online teaching in this IMS blog
Eyal, L. (2012). Digital Assessment Literacy — the Core Role of the Teacher in a Digital Environment. Journal Of Educational Technology & Society, 15(2), 37-49.
Common to all is a view of the level of literacy as a measure of the quality of human capital of a society or a particular area. Literacy develops in interaction with the environment (Vygotsky, 1987).
digital assessment literacy refers to the role of the teacher as an assessor in a technology-rich environment.
Learning Management Systems (LMS) benefits and limitations
Measurement allows quantitative description of a particular characterization of an individual, expressed in numbers.
the combination of assessment and measurement provides a thorough and accurate picture, based upon which practical conclusions can be drawn (Wagner, 1997). A test is a systematic process in which an aspect of student behavior is quantitatively evaluated (Suen & Parkes, 2002).
For several decades this system of assessment has been criticized for a variety of reasons, including the separation between the teaching-learning process and the evaluation process, the relatively low level of thinking required, and the quantitative reporting of results, which does not contribute to students’ progress. In the last decade, the central argument against the tests system is that their predictability is limited to the field and context in which the students are tested, and that they do not predict student problem solving ability, teamwork, good work habits and honesty.
teachers mistakenly believe that repeating lessons will improve students’ achievements.
To evaluate how well the goals were achieved, objective measurement methods are employed (Black, et al., 2004).
Eshet- Alkalai (2004) offered a detailed conceptual framework for the term ‘digital literacy’ that includes: photo-visual thinking; reproduction thinking; branching thinking; information thinking; and socio-emotional thinking.
Eshet-Alkalai, Y. (2004). Digital literacy: A conceptual framework for survival skills in the digital era. Journal of Educational Multimedia and Hypermedia, 13(1), 93–106.
Eshet-Alkalai, Y., & Chajut, E. (2009). Changes Over Time in Digital Literacy. Cyberpsychology & Behavior, 12(6), 713-715. doi:10.1089/cpb.2008.0264
two major patterns of change over time: (a) closing the gap between younger and older participants in the tasks that emphasize profi- ciency and technical control and (b) widening the gap between younger and older participants in tasks that emphasize creativity and critical thinking. Based on the comparison with the matched control groups, we suggest that experience with technology, and not age, accounts for the observed lifelong changes in digital literacy skills
Eshet-Alkalai, Y., & Soffer, O. (2012). Guest Editorial – Navigating in the Digital Era: Digital Literacy: Socio-Cultural and Educational Aspects. Journal Of Educational Technology & Society, 15(2), 1.
a wide range of technological, cognitive and social competences—collectively termed “Digital Literacy.” Users thus must become “digitally literate” in order to cope effectively with the complex sociological, cognitive and pedagogical challenges these technologies pose. These skills include, for example, the ability to operate computers and navigate the net effectively, to cope with large volumes of information, to evaluate the reliability of information, and to critically assess what seem to be natural (and not ideologically biased) technological tools. In a different way from the spirit of modern print, learners construct and consume knowledge in non-linear environments. They need to learn, collaborate and solve problems effectively in virtual (non face-to-face) learning environments, and to communicate effectively in technology-mediated social participation environments.
It is important to note: digital literacy, then, is not limited simply to computer and Internet operation and orientation. It also relates to a variety of epistemological and ethical issues arise due to the unique characteristics of digital technologies and that are often overlapped with trends related to the post-modern and post-structural era. These include questions regarding the authority of knowledge, intellectual property and ownership, copyright, authenticity and plagiarism. Furthermore, issues such as self-representation, virtual group dynamics, and on-line addiction also arise.
more on digital literacy in this IMS blog
We know that many of you have been interested in exploring Turnitin in the past, so we are excited to bring you an exclusive standardized price and more information on the roll out of Feedback Studio, replacing the Turnitin you have previously seen. We would like to share some exciting accessibility updates, how Feedback Studio can help faculty deliver formative feedback to students and help students become writers. Starting today thru December 31st non-integrated Feedback Studio will be $2.50 and integrated Feedback Studio will be $3 for new customers! Confused by the name? Don’t be! Turnitin is new and improved! Check out this video to learn about Feedback Studio!
Meet your exclusive Turnitin Team!
Ariel Ream – Account Executive, Indianapolis firstname.lastname@example.org – 317.650.2795
Juliessa Rivera – Relationship Manager, Oakland email@example.com – 510.764.7698
Juan Valladares – Account Representative, Oakland
firstname.lastname@example.org – 510.764.7552
To learn more, please join us for a WebEx on September 21st. We will be offering free 30 day pilots to anyone who attends!
Wednesday, September 21, 2016
11:00 am | Central Daylight Time (Chicago) | 1 hr
Meeting number (access code): 632 474 162
my notes from the webinar
I am prejudiced against TI and I am not hiding it; that does not mean that I am wrong.
For me, TurnitIn (TI) is an anti-pedagogical “surfer,” using the hype of “technology” to ride the wave of overworked faculty, who hope to streamline increasing workload with technology instead of working on pedagogical resolutions of not that new issues.
Low and behold, Juan, the TI presenter is trying to dazzle me with stuff, which does not dazzle me for a long time.
WCAG 2.0 AA standards of the W3C and section 508 of the rehabilitation act.
the sales pitch: 79% of students believe in feedback, but only %50+ receive it. HIs source is TurnitIn surveys from 2012 to 2016 (very very small font size (ashamed of it?))
It seems to me very much like “massaged” data.
Testimonials: one professor and one students. Ha. the apex of qualitative research…
next sales pitch: TurnitIn feedback studio. Not any more the old Classic. It assesses the originality. Drag and drop macro-style notes. Pushing rubrics. but we still fight for rubrics in D2L. If we have a large amount of adjuncts. Ha. another gem. “I know that you are, guys, IT folks.” So the IT folks are the Trojan horse to get the faculty on board. put comments on
This presentation is structured dangerously askew: IT people but no faculty. If faculty is present, they will object that they ARE capable of doing the same which is proposed to be automated.
More , why do i have to pay for another expensive software, if we have paid already Microsoft? MS Word can do everything that has been presented so far. Between MS Word and D2L, it becomes redundant.
why the heck i am interested about middle school and high school.
TI was sued for illegal collection of paper; paper are stored in their database without the consent of the students’ who wrote it. TI goes “great length to protect the identity of the students,” but still collects their work [illegally?}
November 10 – 30 day free trial
otherwise, $3 per student, prompts back: between Google, MS Word and D2L (which we already heftily pay for), why pay another exuberant price.
D2L integration: version, which does not work. LTI.
“small price to pay of such a beauty” – it does not matter how quick and easy the integration is, it is a redundancy, which already can be resolved with existing tools, part of which we are paying hefty price for
Quantile Measures for Math Added to Kansas Student Assessments
By Dian Schaffhauser 05/27/16
There are two types of Lexile measures: a person’s reading ability and the text’s difficulty. Students who are tested against state standards receive a Lexile reader measure from the Kansas Reading Assessment. Books and other texts receive a Lexile text measure from a MetaMetrics software tool called the Lexile Analyzer, which describes the book’s reading demand or complexity. When used together, the two measures are intended to help match a reader with reading material that is at an appropriate difficulty or will at least help give an idea of how well a reader should comprehend text. The reader should encounter some level of difficulty with the text, but not enough to get frustrated. The Lexile reader measure is used to monitor reader progress.
My note: is this another way / attempt to replace humans as educators? Or it is a supplemental approach to improve students’ reading abilities.
Digital Badges in Education: Trends, Issues, and Cases.
In recent years, digital badging systems have become a credible means through which learners can establish portfolios and articulate knowledge and skills for both academic and professional settings. Digital Badges in Education provides the first comprehensive overview of this emerging tool. A digital badge is an online-based visual representation that uses detailed metadata to signify learners’ specific achievements and credentials in a variety of subjects across K-12 classrooms, higher education, and workplace learning. Focusing on learning design, assessment, and concrete cases in various contexts, this book explores the necessary components of badging systems, their functions and value, and the possible problems they face. These twenty-five chapters illustrate a range of successful applications of digital badges to address a broad spectrum of learning challenges and to help readers formulate solutions during the development of their digital badges learning projects.
Badges and Leaderboards: Professional Developments for Teachers in K12
Why should I bother earning badges?
issues to consider:
More on badges and gaming in education in this IMS blog:
ACRL e-Learning webcast series: Learning Analytics – Strategies for Optimizing Student Data on Your Campus
This three-part webinar series, co-sponsored by the ACRL Value of Academic Libraries Committee, the Student Learning and Information Committee, and the ACRL Instruction Section, will explore the advantages and opportunities of learning analytics as a tool which uses student data to demonstrate library impact and to identify learning weaknesses. How can librarians initiate learning analytics initiatives on their campuses and contribute to existing collaborations? The first webinar will provide an introduction to learning analytics and an overview of important issues. The second will focus on privacy issues and other ethical considerations as well as responsible practice, and the third will include a panel of librarians who are successfully using learning analytics on their campuses.
Webcast One: Learning Analytics and the Academic Library: The State of the Art and the Art of Connecting the Library with Campus Initiatives
March 29, 2016
Learning analytics are used nationwide to augment student success initiatives as well as bolster other institutional priorities. As a key aspect of educational reform and institutional improvement, learning analytics are essential to defining the value of higher education, and academic librarians can be both of great service to and well served by institutional learning analytics teams. In addition, librarians who seek to demonstrate, articulate, and grow the value of academic libraries should become more aware of how they can dovetail their efforts with institutional learning analytics projects. However, all too often, academic librarians are not asked to be part of initial learning analytics teams on their campuses, despite the benefits of library inclusion in these efforts. Librarians can counteract this trend by being conversant in learning analytics goals, advantages/disadvantages, and challenges as well as aware of existing examples of library successes in learning analytics projects.
Learn about the state of the art in learning analytics in higher education with an emphasis on 1) current models, 2) best practices, 3) ethics, privacy, and other difficult issues. The webcast will also focus on current academic library projects and successes in gaining access to and inclusion in learning analytics initiatives on their campus. Benefit from the inclusion of a “short list” of must-read resources as well as a clearly defined list of ways in which librarians can leverage their skills to be both contributing members of learning analytics teams, suitable for use in advocating on their campuses.
open academic analytics initiative
where data comes from:
- students information systems (SIS)
- Video streaming and web conferencing
- Co-curricular and extra-curricular involvement
D2L degree compass
Predictive Analytics Reportitng PAR – was open, but just bought by Hobsons (https://www.hobsons.com/)
IMS Caliper Enabled Services. the way to connect the library in the campus analytics https://www.imsglobal.org/activity/caliperram
student’s opinion of this process
benefits: self-assessment, personal learning, empwerment
analytics and data privacy – students are OK with harvesting the data (only 6% not happy)
8 in 10 are interested in personal dashboard, which will help them perform
Big Mother vs Big Brother: creepy vs helpful. tracking classes, helpful, out of class (where on campus, social media etc) is creepy. 87% see that having access to their data is positive
recognize metrics, assessment, analytics, data. visualization, data literacy, data science, interpretation
INSTRUCTION DEPARTMENT – N.B.
determine who is the key leader: director of institutional research, president, CIO
who does analyics services: institutional research, information technology, dedicated center
analytic maturity: data drivin, decision making culture; senior leadership commitment,; policy supporting (data ollection, accsess, use): data efficacy; investment and resourcefs; staffing; technical infrastrcture; information technology interaction
student success maturity: senior leader commited; fudning of student success efforts; mechanism for making student success decisions; interdepart collaboration; undrestanding of students success goals; advising and student support ability; policies; information systems
developing learning analytics strategy
understand institutional challenges; identify stakeholders; identify inhibitors/challenges; consider tools; scan the environment and see what other done; develop a plan; communicate the plan to stakeholders; start small and build
ways librarians can help
idenfify institu partners; be the partners; hone relevant learning analytics; participate in institutional analytics; identify questions and problems; access and work to improve institu culture; volunteer to be early adopters;
questions to ask: environmental scanning
do we have a learning analytics system? does our culture support? leaders present? stakeholders need to know?
questions to ask: Data
questions to ask: Library role
learning analytics & the academic library: the state of the art of connecting the library with campus initiatives
causation versus correlation studies. speakers claims that it is difficult to establish causation argument. institutions try to predict as accurately as possible via correlation, versus “if you do that it will happen what.”
More on analytics in this blog: