Searching for "project based"

IRDL proposal

Applications for the 2018 Institute will be accepted between December 1, 2017 and January 27, 2018. Scholars accepted to the program will be notified in early March 2018.

Title:

Learning to Harness Big Data in an Academic Library

Abstract (200)

Research on Big Data per se, as well as on the importance and organization of the process of Big Data collection and analysis, is well underway. The complexity of the process comprising “Big Data,” however, deprives organizations of ubiquitous “blue print.” The planning, structuring, administration and execution of the process of adopting Big Data in an organization, being that a corporate one or an educational one, remains an elusive one. No less elusive is the adoption of the Big Data practices among libraries themselves. Seeking the commonalities and differences in the adoption of Big Data practices among libraries may be a suitable start to help libraries transition to the adoption of Big Data and restructuring organizational and daily activities based on Big Data decisions.
Introduction to the problem. Limitations

The redefinition of humanities scholarship has received major attention in higher education. The advent of digital humanities challenges aspects of academic librarianship. Data literacy is a critical need for digital humanities in academia. The March 2016 Library Juice Academy Webinar led by John Russel exemplifies the efforts to help librarians become versed in obtaining programming skills, and respectively, handling data. Those are first steps on a rather long path of building a robust infrastructure to collect, analyze, and interpret data intelligently, so it can be utilized to restructure daily and strategic activities. Since the phenomenon of Big Data is young, there is a lack of blueprints on the organization of such infrastructure. A collection and sharing of best practices is an efficient approach to establishing a feasible plan for setting a library infrastructure for collection, analysis, and implementation of Big Data.
Limitations. This research can only organize the results from the responses of librarians and research into how libraries present themselves to the world in this arena. It may be able to make some rudimentary recommendations. However, based on each library’s specific goals and tasks, further research and work will be needed.

 

 

Research Literature

“Big data is like teenage sex: everyone talks about it, nobody really knows how to do it, everyone thinks everyone else is doing it, so everyone claims they are doing it…”
– Dan Ariely, 2013  https://www.asist.org/publications/bulletin/aprilmay-2017/big-datas-impact-on-privacy-for-librarians-and-information-professionals/

Big Data is becoming an omnipresent term. It is widespread among different disciplines in academia (De Mauro, Greco, & Grimaldi, 2016). This leads to “inconsistency in meanings and necessity for formal definitions” (De Mauro et al, 2016, p. 122). Similarly, to De Mauro et al (2016), Hashem, Yaqoob, Anuar, Mokhtar, Gani and Ullah Khan (2015) seek standardization of definitions. The main connected “themes” of this phenomenon must be identified and the connections to Library Science must be sought. A prerequisite for a comprehensive definition is the identification of Big Data methods. Bughin, Chui, Manyika (2011), Chen et al. (2012) and De Mauro et al (2015) single out the methods to complete the process of building a comprehensive definition.

In conjunction with identifying the methods, volume, velocity, and variety, as defined by Laney (2001), are the three properties of Big Data accepted across the literature. Daniel (2015) defines three stages in big data: collection, analysis, and visualization. According to Daniel, (2015), Big Data in higher education “connotes the interpretation of a wide range of administrative and operational data” (p. 910) and according to Hilbert (2013), as cited in Daniel (2015), Big Data “delivers a cost-effective prospect to improve decision making” (p. 911).

The importance of understanding the process of Big Data analytics is well understood in academic libraries. An example of such “administrative and operational” use for cost-effective improvement of decision making are the Finch & Flenner (2016) and Eaton (2017) case studies of the use of data visualization to assess an academic library collection and restructure the acquisition process. Sugimoto, Ding & Thelwall (2012) call for the discussion of Big Data for libraries. According to the 2017 NMC Horizon Report “Big Data has become a major focus of academic and research libraries due to the rapid evolution of data mining technologies and the proliferation of data sources like mobile devices and social media” (Adams, Becker, et al., 2017, p. 38).

Power (2014) elaborates on the complexity of Big Data in regard to decision-making and offers ideas for organizations on building a system to deal with Big Data. As explained by Boyd and Crawford (2012) and cited in De Mauro et al (2016), there is a danger of a new digital divide among organizations with different access and ability to process data. Moreover, Big Data impacts current organizational entities in their ability to reconsider their structure and organization. The complexity of institutions’ performance under the impact of Big Data is further complicated by the change of human behavior, because, arguably, Big Data affects human behavior itself (Schroeder, 2014).

De Mauro et al (2015) touch on the impact of Dig Data on libraries. The reorganization of academic libraries considering Big Data and the handling of Big Data by libraries is in a close conjunction with the reorganization of the entire campus and the handling of Big Data by the educational institution. In additional to the disruption posed by the Big Data phenomenon, higher education is facing global changes of economic, technological, social, and educational character. Daniel (2015) uses a chart to illustrate the complexity of these global trends. Parallel to the Big Data developments in America and Asia, the European Union is offering access to an EU open data portal (https://data.europa.eu/euodp/home ). Moreover, the Association of European Research Libraries expects under the H2020 program to increase “the digitization of cultural heritage, digital preservation, research data sharing, open access policies and the interoperability of research infrastructures” (Reilly, 2013).

The challenges posed by Big Data to human and social behavior (Schroeder, 2014) are no less significant to the impact of Big Data on learning. Cohen, Dolan, Dunlap, Hellerstein, & Welton (2009) propose a road map for “more conservative organizations” (p. 1492) to overcome their reservations and/or inability to handle Big Data and adopt a practical approach to the complexity of Big Data. Two Chinese researchers assert deep learning as the “set of machine learning techniques that learn multiple levels of representation in deep architectures (Chen & Lin, 2014, p. 515). Deep learning requires “new ways of thinking and transformative solutions (Chen & Lin, 2014, p. 523). Another pair of researchers from China present a broad overview of the various societal, business and administrative applications of Big Data, including a detailed account and definitions of the processes and tools accompanying Big Data analytics.  The American counterparts of these Chinese researchers are of the same opinion when it comes to “think about the core principles and concepts that underline the techniques, and also the systematic thinking” (Provost and Fawcett, 2013, p. 58). De Mauro, Greco, and Grimaldi (2016), similarly to Provost and Fawcett (2013) draw attention to the urgent necessity to train new types of specialists to work with such data. As early as 2012, Davenport and Patil (2012), as cited in Mauro et al (2016), envisioned hybrid specialists able to manage both technological knowledge and academic research. Similarly, Provost and Fawcett (2013) mention the efforts of “academic institutions scrambling to put together programs to train data scientists” (p. 51). Further, Asomoah, Sharda, Zadeh & Kalgotra (2017) share a specific plan on the design and delivery of a big data analytics course. At the same time, librarians working with data acknowledge the shortcomings in the profession, since librarians “are practitioners first and generally do not view usability as a primary job responsibility, usually lack the depth of research skills needed to carry out a fully valid” data-based research (Emanuel, 2013, p. 207).

Borgman (2015) devotes an entire book to data and scholarly research and goes beyond the already well-established facts regarding the importance of Big Data, the implications of Big Data and the technical, societal, and educational impact and complications posed by Big Data. Borgman elucidates the importance of knowledge infrastructure and the necessity to understand the importance and complexity of building such infrastructure, in order to be able to take advantage of Big Data. In a similar fashion, a team of Chinese scholars draws attention to the complexity of data mining and Big Data and the necessity to approach the issue in an organized fashion (Wu, Xhu, Wu, Ding, 2014).

Bruns (2013) shifts the conversation from the “macro” architecture of Big Data, as focused by Borgman (2015) and Wu et al (2014) and ponders over the influx and unprecedented opportunities for humanities in academia with the advent of Big Data. Does the seemingly ubiquitous omnipresence of Big Data mean for humanities a “railroading” into “scientificity”? How will research and publishing change with the advent of Big Data across academic disciplines?

Reyes (2015) shares her “skinny” approach to Big Data in education. She presents a comprehensive structure for educational institutions to shift “traditional” analytics to “learner-centered” analytics (p. 75) and identifies the participants in the Big Data process in the organization. The model is applicable for library use.

Being a new and unchartered territory, Big Data and Big Data analytics can pose ethical issues. Willis (2013) focusses on Big Data application in education, namely the ethical questions for higher education administrators and the expectations of Big Data analytics to predict students’ success.  Daries, Reich, Waldo, Young, and Whittinghill (2014) discuss rather similar issues regarding the balance between data and student privacy regulations. The privacy issues accompanying data are also discussed by Tene and Polonetsky, (2013).

Privacy issues are habitually connected to security and surveillance issues. Andrejevic and Gates (2014) point out in a decision making “generated by data mining, the focus is not on particular individuals but on aggregate outcomes” (p. 195). Van Dijck (2014) goes into further details regarding the perils posed by metadata and data to the society, in particular to the privacy of citizens. Bail (2014) addresses the same issue regarding the impact of Big Data on societal issues, but underlines the leading roles of cultural sociologists and their theories for the correct application of Big Data.

Library organizations have been traditional proponents of core democratic values such as protection of privacy and elucidation of related ethical questions (Miltenoff & Hauptman, 2005). In recent books about Big Data and libraries, ethical issues are important part of the discussion (Weiss, 2018). Library blogs also discuss these issues (Harper & Oltmann, 2017). An academic library’s role is to educate its patrons about those values. Sugimoto et al (2012) reflect on the need for discussion about Big Data in Library and Information Science. They clearly draw attention to the library “tradition of organizing, managing, retrieving, collecting, describing, and preserving information” (p.1) as well as library and information science being “a historically interdisciplinary and collaborative field, absorbing the knowledge of multiple domains and bringing the tools, techniques, and theories” (p. 1). Sugimoto et al (2012) sought a wide discussion among the library profession regarding the implications of Big Data on the profession, no differently from the activities in other fields (e.g., Wixom, Ariyachandra, Douglas, Goul, Gupta, Iyer, Kulkami, Mooney, Phillips-Wren, Turetken, 2014). A current Andrew Mellon Foundation grant for Visualizing Digital Scholarship in Libraries seeks an opportunity to view “both macro and micro perspectives, multi-user collaboration and real-time data interaction, and a limitless number of visualization possibilities – critical capabilities for rapidly understanding today’s large data sets (Hwangbo, 2014).

The importance of the library with its traditional roles, as described by Sugimoto et al (2012) may continue, considering the Big Data platform proposed by Wu, Wu, Khabsa, Williams, Chen, Huang, Tuarob, Choudhury, Ororbia, Mitra, & Giles (2014). Such platforms will continue to emerge and be improved, with librarians as the ultimate drivers of such platforms and as the mediators between the patrons and the data generated by such platforms.

Every library needs to find its place in the large organization and in society in regard to this very new and very powerful phenomenon called Big Data. Libraries might not have the trained staff to become a leader in the process of organizing and building the complex mechanism of this new knowledge architecture, but librarians must educate and train themselves to be worthy participants in this new establishment.

 

Method

 

The study will be cleared by the SCSU IRB.
The survey will collect responses from library population and it readiness to use and use of Big Data.  Send survey URL to (academic?) libraries around the world.

Data will be processed through SPSS. Open ended results will be processed manually. The preliminary research design presupposes a mixed method approach.

The study will include the use of closed-ended survey response questions and open-ended questions.  The first part of the study (close ended, quantitative questions) will be completed online through online survey. Participants will be asked to complete the survey using a link they receive through e-mail.

Mixed methods research was defined by Johnson and Onwuegbuzie (2004) as “the class of research where the researcher mixes or combines quantitative and qualitative research techniques, methods, approaches, concepts, or language into a single study” (Johnson & Onwuegbuzie, 2004 , p. 17).  Quantitative and qualitative methods can be combined, if used to complement each other because the methods can measure different aspects of the research questions (Sale, Lohfeld, & Brazil, 2002).

 

Sampling design

 

  • Online survey of 10-15 question, with 3-5 demographic and the rest regarding the use of tools.
  • 1-2 open-ended questions at the end of the survey to probe for follow-up mixed method approach (an opportunity for qualitative study)
  • data analysis techniques: survey results will be exported to SPSS and analyzed accordingly. The final survey design will determine the appropriate statistical approach.

 

Project Schedule

 

Complete literature review and identify areas of interest – two months

Prepare and test instrument (survey) – month

IRB and other details – month

Generate a list of potential libraries to distribute survey – month

Contact libraries. Follow up and contact again, if necessary (low turnaround) – month

Collect, analyze data – two months

Write out data findings – month

Complete manuscript – month

Proofreading and other details – month

 

Significance of the work 

While it has been widely acknowledged that Big Data (and its handling) is changing higher education (https://blog.stcloudstate.edu/ims?s=big+data) as well as academic libraries (https://blog.stcloudstate.edu/ims/2016/03/29/analytics-in-education/), it remains nebulous how Big Data is handled in the academic library and, respectively, how it is related to the handling of Big Data on campus. Moreover, the visualization of Big Data between units on campus remains in progress, along with any policymaking based on the analysis of such data (hence the need for comprehensive visualization).

 

This research will aim to gain an understanding on: a. how librarians are handling Big Data; b. how are they relating their Big Data output to the campus output of Big Data and c. how librarians in particular and campus administration in general are tuning their practices based on the analysis.

Based on the survey returns (if there is a statistically significant return), this research might consider juxtaposing the practices from academic libraries, to practices from special libraries (especially corporate libraries), public and school libraries.

 

 

References:

 

Adams Becker, S., Cummins M, Davis, A., Freeman, A., Giesinger Hall, C., Ananthanarayanan, V., … Wolfson, N. (2017). NMC Horizon Report: 2017 Library Edition.

Andrejevic, M., & Gates, K. (2014). Big Data Surveillance: Introduction. Surveillance & Society, 12(2), 185–196.

Asamoah, D. A., Sharda, R., Hassan Zadeh, A., & Kalgotra, P. (2017). Preparing a Data Scientist: A Pedagogic Experience in Designing a Big Data Analytics Course. Decision Sciences Journal of Innovative Education, 15(2), 161–190. https://doi.org/10.1111/dsji.12125

Bail, C. A. (2014). The cultural environment: measuring culture with big data. Theory and Society, 43(3–4), 465–482. https://doi.org/10.1007/s11186-014-9216-5

Borgman, C. L. (2015). Big Data, Little Data, No Data: Scholarship in the Networked World. MIT Press.

Bruns, A. (2013). Faster than the speed of print: Reconciling ‘big data’ social media analysis and academic scholarship. First Monday, 18(10). Retrieved from http://firstmonday.org/ojs/index.php/fm/article/view/4879

Bughin, J., Chui, M., & Manyika, J. (2010). Clouds, big data, and smart assets: Ten tech-enabled business trends to watch. McKinsey Quarterly, 56(1), 75–86.

Chen, X. W., & Lin, X. (2014). Big Data Deep Learning: Challenges and Perspectives. IEEE Access, 2, 514–525. https://doi.org/10.1109/ACCESS.2014.2325029

Cohen, J., Dolan, B., Dunlap, M., Hellerstein, J. M., & Welton, C. (2009). MAD Skills: New Analysis Practices for Big Data. Proc. VLDB Endow., 2(2), 1481–1492. https://doi.org/10.14778/1687553.1687576

Daniel, B. (2015). Big Data and analytics in higher education: Opportunities and challenges. British Journal of Educational Technology, 46(5), 904–920. https://doi.org/10.1111/bjet.12230

Daries, J. P., Reich, J., Waldo, J., Young, E. M., Whittinghill, J., Ho, A. D., … Chuang, I. (2014). Privacy, Anonymity, and Big Data in the Social Sciences. Commun. ACM, 57(9), 56–63. https://doi.org/10.1145/2643132

De Mauro, A. D., Greco, M., & Grimaldi, M. (2016). A formal definition of Big Data based on its essential features. Library Review, 65(3), 122–135. https://doi.org/10.1108/LR-06-2015-0061

De Mauro, A., Greco, M., & Grimaldi, M. (2015). What is big data? A consensual definition and a review of key research topics. AIP Conference Proceedings, 1644(1), 97–104. https://doi.org/10.1063/1.4907823

Dumbill, E. (2012). Making Sense of Big Data. Big Data, 1(1), 1–2. https://doi.org/10.1089/big.2012.1503

Eaton, M. (2017). Seeing Library Data: A Prototype Data Visualization Application for Librarians. Publications and Research. Retrieved from http://academicworks.cuny.edu/kb_pubs/115

Emanuel, J. (2013). Usability testing in libraries: methods, limitations, and implications. OCLC Systems & Services: International Digital Library Perspectives, 29(4), 204–217. https://doi.org/10.1108/OCLC-02-2013-0009

Graham, M., & Shelton, T. (2013). Geography and the future of big data, big data and the future of geography. Dialogues in Human Geography, 3(3), 255–261. https://doi.org/10.1177/2043820613513121

Harper, L., & Oltmann, S. (2017, April 2). Big Data’s Impact on Privacy for Librarians and Information Professionals. Retrieved November 7, 2017, from https://www.asist.org/publications/bulletin/aprilmay-2017/big-datas-impact-on-privacy-for-librarians-and-information-professionals/

Hashem, I. A. T., Yaqoob, I., Anuar, N. B., Mokhtar, S., Gani, A., & Ullah Khan, S. (2015). The rise of “big data” on cloud computing: Review and open research issues. Information Systems, 47(Supplement C), 98–115. https://doi.org/10.1016/j.is.2014.07.006

Hwangbo, H. (2014, October 22). The future of collaboration: Large-scale visualization. Retrieved November 7, 2017, from http://usblogs.pwc.com/emerging-technology/the-future-of-collaboration-large-scale-visualization/

Laney, D. (2001, February 6). 3D Data Management: Controlling Data Volume, Velocity, and Variety.

Miltenoff, P., & Hauptman, R. (2005). Ethical dilemmas in libraries: an international perspective. The Electronic Library, 23(6), 664–670. https://doi.org/10.1108/02640470510635746

Philip Chen, C. L., & Zhang, C.-Y. (2014). Data-intensive applications, challenges, techniques and technologies: A survey on Big Data. Information Sciences, 275(Supplement C), 314–347. https://doi.org/10.1016/j.ins.2014.01.015

Power, D. J. (2014). Using ‘Big Data’ for analytics and decision support. Journal of Decision Systems, 23(2), 222–228. https://doi.org/10.1080/12460125.2014.888848

Provost, F., & Fawcett, T. (2013). Data Science and its Relationship to Big Data and Data-Driven Decision Making. Big Data, 1(1), 51–59. https://doi.org/10.1089/big.2013.1508

Reilly, S. (2013, December 12). What does Horizon 2020 mean for research libraries? Retrieved November 7, 2017, from http://libereurope.eu/blog/2013/12/12/what-does-horizon-2020-mean-for-research-libraries/

Reyes, J. (2015). The skinny on big data in education: Learning analytics simplified. TechTrends: Linking Research & Practice to Improve Learning, 59(2), 75–80. https://doi.org/10.1007/s11528-015-0842-1

Schroeder, R. (2014). Big Data and the brave new world of social media research. Big Data & Society, 1(2), 2053951714563194. https://doi.org/10.1177/2053951714563194

Sugimoto, C. R., Ding, Y., & Thelwall, M. (2012). Library and information science in the big data era: Funding, projects, and future [a panel proposal]. Proceedings of the American Society for Information Science and Technology, 49(1), 1–3. https://doi.org/10.1002/meet.14504901187

Tene, O., & Polonetsky, J. (2012). Big Data for All: Privacy and User Control in the Age of Analytics. Northwestern Journal of Technology and Intellectual Property, 11, [xxvii]-274.

van Dijck, J. (2014). Datafication, dataism and dataveillance: Big Data between scientific paradigm and ideology. Surveillance & Society; Newcastle upon Tyne, 12(2), 197–208.

Waller, M. A., & Fawcett, S. E. (2013). Data Science, Predictive Analytics, and Big Data: A Revolution That Will Transform Supply Chain Design and Management. Journal of Business Logistics, 34(2), 77–84. https://doi.org/10.1111/jbl.12010

Weiss, A. (2018). Big-Data-Shocks-An-Introduction-to-Big-Data-for-Librarians-and-Information-Professionals. Rowman & Littlefield Publishers. Retrieved from https://rowman.com/ISBN/9781538103227/Big-Data-Shocks-An-Introduction-to-Big-Data-for-Librarians-and-Information-Professionals

West, D. M. (2012). Big data for education: Data mining, data analytics, and web dashboards. Governance Studies at Brookings, 4, 1–0.

Willis, J. (2013). Ethics, Big Data, and Analytics: A Model for Application. Educause Review Online. Retrieved from https://docs.lib.purdue.edu/idcpubs/1

Wixom, B., Ariyachandra, T., Douglas, D. E., Goul, M., Gupta, B., Iyer, L. S., … Turetken, O. (2014). The current state of business intelligence in academia: The arrival of big data. CAIS, 34, 1.

Wu, X., Zhu, X., Wu, G. Q., & Ding, W. (2014). Data mining with big data. IEEE Transactions on Knowledge and Data Engineering, 26(1), 97–107. https://doi.org/10.1109/TKDE.2013.109

Wu, Z., Wu, J., Khabsa, M., Williams, K., Chen, H. H., Huang, W., … Giles, C. L. (2014). Towards building a scholarly big data platform: Challenges, lessons and opportunities. In IEEE/ACM Joint Conference on Digital Libraries (pp. 117–126). https://doi.org/10.1109/JCDL.2014.6970157

 

+++++++++++++++++
more on big data





Key Issues in Teaching and Learning Survey

The EDUCAUSE Learning Initiative has just launched its 2018 Key Issues in Teaching and Learning Survey, so vote today: http://www.tinyurl.com/ki2018.

Each year, the ELI surveys the teaching and learning community in order to discover the key issues and themes in teaching and learning. These top issues provide the thematic foundation or basis for all of our conversations, courses, and publications for the coming year. Longitudinally they also provide the way to track the evolving discourse in the teaching and learning space. More information about this annual survey can be found at https://www.educause.edu/eli/initiatives/key-issues-in-teaching-and-learning.

ACADEMIC TRANSFORMATION (Holistic models supporting student success, leadership competencies for academic transformation, partnerships and collaborations across campus, IT transformation, academic transformation that is broad, strategic, and institutional in scope)

ACCESSIBILITY AND UNIVERSAL DESIGN FOR LEARNING (Supporting and educating the academic community in effective practice; intersections with instructional delivery modes; compliance issues)

ADAPTIVE TEACHING AND LEARNING (Digital courseware; adaptive technology; implications for course design and the instructor’s role; adaptive approaches that are not technology-based; integration with LMS; use of data to improve learner outcomes)

COMPETENCY-BASED EDUCATION AND NEW METHODS FOR THE ASSESSMENT OF STUDENT LEARNING (Developing collaborative cultures of assessment that bring together faculty, instructional designers, accreditation coordinators, and technical support personnel, real world experience credit)

DIGITAL AND INFORMATION LITERACIES (Student and faculty literacies; research skills; data discovery, management, and analysis skills; information visualization skills; partnerships for literacy programs; evaluation of student digital competencies; information evaluation)

EVALUATING TECHNOLOGY-BASED INSTRUCTIONAL INNOVATIONS (Tools and methods to gather data; data analysis techniques; qualitative vs. quantitative data; evaluation project design; using findings to change curricular practice; scholarship of teaching and learning; articulating results to stakeholders; just-in-time evaluation of innovations). here is my bibliographical overview on Big Data (scroll down to “Research literature”https://blog.stcloudstate.edu/ims/2017/11/07/irdl-proposal/ )

EVOLUTION OF THE TEACHING AND LEARNING SUPPORT PROFESSION (Professional skills for T&L support; increasing emphasis on instructional design; delineating the skills, knowledge, business acumen, and political savvy for success; role of inter-institutional communities of practices and consortia; career-oriented professional development planning)

FACULTY DEVELOPMENT (Incentivizing faculty innovation; new roles for faculty and those who support them; evidence of impact on student learning/engagement of faculty development programs; faculty development intersections with learning analytics; engagement with student success)

GAMIFICATION OF LEARNING (Gamification designs for course activities; adaptive approaches to gamification; alternate reality games; simulations; technological implementation options for faculty)

INSTRUCTIONAL DESIGN (Skills and competencies for designers; integration of technology into the profession; role of data in design; evolution of the design profession (here previous blog postings on this issue: https://blog.stcloudstate.edu/ims/2017/10/04/instructional-design-3/); effective leadership and collaboration with faculty)

INTEGRATED PLANNING AND ADVISING FOR STUDENT SUCCESS (Change management and campus leadership; collaboration across units; integration of technology systems and data; dashboard design; data visualization (here previous blog postings on this issue: https://blog.stcloudstate.edu/ims?s=data+visualization); counseling and coaching advising transformation; student success analytics)

LEARNING ANALYTICS (Leveraging open data standards; privacy and ethics; both faculty and student facing reports; implementing; learning analytics to transform other services; course design implications)

LEARNING SPACE DESIGNS (Makerspaces; funding; faculty development; learning designs across disciplines; supporting integrated campus planning; ROI; accessibility/UDL; rating of classroom designs)

MICRO-CREDENTIALING AND DIGITAL BADGING (Design of badging hierarchies; stackable credentials; certificates; role of open standards; ways to publish digital badges; approaches to meta-data; implications for the transcript; Personalized learning transcripts and blockchain technology (here previous blog postings on this issue: https://blog.stcloudstate.edu/ims?s=blockchain

MOBILE LEARNING (Curricular use of mobile devices (here previous blog postings on this issue:

https://blog.stcloudstate.edu/ims/2015/09/25/mc218-remodel/; innovative curricular apps; approaches to use in the classroom; technology integration into learning spaces; BYOD issues and opportunities)

MULTI-DIMENSIONAL TECHNOLOGIES (Virtual, augmented, mixed, and immersive reality; video walls; integration with learning spaces; scalability, affordability, and accessibility; use of mobile devices; multi-dimensional printing and artifact creation)

NEXT-GENERATION DIGITAL LEARNING ENVIRONMENTS AND LMS SERVICES (Open standards; learning environments architectures (here previous blog postings on this issue: https://blog.stcloudstate.edu/ims/2017/03/28/digital-learning/; social learning environments; customization and personalization; OER integration; intersections with learning modalities such as adaptive, online, etc.; LMS evaluation, integration and support)

ONLINE AND BLENDED TEACHING AND LEARNING (Flipped course models; leveraging MOOCs in online learning; course development models; intersections with analytics; humanization of online courses; student engagement)

OPEN EDUCATION (Resources, textbooks, content; quality and editorial issues; faculty development; intersections with student success/access; analytics; licensing; affordability; business models; accessibility and sustainability)

PRIVACY AND SECURITY (Formulation of policies on privacy and data protection; increased sharing of data via open standards for internal and external purposes; increased use of cloud-based and third party options; education of faculty, students, and administrators)

WORKING WITH EMERGING LEARNING TECHNOLOGY (Scalability and diffusion; effective piloting practices; investments; faculty development; funding; evaluation methods and rubrics; interoperability; data-driven decision-making)

+++++++++++
learning and teaching in this IMS blog
https://blog.stcloudstate.edu/ims?s=teaching+and+learning

digital assessment session for SCSU faculty

please consider the following opportunities:

  1. Remote attendance through : https://webmeeting.minnstate.edu/collaborate
  2. Recording of the session: (URL will be shared after the session)
  3. Request a follow up meeting for your individual project: https://doodle.com/digitalliteracy

+++++++++++++
more on digital assessment in this IMS blog
https://blog.stcloudstate.edu/ims?s=edpuzzle

pedagogically sound Minecraft examples

FridayLive!! Oct 27 THIS WEEK 2:00 PM EDT 

Minecraft for Higher Ed? Try it. Pros, Cons, Recommendations? 

Description: Why Minecraft, the online video game? How can Minecraft improve learning for higher education?
We’ll begin with a live demo in which all can participate (see “Minecraft for Free”).
We’ll review “Examples, Not Rumors” of successful adaptations and USES of Minecraft for teaching/learning in higher education. Especially those submitted in advance
And we’ll try to extract from these activities a few recommendations/questions/requests re Minecraft in higher education.

++++++++++
Examples:

Minecraft Education Edition: https://education.minecraft.net/
(more info: https://blog.stcloudstate.edu/ims/2017/05/23/minecraft-education-edition/)

K12: 

Minecraft empathy skillshttp://www.gettingsmart.com/wp-content/uploads/2017/04/How-Minecraft-Supports-SEL.pdf 

coding w MineCraft

Minecraft for Math

Higher Ed: 

Minecraft Higher Education?

Using MCEE in Higher Education

Why NOT to use minecraft in education:

https://higheredrevolution.com/why-educators-probably-shouldn-t-use-minecraft-in-their-classrooms-989f525c6e62

College Students Get Virtual Look at the Real World with ‘Minecraft’

Carnegie Mellon University uses the game-based learning tool to help students demonstrate engineering skills. SEP182017

https://edtechmagazine.com/higher/article/2017/09/college-students-get-virtual-look-real-world-minecraft

Using Minecraft in Higher Education

https://groups.google.com/forum/#!topic/minecraft-teachers/cED6MM0E0bQ

Using MinecraftEdu – Part 1 – Introduction

https://www.youtube.com/watch?v=Lsfd9J5UgVk

Physics with Minecraft example

Chemistry with Minecraft example

Biology

other disciplines

+++++++++++

Does learning really happen w Minecraft?

Callaghan, N. (2016). Investigating the role of Minecraft in educational learning environments. Educational Media International53(4), 244-260. doi:10.1080/09523987.2016.1254877

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dkeh%26AN%3d119571817%26site%3dehost-live%26scope%3dsite

Noelene Callaghan dissects the evolution in Australian education from a global perspective. She rightfully draws attention (p. 245) to inevitable changes in the educational world, which still remain ignored: e.g., the demise of “traditional” LMS (Educase is calling for their replacement with digital learning environments https://blog.stcloudstate.edu/ims/2017/07/06/next-gen-digital-learning-environment/ and so does the corporate world of learning: https://blog.stcloudstate.edu/ims/2017/03/28/digital-learning/ ), the inevitability of BYOD (mainly by the “budget restrictions and sustainability challenges” (p. 245); by the assertion of cloud computing, and, last but not least, by the gamification of education.

p. 245 literature review. In my paper, I am offering more comprehensive literature review. While Callaghan focuses on the positive, my attempt is to list both pros and cons: http://scsu.mn/1F008Re

 

  1. 246 General use of massive multiplayer online role playing games (MMORPGs)

levels of interaction have grown dramatically and have led to the creation of general use of massive multiplayer online role playing games (MMORPGs)

  1. 247 In teaching and learning environments, affordances associated with edugames within a project-based learning (PBL) environment permit:
  • (1)  Learner-centered environments
  • (2)  Collaboration
  • (3)  Curricular content
  • (4)  Authentic tasks
  • (5)  Multiple expression modes
  • (6)  Emphasis on time management
  • (7)  Innovative assessment (Han & Bhattacharya, 2001).

These affordances develop both social and cognitive abilities of students

 

Nebel, S., Schneider, S., Beege, M., Kolda, F., Mackiewicz, V., & Rey, G. (2017). You cannot do this alone! Increasing task interdependence in cooperative educational videogames to encourage collaboration. Educational Technology Research & Development65(4), 993-1014. doi:10.1007/s11423-017-9511-8

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dkeh%26AN%3d124132216%26site%3dehost-live%26scope%3dsite

Abrams, S. S., & Rowsell, J. (2017). Emotionally Crafted Experiences: Layering Literacies in Minecraft. Reading Teacher70(4), 501-506.

Nebel, S., Schneider, S., & Daniel Rey, G. (2016). Mining Learning and Crafting Scientific Experiments: A Literature Review on the Use of Minecraft in Education and Research. Source: Journal of Educational Technology & Society, 19(192), 355–366. Retrieved from http://www.jstor.org/stable/jeductechsoci.19.2.355

Cipollone, M., Schifter, C. C., & Moffat, R. A. (2014). Minecraft as a Creative Tool: A Case Study. International Journal Of Game-Based Learning4(2), 1-14.

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3deric%26AN%3dEJ1111251%26site%3dehost-live%26scope%3dsite

Niemeyer, D. J., & Gerber, H. R. (2015). Maker culture and Minecraft : implications for the future of learning. Educational Media International52(3), 216-226. doi:10.1080/09523987.2015.1075103

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dkeh%26AN%3d111240626%26site%3dehost-live%26scope%3dsite

Nebel, S., Schneider, S., & Daniel Rey, G. (2016). Mining Learning and Crafting Scientific Experiments: A Literature Review on the Use of Minecraft in Education and Research. Journal of Educational Technology & Society, 19(192), 355–366. Retrieved from http://www.jstor.org/stable/jeductechsoci.19.2.355

 

Wilkinson, B., Williams, N., & Armstrong, P. (2013). Improving Student Understanding, Application and Synthesis of Computer Programming Concepts with Minecraft. In The European Conference on Technology in the Classroom 2013. Retrieved from http://iafor.info/archives/offprints/ectc2013-offprints/ECTC2013_0477.pdf

Berg Marklund, B., & Alklind Taylor, A.-S. (2015). Teachers’ Many Roles in Game-Based Learning Projects. In Academic Conferences International Limited (pp. 359–367). Retrieved from https://search.proquest.com/openview/15e084a1c52fdda188c27b9d2de6d361/1?pq-origsite=gscholar&cbl=396495

Uusi-Mäkelä, M., & Uusi-Mäkelä, M. (2014). Immersive Language Learning with Games: Finding Flow in MinecraftEdu. EdMedia: World Conference on Educational Media and Technology (Vol. 2014). Association for the Advancement of Computing in Education (AACE). Retrieved from https://www.learntechlib.org/noaccess/148409/

Birt, J., & Hovorka, D. (2014). Effect of mixed media visualization on learner perceptions and outcomes. In 25th Australasian Conference on Information Systems (pp. 1–10). Retrieved from http://epublications.bond.edu.au/fsd_papers/74

Al Washmi, R., Bana, J., Knight, I., Benson, E., Afolabi, O., Kerr, A., Hopkins, G. (2014). Design of a Math Learning Game Using a Minecraft Mod. https://doi.org/10.13140/2.1.4660.4809
https://www.researchgate.net/publication/267135810_Design_of_a_Math_Learning_Game_Using_a_Minecraft_Mod
https://docs.google.com/document/d/1uch2iC_CGsESdF9lpATGwWkamNbqQ7JOYEu_D-V03LQ/edit?usp=sharing

+++++++++++++
more on Minecraft in this IMS blog
https://blog.stcloudstate.edu/ims?s=minecraft

data visualization for librarians

Eaton, M. E. (2017). Seeing Seeing Library Data: A Prototype Data Visualization Application for Librarians. Journal of Web Librarianship, 11(1), 69–78. Retrieved from http://academicworks.cuny.edu/kb_pubs

Visualization can increase the power of data, by showing the “patterns, trends and exceptions”

Librarians can benefit when they visually leverage data in support of library projects.

Nathan Yau suggests that exploratory learning is a significant benefit of data visualization initiatives (2013). We can learn about our libraries by tinkering with data. In addition, handling data can also challenge librarians to improve their technical skills. Visualization projects allow librarians to not only learn about their libraries, but to also learn programming and data science skills.

The classic voice on data visualization theory is Edward Tufte. In Envisioning Information, Tufte unequivocally advocates for multi-dimensionality in visualizations. He praises some incredibly complex paper-based visualizations (1990). This discussion suggests that the principles of data visualization are strongly contested. Although Yau’s even-handed approach and Cairo’s willingness to find common ground are laudable, their positions are not authoritative or the only approach to data visualization.

a web application that visualizes the library’s holdings of books and e-books according to certain facets and keywords. Users can visualize whatever topics they want, by selecting keywords and facets that interest them.

Primo X-Services API. JSON, Flask, a very flexible Python web micro-framework. In addition to creating the visualization, SeeCollections also makes this data available on the web. JavaScript is the front-end technology that ultimately presents data to the SeeCollections user. JavaScript is a cornerstone of contemporary web development; a great deal of today’s interactive web content relies upon it. Many popular code libraries have been written for JavaScript. This project draws upon jQuery, Bootstrap and d3.js.

To give SeeCollections a unified visual theme, I have used Bootstrap. Bootstrap is most commonly used to make webpages responsive to different devices

D3.js facilitates the binding of data to the content of a web page, which allows manipulation of the web content based on the underlying data.

 

digital badges in academic libraries

David Demaine, S., Lemmer, C. A., Keele, B. J., & Alcasid, H. (2015). Using Digital Badges to Enhance Research Instruction in Academic Libraries. In B. L. Eden (Ed.), Enhancing Teaching and Learning in the 21st-Century Academic Library: Successful Innovations That Make a Difference (2015th ed.). Retrieved from https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2882671

At their best, badges can create a sort of interactive e-resume.

the librarian may be invited into the classroom, or the students may be sent to the Iibrary for a single research lesson on databases and search tem1s- not enough for truly high-quality research. A better alternative may be that the professor require the students to complete a series of badges- designed, implemented, and managed by the librarian- that build thorough research skills and ultimately produce a better paper.

Meta- badges are s impl y badges that indicate comp letion o f multiple related badges.

Authentication (determining that the badge has not been altered) and validation/verification (checking that the badge has actually been earned and issued by the stated issuer) are major concerns. lt is also important, particularly in the academic context, to make sure that the badge does not come to replace the learning it represents. A badge is a symbol that other skills and knowledge exist in this individual’s portfolio of skills and talents. Therefore, badges awarded in the educational context must reflect time and effort and be based on vetted standards, or they will become empty symbols

Digital credentialing recognizes “learning of many kinds which are acquired beyond formal education institutions .. . ; it proliferates and disperses author- ity over what learning to recognize; and it provides a means of translation and commensuration across multiple spheres” (Oineck, 2012, p. I)

University digital badge projects are rarely a top-down undertaking. Typi- cally, digital badge programs arise from collaborative efforts “of people agi- tating from the middle” (Raths, 2013).

 

library IT’s approach to managing tech support

your library IT’s approach to managing tech support within the framework of moving IT projects forward. Also, how big is your IT team vs your staff?

We have created an environment at our library where staff anticipate almost instant tech support. While this is great for our staff and patrons it’s proven not so great for the IT department as our IT projects that must get done take longer than they should and seem to roll endlessly. It can feel like we’re sacrificing the “big boulders” for endless minutia.

I wondered if you all could tell me your library IT’s approach to managing tech support within the framework of moving IT projects forward.

Also, how big is your IT team vs your staff?

Thank you,
Madeleine  Madeleine Sturmer IT Manager | Teton County Library msturmer@tclib.org | 307.733.2164 x143

+++++++++++++++++++++

While the responses will vary widely based on size, type and IT-issues approaches, I can share one.

Providence College is a private, medium-sized (4,300 FTE students) Masters-I institution.

Our library is a fully integrated (horizontally and vertically) Commons (Library+Commons = no silos, traditional+technology-rich, open 116 hours/week for a primarily residential campus.

IT issues are tiered (e.g., 1-5 in complexity) and we have in-house IT specialists (two – one M-F days, one S-Th evenings) and many “back-up specialists”.  The IT specialists handle most tiers-1-3 issues (sometimes tier 4) very promptly and refer tier 4-5 issues to central IT.  All Library+Commons staff are hired with “relative high-tech/digital expertise, so that there is an articulated in-house IT team.  This means that most IT issues are handled in-house and promptly.  Library+Commons IT reports up to the Assistant Director and Head of Technology & Access.

Russell Bailey, Ph.D.     Professor & Library Director, Providence College  http://www.providence.edu/library  http://works.bepress.com/d_r_bailey/ http://www.providence.edu/library/faculty/Pages/drbailey.aspx

++++++++++++++++++++++++++++++++++

the biggest challenge (and the most important) is to get the word out to the staff about how it works.  I spoke at multiple all staff meetings about the process, put out a lot of documentation, and spoke at multiple meetings of various teams and departments to get the word out.  Once you have a structure you have to support and enforce it.  Getting your administration on board is vital-if the director or associate director thinks that they can “jump the queue” it won’t work.  They have to understand that for the good of the whole, they might have to wait for something that is non-emergency.

Hope that helps-glad to provide further info offline if needed.

Carolyn Carolyn Coulter PrairieCat LLSAP Services Manager / PrairieCat Director Reaching Across Illinois Library System Coal Valley Office Phone: 309.623.4176 Fax: 309.517.1567 carolyn.coulter@railslibraries.info
++++++++++++++++++++
more on technology in the library in this IMS blog
https://blog.stcloudstate.edu/ims?s=library+technology

interactivity for the library

In 2015, former library dean purchased two large touch-screen monitors (I believe paid $3000 each). Shortly before that, I had offered to the campus fitting applications for touch screens (being that large screens or mobiles):

Both applications fit perfect the idea of interactivity in teaching (and learning) – https://blog.stcloudstate.edu/ims?s=interactivity

With the large touch screens, I proposed to have one of the large screens, positioned outside in the Miller Center lobby and used as a dummy terminal (50” + screens run around $700) to mount educational material (e.g. Guenter Grass’s celebration of his work: https://blog.stcloudstate.edu/ims/2015/04/15/gunter-grass-1927-2015/ ) and have students explore by actively engaging, rather than just passively absorbing information. The bus-awaiting students are excellent potential users and they visibly are NOT engaged by by the currently broadcasted information on these screens, but can be potentially engaged if such information is restructured in interactive content.

The initial library administration approval was stalled by a concern with students “opening porno sites” while the library is closed which, indeed, would have been a problem.

My 2015 inquiry with the IT technicians about freezing a browser and a specific tab, which could prevent such issues, but it did not go far (pls see solution below). Failing to secure relatively frigid environment on the touch screen, the project was quietly left to rot.

I am renewing my proposal to consider the rather expensive touch screen monitors, which have been not utilized to their potential, and test my idea to engage students in a meaningful knowledge-building by using these applications to either create content or engage with content created by others.

Further, I am proposing that I investigate with campus faculty the possibility to bring the endeavor a step further by having a regularly-meeting group to develop engaging content using these and similar apps; for their own classes or any other [campus-related] activities. The incentive can be some reward, after users and creators “vote” the best (semester? Academic year?) project. The less conspicuous benefit will be the exposure of faculty to modern technology; some of the faculty are still abiding by lecturing style, other faculty, who seek interactivity are engulfed in the “smart board” fiction. Engaging the faculty in the touch screen creation of teaching materials will allow them to expand the practice to their and their students’ mobile devices. The benefit for the library will be the “hub” of activities, where faculty can learn from each other experience[s] in the library, rather than in their own departments/school only. The reward will be an incentive from the upper administration (document to attach in PDR?). I will need both your involvement/support. Tom Hergert by helping me rally faculty interest and the administrators incentivizing faculty to participate in the initial project, until it gains momentum and recognition.

In the same fashion, as part of the aforementioned group or separate, I would like to host a regularly-meeting group of students, who besides play and entertainment, aim the same process of creating interactive learning materials for their classes/projects. Same “best voted” process by peers. My preferable reward: upper administration is leaving recommendation in the students’ Linkedin account for future employers. I will need both your involvement/support. The student union can be decisive in bringing students to this endeavor.  Both of you have more cloud with the student union then only a regular faculty such as me.

In regard to the security (porn alert, see above) I have the agreement of Dr. Tirthankar Ghos with the IS Department. Dr. Ghosh will be most pleased to announce as a class project the provision of a secure environment for the touch screen monitor to be left after the group meetings for “use” by students in the library. Dr. Ghosh is, however, concerned/uncertain with the level of cooperation from IT, considering that for his students to enable such environment, they have to have the “right” access; namely behind firewalls, administrative privileges etc. Each of you will definitely be more persuasive with Phil Thorson convincing him in the merit of having IS student work with SCSU IT technician, since it is a win-win situation: the IT technician does not have to “waste time” (as in 2015) and resolve an issue and the IS student will be having a project-based, real-life learning experience by enabling the project under the supervision of the IT technician. Besides: a. student-centered, project-based learning; b. IT technician time saved, we also aim c. no silos / collaborative SCSU working environment, as promised by the reorganization process.

Scopus webinar

Scopus Content: High quality, historical depth and expert curation

Bibliographic Indexing Leader

Register for the September 28th webinar

https://www.brighttalk.com/webcast/13703/275301

metadata: counts of papers by yer, researcher, institution, province, region and country. scientific fields subfields
metadata in one-credit course as a topic:

publisher – suppliers =- Elsevier processes – Scopus Data

h-index: The h-index is an author-level metric that attempts to measure both the productivity and citation impact of the publications of a scientist or scholar. The index is based on the set of the scientist’s most cited papers and the number of citations that they have received in other publications.

+++++++++++++++++++

https://www.brighttalk.com/webcast/9995/275813

Librarians and APIs 101: overview and use cases
Christina Harlow, Library Data Specialist;Jonathan Hartmann, Georgetown Univ Medical Center; Robert Phillips, Univ of Florida

https://zenodo.org/

+++++++++++++++

Slides | Research data literacy and the library from Library_Connect

 The era of e-science demands new skill sets and competencies of researchers to ensure their work is accessible, discoverable and reusable. Librarians are naturally positioned to assist in this education as part of their liaison and information literacy services.

Research data literacy and the library

Christian Lauersen, University of Copenhagen; Sarah Wright, Cornell University; Anita de Waard, Elsevier

https://www.brighttalk.com/webcast/9995/226043

Data Literacy: access, assess, manipulate, summarize and present data

Statistical Literacy: think critically about basic stats in everyday media

Information Literacy: think critically about concepts; read, interpret, evaluate information

data information literacy: the ability to use, understand and manage data. the skills needed through the whole data life cycle.

Shield, Milo. “Information literacy, statistical literacy and data literacy.” I ASSIST Quarterly 28. 2/3 (2004): 6-11.

Carlson, J., Fosmire, M., Miller, C. C., & Nelson, M. S. (2011). Determining data information literacy needs: A study of students and research faculty. Portal: Libraries & the Academy, 11(2), 629-657.

data information literacy needs

embedded librarianship,

Courses developed: NTRESS 6600 research data management seminar. six sessions, one-credit mini course

http://guides.library.cornell.edu/ntres6600
BIOG 3020: Seminar in Research skills for biologists; one-credit semester long for undergrads. data management organization http://guides.library.cornell.edu/BIOG3020

lessons learned:

  • lack of formal training for students working with data.
  • faculty assumed that students have or should have acquired the competencies earlier
  • students were considered lacking in these competencies
  • the competencies were almost universally considered important by students and faculty interviewed

http://www.datainfolit.org/

http://www.thepress.purdue.edu/titles/format/9781612493527

ideas behind data information literacy, such as the twelve data competencies.

http://blogs.lib.purdue.edu/dil/the-twelve-dil-competencies/

http://blogs.lib.purdue.edu/dil/what-is-data-information-literacy/

Johnston, L., & Carlson, J. (2015). Data Information Literacy : Librarians, Data and the Education of a New Generation of Researchers. Ashland: Purdue University Press.  http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dnlebk%26AN%3d987172%26site%3dehost-live%26scope%3dsite

NEW ROLESFOR LIbRARIANS: DATAMANAgEMENTAND CURATION

the capacity to manage and curate research data has not kept pace with the ability to produce them (Hey & Hey, 2006). In recognition of this gap, the NSF and other funding agencies are now mandating that every grant proposal must include a DMP (NSF, 2010). These mandates highlight the benefits of producing well-described data that can be shared, understood, and reused by oth-ers, but they generally offer little in the way of guidance or instruction on how to address the inherent issues and challenges researchers face in complying. Even with increasing expecta-tions from funding agencies and research com-munities, such as the announcement by the White House for all federal funding agencies to better share research data (Holdren, 2013), the lack of data curation services tailored for the “small sciences,” the single investigators or small labs that typically comprise science prac-tice at universities, has been identified as a bar-rier in making research data more widely avail-able (Cragin, Palmer, Carlson, & Witt, 2010).Academic libraries, which support the re-search and teaching activities of their home institutions, are recognizing the need to de-velop services and resources in support of the evolving demands of the information age. The curation of research data is an area that librar-ians are well suited to address, and a num-ber of academic libraries are taking action to build capacity in this area (Soehner, Steeves, & Ward, 2010)

REIMAgININg AN ExISTINg ROLEOF LIbRARIANS: TEAChINg INFORMATION LITERACY SkILLS

By combining the use-based standards of information literacy with skill development across the whole data life cycle, we sought to support the practices of science by develop-ing a DIL curriculum and providing training for higher education students and research-ers. We increased ca-pacity and enabled comparative work by involving several insti-tutions in developing instruction in DIL. Finally, we grounded the instruction in the real-world needs as articu-lated by active researchers and their students from a variety of fields

Chapter 1 The development of the 12 DIL competencies is explained, and a brief compari-son is performed between DIL and information literacy, as defined by the 2000 ACRL standards.

chapter 2 thinking and approaches toward engaging researchers and students with the 12 competencies, a re-view of the literature on a variety of educational approaches to teaching data management and curation to students, and an articulation of our key assumptions in forming the DIL project.

Chapter 3 Journal of Digital Curation. http://www.ijdc.net/

http://www.dcc.ac.uk/digital-curation

https://blog.stcloudstate.edu/ims/2017/10/19/digital-curation-2/

https://blog.stcloudstate.edu/ims/2016/12/06/digital-curation/

chapter 4 because these lon-gitudinal data cannot be reproduced, acquiring the skills necessary to work with databases and to handle data entry was described as essential. Interventions took place in a classroom set-ting through a spring 2013 semester one-credit course entitled Managing Data to Facilitate Your Research taught by this DIL team.

chapter 5 embedded librar-ian approach of working with the teaching as-sistants (TAs) to develop tools and resources to teach undergraduate students data management skills as a part of their EPICS experience.
Lack of organization and documentation presents a bar-rier to (a) successfully transferring code to new students who will continue its development, (b) delivering code and other project outputs to the community client, and (c) the center ad-ministration’s ability to understand and evalu-ate the impact on student learning.
skill sessions to deliver instruction to team lead-ers, crafted a rubric for measuring the quality of documenting code and other data, served as critics in student design reviews, and attended student lab sessions to observe and consult on student work

chapter 6 Although the faculty researcher had created formal policies on data management practices for his lab, this case study demonstrated that students’ adherence to these guidelines was limited at best. Similar patterns arose in discus-sions concerning the quality of metadata. This case study addressed a situation in which stu-dents are at least somewhat aware of the need to manage their data;

chapter 7 University of Minnesota team to design and implement a hybrid course to teach DIL com-petencies to graduate students in civil engi-neering.
stu-dents’ abilities to understand and track issues affecting the quality of the data, the transfer of data from their custody to the custody of the lab upon graduation, and the steps neces-sary to maintain the value and utility of the data over time.

++++++++++++++
more on Scopus in this IMS blog
https://blog.stcloudstate.edu/ims?s=scopus

measuring library outcomes and value

THE VALUE OF ACADEMIC LIBRARIES
A Comprehensive Research Review and Report. Megan Oakleaf

http://www.ala.org/acrl/sites/ala.org.acrl/files/content/issues/value/val_report.pdf

Librarians in universities, colleges, and community colleges can establish, assess, and link
academic library outcomes to institutional outcomes related to the following areas:
student enrollment, student retention and graduation rates, student success, student
achievement, student learning, student engagement, faculty research productivity,
faculty teaching, service, and overarching institutional quality.
Assessment management systems help higher education educators, including librarians, manage their outcomes, record and maintain data on each outcome, facilitate connections to
similar outcomes throughout an institution, and generate reports.
Assessment management systems are helpful for documenting progress toward
strategic/organizational goals, but their real strength lies in managing learning
outcomes assessments.
to determine the impact of library interactions on users, libraries can collect data on how individual users engage with library resources and services.
increase library impact on student enrollment.
p. 13-14improved student retention and graduation rates. High -impact practices include: first -year seminars and experiences, common intellectual experiences, learning communities, writing – intensive courses, collaborative assignments and projects, undergraduate research, Value of Academic Libraries diversity/global learning, service learning/community -based learning, internships, capstone courses and projects

p. 14

Libraries support students’ ability to do well in internships, secure job placements, earn salaries, gain acceptance to graduate/professional schools, and obtain marketable skills.
librarians can investigate correlations between student library interactions and their GPA well as conduct test item audits of major professional/educational tests to determine correlations between library services or resources and specific test items.
p. 15 Review course content, readings, reserves, and assignments.
Track and increase library contributions to faculty research productivity.
Continue to investigate library impact on faculty grant proposals and funding, a means of generating institutional income. Librarians contribute to faculty grant proposals in a number of ways.
Demonstrate and improve library support of faculty teaching.
p. 20 Internal Focus: ROI – lib value = perceived benefits / perceived costs
production of a commodity – value=quantity of commodity produced × price per unit of commodity
p. 21 External focus
a fourth definition of value focuses on library impact on users. It asks, “What is the library trying to achieve? How can librarians tell if they have made a difference?” In universities, colleges, and community colleges, libraries impact learning, teaching, research, and service. A main method for measuring impact is to “observe what the [users] are actually doing and what they are producing as a result”
A fifth definition of value is based on user perceptions of the library in relation to competing alternatives. A related definition is “desired value” or “what a [user] wants to have happen when interacting with a [library] and/or using a [library’s] product or service” (Flint, Woodruff and Fisher Gardial 2002) . Both “impact” and “competing alternatives” approaches to value require libraries to gain new understanding of their users’ goals as well as the results of their interactions with academic libraries.
p. 23 Increasingly, academic library value is linked to service, rather than products. Because information products are generally produced outside of libraries, library value is increasingly invested in service aspects and librarian expertise.
service delivery supported by librarian expertise is an important library value.
p. 25 methodology based only on literature? weak!
p. 26 review and analysis of the literature: language and literature are old (e.g. educational administrators vs ed leaders).
G government often sees higher education as unresponsive to these economic demands. Other stakeholder groups —students, pa rents, communities, employers, and graduate/professional schools —expect higher education to make impacts in ways that are not primarily financial.

p. 29

Because institutional missions vary (Keeling, et al. 2008, 86; Fraser, McClure and
Leahy 2002, 512), the methods by which academic libraries contribute value vary as
well. Consequently, each academic library must determine the unique ways in which they contribute to the mission of their institution and use that information to guide planning and decision making (Hernon and Altman, Assessing Service Quality 1998, 31) . For example, the University of Minnesota Libraries has rewritten their mission and vision to increase alignment with their overarching institution’s goals and emphasis on strategic engagement (Lougee 2009, allow institutional missions to guide library assessment
Assessment vs. Research
In community colleges, colleges, and universities, assessment is about defining the
purpose of higher education and determining the nature of quality (Astin 1987)
.
Academic libraries serve a number of purposes, often to the point of being
overextended.
Assessment “strives to know…what is” and then uses that information to change the
status quo (Keeling, et al. 2008, 28); in contrast, research is designed to test
hypotheses. Assessment focuses on observations of change; research is concerned with the degree of correlation or causation among variables (Keeling, et al. 2008, 35) . Assessment “virtually always occurs in a political context ,” while research attempts to be apolitical” (Upcraft and Schuh 2002, 19) .
 p. 31 Assessment seeks to document observations, but research seeks to prove or disprove ideas. Assessors have to complete assessment projects, even when there are significant design flaws (e.g., resource limitations, time limitations, organizational contexts, design limitations, or political contexts); whereas researchers can start over (Upcraft and Schuh 2002, 19) . Assessors cannot always attain “perfect” studies, but must make do with “good enough” (Upcraft and Schuh 2002, 18) . Of course, assessments should be well planned, be based on clear outcomes (Gorman 2009, 9- 10) , and use appropriate methods (Keeling, et al. 2008, 39) ; but they “must be comfortable with saying ‘after’ as well as ‘as a result of’…experiences” (Ke eling, et al. 2008, 35) .
Two multiple measure approaches are most significant for library assessment: 1) triangulation “where multiple methods are used to find areas of convergence of data from different methods with an aim of overcoming the biases or limitations of data gathered from any one particular method” (Keeling, et al. 2008, 53) and 2) complementary mixed methods , which “seek to use data from multiple methods to build upon each other by clarifying, enhancing, or illuminating findings between or among methods” (Keeling, et al. 2008, 53) .
p. 34 Academic libraries can help higher education institutions retain and graduate students, a keystone part of institutional missions (Mezick 2007, 561) , but the challenge lies in determining how libraries can contribute and then document their contribution
p. 35. Student Engagement:  In recent years, academic libraries have been transformed to provide “technology and content ubiquity” as well as individualized support
My Note: I read the “technology and content ubiquity” as digital literacy / metaliteracies, where basic technology instructional sessions (everything that IMS offers for years) is included, but this library still clenches to information literacy only.
National Survey of Student Engagement (NSSE) http://nsse.indiana.edu/
http://nsse.indiana.edu/2017_Institutional_Report/pdf/NSSE17%20Snapshot%20%28NSSEville%20State%29.pdf
p. 37 Student Learning
In the past, academic libraries functioned primarily as information repositories; now they are becoming learning enterprises (Bennett 2009, 194) . This shift requires academic librarians to embed library services and resources in the teaching and learning activities of their institutions (Lewis 2007) . In the new paradigm, librarians focus on information skills, not information access (Bundy 2004, 3); they think like educators, not service providers (Bennett 2009, 194) .
p. 38. For librarians, the main content area of student learning is information literacy; however, they are not alone in their interest in student inform ation literacy skills (Oakleaf, Are They Learning? 2011).
My note: Yep. it was. 20 years ago. Metaliteracies is now.
p. 41 surrogates for student learning in Table 3.
p. 42 strategic planning for learning:
According to Kantor, the university library “exists to benefit the students of the educational institution as individuals ” (Library as an Information Utility 1976 , 101) . In contrast, academic libraries tend to assess learning outcomes using groups of students
p. 45 Assessment Management Systems
Tk20
Each assessment management system has a slightly different set of capabilities. Some guide outcomes creation, some develop rubrics, some score student work, or support student portfolios. All manage, maintain, and report assessment data
p. 46 faculty teaching
However, as online collections grow and discovery tools evolve, that role has become less critical (Schonfeld and Housewright 2010; Housewright and Schonfeld, Ithaka’s 2006 Studies of Key Stakeholders 2008, 256) . Now, libraries serve as research consultants, project managers, technical support professionals, purchasers , and archivists (Housewright, Themes of Change 2009, 256; Case 2008) .
Librarians can count citations of faculty publications (Dominguez 2005)
.

+++++++++++++

Tenopir, C. (2012). Beyond usage: measuring library outcomes and value. Library Management33(1/2), 5-13.

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dllf%26AN%3d70921798%26site%3dehost-live%26scope%3dsite

methods that can be used to measure the value of library products and services. (Oakleaf, 2010; Tenopir and King, 2007): three main categories

  1. Implicit value. Measuring usage through downloads or usage logs provide an implicit measure of value. It is assumed that because libraries are used, they are of value to the users. Usage of e-resources is relatively easy to measure on an ongoing basis and is especially useful in collection development decisions and comparison of specific journal titles or use across subject disciplines.

do not show purpose, satisfaction, or outcomes of use (or whether what is downloaded is actually read).

  1. Explicit methods of measuring value include qualitative interview techniques that ask faculty members, students, or others specifically about the value or outcomes attributed to their use of the library collections or services and surveys or interviews that focus on a specific (critical) incident of use.
  2. Derived values, such as Return on Investment (ROI), use multiple types of data collected on both the returns (benefits) and the library and user costs (investment) to explain value in monetary terms.

++++++++++++++++++
more on ROI in this IMS blog
https://blog.stcloudstate.edu/ims/2014/11/02/roi-of-social-media/

1 10 11 12 13 14 21