Searching for "rubric"

liberalism

Liberalism is the most successful idea of the past 400 years

But its best years are behind it, according to a new book

https://www.economist.com/news/books-and-arts/21735578-its-best-years-are-behind-it-according-new-book-liberalism-most

quality of opportunity has produced a new meritocratic aristocracy that has all the aloofness of the old aristocracy with none of its sense of noblesse oblige. Democracy has degenerated into a theatre of the absurd. And technological advances are reducing ever more areas of work into meaningless drudgery

Mr Deneen’s fixation on the essence of liberalism leads to the second big problem of his book: his failure to recognise liberalism’s ability to reform itself and address its internal problems. The late 19th century saw America suffering from many of the problems that are reappearing today, including the creation of a business aristocracy, the rise of vast companies, the corruption of politics and the sense that society was dividing into winners and losers. But a wide variety of reformers, working within the liberal tradition, tackled these problems head on. Theodore Roosevelt took on the trusts. Progressives cleaned up government corruption. University reformers modernised academic syllabuses and built ladders of opportunity. Rather than dying, liberalism reformed itself.

Key Issues in Teaching and Learning Survey

The EDUCAUSE Learning Initiative has just launched its 2018 Key Issues in Teaching and Learning Survey, so vote today: http://www.tinyurl.com/ki2018.

Each year, the ELI surveys the teaching and learning community in order to discover the key issues and themes in teaching and learning. These top issues provide the thematic foundation or basis for all of our conversations, courses, and publications for the coming year. Longitudinally they also provide the way to track the evolving discourse in the teaching and learning space. More information about this annual survey can be found at https://www.educause.edu/eli/initiatives/key-issues-in-teaching-and-learning.

ACADEMIC TRANSFORMATION (Holistic models supporting student success, leadership competencies for academic transformation, partnerships and collaborations across campus, IT transformation, academic transformation that is broad, strategic, and institutional in scope)

ACCESSIBILITY AND UNIVERSAL DESIGN FOR LEARNING (Supporting and educating the academic community in effective practice; intersections with instructional delivery modes; compliance issues)

ADAPTIVE TEACHING AND LEARNING (Digital courseware; adaptive technology; implications for course design and the instructor’s role; adaptive approaches that are not technology-based; integration with LMS; use of data to improve learner outcomes)

COMPETENCY-BASED EDUCATION AND NEW METHODS FOR THE ASSESSMENT OF STUDENT LEARNING (Developing collaborative cultures of assessment that bring together faculty, instructional designers, accreditation coordinators, and technical support personnel, real world experience credit)

DIGITAL AND INFORMATION LITERACIES (Student and faculty literacies; research skills; data discovery, management, and analysis skills; information visualization skills; partnerships for literacy programs; evaluation of student digital competencies; information evaluation)

EVALUATING TECHNOLOGY-BASED INSTRUCTIONAL INNOVATIONS (Tools and methods to gather data; data analysis techniques; qualitative vs. quantitative data; evaluation project design; using findings to change curricular practice; scholarship of teaching and learning; articulating results to stakeholders; just-in-time evaluation of innovations). here is my bibliographical overview on Big Data (scroll down to “Research literature”https://blog.stcloudstate.edu/ims/2017/11/07/irdl-proposal/ )

EVOLUTION OF THE TEACHING AND LEARNING SUPPORT PROFESSION (Professional skills for T&L support; increasing emphasis on instructional design; delineating the skills, knowledge, business acumen, and political savvy for success; role of inter-institutional communities of practices and consortia; career-oriented professional development planning)

FACULTY DEVELOPMENT (Incentivizing faculty innovation; new roles for faculty and those who support them; evidence of impact on student learning/engagement of faculty development programs; faculty development intersections with learning analytics; engagement with student success)

GAMIFICATION OF LEARNING (Gamification designs for course activities; adaptive approaches to gamification; alternate reality games; simulations; technological implementation options for faculty)

INSTRUCTIONAL DESIGN (Skills and competencies for designers; integration of technology into the profession; role of data in design; evolution of the design profession (here previous blog postings on this issue: https://blog.stcloudstate.edu/ims/2017/10/04/instructional-design-3/); effective leadership and collaboration with faculty)

INTEGRATED PLANNING AND ADVISING FOR STUDENT SUCCESS (Change management and campus leadership; collaboration across units; integration of technology systems and data; dashboard design; data visualization (here previous blog postings on this issue: https://blog.stcloudstate.edu/ims?s=data+visualization); counseling and coaching advising transformation; student success analytics)

LEARNING ANALYTICS (Leveraging open data standards; privacy and ethics; both faculty and student facing reports; implementing; learning analytics to transform other services; course design implications)

LEARNING SPACE DESIGNS (Makerspaces; funding; faculty development; learning designs across disciplines; supporting integrated campus planning; ROI; accessibility/UDL; rating of classroom designs)

MICRO-CREDENTIALING AND DIGITAL BADGING (Design of badging hierarchies; stackable credentials; certificates; role of open standards; ways to publish digital badges; approaches to meta-data; implications for the transcript; Personalized learning transcripts and blockchain technology (here previous blog postings on this issue: https://blog.stcloudstate.edu/ims?s=blockchain

MOBILE LEARNING (Curricular use of mobile devices (here previous blog postings on this issue:

https://blog.stcloudstate.edu/ims/2015/09/25/mc218-remodel/; innovative curricular apps; approaches to use in the classroom; technology integration into learning spaces; BYOD issues and opportunities)

MULTI-DIMENSIONAL TECHNOLOGIES (Virtual, augmented, mixed, and immersive reality; video walls; integration with learning spaces; scalability, affordability, and accessibility; use of mobile devices; multi-dimensional printing and artifact creation)

NEXT-GENERATION DIGITAL LEARNING ENVIRONMENTS AND LMS SERVICES (Open standards; learning environments architectures (here previous blog postings on this issue: https://blog.stcloudstate.edu/ims/2017/03/28/digital-learning/; social learning environments; customization and personalization; OER integration; intersections with learning modalities such as adaptive, online, etc.; LMS evaluation, integration and support)

ONLINE AND BLENDED TEACHING AND LEARNING (Flipped course models; leveraging MOOCs in online learning; course development models; intersections with analytics; humanization of online courses; student engagement)

OPEN EDUCATION (Resources, textbooks, content; quality and editorial issues; faculty development; intersections with student success/access; analytics; licensing; affordability; business models; accessibility and sustainability)

PRIVACY AND SECURITY (Formulation of policies on privacy and data protection; increased sharing of data via open standards for internal and external purposes; increased use of cloud-based and third party options; education of faculty, students, and administrators)

WORKING WITH EMERGING LEARNING TECHNOLOGY (Scalability and diffusion; effective piloting practices; investments; faculty development; funding; evaluation methods and rubrics; interoperability; data-driven decision-making)

+++++++++++
learning and teaching in this IMS blog
https://blog.stcloudstate.edu/ims?s=teaching+and+learning

Plagiarism 101

prepare your students – avoid plagiarism

DISCOURAGING & DETECTING PLAGIARISM

http://citl.illinois.edu/citl-101/teaching-learning/resources/classroom-environment/discouraging-detecting-plagiarism

concepts of plagiarism,

http://www.wpacouncil.org/node/9

https://owl.english.purdue.edu/owl/resource/589/05/

Tutorial

intellectual property,

 

copyright,

collaboration, and

fair dealing.

Teach students how to quote, paraphrase, https://youtu.be/MiL4H09v0gU

and cite correctly:

  • Remind students of available resources, such as consulting with the faculty member, TAs, librarians, and the writing center.
  • Exemplify academic integrity in class by citing sources on handouts and during lectures.
  • Inform students that you will randomly check their citations.

Rubrics to help avoid Plagiarism:

http://wehs.westex.libguides.com/content.php?pid=345788&sid=3018138

Free Plagiarism checker:

https://www.paperrater.com/plagiarism_checker

https://www.grammarly.com/plagiarism-checker

++++++++++++
more on plagiarism in this IMS blog
https://blog.stcloudstate.edu/ims?s=plagiarism

teaching and assessing soft skills

Teaching & Assessing Soft Skills

http://catlintucker.com/2017/09/teaching-assessing-soft-skills/

Communication in Person & Online (available in PDF format here: Communication in Person Online Rubric)

https://docs.google.com/document/d/16JVAivizIysXdmUVXzC2BP2NiclbJ21N9cOZQ6NdqxU/edit

1 2 3 4
Limited, to no, participation in discussions.

 

Does not come to discussions prepared. As a result, fails to support statements with evidence from texts and other research.

 

Few attempts to ask questions or build on ideas shared.

 

Frequently violates the “dos and don’ts of online communication.”

Limited participation in discussions (one-on-one, in groups, and teacher-led) with various partners.

 

Does not consistently come to discussions prepared. Limited preparation and inability to support statements with evidence from texts and other research reflects lack of preparation.

 

Limited attempts to ask questions, build on ideas shared, or invite quieter voices into the conversation.

 

Hesitant to respond to other perspectives and fails to summarize points or make connections.

 

Occasionally violates the “dos and don’ts of online communication.”

Participates in a range of collaborative discussions (one-on-one, in groups, and teacher-led) with diverse partners.

 

Comes to discussions prepared, having read and researched material. Draws on that preparation by referring to evidence from texts and other research on the topic.

 

Attempts to drive conversations forward by asking questions, building on ideas shared, and inviting quieter voices into the conversation.

Responds to diverse perspectives, summarizes points, and makes connections.

 

Respects the “dos and don’ts for online communication.”

Initiates and participates effectively in a range of collaborative discussions (one-on-one, in groups, and teacher-led) with diverse partners.

Comes to discussions prepared with a unique perspective, having read and researched material; explicitly draws on that preparation by referring to evidence from texts and other research on the topic.

Propels conversations by posing and responding to questions that relate to the current discussion. (Adds depth by providing a new, unique perspective to the discussion.)

Responds thoughtfully to diverse perspectives, summarizes points of agreement and disagreement, and makes new connections. Leans in and actively listens.

Makes eye contact, speaks loud enough to be heard, and body language is strong.

Respects the “dos and don’ts for online communication.”

Critical Thinking & Problem Solving,  (available in PDF format here: Critical Thinking Problem Solving Rubric)

https://docs.google.com/document/d/1fjlODmLvrVZyrKnzz54LbVa7CqfbAJvLfb98805fjuY/edit

1 2 3 4
Reflects surface level understanding of information.

 

Unable or unwilling to evaluate quality of information or draw conclusions about information found.

Does not try to solve problems or help others solve problems. Lets others do the work.

 

Does not actively seek answers to questions or attempt to find information. Does not seek out peers or ask teacher for guidance or support.

Attempts to dive below the surface when analyzing information but work lacks depth.

Struggles to evaluate the quality of information and does not draw insightful conclusions about information found.

Does not suggest or refine solutions, but is willing to try out solutions suggested by others.

Asks teachers or other students for answers but does not use online tools, like Google and YouTube, to attempt to answer questions or find information.

Demonstrates a solid understanding of the information.

 

Evaluates the quality of information and makes inferences/draws conclusions.

 

Refines solutions suggested by others.

 

Attempts to use online tools, like Google and YouTube, to seek answers and find information.

Demonstrates a comprehensive understanding of the information.

 

Effectively evaluates the quality of information and makes inferences/draws conclusion that are insightful.

 

Actively looks for and suggests solutions to problems.

 

Uses online tools, like Google and YouTube, to proactively seek answers and find information.

 

 

Collaboration & Contributions in a Team Dynamic  (available in PDF format here: Collaboration Contributions in a Team Dynamic Rubric)

https://docs.google.com/document/d/1ucjgylXWz8nOM5Vq8FpTByur8smsbov3mR8pX-7n1SE/edit

1 2 3 4
Fails to listen to, share with, and support the efforts of team members making accomplishing a task more difficult for the team.

Frequently inattentive or distracting when team members talk. Requires frequent redirection by team members and/or teacher.

Body language does not reflect engagement in the process. Focus on leaning in, asking questions, actively listening (e.g. make eye contact).

Rarely offers feedback. Frequently becomes impatient, frustrated, and/or disrespectful.

 

Limited attempts to move between roles.

Does not use resources to support the team’s work.

Attempts to listen to, share with, and support the efforts of team members are limited or inconsistent.

Does not always listen when team members talk and requires redirection by team members and/or teacher.

Body language does not reflect engagement in the process. Focus on leaning in, asking questions, actively listening (e.g. make eye contact).

Occasionally offers feedback. At times, becomes impatient or frustrated with the process making teamwork more challenging.

Limited attempts to move between roles.

Does not consistently use resources to support the team’s work.

Listens to, shares with, and supports the efforts of team members.

Listens when team members talk.

Attempts to engage in group tasks; however, body language does not consistently communicate interest or attention. Body language reflects engagement in the process, but there is room for improvement.

Offers feedback and treats team members with respect. At times, becomes impatient or frustrated with the process making teamwork more challenging.

Attempts to be flexible and move between roles; at times dominates a particular role. This is an area of potential growth.

Uses resources to support the team’s work.

Consistently listens to, shares with, and supports the efforts of team members.

 

Leans in and actively listens when team members talk.

 

Body language communicates interest in team tasks and engagement in the process.

 

Offers constructive feedback, treats team members with respect, and is patient with the process.

 

Creates balance on the team moving between responsibilities without dominating any one role.

 

Uses resources effectively to support the team’s work.

\++++++++++++++++++++++
more on soft skills in this IMS blog
https://blog.stcloudstate.edu/ims?s=soft+skills

Scopus webinar

Scopus Content: High quality, historical depth and expert curation

Bibliographic Indexing Leader

Register for the September 28th webinar

https://www.brighttalk.com/webcast/13703/275301

metadata: counts of papers by yer, researcher, institution, province, region and country. scientific fields subfields
metadata in one-credit course as a topic:

publisher – suppliers =- Elsevier processes – Scopus Data

h-index: The h-index is an author-level metric that attempts to measure both the productivity and citation impact of the publications of a scientist or scholar. The index is based on the set of the scientist’s most cited papers and the number of citations that they have received in other publications.

+++++++++++++++++++

https://www.brighttalk.com/webcast/9995/275813

Librarians and APIs 101: overview and use cases
Christina Harlow, Library Data Specialist;Jonathan Hartmann, Georgetown Univ Medical Center; Robert Phillips, Univ of Florida

https://zenodo.org/

+++++++++++++++

Slides | Research data literacy and the library from Library_Connect

 The era of e-science demands new skill sets and competencies of researchers to ensure their work is accessible, discoverable and reusable. Librarians are naturally positioned to assist in this education as part of their liaison and information literacy services.

Research data literacy and the library

Christian Lauersen, University of Copenhagen; Sarah Wright, Cornell University; Anita de Waard, Elsevier

https://www.brighttalk.com/webcast/9995/226043

Data Literacy: access, assess, manipulate, summarize and present data

Statistical Literacy: think critically about basic stats in everyday media

Information Literacy: think critically about concepts; read, interpret, evaluate information

data information literacy: the ability to use, understand and manage data. the skills needed through the whole data life cycle.

Shield, Milo. “Information literacy, statistical literacy and data literacy.” I ASSIST Quarterly 28. 2/3 (2004): 6-11.

Carlson, J., Fosmire, M., Miller, C. C., & Nelson, M. S. (2011). Determining data information literacy needs: A study of students and research faculty. Portal: Libraries & the Academy, 11(2), 629-657.

data information literacy needs

embedded librarianship,

Courses developed: NTRESS 6600 research data management seminar. six sessions, one-credit mini course

http://guides.library.cornell.edu/ntres6600
BIOG 3020: Seminar in Research skills for biologists; one-credit semester long for undergrads. data management organization http://guides.library.cornell.edu/BIOG3020

lessons learned:

  • lack of formal training for students working with data.
  • faculty assumed that students have or should have acquired the competencies earlier
  • students were considered lacking in these competencies
  • the competencies were almost universally considered important by students and faculty interviewed

http://www.datainfolit.org/

http://www.thepress.purdue.edu/titles/format/9781612493527

ideas behind data information literacy, such as the twelve data competencies.

http://blogs.lib.purdue.edu/dil/the-twelve-dil-competencies/

http://blogs.lib.purdue.edu/dil/what-is-data-information-literacy/

Johnston, L., & Carlson, J. (2015). Data Information Literacy : Librarians, Data and the Education of a New Generation of Researchers. Ashland: Purdue University Press.  http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dnlebk%26AN%3d987172%26site%3dehost-live%26scope%3dsite

NEW ROLESFOR LIbRARIANS: DATAMANAgEMENTAND CURATION

the capacity to manage and curate research data has not kept pace with the ability to produce them (Hey & Hey, 2006). In recognition of this gap, the NSF and other funding agencies are now mandating that every grant proposal must include a DMP (NSF, 2010). These mandates highlight the benefits of producing well-described data that can be shared, understood, and reused by oth-ers, but they generally offer little in the way of guidance or instruction on how to address the inherent issues and challenges researchers face in complying. Even with increasing expecta-tions from funding agencies and research com-munities, such as the announcement by the White House for all federal funding agencies to better share research data (Holdren, 2013), the lack of data curation services tailored for the “small sciences,” the single investigators or small labs that typically comprise science prac-tice at universities, has been identified as a bar-rier in making research data more widely avail-able (Cragin, Palmer, Carlson, & Witt, 2010).Academic libraries, which support the re-search and teaching activities of their home institutions, are recognizing the need to de-velop services and resources in support of the evolving demands of the information age. The curation of research data is an area that librar-ians are well suited to address, and a num-ber of academic libraries are taking action to build capacity in this area (Soehner, Steeves, & Ward, 2010)

REIMAgININg AN ExISTINg ROLEOF LIbRARIANS: TEAChINg INFORMATION LITERACY SkILLS

By combining the use-based standards of information literacy with skill development across the whole data life cycle, we sought to support the practices of science by develop-ing a DIL curriculum and providing training for higher education students and research-ers. We increased ca-pacity and enabled comparative work by involving several insti-tutions in developing instruction in DIL. Finally, we grounded the instruction in the real-world needs as articu-lated by active researchers and their students from a variety of fields

Chapter 1 The development of the 12 DIL competencies is explained, and a brief compari-son is performed between DIL and information literacy, as defined by the 2000 ACRL standards.

chapter 2 thinking and approaches toward engaging researchers and students with the 12 competencies, a re-view of the literature on a variety of educational approaches to teaching data management and curation to students, and an articulation of our key assumptions in forming the DIL project.

Chapter 3 Journal of Digital Curation. http://www.ijdc.net/

http://www.dcc.ac.uk/digital-curation

https://blog.stcloudstate.edu/ims/2017/10/19/digital-curation-2/

https://blog.stcloudstate.edu/ims/2016/12/06/digital-curation/

chapter 4 because these lon-gitudinal data cannot be reproduced, acquiring the skills necessary to work with databases and to handle data entry was described as essential. Interventions took place in a classroom set-ting through a spring 2013 semester one-credit course entitled Managing Data to Facilitate Your Research taught by this DIL team.

chapter 5 embedded librar-ian approach of working with the teaching as-sistants (TAs) to develop tools and resources to teach undergraduate students data management skills as a part of their EPICS experience.
Lack of organization and documentation presents a bar-rier to (a) successfully transferring code to new students who will continue its development, (b) delivering code and other project outputs to the community client, and (c) the center ad-ministration’s ability to understand and evalu-ate the impact on student learning.
skill sessions to deliver instruction to team lead-ers, crafted a rubric for measuring the quality of documenting code and other data, served as critics in student design reviews, and attended student lab sessions to observe and consult on student work

chapter 6 Although the faculty researcher had created formal policies on data management practices for his lab, this case study demonstrated that students’ adherence to these guidelines was limited at best. Similar patterns arose in discus-sions concerning the quality of metadata. This case study addressed a situation in which stu-dents are at least somewhat aware of the need to manage their data;

chapter 7 University of Minnesota team to design and implement a hybrid course to teach DIL com-petencies to graduate students in civil engi-neering.
stu-dents’ abilities to understand and track issues affecting the quality of the data, the transfer of data from their custody to the custody of the lab upon graduation, and the steps neces-sary to maintain the value and utility of the data over time.

++++++++++++++
more on Scopus in this IMS blog
https://blog.stcloudstate.edu/ims?s=scopus

measuring library outcomes and value

THE VALUE OF ACADEMIC LIBRARIES
A Comprehensive Research Review and Report. Megan Oakleaf

http://www.ala.org/acrl/sites/ala.org.acrl/files/content/issues/value/val_report.pdf

Librarians in universities, colleges, and community colleges can establish, assess, and link
academic library outcomes to institutional outcomes related to the following areas:
student enrollment, student retention and graduation rates, student success, student
achievement, student learning, student engagement, faculty research productivity,
faculty teaching, service, and overarching institutional quality.
Assessment management systems help higher education educators, including librarians, manage their outcomes, record and maintain data on each outcome, facilitate connections to
similar outcomes throughout an institution, and generate reports.
Assessment management systems are helpful for documenting progress toward
strategic/organizational goals, but their real strength lies in managing learning
outcomes assessments.
to determine the impact of library interactions on users, libraries can collect data on how individual users engage with library resources and services.
increase library impact on student enrollment.
p. 13-14improved student retention and graduation rates. High -impact practices include: first -year seminars and experiences, common intellectual experiences, learning communities, writing – intensive courses, collaborative assignments and projects, undergraduate research, Value of Academic Libraries diversity/global learning, service learning/community -based learning, internships, capstone courses and projects

p. 14

Libraries support students’ ability to do well in internships, secure job placements, earn salaries, gain acceptance to graduate/professional schools, and obtain marketable skills.
librarians can investigate correlations between student library interactions and their GPA well as conduct test item audits of major professional/educational tests to determine correlations between library services or resources and specific test items.
p. 15 Review course content, readings, reserves, and assignments.
Track and increase library contributions to faculty research productivity.
Continue to investigate library impact on faculty grant proposals and funding, a means of generating institutional income. Librarians contribute to faculty grant proposals in a number of ways.
Demonstrate and improve library support of faculty teaching.
p. 20 Internal Focus: ROI – lib value = perceived benefits / perceived costs
production of a commodity – value=quantity of commodity produced × price per unit of commodity
p. 21 External focus
a fourth definition of value focuses on library impact on users. It asks, “What is the library trying to achieve? How can librarians tell if they have made a difference?” In universities, colleges, and community colleges, libraries impact learning, teaching, research, and service. A main method for measuring impact is to “observe what the [users] are actually doing and what they are producing as a result”
A fifth definition of value is based on user perceptions of the library in relation to competing alternatives. A related definition is “desired value” or “what a [user] wants to have happen when interacting with a [library] and/or using a [library’s] product or service” (Flint, Woodruff and Fisher Gardial 2002) . Both “impact” and “competing alternatives” approaches to value require libraries to gain new understanding of their users’ goals as well as the results of their interactions with academic libraries.
p. 23 Increasingly, academic library value is linked to service, rather than products. Because information products are generally produced outside of libraries, library value is increasingly invested in service aspects and librarian expertise.
service delivery supported by librarian expertise is an important library value.
p. 25 methodology based only on literature? weak!
p. 26 review and analysis of the literature: language and literature are old (e.g. educational administrators vs ed leaders).
G government often sees higher education as unresponsive to these economic demands. Other stakeholder groups —students, pa rents, communities, employers, and graduate/professional schools —expect higher education to make impacts in ways that are not primarily financial.

p. 29

Because institutional missions vary (Keeling, et al. 2008, 86; Fraser, McClure and
Leahy 2002, 512), the methods by which academic libraries contribute value vary as
well. Consequently, each academic library must determine the unique ways in which they contribute to the mission of their institution and use that information to guide planning and decision making (Hernon and Altman, Assessing Service Quality 1998, 31) . For example, the University of Minnesota Libraries has rewritten their mission and vision to increase alignment with their overarching institution’s goals and emphasis on strategic engagement (Lougee 2009, allow institutional missions to guide library assessment
Assessment vs. Research
In community colleges, colleges, and universities, assessment is about defining the
purpose of higher education and determining the nature of quality (Astin 1987)
.
Academic libraries serve a number of purposes, often to the point of being
overextended.
Assessment “strives to know…what is” and then uses that information to change the
status quo (Keeling, et al. 2008, 28); in contrast, research is designed to test
hypotheses. Assessment focuses on observations of change; research is concerned with the degree of correlation or causation among variables (Keeling, et al. 2008, 35) . Assessment “virtually always occurs in a political context ,” while research attempts to be apolitical” (Upcraft and Schuh 2002, 19) .
 p. 31 Assessment seeks to document observations, but research seeks to prove or disprove ideas. Assessors have to complete assessment projects, even when there are significant design flaws (e.g., resource limitations, time limitations, organizational contexts, design limitations, or political contexts); whereas researchers can start over (Upcraft and Schuh 2002, 19) . Assessors cannot always attain “perfect” studies, but must make do with “good enough” (Upcraft and Schuh 2002, 18) . Of course, assessments should be well planned, be based on clear outcomes (Gorman 2009, 9- 10) , and use appropriate methods (Keeling, et al. 2008, 39) ; but they “must be comfortable with saying ‘after’ as well as ‘as a result of’…experiences” (Ke eling, et al. 2008, 35) .
Two multiple measure approaches are most significant for library assessment: 1) triangulation “where multiple methods are used to find areas of convergence of data from different methods with an aim of overcoming the biases or limitations of data gathered from any one particular method” (Keeling, et al. 2008, 53) and 2) complementary mixed methods , which “seek to use data from multiple methods to build upon each other by clarifying, enhancing, or illuminating findings between or among methods” (Keeling, et al. 2008, 53) .
p. 34 Academic libraries can help higher education institutions retain and graduate students, a keystone part of institutional missions (Mezick 2007, 561) , but the challenge lies in determining how libraries can contribute and then document their contribution
p. 35. Student Engagement:  In recent years, academic libraries have been transformed to provide “technology and content ubiquity” as well as individualized support
My Note: I read the “technology and content ubiquity” as digital literacy / metaliteracies, where basic technology instructional sessions (everything that IMS offers for years) is included, but this library still clenches to information literacy only.
National Survey of Student Engagement (NSSE) http://nsse.indiana.edu/
http://nsse.indiana.edu/2017_Institutional_Report/pdf/NSSE17%20Snapshot%20%28NSSEville%20State%29.pdf
p. 37 Student Learning
In the past, academic libraries functioned primarily as information repositories; now they are becoming learning enterprises (Bennett 2009, 194) . This shift requires academic librarians to embed library services and resources in the teaching and learning activities of their institutions (Lewis 2007) . In the new paradigm, librarians focus on information skills, not information access (Bundy 2004, 3); they think like educators, not service providers (Bennett 2009, 194) .
p. 38. For librarians, the main content area of student learning is information literacy; however, they are not alone in their interest in student inform ation literacy skills (Oakleaf, Are They Learning? 2011).
My note: Yep. it was. 20 years ago. Metaliteracies is now.
p. 41 surrogates for student learning in Table 3.
p. 42 strategic planning for learning:
According to Kantor, the university library “exists to benefit the students of the educational institution as individuals ” (Library as an Information Utility 1976 , 101) . In contrast, academic libraries tend to assess learning outcomes using groups of students
p. 45 Assessment Management Systems
Tk20
Each assessment management system has a slightly different set of capabilities. Some guide outcomes creation, some develop rubrics, some score student work, or support student portfolios. All manage, maintain, and report assessment data
p. 46 faculty teaching
However, as online collections grow and discovery tools evolve, that role has become less critical (Schonfeld and Housewright 2010; Housewright and Schonfeld, Ithaka’s 2006 Studies of Key Stakeholders 2008, 256) . Now, libraries serve as research consultants, project managers, technical support professionals, purchasers , and archivists (Housewright, Themes of Change 2009, 256; Case 2008) .
Librarians can count citations of faculty publications (Dominguez 2005)
.

+++++++++++++

Tenopir, C. (2012). Beyond usage: measuring library outcomes and value. Library Management33(1/2), 5-13.

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dllf%26AN%3d70921798%26site%3dehost-live%26scope%3dsite

methods that can be used to measure the value of library products and services. (Oakleaf, 2010; Tenopir and King, 2007): three main categories

  1. Implicit value. Measuring usage through downloads or usage logs provide an implicit measure of value. It is assumed that because libraries are used, they are of value to the users. Usage of e-resources is relatively easy to measure on an ongoing basis and is especially useful in collection development decisions and comparison of specific journal titles or use across subject disciplines.

do not show purpose, satisfaction, or outcomes of use (or whether what is downloaded is actually read).

  1. Explicit methods of measuring value include qualitative interview techniques that ask faculty members, students, or others specifically about the value or outcomes attributed to their use of the library collections or services and surveys or interviews that focus on a specific (critical) incident of use.
  2. Derived values, such as Return on Investment (ROI), use multiple types of data collected on both the returns (benefits) and the library and user costs (investment) to explain value in monetary terms.

++++++++++++++++++
more on ROI in this IMS blog
https://blog.stcloudstate.edu/ims/2014/11/02/roi-of-social-media/

online teaching evaluation

Tobin, T. J., Mandernach, B. J., & Taylor, A. H. (2015). Evaluating Online Teaching: Implementing Best Practices (1 edition). San Francisco, CA: Jossey-Bass.
  1. 5 measurable faculty competencies for on line teaching:
  • attend to unique challenges of distance learning
  • Be familiar with unique learning needs
  • Achieve mastery of course content, structure , and organization
  • Respond to student inquiries
  • Provide detailed feedback
  • Communicate effectively
  • Promote a safe learning environment
  • Monitor student progress
  • Communicate course goals
  • Provide evidence of teaching presence.

Best practices include:

  • Making interactions challenging yet supportive for students
  • Asking learners to be active participants in the learning process
  • Acknowledging variety on the ways that students learn best
  • Providing timely and constructive feedback

Evaluation principles

  • Instructor knowledge
  • Method of instruction
  • Instructor-student rapport
  • Teaching behaviors
  • Enthusiastic teaching
  • Concern for teaching
  • Overall

8. The American Association for higher Education 9 principle4s of Good practice for assessing student learning from 1996 hold equally in the F2F and online environments:

the assessment of student learning beings with educational values

assessment is most effective when it reflects an understanding of learning as multidimensional, integrated and revealed in performance over time

assessment works best when the programs it seeks to improve have clear, explicitly stated purposes.

Assessment requires attention to outcomes but also and equally to the experiences that lead to those outcomes.

Assessment works best when it is ongoing, not episodic

Assessment fosters wider improvement when representatives from across the educational community are involved

Assessment makes a difference when it begins with issues of use and illumines questions that people really care bout

Assessment is most likely to lead to improvements when it is part of the large set of conditions that promote change.

Through assessment, educators meet responsibilities to students and to the public.

9 most of the online teaching evaluation instruments in use today are created to evaluate content design rather than teaching practices.

29 stakeholders for the evaluation of online teaching

  • faculty members with online teaching experience
  • campus faculty members as a means of establishing equitable evaluation across modes of teaching
  • contingent faculty members teaching online
  • department or college administrators
  • members of faculty unions or representative governing organizations
  • administrative support specialists
  • distance learning administrators
  • technology specialists
  • LMS administrators
  • Faculty development and training specialists
  • Institutional assessment and effectiveness specialists
  • Students

Sample student rating q/s

University resources

Rate the effectiveness of the online library for locationg course materials

Based on your experience,

148. Checklist for Online Interactive Learning COIL

150. Quality Online Course Initiative QOCI

151 QM Rubric

154 The Online Insturctor Evaluation System OIES

 

163 Data Analytics: moving beyond student learning

  • # of announcments posted per module
  • # of contributions to the asynchronous discussion boards
  • Quality of the contributions
  • Timeliness of posting student grades
  • Timelines of student feedback
  • Quality of instructional supplements
  • Quality of feedback on student work
  • Frequency of logins
  1. 180 understanding big data
  • reliability
  • validity
  • factor structure

187 a holistics valuation plan should include both formative evaluation, in which observations and rating are undertaken with the purposes of improving teaching and learning, and summative evaluation, in which observation and ratings are used in order to make personnel decisions, such as granting promotion and tenure, remediation, and asking contingent faculty to teach again.

195 separating teaching behaviors from content design

 

 

 

 

+++++++++++++++++
more on online teaching in this IMS blog
https://blog.stcloudstate.edu/ims?s=online+teaching

blogging for Confucius Institute

Minutes from the Oct 17 meeting:

++++++++++++++++++

Plan for August 17

Introduce students to the blog idea. Short link to this planhttp://bit.ly/blog4ci

https://blog.stcloudstate.edu/ims/2017/08/17/blogging-for-confucius-institute/

  • Why blog
    • What is social media, when SM started
    • What is blog, when blogs started
    • Why blog
      • Blogging vs microblogging

https://blog.stcloudstate.edu/ims/2015/12/31/social-media-and-the-devaluation/
https://blog.stcloudstate.edu/ims/2016/01/01/blog-future/

+++++++++++++++++++++++++++++++++++++++++++++++
handout on basic functions with your blog

+++++++++++++++++++++++++++++++++++++++++++++++

 

 

fake news and video

Computer Scientists Demonstrate The Potential For Faking Video

http://www.npr.org/sections/alltechconsidered/2017/07/14/537154304/computer-scientists-demonstrate-the-potential-for-faking-video

As a team out of the University of Washington explains in a new paper titled “Synthesizing Obama: Learning Lip Sync from Audio,” they’ve made several fake videos of Obama.

+++++++++++++

++++++++++++++++++++++++++++++++++++++

Fake news: you ain’t seen nothing yet

Generating convincing audio and video of fake events, July 1, 2017

https://www.economist.com/news/science-and-technology/21724370-generating-convincing-audio-and-video-fake-events-fake-news-you-aint-seen

took only a few days to create the clip on a desktop computer using a generative adversarial network (GAN), a type of machine-learning algorithm.

Faith in written information is under attack in some quarters by the spread of what is loosely known as “fake news”. But images and sound recordings retain for many an inherent trustworthiness. GANs are part of a technological wave that threatens this credibility.

Amnesty International is already grappling with some of these issues. Its Citizen Evidence Lab verifies videos and images of alleged human-rights abuses. It uses Google Earth to examine background landscapes and to test whether a video or image was captured when and where it claims. It uses Wolfram Alpha, a search engine, to cross-reference historical weather conditions against those claimed in the video. Amnesty’s work mostly catches old videos that are being labelled as a new atrocity, but it will have to watch out for generated video, too. Cryptography could also help to verify that content has come from a trusted organisation. Media could be signed with a unique key that only the signing organisation—or the originating device—possesses.

+++++++++++++
more on fake news in this IMS blog
https://blog.stcloudstate.edu/ims?s=fake+news

next gen digital learning environment

Updating the Next Generation Digital Learning Environment for Better Student Learning Outcomes

a learning management system (LMS) is never the solution to every problem in education. Edtech is just one part of the whole learning ecosystem and student experience.

Therefore, the next generation digital learning environment (NGDLE), as envisioned by EDUCAUSE in 2015 …  Looking at the NGDLE requirements from an LMS perspective, I view the NGDLE as being about five areas: interoperability; personalization; analytics, advising, and learning assessment; collaboration; accessibility and universal design.

Interoperability

  • Content can easily be exchanged between systems.
  • Users are able to leverage the tools they love, including discipline-specific apps.
  • Learning data is available to trusted systems and people who need it.
  • The learning environment is “future proof” so that it can adapt and extend as the ecosystem evolves.

Personalization

  • The learning environment reflects individual preferences.
  • Departments, divisions, and institutions can be autonomous.
  • Instructors teach the way they want and are not constrained by the software design.
  • There are clear, individual learning paths.
  • Students have choice in activity, expression, and engagement.

Analytics, Advising, and Learning Assessment

  • Learning analytics helps to identify at-risk students, course progress, and adaptive learning pathways.
  • The learning environment enables integrated planning and assessment of student performance.
  • More data is made available, with greater context around the data.
  • The learning environment supports platform and data standards.

Collaboration

  • Individual spaces persist after courses and after graduation.
  • Learners are encouraged as creators and consumers.
  • Courses include public and private spaces.

Accessibility and Universal Design

  • Accessibility is part of the design of the learning experience.
  • The learning environment enables adaptive learning and supports different types of materials.
  • Learning design includes measurement rubrics and quality control.

The core analogy used in the NGDLE paper is that each component of the learning environment is a Lego brick:

  • The days of the LMS as a “walled garden” app that does everything is over.
  • Today many kinds of amazing learning and collaboration tools (Lego bricks) should be accessible to educators.
  • We have standards that let these tools (including an LMS) talk to each other. That is, all bricks share some properties that let them fit together.
  • Students and teachers sign in once to this “ecosystem of bricks.”
  • The bricks share results and data.
  • These bricks fit together; they can be interchanged and swapped at will, with confidence that the learning experience will continue uninterrupted.

Any “next-gen” attempt to completely rework the pedagogical model and introduce a “mash-up of whatever” to fulfil this model would fall victim to the same criticisms levied at the LMS today: there is too little time and training to expect faculty to figure out the nuances of implementation on their own.

The Lego metaphor works only if we’re talking about “old school” Lego design — bricks of two, three, and four-post pieces that neatly fit together. Modern edtech is a lot more like the modern Lego. There are wheels and rocket launchers and belts and all kinds of amazing pieces that work well with each other, but only when they are configured properly. A user cannot simply stick together different pieces and assume they will work harmoniously in creating an environment through which each student can be successful.

As the NGDLE paper states: “Despite the high percentages of LMS adoption, relatively few instructors use its more advanced features — just 41% of faculty surveyed report using the LMS ‘to promote interaction outside the classroom.'”

But this is what the next generation LMS is good at: being a central nervous system — or learning hub — through which a variety of learning activities and tools are used. This is also where the LMS needs to go: bringing together and making sense of all the amazing innovations happening around it. This is much harder to do, perhaps even impossible, if all the pieces involved are just bricks without anything to orchestrate them or to weave them together into a meaningful, personal experience for achieving well-defined learning outcomes.

  • Making a commitment to build easy, flexible, and smart technology
  • Working with colleges and universities to remove barriers to adopting new tools in the ecosystem
  • Standardizing the vetting of accessibility compliance (the Strategic Nonvisual Access Partner Program from the National Federation of the Blind is a great start)
  • Advancing standards for data exchange while protecting individual privacy
  • Building integrated components that work with the institutions using them — learning quickly about what is and is not working well and applying those lessons to the next generation of interoperability standards
  • Letting people use the tools they love [SIC] and providing more ways for nontechnical individuals (including students) to easily integrate new features into learning activities

My note: something just refused to be accepted at SCSU
Technologists are often very focused on the technology, but the reality is that the more deeply and closely we understand the pedagogy and the people in the institutions — students, faculty, instructional support staff, administrators — the better suited we are to actually making the tech work for them.

++++++++++++++++++++++

Under the Hood of a Next Generation Digital Learning Environment in Progress

The challenge is that although 85 percent of faculty use a campus learning management system (LMS),1 a recent Blackboard report found that, out of 70,000 courses across 927 North American institutions, 53 percent of LMS usage was classified as supplemental(content-heavy, low interaction) and 24 percent as complementary (one-way communication via content/announcements/gradebook).2 Only 11 percent were characterized as social, 10 percent as evaluative (heavy use of assessment), and 2 percent as holistic (balanced use of all previous). Our FYE course required innovating beyond the supplemental course-level LMS to create a more holistic cohort-wide NGDLE in order to fully support the teaching, learning, and student success missions of the program.The key design goals for our NGDLE were to:

  • Create a common platform that could deliver a standard curriculum and achieve parity in all course sections using existing systems and tools and readily available content
  • Capture, store, and analyze any generated learner data to support learning assessment, continuous program improvement, and research
  • Develop reports and actionable analytics for administrators, advisors, instructors, and students

++++++++++++
more on LMS in this blog
https://blog.stcloudstate.edu/ims?s=LMS

more on learning outcomes in this IMS blog
https://blog.stcloudstate.edu/ims?s=learning+outcomes

1 2 3 4 5 6 7