Searching for "rubrics"

Preparing Learners for 21st Century Digital Citizenship

ID2ID webinar (my notes on the bottom)

Digital Fluency: Preparing Learners for 21st Century Digital Citizenship
Eighty-five percent of the jobs available in 2030 do not yet exist.  How does higher education prepare our learners for careers that don’t yet exist?  One opportunity is to provide our students with opportunities to grow their skills in creative problem solving, critical thinking, resiliency, novel thinking, social intelligence, and excellent communication skills.  Instructional designers and faculty can leverage the framework of digital fluency to create opportunities for learners to practice and hone the skills that will prepare them to be 21st-century digital citizens.  In this session, join a discussion about several fluencies that comprise the overarching framework for digital fluency and help to define some of your own.

Please click this URL to join. https://arizona.zoom.us/j/222969448

Dr. Jennifer Sparrow, Senior Director for Teaching and Learning with Technology and Affiliate Assistant Professor of Learning, Design, and Technology at Penn State.    The webinar will take place on Friday, November 9th at 11am EST/4pm UTC (login details below)  

https://arizona.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=e15266ee-7368-4378-b63c-a99301274877

My notes:

Jennifer does NOT see phone use for learning as an usage to obstruct. Similarly as with the calculator some 30-40 years ago, it was frowned upon, so now is technology. To this notion, added the fast-changing job market: new jobs created, old disappearing (https://www.nbcnews.com/news/us-news/students-are-being-prepared-jobs-no-longer-exist-here-s-n865096)

how DF is different from DLiteracy? enable students define how new knowledge can be created through technology. Not only read and write, but create poems, stories, if analogous w learning a language. slide 4 in https://www.slideshare.net/aidemoreto/vr-library

communication fluency. be able to choose the correct media. curiosity/failure fluency; creation fluency (makerspace: create without soldering, programming, 3Dprinting. PLA filament-corn-based plastic; Makers-in-residence)

immersive fluency: video 360, VR and AR. enable student to create new knowledge through environments beyond reality. Immersive Experiences Lab (IMEX). Design: physical vs virtual spaces.

Data fluency: b.book. how to create my own textbook

rubrics and sample projects to assess digital fluency.

https://er.educause.edu/articles/2018/3/digital-fluency-preparing-students-to-create-big-bold-problems

https://events.educause.edu/annual-conference/2018/agenda/ethics-and-digital-fluency-in-vr-and-immersive-learning-environments

Literacy Is NOT Enough: 21st Century Fluencies for the Digital Age (The 21st Century Fluency Series)
https://www.amazon.com/Literacy-NOT-Enough-Century-Fluencies/dp/1412987806

What is Instructional Design 2.0 or 3.0? deep knowledge and understanding of faculty development. second, once faculty understands the new technology, how does this translate into rework of curriculum? third, the research piece; how to improve to be ready for the next cycle. a partnership between ID and faculty.

ELI 2018 Key Issues Teaching Learning

Key Issues in Teaching and Learning

https://www.educause.edu/eli/initiatives/key-issues-in-teaching-and-learning

A roster of results since 2011 is here.

ELI 2018 key issues

1. Academic Transformation

2. Accessibility and UDL

3. Faculty Development

4. Privacy and Security

5. Digital and Information Literacies

https://cdn.nmc.org/media/2017-nmc-strategic-brief-digital-literacy-in-higher-education-II.pdf
Three Models of Digital Literacy: Universal, Creative, Literacy Across Disciplines

United States digital literacy frameworks tend to focus on educational policy details and personal empowerment, the latter encouraging learners to become more effective students, better creators, smarter information consumers, and more influential members of their community.

National policies are vitally important in European digital literacy work, unsurprising for a continent well populated with nation-states and struggling to redefine itself, while still trying to grow economies in the wake of the 2008 financial crisis and subsequent financial pressures

African digital literacy is more business-oriented.

Middle Eastern nations offer yet another variation, with a strong focus on media literacy. As with other regions, this can be a response to countries with strong state influence or control over local media. It can also represent a drive to produce more locally-sourced content, as opposed to consuming material from abroad, which may elicit criticism of neocolonialism or religious challenges.

p. 14 Digital literacy for Humanities: What does it mean to be digitally literate in history, literature, or philosophy? Creativity in these disciplines often involves textuality, given the large role writing plays in them, as, for example, in the Folger Shakespeare Library’s instructor’s guide. In the digital realm, this can include web-based writing through social media, along with the creation of multimedia projects through posters, presentations, and video. Information literacy remains a key part of digital literacy in the humanities. The digital humanities movement has not seen much connection with digital literacy, unfortunately, but their alignment seems likely, given the turn toward using digital technologies to explore humanities questions. That development could then foster a spread of other technologies and approaches to the rest of the humanities, including mapping, data visualization, text mining, web-based digital archives, and “distant reading” (working with very large bodies of texts). The digital humanities’ emphasis on making projects may also increase

Digital Literacy for Business: Digital literacy in this world is focused on manipulation of data, from spreadsheets to more advanced modeling software, leading up to degrees in management information systems. Management classes unsurprisingly focus on how to organize people working on and with digital tools.

Digital Literacy for Computer Science: Naturally, coding appears as a central competency within this discipline. Other aspects of the digital world feature prominently, including hardware and network architecture. Some courses housed within the computer science discipline offer a deeper examination of the impact of computing on society and politics, along with how to use digital tools. Media production plays a minor role here, beyond publications (posters, videos), as many institutions assign multimedia to other departments. Looking forward to a future when automation has become both more widespread and powerful, developing artificial intelligence projects will potentially play a role in computer science literacy.

6. Integrated Planning and Advising Systems for Student Success (iPASS)

7. Instructional Design

8. Online and Blended Learning

In traditional instruction, students’ first contact with new ideas happens in class, usually through direct instruction from the professor; after exposure to the basics, students are turned out of the classroom to tackle the most difficult tasks in learning — those that involve application, analysis, synthesis, and creativity — in their individual spaces. Flipped learning reverses this, by moving first contact with new concepts to the individual space and using the newly-expanded time in class for students to pursue difficult, higher-level tasks together, with the instructor as a guide.

Let’s take a look at some of the myths about flipped learning and try to find the facts.

Myth: Flipped learning is predicated on recording videos for students to watch before class.

Fact: Flipped learning does not require video. Although many real-life implementations of flipped learning use video, there’s nothing that says video must be used. In fact, one of the earliest instances of flipped learning — Eric Mazur’s peer instruction concept, used in Harvard physics classes — uses no video but rather an online text outfitted with social annotation software. And one of the most successful public instances of flipped learning, an edX course on numerical methods designed by Lorena Barba of George Washington University, uses precisely one video. Video is simply not necessary for flipped learning, and many alternatives to video can lead to effective flipped learning environments [http://rtalbert.org/flipped-learning-without-video/].

Myth: Flipped learning replaces face-to-face teaching.

Fact: Flipped learning optimizes face-to-face teaching. Flipped learning may (but does not always) replace lectures in class, but this is not to say that it replaces teaching. Teaching and “telling” are not the same thing.

Myth: Flipped learning has no evidence to back up its effectiveness.

Fact: Flipped learning research is growing at an exponential pace and has been since at least 2014. That research — 131 peer-reviewed articles in the first half of 2017 alone — includes results from primary, secondary, and postsecondary education in nearly every discipline, most showing significant improvements in student learning, motivation, and critical thinking skills.

Myth: Flipped learning is a fad.

Fact: Flipped learning has been with us in the form defined here for nearly 20 years.

Myth: People have been doing flipped learning for centuries.

Fact: Flipped learning is not just a rebranding of old techniques. The basic concept of students doing individually active work to encounter new ideas that are then built upon in class is almost as old as the university itself. So flipped learning is, in a real sense, a modern means of returning higher education to its roots. Even so, flipped learning is different from these time-honored techniques.

Myth: Students and professors prefer lecture over flipped learning.

Fact: Students and professors embrace flipped learning once they understand the benefits. It’s true that professors often enjoy their lectures, and students often enjoy being lectured to. But the question is not who “enjoys” what, but rather what helps students learn the best.They know what the research says about the effectiveness of active learning

Assertion: Flipped learning provides a platform for implementing active learning in a way that works powerfully for students.

9. Evaluating Technology-based Instructional Innovations

Transitioning to an ROI lens requires three fundamental shifts
What is the total cost of my innovation, including both new spending and the use of existing resources?

What’s the unit I should measure that connects cost with a change in performance?

How might the expected change in student performance also support a more sustainable financial model?

The Exposure Approach: we don’t provide a way for participants to determine if they learned anything new or now have the confidence or competence to apply what they learned.

The Exemplar Approach: from ‘show and tell’ for adults to show, tell, do and learn.

The Tutorial Approach: Getting a group that can meet at the same time and place can be challenging. That is why many faculty report a preference for self-paced professional development.build in simple self-assessment checks. We can add prompts that invite people to engage in some sort of follow up activity with a colleague. We can also add an elective option for faculty in a tutorial to actually create or do something with what they learned and then submit it for direct or narrative feedback.

The Course Approach: a non-credit format, these have the benefits of a more structured and lengthy learning experience, even if they are just three to five-week short courses that meet online or in-person once every week or two.involve badges, portfolios, peer assessment, self-assessment, or one-on-one feedback from a facilitator

The Academy Approach: like the course approach, is one that tends to be a deeper and more extended experience. People might gather in a cohort over a year or longer.Assessment through coaching and mentoring, the use of portfolios, peer feedback and much more can be easily incorporated to add a rich assessment element to such longer-term professional development programs.

The Mentoring Approach: The mentors often don’t set specific learning goals with the mentee. Instead, it is often a set of structured meetings, but also someone to whom mentees can turn with questions and tips along the way.

The Coaching Approach: A mentor tends to be a broader type of relationship with a person.A coaching relationship tends to be more focused upon specific goals, tasks or outcomes.

The Peer Approach:This can be done on a 1:1 basis or in small groups, where those who are teaching the same courses are able to compare notes on curricula and teaching models. They might give each other feedback on how to teach certain concepts, how to write syllabi, how to handle certain teaching and learning challenges, and much more. Faculty might sit in on each other’s courses, observe, and give feedback afterward.

The Self-Directed Approach:a self-assessment strategy such as setting goals and creating simple checklists and rubrics to monitor our progress. Or, we invite feedback from colleagues, often in a narrative and/or informal format. We might also create a portfolio of our work, or engage in some sort of learning journal that documents our thoughts, experiments, experiences, and learning along the way.

The Buffet Approach:

10. Open Education

Figure 1. A Model for Networked Education (Credit: Image by Catherine Cronin, building on
Interpretations of
Balancing Privacy and Openness (Credit: Image by Catherine Cronin. CC BY-SA)

11. Learning Analytics

12. Adaptive Teaching and Learning

13. Working with Emerging Technology

In 2014, administrators at Central Piedmont Community College (CPCC) in Charlotte, North Carolina, began talks with members of the North Carolina State Board of Community Colleges and North Carolina Community College System (NCCCS) leadership about starting a CBE program.

Building on an existing project at CPCC for identifying the elements of a digital learning environment (DLE), which was itself influenced by the EDUCAUSE publication The Next Generation Digital Learning Environment: A Report on Research,1 the committee reached consensus on a DLE concept and a shared lexicon: the “Digital Learning Environment Operational Definitions,

Figure 1. NC-CBE Digital Learning Environment

gamification and learning

Student Perceptions of Learning and Instructional Effectiveness in College Courses

https://www.ets.org/Media/Products/perceptions.pdf

Students’ Perception of Gamification in Learning and Education.

https://link.springer.com/chapter/10.1007%2F978-3-319-47283-6_6

College students’ perceptions of pleasure in learning – Designing gameful gamification in education

investigate behavioral and psychological metrics that could affect learner perceptions of technology

today’s learners spend extensive time and effort posting and commenting in social media and playing video games

Creating pleasurable learning experiences for learners can improve learner engagement.

uses game-design elements in non-gaming environments with the purpose of motivating users to behave in a certain direction (Deterding et al., 2011)

How can we facilitate the gamefulness of gamification?

Most gamified activities include three basic parts: “goal-focused activity, reward mechanisms, and progress tracking” (Glover, 2013, p. 2000).

gamification works similarly to the instructional methods in education – clear learning and teaching objectives, meaningful learning activities, and assessment methods that are aligned with the objectives

the design of seven game elements:

  • Storytelling: It provides the rules of the gamified activities. A good gamified activity should have a clear and simple storyboard to direct learners to achieve the goals. This game-design element works like the guidelines and directions of an instructional activity in class.
  • Levels: A gamified activity usually consists of different levels for learners to advance through. At each level, learners will face different challenges. These levels and challenges can be viewed as the specific learning objectives/competencies for learners to accomplish.
  • Points: Points pertain to the progress-tracking element because learners can gain points when they complete the quests.
  • Leaderboard: This element provides a reward mechanism that shows which learners are leading in the gamified activities. This element is very controversial when gamification is used in educational contexts because some empirical evidence shows that a leaderboard is effective only for users who are aggressive and hardcore players (Hamari, Koivisto, & Sarsa, 2014).
  • Badges: These serve as milestones to resemble the rewards that learners have achieved when they complete certain quests. This element works as the extrinsic motivation for learners (Kapp, 2012).
  • Feedback: A well-designed gamification interface should provide learners with timely feedback in order to help them to stay on the right track.
  • Progress: A progress-tracking bar should appear in the learner profile to remind learners of how many quests remain and how many quests they have completed.

Dominguez et al. (2013) suggested that gamification fosters high-order thinking, such as problem-solving skills, rather than factual knowledge. Critical thinking, which is commonly assessed in social science majors, is also a form of higher-order thinking.

Davis (1989) developed technology acceptance model (TAM) to help people understand how users perceive technologies. Pleasure, arousal, and dominance (PAD) emotional-state model that developed by Mehrabian (1995) is one of the fundamental design frameworks for scale development in understanding user perceptions of user-system interactions.

Technology Acceptance Model from Damian T. Gordon

Introduction of the basic emotional impact of environments from Sekine masato

Van der Heijdedn (2004) asserted that pleasurable experiences encouraged users to use the system for a longer period of time
Self-determination theory (Deci & Ryan, 1985) has been integrated into the design of gamification and addressed the balance between learners’ extrinsic and intrinsic motivation.

Self determination theory from Jeannie Maraya
Ryan and Deci (2000) concluded that extrinsic rewards might suppress learners’ intrinsic motivation. Exploiting the playfulness and gamefulness in gamification, therefore, becomes extremely important, as it would employ the most effective approaches to engage learners.
Sweetser and Wyeth (2005) developed GameFlow as an evaluation model to measure player enjoyment in games
Fu, Su, and Yu (2009) adapted this scale to EGameFlow in order to measure college students’ enjoyment of e-learning games. EGameFlow is a multidimensional scale that consists of self-evaluated emotions.

Gamification and Flow from Martin Sillaots
Eppmann, Bekk, and Klein (2018) developed gameful experience scale (GAMEX) to measure gameful experiences for gamification contexts. one of the limitations of GAMEX to be used in education is that its effects on learning outcome has not been studied
the Big Five Model, which has been proposed as trait theory by McCrae & Costa (1989) and is widely accepted in the field, to measure the linkages between the game mechanics in gamification and the influences of different personality traits.

The Big Five Personality Model from Devina Srivastava

Storytelling in the subscale of Preferences for Instruction emphasizes the rules of the gamified learning environments, such as the syllabus of the course, the rubrics for the assignments, and the directions for tasks. Storytelling in the subscale of Preferences for Instructors’ Teaching Style focuses on the ways in which instructors present the content. For example, instructors could use multimedia resources to present their instructional materials. Storytelling in the subscale of Preferences for Learning Effectiveness emphasizes scaffolding materials for the learners, such as providing background information for newly introduced topics.

The effective use of badges would include three main elements: signifier, completion logic, and rewards (Hamari & Eranti, 2011). A useful badge needs clear goal-setting and prompt feedback. Therefore, badges correlate closely with the design of storytelling (rules) and feedback, which are the key game design elements in the subscale of Preferences for Instruction.

Students can use Google to search on their laptops or tablets in class when instructors introduce new concepts. By reading the reviews and viewing the numbers of “thumbs-up” (agreements by other users), students are able to select the best answers. Today’s learners also “tweet” on social media to share educational videos and news with their classmates and instructors. Well-designed gamified learning environments could increase pleasure in learning by allowing students to use familiar computing experiences in learning environments.

 

Exemplary Course Program Rubric

Exemplary Course Program Rubric

http://www.blackboard.com/resources/catalyst-awards
if problems with the link above, try this one:
/bb_exemplary_course_rubric_apr2017.pdf

Course Design

Course Design addresses elements of instructional design. For the purpose of this rubric, course design includes such elements as structure of the course, learning objectives, organization of content, and instructional strategies.

Interaction and Collaboration

Interaction denotes communication between and among learners and instructors, synchronously or asynchronously. Collaboration is a subset of interaction and refers specifically to those activities in which groups are working interdependently toward a shared result. This differs from group activities that can be completed by students working independently of one another and then combining the results, much as one would when assembling a jigsaw puzzle with parts of the puzzle worked out separately then assembled together. A learning community is defined here as the sense of belonging to a group, rather than each student perceiving himself/herself studying independently.

Assessment

Assessment focuses on instructional activities designed to measure progress toward learning outcomes, provide feedback to students and instructors, and/or enable grading or evaluation. This section addresses the quality and type of student assessments within the course.

Learner Support

Learner Support addresses the support resources made available to students taking the course. Such resources may be accessible within or external to the course environment. Learner support resources address a variety of student services.

+++++++++++
more on online teaching in this IMS blog
http://blog.stcloudstate.edu/ims?s=online+teaching

more on rubrics in this IMS blog
http://blog.stcloudstate.edu/ims?s=rubric

Key Issues in Teaching and Learning Survey

The EDUCAUSE Learning Initiative has just launched its 2018 Key Issues in Teaching and Learning Survey, so vote today: http://www.tinyurl.com/ki2018.

Each year, the ELI surveys the teaching and learning community in order to discover the key issues and themes in teaching and learning. These top issues provide the thematic foundation or basis for all of our conversations, courses, and publications for the coming year. Longitudinally they also provide the way to track the evolving discourse in the teaching and learning space. More information about this annual survey can be found at https://www.educause.edu/eli/initiatives/key-issues-in-teaching-and-learning.

ACADEMIC TRANSFORMATION (Holistic models supporting student success, leadership competencies for academic transformation, partnerships and collaborations across campus, IT transformation, academic transformation that is broad, strategic, and institutional in scope)

ACCESSIBILITY AND UNIVERSAL DESIGN FOR LEARNING (Supporting and educating the academic community in effective practice; intersections with instructional delivery modes; compliance issues)

ADAPTIVE TEACHING AND LEARNING (Digital courseware; adaptive technology; implications for course design and the instructor’s role; adaptive approaches that are not technology-based; integration with LMS; use of data to improve learner outcomes)

COMPETENCY-BASED EDUCATION AND NEW METHODS FOR THE ASSESSMENT OF STUDENT LEARNING (Developing collaborative cultures of assessment that bring together faculty, instructional designers, accreditation coordinators, and technical support personnel, real world experience credit)

DIGITAL AND INFORMATION LITERACIES (Student and faculty literacies; research skills; data discovery, management, and analysis skills; information visualization skills; partnerships for literacy programs; evaluation of student digital competencies; information evaluation)

EVALUATING TECHNOLOGY-BASED INSTRUCTIONAL INNOVATIONS (Tools and methods to gather data; data analysis techniques; qualitative vs. quantitative data; evaluation project design; using findings to change curricular practice; scholarship of teaching and learning; articulating results to stakeholders; just-in-time evaluation of innovations). here is my bibliographical overview on Big Data (scroll down to “Research literature”http://blog.stcloudstate.edu/ims/2017/11/07/irdl-proposal/ )

EVOLUTION OF THE TEACHING AND LEARNING SUPPORT PROFESSION (Professional skills for T&L support; increasing emphasis on instructional design; delineating the skills, knowledge, business acumen, and political savvy for success; role of inter-institutional communities of practices and consortia; career-oriented professional development planning)

FACULTY DEVELOPMENT (Incentivizing faculty innovation; new roles for faculty and those who support them; evidence of impact on student learning/engagement of faculty development programs; faculty development intersections with learning analytics; engagement with student success)

GAMIFICATION OF LEARNING (Gamification designs for course activities; adaptive approaches to gamification; alternate reality games; simulations; technological implementation options for faculty)

INSTRUCTIONAL DESIGN (Skills and competencies for designers; integration of technology into the profession; role of data in design; evolution of the design profession (here previous blog postings on this issue: http://blog.stcloudstate.edu/ims/2017/10/04/instructional-design-3/); effective leadership and collaboration with faculty)

INTEGRATED PLANNING AND ADVISING FOR STUDENT SUCCESS (Change management and campus leadership; collaboration across units; integration of technology systems and data; dashboard design; data visualization (here previous blog postings on this issue: http://blog.stcloudstate.edu/ims?s=data+visualization); counseling and coaching advising transformation; student success analytics)

LEARNING ANALYTICS (Leveraging open data standards; privacy and ethics; both faculty and student facing reports; implementing; learning analytics to transform other services; course design implications)

LEARNING SPACE DESIGNS (Makerspaces; funding; faculty development; learning designs across disciplines; supporting integrated campus planning; ROI; accessibility/UDL; rating of classroom designs)

MICRO-CREDENTIALING AND DIGITAL BADGING (Design of badging hierarchies; stackable credentials; certificates; role of open standards; ways to publish digital badges; approaches to meta-data; implications for the transcript; Personalized learning transcripts and blockchain technology (here previous blog postings on this issue: http://blog.stcloudstate.edu/ims?s=blockchain

MOBILE LEARNING (Curricular use of mobile devices (here previous blog postings on this issue:

http://blog.stcloudstate.edu/ims/2015/09/25/mc218-remodel/; innovative curricular apps; approaches to use in the classroom; technology integration into learning spaces; BYOD issues and opportunities)

MULTI-DIMENSIONAL TECHNOLOGIES (Virtual, augmented, mixed, and immersive reality; video walls; integration with learning spaces; scalability, affordability, and accessibility; use of mobile devices; multi-dimensional printing and artifact creation)

NEXT-GENERATION DIGITAL LEARNING ENVIRONMENTS AND LMS SERVICES (Open standards; learning environments architectures (here previous blog postings on this issue: http://blog.stcloudstate.edu/ims/2017/03/28/digital-learning/; social learning environments; customization and personalization; OER integration; intersections with learning modalities such as adaptive, online, etc.; LMS evaluation, integration and support)

ONLINE AND BLENDED TEACHING AND LEARNING (Flipped course models; leveraging MOOCs in online learning; course development models; intersections with analytics; humanization of online courses; student engagement)

OPEN EDUCATION (Resources, textbooks, content; quality and editorial issues; faculty development; intersections with student success/access; analytics; licensing; affordability; business models; accessibility and sustainability)

PRIVACY AND SECURITY (Formulation of policies on privacy and data protection; increased sharing of data via open standards for internal and external purposes; increased use of cloud-based and third party options; education of faculty, students, and administrators)

WORKING WITH EMERGING LEARNING TECHNOLOGY (Scalability and diffusion; effective piloting practices; investments; faculty development; funding; evaluation methods and rubrics; interoperability; data-driven decision-making)

+++++++++++
learning and teaching in this IMS blog
http://blog.stcloudstate.edu/ims?s=teaching+and+learning

Plagiarism 101

prepare your students – avoid plagiarism

DISCOURAGING & DETECTING PLAGIARISM

http://citl.illinois.edu/citl-101/teaching-learning/resources/classroom-environment/discouraging-detecting-plagiarism

concepts of plagiarism,

http://www.wpacouncil.org/node/9

https://owl.english.purdue.edu/owl/resource/589/05/

Tutorial

intellectual property,

 

copyright,

collaboration, and

fair dealing.

Teach students how to quote, paraphrase, https://youtu.be/MiL4H09v0gU

and cite correctly:

  • Remind students of available resources, such as consulting with the faculty member, TAs, librarians, and the writing center.
  • Exemplify academic integrity in class by citing sources on handouts and during lectures.
  • Inform students that you will randomly check their citations.

Rubrics to help avoid Plagiarism:

http://wehs.westex.libguides.com/content.php?pid=345788&sid=3018138

Free Plagiarism checker:

https://www.paperrater.com/plagiarism_checker

https://www.grammarly.com/plagiarism-checker

++++++++++++
more on plagiarism in this IMS blog
http://blog.stcloudstate.edu/ims?s=plagiarism

measuring library outcomes and value

THE VALUE OF ACADEMIC LIBRARIES
A Comprehensive Research Review and Report. Megan Oakleaf

http://www.ala.org/acrl/sites/ala.org.acrl/files/content/issues/value/val_report.pdf

Librarians in universities, colleges, and community colleges can establish, assess, and link
academic library outcomes to institutional outcomes related to the following areas:
student enrollment, student retention and graduation rates, student success, student
achievement, student learning, student engagement, faculty research productivity,
faculty teaching, service, and overarching institutional quality.
Assessment management systems help higher education educators, including librarians, manage their outcomes, record and maintain data on each outcome, facilitate connections to
similar outcomes throughout an institution, and generate reports.
Assessment management systems are helpful for documenting progress toward
strategic/organizational goals, but their real strength lies in managing learning
outcomes assessments.
to determine the impact of library interactions on users, libraries can collect data on how individual users engage with library resources and services.
increase library impact on student enrollment.
p. 13-14improved student retention and graduation rates. High -impact practices include: first -year seminars and experiences, common intellectual experiences, learning communities, writing – intensive courses, collaborative assignments and projects, undergraduate research, Value of Academic Libraries diversity/global learning, service learning/community -based learning, internships, capstone courses and projects

p. 14

Libraries support students’ ability to do well in internships, secure job placements, earn salaries, gain acceptance to graduate/professional schools, and obtain marketable skills.
librarians can investigate correlations between student library interactions and their GPA well as conduct test item audits of major professional/educational tests to determine correlations between library services or resources and specific test items.
p. 15 Review course content, readings, reserves, and assignments.
Track and increase library contributions to faculty research productivity.
Continue to investigate library impact on faculty grant proposals and funding, a means of generating institutional income. Librarians contribute to faculty grant proposals in a number of ways.
Demonstrate and improve library support of faculty teaching.
p. 20 Internal Focus: ROI – lib value = perceived benefits / perceived costs
production of a commodity – value=quantity of commodity produced × price per unit of commodity
p. 21 External focus
a fourth definition of value focuses on library impact on users. It asks, “What is the library trying to achieve? How can librarians tell if they have made a difference?” In universities, colleges, and community colleges, libraries impact learning, teaching, research, and service. A main method for measuring impact is to “observe what the [users] are actually doing and what they are producing as a result”
A fifth definition of value is based on user perceptions of the library in relation to competing alternatives. A related definition is “desired value” or “what a [user] wants to have happen when interacting with a [library] and/or using a [library’s] product or service” (Flint, Woodruff and Fisher Gardial 2002) . Both “impact” and “competing alternatives” approaches to value require libraries to gain new understanding of their users’ goals as well as the results of their interactions with academic libraries.
p. 23 Increasingly, academic library value is linked to service, rather than products. Because information products are generally produced outside of libraries, library value is increasingly invested in service aspects and librarian expertise.
service delivery supported by librarian expertise is an important library value.
p. 25 methodology based only on literature? weak!
p. 26 review and analysis of the literature: language and literature are old (e.g. educational administrators vs ed leaders).
G government often sees higher education as unresponsive to these economic demands. Other stakeholder groups —students, pa rents, communities, employers, and graduate/professional schools —expect higher education to make impacts in ways that are not primarily financial.

p. 29

Because institutional missions vary (Keeling, et al. 2008, 86; Fraser, McClure and
Leahy 2002, 512), the methods by which academic libraries contribute value vary as
well. Consequently, each academic library must determine the unique ways in which they contribute to the mission of their institution and use that information to guide planning and decision making (Hernon and Altman, Assessing Service Quality 1998, 31) . For example, the University of Minnesota Libraries has rewritten their mission and vision to increase alignment with their overarching institution’s goals and emphasis on strategic engagement (Lougee 2009, allow institutional missions to guide library assessment
Assessment vs. Research
In community colleges, colleges, and universities, assessment is about defining the
purpose of higher education and determining the nature of quality (Astin 1987)
.
Academic libraries serve a number of purposes, often to the point of being
overextended.
Assessment “strives to know…what is” and then uses that information to change the
status quo (Keeling, et al. 2008, 28); in contrast, research is designed to test
hypotheses. Assessment focuses on observations of change; research is concerned with the degree of correlation or causation among variables (Keeling, et al. 2008, 35) . Assessment “virtually always occurs in a political context ,” while research attempts to be apolitical” (Upcraft and Schuh 2002, 19) .
 p. 31 Assessment seeks to document observations, but research seeks to prove or disprove ideas. Assessors have to complete assessment projects, even when there are significant design flaws (e.g., resource limitations, time limitations, organizational contexts, design limitations, or political contexts); whereas researchers can start over (Upcraft and Schuh 2002, 19) . Assessors cannot always attain “perfect” studies, but must make do with “good enough” (Upcraft and Schuh 2002, 18) . Of course, assessments should be well planned, be based on clear outcomes (Gorman 2009, 9- 10) , and use appropriate methods (Keeling, et al. 2008, 39) ; but they “must be comfortable with saying ‘after’ as well as ‘as a result of’…experiences” (Ke eling, et al. 2008, 35) .
Two multiple measure approaches are most significant for library assessment: 1) triangulation “where multiple methods are used to find areas of convergence of data from different methods with an aim of overcoming the biases or limitations of data gathered from any one particular method” (Keeling, et al. 2008, 53) and 2) complementary mixed methods , which “seek to use data from multiple methods to build upon each other by clarifying, enhancing, or illuminating findings between or among methods” (Keeling, et al. 2008, 53) .
p. 34 Academic libraries can help higher education institutions retain and graduate students, a keystone part of institutional missions (Mezick 2007, 561) , but the challenge lies in determining how libraries can contribute and then document their contribution
p. 35. Student Engagement:  In recent years, academic libraries have been transformed to provide “technology and content ubiquity” as well as individualized support
My Note: I read the “technology and content ubiquity” as digital literacy / metaliteracies, where basic technology instructional sessions (everything that IMS offers for years) is included, but this library still clenches to information literacy only.
National Survey of Student Engagement (NSSE) http://nsse.indiana.edu/
http://nsse.indiana.edu/2017_Institutional_Report/pdf/NSSE17%20Snapshot%20%28NSSEville%20State%29.pdf
p. 37 Student Learning
In the past, academic libraries functioned primarily as information repositories; now they are becoming learning enterprises (Bennett 2009, 194) . This shift requires academic librarians to embed library services and resources in the teaching and learning activities of their institutions (Lewis 2007) . In the new paradigm, librarians focus on information skills, not information access (Bundy 2004, 3); they think like educators, not service providers (Bennett 2009, 194) .
p. 38. For librarians, the main content area of student learning is information literacy; however, they are not alone in their interest in student inform ation literacy skills (Oakleaf, Are They Learning? 2011).
My note: Yep. it was. 20 years ago. Metaliteracies is now.
p. 41 surrogates for student learning in Table 3.
p. 42 strategic planning for learning:
According to Kantor, the university library “exists to benefit the students of the educational institution as individuals ” (Library as an Information Utility 1976 , 101) . In contrast, academic libraries tend to assess learning outcomes using groups of students
p. 45 Assessment Management Systems
Tk20
Each assessment management system has a slightly different set of capabilities. Some guide outcomes creation, some develop rubrics, some score student work, or support student portfolios. All manage, maintain, and report assessment data
p. 46 faculty teaching
However, as online collections grow and discovery tools evolve, that role has become less critical (Schonfeld and Housewright 2010; Housewright and Schonfeld, Ithaka’s 2006 Studies of Key Stakeholders 2008, 256) . Now, libraries serve as research consultants, project managers, technical support professionals, purchasers , and archivists (Housewright, Themes of Change 2009, 256; Case 2008) .
Librarians can count citations of faculty publications (Dominguez 2005)
.

+++++++++++++

Tenopir, C. (2012). Beyond usage: measuring library outcomes and value. Library Management33(1/2), 5-13.

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dllf%26AN%3d70921798%26site%3dehost-live%26scope%3dsite

methods that can be used to measure the value of library products and services. (Oakleaf, 2010; Tenopir and King, 2007): three main categories

  1. Implicit value. Measuring usage through downloads or usage logs provide an implicit measure of value. It is assumed that because libraries are used, they are of value to the users. Usage of e-resources is relatively easy to measure on an ongoing basis and is especially useful in collection development decisions and comparison of specific journal titles or use across subject disciplines.

do not show purpose, satisfaction, or outcomes of use (or whether what is downloaded is actually read).

  1. Explicit methods of measuring value include qualitative interview techniques that ask faculty members, students, or others specifically about the value or outcomes attributed to their use of the library collections or services and surveys or interviews that focus on a specific (critical) incident of use.
  2. Derived values, such as Return on Investment (ROI), use multiple types of data collected on both the returns (benefits) and the library and user costs (investment) to explain value in monetary terms.

++++++++++++++++++
more on ROI in this IMS blog
http://blog.stcloudstate.edu/ims/2014/11/02/roi-of-social-media/

blogging for Confucius Institute

Minutes from the Oct 17 meeting:

++++++++++++++++++

Plan for August 17

Introduce students to the blog idea. Short link to this planhttp://bit.ly/blog4ci

http://blog.stcloudstate.edu/ims/2017/08/17/blogging-for-confucius-institute/

  • Why blog
    • What is social media, when SM started
    • What is blog, when blogs started
    • Why blog
      • Blogging vs microblogging

http://blog.stcloudstate.edu/ims/2015/12/31/social-media-and-the-devaluation/
http://blog.stcloudstate.edu/ims/2016/01/01/blog-future/

+++++++++++++++++++++++++++++++++++++++++++++++
handout on basic functions with your blog

+++++++++++++++++++++++++++++++++++++++++++++++

 

 

next gen digital learning environment

Updating the Next Generation Digital Learning Environment for Better Student Learning Outcomes

a learning management system (LMS) is never the solution to every problem in education. Edtech is just one part of the whole learning ecosystem and student experience.

Therefore, the next generation digital learning environment (NGDLE), as envisioned by EDUCAUSE in 2015 …  Looking at the NGDLE requirements from an LMS perspective, I view the NGDLE as being about five areas: interoperability; personalization; analytics, advising, and learning assessment; collaboration; accessibility and universal design.

Interoperability

  • Content can easily be exchanged between systems.
  • Users are able to leverage the tools they love, including discipline-specific apps.
  • Learning data is available to trusted systems and people who need it.
  • The learning environment is “future proof” so that it can adapt and extend as the ecosystem evolves.

Personalization

  • The learning environment reflects individual preferences.
  • Departments, divisions, and institutions can be autonomous.
  • Instructors teach the way they want and are not constrained by the software design.
  • There are clear, individual learning paths.
  • Students have choice in activity, expression, and engagement.

Analytics, Advising, and Learning Assessment

  • Learning analytics helps to identify at-risk students, course progress, and adaptive learning pathways.
  • The learning environment enables integrated planning and assessment of student performance.
  • More data is made available, with greater context around the data.
  • The learning environment supports platform and data standards.

Collaboration

  • Individual spaces persist after courses and after graduation.
  • Learners are encouraged as creators and consumers.
  • Courses include public and private spaces.

Accessibility and Universal Design

  • Accessibility is part of the design of the learning experience.
  • The learning environment enables adaptive learning and supports different types of materials.
  • Learning design includes measurement rubrics and quality control.

The core analogy used in the NGDLE paper is that each component of the learning environment is a Lego brick:

  • The days of the LMS as a “walled garden” app that does everything is over.
  • Today many kinds of amazing learning and collaboration tools (Lego bricks) should be accessible to educators.
  • We have standards that let these tools (including an LMS) talk to each other. That is, all bricks share some properties that let them fit together.
  • Students and teachers sign in once to this “ecosystem of bricks.”
  • The bricks share results and data.
  • These bricks fit together; they can be interchanged and swapped at will, with confidence that the learning experience will continue uninterrupted.

Any “next-gen” attempt to completely rework the pedagogical model and introduce a “mash-up of whatever” to fulfil this model would fall victim to the same criticisms levied at the LMS today: there is too little time and training to expect faculty to figure out the nuances of implementation on their own.

The Lego metaphor works only if we’re talking about “old school” Lego design — bricks of two, three, and four-post pieces that neatly fit together. Modern edtech is a lot more like the modern Lego. There are wheels and rocket launchers and belts and all kinds of amazing pieces that work well with each other, but only when they are configured properly. A user cannot simply stick together different pieces and assume they will work harmoniously in creating an environment through which each student can be successful.

As the NGDLE paper states: “Despite the high percentages of LMS adoption, relatively few instructors use its more advanced features — just 41% of faculty surveyed report using the LMS ‘to promote interaction outside the classroom.'”

But this is what the next generation LMS is good at: being a central nervous system — or learning hub — through which a variety of learning activities and tools are used. This is also where the LMS needs to go: bringing together and making sense of all the amazing innovations happening around it. This is much harder to do, perhaps even impossible, if all the pieces involved are just bricks without anything to orchestrate them or to weave them together into a meaningful, personal experience for achieving well-defined learning outcomes.

  • Making a commitment to build easy, flexible, and smart technology
  • Working with colleges and universities to remove barriers to adopting new tools in the ecosystem
  • Standardizing the vetting of accessibility compliance (the Strategic Nonvisual Access Partner Program from the National Federation of the Blind is a great start)
  • Advancing standards for data exchange while protecting individual privacy
  • Building integrated components that work with the institutions using them — learning quickly about what is and is not working well and applying those lessons to the next generation of interoperability standards
  • Letting people use the tools they love [SIC] and providing more ways for nontechnical individuals (including students) to easily integrate new features into learning activities

My note: something just refused to be accepted at SCSU
Technologists are often very focused on the technology, but the reality is that the more deeply and closely we understand the pedagogy and the people in the institutions — students, faculty, instructional support staff, administrators — the better suited we are to actually making the tech work for them.

++++++++++++++++++++++

Under the Hood of a Next Generation Digital Learning Environment in Progress

The challenge is that although 85 percent of faculty use a campus learning management system (LMS),1 a recent Blackboard report found that, out of 70,000 courses across 927 North American institutions, 53 percent of LMS usage was classified as supplemental(content-heavy, low interaction) and 24 percent as complementary (one-way communication via content/announcements/gradebook).2 Only 11 percent were characterized as social, 10 percent as evaluative (heavy use of assessment), and 2 percent as holistic (balanced use of all previous). Our FYE course required innovating beyond the supplemental course-level LMS to create a more holistic cohort-wide NGDLE in order to fully support the teaching, learning, and student success missions of the program.The key design goals for our NGDLE were to:

  • Create a common platform that could deliver a standard curriculum and achieve parity in all course sections using existing systems and tools and readily available content
  • Capture, store, and analyze any generated learner data to support learning assessment, continuous program improvement, and research
  • Develop reports and actionable analytics for administrators, advisors, instructors, and students

++++++++++++
more on LMS in this blog
http://blog.stcloudstate.edu/ims?s=LMS

more on learning outcomes in this IMS blog
http://blog.stcloudstate.edu/ims?s=learning+outcomes

online teaching

A Return to Best Practices for Teaching Online

10/25/16

https://campustechnology.com/Articles/2016/10/25/A-Return-to-Best-Practices-for-Teaching-Online.aspx

Judith Boettcher book, The Online Teaching Survival Guide (second edition, Jossey-Bass 2016). In chapter three, “Best Practices for Teaching Online: Ten Plus Four,” you and your co-author Rita-Marie Conrad provide a list of 14 best practices for teaching online. How can these best practices help faculty?

https://books.google.com/books?id=Z5PqDAAAQBAJ&lpg=PP1&dq=Boettcher%2C%20The%20Online%20Teaching%20Survival%20Guide&pg=PR9#v=onepage&q=Boettcher,%20The%20Online%20Teaching%20Survival%20Guide&f=false

when faculty are first asked to teach online, most do not have a lot of time to prepare. They are seldom given much coaching, mentoring, or support — often they are just kind of thrown into it,

Personalized learning means that while all students master core concepts, students ideally practice increasingly difficult use of those core concepts in contexts and settings desired by individual students.

The Learning Experiences Framework graphic

we really need to step up to much more effective use of rubrics. Rubrics can define intellectual outcomes in several key areas, such as critical thinking, for example.

great course design is at the core of creating great online learning experiences. We need to ensure that the desired learning outcomes, the course experiences, and the ways we gather evidences of learning are all congruent, one with the other. Course experiences should help students develop the knowledge and expertise that they desire, and the evidences of learning we require of students should be meaningful and purposeful and where possible, personalized and customized.

+++++++++++++++++

more on online teaching in this IMS blog:

http://blog.stcloudstate.edu/ims?s=online+teaching

1 2 3 4