Posts Tagged ‘Assessment and Evaluation’

peer to peer curation

Peer-to-Peer Curation Activities Boost Higher-Order Thinking

https://www.kritik.io/resources/peer-to-peer-curation-activities-boost-higher-order-thinking

Most professors we hear from want to assess their students on higher levels and that if current assessments kept student at the lowest level of Bloom’s Taxonomy, they wouldn’t feel rewarded as educators.

However, assessment is by far the most labour-intensive part of teaching. Assessment plans and rubrics must be prepped. Test questions must be written. Every student needs a mark, personalized feedback and a road-map for improvement. The larger the class, the more work for the instructor. Add in formative assessments like weekly assignments and exercises that precipitate subtle, ongoing tweaks to the syllabus and it’s easy to see why many faculty opt to stick with what they know: An accumulation of easy-to-grade summative assessments that almost inevitably rely upon memorization and the most basic understanding of concepts

Curation Activities can be one of the most effective teaching strategies to help students compare what they’re learning in the classroom with real-world examples, and gain insight into how they can relate to each other.

Curation Activities can apply to all disciples, such as Business, Arts, or Sciences.

When students explain what they’ve learned to other students, they help consolidate and strengthen connections to those concepts while simultaneously engaging in active learning Find more project ideas here.

By actively engaging with their classmates and applying their own evaluative skills to feedback they’re delivering to their peers, students are developing lifelong critical thinking and creative analysis skills. Additionally, peer assessment is proven to be effective in getting students faster feedback from diverse sources, increases meta-cognition, independence and self-reflection, and improves student learning. These are all important skills that provide value far beyond the classroom. More details on the benefits of peer assessment here.

++++++++++++++
more on curation in this IMS blog
https://blog.stcloudstate.edu/ims?s=curation

student evaluations of faculty

Exploring the personal and professional factors associated with student evaluations of tenure-track faculty

Dakota Murray, Clara Boothby, Huimeng Zhao, Vanessa Minik, Nicolas Bérubé, Vincent Larivière, Cassidy R. Sugimoto 

Published: June 3, 2020 https://doi.org/10.1371/journal.pone.0233515

Tenure-track faculty members in the United States are evaluated on their performance in both research and teaching. In spite of accusations of bias and invalidity, student evaluations of teaching have dominated teaching evaluation at U.S. universities. However, studies on the topic have tended to be limited to particular institutional and disciplinary contexts. Moreover, in spite of the idealistic assumption that research and teaching are mutually beneficial, few studies have examined the link between research performance and student evaluations of teaching. In this study, we conduct a large scale exploratory analysis of the factors associated with student evaluations of teachers, controlling for heterogeneous institutional and disciplinary contexts. We source public student evaluations of teaching from RateMyProfessor.com and information regarding career and contemporary research performance indicators from the company Academic Analytics. The factors most associated with higher student ratings were the attractiveness of the faculty and the student’s interest in the class; the factors most associated with lower student ratings were course difficulty and whether student comments mentioned an accent or a teaching assistant. Moreover, faculty tended to be rated more highly when they were young, male, White, in the Humanities, and held a rank of full professor. We observed little to no evidence of any relationship, positive or negative, between student evaluations of teaching and research performance. These results shed light on what factors relate to student evaluations of teaching across diverse contexts and contribute to the continuing discussion teaching evaluation and faculty assessment.

game based learning

https://www.edsurge.com/news/2016-01-20-game-based-learning-has-practical-applications-for-nontraditional-learners

Muzzy Lane Software, a Newbury, Mass.-based game development platform.

The study, “The Potential for Game-based Learning to Improve Outcomes for Nontraditional Students,” is based on research funded by the Bill & Melinda Gates Foundation and includes insights from a survey of 1,700 students, 11 in-person focus groups and interviews with teachers and school leaders. Educators said games could be especially helpful in several areas: auto-assessing whether students can apply what they’ve learned, building employment competencies and improving study skills.

Definition: Muzzy Lane characterizes them as learners who meet two of the following criteria: – returning to school after pausing their education,
– balancing education with work and family responsibilities,
– lower-income,
– English as a second language learners, or
– the first members of their families to attend college.

++++++++++++++
more about game based learning in this IMS blog
https://blog.stcloudstate.edu/ims?s=game+based+learning

microcredentialing and students abilities

Badge breakthroughs

Micro-credentials awarded for in-demand skills give employers deeper detail about a student’s abilities.Matt Zalaznick. June 7, 2017
While employers increasingly demand that new hires have college degrees, the transcripts supporting those hard-earned credentials are no longer the most informative tool students have to exhibit their skills.

An estimated 1 in 5 institutions issue digital badges, which can be posted to social media, stored on digital portfolios and displayed by other specially designed platforms. When clicked on, the badge lists a range of skills a student has demonstrated beyond grades.

“The reason they’re taking off in higher education is most employers are not getting the information they need about people emerging from higher ed, with previous tools we’ve been using,” says Jonathan Finkelstein, founder and CEO of the widely used badging platform Credly. “The degree itself doesn’t get to level of describing particular competencies.”

For instance, a Notre Dame student who goes on a trip to Ecuador to build bridges can earn a badge for mastering the calculations involved in the construction, says G. Alex Ambrose, associate program director of e-portfolio assessment at the Indiana university’s Kaneb Center for Teaching & Learning.

Students can be pretty certain when they have passed calculus or creative writing, but they don’t always recognize when they’ve excelled in demonstrating soft skills such as critical thinking, communication and work ethic, says MJ Bishop, director of the system’s William E. Kirwan Center for Academic Innovation.

Badges have been most popular in the school of education—including with student teachers who, in turn, have created badges for the elementary and secondary classrooms where they’ve apprenticed, says Anna Catterson, the university’s educational technology director.

The campus library is another badging hotspot. Students there have earned microcredentials for research, 3D printing and other skills. These badges are being shared on LinkedIn and other platforms to obtain internships and scholarships.

The university runs faculty training sessions on badging and has established a review process for when faculty submit ideas for microcredentials.

One pothole to avoid is trying to create a schoolwide badge that’s standardized across a wide range of courses or majors. This can force the involvement of committees that can bog down the process, so it’s better to start with skills within single courses, says Ambrose at Notre Dame.

When creating a badge, system faculty have to identify a business or industry interested in that credential.

Badges that have the backing of a college or university are more impressive to job recruiters than are completion certificates from skill-building websites like Lynda.com.

Students won’t be motivated to earn a badge that’s a stock blue ribbon downloaded off the internet. Many institutions put a lot work into the design, and this can include harnessing expertise from the marketing department and graphic designers

+++++++++++
more on micro-credentialing in this IMS blog
https://blog.stcloudstate.edu/ims?s=microcredentialing

assessment and evaluation of serious games

Learners Assessment and Evaluationin Serious Games: Approachesand Techniques Review

Ibtissem Daoudi, Erwan Tranvouez Raoudha Chebil, Bernard Espinasse, and Wided Lejouad Chaari
https://www.academia.edu/37625162/Learners_Assessment_and_Evaluation_in_Serious_Games_Approaches_and_Techniques_Review
Overview of Crisis Management SG focused on their learners ’ assessment and evaluation capabilities. This synthesis can help researchers and gamecreators by enlighten the main criteria and techniques for learners ’ assessment andevaluation. The described benefits and limitations of each technique may facilitate thechoice of the most adequate way to evaluate a particular SG.

+++++++++
more on serious games in this IMS blog
https://blog.stcloudstate.edu/ims?s=serious+games

Embedded Librarianship in Online Courses

Embedded Librarianship in Online Courses

Mimi O’Malley, October workshop w inquiries@libraryjuiceacademy.com
http://libraryjuiceacademy.com/
https://libraryjuiceacademy.com/news/?p=559
This class will start with simple ways librarians may embed their skills remotely starting with the LMS especially through the use of portal tabs, blocks, eReserves, knowledge bases, and student/faculty orientations. We’ll then move on to discussing how to bring the traditional face-to-face BI session (which librarians know so well) into the online class through the use of team teaching, guest lecturing, and conducting synchronous workshops. We’ll explore in the 3rd week how the librarian can become more influential in online course design and development. The session concludes with an examination of the ways librarians can evaluate whether or not their virtual efforts are impacting student access to library resources as well as possible learning outcomes.
++++++++++
more on embedded librarianship in this iMS blog
https://blog.stcloudstate.edu/ims?s=embedded+librarian

grading for art faculty

Meaningful Grading: A Guide for Faculty in the Arts

Natasha Haugnes, Hoag Holmgren, and Martin Springborg

https://wvupressonline.com/node/759#2

Martin’s own LinkedIn post: https://www.linkedin.com/feed/update/activity:6425014893287657472/

College and university faculty in the arts (visual, studio, language, music, design, and others) regularly grade and assess undergraduate student work but often with little guidance or support. As a result, many arts faculty, especially new faculty, adjunct faculty, and graduate student instructors, feel bewildered and must “reinvent the wheel” when grappling with the challenges and responsibilities of grading and assessing student work.

Meaningful Grading: A Guide for Faculty in the Arts enables faculty to create and implement effective assessment methodologies—research based and field tested—in traditional and online classrooms. In doing so, the book reveals how the daunting challenges of grading in the arts can be turned into opportunities for deeper student learning, increased student engagement, and an enlivened pedagogy.

Measuring Learning Outcomes of New Library Initiatives

International Conference on Qualitative and Quantitative Methods in Libraries 2018 (QQML2018)

conf@qqml.net

Where: Cultural Centre Of Chania
ΠΝΕΥΜΑΤΙΚΟ ΚΕΝΤΡΟ ΧΑΝΙΩΝ

https://goo.gl/maps/8KcyxTurBAL2

also live broadcast at https://www.facebook.com/InforMediaServices/videos/1542057332571425/

When: May 24, 12:30AM-2:30PM (local time; 4:40AM-6:30AM, Chicago Central)

Programme QQML2018-23pgopv

Live broadcasts from some of the sessions:

Here is a link to Sebastian Bock’s presentation:
https://drive.google.com/file/d/1jSOyNXQuqgGTrhHIapq0uxAXQAvkC6Qb/view

Information literacy skills and college students from Jade Geary

Session 1:
http://qqml.org/wp-content/uploads/2017/09/SESSION-Miltenoff.pdf

Session Title: Measuring Learning Outcomes of New Library Initiatives Coordinator: Professor Plamen Miltenoff, Ph.D., MLIS, St. Cloud State University, USA Contact: pmiltenoff@stcloudstate.edu Scope & rationale: The advent of new technologies, such as virtual/augmented/mixed reality, and new pedagogical concepts, such as gaming and gamification, steers academic libraries in uncharted territories. There is not yet sufficiently compiled research and, respectively, proof to justify financial and workforce investment in such endeavors. On the other hand, dwindling resources for education presses administration to demand justification for new endeavors. As it has been established already, technology does not teach; teachers do; a growing body of literature questions the impact of educational technology on educational outcomes. This session seeks to bring together presentations and discussion, both qualitative and quantitative research, related to new pedagogical and technological endeavors in academic libraries as part of education on campus. By experimenting with new technologies such as Video 360 degrees and new pedagogical approaches such as gaming and gamification, does the library improve learning? By experimenting with new technologies and pedagogical approaches, does the library help campus faculty to adopt these methods and improve their teaching? How can results be measured, demonstrated?

Conference program

http://qqml.org/wp-content/uploads/2017/09/7.5.2018-programme_final.pdf

More information and bibliography:

https://www.academia.edu/Documents/in/Videogame_and_Virtual_World_Technologies_Serious_Games_applications_in_Education_and_Training

https://www.academia.edu/Documents/in/Measurement_and_evaluation_in_education

Social Media:
https://www.facebook.com/QQML-International-Conference-575508262589919/

 

 

 

assessment learning outcomes

The Misguided Drive to Measure ‘Learning Outcomes’

1 2 3