May
2022
Digital Literacy for St. Cloud State University
Within these methods you’ll find close to 40 tools and tricks for finding out what your students know while they’re still learning.
edutopia.org/article/7-smart-fast-ways-do-formative-assessment
Entry and exit slips
Exit slips can take lots of forms beyond the old-school pencil and scrap paper. Whether you’re assessing at the bottom of Bloom’s taxonomy or the top, you can use tools like Padlet or Poll Everywhere, or measure progress toward attainment or retention of essential content or standards with tools like Google Classroom’s Question tool, Google Forms with Flubaroo, and Edulastic,
Low-stakes quizzes and polls: If you want to find out whether your students really know as much as you think they know, polls and quizzes created with Socrative or Quizlet or in-class games and tools like Quizalize, Kahoot, FlipQuiz, Gimkit, Plickers, and Flippity
Dipsticks: So-called alternative formative assessments are meant to be as easy and quick as checking the oil in your car, so they’re sometimes referred to as dipsticks. These can be things like asking students to:
Interview assessments: If you want to dig a little deeper into students’ understanding of content, try discussion-based assessment methods. Casual chats with students in the classroom can help them feel at ease even as you get a sense of what they know, and you may find that five-minute interview assessments
Flipgrid, Explain Everything, or Seesaw
Methods that incorporate art: Consider using visual art or photography or videography as an assessment tool. Whether students draw, create a collage, or sculpt, you may find that the assessment helps them synthesize their learning.
Misconceptions and errors: Sometimes it’s helpful to see if students understand why something is incorrect or why a concept is hard. Ask students to explain the “muddiest point” in the lesson—the place where things got confusing or particularly difficult or where they still lack clarity. Or do a misconception check:
Self-assessment: Don’t forget to consult the experts—the kids. Often you can give your rubric to your student
the comment below that posting:
Margaret Teall: Maybe if they’re experts on assessments. But…are they?
https://www.frontiersin.org/articles/10.3389/feduc.2021.673594/full
Readiness for teaching online has been defined as the qualities or predispositions of an instructor that exemplify teaching high-quality online courses (Palloff and Pratt, 2011). Mental and physical preparedness (Cutri and Mena, 2020), a willingness to create active, collaborative learning environments that foster a sense community (Palloff and Pratt, 2011), and acceptance of online teaching (Gibson et al., 2008) also demonstrate readiness for the online teaching and learning modality. An inability or unwillingness to adopt student-focused approaches and the perception that online courses provide low quality learning environments (Gibson et al., 2008) and are not worthwhile (Allen and Seaman, 2015) can be important barriers to the successful transition to teaching online.
Thumbs-up/thumbs-down, hand signals, online polls, and chat boxes have become the new mainstays of formative assessments in virtual classrooms. pic.twitter.com/E5QTOsGBIm
— edutopia (@edutopia) March 2, 2021
+++++++++++++++
more on formative assessment in this IMS blog
https://blog.stcloudstate.edu/ims?s=formative+assessment
https://www.chronicle.com/article/assessment-is-an-enormous-waste-of-time/
The assessment industry is not known for self-critical reflection. Assessors insist that faculty provide evidence that their teaching is effective, but they are dismissive of evidence that their own work is ineffective. They demand data, but they are indifferent to the quality of those data. So it’s not a surprise that the assessment project is built on an unexamined assumption: that learning, especially higher-order learning such as critical thinking, is central to the college experience.
the Lumina Foundation’s Degree Qualifications Profile “provides a qualitative set of important learning outcomes, not quantitative measures such as numbers of credits and grade-point averages, as the basis for awarding degrees.”
article in Change, Daniel Sullivan, president emeritus of St. Lawrence University and a senior fellow at the Association of American Colleges & Universities, and Kate McConnell, assistant vice president for research and assessment at the association, describe a project that looked at nearly 3,000 pieces of student work from 14 institutions. They used the critical-thinking and written-communication Value rubrics designed by the AAC&U to score the work. They discovered that most college-student work falls in the middle of the rubric’s four-point scale measuring skill attainment.
Richard Arum and Josipa Roska’s 2011 book, Academically Adrift, used data from the Collegiate Learning Assessment to show that a large percentage of students don’t improve their critical thinking or writing. A 2017 study by The Wall Street Journal used data from the CLA at dozens of public colleges and concluded that the evidence for learning between the first and senior years was so scant that they called it “discouraging.”
not suggesting that college is a waste of time or that there is no value in a college education. But before we spend scarce resources and time trying to assess and enhance student learning, shouldn’t we maybe check to be sure that learning is what actually happens in college?
+++++++++++++++
more on assessment in this IMS blog
https://blog.stcloudstate.edu/ims?s=assessment
and critical thinking
https://blog.stcloudstate.edu/ims?s=critical+thinking
Assessment is STEM courses (and how they integrated in LMS:
+++++++++++++++++++++++++++
More on assessment in this IMS blog
https://blog.stcloudstate.edu/ims?s=assessment
+++++++++++++Formative Assessment in Distance Learning https://t.co/QUPWB2sA15 #education #edchat #k12 #remotelearning #assessment #thriveinedu #edcovid #edutwitter #distancelearning #edtech
— Rachelle Dene Poth #ThriveinEDU #AI #ARVR (@Rdene915) April 8, 2020
https://www.edutopia.org/article/formative-assessment-distance-learning
Whether we use synchronous or asynchronous online sessions, whether we call it distance or virtual learning, we’re all challenged to provide meaningful education experiences at a distance as the education world grapples with the impact of Covid-19.
Know your purpose
Collect data over time
Focus on feedback
Check for understanding in synchronous sessions
Leverage personal conversations
Check in on SEL
Make it useful
+++++++++++++++++++
+++++++++++++++++
formative assessment in this IMS blog
https://blog.stcloudstate.edu/ims?s=formative+assessment
Jones, C., Watkins, F., Williams, J., Lambros, A., Callahan, K., Lawlor, J., … Atkinson, H. (2019). A 360-degree assessment of teaching effectiveness using a structured-videorecorded observed teaching exercise for faculty development. Medical Education Online, 24(1), 1596708. https://doi.org/10.1080/10872981.2019.1596708
enable faculty to receive a detailed 360-degree assessment of their teaching
The faculty in Gerontology and Geriatric Medicine at Wake Forest School of Medicine (WFSM) saw an opportunity to incorporate a focused teaching practicum for faculty within a multiple-specialty faculty development program. 360-degree assessments involve a combination of feedback from subordinates, colleagues and superiors. 360-degree feedback has been considered an essential tool in transformational leadership because the evaluation process avoids bias through diversity of viewpoints represented, and it is rarely applied to teaching assessments. Specifically, we designed a teaching practicum using a Videorecorded Observed Teaching Exercise (VOTE) to provide self-, peer- and learner assessments of teaching
Our design of videorecorded microteaching sessions embedded into a faculty development program presents a feasible, well-received model to provide faculty development in teaching and a robust 360-degree assessment of teaching skills.
Two strengths of our program are that it is feasible and reproducible.
In addition, costs for these sessions were low. VOTE video capture costs ranged from $45 – $90 per session depending on the audiovisual capacity of the room used for recording. Costs for this activity included an audiovisual technician who performed the room setup and videorecording. However, a handheld videorecorder or mobile device could be used for these sessions as well.
++++++++++
more on video 360 in this iMS blog
https://blog.stcloudstate.edu/ims?s=360