Searching for "rubric"

rubrics in D2L: from students’ standpoint

One of the difficulties working with D2L as an instructor is the inability to “see” what “students” see. Indeed D2L has the students role, but…

If you are working with rubrics and advertising this feature to your students (pls share with us your rubrics!!!) and your students are perplexed that they don’t see rubrics under

Assessment

Rubrics

as you do, please keep in mind that you need to “connect” your rubrics (click on “Add Rubrics” under Assessment/Dropbox/Properties/Rubrics) with the dropbox. Students will be able to see the rubric only after the dropbox is “open”

Please let us know, if you need more information

d2l@stcloudstate.edu

Follow us on Twitter: @SCSUtechInstruc | #techworkshop

https://blog.stcloudstate.edu/ims 

Formative Assessment

7 Smart, Fast Ways to Do Formative Assessment

Within these methods you’ll find close to 40 tools and tricks for finding out what your students know while they’re still learning.

edutopia.org/article/7-smart-fast-ways-do-formative-assessment

Entry and exit slips

Exit slips can take lots of forms beyond the old-school pencil and scrap paper. Whether you’re assessing at the bottom of Bloom’s taxonomy or the top, you can use tools like Padlet or Poll Everywhere, or measure progress toward attainment or retention of essential content or standards with tools like Google Classroom’s Question toolGoogle Forms with Flubaroo, and Edulastic,

Low-stakes quizzes and polls: If you want to find out whether your students really know as much as you think they know, polls and quizzes created with Socrative or Quizlet or in-class games and tools like QuizalizeKahoot, FlipQuiz, GimkitPlickers, and Flippity

Dipsticks: So-called alternative formative assessments are meant to be as easy and quick as checking the oil in your car, so they’re sometimes referred to as dipsticks. These can be things like asking students to:

  • write a letter explaining a key idea to a friend,
  • draw a sketch to visually represent new knowledge, or
  • do a think, pair, share exercise with a partner.

Interview assessments: If you want to dig a little deeper into students’ understanding of content, try discussion-based assessment methods. Casual chats with students in the classroom can help them feel at ease even as you get a sense of what they know, and you may find that five-minute interview assessments

TAG feedback 

FlipgridExplain Everything, or Seesaw

Methods that incorporate art: Consider using visual art or photography or videography as an assessment tool. Whether students draw, create a collage, or sculpt, you may find that the assessment helps them synthesize their learning.

Misconceptions and errors: Sometimes it’s helpful to see if students understand why something is incorrect or why a concept is hard. Ask students to explain the “muddiest point” in the lesson—the place where things got confusing or particularly difficult or where they still lack clarity. Or do a misconception check:

Self-assessment: Don’t forget to consult the experts—the kids. Often you can give your rubric to your student

assessment and evaluation of immersive learning

Fegely, A., & S Cherner, T. (2021). A Comprehensive Rubric for Evaluating EduVR. Journal of Information Technology Education: Research, 20, 137–171. https://doi.org/10.28945/4737
a gap in the literature by presenting a criterion-referenced, researchsupported rubric for evaluating the quality of educational virtual reality for mobile devices.
++++++++++++++++++++<

measuring learning outcomes

https://www.facebook.com/groups/onlinelearningcollective/permalink/746716582625709/

a discussion from the Higher Ed Learning Collective:

In my teaching career I worked at two colleges in Wisconsin. One public and one private. Both have Learning outcomes for each course, program/major outcomes for each major, and Institutional outcomes for the college (aka Employable skills, Career essentials, or Abilities).
Recently a friend of mine started teaching an online class at the University in a different state, and she kept asking them to give her learning outcomes for the course. After some back and forth emails it turned out that this other state university doesn’t have them.
It blew my mind 🤯, how do they know what the scope and depth of teaching should be in that course? How do they get their accreditation?
I am curious to know if it is just in Wisconsin or selected states/countries that it is a common practice to have outcomes? Also how do you teach without them?

++++++++++++

Go to the AACU Value Rubric website and adopt several of the outcomes for the course.
https://www.aacu.org/value-rubrics

+++++++++++++

We have LOs over here in MN.
My guess is that they’re buried deep in some filing cabinet in your friend’s university and most folks just ignore them.
+++++++++++++

Usually the instructor is required to create the outcomes for the class, but they are usually based on meta-outcomes from the department. That’s how it has been at all institutions I have worked at. With that said, I have worked with colleagues, full professors with Ph.D.s that didn’t understand the principle of learning outcomes, and unless forced to put them in the syllabus, they either would not do it on their own or when having them, would not follow them. And forgot about triangulation of LO to activities and assessments.
++++++++++++

Accreditors look for program LOs but not at course level. (We learned this after a faculty member was fired for pushing back on LOs on the syllabus, and when the Uni said “SACS requirements”, SACS responded w/“um no, not really…”) Since then, we’ve collected data as part of our assessment plan & can say w/confidence that students don’t read them…

APA 7th

https://www.facebook.com/groups/onlinelearningcollective/permalink/734573083840059/

“As I create and modify my course syllabi, I want to make sure my students use APA 7th ed. when writing their formal assignments. For those of you who also use APA, what do you say in your syllabi? What matters to you with your students giving proper credit to sources, images, or videos? I’m trying to do better and expect better”

There’s a new OER that I used with my students in the fall that introduces them to APA and has examples to work through.

http://blog.stcloudstate.edu/oer/2021/01/15/apa-style-citation-tutorial-7th-edition/

In case this is helpful, my university has a video on using APA. (I haven’t watched it yet.)

Julie Herskovitz

I never assume the they learned the format, and I build in an APA workshop. I use OWL Purdue and go over a sample paper first, then the APA PowerPoint. Then I give them a low stakes assignment (like a discussion post) to practice.

I talk about documentation more as a convention of their discourse community, not just citations. There is a certain structure and way of writing in APA, that along with citations, represent the values of a particular discourse community. Those are the things that matter to me. (I also get more buy in from students.)

I was happy to discover that APA now has decent examples online, free, at their website. So in my instructions to students, I linked to the main page and also 3 specific pages with commonly used items, such as newspaper articles online, and YouTube videos. So step 1 is providing tools. Step 2 is clearly expressed grade penalties.

I actually don’t say anything my syllabus. What I do is in my LMS: give them a template and links to the Purdue OWL and other relevant websites. I have also written a “Dr. Kaminski’s APA 7th Ed Guide”. It’s more of my pet peeves and what they should be focusing on that students often miss. I give a lot of grace on the first (low stakes) written assignment, with more focus on the APA portion than the actual content. After that, I’m expecting them to have it down.

I say it (and link to resources) in my assignment sheets and have a spot in my rubric to reflect what I am asking of my students.

I post resources to our LMS. Mostly the usual subjects (APA, Purdue OWL, etc). I often add a short video on the bias-free writing chapter because that’s often not covered in their intro to research writing courses. For citations, I’m more a stickler for complete information than semicolons and whatnot. I don’t feel good about deducting points for anything that students were taught with APA 6 that is different in 7 since we changed the rules on them.

I provide a free workshop at the beginning of the semester to explain the ‘why’ and provide practice. It carries a rather high weighting in our rubrics so…some understanding and ‘free points’ if they use it appropriately.

I have a different document I refer to in the syllabus titled “Writing Expectations”. I briefly explain the importance of using APA and the characteristics of academic writing (e.g. paraphrasing, avoiding over usage of direct quotes, and other things I see in student writing). The second page is an APA job aid that shows the basics for citations, reference lists, and leveled headings.

 

+++++++++++++
more on APA 7th edition in this IMS blog
https://blog.stcloudstate.edu/ims?s=apa+7

Assessment Is a Waste of Time?

Assessment Is an Enormous Waste of Time

https://www.chronicle.com/article/assessment-is-an-enormous-waste-of-time/

The assessment industry is not known for self-critical reflection. Assessors insist that faculty provide evidence that their teaching is effective, but they are dismissive of evidence that their own work is ineffective. They demand data, but they are indifferent to the quality of those data. So it’s not a surprise that the assessment project is built on an unexamined assumption: that learning, especially higher-order learning such as critical thinking, is central to the college experience.

the Lumina Foundation’s Degree Qualifications Profile “provides a qualitative set of important learning outcomes, not quantitative measures such as numbers of credits and grade-point averages, as the basis for awarding degrees.”

article in Change, Daniel Sullivan, president emeritus of St. Lawrence University and a senior fellow at the Association of American Colleges & Universities, and Kate McConnell, assistant vice president for research and assessment at the association, describe a project that looked at nearly 3,000 pieces of student work from 14 institutions. They used the critical-thinking and written-communication Value rubrics designed by the AAC&U to score the work. They discovered that most college-student work falls in the middle of the rubric’s four-point scale measuring skill attainment.

Richard Arum and Josipa Roska’s 2011 book, Academically Adrift, used data from the Collegiate Learning Assessment to show that a large percentage of students don’t improve their critical thinking or writing. A 2017 study by The Wall Street Journal used data from the CLA at dozens of public colleges and concluded that the evidence for learning between the first and senior years was so scant that they called it “discouraging.”

not suggesting that college is a waste of time or that there is no value in a college education. But before we spend scarce resources and time trying to assess and enhance student learning, shouldn’t we maybe check to be sure that learning is what actually happens in college?

+++++++++++++++
more on assessment in this IMS blog
https://blog.stcloudstate.edu/ims?s=assessment

and critical thinking
https://blog.stcloudstate.edu/ims?s=critical+thinking

student-centered learning

Report: Most educators aren’t equipped for student-centered learning

https://www.educationdive.com/news/report-most-educators-arent-equipped-for-student-centered-learning/585012/

“the perfect combination of catalysts for a rapid conversion to student-centered schooling,” according to a new report from the Christensen Institute.

most K-12 educators aren’t equipped with the skill sets needed to run student-centered schools. For student-centered learning to be adopted, educators must be trained for student-centered competencies,

the report suggests school and district leaders:

  • Work toward a more modular professional development system, which includes specific, verifiable and predictable microcredentials.
  • Specify competencies needed for student-centered educators.
  • Compensate educators with bonuses for microcredentials to incentivize earning them.
  • Purchase bulk licenses to allow teachers the opportunity to earn microcredentials.
  • Demand and pay for mastery of skills rather than a one-time workshop.
  • Vet microcredential issuers’ verification processes, like rubrics and evaluation systems.

While testing could help with personalized instruction, a report from the Center on Reinventing Public Education stressed the need for professional development so teachers can interpret the resulting data and let it guide instruction this year.micr

++++++++++++++++
more on microcredentials in this IMS blog
https://blog.stcloudstate.edu/ims?s=microcredential

peer to peer curation

Peer-to-Peer Curation Activities Boost Higher-Order Thinking

https://www.kritik.io/resources/peer-to-peer-curation-activities-boost-higher-order-thinking

Most professors we hear from want to assess their students on higher levels and that if current assessments kept student at the lowest level of Bloom’s Taxonomy, they wouldn’t feel rewarded as educators.

However, assessment is by far the most labour-intensive part of teaching. Assessment plans and rubrics must be prepped. Test questions must be written. Every student needs a mark, personalized feedback and a road-map for improvement. The larger the class, the more work for the instructor. Add in formative assessments like weekly assignments and exercises that precipitate subtle, ongoing tweaks to the syllabus and it’s easy to see why many faculty opt to stick with what they know: An accumulation of easy-to-grade summative assessments that almost inevitably rely upon memorization and the most basic understanding of concepts

Curation Activities can be one of the most effective teaching strategies to help students compare what they’re learning in the classroom with real-world examples, and gain insight into how they can relate to each other.

Curation Activities can apply to all disciples, such as Business, Arts, or Sciences.

When students explain what they’ve learned to other students, they help consolidate and strengthen connections to those concepts while simultaneously engaging in active learning Find more project ideas here.

By actively engaging with their classmates and applying their own evaluative skills to feedback they’re delivering to their peers, students are developing lifelong critical thinking and creative analysis skills. Additionally, peer assessment is proven to be effective in getting students faster feedback from diverse sources, increases meta-cognition, independence and self-reflection, and improves student learning. These are all important skills that provide value far beyond the classroom. More details on the benefits of peer assessment here.

++++++++++++++
more on curation in this IMS blog
https://blog.stcloudstate.edu/ims?s=curation

digital agility

Digital Agility: Embracing a Holistic Approach to Digital Literacy in the Liberal Arts

https://er.educause.edu/blogs/2020/1/digital-agility-embracing-a-holistic-approach-to-digital-literacy-in-the-liberal-arts

A 2016 Pew Research Center study indicates that the digital divide in the United States is not solely about access to technology; it also is about the ability to use technology to get what we need.1 What does digital readiness mean; applying cumulative knowledge to real-world situations. Having a tech or STEM-related degree does not ensure digital readiness.

How Can We Encourage Digital Agility in the Liberal Arts?

Digital pedagogy often creates opportunities for instructors to create non-disposable assignments—assignments that are not designed to be thrown away but rather have a purpose past being required.3

“We need to marry the best of our academic work with the best of edtech. In other words, what would it look like if education technology were embedded in the everyday practice of academic disciplines?”4

Project-based learning fits well within the curricular flexibility of the liberal arts. In project-based work, students apply what they are learning in the context of an engaging experience.

Building off frameworks that are already in place, like the Association for College and Research Libraries (ACRL) Framework for Information Literacy,

External-facing work offers students real situations where, if we imagine what digital agility looks like, they have to adjust to possible new digital environments and approaches.

Reflection provides a way for meaning-making to happen across individual assignments, projects, and classes. Without the chance to assemble assignments into a larger narrative, each experience lives in its own void.

How Can Institutions Build Systems-Level Support?

Liberal arts colleges in particular are interested in the ways they prepare graduates to be agile and critical in a digital world—as seen in the Association of American Colleges & Universities (AAC&U) Valid Assessment of Learning in Undergraduate Education (VALUE) Rubrics.

he Bryn Mawr Digital Competencies Framework5 was followed by more formal conversations and the formation of a working group (including Carleton College,

++++++++++++
more on digital fluency in this IMS blog
https://blog.stcloudstate.edu/ims?s=digital+fluency

feedback w technology

How to Give Your Students Better Feedback With Technology ADVICE GUIDE

y Holly Fiock and Heather Garcia

https://www.chronicle.com/interactives/20191108-Advice-Feedback

students continue to report dissatisfaction with the feedback they get on assignments and tests — calling it vague, discouraging, and/or late.

The use of technology in the classroom (both in face-to-face and online environments)

  • Rubrics: online scoring guides to evaluate students’ work.
  • Annotations: notes or comments added digitally to essays and other assignments.
  • Audio: a sound file of your voice giving feedback on students’ work.
  • Video: a recorded file of you offering feedback either as a “talking head,” a screencast, or a mix of both.
  • Peer review: online systems in which students review one another’s work.

Two main types of feedback — formative and summative — work together in that process but have different purposes. Formative feedback occurs during the learning process and is used to monitor progress. Summative feedback happens at the end of a lesson or a unit and is used to evaluate the achievement of the learning outcomes.

Good feedback should be: Frequent, Specific, Balanced, Timely

guide on inclusive teaching, frequent, low-stakes assessments are an inclusive teaching practice.

Time-Saving Approaches: rubrics and peer-reviews.

When to Use Audio or Video Tools for Feedback: personalize your feedback, convey nuance, demonstrate a process, avoid miscommunication

Faculty interest in classroom innovation is on the rise. Professors are trying all sorts of new techniques to improve the first few minutes of class, to make their teaching more engaging, to hold better class discussions. Buzzwords like active learningauthentic assessmenttechnology integration, and case-based learning are more and more a part of faculty discussions.

Don’t assume technology will solve every problem.

Avoid making long videos

Video and audio feedback doesn’t have to be perfect.

There is such a thing as too much information.

Have a plan.

++++++++++
more on feedback in education in this IMS blog
https://blog.stcloudstate.edu/ims?s=feedback

1 2 3 4 7