Archive of ‘assessment’ category

Formative Assessment

7 Smart, Fast Ways to Do Formative Assessment

Within these methods you’ll find close to 40 tools and tricks for finding out what your students know while they’re still learning.

edutopia.org/article/7-smart-fast-ways-do-formative-assessment

Entry and exit slips

Exit slips can take lots of forms beyond the old-school pencil and scrap paper. Whether you’re assessing at the bottom of Bloom’s taxonomy or the top, you can use tools like Padlet or Poll Everywhere, or measure progress toward attainment or retention of essential content or standards with tools like Google Classroom’s Question toolGoogle Forms with Flubaroo, and Edulastic,

Low-stakes quizzes and polls: If you want to find out whether your students really know as much as you think they know, polls and quizzes created with Socrative or Quizlet or in-class games and tools like QuizalizeKahoot, FlipQuiz, GimkitPlickers, and Flippity

Dipsticks: So-called alternative formative assessments are meant to be as easy and quick as checking the oil in your car, so they’re sometimes referred to as dipsticks. These can be things like asking students to:

  • write a letter explaining a key idea to a friend,
  • draw a sketch to visually represent new knowledge, or
  • do a think, pair, share exercise with a partner.

Interview assessments: If you want to dig a little deeper into students’ understanding of content, try discussion-based assessment methods. Casual chats with students in the classroom can help them feel at ease even as you get a sense of what they know, and you may find that five-minute interview assessments

TAG feedback 

FlipgridExplain Everything, or Seesaw

Methods that incorporate art: Consider using visual art or photography or videography as an assessment tool. Whether students draw, create a collage, or sculpt, you may find that the assessment helps them synthesize their learning.

Misconceptions and errors: Sometimes it’s helpful to see if students understand why something is incorrect or why a concept is hard. Ask students to explain the “muddiest point” in the lesson—the place where things got confusing or particularly difficult or where they still lack clarity. Or do a misconception check:

Self-assessment: Don’t forget to consult the experts—the kids. Often you can give your rubric to your student

Assessment Is a Waste of Time?

Assessment Is an Enormous Waste of Time

https://www.chronicle.com/article/assessment-is-an-enormous-waste-of-time/

The assessment industry is not known for self-critical reflection. Assessors insist that faculty provide evidence that their teaching is effective, but they are dismissive of evidence that their own work is ineffective. They demand data, but they are indifferent to the quality of those data. So it’s not a surprise that the assessment project is built on an unexamined assumption: that learning, especially higher-order learning such as critical thinking, is central to the college experience.

the Lumina Foundation’s Degree Qualifications Profile “provides a qualitative set of important learning outcomes, not quantitative measures such as numbers of credits and grade-point averages, as the basis for awarding degrees.”

article in Change, Daniel Sullivan, president emeritus of St. Lawrence University and a senior fellow at the Association of American Colleges & Universities, and Kate McConnell, assistant vice president for research and assessment at the association, describe a project that looked at nearly 3,000 pieces of student work from 14 institutions. They used the critical-thinking and written-communication Value rubrics designed by the AAC&U to score the work. They discovered that most college-student work falls in the middle of the rubric’s four-point scale measuring skill attainment.

Richard Arum and Josipa Roska’s 2011 book, Academically Adrift, used data from the Collegiate Learning Assessment to show that a large percentage of students don’t improve their critical thinking or writing. A 2017 study by The Wall Street Journal used data from the CLA at dozens of public colleges and concluded that the evidence for learning between the first and senior years was so scant that they called it “discouraging.”

not suggesting that college is a waste of time or that there is no value in a college education. But before we spend scarce resources and time trying to assess and enhance student learning, shouldn’t we maybe check to be sure that learning is what actually happens in college?

+++++++++++++++
more on assessment in this IMS blog
https://blog.stcloudstate.edu/ims?s=assessment

and critical thinking
https://blog.stcloudstate.edu/ims?s=critical+thinking

PISA Estonia China US

+++++++++++++++++++++

https://www.washingtonpost.com/politics/2019/12/17/chinas-education-system-produces-stellar-test-scores-so-why-do-students-head-abroad-each-year-study/

Education scholars have already critiqued PISA as a valid global measure of education quality — but analysts also are skeptical about the selective participation of Chinese students from wealthier schools.

Second, Chinese students, on average, study 55 hours a week — also No. 1 among PISA-participating countries. This was about 20 hours more than students in Finland, the country that PISA declared to have the highest learning efficiency, or reading-test-score points per hour spent studying.

But PISA analysis also revealed that Chinese students are among the least satisfied with their lives.

Students look overseas for a more well-rounded education

Their top destination of choice, by far, is the United States. The 1.1 million or so foreign students in the United States in 2018 included 369,500 Chinese college students

hostility in U.S.-China relations could dampen the appeal of a U.S. education. Britain, in fact, recorded a 30 percent surge in Chinese applicants in 2019, challenging the U.S. global dominance in higher education.

++++++++++++++++++++++++++++

https://www.edweek.org/ew/articles/2019/12/03/us-students-gain-ground-against-global-peers.html

Immigrant students, who made up 23 percent of all U.S. students taking PISA, performed significantly better compared to their native-born peers in the United States than they did on average throughout the OECD countries.

https://www.msn.com/en-us/finance/news/pisa-rankings-2019-four-chinese-regions-top-international-student-survey/ar-BBXGCZU

The survey found that 15-year-old students from Beijing, Shanghai, and the eastern provinces of Jiangsu and Zhejiang ranked top for all three core subjects, achieving the highest level 4 rating.

Students from the United States were ranked level 3 for reading and science, and level 2 for math, while teens from Britain scored a level 3 ranking in all three categories.

++++++++++++++++

Looking for Post-PISA Answers? Here’s What Our Obsession With Test Scores Overlooks

https://www.edsurge.com/news/2019-12-03-looking-for-post-pisa-answers-here-s-what-our-obsession-with-test-scores-overlooks

By Tony Wan     Dec 3, 2019

Andreas Schelicher, director of education and skills at the OECD—the Paris-based organization behind PISA wrote that “students who disagreed or strongly disagreed with the statement ‘Your intelligence is something about you that you can’t change very much’ scored 32 points higher in reading than students who agreed or strongly agreed.”

Those results are similar to recent findings published by Carol Dweck, a Stanford education professor who is often credited with making growth mindset a mainstream concept.

“Growth mindset is a very important thing that makes us active learners, and makes us invest in our personal education,” Schleicher states. “If learning isn’t based on effort and intelligence is predetermined, why would anyone bother?”

It’s “absolutely fascinating” to see the relationship between teachers’ enthusiasm, students’ social-emotional wellbeing and their learning outcomes, Schleicher notes. As one example, he noted in his summary report that “in most countries and economies, students scored higher in reading when they perceived their teachers as more enthusiastic, especially when they said their teachers were interested in the subject.

In other words, happy teachers lead to better results. That’s hardly a surprising revelation, says Scheleicher. But professional development support is one thing that can sometimes be overlooked by policymakers when so much of the focus is on test scores.

+++++++++++++++++

https://nces.ed.gov/surveys/pisa/
+++++++++++


+++++++++++
more on Estonia in this IMS blog
https://blog.stcloudstate.edu/ims?s=estonia

Performance Assessment

What Is Performance Assessment?

February 5, 2019 https://www.edweek.org/ew/articles/2019/02/06/what-is-performance-assessment.html

William Heard Kilpatrick  “The Project Method”

Today, despite major advances in ways to measure learning, we still don’t have common definitions for project-based learning or performance assessment.

In the absence of agreed-upon definitions for this evolving field, Education Week reporters developed a glossary

Proficiency-based or competency-based learning: These terms are interchangeable. They refer to the practice of allowing students to progress in their learning as they master a set of standards or competencies. Students can advance at different rates. Typically, there is an attempt to build students’ ownership and understanding of their learning goals and often a focus on “personalizing” students’ learning based on their needs and interests.

Project-based learning: Students learn through an extended project, which may have a number of checkpoints or assessments along the way. Key features are inquiry, exploration, the extended duration of the project, and iteration (requiring students to revise and reflect, for example). A subset of project-based learning is problem-based learning, which focuses on a specific challenge for which students must find a solution.

Standards-based grading: This refers to the practice of giving students nuanced and detailed descriptions of their performance against specific criteria or standards, not on a bell curve. It can stand alone or exist alongside traditional letter grading.

Performance assessment: This assessment measures how well students apply their knowledge, skills, and abilities to authentic problems. The key feature is that it requires the student to produce something, such as a report, experiment, or performance, which is scored against specific criteria.

Portfolio: This assessment consists of a body of student work collected over an extended period, from a few weeks to a year or more. This work can be produced in response to a test prompt or assignment but is often simply drawn from everyday classroom tasks. Frequently, portfolios also contain an element of student reflection.

Exhibition: A type of performance assessment that requires a public presentation, as in the sciences or performing arts. Other fields can also require an exhibition component. Students might be required, for instance, to justify their position in an oral presentation or debate.

Performance task: A piece of work students are asked to do to show how well they apply their knowledge, skills, or abilities—from writing an essay to diagnosing and fixing a broken circuit. A performance assessment typically consists of several performance tasks. Performance tasks also may be included in traditional multiple-choice tests.

 

avoid mistakes microcredentialing

The Seven Deadly Sins Of Digital Badging In Education

An academic institution’s digital badging initiative is getting off the ground and students are “earning” badges, or micro-credentials, but are they actually providing value to the student toward his or her future career?
Parth Detroja, bestselling author of Swipe to Unlock
According to a report by the University Professional and Continuing Education Association (UPCEA), one in five institutions now offers digital badges, but as educators tinker with micro-credentialing, digital badging initiatives at educational institutions can prove worthless to students due to seven common mistakes.
1. (Operational Inefficiency) Making faculty and staff manually issue badges
2. Issuing badges without authentic evidence
3. Issuing badges randomly
4. Expecting students to manually claim badges
5. Hiding badges where employers won’t look
6. Storing badges in a separate silo
7. Issuing badges that don’t match to internships or jobs
Troy Markowitz is Vice President of Academic Partnerships at Portfolium

+++++++++++
more on microcredentialing in this IMS blog
https://blog.stcloudstate.edu/ims?s=microcredentialing

1 2 3