Connecting the Dots
Assessing Student Work Using the VALUE Rubrics
1:00 – 4:00
In this session, we will focus on assessing student work using the VALUE Rubrics.
Together, we will look at common work samples from students at different points in
their academic trajectory. We will identify evidence of critical thinking, quantitative
literacy, written communication, and civic engagement from those samples.
We will then connect that evidence to the appropriate domains and levels on
the VALUE rubrics. And we will consider the implications of what we learn for
our own practice in the classroom.
viewer people than i expected.
group work, our group was charged with connecting the dots: assessing student work using the value rubrics
Rubrics provide the criteria for assessing students’ work. Giving students the rubric along with the assignment can clarify the instructor’s expectations. A rubric allows for much quicker, fairer, and more transparent grading. After an instructor grades 30 essays, fairness can become secondary to exhaustion. Following the rubric takes less time, and doing so allows grading the first essay to look exactly like grading the last essay. Students will be less likely to say, for example, “She got a 3 on this section, and I got a 2 for almost the same content.”
Instruction and Liaison Librarian, University of Northern Iowa
games and gamification. the semantics are important. using the right terms can be crucial in the next several years.
gamification for the enthusiasm. credit course with buffet. the pper-to-peer is very important
affordability; east to use; speed to create.
assessment. if you want heavy duty, SPSS kind of assessment, use polldaddy or polleverywhere.
Kahoot only Youtube, does not allow to upload own video or use Kaltura AKA Medispace, text versus multimedia
Kahoot is replacing Voicethread at K12, use the wave
Kahoot allows to share the quizzes and surveys
Kahoot is not about assessment, it is not about drilling knowledge, it is about conversation starter. why do we read an article? there is no shame in wrong answer.
the carrot: when they reach the 1000 points, they can leave the class
Kahoot music can be turned off, how short, the answers are limited like in Twitter
screenshot their final score and reach 80%
gravity is hard, scatter start with. auditory output
1st day is Kahoot, second day is Team challange and test
embed across the curriculum
gaming toolkit for campus
what to take home: have students facing students from differnt library
In the age of Big Data, there is an abundance of free or cheap data sources available to libraries about their users’ behavior across the many components that make up their web presence. Data from vendors, data from Google Analytics or other third-party tracking software, and data from user testing are all things libraries have access to at little or no cost. However, just like many students can become overloaded when they do not know how to navigate the many information sources available to them, many libraries can become overloaded by the continuous stream of data pouring in from these sources. This session will aim to help librarians understand 1) what sorts of data their library already has (or easily could have) access to about how their users use their various web tools, 2) what that data can and cannot tell them, and 3) how to use the datasets they are collecting in a holistic manner to help them make design decisions. The presentation will feature examples from the presenters’ own experience of incorporating user data in decisions related to design the Bethel University Libraries’ web presence.
data tools: user testing, google analytics, click trakcer vendor data
user testing, free, no visualization, cross-domain, easy to use, requires scripts
qualitative q/s : why people do what they do and how will users think about your content
3 versions: variables: options on book search and order/wording of the sections in the articles tab
Findings: big difference between tabs versus single-page. Lil difference btw single-page options. Take-aways it won’t tell how to fix the problem, how to be empathetic how the user is using the page
Like to do in the future: FAQ and Chat. Problem: low use. Question how to make it be used (see PPT details)
Crazy Egg – Click Trackers. not a free tool, lowest tier, less $10/m.
see PPT for details>
interaction with the pates, clicks and scrollings
not easy to use, steep learning curve
“blob” GAnalytics recognize the three different domains that r clicked through as one.
vendor data: springshare
chat and FAQ
is there a dashboard tool that can combine all these tools?
optimal workshop: reframe, but it is more about qualitative data.
how long does it take to build this? about two years in general, but in the last 6 months focused.
Podcasts have become excellent sources for great storytelling, interviews, and journalism.
From a few minutes to more than an hour, podcasts give content creators a chance to speak directly to their listeners free of distractions, and give listeners a new way to expand their minds during their daily commutes.
This session will describe an approach to online discussions that moves beyond the threaded message boards of D2L Brightspace, yet still maintained an asynchronous online delivery. Using teams, discussions were differentiated by product to allow students to turn in an artifact that represented their shared understanding during specific online course modules. Strategies, Technology guides, rubrics, and student feedback will be shared.
Presenter: Michael Manderfeld
Senior Instructional Designer
Minnesota State University Mankato
Join Mario Callegaro, Senior Survey Research Scientist at Google UK, and one of own survey research scientists, Sarah Cho, on February 24 at 10 am PT / 1 pm ET for our webinar, Market research surveys gone mobile: Optimizing for better results.
Senior Survey Research Scientist
Quantitative Marketing Team, Google UK
Survey Research Scientist
.My notes from the Webinar.
Surveys uncover the WHY. Big Data,
why mobile matters. tablet and smart phone penetration: around 60-80% in Europe. According to Pew In the US, 68% smartphone and 45% tablet
faster reaction but longer questionnaire completion time on smartphones = device effects
survey design device vs. survey take device – mismatch. When there is a mismatch, questions are asked.
5 strategies to handle mobile phone respondents: 1. do nothing
surveym0nkey: do all surveys have to be mobile optimized? no, so make sure you think about the context in which you are sending out
2. discourage the use of mobile phones for answering 3. optimize the web questionnaire for mobile browsers 4. mobile app
design considerations for multiple devices surveys. two “actors”: survey designer and survey platform
confounds when interpreting findings across devices: use homogeneous population (e.g students)
difference between mouse vs fingers as input devices
what about tablets: as long as flash is not used, tablet is very much the same as laptop/desktop. phablets (iPhone growth of the screen)
mobile survey design tips (Sarah)
multiple choice: ok to use, but keep wording short, format response vertically instead of horizontally.
open-ended q type: hard to type (but no word on voice recognition???)
multimedia: images, clarity, video, avoid (bandwidth constrains), use Youtube, so every device can play it, versus Flash, Java Script etc