Archive of ‘technology’ category

value rubrics

Connecting the Dots
Assessing Student Work Using the VALUE Rubrics
1:00 – 4:00
In this session, we will focus on assessing student work using the VALUE Rubrics.
Together, we will look at common work samples from students at different points in
their academic trajectory. We will identify evidence of critical thinking, quantitative
literacy, written communication, and civic engagement from those samples.
We will then connect that evidence to the appropriate domains and levels on
the VALUE rubrics. And we will consider the implications of what we learn for
our own practice in the classroom.

my notes

viewer people than i expected.

group work, our group was charged with connecting the dots: assessing student work using the value rubrics

written communication value rubric

 

university web page

Posted in LinkedIn by Jonathan Moser

Check Out Thayer Academy’s new site: from Camps and Campus Maps to Infographics, History, and even some Digital Storytelling:

Home: http://www.thayer.org/
Camp Thayer: https://lnkd.in/e6CVMmk
Campus Map: https://lnkd.in/eu9aUGm
History of Thayer: https://lnkd.in/eSzgEbr
Facts & Figures: https://lnkd.in/eH_F6za

++++++++++++++++++

see also
http://blog.stcloudstate.edu/ims/2016/03/23/library-social-media-strategy/

 

rubrics and grade appeals

Using Rubrics as a Defense Against Grade Appeals

, March 21st, 2016

Rubrics provide the criteria for assessing students’ work. Giving students the rubric along with the assignment can clarify the instructor’s expectations. A rubric allows for much quicker, fairer, and more transparent grading. After an instructor grades 30 essays, fairness can become secondary to exhaustion. Following the rubric takes less time, and doing so allows grading the first essay to look exactly like grading the last essay. Students will be less likely to say, for example, “She got a 3 on this section, and I got a 2 for almost the same content.”

more on rubrics in this IMS blog:

http://blog.stcloudstate.edu/ims/?s=rubrics&submit=Search

tech lib conference 2016

http://2016libtechconference.sched.org/event/69f9/come-on-down-gaming-in-the-flipped-classroom#

avatar for Angie Cox

Angie Cox

Instruction and Liaison Librarian, University of Northern Iowa
games and gamification. the semantics are important. using the right terms can be crucial in the next several years.

gamification for the enthusiasm. credit course with buffet. the pper-to-peer is very important

gaming types

affordability; east to use; speed to create.

assessment. if you want heavy duty, SPSS kind of assessment, use polldaddy or polleverywhere.
Kahoot only Youtube, does not allow to upload own video or use Kaltura AKA Medispace, text versus multimedia
Kahoot is replacing Voicethread at K12, use the wave

Kahoot allows to share the quizzes and surveys
Kahoot is not about assessment, it is not about drilling knowledge, it is about conversation starter. why do we read an article? there is no shame in wrong answer.

the carrot: when they reach the 1000 points, they can leave the class

Kahoot music can be turned off, how short, the answers are limited like in Twitter

Quizlet

screenshot their final score and reach 80%

gravity is hard, scatter start with. auditory output

drill game

Teach Challenge.

1st day is Kahoot, second day is Team challange and test

embed across the curriculum

gaming toolkit for campus

 

what to take home: have students facing students from differnt library

+++++++++++++

http://sched.co/69f2

Putting it all together: a holistic approach to utilizing your library’s user data for making informed web design decisions 

In the age of Big Data, there is an abundance of free or cheap data sources available to libraries about their users’ behavior across the many components that make up their web presence. Data from vendors, data from Google Analytics or other third-party tracking software, and data from user testing are all things libraries have access to at little or no cost. However, just like many students can become overloaded when they do not know how to navigate the many information sources available to them, many libraries can become overloaded by the continuous stream of data pouring in from these sources. This session will aim to help librarians understand 1) what sorts of data their library already has (or easily could have) access to about how their users use their various web tools, 2) what that data can and cannot tell them, and 3) how to use the datasets they are collecting in a holistic manner to help them make design decisions. The presentation will feature examples from the presenters’ own experience of incorporating user data in decisions related to design the Bethel University Libraries’ web presence.

http://tinyurl.com/jbchapf

data tools: user testing, google analytics, click trakcer vendor data

  1. user testing, free, no visualization, cross-domain, easy to use, requires scripts
    qualitative q/s : why people do what they do and how will users think about your content
    3 versions: variables: options on book search and order/wording of the sections in the articles tab
    Findings: big difference between tabs versus single-page. Lil difference btw single-page options. Take-aways it won’t tell how to fix the problem, how to be empathetic how the user is using the page
    Like to do in the future: FAQ and Chat. Problem: low use. Question how to make it be used (see PPT details)
  2. Crazy Egg – Click Trackers. not a free tool, lowest tier, less $10/m.
    see PPT for details>
    interaction with the pates, clicks and scrollings
  3. scroll analytics
    not easy to use, steep learning curve
    “blob” GAnalytics recognize the three different domains that r clicked through as one.
  4. vendor data: springshare
    chat and FAQ
    Libguides

questions:

is there a dashboard tool that can combine all these tools?
optimal workshop: reframe, but it is more about qualitative data.
how long does it take to build this? about two years in general, but in the last 6 months focused.

artificial-intelligence engine https://www.technologyreview.com/s/600984/an-ai-with-30-years-worth-of-knowledge-finally-goes-to-work/,
Doug Lenat

best podcasts

how to subscribe to a podcast:

These are the best podcasts you should be listening to right now

Learning Spaces and Instructional Technology

Special Interest Group: Learning Spaces and Instructional Technology (SIG) webinars are FREE and open to anyone. Please feel free to share this with others at your institution.

Dynamic Discussion Artifacts: Moving Beyond Threaded Discussion

Description

This session will describe an approach to online discussions that moves beyond the threaded message boards of D2L Brightspace, yet still maintained an asynchronous online delivery. Using teams, discussions were differentiated by product to allow students to turn in an artifact that represented their shared understanding during specific online course modules.  Strategies, Technology guides, rubrics, and student feedback will be shared.

Presenter: Michael Manderfeld
Senior Instructional Designer
Minnesota State University Mankato

When
Where
https://moqi.zoom.us/j/672493176 (link to virtual room)

 

 

Notes from the previous session available here:

Active Learning Classrooms

survey for mobiles

https://smaudience.surveymonkey.com/webinar-google-mobile-surveys.html

Join Mario Callegaro, Senior Survey Research Scientist at Google UK, and one of own survey research scientists, Sarah Cho, on February 24 at 10 am PT / 1 pm ET for our webinar, Market research surveys gone mobile: Optimizing for better results.

Mario Callegaro

Senior Survey Research Scientist

Quantitative Marketing Team, Google UK

 

Sarah Cho

Survey Research Scientist

SurveyMonkey

.My notes from the Webinar.

Surveys uncover the WHY. Big Data,

why mobile matters. tablet and smart phone penetration: around 60-80% in Europe. According to Pew In the US, 68% smartphone and 45% tablet

faster reaction but longer questionnaire completion time on smartphones = device effects

survey design device vs. survey take device – mismatch. When there is a mismatch, questions are asked.
5 strategies to handle mobile phone respondents: 1. do nothing
surveym0nkey: do all surveys have to be mobile optimized? no, so make sure you think about the context in which you are sending out

2. discourage the use of mobile phones for answering 3. optimize the web questionnaire for mobile browsers 4. mobile app

design considerations for multiple devices surveys. two “actors”: survey designer and survey platform

confounds when interpreting findings across devices: use homogeneous population (e.g students)

difference between mouse vs fingers as input devices

what about tablets: as long as flash is not used, tablet is very much the same as laptop/desktop. phablets (iPhone growth of the screen)

mobile survey design tips (Sarah)

multiple choice: ok to use, but keep wording short, format response vertically instead of horizontally.

open-ended q type: hard to type (but no word on voice recognition???)

logo

multimedia: images, clarity, video, avoid (bandwidth constrains), use Youtube, so every device can play it, versus Flash, Java Script etc

testing and length: as usual

URL: as short as possible. consider QR code

growth of survey taking on mobile devices

growth of survey taking on mobile devices

 

 

1 2 3 4 36