Searching for "assessment"

Classroom Response System CRS or clickers: questions to vendors

Good evening,

We are pleased to inform you that your classroom response system is chosen as final candidate for campus-wide adoption/support at St. Cloud State University. Should you be interested in pursuing this opportunity, we invite you to respond to the attached list of questions and to prepare a brief presentation for members of the selection committee and interested faculty/staff.

The deadline for responding to the questions is 12:00 pm (CST), Tuesday, April 9. This deadline will allow us to review the responses in time for the vendor presentations on Thursday, April 11, 11AM-1PM. The presentations will be held virtually via Adobe Connect: http://media4.stcloudstate.edu/scsu. Please let us know, if you need to test and familiarize yourself with the presentation platform.

The presentation should be no more than 10 minutes long, followed by 10 minutes for follow-up questions. We suggest that you focus on the highlights of your system, presuming a moderately knowledgeable audience. We may follow up via email or telephone call prior to making our final selection.

Thank you and looking forward to hearing from you soon.

Classroom Response System Taskforce:
Dr. Anthony Hansen
Dr. Michael Rentz
Dr. Joseph Melcher
Dr. Andrew Anda
Dr. Tracy Ore
Dr. Jack McKenna
Dr. Plamen Miltenoff

 

Questions to vendor
1. Is your system proprietary as far as the handheld device and the operating system software?
2. Describe the scalability of your system, from small classes (20-30) to large auditorium classes. (500+).
3. Is your system receiver/transmitter based, wi-fi based, or other?
4. What is the usual process for students to register a “CRS”(or other device) for a course? List all of the possible ways a student could register their device. Could a campus offer this service rather than through your system? If so, how?
5. Once a “CRS” is purchased  can it be used for as long as the student is enrolled in classes? Could “CRS” purchases be made available through the campus bookstore? Once a student purchases a “clicker” are they able to transfer ownership when finished with it?
6. Will your operating software integrate with other standard database formats? If so, list which ones.
7. Describe the support levels you provide. If you offer maintenance agreements, describe what is covered.
8. What is your company’s history in providing this type of technology? Provide a list of higher education clients.
9. What measures does your company take to insure student data privacy? Is your system in compliance with FERPA and the Minnesota Data Practices Act? (https://www.revisor.leg.state.mn.us/statutes/?id=13&view=chapter)
10. What personal data does your company collect on students and for what purpose? Is it shared or sold to others? How is it protected?
11. Do any of your business partners collect personal information about students that use your technology?
12. With what formats can test/quiz questions be imported/exported?
13. List compatible operating systems (e.g., Windows, Macintosh, Palm, Android)?
14. What are the total costs to students including device costs and periodic or one-time operation costs
15. Describe your costs to the institution.
16. Describe how your software integrates with PowerPoint or other presentation systems.
17. State your level of integration with Desire2Learn (D2L)?

Does the integration require a server or other additional equipment the campus must purchase?

18. How does your company address disability accommodation for your product?
19. Does your software limit the number of answers per question in tests or quizzes? If so, what is the max question limit?
20. Does your software provide for integrating multimedia files? If so, list the file format types supported.
21. What has been your historic schedule for software releases and what pricing mechanism do you make available to your clients for upgrading?
22. Describe your “CRS”(s).
23. If applicable, what is the average life span of a battery in your device and what battery type does it take?
24. Does your system automatically save upon shutdown?
25. What is your company’s projection/vision for this technology in the near and far term.
26. Does any of your software/apps require administrator permission to install?
27. If your system is radio frequency based, what frequency spectrum does it operate in? If the system operate in the 2.4-2.5 ghz. spectrum, have you tested to insure that smart phones, wireless tablet’s and laptops and 2.4 ghz. wireless phones do not affect your system? If so, what are the results of those tests?
28. What impact to the wireless network does the solution have?
29. Can the audience response system be used spontaneously for polling?
30. Can quiz questions and response distributions be imported and exported from and to plaintext or a portable format? (motivated by assessment & accreditation requirements).
31. Is there a requirement that a portion of the course grade be based on the audience response system?

 

 

 

—————-

Plamen Miltenoff, Ph.D., MLIS

Professor

204-J James W. Miller Center

Learning Resources and Technology Services

720 Fourth Avenue South

St. Cloud, MN 56301-4498

320-308-3072

pmiltenoff@stcloudstate.edu

http://web.stcloudstate.edu/pmiltenoff/faculty/

“I am not strange, I am just not normal.” Salvador Dali

 

Your q/s, our a/s: math equations for quiz questions

Does anyone have experience with creating math equations for quiz questions either directly in D2L, using MathML and MathJax, Respondus, or another application that can be integrated into D2L?

https://mnsite.ims.mnscu.edu/shared/_instructor_and_coursedesigner_help/learningenvironment/assessment_tools/question_library/creating_arithmetic_questions.htm

More questions? d2l@stcloudstate.edu. Solutions: please logged them in

rubrics in D2L: from students’ standpoint

One of the difficulties working with D2L as an instructor is the inability to “see” what “students” see. Indeed D2L has the students role, but…

If you are working with rubrics and advertising this feature to your students (pls share with us your rubrics!!!) and your students are perplexed that they don’t see rubrics under

Assessment

Rubrics

as you do, please keep in mind that you need to “connect” your rubrics (click on “Add Rubrics” under Assessment/Dropbox/Properties/Rubrics) with the dropbox. Students will be able to see the rubric only after the dropbox is “open”

Please let us know, if you need more information

d2l@stcloudstate.edu

Follow us on Twitter: @SCSUtechInstruc | #techworkshop

http://blog.stcloudstate.edu/ims 

D2L camp Wednesday January 9, 2013

D2L: SHARING PRACTICES IN LEARNING AND TEACHING

– mostly it is visual changes. D2L is now using a lot of collapsing / scroll down bars to navigate. it is more compact
– changes and improvements in different tools: e.g. discussion, rubrics, grades (e.g.  export straight to Excel), pager etc
– faculty cannot add tools to the default navbar, but can email d2l@stcloudstate.edu and request a tool to be added. Faculty CAN take off tool; don;t forget to save
– must post first in discussion

  • 10:00-10:30am: Make D2L work for you: discussions and grades in D2L . Dr. David Switzer, Economics

– grades, how to streamline them. copying again and again in D2L can be too timeconsuming. exxporting to Excel, calculating and importing back is easier. Remeber to export a blank D2L grading item, so the template can be set. q/n: when final grades will be able to export straight from D2L to R&R
-use subscription on discussion
-show students in class that surveys are anonimous indeed

– who to turn for help and ideas: colleagues, tech support, tech insrtruct people, students
– how to organize lectures’ content and put it online, D2L in particular
– F2F, hybrid and online. how do we choose and discriminate?

– online learning, disruptive technology. touched on MOOC, student-center edlearning
Camtasia. free version of the C Studio 8.0 for Win and Mac. Shareware (30 days). for every min of recorded lecture, will take 5 to 10 min to record it, edit it and prepared it.
Adobe Captivate. use it through the virtual lab. it is not that connvenient. $30 per year for the key server version
-Blue Berry is superior to Camtesia by allowing to draw
Jing. Free
Screencast. bandwidh restriction. means that too many students cannot view simultanously the lecture video.  Flash-based and this is not compatible with Apple products.
– Mediaserver (media4.stcloudstate.edu) upload zipped folder (SCORM compliant). Need an account, request from Greg Jorgenson.
— Mike from the Adobe Connect participants shared ” I’ve used Screenhunter to captures images (jpg), which is a free software”
– multimedia formats: video, audio, images, animations
– differences between raster and vector graphics. Camtasia will accept only JPG, PNG formats, but not vectorgraphics

  • 11:30-12:00pm: Open time for individual projects and problem solving.

Lunch Break

– Steve: rubrics and grading. D2L is not flexible and we need to adapt our assessment to the D2L capabilities.
– homework and papers, holistic and analytic.
Amazon Kindle much better for grading online then iPAD.
– separate criteria did not work for Steve, but Ken has his rubrics in different criteria. KISS rule. Properly defines students’ expecations.  Create a grid of the rubrics and then cut and paste into the D2L rubrics. Also go over with students over the rubrics details.
– Ken: have several levels in rubrics. New Rubric must be “published” and not a “draft” otherwise cannot be linked to grades.

– calibrated peer review.
another way of using rubrics. potential advantage of using this app is to do automated blind peer review. D2L cannot do it that well as this app. handy for large classes and short writing assignments. Contact Joe Melcher (jmmelcher@stcloudstate.edu) for an account to be created.
crowd control versus really learning the content. The software gives a good feedback what students have actually done (student progress tab).
export callibrated results to D2L

  • 2:00-3:00pm: Open time for individual projects and problem solving.

 

You can also join us via virtual synchronous connection through Adobe Connect at:
http://media4.stcloudstate.edu/d2lworkshop/

Limited space; please consider registering at:  https://secure.mnsu.edu/mnscupd/login/default.asp?campusid=0073

We would like similar event during the Spring 2013 semester? Please share with us your preference for day/time, as well as topics of interest.

For any questions, recommendations, suggestions, please use the following contact:

Plamen Miltenoff
320-308-3072
pmiltenoff@stcloudstate.edu

 

 

Quizzes and fun games (gamification)

Quizzes are considered mostly an assessment tool. The reward is in the end of the game. The player cannot “lose life.”

Students who are used to the logic of a game, expect rewards throughout the game.

Therefore, instead of a final assessment quiz, the class can be phased out with several training quizzes. Each of the training quizzes can allow students to have several attempts (equals lifes). In addition, students can be stimulated  format wise in playing the quizzes=gaming activity by some reward systems. E.g., for each training quiz being scored above B, students can collect badges/tockens, which they can redeem at the end of class. Content-wise, students can be stimulated in playing the quizzes=gaming activity by stepping on the next level and switching from text-based quizzes to quizzes including more multimedia: audio, video and interactivity

#techworkshop #pm great tool to combine with training D2L quizzes: http://quizlet.com
Here is a practical guide on games and quizzes with D2L
http://www.uww.edu/icit/instructional/teachingonline/games_quizzes.html

 

Those are the students we expect on campus: http://www.edweek.org/dd/articles/2012/06/13/03games.h05.html

Clickers, IPADs and stylus; http://www.as.ua.edu/ipad/drs-hong-min-park-emily-hencken-ritter-and-greg-vonnahme-ipads-in-political-science-pt-1/

Games and gamification

References

Frossard, F., Barajas, M., & Trifonova, A. (2012). A learner-centered game-design approach: Impacts on teachers’ creativity. Digital Education Review, (21), 13-22.

Fu-Hsing Tsai, Kuang-Chao Yu, & Hsien-Sheng Hsiao. (2012). Exploring the factors influencing learning effectiveness in digital game-based learning. Journal of Educational Technology & Society, 15(3), 240-250.

1 19 20 21