Searching for "assessment"
A a workshop for COLL 150 and HONS 100 instructors on May 10.
Here is the outline and resources.
Media Literacy and Skills
Media Literacy (according to Wikipedia — http://en.wikipedia.org/wiki/Media_literacy)
The term has been conceived in many different ways and across all academic departments (Mihalidis, 2008).
Media literacy is central in a broader concept of access (Sourbati, 2009).
The relationship between visual competencies and the notion of media literacy have not been fully explored or adequately specified (Griffin, 2008).
Media literacy interventions refer to education programs designed to reduce harmful effects of the media by informing the audience about one or more aspects of the media, thereby inﬂuencing media-related beliefs and attitudes, and ultimately preventing risky behaviors. Positive effects of media literacy interventions were observed across diverse agents, target age groups, settings, topics, and countries (Jeong et al, 2012).
Media literacy, information literacy and digital literacy are the three most prevailing concepts that focus on a critical approach towards media messages
The 21st century has marked an unprecedented advancement of new media. New media has become so pervasive that it has penetrated into every aspect of our society. New media literacy plays an essential role for any citizen to participate fully in the 21st century society. Researchers have documented that literacy has evolved historically from classic literacy (reading-writing-understanding) to audiovisual literacy to digital literacy or information literacy and recently to new media literacy. A review of literature on media literacy reveals that there is a lack of thorough analysis of unique characteristics of newmedia and its impacts upon the notion of new media literacy. The purpose of the study is to unpack new media literacyand propose a framework for a systematic investigation of new media literacy
Hobbs versus Potter
Ten basic new media skills that today’s journalist should know: http://www.siliconvalleywatcher.com/mt/archives/2008/03/ten_basic_new_m.php
- HTML is not dead. QR codes are only one new technology, which can revive it. But:
- WordPress might be preferable to Adobe Dreamweaver.
- PPT is not enough. Prezi does not replace it. Then what? Desktop/lpatop versus tablet (Stampsy). Or the Cloud m(VoiceThread)? Does Media skills = presentation skills?
- iMovie | Movie Maker (local) versus YouTube (Cloud)
- Flickr (Cloud) versus Photoshop (local).
Mihailidis, P. (2008). Are We Speaking the Same Language? Assessing the State of Media Literacy in U.S. Higher Education. Simile, 8(4), 1-14. doi:10.3138/sim.8.4.001 http://search.ebscohost.com/login.aspx?direct=true&db=aph&AN=40303609
Hobbs, R. (2011). EMPOWERING LEARNERS WITH DIGITAL AND MEDIA LITERACY. Knowledge Quest, 39(5), 12-17. http://search.ebscohost.com/login.aspx?direct=true&db=aph&AN=61819923
Koltay, T. (2011). The media and the literacies: media literacy, information literacy, digital literacy. Media, Culture & Society, 33(2), 211-221. doi:10.1177/0163443710393382 http://search.ebscohost.com/login.aspx?direct=true&db=aph&AN=59569702
“Victor” CHEN, D., WU, J., & WANG, Y. (2011). Unpacking New Media Literacy. Journal Of Systemics, Cybernetics & Informatics, 9(2), 84-88. http://search.ebscohost.com/login.aspx?direct=true&db=aph&AN=83259046
Sourbati, M. (2009). Media Literacy and Universal Access in Europe. Information Society, 25(4), 248-254. doi:10.1080/01972240903028680 http://search.ebscohost.com/login.aspx?direct=true&db=aph&AN=43050924
GRIFFIN, M. (2008). Visual competence and media literacy: can one exist without the other?. Visual Studies,23(2), 113-129. doi:10.1080/14725860802276255 http://search.ebscohost.com/login.aspx?direct=true&db=aph&AN=33944793
Jeong, S., Cho, H., & Hwang, Y. (2012). Media Literacy Interventions: A Meta-Analytic Review. Journal Of Communication, 62(3), 454-472. doi:10.1111/j.1460-2466.2012.01643.x http://search.ebscohost.com/login.aspx?direct=true&db=aph&AN=76349359
Yates, B. L. (2002). Media education’s present and future: A survey of teachers. Simile, 2(3), N.PAG. http://search.ebscohost.com/login.aspx?direct=true&db=aph&AN=10537377
Technology Literacy and Skills
consider this: http://blog.lib.umn.edu/chri1010/TLI/023958.html
Technology Literacy is the ability to responsibly use appropriate technology to communicate, solve problems, and access, manage, integrate, evaluate, and create information to improve learning in all subject areas and to acquire lifelong knowledge and skills in the 21st century.
Technology literacy is the ability of an individual, working independently and with others, to responsibly, appropriately and effectively use technology tools to access, manage, integrate, evaluate, create and communicate information.
“Technological Literacy is the ability to use, manage, assess, and understand technology” (Gallop Poll, 2004, p. 1). “Technological literacy encompasses three interdependent dimensions: (1) knowledge, (2) ways of thinking and acting; and (3) capabilities” (Technically Speaking, 2006, p.1).
Comprehension of technological innovation and the impact of technology on society — may include the ability to select and use specific innovations appropriate to one’s interests and needs.
Technological Literacy Reconsidered: http://scholar.lib.vt.edu/ejournals/JTE/v4n2/waetjen.jte-v4n2.html
ICT literacy, which is increasingly referred to as the fourth literacy, is neither as well defined nor as readily assessed as reading, writing, and arithmetic (Mirray and Perez, 2010).
The importance for the public and educators to be proficienttechnology users since technology literacy is one of the important skills in the 21st century (Eisenberg et al, 2010).
Technology literacy is hampered by well-intentioned educators who are trying to develop checklists and tests (Miners, 2007).
Pérez, J., & Murray, M. (2010). Generativity: The New Frontier for Information and Communication Technology Literacy. Interdisciplinary Journal Of Information, Knowledge & Management, 5127-137. http://search.ebscohost.com/login.aspx?direct=true&db=aph&AN=58079824
Eisenberg, M., Johnson, D., & Berkowitz, B. (2010). Information, Communications, and Technology (ICT) Skills Curriculum Based on the Big6 Skills Approach to Information Problem-Solving. Library Media Connection, 28(6), 24-27. http://search.ebscohost.com/login.aspx?direct=true&db=aph&AN=50728714
Miners, Z., & Pascopella, A. (2007). The NEW Literacies. District Administration, 43(10), 26-34. http://search.ebscohost.com/login.aspx?direct=true&db=aph&AN=27024204
NAEP Will Include Technology Literacy in 2012. (Cover story). (2008). Electronic Education Report, 15(20), 1-7. http://search.ebscohost.com/login.aspx?direct=true&db=aph&AN=62828392
Heller-Ross, H. (2004). Reinforcing information and technology literacy. College & Research Libraries News, 65(6), 321-325. http://search.ebscohost.com/login.aspx?direct=true&db=aph&AN=13541089
Do you have ideas and materials regarding Media and Technology Literacy and Skills? Pls contribute…
Digital literacy includes the ability to find and use information (otherwise known as information literacy) but goes beyond this to encompass communication, collaboration and teamwork, social awareness in the digital environment, understanding of e-safety and creation of new information. Both digital and information literacy are underpinned by critical thinking and evaluation.
how to evaluate digital literacy
working document for information literacy at
You are invited to participate in the “First Annual SCSU Technology in Teaching and Learning Summer Institute” co-sponsored by the Center for Continuing Studies, InfoMedia Services and the Center for Excellence in Teaching and Learning.
When? Monday, May 13 – Tuesday, May 14, 2013
Where? Miller Center
Space is limited to 75 participants. Registration is required and can be completed at this link: http://www.eventbrite.com/org/3606333855
The Institute program is available here: http://web.stcloudstate.edu/informedia/cetl/tech_institute_schedule.docx
Participants are eligible for incentive awards to support their teaching with technology. Please see the attachment, “participant incentives.”
The goal of the SCSU Technology in Teaching and Learning Summer Institute and its follow-up sessions is to provide high quality and effective pedagogical strategies, skills and discussions around the use of technology for teaching and learning in online, face-to-face and blended courses. This Institute is part of our on-going varied and collaborative efforts to foster a professional peer learning climate around teaching and learning with technology.
Participants who attend all sessions on both days including the follow-ups and complete all evaluations will have the opportunity to use their self-assessment of current skills and knowledge of technology and select sessions in order to:
• Acquire basic and advanced skills in using the current Learning Management System, i.e. D2L
• Distinguish the appropriate use of pedagogical strategies with technology in online, face-to-face and blended settings
• Explore opportunities to improve student learning through application of e-conferencing tools (e.g. Adobe Connect), and Web 2.0 tools such as social media, etc.
• Meet and interact with faculty and staff experts and mentors and learn the processes by which they can get additional and on-going support for each of the above areas.
Please register no later than Wednesday May 8.
“Full-time faculty and full-time professional staff with teaching responsibilities who participate in both days of the “Summer Institute” and complete the evaluations will be rewarded with a $300 coupon for a one-time purchase of material that directly supports teaching with technology at the SCSU Computer Store in the Miller Center. Faculty who participate in one of the two days will receive a $150 coupon for the same purpose. Coupons are not transferable.
Please remember that the items purchased remain the property of SCSU but may be used by the purchaser to support their teaching and related academic activities.
Upon completion of the “Summer Institute” participants will be contacted by the SCSU Online Office to verify level of participation in the institute and verify eligibility for funds. These funds must be spent by June 15, 2013.”
Clarification on Presenters Registration
- Presenters do not have to register unless they want to attend both days.
- If presenters are not going to participate in sessions other than the one(s) they are presenting but want to eat lunch with us on either day please contact me directly so we can add you to the lunch count and identify any dietary needs
Thursday, April 11, 11AM-1PM, Miller Center B-37
We invite the campus community to a presentation by three vendors of Classroom Response System (CRS), AKA “clickers”:
11:00-11:30AM Poll Everywhere, Mr. Alec Nuñez
11:30-12:00PM iClikers, Mr. Jeff Howard
12:00-12:30PM Top Hat Monocle Mr. Steve Popovich
12:30-1PM Turning Technologies Mr. Jordan Ferns
links to documentation from the vendors:
Top Hat Monocle docs:
|Questions to vendor: firstname.lastname@example.org
- 1. Is your system proprietary as far as the handheld device and the operating system software?
The site and the service are the property of Poll Everywhere. We do not provide handheld devices. Participants use their own device be it a smart phone, cell phone, laptop, tablet, etc.
- 2. Describe the scalability of your system, from small classes (20-30) to large auditorium classes. (500+).
Poll Everywhere is used daily by thousands of users. Audience sizes upwards of 500+ are not uncommon. We’ve been used for events with 30,000 simultaneous participants in the past.
- 3. Is your system receiver/transmitter based, wi-fi based, or other?
- 4. What is the usual process for students to register a “CRS”(or other device) for a course? List all of the possible ways a student could register their device. Could a campus offer this service rather than through your system? If so, how?
Student participants may register by filling out a form. Or, student information can be uploaded via a CSV.
- 5. Once a “CRS” is purchased can it be used for as long as the student is enrolled in classes? Could “CRS” purchases be made available through the campus bookstore? Once a student purchases a “clicker” are they able to transfer ownership when finished with it?
N/A. Poll Everywhere sells service licenses the length and number of students supported would be outlined in a services agreement.
- 6. Will your operating software integrate with other standard database formats? If so, list which ones.
Need more information to answer.
- 7. Describe the support levels you provide. If you offer maintenance agreements, describe what is covered.
8am to 8pm EST native English speaking phone support and email support.
- 8. What is your company’s history in providing this type of technology? Provide a list of higher education clients.
Company pioneered and invented the use of this technology for audience and classroom response. http://en.wikipedia.org/wiki/Poll_Everywhere. University of Notre Dame
South Bend, Indiana
University of North Carolina-Chapel Hill
Raleigh, North Carolina
University of Southern California
Los Angeles, California
San Diego State University
San Diego, California
King’s College London
London, United Kingdom
Fayetteville State University
Fayetteville, North Carolina
New Brunswick, New Jersey
Texas A&M University
College Station, Texas
University of Illinois
- 9. What measures does your company take to insure student data privacy? Is your system in compliance with FERPA and the Minnesota Data Practices Act? (https://www.revisor.leg.state.mn.us/statutes/?id=13&view=chapter)
- 10. What personal data does your company collect on students and for what purpose? Is it shared or sold to others? How is it protected?
Name. Phone Number. Email. For the purposes of voting and identification (Graded quizzes, attendance, polls, etc.). It is never shared or sold to others.
- 11. Do any of your business partners collect personal information about students that use your technology?
- 12. With what formats can test/quiz questions be imported/exported?
Import via text. Export via CSV.
- 13. List compatible operating systems (e.g., Windows, Macintosh, Palm, Android)?
Works via standard web technology including Safari, Chrome, Firefox, and Internet Explorer. Participant web voting fully supported on Android and IOS devices. Text message participation supported via both shortcode and longcode formats.
- 14. What are the total costs to students including device costs and periodic or one-time operation costs
Depends on negotiated service level agreement. We offer a student pays model at $14 per year or Institutional Licensing.
- 15. Describe your costs to the institution.
Depends on negotiated service level agreement. We offer a student pays model at $14 per year or Institutional Licensing.
- 16. Describe how your software integrates with PowerPoint or other presentation systems.
Downloadable slides from the website for Windows PowerPoint and downloadable app for PowerPoint and Keynote integration on a Mac.
|17. State your level of integration with Desire2Learn (D2L)?Does the integration require a server or other additional equipment the campus must purchase?Export results from site via CSV for import into D2L.
- 17. How does your company address disability accommodation for your product?
We follow the latest web standards best practices to make our website widely accessible by all. To make sure we live up to this, we test our website in a text-based browser called Lynx that makes sure we’re structuring our content correctly for screen readers and other assisted technologies.
- 18. Does your software limit the number of answers per question in tests or quizzes? If so, what is the max question limit?
- 19. Does your software provide for integrating multimedia files? If so, list the file format types supported.
Supports image formats (.PNG, .GIF, .JPG).
- 20. What has been your historic schedule for software releases and what pricing mechanism do you make available to your clients for upgrading?
We ship new code daily. New features are released several times a year depending on when we finish them. New features are released to the website for use by all subscribers.
- 21. Describe your “CRS”(s).
Poll Everywhere is a web based classroom response system that allows students to participate from their existing devices. No expensive hardware “clickers” are required. More information can be found at http://www.polleverywhere.com/classroom-response-system.
- 22. If applicable, what is the average life span of a battery in your device and what battery type does it take?
N/A. Battery manufacturers hate us. Thirty percent of their annual profits can be contributed to their use in clickers (we made that up).
- 23. Does your system automatically save upon shutdown?
Our is a “cloud based” system. User data is stored there even when your computer is not on.
- 24. What is your company’s projection/vision for this technology in the near and far term.
We want to take clicker companies out of business. We think it’s ridiculous to charge students and institutions a premium for outdated technology when existing devices and standard web technology can be used instead for less than a tenth of the price.
- 25. Does any of your software/apps require administrator permission to install?
- 26. If your system is radio frequency based, what frequency spectrum does it operate in? If the system operate in the 2.4-2.5 ghz. spectrum, have you tested to insure that smart phones, wireless tablet’s and laptops and 2.4 ghz. wireless phones do not affect your system? If so, what are the results of those tests?
- 27. What impact to the wireless network does the solution have?
Depends on a variety of factors. Most university wireless networks are capable of supporting Poll Everywhere. Poll Everywhere can also make use of cell phone carrier infrastructure through SMS and data networks on the students phones.
- 28. Can the audience response system be used spontaneously for polling?
- 29. Can quiz questions and response distributions be imported and exported from and to plaintext or a portable format? (motivated by assessment & accreditation requirements).
- 30. Is there a requirement that a portion of the course grade be based on the audience response system?
Fall 2011 Student Response System Pilot
NDSU has been standardized on a single student response (i.e., “clicker”) system for over a decade, with the intent to provide a reliable system for students and faculty that can be effectively and efficiently supported by ITS. In April 2011, Instructional Services made the decision to explore other response options and to identify a suitable replacement product for the previously used e-Instruction Personal Response System (PRS). At the time, PRS was laden with technical problems that rendered the system ineffective and unsupportable. That system also had a steep learning curve, was difficult to navigate, and was unnecessarily time-consuming to use. In fact, many universities across the U.S. experienced similar problems with PRS and have since then adopted alternative systems.
A pilot to explore alternative response systems was initiated at NDSU in fall 2011. The pilot was aimed at further investigating two systems—Turning Technologies and iClicker—in realistic classroom environments. As part of this pilot program, each company agreed to supply required hardware and software at no cost to faculty or students. Each vendor also visited campus to demonstrate their product to faculty, students and staff.
An open invitation to participate in the pilot was extended to all NDSU faculty on a first come, first serve basis. Of those who indicated interest, 12 were included as participants in this pilot.
Pilot Faculty Participants:
- Angela Hodgson (Biological Sciences)
- Ed Deckard (AES Plant Science)
- Mary Wright (Nursing)
- Larry Peterson (History, Philosophy & Religious Studies)
- Ronald Degges (Statistics)
- Julia Bowsher (Biological Sciences)
- Sanku Mallik (Pharmaceutical Sciences)
- Adnan Akyuz (AES School of Natural Resource Sciences)
- Lonnie Hass (Mathematics)
- Nancy Lilleberg (ITS/Communications)
- Lisa Montplaisir (Biological Sciences)
- Lioudmila Kryjevskaia (Physics)
The pilot included three components: 1) Vendor demonstrations, 2) in-class testing of the two systems, and 3) side-by-side faculty demonstrations of the two systems.
After exploring several systems, Instructional Services narrowed down to two viable options—Turning Technologies and iClicker. Both of these systems met initial criteria that was assembled based on faculty input and previous usage of the existing response system. These criteria included durability, reliability, ease of use, radio frequency transmission, integration with Blackboard LMS, cross-platform compatibility (Mac, PC), stand-alone software (i.e., no longer tied to PowerPoint or other programs), multiple answer formats (including multiple choice, true/false, numeric), potential to migrate to mobile/Web solutions at some point in the future, and cost to students and the university.
In the first stage of the pilot, both vendors were invited to campus to demonstrate their respective technologies. These presentations took place during spring semester 2011 and were attended by faculty, staff and students. The purpose of these presentations was to introduce both systems and provide faculty, staff, and students with an opportunity to take a more hands-on look at the systems and provide their initial feedback.
In the second stage of the pilot, faculty were invited to test the technologies in their classes during fall semester 2011. Both vendors supplied required hardware and software at no cost to faculty and students, and both provided online training to orient faculty to their respective system. Additionally, Instructional Services staff provided follow-up support and training throughout the pilot program. Both vendors were requested to ensure system integration with Blackboard. Both vendors indicated that they would provide the number of clickers necessary to test the systems equally across campus. Both clickers were allocated to courses of varying sizes, ranging from 9 to 400+ students, to test viability in various facilities with differing numbers of users. Participating faculty agreed to offer personal feedback and collect feedback from students regarding experiences with the systems at the end of the pilot.
In the final stage of the pilot, Instructional Services facilitated a side-by-side demonstration led by two faculty members. Each faculty member showcased each product on a function-by-function basis so that attendees were able to easily compare and contrast the two systems. Feedback was collected from attendees.
Results of Pilot
In stage one, we established that both systems were viable and appeared to offer similar features, functions, and were compatible with existing IT systems at NDSU. The determination was made to include both products in a larger classroom trial.
In stage two, we discovered that both systems largely functioned as intended; however, several differences between the technologies in terms of advantages and disadvantages were discovered that influenced our final recommendation. (See Appendix A for a list of these advantages, disadvantages, and potential workarounds.) We also encountered two significant issues that altered the course of the pilot. Initially, it was intended that both systems would be tested in equal number in terms of courses and students. Unfortunately, at the time of the pilot, iClicker was not able to provide more than 675 clickers, which was far fewer than anticipated. Turning Technologies was able to provide 1,395 clickers. As a result, Turning Technologies was used by a larger number of faculty and students across campus.
At the beginning of the pilot, Blackboard integration with iClicker at NDSU was not functional. The iClicker vendor provided troubleshooting assistance immediately, but the problem was not resolved until mid-November. As a result, iClicker users had to use alternative solutions for registering clickers and uploading points to Blackboard for student viewing. Turning Technologies was functional and fully integrated with Blackboard throughout the pilot.
During the span of the pilot additional minor issues were discovered with both systems. A faulty iClicker receiver slightly delayed the effective start date of clicker use in one course. The vendor responded by sending a new receiver, however it was an incorrect model. Instructional Services temporarily exchanged receivers with another member of the pilot group until a functional replacement arrived. Similarly, a Turning Technologies receiver was received with outdated firmware. Turning Technologies support staff identified the problem and assisted in updating the firmware with an update tool located on their website. A faculty participant discovered a software flaw in the iClicker software that hides the software toolbar when disconnecting a laptop from a second monitor. iClicker technical support assisted in identifying the problem and stated the problem would be addressed in a future software update. A workaround was identified that mitigated this problem for the remainder of the pilot. It is important to note that these issues were not widespread and did not widely affect all pilot users, however these issues attest to the need for timely, reliable, and effective vendor support.
Students and faculty reported positive experiences with both technologies throughout the semester. Based on feedback, users of both systems found the new technologies to be much improved over the previous PRS system, indicating that adopting either technology would be perceived as an upgrade among students and faculty. Faculty pilot testers met several times during the semester to discuss their experiences with each system; feedback was sent to each vendor for their comments, suggestions, and solutions.
During the stage three demonstrations, feedback from attendees focused on the inability for iClicker to integrate with Blackboard at that time and the substantial differences between the two systems in terms of entering numeric values (i.e., Turning Technologies has numeric buttons, while iClicker requires the use of a directional key pad to scroll through numeric characters). Feedback indicated that attendees perceived Turning Technologies’ clickers to be much more efficient for submitting numeric responses. Feedback regarding other functionalities indicated relative equality between both systems.
Based on the findings of this pilot, Instructional Services recommends that NDSU IT adopt Turning Technologies as the replacement for the existing PRS system. While both pilot-tested systems are viable solutions, Turning Technologies appears to meet the needs of a larger user base. Additionally, the support offered by Turning Technologies was more timely and effective throughout the pilot. With the limited resources of IT, vendor support is critical and was a major reason for exploring alternative student response technologies.
From Instructional Services’ standpoint, standardizing to one solution is imperative for two major reasons: cost efficiency for students (i.e., preventing students from having to purchase duplicate technologies) and efficient utilization of IT resources (i.e., support and training). It is important to note that this recommendation is based on the opinion of the Instructional Services staff and the majority of pilot testers, but is not based on consensus among all participating faculty and staff. It is possible that individual faculty members may elect to use other options that best meet their individual teaching needs, including (but not limited to) iClicker. As an IT organization, we continue to support technology that serves faculty, student and staff needs across various colleges, disciplines, and courses. We feel that this pilot was effective in determining the student response technology—Turning Technologies—that will best serve NDSU faculty, students and staff for the foreseeable future.
Once a final decision concerning standardization is made, contract negotiations should begin in earnest with the goal of completion by January 1, 2012, in order to accommodate those wishing to use clickers during the spring session.
Appendix A: Clicker Comparisons
Turning Technologies and iClicker
Areas where both products have comparable functionality:
- Setting up the receiver and software
- Student registration of clickers
- Software interface floats above other software
- Can use with anything – PowerPoint, Websites, Word, etc.
- Asking questions on the fly
- Can create questions / answers files
- Managing scores and data
- Allow participation points, points for correct answer, change correct answer
- Reporting – Summary and Detailed
- Uploading scores and data to Blackboard (but there was a big delay with the iClicker product)
- Durability of the receivers and clickers
- Free software
- Offer mobile web device product to go “clickerless”
Areas where the products differ:
Main Shortcomings of Turning Technology Product:
- Costs $5 more – no workaround
- Doesn’t have instructor readout window on receiver base –
- This is a handy function in iClicker that lets the instructor see the %’s of votes as they come in, allowing the instructor to plan how he/she will proceed.
- Workaround: As the time winds down to answer the question, the question and answers are displayed on the screen. Intermittently, the instructor would push a button to mute the projector, push a button to view graph results quickly, then push a button to hide graph and push a button to unmute the projector. In summary, push four buttons quickly each time you want to see the feedback, and the students will see a black screen momentarily.
- Processing multiple sessions when uploading grading –
- Turning Technologies uses their own file structure types, but iClicker uses comma-separated-value text files which work easily with Excel
- Workaround: When uploading grades into Blackboard, upload them one session at a time, and use a calculated total column in Bb to combine them. Ideally, instructors would upload the grades daily or weekly to avoid backlog of sessions.
Main Shortcomings of iClicker Product:
- Entering numeric answers –
- Questions that use numeric answers are widely used in Math and the sciences. Instead of choosing a multiple-choice answer, students solve the problem and enter the actual numeric answer, which can include numbers and symbols.
- Workaround: Students push mode button and use directional pad to scroll up and down through a list of numbers, letters and symbols to choose each character individually from left to right. Then they must submit the answer.
- Number of multiple choice answers –
- iClicker has 5 buttons on the transmitter for direct answer choices and Turning Technologies has 10.
- Workaround: Similar to numeric answer workaround. Once again the simpler transmitter becomes complex for the students.
- Potential Vendor Support Problems –
- It took iClicker over 3 months to get their grade upload interface working with NDSU’s Blackboard system. The Turning Technology interface worked right away. No workaround.
We are pleased to inform you that your classroom response system is chosen as final candidate for campus-wide adoption/support at St. Cloud State University. Should you be interested in pursuing this opportunity, we invite you to respond to the attached list of questions and to prepare a brief presentation for members of the selection committee and interested faculty/staff.
The deadline for responding to the questions is 12:00 pm (CST), Tuesday, April 9. This deadline will allow us to review the responses in time for the vendor presentations on Thursday, April 11, 11AM-1PM. The presentations will be held virtually via Adobe Connect: http://media4.stcloudstate.edu/scsu. Please let us know, if you need to test and familiarize yourself with the presentation platform.
The presentation should be no more than 10 minutes long, followed by 10 minutes for follow-up questions. We suggest that you focus on the highlights of your system, presuming a moderately knowledgeable audience. We may follow up via email or telephone call prior to making our final selection.
Thank you and looking forward to hearing from you soon.
Classroom Response System Taskforce:
Dr. Anthony Hansen
Dr. Michael Rentz
Dr. Joseph Melcher
Dr. Andrew Anda
Dr. Tracy Ore
Dr. Jack McKenna
Dr. Plamen Miltenoff
|Questions to vendor
|1. Is your system proprietary as far as the handheld device and the operating system software?
|2. Describe the scalability of your system, from small classes (20-30) to large auditorium classes. (500+).
|3. Is your system receiver/transmitter based, wi-fi based, or other?
|4. What is the usual process for students to register a “CRS”(or other device) for a course? List all of the possible ways a student could register their device. Could a campus offer this service rather than through your system? If so, how?
|5. Once a “CRS” is purchased can it be used for as long as the student is enrolled in classes? Could “CRS” purchases be made available through the campus bookstore? Once a student purchases a “clicker” are they able to transfer ownership when finished with it?
|6. Will your operating software integrate with other standard database formats? If so, list which ones.
|7. Describe the support levels you provide. If you offer maintenance agreements, describe what is covered.
|8. What is your company’s history in providing this type of technology? Provide a list of higher education clients.
|9. What measures does your company take to insure student data privacy? Is your system in compliance with FERPA and the Minnesota Data Practices Act? (https://www.revisor.leg.state.mn.us/statutes/?id=13&view=chapter)
|10. What personal data does your company collect on students and for what purpose? Is it shared or sold to others? How is it protected?
|11. Do any of your business partners collect personal information about students that use your technology?
|12. With what formats can test/quiz questions be imported/exported?
|13. List compatible operating systems (e.g., Windows, Macintosh, Palm, Android)?
|14. What are the total costs to students including device costs and periodic or one-time operation costs
|15. Describe your costs to the institution.
|16. Describe how your software integrates with PowerPoint or other presentation systems.
|17. State your level of integration with Desire2Learn (D2L)?
Does the integration require a server or other additional equipment the campus must purchase?
|18. How does your company address disability accommodation for your product?
|19. Does your software limit the number of answers per question in tests or quizzes? If so, what is the max question limit?
|20. Does your software provide for integrating multimedia files? If so, list the file format types supported.
|21. What has been your historic schedule for software releases and what pricing mechanism do you make available to your clients for upgrading?
|22. Describe your “CRS”(s).
|23. If applicable, what is the average life span of a battery in your device and what battery type does it take?
|24. Does your system automatically save upon shutdown?
|25. What is your company’s projection/vision for this technology in the near and far term.
|26. Does any of your software/apps require administrator permission to install?
|27. If your system is radio frequency based, what frequency spectrum does it operate in? If the system operate in the 2.4-2.5 ghz. spectrum, have you tested to insure that smart phones, wireless tablet’s and laptops and 2.4 ghz. wireless phones do not affect your system? If so, what are the results of those tests?
|28. What impact to the wireless network does the solution have?
|29. Can the audience response system be used spontaneously for polling?
|30. Can quiz questions and response distributions be imported and exported from and to plaintext or a portable format? (motivated by assessment & accreditation requirements).
|31. Is there a requirement that a portion of the course grade be based on the audience response system?
Plamen Miltenoff, Ph.D., MLIS
204-J James W. Miller Center
Learning Resources and Technology Services
720 Fourth Avenue South
St. Cloud, MN 56301-4498
“I am not strange, I am just not normal.” Salvador Dali
Does anyone have experience with creating math equations for quiz questions either directly in D2L, using MathML and MathJax, Respondus, or another application that can be integrated into D2L?
More questions? email@example.com. Solutions: please logged them in
One of the difficulties working with D2L as an instructor is the inability to “see” what “students” see. Indeed D2L has the students role, but…
If you are working with rubrics and advertising this feature to your students (pls share with us your rubrics!!!) and your students are perplexed that they don’t see rubrics under
as you do, please keep in mind that you need to “connect” your rubrics (click on “Add Rubrics” under Assessment/Dropbox/Properties/Rubrics) with the dropbox. Students will be able to see the rubric only after the dropbox is “open”
Please let us know, if you need more information
Follow us on Twitter: @SCSUtechInstruc | #techworkshop
D2L: SHARING PRACTICES IN LEARNING AND TEACHING
– mostly it is visual changes. D2L is now using a lot of collapsing / scroll down bars to navigate. it is more compact
– changes and improvements in different tools: e.g. discussion, rubrics, grades (e.g. export straight to Excel), pager etc
– faculty cannot add tools to the default navbar, but can email firstname.lastname@example.org and request a tool to be added. Faculty CAN take off tool; don;t forget to save
– must post first in discussion
- 10:00-10:30am: Make D2L work for you: discussions and grades in D2L . Dr. David Switzer, Economics
– grades, how to streamline them. copying again and again in D2L can be too timeconsuming. exxporting to Excel, calculating and importing back is easier. Remeber to export a blank D2L grading item, so the template can be set. q/n: when final grades will be able to export straight from D2L to R&R
-use subscription on discussion
-show students in class that surveys are anonimous indeed
– who to turn for help and ideas: colleagues, tech support, tech insrtruct people, students
– how to organize lectures’ content and put it online, D2L in particular
– F2F, hybrid and online. how do we choose and discriminate?
– online learning, disruptive technology. touched on MOOC, student-center edlearning
– Camtasia. free version of the C Studio 8.0 for Win and Mac. Shareware (30 days). for every min of recorded lecture, will take 5 to 10 min to record it, edit it and prepared it.
– Adobe Captivate. use it through the virtual lab. it is not that connvenient. $30 per year for the key server version
-Blue Berry is superior to Camtesia by allowing to draw
– Jing. Free
– Screencast. bandwidh restriction. means that too many students cannot view simultanously the lecture video. Flash-based and this is not compatible with Apple products.
– Mediaserver (media4.stcloudstate.edu) upload zipped folder (SCORM compliant). Need an account, request from Greg Jorgenson.
— Mike from the Adobe Connect participants shared ” I’ve used Screenhunter to captures images (jpg), which is a free software”
– multimedia formats: video, audio, images, animations
– differences between raster and vector graphics. Camtasia will accept only JPG, PNG formats, but not vectorgraphics
- 11:30-12:00pm: Open time for individual projects and problem solving.
– Steve: rubrics and grading. D2L is not flexible and we need to adapt our assessment to the D2L capabilities.
– homework and papers, holistic and analytic.
– Amazon Kindle much better for grading online then iPAD.
– separate criteria did not work for Steve, but Ken has his rubrics in different criteria. KISS rule. Properly defines students’ expecations. Create a grid of the rubrics and then cut and paste into the D2L rubrics. Also go over with students over the rubrics details.
– Ken: have several levels in rubrics. New Rubric must be “published” and not a “draft” otherwise cannot be linked to grades.
– calibrated peer review.
another way of using rubrics. potential advantage of using this app is to do automated blind peer review. D2L cannot do it that well as this app. handy for large classes and short writing assignments. Contact Joe Melcher (email@example.com) for an account to be created.
crowd control versus really learning the content. The software gives a good feedback what students have actually done (student progress tab).
export callibrated results to D2L
- 2:00-3:00pm: Open time for individual projects and problem solving.
You can also join us via virtual synchronous connection through Adobe Connect at:
Limited space; please consider registering at: https://secure.mnsu.edu/mnscupd/login/default.asp?campusid=0073
We would like similar event during the Spring 2013 semester? Please share with us your preference for day/time, as well as topics of interest.
For any questions, recommendations, suggestions, please use the following contact:
Quizzes are considered mostly an assessment tool. The reward is in the end of the game. The player cannot “lose life.”
Students who are used to the logic of a game, expect rewards throughout the game.
Therefore, instead of a final assessment quiz, the class can be phased out with several training quizzes. Each of the training quizzes can allow students to have several attempts (equals lifes). In addition, students can be stimulated format wise in playing the quizzes=gaming activity by some reward systems. E.g., for each training quiz being scored above B, students can collect badges/tockens, which they can redeem at the end of class. Content-wise, students can be stimulated in playing the quizzes=gaming activity by stepping on the next level and switching from text-based quizzes to quizzes including more multimedia: audio, video and interactivity
#techworkshop #pm great tool to combine with training D2L quizzes: http://quizlet.com
Here is a practical guide on games and quizzes with D2L
Those are the students we expect on campus: http://www.edweek.org/dd/articles/2012/06/13/03games.h05.html
Clickers, IPADs and stylus; http://www.as.ua.edu/ipad/drs-hong-min-park-emily-hencken-ritter-and-greg-vonnahme-ipads-in-political-science-pt-1/
Games and gamification
Frossard, F., Barajas, M., & Trifonova, A. (2012). A learner-centered game-design approach: Impacts on teachers’ creativity. Digital Education Review, (21), 13-22.
Fu-Hsing Tsai, Kuang-Chao Yu, & Hsien-Sheng Hsiao. (2012). Exploring the factors influencing learning effectiveness in digital game-based learning. Journal of Educational Technology & Society, 15(3), 240-250.