Searching for "multiple choice"

7 Free Social Media Tools for Teachers

7 Fantastic Free Social Media Tools for Teachers

http://mashable.com/2010/10/16/free-social-media-tools-for-teachers/

1. EDU 2.0

EDU 2.0 is a lot like online course management systems Blackboard and Moodle, but with a couple of distinct advantages. First, teachers can share their lesson plans, quizzes, videos, experiments and other resources in a shared library that currently hosts more than 15,000 pieces of content. Second, a community section allows teachers and students to network and collaborate with other members who share the same educational interests. And third, everything is hosted in the cloud for free.

2. SymbalooEDU

The popular visual organizing and sharing tool Symbaloo launched its “EDU” version last month. According to the company, 50,000 teachers are already using Symbaloo to organize classroom resources. The new EDU version comes with academic subject-specific resource pages or “webmixes” and top tools like TeacherTube, Slideshare, Google Docs, Flickr and more are fully embeddable. Teachers with a “Free Plus” account can add their school logo and customize the links. The site also allows students to easily share their Symbaloo pages and projects with classmates.

3. Collaborize Classroom

This app gives teachers four discussion format choices. Students can either agree or disagree with a statement, answer a multiple choice question, post responses, or have the choice between adding a new response or voting for someone else’s response. Teachers can add photos or videos to their prompts and all of the discussions take place on one class page.

4. Edublogs

This WordPress-like blogging platform only supports educational content and thus, unlike WordPress, usually isn’t blocked by school filters. Since 2005, it has hosted more than a million blogs from students and teachers.

5. Kidblog

Kidblog is a bit more specific than Edublogs. There are fewer options to adjust the appearance of the main page, and it’s hard to use the platform for anything other than as a system for managing individual class blogs. The homepage serves as a catalog of student blogs on the right with a recent post feed on the left.

Teachers can also control how private they want the blogs to be. They can keep them student-and-teacher only, allow parents to log in with a password, or make them open to the public.

6. Edmodo

Edmodo looks and functions much like Facebook. But unlike Facebook, it’s a controlled environment that teachers can effectively leverage to encourage class engagement. The platform allows teachers and students to share ideas, files and assignments on a communal wall. Teachers can organize different groups of students and monitor them from the same dashboard. Once they’ve organized classes, they can post assignments to the wall and grade them online. They can then archive the class groups and begin new ones.

7. TeacherTube and SchoolTube and YouTube

As the name implies, TeacherTube is YouTube for teachers. It’s a great resource for lesson ideas but videos can also be used during class to supplement a lecture. For instance, you can let Mrs. Burk rap about perimeters if you like her idea but lack the rhyming skills to pull it off yourself. This site also has a crowdsourced stock of documents, audio and photos that can be added to your lesson plans. Unfortunately, every video is preceded by an ad.

SchoolTube is another YouTube alternative. Unlike other video sharing sites, it is not generally blocked by school filters because all of its content is moderated.

The original, generic YouTube also has a bevy of teacher resources, though it’s often blocked in schools. Khan Academy consistently puts out high-quality lessons for every subject, but a general search on any topic usually yields a handful of lesson approaches. Some of the better ones are indexed onWatchKnow.

Three Good Tools for Building Flipped Lessons That Include Assessment Tools

http://www.freetech4teachers.com/2014/01/three-good-tools-for-building-flipped.html#.UtFjEfRDuSo

Three Good Tools for Building Flipped Lessons That Include Assessment Tools

eduCanon is a free service for creating, assigning, and tracking your students’ progress on flipped lessons. eduCanon allows teachers to build flipped lessons using YouTube and Vimeo videos, create questions about the videos, then assign lessons to their students. Teachers can track the progress of their students within eduCanon.

Teachem is a service that uses the TED Ed model of creating lessons based on video. On Teachem teachers can build courses that are composed of a series of videos hosted on YouTube. Teachers can write questions and comments in “flashcards” that are tied to specific parts of each video and display next to each video. Students can take notes while watching the videos using the Teachem SmartNote system.

Knowmia is a website and a free iPad app for creating, sharing, and viewing video lessons. One of the best features of Knowia is a tool that they call the Assignment Wizard. The Knowmia Assignment Wizard allows teachers to design assignments that their students have to complete after watching a video. Students can check their own Knowmia accounts to see the assignments that their teachers have distributed. To aid teachers in assessing their students, Knowmia offers an automatic scoring option. Knowmia’s automatic scoring function works for multiple choice questions and numeric questions.

D2L issue /solution: set the correct responses to quiz questions

Issue:  How do I set the correct responses to quiz questions and how can they be weighted?

 

Solution:  T/F, Multiple Choice, Short Answer, Fill in the Blank, and Multiple Short Answer question types allow for weighting the correct or partially correct responses for automatically grading the questions.  See details at:  https://d2l.custhelp.com/app/answers/detail/a_id/1293/

 

clickers documentation

Thursday, April 11, 11AM-1PM, Miller Center B-37
and/or
http://media4.stcloudstate.edu/scsu

We invite the campus community to a presentation by three vendors of Classroom Response System (CRS), AKA “clickers”:

11:00-11:30AM          Poll Everywhere,              Mr. Alec Nuñez

11:30-12:00PM          iClikers,                                Mr. Jeff Howard
12:00-12:30PM          Top Hat Monocle             Mr. Steve Popovich

12:30-1PM                  Turning Technologies     Mr. Jordan Ferns

links to documentation from the vendors:

http://web.stcloudstate.edu/informedia/crs/ClickerSummaryReport_NDSU.docx 

 http://web.stcloudstate.edu/informedia/crs/Poll%20Everywhere.docx

http://web.stcloudstate.edu/informedia/crs/tophat1.pdf

http://web.stcloudstate.edu/informedia/crs/tophat2.pdf

http://web.stcloudstate.edu/informedia/crs/turning.pdf

Top Hat Monocle docs:

http://web.stcloudstate.edu/informedia/crs/thm/FERPA.pdf

http://web.stcloudstate.edu/informedia/crs/thm/proposal.pdf

http://web.stcloudstate.edu/informedia/crs/thm/THM_CaseStudy_Eng.pdf

http://web.stcloudstate.edu/informedia/crs/thm/thm_vsCRS.pdf

iCLicker docs:
http://web.stcloudstate.edu/informedia/crs/iclicker/iclicker.pdf

http://web.stcloudstate.edu/informedia/crs/iclicker/iclicker2VPAT.pdf

http://web.stcloudstate.edu/informedia/crs/iclicker/responses.doc

 

Questions to vendor: alec@polleverywhere.com 
  1. 1.     Is your system proprietary as far as the handheld device and the operating system software?

The site and the service are the property of Poll Everywhere. We do not provide handheld devices. Participants use their own device be it a smart phone, cell phone, laptop, tablet, etc.

  1. 2.     Describe the scalability of your system, from small classes (20-30) to large auditorium classes. (500+).

Poll Everywhere is used daily by thousands of users. Audience sizes upwards of 500+ are not uncommon. We’ve been used for events with 30,000 simultaneous participants in the past.

  1. 3.     Is your system receiver/transmitter based, wi-fi based, or other?

N/A

  1. 4.     What is the usual process for students to register a “CRS”(or other device) for a course? List all of the possible ways a student could register their device. Could a campus offer this service rather than through your system? If so, how?

Student participants may register by filling out a form. Or, student information can be uploaded via a CSV.

  1. 5.     Once a “CRS” is purchased  can it be used for as long as the student is enrolled in classes? Could “CRS” purchases be made available through the campus bookstore? Once a student purchases a “clicker” are they able to transfer ownership when finished with it?

N/A. Poll Everywhere sells service licenses the length and number of students supported would be outlined in a services agreement.

  1. 6.     Will your operating software integrate with other standard database formats? If so, list which ones.

Need more information to answer.

  1. 7.     Describe the support levels you provide. If you offer maintenance agreements, describe what is covered.

8am to 8pm EST native English speaking phone support and email support.

  1. 8.     What is your company’s history in providing this type of technology? Provide a list of higher education clients.

Company pioneered and invented the use of this technology for audience and classroom response. http://en.wikipedia.org/wiki/Poll_Everywhere. University of Notre Dame
South Bend, Indiana

University of North Carolina-Chapel Hill
Raleigh, North Carolina

University of Southern California
Los Angeles, California

San Diego State University
San Diego, California

Auburn University
Auburn, Alabama

King’s College London
London, United Kingdom

Raffles Institution
Singapore

Fayetteville State University
Fayetteville, North Carolina

Rutgers University
New Brunswick, New Jersey

Pepperdine University
Malibu, California

Texas A&M University
College Station, Texas

University of Illinois
Champaign, Illinois

  1. 9.     What measures does your company take to insure student data privacy? Is your system in compliance with FERPA and the Minnesota Data Practices Act? (https://www.revisor.leg.state.mn.us/statutes/?id=13&view=chapter)

Our Privacy Policy can be found here: http://www.polleverywhere.com/privacy-policy. We take privacy very seriously.

  1. 10.  What personal data does your company collect on students and for what purpose? Is it shared or sold to others? How is it protected?

Name. Phone Number. Email. For the purposes of voting and identification (Graded quizzes, attendance, polls, etc.). It is never shared or sold to others.

  1. 11.  Do any of your business partners collect personal information about students that use your technology?

No.

  1. 12.  With what formats can test/quiz questions be imported/exported?

Import via text. Export via CSV.

  1. 13.  List compatible operating systems (e.g., Windows, Macintosh, Palm, Android)?

Works via standard web technology including Safari, Chrome, Firefox, and Internet Explorer. Participant web voting fully supported on Android and IOS devices. Text message participation supported via both shortcode and longcode formats.

  1. 14.  What are the total costs to students including device costs and periodic or one-time operation costs

Depends on negotiated service level agreement. We offer a student pays model at $14 per year or Institutional Licensing.

  1. 15.  Describe your costs to the institution.

Depends on negotiated service level agreement. We offer a student pays model at $14 per year or Institutional Licensing.

  1. 16.  Describe how your software integrates with PowerPoint or other presentation systems.

Downloadable slides from the website for Windows PowerPoint and downloadable app for PowerPoint and Keynote integration on a Mac.

17. State your level of integration with Desire2Learn (D2L)?Does the integration require a server or other additional equipment the campus must purchase?Export results from site via CSV for import into D2L.
  1. 17.  How does your company address disability accommodation for your product?

We follow the latest web standards best practices to make our website widely accessible by all. To make sure we live up to this, we test our website in a text-based browser called Lynx that makes sure we’re structuring our content correctly for screen readers and other assisted technologies.

  1. 18.  Does your software limit the number of answers per question in tests or quizzes? If so, what is the max question limit?

No.

  1. 19.  Does your software provide for integrating multimedia files? If so, list the file format types supported.

Supports image formats (.PNG, .GIF, .JPG).

  1. 20.  What has been your historic schedule for software releases and what pricing mechanism do you make available to your clients for upgrading?

We ship new code daily. New features are released several times a year depending on when we finish them. New features are released to the website for use by all subscribers.

  1. 21.  Describe your “CRS”(s).

Poll Everywhere is a web based classroom response system that allows students to participate from their existing devices. No expensive hardware “clickers” are required. More information can be found at  http://www.polleverywhere.com/classroom-response-system.

  1. 22.  If applicable, what is the average life span of a battery in your device and what battery type does it take?

N/A. Battery manufacturers hate us. Thirty percent of their annual profits can be contributed to their use in clickers (we made that up).

  1. 23.  Does your system automatically save upon shutdown?

Our is a “cloud based” system. User data is stored there even when your computer is not on.

  1. 24.  What is your company’s projection/vision for this technology in the near and far term.

We want to take clicker companies out of business. We think it’s ridiculous to charge students and institutions a premium for outdated technology when existing devices and standard web technology can be used instead for less than a tenth of the price.

  1. 25.  Does any of your software/apps require administrator permission to install?

No.

  1. 26.  If your system is radio frequency based, what frequency spectrum does it operate in? If the system operate in the 2.4-2.5 ghz. spectrum, have you tested to insure that smart phones, wireless tablet’s and laptops and 2.4 ghz. wireless phones do not affect your system? If so, what are the results of those tests?

No.

  1. 27.  What impact to the wireless network does the solution have?

Depends on a variety of factors. Most university wireless networks are capable of supporting Poll Everywhere. Poll Everywhere can also make use of cell phone carrier infrastructure through SMS and data networks on the students phones.

  1. 28.  Can the audience response system be used spontaneously for polling?

Yes.

  1. 29.  Can quiz questions and response distributions be imported and exported from and to plaintext or a portable format? (motivated by assessment & accreditation requirements).

Yes.

  1. 30.  Is there a requirement that a portion of the course grade be based on the audience response system?

No.

Gloria Sheldon
MSU Moorhead

Fall 2011 Student Response System Pilot

Summary Report

 

NDSU has been standardized on a single student response (i.e., “clicker”) system for over a decade, with the intent to provide a reliable system for students and faculty that can be effectively and efficiently supported by ITS. In April 2011, Instructional Services made the decision to explore other response options and to identify a suitable replacement product for the previously used e-Instruction Personal Response System (PRS). At the time, PRS was laden with technical problems that rendered the system ineffective and unsupportable. That system also had a steep learning curve, was difficult to navigate, and was unnecessarily time-consuming to use. In fact, many universities across the U.S. experienced similar problems with PRS and have since then adopted alternative systems.

A pilot to explore alternative response systems was initiated at NDSU in fall 2011. The pilot was aimed at further investigating two systems—Turning Technologies and iClicker—in realistic classroom environments. As part of this pilot program, each company agreed to supply required hardware and software at no cost to faculty or students. Each vendor also visited campus to demonstrate their product to faculty, students and staff.

An open invitation to participate in the pilot was extended to all NDSU faculty on a first come, first serve basis. Of those who indicated interest, 12 were included as participants in this pilot.

 

Pilot Faculty Participants:

  • Angela Hodgson (Biological Sciences)
  • Ed Deckard (AES Plant Science)
  • Mary Wright (Nursing)
  • Larry Peterson (History, Philosophy & Religious Studies)
  • Ronald Degges (Statistics)
  • Julia Bowsher (Biological Sciences)
  • Sanku Mallik (Pharmaceutical Sciences)
  • Adnan Akyuz (AES School of Natural Resource Sciences)
  • Lonnie Hass (Mathematics)
  • Nancy Lilleberg (ITS/Communications)
  • Lisa Montplaisir (Biological Sciences)
  • Lioudmila Kryjevskaia (Physics)

 

Pilot Overview

The pilot included three components: 1) Vendor demonstrations, 2) in-class testing of the two systems, and 3) side-by-side faculty demonstrations of the two systems.

After exploring several systems, Instructional Services narrowed down to two viable options—Turning Technologies and iClicker. Both of these systems met initial criteria that was assembled based on faculty input and previous usage of the existing response system. These criteria included durability, reliability, ease of use, radio frequency transmission, integration with Blackboard LMS, cross-platform compatibility (Mac, PC), stand-alone software (i.e., no longer tied to PowerPoint or other programs), multiple answer formats (including multiple choice, true/false, numeric), potential to migrate to mobile/Web solutions at some point in the future, and cost to students and the university.

In the first stage of the pilot, both vendors were invited to campus to demonstrate their respective technologies. These presentations took place during spring semester 2011 and were attended by faculty, staff and students. The purpose of these presentations was to introduce both systems and provide faculty, staff, and students with an opportunity to take a more hands-on look at the systems and provide their initial feedback.

In the second stage of the pilot, faculty were invited to test the technologies in their classes during fall semester 2011. Both vendors supplied required hardware and software at no cost to faculty and students, and both provided online training to orient faculty to their respective system. Additionally, Instructional Services staff provided follow-up support and training throughout the pilot program. Both vendors were requested to ensure system integration with Blackboard. Both vendors indicated that they would provide the number of clickers necessary to test the systems equally across campus. Both clickers were allocated to courses of varying sizes, ranging from 9 to 400+ students, to test viability in various facilities with differing numbers of users. Participating faculty agreed to offer personal feedback and collect feedback from students regarding experiences with the systems at the end of the pilot.

In the final stage of the pilot, Instructional Services facilitated a side-by-side demonstration led by two faculty members. Each faculty member showcased each product on a function-by-function basis so that attendees were able to easily compare and contrast the two systems. Feedback was collected from attendees.

 

Results of Pilot

In stage one, we established that both systems were viable and appeared to offer similar features, functions, and were compatible with existing IT systems at NDSU. The determination was made to include both products in a larger classroom trial.

In stage two, we discovered that both systems largely functioned as intended; however, several differences between the technologies in terms of advantages and disadvantages were discovered that influenced our final recommendation. (See Appendix A for a list of these advantages, disadvantages, and potential workarounds.) We also encountered two significant issues that altered the course of the pilot. Initially, it was intended that both systems would be tested in equal number in terms of courses and students. Unfortunately, at the time of the pilot, iClicker was not able to provide more than 675 clickers, which was far fewer than anticipated. Turning Technologies was able to provide 1,395 clickers. As a result, Turning Technologies was used by a larger number of faculty and students across campus.

At the beginning of the pilot, Blackboard integration with iClicker at NDSU was not functional. The iClicker vendor provided troubleshooting assistance immediately, but the problem was not resolved until mid-November. As a result, iClicker users had to use alternative solutions for registering clickers and uploading points to Blackboard for student viewing. Turning Technologies was functional and fully integrated with Blackboard throughout the pilot.

During the span of the pilot additional minor issues were discovered with both systems. A faulty iClicker receiver slightly delayed the effective start date of clicker use in one course.  The vendor responded by sending a new receiver, however it was an incorrect model. Instructional Services temporarily exchanged receivers with another member of the pilot group until a functional replacement arrived. Similarly, a Turning Technologies receiver was received with outdated firmware. Turning Technologies support staff identified the problem and assisted in updating the firmware with an update tool located on their website. A faculty participant discovered a software flaw in the iClicker software that hides the software toolbar when disconnecting a laptop from a second monitor. iClicker technical support assisted in identifying the problem and stated the problem would be addressed in a future software update. A workaround was identified that mitigated this problem for the remainder of the pilot. It is important to note that these issues were not widespread and did not widely affect all pilot users, however these issues attest to the need for timely, reliable, and effective vendor support.

Students and faculty reported positive experiences with both technologies throughout the semester. Based on feedback, users of both systems found the new technologies to be much improved over the previous PRS system, indicating that adopting either technology would be perceived as an upgrade among students and faculty. Faculty pilot testers met several times during the semester to discuss their experiences with each system; feedback was sent to each vendor for their comments, suggestions, and solutions.

During the stage three demonstrations, feedback from attendees focused on the inability for iClicker to integrate with Blackboard at that time and the substantial differences between the two systems in terms of entering numeric values (i.e., Turning Technologies has numeric buttons, while iClicker requires the use of a directional key pad to scroll through numeric characters). Feedback indicated that attendees perceived Turning Technologies’ clickers to be much more efficient for submitting numeric responses. Feedback regarding other functionalities indicated relative equality between both systems.

Recommendation

Based on the findings of this pilot, Instructional Services recommends that NDSU IT adopt Turning Technologies as the replacement for the existing PRS system. While both pilot-tested systems are viable solutions, Turning Technologies appears to meet the needs of a larger user base. Additionally, the support offered by Turning Technologies was more timely and effective throughout the pilot. With the limited resources of IT, vendor support is critical and was a major reason for exploring alternative student response technologies.

From Instructional Services’ standpoint, standardizing to one solution is imperative for two major reasons: cost efficiency for students (i.e., preventing students from having to purchase duplicate technologies) and efficient utilization of IT resources (i.e., support and training). It is important to note that this recommendation is based on the opinion of the Instructional Services staff and the majority of pilot testers, but is not based on consensus among all participating faculty and staff. It is possible that individual faculty members may elect to use other options that best meet their individual teaching needs, including (but not limited to) iClicker. As an IT organization, we continue to support technology that serves faculty, student and staff needs across various colleges, disciplines, and courses. We feel that this pilot was effective in determining the student response technology—Turning Technologies—that will best serve NDSU faculty, students and staff for the foreseeable future.

Once a final decision concerning standardization is made, contract negotiations should begin in earnest with the goal of completion by January 1, 2012, in order to accommodate those wishing to use clickers during the spring session.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 


 

Appendix A: Clicker Comparisons
Turning Technologies and iClicker

 

Areas where both products have comparable functionality:

  • Setting up the receiver and software
  • Student registration of clickers
  • Software interface floats above other software
    • Can use with anything – PowerPoint, Websites, Word, etc.
    • Asking questions on the fly
    • Can create questions / answers files
    • Managing scores and data
      • Allow participation points, points for correct answer, change correct answer
      • Reporting – Summary and Detailed
      • Uploading scores and data to Blackboard (but there was a big delay with the iClicker product)
      • Durability of the receivers and clickers
      • Free software
      • Offer mobile web device product to go “clickerless”

Areas where the products differ:

Main Shortcomings of Turning Technology Product:

  • Costs $5 more – no workaround
  • Doesn’t have instructor readout window on receiver base –
    • This is a handy function in iClicker that lets the instructor see the %’s of votes as they come in, allowing the instructor to plan how he/she will proceed.
    • Workaround: As the time winds down to answer the question, the question and answers are displayed on the screen. Intermittently, the instructor would push a button to mute the projector, push a button to view graph results quickly, then push a button to hide graph and push a button to unmute the projector. In summary, push four buttons quickly each time you want to see the feedback, and the students will see a black screen momentarily.
    • Processing multiple sessions when uploading grading –
      • Turning Technologies uses their own file structure types, but iClicker uses comma-separated-value text files which work easily with Excel
      • Workaround: When uploading grades into Blackboard, upload them one session at a time, and use a calculated total column in Bb to combine them. Ideally, instructors would upload the grades daily or weekly to avoid backlog of sessions.

 

Main Shortcomings of iClicker Product:

  • Entering numeric answers –
    • Questions that use numeric answers are widely used in Math and the sciences. Instead of choosing a multiple-choice answer, students solve the problem and enter the actual numeric answer, which can include numbers and symbols.
    • Workaround: Students push mode button and use directional pad to scroll up and down through a list of numbers, letters and symbols to choose each character individually from left to right. Then they must submit the answer.
    • Number of multiple choice answers –
      • iClicker has 5 buttons on the transmitter for direct answer choices and Turning Technologies has 10.
      • Workaround: Similar to numeric answer workaround. Once again the simpler transmitter becomes complex for the students.
      • Potential Vendor Support Problems –
        • It took iClicker over 3 months to get their grade upload interface working with NDSU’s Blackboard system. The Turning Technology interface worked right away.  No workaround.

 

 

 

 

D2L camp Monday December 2012

  1. 9:00-9:30am: Snacks, networking and welcome.
  2. 9:30-10:00am: D2L Version 10 update.
  3. 10:00-10:30am: Overview of D2L basics and share best practices. Dr. Plamen Miltenoff, LRS
    • please enter ideas and suggestions
      who is helping students with the new D2L interface?
      PPT about the changes to the new version at:
      http://web.stcloudstate.edu/informedia/d2l10.pptx
      the new version does not discrimante the teacher, versus T2 and GA unless you
      change of Navbar. BE AWARE that you cannot add tools (you need to request via d2L@stcloudstate.edu) but you can take off tools from the new navbar. To take off a tool, go to “Edit Course” in the new version, click on “Tools” and find “Set Inactive”
      Dropbox addition. Feedback left for students can be kept as a draft

 

  1. 10:30-11:00am: Automation of lab reports using D2L.  Dr. Zengqiang “John” Liu, Physics
  • please enter ideas and suggestions
    – D2L dropbox:
    1. when papers are a big stack of paper, versus electronic format in dropbox, is it a bigger psychological burden?
    2. Navbar CANNOT be changed by faculty. Need to request the change from D2L@stcloudstate.edu
    3. BWhen assignng bonus points work, they fine, but do not apply to the final grade
    4. Naming the file deposited in the dropbox is crucial to navigating later on
    5. “Properties: One file per sumbission | overwrite submissions” is probably the best way to streamline the dropbox flow
    6. “Restrictions: Display in Calendar” helps student as a reminder, even if the D2L calendar is not populated and used regularly
    7. “Restrictions: Additional Release and Conditions” is the overarching idea of successful teaching. Conditioning Dropbox with Content, Discussions and Quizzes can bring uniformity and structure in students’ learning
    8. Restrictions: Special Access” is poorly phrased and can confuse faculty.
    9. Downloading all files at once via zipped file attaches Last Name First name of the student to its paper’s file name
  1. 11:00-11:30am: Organization of D2L Content delivery and student learning. Dr. Lakshmaiah Sreerama, Chemistry
    please have a link to Ram’s presentation: http://web.stcloudstate.edu/informedia/d2l/Organization_D2L_Content_Student_Learning.pptx  
  • please enter ideas and suggestions
    1. what is optimal when using CMS.
    2. the switch from WebCT to D2L was very consuming. Is it gonna be again when we switch to a different one?
    3. How to deliver content is challange. write versus speak. Student takes notes or listens? Also engage, becomes to much. Classes become “flipped classroom”
    4. Modular | recorded lectures | lectures notes in several formats | study guides
    5. develop best practices for my discipline
    6. modular guide: goals | outcomes | objectives | readings | activities | quizzes
    7. recorded lexture: in sciences is easIER to organize, how it will be in humanities? This is where we can be creative
    8. providing all this content in all thes[e] format[s] made me a better teacher. It also made students better prepared for class.  student learning success
    9. Best Practices used by Ram: check his PPT. -) choose simler presenation format -) listen to student feedback -) privacy issues (release form about taping students), intellectual property rights
    10. Flipped classroom: -) capture
    11. discussion – Camtasia versus Adobe Connect how do we manage this. Camtasia has larger file size. Kaltura is still tested. The MediaSite server as carrying the heavy duty files. Authentication not needed if the files are made public.
  1. 11:30-12:00pm: New tools in D2L. Greg Jorgensen and Karin Duncan, ITS
  • please enter ideas and suggestions
    1. search option in minibar only if faculty has ten or more classes
    2. instant notifications: new features. ellect to receive emails
    3. discussions managed in two spots: -) via subscription  on the top as general, or -) subscribe for each topic.  There is an option: include in my summary of activity
    4. D2L now keeps “sent” email.  Comibne an email to all six classes I teach; how do I do that?
    5. Classlist has inconsistency, be aware, ask D2L@stcloudstate.edu about it
    6. Assesst discussions has a sqaure ot check “must post first.” It is off by default. Edit topic, under Options: “A user must comopse a message before participating in the topic.”
    7. reset dates by Manage Dates: instead of going to separate modules one by one and changing dates. Notice the checkbox on the right for Calendar.  The offset option makes the dates relevant to this semester.
    8. App for iPAD, free, Assignment Grader. leave feedback, asses using rubrics and review on PDF and feeds D2L.
    9. SCORM user, can be reported into D2L. If Polleverywhere is SCORM complient it can be reported via SCORM like poll in Adobe Connect.
    10. Grates, Discussions, and other areas, which are wide, the header image goes away
  1. 1:00-1:30pm: Case study and sharing best practices. Dr. David Switzer, Economics
  • please enter ideas and suggestions
    1. creating groups in class and each person in a group and locking up. but that before subscribing for discussions.
    2. gradebooks exporting and importing. Problem. D2L graidng is not very flexible. First export to CSV file. Sort in excel by last name and have it in order.
    3. bonus items in grades: to curve grades, instead exporting importing, go to grades, createa bonus item called “exam 1 curve” and thus not only automating the grading but seeing the curve next semester
    4. switch in quiz from the default “users” to tab “questions” it saves time when grading
    5. take home exam is in quiz, not in dropbox, because dropbox cannot be taimed
    tip for students
    6. tip for students: discussion forums. Subsribe to topics by students. It helps students a lot, since they don’t have to go and login into D2L, the get it via email. Quesion: how many of them are using now mobile devices to get this notifications?
    7. New section shows only the most recent announcments. This can be changed via settings
    8. Video, mp4 format, 7 min, intro screencapture walking students through D2L.  A MnSCU video might exist.
    9. Narrated PPTs does not act well when hand writting. Presenter for PPT. Or Camtasia
    10. Surveys.  Show in class that “anonymous” is real.
    11. practice quizzes. also similar in Content. also the gamification: can go to the next quiz after 75% of the previous one is resolved
  1. 1:30-2:00pm: Creating and assigning online quizzes. Dr. Eugmin Kang, SOB
  • please enter ideas and suggestions
    1. quiz structure. the option for randomly assigning questions. So every time the student takes the trainng quiz again, new questions are assigned.
    using different types: multiple choice, true/false, images as part of the quiz question. To ensure that equal questions from each section are chosen, one need to create separate sections in the library. To do it, create a new “random’ section, with name “random1” and import the quiz q/s from the book section 1 etc.
    accumulative final.  Pull questions for the final quiz from training quizzes randomly.
  1. 2:00-3:00pm: Open time for individual projects and problem solving.

please enter ideas and requests

 

You can also join us via virtual synchronous connection through Adobe Connect at:
http://media4.stcloudstate.edu/d2lworkshop/

 

Limited space; please consider registering at https://secure.mnsu.edu/mnscupd/login/default.asp?campusid=0073

 

We would like to organize similar event sometimes in January. Please share with us your preference for day/time in January 2013, as well as topics of interest.

Follow us on Twitter: @SCSUtechinstruc   #techworkshop

Hamlet on the HoloDeck

Murray, J. H. (1997). Hamlet on the holodeck: The future of narrative in cyberspace. Free Press.
https://mnpals-scs.primo.exlibrisgroup.com/permalink/01MNPALS_SCS/qoo6di/alma990011592830104318
Since 1992 I have been teaching a course on how to write electronics section. My students include freshman, writing majors, and media lab graduate students.
As I watch the yearly growth in ingenuity among my students, I find myself anticipating a new kind of storyteller, one who is half hacker, half bard. The spirit of the hacker is one of the great creative wellspring Safari time, causing the in animate circuits to sync with ever more individualized and quirky voices; the spirit of the bard is eternal and irreplaceable, telling us what we are doing here in about we mean to one another.
p. 12 A New Medium of Storytelling
p. 18 Aldous Huxley Brave New World
(more on it here https://blog.stcloudstate.edu/ims?s=huxley)
Set 600 years from now, describes a society that science has dehumanized by eliminating love, parenthood, and the family in favor of generating engineering, test tube delivery, and state indoctrination. Books are banned, and science has come up with a substitute form of storytelling to delete the masses.
p. 20 Ray Bradbury Fahrenheit 451
p. 22 this accounts of a digital dystopia but eroticize and demonize the computer. Cyberpunks surfers are like Cowboys on the new frontier motorcycle hoodlums with a joystick in their hand instead of a motorcycle between their legs. They are outlaw pirates on an endless voyage of exploration throughout the virtual world riding and plundering among the invisible data hoards of the world in many states by the stronger pirate parents who reach in and reprogram their minds.
p. 22 William Gibson Neurmancer
p. 28 The Harbinger on the the Holodeck
The technical in economic cultivation of this freestyle new medium of communication has led to several new varieties of narrative entertainment. This new storytelling formats very from the shoot them up video game in the virtual dungeons of Internet role-playing games to the post modern literary hypertext. This wide range of narrative art holds the promise of a new medium of expression that is S varied SD printed book or the movie picture.
Books printed before 1501 or cold incunabula; the word is derived from the Latin for swaddling clothes and is used to indicate that this books are the work of a technology still in its infancy.
The garish video games in tangled websites of the current digital environment or part of a similar period of technical evolution, part of a similar struggle for the conventions of coherent communication.
p. 29 now, in the incunabular days of the narrative computer. We can see how 20th century novels, films, and please have been steadily pushing against the boundaries of linear storytelling. We therefore have to start our survey of the harbingers of the holiday back with a look at multiform stories, that is, linear narrative straining against the boundary of pre-digital media like a two dimensional picture trying to burst of its frame.
p. 30 The multiform story
Frank Capra’s It’s a wonderful life
p. 34 Robert Zemeskis Back to the Future
p. 35 Harold Ramis’s Groundhog Day
Multi forms stories often reflect different points of view of the same event.  p. 37 Kurosawa Rashomon, the same crime is narrated by four different people: a rape victim; her husband, who is murdered; the bandit who attacked them; and a bystander.
p. 37 Milorad Pavic’s Dictionary of the Khazars
p. 37 Multi form narrative attempts to give a simultaneous form to this possibilities, to allow us to hold in our minds at the same time multiple contradictory alternatives.
p. 38  active audience
When the writer expands the story to include multiple possibilities, the reader assumes a more active role.
p. 40 although television viewers have long been accused of being less active engaged in readers or theatergoers, research on thin culture provides considerable evidence that viewers actively appropriate the stories of their favorite series. In addition to sharing critical commentary and gossip, fans create their own stories by taking characters and situations from the series and developing them in ways closer to their own concerns.
p. 42 role playing games or theatrical in a non-traditional but thrilling way. Players are both actors and audience for one another, and events the purple tree often have the media seat of personal experience.
p. 43 Live theater has been incorporating the same qualities of spontaneity and audience involvement for some time.
p. 43 MUDs have allowed distant players on the Internet to share a common virtual space in which they can chat with one another in real time. A the social psychologist Sherry Turkle has persuasively demonstrated, mods are intensely “evocative” environments for fantasy play that allow people to create and sustain elaborate fictional personas.
p. 44 movies three dimensions
p. 51 dramatic storytelling in electronic games
p. 55 story webs
p. 59 computer scientist as storytellers
p. 65 Chapter 3 From Additive to Expressive Form
beyond multimedia
Sept 28, 1895 Arrival of the Train at La Ciotat Station
p. 66 photoplays
p. 67 one of the lessons we can learn from the history of film is that additive formulations like photo play or the contemporary catshall ‘multimedia” or a sign that the medium is in an early stage of development and it is still depending on the format derived from earlier technologies instead of exploiting its own expressive power. Today the derivative mindset is apparent in the conception of cyber space is the place to view “pages” of print or “clips” of moving video end of cedar rooms is offering “extended books.”
p. 60 ELIZA, 1966 Joseph Weizenbaum
p. 71 the four essential properties of digital environments
Digital environments are procedural
Digital environments are participatory
Digital environments are spacial
Digital environments are encyclopedic
p. 90 Digital structures of complexity
p. 95 part to the aesthetics of the medium
chapter 4 immersion
definition
The experience of being transported to an elaborate please simulator please it’s pleasurable in itself regarding of the fantasy contact. we Refer to this experience as immersion. Immersion is a metaphorical term derived from the physical experience of being submerged in water. We seek the same feeling from a psychologically immersive experience that we do from a plunge in the ocean or swimming pool: the sensation of being surrounded by a completely other reality, as different is water is from air, that takes over all of our attention, our whole perceptual apparatus.
p/ 99 entering the enchanted place
my note: ghost in the machine
The computer itself, even without any fantasy content, is an enchanted object. Sometimes it can act like an autonomous, animate being, sensing it’s environment in carrying out internally generated processes, yet it can also seem like an extension of our own consciousness, capturing our words through the keyboard in displaying them on the screen as fast as we can thank them.
p. 110 the active creation of belief
In digital environments we have new opportunities to practice this active creation of belief. For instance, in an interactive video program set in Paris that may research group designed in the 1980s for language learners, we included a working telephone, represented by a photograph of a phone who’s keypad could be clicked on .
p. 112 structuring participation with a mask .
p. 119 regulating arousal According to Winnicott, “the pleasurable element in playing Carris whit eight employee Kasian that the instructional a razzle is not excessive”; that is, the object of the imaginary world should not be too enticing, scary, or real let the immersive trance be broken. This is true in any medium. If a horror movie is too frightening, we cover our eyes or turn away from the screen.
p. 126 chapter Agency
Agency is the satisfying power to take meaningful action and see the results of our decisions and choices. We expect to feel agency on the computer when we double click on a file and seat open before us or when we enter numbers in a spreadsheet and see the totals readjust. However, we do not usually expect to experience agency within a narrative environment.
p. 129 the pleasures of navigation
One form of agency not dependent on the game structure yet characteristic of digital environment is spatial navigation. The ability to move through virtual landscapes can be pleasurable in itself, independent of the content of the spaces.
p. 130 the story of the maze
the adventure maze embodies a classic fair-tale narrative of danger and salvation. as a format for electronic narrative, the maze is a more active version of the immersive visit (chapter 4).
p. 134 Giving Shape to Anxiety
p. 137 The Journey Story and the Pleasure of Problem Solving
p. 140 Games into Stories
p. 142 Games as Symbolic Dramas

Algorithmic Test Proctoring

Our Bodies Encoded: Algorithmic Test Proctoring in Higher Education

SHEA SWAUGER ED-TECH

https://hybridpedagogy.org/our-bodies-encoded-algorithmic-test-proctoring-in-higher-education/

While in-person test proctoring has been used to combat test-based cheating, this can be difficult to translate to online courses. Ed-tech companies have sought to address this concern by offering to watch students take online tests, in real time, through their webcams.

Some of the more prominent companies offering these services include ProctorioRespondusProctorUHonorLockKryterion Global Testing Solutions, and Examity.

Algorithmic test proctoring’s settings have discriminatory consequences across multiple identities and serious privacy implications. 

While racist technology calibrated for white skin isn’t new (everything from photography to soap dispensers do this), we see it deployed through face detection and facial recognition used by algorithmic proctoring systems.

While some test proctoring companies develop their own facial recognition software, most purchase software developed by other companies, but these technologies generally function similarly and have shown a consistent inability to identify people with darker skin or even tell the difference between Chinese people. Facial recognition literally encodes the invisibility of Black people and the racist stereotype that all Asian people look the same.

As Os Keyes has demonstrated, facial recognition has a terrible history with gender. This means that a software asking students to verify their identity is compromising for students who identify as trans, non-binary, or express their gender in ways counter to cis/heteronormativity.

These features and settings create a system of asymmetric surveillance and lack of accountability, things which have always created a risk for abuse and sexual harassment. Technologies like these have a long history of being abused, largely by heterosexual men at the expense of women’s bodies, privacy, and dignity.

Their promotional messaging functions similarly to dog whistle politics which is commonly used in anti-immigration rhetoric. It’s also not a coincidence that these technologies are being used to exclude people not wanted by an institution; biometrics and facial recognition have been connected to anti-immigration policies, supported by both Republican and Democratic administrations, going back to the 1990’s.

Borrowing from Henry A. Giroux, Kevin Seeber describes the pedagogy of punishment and some of its consequences in regards to higher education’s approach to plagiarism in his book chapter “The Failed Pedagogy of Punishment: Moving Discussions of Plagiarism beyond Detection and Discipline.”

my note: I am repeating this for years
Sean Michael Morris and Jesse Stommel’s ongoing critique of Turnitin, a plagiarism detection software, outlines exactly how this logic operates in ed-tech and higher education: 1) don’t trust students, 2) surveil them, 3) ignore the complexity of writing and citation, and 4) monetize the data.

Technological Solutionism

Cheating is not a technological problem, but a social and pedagogical problem.
Our habit of believing that technology will solve pedagogical problems is endemic to narratives produced by the ed-tech community and, as Audrey Watters writes, is tied to the Silicon Valley culture that often funds it. Scholars have been dismantling the narrative of technological solutionism and neutrality for some time now. In her book “Algorithms of Oppression,” Safiya Umoja Noble demonstrates how the algorithms that are responsible for Google Search amplify and “reinforce oppressive social relationships and enact new modes of racial profiling.”

Anna Lauren Hoffmann, who coined the term “data violence” to describe the impact harmful technological systems have on people and how these systems retain the appearance of objectivity despite the disproportionate harm they inflict on marginalized communities.

This system of measuring bodies and behaviors, associating certain bodies and behaviors with desirability and others with inferiority, engages in what Lennard J. Davis calls the Eugenic Gaze.

Higher education is deeply complicit in the eugenics movement. Nazism borrowed many of its ideas about racial purity from the American school of eugenics, and universities were instrumental in supporting eugenics research by publishing copious literature on it, establishing endowed professorships, institutes, and scholarly societies that spearheaded eugenic research and propaganda.

+++++++++++++++++
more on privacy in this IMS blog
https://blog.stcloudstate.edu/ims?s=privacy

2019 Educause Horizon Report

2019 Horizon Report

Tuesday, April 23, 2019 https://library.educause.edu/resources/2019/4/2019-horizon-report

https://library.educause.edu/-/media/files/library/2019/4/2019horizonreport.pdf

p. 8 Modularized and Disaggregated Degrees

Only 2% of institutions have deployed digital microcredentials (including badging) institution-wide, but 29% are expanding or planning their use. —EDUCAUSE Strategic Technologies, 2019

p. 15 Increasing Demand for Digital Learning Experience and Instructional Design Expertise

A driving factor for mobile learning is the ownership of mobile devices, particularly the smartphone. In 2018, the Pew Research Center reported that 59% of adults globally own a smartphone, and research from the EDUCAUSE Center for Analysis and Research indicated that 95% of undergraduate students own smartphones. As mobile device ownership and usage have increased, mobile learning is no longer just focused on asynchronous interaction, content creation, and reference. More emphasis is emerging on content that is responsive instead of adaptive and on creating microlearning experiences that can sync across multiple devices and give learners the flexibility to learn on the device of their choice

p. 25 Mixed Reality

p. 36 Fail or Scale: AR and MR –
In 2016, the Horizon Expert Panel determined that augmented reality and virtual reality were two to three years from widespread adoption. By 2018, the notion of mixed reality was, at four to five years from adoption, even further out.

p. 38 Bryan Alexander: Gaming and Gamification (Fail or Scale)

++++++++++++++
more on the Horizon reports in this IMS blog
https://blog.stcloudstate.edu/ims?s=horizon+report

Performance Assessment

What Is Performance Assessment?

February 5, 2019 https://www.edweek.org/ew/articles/2019/02/06/what-is-performance-assessment.html

William Heard Kilpatrick  “The Project Method”

Today, despite major advances in ways to measure learning, we still don’t have common definitions for project-based learning or performance assessment.

In the absence of agreed-upon definitions for this evolving field, Education Week reporters developed a glossary

Proficiency-based or competency-based learning: These terms are interchangeable. They refer to the practice of allowing students to progress in their learning as they master a set of standards or competencies. Students can advance at different rates. Typically, there is an attempt to build students’ ownership and understanding of their learning goals and often a focus on “personalizing” students’ learning based on their needs and interests.

Project-based learning: Students learn through an extended project, which may have a number of checkpoints or assessments along the way. Key features are inquiry, exploration, the extended duration of the project, and iteration (requiring students to revise and reflect, for example). A subset of project-based learning is problem-based learning, which focuses on a specific challenge for which students must find a solution.

Standards-based grading: This refers to the practice of giving students nuanced and detailed descriptions of their performance against specific criteria or standards, not on a bell curve. It can stand alone or exist alongside traditional letter grading.

Performance assessment: This assessment measures how well students apply their knowledge, skills, and abilities to authentic problems. The key feature is that it requires the student to produce something, such as a report, experiment, or performance, which is scored against specific criteria.

Portfolio: This assessment consists of a body of student work collected over an extended period, from a few weeks to a year or more. This work can be produced in response to a test prompt or assignment but is often simply drawn from everyday classroom tasks. Frequently, portfolios also contain an element of student reflection.

Exhibition: A type of performance assessment that requires a public presentation, as in the sciences or performing arts. Other fields can also require an exhibition component. Students might be required, for instance, to justify their position in an oral presentation or debate.

Performance task: A piece of work students are asked to do to show how well they apply their knowledge, skills, or abilities—from writing an essay to diagnosing and fixing a broken circuit. A performance assessment typically consists of several performance tasks. Performance tasks also may be included in traditional multiple-choice tests.

 

21st Century Teaching

6 Key Trends to 21st Century Teaching

Richard Nattoo

https://www.edsurge.com/research/guides/21st-century-teaching-guide

OER on the rise

Colleges around the country have also started hiring staff members with titles like OER Coordinator and Affordable Content Librarian. Our series looked into how the movement is changing, and the research into the costsand benefits. You can even hear a podcast version here.

Flipped classrooms seem to be growing exponentially

Robert Talbert, a professor of mathematics at Grand Valley State University and author of the book Flipped Learning. Talbert recently tabulated how many scholarly articles are published each year about “flipping” instruction, meaning that traditional lecture-style material is delivered before class (often using videos) so that classroom time can be used for discussion and other more active learning.

OER on the rise

More professors are looking to experts to help them teach. (Though some resist.)

By 2016, there were an estimated 13,000 instructional designers on U.S. campuses, according to a report by Intentional Futures. And that number seems to be growing.

There’s also a growing acceptance of the scholarly discipline known as “learning sciences,” a body of research across disciplines of cognitive science, computer science, psychology, anthropology and other fields trying to unlock secrets of how people learn and how to best teach.

here’s a classic study that shows that professors think they’re better teachers than they actually are

The classroom isn’t the only place to learn

experiments with putting office hours online to get students to show up, bringing virtual reality to science labs to broaden what students could explore there, and changing how homework and tests are written.

Students are also finding their own new ways to learn online, by engaging in online activism. The era of a campus bubble seems over in the age of Twitter

Colleges are still struggling to find the best fit for online education

We dove into what lessons can be learned from MOOCs, as well what research so far about which audiences online can best serve.

And what does it mean to teach an age of information overload and polarization?

Perhaps the toughest questions of all about teaching in the 21st century is what exactly is the professor’s role in the Internet age. Once upon a time the goal was to be the ‘sage on the stage,’ when lecturing was king. Today many people argue that the college instructor should be more of a ‘guide on the side.’ But as one popular teaching expert notes, even that may not quite fit.

And in an era of intense political polarization, colleges and professors are looking for best to train students to become digitally literate so they can play their roles as informed citizens. But just how to do that is up for debate, though some are looking for a nonpartisan solution.

 

1 2 3 4