Searching for "online dating"

Recommendations for games and gaming at LRS

Gaming and Gamification in academic and library settings (paper)
Short URL: http://scsu.mn/1F008Re 

Based on the literature regarding games, gaming, gamification, game-based learning, and serious games, several clear trends emerge:

  1. Gaming and gamification in the sense of game-based learning is about using games and game-like tactics in the education process, for greater engagement and better learning outcomes. However, this is only the first level of such initiative. The second and higher level is about involving students in the game-building and gamification of the learning process (as per Vygotsky’s Zone of…) thus achieving student-centered and experiential learning.
  2. When hosting games and gaming in any library, “in-person” or electronic/online games are welcome but not sufficient to fulfill their promise, especially in an academic library. Per (1), an academic library has the responsibility to involve students and guide them in learning how to engage in the building process required in true game-based learning.
  3. Game-based learning, gaming and gamification in particular, in educational (academic library) settings must consider mobile devices and the BYOD movement in particular as intrinsic parts of the entire process. Approaching the initiative primarily by acquiring online “in-person” games, or game consoles has the same limited educational potential as only hosting games, rather than elevating the students to full guided engagement with game-based learning. If public relations and raised profile are the main goals for the academic library, such an approach is justified. If the academic library seeks to maximize the value of game-based learning, then the library must consider: a. gaming consoles, b. mobile devices as part of a BYOD initiative and c. cloud-based / social games, such as MineCraft, SimCity etc.
  4. Design for game-based learning, gaming and gamification in educational (academic library) settings must include multiple forms of assessment and reward, e.g. badges, leaderboards and/or certificates as an intrinsic part of the entire process. Merely hosting games in the academic library cannot guarantee true game-based learning. The academic library, as the forefront of a game-based learning initiative on campus, must work with faculty on understanding and fine tuning badges and similar new forms of assessment and reward, as they effectively implement large scale game-based learning, focused on the students’ learning gains.

Recommendations for LRS

  1. In regard to LRS, the gaming and gamification process must be organized and led by faculty, including housing and distributing the hardware, software and applications, when needed.
  2. The attached paper and the respective conclusions summarized in four points demand educational and experiential background, which is above the limits of the LRS staff. In addition, the LRS staff has clearly admitted that the pedagogical value of gaming and gamification is beyond their interest. This recommendation is not contradicting to the fact and opportunity for LRS staff to participate in the process and contribute to the process; it just negates the possibility of staff mandating and leading the process, since it will keep the gaming and gamification process on a very rudimentary level.
  3. The process must be further led by faculty with a terminal degree in education (Ph.D.) and experience in the educational field, since, as proved by the attached paper and 4 point conclusion, the goal is not a public-library type of hosting activities, but rather involving students in a pedagogically-sound creative process, with the respective opportunity for assessment and future collaboration with instructors across campus. This recommendation is not contradicting the fact and opportunity for LRS library faculty to participate actively in the process and contribute to the process. It just safeguards from restricting the process to the realm of “public-library” type of hosting activities, but failing to elevate them to the needs of an academic campus and connecting with instructors across campus.
  4. This conclusions adhere to and are derived from the document recommended by the LRS dean, discussed and accepted by LRS faculty in 2013 about new trends and directions in academic libraries, namely diversification of LRS faculty; breaking from the traditional library mold of including faculty from different disciplines with different opinions and ideas.

mega trends in technology

THE SIX BIGGEST TRENDS IN SOCIAL THAT WILL BLOW YOUR MIND

Mega shifts in social business will significantly affect the way that business will run in the future.

http://www.socialmediaexplorer.com/social-media-marketing/the-six-biggest-trends-in-social-that-will-blow-your-mind/

1. Big Data

How it works: Businesses collect multiple data points, helping to create hyper-specific marketing for users, while making better predictions with more information from a larger data set.

Examples: You’ve already seen this when Target figured out a teen was pregnant before her dad did. Even though she didn’t buy diapers or formula, her purchasing habits correlated closely with other customers’ who were pregnant, and Target sent her coupons for her upcoming baby.

Factors: Big data is being powered by the reduction in costs of data storage, as well as an explosion in the ability of businesses to capture data points. Never before have retailers been able to capture as much data about purchases, never before has online tracking been so robust, nor have social platforms offered access to so much data about users.

How to Prepare: As a user, you can expect to see much more targeted marketing, and not necessarily what you may expect. By drawing conclusions from large sets of data, companies might be even a little creepy in being able to predict your life – like the Target pregnancy. For marketers, you can expect to find new ways to streamline your sales funnel and get more analytical data about customers through social networks, web analytics groups and at retail.

2. Social Tool Aggregation

How it works: More and more third-party tools are springing up to help marketers and social network users make sense of multiple networks. Furthermore, networks themselves are offering ways of connecting to other apps and networks.

Examples: Tools like IFTTT and Zapier use social network APIs to trigger responses, while others like HootSuite allow users to aggregate multiple network communication into one tool. At the same time, tools like About.me allow a combined view of an individual’s social activity. Furthermore, networks themselves are beginning to integrate. Facebook allows cross posts from Instagram, Foursquare, Yelp and a variety of others.

Factors: It’s already taking too much time for individuals and marketers alike to keep up with just a couple social networks, and both the social networks and third-party tools know this. By consolidating social network interaction into a single place, users may be able to spend less time trying to make sense of the chaos.

How to Prepare: Users and marketers alike should keep an eye out for how this data is being used. What happens if you like Eminem on Facebook, but check into a venue during a Taylor Swift concert on Foursquare? What happens if you listen to the Glee channel on Pandora? What says more about who you really are? Do these networks share that information? Is it part of the authorization you okayed? The future may tell.

3. Social Network Consolidation

How it works: Social networks and tool providers are consolidating to remain competitive, both in creating a better offering for users, as well as buying market share.

Examples: Facebook has had nearly 40 different acquisitions since 2005 including technologies that help import contacts, manage photos, create mobile apps, and more, with their largest acquisition being Instagram for one beelion dollars (Doctor Evil style, of course.) Not to be outdone, LinkedIn has scored about 10 of their own acquisitions including Slideshare. Twitter has acquired tools like TweetDeck, platforms like Posterous and has created Vine, but acquisitions aren’t limited to social networks, they extend into social tools as well. Salesforce just had their largest couple years so far acquiring Radian6, Buddy Media and most recently, their largest, Exact Target. Adobe purchased Omniture, and Google bought YouTube and Wildfire Apps, and Oracle took over Involver social apps. Everyone is finding some value in social.

Factors: Not only is social the big thing, but it’s the logical next step after Social Aggregation. People want to be able to easily publish across social networks and marketers want to have the ability to create one true set of data. Rather than having multiple tools these companies are attempting to offer consolidated suites for data creation, storage and analysis.

How to Prepare: Marketers need to be aware of evolving tools and networks. When Twitter bought TweetDeck, it dropped many of the supported features for Facebook, LinkedIn, Myspace and others. Be aware of these types of changes so you can make plans for uninterrupted service.

4. Crowdsourcing

How it works: Companies are offering bigger roles to consumers.

Examples: Small and medium business often resort to sites like DesignCrowd, who offers thousands of designers the opportunity to design a logo, print piece or something else. The customer picks the best designs, offers revisions and the winner gets about $200. Starbucks turned to crowdsourcing for coming up with new product ideas, with over 50,000 ideas coming through My Starbucks Idea. Doritos, Lincoln, Pepsi, Pizza Hut, Toyota and others have even crowdsourced Super Bowl ads.

Factors: Customers want to have a stake in companies. As more businesses go to greater and greater lengths to spotlight influential users or creative user-generated work, consumers are expecting to interact more and more with companies in these ways. Furthermore, consumers are expecting more unique messaging rather than traditional corporate marketing speak.

How to Prepare: Find new ways that you can incorporate customer feedback and ideas into marketing campaigns, product updates or other areas of the business.

5. Sharing Economy

How it works: Online networks, “peer-to-peer marketplaces” are set up to pay to use people’s spare assets – rent a bedroom, or car from, or even eat a meal with complete strangers.

Examples: Perhaps some of the first companies in this space followed the crowdfunding model – with Kickstarter and Indiegogo being the top two. Airbnb offers to rent out unoccupied living space from a bedroom to an entire island including 250,000 listings in 192 countries. Taskrabbit allows users to outsource small jobs such as picking up dog food and dropping it off at your door. RelayRides even offers unused personal vehicles to rent.

Factors: It could be the downturn in the economy making some folks want to rent out their cars and rooms for extra cash, or causing others to avoid committing to a car payment. Furthermore, people are increasingly aware of the toll on natural resources in manufacturing and the high costs of parking in major urban areas. Sharing based businesses help to alleviate these problems and make use of otherwise idle resources.

How to Prepare: See if there may be a natural fit in working with one of these sharing services or offering your services through one. Jeremiah Owyang offers an example where Marriott could work with a shared lodging hosts to offer a “stamp of approval” of sorts, where hosts could agree to abide by certain standards or receive certain training to become certified. Marriott could even offer bedding, linens or other materials that could both help guests feel more confident in their accommodations while helping guests distinguish themselves from competitors.

6. Quantified Self

How it works: Individuals using devices or social networks to track information about themselves. This data can be cross referenced to identify some interesting trends about yourself.

Examples: FitBit tracks your physical activity, while foursquare tracks the types of businesses where you check in. It’s not too difficult to find out that when you go to movie theaters, you tend to eat poorly, and when you go to museums, you add an extra thousand steps to your routine. Apply that across other areas of life, music, work, love and you can some very interesting trends can turn out.

Factors: People are increasingly using technology to extrapolate information to work more efficiently. Furthermore, an increase in the scrutiny of the NSA and increased awareness of privacy have perhaps made people more interested in creating and storing their own information.

How to Prepare: Companies need to offer APIs and other ways for users to control and access their own information where possible. Connect to services like IFTTT and Zapier so users can import data and manipulate it, and make accommodations for people using personal technology like FitBits, Nike Fuelband, Jawbone Up, and others.

Overall, these mega shifts in social networking and social business can significantly affect the way that business will run in the future. Are you prepared? Have you seen these shifts or experienced them? Look for our future posts on the Micro-Trends within each of these larger trends and let us know your thoughts in the comments.

==========================================================================

CES 2014: Four mega-trends for the professionals

Summary: Trends matter at CES. While there may not be major product announcements, trends will emerge to shape 2014. Here’s what to watch in business tech.

http://www.zdnet.com/ces-2014-four-mega-trends-for-the-professionals-7000024727/

1. Wearables

2. The Internet of Things

3. Contextual computing

4. Consumerization of business tech

blog under the articles holds good information

clickers documentation

Thursday, April 11, 11AM-1PM, Miller Center B-37
and/or
http://media4.stcloudstate.edu/scsu

We invite the campus community to a presentation by three vendors of Classroom Response System (CRS), AKA “clickers”:

11:00-11:30AM          Poll Everywhere,              Mr. Alec Nuñez

11:30-12:00PM          iClikers,                                Mr. Jeff Howard
12:00-12:30PM          Top Hat Monocle             Mr. Steve Popovich

12:30-1PM                  Turning Technologies     Mr. Jordan Ferns

links to documentation from the vendors:

http://web.stcloudstate.edu/informedia/crs/ClickerSummaryReport_NDSU.docx 

 http://web.stcloudstate.edu/informedia/crs/Poll%20Everywhere.docx

http://web.stcloudstate.edu/informedia/crs/tophat1.pdf

http://web.stcloudstate.edu/informedia/crs/tophat2.pdf

http://web.stcloudstate.edu/informedia/crs/turning.pdf

Top Hat Monocle docs:

http://web.stcloudstate.edu/informedia/crs/thm/FERPA.pdf

http://web.stcloudstate.edu/informedia/crs/thm/proposal.pdf

http://web.stcloudstate.edu/informedia/crs/thm/THM_CaseStudy_Eng.pdf

http://web.stcloudstate.edu/informedia/crs/thm/thm_vsCRS.pdf

iCLicker docs:
http://web.stcloudstate.edu/informedia/crs/iclicker/iclicker.pdf

http://web.stcloudstate.edu/informedia/crs/iclicker/iclicker2VPAT.pdf

http://web.stcloudstate.edu/informedia/crs/iclicker/responses.doc

 

Questions to vendor: alec@polleverywhere.com 
  1. 1.     Is your system proprietary as far as the handheld device and the operating system software?

The site and the service are the property of Poll Everywhere. We do not provide handheld devices. Participants use their own device be it a smart phone, cell phone, laptop, tablet, etc.

  1. 2.     Describe the scalability of your system, from small classes (20-30) to large auditorium classes. (500+).

Poll Everywhere is used daily by thousands of users. Audience sizes upwards of 500+ are not uncommon. We’ve been used for events with 30,000 simultaneous participants in the past.

  1. 3.     Is your system receiver/transmitter based, wi-fi based, or other?

N/A

  1. 4.     What is the usual process for students to register a “CRS”(or other device) for a course? List all of the possible ways a student could register their device. Could a campus offer this service rather than through your system? If so, how?

Student participants may register by filling out a form. Or, student information can be uploaded via a CSV.

  1. 5.     Once a “CRS” is purchased  can it be used for as long as the student is enrolled in classes? Could “CRS” purchases be made available through the campus bookstore? Once a student purchases a “clicker” are they able to transfer ownership when finished with it?

N/A. Poll Everywhere sells service licenses the length and number of students supported would be outlined in a services agreement.

  1. 6.     Will your operating software integrate with other standard database formats? If so, list which ones.

Need more information to answer.

  1. 7.     Describe the support levels you provide. If you offer maintenance agreements, describe what is covered.

8am to 8pm EST native English speaking phone support and email support.

  1. 8.     What is your company’s history in providing this type of technology? Provide a list of higher education clients.

Company pioneered and invented the use of this technology for audience and classroom response. http://en.wikipedia.org/wiki/Poll_Everywhere. University of Notre Dame
South Bend, Indiana

University of North Carolina-Chapel Hill
Raleigh, North Carolina

University of Southern California
Los Angeles, California

San Diego State University
San Diego, California

Auburn University
Auburn, Alabama

King’s College London
London, United Kingdom

Raffles Institution
Singapore

Fayetteville State University
Fayetteville, North Carolina

Rutgers University
New Brunswick, New Jersey

Pepperdine University
Malibu, California

Texas A&M University
College Station, Texas

University of Illinois
Champaign, Illinois

  1. 9.     What measures does your company take to insure student data privacy? Is your system in compliance with FERPA and the Minnesota Data Practices Act? (https://www.revisor.leg.state.mn.us/statutes/?id=13&view=chapter)

Our Privacy Policy can be found here: http://www.polleverywhere.com/privacy-policy. We take privacy very seriously.

  1. 10.  What personal data does your company collect on students and for what purpose? Is it shared or sold to others? How is it protected?

Name. Phone Number. Email. For the purposes of voting and identification (Graded quizzes, attendance, polls, etc.). It is never shared or sold to others.

  1. 11.  Do any of your business partners collect personal information about students that use your technology?

No.

  1. 12.  With what formats can test/quiz questions be imported/exported?

Import via text. Export via CSV.

  1. 13.  List compatible operating systems (e.g., Windows, Macintosh, Palm, Android)?

Works via standard web technology including Safari, Chrome, Firefox, and Internet Explorer. Participant web voting fully supported on Android and IOS devices. Text message participation supported via both shortcode and longcode formats.

  1. 14.  What are the total costs to students including device costs and periodic or one-time operation costs

Depends on negotiated service level agreement. We offer a student pays model at $14 per year or Institutional Licensing.

  1. 15.  Describe your costs to the institution.

Depends on negotiated service level agreement. We offer a student pays model at $14 per year or Institutional Licensing.

  1. 16.  Describe how your software integrates with PowerPoint or other presentation systems.

Downloadable slides from the website for Windows PowerPoint and downloadable app for PowerPoint and Keynote integration on a Mac.

17. State your level of integration with Desire2Learn (D2L)?Does the integration require a server or other additional equipment the campus must purchase?Export results from site via CSV for import into D2L.
  1. 17.  How does your company address disability accommodation for your product?

We follow the latest web standards best practices to make our website widely accessible by all. To make sure we live up to this, we test our website in a text-based browser called Lynx that makes sure we’re structuring our content correctly for screen readers and other assisted technologies.

  1. 18.  Does your software limit the number of answers per question in tests or quizzes? If so, what is the max question limit?

No.

  1. 19.  Does your software provide for integrating multimedia files? If so, list the file format types supported.

Supports image formats (.PNG, .GIF, .JPG).

  1. 20.  What has been your historic schedule for software releases and what pricing mechanism do you make available to your clients for upgrading?

We ship new code daily. New features are released several times a year depending on when we finish them. New features are released to the website for use by all subscribers.

  1. 21.  Describe your “CRS”(s).

Poll Everywhere is a web based classroom response system that allows students to participate from their existing devices. No expensive hardware “clickers” are required. More information can be found at  http://www.polleverywhere.com/classroom-response-system.

  1. 22.  If applicable, what is the average life span of a battery in your device and what battery type does it take?

N/A. Battery manufacturers hate us. Thirty percent of their annual profits can be contributed to their use in clickers (we made that up).

  1. 23.  Does your system automatically save upon shutdown?

Our is a “cloud based” system. User data is stored there even when your computer is not on.

  1. 24.  What is your company’s projection/vision for this technology in the near and far term.

We want to take clicker companies out of business. We think it’s ridiculous to charge students and institutions a premium for outdated technology when existing devices and standard web technology can be used instead for less than a tenth of the price.

  1. 25.  Does any of your software/apps require administrator permission to install?

No.

  1. 26.  If your system is radio frequency based, what frequency spectrum does it operate in? If the system operate in the 2.4-2.5 ghz. spectrum, have you tested to insure that smart phones, wireless tablet’s and laptops and 2.4 ghz. wireless phones do not affect your system? If so, what are the results of those tests?

No.

  1. 27.  What impact to the wireless network does the solution have?

Depends on a variety of factors. Most university wireless networks are capable of supporting Poll Everywhere. Poll Everywhere can also make use of cell phone carrier infrastructure through SMS and data networks on the students phones.

  1. 28.  Can the audience response system be used spontaneously for polling?

Yes.

  1. 29.  Can quiz questions and response distributions be imported and exported from and to plaintext or a portable format? (motivated by assessment & accreditation requirements).

Yes.

  1. 30.  Is there a requirement that a portion of the course grade be based on the audience response system?

No.

Gloria Sheldon
MSU Moorhead

Fall 2011 Student Response System Pilot

Summary Report

 

NDSU has been standardized on a single student response (i.e., “clicker”) system for over a decade, with the intent to provide a reliable system for students and faculty that can be effectively and efficiently supported by ITS. In April 2011, Instructional Services made the decision to explore other response options and to identify a suitable replacement product for the previously used e-Instruction Personal Response System (PRS). At the time, PRS was laden with technical problems that rendered the system ineffective and unsupportable. That system also had a steep learning curve, was difficult to navigate, and was unnecessarily time-consuming to use. In fact, many universities across the U.S. experienced similar problems with PRS and have since then adopted alternative systems.

A pilot to explore alternative response systems was initiated at NDSU in fall 2011. The pilot was aimed at further investigating two systems—Turning Technologies and iClicker—in realistic classroom environments. As part of this pilot program, each company agreed to supply required hardware and software at no cost to faculty or students. Each vendor also visited campus to demonstrate their product to faculty, students and staff.

An open invitation to participate in the pilot was extended to all NDSU faculty on a first come, first serve basis. Of those who indicated interest, 12 were included as participants in this pilot.

 

Pilot Faculty Participants:

  • Angela Hodgson (Biological Sciences)
  • Ed Deckard (AES Plant Science)
  • Mary Wright (Nursing)
  • Larry Peterson (History, Philosophy & Religious Studies)
  • Ronald Degges (Statistics)
  • Julia Bowsher (Biological Sciences)
  • Sanku Mallik (Pharmaceutical Sciences)
  • Adnan Akyuz (AES School of Natural Resource Sciences)
  • Lonnie Hass (Mathematics)
  • Nancy Lilleberg (ITS/Communications)
  • Lisa Montplaisir (Biological Sciences)
  • Lioudmila Kryjevskaia (Physics)

 

Pilot Overview

The pilot included three components: 1) Vendor demonstrations, 2) in-class testing of the two systems, and 3) side-by-side faculty demonstrations of the two systems.

After exploring several systems, Instructional Services narrowed down to two viable options—Turning Technologies and iClicker. Both of these systems met initial criteria that was assembled based on faculty input and previous usage of the existing response system. These criteria included durability, reliability, ease of use, radio frequency transmission, integration with Blackboard LMS, cross-platform compatibility (Mac, PC), stand-alone software (i.e., no longer tied to PowerPoint or other programs), multiple answer formats (including multiple choice, true/false, numeric), potential to migrate to mobile/Web solutions at some point in the future, and cost to students and the university.

In the first stage of the pilot, both vendors were invited to campus to demonstrate their respective technologies. These presentations took place during spring semester 2011 and were attended by faculty, staff and students. The purpose of these presentations was to introduce both systems and provide faculty, staff, and students with an opportunity to take a more hands-on look at the systems and provide their initial feedback.

In the second stage of the pilot, faculty were invited to test the technologies in their classes during fall semester 2011. Both vendors supplied required hardware and software at no cost to faculty and students, and both provided online training to orient faculty to their respective system. Additionally, Instructional Services staff provided follow-up support and training throughout the pilot program. Both vendors were requested to ensure system integration with Blackboard. Both vendors indicated that they would provide the number of clickers necessary to test the systems equally across campus. Both clickers were allocated to courses of varying sizes, ranging from 9 to 400+ students, to test viability in various facilities with differing numbers of users. Participating faculty agreed to offer personal feedback and collect feedback from students regarding experiences with the systems at the end of the pilot.

In the final stage of the pilot, Instructional Services facilitated a side-by-side demonstration led by two faculty members. Each faculty member showcased each product on a function-by-function basis so that attendees were able to easily compare and contrast the two systems. Feedback was collected from attendees.

 

Results of Pilot

In stage one, we established that both systems were viable and appeared to offer similar features, functions, and were compatible with existing IT systems at NDSU. The determination was made to include both products in a larger classroom trial.

In stage two, we discovered that both systems largely functioned as intended; however, several differences between the technologies in terms of advantages and disadvantages were discovered that influenced our final recommendation. (See Appendix A for a list of these advantages, disadvantages, and potential workarounds.) We also encountered two significant issues that altered the course of the pilot. Initially, it was intended that both systems would be tested in equal number in terms of courses and students. Unfortunately, at the time of the pilot, iClicker was not able to provide more than 675 clickers, which was far fewer than anticipated. Turning Technologies was able to provide 1,395 clickers. As a result, Turning Technologies was used by a larger number of faculty and students across campus.

At the beginning of the pilot, Blackboard integration with iClicker at NDSU was not functional. The iClicker vendor provided troubleshooting assistance immediately, but the problem was not resolved until mid-November. As a result, iClicker users had to use alternative solutions for registering clickers and uploading points to Blackboard for student viewing. Turning Technologies was functional and fully integrated with Blackboard throughout the pilot.

During the span of the pilot additional minor issues were discovered with both systems. A faulty iClicker receiver slightly delayed the effective start date of clicker use in one course.  The vendor responded by sending a new receiver, however it was an incorrect model. Instructional Services temporarily exchanged receivers with another member of the pilot group until a functional replacement arrived. Similarly, a Turning Technologies receiver was received with outdated firmware. Turning Technologies support staff identified the problem and assisted in updating the firmware with an update tool located on their website. A faculty participant discovered a software flaw in the iClicker software that hides the software toolbar when disconnecting a laptop from a second monitor. iClicker technical support assisted in identifying the problem and stated the problem would be addressed in a future software update. A workaround was identified that mitigated this problem for the remainder of the pilot. It is important to note that these issues were not widespread and did not widely affect all pilot users, however these issues attest to the need for timely, reliable, and effective vendor support.

Students and faculty reported positive experiences with both technologies throughout the semester. Based on feedback, users of both systems found the new technologies to be much improved over the previous PRS system, indicating that adopting either technology would be perceived as an upgrade among students and faculty. Faculty pilot testers met several times during the semester to discuss their experiences with each system; feedback was sent to each vendor for their comments, suggestions, and solutions.

During the stage three demonstrations, feedback from attendees focused on the inability for iClicker to integrate with Blackboard at that time and the substantial differences between the two systems in terms of entering numeric values (i.e., Turning Technologies has numeric buttons, while iClicker requires the use of a directional key pad to scroll through numeric characters). Feedback indicated that attendees perceived Turning Technologies’ clickers to be much more efficient for submitting numeric responses. Feedback regarding other functionalities indicated relative equality between both systems.

Recommendation

Based on the findings of this pilot, Instructional Services recommends that NDSU IT adopt Turning Technologies as the replacement for the existing PRS system. While both pilot-tested systems are viable solutions, Turning Technologies appears to meet the needs of a larger user base. Additionally, the support offered by Turning Technologies was more timely and effective throughout the pilot. With the limited resources of IT, vendor support is critical and was a major reason for exploring alternative student response technologies.

From Instructional Services’ standpoint, standardizing to one solution is imperative for two major reasons: cost efficiency for students (i.e., preventing students from having to purchase duplicate technologies) and efficient utilization of IT resources (i.e., support and training). It is important to note that this recommendation is based on the opinion of the Instructional Services staff and the majority of pilot testers, but is not based on consensus among all participating faculty and staff. It is possible that individual faculty members may elect to use other options that best meet their individual teaching needs, including (but not limited to) iClicker. As an IT organization, we continue to support technology that serves faculty, student and staff needs across various colleges, disciplines, and courses. We feel that this pilot was effective in determining the student response technology—Turning Technologies—that will best serve NDSU faculty, students and staff for the foreseeable future.

Once a final decision concerning standardization is made, contract negotiations should begin in earnest with the goal of completion by January 1, 2012, in order to accommodate those wishing to use clickers during the spring session.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 


 

Appendix A: Clicker Comparisons
Turning Technologies and iClicker

 

Areas where both products have comparable functionality:

  • Setting up the receiver and software
  • Student registration of clickers
  • Software interface floats above other software
    • Can use with anything – PowerPoint, Websites, Word, etc.
    • Asking questions on the fly
    • Can create questions / answers files
    • Managing scores and data
      • Allow participation points, points for correct answer, change correct answer
      • Reporting – Summary and Detailed
      • Uploading scores and data to Blackboard (but there was a big delay with the iClicker product)
      • Durability of the receivers and clickers
      • Free software
      • Offer mobile web device product to go “clickerless”

Areas where the products differ:

Main Shortcomings of Turning Technology Product:

  • Costs $5 more – no workaround
  • Doesn’t have instructor readout window on receiver base –
    • This is a handy function in iClicker that lets the instructor see the %’s of votes as they come in, allowing the instructor to plan how he/she will proceed.
    • Workaround: As the time winds down to answer the question, the question and answers are displayed on the screen. Intermittently, the instructor would push a button to mute the projector, push a button to view graph results quickly, then push a button to hide graph and push a button to unmute the projector. In summary, push four buttons quickly each time you want to see the feedback, and the students will see a black screen momentarily.
    • Processing multiple sessions when uploading grading –
      • Turning Technologies uses their own file structure types, but iClicker uses comma-separated-value text files which work easily with Excel
      • Workaround: When uploading grades into Blackboard, upload them one session at a time, and use a calculated total column in Bb to combine them. Ideally, instructors would upload the grades daily or weekly to avoid backlog of sessions.

 

Main Shortcomings of iClicker Product:

  • Entering numeric answers –
    • Questions that use numeric answers are widely used in Math and the sciences. Instead of choosing a multiple-choice answer, students solve the problem and enter the actual numeric answer, which can include numbers and symbols.
    • Workaround: Students push mode button and use directional pad to scroll up and down through a list of numbers, letters and symbols to choose each character individually from left to right. Then they must submit the answer.
    • Number of multiple choice answers –
      • iClicker has 5 buttons on the transmitter for direct answer choices and Turning Technologies has 10.
      • Workaround: Similar to numeric answer workaround. Once again the simpler transmitter becomes complex for the students.
      • Potential Vendor Support Problems –
        • It took iClicker over 3 months to get their grade upload interface working with NDSU’s Blackboard system. The Turning Technology interface worked right away.  No workaround.

 

 

 

 

1 2 3