- Zohoorian-Fooladi, N., & Abrizah, A. A. (2014). Academic librarians and their social media presence: a story of motivations and deterrents. Information Development, 30(2), 159-171.
Librarians also believed that social media tools are suitable not only to communicate with users but also
to facilitate the interaction of librarians with each other by creating librarian groups.
Librarians also believed that social media tools are suitable not only to communicate with users but also
to facilitate the interaction of librarians with each other by creating librarian groups. (p. 169)
- Collins, G., & Quan-Haase, A. (2014). Are Social Media Ubiquitous in Academic Libraries? A Longitudinal Study of Adoption and Usage Patterns. Journal Of Web Librarianship, 8(1), 48-68. doi:10.1080/19322909.2014.873663
- Reynolds, L. M., Smith, S. E., & D’Silva, M. U. (2013). The Search for Elusive Social Media Data: An Evolving Librarian-Faculty Collaboration. Journal Of Academic Librarianship, 39(5), 378-384. doi:10.1016/j.acalib.2013.02.007
- Chawner, B., & Oliver, G. (2013). A survey of New Zealand academic reference librarians: Current and future skills and competencies. Australian Academic & Research Libraries, 44(1), 29-39. doi:10.1080/00048623.2013.773865
- Lilburn, J. (2012). Commercial Social Media and the Erosion of the Commons: Implications for Academic Libraries. Portal: Libraries And The Academy, 12(2), 139-153.
The general consensus emerging to date is that the Web 2.0 applications now widely used in academic libraries have given librarians new tools for interacting with users, promoting services, publicizing events and teaching information literacy skills. We are, by now, well versed in the language of Web 2.0. The 2.0 tools – wikis, blogs, microblogs, social networking sites, social bookmarking sites, video or photo sharing sites, to name just a few – are said to be open, user-centered, and to increase user engagement, interaction, collaboration, and participation. Web 2.0 is said to “empower creativity, to democratize media production, and to celebrate the individual while also relishing the power of collaboration and social networks.”4 All of this is in contrast with what is now viewed as the static, less interactive, less empowering pre-Web 2.0 online environment. (p. 140)
Taking into account the social, political, economic, and ethical issues associated with Web 2.0, other scholars raise questions about the generally accepted understanding of the benefits of Web 2.0. p. 141
- The decision to integrate commercial social media into existing library services seems almost inevitable, if not compulsory. Yet, research that considers the short- and long-term implications of this decision remains lacking. As discussed in the sections above, where and how institutions choose to establish a social media presence is significant. It confers meaning. Likewise, the absence of a presence can also confer meaning, and future p. 149
- Nicholas, D., Watkinson, A., Rowlands, I., & Jubb, M. (2011). Social Media, Academic Research and the Role of University Libraries. Journal Of Academic Librarianship, 37(5), 373-375. doi:10.1016/j.acalib.2011.06.023
- BROWN, K., LASTRES, S., & MURRAY, J. (2013). Social Media Strategies and Your Library. Information Outlook,17(2), 22-24.
Establishing an open leadership relationship with these stakeholders necessitates practicing five rules of open leadership: (1) respecting the power that your patrons and employees have in their relationship with you and others, (2) sharing content constantly to assist in building trust, (3) nurturing curiosity and humility in yourself as well as in others, (4) holding openness accountable, and (5) forgiving the failures of others and yourself. The budding relationships that will flourish as a result of applying these rules will reward each party involved.
Whether you intend it or not, your organization’s leaders are part of your audience. As a result, you must know your organization’s policies and practices (in addition to its people) if you hope to succeed with social media. My note: so, if one defines a very narrow[sided] policy, then the entire social media enterprise is….
Third, be a leader and a follower. My note: not a Web 1.0 – type of control freak, where content must come ONLY from you and be vetoed by you!
All library staff have their own login accounts and are expected to contribute to and review
- Dority Baker, M. L. (2013). Using Buttons to Better Manage Online Presence: How One Academic Institution Harnessed the Power of Flair. Journal Of Web Librarianship, 7(3), 322-332. doi:10.1080/19322909.2013.789333
his project was a partnership between the Law College Communications Department, Law College Administration, and the Law Library, involving law faculty, staff, and librarians.
- Van Wyk, J. (2009). Engaging academia through Library 2.0 tools : a case study : Education Library, University of Pretoria.
- Paul, J., Baker, H. M., & Cochran, J. (2012). Effect of online social networking on student academic performance.Computers In Human Behavior, 28(6), 2117-2127. doi:10.1016/j.chb.2012.06.016
#SocialMedia and students place a higher value on the technologies their instructors use effectively in the classroom. a negative impact of social media usage on academic performance. rather CONSERVATIVE conclusions.
Students should be made aware of the detrimental impact of online social networking on their potential academic performance. In addition to recommending changes in social networking related behavior based on our study results, findings with regard to relationships between academic performance and factors such as academic competence, time management skills, attention span, etc., suggest the need for academic institutions and faculty to put adequate emphasis on improving the student’s ability to manage time efficiently and to develop better study strategies. This could be achieved via workshops and seminars that familiarize and train students to use new and intuitive tools such as online calendars, reminders, etc. For example, online calendars are accessible in many devices and can be setup to send a text message or email reminder of events or due dates. There are also online applications that can help students organize assignments and task on a day-to-day basis. Further, such workshops could be a requirement of admission to academic programs. In the light of our results on relationship between attention span and academic performance, instructors could use mandatory policies disallowing use of phones and computers unless required for course purposes. My note: I completely disagree with the this decision: it can be argued that instructors must make their content delivery more engaging and thus, electronic devices will not be used for distraction
- MANGAN, K. (2012). Social Networks for Academics Proliferate, Despite Some Doubts. Chronicle Of Higher Education, 58(35), A20.
While Mendeley’s users tend to have scientific backgrounds, Zotero offers similar technical tools for researchers in other disciplines, including many in the humanities. The free system helps researchers collect, organize, share, and cite research sources.
“After six years of running Zotero, it’s not clear that there is a whole lot of social value to academic social networks,” says Sean Takats, the site’s director, who is an assistant professor of history at George Mason University. “Everyone uses Twitter, which is an easy way to pop up on other people’s radar screens without having to formally join a network.
- Beech, M. (2014). Key Issue – How to share and discuss your research successfully online. Insights: The UKSG Journal, 27(1), 92-95. doi:10.1629/2048-7754.142
the dissemination of academic research over the internet and presents five tenets to engage the audience online. It comments on targeting an audience for the research and suggests the online social networks Twitter,LinkedIn, and ResearchGate as venues. It talks about the need to relate work with the target audience and examines the use of storytelling and blogs. It mentions engaging in online discussions and talks about open access research
Searching for "history"
The Minecraft Experience Panel Presentation Games for Change NYC April 24th 2014
Last year at G4C Nick Fortugno threw some controversy into the conversation about Minecraft by suggesting Minecraft was not a game but a toy. The proposed panel extends that conversation by asking what is the Minecraft experience, can it be defined or categorised and what as game designers and exponents can we take from understanding its zeitgeist and the impact it has had on the serious gaming landscape?
In 2012/23 at both GLS and G4C many presenters made jokes about including the obligatory Minecraft slide and for very good reasons. Minecraft is arguably a game of immense impact. It has been embraced as part of learning programs focussing on seemingly disparate areas from digital citizenship, history, coding and the maker movement. It is probably the first game brought into the classroom by teachers to leverage the out of school groundswell of existing player excitement. It’s impact is multi generational and perhaps more global than any game before it. The fan base and user community/ies are strong and well supported and exemplar of the potential Jim Gee describes for Big G game. This panel proposes to leverage that Big G space in the lead up to Games for Change 2014 and to honor the voices of its players.
Minecraft has been variously described as a game, toy sandpit, learning space, creative environment, virtual world, and game-infused service. But what really are the affordances of this blocky 16 bit program and how can we even begin to define its value to learning? Enter the Minecraft Experience, a global crowdsourced program managed by Bron Stuckey of The Massively Minecraft Project. People engaging in Minecraft activities about the globe are being invited to describe Minecraft in all its contexts and adaptations. The categories for these experiences will emerge from the crowd sourced content as members contribute thoughts, media, resources and questions to build the __Minecraft Experience__ evidence base.
This panel of notable speakers has been drawn together to answer provocative questions about Minecraft’s success and define its relationship to and impact on learning. The panelists have been chosen to represent play in many contexts formal education, informal learning, self-organised learning, schools and non-school contexts. They include game designers, educators, researchers, learners and parents who have each had a personal and professional experience of this and many other games.
Panelists take a position on the Minecraft experience and use the resources provided by members of the project to inform, support and evidence their case.
How are players, educators and researchers invited to contribute?
- project wiki to prod, poke, stimulate and support crowd sourced content and dialog
- live youth speakers on the panel
- social media and wiki activity in lead-up using selected #minecraftproject
- video inclusions of educators, parents, kids/youth arguments, evidence and questions
- promotion of youth media pieces from existing YouTube etc to support and stimulate various provocative dialogs
- livestream of the panel to global contributors with live feedback and questions.
Who could benefit from joining this project and attending the G4C 2014 panel session?
- Educators seeking to understand Minecraft’s value to learning
- Programs seeking to adapt Minecraft as part of a program of impact or change.
- Game designers seeking to build in its wake
- Anyone wanting to consider issues of fidelity, adaptation, constructionism, popular culture, and impact in gaming.
Big Data and Privacy
April 17, 2014
Big data has been generating big hype for a while. In January, the White House jumped into the fray, launching a big data and privacy review. CDT participated in all three public workshops convened in connection with the review and submitted written comments.
CDT’s Big Data and Privacy Comments
In our comments, we focused on three main areas: applying the Fair Information Practice Principles (FIPPs) to both private sector and government big data programs; exploring technical measures such as de-identification to safeguard privacy; and reforming existing privacy laws, most notably the Electronic Communications Privacy Act, to account for rapid changes in the ways that digital data is collected, stored, and used.
CDT stressed that, as entities collect more data to offer innovative products and more efficient services, they must still be guided by purpose specification, consent, security, and the other elements of the FIPPs framework.
Government and Big Data
|“Strong consensus is forming that the bulk collection of phone records should end.”
|The Administration says that it will end its bulk collection of telephony metadata, although the details of what will replace it remain unsettled. Meanwhile, CDT is pointing out that the laws the government has used to justify bulk collection are not limited just to phone records. Instead, they could be used to justify collection of location data, Internet browsing history, financial records, and more. CDT has been vocal in advocating the end of all forms of bulk collection, and we endorse the USA FREEDOM Act as the best legislation to do just that.
A report from the White House review is due before the end of April, but it is expected to present more questions than answers. In this complex and unsettled space, CDT will continue to work with companies and other stakeholders to develop workable approaches that will protect privacy while pursuing the benefits promised by advanced data analytics.
Check Out CDT’s New Website
CDT has launched a totally revamped website: http://www.cdt.org. It has a fresh new look and tools that should make our content more easily accessible. Thanks to our partners at iStrategy Labs for their creative and technical efforts on the new site.
Do student evaluations measure teaching effectiveness?Manager’s Choice
Mauricio Vasquez, Ph.D.Assistant Professor in MISTop Contributor
Higher Education institutions use course evaluations for a variety of purposes. They factor in retention analysis for adjuncts, tenure approval or rejection for full-time professors, even in salary bonuses and raises. But, are the results of course evaluations an objective measure of high quality scholarship in the classroom?
The goal, for instance, is not to teach a particular version of history, but to teach someone how to think like a historian.
SPIEGEL: But you also said that lists can establish order. So, do both order and anarchy apply? That would make the Internet, and the lists that the search engine Google creates, prefect for you.
Eco: Yes, in the case of Google, both things do converge. Google makes a list, but the minute I look at my Google-generated list, it has already changed. These lists can be dangerous — not for old people like me, who have acquired their knowledge in another way, but for young people, for whom Google is a tragedy. Schools ought to teach the high art of how to be discriminating.
SPIEGEL: Are you saying that teachers should instruct students on the difference between good and bad? If so, how should they do that?
Eco: Education should return to the way it was in the workshops of the Renaissance. There, the masters may not necessarily have been able to explain to their students why a painting was good in theoretical terms, but they did so in more practical ways. Look, this is what your finger can look like, and this is what it has to look like. Look, this is a good mixing of colors. The same approach should be used in school when dealing with the Internet. The teacher should say: “Choose any old subject, whether it be German history or the life of ants. Search 25 different Web pages and, by comparing them, try to figure out which one has good information.” If 10 pages describe the same thing, it can be a sign that the information printed there is correct. But it can also be a sign that some sites merely copied the others’ mistakes.
SPIEGEL: You yourself are more likely to work with books, and you have a library of 30,000 volumes. It probably doesn’t work without a list or catalogue.
Eco: I’m afraid that, by now, it might actually be 50,000 books. When my secretary wanted to catalogue them, I asked her not to. My interests change constantly, and so does my library. By the way, if you constantly change your interests, your library will constantly be saying something different about you. Besides, even without a catalogue, I’m forced to remember my books. I have a hallway for literature that’s 70 meters long. I walk through it several times a day, and I feel good when I do. Culture isn’t knowing when Napoleon died. Culture means knowing how I can find out in two minutes. Of course, nowadays I can find this kind of information on the Internet in no time. But, as I said, you never know with the Internet.
Thursday, April 11, 11AM-1PM, Miller Center B-37
We invite the campus community to a presentation by three vendors of Classroom Response System (CRS), AKA “clickers”:
11:00-11:30AM Poll Everywhere, Mr. Alec Nuñez
11:30-12:00PM iClikers, Mr. Jeff Howard
12:00-12:30PM Top Hat Monocle Mr. Steve Popovich
12:30-1PM Turning Technologies Mr. Jordan Ferns
links to documentation from the vendors:
Top Hat Monocle docs:
|Questions to vendor: firstname.lastname@example.org|
The site and the service are the property of Poll Everywhere. We do not provide handheld devices. Participants use their own device be it a smart phone, cell phone, laptop, tablet, etc.
Poll Everywhere is used daily by thousands of users. Audience sizes upwards of 500+ are not uncommon. We’ve been used for events with 30,000 simultaneous participants in the past.
Student participants may register by filling out a form. Or, student information can be uploaded via a CSV.
N/A. Poll Everywhere sells service licenses the length and number of students supported would be outlined in a services agreement.
Need more information to answer.
8am to 8pm EST native English speaking phone support and email support.
Company pioneered and invented the use of this technology for audience and classroom response. http://en.wikipedia.org/wiki/Poll_Everywhere. University of Notre Dame
University of North Carolina-Chapel Hill
University of Southern California
San Diego State University
King’s College London
Fayetteville State University
Texas A&M University
University of Illinois
Name. Phone Number. Email. For the purposes of voting and identification (Graded quizzes, attendance, polls, etc.). It is never shared or sold to others.
Import via text. Export via CSV.
Works via standard web technology including Safari, Chrome, Firefox, and Internet Explorer. Participant web voting fully supported on Android and IOS devices. Text message participation supported via both shortcode and longcode formats.
Depends on negotiated service level agreement. We offer a student pays model at $14 per year or Institutional Licensing.
Depends on negotiated service level agreement. We offer a student pays model at $14 per year or Institutional Licensing.
Downloadable slides from the website for Windows PowerPoint and downloadable app for PowerPoint and Keynote integration on a Mac.
|17. State your level of integration with Desire2Learn (D2L)?Does the integration require a server or other additional equipment the campus must purchase?Export results from site via CSV for import into D2L.|
We follow the latest web standards best practices to make our website widely accessible by all. To make sure we live up to this, we test our website in a text-based browser called Lynx that makes sure we’re structuring our content correctly for screen readers and other assisted technologies.
Supports image formats (.PNG, .GIF, .JPG).
We ship new code daily. New features are released several times a year depending on when we finish them. New features are released to the website for use by all subscribers.
Poll Everywhere is a web based classroom response system that allows students to participate from their existing devices. No expensive hardware “clickers” are required. More information can be found at http://www.polleverywhere.com/classroom-response-system.
N/A. Battery manufacturers hate us. Thirty percent of their annual profits can be contributed to their use in clickers (we made that up).
Our is a “cloud based” system. User data is stored there even when your computer is not on.
We want to take clicker companies out of business. We think it’s ridiculous to charge students and institutions a premium for outdated technology when existing devices and standard web technology can be used instead for less than a tenth of the price.
Depends on a variety of factors. Most university wireless networks are capable of supporting Poll Everywhere. Poll Everywhere can also make use of cell phone carrier infrastructure through SMS and data networks on the students phones.
Fall 2011 Student Response System Pilot
NDSU has been standardized on a single student response (i.e., “clicker”) system for over a decade, with the intent to provide a reliable system for students and faculty that can be effectively and efficiently supported by ITS. In April 2011, Instructional Services made the decision to explore other response options and to identify a suitable replacement product for the previously used e-Instruction Personal Response System (PRS). At the time, PRS was laden with technical problems that rendered the system ineffective and unsupportable. That system also had a steep learning curve, was difficult to navigate, and was unnecessarily time-consuming to use. In fact, many universities across the U.S. experienced similar problems with PRS and have since then adopted alternative systems.
A pilot to explore alternative response systems was initiated at NDSU in fall 2011. The pilot was aimed at further investigating two systems—Turning Technologies and iClicker—in realistic classroom environments. As part of this pilot program, each company agreed to supply required hardware and software at no cost to faculty or students. Each vendor also visited campus to demonstrate their product to faculty, students and staff.
An open invitation to participate in the pilot was extended to all NDSU faculty on a first come, first serve basis. Of those who indicated interest, 12 were included as participants in this pilot.
Pilot Faculty Participants:
- Angela Hodgson (Biological Sciences)
- Ed Deckard (AES Plant Science)
- Mary Wright (Nursing)
- Larry Peterson (History, Philosophy & Religious Studies)
- Ronald Degges (Statistics)
- Julia Bowsher (Biological Sciences)
- Sanku Mallik (Pharmaceutical Sciences)
- Adnan Akyuz (AES School of Natural Resource Sciences)
- Lonnie Hass (Mathematics)
- Nancy Lilleberg (ITS/Communications)
- Lisa Montplaisir (Biological Sciences)
- Lioudmila Kryjevskaia (Physics)
The pilot included three components: 1) Vendor demonstrations, 2) in-class testing of the two systems, and 3) side-by-side faculty demonstrations of the two systems.
After exploring several systems, Instructional Services narrowed down to two viable options—Turning Technologies and iClicker. Both of these systems met initial criteria that was assembled based on faculty input and previous usage of the existing response system. These criteria included durability, reliability, ease of use, radio frequency transmission, integration with Blackboard LMS, cross-platform compatibility (Mac, PC), stand-alone software (i.e., no longer tied to PowerPoint or other programs), multiple answer formats (including multiple choice, true/false, numeric), potential to migrate to mobile/Web solutions at some point in the future, and cost to students and the university.
In the first stage of the pilot, both vendors were invited to campus to demonstrate their respective technologies. These presentations took place during spring semester 2011 and were attended by faculty, staff and students. The purpose of these presentations was to introduce both systems and provide faculty, staff, and students with an opportunity to take a more hands-on look at the systems and provide their initial feedback.
In the second stage of the pilot, faculty were invited to test the technologies in their classes during fall semester 2011. Both vendors supplied required hardware and software at no cost to faculty and students, and both provided online training to orient faculty to their respective system. Additionally, Instructional Services staff provided follow-up support and training throughout the pilot program. Both vendors were requested to ensure system integration with Blackboard. Both vendors indicated that they would provide the number of clickers necessary to test the systems equally across campus. Both clickers were allocated to courses of varying sizes, ranging from 9 to 400+ students, to test viability in various facilities with differing numbers of users. Participating faculty agreed to offer personal feedback and collect feedback from students regarding experiences with the systems at the end of the pilot.
In the final stage of the pilot, Instructional Services facilitated a side-by-side demonstration led by two faculty members. Each faculty member showcased each product on a function-by-function basis so that attendees were able to easily compare and contrast the two systems. Feedback was collected from attendees.
Results of Pilot
In stage one, we established that both systems were viable and appeared to offer similar features, functions, and were compatible with existing IT systems at NDSU. The determination was made to include both products in a larger classroom trial.
In stage two, we discovered that both systems largely functioned as intended; however, several differences between the technologies in terms of advantages and disadvantages were discovered that influenced our final recommendation. (See Appendix A for a list of these advantages, disadvantages, and potential workarounds.) We also encountered two significant issues that altered the course of the pilot. Initially, it was intended that both systems would be tested in equal number in terms of courses and students. Unfortunately, at the time of the pilot, iClicker was not able to provide more than 675 clickers, which was far fewer than anticipated. Turning Technologies was able to provide 1,395 clickers. As a result, Turning Technologies was used by a larger number of faculty and students across campus.
At the beginning of the pilot, Blackboard integration with iClicker at NDSU was not functional. The iClicker vendor provided troubleshooting assistance immediately, but the problem was not resolved until mid-November. As a result, iClicker users had to use alternative solutions for registering clickers and uploading points to Blackboard for student viewing. Turning Technologies was functional and fully integrated with Blackboard throughout the pilot.
During the span of the pilot additional minor issues were discovered with both systems. A faulty iClicker receiver slightly delayed the effective start date of clicker use in one course. The vendor responded by sending a new receiver, however it was an incorrect model. Instructional Services temporarily exchanged receivers with another member of the pilot group until a functional replacement arrived. Similarly, a Turning Technologies receiver was received with outdated firmware. Turning Technologies support staff identified the problem and assisted in updating the firmware with an update tool located on their website. A faculty participant discovered a software flaw in the iClicker software that hides the software toolbar when disconnecting a laptop from a second monitor. iClicker technical support assisted in identifying the problem and stated the problem would be addressed in a future software update. A workaround was identified that mitigated this problem for the remainder of the pilot. It is important to note that these issues were not widespread and did not widely affect all pilot users, however these issues attest to the need for timely, reliable, and effective vendor support.
Students and faculty reported positive experiences with both technologies throughout the semester. Based on feedback, users of both systems found the new technologies to be much improved over the previous PRS system, indicating that adopting either technology would be perceived as an upgrade among students and faculty. Faculty pilot testers met several times during the semester to discuss their experiences with each system; feedback was sent to each vendor for their comments, suggestions, and solutions.
During the stage three demonstrations, feedback from attendees focused on the inability for iClicker to integrate with Blackboard at that time and the substantial differences between the two systems in terms of entering numeric values (i.e., Turning Technologies has numeric buttons, while iClicker requires the use of a directional key pad to scroll through numeric characters). Feedback indicated that attendees perceived Turning Technologies’ clickers to be much more efficient for submitting numeric responses. Feedback regarding other functionalities indicated relative equality between both systems.
Based on the findings of this pilot, Instructional Services recommends that NDSU IT adopt Turning Technologies as the replacement for the existing PRS system. While both pilot-tested systems are viable solutions, Turning Technologies appears to meet the needs of a larger user base. Additionally, the support offered by Turning Technologies was more timely and effective throughout the pilot. With the limited resources of IT, vendor support is critical and was a major reason for exploring alternative student response technologies.
From Instructional Services’ standpoint, standardizing to one solution is imperative for two major reasons: cost efficiency for students (i.e., preventing students from having to purchase duplicate technologies) and efficient utilization of IT resources (i.e., support and training). It is important to note that this recommendation is based on the opinion of the Instructional Services staff and the majority of pilot testers, but is not based on consensus among all participating faculty and staff. It is possible that individual faculty members may elect to use other options that best meet their individual teaching needs, including (but not limited to) iClicker. As an IT organization, we continue to support technology that serves faculty, student and staff needs across various colleges, disciplines, and courses. We feel that this pilot was effective in determining the student response technology—Turning Technologies—that will best serve NDSU faculty, students and staff for the foreseeable future.
Once a final decision concerning standardization is made, contract negotiations should begin in earnest with the goal of completion by January 1, 2012, in order to accommodate those wishing to use clickers during the spring session.
Appendix A: Clicker Comparisons
Turning Technologies and iClicker
Areas where both products have comparable functionality:
- Setting up the receiver and software
- Student registration of clickers
- Software interface floats above other software
- Can use with anything – PowerPoint, Websites, Word, etc.
- Asking questions on the fly
- Can create questions / answers files
- Managing scores and data
- Allow participation points, points for correct answer, change correct answer
- Reporting – Summary and Detailed
- Uploading scores and data to Blackboard (but there was a big delay with the iClicker product)
- Durability of the receivers and clickers
- Free software
- Offer mobile web device product to go “clickerless”
Areas where the products differ:
Main Shortcomings of Turning Technology Product:
- Costs $5 more – no workaround
- Doesn’t have instructor readout window on receiver base –
- This is a handy function in iClicker that lets the instructor see the %’s of votes as they come in, allowing the instructor to plan how he/she will proceed.
- Workaround: As the time winds down to answer the question, the question and answers are displayed on the screen. Intermittently, the instructor would push a button to mute the projector, push a button to view graph results quickly, then push a button to hide graph and push a button to unmute the projector. In summary, push four buttons quickly each time you want to see the feedback, and the students will see a black screen momentarily.
- Processing multiple sessions when uploading grading –
- Turning Technologies uses their own file structure types, but iClicker uses comma-separated-value text files which work easily with Excel
- Workaround: When uploading grades into Blackboard, upload them one session at a time, and use a calculated total column in Bb to combine them. Ideally, instructors would upload the grades daily or weekly to avoid backlog of sessions.
Main Shortcomings of iClicker Product:
- Entering numeric answers –
- Questions that use numeric answers are widely used in Math and the sciences. Instead of choosing a multiple-choice answer, students solve the problem and enter the actual numeric answer, which can include numbers and symbols.
- Workaround: Students push mode button and use directional pad to scroll up and down through a list of numbers, letters and symbols to choose each character individually from left to right. Then they must submit the answer.
- Number of multiple choice answers –
- iClicker has 5 buttons on the transmitter for direct answer choices and Turning Technologies has 10.
- Workaround: Similar to numeric answer workaround. Once again the simpler transmitter becomes complex for the students.
- Potential Vendor Support Problems –
- It took iClicker over 3 months to get their grade upload interface working with NDSU’s Blackboard system. The Turning Technology interface worked right away. No workaround.
We are pleased to inform you that your classroom response system is chosen as final candidate for campus-wide adoption/support at St. Cloud State University. Should you be interested in pursuing this opportunity, we invite you to respond to the attached list of questions and to prepare a brief presentation for members of the selection committee and interested faculty/staff.
The deadline for responding to the questions is 12:00 pm (CST), Tuesday, April 9. This deadline will allow us to review the responses in time for the vendor presentations on Thursday, April 11, 11AM-1PM. The presentations will be held virtually via Adobe Connect: http://media4.stcloudstate.edu/scsu. Please let us know, if you need to test and familiarize yourself with the presentation platform.
The presentation should be no more than 10 minutes long, followed by 10 minutes for follow-up questions. We suggest that you focus on the highlights of your system, presuming a moderately knowledgeable audience. We may follow up via email or telephone call prior to making our final selection.
Thank you and looking forward to hearing from you soon.
Classroom Response System Taskforce:
Dr. Anthony Hansen
Dr. Michael Rentz
Dr. Joseph Melcher
Dr. Andrew Anda
Dr. Tracy Ore
Dr. Jack McKenna
Dr. Plamen Miltenoff
|Questions to vendor|
|1. Is your system proprietary as far as the handheld device and the operating system software?|
|2. Describe the scalability of your system, from small classes (20-30) to large auditorium classes. (500+).|
|3. Is your system receiver/transmitter based, wi-fi based, or other?|
|4. What is the usual process for students to register a “CRS”(or other device) for a course? List all of the possible ways a student could register their device. Could a campus offer this service rather than through your system? If so, how?|
|5. Once a “CRS” is purchased can it be used for as long as the student is enrolled in classes? Could “CRS” purchases be made available through the campus bookstore? Once a student purchases a “clicker” are they able to transfer ownership when finished with it?|
|6. Will your operating software integrate with other standard database formats? If so, list which ones.|
|7. Describe the support levels you provide. If you offer maintenance agreements, describe what is covered.|
|8. What is your company’s history in providing this type of technology? Provide a list of higher education clients.|
|9. What measures does your company take to insure student data privacy? Is your system in compliance with FERPA and the Minnesota Data Practices Act? (https://www.revisor.leg.state.mn.us/statutes/?id=13&view=chapter)|
|10. What personal data does your company collect on students and for what purpose? Is it shared or sold to others? How is it protected?|
|11. Do any of your business partners collect personal information about students that use your technology?|
|12. With what formats can test/quiz questions be imported/exported?|
|13. List compatible operating systems (e.g., Windows, Macintosh, Palm, Android)?|
|14. What are the total costs to students including device costs and periodic or one-time operation costs|
|15. Describe your costs to the institution.|
|16. Describe how your software integrates with PowerPoint or other presentation systems.|
|17. State your level of integration with Desire2Learn (D2L)?
Does the integration require a server or other additional equipment the campus must purchase?
|18. How does your company address disability accommodation for your product?|
|19. Does your software limit the number of answers per question in tests or quizzes? If so, what is the max question limit?|
|20. Does your software provide for integrating multimedia files? If so, list the file format types supported.|
|21. What has been your historic schedule for software releases and what pricing mechanism do you make available to your clients for upgrading?|
|22. Describe your “CRS”(s).|
|23. If applicable, what is the average life span of a battery in your device and what battery type does it take?|
|24. Does your system automatically save upon shutdown?|
|25. What is your company’s projection/vision for this technology in the near and far term.|
|26. Does any of your software/apps require administrator permission to install?|
|27. If your system is radio frequency based, what frequency spectrum does it operate in? If the system operate in the 2.4-2.5 ghz. spectrum, have you tested to insure that smart phones, wireless tablet’s and laptops and 2.4 ghz. wireless phones do not affect your system? If so, what are the results of those tests?|
|28. What impact to the wireless network does the solution have?|
|29. Can the audience response system be used spontaneously for polling?|
|30. Can quiz questions and response distributions be imported and exported from and to plaintext or a portable format? (motivated by assessment & accreditation requirements).|
|31. Is there a requirement that a portion of the course grade be based on the audience response system?|
Plamen Miltenoff, Ph.D., MLIS
204-J James W. Miller Center
Learning Resources and Technology Services
720 Fourth Avenue South
St. Cloud, MN 56301-4498
“I am not strange, I am just not normal.” Salvador Dali
Previous 1 … 17 18 19