Searching for "mobile phones"

LMS and embedded librarianship

Tumbleson, B. E., & Burke, J. (. J. (2013). Embedding librarianship in learning management systems: A how-to-do-it manual for librarians. Neal-Schuman, an imprint of the American Library Association.

Embedding librarianship in learning management systems:

https://scsu.mplus.mnpals.net/vufind/Record/007650037

see also:

Kvenild, C., & Calkins, K. (2011). Embedded Librarians: Moving Beyond One-Shot Instruction – Books / Professional Development – Books for Academic Librarians – ALA Store. ACRL. Retrieved from http://www.alastore.ala.org/detail.aspx?ID=3413

p. 20 Embedding Academic and Research Libraries in the Curriculum: 2014-nmc-horizon-report-library-EN

xi. the authors are convinced that LMS embedded librarianship is becoming he primary and most productive method for connecting with college and university students, who are increasingly mobile.

xii. reference librarians engage the individual, listen, discover what is wanted and seek to point the stakeholder in profitable directions.
Instruction librarians, in contrast, step into the classroom and attempt to lead a group of students in new ways of searching wanted information.
Sometimes that instruction librarian even designs curriculum and teaches their own credit course to guide information seekers in the ways of finding, evaluating, and using information published in various formats.
Librarians also work in systems, emerging technologies, and digital initiatives in order to provide infrastructure or improve access to collections and services for tend users through the library website, discovery layers, etc. Although these arenas seemingly differ, librarians work as one.

xiii. working as an LMS embedded librarian is both a proactive approach to library instruction using available technologies and enabling a 24/7 presence.

1. Embeddedness involves more that just gaining perspective. It also allows the outsider to become part of the group through shared learning experiences and goals. 3. Embedded librarianship in the LMS is all about being as close as possible to where students are receiving their assignments and gaining instruction and advice from faculty members. p. 6 When embedded librarians provide ready access to scholarly electronic collections, research databases, and Web 2.0 tools and tutorials, the research experience becomes less frustrating and more focused for students. Undergraduate associate this familiar online environment with the academic world.

p. 7 describes embedding a reference librarian, which LRS reference librarians do, “partnership with the professor.” However, there is room for “Research Consultations” (p. 8). While “One-Shot Library Instruction Sessions” and “Information Literacy Credit Courses” are addressed (p. 809), the content of these sessions remains in the old-fashioned lecturing type of delivering the information.

p. 10-11. The manuscript points out clearly the weaknesses of using a Library Web site. The authors fail to see that the efforts of the academic librarians must go beyond Web page and seek how to easy the information access by integrating the power of social media with the static information residing on the library web page.

p. 12 what becomes disturbingly clear is that faculty focus on the mechanics of the research paper over the research process. Although students are using libraries, 70 % avoid librarians. Urging academic librarians to “take an active role and initiate the dialogue with faculty to close a divide that may be growing between them and faculty and between them and students.”
Four research context with which undergraduates struggle: big picture, language, situational context and information gathering.

p. 15 ACRL standards One and Three: librarians might engage students who rely on their smartphones, while keeping in mind that “[s]tudents who retrieve information on their smartphones may also have trouble understanding or evaluating how the information on their phone is ‘produced, organized, and disseminated’ (Standard One).
Standard One by its definition seems obsolete. If information is formatted for desktops, it will be confusing when on smart phones, And by that, it is not mean to adjust the screen size, but change the information delivery from old fashioned lecturing to more constructivist forms. e.ghttp://web.stcloudstate.edu/pmiltenoff/bi/

p. 15 As for Standard Two, which deals with effective search strategies, the LMS embedded librarian must go beyond Boolean operators and controlled vocabulary, since emerging technologies incorporate new means of searching. As unsuccessfully explained to me for about two years now at LRS: hashtag search, LinkedIn groups etc, QR codes, voice recognition etc.

p. 16. Standard Five. ethical and legal use of information.

p. 23 Person announced in 2011 OpenClass compete with BB, Moodle, Angel, D2L, WebCT, Sakai and other
p. 24 Common Features: content, email, discussion board, , synchronous chat and conferencing tools (Wimba and Elluminate for BB)

p. 31 information and resources which librarians could share via LMS
– post links to dbases and other resources within the course. LIB web site, LibGuides or other subject-related course guidelines
– information on research concepts can be placed in a similar fashion. brief explanation of key information literacy topics (e.g difference between scholarly and popular periodical articles, choosing or narrowing research topics, avoiding plagiarism, citing sources properly whining required citations style, understanding the merits of different types of sources (Articles book’s website etc)
– Pertinent advice the students on approaching the assignment and got to rheank needed information
– Tutorials on using databases or planning searches step-by-step screencast navigating in search and Candida bass video search of the library did you a tour of the library

p. 33 embedded librarian being copied on the blanked emails from instructor to students.
librarian monitors the discussion board

p. 35 examples: students place specific questions on the discussion board and are assured librarian to reply by a certain time
instead of F2F instruction, created a D2L module, which can be placed in any course. videos, docls, links to dbases, links to citation tools etc. Quiz, which faculty can use to asses the the students

p. 36 discussion forum just for the embedded librarian. for the students, but faculty are encouraged to monitor it and provide content- or assignment-specific input
video tutorials and searching tips
Contact information email phone active IM chat information on the library’s open hours

p. 37 questions to consider
what is the status of the embedded librarian: T2, grad assistant

p. 41 pilot program. small scale trial which is run to discover and correct potential problems before
One or two faculty members, with faculty from a single department
Pilot at Valdosta State U = a drop-in informatil session with the hope of serving the information literacy needs of distance and online students, whereas at George Washington U, librarian contacted a distance education faculty member to request embedding in his upcoming online Mater’s course
p. 43 when librarians sense that current public services are not being fully utilized, it may signal that a new approach is needed.
pilots permit tinkering. they are all about risk-taking to enhance delivery

p. 57 markeing LMS ebedded Librarianship

library collections, services and facilities because faculty may be uncertain how the service benefits their classroom teaching and learning outcomes.
my note per
“it is incumbent upon librarians to promote this new mode of information literacy instruction.” it is so passe. in the times when digital humanities is discussed and faculty across campus delves into digital humanities, which de facto absorbs digital literacy, it is shortsighted for academic librarians to still limit themselves into “information literacy,” considering that lip service is paid for for librarians being the leaders in the digital humanities movement. If academic librarians want to market themselves, they have to think broad and start with topics, which ARE of interest for the campus faculty (digital humanities included) and then “push” their agenda (information literacy). One of the reasons why academic libraries are sinking into oblivion is because they are sunk already in 1990-ish practices (information literacy) and miss the “hip” trends, which are of interest for faculty and students. The authors (also paying lip services to the 21st century necessities), remain imprisoned to archaic content. In the times, when multi (meta) literacies are discussed as the goal for library instruction, they push for more arduous marketing of limited content. Indeed, marketing is needed, but the best marketing is by delivering modern and user-sought content.
the stigma of “academic librarians keep doing what they know well, just do it better.” Lip-services to change, and life-long learning. But the truth is that the commitment to “information literacy” versus the necessity to provide multi (meta) literacites instruction (Reframing Information Literacy as a metaliteracy) is minimizing the entire idea of academic librarians reninventing themselves in the 21st century.
Here is more: NRNT-New Roles for New Times

p. 58 According to the Burke and Tumbleson national LMS embedded librarianship survey, 280 participants yielded the following data regarding embedded librarianship:

  • traditional F2F LMS courses – 69%
  • online courses – 70%
  • hybrid courses – 54%
  • undergraduate LMS courses 61%
  • graduate LMS courses 42%

of those respondents in 2011, 18% had the imitative started for four or more years, which place the program in 2007. Thus, SCSU is almost a decade behind.

p. 58 promotional methods:

  • word of mouth
  • personal invitation by librarians
  • email by librarians
  • library brochures
  • library blogs

four years later, the LRS reference librarians’ report https://magic.piktochart.com/output/5704744-libsmart-stats-1415 has no mentioning of online courses, less to say embedded librarianship

my note:
library blog
was offered numerous times to the LRS librarians and, consequently to the LRS dean, but it was brushed away, as were brushed away the proposals for modern institutional social media approach (social media at LRS does not favor proficiency in social media but rather sees social media as learning ground for novices, as per 11:45 AM visit to LRS social media meeting of May 6, 2015). The idea of the blog advantages to static HTML page was explained in length, but it was visible that the advantages are not understood, as it is not understood the difference of Web 2.0 tools (such as social media) and Web 1.0 tools (such as static web page). The consensus among LRS staff and faculty is to keep projecting Web 1.0 ideas on Web 2.0 tools (e.g. using Facebook as a replacement of Adobe Dreamweaver: instead of learning how to create static HTML pages to broadcast static information, use Facebook for fast and dirty announcement of static information). It is flabbergasting to be rejected offering a blog to replace Web 1.0 in times when the corporate world promotes live-streaming (http://www.socialmediaexaminer.com/live-streaming-video-for-business/) as a way to  promote services (academic librarians can deliver live their content)

p. 59 Marketing 2.0 in the information age is consumer-oriented. Marketing 3.0 in the values-driven era, which touches the human spirit (Kotler, Katajaya, and Setiawan 2010, 6).
The four Ps: products and services, place, price and promotion. Libraries should consider two more P’s: positioning and politics.

Mathews (2009) “library advertising should focus on the lifestyle of students. the academic library advertising to students today needs to be: “tangible, experiential, relatebale, measurable, sharable and surprising.” Leboff (2011, p. 400 agrees with Mathews: the battle in the marketplace is not longer for transaction, it is for attention. Formerly: billboards, magazines, newspapers, radio, tv, direct calls. Today: emphasize conversation, authenticity, values, establishing credibility and demonstrating expertise and knowledge by supplying good content, to enhance reputation (Leboff, 2011, 134). translated for the embedded librarians: Google goes that far; students want answers to their personal research dillemas and questions. Being a credentialed information specialist with years of experience is no longer enough to win over an admiring following. the embedded librarian must be seen as open and honest in his interaction with students.
p. 60  becoming attractive to end-users is the essential message in advertising LMS embedded librarianship. That attractivness relies upon two elements: being noticed and imparting values (Leboff, 2011, 99)

p. 61 connecting with faculty

p. 62 reaching students

  • attending a synchronous chat sessions
  • watching a digital tutorial
  • posting a question in a discussion board
  • using an instant messaging widget

be careful not to overload students with too much information. don’t make contact too frequently and be perceived as an annoyance and intruder.

p. 65. contemporary publicity and advertising is incorporating storytelling. testimonials differ from stories

p. 66 no-cost marketing. social media

low-cost marketing – print materials, fliers, bookmarks, posters, floor plans, newsletters, giveaways (pens, magnets, USB drives), events (orientations, workshops, contests, film viewings), campus media, digital media (lib web page, blogs, podcasts, social networking cites

p. 69 Instructional Content and Instructional Design
p. 70 ADDIE Model

ADDIE model ADDIE model

Analysis: the requirements for the given course, assignments.
Ask instructors expectations from students vis-a-vis research or information literacy activities
students knowledge about the library already related to their assignments
which are the essential resources for this course
is this a hybrid or online course and what are the options for the librarian to interact with the students.
due date for the research assignment. what is the timeline for completing the assignment
when research tips or any other librarian help can be inserted

copy of the syllabus or any other assignment document

p. 72 discuss the course with faculty member. Analyze the instructional needs of a course. Analyze students needs. Create list of goals. E.g.: how to find navigate and use the PschInfo dbase; how to create citations in APA format; be able to identify scholarly sources and differentiate them from popular sources; know other subject-related dbases to search; be able to create a bibliography and use in-text citations in APA format

p. 74 Design (Addie)
the embedded component is a course within a course. Add pre-developed IL components to the broader content of the course. multiple means of contact information for the librarians and /or other library staff. link to dbases. link to citation guidance and or tutorial on APA citations. information on how to distinguish scholarly and popular sources. links to other dbases. information and guidance on bibliographic and in-text citations n APA either through link, content written within the course a tutorial or combination. forum or a discussion board topic to take questions. f2f lib instruction session with students
p. 76 decide which resources to focus on and which skills to teach and reinforce. focus on key resources

p. 77 development (Addie).
-building content;the “landing” page at LRS is the subject guides page.  resources integrated into the assignment pages. video tutorials and screencasts

-finding existing content; google search of e.g.: “library handout narrowing topic” or “library quiz evaluating sources,” “avoiding plagiarism,” scholarly vs popular periodicals etc

-writing narrative content. p. 85

p. 87 Evaluation (Addie)

formative: to change what the embedded librarian offers to improve h/er services to students for the reminder of the course
summative at the end of the course:

p. 89  Online, F2F and Hybrid Courses

p. 97 assessment impact of embedded librarian.
what is the purpose of the assessment; who is the audience; what will focus on; what resources are available
p. 98 surveys of faculty; of students; analysis of student research assignments; focus groups of students and faculty

p. 100 assessment methods: p. 103/4 survey template
https://www.ets.org/iskills/about
https://www.projectsails.org/ (paid)
http://www.trails-9.org/
http://www.library.ualberta.ca/augustana/infolit/wassail/
p. 106 gathering LMS stats. Usability testing
examples: p. 108-9, UofFL : pre-survey and post-survey of studs perceptions of library skills, discussion forum analysis and interview with the instructor

p. 122 create an LMS module for reuse (standardized template)
p. 123 subject and course LibGuides, digital tutorials, PPTs,
research mind maps, charts, logs, or rubrics
http://creately.com/blog/wp-content/uploads/2012/12/Research-Proposal-mind-map-example.png
http://www.library.arizona.edu/help/tutorials/mindMap/sample.php  (excellent)
or paper-based if needed: Concept Map Worksheet
Productivity Tools for Graduate Students: MindMapping http://libguides.gatech.edu/c.php

rubrics:
http://www.cornellcollege.edu/LIBRARY/faculty/focusing-on-assignments/tools-for-assessment/research-paper-rubric.shtml
http://gvsu.edu/library/instruction/research-guidance-rubric-for-assignment-design-4.htm
Creating Effective Information Literacy Assignments http://www.lib.jmu.edu/instruction/assignments.aspx

course handouts
guides on research concepts http://library.olivet.edu/subject-guides/english/college-writing-ii/research-concepts/
http://louisville.libguides.com/c.php
Popular versus scholar http://www.library.arizona.edu/help/tutorials/scholarly/guide.html

list of frequently asked q/s:
blog posts
banks of reference q/s

p. 124. Resistance or Receptivity

p. 133 getting admin access to LMS for the librarians.

p. 136 mobile students, dominance of born-digital resources

 

 

 

———————-

Summey T, Valenti S. But we don’t have an instructional designer: Designing online library instruction using isd techniques. Journal Of Library & Information Services In Distance Learning [serial online]. January 1, 2013;Available from: Scopus®, Ipswich, MA. Accessed May 11, 2015.
http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dedselc%26AN%3dedselc.2-52.0-84869866367%26site%3deds-live%26scope%3dsite

instructional designer library instruction using ISD techniques

Shank, J. (2006). The blended librarian: A job announcement analysis of the newly emerging position of instructional design librarian. College And Research Libraries, 67(6), 515-524.
http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dedselc%26AN%3dedselc.2-52.0-33845291135%26site%3deds-live%26scope%3dsite

The Blended Librarian_ A Job Announcement Analysis of the Newly Emerging Position of Instructional Design Librarian

Macklin, A. (2003). Theory into practice: Applying David Jonassen’s work in instructional design to instruction programs in academic libraries. College And Research Libraries, 64(6), 494-500.
http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dedselc%26AN%3dedselc.2-52.0-7044266019%26site%3deds-live%26scope%3dsite

Theory into Practice_ Applying David Jonassen_s Work in Instructional Design to Instruction Programs in Academic Libraries

Walster, D. (1995). Using Instructional Design Theories in Library and Information Science Education. Journal of Education for Library and Information Science, (3). 239.
http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dedsjsr%26AN%3dedsjsr.10.2307.40323743%26site%3deds-live%26scope%3dsite

Using Instructional Design Theories in Library and Information Science Education

Mackey, T. )., & Jacobson, T. ). (2011). Reframing information literacy as a metaliteracy. College And Research Libraries, 72(1), 62-78.
http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dedselc%26AN%3dedselc.2-52.0-79955018169%26site%3deds-live%26scope%3dsite

Reframing Information Literacy as a metaliteracy

Nichols, J. (2009). The 3 directions: Situated information literacy. College And Research Libraries, 70(6), 515-530.
http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dedselc%26AN%3dedselc.2-52.0-73949087581%26site%3deds-live%26scope%3dsite

The 3 Directions_ Situated literacy

 

—————

Journal of Library & Information Services in Distance Learning (J Libr Inform Serv Dist Learn)

https://www.researchgate.net/journal/1533-290X_Journal_of_Library_Information_Services_in_Distance_Learning

http://conference.acrl.org/

http://www.loex.org/conferences.php

http://www.ala.org/lita/about/igs/distance/lit-igdl

————

https://magic.piktochart.com/output/5704744-libsmart-stats-1415

BYOD

5 Essential Insights About Mobile Learning

http://ww2.kqed.org/mindshift/2014/07/15/5-essential-insights-about-mobile-learning/

1. Set goals and expectations for teaching and learning with mobile devices before worrying about the device itself.

St. Vrain Valley School District in Colorado,

Mooresville Graded School District

Consolidated High School District 230

2. Develop a strong community of support for the initiative early and keep up transparent communication with parents and community members throughout the process.

Forsyth County Schools in Georgia.

3. Think about equity, but don’t let it stop forward motion.

includes both urban and rural areas,

4. Evaluate the effectiveness of a mobile learning initiative based on the goals set at the beginning of the rollout.

5. Some of the biggest lessons learned include giving up control and trusting students.

included students in the discussions

STAY NIMBLE

While these mobile learning pioneers have seen some of the pitfalls and can help districts new to the game avoid the same stumbles, this space is changing quickly and every community’s needs will be different.

“It’s no longer just something you implement; it’s evolving and it’s unique in each location,” Bjerede said. “If you try to be cookie cutter about it you won’t meet the needs of every kid in every classroom.”

The technology will change, students will surprise their teachers and the best advice to district leaders is to stay open to all the possibilities and allow students to take control of the tremendous learning opportunity that having a device at all times could offer them.

=====================================

My note: Kathrina Schwartz offers an opinion, which reflects the second wave (withdrawl) in the 3 steps of innovation

The Struggles and Realities of Student-Driven Learning and BYOD

http://ww2.kqed.org/mindshift/2014/07/07/the-struggles-and-realities-of-student-driven-learning-and-byod/

A 2013 Pew study revealed that only 35 percent of teachers at the lowest income schools allow their students to look up information on their mobile devices, as compared to 52 percent of teachers at wealthier schools.

Many advocates of using mobile technologies say the often cited issues of student distraction are just excuses not to try something new.
“The way you discourage it is engage them in the activity so they don’t even think of sending a text. You’ve got to jump in and play their game or you’re going to lose them.”

Angela Crawford has heard all the arguments of BYOD evangelists, but doesn’t see how they match the reality of her classroom. “BYOD is very problematic in many schools, mine included, because we have a prominent engagement problem,” Crawford said.

Tactics to improve engagement like making work relevant to her students’ lives or letting them use their phones in class to look up information, haven’t worked for Crawford, although she’s tried.

When she first started, Crawford was enthusiastic about jumping into collaborative, project-based learning. “I thought my colleagues were monsters because of how they were teaching,” she said of a school where she previously worked and where teachers lectured all the time. She tried to teach students through projects, but found it was a disaster. To her students’ parents, her efforts to make the classroom “student-centered” looked like she wasn’t teaching. “There is a different perception of what a teacher should be in different cultures,” Crawford said. “And in the African-American community in the South the teacher is supposed to do direct instruction.”

“What works best for each student is really the heart of student-centered learning,” Crawford said. “Sometimes what the student needs best is direct instruction. They need that authoritative, in-control figure who is directing their learning and will get them where they need to go.” Many of Crawford’s students come from homes run by single mothers who rule with an iron hand. She tries to replicate that attitude and presence. “They respond to that; they like it,” Crawford said. “It’s comforting to them.”

Still, Crawford will not be experimenting with a bring-your-own-device program. “My problem with education innovation is we tend to want to take a new technology or a new idea and go forth with it as if it’s the silver bullet,” Crawford said. “What happens is that teachers who teach in my type of environment realize this would be a disaster in my classroom.”

Crawford is skeptical that kids in higher income areas aren’t misusing technology too. Her children attend school in a more affluent district and they tell her that kids are constantly messing around on their devices. They just switch screens when a teacher comes by. They get away with it because their teachers trust them to do their work.

“I think kids in middle class or upper middle class schools are equally distracted as low-income students,” said Bob Lenz, director of innovation at Envision Schools, a small charter network that’s part of the deeper learning movement. “It’s just that because of the privilege of their background the content and the skills that they need to gain in school — they’re coming with a lot of those skills already– so it’s not as urgently needed.”

Where the iPad should go next: Look toward Windows 10

What Microsoft is getting right with tablets–seamless synching between devices, more computing power, and accessories–and why Apple should go there too.

“The iPad is nearly 5 years old. That product, ever since, has continued to ride a thin dividing line between iPhones and Macs: mobile, and computers.

…Will there be both a 12-inch iPad and a 12-inch MacBook Air in 2015? If so, how will they co-exist? Could they be meant for different customers?

…the iPad needs a change. It needs something to ignite interest. It needs a few new ideas.

Microsoft — with its hardware, and with its upcoming Windows 10 operating system — is actually blazing a bold trail. One that Apple may actually be able to learn from.”

http://www.cnet.com/news/where-the-ipad-should-go-next-look-toward-windows-10/

Tech In 2015 and flops in 2014

What To Look Out For In Tech In 2015

 http://www.businessinsider.com/what-to-look-out-for-in-tech-in-2015-2014-12#ixzz3O0Jpgipy

Venmo, the peer-to-peer payments app, will offer a solution for in-store merchants.

By year-end 2015, more people will have used a smartphone to unlock their doors than will have used a mobile wallet. 

The Amazon Echo will succeed

YouTube will get a ‘social’ make-over

The Top Technology Failures of 2014

http://www.technologyreview.com/news/533546/the-top-technology-failures-of-2014/

Google Glass

(See “Google Glass Is Dead; Long Live Smart Glasses.”)

Brazil’s EEG Exoskeleton

(See “World Cup Mind-Control Demo Faces Deadlines, Critics.”)

Bitcoin

(See “Marginally Useful.”)

STAP Cells

(See coverage by the Los Angeles Times and by Nature.)

Sapphire iPhone Screens

(See “Why Apple Failed to Make Sapphire iPhones.”)

Aereo’s Tiny Antennas

disruptive technologies: from swarming to mesh networking

How Hong Kong Protesters Are Connecting, Without Cell Or Wi-Fi Networks

http://www.npr.org/blogs/alltechconsidered/2014/09/29/352476454/how-hong-kong-protesters-are-connecting-without-cell-or-wi-fi-networks

messaging one another through a network that doesn’t require cell towers or Wi-Fi nodes. They’re using an app called FireChat that launched in March and is underpinned by mesh networking, which lets phones unite to form a temporary Internet.

My note: seems that civil disobedience provides excellent innovations in using technology; examples are-

  1. the 1999 World Trade Organization Protests in Seattle, where the “swarming” idea was implemented and later transformed by Bryan Alexander into “swarming for education” (http://www.educause.edu/ero/article/going-nomadic-mobile-learning-higher-education)  and depicted on this blog in September 2013
    https://blog.stcloudstate.edu/ims/tag/bryan-alexander/
    to be continued by Britt in Learning Swarms? (http://bwatwood.edublogs.org/2010/08/05/learning-swarms/) and Howard Rheingold in his interview with Bryn Alexander in 2004 (http://www.thefeaturearchives.com/topic/Culture/M-Learning_4_Generation_Txt_.html and as Howard calls it “moblogging” and lately is becoming finally popular (at least in K12 if not in higher ed) as “backchanneling.”
  2. In a very similar scenario as the 1999 Seattle unrest, people in Venezuela (#venezuelalibre – Zello)  and Ukraine (Ukrainian roots shine through at WhatsApp) are turning to mobile apps to organize themselves and defy governments blocking of traditional social media (Protesters in Venezuela, Ukraine turn to peer-to  – CNN.com)The ideas using Zello and WhatsApp in education poured in:WhatsApp for education?, How to use Whatsapp Chat Messenger for Education

Mesh networking is still only an IT term. Internet and dbase search has no returns on mesh networking as a tool for education and/or civil disobedience. Will it be the continuation of moblogging, backchanneling and swarming?

related IMS blog post: https://blog.stcloudstate.edu/ims/2014/09/19/mobile-elearning/

FireChat

12 Embarrassing Gadgets And Apps You Should Stop Using

12 Embarrassing Gadgets And Apps You Should Stop Using

Read more: http://www.businessinsider.com/embarrassing-gadgets-2014-4?op=1#ixzz30I03rggb

Not sure if Google Glass will go into oblivion (but it might, considering that it ALSO tethers with a mobile device as the vanishing Blackberry tablet), but smart phones definitely are taking over.

 

clickers documentation

Thursday, April 11, 11AM-1PM, Miller Center B-37
and/or
http://media4.stcloudstate.edu/scsu

We invite the campus community to a presentation by three vendors of Classroom Response System (CRS), AKA “clickers”:

11:00-11:30AM          Poll Everywhere,              Mr. Alec Nuñez

11:30-12:00PM          iClikers,                                Mr. Jeff Howard
12:00-12:30PM          Top Hat Monocle             Mr. Steve Popovich

12:30-1PM                  Turning Technologies     Mr. Jordan Ferns

links to documentation from the vendors:

http://web.stcloudstate.edu/informedia/crs/ClickerSummaryReport_NDSU.docx 

 http://web.stcloudstate.edu/informedia/crs/Poll%20Everywhere.docx

http://web.stcloudstate.edu/informedia/crs/tophat1.pdf

http://web.stcloudstate.edu/informedia/crs/tophat2.pdf

http://web.stcloudstate.edu/informedia/crs/turning.pdf

Top Hat Monocle docs:

http://web.stcloudstate.edu/informedia/crs/thm/FERPA.pdf

http://web.stcloudstate.edu/informedia/crs/thm/proposal.pdf

http://web.stcloudstate.edu/informedia/crs/thm/THM_CaseStudy_Eng.pdf

http://web.stcloudstate.edu/informedia/crs/thm/thm_vsCRS.pdf

iCLicker docs:
http://web.stcloudstate.edu/informedia/crs/iclicker/iclicker.pdf

http://web.stcloudstate.edu/informedia/crs/iclicker/iclicker2VPAT.pdf

http://web.stcloudstate.edu/informedia/crs/iclicker/responses.doc

 

Questions to vendor: alec@polleverywhere.com 
  1. 1.     Is your system proprietary as far as the handheld device and the operating system software?

The site and the service are the property of Poll Everywhere. We do not provide handheld devices. Participants use their own device be it a smart phone, cell phone, laptop, tablet, etc.

  1. 2.     Describe the scalability of your system, from small classes (20-30) to large auditorium classes. (500+).

Poll Everywhere is used daily by thousands of users. Audience sizes upwards of 500+ are not uncommon. We’ve been used for events with 30,000 simultaneous participants in the past.

  1. 3.     Is your system receiver/transmitter based, wi-fi based, or other?

N/A

  1. 4.     What is the usual process for students to register a “CRS”(or other device) for a course? List all of the possible ways a student could register their device. Could a campus offer this service rather than through your system? If so, how?

Student participants may register by filling out a form. Or, student information can be uploaded via a CSV.

  1. 5.     Once a “CRS” is purchased  can it be used for as long as the student is enrolled in classes? Could “CRS” purchases be made available through the campus bookstore? Once a student purchases a “clicker” are they able to transfer ownership when finished with it?

N/A. Poll Everywhere sells service licenses the length and number of students supported would be outlined in a services agreement.

  1. 6.     Will your operating software integrate with other standard database formats? If so, list which ones.

Need more information to answer.

  1. 7.     Describe the support levels you provide. If you offer maintenance agreements, describe what is covered.

8am to 8pm EST native English speaking phone support and email support.

  1. 8.     What is your company’s history in providing this type of technology? Provide a list of higher education clients.

Company pioneered and invented the use of this technology for audience and classroom response. http://en.wikipedia.org/wiki/Poll_Everywhere. University of Notre Dame
South Bend, Indiana

University of North Carolina-Chapel Hill
Raleigh, North Carolina

University of Southern California
Los Angeles, California

San Diego State University
San Diego, California

Auburn University
Auburn, Alabama

King’s College London
London, United Kingdom

Raffles Institution
Singapore

Fayetteville State University
Fayetteville, North Carolina

Rutgers University
New Brunswick, New Jersey

Pepperdine University
Malibu, California

Texas A&M University
College Station, Texas

University of Illinois
Champaign, Illinois

  1. 9.     What measures does your company take to insure student data privacy? Is your system in compliance with FERPA and the Minnesota Data Practices Act? (https://www.revisor.leg.state.mn.us/statutes/?id=13&view=chapter)

Our Privacy Policy can be found here: http://www.polleverywhere.com/privacy-policy. We take privacy very seriously.

  1. 10.  What personal data does your company collect on students and for what purpose? Is it shared or sold to others? How is it protected?

Name. Phone Number. Email. For the purposes of voting and identification (Graded quizzes, attendance, polls, etc.). It is never shared or sold to others.

  1. 11.  Do any of your business partners collect personal information about students that use your technology?

No.

  1. 12.  With what formats can test/quiz questions be imported/exported?

Import via text. Export via CSV.

  1. 13.  List compatible operating systems (e.g., Windows, Macintosh, Palm, Android)?

Works via standard web technology including Safari, Chrome, Firefox, and Internet Explorer. Participant web voting fully supported on Android and IOS devices. Text message participation supported via both shortcode and longcode formats.

  1. 14.  What are the total costs to students including device costs and periodic or one-time operation costs

Depends on negotiated service level agreement. We offer a student pays model at $14 per year or Institutional Licensing.

  1. 15.  Describe your costs to the institution.

Depends on negotiated service level agreement. We offer a student pays model at $14 per year or Institutional Licensing.

  1. 16.  Describe how your software integrates with PowerPoint or other presentation systems.

Downloadable slides from the website for Windows PowerPoint and downloadable app for PowerPoint and Keynote integration on a Mac.

17. State your level of integration with Desire2Learn (D2L)?Does the integration require a server or other additional equipment the campus must purchase?Export results from site via CSV for import into D2L.
  1. 17.  How does your company address disability accommodation for your product?

We follow the latest web standards best practices to make our website widely accessible by all. To make sure we live up to this, we test our website in a text-based browser called Lynx that makes sure we’re structuring our content correctly for screen readers and other assisted technologies.

  1. 18.  Does your software limit the number of answers per question in tests or quizzes? If so, what is the max question limit?

No.

  1. 19.  Does your software provide for integrating multimedia files? If so, list the file format types supported.

Supports image formats (.PNG, .GIF, .JPG).

  1. 20.  What has been your historic schedule for software releases and what pricing mechanism do you make available to your clients for upgrading?

We ship new code daily. New features are released several times a year depending on when we finish them. New features are released to the website for use by all subscribers.

  1. 21.  Describe your “CRS”(s).

Poll Everywhere is a web based classroom response system that allows students to participate from their existing devices. No expensive hardware “clickers” are required. More information can be found at  http://www.polleverywhere.com/classroom-response-system.

  1. 22.  If applicable, what is the average life span of a battery in your device and what battery type does it take?

N/A. Battery manufacturers hate us. Thirty percent of their annual profits can be contributed to their use in clickers (we made that up).

  1. 23.  Does your system automatically save upon shutdown?

Our is a “cloud based” system. User data is stored there even when your computer is not on.

  1. 24.  What is your company’s projection/vision for this technology in the near and far term.

We want to take clicker companies out of business. We think it’s ridiculous to charge students and institutions a premium for outdated technology when existing devices and standard web technology can be used instead for less than a tenth of the price.

  1. 25.  Does any of your software/apps require administrator permission to install?

No.

  1. 26.  If your system is radio frequency based, what frequency spectrum does it operate in? If the system operate in the 2.4-2.5 ghz. spectrum, have you tested to insure that smart phones, wireless tablet’s and laptops and 2.4 ghz. wireless phones do not affect your system? If so, what are the results of those tests?

No.

  1. 27.  What impact to the wireless network does the solution have?

Depends on a variety of factors. Most university wireless networks are capable of supporting Poll Everywhere. Poll Everywhere can also make use of cell phone carrier infrastructure through SMS and data networks on the students phones.

  1. 28.  Can the audience response system be used spontaneously for polling?

Yes.

  1. 29.  Can quiz questions and response distributions be imported and exported from and to plaintext or a portable format? (motivated by assessment & accreditation requirements).

Yes.

  1. 30.  Is there a requirement that a portion of the course grade be based on the audience response system?

No.

Gloria Sheldon
MSU Moorhead

Fall 2011 Student Response System Pilot

Summary Report

 

NDSU has been standardized on a single student response (i.e., “clicker”) system for over a decade, with the intent to provide a reliable system for students and faculty that can be effectively and efficiently supported by ITS. In April 2011, Instructional Services made the decision to explore other response options and to identify a suitable replacement product for the previously used e-Instruction Personal Response System (PRS). At the time, PRS was laden with technical problems that rendered the system ineffective and unsupportable. That system also had a steep learning curve, was difficult to navigate, and was unnecessarily time-consuming to use. In fact, many universities across the U.S. experienced similar problems with PRS and have since then adopted alternative systems.

A pilot to explore alternative response systems was initiated at NDSU in fall 2011. The pilot was aimed at further investigating two systems—Turning Technologies and iClicker—in realistic classroom environments. As part of this pilot program, each company agreed to supply required hardware and software at no cost to faculty or students. Each vendor also visited campus to demonstrate their product to faculty, students and staff.

An open invitation to participate in the pilot was extended to all NDSU faculty on a first come, first serve basis. Of those who indicated interest, 12 were included as participants in this pilot.

 

Pilot Faculty Participants:

  • Angela Hodgson (Biological Sciences)
  • Ed Deckard (AES Plant Science)
  • Mary Wright (Nursing)
  • Larry Peterson (History, Philosophy & Religious Studies)
  • Ronald Degges (Statistics)
  • Julia Bowsher (Biological Sciences)
  • Sanku Mallik (Pharmaceutical Sciences)
  • Adnan Akyuz (AES School of Natural Resource Sciences)
  • Lonnie Hass (Mathematics)
  • Nancy Lilleberg (ITS/Communications)
  • Lisa Montplaisir (Biological Sciences)
  • Lioudmila Kryjevskaia (Physics)

 

Pilot Overview

The pilot included three components: 1) Vendor demonstrations, 2) in-class testing of the two systems, and 3) side-by-side faculty demonstrations of the two systems.

After exploring several systems, Instructional Services narrowed down to two viable options—Turning Technologies and iClicker. Both of these systems met initial criteria that was assembled based on faculty input and previous usage of the existing response system. These criteria included durability, reliability, ease of use, radio frequency transmission, integration with Blackboard LMS, cross-platform compatibility (Mac, PC), stand-alone software (i.e., no longer tied to PowerPoint or other programs), multiple answer formats (including multiple choice, true/false, numeric), potential to migrate to mobile/Web solutions at some point in the future, and cost to students and the university.

In the first stage of the pilot, both vendors were invited to campus to demonstrate their respective technologies. These presentations took place during spring semester 2011 and were attended by faculty, staff and students. The purpose of these presentations was to introduce both systems and provide faculty, staff, and students with an opportunity to take a more hands-on look at the systems and provide their initial feedback.

In the second stage of the pilot, faculty were invited to test the technologies in their classes during fall semester 2011. Both vendors supplied required hardware and software at no cost to faculty and students, and both provided online training to orient faculty to their respective system. Additionally, Instructional Services staff provided follow-up support and training throughout the pilot program. Both vendors were requested to ensure system integration with Blackboard. Both vendors indicated that they would provide the number of clickers necessary to test the systems equally across campus. Both clickers were allocated to courses of varying sizes, ranging from 9 to 400+ students, to test viability in various facilities with differing numbers of users. Participating faculty agreed to offer personal feedback and collect feedback from students regarding experiences with the systems at the end of the pilot.

In the final stage of the pilot, Instructional Services facilitated a side-by-side demonstration led by two faculty members. Each faculty member showcased each product on a function-by-function basis so that attendees were able to easily compare and contrast the two systems. Feedback was collected from attendees.

 

Results of Pilot

In stage one, we established that both systems were viable and appeared to offer similar features, functions, and were compatible with existing IT systems at NDSU. The determination was made to include both products in a larger classroom trial.

In stage two, we discovered that both systems largely functioned as intended; however, several differences between the technologies in terms of advantages and disadvantages were discovered that influenced our final recommendation. (See Appendix A for a list of these advantages, disadvantages, and potential workarounds.) We also encountered two significant issues that altered the course of the pilot. Initially, it was intended that both systems would be tested in equal number in terms of courses and students. Unfortunately, at the time of the pilot, iClicker was not able to provide more than 675 clickers, which was far fewer than anticipated. Turning Technologies was able to provide 1,395 clickers. As a result, Turning Technologies was used by a larger number of faculty and students across campus.

At the beginning of the pilot, Blackboard integration with iClicker at NDSU was not functional. The iClicker vendor provided troubleshooting assistance immediately, but the problem was not resolved until mid-November. As a result, iClicker users had to use alternative solutions for registering clickers and uploading points to Blackboard for student viewing. Turning Technologies was functional and fully integrated with Blackboard throughout the pilot.

During the span of the pilot additional minor issues were discovered with both systems. A faulty iClicker receiver slightly delayed the effective start date of clicker use in one course.  The vendor responded by sending a new receiver, however it was an incorrect model. Instructional Services temporarily exchanged receivers with another member of the pilot group until a functional replacement arrived. Similarly, a Turning Technologies receiver was received with outdated firmware. Turning Technologies support staff identified the problem and assisted in updating the firmware with an update tool located on their website. A faculty participant discovered a software flaw in the iClicker software that hides the software toolbar when disconnecting a laptop from a second monitor. iClicker technical support assisted in identifying the problem and stated the problem would be addressed in a future software update. A workaround was identified that mitigated this problem for the remainder of the pilot. It is important to note that these issues were not widespread and did not widely affect all pilot users, however these issues attest to the need for timely, reliable, and effective vendor support.

Students and faculty reported positive experiences with both technologies throughout the semester. Based on feedback, users of both systems found the new technologies to be much improved over the previous PRS system, indicating that adopting either technology would be perceived as an upgrade among students and faculty. Faculty pilot testers met several times during the semester to discuss their experiences with each system; feedback was sent to each vendor for their comments, suggestions, and solutions.

During the stage three demonstrations, feedback from attendees focused on the inability for iClicker to integrate with Blackboard at that time and the substantial differences between the two systems in terms of entering numeric values (i.e., Turning Technologies has numeric buttons, while iClicker requires the use of a directional key pad to scroll through numeric characters). Feedback indicated that attendees perceived Turning Technologies’ clickers to be much more efficient for submitting numeric responses. Feedback regarding other functionalities indicated relative equality between both systems.

Recommendation

Based on the findings of this pilot, Instructional Services recommends that NDSU IT adopt Turning Technologies as the replacement for the existing PRS system. While both pilot-tested systems are viable solutions, Turning Technologies appears to meet the needs of a larger user base. Additionally, the support offered by Turning Technologies was more timely and effective throughout the pilot. With the limited resources of IT, vendor support is critical and was a major reason for exploring alternative student response technologies.

From Instructional Services’ standpoint, standardizing to one solution is imperative for two major reasons: cost efficiency for students (i.e., preventing students from having to purchase duplicate technologies) and efficient utilization of IT resources (i.e., support and training). It is important to note that this recommendation is based on the opinion of the Instructional Services staff and the majority of pilot testers, but is not based on consensus among all participating faculty and staff. It is possible that individual faculty members may elect to use other options that best meet their individual teaching needs, including (but not limited to) iClicker. As an IT organization, we continue to support technology that serves faculty, student and staff needs across various colleges, disciplines, and courses. We feel that this pilot was effective in determining the student response technology—Turning Technologies—that will best serve NDSU faculty, students and staff for the foreseeable future.

Once a final decision concerning standardization is made, contract negotiations should begin in earnest with the goal of completion by January 1, 2012, in order to accommodate those wishing to use clickers during the spring session.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 


 

Appendix A: Clicker Comparisons
Turning Technologies and iClicker

 

Areas where both products have comparable functionality:

  • Setting up the receiver and software
  • Student registration of clickers
  • Software interface floats above other software
    • Can use with anything – PowerPoint, Websites, Word, etc.
    • Asking questions on the fly
    • Can create questions / answers files
    • Managing scores and data
      • Allow participation points, points for correct answer, change correct answer
      • Reporting – Summary and Detailed
      • Uploading scores and data to Blackboard (but there was a big delay with the iClicker product)
      • Durability of the receivers and clickers
      • Free software
      • Offer mobile web device product to go “clickerless”

Areas where the products differ:

Main Shortcomings of Turning Technology Product:

  • Costs $5 more – no workaround
  • Doesn’t have instructor readout window on receiver base –
    • This is a handy function in iClicker that lets the instructor see the %’s of votes as they come in, allowing the instructor to plan how he/she will proceed.
    • Workaround: As the time winds down to answer the question, the question and answers are displayed on the screen. Intermittently, the instructor would push a button to mute the projector, push a button to view graph results quickly, then push a button to hide graph and push a button to unmute the projector. In summary, push four buttons quickly each time you want to see the feedback, and the students will see a black screen momentarily.
    • Processing multiple sessions when uploading grading –
      • Turning Technologies uses their own file structure types, but iClicker uses comma-separated-value text files which work easily with Excel
      • Workaround: When uploading grades into Blackboard, upload them one session at a time, and use a calculated total column in Bb to combine them. Ideally, instructors would upload the grades daily or weekly to avoid backlog of sessions.

 

Main Shortcomings of iClicker Product:

  • Entering numeric answers –
    • Questions that use numeric answers are widely used in Math and the sciences. Instead of choosing a multiple-choice answer, students solve the problem and enter the actual numeric answer, which can include numbers and symbols.
    • Workaround: Students push mode button and use directional pad to scroll up and down through a list of numbers, letters and symbols to choose each character individually from left to right. Then they must submit the answer.
    • Number of multiple choice answers –
      • iClicker has 5 buttons on the transmitter for direct answer choices and Turning Technologies has 10.
      • Workaround: Similar to numeric answer workaround. Once again the simpler transmitter becomes complex for the students.
      • Potential Vendor Support Problems –
        • It took iClicker over 3 months to get their grade upload interface working with NDSU’s Blackboard system. The Turning Technology interface worked right away.  No workaround.

 

 

 

 

1 6 7 8