Searching for "data privacy"

Find, vet and implement edtech

Find, vet and implement edtech – painlessly!

By Nicole Krueger 12/13/2018
https://www.iste.org/explore/articleDetail?articleid=2325

Pre-vetted tools are rated in several categories

Educators seeking new technology can start by consulting a database of pre-vetted edtech tools, rated based on alignment with both child data privacy laws and the district’s instructional vision. Each entry includes notes about what the software does, how it can be used in the classroom, and the appropriate age level. Kaye is also working on aligning the database to the ISTE Standards so teachers can see at a glance which standards each tool can help them meet.

Every app falls into one of four categories:

  • Tools the district approves, supports, pays for, and will train teachers to use.
  • Tools that are approved and can be freely used on an independent basis.
  • Tools that are approved with stipulations, such as age or parental permission requirements.
  • Tools that are not approved because they don’t align with the district’s vision or data privacy needs.

Teachers can request to have a tool vetted

Teachers who choose a pre-vetted app from the approved list can start using it right away, without any further action needed. Educators who have a specific tool in mind that hasn’t yet been vetted can submit a request form that asks questions such as:

  • How does the tool connect to the curriculum?
  • Will students be consumers or producers when using it?
  • How easy is it to learn and use?
  • What are some of the things they plan on doing with it?

Encyclopedia of Criminal Activities and the Deep Web

>>>>>>> Publishing Opportunity <<<<<<<<<<<<<<

Encyclopedia of Criminal Activities and the Deep Web

Countries all over the world are seeing significant increases in criminal activity through the use of technological tools. Such crimes as identity theft, cyberattacks, drug trafficking, and human trafficking are conducted through the deep and dark web, while social media is utilized by murderers, sex offenders, and pedophiles to elicit information and contact their victims. As criminals continue to harness technology to their advantage, law enforcement and government officials are left to devise alternative strategies to learn more about all aspects of these modern criminal patterns and behavior, to preserve the safety of society, and to ensure that proper justice is served. Regrettably, the lack of adequate research findings on these modern criminal activities is limiting everyone’s abilities to devise effective strategies and programs to combat these modern technology-related criminal activities.

In an effort to compile the most current research on this topic, a new major reference work titled Encyclopedia of Criminal Activities and the Deep Web is currently being developed. This comprehensive Encyclopedia is projected to encompass expert insights about the nature of these criminal activities, how they are conducted, and societal and technological limitations. It will also explore new methods and processes for monitoring and regulating the use of these tools, such as social media, online forums, and online ads, as well as hidden areas of the internet including the deep and dark web. Additionally, this Encyclopedia seeks to offer strategies for predicting and preventing criminals from using technology as a means to track, stalk, and lure their victims.

You are cordially invited to share your research to be featured in this Encyclopedia by submitting a chapter proposal/abstract using the link on the formal call for papers page here. If your chapter proposal is accepted, guidelines for preparing your full chapter submission (which should be between 5,000-7,500 total words in length) can be accessed at: http://www.igi-global.com/publish/contributor-resources/ (under the “For Authors” heading – “Encyclopedia Chapter Organization and Formatting”).

Recommended topics for papers include, but are not limited to:

  • Bitcoin and Crime
  • Botnets and Crime
  • Child Exploitation
  • Contract Killing
  • Criminology
  • Cryptocurrency
  • Cyber Espionage
  • Cyber Stalking
  • Cybercrime
  • Cybercriminals
  • Cybersecurity Legislation
  • Cyberterrorism Fraud
  • Dark Web
  • Dark Web Vendors
  • Darknets
  • Data Privacy
  • Dating Websites and Crime
  • Deep Web
  • Drug Trafficking
  • E-Banking Fraud
  • Email Scams
  • Fraud and Internet
  • Gaming and Crime
  • Government Regulations of the Dark Web
  • Hacking and Crime
  • Hacktivism
  • Human Trafficking
  • Identity Theft
  • International Regulations of the Dark Web
  • Internet Privacy
  • Internet Regulations
  • Internet Safety & Crime
  • Online Advertisement Websites and Crime
  • Online Blackmail
  • Online Forums and Crime
  • Online Hate Crimes
  • Online Predators
  • Online Privacy
  • Social Media Deception
  • Social Networking Traps
  • Undercover Dark Web Busts
  • Undercover Operations
  • Vigilante Justice
  • Virtual Currencies & Crime
  • Whistleblowing

IMPORTANT DATES: Chapter Proposal Submission Deadline: October 15, 2018; Full Chapters Due: December 15, 2018

Note: There are no publication fees, however, contributors will be requested to provide a courtesy to their fellow colleagues by serving as a peer reviewer for this project for at least 2-3 articles. This will ensure the highest level of integrity and quality for the publication. 

Should you have any questions regarding this publication, or this invitation, please do not hesitate to contact: EncyclopediaCADW@igi-global.com

Mehdi Khosrow-Pour, DBA
Editor-in-Chief
Encyclopedia of Criminal Activities and the Deep Web
EncyclopediaCADW@igi-global.com

social media addiction

Social media copies gambling methods ‘to create psychological cravings’

Methods activate ‘same brain mechanisms as cocaine’ and leads to users experiencing ‘phantom’ notification buzzing, experts warn

https://www.theguardian.com/technology/2018/may/08/social-media-copies-gambling-methods-to-create-psychological-cravings

Social media platforms are using the same techniques as gambling firms to create psychological dependencies and ingrain their products in the lives of their users, experts warn.

atasha Schüll, the author of Addiction by Designwhich reported how slot machines and other systems are designed to lock users into a cycle of addiction.

Whether it’s Snapchat streaks, Facebook photo-scrolling, or playing CandyCrush, Schüll explained, you get drawn into “ludic loops” or repeated cycles of uncertainty, anticipation and feedback — and the rewards are just enough to keep you going.

Like gambling, which physically alters the brain’s structure and makes people more susceptible to depression and anxiety, social media use has been linked to depression and its potential to have an adverse psychological impact on users cannot be overlooked or underestimated.

Tech insiders have previously said “our minds can be hijacked” and that Silicon Valley is addicting us to our phones, while some have confessed they ban their kids from using social media.

However, the number of monthly active users of Facebook hit 2.13 billion earlier this year, up 14% from a year ago. Despite the furore around its data privacy issues, the social media monolith posted record revenues for the first quarter of 2018, making $11.97bn, up 49% on last year.

++++++++++++++
more on addiction in this IMS blog
https://blog.stcloudstate.edu/ims?s=addiction

Susan Grajek at Bryan Alexander on IT and education

Susan Grajek at Bryan Alexander on IT and education

Forum takes a deep dive into higher education and technology. On Thursday, March 23rd, from 2-3 pm EST we will be joined by Susan Grajek, the vice president for communities and research at EDUCAUSE

+++++++++++++++++++

Top 10 IT Issues, 2017: Foundations for Student Success

analytics in education

ACRL e-Learning webcast series: Learning Analytics – Strategies for Optimizing Student Data on Your Campus

This three-part webinar series, co-sponsored by the ACRL Value of Academic Libraries Committee, the Student Learning and Information Committee, and the ACRL Instruction Section, will explore the advantages and opportunities of learning analytics as a tool which uses student data to demonstrate library impact and to identify learning weaknesses. How can librarians initiate learning analytics initiatives on their campuses and contribute to existing collaborations? The first webinar will provide an introduction to learning analytics and an overview of important issues. The second will focus on privacy issues and other ethical considerations as well as responsible practice, and the third will include a panel of librarians who are successfully using learning analytics on their campuses.

Webcast One: Learning Analytics and the Academic Library: The State of the Art and the Art of Connecting the Library with Campus Initiatives
March 29, 2016

Learning analytics are used nationwide to augment student success initiatives as well as bolster other institutional priorities.  As a key aspect of educational reform and institutional improvement, learning analytics are essential to defining the value of higher education, and academic librarians can be both of great service to and well served by institutional learning analytics teams.  In addition, librarians who seek to demonstrate, articulate, and grow the value of academic libraries should become more aware of how they can dovetail their efforts with institutional learning analytics projects.  However, all too often, academic librarians are not asked to be part of initial learning analytics teams on their campuses, despite the benefits of library inclusion in these efforts.  Librarians can counteract this trend by being conversant in learning analytics goals, advantages/disadvantages, and challenges as well as aware of existing examples of library successes in learning analytics projects.

Learn about the state of the art in learning analytics in higher education with an emphasis on 1) current models, 2) best practices, 3) ethics, privacy, and other difficult issues.  The webcast will also focus on current academic library projects and successes in gaining access to and inclusion in learning analytics initiatives on their campus.  Benefit from the inclusion of a “short list” of must-read resources as well as a clearly defined list of ways in which librarians can leverage their skills to be both contributing members of learning analytics teams, suitable for use in advocating on their campuses.

my notes:

open academic analytics initiative
https://confluence.sakaiproject.org/pages/viewpage.action?pageId=75671025

where data comes from:

  • students information systems (SIS)
  • LMS
  • Publishers
  • Clickers
  • Video streaming and web conferencing
  • Surveys
  • Co-curricular and extra-curricular involvement

D2L degree compass
Predictive Analytics Reportitng PAR – was open, but just bought by Hobsons (https://www.hobsons.com/)

Learning Analytics

IMS Caliper Enabled Services. the way to connect the library in the campus analytics https://www.imsglobal.org/activity/caliperram

student’s opinion of this process
benefits: self-assessment, personal learning, empwerment
analytics and data privacy – students are OK with harvesting the data (only 6% not happy)
8 in 10 are interested in personal dashboard, which will help them perform
Big Mother vs Big Brother: creepy vs helpful. tracking classes, helpful, out of class (where on campus, social media etc) is creepy. 87% see that having access to their data is positive

librarians:
recognize metrics, assessment, analytics, data. visualization, data literacy, data science, interpretation

INSTRUCTION DEPARTMENT – N.B.

determine who is the key leader: director of institutional research, president, CIO

who does analyics services: institutional research, information technology, dedicated center

analytic maturity: data drivin, decision making culture; senior leadership commitment,; policy supporting (data ollection, accsess, use): data efficacy; investment and resourcefs; staffing; technical infrastrcture; information technology interaction

student success maturity: senior leader commited; fudning of student success efforts; mechanism for making student success decisions; interdepart collaboration; undrestanding of students success goals; advising and student support ability; policies; information systems

developing learning analytics strategy

understand institutional challenges; identify stakeholders; identify inhibitors/challenges; consider tools; scan the environment and see what other done; develop a plan; communicate the plan to stakeholders; start small and build

ways librarians can help
idenfify institu partners; be the partners; hone relevant learning analytics; participate in institutional analytics; identify questions and problems; access and work to improve institu culture; volunteer to be early adopters;

questions to ask: environmental scanning
do we have a learning analytics system? does our culture support? leaders present? stakeholders need to know?

questions to ask: Data

questions to ask: Library role

learning analytics & the academic library: the state of the art of connecting the library with campus initiatives

questions:
pole analytics library

 

 

 

 

 

 

 

 

 

 

 

 

 

 

literature

causation versus correlation studies. speakers claims that it is difficult to establish causation argument. institutions try to predict as accurately as possible via correlation, versus “if you do that it will happen what.”

++++++++++++

More on analytics in this blog:

https://blog.stcloudstate.edu/ims/?s=analytics&submit=Search

tech ed trends in 2016

What’s Hot, What’s Not in 2016

Our expert panelists weigh in on education technology to give us their verdict on which approaches to tech-enabled learning will have a major impact, which ones are stagnating and which ones might be better forgotten entirely.

By Greg Thompson 01/12/16

https://thejournal.com/articles/2016/01/12/whats-hot-whats-not-in-2016.aspx

  • Bring Your Own Device (BYOD): Lukewarm to Hot

  • Social Media for Teaching and Learning: Lukewarm to Hot

  • Digital Badges: Mostly Lukewarm

  • Open Educational Resources (OERs): Mostly Hot

  • E-Portfolios: Losing Steam

  • Learning Management Systems (LMS): Lukewarm to Hot

  • Flipped Learning: Mostly Hot (but Equitability a Question)

  • Blended Learning: Unanimously Hot

  • Student Data Privacy Concerns: Unanimously Hot

  • Apps for Learning: A Mostly Lukewarm Mixed Bag

  • Games for Learning: Hot

What are the hot devices?

Cameras like the Canon VIXIA, the Sony HDR-MV1 or the Zoom Q4 or Q8 range from $200 to $400. The secret of these small devices is a tradeoff between video flexibility and audio power. With digital-only zoom, these cameras still deliver full HD video (or better) but with limited distance capabilities. In return, the audio quality is unsurpassed by anything short of a professional boom or wireless microphone setup; most of these cameras feature high-end condenser microphone capsules that will make music or interview recordings shine.

The Chromebook is hot. Seventy-two percent of Chromebook sales were education-related purchases in 2014.

The smartphone is hot. Every day, the smartphone becomes less of a “phone” and more of a device for connecting with others via social media, researching information on the Internet, learning with apps and games and recording experiences with photos and videos.

clickers documentation

Thursday, April 11, 11AM-1PM, Miller Center B-37
and/or
http://media4.stcloudstate.edu/scsu

We invite the campus community to a presentation by three vendors of Classroom Response System (CRS), AKA “clickers”:

11:00-11:30AM          Poll Everywhere,              Mr. Alec Nuñez

11:30-12:00PM          iClikers,                                Mr. Jeff Howard
12:00-12:30PM          Top Hat Monocle             Mr. Steve Popovich

12:30-1PM                  Turning Technologies     Mr. Jordan Ferns

links to documentation from the vendors:

http://web.stcloudstate.edu/informedia/crs/ClickerSummaryReport_NDSU.docx 

 http://web.stcloudstate.edu/informedia/crs/Poll%20Everywhere.docx

http://web.stcloudstate.edu/informedia/crs/tophat1.pdf

http://web.stcloudstate.edu/informedia/crs/tophat2.pdf

http://web.stcloudstate.edu/informedia/crs/turning.pdf

Top Hat Monocle docs:

http://web.stcloudstate.edu/informedia/crs/thm/FERPA.pdf

http://web.stcloudstate.edu/informedia/crs/thm/proposal.pdf

http://web.stcloudstate.edu/informedia/crs/thm/THM_CaseStudy_Eng.pdf

http://web.stcloudstate.edu/informedia/crs/thm/thm_vsCRS.pdf

iCLicker docs:
http://web.stcloudstate.edu/informedia/crs/iclicker/iclicker.pdf

http://web.stcloudstate.edu/informedia/crs/iclicker/iclicker2VPAT.pdf

http://web.stcloudstate.edu/informedia/crs/iclicker/responses.doc

 

Questions to vendor: alec@polleverywhere.com 
  1. 1.     Is your system proprietary as far as the handheld device and the operating system software?

The site and the service are the property of Poll Everywhere. We do not provide handheld devices. Participants use their own device be it a smart phone, cell phone, laptop, tablet, etc.

  1. 2.     Describe the scalability of your system, from small classes (20-30) to large auditorium classes. (500+).

Poll Everywhere is used daily by thousands of users. Audience sizes upwards of 500+ are not uncommon. We’ve been used for events with 30,000 simultaneous participants in the past.

  1. 3.     Is your system receiver/transmitter based, wi-fi based, or other?

N/A

  1. 4.     What is the usual process for students to register a “CRS”(or other device) for a course? List all of the possible ways a student could register their device. Could a campus offer this service rather than through your system? If so, how?

Student participants may register by filling out a form. Or, student information can be uploaded via a CSV.

  1. 5.     Once a “CRS” is purchased  can it be used for as long as the student is enrolled in classes? Could “CRS” purchases be made available through the campus bookstore? Once a student purchases a “clicker” are they able to transfer ownership when finished with it?

N/A. Poll Everywhere sells service licenses the length and number of students supported would be outlined in a services agreement.

  1. 6.     Will your operating software integrate with other standard database formats? If so, list which ones.

Need more information to answer.

  1. 7.     Describe the support levels you provide. If you offer maintenance agreements, describe what is covered.

8am to 8pm EST native English speaking phone support and email support.

  1. 8.     What is your company’s history in providing this type of technology? Provide a list of higher education clients.

Company pioneered and invented the use of this technology for audience and classroom response. http://en.wikipedia.org/wiki/Poll_Everywhere. University of Notre Dame
South Bend, Indiana

University of North Carolina-Chapel Hill
Raleigh, North Carolina

University of Southern California
Los Angeles, California

San Diego State University
San Diego, California

Auburn University
Auburn, Alabama

King’s College London
London, United Kingdom

Raffles Institution
Singapore

Fayetteville State University
Fayetteville, North Carolina

Rutgers University
New Brunswick, New Jersey

Pepperdine University
Malibu, California

Texas A&M University
College Station, Texas

University of Illinois
Champaign, Illinois

  1. 9.     What measures does your company take to insure student data privacy? Is your system in compliance with FERPA and the Minnesota Data Practices Act? (https://www.revisor.leg.state.mn.us/statutes/?id=13&view=chapter)

Our Privacy Policy can be found here: http://www.polleverywhere.com/privacy-policy. We take privacy very seriously.

  1. 10.  What personal data does your company collect on students and for what purpose? Is it shared or sold to others? How is it protected?

Name. Phone Number. Email. For the purposes of voting and identification (Graded quizzes, attendance, polls, etc.). It is never shared or sold to others.

  1. 11.  Do any of your business partners collect personal information about students that use your technology?

No.

  1. 12.  With what formats can test/quiz questions be imported/exported?

Import via text. Export via CSV.

  1. 13.  List compatible operating systems (e.g., Windows, Macintosh, Palm, Android)?

Works via standard web technology including Safari, Chrome, Firefox, and Internet Explorer. Participant web voting fully supported on Android and IOS devices. Text message participation supported via both shortcode and longcode formats.

  1. 14.  What are the total costs to students including device costs and periodic or one-time operation costs

Depends on negotiated service level agreement. We offer a student pays model at $14 per year or Institutional Licensing.

  1. 15.  Describe your costs to the institution.

Depends on negotiated service level agreement. We offer a student pays model at $14 per year or Institutional Licensing.

  1. 16.  Describe how your software integrates with PowerPoint or other presentation systems.

Downloadable slides from the website for Windows PowerPoint and downloadable app for PowerPoint and Keynote integration on a Mac.

17. State your level of integration with Desire2Learn (D2L)?Does the integration require a server or other additional equipment the campus must purchase?Export results from site via CSV for import into D2L.
  1. 17.  How does your company address disability accommodation for your product?

We follow the latest web standards best practices to make our website widely accessible by all. To make sure we live up to this, we test our website in a text-based browser called Lynx that makes sure we’re structuring our content correctly for screen readers and other assisted technologies.

  1. 18.  Does your software limit the number of answers per question in tests or quizzes? If so, what is the max question limit?

No.

  1. 19.  Does your software provide for integrating multimedia files? If so, list the file format types supported.

Supports image formats (.PNG, .GIF, .JPG).

  1. 20.  What has been your historic schedule for software releases and what pricing mechanism do you make available to your clients for upgrading?

We ship new code daily. New features are released several times a year depending on when we finish them. New features are released to the website for use by all subscribers.

  1. 21.  Describe your “CRS”(s).

Poll Everywhere is a web based classroom response system that allows students to participate from their existing devices. No expensive hardware “clickers” are required. More information can be found at  http://www.polleverywhere.com/classroom-response-system.

  1. 22.  If applicable, what is the average life span of a battery in your device and what battery type does it take?

N/A. Battery manufacturers hate us. Thirty percent of their annual profits can be contributed to their use in clickers (we made that up).

  1. 23.  Does your system automatically save upon shutdown?

Our is a “cloud based” system. User data is stored there even when your computer is not on.

  1. 24.  What is your company’s projection/vision for this technology in the near and far term.

We want to take clicker companies out of business. We think it’s ridiculous to charge students and institutions a premium for outdated technology when existing devices and standard web technology can be used instead for less than a tenth of the price.

  1. 25.  Does any of your software/apps require administrator permission to install?

No.

  1. 26.  If your system is radio frequency based, what frequency spectrum does it operate in? If the system operate in the 2.4-2.5 ghz. spectrum, have you tested to insure that smart phones, wireless tablet’s and laptops and 2.4 ghz. wireless phones do not affect your system? If so, what are the results of those tests?

No.

  1. 27.  What impact to the wireless network does the solution have?

Depends on a variety of factors. Most university wireless networks are capable of supporting Poll Everywhere. Poll Everywhere can also make use of cell phone carrier infrastructure through SMS and data networks on the students phones.

  1. 28.  Can the audience response system be used spontaneously for polling?

Yes.

  1. 29.  Can quiz questions and response distributions be imported and exported from and to plaintext or a portable format? (motivated by assessment & accreditation requirements).

Yes.

  1. 30.  Is there a requirement that a portion of the course grade be based on the audience response system?

No.

Gloria Sheldon
MSU Moorhead

Fall 2011 Student Response System Pilot

Summary Report

 

NDSU has been standardized on a single student response (i.e., “clicker”) system for over a decade, with the intent to provide a reliable system for students and faculty that can be effectively and efficiently supported by ITS. In April 2011, Instructional Services made the decision to explore other response options and to identify a suitable replacement product for the previously used e-Instruction Personal Response System (PRS). At the time, PRS was laden with technical problems that rendered the system ineffective and unsupportable. That system also had a steep learning curve, was difficult to navigate, and was unnecessarily time-consuming to use. In fact, many universities across the U.S. experienced similar problems with PRS and have since then adopted alternative systems.

A pilot to explore alternative response systems was initiated at NDSU in fall 2011. The pilot was aimed at further investigating two systems—Turning Technologies and iClicker—in realistic classroom environments. As part of this pilot program, each company agreed to supply required hardware and software at no cost to faculty or students. Each vendor also visited campus to demonstrate their product to faculty, students and staff.

An open invitation to participate in the pilot was extended to all NDSU faculty on a first come, first serve basis. Of those who indicated interest, 12 were included as participants in this pilot.

 

Pilot Faculty Participants:

  • Angela Hodgson (Biological Sciences)
  • Ed Deckard (AES Plant Science)
  • Mary Wright (Nursing)
  • Larry Peterson (History, Philosophy & Religious Studies)
  • Ronald Degges (Statistics)
  • Julia Bowsher (Biological Sciences)
  • Sanku Mallik (Pharmaceutical Sciences)
  • Adnan Akyuz (AES School of Natural Resource Sciences)
  • Lonnie Hass (Mathematics)
  • Nancy Lilleberg (ITS/Communications)
  • Lisa Montplaisir (Biological Sciences)
  • Lioudmila Kryjevskaia (Physics)

 

Pilot Overview

The pilot included three components: 1) Vendor demonstrations, 2) in-class testing of the two systems, and 3) side-by-side faculty demonstrations of the two systems.

After exploring several systems, Instructional Services narrowed down to two viable options—Turning Technologies and iClicker. Both of these systems met initial criteria that was assembled based on faculty input and previous usage of the existing response system. These criteria included durability, reliability, ease of use, radio frequency transmission, integration with Blackboard LMS, cross-platform compatibility (Mac, PC), stand-alone software (i.e., no longer tied to PowerPoint or other programs), multiple answer formats (including multiple choice, true/false, numeric), potential to migrate to mobile/Web solutions at some point in the future, and cost to students and the university.

In the first stage of the pilot, both vendors were invited to campus to demonstrate their respective technologies. These presentations took place during spring semester 2011 and were attended by faculty, staff and students. The purpose of these presentations was to introduce both systems and provide faculty, staff, and students with an opportunity to take a more hands-on look at the systems and provide their initial feedback.

In the second stage of the pilot, faculty were invited to test the technologies in their classes during fall semester 2011. Both vendors supplied required hardware and software at no cost to faculty and students, and both provided online training to orient faculty to their respective system. Additionally, Instructional Services staff provided follow-up support and training throughout the pilot program. Both vendors were requested to ensure system integration with Blackboard. Both vendors indicated that they would provide the number of clickers necessary to test the systems equally across campus. Both clickers were allocated to courses of varying sizes, ranging from 9 to 400+ students, to test viability in various facilities with differing numbers of users. Participating faculty agreed to offer personal feedback and collect feedback from students regarding experiences with the systems at the end of the pilot.

In the final stage of the pilot, Instructional Services facilitated a side-by-side demonstration led by two faculty members. Each faculty member showcased each product on a function-by-function basis so that attendees were able to easily compare and contrast the two systems. Feedback was collected from attendees.

 

Results of Pilot

In stage one, we established that both systems were viable and appeared to offer similar features, functions, and were compatible with existing IT systems at NDSU. The determination was made to include both products in a larger classroom trial.

In stage two, we discovered that both systems largely functioned as intended; however, several differences between the technologies in terms of advantages and disadvantages were discovered that influenced our final recommendation. (See Appendix A for a list of these advantages, disadvantages, and potential workarounds.) We also encountered two significant issues that altered the course of the pilot. Initially, it was intended that both systems would be tested in equal number in terms of courses and students. Unfortunately, at the time of the pilot, iClicker was not able to provide more than 675 clickers, which was far fewer than anticipated. Turning Technologies was able to provide 1,395 clickers. As a result, Turning Technologies was used by a larger number of faculty and students across campus.

At the beginning of the pilot, Blackboard integration with iClicker at NDSU was not functional. The iClicker vendor provided troubleshooting assistance immediately, but the problem was not resolved until mid-November. As a result, iClicker users had to use alternative solutions for registering clickers and uploading points to Blackboard for student viewing. Turning Technologies was functional and fully integrated with Blackboard throughout the pilot.

During the span of the pilot additional minor issues were discovered with both systems. A faulty iClicker receiver slightly delayed the effective start date of clicker use in one course.  The vendor responded by sending a new receiver, however it was an incorrect model. Instructional Services temporarily exchanged receivers with another member of the pilot group until a functional replacement arrived. Similarly, a Turning Technologies receiver was received with outdated firmware. Turning Technologies support staff identified the problem and assisted in updating the firmware with an update tool located on their website. A faculty participant discovered a software flaw in the iClicker software that hides the software toolbar when disconnecting a laptop from a second monitor. iClicker technical support assisted in identifying the problem and stated the problem would be addressed in a future software update. A workaround was identified that mitigated this problem for the remainder of the pilot. It is important to note that these issues were not widespread and did not widely affect all pilot users, however these issues attest to the need for timely, reliable, and effective vendor support.

Students and faculty reported positive experiences with both technologies throughout the semester. Based on feedback, users of both systems found the new technologies to be much improved over the previous PRS system, indicating that adopting either technology would be perceived as an upgrade among students and faculty. Faculty pilot testers met several times during the semester to discuss their experiences with each system; feedback was sent to each vendor for their comments, suggestions, and solutions.

During the stage three demonstrations, feedback from attendees focused on the inability for iClicker to integrate with Blackboard at that time and the substantial differences between the two systems in terms of entering numeric values (i.e., Turning Technologies has numeric buttons, while iClicker requires the use of a directional key pad to scroll through numeric characters). Feedback indicated that attendees perceived Turning Technologies’ clickers to be much more efficient for submitting numeric responses. Feedback regarding other functionalities indicated relative equality between both systems.

Recommendation

Based on the findings of this pilot, Instructional Services recommends that NDSU IT adopt Turning Technologies as the replacement for the existing PRS system. While both pilot-tested systems are viable solutions, Turning Technologies appears to meet the needs of a larger user base. Additionally, the support offered by Turning Technologies was more timely and effective throughout the pilot. With the limited resources of IT, vendor support is critical and was a major reason for exploring alternative student response technologies.

From Instructional Services’ standpoint, standardizing to one solution is imperative for two major reasons: cost efficiency for students (i.e., preventing students from having to purchase duplicate technologies) and efficient utilization of IT resources (i.e., support and training). It is important to note that this recommendation is based on the opinion of the Instructional Services staff and the majority of pilot testers, but is not based on consensus among all participating faculty and staff. It is possible that individual faculty members may elect to use other options that best meet their individual teaching needs, including (but not limited to) iClicker. As an IT organization, we continue to support technology that serves faculty, student and staff needs across various colleges, disciplines, and courses. We feel that this pilot was effective in determining the student response technology—Turning Technologies—that will best serve NDSU faculty, students and staff for the foreseeable future.

Once a final decision concerning standardization is made, contract negotiations should begin in earnest with the goal of completion by January 1, 2012, in order to accommodate those wishing to use clickers during the spring session.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 


 

Appendix A: Clicker Comparisons
Turning Technologies and iClicker

 

Areas where both products have comparable functionality:

  • Setting up the receiver and software
  • Student registration of clickers
  • Software interface floats above other software
    • Can use with anything – PowerPoint, Websites, Word, etc.
    • Asking questions on the fly
    • Can create questions / answers files
    • Managing scores and data
      • Allow participation points, points for correct answer, change correct answer
      • Reporting – Summary and Detailed
      • Uploading scores and data to Blackboard (but there was a big delay with the iClicker product)
      • Durability of the receivers and clickers
      • Free software
      • Offer mobile web device product to go “clickerless”

Areas where the products differ:

Main Shortcomings of Turning Technology Product:

  • Costs $5 more – no workaround
  • Doesn’t have instructor readout window on receiver base –
    • This is a handy function in iClicker that lets the instructor see the %’s of votes as they come in, allowing the instructor to plan how he/she will proceed.
    • Workaround: As the time winds down to answer the question, the question and answers are displayed on the screen. Intermittently, the instructor would push a button to mute the projector, push a button to view graph results quickly, then push a button to hide graph and push a button to unmute the projector. In summary, push four buttons quickly each time you want to see the feedback, and the students will see a black screen momentarily.
    • Processing multiple sessions when uploading grading –
      • Turning Technologies uses their own file structure types, but iClicker uses comma-separated-value text files which work easily with Excel
      • Workaround: When uploading grades into Blackboard, upload them one session at a time, and use a calculated total column in Bb to combine them. Ideally, instructors would upload the grades daily or weekly to avoid backlog of sessions.

 

Main Shortcomings of iClicker Product:

  • Entering numeric answers –
    • Questions that use numeric answers are widely used in Math and the sciences. Instead of choosing a multiple-choice answer, students solve the problem and enter the actual numeric answer, which can include numbers and symbols.
    • Workaround: Students push mode button and use directional pad to scroll up and down through a list of numbers, letters and symbols to choose each character individually from left to right. Then they must submit the answer.
    • Number of multiple choice answers –
      • iClicker has 5 buttons on the transmitter for direct answer choices and Turning Technologies has 10.
      • Workaround: Similar to numeric answer workaround. Once again the simpler transmitter becomes complex for the students.
      • Potential Vendor Support Problems –
        • It took iClicker over 3 months to get their grade upload interface working with NDSU’s Blackboard system. The Turning Technology interface worked right away.  No workaround.

 

 

 

 

Classroom Response System CRS or clickers: questions to vendors

Good evening,

We are pleased to inform you that your classroom response system is chosen as final candidate for campus-wide adoption/support at St. Cloud State University. Should you be interested in pursuing this opportunity, we invite you to respond to the attached list of questions and to prepare a brief presentation for members of the selection committee and interested faculty/staff.

The deadline for responding to the questions is 12:00 pm (CST), Tuesday, April 9. This deadline will allow us to review the responses in time for the vendor presentations on Thursday, April 11, 11AM-1PM. The presentations will be held virtually via Adobe Connect: http://media4.stcloudstate.edu/scsu. Please let us know, if you need to test and familiarize yourself with the presentation platform.

The presentation should be no more than 10 minutes long, followed by 10 minutes for follow-up questions. We suggest that you focus on the highlights of your system, presuming a moderately knowledgeable audience. We may follow up via email or telephone call prior to making our final selection.

Thank you and looking forward to hearing from you soon.

Classroom Response System Taskforce:
Dr. Anthony Hansen
Dr. Michael Rentz
Dr. Joseph Melcher
Dr. Andrew Anda
Dr. Tracy Ore
Dr. Jack McKenna
Dr. Plamen Miltenoff

 

Questions to vendor
1. Is your system proprietary as far as the handheld device and the operating system software?
2. Describe the scalability of your system, from small classes (20-30) to large auditorium classes. (500+).
3. Is your system receiver/transmitter based, wi-fi based, or other?
4. What is the usual process for students to register a “CRS”(or other device) for a course? List all of the possible ways a student could register their device. Could a campus offer this service rather than through your system? If so, how?
5. Once a “CRS” is purchased  can it be used for as long as the student is enrolled in classes? Could “CRS” purchases be made available through the campus bookstore? Once a student purchases a “clicker” are they able to transfer ownership when finished with it?
6. Will your operating software integrate with other standard database formats? If so, list which ones.
7. Describe the support levels you provide. If you offer maintenance agreements, describe what is covered.
8. What is your company’s history in providing this type of technology? Provide a list of higher education clients.
9. What measures does your company take to insure student data privacy? Is your system in compliance with FERPA and the Minnesota Data Practices Act? (https://www.revisor.leg.state.mn.us/statutes/?id=13&view=chapter)
10. What personal data does your company collect on students and for what purpose? Is it shared or sold to others? How is it protected?
11. Do any of your business partners collect personal information about students that use your technology?
12. With what formats can test/quiz questions be imported/exported?
13. List compatible operating systems (e.g., Windows, Macintosh, Palm, Android)?
14. What are the total costs to students including device costs and periodic or one-time operation costs
15. Describe your costs to the institution.
16. Describe how your software integrates with PowerPoint or other presentation systems.
17. State your level of integration with Desire2Learn (D2L)?

Does the integration require a server or other additional equipment the campus must purchase?

18. How does your company address disability accommodation for your product?
19. Does your software limit the number of answers per question in tests or quizzes? If so, what is the max question limit?
20. Does your software provide for integrating multimedia files? If so, list the file format types supported.
21. What has been your historic schedule for software releases and what pricing mechanism do you make available to your clients for upgrading?
22. Describe your “CRS”(s).
23. If applicable, what is the average life span of a battery in your device and what battery type does it take?
24. Does your system automatically save upon shutdown?
25. What is your company’s projection/vision for this technology in the near and far term.
26. Does any of your software/apps require administrator permission to install?
27. If your system is radio frequency based, what frequency spectrum does it operate in? If the system operate in the 2.4-2.5 ghz. spectrum, have you tested to insure that smart phones, wireless tablet’s and laptops and 2.4 ghz. wireless phones do not affect your system? If so, what are the results of those tests?
28. What impact to the wireless network does the solution have?
29. Can the audience response system be used spontaneously for polling?
30. Can quiz questions and response distributions be imported and exported from and to plaintext or a portable format? (motivated by assessment & accreditation requirements).
31. Is there a requirement that a portion of the course grade be based on the audience response system?

 

 

 

—————-

Plamen Miltenoff, Ph.D., MLIS

Professor

204-J James W. Miller Center

Learning Resources and Technology Services

720 Fourth Avenue South

St. Cloud, MN 56301-4498

320-308-3072

pmiltenoff@stcloudstate.edu

http://web.stcloudstate.edu/pmiltenoff/faculty/

“I am not strange, I am just not normal.” Salvador Dali

 

social media and social credit system

One of China’s biggest social networks is revealing user locations to head off ‘bad behaviour’

https://www.techradar.com/news/one-of-chinas-biggest-social-networks-is-revealing-user-locations-to-head-off-bad-behaviour

euters reports that Weibo will begin showing the rough locations of its users using IP addresses to combat “bad behaviour” online. The locations show up on both profiles and posts.

Chinese citizens have long resorted to using VPNs and other privacy tools to help either access non-Chinese services or speak freely online and you can see why.

In a similar view to the Panopticon, visibly showing users that the service knows where they are will lead to self-censorship, reducing the strain on Chinese censors to cover an internet with hundreds of millions of users.

+++++++++++++++
more on social credit system in this IMS blog
https://blog.stcloudstate.edu/ims?s=china+social

57 Jobs of the Future

57 Jobs of the Future

Metaverse Jobs

  1. Metaverse World Designers
  2. Avatar Designers
  3. Metaverse Storefront Creators, Developers, and Operators
  4. Metaverse Law Enforcement
  5. DAO Attorneys

Cryptocurrency

  1. Crypto Coaches and Advisors
  2. Crypto Mortgage Specialists
  3. Decentralization Managers

Healthcare

  1. Amnesia Surgeons – Doctors who are skilled in removing bad memories or destructive behavior.
  2. Memory Augmentation Therapists – Entertainment is all about the great memories it creates. Creating a better grade of memories can dramatically change who we are and pave the way for an entirely new class of humans.
  3. Digital Implant Architects
  4. Genetic Troubleshooters
  5. Body Part Fabricators
  6. AI Health Managers

Big Data

  1. Privacy Strategists
  2. Personal Data Managers, Archivists, and Protectors
  3. Blockchain Designers
  4. Vulnerabilities Analyst

Future Education

  1. AI Memory Assessment Engineers
  2. AI Coach-Bot Designers
  3. AI Teacher-Bot Developers

 

1 5 6 7 8 9 16