Searching for "online privacy"

Privacy & Security in Today’s Library

Privacy & Security in Today’s Library by Amigos Library Services

The virtuality of privacy and security on the from Plamen Miltenoff

From: Jodie Borgerding [mailto:Borgerding@amigos.org]
Sent: Wednesday, July 05, 2017 3:07 PM
To: Miltenoff, Plamen <pmiltenoff@stcloudstate.edu>
Cc: Nicole Walsh <WALSH@AMIGOS.ORG>
Subject: Proposal Submission for Privacy & Security Conference

Hi Plamen,

Thank you for your recent presentation proposal for the online conference, Privacy & Security in Today’s Library, presented by Amigos Library Services. Your proposal, The role of the library in teaching with technology unsupported by campus IT: the privacy and security issues of the “third-party,” has been accepted. I just wanted to confirm that you are still available to present on September 21, 2017 and if you have a time preference for your presentation (11 am, 12 pm, or 2 pm Central). If you are no longer able to participate, please let me know.

Nicole will be touch with you shortly with additional details and a speaker’s agreement.

Please let me know if you have any questions.

Thanks!
___________________

Jodie Borgerding Consulting & Education Services Manager Amigos Library Services 1190 Meramec Station Road, Suite 207 | Ballwin, MO  63021-6902 800-843-8482 x2897 | 972-340-2897(direct) http://www.amigos.org | borgerding@amigos.org

+++++++++++++++++

Bio

Dr. Plamen Miltenoff is an Information Specialist and Professor at St. Cloud State University. His education includes several graduate degrees in history and Library and Information Science and terminal degrees in education and psychology.

His professional interests encompass social media, multimedia, Web development and design, gaming and gamification, and learning environments (LEs).

Dr. Miltenoff organized and taught classes such as LIB 290 “Social Media in Global Context” (http://web.stcloudstate.edu/pmiltenoff/lib290/) and LIB 490/590 “Digital Storytelling” (http://web.stcloudstate.edu/pmiltenoff/lib490/) where issues of privacy and security are discussed.

Twitter handle @SCSUtechinstruc

Facebook page: https://www.facebook.com/InforMediaServices/

The virtuality of privacy and security on the modern campus:

The role of the library in teaching with technology unsupported by campus IT: the privacy and security issues of the “third-party software” teaching and learning

Abstract/Summary of Your Proposed Session

The virtualization reality changes rapidly all aspects of learning and teaching: from equipment to methodology, just when faculty have finalized their syllabus, they have to start a new, if they want to keep abreast with content changes and upgrades and engagement of a very different student fabric – Millennials.

Mainframes are replaced by microcomputers, microcomputers by smart phones and tablets, hard drives by cloud storage and wearables by IoT. The pace of hardware, software and application upgrade is becoming unbearable for students and faculty. Content creation and methodology becomes useless by the speed of becoming obsolete. In such environment, faculty students and IT staff barely can devote time and energy to deal with the rapidly increasing vulnerability connected with privacy and security.

In an effort to streamline ever-becoming-scarce resources, campus IT “standardizes” campus use of applications. Those are the applications, which IT chooses to troubleshoot campus-wide. Those are the applications recommended to faculty and students to use.

In an unprecedented burgeoning amount of applications, specifically for mobile devices, it is difficult to constraint faculty and students to use campus IT sanctioned applications, especially considering the rapid pace of such applications becoming obsolete. Faculty and students often “stray” away and go with their own choice. Such decision exposes faculty and students, personally, and the campus, institutionally, at risk. In a recent post by THE Journal, attention on campuses is drown to the fact that cyberattacks shift now from mobile devices to IoT and campus often are struggling even with their capability to guarantee cybersecurity of mobile devices on campus. Further, the use of third-party application might be in conflict with the FERPA campus-mandated policies. Such policies are lengthy and complex to absorb, both by faculty and students and often are excessively restrictive in terms of innovative ways to improve methodology and pedagogy of teaching and learning. The current procedure of faculty and students proposing new applications is a lengthy and cumbersome bureaucratic process, which often render the end-users’ proposals obsolete by the time the process is vetted.

Where/what is the balance between safeguarding privacy on campus and fostering security without stifling innovation and creativity? Can the library be the campus hub for education about privacy and security, the sandbox for testing and innovation and the body to expedite decision-making?

Abstract

The pace of changes in teaching and learning is becoming impossible to sustain: equipment evolves in accelerated pace, the methodology of teaching and learning cannot catch up with the equipment changes and atop, there are constant content updates. In an even-shrinking budget, faculty, students and IT staff barely can address the issues above, less time and energy left to address the increasing concerns about privacy and security.

In an unprecedented burgeoning amount of applications, specifically for mobile devices, it is difficult to constraint faculty and students to use campus IT sanctioned applications, especially considering the rapid pace of such applications becoming obsolete. Faculty and students often “stray” away and go with their own choice. Such decision exposes faculty and students, personally, and the campus, institutionally, at risk. In a recent post by THE Journal (http://blog.stcloudstate.edu/ims/2017/06/06/cybersecurity-and-students/), attention on campuses is drawn to the fact of cyberattacks shifting from mobile devices to IoT but campus still struggling to guarantee cybersecurity of mobile devices on campus. Further, the use of third-party applications might be in conflict with the FERPA campus-mandated policies. Such policies are lengthy and complex to absorb, both by faculty and students and often are excessively restrictive in terms of innovative ways to improve methodology and pedagogy of teaching and learning. The current procedure of faculty and students proposing new applications is a lengthy and cumbersome bureaucratic process, which often render the end-users’ proposals obsolete by the time the process is vetted.

Where/what is the balance between safeguarding privacy on campus and fostering security without stifling innovation and creativity? Can the library be the campus hub for education about privacy and security, the sandbox for testing and innovation and the body to expedite decision-making?

http://blog.stcloudstate.edu/ims/2017/06/06/cybersecurity-and-students/

Anything else you would like to add

3 take-aways from this session:

  • Discuss and form an opinion about the education-pertinent issues of privacy and security from the broad campus perspective, versus the narrow library one
  • Discuss and form an opinion about the role of the library on campus in terms of the greater issues of privacy and security

Re-examine the thin red line of the balance between standardization and innovation; between the need for security and privacy protection a

++++++++++++++
presentation:
https://www.slideshare.net/aidemoreto/the-virtuality-of-privacy-and-security-on-the 

chat – slide 4, privacy. please take 2 min and share your definition of privacy on campus. Does it differ between faculty and students?  what are the main characteristics to determine privacy

chat – slide 5, security. please take 2 min and share your definition of security on campus regarding electronic activities. Who’s responsibility is security? IT issue [only]?

poles: slide 6, technology unsupported by campus IT, is it worth considering? 1. i am a great believer in my freedom of choice 2. I firmly follow rules and this applies to the use of computer tools and applications 3. Whatever…

chat –  slide 6, why third party applications? pros and cons. E.g. pros – familiarity with third party versus campus-required

pole, slide 6, appsmashing. App smashing is the ability to combine mobile apps in your teaching process. How do you feel about it? 1. The force is with us 2. Nonsense…

pole slide 7 third party apps and the comfort of faculty. How do you see the freedom of using third party apps? 1. All I want, thank you 2. I would rather follow the rules 3. Indifference is my middle name

pole slide 8 Technology standardization? 1. yes, 2. no, 3. indifferent

chat slide 9 if the two major issues colliding in this instance are: standardization versus third party and they have impact on privacy and security, how would you argue for the one or the other?

++++++++++++++++
notes from the conference

 

 

Measuring Library Vendor Cyber Security: Seven Easy Questions Every Librarian Can Ask

http://journal.code4lib.org/articles/11413

Bill Walker: http://www.amigos.org/innovating_metadata

 

+++++++++++++++
more on security in education in this IMS blog
http://blog.stcloudstate.edu/ims?s=security

more on privacy in education in this IMS blog
http://blog.stcloudstate.edu/ims?s=privacy

classroom discussions on privacy

Dear colleagues,

the topics of privacy pertaining technology is becoming ubiquitous.
If you feel that the content of your class material can benefit of such discussions, please let us know.

Please have  some titles, which can help you brainstorm topics for discussions in your classes:

Power, Privacy, and the Internet
http://blog.stcloudstate.edu/ims/2015/12/03/power-privacy-internet/

Privacy groups slam Department of Homeland Security social media proposal
http://blog.stcloudstate.edu/ims/2016/08/24/dhs-social-media-proposal/

FBI quietly changes its privacy rules for accessing NSA data on Americans
http://blog.stcloudstate.edu/ims/2016/03/09/surveillance-and-privacy/

Facebook canceled a student’s internship after he highlighted a massive privacy issue
http://blog.stcloudstate.edu/ims/2015/08/17/facebook-and-privacy/

Samsung’s Privacy Policy Warns Customers Their Smart TVs Are Listening
http://blog.stcloudstate.edu/ims/2015/02/10/privacy-smart-devices/

Teenagers, The Internet, And Privacy
http://blog.stcloudstate.edu/ims/2014/11/05/teenagers-the-internet-and-privacy/

Online privacy: It’s time for a new security paradigm
http://blog.stcloudstate.edu/ims/2014/09/25/online-privacy-its-time-for-a-new-security-paradigm/

On social media, privacy, etc.
http://blog.stcloudstate.edu/ims/2014/03/14/on-social-media-privacy-etc/

Hacking the Future: Privacy, Identity, and Anonymity On the Web
http://blog.stcloudstate.edu/ims/2013/12/03/hacking-the-future-privacy-identity-and-anonymity-on-the-web/

Are We Puppets in a Wired World?
http://blog.stcloudstate.edu/ims/2013/10/23/pro-domo-sua-are-we-puppets-in-a-wired-world-surveillance-and-privacy-revisited/

How Teens Deal With Privacy and Mobile Apps
http://blog.stcloudstate.edu/ims/2013/08/28/how-teens-deal-with-privacy-and-mobile-apps/

If you seek  more tangible, hands-on assistance with similar and/or any topics regarding technology, please do not hesitate to contact us.

Teenagers, The Internet, And Privacy

The Truth About Teenagers, The Internet, And Privacy

http://www.fastcompany.com/3037962/then-and-now/the-truth-about-teenagers-the-internet-and-privacy

danah boyd, a professor at Harvard University’s Berkman Center for the Internet and Society, argues that teenagers closely scrutinize what they share online because it is a way for them to negotiate their changing identities. In her book, It’s Complicated: The Social Lives of Networked Teens, she describes how teenagers carefully curate their feeds based on the audience they are trying to reach.

Adolescents have been migrating away from Facebook and Twitter over the last few years, showing preference for sites like Snapchat, Whisper, Kik, and Secret that provide more anonymity and privacy. Part of this transition can be explained by the fact that the older social media sites stopped being cool when parents joined them, but perhaps another reason could be that teenagers growing up in the post-Snowden era implicitly understand the value of anonymity. For teens, it’s not a matter of which platform to use, but rather which works best in a particular context.

WHAT ARE YOU REVEALING ONLINE? MUCH MORE THAN YOU THINK

WHAT ARE YOU REVEALING ONLINE? MUCH MORE THAN YOU THINK

http://ideas.ted.com/2014/07/01/do-you-know-what-youre-revealing-online-much-more-than-you-think/

Right now in the U.S. it’s essentially the case that when you post information online, you give up control of it.

Some companies may give you that right, but you don’t have a natural, legal right to control your personal data. So if a company decides they want to sell it or market it or release it or change your privacy settings, they can do that.

The point is, we really don’t know how this information will be used. For instance, say I’m a merchant — once I get information about you, I can use this information to try to extract more economic surplus from the transaction. I can price-discriminate you, so that I can get more out of the transaction than you will.

I’m interested in working in this area, not because disclosure is bad — human beings disclose all the time, it’s an innate need as much as privacy is — but because we really don’t know how this information will be used in the long run.

On social media, privacy, etc.

Twitter, Rape and Privacy on Social Media – The Cut
http://nymag.com/thecut/2014/03/twitter-rape-and-privacy-on-social-media.html?mid=facebook_nymag

*****************

Three thoughtful and thought-provoking essays about teaching social media use:

“Why students should not be required to publicly participate online” online at http://prpost.wordpress.com/2010/04/25/why-students-should-not-be-required-to-publicly-participate-online/

“Notes on Student Privacy and Online Pedagogy” online at http://joshhonn.com/?p=65

“Why the Loon does not assign public social-media use” online at http://gavialib.com/2014/02/why-the-loon-does-not-assign-public-social-media-use/

I don’t necessarily advocate the point of view expressed in these posts, but I do think they merit both attention and discussion in a course focused on social media.

Keith Ewing

Professor, Library Systems & Digital Projects

 

 

OLC Collaborate

OLC Collaborate

https://onlinelearningconsortium.org/attend-2019/innovate/

schedule:

https://onlinelearningconsortium.org/attend-2019/innovate/program/all_sessions/#streamed

Wednesday

++++++++++++++++
THE NEW PROFESSOR: HOW I PODCASTED MY WAY INTO STUDENTS’ LIVES (AND HOW YOU CAN, TOO)

Concurrent Session 1

https://onlinelearningconsortium.org/olc-innovate-2019-session-page/?session=6734&kwds=

+++++++++++++

Creating A Cost-Free Course

+++++++++++++++++

Idea Hose: AI Design For People
Date: Wednesday, April 3rd
Time: 3:30 PM to 4:15 PM
Conference Session: Concurrent Session 3
Streamed session
Lead Presenter: Brian Kane (General Design LLC)
Track: Research: Designs, Methods, and Findings
Location: Juniper A
Session Duration: 45min
Brief Abstract:What happens when you apply design thinking to AI? AI presents a fundamental change in the way people interact with machines. By applying design thinking to the way AI is made and used, we can generate an unlimited amount of new ideas for products and experiences that people will love and use.https://onlinelearningconsortium.org/olc-innovate-2019-session-page/?session=6964&kwds=
Notes from the session:
design thinking: get out from old mental models.  new narratives; get out of the sci fi movies.
narrative generators: AI design for people stream
we need machines to make mistakes. Ai even more then traditional software.
Lessons learned: don’t replace people
creativity engines – automated creativity.
trends:
 AI Design for People stream49 PM-us9swehttps://www.androidauthority.com/nvidia-jetson-nano-966609/
https://community.infiniteflight.com/t/virtualhub-ios-and-android-free/142837?u=sudafly
 http://bit.ly/VirtualHub
Thursday
Chatbots, Game Theory, And AI: Adapting Learning For Humans, Or Innovating Humans Out Of The Picture?
Date: Thursday, April 4th
Time: 8:45 AM to 9:30 AM
Conference Session: Concurrent Session 4
Streamed session
Lead Presenter: Matt Crosslin (University of Texas at Arlington LINK Research Lab)
Track: Experiential and Life-Long Learning
Location: Cottonwood 4-5
Session Duration: 45min
Brief Abstract:How can teachers utilize chatbots and artificial intelligence in ways that won’t remove humans out of the education picture? Using tools like Twine and Recast.AI chatobts, this session will focus on how to build adaptive content that allows learners to create their own heutagogical educational pathways based on individual needs.++++++++++++++++

This Is Us: Fostering Effective Storytelling Through EdTech & Student’s Influence As Digital Citizens
Date: Thursday, April 4th
Time: 9:45 AM to 10:30 AM
Conference Session: Concurrent Session 5
Streamed session
Lead Presenter: Maikel Alendy (FIU Online)
Co-presenter: Sky V. King (FIU Online – Florida International University)
Track: Teaching and Learning Practice
Location: Cottonwood 4-5
Session Duration: 45min
Brief Abstract:“This is Us” demonstrates how leveraging storytelling in learning engages students to effectively communicate their authentic story, transitioning from consumerism to become creators and influencers. Addressing responsibility as a digital citizen, information and digital literacy, online privacy, and strategies with examples using several edtech tools, will be reviewed.++++++++++++++++++

Personalized Learning At Scale: Using Adaptive Tools & Digital Assistants
Date: Thursday, April 4th
Time: 11:15 AM to 12:00 PM
Conference Session: Concurrent Session 6
Streamed session
Lead Presenter: Kristin Bushong (Arizona State University )
Co-presenter: Heather Nebrich (Arizona State University)
Track: Effective Tools, Toys and Technologies
Location: Juniper C
Session Duration: 45min
Brief Abstract:Considering today’s overstimulated lifestyle, how do we engage busy learners to stay on task? Join this session to discover current efforts in implementing ubiquitous educational opportunities through customized interests and personalized learning aspirations e.g., adaptive math tools, AI support communities, and memory management systems.+++++++++++++

High-Impact Practices Online: Starting The Conversation
Date: Thursday, April 4th
Time: 1:15 PM to 2:00 PM
Conference Session: Concurrent Session 7
Streamed session
Lead Presenter: Katie Linder (Oregon State University)
Co-presenter: June Griffin (University of Nebraska-Lincoln)
Track: Teaching and Learning Practice
Location: Cottonwood 4-5
Session Duration: 45min
Brief Abstract:The concept of High-impact Educational Practices (HIPs) is well-known, but the conversation about transitioning HIPs online is new. In this session, contributors from the edited collection High-Impact Practices in Online Education will share current HIP research, and offer ideas for participants to reflect on regarding implementing HIPs into online environments.https://www.aacu.org/leap/hipshttps://www.aacu.org/sites/default/files/files/LEAP/HIP_tables.pdf+++++++++++++++++++++++

Human Skills For Digital Natives: Expanding Our Definition Of Tech And Media Literacy
Date: Thursday, April 4th
Time: 3:45 PM to 5:00 PM
Streamed session
Lead Presenter: Manoush Zomorodi (Stable Genius Productions)
Track: N/A
Location: Adams Ballroom
Session Duration: 1hr 15min
Brief Abstract:How can we ensure that students and educators thrive in increasingly digital environments, where change is the only constant? In this keynote, author and journalist Manoush Zomorodi shares her pioneering approach to researching the effects of technology on our behavior. Her unique brand of journalism includes deep-dive investigations into such timely topics as personal privacy, information overload, and the Attention Economy. These interactive multi-media experiments with tens of thousands of podcast listeners will inspire you to think creatively about how we use technology to educate and grow communities.Friday

Anger Is An Energy
Date: Friday, April 5th
Time: 8:30 AM to 9:30 AM
Streamed session
Lead Presenter: Michael Caulfield (Washington State University-Vancouver)
Track: N/A
Location: Adams Ballroom
Position: 2
Session Duration: 60min
Brief Abstract:Years ago, John Lyndon (then Johnny Rotten) sang that “anger is an energy.” And he was right, of course. Anger isn’t an emotion, like happiness or sadness. It’s a reaction, a swelling up of a confused urge. I’m a person profoundly uncomfortable with anger, but yet I’ve found in my professional career that often my most impactful work begins in a place of anger: anger against injustice, inequality, lies, or corruption. And often it is that anger that gives me the energy and endurance to make a difference, to move the mountains that need to be moved. In this talk I want to think through our uneasy relationship with anger; how it can be helpful, and how it can destroy us if we’re not careful.++++++++++++++++

Improving Online Teaching Practice, Creating Community And Sharing Resources
Date: Friday, April 5th
Time: 10:45 AM to 11:30 AM
Conference Session: Concurrent Session 10
Streamed session
Lead Presenter: Laurie Daily (Augustana University)
Co-presenter: Sharon Gray (Augustana University)
Track: Problems, Processes, and Practices
Location: Juniper A
Session Duration: 45min
Brief Abstract:The purpose of this session is to explore the implementation of a Community of Practice to support professional development, enhance online course and program development efforts, and to foster community and engagement between and among full and part time faculty.+++++++++++++++

It’s Not What You Teach, It’s HOW You Teach: A Story-Driven Approach To Course Design
Date: Friday, April 5th
Time: 11:45 AM to 12:30 PM
Conference Session: Concurrent Session 11
Streamed session
Lead Presenter: Katrina Rainer (Strayer University)
Co-presenter: Jennifer M McVay-Dyche (Strayer University)
Track: Teaching and Learning Practice
Location: Cottonwood 2-3
Session Duration: 45min
Brief Abstract:Learning is more effective and organic when we teach through the art of storytelling. At Strayer University, we are blending the principles story-driven learning with research-based instructional design practices to create engaging learning experiences. This session will provide you with strategies to strategically infuse stories into any lesson, course, or curriculum.

Encyclopedia of Criminal Activities and the Deep Web

>>>>>>> Publishing Opportunity <<<<<<<<<<<<<<

Encyclopedia of Criminal Activities and the Deep Web

Countries all over the world are seeing significant increases in criminal activity through the use of technological tools. Such crimes as identity theft, cyberattacks, drug trafficking, and human trafficking are conducted through the deep and dark web, while social media is utilized by murderers, sex offenders, and pedophiles to elicit information and contact their victims. As criminals continue to harness technology to their advantage, law enforcement and government officials are left to devise alternative strategies to learn more about all aspects of these modern criminal patterns and behavior, to preserve the safety of society, and to ensure that proper justice is served. Regrettably, the lack of adequate research findings on these modern criminal activities is limiting everyone’s abilities to devise effective strategies and programs to combat these modern technology-related criminal activities.

In an effort to compile the most current research on this topic, a new major reference work titled Encyclopedia of Criminal Activities and the Deep Web is currently being developed. This comprehensive Encyclopedia is projected to encompass expert insights about the nature of these criminal activities, how they are conducted, and societal and technological limitations. It will also explore new methods and processes for monitoring and regulating the use of these tools, such as social media, online forums, and online ads, as well as hidden areas of the internet including the deep and dark web. Additionally, this Encyclopedia seeks to offer strategies for predicting and preventing criminals from using technology as a means to track, stalk, and lure their victims.

You are cordially invited to share your research to be featured in this Encyclopedia by submitting a chapter proposal/abstract using the link on the formal call for papers page here. If your chapter proposal is accepted, guidelines for preparing your full chapter submission (which should be between 5,000-7,500 total words in length) can be accessed at: http://www.igi-global.com/publish/contributor-resources/ (under the “For Authors” heading – “Encyclopedia Chapter Organization and Formatting”).

Recommended topics for papers include, but are not limited to:

  • Bitcoin and Crime
  • Botnets and Crime
  • Child Exploitation
  • Contract Killing
  • Criminology
  • Cryptocurrency
  • Cyber Espionage
  • Cyber Stalking
  • Cybercrime
  • Cybercriminals
  • Cybersecurity Legislation
  • Cyberterrorism Fraud
  • Dark Web
  • Dark Web Vendors
  • Darknets
  • Data Privacy
  • Dating Websites and Crime
  • Deep Web
  • Drug Trafficking
  • E-Banking Fraud
  • Email Scams
  • Fraud and Internet
  • Gaming and Crime
  • Government Regulations of the Dark Web
  • Hacking and Crime
  • Hacktivism
  • Human Trafficking
  • Identity Theft
  • International Regulations of the Dark Web
  • Internet Privacy
  • Internet Regulations
  • Internet Safety & Crime
  • Online Advertisement Websites and Crime
  • Online Blackmail
  • Online Forums and Crime
  • Online Hate Crimes
  • Online Predators
  • Online Privacy
  • Social Media Deception
  • Social Networking Traps
  • Undercover Dark Web Busts
  • Undercover Operations
  • Vigilante Justice
  • Virtual Currencies & Crime
  • Whistleblowing

IMPORTANT DATES: Chapter Proposal Submission Deadline: October 15, 2018; Full Chapters Due: December 15, 2018

Note: There are no publication fees, however, contributors will be requested to provide a courtesy to their fellow colleagues by serving as a peer reviewer for this project for at least 2-3 articles. This will ensure the highest level of integrity and quality for the publication. 

Should you have any questions regarding this publication, or this invitation, please do not hesitate to contact: EncyclopediaCADW@igi-global.com

Mehdi Khosrow-Pour, DBA
Editor-in-Chief
Encyclopedia of Criminal Activities and the Deep Web
EncyclopediaCADW@igi-global.com

big data in ed

New Report Examines Use of Big Data in Ed

By Dian Schaffhauser  05/17/17

https://campustechnology.com/articles/2017/05/17/new-report-examines-use-of-big-data-in-ed.aspx

new report from the National Academy of Education “Big Data in Education,” summarizes the findings of a recent workshop held by the academy

three federal laws: Family Educational Rights and Privacy Act (FERPA), the Children’s Online Privacy Protection Act (COPPA) and the Protection of Pupil Rights Amendment (PPRA).

over the last four years, 49 states and the District of Columbia have introduced 410 bills related to student data privacy, and 36 states have passed 85 new education data privacy laws. Also, since 2014, 19 states have passed laws that in some way address the work done by researchers.

researchers need to get better at communicating about their projects, especially with non-researchers.

One approach to follow in gaining trust “from parents, advocates and teachers” uses the acronym CUPS:

  • Collection: What data is collected by whom and from whom;
  • Use: How the data will be used and what the purpose of the research is;
  • Protection: What forms of data security protection are in place and how access will be limited; and
  • Sharing: How and with whom the results of the data work will be shared.

Second, researchers must pin down how to share data without making it vulnerable to theft.

Third, researchers should build partnerships of trust and “mutual interest” pertaining to their work with data. Those alliances may involve education technology developers, education agencies both local and state, and data privacy stakeholders.

Along with the summary report, the results of the workshop are being maintained on a page within the Academy’s website here.

+++++++++++++++++
more on big data in education in this IMS blog
http://blog.stcloudstate.edu/ims?s=big+data

big data

big-data-in-education-report

Center for Digital Education (CDE)

real-time impact on curriculum structure, instruction delivery and student learning, permitting change and improvement. It can also provide insight into important trends that affect present and future resource needs.

Big Data: Traditionally described as high-volume, high-velocity and high-variety information.
Learning or Data Analytics: The measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.
Educational Data Mining: The techniques, tools and research designed for automatically extracting meaning from large repositories of data generated by or related to people’s learning activities in educational settings.
Predictive Analytics: Algorithms that help analysts predict behavior or events based on data.
Predictive Modeling: The process of creating, testing and validating a model to best predict the probability of an outcome.

Data analytics, or the measurement, collection, analysis and reporting of data, is driving decisionmaking in many institutions. However, because of the unique nature of each district’s or college’s data needs, many are building their own solutions.

For example, in 2014 the nonprofit company inBloom, Inc., backed by $100 million from the Gates Foundation and the Carnegie Foundation for the Advancement of Teaching, closed its doors amid controversy regarding its plan to store, clean and aggregate a range of student information for states and districts and then make the data available to district-approved third parties to develop tools and dashboards so the data could be used by classroom educators.22

Tips for Student Data Privacy

Know the Laws and Regulations
There are many regulations on the books intended to protect student privacy and safety: the Family Educational Rights and Privacy Act (FERPA), the Protection of Pupil Rights Amendment (PPRA), the Children’s Internet Protection Act (CIPA), the Children’s Online Privacy Protection Act (COPPA) and the Health Insurance Portability and Accountability Act (HIPAA)
— as well as state, district and community laws. Because technology changes so rapidly, it is unlikely laws and regulations will keep pace with new data protection needs. Establish a committee to ascertain your institution’s level of understanding of and compliance with these laws, along with additional safeguard measures.
Make a Checklist Your institution’s privacy policies should cover security, user safety, communications, social media, access, identification rules, and intrusion detection and prevention.
Include Experts
To nail down compliance and stave off liability issues, consider tapping those who protect privacy for a living, such as your school attorney, IT professionals and security assessment vendors. Let them review your campus or district technologies as well as devices brought to campus by students, staff and instructors. Finally, a review of your privacy and security policies, terms of use and contract language is a good idea.
Communicate, Communicate, Communicate
Students, staff, faculty and parents all need to know their rights and responsibilities regarding data privacy. Convey your technology plans, policies and requirements and then assess and re-communicate those throughout each year.

“Anything-as-a-Service” or “X-as-a-Service” solutions can help K-12 and higher education institutions cope with big data by offering storage, analytics capabilities and more. These include:
• Infrastructure-as-a-Service (IaaS): Providers offer cloud-based storage, similar to a campus storage area network (SAN)

• Platform-as-a-Service (PaaS): Opens up application platforms — as opposed to the applications themselves — so others can build their own applications
using underlying operating systems, data models and databases; pre-built application components and interfaces

• Software-as-a-Service (SaaS): The hosting of applications in the cloud

• Big-Data-as-a-Service (BDaaS): Mix all the above together, upscale the amount of data involved by an enormous amount and you’ve got BDaaS

Suggestions:

Use accurate data correctly
Define goals and develop metrics
Eliminate silos, integrate data
Remember, intelligence is the goal
Maintain a robust, supportive enterprise infrastructure.
Prioritize student privacy
Develop bullet-proof data governance guidelines
Create a culture of collaboration and sharing, not compliance.

more on big data in this IMS blog:

http://blog.stcloudstate.edu/ims/?s=big+data&submit=Search

Algorithmic Test Proctoring

Our Bodies Encoded: Algorithmic Test Proctoring in Higher Education

SHEA SWAUGER ED-TECH

https://hybridpedagogy.org/our-bodies-encoded-algorithmic-test-proctoring-in-higher-education/

While in-person test proctoring has been used to combat test-based cheating, this can be difficult to translate to online courses. Ed-tech companies have sought to address this concern by offering to watch students take online tests, in real time, through their webcams.

Some of the more prominent companies offering these services include ProctorioRespondusProctorUHonorLockKryterion Global Testing Solutions, and Examity.

Algorithmic test proctoring’s settings have discriminatory consequences across multiple identities and serious privacy implications. 

While racist technology calibrated for white skin isn’t new (everything from photography to soap dispensers do this), we see it deployed through face detection and facial recognition used by algorithmic proctoring systems.

While some test proctoring companies develop their own facial recognition software, most purchase software developed by other companies, but these technologies generally function similarly and have shown a consistent inability to identify people with darker skin or even tell the difference between Chinese people. Facial recognition literally encodes the invisibility of Black people and the racist stereotype that all Asian people look the same.

As Os Keyes has demonstrated, facial recognition has a terrible history with gender. This means that a software asking students to verify their identity is compromising for students who identify as trans, non-binary, or express their gender in ways counter to cis/heteronormativity.

These features and settings create a system of asymmetric surveillance and lack of accountability, things which have always created a risk for abuse and sexual harassment. Technologies like these have a long history of being abused, largely by heterosexual men at the expense of women’s bodies, privacy, and dignity.

Their promotional messaging functions similarly to dog whistle politics which is commonly used in anti-immigration rhetoric. It’s also not a coincidence that these technologies are being used to exclude people not wanted by an institution; biometrics and facial recognition have been connected to anti-immigration policies, supported by both Republican and Democratic administrations, going back to the 1990’s.

Borrowing from Henry A. Giroux, Kevin Seeber describes the pedagogy of punishment and some of its consequences in regards to higher education’s approach to plagiarism in his book chapter “The Failed Pedagogy of Punishment: Moving Discussions of Plagiarism beyond Detection and Discipline.”

my note: I am repeating this for years
Sean Michael Morris and Jesse Stommel’s ongoing critique of Turnitin, a plagiarism detection software, outlines exactly how this logic operates in ed-tech and higher education: 1) don’t trust students, 2) surveil them, 3) ignore the complexity of writing and citation, and 4) monetize the data.

Technological Solutionism

Cheating is not a technological problem, but a social and pedagogical problem.
Our habit of believing that technology will solve pedagogical problems is endemic to narratives produced by the ed-tech community and, as Audrey Watters writes, is tied to the Silicon Valley culture that often funds it. Scholars have been dismantling the narrative of technological solutionism and neutrality for some time now. In her book “Algorithms of Oppression,” Safiya Umoja Noble demonstrates how the algorithms that are responsible for Google Search amplify and “reinforce oppressive social relationships and enact new modes of racial profiling.”

Anna Lauren Hoffmann, who coined the term “data violence” to describe the impact harmful technological systems have on people and how these systems retain the appearance of objectivity despite the disproportionate harm they inflict on marginalized communities.

This system of measuring bodies and behaviors, associating certain bodies and behaviors with desirability and others with inferiority, engages in what Lennard J. Davis calls the Eugenic Gaze.

Higher education is deeply complicit in the eugenics movement. Nazism borrowed many of its ideas about racial purity from the American school of eugenics, and universities were instrumental in supporting eugenics research by publishing copious literature on it, establishing endowed professorships, institutes, and scholarly societies that spearheaded eugenic research and propaganda.

+++++++++++++++++
more on privacy in this IMS blog
http://blog.stcloudstate.edu/ims?s=privacy

1 2 3 4 7