Posts Tagged ‘online privacy’

social media and social credit system

One of China’s biggest social networks is revealing user locations to head off ‘bad behaviour’

https://www.techradar.com/news/one-of-chinas-biggest-social-networks-is-revealing-user-locations-to-head-off-bad-behaviour

euters reports that Weibo will begin showing the rough locations of its users using IP addresses to combat “bad behaviour” online. The locations show up on both profiles and posts.

Chinese citizens have long resorted to using VPNs and other privacy tools to help either access non-Chinese services or speak freely online and you can see why.

In a similar view to the Panopticon, visibly showing users that the service knows where they are will lead to self-censorship, reducing the strain on Chinese censors to cover an internet with hundreds of millions of users.

+++++++++++++++
more on social credit system in this IMS blog
https://blog.stcloudstate.edu/ims?s=china+social

Digital violence

https://www.spiegel.de/international/tomorrow/digital-violence-olimpia-coral-s-fight-on-behalf-of-the-women-of-mexico-a-b1da72f2-b720-4102-849a-d5a02def7d58

The recently published report “Free to Be Online?” by Plan International found that more than half of the 14,000 girls and young women surveyed worldwide have experienced online harassment or abuse.

+++++++++++++
more on privacy in this IMS blog
https://blog.stcloudstate.edu/ims?s=privacy

education algorithms

https://www.edsurge.com/news/2016-06-10-humanizing-education-s-algorithms

predictive algorithms to better target students’ individual learning needs.

Personalized learning is a lofty aim, however you define it. To truly meet each student where they are, we would have to know their most intimate details, or discover it through their interactions with our digital tools. We would need to track their moods and preferences, their fears and beliefs…perhaps even their memories.

There’s something unsettling about capturing users’ most intimate details. Any prediction model based off historical records risks typecasting the very people it is intended to serve. Even if models can overcome the threat of discrimination, there is still an ethical question to confront – just how much are we entitled to know about students?

We can accept that tutoring algorithms, for all their processing power, are inherently limited in what they can account for. This means steering clear of mythical representations of what such algorithms can achieve. It may even mean giving up on personalization altogether. The alternative is to pack our algorithms to suffocation at the expense of users’ privacy. This approach does not end well.

There is only one way to resolve this trade-off: loop in the educators.

Algorithms and data must exist to serve educators

 

++++++++++++
more on algorithms in this IMS blog
blog.stcloudstate.edu/ims?s=algor

game for online privacy and data security

Teachers Turn to Gaming for Online Privacy Lessons

By Dian Schaffhauser 10/10/18

https://thejournal.com/articles/2018/10/10/teachers-turn-to-gaming-for-online-privacy-lessons.aspx

Blind Protocol, an alternate reality game created by two high school English teachers to help students understand online privacy and data security. This form of gaming blends fact and fiction to immerse players in an interactive world that responds to their decisions and actions. In a recent article on KQED, Paul Darvasi and John Fallon described how they chose the gaming format to help their students gain a deeper look at how vulnerable their personal data is.

Darvasi, who blogs at “Ludic Learning,” and Fallon, who writes at “TheAlternativeClassroom,” are both immersed in the education gaming realm.

++++++++++
more on online privacy and data security

https://blog.stcloudstate.edu/ims?s=online+privacy

https://blog.stcloudstate.edu/ims?s=data+security