Because of technological advances and the sheer amount of data now available about billions of other people, discretion no longer suffices to protect your privacy. Computer algorithms and network analyses can now infer, with a sufficiently high degree of accuracy, a wide range of things about you that you may have never disclosed, including your moods, your political beliefs, your sexual orientation and your health.
There is no longer such a thing as individually “opting out” of our privacy-compromised world.
In 2017, the newspaper The Australian published an article, based on a leaked document from Facebook, revealing that the company had told advertisers that it could predict when younger users, including teenagers, were feeling “insecure,” “worthless” or otherwise in need of a “confidence boost.” Facebook was apparently able to draw these inferences by monitoring photos, posts and other social media data.
In 2017, academic researchers, armed with data from more than 40,000 Instagram photos, used machine-learning tools to accurately identify signs of depression in a group of 166 Instagram users. Their computer models turned out to be better predictors of depression than humans who were asked to rate whether photos were happy or sad and so forth.
Computational inference can also be a tool of social control. The Chinese government, having gathered biometric data on its citizens, is trying to use big data and artificial intelligence to single out “threats” to Communist rule, including the country’s Uighurs, a mostly Muslim ethnic group.
counting how many times students use electronic library resources or visit in person, and comparing that to how well the students do in their classes and how likely they are to stay in school and earn a degree. And many library leaders are finding a strong correlation, meaning that students who consume more library materials tend to be more successful academically.
carefully tracking how library use compares to other metrics, and it has made changes as a result—like moving the tutoring center and the writing lab into the library. Those moves were designed not only to lure more people into the stacks, but to make seeking help more socially-acceptable for students who might have been hesitant.
a partnership between the library, which knows what electronic materials students use, and the technology office, which manages other campus data such as usage of the course-management system. The university is doing a study to see whether library usage there also equates to student success.
Last month, the nonprofit Center for Democracy and Technology (CDT) published a report arguing schools and districts should go the way of other industries and hire a Chief Privacy Officer to oversee their organization’s privacy policies and practices.
But the reality is that Chief Privacy Officers in K-12 education are about as common as unicorns.
Two years ago, Denver Public Schools created a new role, the Student Data Privacy Officer, after the Colorado legislature passed a law to promote student data privacy and transparency.
Between the creation of a social rating system and street cameras with facial recognition capabilities, technology reports coming out of China have raised serious concerns for privacy advocates. These concerns are only heightened as Chinese investors turn their attention to the United States education technology space acquiring companies with millions of public school users.
A particularly notable deal this year centers on Edmodo, a cross between a social networking platform and a learning management system for schools that boasts having upwards of 90 million users. Net Dragon, a Chinese gaming company that is building a significant education division, bought Edmodo for a combination of cash and equity valued at $137.5 million earlier this month.
Edmodo began shifting to an advertising model last year, after years of struggling to generate revenue. This has left critics wondering why the Chinese firm chose to acquire Edmodo at such a price, some have gone as far as to call the move a data grab.
as data becomes a tool that governments such as Russia and China could use to influence voting systems or induce citizens into espionage, more legislators are turning their attention to the acquisitions of early-stage technology startups.
NetDragon officials, however, say they have no interest in these types of activities. Their main goal in acquiring United States edtech companies lies in building profitability, says Pep So, NetDragon’s Director of Corporate Development.
In 2015, the firm acquired the education technology platform, Promethean, a company that creates interactive displays for schools. NetDragon executives say that the Edmodo acquisition rounds out their education product portfolio—meaning the company will have tools for supporting multiple aspects of learning including; preparation, instructional delivery, homework, assignment grading, communication with parents students and teachers and a content marketplace.
NetDragon’s monetization plan for Edmodo focuses on building out content that gets sold via its platform. Similar to tools like TeachersPayTeachers, So hopes to see users putting up content on the platform’s marketplace, some free and others for a fee (including some virtual reality content), so that the community can buy, sell and review available educational tools.
As far as data privacy is concerned, So notes that NetDragon is still learning what it can and cannot do. He noted that the company will comply with Children’s Online Privacy Protection Act (COPPA), a federal regulation created in order to protect the privacy of children online, but says that the rules and regulations surrounding the law are confusing for all actors involved.
Historically, Chinese companies have faced trust and branding issues when moving into the United States market, and the reverse is also true for U.S. companies seeking to expand overseas. Companies have also struggled to learn the rules, regulations and operational procedures in place in other countries.
As Norwegian Refugee Council research found, 70 percent of Syrian refugees lack basic identification and documents showing ownership of property.
The global passport
Host nations certainly has a share in the damage, as they face problems concerning the accessibility of vital information about the newcomers — dealing with the undocumented refugee, the immigration service can’t gain the information about his/her health status, family ties or criminal record, or verify any other vital data that helps them make a decision. Needless to say, this may lead to the designation of refugee status being exploited by economic migrants, fugitives or even the war criminals that caused the mass displacement to begin with.
Another important issue is data security. Refugees’ personal identities are carefully re-established with the support of clever biometric systems set up by the U.N. Agency for Refugees (UNHCR). UNHCR registers millions of refugees and maintains those records in a database. But the evidence suggests that centralized systems like this could be prone to attacks. As a report on UNCHR’s site notes, Aadhaar — India’s massive biometric database and the largest national database of people in the world — has suffered serious breaches, and last year, allegations were made that access was for sale on the internet for as little as $8
Finland, a country with a population of 5.5 million, cannot boast huge numbers of refugees. For 2018, it set a quota of 750 people, mainly flying from Syria and the Democratic Republic of Congo. That’s way less than neighboring Sweden, which promised to take in 3,400. Nevertheless, the country sets a global example of the use of effective technology in immigration policy: It’s using blockchain to help the newcomers get on their feet faster.
The system, developed by the Helsinki-based startup MONI, maintains a full analogue of a bank account for every one of its participants.
Speaking at the World Economic Forum in Davos in January 2018, the billionaire investor and philanthropist George Soros revealed that his structures already use a blockchain in immigration policies
In 2017, Accenture and Microsoft Corp. teamed up to build a digital ID network using blockchain technology, as part of a U.N.-supported project to provide legal identification to 1.1 billion people worldwide with no official documents.
a Memorandum of Understanding (MOU) with blockchain platform IOTA to explore how the technology could increase efficiency.
Google’s sites in London, Madrid, Tel Aviv, Seoul, São Paulo and Warsaw (in a converted former vodka distillery) are hubs for entrepreneurs, providing workspace for startup founders as well as networking and educational events.
the recent offer from Sidewalk Labs – a company owned by Alphabet, Google’s parent company – to redevelop Toronto’s waterfront as a reason to be concerned about the company’s interests in potentially extracting data from cities.
Google’s history of tax evasion and mass surveillance as examples of actions that make it incompatible with the progressive values of the local area.
A large global change in data protection law is about to hit the tech industry, thanks to the EU’s General Data Protection Regulations (GDPR). GDPR affects any company, wherever they are in the world, that handles data about European citizens. It becomes law on 25 May 2018, and as such includes UK citizens, since it precedes Brexit. It’s no surprise the EU has chosen to tighten the data protection belt: Europe has long opposed the tech industry’s expansionist tendencies, particularly through antitrust suits, and is perhaps the only regulatory body with the inclination and power to challenge Silicon Valley in the coming years.
So, no more harvesting data for unplanned analytics, future experimentation, or unspecified research. Teams must have specific uses for specific data.
Augmented reality can be described as experiencing the real world with an overlay of additional computer generated content. In contrast, virtual reality immerses a user in an entirely simulated environment, while mixed or merged reality blends real and virtual worlds in ways through which the physical and the digital can interact. AR, VR, and MR offer new opportunities to create a psychological sense of immersive presence in an environment that feels real enough to be viewed, experienced, explored, and manipulated. These technologies have the potential to democratize learning by giving everyone access to immersive experiences that were once restricted to relatively few learners.
In Grinnell College’s Immersive Experiences Lab http://gciel.sites.grinnell.edu/, teams of faculty, staff, and students collaborate on research projects, then use 3D, VR, and MR technologies as a platform to synthesize and present their findings.
In terms of equity, AR, VR, and MR have the potential to democratize learning by giving all learners access to immersive experiences
relatively little research about the most effective ways to use these technologies as instructional tools. Combined, these factors can be disincentives for institutions to invest in the equipment, facilities, and staffing that can be required to support these systems. AR, VR, and MR technologies raise concerns about personal privacy and data security. Further, at least some of these tools and applications currently fail to meet accessibility standards. The user experience in some AR, VR, and MR applications can be intensely emotional and even disturbing (my note: but can be also used for empathy literacy),
immersing users in recreated, remote, or even hypothetical environments as small as a molecule or as large as a universe, allowing learners to experience “reality” from multiple perspectives.