Between the creation of a social rating system and street cameras with facial recognition capabilities, technology reports coming out of China have raised serious concerns for privacy advocates. These concerns are only heightened as Chinese investors turn their attention to the United States education technology space acquiring companies with millions of public school users.
A particularly notable deal this year centers on Edmodo, a cross between a social networking platform and a learning management system for schools that boasts having upwards of 90 million users. Net Dragon, a Chinese gaming company that is building a significant education division, bought Edmodo for a combination of cash and equity valued at $137.5 million earlier this month.
Edmodo began shifting to an advertising model last year, after years of struggling to generate revenue. This has left critics wondering why the Chinese firm chose to acquire Edmodo at such a price, some have gone as far as to call the move a data grab.
as data becomes a tool that governments such as Russia and China could use to influence voting systems or induce citizens into espionage, more legislators are turning their attention to the acquisitions of early-stage technology startups.
NetDragon officials, however, say they have no interest in these types of activities. Their main goal in acquiring United States edtech companies lies in building profitability, says Pep So, NetDragon’s Director of Corporate Development.
In 2015, the firm acquired the education technology platform, Promethean, a company that creates interactive displays for schools. NetDragon executives say that the Edmodo acquisition rounds out their education product portfolio—meaning the company will have tools for supporting multiple aspects of learning including; preparation, instructional delivery, homework, assignment grading, communication with parents students and teachers and a content marketplace.
NetDragon’s monetization plan for Edmodo focuses on building out content that gets sold via its platform. Similar to tools like TeachersPayTeachers, So hopes to see users putting up content on the platform’s marketplace, some free and others for a fee (including some virtual reality content), so that the community can buy, sell and review available educational tools.
As far as data privacy is concerned, So notes that NetDragon is still learning what it can and cannot do. He noted that the company will comply with Children’s Online Privacy Protection Act (COPPA), a federal regulation created in order to protect the privacy of children online, but says that the rules and regulations surrounding the law are confusing for all actors involved.
Historically, Chinese companies have faced trust and branding issues when moving into the United States market, and the reverse is also true for U.S. companies seeking to expand overseas. Companies have also struggled to learn the rules, regulations and operational procedures in place in other countries.
Ubiquitous social media platforms—including Facebook, Twitter and Instagram—have created a venue for people to share and connect with others. We use these services by clicking “I Agree” on Terms of Service screens, trading off some of our private and personal data for seemingly free services. While these services say data collection helps create a better user experience, that data is also potentially exploitable.
The news about how third parties obtain and use Facebook users’ data to wage political campaigns and the mounting evidence of election interference have shined a spotlight on just how secure our data is when we share online. Educating youth about data security can fall under the larger umbrella of digital citizenship, such as social media uses and misuses and learning how not to embarrass or endanger oneself while using the internet.
Darvasi’s students in Toronto can pool together 55 faux bitcoins to purchase and launch the BOTTING protocol against an opponent. The student targeted at Fallon’s school in Connecticut would then have 48 hours to record audio of 10 words of Darvasi’s students choosing and send it back to them through an intermediary (Darvasi or Fallon). For a higher price of 65 faux bitcoins, students can launch MORPHLING, which would give the opponent 48 hours to record a one-minute video explaining three ways to stay safe while using Facebook, while making their school mascot (or a close approximation of) appear in the video in some way during the entire minute.
Blind Protocol, an alternate reality game created by two high school English teachers to help students understand online privacy and data security. This form of gaming blends fact and fiction to immerse players in an interactive world that responds to their decisions and actions. In a recent article on KQED, Paul Darvasi and John Fallon described how they chose the gaming format to help their students gain a deeper look at how vulnerable their personal data is.
infamous former Bush administration lawyer John Yoo wrote in his 2006 book that Newstead was the “day-to-day manager of the Patriot Act in Congress”.
The Patriot Act was passed in the wake of the 9/11 attacks and brought in a series of new federal crimes related to terrorism. The legislation was broad and much of the government’s expanded surveillance powers stemmed from parts of the act. It enabled, among other things, the controversial Section 215, which was used to justify the National Security Agency’s phone records collection programme.
It also had a “roving wiretap” provision, which allowed government to place a tap on all of an individual’s personal devices based purely on the approval of the notoriously permissive Foreign Intelligence Surveillance Court.
As The Verge points out, the Patriot Act also initiated the practice of “national security letters”, a procedure by which intelligence agencies can informally request data without any kind of court or ex parte authorisation, citing threats to national security. Facebook fields thousands of these requests every year, the content of which is generally subject to gag orders and therefore remains publicly unknown. In her capacity as general counsel, Newstead will be able to approve or deny these requests.
Because of technological advances and the sheer amount of data now available about billions of other people, discretion no longer suffices to protect your privacy. Computer algorithms and network analyses can now infer, with a sufficiently high degree of accuracy, a wide range of things about you that you may have never disclosed, including your moods, your political beliefs, your sexual orientation and your health.
There is no longer such a thing as individually “opting out” of our privacy-compromised world.
In 2017, the newspaper The Australian published an article, based on a leaked document from Facebook, revealing that the company had told advertisers that it could predict when younger users, including teenagers, were feeling “insecure,” “worthless” or otherwise in need of a “confidence boost.” Facebook was apparently able to draw these inferences by monitoring photos, posts and other social media data.
In 2017, academic researchers, armed with data from more than 40,000 Instagram photos, used machine-learning tools to accurately identify signs of depression in a group of 166 Instagram users. Their computer models turned out to be better predictors of depression than humans who were asked to rate whether photos were happy or sad and so forth.
Computational inference can also be a tool of social control. The Chinese government, having gathered biometric data on its citizens, is trying to use big data and artificial intelligence to single out “threats” to Communist rule, including the country’s Uighurs, a mostly Muslim ethnic group.
Zeynep Tufekci and Seth Stephens-Davidowitz: Privacy is over
The increasing availability of these kinds of tools raise concerns and questions for Doug Levin, founder of EdTech Strategies.acial-recognition police tools have been decried as “staggeringly inaccurate.”
acial-recognition police tools have been decried as “staggeringly inaccurate.”School web filters can also impact low-income families inequitably, he adds, especially those that use school-issued devices at home. #equity.
Social-Emotional Learning: The New Surveillance?
Using data to profile students—even in attempts to reinforce positive behaviors—has Cerda concerned, especially in schools serving diverse demographics. #equity.
As in the insurance industry, much of the impetus (and sales pitches) in the school and online safety market can be driven by fear. But voicing such concerns and red flags can also steer the stakeholders toward dialogue and collaboration.
counting how many times students use electronic library resources or visit in person, and comparing that to how well the students do in their classes and how likely they are to stay in school and earn a degree. And many library leaders are finding a strong correlation, meaning that students who consume more library materials tend to be more successful academically.
carefully tracking how library use compares to other metrics, and it has made changes as a result—like moving the tutoring center and the writing lab into the library. Those moves were designed not only to lure more people into the stacks, but to make seeking help more socially-acceptable for students who might have been hesitant.
a partnership between the library, which knows what electronic materials students use, and the technology office, which manages other campus data such as usage of the course-management system. The university is doing a study to see whether library usage there also equates to student success.