What Happens to Student Data Privacy When Chinese Firms Acquire U.S. Edtech Companies?
Between the creation of a social rating system and street cameras with facial recognition capabilities, technology reports coming out of China have raised serious concerns for privacy advocates. These concerns are only heightened as Chinese investors turn their attention to the United States education technology space acquiring companies with millions of public school users.
A particularly notable deal this year centers on Edmodo, a cross between a social networking platform and a learning management system for schools that boasts having upwards of 90 million users. Net Dragon, a Chinese gaming company that is building a significant education division, bought Edmodo for a combination of cash and equity valued at $137.5 million earlier this month.
Edmodo began shifting to an advertising model last year, after years of struggling to generate revenue. This has left critics wondering why the Chinese firm chose to acquire Edmodo at such a price, some have gone as far as to call the move a data grab.
as data becomes a tool that governments such as Russia and China could use to influence voting systems or induce citizens into espionage, more legislators are turning their attention to the acquisitions of early-stage technology startups.
NetDragon officials, however, say they have no interest in these types of activities. Their main goal in acquiring United States edtech companies lies in building profitability, says Pep So, NetDragon’s Director of Corporate Development.
In 2015, the firm acquired the education technology platform, Promethean, a company that creates interactive displays for schools. NetDragon executives say that the Edmodo acquisition rounds out their education product portfolio—meaning the company will have tools for supporting multiple aspects of learning including; preparation, instructional delivery, homework, assignment grading, communication with parents students and teachers and a content marketplace.
NetDragon’s monetization plan for Edmodo focuses on building out content that gets sold via its platform. Similar to tools like TeachersPayTeachers, So hopes to see users putting up content on the platform’s marketplace, some free and others for a fee (including some virtual reality content), so that the community can buy, sell and review available educational tools.
As far as data privacy is concerned, So notes that NetDragon is still learning what it can and cannot do. He noted that the company will comply with Children’s Online Privacy Protection Act (COPPA), a federal regulation created in order to protect the privacy of children online, but says that the rules and regulations surrounding the law are confusing for all actors involved.
Historically, Chinese companies have faced trust and branding issues when moving into the United States market, and the reverse is also true for U.S. companies seeking to expand overseas. Companies have also struggled to learn the rules, regulations and operational procedures in place in other countries.
Iran and Huawei top agenda as Pompeo meets Merkel for 45 minutes in Berlin
Merkel to Ratchet up Huawei Restrictions in Concession to Hawks
more on data privacy in this IMS blog:
How Data Privacy Lessons in Alternative Reality Games Can Help Kids In Real Life
Ubiquitous social media platforms—including Facebook, Twitter and Instagram—have created a venue for people to share and connect with others. We use these services by clicking “I Agree” on Terms of Service screens, trading off some of our private and personal data for seemingly free services. While these services say data collection helps create a better user experience, that data is also potentially exploitable.
The news about how third parties obtain and use Facebook users’ data to wage political campaigns and the mounting evidence of election interference have shined a spotlight on just how secure our data is when we share online. Educating youth about data security can fall under the larger umbrella of digital citizenship, such as social media uses and misuses and learning how not to embarrass or endanger oneself while using the internet.
Darvasi’s students in Toronto can pool together 55 faux bitcoins to purchase and launch the BOTTING protocol against an opponent. The student targeted at Fallon’s school in Connecticut would then have 48 hours to record audio of 10 words of Darvasi’s students choosing and send it back to them through an intermediary (Darvasi or Fallon). For a higher price of 65 faux bitcoins, students can launch MORPHLING, which would give the opponent 48 hours to record a one-minute video explaining three ways to stay safe while using Facebook, while making their school mascot (or a close approximation of) appear in the video in some way during the entire minute.
more on digital citizenship in this IMS blog
How To Keep Google From Collecting Your Data
Megan E. Holstein Dec 21, 2018 https://medium.com/swlh/how-to-keep-google-from-collecting-your-data-5fd97a6bf929
here are two places where you can turn off how Google tracks you: Activity Controls and Ad Settings.
Along with turning off Ad Personalization, you should turn off Shared Endorsements.
Part 2: Deleting The Data They Already Have
Part 3: Don’t Use Your Google Account for More Than You Have To
- Don’t use Sign In With Google
- Don’t use your Gmail, even for junk email
more on privacy in this IMS blog
Teachers Turn to Gaming for Online Privacy Lessons
By Dian Schaffhauser 10/10/18
Blind Protocol, an alternate reality game created by two high school English teachers to help students understand online privacy and data security. This form of gaming blends fact and fiction to immerse players in an interactive world that responds to their decisions and actions. In a recent article on KQED, Paul Darvasi and John Fallon described how they chose the gaming format to help their students gain a deeper look at how vulnerable their personal data is.
Darvasi, who blogs at “Ludic Learning,” and Fallon, who writes at “TheAlternativeClassroom,” are both immersed in the education gaming realm.
more on online privacy and data security
No cookie consent walls — and no, scrolling isn’t consent, says EU data protection body from r/programming
No cookie consent walls — and no, scrolling isn’t consent, says EU data protection body
unambiguous message from the European Data Protection Board (EDPB), which has published updated guidelines on the rules around online consent to process people’s data.
The regional cookie wall has been crumbling for some time, as we reported last year — when the Dutch DPA clarified its guidance to ban cookie walls.
as the EDPB puts it, “actions such as scrolling or swiping through a webpage or similar user activity will not under any circumstances satisfy the requirement of a clear and affirmative action”
more on privacy on this IMS blog
The Secretive Company That Might End Privacy as We Know It: It’s taken 3 billion images from the internet to build a an AI driven database that allows US law enforcement agencies identify any stranger. from r/Futurology
Until now, technology that readily identifies everyone based on his or her face has been taboo because of its radical erosion of privacy. Tech companies capable of releasing such a tool have refrained from doing so; in 2011, Google’s chairman at the time said it was the one technology the company had held back because it could be used “in a very bad way.” Some large cities, including San Francisco, have barred police from using facial
But without public scrutiny, more than 600 law enforcement agencies have started using Clearview in the past year, according to the company, which declined to provide a list. recognition technology.
Facial recognition technology has always been controversial. It makes people nervous about Big Brother. It has a tendency to deliver false matches for certain groups, like people of color. And some facial recognition products used by the police — including Clearview’s — haven’t been vetted by independent experts.
Clearview deployed current and former Republican officials to approach police forces, offering free trials and annual licenses for as little as $2,000. Mr. Schwartz tapped his political connections to help make government officials aware of the tool, according to Mr. Ton-That.
“We have no data to suggest this tool is accurate,” said Clare Garvie, a researcher at Georgetown University’s Center on Privacy and Technology, who has studied the government’s use of facial recognition. “The larger the database, the larger the risk of misidentification because of the doppelgänger effect. They’re talking about a massive database of random people they’ve found on the internet.”
Law enforcement is using a facial recognition app with huge privacy issues Clearview AI’s software can find matches in billions of internet images. from r/technology
Part of the problem stems from a lack of oversight. There has been no real public input into adoption of Clearview’s software, and the company’s ability to safeguard data hasn’t been tested in practice. Clearview itself remained highly secretive until late 2019.
The software also appears to explicitly violate policies at Facebook and elsewhere against collecting users’ images en masse.
while there’s underlying code that could theoretically be used for augmented reality glasses that could identify people on the street, Ton-That said there were no plans for such a design.
Banning Facial Recognition Isn’t Enough from r/technology
In May of last year, San Francisco banned facial recognition; the neighboring city of Oakland soon followed, as did Somerville and Brookline in Massachusetts (a statewide ban may follow). In December, San Diego suspended a facial recognition program in advance of a new statewide law, which declared it illegal, coming into effect. Forty major music festivals pledged not to use the technology, and activists are calling for a nationwide ban. Many Democratic presidential candidates support at least a partial ban on the technology.
facial recognition bans are the wrong way to fight against modern surveillance. Focusing on one particular identification method misconstrues the nature of the surveillance society we’re in the process of building. Ubiquitous mass surveillance is increasingly the norm. In countries like China, a surveillance infrastructure is being built by the government for social control. In countries like the United States, it’s being built by corporations in order to influence our buying behavior, and is incidentally used by the government.
People can be identified at a distance by their heart beat or by their gait, using a laser-based system. Cameras are so good that they can read fingerprints and iris patterns from meters away. And even without any of these technologies, we can always be identified because our smartphones broadcast unique numbers called MAC addresses.
China, for example, uses multiple identification technologies to support its surveillance state.
There is a huge — and almost entirely unregulated — data broker industry in the United States that trades on our information.
This is why many companies buy license plate data from states. It’s also why companies like Google are buying health records, and part of the reason Google bought the company Fitbit, along with all of its data.
The data broker industry is almost entirely unregulated; there’s only one law — passed in Vermont in 2018 — that requires data brokers to register and explain in broad terms what kind of data they collect.
The Secretive Company That Might End Privacy as We Know It from r/technews
Until now, technology that readily identifies everyone based on his or her face has been taboo because of its radical erosion of privacy. Tech companies capable of releasing such a tool have refrained from doing so; in 2011, Google’s chairman at the time said it was the one technology the company had held back because it could be used “in a very bad way.” Some large cities, including San Francisco, have barred police from using facial recognition technology.
on social credit system in this IMS blog
How to Turn Bad Data Into Good Data
Date: Wednesday, January 22, 2020 Time: 1:00 pm CT
a panel of data and education experts about how to make the most of your education data. In this webinar you’ll learn about:
- How rapid data turnover can hurt you (and your bottom line)
- How to access “good‘‘ data and what it looks like
- Opportunities open to you when your data is clean
- Avoiding the pitfalls of using outdated or irrelevant data and making decisions that are not data informed
- Navigating the unique challenges of working in education, such as privacy regulations that might hinder communication
more on big data in this IMS blog
The American Library Association said in a statement Monday that the planned changes to Lynda.com, which are slated to happen by the end of September 2019, “would significantly impair library users’ privacy rights.” That same day, the California State Library recommended that its users discontinue Lynda.com when it fully merges with LinkedIn Learning if it institutes the changes.
The library groups argue that by requiring users to create LinkedIn accounts to watch Lynda videos, the company is going from following best practices about privacy and identity protection to potentially asking libraries to violate a range of ethics codes they have pledged to uphold. The ALA’s Library Bill of Rights, for instance, states that: “All people, regardless of origin, age, background, or views, possess a right to privacy and confidentiality in their library use. Libraries should advocate for, educate about, and protect people’s privacy, safeguarding all library use data, including personally identifiable information.”
The change will not impact most colleges and university libraries or corporate users of Lynda.com services, who will not be required to force users to set up a LinkedIn profile. LinkedIn officials say that’s because colleges and corporations have more robust ways to identify users than public libraries do.
LinkedIn acquired Lynda.com in 2015 for $1.5 billion. The following June, Microsoft bought LinkedIn for $26.2 billion, the company’s largest-ever acquisition.
more on privacy in this IMS blog