Searching for "privacy"

school privacy data

http://blogs.edweek.org/edweek/DigitalEducation/2020/11/student_data_future_criminals_pasco_privacy.html

Using Student Data to Identify Future Criminals: A Privacy Debacle

Under the federal Family Educational Rights and Privacy Act (FERPA), schools can share student records with a contractor or outside party to whom the school has outsourced certain functions, if that outside party (like a designated school resource officer) meets all three of these conditions:

  1. The outside party is performing a service that would otherwise be performed by school employees.
  2. The outside party’s use of education records is under the direct control of the school.
  3. The outside party does not use the education records for anything other than the reason they were originally shared, and does not share the education record with anyone else unless it secures written consent from the parent of the student.

S Korea Facebook privacy

S. Korea fines Facebook 6.7 bln won for sharing users’ info without consent from r/technology

https://en.yna.co.kr/view/AEN20201125006500320

South Korea’s information watchdog on Wednesday fined Facebook Inc. 6.7 billion won (US$6 million) for passing information of at least 3.3 million South Koreans to other companies in its first crackdown on the U.S. tech giant.

+++++++++++++++
more on Facebook privacy in this IMS blog
https://blog.stcloudstate.edu/ims?s=facebook+privacy

Privacy data ignored by Android iPhone

 

+++++Under EU law, citizen can demand a copy of all personal data that companies hold about them. However, more than one year after implementation of the new law, most Android and iPhone apps still completely ignore this right, a new study has found. from r/iphone

https://dl.acm.org/doi/epdf/10.1145/3407023.3407057

How do App Vendors Respond to Subject Access Requests? A Longitudinal Privacy Study on iOS and Android Apps
the results of a four-year undercover field study.

Besides a general lack of responsiveness, the observed problems range from malfunctioning download links and authentication mechanisms over confusing data labels and le structures to impoliteness, incomprehensible language, and even serious cases of carelessness and data leakage. It is evident from our results that there are no well-established and standardized processes for subject access requests in the mobile app industry. Moreover, we found that many vendors lack the motivation to respond adequately. Many of the responses we received were not only completely insucient, but also deceptive or misleading. Equally worrisome are cases of unsolicited dissolution of personal data, for instance, due to the

apparently widespread practice of deleting stale accounts without prior notice

++++++++++++++

New lawsuit: Why do Android phones mysteriously exchange 260MB a month with Google via cellular data when they’re not even in use? from r/technology

+++++++++++++++++
more on privacy data in this IMS blog
https://blog.stcloudstate.edu/ims?s=privacy+data

Google policy privacy

Google is giving data to police based on search keywords, court docs show

Court records in an arson case show that Google gave away data on people who searched for a specific address.

https://www.cnet.com/news/google-is-giving-data-to-police-based-on-search-keywords-court-docs-show

Recently unsealed court document found that investigators can request such data in reverse order by asking Google to disclose everyone who searched a keyword rather than for information on a known suspect.

++++++++++++++++
more on privacy in this IMS blog
https://blog.stcloudstate.edu/ims?s=privacy

students data privacy

https://www.edsurge.com/news/2020-06-26-researchers-raise-concerns-about-algorithmic-bias-in-online-course-tools

++++++++++++++

Students fear for their data privacy after University of California invests in private equity firm

A financial link between a virtual classroom platform and the University of California system is raising eyebrows

https://www.salon.com/2020/07/28/students-fear-for-their-data-privacy-after-university-of-california-invests-in-private-equity-firm/

Instructure has made it clear through their own language that they view the student data they aggregated as one of their chief assets, although they have also insisted that they do not use that data improperly. My note: “improperly” is relative and requires defining.

Yet an article published in the Virginia Journal of Law and Technology, titled “Transparency and the Marketplace for Student Data,” pointed out that there is “an overall lack of transparency in the student information commercial marketplace and an absence of law to protect student information.” As such, some students at the University of California are concerned that — despite reassurances to the contrary — their institution’s new financial relationship with Thoma Bravo will mean their personal data can be sold or otherwise misused.

The students’ concerns over surveillance and privacy are not unwarranted. Previously, the University of California used military surveillance technology to help quell the grad student strikes at UC Santa Cruz and other campuses

cookies privacy EU

No cookie consent walls — and no, scrolling isn’t consent, says EU data protection body from r/programming

No cookie consent walls — and no, scrolling isn’t consent, says EU data protection body

unambiguous message from the European Data Protection Board (EDPB), which has published updated guidelines on the rules around online consent to process people’s data.

The regional cookie wall has been crumbling for some time, as we reported last year — when the Dutch DPA clarified its guidance to ban cookie walls.

as the EDPB puts it, “actions such as scrolling or swiping through a webpage or similar user activity will not under any circumstances satisfy the requirement of a clear and affirmative action”
++++++++++++++++++++++
more on privacy on this IMS blog
https://blog.stcloudstate.edu/ims?s=privacy

 

AI and privacy

The Secretive Company That Might End Privacy as We Know It: It’s taken 3 billion images from the internet to build a an AI driven database that allows US law enforcement agencies identify any stranger. from r/Futurology

Until now, technology that readily identifies everyone based on his or her face has been taboo because of its radical erosion of privacy. Tech companies capable of releasing such a tool have refrained from doing so; in 2011, Google’s chairman at the time said it was the one technology the company had held back because it could be used “in a very bad way.” Some large cities, including San Francisco, have barred police from using facial

But without public scrutiny, more than 600 law enforcement agencies have started using Clearview in the past year, according to the company, which declined to provide a list. recognition technology.

Facial recognition technology has always been controversial. It makes people nervous about Big Brother. It has a tendency to deliver false matches for certain groups, like people of color. And some facial recognition products used by the police — including Clearview’s — haven’t been vetted by independent experts.

Clearview deployed current and former Republican officials to approach police forces, offering free trials and annual licenses for as little as $2,000. Mr. Schwartz tapped his political connections to help make government officials aware of the tool, according to Mr. Ton-That.

“We have no data to suggest this tool is accurate,” said Clare Garvie, a researcher at Georgetown University’s Center on Privacy and Technology, who has studied the government’s use of facial recognition. “The larger the database, the larger the risk of misidentification because of the doppelgänger effect. They’re talking about a massive database of random people they’ve found on the internet.”

Law enforcement is using a facial recognition app with huge privacy issues Clearview AI’s software can find matches in billions of internet images. from r/technology

Part of the problem stems from a lack of oversight. There has been no real public input into adoption of Clearview’s software, and the company’s ability to safeguard data hasn’t been tested in practice. Clearview itself remained highly secretive until late 2019.

The software also appears to explicitly violate policies at Facebook and elsewhere against collecting users’ images en masse.

while there’s underlying code that could theoretically be used for augmented reality glasses that could identify people on the street, Ton-That said there were no plans for such a design.

Banning Facial Recognition Isn’t Enough from r/technology

In May of last year, San Francisco banned facial recognition; the neighboring city of Oakland soon followed, as did Somerville and Brookline in Massachusetts (a statewide ban may follow). In December, San Diego suspended a facial recognition program in advance of a new statewide law, which declared it illegal, coming into effect. Forty major music festivals pledged not to use the technology, and activists are calling for a nationwide ban. Many Democratic presidential candidates support at least a partial ban on the technology.

facial recognition bans are the wrong way to fight against modern surveillance. Focusing on one particular identification method misconstrues the nature of the surveillance society we’re in the process of building. Ubiquitous mass surveillance is increasingly the norm. In countries like China, a surveillance infrastructure is being built by the government for social control. In countries like the United States, it’s being built by corporations in order to influence our buying behavior, and is incidentally used by the government.

People can be identified at a distance by their heart beat or by their gait, using a laser-based system. Cameras are so good that they can read fingerprints and iris patterns from meters away. And even without any of these technologies, we can always be identified because our smartphones broadcast unique numbers called MAC addresses.

China, for example, uses multiple identification technologies to support its surveillance state.

There is a huge — and almost entirely unregulated — data broker industry in the United States that trades on our information.

This is why many companies buy license plate data from states. It’s also why companies like Google are buying health records, and part of the reason Google bought the company Fitbit, along with all of its data.

The data broker industry is almost entirely unregulated; there’s only one law — passed in Vermont in 2018 — that requires data brokers to register and explain in broad terms what kind of data they collect.

The Secretive Company That Might End Privacy as We Know It from r/technews

Until now, technology that readily identifies everyone based on his or her face has been taboo because of its radical erosion of privacy. Tech companies capable of releasing such a tool have refrained from doing so; in 2011, Google’s chairman at the time said it was the one technology the company had held back because it could be used “in a very bad way.” Some large cities, including San Francisco, have barred police from using facial recognition technology.

+++++++++++++
on social credit system in this IMS blog
https://blog.stcloudstate.edu/ims?s=social+credit

TikTok privacy analysis


Privacy analysis of Tiktok’s app and website from programming

Privacy Analysis of Tiktok’s App and Website

https://www.sueddeutsche.de/digital/tiktok-ueberwachung-daten-kritik-1.4709779

++++++++++++
more on TikTok in this IMS blog
https://blog.stcloudstate.edu/ims?s=tik+tok

1 2 3 4 23