Searching for "surveillance"

Surveillance Age and Librarians

Privacy in the Surveillance Age: How Librarians Can Fight Back.
Wednesday, December 9, 2015
2pm Eastern (11am Pacific | 12pm Mountain | 1pm Central)
Register: https://goo.gl/6Qelrm
Description:
In the wake of Edward Snowden’s revelations about NSA and FBI dragnet surveillance, many Americans are concerned that their rights to privacy and intellectual freedom are under threat. But librarians are perfectly positioned to help our communities develop strategies to protect themselves against unwanted surveillance. In this webinar, Alison Macrina and April Glaser of the Library Freedom Project will talk about the landscape of surveillance, the work of the LFP, and some tips and tools librarians can use to resist pervasive surveillance in the digital age.
About the Presenters:
 
Alison Macrina is a librarian, privacy rights activist, and the founder and director of the Library Freedom Project, an initiative which aims to make real the promise of intellectual freedom in libraries by teaching librarians and their local communities about surveillance threats, privacy rights and law, and privacy-protecting technology tools to help safeguard digital freedoms. Alison is passionate about connecting surveillance issues to larger global struggles for justice, demystifying privacy and security technologies for ordinary users, and resisting an internet controlled by a handful of intelligence agencies and giant multinational corporations. When she’s not doing any of that, she’s reading.
April Glaser is a writer and an activist with the Library Freedom Project. She currently works as a mobilization specialist at Greenpeace USA, where she focuses on ending oil extraction in the Arctic. Prior to Greenpeace, April was at the Electronic Frontier Foundation, organizing around the net neutrality campaign and EFF’s grassroots programming. April also previously worked with the Prometheus Radio Project, where her efforts helped propel the passage of the Local Community Radio Act, the largest expansion of community radio in U.S. history. She lives in Oakland, California and continues to work with local organizations on a range of digital rights issues.
Can’t make it to the live show? That’s okay. The session will be recorded and available on the Carterette Series Webinars site for later viewing.
——————————————————-
To register for the online event
——————————————————-
1. Go to registration page: https://goo.gl/6Qelrm
2. Complete and submit the form.
3. A URL for the event will be emailed to you immediately after registration.
~~~
Contact a member of the Carterette Series planning team with questions or suggestions:
carteretteserieswebinars@gmail.com
More on privacy in this IMS blog:
http://blog.stcloudstate.edu/ims/?s=privacy&submit=Search
http://blog.stcloudstate.edu/ims/2013/10/23/pro-domo-sua-are-we-puppets-in-a-wired-world-surveillance-and-privacy-revisited/

AT&T allows NSA surveillance

New Documents and Reports Confirm AT&T and NSA’s Longstanding Surveillance Partnership

https://www.reddit.com/r/technology/comments/3h64l2/new_documents_and_reports_confirm_att_and_nsas/

Please consider previous IMS blog entries on this topic:

http://blog.stcloudstate.edu/ims/2014/09/25/online-privacy-its-time-for-a-new-security-paradigm/

http://blog.stcloudstate.edu/ims/2014/07/01/privacy-and-surveillance-obama-advisor-john-podesta-every-country-has-a-history-of-going-over-the-line/

Pro Domo Sua: Are We Puppets in a Wired World? Surveillance and privacy revisited…

http://www.nybooks.com/articles/archives/2013/nov/07/are-we-puppets-wired-world/

Are We Puppets in a Wired World?

But while we were having fun, we happily and willingly helped to create the greatest surveillance system ever imagined, a web whose strings give governments and businesses countless threads to pull, which makes us…puppets. The free flow of information over the Internet (except in places where that flow is blocked), which serves us well, may serve others better. Whether this distinction turns out to matter may be the one piece of information the Internet cannot deliver.

 

students data privacy

https://www.edsurge.com/news/2020-06-26-researchers-raise-concerns-about-algorithmic-bias-in-online-course-tools

++++++++++++++

Students fear for their data privacy after University of California invests in private equity firm

A financial link between a virtual classroom platform and the University of California system is raising eyebrows

https://www.salon.com/2020/07/28/students-fear-for-their-data-privacy-after-university-of-california-invests-in-private-equity-firm/

Instructure has made it clear through their own language that they view the student data they aggregated as one of their chief assets, although they have also insisted that they do not use that data improperly. My note: “improperly” is relative and requires defining.

Yet an article published in the Virginia Journal of Law and Technology, titled “Transparency and the Marketplace for Student Data,” pointed out that there is “an overall lack of transparency in the student information commercial marketplace and an absence of law to protect student information.” As such, some students at the University of California are concerned that — despite reassurances to the contrary — their institution’s new financial relationship with Thoma Bravo will mean their personal data can be sold or otherwise misused.

The students’ concerns over surveillance and privacy are not unwarranted. Previously, the University of California used military surveillance technology to help quell the grad student strikes at UC Santa Cruz and other campuses

Encrypted Data Act

New anti-encryption bill worse than EARN IT. Act now to stop both. from r/technology

https://tutanota.com/blog/posts/lawful-access-encrypted-data-act-backdoor/

Once surveillance laws such as an encryption backdoor for the “good guys” is available, it’s just a matter of time until the “good guys” turn bad or abuse their power.

By stressing the fact that tech companies must decrypt sensitive information only after a court issues a warrant, the three Senators believe they can swing the public opinion in favor of this encryption backdoor law.

web browsing history

https://www.facebook.com/aboutness/posts/10218894782575532

Senate Votes to Allow FBI to Look at Your Web Browsing History Without a Warrant

https://www.vice.com/en_us/article/jgxxvk/senate-votes-to-allow-fbi-to-look-at-your-web-browsing-history-without-a-warrant

The US Senate has voted to give law enforcement agencies access to web browsing data without a warrant, dramatically expanding the government’s surveillance powers in the midst of the COVID-19 pandemic.

The power grab was led by Senate majority leader Mitch McConnell as part of a reauthorization of the Patriot Act, which gives federal agencies broad domestic surveillance powers.

“Today the Senate made clear that the purpose of the PATRIOT Act is to spy on Americans, no warrants or due process necessary,” Dayton Young, director of product at Fight for the Future, told Motherboard.

https://twitter.com/search?q=fbi%20browsing%20history&src=typed_query

https://www.reddit.com/search/?q=fbi%20browsing%20history

https://www.facebook.com/search/top/?q=fbi%20browsing%20history&epa=SEARCH_BOX

 

Algorithmic Test Proctoring

Our Bodies Encoded: Algorithmic Test Proctoring in Higher Education

SHEA SWAUGER ED-TECH

https://hybridpedagogy.org/our-bodies-encoded-algorithmic-test-proctoring-in-higher-education/

While in-person test proctoring has been used to combat test-based cheating, this can be difficult to translate to online courses. Ed-tech companies have sought to address this concern by offering to watch students take online tests, in real time, through their webcams.

Some of the more prominent companies offering these services include ProctorioRespondusProctorUHonorLockKryterion Global Testing Solutions, and Examity.

Algorithmic test proctoring’s settings have discriminatory consequences across multiple identities and serious privacy implications. 

While racist technology calibrated for white skin isn’t new (everything from photography to soap dispensers do this), we see it deployed through face detection and facial recognition used by algorithmic proctoring systems.

While some test proctoring companies develop their own facial recognition software, most purchase software developed by other companies, but these technologies generally function similarly and have shown a consistent inability to identify people with darker skin or even tell the difference between Chinese people. Facial recognition literally encodes the invisibility of Black people and the racist stereotype that all Asian people look the same.

As Os Keyes has demonstrated, facial recognition has a terrible history with gender. This means that a software asking students to verify their identity is compromising for students who identify as trans, non-binary, or express their gender in ways counter to cis/heteronormativity.

These features and settings create a system of asymmetric surveillance and lack of accountability, things which have always created a risk for abuse and sexual harassment. Technologies like these have a long history of being abused, largely by heterosexual men at the expense of women’s bodies, privacy, and dignity.

Their promotional messaging functions similarly to dog whistle politics which is commonly used in anti-immigration rhetoric. It’s also not a coincidence that these technologies are being used to exclude people not wanted by an institution; biometrics and facial recognition have been connected to anti-immigration policies, supported by both Republican and Democratic administrations, going back to the 1990’s.

Borrowing from Henry A. Giroux, Kevin Seeber describes the pedagogy of punishment and some of its consequences in regards to higher education’s approach to plagiarism in his book chapter “The Failed Pedagogy of Punishment: Moving Discussions of Plagiarism beyond Detection and Discipline.”

my note: I am repeating this for years
Sean Michael Morris and Jesse Stommel’s ongoing critique of Turnitin, a plagiarism detection software, outlines exactly how this logic operates in ed-tech and higher education: 1) don’t trust students, 2) surveil them, 3) ignore the complexity of writing and citation, and 4) monetize the data.

Technological Solutionism

Cheating is not a technological problem, but a social and pedagogical problem.
Our habit of believing that technology will solve pedagogical problems is endemic to narratives produced by the ed-tech community and, as Audrey Watters writes, is tied to the Silicon Valley culture that often funds it. Scholars have been dismantling the narrative of technological solutionism and neutrality for some time now. In her book “Algorithms of Oppression,” Safiya Umoja Noble demonstrates how the algorithms that are responsible for Google Search amplify and “reinforce oppressive social relationships and enact new modes of racial profiling.”

Anna Lauren Hoffmann, who coined the term “data violence” to describe the impact harmful technological systems have on people and how these systems retain the appearance of objectivity despite the disproportionate harm they inflict on marginalized communities.

This system of measuring bodies and behaviors, associating certain bodies and behaviors with desirability and others with inferiority, engages in what Lennard J. Davis calls the Eugenic Gaze.

Higher education is deeply complicit in the eugenics movement. Nazism borrowed many of its ideas about racial purity from the American school of eugenics, and universities were instrumental in supporting eugenics research by publishing copious literature on it, establishing endowed professorships, institutes, and scholarly societies that spearheaded eugenic research and propaganda.

+++++++++++++++++
more on privacy in this IMS blog
http://blog.stcloudstate.edu/ims?s=privacy

Facial recognition technology breaches GDPR says Vestager

Margrethe Vestager, EU’s tech chief Margrethe Vestager said on Thursday that facial recognition technologies breach the need to give consent, which is stipulated in Europe’s data protection rules (GDPR).

“China might have data and the US might have money, but Europe has purpose,” the Commission’s VP for a Europe Fit for the Digital Age said.

The use of facial recognition technology remains highly controversial due to fears of China-type surveillance regimes and human rights violations, with Ursula von der Leyen, EC President pledging to distance Europe from these practices and to announcing new AI ethical and human-centred rules in the first 100 days of her mandate.

++++++++++++++
more on facial recognition in this IMS blog
http://blog.stcloudstate.edu/ims?s=facial+recognition

AI and privacy

The Secretive Company That Might End Privacy as We Know It: It’s taken 3 billion images from the internet to build a an AI driven database that allows US law enforcement agencies identify any stranger. from r/Futurology

Until now, technology that readily identifies everyone based on his or her face has been taboo because of its radical erosion of privacy. Tech companies capable of releasing such a tool have refrained from doing so; in 2011, Google’s chairman at the time said it was the one technology the company had held back because it could be used “in a very bad way.” Some large cities, including San Francisco, have barred police from using facial

But without public scrutiny, more than 600 law enforcement agencies have started using Clearview in the past year, according to the company, which declined to provide a list. recognition technology.

Facial recognition technology has always been controversial. It makes people nervous about Big Brother. It has a tendency to deliver false matches for certain groups, like people of color. And some facial recognition products used by the police — including Clearview’s — haven’t been vetted by independent experts.

Clearview deployed current and former Republican officials to approach police forces, offering free trials and annual licenses for as little as $2,000. Mr. Schwartz tapped his political connections to help make government officials aware of the tool, according to Mr. Ton-That.

“We have no data to suggest this tool is accurate,” said Clare Garvie, a researcher at Georgetown University’s Center on Privacy and Technology, who has studied the government’s use of facial recognition. “The larger the database, the larger the risk of misidentification because of the doppelgänger effect. They’re talking about a massive database of random people they’ve found on the internet.”

Law enforcement is using a facial recognition app with huge privacy issues Clearview AI’s software can find matches in billions of internet images. from r/technology

Part of the problem stems from a lack of oversight. There has been no real public input into adoption of Clearview’s software, and the company’s ability to safeguard data hasn’t been tested in practice. Clearview itself remained highly secretive until late 2019.

The software also appears to explicitly violate policies at Facebook and elsewhere against collecting users’ images en masse.

while there’s underlying code that could theoretically be used for augmented reality glasses that could identify people on the street, Ton-That said there were no plans for such a design.

Banning Facial Recognition Isn’t Enough from r/technology

In May of last year, San Francisco banned facial recognition; the neighboring city of Oakland soon followed, as did Somerville and Brookline in Massachusetts (a statewide ban may follow). In December, San Diego suspended a facial recognition program in advance of a new statewide law, which declared it illegal, coming into effect. Forty major music festivals pledged not to use the technology, and activists are calling for a nationwide ban. Many Democratic presidential candidates support at least a partial ban on the technology.

facial recognition bans are the wrong way to fight against modern surveillance. Focusing on one particular identification method misconstrues the nature of the surveillance society we’re in the process of building. Ubiquitous mass surveillance is increasingly the norm. In countries like China, a surveillance infrastructure is being built by the government for social control. In countries like the United States, it’s being built by corporations in order to influence our buying behavior, and is incidentally used by the government.

People can be identified at a distance by their heart beat or by their gait, using a laser-based system. Cameras are so good that they can read fingerprints and iris patterns from meters away. And even without any of these technologies, we can always be identified because our smartphones broadcast unique numbers called MAC addresses.

China, for example, uses multiple identification technologies to support its surveillance state.

There is a huge — and almost entirely unregulated — data broker industry in the United States that trades on our information.

This is why many companies buy license plate data from states. It’s also why companies like Google are buying health records, and part of the reason Google bought the company Fitbit, along with all of its data.

The data broker industry is almost entirely unregulated; there’s only one law — passed in Vermont in 2018 — that requires data brokers to register and explain in broad terms what kind of data they collect.

The Secretive Company That Might End Privacy as We Know It from r/technews

Until now, technology that readily identifies everyone based on his or her face has been taboo because of its radical erosion of privacy. Tech companies capable of releasing such a tool have refrained from doing so; in 2011, Google’s chairman at the time said it was the one technology the company had held back because it could be used “in a very bad way.” Some large cities, including San Francisco, have barred police from using facial recognition technology.

+++++++++++++
on social credit system in this IMS blog
http://blog.stcloudstate.edu/ims?s=social+credit

1 2 3 4 6