Searching for "surveillance"
New Documents and Reports Confirm AT&T and NSA’s Longstanding Surveillance Partnership
Please consider previous IMS blog entries on this topic:
Obama Adviser John Podesta: ‘Every Country Has a History of Going Over the Line’
Instead of a no-spy deal, the US has begun a Cyber Dialogue with Germany. In a SPIEGEL interview, John Podesta, a special adviser to President Barack Obama, speaks of the balance between alliances and security and says that changes are being made to NSA espionage practices.
Pls consider the following additional resources on the topic:
Power, Privacy, and the Internet
- Governments, Corporations and Hackers: The Internet and Threats to the Privacy and Dignity of the Citizen:
- The Internet and the Future of the Press
- The Internet, Repression and Dissent
Merkel calls for separate EU internet
The NSA Is Building the Country’s Biggest Spy Center (Watch What You Say)
Are We Puppets in a Wired World?
To Save Everything, Click Here: The Folly of Technological Solutionism
Hacking the Future: Privacy, Identity and Anonymity on the Web
From Gutenberg to Zuckerberg: What You Really Need to Know About the Internet
Predictive Analytics: The Power to Predict Who Will Click, Buy, Lie, or Die
Big Data: A Revolution That Will Transform How We Live, Work, and Think
Status Update: Celebrity, Publicity, and Branding in the Social Media Age
Privacy and Big Data: The Players, Regulators and Stakeholders
Students fear for their data privacy after University of California invests in private equity firm
A financial link between a virtual classroom platform and the University of California system is raising eyebrows
Instructure has made it clear through their own language that they view the student data they aggregated as one of their chief assets, although they have also insisted that they do not use that data improperly. My note: “improperly” is relative and requires defining.
Yet an article published in the Virginia Journal of Law and Technology, titled “Transparency and the Marketplace for Student Data,” pointed out that there is “an overall lack of transparency in the student information commercial marketplace and an absence of law to protect student information.” As such, some students at the University of California are concerned that — despite reassurances to the contrary — their institution’s new financial relationship with Thoma Bravo will mean their personal data can be sold or otherwise misused.
The students’ concerns over surveillance and privacy are not unwarranted. Previously, the University of California used military surveillance technology to help quell the grad student strikes at UC Santa Cruz and other campuses
New anti-encryption bill worse than EARN IT. Act now to stop both. from r/technology
Once surveillance laws such as an encryption backdoor for the “good guys” is available, it’s just a matter of time until the “good guys” turn bad or abuse their power.
By stressing the fact that tech companies must decrypt sensitive information only after a court issues a warrant, the three Senators believe they can swing the public opinion in favor of this encryption backdoor law.
Senate Votes to Allow FBI to Look at Your Web Browsing History Without a Warrant
The US Senate has voted to give law enforcement agencies access to web browsing data without a warrant, dramatically expanding the government’s surveillance powers in the midst of the COVID-19 pandemic.
The power grab was led by Senate majority leader Mitch McConnell as part of a reauthorization of the Patriot Act, which gives federal agencies broad domestic surveillance powers.
“Today the Senate made clear that the purpose of the PATRIOT Act is to spy on Americans, no warrants or due process necessary,” Dayton Young, director of product at Fight for the Future, told Motherboard.
Raise your hand if you agree that the Senate GOP’s vote to permit the FBI to review your browsing history without a warrant is a flagrant government overreach ✋
— Publius (@ThePubliusUSA) May 13, 2020
Our Bodies Encoded: Algorithmic Test Proctoring in Higher Education
While in-person test proctoring has been used to combat test-based cheating, this can be difficult to translate to online courses. Ed-tech companies have sought to address this concern by offering to watch students take online tests, in real time, through their webcams.
Some of the more prominent companies offering these services include Proctorio, Respondus, ProctorU, HonorLock, Kryterion Global Testing Solutions, and Examity.
Algorithmic test proctoring’s settings have discriminatory consequences across multiple identities and serious privacy implications.
While racist technology calibrated for white skin isn’t new (everything from photography to soap dispensers do this), we see it deployed through face detection and facial recognition used by algorithmic proctoring systems.
While some test proctoring companies develop their own facial recognition software, most purchase software developed by other companies, but these technologies generally function similarly and have shown a consistent inability to identify people with darker skin or even tell the difference between Chinese people. Facial recognition literally encodes the invisibility of Black people and the racist stereotype that all Asian people look the same.
As Os Keyes has demonstrated, facial recognition has a terrible history with gender. This means that a software asking students to verify their identity is compromising for students who identify as trans, non-binary, or express their gender in ways counter to cis/heteronormativity.
These features and settings create a system of asymmetric surveillance and lack of accountability, things which have always created a risk for abuse and sexual harassment. Technologies like these have a long history of being abused, largely by heterosexual men at the expense of women’s bodies, privacy, and dignity.
Their promotional messaging functions similarly to dog whistle politics which is commonly used in anti-immigration rhetoric. It’s also not a coincidence that these technologies are being used to exclude people not wanted by an institution; biometrics and facial recognition have been connected to anti-immigration policies, supported by both Republican and Democratic administrations, going back to the 1990’s.
Borrowing from Henry A. Giroux, Kevin Seeber describes the pedagogy of punishment and some of its consequences in regards to higher education’s approach to plagiarism in his book chapter “The Failed Pedagogy of Punishment: Moving Discussions of Plagiarism beyond Detection and Discipline.”
my note: I am repeating this for years
Sean Michael Morris and Jesse Stommel’s ongoing critique of Turnitin, a plagiarism detection software, outlines exactly how this logic operates in ed-tech and higher education: 1) don’t trust students, 2) surveil them, 3) ignore the complexity of writing and citation, and 4) monetize the data.
Cheating is not a technological problem, but a social and pedagogical problem.
Our habit of believing that technology will solve pedagogical problems is endemic to narratives produced by the ed-tech community and, as Audrey Watters writes, is tied to the Silicon Valley culture that often funds it. Scholars have been dismantling the narrative of technological solutionism and neutrality for some time now. In her book “Algorithms of Oppression,” Safiya Umoja Noble demonstrates how the algorithms that are responsible for Google Search amplify and “reinforce oppressive social relationships and enact new modes of racial profiling.”
Anna Lauren Hoffmann, who coined the term “data violence” to describe the impact harmful technological systems have on people and how these systems retain the appearance of objectivity despite the disproportionate harm they inflict on marginalized communities.
This system of measuring bodies and behaviors, associating certain bodies and behaviors with desirability and others with inferiority, engages in what Lennard J. Davis calls the Eugenic Gaze.
Higher education is deeply complicit in the eugenics movement. Nazism borrowed many of its ideas about racial purity from the American school of eugenics, and universities were instrumental in supporting eugenics research by publishing copious literature on it, establishing endowed professorships, institutes, and scholarly societies that spearheaded eugenic research and propaganda.
more on privacy in this IMS blog
Facial recognition technology breaches GDPR says Vestager
Margrethe Vestager, EU’s tech chief Margrethe Vestager said on Thursday that facial recognition technologies breach the need to give consent, which is stipulated in Europe’s data protection rules (GDPR).
“China might have data and the US might have money, but Europe has purpose,” the Commission’s VP for a Europe Fit for the Digital Age said.
The use of facial recognition technology remains highly controversial due to fears of China-type surveillance regimes and human rights violations, with Ursula von der Leyen, EC President pledging to distance Europe from these practices and to announcing new AI ethical and human-centred rules in the first 100 days of her mandate.
more on facial recognition in this IMS blog
The Secretive Company That Might End Privacy as We Know It: It’s taken 3 billion images from the internet to build a an AI driven database that allows US law enforcement agencies identify any stranger. from r/Futurology
Until now, technology that readily identifies everyone based on his or her face has been taboo because of its radical erosion of privacy. Tech companies capable of releasing such a tool have refrained from doing so; in 2011, Google’s chairman at the time said it was the one technology the company had held back because it could be used “in a very bad way.” Some large cities, including San Francisco, have barred police from using facial
But without public scrutiny, more than 600 law enforcement agencies have started using Clearview in the past year, according to the company, which declined to provide a list. recognition technology.
Facial recognition technology has always been controversial. It makes people nervous about Big Brother. It has a tendency to deliver false matches for certain groups, like people of color. And some facial recognition products used by the police — including Clearview’s — haven’t been vetted by independent experts.
Clearview deployed current and former Republican officials to approach police forces, offering free trials and annual licenses for as little as $2,000. Mr. Schwartz tapped his political connections to help make government officials aware of the tool, according to Mr. Ton-That.
“We have no data to suggest this tool is accurate,” said Clare Garvie, a researcher at Georgetown University’s Center on Privacy and Technology, who has studied the government’s use of facial recognition. “The larger the database, the larger the risk of misidentification because of the doppelgänger effect. They’re talking about a massive database of random people they’ve found on the internet.”
Law enforcement is using a facial recognition app with huge privacy issues Clearview AI’s software can find matches in billions of internet images. from r/technology
Part of the problem stems from a lack of oversight. There has been no real public input into adoption of Clearview’s software, and the company’s ability to safeguard data hasn’t been tested in practice. Clearview itself remained highly secretive until late 2019.
The software also appears to explicitly violate policies at Facebook and elsewhere against collecting users’ images en masse.
while there’s underlying code that could theoretically be used for augmented reality glasses that could identify people on the street, Ton-That said there were no plans for such a design.
Banning Facial Recognition Isn’t Enough from r/technology
In May of last year, San Francisco banned facial recognition; the neighboring city of Oakland soon followed, as did Somerville and Brookline in Massachusetts (a statewide ban may follow). In December, San Diego suspended a facial recognition program in advance of a new statewide law, which declared it illegal, coming into effect. Forty major music festivals pledged not to use the technology, and activists are calling for a nationwide ban. Many Democratic presidential candidates support at least a partial ban on the technology.
facial recognition bans are the wrong way to fight against modern surveillance. Focusing on one particular identification method misconstrues the nature of the surveillance society we’re in the process of building. Ubiquitous mass surveillance is increasingly the norm. In countries like China, a surveillance infrastructure is being built by the government for social control. In countries like the United States, it’s being built by corporations in order to influence our buying behavior, and is incidentally used by the government.
People can be identified at a distance by their heart beat or by their gait, using a laser-based system. Cameras are so good that they can read fingerprints and iris patterns from meters away. And even without any of these technologies, we can always be identified because our smartphones broadcast unique numbers called MAC addresses.
China, for example, uses multiple identification technologies to support its surveillance state.
There is a huge — and almost entirely unregulated — data broker industry in the United States that trades on our information.
This is why many companies buy license plate data from states. It’s also why companies like Google are buying health records, and part of the reason Google bought the company Fitbit, along with all of its data.
The data broker industry is almost entirely unregulated; there’s only one law — passed in Vermont in 2018 — that requires data brokers to register and explain in broad terms what kind of data they collect.
The Secretive Company That Might End Privacy as We Know It from r/technews
Until now, technology that readily identifies everyone based on his or her face has been taboo because of its radical erosion of privacy. Tech companies capable of releasing such a tool have refrained from doing so; in 2011, Google’s chairman at the time said it was the one technology the company had held back because it could be used “in a very bad way.” Some large cities, including San Francisco, have barred police from using facial recognition technology.
on social credit system in this IMS blog
Previous 1 2 3 4 … 6 Next