Searching for "surveillance education"

surveillance technology and education

New York’s Lockport City School District, which is using public funds from a Smart Schools bond to help pay for a reported $3.8 million security system that uses facial recognition technology to identify individuals who don’t belong on campus

The Lockport case has drawn the attention of national media, ire of many parents and criticism from the New York Civil Liberties Union, among other privacy groups.

the Future of Privacy Forum (FPF), a nonprofit think tank based in Washington, D.C., published an animated video that illustrates the possible harm that surveillance technology can cause to children and the steps schools should take before making any decisions, such as identifying specific goals for the technology and establishing who will have access to the data and for how long.

A few days later, the nonprofit Center for Democracy and Technology, in partnership with New York University’s Brennan Center for Justice, released a brief examining the same topic.

My note: same considerations were relayed to the SCSU SOE dean in regard of the purchase of Premethean and its installation in SOE building without discussion with faculty, who work with technology. This information was also shared with the dean:

more on surveillance in education in this IMS blog

surveillance in schools

The phrase “school-to-prison pipeline” has long been used to describe how schools respond to disciplinary problems with excessively stringent policies that create prison-like environments and funnel children who don’t fall in line into the criminal justice system. Now, schools are investing in surveillance systems that will likely exacerbate existing disparities.

number of tech companies are capitalizing on the growing market for student surveillance measures as various districts and school leaders commit themselves to preventing acts of violence. Rekor Systems, for instance, recently announced the launch of OnGuard, a program that claims to “advance student safety” by implementing countless surveillance and “threat assessment” mechanisms in and around schools.

While none of these methods have been proven to be effective in deterring violence, similar systems have resulted in diverting resources away from enrichment opportunities, policing school communities to a point where students feel afraid to express themselves, and placing especially dangerous targets on students of color who are already disproportionately mislabeled and punished.ProPublica

more on surveillance in this IMS blog

Algorithmic Test Proctoring

Our Bodies Encoded: Algorithmic Test Proctoring in Higher Education


While in-person test proctoring has been used to combat test-based cheating, this can be difficult to translate to online courses. Ed-tech companies have sought to address this concern by offering to watch students take online tests, in real time, through their webcams.

Some of the more prominent companies offering these services include ProctorioRespondusProctorUHonorLockKryterion Global Testing Solutions, and Examity.

Algorithmic test proctoring’s settings have discriminatory consequences across multiple identities and serious privacy implications. 

While racist technology calibrated for white skin isn’t new (everything from photography to soap dispensers do this), we see it deployed through face detection and facial recognition used by algorithmic proctoring systems.

While some test proctoring companies develop their own facial recognition software, most purchase software developed by other companies, but these technologies generally function similarly and have shown a consistent inability to identify people with darker skin or even tell the difference between Chinese people. Facial recognition literally encodes the invisibility of Black people and the racist stereotype that all Asian people look the same.

As Os Keyes has demonstrated, facial recognition has a terrible history with gender. This means that a software asking students to verify their identity is compromising for students who identify as trans, non-binary, or express their gender in ways counter to cis/heteronormativity.

These features and settings create a system of asymmetric surveillance and lack of accountability, things which have always created a risk for abuse and sexual harassment. Technologies like these have a long history of being abused, largely by heterosexual men at the expense of women’s bodies, privacy, and dignity.

Their promotional messaging functions similarly to dog whistle politics which is commonly used in anti-immigration rhetoric. It’s also not a coincidence that these technologies are being used to exclude people not wanted by an institution; biometrics and facial recognition have been connected to anti-immigration policies, supported by both Republican and Democratic administrations, going back to the 1990’s.

Borrowing from Henry A. Giroux, Kevin Seeber describes the pedagogy of punishment and some of its consequences in regards to higher education’s approach to plagiarism in his book chapter “The Failed Pedagogy of Punishment: Moving Discussions of Plagiarism beyond Detection and Discipline.”

my note: I am repeating this for years
Sean Michael Morris and Jesse Stommel’s ongoing critique of Turnitin, a plagiarism detection software, outlines exactly how this logic operates in ed-tech and higher education: 1) don’t trust students, 2) surveil them, 3) ignore the complexity of writing and citation, and 4) monetize the data.

Technological Solutionism

Cheating is not a technological problem, but a social and pedagogical problem.
Our habit of believing that technology will solve pedagogical problems is endemic to narratives produced by the ed-tech community and, as Audrey Watters writes, is tied to the Silicon Valley culture that often funds it. Scholars have been dismantling the narrative of technological solutionism and neutrality for some time now. In her book “Algorithms of Oppression,” Safiya Umoja Noble demonstrates how the algorithms that are responsible for Google Search amplify and “reinforce oppressive social relationships and enact new modes of racial profiling.”

Anna Lauren Hoffmann, who coined the term “data violence” to describe the impact harmful technological systems have on people and how these systems retain the appearance of objectivity despite the disproportionate harm they inflict on marginalized communities.

This system of measuring bodies and behaviors, associating certain bodies and behaviors with desirability and others with inferiority, engages in what Lennard J. Davis calls the Eugenic Gaze.

Higher education is deeply complicit in the eugenics movement. Nazism borrowed many of its ideas about racial purity from the American school of eugenics, and universities were instrumental in supporting eugenics research by publishing copious literature on it, establishing endowed professorships, institutes, and scholarly societies that spearheaded eugenic research and propaganda.

more on privacy in this IMS blog

Zuckerberg politics and money

Thiel is notorious among Silicon Valley billionaires for explicitly endorsing Trump in 2016 and speaking at the Republican National Convention that year. Thiel, a libertarian who runs a company that enhances government surveillance efforts, has also questioned the value of women voting.

a series of dinners at Zuckerberg’s home in California with conservative pundits and activists like white supremacist Tucker Carlson of Fox News.

It’s safe to say that Zuckerberg’s politics are issue-specific and generally party-agnostic.

Zuckerberg dropped out of Harvard after two years. Zuckerberg has enrolled for the past decade at the University of Davos, where rich people pretend they are smart and smart people pander to the rich. If someone chooses to study world politics from Henry Kissinger, you can assume that he will have some twisted views of how the world works.

More on in this IMS blog:

Facial Recognition Technology in schools

With Safety in Mind, Schools Turn to Facial Recognition Technology. But at What Cost?

By Emily Tate     Jan 31, 2019

SAFR (Secure, Accurate Facial Recognition)

violent deaths in schools have stayed relatively constant over the last 30 years, according to data from the National Center for Education Statistics. But then there’s the emotive reality, which is that every time another event like Sandy Hook or Parkland occurs, many educators and students feel they are in peril when they go to school.

RealNetworks, a Seattle-based software company that was popular in the 1990s for its audio and video streaming services but has since expanded to offer other tools, including SAFR (Secure, Accurate Facial Recognition), its AI-supported facial recognition software.

After installing new security cameras, purchasing a few Apple devices and upgrading the school’s Wi-Fi, St. Therese was looking at a $24,000 technology tab.

The software is programmed to allow authorized users into the building with a smile.

“Facial recognition isn’t a panacea. It is just a tool,” says Collins, who focuses on education privacy issues.

Another part of the problem with tools like SAFR, is it provides a false sense of security.

more on surveillance in this IMS blog

more on privacy in this IMS blog

media literacy backfire

Did Media Literacy Backfire?

Jan 5, 2017danah boyd

Understanding what sources to trust is a basic tenet of media literacy education.

Think about how this might play out in communities where the “liberal media” is viewed with disdain as an untrustworthy source of information…or in those where science is seen as contradicting the knowledge of religious people…or where degrees are viewed as a weapon of the elite to justify oppression of working people. Needless to say, not everyone agrees on what makes a trusted source.

Students are also encouraged to reflect on economic and political incentives that might bias reporting. Follow the money, they are told. Now watch what happens when they are given a list of names of major power players in the East Coast news media whose names are all clearly Jewish. Welcome to an opening for anti-Semitic ideology.

In the United States, we believe that worthy people lift themselves up by their bootstraps. This is our idea of freedom. To take away the power of individuals to control their own destiny is viewed as anti-American by so much of this country. You are your own master.

Children are indoctrinated into this cultural logic early, even as their parents restrict their mobility and limit their access to social situations. But when it comes to information, they are taught that they are the sole proprietors of knowledge. All they have to do is “do the research” for themselves and they will know better than anyone what is real.

Combine this with a deep distrust of media sources.

Many marginalized groups are justifiably angry about the ways in which their stories have been dismissed by mainstream media for decades.It took five days for major news outlets to cover Ferguson. It took months and a lot of celebrities for journalists to start discussing the Dakota Pipeline. But feeling marginalized from news media isn’t just about people of color.

Keep in mind that anti-vaxxers aren’t arguing that vaccinations definitively cause autism. They are arguing that we don’t know. They are arguing that experts are forcing children to be vaccinated against their will, which sounds like oppression. What they want is choice — the choice to not vaccinate. And they want information about the risks of vaccination, which they feel are not being given to them. In essence, they are doing what we taught them to do: questioning information sources and raising doubts about the incentives of those who are pushing a single message. Doubt has become tool.

Addressing so-called fake news is going to require a lot more than labeling. It’s going to require a cultural change about how we make sense of information, whom we trust, and how we understand our own role in grappling with information. Quick and easy solutions may make the controversy go away, but they won’t address the underlying problems.

In the United States, we’re moving towards tribalism (see Fukuyama), and we’re undoing the social fabric of our country through polarization, distrust, and self-segregation.


boyd, danah. (2014). It’s Complicated: The Social Lives of Networked Teens (1 edition). New Haven: Yale University Press.
p. 8 networked publics are publics that are reconstructed by networked technologies. they are both space and imagined community.
p. 11 affordances: persistence, visibility, spreadability, searchability.
p. technological determinism both utopian and dystopian
p. 30 adults misinterpret teens online self-expression.
p. 31 taken out of context. Joshua Meyrowitz about Stokely Charmichael.
p. 43 as teens have embraced a plethora of social environment and helped co-create the norms that underpin them, a wide range of practices has emerged. teens have grown sophisticated with how they manage contexts and present themselves in order to be read by their intended audience.
p. 54 privacy. p. 59 Privacy is a complex concept without a clear definition. Supreme Court Justice Brandeis: the right to be let alone, but also ‘measure of th access others have to you through information, attention, and physical proximity.’
control over access and visibility
p. 65 social steganography. hiding messages in plain sight
p. 69 subtweeting. encoding content
p. 70 living with surveillance . Foucault Discipline and Punish
p. 77 addition. what makes teens obsessed w social media.
p. 81 Ivan Goldberg coined the term internet addiction disorder. jokingly
p. 89 the decision to introduce programmed activities and limit unstructured time is not unwarranted; research has shown a correlation between boredom and deviance.
My interview with Myra, a middle-class white fifteen-year-old from Iowa, turned funny and sad when “lack of time” became a verbal trick in response to every question. From learning Czech to trakc, from orchestra to work in a nursery, she told me that her mother organized “98%” of her daily routine. Myra did not like all of these activities, but her mother thought they were important.
Myra noted that her mother meant well, but she was exhausted and felt socially disconnected because she did not have time to connect with friends outside of class.
p. 100 danger
are sexual predators lurking everywhere
p. 128 bullying. is social media amplifying meanness and cruelty.
p. 131 defining bullying in a digital era. p. 131 Dan Olweus narrowed in the 70s bulling to three components: aggression, repetition and imbalance on power. p. 152 SM has not radically altered the dynamics of bullying, but it has made these dynamics more visible to more people. we must use this visibility not to justify increased punishment, but to help youth who are actually crying out for attention.
p. 153 inequality. can SM resolve social divisions?
p. 176 literacy. are today’s youth digital natives? p. 178 Barlow and Rushkoff p. 179 Prensky. p. 180 youth need new literacies. p. 181 youth must become media literate. when they engage with media–either as consumers or producers–they need to have the skills to ask questions about the construction and dissemination of particular media artifacts. what biases are embedded in the artifact? how did the creator intend for an audience to interpret the artifact, and what are the consequences of that interpretation.
p. 183 the politics of algorithms (see also these IMS blog entries Wikipedia and google are fundamentally different sites. p. 186 Eli Pariser, The Filter Bubble: the personalization algorithms produce social divisions that undermine any ability to crate an informed public. Harvard’s Berkman Center have shown, search engines like Google shape the quality of information experienced by youth.
p. 192 digital inequality. p. 194 (bottom) 195 Eszter Hargittai: there are signifficant difference in media literacy and technical skills even within age cohorts. teens technological skills are strongly correlated with socio-economic status. Hargittai argues that many youth, far from being digital natives, are quite digitally naive.
p. 195 Dmitry  Epstein: when society frames the digital divide as a problem of access, we see government and industry as the responsible party for the addressing the issue. If DD as skills issue, we place the onus on learning how to manage on individuals and families.
p. 196 beyond digital natives

Palfrey, J., & Gasser, U. (2008). Born Digital: Understanding the First Generation of Digital Natives (1 edition). New York: Basic Books.

John Palfrey, Urs Gasser: Born Digital
Digital Natives share a common global culture that is defined not by age, strictly, but by certain attributes and experience related to how they interact with information technologies, information itself, one another, and other people and institutions. Those who were not “born digital’ can be just as connected, if not more so, than their younger counterparts. And not everyone born since, say 1982, happens to be a digital native.” (see also

p. 197. digital native rhetoric is worse than inaccurate: it is dangerous
many of the media literacy skills needed to be digitally savvy require a level of engagement that goes far beyond what the average teen pick up hanging out with friends on FB or Twitter. Technical skills, such as the ability to build online spaces requires active cultivation. Why some search queries return some content before others. Why social media push young people to learn how to to build their own systems, versus simply using a social media platforms. teens social status and position alone do not determine how fluent or informed they are via-a-vis technology.
p. 199 Searching for a public on their own


Daum, M. (2018, August 24). My Affair With the Intellectual Dark Web – Great Escape. Retrieved October 9, 2018, from

the intellectual dark web

more on media literacy in this IMS blog

fake news in this IMS blog

AI tracks students writings

Schools are using AI to track what students write on their computers

By Simone Stolzoff August 19, 2018
50 million k-12 students in the US
Under the Children’s Internet Protection Act (CIPA), any US school that receives federal funding is required to have an internet-safety policy. As school-issued tablets and Chromebook laptops become more commonplace, schools must install technological guardrails to keep their students safe. For some, this simply means blocking inappropriate websites. Others, however, have turned to software companies like GaggleSecurly, and GoGuardian to surface potentially worrisome communications to school administrators
In an age of mass school-shootings and increased student suicides, SMPs Safety Management Platforms can play a vital role in preventing harm before it happens. Each of these companies has case studies where an intercepted message helped save lives.
Over 50% of teachers say their schools are one-to-one (the industry term for assigning every student a device of their own), according to a 2017 survey from Freckle Education
But even in an age of student suicides and school shootings, when do security precautions start to infringe on students’ freedoms?
When the Gaggle algorithm surfaces a word or phrase that may be of concern—like a mention of drugs or signs of cyberbullying—the “incident” gets sent to human reviewers before being passed on to the school. Using AI, the software is able to process thousands of student tweets, posts, and status updates to look for signs of harm.
SMPs help normalize surveillance from a young age. In the wake of the Cambridge Analytica scandal at Facebook and other recent data breaches from companies like Equifax, we have the opportunity to teach kids the importance of protecting their online data
in an age of increased school violence, bullying, and depression, schools have an obligation to protect their students. But the protection of kids’ personal information is also a matter of their safety

more on cybersecurity in this IMS blog

more on surveillance  in this IMS blog

more on privacy in this IMS blog

Digital Literacy for SPED 405

Digital Literacy for SPED 405. Behavior Theories and Practices in Special Education.

Instructor Mark Markell. Mondays, 5:30 – 8:20 PM. SOE A235

Preliminary Plan for Monday, Sept 10, 5:45 PM to 8 PM

Introduction – who are the students in this class. About myself: Contact info, “embedded” librarian idea – I am available to help during the semester with research and papers

about 40 min: Intro to the library:
15 min for a Virtual Reality tours of the Library + quiz on how well they learned the library:
and 360 degree video on BYOD:
Play a scavenger hunt IN THE LIBRARY:
The VR (virtual reality) and AR (augmented reality) component; why is it important?
why is this technology brought up to a SPED class?
Social emotional learning
(transition to the next topic – digital literacy)

about 50 min:

  1. Digital Literacy

How important is technology in our life? Profession?

Do you think technology overlaps with the broad field of special education? How?
How do you define technology? What falls under “technology?”

What is “digital literacy?” Do we need to be literate in that sense? How does it differ from technology literacy?

Additional readings on “digital literacy”

Digital Citizenship:
Play Kahoot:
Privacy and surveillance: how does these two issues affect your students? Does it affect them more? if so, how?

Social Media: if you want to survey the class, here is the FB group page:

Is Social Media part of digital literacy? Why? How SM can help us become more literate?

Digital Storytelling:

How is digital storytelling essential in digital literacy?

about 50 min:

  1. Fake News and Research

Syllabus: Teaching Media Manipulation:

#FakeNews is a very timely and controversial issue. in 2-3 min choose your best source on this issue. 1. Mind the prevalence of resources in the 21st century 2. Mind the necessity to evaluate a) the veracity of your courses b) the quality of your sources (the fact that they are “true” does not mean that they are the best). Be prepared to name your source and defend its quality.
How do you determine your sources? How do you decide the reliability of your sources? Are you sure you can distinguish “good” from “bad?”
Compare this entry
to this entry: to understand the scope

Do you know any fact checking sites? Can you identify spot sponsored content? Do you understand syndication? What do you understand under “media literacy,” “news literacy,” “information literacy.”

Why do we need to explore the “fake news” phenomenon? Do you find it relevant to your professional development?

Let’s watch another video and play this Kahoot:

So, how do we do academic research? Let’s play another Kahoot:
If you to structure this Kahoot, what are the questions, you will ask? What are the main steps in achieving successful research for your paper?

  • Research using social media

what is social media (examples). why is called SM? why is so popular? what makes it so popular?

use SM tools for your research and education:

– Determining your topic. How to?
Digg, Reddit , Quora
Facebook, Twitter – hashtags (class assignment 2-3 min to search)
LinkedIn Groups
YouTube and Slideshare (class assignment 2-3 min to search)
Flickr, Instagram, Pinterest for visual aids (like YouTube they are media repositories) (, a paper-sharing social network that has been informally dubbed “Facebook for academics,”


– collecting and managing your resources:
Evernote: OneNote (Microsoft)

blogs and wikis for collecting data and collaborating

– Managing and sharing your information:

– Testing your work against your peers (globally):

First step:Using Wikipedia.Second step: Contributing to Wikipedia (editing a page). Third step: Contributing to Wikipedia (creating a page)

– presenting your information

please use this form to cast your feedback. Please feel free to fill out only the relevant questions:

Google go home

‘Google go home’: the Berlin neighbourhood fighting off a tech giant

Other cities have embraced the company, but in Kreuzberg opposition to a planned Google campus is vociferous. What makes Berlin different?

Google’s sites in London, Madrid, Tel Aviv, Seoul, São Paulo and Warsaw (in a converted former vodka distillery) are hubs for entrepreneurs, providing workspace for startup founders as well as networking and educational events.

the recent offer from Sidewalk Labs – a company owned by Alphabet, Google’s parent company – to redevelop Toronto’s waterfront as a reason to be concerned about the company’s interests in potentially extracting data from cities.

Google’s history of tax evasion and mass surveillance as examples of actions that make it incompatible with the progressive values of the local area.

more on Google in this IMS blog

China of Xi

Time of Xi

My note: CCTV (, accidentally overlaps with cctv ( “also known as video surveillance”

China Central Television (formerly Beijing Television), commonly abbreviated as CCTV, is the predominant state television broadcaster in the People’s Republic of China. CCTV has a network of 50 channels broadcasting different programmes and is accessible to more than one billion viewers.[1] As of present, there are 50 television channels, and the broadcaster provides programming in six different languages. Most of its programmes are a mixture of news, documentary, social education, comedy, entertainment, and drama, the majority of which consists of Chinese soap operas and entertainment.[2]

CCTV is one of the official mouthpieces of the Communist Party of China, and is part of what is known in China as the “central three” (中央三台), with the others being China National Radio and China Radio International.

Fake news and CCTV

CCTV mentioned positively:

1 2