The error I see many beginning to make is forgetting about the diverse needs of our younger students or, worse, pushing tools intended for older students on younger ones. When considering immersive technology resources for our early elementary students, I’ve shared some important, practical areas to keep in mind.
A virtual reality (VR) medical training system built by Oxford Medical Simulation (OMS) is now being offered for free during the COVID-19 pandemic to help hospitals and medical schools bring in badly-needed additional staffers to provide patient care.
Contact pmiltenoff@stcloudstate.edu if you need more info/support, clarifications. E.g. among the great tools in the list is EdPuzzle (https://edpuzzle.com/). EdPuzzle does very much the same as theVideo Quiz in the MinnState MediaSpace (aka Kaltura); we can help you figure out advantages and disadvantages of the tools, their pedagogical application and make final choice.
Algorithmic test proctoring’s settings have discriminatory consequences across multiple identities and serious privacy implications.
While racist technology calibrated for white skin isn’t new (everything from photography to soap dispensers do this), we see it deployed through face detection and facial recognition used by algorithmic proctoring systems.
While some test proctoring companies develop their own facial recognition software, most purchase software developed by other companies, but these technologies generally function similarly and have shown a consistent inability to identify people with darker skin or even tell the difference between Chinese people. Facial recognition literally encodes the invisibility of Black people and the racist stereotype that all Asian people look the same.
As Os Keyes has demonstrated, facial recognition has a terrible history with gender. This means that a software asking students to verify their identity is compromising for students who identify as trans, non-binary, or express their gender in ways counter to cis/heteronormativity.
These features and settings create a system of asymmetric surveillance and lack of accountability, things which have always created a risk for abuse and sexual harassment. Technologies like these have a long history of being abused, largely by heterosexual men at the expense of women’s bodies, privacy, and dignity.
my note: I am repeating this for years
Sean Michael Morris and Jesse Stommel’s ongoing critique of Turnitin, a plagiarism detection software, outlines exactly how this logic operates in ed-tech and higher education: 1) don’t trust students, 2) surveil them, 3) ignore the complexity of writing and citation, and 4) monetize the data.
Technological Solutionism
Cheating is not a technological problem, but a social and pedagogical problem.
Our habit of believing that technology will solve pedagogical problems is endemic to narratives produced by the ed-tech community and, as Audrey Watters writes, is tied to the Silicon Valley culture that often funds it. Scholars have been dismantling the narrative of technological solutionism and neutrality for some time now. In her book “Algorithms of Oppression,” Safiya Umoja Noble demonstrates how the algorithms that are responsible for Google Search amplify and “reinforce oppressive social relationships and enact new modes of racial profiling.”
Anna Lauren Hoffmann, who coined the term “data violence” to describe the impact harmful technological systems have on people and how these systems retain the appearance of objectivity despite the disproportionate harm they inflict on marginalized communities.
This system of measuring bodies and behaviors, associating certain bodies and behaviors with desirability and others with inferiority, engages in what Lennard J. Davis calls the Eugenic Gaze.
Higher education is deeply complicit in the eugenics movement. Nazism borrowed many of its ideas about racial purity from the American school of eugenics, and universities were instrumental in supporting eugenics research by publishing copious literature on it, establishing endowed professorships, institutes, and scholarly societies that spearheaded eugenic research and propaganda.
he Intercept reported that Zoom video calls are not end-to-end encrypted, despite the company’s claims that they are.
Motherboard reports that Zoom is leaking the email addresses of “at least a few thousand” people because personal addresses are treated as if they belong to the same company
Apple was forced to step in to secure millions of Macs after a security researcher found Zoom failed to disclose that it installed a secret web server on users’ Macs, which Zoom failed to remove when the client was uninstalled
+++++++++++++
‘Zoom is malware’: why experts worry about the video conferencing platform
A report from Motherboard found Zoom sends data from users of its iOS app to Facebook for advertising purposes, even if the user does not have a Facebook account.
++++++++++++++++
I used to thoroughly love @zoom_us as a platform for collaborating. I still use it. But it’s not something that I would recommend to others anymore. Here’s a thread as to why: