Searching for "algor"

Algorithmic Test Proctoring

Our Bodies Encoded: Algorithmic Test Proctoring in Higher Education

SHEA SWAUGER ED-TECH

https://hybridpedagogy.org/our-bodies-encoded-algorithmic-test-proctoring-in-higher-education/

While in-person test proctoring has been used to combat test-based cheating, this can be difficult to translate to online courses. Ed-tech companies have sought to address this concern by offering to watch students take online tests, in real time, through their webcams.

Some of the more prominent companies offering these services include ProctorioRespondusProctorUHonorLockKryterion Global Testing Solutions, and Examity.

Algorithmic test proctoring’s settings have discriminatory consequences across multiple identities and serious privacy implications. 

While racist technology calibrated for white skin isn’t new (everything from photography to soap dispensers do this), we see it deployed through face detection and facial recognition used by algorithmic proctoring systems.

While some test proctoring companies develop their own facial recognition software, most purchase software developed by other companies, but these technologies generally function similarly and have shown a consistent inability to identify people with darker skin or even tell the difference between Chinese people. Facial recognition literally encodes the invisibility of Black people and the racist stereotype that all Asian people look the same.

As Os Keyes has demonstrated, facial recognition has a terrible history with gender. This means that a software asking students to verify their identity is compromising for students who identify as trans, non-binary, or express their gender in ways counter to cis/heteronormativity.

These features and settings create a system of asymmetric surveillance and lack of accountability, things which have always created a risk for abuse and sexual harassment. Technologies like these have a long history of being abused, largely by heterosexual men at the expense of women’s bodies, privacy, and dignity.

Their promotional messaging functions similarly to dog whistle politics which is commonly used in anti-immigration rhetoric. It’s also not a coincidence that these technologies are being used to exclude people not wanted by an institution; biometrics and facial recognition have been connected to anti-immigration policies, supported by both Republican and Democratic administrations, going back to the 1990’s.

Borrowing from Henry A. Giroux, Kevin Seeber describes the pedagogy of punishment and some of its consequences in regards to higher education’s approach to plagiarism in his book chapter “The Failed Pedagogy of Punishment: Moving Discussions of Plagiarism beyond Detection and Discipline.”

my note: I am repeating this for years
Sean Michael Morris and Jesse Stommel’s ongoing critique of Turnitin, a plagiarism detection software, outlines exactly how this logic operates in ed-tech and higher education: 1) don’t trust students, 2) surveil them, 3) ignore the complexity of writing and citation, and 4) monetize the data.

Technological Solutionism

Cheating is not a technological problem, but a social and pedagogical problem.
Our habit of believing that technology will solve pedagogical problems is endemic to narratives produced by the ed-tech community and, as Audrey Watters writes, is tied to the Silicon Valley culture that often funds it. Scholars have been dismantling the narrative of technological solutionism and neutrality for some time now. In her book “Algorithms of Oppression,” Safiya Umoja Noble demonstrates how the algorithms that are responsible for Google Search amplify and “reinforce oppressive social relationships and enact new modes of racial profiling.”

Anna Lauren Hoffmann, who coined the term “data violence” to describe the impact harmful technological systems have on people and how these systems retain the appearance of objectivity despite the disproportionate harm they inflict on marginalized communities.

This system of measuring bodies and behaviors, associating certain bodies and behaviors with desirability and others with inferiority, engages in what Lennard J. Davis calls the Eugenic Gaze.

Higher education is deeply complicit in the eugenics movement. Nazism borrowed many of its ideas about racial purity from the American school of eugenics, and universities were instrumental in supporting eugenics research by publishing copious literature on it, establishing endowed professorships, institutes, and scholarly societies that spearheaded eugenic research and propaganda.

+++++++++++++++++
more on privacy in this IMS blog
https://blog.stcloudstate.edu/ims?s=privacy

algorithm literacy

Report: Colleges Must Teach ‘Algorithm Literacy’ to Help Students Navigate Internet

By Rebecca Koenig     Jan 16, 2020

https://www.edsurge.com/news/2020-01-16-report-colleges-must-teach-algorithm-literacy-to-help-students-navigate-internet

Project Information Literacy, a nonprofit research institution that explores how college students find, evaluate and use information. It was commissioned by the John S. and James L. Knight Foundation and The Harvard Graduate School of Education.

focus groups and interviews with 103 undergraduates and 37 faculty members from eight U.S. colleges.

To better equip students for the modern information environment, the report recommends that faculty teach algorithm literacy in their classrooms. And given students’ reliance on learning from their peers when it comes to technology, the authors also suggest that students help co-design these learning experiences.

Algorithms and Media Literacy

While informed and critically aware media users may see past the resulting content found in suggestions provided after conducting a search on YouTube, Facebook, or Google, those without these skills, particularly young or inexperienced users, fail to realize the culpability of underlying algorithms in the resultant filter bubbles and echo chambers (Cohen, 2018).
Media literacy education is more important than ever. It’s not just the overwhelming calls to understand the effects of fake news or addressing data breaches threatening personal information, it is the artificial intelligence systems being designed to predict and project what is perceived to be what consumers of social media want.
it’s time to revisit the Eight Key Concepts of media literacy with an algorithmic focus.
Literacy in today’s online and offline environments “means being able to use the dominant symbol systems of the culture for personal, aesthetic, cultural, social, and political goals” (Hobbs & Jensen, 2018, p 4).

+++++++++++++++++++

++++++++++++++++++++

Information Literacy in an Age of Algorithms from Kristen Yarmey

++++++++++++++++++++

Artificial Intelligence Literacy from Rogelio E. Cardona-Rivera

+++++++++++++
more on media literacy in this IMS blog
https://blog.stcloudstate.edu/ims?s=media+literacy

more on news literacy in this IMS blog
https://blog.stcloudstate.edu/ims?s=news+literate

education algorithms

https://www.edsurge.com/news/2016-06-10-humanizing-education-s-algorithms

predictive algorithms to better target students’ individual learning needs.

Personalized learning is a lofty aim, however you define it. To truly meet each student where they are, we would have to know their most intimate details, or discover it through their interactions with our digital tools. We would need to track their moods and preferences, their fears and beliefs…perhaps even their memories.

There’s something unsettling about capturing users’ most intimate details. Any prediction model based off historical records risks typecasting the very people it is intended to serve. Even if models can overcome the threat of discrimination, there is still an ethical question to confront – just how much are we entitled to know about students?

We can accept that tutoring algorithms, for all their processing power, are inherently limited in what they can account for. This means steering clear of mythical representations of what such algorithms can achieve. It may even mean giving up on personalization altogether. The alternative is to pack our algorithms to suffocation at the expense of users’ privacy. This approach does not end well.

There is only one way to resolve this trade-off: loop in the educators.

Algorithms and data must exist to serve educators

 

++++++++++++
more on algorithms in this IMS blog
blog.stcloudstate.edu/ims?s=algor

social media algorithms

How algorithms impact our browsing behavior? browsing history?
What is the connection between social media algorithms and fake news?
Are there topic-detection algorithms as they are community-detection ones?
How can I change the content of a [Google] search return? Can I? 

Larson, S. (2016, July 8). What is an Algorithm and How Does it Affect You? The Daily Dot. Retrieved from https://www.dailydot.com/debug/what-is-an-algorithm/
Berg, P. (2016, June 30). How Do Social Media Algorithms Affect You | Forge and Smith. Retrieved September 19, 2017, from https://forgeandsmith.com/how-do-social-media-algorithms-affect-you/
Oremus, W., & Chotiner, I. (2016, January 3). Who Controls Your Facebook Feed. Slate. Retrieved from http://www.slate.com/articles/technology/cover_story/2016/01/how_facebook_s_news_feed_algorithm_works.html
Lehrman, R. A. (2013, August 11). The new age of algorithms: How it affects the way we live. Christian Science Monitor. Retrieved from https://www.csmonitor.com/USA/Society/2013/0811/The-new-age-of-algorithms-How-it-affects-the-way-we-live
Johnson, C. (2017, March 10). How algorithms affect our way of life. Desert News. Retrieved from https://www.deseretnews.com/article/865675141/How-algorithms-affect-our-way-of-life.html
Understanding algorithms and their impact on human life goes far beyond basic digital literacy, some experts said.
An example could be the recent outcry over Facebook’s news algorithm, which enhances the so-called “filter bubble”of information.
personalized search (https://en.wikipedia.org/wiki/Personalized_search)
Kounine, A. (2016, August 24). How your personal data is used in personalization and advertising. Retrieved September 19, 2017, from https://www.tastehit.com/blog/personal-data-in-personalization-and-advertising/
Hotchkiss, G. (2007, March 9). The Pros & Cons Of Personalized Search. Retrieved September 19, 2017, from http://searchengineland.com/the-pros-cons-of-personalized-search-10697
Magid, L. (2012). How (and why) To Turn Off Google’s Personalized Search Results. Forbes. Retrieved from https://www.forbes.com/sites/larrymagid/2012/01/13/how-and-why-to-turn-off-googles-personalized-search-results/#53a30be838f2
Nelson, P. (n.d.). Big Data, Personalization and the No-Search of Tomorrow. Retrieved September 19, 2017, from https://www.searchtechnologies.com/blog/big-data-search-personalization

gender

Massanari, A. (2017). #Gamergate and The Fappening: How Reddit’s algorithm, governance, and culture support toxic technocultures. New Media & Society19(3), 329-346. doi:10.1177/1461444815608807

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dkeh%26AN%3d121748152%26site%3dehost-live%26scope%3dsite

community detection algorithms:

Bedi, P., & Sharma, C. (2016). Community detection in social networks. Wires: Data Mining & Knowledge Discovery6(3), 115-135.

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dllf%26AN%3d114513548%26site%3dehost-live%26scope%3dsite

CRUZ, J. D., BOTHOREL, C., & POULET, F. (2014). Community Detection and Visualization in Social Networks: Integrating Structural and Semantic Information. ACM Transactions On Intelligent Systems & Technology5(1), 1-26. doi:10.1145/2542182.2542193

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3daph%26AN%3d95584126%26site%3dehost-live%26scope%3dsite

Bai, X., Yang, P., & Shi, X. (2017). An overlapping community detection algorithm based on density peaks. Neurocomputing2267-15. doi:10.1016/j.neucom.2016.11.019

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dkeh%26AN%3d120321022%26site%3dehost-live%26scope%3dsite

topic-detection algorithms:

Zeng, J., & Zhang, S. (2009). Incorporating topic transition in topic detection and tracking algorithms. Expert Systems With Applications36(1), 227-232. doi:10.1016/j.eswa.2007.09.013

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dkeh%26AN%3d34892957%26site%3dehost-live%26scope%3dsite

topic detection and tracking (TDT) algorithms based on topic models, such as LDA, pLSI (https://en.wikipedia.org/wiki/Probabilistic_latent_semantic_analysis), etc.

Zhou, E., Zhong, N., & Li, Y. (2014). Extracting news blog hot topics based on the W2T Methodology. World Wide Web17(3), 377-404. doi:10.1007/s11280-013-0207-7

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dkeh%26AN%3d94609674%26site%3dehost-live%26scope%3dsite

The W2T (Wisdom Web of Things) methodology considers the information organization and management from the perspective of Web services, which contributes to a deep understanding of online phenomena such as users’ behaviors and comments in e-commerce platforms and online social networks.  (https://link.springer.com/chapter/10.1007/978-3-319-44198-6_10)

ethics of algorithm

Mittelstadt, B. D., Allo, P., Taddeo, M., Wachter, S., & Floridi, L. (2016). The ethics of algorithms: Mapping the debate. Big Data & Society, 3(2), 2053951716679679. https://doi.org/10.1177/2053951716679679

journalism

Malyarov, N. (2016, October 18). Journalism in the age of algorithms, platforms and newsfeeds | News | FIPP.com. Retrieved September 19, 2017, from http://www.fipp.com/news/features/journalism-in-the-age-of-algorithms-platforms-newsfeeds

+++++++++++++++++
https://blog.stcloudstate.edu/ims?s=algorithm
more on algorithms in this IMS blog

see also

the Platform Transparency and Accountability Act

Meta, TikTok and YouTube may finally have to start sharing data with researchers

A Senate hearing this week and a new law in Europe show how “transparency” advocates are winning

the Platform Transparency and Accountability Act, was introduced in December by (an ever-so-slightly) bipartisan group of senators.

“YouTube, TikTok, Telegram, and Snapchat represent some of the largest and most influential platforms in the United States, and they provide almost no functional transparency into their systems. And as a result, they avoid nearly all of the scrutiny and criticism that comes with it.”

When we do hear about what happens inside a tech company, it’s often because a Frances Haugen-type employee decides to leak it.

Cruz expressed great confusion about why he got relatively few new Twitter followers in the days before Elon Musk said he was going to buy it, but then got many more after the acquisition was announced.

The actual explanation is that Musk has lots of conservative fans, they flocked back to the platform when they heard he was buying it, and from there Twitter’s recommendation algorithms kicked into gear.

As usual, though, Europe is much further ahead of us. The Digital Services Act, which regulators reached an agreement on in April, includes provisions that would require big platforms to share data with qualified researchers. The law is expected to go into effect by next year. And so even if Congress dithers after today, transparency is coming to platforms one way or another. Here’s hoping it can begin to answer some very important questions.

new EU legislation for Google, Meta

Google, Meta, and others will have to explain their algorithms under new EU legislation

The Digital Services Act will reshape the online world

https://www.theverge.com/2022/4/23/23036976/eu-digital-services-act-finalized-algorithms-targeted-advertising

The EU has agreed on another ambitious piece of legislation to police the online world.

  • argeted advertising based on an individual’s religion, sexual orientation, or ethnicity is banned. Minors cannot be subject to targeted advertising either.
  • “Dark patterns” — confusing or deceptive user interfaces designed to steer users into making certain choices — will be prohibited. The EU says that, as a rule, canceling subscriptions should be as easy as signing up for them.
  • Large online platforms like Facebook will have to make the working of their recommender algorithms (used for sorting content on the News Feed or suggesting TV shows on Netflix) transparent to users. Users should also be offered a recommender system “not based on profiling.” In the case of Instagram, for example, this would mean a chronological feed (as it introduced recently).
  • Hosting services and online platforms will have to explain clearly why they have removed illegal content as well as give users the ability to appeal such takedowns. The DSA itself does not define what content is illegal, though, and leaves this up to individual countries.
  • The largest online platforms will have to provide key data to researchers to “provide more insight into how online risks evolve.”
  • Online marketplaces must keep basic information about traders on their platform to track down individuals selling illegal goods or services.
  • Large platforms will also have to introduce new strategies for dealing with misinformation during crises (a provision inspired by the recent invasion of Ukraine).

hese tech companies have lobbied hard to water down the requirements in the DSA, particularly those concerning targeted advertising and handing over data to outside researchers.

Facebook and January 6

https://www.commondreams.org/news/2022/01/04/congress-could-help-prevent-another-jan-6-data-privacy-law-say-campaigners

“Facebook’s business model has evolved into social engineering via psychological warfare,” she declared. “The platform weaponizes user data to fuel algorithmic manipulation in order to maximize ad sales—not just for products, but for ideas like the disinformation that led to the conspiracy theories associated with the January 6 Capitol attack.”

“One thing is clear: Facebook and the other digital platforms that rely on an extractive business model will not change on their own,” the letter states. “Congress needs to step in.”

“The secretive collection, sale, and algorithmic manipulation of our personal data by platforms like Facebook must end,” he said. “It is a primary driver of the virality of the misinformation, hate speech, and online radicalization that people across the political spectrum are worried about.”

Instagram Carousel Posts

7 Tips For More Engaging, Top Performing Instagram Carousel Posts

https://www.searchenginejournal.com/instagram-carousel-tips/429880

Studies have shown that carousels are the most engaging type of post on the platform.

One of the most effective techniques: know your audience and talk directly to them as individuals.

you’re leading with the most compelling information or image for the reader, and think about “what’s in it for them” throughout the carousel.

add a visual signal in the images, like an arrow pointing to the right in all but the final image.

You want every slide to stand on its own.

+++++++++++++
more on instagram in this IMS blog
https://blog.stcloudstate.edu/ims?s=instagram

automated proctoring

https://www.edsurge.com/news/2021-11-19-automated-proctoring-swept-in-during-pandemic-it-s-likely-to-stick-around-despite-concerns

law student sued an automated proctoring company, students have complained about their use in student newspaper editorials and professors have compared them to Big Brother.

ProctorU, which has decided not to sell software that uses algorithms to detect cheating

recent Educause study found that 63 percent of colleges and universities in the U.S. and Canada mention the use of remote proctoring on their websites.

One reason colleges are holding onto proctoring tools, Urdan adds, is that many colleges plan to expand their online course offerings even after campus activities return to normal. And the pandemic also saw rapid growth of another tech trend: students using websites to cheat on exams.

++++++++++++++++
More on proctoring in this blog
https://blog.stcloudstate.edu/ims?s=proctoring

1 2 3 9