Searching for "algorithm"

Algorithmic Test Proctoring

Our Bodies Encoded: Algorithmic Test Proctoring in Higher Education

SHEA SWAUGER ED-TECH

https://hybridpedagogy.org/our-bodies-encoded-algorithmic-test-proctoring-in-higher-education/

While in-person test proctoring has been used to combat test-based cheating, this can be difficult to translate to online courses. Ed-tech companies have sought to address this concern by offering to watch students take online tests, in real time, through their webcams.

Some of the more prominent companies offering these services include ProctorioRespondusProctorUHonorLockKryterion Global Testing Solutions, and Examity.

Algorithmic test proctoring’s settings have discriminatory consequences across multiple identities and serious privacy implications. 

While racist technology calibrated for white skin isn’t new (everything from photography to soap dispensers do this), we see it deployed through face detection and facial recognition used by algorithmic proctoring systems.

While some test proctoring companies develop their own facial recognition software, most purchase software developed by other companies, but these technologies generally function similarly and have shown a consistent inability to identify people with darker skin or even tell the difference between Chinese people. Facial recognition literally encodes the invisibility of Black people and the racist stereotype that all Asian people look the same.

As Os Keyes has demonstrated, facial recognition has a terrible history with gender. This means that a software asking students to verify their identity is compromising for students who identify as trans, non-binary, or express their gender in ways counter to cis/heteronormativity.

These features and settings create a system of asymmetric surveillance and lack of accountability, things which have always created a risk for abuse and sexual harassment. Technologies like these have a long history of being abused, largely by heterosexual men at the expense of women’s bodies, privacy, and dignity.

Their promotional messaging functions similarly to dog whistle politics which is commonly used in anti-immigration rhetoric. It’s also not a coincidence that these technologies are being used to exclude people not wanted by an institution; biometrics and facial recognition have been connected to anti-immigration policies, supported by both Republican and Democratic administrations, going back to the 1990’s.

Borrowing from Henry A. Giroux, Kevin Seeber describes the pedagogy of punishment and some of its consequences in regards to higher education’s approach to plagiarism in his book chapter “The Failed Pedagogy of Punishment: Moving Discussions of Plagiarism beyond Detection and Discipline.”

my note: I am repeating this for years
Sean Michael Morris and Jesse Stommel’s ongoing critique of Turnitin, a plagiarism detection software, outlines exactly how this logic operates in ed-tech and higher education: 1) don’t trust students, 2) surveil them, 3) ignore the complexity of writing and citation, and 4) monetize the data.

Technological Solutionism

Cheating is not a technological problem, but a social and pedagogical problem.
Our habit of believing that technology will solve pedagogical problems is endemic to narratives produced by the ed-tech community and, as Audrey Watters writes, is tied to the Silicon Valley culture that often funds it. Scholars have been dismantling the narrative of technological solutionism and neutrality for some time now. In her book “Algorithms of Oppression,” Safiya Umoja Noble demonstrates how the algorithms that are responsible for Google Search amplify and “reinforce oppressive social relationships and enact new modes of racial profiling.”

Anna Lauren Hoffmann, who coined the term “data violence” to describe the impact harmful technological systems have on people and how these systems retain the appearance of objectivity despite the disproportionate harm they inflict on marginalized communities.

This system of measuring bodies and behaviors, associating certain bodies and behaviors with desirability and others with inferiority, engages in what Lennard J. Davis calls the Eugenic Gaze.

Higher education is deeply complicit in the eugenics movement. Nazism borrowed many of its ideas about racial purity from the American school of eugenics, and universities were instrumental in supporting eugenics research by publishing copious literature on it, establishing endowed professorships, institutes, and scholarly societies that spearheaded eugenic research and propaganda.

+++++++++++++++++
more on privacy in this IMS blog
http://blog.stcloudstate.edu/ims?s=privacy

algorithm literacy

Report: Colleges Must Teach ‘Algorithm Literacy’ to Help Students Navigate Internet

By Rebecca Koenig     Jan 16, 2020

https://www.edsurge.com/news/2020-01-16-report-colleges-must-teach-algorithm-literacy-to-help-students-navigate-internet

Project Information Literacy, a nonprofit research institution that explores how college students find, evaluate and use information. It was commissioned by the John S. and James L. Knight Foundation and The Harvard Graduate School of Education.

focus groups and interviews with 103 undergraduates and 37 faculty members from eight U.S. colleges.

To better equip students for the modern information environment, the report recommends that faculty teach algorithm literacy in their classrooms. And given students’ reliance on learning from their peers when it comes to technology, the authors also suggest that students help co-design these learning experiences.

Algorithms and Media Literacy

While informed and critically aware media users may see past the resulting content found in suggestions provided after conducting a search on YouTube, Facebook, or Google, those without these skills, particularly young or inexperienced users, fail to realize the culpability of underlying algorithms in the resultant filter bubbles and echo chambers (Cohen, 2018).
Media literacy education is more important than ever. It’s not just the overwhelming calls to understand the effects of fake news or addressing data breaches threatening personal information, it is the artificial intelligence systems being designed to predict and project what is perceived to be what consumers of social media want.
it’s time to revisit the Eight Key Concepts of media literacy with an algorithmic focus.
Literacy in today’s online and offline environments “means being able to use the dominant symbol systems of the culture for personal, aesthetic, cultural, social, and political goals” (Hobbs & Jensen, 2018, p 4).

+++++++++++++++++++

++++++++++++++++++++

Information Literacy in an Age of Algorithms from Kristen Yarmey

++++++++++++++++++++

Artificial Intelligence Literacy from Rogelio E. Cardona-Rivera

+++++++++++++
more on media literacy in this IMS blog
http://blog.stcloudstate.edu/ims?s=media+literacy

more on news literacy in this IMS blog
http://blog.stcloudstate.edu/ims?s=news+literate

education algorithms

https://www.edsurge.com/news/2016-06-10-humanizing-education-s-algorithms

predictive algorithms to better target students’ individual learning needs.

Personalized learning is a lofty aim, however you define it. To truly meet each student where they are, we would have to know their most intimate details, or discover it through their interactions with our digital tools. We would need to track their moods and preferences, their fears and beliefs…perhaps even their memories.

There’s something unsettling about capturing users’ most intimate details. Any prediction model based off historical records risks typecasting the very people it is intended to serve. Even if models can overcome the threat of discrimination, there is still an ethical question to confront – just how much are we entitled to know about students?

We can accept that tutoring algorithms, for all their processing power, are inherently limited in what they can account for. This means steering clear of mythical representations of what such algorithms can achieve. It may even mean giving up on personalization altogether. The alternative is to pack our algorithms to suffocation at the expense of users’ privacy. This approach does not end well.

There is only one way to resolve this trade-off: loop in the educators.

Algorithms and data must exist to serve educators

 

++++++++++++
more on algorithms in this IMS blog
blog.stcloudstate.edu/ims?s=algor

social media algorithms

How algorithms impact our browsing behavior? browsing history?
What is the connection between social media algorithms and fake news?
Are there topic-detection algorithms as they are community-detection ones?
How can I change the content of a [Google] search return? Can I? 

Larson, S. (2016, July 8). What is an Algorithm and How Does it Affect You? The Daily Dot. Retrieved from https://www.dailydot.com/debug/what-is-an-algorithm/
Berg, P. (2016, June 30). How Do Social Media Algorithms Affect You | Forge and Smith. Retrieved September 19, 2017, from https://forgeandsmith.com/how-do-social-media-algorithms-affect-you/
Oremus, W., & Chotiner, I. (2016, January 3). Who Controls Your Facebook Feed. Slate. Retrieved from http://www.slate.com/articles/technology/cover_story/2016/01/how_facebook_s_news_feed_algorithm_works.html
Lehrman, R. A. (2013, August 11). The new age of algorithms: How it affects the way we live. Christian Science Monitor. Retrieved from https://www.csmonitor.com/USA/Society/2013/0811/The-new-age-of-algorithms-How-it-affects-the-way-we-live
Johnson, C. (2017, March 10). How algorithms affect our way of life. Desert News. Retrieved from https://www.deseretnews.com/article/865675141/How-algorithms-affect-our-way-of-life.html
Understanding algorithms and their impact on human life goes far beyond basic digital literacy, some experts said.
An example could be the recent outcry over Facebook’s news algorithm, which enhances the so-called “filter bubble”of information.
personalized search (https://en.wikipedia.org/wiki/Personalized_search)
Kounine, A. (2016, August 24). How your personal data is used in personalization and advertising. Retrieved September 19, 2017, from https://www.tastehit.com/blog/personal-data-in-personalization-and-advertising/
Hotchkiss, G. (2007, March 9). The Pros & Cons Of Personalized Search. Retrieved September 19, 2017, from http://searchengineland.com/the-pros-cons-of-personalized-search-10697
Magid, L. (2012). How (and why) To Turn Off Google’s Personalized Search Results. Forbes. Retrieved from https://www.forbes.com/sites/larrymagid/2012/01/13/how-and-why-to-turn-off-googles-personalized-search-results/#53a30be838f2
Nelson, P. (n.d.). Big Data, Personalization and the No-Search of Tomorrow. Retrieved September 19, 2017, from https://www.searchtechnologies.com/blog/big-data-search-personalization

gender

Massanari, A. (2017). #Gamergate and The Fappening: How Reddit’s algorithm, governance, and culture support toxic technocultures. New Media & Society19(3), 329-346. doi:10.1177/1461444815608807

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dkeh%26AN%3d121748152%26site%3dehost-live%26scope%3dsite

community detection algorithms:

Bedi, P., & Sharma, C. (2016). Community detection in social networks. Wires: Data Mining & Knowledge Discovery6(3), 115-135.

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dllf%26AN%3d114513548%26site%3dehost-live%26scope%3dsite

CRUZ, J. D., BOTHOREL, C., & POULET, F. (2014). Community Detection and Visualization in Social Networks: Integrating Structural and Semantic Information. ACM Transactions On Intelligent Systems & Technology5(1), 1-26. doi:10.1145/2542182.2542193

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3daph%26AN%3d95584126%26site%3dehost-live%26scope%3dsite

Bai, X., Yang, P., & Shi, X. (2017). An overlapping community detection algorithm based on density peaks. Neurocomputing2267-15. doi:10.1016/j.neucom.2016.11.019

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dkeh%26AN%3d120321022%26site%3dehost-live%26scope%3dsite

topic-detection algorithms:

Zeng, J., & Zhang, S. (2009). Incorporating topic transition in topic detection and tracking algorithms. Expert Systems With Applications36(1), 227-232. doi:10.1016/j.eswa.2007.09.013

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dkeh%26AN%3d34892957%26site%3dehost-live%26scope%3dsite

topic detection and tracking (TDT) algorithms based on topic models, such as LDA, pLSI (https://en.wikipedia.org/wiki/Probabilistic_latent_semantic_analysis), etc.

Zhou, E., Zhong, N., & Li, Y. (2014). Extracting news blog hot topics based on the W2T Methodology. World Wide Web17(3), 377-404. doi:10.1007/s11280-013-0207-7

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dkeh%26AN%3d94609674%26site%3dehost-live%26scope%3dsite

The W2T (Wisdom Web of Things) methodology considers the information organization and management from the perspective of Web services, which contributes to a deep understanding of online phenomena such as users’ behaviors and comments in e-commerce platforms and online social networks.  (https://link.springer.com/chapter/10.1007/978-3-319-44198-6_10)

ethics of algorithm

Mittelstadt, B. D., Allo, P., Taddeo, M., Wachter, S., & Floridi, L. (2016). The ethics of algorithms: Mapping the debate. Big Data & Society, 3(2), 2053951716679679. https://doi.org/10.1177/2053951716679679

journalism

Malyarov, N. (2016, October 18). Journalism in the age of algorithms, platforms and newsfeeds | News | FIPP.com. Retrieved September 19, 2017, from http://www.fipp.com/news/features/journalism-in-the-age-of-algorithms-platforms-newsfeeds

+++++++++++++++++
http://blog.stcloudstate.edu/ims?s=algorithm
more on algorithms in this IMS blog

see also

smart anonymization

This startup claims its deepfakes will protect your privacy

But some experts say that D-ID’s “smart video anonymization” technique breaks the law.

https://www.technologyreview.com/s/614983/this-startup-claims-its-deepfakes-will-protect-your-privacy/

The upside for businesses is that this new, “anonymized” video no longer gives away the exact identity of a customer—which, Perry says, means companies using D-ID can “eliminate the need for consent” and analyze the footage for business and marketing purposes. A store might, for example, feed video of a happy-looking white woman to an algorithm that can surface the most effective ad for her in real time.

Three leading European privacy experts who spoke to MIT Technology Review voiced their concerns about D-ID’s technology and its intentions. All say that, in their opinion, D-ID actually violates GDPR.

Surveillance is becoming more and more widespread. A recent Pew study found that most Americans think they’re constantly being tracked but can’t do much about it, and the facial recognition market is expected to grow from around $4.5 billion in 2018 to $9 billion by 2024. Still, the reality of surveillance isn’t keeping activists from fighting back.

++++++++++++
more on deep fake in this IMS blog
http://blog.stcloudstate.edu/ims?s=deep+fake

digital media misinformation

Digital Media Has a Misinformation Problem—but It’s an Opportunity for Teaching.

Jennifer Sparrow    Dec 13, 2018

https://www.edsurge.com/news/2018-12-13-digital-media-has-a-misinformation-problem-but-it-s-an-opportunity-for-teaching

https://www.edsurge.com/news/2018-12-13-digital-media-has-a-misinformation-problem-but-it-s-an-opportunity-for-teaching

Research has shown that 50 percent of college students spend a minimum of five hours each week on social media. These social channels feed information from news outlets, private bloggers, friends and family, and myriad other sources that are often curated based on the user’s interests. But what really makes social media a tricky resource for students and educators alike is that most companies don’t view themselves as content publishers. This position essentially absolves social media platforms of the responsibility to monitor what their users share, and that can allow false even harmful information to circulate.

“How do we help students become better consumers of information, data, and communication?” Fluency in each of these areas is integral to 21st century-citizenry, for which we must prepare students.

In English 202C, a technical writing course, students use our Invention Studio and littleBits to practice inventing their own electronic devices, write instructions for how to construct the device, and have classmates reproduce the invention.

The proliferation of mobile devices and high-speed Wi-Fi have made videos a common outlet for information-sharing. To keep up with the changing means of communication, Penn State campuses are equipped with One Button Studio, where students can learn to produce professional-quality video. With this, students must learn how to take information and translate it into a visual medium in a way that will best benefit the intended audience. They can also use the studios to hone their presentation or interview skills by recording practice sessions and then reviewing the footage.
++++++++++++++
more on digital media in this IMS blog
http://blog.stcloudstate.edu/ims?s=digital+media

NLP and ACL

NLP – natural language processing; ACL – Association for Computational Linguistics (ACL 2019)

Major trends in NLP: a review of 20 years of ACL research

Janna Lipenkova, July 23, 2019

https://www.linkedin.com/pulse/major-trends-nlp-review-20-years-acl-research-janna-lipenkova

The 57th Annual Meeting of the Association for Computational Linguistics (ACL 2019)

 Data: working around the bottlenecks

large data is inherently noisy. \In general, the more “democratic” the production channel, the dirtier the data – which means that more effort has to be spent on its cleaning. For example, data from social media will require a longer cleaning pipeline. Among others, you will need to deal with extravagancies of self-expression like smileys and irregular punctuation, which are normally absent in more formal settings such as scientific papers or legal contracts.

The other major challenge is the labeled data bottleneck

crowd-sourcing and Training Data as a Service (TDaaS). On the other hand, a range of automatic workarounds for the creation of annotated datasets have also been suggested in the machine learning community.

Algorithms: a chain of disruptions in Deep Learning

Neural Networks are the workhorse of Deep Learning (cf. Goldberg and Hirst (2017) for an introduction of the basic architectures in the NLP context). Convolutional Neural Networks have seen an increase in the past years, whereas the popularity of the traditional Recurrent Neural Network (RNN) is dropping. This is due, on the one hand, to the availability of more efficient RNN-based architectures such as LSTM and GRU. On the other hand, a new and pretty disruptive mechanism for sequential processing – attention – has been introduced in the sequence-to-sequence (seq2seq) model by Sutskever et al. (2014).

Consolidating various NLP tasks

the three “global” NLP development curves – syntax, semantics and context awareness
the third curve – the awareness of a larger context – has already become one of the main drivers behind new Deep Learning algorithms.

A note on multilingual research

Think of different languages as different lenses through which we view the same world – they share many properties, a fact that is fully accommodated by modern learning algorithms with their increasing power for abstraction and generalization.

Spurred by the global AI hype, the NLP field is exploding with new approaches and disruptive improvements. There is a shift towards modeling meaning and context dependence, probably the most universal and challenging fact of human language. The generalisation power of modern algorithms allows for efficient scaling across different tasks, languages and datasets, thus significantly speeding up the ROI cycle of NLP developments and allowing for a flexible and efficient integration of NLP into individual business scenarios.

break up Facebook

https://nyti.ms/2LzRzwq

Facebook’s board works more like an advisory committee than an overseer, because Mark controls around 60 percent of voting shares. Mark alone can decide how to configure Facebook’s algorithms to determine what people see in their News Feeds, what privacy settings they can use and even which messages get delivered. He sets the rules for how to distinguish violent and incendiary speech from the merely offensive, and he can choose to shut down a competitor by acquiring, blocking or copying it.

We are a nation with a tradition of reining in monopolies, no matter how well intentioned the leaders of these companies may be. Mark’s power is unprecedented and un-American.

It is time to break up Facebook.

America was built on the idea that power should not be concentrated in any one person, because we are all fallible. That’s why the founders created a system of checks and balances.

More legislation followed in the 20th century, creating legal and regulatory structures to promote competition and hold the biggest companies accountable.

Starting in the 1970s, a small but dedicated group of economists, lawyers and policymakers sowed the seeds of our cynicism. Over the next 40 years, they financed a network of think tanks, journals, social clubs, academic centers and media outlets to teach an emerging generation that private interests should take precedence over public ones. Their gospel was simple: “Free” markets are dynamic and productive, while government is bureaucratic and ineffective.

American industries, from airlines to pharmaceuticals, have experienced increased concentration, and the average size of public companies has tripled. The results are a decline in entrepreneurshipstalled productivity growth, and higher prices and fewer choices for consumers.

From our earliest days, Mark used the word “domination” to describe our ambitions, with no hint of irony or humility.

Facebook’s monopoly is also visible in its usage statistics. About 70 percent of American adults use social media, and a vast majority are on Facebook products. Over two-thirds use the core site, a third use Instagram, and a fifth use WhatsApp. By contrast, fewer than a third report using Pinterest, LinkedIn or Snapchat. What started out as lighthearted entertainment has become the primary way that people of all ages communicate online.

The F.T.C.’s biggest mistake was to allow Facebook to acquire Instagram and WhatsApp. In 2012, the newer platforms were nipping at Facebook’s heels because they had been built for the smartphone, where Facebook was still struggling to gain traction. Mark responded by buying them, and the F.T.C. approved.

The News Feed algorithm reportedly prioritized videos created through Facebook over videos from competitors, like YouTube and Vimeo. In 2012, Twitter introduced a video network called Vine that featured six-second videos. That same day, Facebook blocked Vine from hosting a tool that let its users search for their Facebook friends while on the new network. The decision hobbled Vine, which shut down four years later.

unlike Vine, Snapchat wasn’t interfacing with the Facebook ecosystem; there was no obvious way to handicap the company or shut it out. So Facebook simply copied it. (opyright law does not extend to the abstract concept itself.)

As markets become more concentrated, the number of new start-up businesses declines. This holds true in other high-tech areas dominated by single companies, like search (controlled by Google) and e-commerce (taken over by Amazon). Meanwhile, there has been plenty of innovation in areas where there is no monopolistic domination, such as in workplace productivity (Slack, Trello, Asana), urban transportation (Lyft, Uber, Lime, Bird) and cryptocurrency exchanges (Ripple, Coinbase, Circle).

The choice is mine, but it doesn’t feel like a choice. Facebook seeps into every corner of our lives to capture as much of our attention and data as possible and, without any alternative, we make the trade.

Just last month, Facebook seemingly tried to bury news that it had stored tens of millions of user passwords in plain text format, which thousands of Facebook employees could see. Competition alone wouldn’t necessarily spur privacy protection — regulation is required to ensure accountability — but Facebook’s lock on the market guarantees that users can’t protest by moving to alternative platforms.

Mark used to insist that Facebook was just a “social utility,” a neutral platform for people to communicate what they wished. Now he recognizes that Facebook is both a platform and a publisher and that it is inevitably making decisions about values. The company’s own lawyers have argued in court that Facebook is a publisher and thus entitled to First Amendment protection.

As if Facebook’s opaque algorithms weren’t enough, last year we learned that Facebook executives had permanently deleted their own messages from the platform, erasing them from the inboxes of recipients; the justification was corporate security concerns.

Mark may never have a boss, but he needs to have some check on his power. The American government needs to do two things: break up Facebook’s monopoly and regulate the company to make it more accountable to the American people.

++++++++++++++++++++

We Don’t Need Social Media

The push to regulate or break up Facebook ignores the fact that its services do more harm than good

Colin Horgan, May 13, 2019

https://onezero.medium.com/we-dont-need-social-media-53d5455f4f6b

Hughes joins a growing chorus of former Silicon Valley unicorn riders who’ve recently had second thoughts about the utility or benefit of the surveillance-attention economy their products and platforms have helped create. He is also not the first to suggest that government might need to step in to clean up the mess they made

Nick Srnicek, author of the book Platform Capitalism and a lecturer in digital economy at King’s College London, wrotelast month, “[I]t’s competition — not size — that demands more data, more attention, more engagement and more profits at all costs

 

++++++++++++++++++++
more on Facebook in this IMS blog
http://blog.stcloudstate.edu/ims?s=facebook

ARLD 2019

ARLD 2019

Paul Goodman

Technology is a branch of moral philosophy, not of science

The process of making technology is design

Design is a branch of moral philosophy, not of a science

 

System design reflects the designer’s values and the cultural content

Andreas Orphanides

 

Fulbright BOYD

 

Byzantine history professor Bulgarian – all that is 200 years old is politics, not history

 

Access, privacy, equity, values for the prof organization ARLD.

 

Mike Monteiro

This is how bad design makes it out into the world, not due to mailcioius intent, but whith nbo intent at all

 

Cody Hanson

Our expertise, our service ethic, and our values remain our greatest strengths. But for us to have the impat we seek into the lives of our users, we must encode our services and our values in to the software

Ethical design.

Design interprets the world to crate useful objects. Ethical design closes the loop, imaging how those object will affect the world.

 

A good science fiction story should be able to predict not the automobile, ut the traffics jam. Frederic Pohl

Victor Papanek The designer’s social and moral judgement must be brought into play long before she begins to design.

 

We need to fear the consequences of our work more than we love the cleverness of our ideas Mike Monteiro

Analytics

Qual and quan data – lirarainas love data, usage, ILL, course reserves, data –  QQLM.

IDEO – the goal of design research isn’t to collect data, I tis to synthesize information and provide insight and guidance that leads to action.

Google Analytics: the trade off. besides privacy concners. sometimes data and analytics is the only thing we can see.

Frank CHimero – remove a person;s humanity and she is just a curiosity, a pinpoint on a map, a line in a list, an entry in a dbase. a person turns into a granular but of information.

Gale analytics on demand – similar the keynote speaker at Macalester LibTech 2019. https://www.facebook.com/InforMediaServices/posts/1995793570531130?comment_id=1995795043864316&comment_tracking=%7B%22tn%22%3A%22R%22%7D

personas

by designing for yourself or your team, you are potentially building discrimination right into your product Erica Hall.

Search algorithms.

what is relevance. the relevance of the ranking algorithm. for whom (what patron). crummy searches.

reckless associsations – made by humans or computers – can do very real harm especially when they appear in supposedly neutral environments.

Donna Lanclos and Andrew Asher Ethonography should be core to the business of the library.

technology as information ecology. co-evolve. prepare to start asking questions to see the effect of our design choices.

ethnography of library: touch point tours – a student to give a tour to the librarians or draw a map of the library , give a sense what spaces they use, what is important. ethnographish

Q from the audience: if instructors warn against Google and Wikipedia and steer students to library and dbases, how do you now warn about the perils of the dbases bias? A: put fires down, and systematically, try to build into existing initiatives: bi-annual magazine, as many places as can

1 2 3 6