Project Information Literacy, a nonprofit research institution that explores how college students find, evaluate and use information. It was commissioned by the John S. and James L. Knight Foundation and The Harvard Graduate School of Education.
focus groups and interviews with 103 undergraduates and 37 faculty members from eight U.S. colleges.
To better equip students for the modern information environment, the report recommends that faculty teach algorithm literacy in their classrooms. And given students’ reliance on learning from their peers when it comes to technology, the authors also suggest that students help co-design these learning experiences.
While informed and critically aware media users may see past the resulting content found in suggestions provided after conducting a search on YouTube, Facebook, or Google, those without these skills, particularly young or inexperienced users, fail to realize the culpability of underlying algorithms in the resultant filter bubbles and echo chambers (Cohen, 2018).
Media literacy education is more important than ever. It’s not just the overwhelming calls to understand the effects of fake news or addressing data breaches threatening personal information, it is the artificial intelligence systems being designed to predict and project what is perceived to be what consumers of social media want.
it’s time to revisit the Eight Key Concepts of media literacy with an algorithmic focus.
Literacy in today’s online and offline environments “means being able to use the dominant symbol systems of the culture for personal, aesthetic, cultural, social, and political goals” (Hobbs & Jensen, 2018, p 4).
Some news organisations, including the BBC, New York Times and Buzzfeed have made their own “deepfake” videos, ostensibly to spread awareness about the techniques. Those videos, while of varying quality, have all contained clear statements that they are fake.
The Poynter Institute – an enlightened non-profit in St. Petersburg, Fla., that has an ownership role in the Tampa Bay Times and provides research, training and educational resources on journalism – provides many excellent online modules to help citizens improve their news media literacy.
citizens should support local and regional publications that hew to ethical journalism standards and cover local government entities.
Report: Florida, Ohio called ‘advanced leaders’ in K-12 media literacy efforts
Advocacy group Media Literacy Now says 14 states have laws with “some media-literacy language” and others will consider bills this year, but some say progress “is too slow.”
Media Literacy Now considers digital citizenship as part of media literacy — not the other way around
nine states — California, Colorado, Connecticut, Illinois, Massachusetts, Minnesota, New Jersey, Rhode Island and Utah — are identified as “emerging leaders” for “beginning the conversation” and consulting with experts and others.
Calls for increased attention to media literacy skills and demand from educators for training in this area increased following an outbreak of “fake news” reports associated with the 2016 presidential election. Studies and assessments showing students are easily misled by digital information have also contributed to a sense of urgency.
because the topic can fit into multiple content areas, it can also be overlooked because of other pressures on teachers. Media literacy, the group notes, also “encompasses the foundational skills of digital citizenship and internet safety including the norms of appropriate, responsible, ethical, and healthy behavior, and cyberbullying prevention.”
Lawmakers in Missouri and South Carolina have also pre-filed versions of Media Literacy Now’s model bill, the report noted, and legislation is expected in Hawaii and Arizona.
That’s the nickname given to computer-created artificial videos or other digital material in which images are combined to create new footage that depicts events that never actually happened. The term originates from the online message board Reddit.
One initial use of the fake videos was in amateur-created pornography, in which the faces of famous Hollywood actresses were digitally placed onto that of other performers to make it appear as though the stars themselves were performing.
How difficult is it to create fake media?
It can be done with specialized software, experts say, the same way that editing programs such as Photoshop have made it simpler to manipulate still images. And specialized software itself isn’t necessary for what have been dubbed “shallow fakes” or “cheap fakes.”
Researchers also say they are working on new ways to speed up systems aimed at helping establish when video or audio has been manipulated. But it’s been called a “cat and mouse” game in which there may seldom be exact parity between fabrication and detection.
Released on Friday, the Zao app went viral as Chinese users seized on the chance to see themselves act out scenes from well-known movies using deepfake technology, which has already prompted concerns elsewhere over potential misuse.
As of Monday afternoon it remained the top free download in China, according to the app market data provider App Annie.
Concerns over deepfakes have grown since the 2016 US election campaign, which saw wide use of online misinformation, according to US investigations.
In June, Facebook’s chief executive, Mark Zuckerberg, said the social network was struggling to find ways to deal with deepfake videos, saying they may constitute “a completely different category” of misinformation than anything faced before.