Misinformation and disinformation are rife, but so far it’s been seen as a challenge for policy-makers and big tech, including social media platforms.
The sheer volume of data being created makes it hard to tell what’s real and what’s not. From destroying 5G towers to conspiracies like QAnon and unfounded concern about election fraud, distrust is becoming the default – and this can have incredibly damaging effects on society.
So far, the tech sector – primarily social media companies, given that their platforms enable fake news to spread exponentially – have tried to implement some measures, with varying levels of success. For example, WhatsApp has placed a stricter limit on its message-forwarding capability and Twitter has begun to flag misleading posts.
the rise of tech startups that are exploring ways to detect and stem the flow of disinformation, such Right of Reply, Astroscreen and Logically.
It has been created by University of Cambridge psychologists with support from the US Department of State’s Global Engagement Center and Department of Homeland Security Cybersecurity and Infrastructure Security Agency (CISA).
Last year, researchers at Oxford University found that 70 countries had political disinformation campaigns over two years.
Perhaps the most notable of such campaigns was that initiated by a Russian propaganda group to influence the 2016 US election result.
he US Federal Communications Commission hosted a period in 2017 where the public could comment on its plans to repeal net neutrality. Harvard Kennedy School lecturer Bruce Schneier notes that while the agency received 22 million comments, many of them were made by fake identities.
Schneier argues that the escalating prevalence of computer-generated personas could “starve” people of democracy
In Media Manipulation and Disinformation Online, Marwick and Lewis (2017) of the Data & Society Research Institute described the agents of media manipulation, their modus operandi, motivators, and how they’ve taken advantage of the vulnerability of online media. The researchers described the manipulators as right-wing extremists (RWE), also known as alt-right, who run the gamut from sexists (including male sexual conquest communities) to white nationalists to anti-immigration activists and even those who rebuke RWE identification but whose actions confer such classification. These manipulators rally behind a shared belief on online forums, blogs, podcasts, and social media through pranks or ruinous trolling anonymity, usurping participatory culture methods (networking, humor, mentorship) for harassment, and competitive cyber brigades that earn status by escalating bullying such as the sharing of a target’s private information.
Marwick and Lewis reported on how RWE groups have taken advantage of certain media tactics to gain viewers’ attention such as novelty and sensationalism, as well as their interactions with the public via social media, to manipulate it for their agenda. For instance, YouTube provides any individual with a portal and potential revenue to contribute to the media ecosystem. The researchers shared the example of the use of YouTube by conspiracy theorists, which can be used as fodder for extremist networks as conspiracies generally focus on loss of control of important ideals, health, and safety.
One tactic they’re using is to package their hate in a way that appeals to millennials. They use attention hacking to increase their status such as hate speech, which is later recanted as trickster trolling all the while gaining the media’s attention for further propagation
SHARED MODUS OPERANDI
Marwick and Lewis reported the following shared tactics various RWE groups use for online exploits:
Ambiguity of persona or ideology,
Baiting a single or community target’s emotions,
Bots for amplification of propaganda that appears legitimately from a real person,
“…Embeddedness in Internet culture… (p. 28),”
Exploitation of young male rebelliousness,
Hate speech and offensive language (under the guise of First Amendment protections),
Irony to cloak ideology and/or skewer intended targets,
Memes for stickiness of propaganda,
Mentorship in argumentation, marketing strategies, and subversive literature in their communities of interest,
Networked and agile groups,
“…Permanent warfare… (p.12)” call to action,
Pseudo scholarship to deceive readers,
“…Quasi moral arguments… (p. 7)”
Shocking images for filtering network membership,
“Trading stories up the chain… (p. 38)” from low-level news outlets to mainstream, and
Trolling others with asocial behavior.
teenagers in Veles, Macedonia who profited around 16K dollars per month via Google’s AdSense from Facebook post engagements
a long history of mistrust with mainstream media
If you’re a college instructor of communications or teach digital literacy as a librarian, see the corresponding syllabus for this article. It provides discussion questions and assignments for teaching students about media manipulation. To teach your students how to combat fake news online, see my post on Navigating Post-Truth Societies: Strategies, Resources, and Technologies.
Tom Dickinson describes four different types of distributed ‘fake news’.
‘Fake news’ is lazy language. Be specific. Do you mean: A) Propaganda B) Disinformation C) Conspiracy theory D) Clickbait
The RAND Corporation, a US think-tank with strong ties to the military industrial complex, recently looked at the influence of the Russian Propaganda Model and how best to deal with it.
Three factors have been shown to increase the (limited) effectiveness of retractions and refutations: (1) warnings at the time of initial exposure to misinformation, (2) repetition of the retraction or refutation, and (3) corrections that provide an alternative story to help fill the resulting gap in understanding when false ‘facts’ are removed.
Critical thinking requires us to constantly question assumptions, especially our own. To develop these skills, questioning must be encouraged. This runs counter to most schooling and training practices. When do students or employees get to question underlying assumptions of their institutions? If they cannot do this, how can we expect them to challenge various and pervasive types of ‘fake news’?
+++++ thank you for covering this information at home. Pls don’t forget to bring your q/s and ideas to class +++++
Why we are here today? We need to look deeper in the current 21stcentury state of information and disinformation and determine how such awareness can help policy analysis. How do we make up our mind about news and information; where from we get our info; who do we believe, who do we mistrust.
how do these three items assist a better analysis of policies?
Class assignment:
Share a topic which is very much to your heart.
Please feel welcome to use the following resources and/or contribute with your own resources to determine the sources and potential bias
Feel free also to use the following guidelines when establishing the veracity of information:
Here is a short (4 min) video introducing you to the well-known basics for evaluation of academic literature: https://youtu.be/qUd_gf2ypk4
ACCURACY
Does the author cite reliable sources?
How does the information compare with that in other works on the topic?
Can you determine if the information has gone through peer-review?
Are there factual, spelling, typographical, or grammatical errors?
AUDIENCE
Who do you think the authors are trying to reach?
Is the language, vocabulary, style and tone appropriate for intended audience?
What are the audience demographics? (age, educational level, etc.)
Are the authors targeting a particular group or segment of society?
AUTHORITY
Who wrote the information found in the article or on the site?
What are the author’s credentials/qualifications for this particular topic?
Is the author affiliated with a particular organization or institution?
What does that affiliation suggest about the author?
CURRENCY
Is the content current?
Does the date of the information directly affect the accuracy or usefulness of the information?
OBJECTIVITY/BIAS
What is the author’s or website’s point of view?
Is the point of view subtle or explicit?
Is the information presented as fact or opinion?
If opinion, is the opinion supported by credible data or informed argument?
Is the information one-sided?
Are alternate views represented?
Does the point of view affect how you view the information?
PURPOSE
What is the author’s purpose or objective, to explain, provide new information or news, entertain, persuade or sell?
Does the purpose affect how you view the information presented?
In 2021, however, all suggestions above may not be sufficient to distinguish a reliable source of information, even if the article made it through the peer-reviewed process. In time, you should learn to evaluate the research methods of the authors and decide if they are reliable. Same applies for the research findings and conclusions.
++++++++++++++++++++
Aditional topics and ideas for exploring at home:
civil disobedience