Searching for "disinformation"

Twitter bots climate disinformation

Twitter Bots Are a Major Source of Climate Disinformation. Researchers determined that nearly 9.5% of the users in their sample were likely bots. But those bots accounted for 25% of the total tweets about climate change on most days from r/science

https://www.scientificamerican.com/article/twitter-bots-are-a-major-source-of-climate-disinformation/

paper published last week in the journal Climate Policy is part of an expanding body of research about the role of bots in online climate discourse.

+++++++++++
more on climate in this IMS blog
https://blog.stcloudstate.edu/ims?s=climate

disinformation cybersecurity

It’s time to accept that disinformation is a cyber security issue from r/technology

https://www.computerweekly.com/opinion/Its-time-to-accept-that-disinformation-is-a-cyber-security-issue

Misinformation and disinformation are rife, but so far it’s been seen as a challenge for policy-makers and big tech, including social media platforms.

The sheer volume of data being created makes it hard to tell what’s real and what’s not. From destroying 5G towers to conspiracies like QAnon and unfounded concern about election fraud, distrust is becoming the default – and this can have incredibly damaging effects on society.

So far, the tech sector – primarily social media companies, given that their platforms enable fake news to spread exponentially – have tried to implement some measures, with varying levels of success. For example, WhatsApp has placed a stricter limit on its message-forwarding capability and Twitter has begun to flag misleading posts.

the rise of tech startups that are exploring ways to detect and stem the flow of disinformation, such Right of ReplyAstroscreen and Logically.

disinformation has the potential to undermine national security

Data breaches result in the loss of value, but so can data manipulation

Chief Disinformation Officer

“Trying to debunk misinformation after it has spread is like shutting the barn door after the horse has bolted. By pre-bunking, we aim to stop the spread of fake news in the first place,” said Dr Sander van der Linden, Director of the Cambridge Social Decision-Making lab and senior author of the new study.Game combats political misinformation by letting players undermine democracy: A short online game in which players are recruited as a “Chief Disinformation Officer” and use tactics like trolling to sabotage elections in a peaceful town has shown to reduce susceptibility to political misinformation from r/science

https://www.cam.ac.uk/research/news/game-combats-political-misinformation-by-letting-players-undermine-democracy

Game combats political misinformation by letting players undermine democracy

The free-to-play Harmony Square is released to the public today, along with a study on its effectiveness published in the Harvard Misinformation Review

It has been created by University of Cambridge psychologists with support from the US Department of State’s Global Engagement Center and Department of Homeland Security Cybersecurity and Infrastructure Security Agency (CISA).

 

+++++++++++++++
more on disinformation in this IMS blog
https://blog.stcloudstate.edu/ims?s=disinformation

Disinformation and the Cost of Fake News

https://youtu.be/ZZKKah4vNhc

+++++++++++++++
more on fake news in this IMS blog
https://blog.stcloudstate.edu/ims?s=%23fakenews

bots and disinformation

Computer-generated humans and disinformation campaigns could soon take over political debate. Last year, researchers found that 70 countries had political disinformation campaigns over two years from r/Futurology

Bots will dominate political debate, experts warn

 

 

Last year, researchers at Oxford University found that 70 countries had political disinformation campaigns over two years.
Perhaps the most notable of such campaigns was that initiated by a Russian propaganda group to influence the 2016 US election result.

he US Federal Communications Commission hosted a period in 2017 where the public could comment on its plans to repeal net neutrality. Harvard Kennedy School lecturer Bruce Schneier notes that while the agency received 22 million comments, many of them were made by fake identities.
Schneier argues that the escalating prevalence of computer-generated personas could “starve” people of democracy

 

++++++++++++
more on deepfake in this IMS blog
https://blog.stcloudstate.edu/ims?s=deepfake

Media Manipulation and Disinformation Online

A Review of ‘Media Manipulation and Disinformation Online’

In Media Manipulation and Disinformation Online, Marwick and Lewis (2017) of the Data & Society Research Institute described the agents of media manipulation, their modus operandi, motivators, and how they’ve taken advantage of the vulnerability of online media. The researchers described the manipulators as right-wing extremists (RWE), also known as alt-right, who run the gamut from sexists (including male sexual conquest communities) to white nationalists to anti-immigration activists and even those who rebuke RWE identification but whose actions confer such classification. These manipulators rally behind a shared belief on online forums, blogs, podcasts, and social media through pranks or ruinous trolling anonymity, usurping participatory culture methods (networking, humor, mentorship) for harassment, and competitive cyber brigades that earn status by escalating bullying such as the sharing of a target’s private information.

Marwick and Lewis reported on how RWE groups have taken advantage of certain media tactics to gain viewers’ attention such as novelty and sensationalism, as well as their interactions with the public via social media, to manipulate it for their agenda. For instance, YouTube provides any individual with a portal and potential revenue to contribute to the media ecosystem. The researchers shared the example of the use of YouTube by conspiracy theorists, which can be used as fodder for extremist networks as conspiracies generally focus on loss of control of important ideals, health, and safety.

One tactic they’re using is to package their hate in a way that appeals to millennials. They use attention hacking to increase their status such as hate speech, which is later recanted as trickster trolling all the while gaining the media’s attention for further propagation

SHARED MODUS OPERANDI

Marwick and Lewis reported the following shared tactics various RWE groups use for online exploits:

  • Ambiguity of persona or ideology,
  • Baiting a single or community target’s emotions,
  • Bots for amplification of propaganda that appears legitimately from a real person,
  • “…Embeddedness in Internet culture… (p. 28),”
  • Exploitation of young male rebelliousness,
  • Hate speech and offensive language (under the guise of First Amendment protections),
  • Irony to cloak ideology and/or skewer intended targets,
  • Memes for stickiness of propaganda,
  • Mentorship in argumentation, marketing strategies, and subversive literature in their communities of interest,
  • Networked and agile groups,
  • “…Permanent warfare… (p.12)” call to action,
  • Pseudo scholarship to deceive readers,
  • “…Quasi moral arguments… (p. 7)”
  • Shocking images for filtering network membership,
  • “Trading stories up the chain… (p. 38)” from low-level news outlets to mainstream, and
  • Trolling others with asocial behavior.

teenagers in Veles, Macedonia who profited around 16K dollars per month via Google’s AdSense from Facebook post engagements

a long history of mistrust with mainstream media

If you’re a college instructor of communications or teach digital literacy as a librarian, see the corresponding syllabus for this article. It provides discussion questions and assignments for teaching students about media manipulation. To teach your students how to combat fake news online, see my post on Navigating Post-Truth Societies: Strategies, Resources, and Technologies.

+++++++++
more on fake news in this iMS blog
https://blog.stcloudstate.edu/ims?s=fake+news

fake news disinformation propaganda

the secret of freedom

the secret of freedom

if we are in a post-truth moment then we need to understand the tools we have at hand to deal with falsehoods.

Tom Dickinson describes four different types of distributed ‘fake news’.

‘Fake news’ is lazy language. Be specific. Do you mean:
A) Propaganda
B) Disinformation
C) Conspiracy theory
D) Clickbait

The RAND Corporation, a US think-tank with strong ties to the military industrial complex, recently looked at the influence of the Russian Propaganda Model and how best to deal with it.

Three factors have been shown to increase the (limited) effectiveness of retractions and refutations: (1) warnings at the time of initial exposure to misinformation, (2) repetition of the retraction or refutation, and (3) corrections that provide an alternative story to help fill the resulting gap in understanding when false ‘facts’ are removed.

Critical thinking requires us to constantly question assumptions, especially our own. To develop these skills, questioning must be encouraged. This runs counter to most schooling and training practices. When do students or employees get to question underlying assumptions of their institutions? If they cannot do this, how can we expect them to challenge various and pervasive types of ‘fake news’?

++++++++++++++
more on fake news in this IMS blog
https://blog.stcloudstate.edu/ims?s=fake+news

critical news literacy session

Critical news literacy session for social policy analysis course

Katie Querna, Thursday, 11AM, Stewart Hall

post Higher Ed Learning Collective

https://www.theguardian.com/world/2022/feb/21/dumb-and-lazy-the-flawed-films-of-ukrainian-attacks-made-by-russias-fake-factory

https://english.elpais.com/science-tech/2022-02-24/the-war-in-ukraine-via-tiktok-how-ordinary-citizens-are-recording-russian-troops.html

+++ please cover this information at home and bring your ideas and questions to class +++++

Most students can’t tell fake news from real news, study shows
Read more: https://blog.stcloudstate.edu/ims/2017/03/28/fake-news-3/

Module 1 (video to introduce students to the readings and expected tasks)

  1. Fake News / Misinformation / Disinformation
    1. Definitions
      1. Fake news, alternative facts
        https://blog.stcloudstate.edu/ims?s=fake+news
        https://blog.stcloudstate.edu/ims?s=alternative+facts
      2. Misinformation vs disinformation
        https://blog.stcloudstate.edu/ims/2018/02/18/fake-news-disinformation-propaganda/

        1. Propaganda
        2. Conspiracy theories
          1. Bots, trolls
            https://blog.stcloudstate.edu/ims/2017/11/22/bots-trolls-and-fake-news/
            https://blog.stcloudstate.edu/ims/2020/04/30/fake-social-media-accounts-and-politicians/
            https://blog.stcloudstate.edu/ims/2020/01/20/bots-and-disinformation/
        3. Clickbait
          Filter bubbles, echo chambers
          (8 min) video explains filter bubbles
          https://www.ted.com/talks/eli_pariser_beware_online_filter

+++++ thank you for covering this information at home. Pls don’t forget to bring your q/s and ideas to class +++++

Why we are here today?
We need to look deeper in the current 21stcentury state of information and disinformation and determine how such awareness can help policy analysis. 
How do we make up our mind about news and information; where from we get our info; who do we believe, who do we mistrust. 

What do you understand under the following three items and their place in our efforts to analyze policies?
“critical thinking,” https://blog.stcloudstate.edu/ims/2014/05/11/the-5-step-model-to-teach-students-critical-thinking-skills/

“media literacy,” “Media Literacy now considers digital citizenship as part of media literacy — not the other way around”
https://blog.stcloudstate.edu/ims/2020/01/07/k12-media-literacy/

“critical [news] literacy”
https://youtu.be/i2WyIkK9IOg

how do these three items assist a better analysis of policies?

Class assignment:
Share a topic which is very much to your heart.
Please feel welcome to use the following resources and/or contribute with your own resources to determine the sources and potential bias

library spot fake news

fake news resources

fake news and video

Feel free also to use the following guidelines when establishing the veracity of information:

Here is a short (4 min) video introducing you to the well-known basics for evaluation of academic literature:
https://youtu.be/qUd_gf2ypk4

  1. ACCURACY
    1. Does the author cite reliable sources?
    2. How does the information compare with that in other works on the topic?
    3. Can you determine if the information has gone through peer-review?
    4. Are there factual, spelling, typographical, or grammatical errors?
  2. AUDIENCE
    1. Who do you think the authors are trying to reach?
    2. Is the language, vocabulary, style and tone appropriate for intended audience?
    3. What are the audience demographics? (age, educational level, etc.)
    4. Are the authors targeting a particular group or segment of society?
  3. AUTHORITY
    1. Who wrote the information found in the article or on the site?
    2. What are the author’s credentials/qualifications for this particular topic?
    3. Is the author affiliated with a particular organization or institution?
    4. What does that affiliation suggest about the author?
  4. CURRENCY
    1. Is the content current?
    2. Does the date of the information directly affect the accuracy or usefulness of the information?
  5. OBJECTIVITY/BIAS
    1. What is the author’s or website’s point of view?
    2. Is the point of view subtle or explicit?
    3. Is the information presented as fact or opinion?
    4. If opinion, is the opinion supported by credible data or informed argument?
    5. Is the information one-sided?
    6. Are alternate views represented?
    7. Does the point of view affect how you view the information?
  6. PURPOSE
    1. What is the author’s purpose or objective, to explain, provide new information or news, entertain, persuade or sell?
    2. Does the purpose affect how you view the information presented?

In 2021, however, all suggestions above may not be sufficient to distinguish a reliable source of information, even if the article made it through the peer-reviewed process. In time, you should learn to evaluate the research methods of the authors and decide if they are reliable. Same applies for the research findings and conclusions.

++++++++++++++++++++
Aditional topics and ideas for exploring at home:
civil disobedience

https://blog.stcloudstate.edu/ims/2014/09/30/disruptive-technologies-from-swarming-to-mesh-networking/
https://blog.stcloudstate.edu/ims/2019/08/30/tik-tok-students-and-teachers/
https://news.softpedia.com/news/Venezuela-Blocks-Walkie-Talkie-App-Zello-Amid-Protests-428583.shtml
http://www.businessinsider.com/yo-updates-on-israel-missile-attacks-2014-7

https://blog.stcloudstate.edu/ims/2016/11/14/internet-freedom/
https://blog.stcloudstate.edu/ims/2016/08/31/police-to-block-social-media/
https://blog.stcloudstate.edu/ims/2016/04/04/technology-and-activism/

1 2 3 4