PISA scores were recently released, and results of the international test revealed that only 14 percent of U.S. students were able to reliably distinguish between fact and opinion.
Even on seemingly-serious websites, credibility is not a given. When I was in middle and high school, we were taught that we could trust .org websites. Now, with the practice of astroturfing, responsible consumers of information must dig deeper and go further to verify the legitimacy of information. https://www.merriam-webster.com/dictionary/astroturfing
Experiences like these, where students are challenged to consider the validity of information and sort what’s real from what’s fake, would better prepare them not only to be savvier consumers of news, but also to someday digest contradictory information to make complicated decisions about their own health care, finances or civic engagement.
freely available resources to help educators teach how to vet information and think critically about real-world topics.
That’s the nickname given to computer-created artificial videos or other digital material in which images are combined to create new footage that depicts events that never actually happened. The term originates from the online message board Reddit.
One initial use of the fake videos was in amateur-created pornography, in which the faces of famous Hollywood actresses were digitally placed onto that of other performers to make it appear as though the stars themselves were performing.
How difficult is it to create fake media?
It can be done with specialized software, experts say, the same way that editing programs such as Photoshop have made it simpler to manipulate still images. And specialized software itself isn’t necessary for what have been dubbed “shallow fakes” or “cheap fakes.”
Researchers also say they are working on new ways to speed up systems aimed at helping establish when video or audio has been manipulated. But it’s been called a “cat and mouse” game in which there may seldom be exact parity between fabrication and detection.
Lazer, D. M. J., Baum, M. A., Benkler, Y., Berinsky, A. J., Greenhill, K. M., Menczer, F., … Zittrain, J. L. (2018). The science of fake news. Science, 359(6380), 1094–1096. https://doi.org/10.1126/science.aao2998
Baum and David Lazer, M. A. (2017, May 11). Social media must be held to account on fake news. Winnipeg Free Press (MB). p. A7.
In a paper published in March in the journal Science, David Lazer, Matthew Baum and 14 co-authors consider what we do and don’t know about the science of fake news. They definefake news as “fabricated information that mimics news media content in form but not in organizational process or intent,” and they go on to discuss problems at multiple levels: individual, institutional and societal. What do we know about individuals’ exposure to fake news and its influence upon them? How can Internet platforms help limit the dissemination of fake news? And most fundamentally: How can we succeed in creating and perpetuating a culture that values and promotes truth?
Steven Sloman, professor of cognitive, linguistic and psychological sciences at Brown University, and one of the paper’s 16 authors. Sloman is also author of The Knowledge Illusion: Why We Never Think Alone, a book about the merits and failings of our collaborative minds, published in 2017 with co-author Philip Fernbach.
Sloman, S. A. (2017). The knowledge illusion: Why we never think alone. New York: Riverhead Books.
“They were taking both sides of the argument this past weekend and pushing them out from their troll farms as much as they could to just raise the noise level in America and make a big issue seem like an even bigger issue as they’re trying to push divisiveness in the country,” as Sen. James Lankford, R-Okla., said in the fall.
Tools such as the German Marshall Fund’s “Hamilton 68” dashboard track Russian-linked accounts in real time to show what links, phrases and hashtags are circulating.
The Trump administration detailed the threat — without any specific mention of the 2016 interference — in the new National Security Strategy it released at the end of December.
The Republicans might have been tarnished by the St Petersburg troll factory, but Democratic fantasies about social media were rubbished in the process
The ads in question were memes, manufactured and posted to a number of bluntly named, pseudo-American Facebook accounts in 2016 by workers at a troll farm in St Petersburg, Russia. There were thousands of these ads, it seems, plus parallel efforts on Instagram and Twitter. Between them, they reached over 100 million people.
The memes were big news for a while because they showed what Russian interference in the 2016 election actually looked like, in vivid color. Eventually the story faded, though, in part because it was superseded by other stories, but also, I think, because the Russian ad story was deeply distasteful to both sides of our atrophied political debate.
The ads were clumsily written. They were rife with spelling errors and poor grammar. Their grasp of American history was awful. And over them all hovered a paranoid fear that the powerful were scheming to flip the world upside-down in the most outlandish ways: to turn our country over to the undocumented … to punish the hardworking … to crack down on patriots and Christians … to enact Sharia law right here at home.
The social media platforms aren’t neutral arbiters, selflessly serving the needs of society. As is all too obvious now, they are monopolies that manipulate us in a hundred different ways, selecting our news, steering us towards what we need to buy. The corporate entities behind them wield enormous power in Washington, too, filling Democratic campaign coffers and keeping the revolving door turning for trusted servants. Those who don’t comply get disciplined.
++++++++++++++
Russia calls for answers after Chechen leader’s Instagram is blocked
Internet watchdog demands explanation after Ramzan Kadyrov claimed Facebook also suspended him without explanation
Kadyrov has accused the US government of pressuring the social networks to disable his accounts, which he said were blocked on Saturday without explanation. The US imposed travel and financial sanctions on Kadyrov last week over numerous allegations of human rights abuses.
The former rebel fighter, who is now loyal to the Russian president, Vladimir Putin, is a fan of social media, particularly Instagram, which he has used in recent years to make barely veiled death threats against Kremlin critics.
Leonid Levin, the head of the Russian parliament’s information technologies and communications committee, suggested the move by Facebook and Instagram was an attack on freedom of speech.
Dzhambulat Umarov, the Chechen press and information minister, described the blocking of Kadyrov’s accounts as a “vile” cyber-attack by the US.
Neither Instagram nor Facebook had commented at the time of publication.
In 2015, Kadyrov urged Chechen men not to let their wives use the WhatsApp messaging service after an online outcry over the forced marriage of a 17-year-old Chechen to a 47-year-old police chief. “Do not write such things. Men, take your women out of WhatsApp,” he said.
Want to strengthen your own ability to tell real news from fake news? Start by asking these five questions of any news item:
Who wrote it?
identify whether the item you’re reading is a reported news article (written by a journalist with the intent to inform), a persuasive opinion piece (written by an industry expert with a point of view), or something else entirely.
What claims does it make? Real news — like these Pulitzer Prize winning articles — will include multiple primary sources when discussing a controversial claim. Fake news may include fake sources, false urls, and/or “alternative facts”
Where was it published? Real news is published by trustworthy media outlets with a strong factchecking record, such as the BBC, NPR, ProPublica, Mother Jones, and Wired. (To learn more about any media outlet, look at their About page and examine their published body of work.) If you get your news primarily via social media, try to verify that the information is accurate before you share it. (On Twitter, for example, you might look for the blue “verified” checkmark next to a media outlet name to doublecheck a publication source before sharing a link.)
How does it make you feel?Fake news, like all propaganda, is designed to make you feel strong emotions. So if you read a news item that makes you feel super angry, pause and take a deep breath.
Starbird argues in a new paper, set to be presented at a computational social-science conference in May, that these “strange clusters” of wild conspiracy talk, when mapped, point to an emerging alternative media ecosystem on the web of surprising power and reach.
There are dozens of other conspiracy-propagating websites such as beforeitsnews.com, nodisinfo.com and veteranstoday.com.
It isn’t a traditional left-right political axis, she found. There are right-wing sites like Danger & Play and left-wing sensationalizers such as The Free Thought Project. Some appear to be just trying to make money, while others are aggressively pushing political agendas.
The true common denominator, she found, is anti-globalism — deep suspicion of free trade, multinational business and global institutions.
++++++++++++++++++++++++++++
The News Literacy Project
False information on the internet makes it harder and harder to know what’s true, and the consequences have been devastating. This hour, TED speakers explore ideas around technology and deception. Guests include law professor Danielle Citron, journalist Andrew Marantz, and computer scientist Joy Buolamwini.
The upside for businesses is that this new, “anonymized” video no longer gives away the exact identity of a customer—which, Perry says, means companies using D-ID can “eliminate the need for consent” and analyze the footage for business and marketing purposes. A store might, for example, feed video of a happy-looking white woman to an algorithm that can surface the most effective ad for her in real time.
Three leading European privacy experts who spoke to MIT Technology Review voiced their concerns about D-ID’s technology and its intentions. All say that, in their opinion, D-ID actually violates GDPR.
+++++ thank you for covering this information at home. Pls don’t forget to bring your q/s and ideas to class +++++
Why we are here today? We need to look deeper in the current 21stcentury state of information and disinformation and determine how such awareness can help policy analysis. How do we make up our mind about news and information; where from we get our info; who do we believe, who do we mistrust.
how do these three items assist a better analysis of policies?
Class assignment:
Share a topic which is very much to your heart.
Please feel welcome to use the following resources and/or contribute with your own resources to determine the sources and potential bias
Feel free also to use the following guidelines when establishing the veracity of information:
Here is a short (4 min) video introducing you to the well-known basics for evaluation of academic literature: https://youtu.be/qUd_gf2ypk4
ACCURACY
Does the author cite reliable sources?
How does the information compare with that in other works on the topic?
Can you determine if the information has gone through peer-review?
Are there factual, spelling, typographical, or grammatical errors?
AUDIENCE
Who do you think the authors are trying to reach?
Is the language, vocabulary, style and tone appropriate for intended audience?
What are the audience demographics? (age, educational level, etc.)
Are the authors targeting a particular group or segment of society?
AUTHORITY
Who wrote the information found in the article or on the site?
What are the author’s credentials/qualifications for this particular topic?
Is the author affiliated with a particular organization or institution?
What does that affiliation suggest about the author?
CURRENCY
Is the content current?
Does the date of the information directly affect the accuracy or usefulness of the information?
OBJECTIVITY/BIAS
What is the author’s or website’s point of view?
Is the point of view subtle or explicit?
Is the information presented as fact or opinion?
If opinion, is the opinion supported by credible data or informed argument?
Is the information one-sided?
Are alternate views represented?
Does the point of view affect how you view the information?
PURPOSE
What is the author’s purpose or objective, to explain, provide new information or news, entertain, persuade or sell?
Does the purpose affect how you view the information presented?
In 2021, however, all suggestions above may not be sufficient to distinguish a reliable source of information, even if the article made it through the peer-reviewed process. In time, you should learn to evaluate the research methods of the authors and decide if they are reliable. Same applies for the research findings and conclusions.
++++++++++++++++++++
Aditional topics and ideas for exploring at home:
civil disobedience
From
Mike Caulfield’s Web Literacy for Student Fact-Checkers
(https://webliteracy.pressbooks.com/)
Fact-Checking Organizations
There are many fact-checking sites outside the U.S. Here is a small sample.