Lazer, D. M. J., Baum, M. A., Benkler, Y., Berinsky, A. J., Greenhill, K. M., Menczer, F., … Zittrain, J. L. (2018). The science of fake news. Science, 359(6380), 1094–1096. https://doi.org/10.1126/science.aao2998
Baum and David Lazer, M. A. (2017, May 11). Social media must be held to account on fake news. Winnipeg Free Press (MB). p. A7.
In a paper published in March in the journal Science, David Lazer, Matthew Baum and 14 co-authors consider what we do and don’t know about the science of fake news. They definefake news as “fabricated information that mimics news media content in form but not in organizational process or intent,” and they go on to discuss problems at multiple levels: individual, institutional and societal. What do we know about individuals’ exposure to fake news and its influence upon them? How can Internet platforms help limit the dissemination of fake news? And most fundamentally: How can we succeed in creating and perpetuating a culture that values and promotes truth?
Steven Sloman, professor of cognitive, linguistic and psychological sciences at Brown University, and one of the paper’s 16 authors. Sloman is also author of The Knowledge Illusion: Why We Never Think Alone, a book about the merits and failings of our collaborative minds, published in 2017 with co-author Philip Fernbach.
Sloman, S. A. (2017). The knowledge illusion: Why we never think alone. New York: Riverhead Books.
the digital age only exacerbates the need for a school librarian, which he describes as a position that far exceeds “book manager.”
A bill sponsored by Sen. Becky Harris in the 2017 session proposed changing that. SB143 would have required public schools, including charters, to maintain a school library and staff a certified librarian, except in specific circumstances.
“Students are technology literate when they come to Miller,” he said. “But they are informationally illiterate.”
The American Association of School Librarians is in the beginning stages of studying state policies regarding school libraries and librarians, said the organization’s president, Steven Yates. He doesn’t think eliminating school libraries or the position is a widespread movement in K-12 education, but he acknowledged that’s happening in more places than just Southern Nevada.
The anti-fake news bill, which must be approved by parliament, calls for penalising those who create, offer, circulate, print or publish fake news – or publications containing fake news – with a 10-year jail term, a fine of up to 500,000 ringgit (£90,000) or both.
The bill defines fake news as “any news, information, data and reports which is, or are, wholly or partly false whether in the form of features, visuals or audio recordings or in any other form capable of suggesting words or ideas”.
It covers all media and extends to foreigners outside Malaysia if Malaysia or its citizens are affected.
Tom Dickinson describes four different types of distributed ‘fake news’.
‘Fake news’ is lazy language. Be specific. Do you mean: A) Propaganda B) Disinformation C) Conspiracy theory D) Clickbait
The RAND Corporation, a US think-tank with strong ties to the military industrial complex, recently looked at the influence of the Russian Propaganda Model and how best to deal with it.
Three factors have been shown to increase the (limited) effectiveness of retractions and refutations: (1) warnings at the time of initial exposure to misinformation, (2) repetition of the retraction or refutation, and (3) corrections that provide an alternative story to help fill the resulting gap in understanding when false ‘facts’ are removed.
Critical thinking requires us to constantly question assumptions, especially our own. To develop these skills, questioning must be encouraged. This runs counter to most schooling and training practices. When do students or employees get to question underlying assumptions of their institutions? If they cannot do this, how can we expect them to challenge various and pervasive types of ‘fake news’?
$129. Select CoSN Member or Non-member, change the “0” next to the “Symposium on Educating for Digital Citizenship ONLY” to a “1”. Click “next” and complete your registration.
CoSN and UNESCO, in partnership with the Global Education Conference, HP, ClassLink, Participate, Qatar Foundation International, Partnership for 21st Century Learning, ISTE, iEARN-USA, The Stevens Initiative at the Aspen Institute, World Savvy, Wikimedia, TakingITGlobal, Smithsonian Institute, and Project Tomorrow, are hosting this event to bring together thought leaders from across the world to explore the role of education in ensuring students are responsible digital citizens.
Internet safety has been a concern for policymakers and educators since the moment technology, particularly the Internet, was introduced to classrooms. Increasingly many school systems are evolving that focus from simply minimizing risk and blocking access, to more responsible use policies and strategies that empower the student as a digital citizen. Digital citizenship initiatives also seek to prepare students to live in a world where online hate and radicalization are all too common.
Join us for a lively and engaging exploration of what are the essential digital citizenship skills that students need, including policies and practices in response to the following questions:
How can technology be used to improve digital citizenship and to what extent is technology providing new challenges to digital citizenship?
How should we access information effectively and form good evaluate its accuracy?
How should we develop the skills to engage with others respectfully and in a sensitive and ethical manner?
How should we develop an appropriate balance between instruction and nurturing student behaviors that ensure ICT (Information and communications technology) is used safely and responsibly?
Combine the superfast calculational capacities of Big Compute with the oceans of specific personal information comprising Big Data — and the fertile ground for computational propaganda emerges. That’s how the small AI programs called bots can be unleashed into cyberspace to target and deliver misinformation exactly to the people who will be most vulnerable to it. These messages can be refined over and over again based on how well they perform (again in terms of clicks, likes and so on). Worst of all, all this can be done semiautonomously, allowing the targeted propaganda (like fake news stories or faked images) to spread like viruses through communities most vulnerable to their misinformation.
According to Bolsover and Howard, viewing computational propaganda only from a technical perspective would be a grave mistake. As they explain, seeing it just in terms of variables and algorithms “plays into the hands of those who create it, the platforms that serve it, and the firms that profit from it.”
Computational propaganda is a new thing. People just invented it. And they did so by realizing possibilities emerging from the intersection of new technologies (Big Compute, Big Data) and new behaviors those technologies allowed (social media). But the emphasis on behavior can’t be lost.
People are not machines. We do things for a whole lot of reasons including emotions of loss, anger, fear and longing. To combat computational propaganda’s potentially dangerous effects on democracy in a digital age, we will need to focus on both its howand its why.
Common Sense Media recently partnered with the Center for Humane Technology, which supports the development of ethical technological tools, to lay out a fierce call for regulation and awareness about the health issues surrounding tech addiction.
To support educators making such decisions, Common Sense Media is taking their “Truth about Tech” campaign to schools through an upgraded version of their current Digital Citizenship curriculum. The new updates will include more information on subjects such as:
Creating a healthy media balance and digital wellness;
Concerns about the rise of hate speech in schools, that go beyond talking about cyberbullying; and
Fake news, media literacy and curating your own content
What Does ‘Tech Addiction’ Mean?
In a recent NPR report, writer Anya Kamenetz, notes that clinicians are debating whether technology overuse is best categorized as a bad habit, a symptom of other mental struggles (such as depression or anxiety) or as an addiction.
Dr. Jenny Radesky, a developmental-behavioral pediatrician at the American Academy of Pediatrics, notes that though she’s seen solid evidence linking heavy media usage to problems with sleep and obesity, she hesitated to call the usage “addiction.”
Dr. Robert Lustig, an endocrinologist who studies hormones at the University of Southern California disagreed, noting that parents have to see the overuse of technology as an addiction.