Tom Dickinson describes four different types of distributed ‘fake news’.
‘Fake news’ is lazy language. Be specific. Do you mean: A) Propaganda B) Disinformation C) Conspiracy theory D) Clickbait
The RAND Corporation, a US think-tank with strong ties to the military industrial complex, recently looked at the influence of the Russian Propaganda Model and how best to deal with it.
Three factors have been shown to increase the (limited) effectiveness of retractions and refutations: (1) warnings at the time of initial exposure to misinformation, (2) repetition of the retraction or refutation, and (3) corrections that provide an alternative story to help fill the resulting gap in understanding when false ‘facts’ are removed.
Critical thinking requires us to constantly question assumptions, especially our own. To develop these skills, questioning must be encouraged. This runs counter to most schooling and training practices. When do students or employees get to question underlying assumptions of their institutions? If they cannot do this, how can we expect them to challenge various and pervasive types of ‘fake news’?
$129. Select CoSN Member or Non-member, change the “0” next to the “Symposium on Educating for Digital Citizenship ONLY” to a “1”. Click “next” and complete your registration.
CoSN and UNESCO, in partnership with the Global Education Conference, HP, ClassLink, Participate, Qatar Foundation International, Partnership for 21st Century Learning, ISTE, iEARN-USA, The Stevens Initiative at the Aspen Institute, World Savvy, Wikimedia, TakingITGlobal, Smithsonian Institute, and Project Tomorrow, are hosting this event to bring together thought leaders from across the world to explore the role of education in ensuring students are responsible digital citizens.
Internet safety has been a concern for policymakers and educators since the moment technology, particularly the Internet, was introduced to classrooms. Increasingly many school systems are evolving that focus from simply minimizing risk and blocking access, to more responsible use policies and strategies that empower the student as a digital citizen. Digital citizenship initiatives also seek to prepare students to live in a world where online hate and radicalization are all too common.
Join us for a lively and engaging exploration of what are the essential digital citizenship skills that students need, including policies and practices in response to the following questions:
How can technology be used to improve digital citizenship and to what extent is technology providing new challenges to digital citizenship?
How should we access information effectively and form good evaluate its accuracy?
How should we develop the skills to engage with others respectfully and in a sensitive and ethical manner?
Combine the superfast calculational capacities of Big Compute with the oceans of specific personal information comprising Big Data — and the fertile ground for computational propaganda emerges. That’s how the small AI programs called bots can be unleashed into cyberspace to target and deliver misinformation exactly to the people who will be most vulnerable to it. These messages can be refined over and over again based on how well they perform (again in terms of clicks, likes and so on). Worst of all, all this can be done semiautonomously, allowing the targeted propaganda (like fake news stories or faked images) to spread like viruses through communities most vulnerable to their misinformation.
According to Bolsover and Howard, viewing computational propaganda only from a technical perspective would be a grave mistake. As they explain, seeing it just in terms of variables and algorithms “plays into the hands of those who create it, the platforms that serve it, and the firms that profit from it.”
Computational propaganda is a new thing. People just invented it. And they did so by realizing possibilities emerging from the intersection of new technologies (Big Compute, Big Data) and new behaviors those technologies allowed (social media). But the emphasis on behavior can’t be lost.
People are not machines. We do things for a whole lot of reasons including emotions of loss, anger, fear and longing. To combat computational propaganda’s potentially dangerous effects on democracy in a digital age, we will need to focus on both its howand its why.
Common Sense Media recently partnered with the Center for Humane Technology, which supports the development of ethical technological tools, to lay out a fierce call for regulation and awareness about the health issues surrounding tech addiction.
To support educators making such decisions, Common Sense Media is taking their “Truth about Tech” campaign to schools through an upgraded version of their current Digital Citizenship curriculum. The new updates will include more information on subjects such as:
Creating a healthy media balance and digital wellness;
Concerns about the rise of hate speech in schools, that go beyond talking about cyberbullying; and
Fake news, media literacy and curating your own content
What Does ‘Tech Addiction’ Mean?
In a recent NPR report, writer Anya Kamenetz, notes that clinicians are debating whether technology overuse is best categorized as a bad habit, a symptom of other mental struggles (such as depression or anxiety) or as an addiction.
Dr. Jenny Radesky, a developmental-behavioral pediatrician at the American Academy of Pediatrics, notes that though she’s seen solid evidence linking heavy media usage to problems with sleep and obesity, she hesitated to call the usage “addiction.”
Dr. Robert Lustig, an endocrinologist who studies hormones at the University of Southern California disagreed, noting that parents have to see the overuse of technology as an addiction.
Social Media Is Making Us Dumber. Here’s Exhibit A.
By JESSE SINGAL
My note: the genesis of #FakeNews, #tribalism
The idea that Mr. Pinker, a liberal, Jewish psychology professor, is a fan of a racist, anti-Semitic online movement is absurd on its face, so it might be tempting to roll your eyes and dismiss this blowup as just another instance of social media doing what it does best: generating outrage.
But it’s actually a worthwhile episode to unpack, because it highlights a disturbing, worsening tendency in social media in which tribal allegiances are replacing shared empirical understandings of the world.
What social media is doing is slicing the salami thinner and thinner, as it were, making it harder even for people who are otherwise in general ideological agreement to agree on basic facts about news events.
“They were taking both sides of the argument this past weekend and pushing them out from their troll farms as much as they could to just raise the noise level in America and make a big issue seem like an even bigger issue as they’re trying to push divisiveness in the country,” as Sen. James Lankford, R-Okla., said in the fall.
The Republicans might have been tarnished by the St Petersburg troll factory, but Democratic fantasies about social media were rubbished in the process
The ads in question were memes, manufactured and posted to a number of bluntly named, pseudo-American Facebook accounts in 2016 by workers at a troll farm in St Petersburg, Russia. There were thousands of these ads, it seems, plus parallel efforts on Instagram and Twitter. Between them, they reached over 100 million people.
The memes were big news for a while because they showed what Russian interference in the 2016 election actually looked like, in vivid color. Eventually the story faded, though, in part because it was superseded by other stories, but also, I think, because the Russian ad story was deeply distasteful to both sides of our atrophied political debate.
The ads were clumsily written. They were rife with spelling errors and poor grammar. Their grasp of American history was awful. And over them all hovered a paranoid fear that the powerful were scheming to flip the world upside-down in the most outlandish ways: to turn our country over to the undocumented … to punish the hardworking … to crack down on patriots and Christians … to enact Sharia law right here at home.
The social media platforms aren’t neutral arbiters, selflessly serving the needs of society. As is all too obvious now, they are monopolies that manipulate us in a hundred different ways, selecting our news, steering us towards what we need to buy. The corporate entities behind them wield enormous power in Washington, too, filling Democratic campaign coffers and keeping the revolving door turning for trusted servants. Those who don’t comply get disciplined.
Russia calls for answers after Chechen leader’s Instagram is blocked
Internet watchdog demands explanation after Ramzan Kadyrov claimed Facebook also suspended him without explanation
Kadyrov has accused the US government of pressuring the social networks to disable his accounts, which he said were blocked on Saturday without explanation. The US imposed travel and financial sanctions on Kadyrov last week over numerous allegations of human rights abuses.
The former rebel fighter, who is now loyal to the Russian president, Vladimir Putin, is a fan of social media, particularly Instagram, which he has used in recent years to make barely veiled death threats against Kremlin critics.
Leonid Levin, the head of the Russian parliament’s information technologies and communications committee, suggested the move by Facebook and Instagram was an attack on freedom of speech.
Dzhambulat Umarov, the Chechen press and information minister, described the blocking of Kadyrov’s accounts as a “vile” cyber-attack by the US.
Neither Instagram nor Facebook had commented at the time of publication.
In 2015, Kadyrov urged Chechen men not to let their wives use the WhatsApp messaging service after an online outcry over the forced marriage of a 17-year-old Chechen to a 47-year-old police chief. “Do not write such things. Men, take your women out of WhatsApp,” he said.
The digital attack that brought Estonia to a standstill 10 years ago was the first shot in a cyberwar that has been raging between Moscow and the west ever since
It began at exactly 10pm on 26 April, 2007, when a Russian-speaking mob began rioting in the streets of Tallinn, the capital city of Estonia, killing one person and wounding dozens of others. That incident resonates powerfully in some of the recent conflicts in the US. In 2007, the Estonian government had announced that a bronze statue of a heroic second world war Soviet soldier was to be removed from a central city square. For ethnic Estonians, the statue had less to do with the war than with the Soviet occupation that followed it, which lasted until independence in 1991. For the country’s Russian-speaking minority – 25% of Estonia’s 1.3 million people – the removal of the memorial was another sign of ethnic discrimination.
That evening, Jaan Priisalu – a former risk manager for Estonia’s largest bank, Hansabank, who was working closely with the government on its cybersecurity infrastructure – was at home in Tallinn with his girlfriend when his phone rang. On the line was Hillar Aarelaid, the chief of Estonia’s cybercrime police.
“It’s going down,” Aarelaid declared. Alongside the street fighting, reports of digital attacks were beginning to filter in. The websites of the parliament, major universities, and national newspapers were crashing. Priisalu and Aarelaid had suspected something like this could happen one day. A digital attack on Estoniahad begun.
“The Russian theory of war allows you to defeat the enemy without ever having to touch him,” says Peter Pomerantsev, author of Nothing is True and Everything is Possible. “Estonia was an early experiment in that theory.”
Since then, Russia has only developed, and codified, these strategies. The techniques pioneered in Estonia are known as the “Gerasimov doctrine,” named after Valery Gerasimov, the chief of the general staff of the Russian military. In 2013, Gerasimov published an article in the Russian journal Military-Industrial Courier, articulating the strategy of what is now called “hybrid” or “nonlinear” warfare. “The lines between war and peace are blurred,” he wrote. New forms of antagonism, as seen in 2010’s Arab spring and the “colour revolutions” of the early 2000s, could transform a “perfectly thriving state, in a matter of months, and even days, into an arena of fierce armed conflict”.
Russia has deployed these strategies around the globe. Its 2008 war with Georgia, another former Soviet republic, relied on a mix of both conventional and cyber-attacks, as did the 2014 invasion of Crimea. Both began with civil unrest sparked via digital and social media – followed by tanks. Finland and Sweden have experienced near-constant Russian information operations. Russian hacks and social media operations have also occurred during recent elections in Holland, Germany, and France. Most recently, Spain’s leading daily, El País, reported on Russian meddling in the Catalonian independence referendum. Russian-supported hackers had allegedly worked with separatist groups, presumably with a mind to further undermining the EU in the wake of the Brexit vote.
The Kremlin has used the same strategies against its own people. Domestically, history books, school lessons, and media are manipulated, while laws are passed blocking foreign access to the Russian population’s online data from foreign companies – an essential resource in today’s global information-sharing culture. According to British military researcher Keir Giles, author of Nato’s Handbook of Russian Information Warfare, the Russian government, or actors that it supports, has even captured the social media accounts of celebrities in order to spread provocative messages under their names but without their knowledge. The goal, both at home and abroad, is to sever outside lines of communication so that people get their information only through controlled channels.
According to its detractors, RT is Vladimir Putin’s global disinformation service, countering one version of the truth with another in a bid to undermine the whole notion of empirical truth. And yet influential people from all walks of public life appear on it, or take its money. You can’t criticise RT’s standards, they say, if you don’t watch it. So I watched it. For a week.
My note; so this is why Oliver Stone in his “documentary” went gentle on Putin, so his son can have a job. #Nepotism #FakeNews
RT’s stated mission is to offer an “alternative perspective on major global events”, but the world according to RT is often downright surreal.
Peter Pomerantsev, author of Nothing Is True and Everything Is Possible, about Putin’s Russia, and now a senior visiting fellow in global affairs at the London School of Economics, was in Moscow working in television when Russia Today first started hiring graduates from Britain and the US. “The people were really bright, they were being paid well,” he says. But they soon found they were being ordered to change their copy, or instructed how to cover certain stories to reflect well on the Kremlin. “Everyone had their own moment when they first twigged that this wasn’t like the BBC,” he says. “That, actually, this is being dictated from above.” The coverage of Russia’s war with Georgia in 2008 was a lightbulb moment for many, he says. They quit.
China Central Television (formerly Beijing Television), commonly abbreviated as CCTV, is the predominant statetelevision broadcaster in the People’s Republic of China. CCTV has a network of 50 channels broadcasting different programmes and is accessible to more than one billion viewers. As of present, there are 50 television channels, and the broadcaster provides programming in six different languages. Most of its programmes are a mixture of news, documentary, social education, comedy, entertainment, and drama, the majority of which consists of Chinese soap operas and entertainment.