half truths fake news
++++++++++++
more on fake news in this IMS blog
https://blog.stcloudstate.edu/ims?s=fake+news
Digital Literacy for St. Cloud State University
++++++++++++
more on fake news in this IMS blog
https://blog.stcloudstate.edu/ims?s=fake+news
While the text does not mention specific cases, Russian interference has been proven in the 2016 election campaign in the United States, which saw Donald Trump victorious, as well as the Brexit referendum in the United Kingdom the same year, which saw voters narrowly decide they wanted their country to leave the European Union.
the text relies on the classification of the European Commission: “Verifiably false or misleading information created, presented and disseminated for economic gain or to intentionally deceive the public.” This includes electoral processes, but also sectors such as health, environment or security. The text underlines that the current coronavirus pandemic has been accompanied by an “unprecedented infodemic,” i.e. a proliferation of fake news.
The document recognizes that the “news media, digital platforms, academic world, technology sector, NGOs and society in general play an essential role in the fight against disinformation, with actions such as its identification and not contributing to its spread, the promotion of activities that raise awareness and training or the development of tools to avoid its propagation.”
++++++++++++
more on fake news in this IMS blog
https://blog.stcloudstate.edu/ims?s=fake+news
a historic report last week from the nation’s top boss of counterintelligence.
the need for the United States to order the closure of the Chinese government’s consulate in Houston.
metaphor for this aspect of the spy game: a layer cake.
There’s a layer of activity that is visible to all — the actions or comments of public figures, or statements made via official channels.
Then there’s a clandestine layer that is usually visible only to another clandestine service: the work of spies being watched by other spies.
Counterintelligence officials watching Chinese intelligence activities in Houston, for example, knew the consulate was a base for efforts to steal intellectual property or recruit potential agents
And there’s at least a third layer about which the official statements raised questions: the work of spies who are operating without being detected.
The challenges of election security include its incredible breadth — every county in the United States is a potential target — and vast depth, from the prospect of cyberattacks on voter systems, to the theft of information that can then be released to embarrass a target, to the ongoing and messy war on social media over disinformation and political agitation.
Witnesses have told Congress that when Facebook and Twitter made it more difficult to create and use fake accounts to spread disinformation and amplify controversy, Russia and China began to rely more on open channels.
In 2016, Russian influencemongers posed as fake Americans and engaged with them as though they were responding to the same election alongside one another. Russian operatives even used Facebook to organize real-world campaign events across the United States.
But RT’s account on Twitter or China’s foreign ministry representatives aren’t pretending to do anything but serve as voices for Moscow or Beijing.
the offer of a $10 million bounty for information about threats to the election.
+++++++++++++++++++
more on trolls in this IMS blog
https://blog.stcloudstate.edu/ims?s=troll
one cost of rampant fake reviews, fake accounts, fake views, & fake clicks –> the internet is increasingly becoming a low-trust environment, where an assumption of pervasive fraud is built into the way many things functionhttps://t.co/keOZUYiARL @zeynep pic.twitter.com/rOCPqnQU5y
— Rachel Thomas (@math_rachel) September 22, 2019
https://www.nytimes.com/video/opinion/100000006188102/what-is-pizzagate.html
‘The goal here is bigger than any one election. It is to constantly divide, increase distrust and undermine our faith in institutions and democracy itself’
Matt Apuzzo, Adam Satariano 2019-05-12T13:13:04+01:00″
NICHOLAS THOMPSON AND Issy Lapowsky 12.17.18
https://www.wired.com/story/russia-ira-propaganda-senate-report/
THERE’S A MEME on Instagram, circulated by a group called “Born Liberal.” “Born Liberal” was a creation of the Internet Research Agency, the Russian propaganda wing
Conversations around the IRA’s operations traditionally have focused on Facebook and Twitter, but like any hip millennial, the IRA was actually most obsessive about Instagram.
the IRA deployed 3,841 accounts, including several personas that “regularly played hashtag games.” That approach paid off; 1.4 million people engaged with the tweets, leading to nearly 73 million engagements. Most of this work was focused on news, while on Facebook and Instagram, the Russians prioritized “deeper relationships,” according to the researchers. On Facebook, the IRA notched a total of 3.3 million page followers, who engaged with their politically divisive content 76.5 million times. Russia’s most popular pages targeted the right wing and the black community. The trolls also knew their audiences; they deployed Pepe memes at pages intended for right-leaning millennials, but kept them away from posts directed at older conservative Facebook users. Not every attempt was a hit; while 33 of the 81 IRA Facebook pages had over 1,000 followers, dozens had none at all.
The report also points out new links between the IRA’s pages and Wikileaks, which helped disseminate hacked emails from Clinton campaign manager John Podesta
Russian presence unrelated to the relatively small ad spend that Facebook executives pointed to as the story first unfolded, in what the report authors describe as an attempt to downplay the problem.
“While many people think of memes as “cat pictures with words,” the Defense Department and DARPA have studied them for years as a powerful tool of cultural influence, capable of reinforcing or even changing values and behavior.
“over the past five years, disinformation has evolved from a nuisance into high-stakes information war.” And yet, rather than fighting back effectively, Americans are battling each other over what to do about it.
++++++++++
memetic warfare:
https://en.wikipedia.org/wiki/Memetic_warfare
++++++++++
A year after the Meme Warfare Center proposal was published, DARPA, the Pentagon agency that develops new military technology, commissioned a four-year study of memetics. The research was led by Dr. Robert Finkelstein, founder of the Robotic Technology Institute, and an academic with a background in physics and cybernetics.
Finkelstein’s study of “Military Memetics” centered on a basic problem in the field, determining “whether memetics can be established as a science with the ability to explain and predict phenomena.” It still had to be proved, in other words, that memes were actual components of reality and not just a nifty concept with great marketing.
+++++++++
++++++++++
more on social media in this IMS blog
https://blog.stcloudstate.edu/ims?s=social+media
Amid vote-hacking fears, election officials are jumping on the crypto bandwagon — but cybersecurity experts are sounding an alarm
+++++++++++
more on blockchain in this IMS blog
https://blog.stcloudstate.edu/ims?s=blockchain
Summary This short paper lays out an attempt to measure how much activity from Russian state-operated accounts released in the dataset made available by Twitter in October 2018 was targeted at the United Kingdom. Finding UK-related Tweets is not an easy task. By applying a combination of geographic inference, keyword analysis and classification by algorithm, we identified UK-related Tweets sent by these accounts and subjected them to further qualitative and quantitative analytic techniques.
We find:
There were three phases in Russian influence operations : under-the-radar account building, minor Brexit vote visibility, and larger-scale visibility during the London terror attacks.
Russian influence operations linked to the UK were most visible when discussing Islam . Tweets discussing Islam over the period of terror attacks between March and June 2017 were retweeted 25 times more often than their other messages.
The most widely-followed and visible troll account, @TEN_GOP, shared 109 Tweets related to the UK. Of these, 60 percent were related to Islam .
The topology of tweet activity underlines the vulnerability of social media users to disinformation in the wake of a tragedy or outrage.
Focus on the UK was a minor part of wider influence operations in this data . Of the nine million Tweets released by Twitter, 3.1 million were in English (34 percent). Of these 3.1 million, we estimate 83 thousand were in some way linked to the UK (2.7%). Those Tweets were shared 222 thousand times. It is plausible we are therefore seeing how the UK was caught up in Russian operations against the US .
Influence operations captured in this data show attempts to falsely amplify other news sources and to take part in conversations around Islam , and rarely show attempts to spread ‘fake news’ or influence at an electoral level.
On 17 October 2018, Twitter released data about 9 million tweets from 3,841 blocked accounts affiliated with the Internet Research Agency (IRA) – a Russian organisation founded in 2013 and based in St Petersburg, accused of using social media platforms to push pro-Kremlin propaganda and influence nation states beyond their borders, as well as being tasked with spreading pro-Kremlin messaging in Russia. It is one of the first major datasets linked to state-operated accounts engaging in influence operations released by a social media platform.
Conclusion
This report outlines the ways in which accounts linked to the Russian Internet ResearchAgency (IRA) carried out influence operations on social media and the ways their operationsintersected with the UK.The UK plays a reasonably small part in the wider context of this data. We see two possibleexplanations: either influence operations were primarily targeted at the US and British Twitterusers were impacted as collate, or this dataset is limited to US-focused operations whereevents in the UK were highlighted in an attempt to impact US public, rather than a concertedeffort against the UK. It is plausible that such efforts al so existed but are not reflected inthis dataset.Nevertheless, the data offers a highly useful window into how Russian influence operationsare carried out, as well as highlighting the moments when we might be most vulnerable tothem.Between 2011 and 2016, these state-operated accounts were camouflaged. Through manualand automated methods, they were able to quietly build up the trappings of an active andwell-followed Twitter account before eventually pivoting into attempts to influence the widerTwitter ecosystem. Their methods included engaging in unrelated and innocuous topics ofconversation, often through automated methods, and through sharing and engaging withother, more mainstream sources of news.Although this data shows levels of electoral and party-political influence operations to berelatively low, the day of the Brexit referendum results showed how messaging originatingfrom Russian state-controlled accounts might come to be visible – on June 24th 2016, we believe UK Twitter users discussing the Brexit Vote would have encountered messages originating from these accounts.As early as 2014, however, influence operations began taking part in conversations aroundIslam, and these accounts came to the fore during the three months of terror attacks thattook place between March and June 2017. In the immediate wake of these attacks, messagesrelated to Islam and circulated by Russian state-operated Twitter accounts were widelyshared, and would likely have been visible in the UK.The dataset released by Twitter begins to answer some questions about attempts by a foreignstate to interfere in British affairs online. It is notable that overt political or electoralinterference is poorly represented in this dataset: rather, we see attempts at stirring societaldivision, particularly around Islam in the UK, as the messages that resonated the most overthe period.What is perhaps most interesting about this moment is its portrayal of when we as socialmedia users are most vulnerable to the kinds of messages circulated by those looking toinfluence us. In the immediate aftermath of terror attacks, the data suggests, social mediausers were more receptive to this kind of messaging than at any other time.
+++++++++++
more on cybersecurity in this IMS blog
https://blog.stcloudstate.edu/ims?s=cybersecurity
The digital attack that brought Estonia to a standstill 10 years ago was the first shot in a cyberwar that has been raging between Moscow and the west ever since
It began at exactly 10pm on 26 April, 2007, when a Russian-speaking mob began rioting in the streets of Tallinn, the capital city of Estonia, killing one person and wounding dozens of others. That incident resonates powerfully in some of the recent conflicts in the US. In 2007, the Estonian government had announced that a bronze statue of a heroic second world war Soviet soldier was to be removed from a central city square. For ethnic Estonians, the statue had less to do with the war than with the Soviet occupation that followed it, which lasted until independence in 1991. For the country’s Russian-speaking minority – 25% of Estonia’s 1.3 million people – the removal of the memorial was another sign of ethnic discrimination.
That evening, Jaan Priisalu – a former risk manager for Estonia’s largest bank, Hansabank, who was working closely with the government on its cybersecurity infrastructure – was at home in Tallinn with his girlfriend when his phone rang. On the line was Hillar Aarelaid, the chief of Estonia’s cybercrime police.
“It’s going down,” Aarelaid declared. Alongside the street fighting, reports of digital attacks were beginning to filter in. The websites of the parliament, major universities, and national newspapers were crashing. Priisalu and Aarelaid had suspected something like this could happen one day. A digital attack on Estoniahad begun.
“The Russian theory of war allows you to defeat the enemy without ever having to touch him,” says Peter Pomerantsev, author of Nothing is True and Everything is Possible. “Estonia was an early experiment in that theory.”
Since then, Russia has only developed, and codified, these strategies. The techniques pioneered in Estonia are known as the “Gerasimov doctrine,” named after Valery Gerasimov, the chief of the general staff of the Russian military. In 2013, Gerasimov published an article in the Russian journal Military-Industrial Courier, articulating the strategy of what is now called “hybrid” or “nonlinear” warfare. “The lines between war and peace are blurred,” he wrote. New forms of antagonism, as seen in 2010’s Arab spring and the “colour revolutions” of the early 2000s, could transform a “perfectly thriving state, in a matter of months, and even days, into an arena of fierce armed conflict”.
Russia has deployed these strategies around the globe. Its 2008 war with Georgia, another former Soviet republic, relied on a mix of both conventional and cyber-attacks, as did the 2014 invasion of Crimea. Both began with civil unrest sparked via digital and social media – followed by tanks. Finland and Sweden have experienced near-constant Russian information operations. Russian hacks and social media operations have also occurred during recent elections in Holland, Germany, and France. Most recently, Spain’s leading daily, El País, reported on Russian meddling in the Catalonian independence referendum. Russian-supported hackers had allegedly worked with separatist groups, presumably with a mind to further undermining the EU in the wake of the Brexit vote.
The Kremlin has used the same strategies against its own people. Domestically, history books, school lessons, and media are manipulated, while laws are passed blocking foreign access to the Russian population’s online data from foreign companies – an essential resource in today’s global information-sharing culture. According to British military researcher Keir Giles, author of Nato’s Handbook of Russian Information Warfare, the Russian government, or actors that it supports, has even captured the social media accounts of celebrities in order to spread provocative messages under their names but without their knowledge. The goal, both at home and abroad, is to sever outside lines of communication so that people get their information only through controlled channels.
Tim Dowling Wednesday 29 November 2017 12.39 EST
According to its detractors, RT is Vladimir Putin’s global disinformation service, countering one version of the truth with another in a bid to undermine the whole notion of empirical truth. And yet influential people from all walks of public life appear on it, or take its money. You can’t criticise RT’s standards, they say, if you don’t watch it. So I watched it. For a week.
Suchet, the son of former ITV newsreader John Suchet and the nephew of actor David Suchet, has been working for RT since 2009. The offspring of well-known people feature often on RT. Sophie Shevardnadze, who presents Sophie & Co, is the granddaughter of former Georgian president and Soviet foreign minister Eduard Shevardnadze. Tyrel Ventura, who presents Watching the Hawks on RT America, is the son of wrestler-turned-politician Jesse Ventura. His co-host is Oliver Stone’s son Sean.
My note; so this is why Oliver Stone in his “documentary” went gentle on Putin, so his son can have a job. #Nepotism #FakeNews
RT’s stated mission is to offer an “alternative perspective on major global events”, but the world according to RT is often downright surreal.
Peter Pomerantsev, author of Nothing Is True and Everything Is Possible, about Putin’s Russia, and now a senior visiting fellow in global affairs at the London School of Economics, was in Moscow working in television when Russia Today first started hiring graduates from Britain and the US. “The people were really bright, they were being paid well,” he says. But they soon found they were being ordered to change their copy, or instructed how to cover certain stories to reflect well on the Kremlin. “Everyone had their own moment when they first twigged that this wasn’t like the BBC,” he says. “That, actually, this is being dictated from above.” The coverage of Russia’s war with Georgia in 2008 was a lightbulb moment for many, he says. They quit.
+++++++++++++++
more on Russian bots, trolls:
https://blog.stcloudstate.edu/ims/2017/11/22/bots-trolls-and-fake-news/
+++++++++++++++
more on state propaganda in this IMS blog
https://blog.stcloudstate.edu/ims/2017/11/21/china-of-xi/