Posts Tagged ‘echo chambers’

Information Overload Fake News Social Media

Information Overload Helps Fake News Spread, and Social Media Knows It

Understanding how algorithm manipulators exploit our cognitive vulnerabilities empowers us to fight back

https://www.scientificamerican.com/article/information-overload-helps-fake-news-spread-and-social-media-knows-it/

a minefield of cognitive biases.

People who behaved in accordance with them—for example, by staying away from the overgrown pond bank where someone said there was a viper—were more likely to survive than those who did not.

Compounding the problem is the proliferation of online information. Viewing and producing blogs, videos, tweets and other units of information called memes has become so cheap and easy that the information marketplace is inundated. My note: folksonomy in its worst.

At the University of Warwick in England and at Indiana University Bloomington’s Observatory on Social Media (OSoMe, pronounced “awesome”), our teams are using cognitive experiments, simulations, data mining and artificial intelligence to comprehend the cognitive vulnerabilities of social media users.
developing analytical and machine-learning aids to fight social media manipulation.

As Nobel Prize–winning economist and psychologist Herbert A. Simon noted, “What information consumes is rather obvious: it consumes the attention of its recipients.”

attention economy

Nodal diagrams representing 3 social media networks show that more memes correlate with higher load and lower quality of information shared

 Our models revealed that even when we want to see and share high-quality information, our inability to view everything in our news feeds inevitably leads us to share things that are partly or completely untrue.

Frederic Bartlett
Cognitive biases greatly worsen the problem.

We now know that our minds do this all the time: they adjust our understanding of new information so that it fits in with what we already know. One consequence of this so-called confirmation bias is that people often seek out, recall and understand information that best confirms what they already believe.
This tendency is extremely difficult to correct.

Making matters worse, search engines and social media platforms provide personalized recommendations based on the vast amounts of data they have about users’ past preferences.

pollution by bots

Nodal diagrams representing 2 social media networks show that when more than 1% of real users follow bots, low-quality information prevails

Social Herding

social groups create a pressure toward conformity so powerful that it can overcome individual preferences, and by amplifying random early differences, it can cause segregated groups to diverge to extremes.

Social media follows a similar dynamic. We confuse popularity with quality and end up copying the behavior we observe.
information is transmitted via “complex contagion”: when we are repeatedly exposed to an idea, typically from many sources, we are more likely to adopt and reshare it.

Twitter users with extreme political views are more likely than moderate users to share information from low credibility sources

In addition to showing us items that conform with our views, social media platforms such as Facebook, Twitter, YouTube and Instagram place popular content at the top of our screens and show us how many people have liked and shared something. Few of us realize that these cues do not provide independent assessments of quality.

programmers who design the algorithms for ranking memes on social media assume that the “wisdom of crowds” will quickly identify high-quality items; they use popularity as a proxy for quality. My note: again, ill-conceived folksonomy.

Echo Chambers
the political echo chambers on Twitter are so extreme that individual users’ political leanings can be predicted with high accuracy: you have the same opinions as the majority of your connections. This chambered structure efficiently spreads information within a community while insulating that community from other groups.

socially shared information not only bolsters our biases but also becomes more resilient to correction.

machine-learning algorithms to detect social bots. One of these, Botometer, is a public tool that extracts 1,200 features from a given Twitter account to characterize its profile, friends, social network structure, temporal activity patterns, language and other features. The program compares these characteristics with those of tens of thousands of previously identified bots to give the Twitter account a score for its likely use of automation.

Some manipulators play both sides of a divide through separate fake news sites and bots, driving political polarization or monetization by ads.
recently uncovered a network of inauthentic accounts on Twitter that were all coordinated by the same entity. Some pretended to be pro-Trump supporters of the Make America Great Again campaign, whereas others posed as Trump “resisters”; all asked for political donations.

a mobile app called Fakey that helps users learn how to spot misinformation. The game simulates a social media news feed, showing actual articles from low- and high-credibility sources. Users must decide what they can or should not share and what to fact-check. Analysis of data from Fakey confirms the prevalence of online social herding: users are more likely to share low-credibility articles when they believe that many other people have shared them.

Hoaxy, shows how any extant meme spreads through Twitter. In this visualization, nodes represent actual Twitter accounts, and links depict how retweets, quotes, mentions and replies propagate the meme from account to account.

Free communication is not free. By decreasing the cost of information, we have decreased its value and invited its adulteration. 

information gerrymandering

Information gerrymandering in social networks skews collective decision-making

https://www.nature.com/articles/d41586-019-02562-z

https://www.facebook.com/mariana.damova/posts/10221298893368558

An analysis shows that information flow between individuals in a social network can be ‘gerrymandered’ to skew perceptions of how others in the community will vote — which can alter the outcomes of elections.

The Internet has erased geographical barriers and allowed people across the globe to interact in real time around their common interests. But social media is starting to compete with, or even replace, nationally visible conversations in print and on broadcast media with ad libitum, personalized discourse on virtual social networks3. Instead of broadening their spheres of association, people gravitate towards interactions with ideologically aligned content and similarly minded individuals.

n information gerrymandering, the way in which voters are concentrated into districts is not what matters; rather, it is the way in which the connections between them are arranged (Fig. 1). Nevertheless, like geographical gerrymandering, information gerrymandering threatens ideas about proportional representation in a democracy.

Figure 1 | Social-network structure affects voters’ perceptions. In these social networks, ten individuals favour orange and eight favour blue. Each individual has four reciprocal social connections. a, In this random network, eight individuals correctly infer from their contacts’ preferences that orange is more popular, eight infer a draw and only two incorrectly infer that blue is more popular. b, When individuals largely interact with like-minded individuals, filter bubbles arise in which all individuals believe that their party is the most popular. Voting gridlock is more likely in such situations, because no one recognizes a need to compromise. c, Stewart et al.1 describe ‘information gerrymandering’, in which the network structure skews voters’ perceptions about others’ preferences. Here, two-thirds of voters mistakenly infer that blue is more popular. This is because blue proponents strategically influence a small number of orange-preferring individuals, whereas orange proponents squander their influence on like-minded individuals who have exclusively orane-preferring contacts, or on blue-preferring individuals who have enough blue-preferring contacts to remain unswayed.

social media and democracy

The biggest threat to democracy? Your social media feed

Vyacheslav PolonskiNetwork Scientist, Oxford Internet Institute
Yochai Benkler explains: “The various formats of the networked public sphere provide anyone with an outlet to speak, to inquire, to investigate, without need to access the resources of a major media organization.”
Democratic bodies are typically elected in periods of three to five years, yet citizen opinions seem to fluctuate daily and sometimes these mood swings grow to enormous proportions. When thousands of people all start tweeting about the same subject on the same day, you know that something is up. With so much dynamic and salient political diversity in the electorate, how can policy-makers ever reach a consensus that could satisfy everyone?
At the same time, it would be a grave mistake to discount the voices of the internet as something that has no connection to real political situations.
What happened in the UK was not only a political disaster, but also a vivid example of what happens when you combine the uncontrollable power of the internet with a lingering visceral feeling that ordinary people have lost control of the politics that shape their lives.

social media and democracy

Polarization as a driver of populism

People who have long entertained right-wing populist ideas, but were never confident enough to voice them openly, are now in a position to connect to like-minded others online and use the internet as a megaphone for their opinions.

The resulting echo chambers tend to amplify and reinforce our existing opinions, which is dysfunctional for a healthy democratic discourse. And while social media platforms like Facebook and Twitter generally have the power to expose us to politically diverse opinions, research suggests that the filter bubbles they sometimes create are, in fact, exacerbated by the platforms’ personalization algorithms, which are based on our social networks and our previously expressed ideas. This means that instead of creating an ideal type of a digitally mediated “public agora”, which would allow citizens to voice their concerns and share their hopes, the internet has actually increased conflict and ideological segregation between opposing views, granting a disproportionate amount of clout to the most extreme opinions.

The disintegration of the general will

In political philosophy, the very idea of democracy is based on the principal of the general will, which was proposed by Jean-Jacques Rousseau in the 18th century. Rousseau envisioned that a society needs to be governed by a democratic body that acts according to the imperative will of the people as a whole.

There can be no doubt that a new form of digitally mediated politics is a crucial component of the Fourth Industrial Revolution: the internet is already used for bottom-up agenda-setting, empowering citizens to speak up in a networked public sphere, and pushing the boundaries of the size, sophistication and scope of collective action. In particular, social media has changed the nature of political campaigning and will continue to play an important role in future elections and political campaigns around the world.

++++++++++++++

more on the impact of technology on democracy in this IMS blog:

https://blog.stcloudstate.edu/ims?s=democracy