Apr
2021
virtual reality art
View this post on Instagram
+++++++++++++++++
more on VR in this IMS blog
https://blog.stcloudstate.edu/ims?s=virtual+reality
Digital Literacy for St. Cloud State University
View this post on Instagram
+++++++++++++++++
more on VR in this IMS blog
https://blog.stcloudstate.edu/ims?s=virtual+reality
Facebook is testing Hotline, a Q&A app that’s a bit like Clubhouse, but with video. A real-estate investor hosted the app’s first live session. https://t.co/h8vW0ZASvc
— Business Insider Tech (@BITech) April 8, 2021
https://www.businessinsider.com/facebook-hotline-app-clubhouse-test-video-instagram-2021-4
Hotline can be accessed via its website — there’s no app version available for smartphones yet. The website requires users to sign in via their Twitter account, rather than their Facebook account, then leave their name on a waiting list.
In contrast to audio-only Clubhouse, Hotline users can use videos and livestreams with a Q&A feature to chat to their audience, a bit like live videos on Instagram, which Facebook owns.
+++++++++++++++
more on Facebook in this IMS blog
https://blog.stcloudstate.edu/ims?s=facebok
Last year the Institute for Strategic Dialogue released a report that found QAnon’s following had grown considerably in Australia during 2020, with Facebook, Instagram, YouTube and Twitter driving increased engagement.
The report found Australia was the fourth largest country for QAnon activity, behind the US, UK and Canada. Its presence in Australia is also evident on less mainstream sites. For example as Canadian QAnon research Marc-Andre Argentino has pointed out, there were at least six Australian Q “research boards” on the site 8kun with about 4,000 posts by January last year. That had increased to 11 boards by the start of 2021.
Last year, Guardian Australia revealed QAnon had found a follower in Tim Stewart, a family friend of the prime Scott Morrison. Stewart was behind one of Australia’s largest QAnon-linked accounts, BurnedSpy34.
++++++++++++++
more on QAnon in this IMS blog
https://blog.stcloudstate.edu/ims?s=qanon
more on QAnon in Germany: https://blog.stcloudstate.edu/ims?s=qanon+germany
How companies are failing social media managers—and wasting crucial talent
Facebook and Twitter are still the most frequently managed company channels. “The top five social media channels managed include Facebook (81%), Twitter (77%), LinkedIn (67%), Instagram (66%), and YouTube (51%),” according to the report.
(Just 6% are managing TikTok channels despite the platform’s meteoric rise.)
The primary role of social media managers is to create content. Forty-one percent of respondents said their primary role as a social media manager was to create content, while 20% said their main goal was to improve brand awareness and reputation.
It’s a female-dominated field. Seventy-nine percent of the 379 respondents are women.
Engagement and replies are the top metric for evaluating performance, but many have no clear objectives. According to the report, 45% of social media managers are evaluated on their engagement and replies, followed by progress toward goals (36%) and follower counts (33%).
Thirty percent said their social media performance was not evaluated at all.
++++++++++++++++++
more on social media in this IMS blog
https://blog.stcloudstate.edu/ims?s=social+media
Tens of thousands took to the streets across Russia, sharing photos and videos on social media faster than they could be removed, urging others to join.
Posted by NPR on Sunday, January 24, 2021
But the Russian media regulator Roskomnadzor quickly jumped to pressure social media platforms to remove videos that it said called for minors to participate in protests, warning that it was illegal for them to do so. On Friday, the government claimed that 38% of all offending videos on TikTok, 50% on YouTube and 17% on Instagram had been removed. By Saturday Roskomnadzor said more messages had been deleted on TikTok and Russian social media app Vkontakte. Facebook and Google, the owners of Instagram and YouTube respectively, expressed uncertainty about the accuracy of the numbers, Gizmodo reported.
++++++++++++
more on censorship in this IMS blog
https://blog.stcloudstate.edu/ims?s=censor
Youtube, TikTok and Instagram Delete Russian Posts Promoting Navalny Protests from r/worldnews
Social media platforms are taking down Russians’ calls to protest in support of jailed opposition leader Alexei Navalny over the government’s claims that they illegally incite minors to attend unauthorized rallies.
++++++++++++
more on censorship in this IMS blog
https://blog.stcloudstate.edu/ims?s=censor
Internet propaganda is becoming an industrialized commodity, warns Phil Howard, the director of the Oxford Internet…
Posted by SPIEGEL International on Friday, January 15, 2021
Posted by SPIEGEL International on Friday, January 15, 2021
++++++++++++
more on fake news in this IMS blog
https://blog.stcloudstate.edu/ims?s=fake+news
Why the grandiose promises of multilevel marketing and QAnon conspiracy theories go hand in hand
The Concordia University researcher Marc-André Argentino has a name for people like Schrandt: “Pastel QAnon.” These women—they are almost universally women—are doing the work of sanitizing QAnon, often pairing its least objectionable elements (Save the children!) with equally inoffensive imagery: Millennial-pink-and-gold color schemes, a winning smile. And many of them are members of multilevel-marketing organizations—a massive, under-examined sector of the American retail economy that is uniquely fertile ground for conspiracism. These are organizations built on foundational myths (that the establishment is keeping secrets from you, that you are on a hero’s journey to enlightenment and wealth), charismatic leadership, and shameless, constant posting. The people at the top of them are enviable, rich, and gifted at wrapping everything that happens—in their personal lives, or in the world around them—into a grand narrative about how to become as happy as they are. In 2020, what’s happening to them is dark and dangerous, but it looks gorgeous.
+++++++++++++++
more on QAnon in this IMS blog
https://blog.stcloudstate.edu/ims?s=qanon
Understanding how algorithm manipulators exploit our cognitive vulnerabilities empowers us to fight back
a minefield of cognitive biases.
People who behaved in accordance with them—for example, by staying away from the overgrown pond bank where someone said there was a viper—were more likely to survive than those who did not.
Compounding the problem is the proliferation of online information. Viewing and producing blogs, videos, tweets and other units of information called memes has become so cheap and easy that the information marketplace is inundated. My note: folksonomy in its worst.
At the University of Warwick in England and at Indiana University Bloomington’s Observatory on Social Media (OSoMe, pronounced “awesome”), our teams are using cognitive experiments, simulations, data mining and artificial intelligence to comprehend the cognitive vulnerabilities of social media users.
developing analytical and machine-learning aids to fight social media manipulation.
As Nobel Prize–winning economist and psychologist Herbert A. Simon noted, “What information consumes is rather obvious: it consumes the attention of its recipients.”
attention economy
Our models revealed that even when we want to see and share high-quality information, our inability to view everything in our news feeds inevitably leads us to share things that are partly or completely untrue.
Frederic Bartlett
Cognitive biases greatly worsen the problem.
We now know that our minds do this all the time: they adjust our understanding of new information so that it fits in with what we already know. One consequence of this so-called confirmation bias is that people often seek out, recall and understand information that best confirms what they already believe.
This tendency is extremely difficult to correct.
Making matters worse, search engines and social media platforms provide personalized recommendations based on the vast amounts of data they have about users’ past preferences.
pollution by bots
Social Herding
social groups create a pressure toward conformity so powerful that it can overcome individual preferences, and by amplifying random early differences, it can cause segregated groups to diverge to extremes.
Social media follows a similar dynamic. We confuse popularity with quality and end up copying the behavior we observe.
information is transmitted via “complex contagion”: when we are repeatedly exposed to an idea, typically from many sources, we are more likely to adopt and reshare it.
In addition to showing us items that conform with our views, social media platforms such as Facebook, Twitter, YouTube and Instagram place popular content at the top of our screens and show us how many people have liked and shared something. Few of us realize that these cues do not provide independent assessments of quality.
programmers who design the algorithms for ranking memes on social media assume that the “wisdom of crowds” will quickly identify high-quality items; they use popularity as a proxy for quality. My note: again, ill-conceived folksonomy.
Echo Chambers
the political echo chambers on Twitter are so extreme that individual users’ political leanings can be predicted with high accuracy: you have the same opinions as the majority of your connections. This chambered structure efficiently spreads information within a community while insulating that community from other groups.
socially shared information not only bolsters our biases but also becomes more resilient to correction.
machine-learning algorithms to detect social bots. One of these, Botometer, is a public tool that extracts 1,200 features from a given Twitter account to characterize its profile, friends, social network structure, temporal activity patterns, language and other features. The program compares these characteristics with those of tens of thousands of previously identified bots to give the Twitter account a score for its likely use of automation.
Some manipulators play both sides of a divide through separate fake news sites and bots, driving political polarization or monetization by ads.
recently uncovered a network of inauthentic accounts on Twitter that were all coordinated by the same entity. Some pretended to be pro-Trump supporters of the Make America Great Again campaign, whereas others posed as Trump “resisters”; all asked for political donations.
a mobile app called Fakey that helps users learn how to spot misinformation. The game simulates a social media news feed, showing actual articles from low- and high-credibility sources. Users must decide what they can or should not share and what to fact-check. Analysis of data from Fakey confirms the prevalence of online social herding: users are more likely to share low-credibility articles when they believe that many other people have shared them.
Hoaxy, shows how any extant meme spreads through Twitter. In this visualization, nodes represent actual Twitter accounts, and links depict how retweets, quotes, mentions and replies propagate the meme from account to account.
Free communication is not free. By decreasing the cost of information, we have decreased its value and invited its adulteration.
View this post on Instagram
Kyle Chayka calls this “ambient television,” an artifact of contemporary dystopia made for our quarantine era, when nothing’s stopping you from leaving the TV on all day long.
https://www.newyorker.com/culture/cultural-comment/emily-in-paris-and-the-rise-of-ambient-tv
+++++++++++++++
more on social media in this IMS blog
https://blog.stcloudstate.edu/ims?s=social+media