Searching for "FAKE NEWS"

weaponizing the web RT hybrid war

Fake news and botnets: how Russia weaponised the web

https://www.theguardian.com/technology/2017/dec/02/fake-news-botnets-how-russia-weaponised-the-web-cyber-attack-estonia

The digital attack that brought Estonia to a standstill 10 years ago was the first shot in a cyberwar that has been raging between Moscow and the west ever since

It began at exactly 10pm on 26 April, 2007, when a Russian-speaking mob began rioting in the streets of Tallinn, the capital city of Estonia, killing one person and wounding dozens of others. That incident resonates powerfully in some of the recent conflicts in the US. In 2007, the Estonian government had announced that a bronze statue of a heroic second world war Soviet soldier was to be removed from a central city square. For ethnic Estonians, the statue had less to do with the war than with the Soviet occupation that followed it, which lasted until independence in 1991. For the country’s Russian-speaking minority – 25% of Estonia’s 1.3 million people – the removal of the memorial was another sign of ethnic discrimination.

That evening, Jaan Priisalu – a former risk manager for Estonia’s largest bank, Hansabank, who was working closely with the government on its cybersecurity infrastructure – was at home in Tallinn with his girlfriend when his phone rang. On the line was Hillar Aarelaid, the chief of Estonia’s cybercrime police.

“It’s going down,” Aarelaid declared. Alongside the street fighting, reports of digital attacks were beginning to filter in. The websites of the parliament, major universities, and national newspapers were crashing. Priisalu and Aarelaid had suspected something like this could happen one day. A digital attack on Estoniahad begun.

“The Russian theory of war allows you to defeat the enemy without ever having to touch him,” says Peter Pomerantsev, author of Nothing is True and Everything is Possible. “Estonia was an early experiment in that theory.”

Since then, Russia has only developed, and codified, these strategies. The techniques pioneered in Estonia are known as the “Gerasimov doctrine,” named after Valery Gerasimov, the chief of the general staff of the Russian military. In 2013, Gerasimov published an article in the Russian journal Military-Industrial Courier, articulating the strategy of what is now called “hybrid” or “nonlinear” warfare. “The lines between war and peace are blurred,” he wrote. New forms of antagonism, as seen in 2010’s Arab spring and the “colour revolutions” of the early 2000s, could transform a “perfectly thriving state, in a matter of months, and even days, into an arena of fierce armed conflict”.

Russia has deployed these strategies around the globe. Its 2008 war with Georgia, another former Soviet republic, relied on a mix of both conventional and cyber-attacks, as did the 2014 invasion of Crimea. Both began with civil unrest sparked via digital and social media – followed by tanks. Finland and Sweden have experienced near-constant Russian information operations. Russian hacks and social media operations have also occurred during recent elections in Holland, Germany, and France. Most recently, Spain’s leading daily, El País, reported on Russian meddling in the Catalonian independence referendum. Russian-supported hackers had allegedly worked with separatist groups, presumably with a mind to further undermining the EU in the wake of the Brexit vote.

The Kremlin has used the same strategies against its own people. Domestically, history books, school lessons, and media are manipulated, while laws are passed blocking foreign access to the Russian population’s online data from foreign companies – an essential resource in today’s global information-sharing culture. According to British military researcher Keir Giles, author of Nato’s Handbook of Russian Information Warfare, the Russian government, or actors that it supports, has even captured the social media accounts of celebrities in order to spread provocative messages under their names but without their knowledge. The goal, both at home and abroad, is to sever outside lines of communication so that people get their information only through controlled channels.

+++++++++++++++++++++
24-hour Putin people: my week watching Kremlin ‘propaganda channel’ RT

https://www.theguardian.com/media/2017/nov/29/24-hour-putin-people-my-week-watching-kremlin-propaganda-channel-rt-russia-today

 Wednesday 29 November 2017 

According to its detractors, RT is Vladimir Putin’s global disinformation service, countering one version of the truth with another in a bid to undermine the whole notion of empirical truth. And yet influential people from all walks of public life appear on it, or take its money. You can’t criticise RT’s standards, they say, if you don’t watch it. So I watched it. For a week.

Suchet, the son of former ITV newsreader John Suchet and the nephew of actor David Suchet, has been working for RT since 2009. The offspring of well-known people feature often on RT. Sophie Shevardnadze, who presents Sophie & Co, is the granddaughter of former Georgian president and Soviet foreign minister Eduard ShevardnadzeTyrel Ventura, who presents Watching the Hawks on RT America, is the son of wrestler-turned-politician Jesse Ventura. His co-host is Oliver Stone’s son Sean.

My note; so this is why Oliver Stone in his “documentary” went gentle on Putin, so his son can have a job. #Nepotism #FakeNews

RT’s stated mission is to offer an “alternative perspective on major global events”, but the world according to RT is often downright surreal.

Peter Pomerantsev, author of Nothing Is True and Everything Is Possible, about Putin’s Russia, and now a senior visiting fellow in global affairs at the London School of Economics, was in Moscow working in television when Russia Today first started hiring graduates from Britain and the US. “The people were really bright, they were being paid well,” he says. But they soon found they were being ordered to change their copy, or instructed how to cover certain stories to reflect well on the Kremlin. “Everyone had their own moment when they first twigged that this wasn’t like the BBC,” he says. “That, actually, this is being dictated from above.” The coverage of Russia’s war with Georgia in 2008 was a lightbulb moment for many, he says. They quit.

+++++++++++++++

more on Russian bots, trolls:
https://blog.stcloudstate.edu/ims/2017/11/22/bots-trolls-and-fake-news/

+++++++++++++++
more on state propaganda in this IMS blog
https://blog.stcloudstate.edu/ims/2017/11/21/china-of-xi/

China of Xi

Time of Xi



My note: CCTV (http://english.cctv.com/), accidentally overlaps with cctv (https://en.wikipedia.org/wiki/Closed-circuit_television): “also known as video surveillance”

China Central Television (formerly Beijing Television), commonly abbreviated as CCTV, is the predominant state television broadcaster in the People’s Republic of China. CCTV has a network of 50 channels broadcasting different programmes and is accessible to more than one billion viewers.[1] As of present, there are 50 television channels, and the broadcaster provides programming in six different languages. Most of its programmes are a mixture of news, documentary, social education, comedy, entertainment, and drama, the majority of which consists of Chinese soap operas and entertainment.[2]

https://en.wikipedia.org/wiki/China_Central_Television

CCTV is one of the official mouthpieces of the Communist Party of China, and is part of what is known in China as the “central three” (中央三台), with the others being China National Radio and China Radio International.

Fake news and CCTV

https://en.wikipedia.org/wiki/China_Central_Television

https://blogs.wsj.com/chinarealtime/2014/03/28/china-targets-fake-news/

http://ascportfolios.org/chinaandmedia/2011/01/31/fake-news-in-the-news/

https://www.huffingtonpost.com/entry/united-states-china-fake-news_us_592494d5e4b00c8df29f88d7

CCTV mentioned positively: http://www.bbc.com/news/world-asia-china-22424129

topics for IM260

proposed topics for IM 260 class

  • Media literacy. Differentiated instruction. Media literacy guide.
    Fake news as part of media literacy. Visual literacy as part of media literacy. Media literacy as part of digital citizenship.
  • Web design / web development
    the roles of HTML5, CSS, Java Script, PHP, Bootstrap, JQuery, React and other scripting languages and libraries. Heat maps and other usability issues; website content strategy. THE MODEL-VIEW-CONTROLLER (MVC) design pattern
  • Social media for institutional use. Digital Curation. Social Media algorithms. Etiquette Ethics. Mastodon
    I hosted a LITA webinar in the fall of 2016 (four weeks); I can accommodate any information from that webinar for the use of the IM students
  • OER and instructional designer’s assistance to book creators.
    I can cover both the “library part” (“free” OER, copyright issues etc) and the support / creative part of an OER book / textbook
  • Big Data.” Data visualization. Large scale visualization. Text encoding. Analytics, Data mining. Unizin. Python, R in academia.
    I can introduce the students to the large idea of Big Data and its importance in lieu of the upcoming IoT, but also departmentalize its importance for academia, business, etc. From infographics to heavy duty visualization (Primo X-Services API. JSON, Flask).
  • NetNeutrality, Digital Darwinism, Internet economy and the role of your professional in such environment
    I can introduce students to the issues, if not familiar and / or lead a discussion on a rather controversial topic
  • Digital assessment. Digital Assessment literacy.
    I can introduce students to tools, how to evaluate and select tools and their pedagogical implications
  • Wikipedia
    a hands-on exercise on working with Wikipedia. After the session, students will be able to create Wikipedia entries thus knowing intimately the process of Wikipedia and its information.
  • Effective presentations. Tools, methods, concepts and theories (cognitive load). Presentations in the era of VR, AR and mixed reality. Unity.
    I can facilitate a discussion among experts (your students) on selection of tools and their didactically sound use to convey information. I can supplement the discussion with my own findings and conclusions.
  • eConferencing. Tools and methods
    I can facilitate a discussion among your students on selection of tools and comparison. Discussion about the their future and their place in an increasing online learning environment
  • Digital Storytelling. Immersive Storytelling. The Moth. Twine. Transmedia Storytelling
    I am teaching a LIB 490/590 Digital Storytelling class. I can adapt any information from that class to the use of IM students
  • VR, AR, Mixed Reality.
    besides Mark Gill, I can facilitate a discussion, which goes beyond hardware and brands, but expand on the implications for academia and corporate education / world
  • IoT , Arduino, Raspberry PI. Industry 4.0
  • Instructional design. ID2ID
    I can facilitate a discussion based on the Educause suggestions about the profession’s development
  • Microcredentialing in academia and corporate world. Blockchain
  • IT in K12. How to evaluate; prioritize; select. obsolete trends in 21 century schools. K12 mobile learning
  • Podcasting: past, present, future. Beautiful Audio Editor.
    a definition of podcasting and delineation of similar activities; advantages and disadvantages.
  • Digital, Blended (Hybrid), Online teaching and learning: facilitation. Methods and techniques. Proctoring. Online students’ expectations. Faculty support. Asynch. Blended Synchronous Learning Environment
  • Gender, race and age in education. Digital divide. Xennials, Millennials and Gen Z. generational approach to teaching and learning. Young vs old Millennials. Millennial employees.
  • Privacy, [cyber]security, surveillance. K12 cyberincidents. Hackers.
  • Gaming and gamification. Appsmashing. Gradecraft
  • Lecture capture, course capture.
  • Bibliometrics, altmetrics
  • Technology and cheating, academic dishonest, plagiarism, copyright.

social media algorithms

How algorithms impact our browsing behavior? browsing history?
What is the connection between social media algorithms and fake news?
Are there topic-detection algorithms as they are community-detection ones?
How can I change the content of a [Google] search return? Can I? 

Larson, S. (2016, July 8). What is an Algorithm and How Does it Affect You? The Daily Dot. Retrieved from https://www.dailydot.com/debug/what-is-an-algorithm/
Berg, P. (2016, June 30). How Do Social Media Algorithms Affect You | Forge and Smith. Retrieved September 19, 2017, from https://forgeandsmith.com/how-do-social-media-algorithms-affect-you/
Oremus, W., & Chotiner, I. (2016, January 3). Who Controls Your Facebook Feed. Slate. Retrieved from http://www.slate.com/articles/technology/cover_story/2016/01/how_facebook_s_news_feed_algorithm_works.html
Lehrman, R. A. (2013, August 11). The new age of algorithms: How it affects the way we live. Christian Science Monitor. Retrieved from https://www.csmonitor.com/USA/Society/2013/0811/The-new-age-of-algorithms-How-it-affects-the-way-we-live
Johnson, C. (2017, March 10). How algorithms affect our way of life. Desert News. Retrieved from https://www.deseretnews.com/article/865675141/How-algorithms-affect-our-way-of-life.html
Understanding algorithms and their impact on human life goes far beyond basic digital literacy, some experts said.
An example could be the recent outcry over Facebook’s news algorithm, which enhances the so-called “filter bubble”of information.
personalized search (https://en.wikipedia.org/wiki/Personalized_search)
Kounine, A. (2016, August 24). How your personal data is used in personalization and advertising. Retrieved September 19, 2017, from https://www.tastehit.com/blog/personal-data-in-personalization-and-advertising/
Hotchkiss, G. (2007, March 9). The Pros & Cons Of Personalized Search. Retrieved September 19, 2017, from http://searchengineland.com/the-pros-cons-of-personalized-search-10697
Magid, L. (2012). How (and why) To Turn Off Google’s Personalized Search Results. Forbes. Retrieved from https://www.forbes.com/sites/larrymagid/2012/01/13/how-and-why-to-turn-off-googles-personalized-search-results/#53a30be838f2
Nelson, P. (n.d.). Big Data, Personalization and the No-Search of Tomorrow. Retrieved September 19, 2017, from https://www.searchtechnologies.com/blog/big-data-search-personalization

gender

Massanari, A. (2017). #Gamergate and The Fappening: How Reddit’s algorithm, governance, and culture support toxic technocultures. New Media & Society19(3), 329-346. doi:10.1177/1461444815608807

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dkeh%26AN%3d121748152%26site%3dehost-live%26scope%3dsite

community detection algorithms:

Bedi, P., & Sharma, C. (2016). Community detection in social networks. Wires: Data Mining & Knowledge Discovery6(3), 115-135.

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dllf%26AN%3d114513548%26site%3dehost-live%26scope%3dsite

CRUZ, J. D., BOTHOREL, C., & POULET, F. (2014). Community Detection and Visualization in Social Networks: Integrating Structural and Semantic Information. ACM Transactions On Intelligent Systems & Technology5(1), 1-26. doi:10.1145/2542182.2542193

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3daph%26AN%3d95584126%26site%3dehost-live%26scope%3dsite

Bai, X., Yang, P., & Shi, X. (2017). An overlapping community detection algorithm based on density peaks. Neurocomputing2267-15. doi:10.1016/j.neucom.2016.11.019

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dkeh%26AN%3d120321022%26site%3dehost-live%26scope%3dsite

topic-detection algorithms:

Zeng, J., & Zhang, S. (2009). Incorporating topic transition in topic detection and tracking algorithms. Expert Systems With Applications36(1), 227-232. doi:10.1016/j.eswa.2007.09.013

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dkeh%26AN%3d34892957%26site%3dehost-live%26scope%3dsite

topic detection and tracking (TDT) algorithms based on topic models, such as LDA, pLSI (https://en.wikipedia.org/wiki/Probabilistic_latent_semantic_analysis), etc.

Zhou, E., Zhong, N., & Li, Y. (2014). Extracting news blog hot topics based on the W2T Methodology. World Wide Web17(3), 377-404. doi:10.1007/s11280-013-0207-7

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dkeh%26AN%3d94609674%26site%3dehost-live%26scope%3dsite

The W2T (Wisdom Web of Things) methodology considers the information organization and management from the perspective of Web services, which contributes to a deep understanding of online phenomena such as users’ behaviors and comments in e-commerce platforms and online social networks.  (https://link.springer.com/chapter/10.1007/978-3-319-44198-6_10)

ethics of algorithm

Mittelstadt, B. D., Allo, P., Taddeo, M., Wachter, S., & Floridi, L. (2016). The ethics of algorithms: Mapping the debate. Big Data & Society, 3(2), 2053951716679679. https://doi.org/10.1177/2053951716679679

journalism

Malyarov, N. (2016, October 18). Journalism in the age of algorithms, platforms and newsfeeds | News | FIPP.com. Retrieved September 19, 2017, from http://www.fipp.com/news/features/journalism-in-the-age-of-algorithms-platforms-newsfeeds

+++++++++++++++++
https://blog.stcloudstate.edu/ims?s=algorithm
more on algorithms in this IMS blog

see also

social media for anthropology

ANTH 101 with Kelly Branam Macauley

Plamen Miltenoff: http://web.stcloudstate.edu/pmiltenoff/faculty/
relevant classes I teach and might be of interest for you:
http://web.stcloudstate.edu/pmiltenoff/lib290/. if you want to survey the class, here is the FB group page: https://www.facebook.com/groups/LIB290/
and
http://web.stcloudstate.edu/pmiltenoff/lib490/

short link to this presentation: http://bit.ly/lib4anth

Please pull out your smartphones, go to your Internet browser and and type: kahoot.it or click on the link: https://play.kahoot.it/

what is social media from anthropological point of view?

a study, the “Why We Post” project, has just been published by nine anthropologists, led by Daniel Miller of University College, London. worked independently for 15 months at locations in Brazil, Britain, Chile, China (one rural and one industrial site), India, Italy, Trinidad and Tobago, and Turkey.

In rural China and Turkey social media were viewed as a distraction from education. But in industrial China and Brazil they were seen to be an educational resource. Such a divide was evident in India, too. There, high-income families regarded them with suspicion but low-income families advocated them as a supplementary source of schooling. In Britain, meanwhile, they were valued not directly as a means of education, but as a way for pupils, parents and teachers to communicate.

How would you answer if addressed by this study? How do you see social media? Do you see it differently then before?

+++++++++++++++++++++++++++++++++++++

Jordan, K. (2017, January 13). When Social Media Are the News | Anthropology-News [American Anthropological Association]. Retrieved from http://www.anthropology-news.org/index.php/2017/01/13/when-social-media-are-the-news/
On a recent visit in 2015, I found the social media landscape dramatically changed, again. Facebook began actively steering reading practices through changes in 2013 to the News Feed algorithm, which determines content in the site’s central feature. That year, Facebook announced an effort to prioritize “high quality content,” defined as timely, relevant, and trustworthy—and not clickbait, memes, or other viral links. This policy, along with changing practices in sharing news content generally, meant that current events can unfold on and through social media.
how much of your news do you acquire through social media? do you trust the information you acquire through social media? #FakeNews – have explored this hashtag? What is your take on fake news? 

++++++++++++++++++++++++++++++++++++++

Fournier, S., Quelch, J., & Rietveld, B. (2016, August 17). To Get More Out of Social Media, Think Like an Anthropologist. Retrieved March 17, 2017, from https://hbr.org/2016/08/to-get-more-out-of-social-media-think-like-an-anthropologist 
meaning management :
Anthropologists and the culturally sensitive analysts take complex bits of data and develop a higher-order sense of them. Information and meaning work at cross purposes. In managing meaning, context is everything while in managing information context is error and noise. When we give our social listening projects to information specialists, we lose an appreciation of context and with it the ability to extract the meanings that provide insight for our companies and brands.
Meaning management also involves a deeper appreciation of social listening as a component of a broader meaning-making system, rather than as, simply, a data source to be exploited.
How do you perceive meaning management? Do you see yourself being a professional with the ability to collect, analyze and interpret such data for your company?
++++++++++++++++++++++++++++++++
Kraemer, J. (n.d.). Comparing Worlds through Social Media | Platypus. Retrieved from http://blog.castac.org/2016/04/whywepost/
——————————————-

please use this form to cast your feedback. Please feel free to fill out only the relevant questions:
http://bit.ly/imseval

dodgy academic research

Death threats, ghost researchers and sock puppets: Inside the weird, wild world of dodgy academic research

https://www.abc.net.au/news/2022-01-31/on-the-trail-of-dodgy-academic-research/100788052

More than 46 of Shadi Riahi’s publications with Dr Nazari have now been retracted for plagiarism, duplication of data and forged authorship.

“People try and fake everything,” said Ivan Oransky, who has spent years researching scientific misconduct on his blog Retraction Watch.

 investigative journalist Brian Deer, who discovered Dr Wakefield had multiple undisclosed conflicts of interest and that the study of just 12 children had been rigged.

But the damage had already been done.

Vaccination rates in the United Kingdom hit a low of 80 per cent in the early 2000s, leaving children unprotected from serious diseases. The repercussions are still being felt today, with Dr Wakefield being hailed as a hero by vaccine sceptics.

++++++++++++++
more on peer reviewed fake papers in this IMS blog
https://blog.stcloudstate.edu/ims?s=china+peer+review

media bias chart

https://adfontesmedia.com/
Media Bias Chart++++++++++++++++
older version
https://blog.stcloudstate.edu/ims/2017/08/13/library-spot-fake-news/
fact checking
https://blog.stcloudstate.edu/ims/2017/03/28/fake-news-resources/

All Sides
https://www.allsides.com/unbiased-balanced-news

The Flip Side
https://www.allsides.com/unbiased-balanced-news

AI tutors

Viewpoint: Can AI tutors help students learn?

the Kyowon Group, an education company in Korea, recently developed a life-like tutor using artificial intelligence for the very first time in the Korean education industry.

Kyowon created its AI tutors for two-way communication–teacher to student and student to teacher–by exchanging questions and answers between the two about the lesson plan as if they were having an interactive conversation.  These AI tutors were able to provide real time feedback related to the learning progress and were also able to identify, manage, and customize interactions with students through learning habits management.  In addition, to help motivate student learning, the AI Tutors captured students’ emotions through analysis of their strengths and challenges.

While AI is being used in various industries, including education, the technology comes under scrutiny as many ask the question if they can trust AI and its legitimacy?

Although there are some meaningful use cases for deepfake, such as using technology to bring historical figures of the past to life, deepfake technology is mostly exploited. However, the good news is that groups are working to detect and minimize the damage caused by deepfake videos and other AI technology abuses, including credible standards organizations who are working to ensure trust in AI.

For education, the best and only way AI tutors will be adopted and accepted
can only be done with innovative real-time AI conversational technology that must include accurate lip and mouth synchronization in addition to video synthesis technology. Using real models, not fake computer-generated ones, is critical as well.

trustworthiness of science

Is Scientific Communication Fit for Purpose?

problems is scientific misconduct and fraud, which, it is important to note, is perpetuated by scientists themselves. This category includes scientists who use fraudulent datainappropriately manipulate images, and otherwise fake experimental results. Publishers have been investing increasingly to block bad contributions at the point of submission through editorial review and more is almost certainly needed, likely a combination of automated and human review. Another form of misconduct is the failure to disclose conflicts of interest, which, notwithstanding efforts by publishers to strengthen disclosure guidelines, have continued to be disclosed “too little too late,”

Beyond individual misconduct, there are also organized and systematic challenges. We are seeing “organized fraud” and “industrialized cheating” to manipulate the scientific record to advance self-interests. These choreographed efforts include citation malpracticepaper millspeer review rings, and guest editor frauds. And, even if it does not rise to the level of misconduct, we have seen the use of methods and practices that make substantial portions of at least some fields impossible to reproduce and therefore of dubious validity. Whether individual, organized, or systematic, all these are threats to scientific integrity.

1 9 10 11 12 13