Tom Dickinson describes four different types of distributed ‘fake news’.
‘Fake news’ is lazy language. Be specific. Do you mean: A) Propaganda B) Disinformation C) Conspiracy theory D) Clickbait
The RAND Corporation, a US think-tank with strong ties to the military industrial complex, recently looked at the influence of the Russian Propaganda Model and how best to deal with it.
Three factors have been shown to increase the (limited) effectiveness of retractions and refutations: (1) warnings at the time of initial exposure to misinformation, (2) repetition of the retraction or refutation, and (3) corrections that provide an alternative story to help fill the resulting gap in understanding when false ‘facts’ are removed.
Critical thinking requires us to constantly question assumptions, especially our own. To develop these skills, questioning must be encouraged. This runs counter to most schooling and training practices. When do students or employees get to question underlying assumptions of their institutions? If they cannot do this, how can we expect them to challenge various and pervasive types of ‘fake news’?
Online Course | Designing a Collaborative Instructional Technology Support Model
Part 1: March 7, 2018 | 1:00–2:30 p.m. ET
Part 2: March 14, 2018 | 1:00–2:30 p.m. ET
Part 3: March 21, 2018 | 1:00–2:30 p.m. ET
Faculty need a variety of instructional technology support—instructional design, content development, technology, training, and assessment—to name a few. They don’t want to go to one place for help, find out they’re in the wrong place, and be sent somewhere else—digitally or physically. Staff don’t want to provide help in silos or duplicate what other units are doing.
So, how can academic service providers collaborate to offer the right instructional technology support services, in the right place, at the right time, in the right way? In this course, instructional technologists, instructional designers, librarians, and instructional technology staff will learn to use a tool called the Service Center Canvas that does just that.
Learning Objectives:
During this course, participants will:
Explore the factors that influence how instructional technology support services are offered in higher education
Answer critical questions about how your instructional technology support services should be delivered relative to broader trends and institutional goals
Experiment with ways to prototype new services and/or new ways of delivering them
Identify potential implementation obstacles and ways to address them
NOTE: Participants will be asked to complete assignments in between the course segments that support the learning objectives stated below and will receive feedback and constructive critique from course facilitators on how to improve and shape their work.
Course Facilitators
Elliot Felix, Founder and CEO, brightspot strategy
Felix founded and leads brightspot, a strategy consultancy that reimagines places, rethinks services, and redesigns organizations on university campuses so that people are better connected to a purpose, information, and each other. Felix is accomplished strategist, facilitator, and sense-maker who has helped transform over 70 colleges and universities.
Adam Griff is a director at brightspot. He helps universities rethink their space, reinvent their service offerings, and redesign their organization to improve the experiences of their faculty, students, and staff, connecting people and processes to create simple and intuitive answers to complex questions. He has led projects with a wide range of higher education institutions including University of Wisconsin–Madison, University of North Carolina at Chapel Hill, and University of California, Berkeley.
Although by a narrower margin for millennials and Gen Z, the numbers in the Wainhouse study shows that the personal touch hasn’t been replaced in workplace learning.
What is digital literacy? Do you know how you can foster digital literacy through formal and informal learning opportunities for your library staff and users?
Supporting digital literacy still remains an important part of library staff members’ work, but sometimes we struggle to agree on a simple, meaningful definition of the term. In this four-week eCourse, training/learning specialist Paul Signorelli begins by exploring a variety of definitions, focusing on work by a few leading proponents of the need to foster digital literacy among people of all ages and backgrounds. He explores a variety of digital-literacy resources – including case studies of how we creatively approach digital-literacy learning opportunities for library staff and users, and explores a variety of digital tools that will help to encourage further understanding of this topic.
+++++++++++++++
more on digital literacy in this IMS blog
Combine the superfast calculational capacities of Big Compute with the oceans of specific personal information comprising Big Data — and the fertile ground for computational propaganda emerges. That’s how the small AI programs called bots can be unleashed into cyberspace to target and deliver misinformation exactly to the people who will be most vulnerable to it. These messages can be refined over and over again based on how well they perform (again in terms of clicks, likes and so on). Worst of all, all this can be done semiautonomously, allowing the targeted propaganda (like fake news stories or faked images) to spread like viruses through communities most vulnerable to their misinformation.
According to Bolsover and Howard, viewing computational propaganda only from a technical perspective would be a grave mistake. As they explain, seeing it just in terms of variables and algorithms “plays into the hands of those who create it, the platforms that serve it, and the firms that profit from it.”
Computational propaganda is a new thing. People just invented it. And they did so by realizing possibilities emerging from the intersection of new technologies (Big Compute, Big Data) and new behaviors those technologies allowed (social media). But the emphasis on behavior can’t be lost.
People are not machines. We do things for a whole lot of reasons including emotions of loss, anger, fear and longing. To combat computational propaganda’s potentially dangerous effects on democracy in a digital age, we will need to focus on both its howand its why.
The UN’s children’s agency, Unicef, has launched a futuristic pilot project to utilise the cryptocurrency Ethereum to raise money for Syrian children.
the “blockchain” technology associated with the cryptocurrency – the world’s second largest after the controversial Bitcoin – to revolutionise not only how aid organisations raise money but also to increase transparency in their financial transactions.
Blockchain – which emerged as one of the underpinnings of Bitcoin – is a shared record of transactions maintained by a network of computers. It has become a key technology because of its ability to record and keep track of assets or transactions with no need for middlemen.
The World Food Programme (WFP) has used Ethereum to deliver $1.4m in food vouchers, via the use of iris recognition scanners in camp supermarkets, to around 10,000 Syrian refugees in Jordan,
Just as in the 1970s, when antidrug campaigns were scoffed at by the very people they were targeting, anti-bullying campaigns are also losing their effectiveness. I got a taste of this firsthand last year when I spoke about sexting, online safety and cyberbullying at an all-school assembly. When a student blurted out an obscenity during the sexting portion, the students went wild and didn’t listen to a thing I said. I was frustrated and discouraged. Later, I offered an iPad mini to the student who produced the best video and poster. Even that got little response.
The fact is, anti-bullying clichés have become a shut-off switch. What we really need to be doing is giving students actual skills to prevent bullying. To get that conversation going, I pose this question to students: “Will you accept the identity that others give you?”
Moscow appears to have moved quickly into the market of cryptocurrencies and cryptography. Gref earlier warned that companies including Blockchain and Bitcoin should not be banned or hindered in their operations.
Russia’s Finance Ministry legalised cryptocurrency trading on January 25 with the Digital Assets Regulation Bill, despite vocal objections from the country’s Central Bank. The bill defined cryptocurrencies and tokens as digital financial assets that are not legal tender in Russia. The Central Bank, however, argued that digital currency trading rules should only be applied to tokens that would attract financial investments.