Understanding what sources to trust is a basic tenet of media literacy education.
Think about how this might play out in communities where the “liberal media” is viewed with disdain as an untrustworthy source of information…or in those where science is seen as contradicting the knowledge of religious people…or where degrees are viewed as a weapon of the elite to justify oppression of working people. Needless to say, not everyone agrees on what makes a trusted source.
Students are also encouraged to reflect on economic and political incentives that might bias reporting. Follow the money, they are told. Now watch what happens when they are given a list of names of major power players in the East Coast news media whose names are all clearly Jewish. Welcome to an opening for anti-Semitic ideology.
In the United States, we believe that worthy people lift themselves up by their bootstraps. This is our idea of freedom. To take away the power of individuals to control their own destiny is viewed as anti-American by so much of this country. You are your own master.
Children are indoctrinated into this cultural logic early, even as their parents restrict their mobility and limit their access to social situations. But when it comes to information, they are taught that they are the sole proprietors of knowledge. All they have to do is “do the research” for themselves and they will know better than anyone what is real.
Many marginalized groups are justifiably angry about the ways in which their stories have been dismissed by mainstream media for decades.It took five days for major news outlets to cover Ferguson. It took months and a lot of celebrities for journalists to start discussing the Dakota Pipeline. But feeling marginalized from news media isn’t just about people of color.
Keep in mind that anti-vaxxers aren’t arguing that vaccinations definitively cause autism. They are arguing that we don’t know. They are arguing that experts are forcing children to be vaccinated against their will, which sounds like oppression. What they want is choice — the choice to not vaccinate. And they want information about the risks of vaccination, which they feel are not being given to them. In essence, they are doing what we taught them to do: questioning information sources and raising doubts about the incentives of those who are pushing a single message. Doubt has become tool.
Addressing so-called fake news is going to require a lot more than labeling. It’s going to require a cultural change about how we make sense of information, whom we trust, and how we understand our own role in grappling with information. Quick and easy solutions may make the controversy go away, but they won’t address the underlying problems.
boyd, danah. (2014). It’s Complicated: The Social Lives of Networked Teens (1 edition). New Haven: Yale University Press.
p. 8 networked publics are publics that are reconstructed by networked technologies. they are both space and imagined community.
p. 11 affordances: persistence, visibility, spreadability, searchability.
p. technological determinism both utopian and dystopian
p. 30 adults misinterpret teens online self-expression.
p. 31 taken out of context. Joshua Meyrowitz about Stokely Charmichael.
p. 43 as teens have embraced a plethora of social environment and helped co-create the norms that underpin them, a wide range of practices has emerged. teens have grown sophisticated with how they manage contexts and present themselves in order to be read by their intended audience.
p. 54 privacy. p. 59 Privacy is a complex concept without a clear definition. Supreme Court Justice Brandeis: the right to be let alone, but also ‘measure of th access others have to you through information, attention, and physical proximity.’
control over access and visibility
p. 65 social steganography. hiding messages in plain sight
p. 69 subtweeting. encoding content
p. 70 living with surveillance . Foucault Discipline and Punish
p. 77 addition. what makes teens obsessed w social media.
p. 81 Ivan Goldberg coined the term internet addiction disorder. jokingly
p. 89 the decision to introduce programmed activities and limit unstructured time is not unwarranted; research has shown a correlation between boredom and deviance.
My interview with Myra, a middle-class white fifteen-year-old from Iowa, turned funny and sad when “lack of time” became a verbal trick in response to every question. From learning Czech to trakc, from orchestra to work in a nursery, she told me that her mother organized “98%” of her daily routine. Myra did not like all of these activities, but her mother thought they were important.
Myra noted that her mother meant well, but she was exhausted and felt socially disconnected because she did not have time to connect with friends outside of class.
p. 100 danger
are sexual predators lurking everywhere
p. 128 bullying. is social media amplifying meanness and cruelty.
p. 131 defining bullying in a digital era. p. 131 Dan Olweus narrowed in the 70s bulling to three components: aggression, repetition and imbalance on power. p. 152 SM has not radically altered the dynamics of bullying, but it has made these dynamics more visible to more people. we must use this visibility not to justify increased punishment, but to help youth who are actually crying out for attention.
p. 153 inequality. can SM resolve social divisions?
p. 176 literacy. are today’s youth digital natives? p. 178 Barlow and Rushkoff p. 179 Prensky. p. 180 youth need new literacies. p. 181 youth must become media literate. when they engage with media–either as consumers or producers–they need to have the skills to ask questions about the construction and dissemination of particular media artifacts. what biases are embedded in the artifact? how did the creator intend for an audience to interpret the artifact, and what are the consequences of that interpretation.
p. 183 the politics of algorithms (see also these IMS blog entrieshttp://blog.stcloudstate.edu/ims?s=algorithms) Wikipedia and google are fundamentally different sites. p. 186 Eli Pariser, The Filter Bubble: the personalization algorithms produce social divisions that undermine any ability to crate an informed public. Harvard’s Berkman Center have shown, search engines like Google shape the quality of information experienced by youth.
p. 192 digital inequality. p. 194 (bottom) 195 Eszter Hargittai: there are signifficant difference in media literacy and technical skills even within age cohorts. teens technological skills are strongly correlated with socio-economic status. Hargittai argues that many youth, far from being digital natives, are quite digitally naive.
p. 195 Dmitry Epstein: when society frames the digital divide as a problem of access, we see government and industry as the responsible party for the addressing the issue. If DD as skills issue, we place the onus on learning how to manage on individuals and families.
p. 196 beyond digital natives
Palfrey, J., & Gasser, U. (2008). Born Digital: Understanding the First Generation of Digital Natives (1 edition). New York: Basic Books.
John Palfrey, Urs Gasser: Born Digital
Digital Natives share a common global culture that is defined not by age, strictly, but by certain attributes and experience related to how they interact with information technologies, information itself, one another, and other people and institutions. Those who were not “born digital’ can be just as connected, if not more so, than their younger counterparts. And not everyone born since, say 1982, happens to be a digital native.” (see also http://blog.stcloudstate.edu/ims/2018/04/15/no-millennials-gen-z-gen-x/
p. 197. digital native rhetoric is worse than inaccurate: it is dangerous
many of the media literacy skills needed to be digitally savvy require a level of engagement that goes far beyond what the average teen pick up hanging out with friends on FB or Twitter. Technical skills, such as the ability to build online spaces requires active cultivation. Why some search queries return some content before others. Why social media push young people to learn how to to build their own systems, versus simply using a social media platforms. teens social status and position alone do not determine how fluent or informed they are via-a-vis technology.
Artificial intelligence (AI) and machine learning are no longer fantastical prospects seen only in science fiction. Products like Amazon Echo and Siri have brought AI into many homes,
Kelly Calhoun Williams, an education analyst for the technology research firm Gartner Inc., cautions there is a clear gap between the promise of AI and the reality of AI.
Artificial intelligence is a broad term used to describe any technology that emulates human intelligence, such as by understanding complex information, drawing its own conclusions and engaging in natural dialog with people.
Machine learning is a subset of AI in which the software can learn or adapt like a human can. Essentially, it analyzes huge amounts of data and looks for patterns in order to classify information or make predictions. The addition of a feedback loop allows the software to “learn” as it goes by modifying its approach based on whether the conclusions it draws are right or wrong.
AI can process far more information than a human can, and it can perform tasks much faster and with more accuracy. Some curriculum software developers have begun harnessing these capabilities to create programs that can adapt to each student’s unique circumstances.
GoGuardian, a Los Angeles company, uses machine learning technology to improve the accuracy of its cloud-based Internet filtering and monitoring software for Chromebooks. (My note: that smells Big Brother).Instead of blocking students’ access to questionable material based on a website’s address or domain name, GoGuardian’s software uses AI to analyze the actual content of a page in real time to determine whether it’s appropriate for students. (my note: privacy)
serious privacy concerns. It requires an increased focus not only on data quality and accuracy, but also on the responsible stewardship of this information. “School leaders need to get ready for AI from a policy standpoint,” Calhoun Williams said. For instance: What steps will administrators take to secure student data and ensure the privacy of this information?
Under the Children’s Internet Protection Act (CIPA), any US school that receives federal funding is required to have an internet-safety policy. As school-issued tablets and Chromebook laptops become more commonplace, schools must install technological guardrails to keep their students safe. For some, this simply means blocking inappropriate websites. Others, however, have turned to software companies like Gaggle, Securly, and GoGuardian to surface potentially worrisome communications to school administrators
Over 50% of teachers say their schools are one-to-one (the industry term for assigning every student a device of their own), according to a 2017 survey from Freckle Education
But even in an age of student suicides and school shootings, when do security precautions start to infringe on students’ freedoms?
When the Gaggle algorithm surfaces a word or phrase that may be of concern—like a mention of drugs or signs of cyberbullying—the “incident” gets sent to human reviewers before being passed on to the school. Using AI, the software is able to process thousands of student tweets, posts, and status updates to look for signs of harm.
SMPs help normalize surveillance from a young age. In the wake of the Cambridge Analytica scandal at Facebook and other recent data breaches from companies like Equifax, we have the opportunity to teach kids the importance of protecting their online data
in an age of increased school violence, bullying, and depression, schools have an obligation to protect their students. But the protection of kids’ personal information is also a matter of their safety
#FakeNews is a very timely and controversial issue. in 2-3 min choose your best source on this issue. 1. Mind the prevalence of resources in the 21st century 2. Mind the necessity to evaluate a) the veracity of your courses b) the quality of your sources (the fact that they are “true” does not mean that they are the best). Be prepared to name your source and defend its quality.
How do you determine your sources? How do you decide the reliability of your sources? Are you sure you can distinguish “good” from “bad?”
Compare this entry https://en.wikipedia.org/wiki/List_of_fake_news_websites
to this entry: https://docs.google.com/document/d/10eA5-mCZLSS4MQY5QGb5ewC3VAL6pLkT53V_81ZyitM/preview to understand the scope
what is social media (examples). why is called SM? why is so popular? what makes it so popular?
use SM tools for your research and education:
– Determining your topic. How to?
Digg http://digg.com/, Reddit https://www.reddit.com/ , Quora https://www.quora.com
Facebook, Twitter – hashtags (class assignment 2-3 min to search)
YouTube and Slideshare (class assignment 2-3 min to search)
Flickr, Instagram, Pinterest for visual aids (like YouTube they are media repositories)
Apps like WhatsApp, Facebook, Snapchat, Instagram, Twitter, LinkedIn, Viber
Felix Krause described in 2017 that when a user grants an app access to their camera and microphone, the app could do the following:
Access both the front and the back camera.
Record you at any time the app is in the foreground.
Take pictures and videos without telling you.
Upload the pictures and videos without telling you.
Upload the pictures/videos it takes immediately.
Run real-time face recognition to detect facial features or expressions.
Livestream the camera on to the internet.
Detect if the user is on their phone alone, or watching together with a second person.
Upload random frames of the video stream to your web service and run a proper face recognition software which can find existing photos of you on the internet and create a 3D model based on your face.
For instance, here’s a Find my Phone application which a documentary maker installed on a phone, then let someone steal it. After the person stole it, the original owner spied on every moment of the thief’s life through the phone’s camera and microphone.
Edward Snowden revealed an NSA program called Optic Nerves. The operation was a bulk surveillance program under which they captured webcam images every five minutes from Yahoo users’ video chats and then stored them for future use. It is estimated that between 3% and 11% of the images captured contained “undesirable nudity”.
Hackers can also gain access to your device with extraordinary ease via apps, PDF files, multimedia messages and even emojis.
An application called Metasploit on the ethical hacking platform Kali uses an Adobe Reader 9 (which over 60% of users still use) exploit to open a listener (rootkit) on the user’s computer. You alter the PDF with the program, send the user the malicious file, they open it, and hey presto – you have total control over their device remotely.
Once a user opens this PDF file, the hacker can then:
Install whatever software/app they like on the user’s device.
Use a keylogger to grab all of their passwords.
Steal all documents from the device.
Take pictures and stream videos from their camera.
Capture past or live audio from the microphone.
Upload incriminating images/documents to their PC, and notify the police.
And, if it’s not enough that your phone is tracking you – surveillance cameras in shops and streets are tracking you, too
You might even be on this website, InSeCam, which allows ordinary people online to watch surveillance cameras free of charge. It even allows you to search cameras by location, city, time zone, device manufacturer, and specify whether you want to see a kitchen, bar, restaurant or bedroom.
cyber security experts say that weaving your personal and professional lives together via a work laptop is risky business — for you and the company. Software technology company Check Point conducted a survey of over 700 IT professionals which revealed that nearly two-thirds of IT pros believed that recent high-profile breaches were caused by employee carelessness.
DON’T: Save personal passwords in your work device keychain.
DON’T: Make off-color jokes on messaging software.
DON’T: Access free public wi-fi while working on sensitive material.
DON’T: Allow friends or non-IT department colleagues to remotely access your work computer.
DON’T: Store personal data.
DON’T: Work on your side hustle while at the office.
Section 702 — that authorizes them to monitor some Americans’ communications without a warrant.
The spy agencies are supposed to “minimize” details about people swept up in what they call such “incidental collection,” and they say their practices are regularly vetted by Congress and the Foreign Intelligence Surveillance Court.
shows that state and federal laws, as well as industry self-regulation, have failed to keep up with a growing education technology industry.
One-third of all K–12 students in the United States use school-issued devices running software and apps that collect far more information on kids than is necessary.
Resource-poor school districts can receive these tools at deeply discounted prices or for free, as tech companies seek a slice of the $8 billion ed tech industry. But there’s a real, devastating cost — the tracking, cataloging and exploitation of data about children as young as 5 years old.
Our report shows that the surveillance culture begins in grade school, which threatens to normalize the next generation to a digital world in which users hand over data without question in return for free services
EFF surveyed more than 1,000 stakeholders across the country, including students, parents, teachers and school administrators, and reviewed 152 ed tech privacy policies.
“Spying on Students” provides comprehensive recommendations for parents, teachers, school administrators and tech companies to improve the protection of student privacy. Asking the right questions, negotiating for contracts that limit or ban data collection, offering families the right to opt out, and making digital literacy and privacy part of the school curriculum are just a few of the 70-plus recommendations for protecting student privacy contained in the report.
The 188-page “Challenging Government Hacking In Criminal Cases” report, released by the American Civil Liberties Union on March 30, addresses new amendments to Rule 41 of the Federal Rules of Criminal Procedure, which took effect last December.
Under the changes to criminal procedure rules, feds can remotely search computers in multiple jurisdictions with a single warrant. The rules are touted by law enforcement agencies as a way to streamline 100-year-old rules of criminal procedure