Roskomnadzor has also exerted pressure on Google to remove certain sites on Russian searches.
Director of National Intelligence Dan Coats told Congress last month that Russia, as well as other foreign actors, will increasingly use cyber operations to “threaten both minds and machines in an expanding number of ways—to steal information, to influence our citizens, or to disrupt critical infrastructure.”
TAILS is an acronym for “The Amnesic Incognito Live System.”
TAILS is a highly-secure operating system (and a host of cool applications) designed to be booted off of a DVD or USB thumb drive. This not only makes TAILS easy to transport, but also ensures that TAILS can be booted and instantly useful from nearly any PC, Mac, or Chromebook. TAILS is built on Linux, a name you might recognize because it’s a popular, free, and open-source operating system that’s been available since 1991. TAILS, in particular, runs on a variant of Linux known as “Debian,” which became available in 1996.
Third and most importantly, when setup correctly, TAILS helps ensure that all of your communications — email, web browsing, chat, and more — are encrypted, made anonymous, and then routed in such a way that it’s extremely difficult to detect or trace them.
Its unique approach to offering such well-regarded security is the creative use of two virtual machines (or VMs) running in tandem on one host computer. One of these VMs is known as the Gateway while the other is known as the Workstation.
Compared to TAILS, Whonix only provides a few free, open-source applications and those need to be set up fairly extensively. The list includes:
The TOR browser, for safe web browsing
Firefox, for less secure web browsing
Icedove, for emailing, secured by the Enigmail extension to encrypt and authenticate emails using a well-know and secure protocol called “OpenPGP”
HexChat, for internet chats
VLC, to open and view every kind of video file that’s ever existed
Whether the NYC police angle is true or not (it’s being hotly disputed), Facebook and Google are thinking along lines that follow the whims of the Chinese Government.
SenseTime and Megvii won’t just be worth $5 Billion, they will be worth many times that in the future. This is because a facial recognition data-harvesting of everything is the future of consumerism and capitalism, and in some places, the central tenet of social order (think Asia).
China has already ‘won’ the trade-war, because its winning the race to innovation. America doesn’t regulate Amazon, Microsoft, Google or Facebook properly, that stunts innovation and ethics in technology where the West is now forced to copy China just to keep up.
violent deaths in schools have stayed relatively constant over the last 30 years, according to data from the National Center for Education Statistics. But then there’s the emotive reality, which is that every time another event like Sandy Hook or Parkland occurs, many educators and students feel they are in peril when they go to school.
RealNetworks, a Seattle-based software company that was popular in the 1990s for its audio and video streaming services but has since expanded to offer other tools, including SAFR (Secure, Accurate Facial Recognition), its AI-supported facial recognition software.
After installing new security cameras, purchasing a few Apple devices and upgrading the school’s Wi-Fi, St. Therese was looking at a $24,000 technology tab.
The software is programmed to allow authorized users into the building with a smile.
“Facial recognition isn’t a panacea. It is just a tool,” says Collins, who focuses on education privacy issues.
Another part of the problem with tools like SAFR, is it provides a false sense of security.
Microsoft President and Chief Legal Counsel Brad Smith explained that it’s not the first time the search engine has been blocked. “It happens periodically,” he said in an interview with Fox Business News from Davos, Switzerland, on Thursday.
“You know, we operate in China pursuant to some global principles that’s called the Global Network Initiative in terms of how we manage censorship demands and the like,” he said.
Although Bing enjoyed only about 2 percent of China’s search engine market, its banishment was significant in a country known for controlling electronic access to information. With Bing blocked, China’s citizens had even fewer options for finding information on the Internet.
Artificial intelligence could erase many practical advantages of democracy, and erode the ideals of liberty and equality. It will further concentrate power among a small elite if we don’t take steps to stop it.
Ordinary people may not understand artificial intelligence and biotechnology in any detail, but they can sense that the future is passing them by. In 1938 the common man’s condition in the Soviet Union, Germany, or the United States may have been grim, but he was constantly told that he was the most important thing in the world, and that he was the future (provided, of course, that he was an “ordinary man,” rather than, say, a Jew or a woman).
n 2018 the common person feels increasingly irrelevant. Lots of mysterious terms are bandied about excitedly in ted Talks, at government think tanks, and at high-tech conferences—globalization, blockchain, genetic engineering, AI, machine learning—and common people, both men and women, may well suspect that none of these terms is about them.
Fears of machines pushing people out of the job market are, of course, nothing new, and in the past such fears proved to be unfounded. But artificial intelligence is different from the old machines. In the past, machines competed with humans mainly in manual skills. Now they are beginning to compete with us in cognitive skills.
Israel is a leader in the field of surveillance technology, and has created in the occupied West Bank a working prototype for a total-surveillance regime. Already today whenever Palestinians make a phone call, post something on Facebook, or travel from one city to another, they are likely to be monitored by Israeli microphones, cameras, drones, or spy software. Algorithms analyze the gathered data, helping the Israeli security forces pinpoint and neutralize what they consider to be potential threats.
The conflict between democracy and dictatorship is actually a conflict between two different data-processing systems. AI may swing the advantage toward the latter.
As we rely more on Google for answers, our ability to locate information independently diminishes. Already today, “truth” is defined by the top results of a Google search. This process has likewise affected our physical abilities, such as navigating space.
So what should we do?
For starters, we need to place a much higher priority on understanding how the human mind works—particularly how our own wisdom and compassion can be cultivated.
Between the creation of a social rating system and street cameras with facial recognition capabilities, technology reports coming out of China have raised serious concerns for privacy advocates. These concerns are only heightened as Chinese investors turn their attention to the United States education technology space acquiring companies with millions of public school users.
A particularly notable deal this year centers on Edmodo, a cross between a social networking platform and a learning management system for schools that boasts having upwards of 90 million users. Net Dragon, a Chinese gaming company that is building a significant education division, bought Edmodo for a combination of cash and equity valued at $137.5 million earlier this month.
Edmodo began shifting to an advertising model last year, after years of struggling to generate revenue. This has left critics wondering why the Chinese firm chose to acquire Edmodo at such a price, some have gone as far as to call the move a data grab.
as data becomes a tool that governments such as Russia and China could use to influence voting systems or induce citizens into espionage, more legislators are turning their attention to the acquisitions of early-stage technology startups.
NetDragon officials, however, say they have no interest in these types of activities. Their main goal in acquiring United States edtech companies lies in building profitability, says Pep So, NetDragon’s Director of Corporate Development.
In 2015, the firm acquired the education technology platform, Promethean, a company that creates interactive displays for schools. NetDragon executives say that the Edmodo acquisition rounds out their education product portfolio—meaning the company will have tools for supporting multiple aspects of learning including; preparation, instructional delivery, homework, assignment grading, communication with parents students and teachers and a content marketplace.
NetDragon’s monetization plan for Edmodo focuses on building out content that gets sold via its platform. Similar to tools like TeachersPayTeachers, So hopes to see users putting up content on the platform’s marketplace, some free and others for a fee (including some virtual reality content), so that the community can buy, sell and review available educational tools.
As far as data privacy is concerned, So notes that NetDragon is still learning what it can and cannot do. He noted that the company will comply with Children’s Online Privacy Protection Act (COPPA), a federal regulation created in order to protect the privacy of children online, but says that the rules and regulations surrounding the law are confusing for all actors involved.
Historically, Chinese companies have faced trust and branding issues when moving into the United States market, and the reverse is also true for U.S. companies seeking to expand overseas. Companies have also struggled to learn the rules, regulations and operational procedures in place in other countries.
Understanding what sources to trust is a basic tenet of media literacy education.
Think about how this might play out in communities where the “liberal media” is viewed with disdain as an untrustworthy source of information…or in those where science is seen as contradicting the knowledge of religious people…or where degrees are viewed as a weapon of the elite to justify oppression of working people. Needless to say, not everyone agrees on what makes a trusted source.
Students are also encouraged to reflect on economic and political incentives that might bias reporting. Follow the money, they are told. Now watch what happens when they are given a list of names of major power players in the East Coast news media whose names are all clearly Jewish. Welcome to an opening for anti-Semitic ideology.
In the United States, we believe that worthy people lift themselves up by their bootstraps. This is our idea of freedom. To take away the power of individuals to control their own destiny is viewed as anti-American by so much of this country. You are your own master.
Children are indoctrinated into this cultural logic early, even as their parents restrict their mobility and limit their access to social situations. But when it comes to information, they are taught that they are the sole proprietors of knowledge. All they have to do is “do the research” for themselves and they will know better than anyone what is real.
Many marginalized groups are justifiably angry about the ways in which their stories have been dismissed by mainstream media for decades.It took five days for major news outlets to cover Ferguson. It took months and a lot of celebrities for journalists to start discussing the Dakota Pipeline. But feeling marginalized from news media isn’t just about people of color.
Keep in mind that anti-vaxxers aren’t arguing that vaccinations definitively cause autism. They are arguing that we don’t know. They are arguing that experts are forcing children to be vaccinated against their will, which sounds like oppression. What they want is choice — the choice to not vaccinate. And they want information about the risks of vaccination, which they feel are not being given to them. In essence, they are doing what we taught them to do: questioning information sources and raising doubts about the incentives of those who are pushing a single message. Doubt has become tool.
Addressing so-called fake news is going to require a lot more than labeling. It’s going to require a cultural change about how we make sense of information, whom we trust, and how we understand our own role in grappling with information. Quick and easy solutions may make the controversy go away, but they won’t address the underlying problems.
boyd, danah. (2014). It’s Complicated: The Social Lives of Networked Teens (1 edition). New Haven: Yale University Press.
p. 8 networked publics are publics that are reconstructed by networked technologies. they are both space and imagined community.
p. 11 affordances: persistence, visibility, spreadability, searchability.
p. technological determinism both utopian and dystopian
p. 30 adults misinterpret teens online self-expression.
p. 31 taken out of context. Joshua Meyrowitz about Stokely Charmichael.
p. 43 as teens have embraced a plethora of social environment and helped co-create the norms that underpin them, a wide range of practices has emerged. teens have grown sophisticated with how they manage contexts and present themselves in order to be read by their intended audience.
p. 54 privacy. p. 59 Privacy is a complex concept without a clear definition. Supreme Court Justice Brandeis: the right to be let alone, but also ‘measure of th access others have to you through information, attention, and physical proximity.’
control over access and visibility
p. 65 social steganography. hiding messages in plain sight
p. 69 subtweeting. encoding content
p. 70 living with surveillance . Foucault Discipline and Punish
p. 77 addition. what makes teens obsessed w social media.
p. 81 Ivan Goldberg coined the term internet addiction disorder. jokingly
p. 89 the decision to introduce programmed activities and limit unstructured time is not unwarranted; research has shown a correlation between boredom and deviance.
My interview with Myra, a middle-class white fifteen-year-old from Iowa, turned funny and sad when “lack of time” became a verbal trick in response to every question. From learning Czech to trakc, from orchestra to work in a nursery, she told me that her mother organized “98%” of her daily routine. Myra did not like all of these activities, but her mother thought they were important.
Myra noted that her mother meant well, but she was exhausted and felt socially disconnected because she did not have time to connect with friends outside of class.
p. 100 danger
are sexual predators lurking everywhere
p. 128 bullying. is social media amplifying meanness and cruelty.
p. 131 defining bullying in a digital era. p. 131 Dan Olweus narrowed in the 70s bulling to three components: aggression, repetition and imbalance on power. p. 152 SM has not radically altered the dynamics of bullying, but it has made these dynamics more visible to more people. we must use this visibility not to justify increased punishment, but to help youth who are actually crying out for attention.
p. 153 inequality. can SM resolve social divisions?
p. 176 literacy. are today’s youth digital natives? p. 178 Barlow and Rushkoff p. 179 Prensky. p. 180 youth need new literacies. p. 181 youth must become media literate. when they engage with media–either as consumers or producers–they need to have the skills to ask questions about the construction and dissemination of particular media artifacts. what biases are embedded in the artifact? how did the creator intend for an audience to interpret the artifact, and what are the consequences of that interpretation.
p. 183 the politics of algorithms (see also these IMS blog entrieshttp://blog.stcloudstate.edu/ims?s=algorithms) Wikipedia and google are fundamentally different sites. p. 186 Eli Pariser, The Filter Bubble: the personalization algorithms produce social divisions that undermine any ability to crate an informed public. Harvard’s Berkman Center have shown, search engines like Google shape the quality of information experienced by youth.
p. 192 digital inequality. p. 194 (bottom) 195 Eszter Hargittai: there are signifficant difference in media literacy and technical skills even within age cohorts. teens technological skills are strongly correlated with socio-economic status. Hargittai argues that many youth, far from being digital natives, are quite digitally naive.
p. 195 Dmitry Epstein: when society frames the digital divide as a problem of access, we see government and industry as the responsible party for the addressing the issue. If DD as skills issue, we place the onus on learning how to manage on individuals and families.
p. 196 beyond digital natives
Palfrey, J., & Gasser, U. (2008). Born Digital: Understanding the First Generation of Digital Natives (1 edition). New York: Basic Books.
John Palfrey, Urs Gasser: Born Digital
Digital Natives share a common global culture that is defined not by age, strictly, but by certain attributes and experience related to how they interact with information technologies, information itself, one another, and other people and institutions. Those who were not “born digital’ can be just as connected, if not more so, than their younger counterparts. And not everyone born since, say 1982, happens to be a digital native.” (see also http://blog.stcloudstate.edu/ims/2018/04/15/no-millennials-gen-z-gen-x/
p. 197. digital native rhetoric is worse than inaccurate: it is dangerous
many of the media literacy skills needed to be digitally savvy require a level of engagement that goes far beyond what the average teen pick up hanging out with friends on FB or Twitter. Technical skills, such as the ability to build online spaces requires active cultivation. Why some search queries return some content before others. Why social media push young people to learn how to to build their own systems, versus simply using a social media platforms. teens social status and position alone do not determine how fluent or informed they are via-a-vis technology.
Artificial intelligence (AI) and machine learning are no longer fantastical prospects seen only in science fiction. Products like Amazon Echo and Siri have brought AI into many homes,
Kelly Calhoun Williams, an education analyst for the technology research firm Gartner Inc., cautions there is a clear gap between the promise of AI and the reality of AI.
Artificial intelligence is a broad term used to describe any technology that emulates human intelligence, such as by understanding complex information, drawing its own conclusions and engaging in natural dialog with people.
Machine learning is a subset of AI in which the software can learn or adapt like a human can. Essentially, it analyzes huge amounts of data and looks for patterns in order to classify information or make predictions. The addition of a feedback loop allows the software to “learn” as it goes by modifying its approach based on whether the conclusions it draws are right or wrong.
AI can process far more information than a human can, and it can perform tasks much faster and with more accuracy. Some curriculum software developers have begun harnessing these capabilities to create programs that can adapt to each student’s unique circumstances.
GoGuardian, a Los Angeles company, uses machine learning technology to improve the accuracy of its cloud-based Internet filtering and monitoring software for Chromebooks. (My note: that smells Big Brother).Instead of blocking students’ access to questionable material based on a website’s address or domain name, GoGuardian’s software uses AI to analyze the actual content of a page in real time to determine whether it’s appropriate for students. (my note: privacy)
serious privacy concerns. It requires an increased focus not only on data quality and accuracy, but also on the responsible stewardship of this information. “School leaders need to get ready for AI from a policy standpoint,” Calhoun Williams said. For instance: What steps will administrators take to secure student data and ensure the privacy of this information?
Under the Children’s Internet Protection Act (CIPA), any US school that receives federal funding is required to have an internet-safety policy. As school-issued tablets and Chromebook laptops become more commonplace, schools must install technological guardrails to keep their students safe. For some, this simply means blocking inappropriate websites. Others, however, have turned to software companies like Gaggle, Securly, and GoGuardian to surface potentially worrisome communications to school administrators
Over 50% of teachers say their schools are one-to-one (the industry term for assigning every student a device of their own), according to a 2017 survey from Freckle Education
But even in an age of student suicides and school shootings, when do security precautions start to infringe on students’ freedoms?
When the Gaggle algorithm surfaces a word or phrase that may be of concern—like a mention of drugs or signs of cyberbullying—the “incident” gets sent to human reviewers before being passed on to the school. Using AI, the software is able to process thousands of student tweets, posts, and status updates to look for signs of harm.
SMPs help normalize surveillance from a young age. In the wake of the Cambridge Analytica scandal at Facebook and other recent data breaches from companies like Equifax, we have the opportunity to teach kids the importance of protecting their online data
in an age of increased school violence, bullying, and depression, schools have an obligation to protect their students. But the protection of kids’ personal information is also a matter of their safety