Searching for "algorithms"

the intellectual dark web

Nuance: A Love Story. My affair with the intellectual dark web

Meghan Daum Aug 24 https://medium.com/s/greatescape/nuance-a-love-story-ae6a14991059

the standard set of middle-class Democratic Party values: Public safety nets were a force for good, corporate greed was a real threat, civil and reproductive rights were paramount.

I remember how good it felt to stand with my friends in our matching college sweatshirts shouting “never again!” and “my body, my choice!”

(hey, why shouldn’t Sarah Palin call herself a feminist?) brought angry letters from liberals as well as conservatives.

We would all go to the mat for women’s rights, gay rights, or pretty much any rights other than gun rights. We lived, for the most part, in big cities in blue states.

When Barack Obama came into the picture, we loved him with the delirium of crushed-out teenagers, perhaps less for his policies than for being the kind of person who also listens to NPR. We loved Hillary Clinton with the fraught resignation of a daughter’s love for her mother. We loved her even if we didn’t like her. We were liberals, after all. We were family.

Words like “mansplaining” and “gaslighting” were suddenly in heavy rotation, often invoked with such elasticity as to render them nearly meaningless. Similarly, the term “woke,” which originated in black activism, was being now used to draw a bright line between those on the right side of things and those on the wrong side of things.

From the Black Guys on Bloggingheads, YouTube’s algorithms bounced me along a path of similarly unapologetic thought criminals: the neuroscientist Sam Harris and his Waking Up podcast; Christina Hoff Sommers, aka “The Factual Feminist”; the comedian turned YouTube interviewer Dave Rubin; the counter-extremist activist Maajid Nawaz; and a cantankerous and then little-known Canadian psychology professor named Jordan Peterson, who railed against authoritarianism on both the left and right but reserved special disdain for postmodernism, which he believed was eroding rational thought on campuses and elsewhere.

the sudden national obsession with female endangerment on college campuses struck me much the same way it had in the early 1990s: well-intended but ultimately infantilizing to women and essentially unfeminist.

Weinstein and his wife, the evolutionary biologist Heather Heying, who also taught at Evergreen, would eventually leave the school and go on to become core members of the “intellectual dark web.”

Weinstein talked about intellectual “feebleness” in academia and in the media, about the demise of nuance, about still considering himself a progressive despite his feeling that the far left was no better at offering practical solutions to the world’s problems than the far right.

an American Enterprise Institute video of Sommers, the Factual Feminist, in conversation with the scholar and social critic Camille Paglia — “My generation fought for the freedom for women to risk getting raped!” I watched yet another video in which Paglia sat by herself and expounded volcanically about the patriarchal history of art (she was all for it).

the brothers sat down together for a two-hour, 47-minute interview on theRubin Report,

James Baldwin’s line, “I love America more than any other country in the world, and, exactly for this reason, I insist on the right to criticize her perpetually

Jordan Peterson Twelve Rules for Life: An Antidote for Chaos, is a sort of New and Improved Testament for the purpose-lacking young person (often but not always male) for whom tough-love directives like “clean up your room!” go down a lot easier when dispensed with a Jungian, evo-psych panache.

Quillette, a new online magazine that billed itself as “a platform for free thought”

the more honest we are about what we think, the more we’re alone with our thoughts. Just as you can’t fight Trumpism with tribalism, you can’t fight tribalism with a tribe.

media literacy backfire

Did Media Literacy Backfire?

Jan 5, 2017danah boyd

https://points.datasociety.net/did-media-literacy-backfire-7418c084d88d

Understanding what sources to trust is a basic tenet of media literacy education.

Think about how this might play out in communities where the “liberal media” is viewed with disdain as an untrustworthy source of information…or in those where science is seen as contradicting the knowledge of religious people…or where degrees are viewed as a weapon of the elite to justify oppression of working people. Needless to say, not everyone agrees on what makes a trusted source.

Students are also encouraged to reflect on economic and political incentives that might bias reporting. Follow the money, they are told. Now watch what happens when they are given a list of names of major power players in the East Coast news media whose names are all clearly Jewish. Welcome to an opening for anti-Semitic ideology.

In the United States, we believe that worthy people lift themselves up by their bootstraps. This is our idea of freedom. To take away the power of individuals to control their own destiny is viewed as anti-American by so much of this country. You are your own master.

Children are indoctrinated into this cultural logic early, even as their parents restrict their mobility and limit their access to social situations. But when it comes to information, they are taught that they are the sole proprietors of knowledge. All they have to do is “do the research” for themselves and they will know better than anyone what is real.

Combine this with a deep distrust of media sources.

Many marginalized groups are justifiably angry about the ways in which their stories have been dismissed by mainstream media for decades.It took five days for major news outlets to cover Ferguson. It took months and a lot of celebrities for journalists to start discussing the Dakota Pipeline. But feeling marginalized from news media isn’t just about people of color.

Keep in mind that anti-vaxxers aren’t arguing that vaccinations definitively cause autism. They are arguing that we don’t know. They are arguing that experts are forcing children to be vaccinated against their will, which sounds like oppression. What they want is choice — the choice to not vaccinate. And they want information about the risks of vaccination, which they feel are not being given to them. In essence, they are doing what we taught them to do: questioning information sources and raising doubts about the incentives of those who are pushing a single message. Doubt has become tool.

Addressing so-called fake news is going to require a lot more than labeling. It’s going to require a cultural change about how we make sense of information, whom we trust, and how we understand our own role in grappling with information. Quick and easy solutions may make the controversy go away, but they won’t address the underlying problems.

In the United States, we’re moving towards tribalism (see Fukuyama), and we’re undoing the social fabric of our country through polarization, distrust, and self-segregation.

++++++++++++++++++++++++++++++

boyd, danah. (2014). It’s Complicated: The Social Lives of Networked Teens (1 edition). New Haven: Yale University Press.
p. 8 networked publics are publics that are reconstructed by networked technologies. they are both space and imagined community.
p. 11 affordances: persistence, visibility, spreadability, searchability.
p. technological determinism both utopian and dystopian
p. 30 adults misinterpret teens online self-expression.
p. 31 taken out of context. Joshua Meyrowitz about Stokely Charmichael.
p. 43 as teens have embraced a plethora of social environment and helped co-create the norms that underpin them, a wide range of practices has emerged. teens have grown sophisticated with how they manage contexts and present themselves in order to be read by their intended audience.
p. 54 privacy. p. 59 Privacy is a complex concept without a clear definition. Supreme Court Justice Brandeis: the right to be let alone, but also ‘measure of th access others have to you through information, attention, and physical proximity.’
control over access and visibility
p. 65 social steganography. hiding messages in plain sight
p. 69 subtweeting. encoding content
p. 70 living with surveillance . Foucault Discipline and Punish
p. 77 addition. what makes teens obsessed w social media.
p. 81 Ivan Goldberg coined the term internet addiction disorder. jokingly
p. 89 the decision to introduce programmed activities and limit unstructured time is not unwarranted; research has shown a correlation between boredom and deviance.
My interview with Myra, a middle-class white fifteen-year-old from Iowa, turned funny and sad when “lack of time” became a verbal trick in response to every question. From learning Czech to trakc, from orchestra to work in a nursery, she told me that her mother organized “98%” of her daily routine. Myra did not like all of these activities, but her mother thought they were important.
Myra noted that her mother meant well, but she was exhausted and felt socially disconnected because she did not have time to connect with friends outside of class.
p. 100 danger
are sexual predators lurking everywhere
p. 128 bullying. is social media amplifying meanness and cruelty.
p. 131 defining bullying in a digital era. p. 131 Dan Olweus narrowed in the 70s bulling to three components: aggression, repetition and imbalance on power. p. 152 SM has not radically altered the dynamics of bullying, but it has made these dynamics more visible to more people. we must use this visibility not to justify increased punishment, but to help youth who are actually crying out for attention.
p. 153 inequality. can SM resolve social divisions?
p. 176 literacy. are today’s youth digital natives? p. 178 Barlow and Rushkoff p. 179 Prensky. p. 180 youth need new literacies. p. 181 youth must become media literate. when they engage with media–either as consumers or producers–they need to have the skills to ask questions about the construction and dissemination of particular media artifacts. what biases are embedded in the artifact? how did the creator intend for an audience to interpret the artifact, and what are the consequences of that interpretation.
p. 183 the politics of algorithms (see also these IMS blog entries https://blog.stcloudstate.edu/ims?s=algorithms) Wikipedia and google are fundamentally different sites. p. 186 Eli Pariser, The Filter Bubble: the personalization algorithms produce social divisions that undermine any ability to crate an informed public. Harvard’s Berkman Center have shown, search engines like Google shape the quality of information experienced by youth.
p. 192 digital inequality. p. 194 (bottom) 195 Eszter Hargittai: there are signifficant difference in media literacy and technical skills even within age cohorts. teens technological skills are strongly correlated with socio-economic status. Hargittai argues that many youth, far from being digital natives, are quite digitally naive.
p. 195 Dmitry  Epstein: when society frames the digital divide as a problem of access, we see government and industry as the responsible party for the addressing the issue. If DD as skills issue, we place the onus on learning how to manage on individuals and families.
p. 196 beyond digital natives

Palfrey, J., & Gasser, U. (2008). Born Digital: Understanding the First Generation of Digital Natives (1 edition). New York: Basic Books.

John Palfrey, Urs Gasser: Born Digital
Digital Natives share a common global culture that is defined not by age, strictly, but by certain attributes and experience related to how they interact with information technologies, information itself, one another, and other people and institutions. Those who were not “born digital’ can be just as connected, if not more so, than their younger counterparts. And not everyone born since, say 1982, happens to be a digital native.” (see also https://blog.stcloudstate.edu/ims/2018/04/15/no-millennials-gen-z-gen-x/

p. 197. digital native rhetoric is worse than inaccurate: it is dangerous
many of the media literacy skills needed to be digitally savvy require a level of engagement that goes far beyond what the average teen pick up hanging out with friends on FB or Twitter. Technical skills, such as the ability to build online spaces requires active cultivation. Why some search queries return some content before others. Why social media push young people to learn how to to build their own systems, versus simply using a social media platforms. teens social status and position alone do not determine how fluent or informed they are via-a-vis technology.
p. 199 Searching for a public on their own

++++++++++++

Daum, M. (2018, August 24). My Affair With the Intellectual Dark Web – Great Escape. Retrieved October 9, 2018, from https://medium.com/s/greatescape/nuance-a-love-story-ae6a14991059

the intellectual dark web

++++++++++++
more on media literacy in this IMS blog
https://blog.stcloudstate.edu/ims?s=media+literacy

fake news in this IMS blog
https://blog.stcloudstate.edu/ims?s=fake+news

Hoaxy

Interview with the Creators of Hoaxy® from Indiana University

Interview with the Creators of Hoaxy® from Indiana University

Falsehoods are spread due to biases in the brain, society, and computer algorithms (Ciampaglia & Menczer, 2018). A combined problem is “information overload and limited attention contribute to a degradation of the market’s discriminative power” (Qiu, Oliveira, Shirazi, Flammini, & Menczer, 2017).  Falsehoods spread quickly in the US through social media because this has become Americans’ preferred way to read the news (59%) in the 21st century (Mitchell, Gottfried, Barthel, & Sheer, 2016). While a mature critical reader may recognize a hoax disguised as news, there are those who share it intentionally. A 2016 US poll revealed that 23% of American adults had shared misinformation unwittingly or on purpose; this poll reported high to moderate confidence in one’s ability to identify fake news with only 15% not very confident (Barthel, Mitchell, & Holcomb, 2016).

Hoaxy® takes it one step further and shows you who is spreading or debunking a hoax or disinformation on Twitter.

+++++++++++++
more on fake news in this IMS blog
https://blog.stcloudstate.edu/ims?s=fake+news

Limbic thought and artificial intelligence

Limbic thought and artificial intelligence

September 5, 2018  Siddharth (Sid) Pai

https://www.linkedin.com/pulse/limbic-thought-artificial-intelligence-siddharth-sid-pai/

An AI programme “catastrophically forgets” the learnings from its first set of data and would have to be retrained from scratch with new data. The website futurism.com says a completely new set of algorithms would have to be written for a programme that has mastered face recognition, if it is now also expected to recognize emotions. Data on emotions would have to be manually relabelled and then fed into this completely different algorithm for the altered programme to have any use. The original facial recognition programme would have “catastrophically forgotten” the things it learnt about facial recognition as it takes on new code for recognizing emotions. According to the website, this is because computer programmes cannot understand the underlying logic that they have been coded with.
Irina Higgins, a senior researcher at Google DeepMind, has recently announced that she and her team have begun to crack the code on “catastrophic forgetting”.
As far as I am concerned, this limbic thinking is “catastrophic thinking” which is the only true antipode to AI’s “catastrophic forgetting”. It will be eons before AI thinks with a limbic brain, let alone has consciousness.
++++++++++++++++++

Stephen Hawking warns artificial intelligence could end mankind

https://www.bbc.com/news/technology-30290540
++++++++++++++++++++
thank you Sarnath Ramnat (sarnath@stcloudstate.edu) for the finding

An AI Wake-Up Call From Ancient Greece

  https://www.project-syndicate.org/commentary/artificial-intelligence-pandoras-box-by-adrienne-mayor-2018-10

++++++++++++++++++++
more on AI in this IMS blog
https://blog.stcloudstate.edu/ims?s=artifical+intelligence

coding ethics unpredictability

Franken-algorithms: the deadly consequences of unpredictable code

by  Thu 30 Aug 2018 

https://www.theguardian.com/technology/2018/aug/29/coding-algorithms-frankenalgos-program-danger

Between the “dumb” fixed algorithms and true AI lies the problematic halfway house we’ve already entered with scarcely a thought and almost no debate, much less agreement as to aims, ethics, safety, best practice. If the algorithms around us are not yet intelligent, meaning able to independently say “that calculation/course of action doesn’t look right: I’ll do it again”, they are nonetheless starting to learn from their environments. And once an algorithm is learning, we no longer know to any degree of certainty what its rules and parameters are. At which point we can’t be certain of how it will interact with other algorithms, the physical world, or us. Where the “dumb” fixed algorithms – complex, opaque and inured to real time monitoring as they can be – are in principle predictable and interrogable, these ones are not. After a time in the wild, we no longer know what they are: they have the potential to become erratic. We might be tempted to call these “frankenalgos” – though Mary Shelley couldn’t have made this up.

Twenty years ago, George Dyson anticipated much of what is happening today in his classic book Darwin Among the Machines. The problem, he tells me, is that we’re building systems that are beyond our intellectual means to control. We believe that if a system is deterministic (acting according to fixed rules, this being the definition of an algorithm) it is predictable – and that what is predictable can be controlled. Both assumptions turn out to be wrong.“It’s proceeding on its own, in little bits and pieces,” he says. “What I was obsessed with 20 years ago that has completely taken over the world today are multicellular, metazoan digital organisms, the same way we see in biology, where you have all these pieces of code running on people’s iPhones, and collectively it acts like one multicellular organism.“There’s this old law called Ashby’s law that says a control system has to be as complex as the system it’s controlling, and we’re running into that at full speed now, with this huge push to build self-driving cars where the software has to have a complete model of everything, and almost by definition we’re not going to understand it. Because any model that we understand is gonna do the thing like run into a fire truck ’cause we forgot to put in the fire truck.”

Walsh believes this makes it more, not less, important that the public learn about programming, because the more alienated we become from it, the more it seems like magic beyond our ability to affect. When shown the definition of “algorithm” given earlier in this piece, he found it incomplete, commenting: “I would suggest the problem is that algorithm now means any large, complex decision making software system and the larger environment in which it is embedded, which makes them even more unpredictable.” A chilling thought indeed. Accordingly, he believes ethics to be the new frontier in tech, foreseeing “a golden age for philosophy” – a view with which Eugene Spafford of Purdue University, a cybersecurity expert, concurs. Where there are choices to be made, that’s where ethics comes in.

our existing system of tort law, which requires proof of intention or negligence, will need to be rethought. A dog is not held legally responsible for biting you; its owner might be, but only if the dog’s action is thought foreseeable.

model-based programming, in which machines do most of the coding work and are able to test as they go.

As we wait for a technological answer to the problem of soaring algorithmic entanglement, there are precautions we can take. Paul Wilmott, a British expert in quantitative analysis and vocal critic of high frequency trading on the stock market, wryly suggests “learning to shoot, make jam and knit

The venerable Association for Computing Machinery has updated its code of ethics along the lines of medicine’s Hippocratic oath, to instruct computing professionals to do no harm and consider the wider impacts of their work.

+++++++++++
more on coding in this IMS blog
https://blog.stcloudstate.edu/ims?s=coding

Fake news materials for Engl 101

English 101 materials for discussion on fake news.

Jamie Heiman.

All materials on #FakeNews in the IMS blog: https://blog.stcloudstate.edu/ims?s=fake+news

this topic is developed in conjunction with digital literacy discussions.

from psychological perspective: https://blog.stcloudstate.edu/ims/2018/03/29/psychology-fake-news/

from legal/ethical perspective: https://blog.stcloudstate.edu/ims/2018/03/26/prison-time-for-fake-news/

definition:
https://blog.stcloudstate.edu/ims/2018/02/18/fake-news-disinformation-propaganda/

mechanics:
https://blog.stcloudstate.edu/ims/2017/11/22/bots-trolls-and-fake-news/

https://blog.stcloudstate.edu/ims/2017/07/15/fake-news-and-video/

https://blog.stcloudstate.edu/ims/2018/04/09/automated-twitter-bots/

https://blog.stcloudstate.edu/ims/2018/03/25/data-misuse/

https://blog.stcloudstate.edu/ims/2018/02/10/bots-big-data-future/

https://blog.stcloudstate.edu/ims/2017/09/19/social-media-algorithms/

exercises in detecting fake news:
(why should we) :

fake news


https://blog.stcloudstate.edu/ims/2016/12/09/immune-to-info-overload/

https://blog.stcloudstate.edu/ims/2017/08/13/library-spot-fake-news/

https://blog.stcloudstate.edu/ims/2016/11/23/fake-news/

https://blog.stcloudstate.edu/ims/2016/12/14/fake-news-2/

https://blog.stcloudstate.edu/ims/2017/06/26/fake-news-real-news/

https://blog.stcloudstate.edu/ims/2017/03/28/fake-news-resources/

https://blog.stcloudstate.edu/ims/2017/03/15/fake-news-bib/

News literacy education (see digital literacy): https://blog.stcloudstate.edu/ims/2018/06/23/digital-forensics-and-news-literacy-education/

https://blog.stcloudstate.edu/ims/2017/07/21/unfiltered-news/

https://blog.stcloudstate.edu/ims/2017/03/13/types-of-misinformation/

Additional ideas and readings:

https://blog.stcloudstate.edu/ims/2017/11/30/rt-hybrid-war/

https://blog.stcloudstate.edu/ims/2017/08/23/nmc-digital-literacy/

 

 

future of Internet

Can the Internet be saved?

https://mondediplo.com/outsidein/can-the-internet-be-saved
In 2014 Tim Berners-Lee, inventor of the World Wide Web, proposed an online ‘Magna Carta’ to protect the Internet, as a neutral system, from government and corporate manipulation. He was responding after revelations that British and US spy agencies were carrying out mass surveillance programmes; the Cambridge Analytica scandal makes his proposal as relevant as ever.

Luciano Floridi, professor of Philosophy and Ethics of Information at the Oxford Internet Institute, explains that grey power is not ordinary socio-political or military power. It is not the ability to directly influence others, but rather the power to influence those who influence power. To see grey power, you need only look at the hundreds of high-level instances of revolving-door staffing patterns between Google and European governmentsand the U.S. Department of State.

And then there is ‘surveillance capitalism’. Shoshana Zuboff, Professor Emerita at Harvard Business School, proposes that surveillance capitalism is ‘a new logic of accumulation’. The incredible evolution of computer processing power, complex algorithms and leaps in data storage capabilities combine to make surveillance capitalism possible. It is the process of accumulation by dispossession of the data that people produce.

The respected security technologist Bruce Schneier recently applied the insights of surveillance capitalism to the Cambridge Analytica/Facebook crisis.

For Schneier, ‘regulation is the only answer.’ He cites the EU’s General Data Protection Regulation coming into effect next month, which stipulates that users must consent to what personal data can be saved and how it is used.

++++++++++++++++++++++
more on the Internet in this IMS blog
https://blog.stcloudstate.edu/ims?s=internet

bots, big data and the future

Computational Propaganda: Bots, Targeting And The Future

February 9, 201811:37 AM ET 

https://www.npr.org/sections/13.7/2018/02/09/584514805/computational-propaganda-yeah-that-s-a-thing-now

Combine the superfast calculational capacities of Big Compute with the oceans of specific personal information comprising Big Data — and the fertile ground for computational propaganda emerges. That’s how the small AI programs called bots can be unleashed into cyberspace to target and deliver misinformation exactly to the people who will be most vulnerable to it. These messages can be refined over and over again based on how well they perform (again in terms of clicks, likes and so on). Worst of all, all this can be done semiautonomously, allowing the targeted propaganda (like fake news stories or faked images) to spread like viruses through communities most vulnerable to their misinformation.

According to Bolsover and Howard, viewing computational propaganda only from a technical perspective would be a grave mistake. As they explain, seeing it just in terms of variables and algorithms “plays into the hands of those who create it, the platforms that serve it, and the firms that profit from it.”

Computational propaganda is a new thing. People just invented it. And they did so by realizing possibilities emerging from the intersection of new technologies (Big Compute, Big Data) and new behaviors those technologies allowed (social media). But the emphasis on behavior can’t be lost.

People are not machines. We do things for a whole lot of reasons including emotions of loss, anger, fear and longing. To combat computational propaganda’s potentially dangerous effects on democracy in a digital age, we will need to focus on both its howand its why.

++++++++++++++++
more on big data in this IMS blog
https://blog.stcloudstate.edu/ims?s=big+data

more on bots in this IMS blog
https://blog.stcloudstate.edu/ims?s=bot

more on fake news in this IMS blog
https://blog.stcloudstate.edu/ims?s=fake+news

topics for IM260

proposed topics for IM 260 class

  • Media literacy. Differentiated instruction. Media literacy guide.
    Fake news as part of media literacy. Visual literacy as part of media literacy. Media literacy as part of digital citizenship.
  • Web design / web development
    the roles of HTML5, CSS, Java Script, PHP, Bootstrap, JQuery, React and other scripting languages and libraries. Heat maps and other usability issues; website content strategy. THE MODEL-VIEW-CONTROLLER (MVC) design pattern
  • Social media for institutional use. Digital Curation. Social Media algorithms. Etiquette Ethics. Mastodon
    I hosted a LITA webinar in the fall of 2016 (four weeks); I can accommodate any information from that webinar for the use of the IM students
  • OER and instructional designer’s assistance to book creators.
    I can cover both the “library part” (“free” OER, copyright issues etc) and the support / creative part of an OER book / textbook
  • Big Data.” Data visualization. Large scale visualization. Text encoding. Analytics, Data mining. Unizin. Python, R in academia.
    I can introduce the students to the large idea of Big Data and its importance in lieu of the upcoming IoT, but also departmentalize its importance for academia, business, etc. From infographics to heavy duty visualization (Primo X-Services API. JSON, Flask).
  • NetNeutrality, Digital Darwinism, Internet economy and the role of your professional in such environment
    I can introduce students to the issues, if not familiar and / or lead a discussion on a rather controversial topic
  • Digital assessment. Digital Assessment literacy.
    I can introduce students to tools, how to evaluate and select tools and their pedagogical implications
  • Wikipedia
    a hands-on exercise on working with Wikipedia. After the session, students will be able to create Wikipedia entries thus knowing intimately the process of Wikipedia and its information.
  • Effective presentations. Tools, methods, concepts and theories (cognitive load). Presentations in the era of VR, AR and mixed reality. Unity.
    I can facilitate a discussion among experts (your students) on selection of tools and their didactically sound use to convey information. I can supplement the discussion with my own findings and conclusions.
  • eConferencing. Tools and methods
    I can facilitate a discussion among your students on selection of tools and comparison. Discussion about the their future and their place in an increasing online learning environment
  • Digital Storytelling. Immersive Storytelling. The Moth. Twine. Transmedia Storytelling
    I am teaching a LIB 490/590 Digital Storytelling class. I can adapt any information from that class to the use of IM students
  • VR, AR, Mixed Reality.
    besides Mark Gill, I can facilitate a discussion, which goes beyond hardware and brands, but expand on the implications for academia and corporate education / world
  • IoT , Arduino, Raspberry PI. Industry 4.0
  • Instructional design. ID2ID
    I can facilitate a discussion based on the Educause suggestions about the profession’s development
  • Microcredentialing in academia and corporate world. Blockchain
  • IT in K12. How to evaluate; prioritize; select. obsolete trends in 21 century schools. K12 mobile learning
  • Podcasting: past, present, future. Beautiful Audio Editor.
    a definition of podcasting and delineation of similar activities; advantages and disadvantages.
  • Digital, Blended (Hybrid), Online teaching and learning: facilitation. Methods and techniques. Proctoring. Online students’ expectations. Faculty support. Asynch. Blended Synchronous Learning Environment
  • Gender, race and age in education. Digital divide. Xennials, Millennials and Gen Z. generational approach to teaching and learning. Young vs old Millennials. Millennial employees.
  • Privacy, [cyber]security, surveillance. K12 cyberincidents. Hackers.
  • Gaming and gamification. Appsmashing. Gradecraft
  • Lecture capture, course capture.
  • Bibliometrics, altmetrics
  • Technology and cheating, academic dishonest, plagiarism, copyright.

digital darwinism

We Need New Rules for the Internet Economy

Antitrust laws only go so far when addressing companies that don’t produce any physical goods. It is time to negotiate a new set of rules. Otherwise, our future economy will be dominated by just a few companies.

A DER SPIEGEL Editorial by Armin Mahler  November 03, 2017  06:12 PMhttp://www.spiegel.de/international/business/editorial-time-for-new-rules-for-the-ditigal-economy-a-1176403.html

There are still people out there who think that Amazon is nothing more than an online version of a department store. But it’s much more than that: It is a rapidly growing, global internet giant that is changing the way we shop, conquering more and more markets, using Alexa to suck up our personal data straight out of our living rooms and currently seeking access to our front door keys so it can deliver packages even when nobody’s home.

It wasn’t that long ago that EU efforts to limit the power of Google and Amazon on the European market were decried in the U.S. as protectionism, as an attempt by the Europeans to protect their own inferior digital economy. Now, though, politicians and economists in the U.S. have even begun discussing the prospect of breaking up the internet giants. The mood has shifted.

The digital economy, by contrast, is based on algorithms and its most powerful companies don’t produce any physical products. Customers receive their services free of charge, paying only with their data. The more customers a service provider attracts, the more attractive it becomes to new customers, who then deliver even more data – which is why Google and Facebook need not fear new competition.

first of all, the power of a company, and the abuse of that power, must be redefined. We cannot allow a situation in which these extremely large companies can swallow up potential rivals before they can even begin to develop. As such, company acquisitions must be monitored much more strictly than they currently are and, if need be, blocked.

Second, it must be determined who owns the data collected – whether, for example, it should also be made available to competitors or whether consumers should receive more in exchange than simply free internet search results.

Third, those disseminating content cannot be allowed to reject responsibility for that content. Demonstrably false claims and expressions of hate should not be tolerated.

And finally, those who earn lots of money must also pay lots of taxes – and not just back home but in all the countries where they do business.

+++++++++++
more on net neutrality in this IMS blog
https://blog.stcloudstate.edu/ims?s=net+neutrality

1 2 3 4 5