1. Anti-School Shooter Software
4. “The Year of the MOOC” (2012)
6. “Everyone Should Learn to Code”
8. LAUSD’s iPad Initiative (2013)
9. Virtual Charter Schools
10. Google for Education
14. inBloom. The Shared Learning Collaborative (2011)
17. Test Prep
20. Predictive Analytics
22. Automated Essay Grading
25. Peter Thiel
26. Google Glass
32. Common Core State Standards
44. YouTube, the New “Educational TV”
48. The Hour of Code
49. Yik Yak
52. Virtual Reality
57. TurnItIn (and the Cheating Detection Racket) (my note: repeating the same for years: http://blog.stcloudstate.edu/ims?s=turnitin)
64. Alexa at School
65. Apple’s iTextbooks (2011)
67. UC Berkeley Deletes Its Online Lectures. ADA
72. Chatbot Instructors. IBM Watson “AI” technology (2016)
82. “The End of Library” Stories (and the Software that Seems to Support That)
92. “The Flipped Classroom”
93. 3D Printing
100. The Horizon Report
TWO YEARS AGO, Alison Darcy built a robot to help out the depressed. As a clinical research psychologist at Stanford University, she knew that one powerful way to help people suffering from depression or anxiety is cognitive behavioral therapy, or C.B.T. It’s a form of treatment in which a therapist teaches patients simple techniques that help them break negative patterns of thinking.
In a study with 70 young adults, Darcy found that after two weeks of interacting with the bot, the test subjects had lower incidences of depression and anxiety. They were impressed, and even touched, by the software’s attentiveness.
Many tell Darcy that it’s easier to talk to a bot than a human; they don’t feel judged.
Darcy argues this is a glimpse of our rapidly arriving future, where talking software is increasingly able to help us manage our emotions. There will be A.I.s that detect our feelings, possibly better than we can. “I think you’ll see robots for weight loss, and robots for being more effective communicators,” she says. It may feel odd at first
RECENT HISTORY HAS seen a rapid change in at least one human attitude toward machines: We’ve grown accustomed to talking to them. Millions now tell Alexa or Siri or Google Assistant to play music, take memos, put something on their calendar or tell a terrible joke.
One reason botmakers are embracing artificiality is that the Turing Test turns out to be incredibly difficult to pass. Human conversation is full of idioms, metaphors and implied knowledge: Recognizing that the expression “It’s raining cats and dogs” isn’t actually about cats and dogs, for example, surpasses the reach of chatbots.
Conversational bots thus could bring on a new wave of unemployment — or “readjustment,” to use the bloodless term of economics. Service workers, sales agents, telemarketers — it’s not hard to imagine how millions of jobs that require social interaction, whether on the phone or online, could eventually be eliminated by code.
One person who bought a Jibo was Erin Partridge, an art therapist in Alameda, Calif., who works with the elderly. When she took Jibo on visits, her patients loved it.
For some technology critics, including Sherry Turkle, who does research on the psychology of tech at M.I.T., this raises ethical concerns. “People are hard-wired with sort of Darwinian vulnerabilities, Darwinian buttons,” she told me. “And these Darwinian buttons are pushed by this technology.” That is, programmers are manipulating our emotions when they create objects that inquire after our needs.
The precursor to today’s bots, Joseph Weizenbaum’s ELIZA, was created at M.I.T. in 1966. ELIZA was a pretty crude set of prompts, but by simply asking people about their feelings, it drew them into deep conversations.