TWO YEARS AGO, Alison Darcy built a robot to help out the depressed. As a clinical research psychologist at Stanford University, she knew that one powerful way to help people suffering from depression or anxiety is cognitive behavioral therapy, or C.B.T. It’s a form of treatment in which a therapist teaches patients simple techniques that help them break negative patterns of thinking.
In a study with 70 young adults, Darcy found that after two weeks of interacting with the bot, the test subjects had lower incidences of depression and anxiety. They were impressed, and even touched, by the software’s attentiveness.
Many tell Darcy that it’s easier to talk to a bot than a human; they don’t feel judged.
Darcy argues this is a glimpse of our rapidly arriving future, where talking software is increasingly able to help us manage our emotions. There will be A.I.s that detect our feelings, possibly better than we can. “I think you’ll see robots for weight loss, and robots for being more effective communicators,” she says. It may feel odd at first
RECENT HISTORY HAS seen a rapid change in at least one human attitude toward machines: We’ve grown accustomed to talking to them. Millions now tell Alexa or Siri or Google Assistant to play music, take memos, put something on their calendar or tell a terrible joke.
One reason botmakers are embracing artificiality is that the Turing Test turns out to be incredibly difficult to pass. Human conversation is full of idioms, metaphors and implied knowledge: Recognizing that the expression “It’s raining cats and dogs” isn’t actually about cats and dogs, for example, surpasses the reach of chatbots.
Conversational bots thus could bring on a new wave of unemployment — or “readjustment,” to use the bloodless term of economics. Service workers, sales agents, telemarketers — it’s not hard to imagine how millions of jobs that require social interaction, whether on the phone or online, could eventually be eliminated by code.
One person who bought a Jibo was Erin Partridge, an art therapist in Alameda, Calif., who works with the elderly. When she took Jibo on visits, her patients loved it.
For some technology critics, including Sherry Turkle, who does research on the psychology of tech at M.I.T., this raises ethical concerns. “People are hard-wired with sort of Darwinian vulnerabilities, Darwinian buttons,” she told me. “And these Darwinian buttons are pushed by this technology.” That is, programmers are manipulating our emotions when they create objects that inquire after our needs.
The precursor to today’s bots, Joseph Weizenbaum’s ELIZA, was created at M.I.T. in 1966. ELIZA was a pretty crude set of prompts, but by simply asking people about their feelings, it drew them into deep conversations.
It is important to note that bot accounts do not always clearly identify themselves as such in their profiles, and any bot classification system inevitably carries some risk of error. The Botometer system has been documented and validated in an array of academicpublications, and researchers from the Center conducted a number of independent validation measures of its results.8
Combine the superfast calculational capacities of Big Compute with the oceans of specific personal information comprising Big Data — and the fertile ground for computational propaganda emerges. That’s how the small AI programs called bots can be unleashed into cyberspace to target and deliver misinformation exactly to the people who will be most vulnerable to it. These messages can be refined over and over again based on how well they perform (again in terms of clicks, likes and so on). Worst of all, all this can be done semiautonomously, allowing the targeted propaganda (like fake news stories or faked images) to spread like viruses through communities most vulnerable to their misinformation.
According to Bolsover and Howard, viewing computational propaganda only from a technical perspective would be a grave mistake. As they explain, seeing it just in terms of variables and algorithms “plays into the hands of those who create it, the platforms that serve it, and the firms that profit from it.”
Computational propaganda is a new thing. People just invented it. And they did so by realizing possibilities emerging from the intersection of new technologies (Big Compute, Big Data) and new behaviors those technologies allowed (social media). But the emphasis on behavior can’t be lost.
People are not machines. We do things for a whole lot of reasons including emotions of loss, anger, fear and longing. To combat computational propaganda’s potentially dangerous effects on democracy in a digital age, we will need to focus on both its howand its why.
Educators seeking new technology can start by consulting a database of pre-vetted edtech tools, rated based on alignment with both child data privacy laws and the district’s instructional vision. Each entry includes notes about what the software does, how it can be used in the classroom, and the appropriate age level. Kaye is also working on aligning the database to the ISTE Standards so teachers can see at a glance which standards each tool can help them meet.
Every app falls into one of four categories:
Tools the district approves, supports, pays for, and will train teachers to use.
Tools that are approved and can be freely used on an independent basis.
Tools that are approved with stipulations, such as age or parental permission requirements.
Tools that are not approved because they don’t align with the district’s vision or data privacy needs.
Teachers can request to have a tool vetted
Teachers who choose a pre-vetted app from the approved list can start using it right away, without any further action needed. Educators who have a specific tool in mind that hasn’t yet been vetted can submit a request form that asks questions such as:
How does the tool connect to the curriculum?
Will students be consumers or producers when using it?
How easy is it to learn and use?
What are some of the things they plan on doing with it?
Session 2: The Digital Age: The Impact and Future Possibilities Offered by Data and Technology
Thank you for registering to participate in the second Reimagining Minnesota State forum. The Forums have been designed to spark not only individual reflection but what we hope can serve as catalysts for discussions in a variety of venues. The Forum will be recorded and available for viewing on the Reimagining website.
Below are the directions whether you are attending in person or by live stream.
Catherine Haslag: Is there any research to show students retention in an online class vs a face-to-face course?
the challenge is not collecting, but integrating, using data.
silos = cylinder of excellence.
technology innovation around advising. iPASS resources.
adaptive learning systems – how students advance through the learning process.
games and simulations Bryan Mark Gill. voice recognition,
next 3 to 5 years AR. by 2023 40% with AR and VR
AI around the controversial. Chatbot and Voice assistants.
Unizin: 13 founding members to develop platform, Canvas, instructional services, data for predictive analytic, consistent data standard among institutions,
University innovation Alliance. Analytics as the linchpin for students’ success. graduation rates increase. racial gap graduation. Georgia State.
digital ethics. Mark Gill and Susana Nuccetelli. digital ethics: Susana Nuccetelli brought her students from the Philosophy Dept to Mark Gill’s SCSu Vizlab so we can discuss ethics and AI, last semester. firstname.lastname@example.org
assistant vice president for student success and prevention Morgan State U
the importance of training in technology adoption
Dr. Peter Smith, Orkand Endowed Chair and Professor of Innovative Practices in Higher Education at University of Maryland University College
social disruption, national security issue,
Allan Taft Candadian researcher, 700 hours / year learning something. 14 h/w.
learners deserve recognition
free range learning.
how do we get a value on people from a different background? knowledge discrimination. we value it on where they learned it. then how you learned it and what you can do with it. talent and capacity not recognized.
we, the campus, don’t control the forces for a very first time. MIT undergrad curricula is free, what will happen. dynamics at work here. declining student numbers, legislation unhappy. technology had made college more expensive, not less. doing the right thing, leads to more disruption. local will be better, if done well. workplace can become a place for learning.
learning is a social activity. distance learning: being on the farthest raw of 300 Princeton lecture. there is a tool and there is people; has to have people at the heart.
what will work not only for MN, but for each of the campuses, the personalization.
staying still is death.
what is the role of faculty in the vendor and discussions about technology. a heat map shows that IT people were testing the vendor web site most, faculty and student much less.
Early signs suggest Gen Z workers are more competitive and pragmatic, but also more anxious and reserved, than millennials, the generation of 72 million born from 1981 to 1996, according to executives, managers, generational consultants and multidecade studies of young people. Gen Zers are also the most racially diverse generation in American histor
With the generation of baby boomers retiring and unemployment at historic lows, Gen Z is filling immense gaps in the workforce. Employers, plagued by worker shortages, are trying to adapt.
LinkedIn Corp. and Intuit Inc. have eased requirements that certain hires hold bachelor’s degrees to reach young adults who couldn’t afford college. At campus recruiting events, EY is raffling off computer tablets because competition for top talent is intense.
Companies are reworking training so it replicates YouTube-style videos that appeal to Gen Z workers reared on smartphones.
“They learn new information much more quickly than their predecessors,”
A few years ago Mr. Stewart noticed that Gen Z hires behaved differently than their predecessors. When the company launched a project to support branch managers, millennials excitedly teamed up and worked together. Gen Z workers wanted individual recognition and extra pay.
Much of Gen Z’s socializing takes place via text messages and social media platforms—a shift that has eroded natural interactions and allowed bullying to play out in front of wider audiences.
The flip side of being digital natives is that Gen Z is even more adept with technology than millennials. Natasha Stough, Americas campus recruiting director at EY in Chicago, was wowed by a young hire who created a bot to answer questions on the company’s Facebook careers page.
To lure more Gen Z workers, EY rolled out video technology that allows job candidates to record answers to interview questions and submit them electronically.
LinkedIn, which used to recruit from about a dozen colleges, broadened its efforts to include hundreds of schools and computer coding boot camps to capture a diverse applicant pool that mirrors the changing population.
people confuse gratification with happiness because they both give a good experience at the beginning. However, the good feeling from gratification only lasts for the current moment or not as long as happiness. Gratification is anything from outside yourself that make you feel good.
rushing through school to get a diploma and not learn anything will not bring you happiness. A diploma is a piece of paper that is outside of yourself.
Happiness is when you build qualities within you, for example, when you build your self-confidence, self-esteem, kindness, honesty, knowledge, etc. These are qualities that develop within yourself. The stronger these qualities become, the happier you will become.