dear Hollywood, I demand you make a Gen Z Manchurian Candidate reboot where all the brainwashing programming/delusive fantasies take place on TikTok https://t.co/sgSekDuK6l
The bottom line: While the Big Tech behemoths of the U.S. are barred from making inroads in China, the inverse doesn’t apply. That could mark an opening front in the ongoing technological and economic war between the two rivals.
New York’s Lockport City School District, which is using public funds from a Smart Schools bond to help pay for a reported $3.8 million security system that uses facial recognition technology to identify individuals who don’t belong on campus
the Future of Privacy Forum (FPF), a nonprofit think tank based in Washington, D.C., published an animated video that illustrates the possible harm that surveillance technology can cause to children and the steps schools should take before making any decisions, such as identifying specific goals for the technology and establishing who will have access to the data and for how long.
My note: same considerations were relayed to the SCSU SOE dean in regard of the purchase of Premethean and its installation in SOE building without discussion with faculty, who work with technology. This information was also shared with the dean: https://blog.stcloudstate.edu/ims/2018/10/31/students-data-privacy/
starting to hear more about what might be lost when schools focus too much on data. Here are five arguments against the excesses of data-driven instruction.
The National Education Policy Center releases annual reports on commercialization and marketing in public schools. In its most recent report in May, researchers there raised concerns about targeted marketing to students using computers for schoolwork and homework.
In the past few years several states have passed laws banning employers from looking at the credit reports of job applicants.
Similarly, for young people who get in trouble with the law, there is a procedure for sealing juvenile records
Educational transcripts, unlike credit reports or juvenile court records, are currently considered fair game for gatekeepers like colleges and employers. These records, though, are getting much more detailed.
predictive algorithms to better target students’ individual learning needs.
Personalized learning is a lofty aim, however you define it. To truly meet each student where they are, we would have to know their most intimate details, or discover it through their interactions with our digital tools. We would need to track their moods and preferences, their fears and beliefs…perhaps even their memories.
There’s something unsettling about capturing users’ most intimate details. Any prediction model based off historical records risks typecasting the very people it is intended to serve. Even if models can overcome the threat of discrimination, there is still an ethical question to confront – just how much are we entitled to know about students?
We can accept that tutoring algorithms, for all their processing power, are inherently limited in what they can account for. This means steering clear of mythical representations of what such algorithms can achieve. It may even mean giving up on personalization altogether. The alternative is to pack our algorithms to suffocation at the expense of users’ privacy. This approach does not end well.
There is only one way to resolve this trade-off: loop in the educators.
Technology is a branch of moral philosophy, not of science
The process of making technology is design
Design is a branch of moral philosophy, not of a science
System design reflects the designer’s values and the cultural content
Andreas Orphanides
Fulbright BOYD
Byzantine history professor Bulgarian – all that is 200 years old is politics, not history
Access, privacy, equity, values for the prof organization ARLD.
Mike Monteiro
This is how bad design makes it out into the world, not due to mailcioius intent, but whith nbo intent at all
Cody Hanson
Our expertise, our service ethic, and our values remain our greatest strengths. But for us to have the impat we seek into the lives of our users, we must encode our services and our values in to the software
Ethical design.
Design interprets the world to crate useful objects. Ethical design closes the loop, imaging how those object will affect the world.
A good science fiction story should be able to predict not the automobile, ut the traffics jam. Frederic Pohl
Victor Papanek The designer’s social and moral judgement must be brought into play long before she begins to design.
We need to fear the consequences of our work more than we love the cleverness of our ideas Mike Monteiro
Analytics
Qual and quan data – lirarainas love data, usage, ILL, course reserves, data – QQLM.
IDEO – the goal of design research isn’t to collect data, I tis to synthesize information and provide insight and guidance that leads to action.
Google Analytics: the trade off. besides privacy concners. sometimes data and analytics is the only thing we can see.
Frank CHimero – remove a person;s humanity and she is just a curiosity, a pinpoint on a map, a line in a list, an entry in a dbase. a person turns into a granular but of information.
by designing for yourself or your team, you are potentially building discrimination right into your product Erica Hall.
Search algorithms.
what is relevance. the relevance of the ranking algorithm. for whom (what patron). crummy searches.
reckless associsations – made by humans or computers – can do very real harm especially when they appear in supposedly neutral environments.
Donna Lanclos and Andrew Asher Ethonography should be core to the business of the library.
technology as information ecology. co-evolve. prepare to start asking questions to see the effect of our design choices.
ethnography of library: touch point tours – a student to give a tour to the librarians or draw a map of the library , give a sense what spaces they use, what is important. ethnographish
Q from the audience: if instructors warn against Google and Wikipedia and steer students to library and dbases, how do you now warn about the perils of the dbases bias? A: put fires down, and systematically, try to build into existing initiatives: bi-annual magazine, as many places as can
Date: Wednesday, April 3rd Time: 3:30 PM to 4:15 PM Conference Session: Concurrent Session 3 Streamed session Lead Presenter: Brian Kane (General Design LLC) Track: Research: Designs, Methods, and Findings Location: Juniper A Session Duration: 45min Brief Abstract:What happens when you apply design thinking to AI? AI presents a fundamental change in the way people interact with machines. By applying design thinking to the way AI is made and used, we can generate an unlimited amount of new ideas for products and experiences that people will love and use.https://onlinelearningconsortium.org/olc-innovate-2019-session-page/?session=6964&kwds=
Notes from the session:
design thinking: get out from old mental models. new narratives; get out of the sci fi movies.
narrative generators:
we need machines to make mistakes. Ai even more then traditional software.
Lessons learned: don’t replace people
Date: Thursday, April 4th Time: 8:45 AM to 9:30 AM Conference Session: Concurrent Session 4 Streamed session Lead Presenter: Matt Crosslin (University of Texas at Arlington LINK Research Lab) Track: Experiential and Life-Long Learning Location: Cottonwood 4-5 Session Duration: 45min Brief Abstract:How can teachers utilize chatbots and artificial intelligence in ways that won’t remove humans out of the education picture? Using tools like Twine and Recast.AI chatobts, this session will focus on how to build adaptive content that allows learners to create their own heutagogical educational pathways based on individual needs.++++++++++++++++
Date: Thursday, April 4th Time: 9:45 AM to 10:30 AM Conference Session: Concurrent Session 5 Streamed session Lead Presenter: Maikel Alendy (FIU Online) Co-presenter: Sky V. King (FIU Online – Florida International University) Track: Teaching and Learning Practice Location: Cottonwood 4-5 Session Duration: 45min Brief Abstract:“This is Us” demonstrates how leveraging storytelling in learning engages students to effectively communicate their authentic story, transitioning from consumerism to become creators and influencers. Addressing responsibility as a digital citizen, information and digital literacy, online privacy, and strategies with examples using several edtech tools, will be reviewed.++++++++++++++++++
Date: Thursday, April 4th Time: 11:15 AM to 12:00 PM Conference Session: Concurrent Session 6 Streamed session Lead Presenter: Kristin Bushong (Arizona State University ) Co-presenter: Heather Nebrich (Arizona State University) Track: Effective Tools, Toys and Technologies Location: Juniper C Session Duration: 45min Brief Abstract:Considering today’s overstimulated lifestyle, how do we engage busy learners to stay on task? Join this session to discover current efforts in implementing ubiquitous educational opportunities through customized interests and personalized learning aspirations e.g., adaptive math tools, AI support communities, and memory management systems.+++++++++++++
Date: Thursday, April 4th Time: 1:15 PM to 2:00 PM Conference Session: Concurrent Session 7 Streamed session Lead Presenter: Katie Linder (Oregon State University) Co-presenter: June Griffin (University of Nebraska-Lincoln) Track: Teaching and Learning Practice Location: Cottonwood 4-5 Session Duration: 45min Brief Abstract:The concept of High-impact Educational Practices (HIPs) is well-known, but the conversation about transitioning HIPs online is new. In this session, contributors from the edited collection High-Impact Practices in Online Education will share current HIP research, and offer ideas for participants to reflect on regarding implementing HIPs into online environments.https://www.aacu.org/leap/hipshttps://www.aacu.org/sites/default/files/files/LEAP/HIP_tables.pdf+++++++++++++++++++++++
Date: Thursday, April 4th Time: 3:45 PM to 5:00 PM Streamed session Lead Presenter: Manoush Zomorodi (Stable Genius Productions) Track: N/A Location: Adams Ballroom Session Duration: 1hr 15min Brief Abstract:How can we ensure that students and educators thrive in increasingly digital environments, where change is the only constant? In this keynote, author and journalist Manoush Zomorodi shares her pioneering approach to researching the effects of technology on our behavior. Her unique brand of journalism includes deep-dive investigations into such timely topics as personal privacy, information overload, and the Attention Economy. These interactive multi-media experiments with tens of thousands of podcast listeners will inspire you to think creatively about how we use technology to educate and grow communities.Friday
Date: Friday, April 5th Time: 8:30 AM to 9:30 AM Streamed session Lead Presenter: Michael Caulfield (Washington State University-Vancouver) Track: N/A Location: Adams Ballroom Position: 2 Session Duration: 60min Brief Abstract:Years ago, John Lyndon (then Johnny Rotten) sang that “anger is an energy.” And he was right, of course. Anger isn’t an emotion, like happiness or sadness. It’s a reaction, a swelling up of a confused urge. I’m a person profoundly uncomfortable with anger, but yet I’ve found in my professional career that often my most impactful work begins in a place of anger: anger against injustice, inequality, lies, or corruption. And often it is that anger that gives me the energy and endurance to make a difference, to move the mountains that need to be moved. In this talk I want to think through our uneasy relationship with anger; how it can be helpful, and how it can destroy us if we’re not careful.++++++++++++++++
Date: Friday, April 5th Time: 10:45 AM to 11:30 AM Conference Session: Concurrent Session 10 Streamed session Lead Presenter: Laurie Daily (Augustana University) Co-presenter: Sharon Gray (Augustana University) Track: Problems, Processes, and Practices Location: Juniper A Session Duration: 45min Brief Abstract:The purpose of this session is to explore the implementation of a Community of Practice to support professional development, enhance online course and program development efforts, and to foster community and engagement between and among full and part time faculty.+++++++++++++++
Date: Friday, April 5th Time: 11:45 AM to 12:30 PM Conference Session: Concurrent Session 11 Streamed session Lead Presenter: Katrina Rainer (Strayer University) Co-presenter: Jennifer M McVay-Dyche (Strayer University) Track: Teaching and Learning Practice Location: Cottonwood 2-3 Session Duration: 45min Brief Abstract:Learning is more effective and organic when we teach through the art of storytelling. At Strayer University, we are blending the principles story-driven learning with research-based instructional design practices to create engaging learning experiences. This session will provide you with strategies to strategically infuse stories into any lesson, course, or curriculum.
https://sched.co/JAqk
the type of data: wikipedia. the dangers of learning from wikipedia. how individuals can organize mitigate some of these dangers. wikidata, algorithms.
IBM Watson is using wikipedia by algorythms making sense, AI system
youtube videos debunked of conspiracy theories by using wikipedia.
semantic relatedness, Word2Vec
how does algorithms work: large body of unstructured text. picks specific words
lots of AI learns about the world from wikipedia. the neutral point of view policy. WIkipedia asks editors present as proportionally as possible. Wikipedia biases: 1. gender bias (only 20-30 % are women).
conceptnet. debias along different demographic dimensions.
citations analysis gives also an idea about biases. localness of sources cited in spatial articles. structural biases.
geolocation on Twitter by County. predicting the people living in urban areas. FB wants to push more local news.
danger (biases) #3. wikipedia search results vs wkipedia knowledge panel.
collective action against tech: Reddit, boycott for FB and Instagram.
data labor: what the primary resources this companies have. posts, images, reviews etc.
boycott, data strike (data not being available for algorithms in the future). GDPR in EU – all historical data is like the CA Consumer Privacy Act. One can do data strike without data boycott. general vs homogeneous (group with shared identity) boycott.
the wikipedia SPAM policy is obstructing new editors and that hit communities such as women.
how to access at different levels. methods and methodological concerns. ethical concerns, legal concerns,
tweetdeck for advanced Twitter searches. quoting, likes is relevant, but not enough, sometimes screenshot
engagement option
social listening platforms: crimson hexagon, parsely, sysomos – not yet academic platforms, tools to setup queries and visualization, but difficult to algorythm, the data samples etc. open sources tools (Urbana, Social Media microscope: SMILE (social media intelligence and learning environment) to collect data from twitter, reddit and within the platform they can query Twitter. create trend analysis, sentiment analysis, Voxgov (subscription service: analyzing political social media)
graduate level and faculty research: accessing SM large scale data web scraping & APIs Twitter APIs. Jason script, Python etc. Gnip Firehose API ($) ; Web SCraper Chrome plugin (easy tool, Pyhon and R created); Twint (Twitter scraper)
Facepager (open source) if not Python or R coder. structure and download the data sets.
TAGS archiving google sheets, uses twitter API. anything older 7 days not avaialble, so harvest every week.
social feed manager (GWUniversity) – Justin Litman with Stanford. Install on server but allows much more.
legal concerns: copyright (public info, but not beyond copyrighted). fair use argument is strong, but cannot publish the data. can analyize under fair use. contracts supercede copyright (terms of service/use) licensed data through library.
methods: sampling concerns tufekci, 2014 questions for sm. SM data is a good set for SM, but other fields? not according to her. hashtag studies: self selection bias. twitter as a model organism: over-represnted data in academic studies.
methodological concerns: scope of access – lack of historical data. mechanics of platform and contenxt: retweets are not necessarily endorsements.
ethical concerns. public info – IRB no informed consent. the right to be forgotten. anonymized data is often still traceable.
table discussion: digital humanities, journalism interested, but too narrow. tools are still difficult to find an operate. context of the visuals. how to spread around variety of majors and classes. controversial events more likely to be deleted.
takedowns, lies and corrosion: what is a librarian to do: trolls, takedown,
development kit circulation. familiarity with the Oculus Rift resulted in lesser reservation. Downturn also.
An experience station. clean up free apps.
question: spherical video, video 360.
safety issues: policies? instructional perspective: curating,WI people: user testing. touch controllers more intuitive then xbox controller. Retail Oculus Rift
app Scatchfab. 3modelviewer. obj or sdl file. Medium, Tiltbrush.
College of Liberal Arts at the U has their VR, 3D print set up.
Penn State (Paul, librarian, kiniseology, anatomy programs), Information Science and Technology. immersive experiences lab for video 360.
CALIPHA part of it is xrlibraries. libraries equal education. content provider LifeLiqe STEM library of AR and VR objects. https://www.lifeliqe.com/
libraians, IT staff, IDs. help faculty with course design, primarily online, master courses. Concordia is GROWING, mostly because of online students.
solve issues (putting down fires, such as “gradebook” on BB). Librarians : research and resources experts. Librarians helping with LMS. Broadening definition of Library as support hub.
counting how many times students use electronic library resources or visit in person, and comparing that to how well the students do in their classes and how likely they are to stay in school and earn a degree. And many library leaders are finding a strong correlation, meaning that students who consume more library materials tend to be more successful academically.
carefully tracking how library use compares to other metrics, and it has made changes as a result—like moving the tutoring center and the writing lab into the library. Those moves were designed not only to lure more people into the stacks, but to make seeking help more socially-acceptable for students who might have been hesitant.
a partnership between the library, which knows what electronic materials students use, and the technology office, which manages other campus data such as usage of the course-management system. The university is doing a study to see whether library usage there also equates to student success.
The issue of privacy also emerged during a session on libraries and data at the annual Educause conference earlier this month.
violent deaths in schools have stayed relatively constant over the last 30 years, according to data from the National Center for Education Statistics. But then there’s the emotive reality, which is that every time another event like Sandy Hook or Parkland occurs, many educators and students feel they are in peril when they go to school.
RealNetworks, a Seattle-based software company that was popular in the 1990s for its audio and video streaming services but has since expanded to offer other tools, including SAFR (Secure, Accurate Facial Recognition), its AI-supported facial recognition software.
After installing new security cameras, purchasing a few Apple devices and upgrading the school’s Wi-Fi, St. Therese was looking at a $24,000 technology tab.
The software is programmed to allow authorized users into the building with a smile.
“Facial recognition isn’t a panacea. It is just a tool,” says Collins, who focuses on education privacy issues.
Another part of the problem with tools like SAFR, is it provides a false sense of security.
Educators seeking new technology can start by consulting a database of pre-vetted edtech tools, rated based on alignment with both child data privacy laws and the district’s instructional vision. Each entry includes notes about what the software does, how it can be used in the classroom, and the appropriate age level. Kaye is also working on aligning the database to the ISTE Standards so teachers can see at a glance which standards each tool can help them meet.
Every app falls into one of four categories:
Tools the district approves, supports, pays for, and will train teachers to use.
Tools that are approved and can be freely used on an independent basis.
Tools that are approved with stipulations, such as age or parental permission requirements.
Tools that are not approved because they don’t align with the district’s vision or data privacy needs.
Teachers can request to have a tool vetted
Teachers who choose a pre-vetted app from the approved list can start using it right away, without any further action needed. Educators who have a specific tool in mind that hasn’t yet been vetted can submit a request form that asks questions such as:
How does the tool connect to the curriculum?
Will students be consumers or producers when using it?
How easy is it to learn and use?
What are some of the things they plan on doing with it?