Got a new open access article out on the ways AI is embedding in education research. Well-funded precision education experts and learning engineers aim to collect psychodata, brain data and biodata as evidence of the embodied substrates of learning. https://t.co/CbdHReXUiz
This article presents an examination of how education research is being remade as an experimental data-intensive science. AI is combining with learning science in new ‘digital laboratories’ where ownership over data, and power and authority over educational knowledge production, are being redistributed to research assemblages of computational machines and scientific expertise.
Research across the sciences, humanities and social sciences is increasingly conducted through digital knowledge machines that are reconfiguring the ways knowledge is generated, circulated and used (Meyer and Schroeder, 2015).
Knowledge infrastructures, such as those of statistical institutes or research-intensive universities, have undergone significant digital transformation with the arrival of data-intensive technologies, with knowledge production now enacted in myriad settings, from academic laboratories and research institutes to commercial research and development studios, think tanks and consultancies. Datafied knowledge infrastructures have become hubs of command and control over the creation, analysis and exchange of data (Bigo et al., 2019).
The combination of AI and learning science into an AILSci research assemblage consists of particular forms of scientific expertise embodied by knowledge actors – individuals and organizations – identified by categories including science of learning, AIED, precision education and learning engineering.
Precision education overtly uses psychological, neurological and genomic data to tailor or personalize learning around the unique needs of the individual (Williamson, 2019). Precision education approaches include cognitive tracking, behavioural monitoring, brain imaging and DNA analysis.
Expert power is therefore claimed by those who can perform big data analyses, especially those able to translate and narrate the data for various audiences. Likewise, expert power in education is now claimed by those who can enact data-intensive science of learning, precision education and learning engineering research and development, and translate AILSci findings into knowledge for application in policy and practitioner settings.
the thinking of a thinking infrastructure is not merely a conscious human cognitive process, but relationally performed across humans and socio-material strata, wherein interconnected technical devices and other forms ‘organize thinking and thought and direct action’.
As an infrastructure for AILSci analyses, these technologies at least partly structure how experts think: they generate new understandings and knowledge about processes of education and learning that are only thinkable and knowable due to the computational machinery of the research enterprise.
Big data-based molecular genetics studies are part of a bioinformatics-led transformation of biomedical sciences based on analysing exceptional volumes of data (Parry and Greenhough, 2018), which has transformed the biological sciences to focus on structured and computable data rather than embodied evidence itself.
Isin and Ruppert (2019) have recently conceptualized an emergent form of power that they characterize as sensory power. Building on Foucault, they note how sovereign power gradually metamorphosed into disciplinary power and biopolitical forms of statistical regulation over bodies and populations. Sensory power marks a shift to practices of data-intensive sensing, and to the quantified tracking, recording and representing of living pulses, movements and sentiments through devices such as wearable fitness monitors, online natural-language processing and behaviour-tracking apps. Davies (2019: 515–20) designates these as ‘techno-somatic real-time sensing’ technologies that capture the ‘rhythms’ and ‘metronomic vitality’ of human bodies, and bring about ‘new cyborg-type assemblages of bodies, codes, screens and machines’ in a ‘constant cybernetic loop of action, feedback and adaptation’.
Techno-somatic modes of neural sensing, using neurotechnologies for brain imaging and neural analysis, are the next frontier in AILSci. Real-time brainwave sensing is being developed and trialled in multiple expert settings.
Many platforms have both qualitative and quantitative capabilities, such as UserZoom and UserTesting
Tips for Remote Facilitating and Presenting:
turn on your camera
Enable connection
Create ground rules
Assign homework
Adapt the structure
Tools for Remote Facilitating and Presenting
Presenting UX work: Zoom, GoToMeeting, and Google Hangouts Meet
Generative workshop activities: Google Draw, Microsoft Visio, Sketch, MURAL, and Miro
Evaluative workshop activities: MURAL or Miro. Alternatively, use survey tools such as SurveyMonkey or CrowdSignal, or live polling apps such as Poll Everywhere that you can insert directly into your slides.
Remote Collaboration and Brainstorming
Consider both synchronous and asynchronous methods
OneNote OneNote is the obvious choice for anyone who is using a Microsoft Surface or other Windows-based tablet. It is also available to use on iPads and on Android tablets. The option to have handwriting converted to text is an outstanding feature.
Google Keep
If you’re a G Suite for Education user, Google Keep. It doesn’t have the handwriting-to-text function that OneNote offers.
Zoho Notebook Zoho Notebook doesn’t have the name recognition of OneNote or Keep. Zoho Notebook has the most intuitive design or organization options of the three digital notebooks featured here.
The downside to Zoho Notebook is that the handwriting option only appears on the Android and iOS platforms. If the handwriting option worked in the Chrome or Edge web browsers,
Early signs suggest Gen Z workers are more competitive and pragmatic, but also more anxious and reserved, than millennials, the generation of 72 million born from 1981 to 1996, according to executives, managers, generational consultants and multidecade studies of young people. Gen Zers are also the most racially diverse generation in American histor
With the generation of baby boomers retiring and unemployment at historic lows, Gen Z is filling immense gaps in the workforce. Employers, plagued by worker shortages, are trying to adapt.
LinkedIn Corp. and Intuit Inc. have eased requirements that certain hires hold bachelor’s degrees to reach young adults who couldn’t afford college. At campus recruiting events, EY is raffling off computer tablets because competition for top talent is intense.
Companies are reworking training so it replicates YouTube-style videos that appeal to Gen Z workers reared on smartphones.
“They learn new information much more quickly than their predecessors,”
A few years ago Mr. Stewart noticed that Gen Z hires behaved differently than their predecessors. When the company launched a project to support branch managers, millennials excitedly teamed up and worked together. Gen Z workers wanted individual recognition and extra pay.
Much of Gen Z’s socializing takes place via text messages and social media platforms—a shift that has eroded natural interactions and allowed bullying to play out in front of wider audiences.
The flip side of being digital natives is that Gen Z is even more adept with technology than millennials. Natasha Stough, Americas campus recruiting director at EY in Chicago, was wowed by a young hire who created a bot to answer questions on the company’s Facebook careers page.
To lure more Gen Z workers, EY rolled out video technology that allows job candidates to record answers to interview questions and submit them electronically.
LinkedIn, which used to recruit from about a dozen colleges, broadened its efforts to include hundreds of schools and computer coding boot camps to capture a diverse applicant pool that mirrors the changing population.
Screencastify is a tool that allows students and educators to personalize their learning experience through sharing their voice via a screen recording. The app is a Chrome extension, meaning the tool is always at the ready whenever you want to capture some magic!
If you are not a Kahoot user yet, please consider: a) the Kahoots (quizzes) can be an excellent conversation starter (vs. assessment tool) b) the Kahoots can be modified to your liking (you can change the content)
here some screen-sharing capture to get a taste of the excitement:
Indiana University explores that question by bringing together tech partners and university leaders to share ideas on how to design classrooms that make better use of faculty and student time.
Untether instructors from the room’s podium, allowing them control from anywhere in the room;
Streamline the start of class, including biometric login to the room’s technology, behind-the-scenes routing of course content to room displays, control of lights and automatic attendance taking;
Offer whiteboards that can be captured, routed to different displays in the room and saved for future viewing and editing;
Provide small-group collaboration displays and the ability to easily route content to and from these displays; and
Deliver these features through a simple, user-friendly and reliable room/technology interface.
Key players from Crestron, Google, Sony, Steelcase and Spectrum met with Indiana University faculty, technologists and architects to generate new ideas related to current and emerging technologies. Activities included collaborative brainstorming focusing on these questions:
What else can we do to create the classroom of the future?
What current technology exists to solve these problems?
What could be developed that doesn’t yet exist?
What’s next?
top five findings:
Screenless and biometric technology will play an important role in the evolution of classrooms in higher education. We plan to research how voice activation and other Internet of Things technologies can streamline the process for faculty and students.
The entire classroom will become a space for student activity and brainstorming; walls, windows, desks and all activities are easily captured to the cloud, allowing conversations to continue outside of class or at the next class meeting.
Technology will be leveraged to include advance automation for a variety of tasks, so the faculty member is released from duties to focus on teaching.
The technology will become invisible to the process and enhance and customize the experience for the learner.
Virtual assistants could play an important role in providing students with a supported experience throughout their entire campus career.
In September 2015, the back-then library dean (they change every 2-3 years) requested a committee of librarians to meet and discuss the remodeling of Miller Center 2018. By that time the SCSU CIO was asserting the BYOx as a new policy for SCSU. BYOx in essence means the necessity for stronger (wider) WiFI pipe. Based on that assertion, I, Plamen Miltenoff, was insisting to shift the cost of hardware (computers, laptops) to infrastructure (more WiFi nods in the room and around it) and prepare for the upcoming IoT by learning to remodel our syllabi for mobile devices and use those (students) mobile devices, rather squander University money on hardware. At least one faculty member from the committee honestly admitted she has no idea about IoT and respectively the merit of my proposal. Thus, my proposal was completely disregarded by the self-nominated chair of the committee of librarians, who pushed for her idea to replace the desktops with a cart of laptops (a very 2010 idea, which by 2015 was already passe). As per Kelly (2018) (second article above), it is obvious the failure of her proposal to the dean to choose laptops over mobile devices, considering that faculty DO see mobile devices completely replacing desktops and laptops; that faculty DO not see document cameras and overhead projectors as a tool to stay.
Here are the notes from September 2015 https://blog.stcloudstate.edu/ims/2015/09/25/mc218-remodel/
As are result, my IoT proposal as now reflected in the Johnston (2018) (first article above), did not make it even formally to the dean, hence the necessity to make it available through the blog.
The SCSU library thinking regarding physical remodeling of classrooms is behind its times and that costs money for the university, if that room needs to be remodeled again to be with the contemporary times.
GOTTACATCHEMALL:EXPLORING POKEMON GO IN SEARCH OF LEARNING ENHANCEMENT OBJECTS
Annamaria Cacchione, Emma Procter-Legg and Sobah Abbas Petersen
Universidad Complutense de Madrid, Facultad de Filologia, Av.da Complutense sn, 28040 Madrid, Spain Independent; Abingdon, Oxon, UK SINTEF Technology and Society, Trondheim, Norway
The Augmented Reality Game, Pokemon Go, took the world by storm in the summer of 2016. City landscapes were decorated with amusing, colourful objects called Pokemon, and the holiday activities were enhanced by catching these wonderful creatures. In light of this, it is inevitable for mobile language learning researchers to reflect on the impact oft his game on learning and how it may be leveraged to enhance the design of mobile and ubiquitous technologies for mobile and situated language learning. This paper analyses the game Pokemon Go and the players’ experiences accordingto a framework developed for evaluating mobile language learning and discusses how Pokemon Go can help to meetsome of the challenges faced by earlier research activities.
A comparison between PG and Geocashing will illustrate the evolution of the concept of location-based games a concept that is very close to that of situated learning that we have explored in several previous works.
Pokémon Go is a free, location-based augmented reality game developed for mobile devices. Players useGPS on their mobile device to locate, capture, battle, and train virtual creatures (a.k.a. Pokémon), whichappear on screen overlaying the image seen through the device’s camera. This makes it seem like thePokemon are in the same real-world location as the player
“Put simply, augmented reality is a technology that overlays computer generated visuals over the real worldthrough a device camera bringing your surroundings to life and interacting with sensors such as location and heart rate to provide additional information”(Ramirez, 2014).
Apply the evaluation framework developed in 2015 for mobile learning applications(Cacchione, Procter-Legg, Petersen, & Winter, 2015). The framework is composed of a set offactors of different nature neuroscientific, technological, organisational and pedagogical and aim toprovide a comprehensive account of what plays a major role in ensuring effective learning via mobile devices