Author Archive

social media teaching and learning

Teaching Crowds: Learning and Social Media by Jon Dron and Terry Anderson Published by Athabasca University, Canada, ISBN: 978-1-927356-81-4 (PDF), September 2014, Pages: 370

(book review)

https://www.dhakacourier.com.bd/news/Essays/Using-social-media-platforms-in-teaching-learning/1051

Dr. Jon Dron and Professor Terry Anderson of Athabasca University, Canada attempt to introduce a new model for understanding and exploiting the pedagogical potential of Web-based technologies. Recognizing the E-learning/ online education as new model of teaching and learning, the authors show how learners can engage with social media platforms to create an unbounded field of emergent connections.

In chapter 9 ‘Issues and Challenges in Educational Uses of Social Software’ , the writers accordingly examine the dark side of social software—the ways in which it can undermine or even jeopardize, rather than deepen and extend, the experience of learning. They present a series of over-arching issues that warrant consideration by anyone who plans to use social software for learning. These include issues surrounding privacy, disclosure, and trust, cross-cultural dissonances, problems posed by the complexities of technology and by the digital divide, unpredictable systemic effects, and risks such as mob stupidity and filter bubbles.

+++++++++++++++
more on social media in this IMS blog
https://blog.stcloudstate.edu/ims?s=social+media

misconceptions VR training

https://aixr.org/insights/7-myths-and-misconceptions-about-vr-training/

1. VR Training Is Expensive, Especially At Scale

2. VR Training Requires A Lot Of Space

3. VR Training Is Distracting And Counterproductive

4. VR Training Is Unhygienic

5. VR Training Sessions Are Very Long

6. All VR Training Makes Users Sick

7. VR Training Isn’t Here To Stay

++++++++++++
more on VR training in this IMS blog
https://blog.stcloudstate.edu/ims?s=vr+training

students and edtech

Are College Students Comfortable Using Edtech? Maybe Not

https://www-edsurge-com.cdn.ampproject.org/c/s/www.edsurge.com/amp/news/2021-08-04-are-college-students-comfortable-using-edtech-maybe-not

The survey from the College Innovation Network asked nearly 700 students enrolled at four higher ed institutions to answer questions about what online learning has been like for them during the 2020-21 academic year.

While some students haven’t had full access to computers or the internet, others have discovered that their laptops are too old or too slow to adequately handle the tools they’ve been assigned.

four key ways that people develop self-efficacy

college students were less likely to use and trust edtech tools that they don’t consider relevant, accurate or easy to use.

 

3 types of instructional design

3 types of instructional design on the example of an egg cooking recipe
1. Manual.
“Add salt to the water, boil for 8 minutes, immerse in cold water,” that’s all—a simple sequence of steps.
Manual is the simplest, cheapest, and, unfortunately, the most common type of educational program. Yes, the automatic repetition of actions can lead to something, but any deviation will cause difficulties.
2. Manual with context.
Now imagine this recipe: “During cooking, the shell may crack, and the protein will leak out. To avoid this, add salt to the water. The salt will make the protein curdle.”
The context is added, it is explained why it is necessary to do things in that specific way. This is very important because it provides tools for working with real-life situations.
3. Abstraction.
In fact, this is a context twisted to the maximum. For example: “Salt will make the protein curdle. That is why in the old days, people bandaged purulent wounds with bandages soaked in saltwater.” Two completely different phenomena are taken, and a comparison is made based on a common abstract form.
This type is not always appropriate, but it can ignite the student with unexpected facts and comparisons.

++++++++++++++++++++
more on instructional design this IMS blog
https://blog.stcloudstate.edu/ims?s=instructional+design

Cross Reality (XR)

Ziker, C., Truman, B., & Dodds, H. (2021). Cross Reality (XR): Challenges and Opportunities Across the Spectrum. Innovative Learning Environments in STEM Higher Education, 55–77. https://doi.org/10.1007/978-3-030-58948-6_4
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7948004/

For the purpose of this chapter, Cross Reality or XR refers to technologies and applications that involve combinations of mixed reality (MR), augmented reality (AR), virtual reality (VR), and virtual worlds (VWs). These are technologies that connect computer technology (such as informational overlays) to the physical world for the purposes of augmenting or extending experiences beyond the real. Especially relevant to the definition of XR is the fact that this term encompasses a wide range of options for delivering learning experiences, from minimal technology and episodic experiences to deep immersion and persistent platforms. The preponderance of different terms for slightly different technologies indicate that this is a growth area within the field. Here we provide a few definitions of these technologies.

MR—Mixed reality refers to a blend of technologies used to influence the human perception of an experience. Motion sensors, body tracking, and eye tracking interplay with overlaid technology to give a rich and full version of reality displayed to the user. For example, technology could add sound or additional graphics to an experience in real time. Examples include the Magic Leap One and Microsoft HoloLens 2.0. MR and XR are often used interchangeably.

AR—Augmented reality refers to technology systems that overlay information onto the real world, but the technology might not allow for real-time feedback. As such, AR experiences can move or animate, but they might not interact with changes in depth of view or external light conditions. Currently, AR is considered the first generation of the newer and more interactive MR experiences.

VR—Virtual reality, as a technological product, traces its history to approximately 1960 and tends to encompass user experiences that are visually and auditorily different from the real world. Indeed, the real world is often blocked from interacting with the virtual one. Headsets, headphones, haptics, and haptic clothing might purposely cut off all input except that which is virtual. In general, VR is a widely recognizable term, often found in gaming and workplace training, where learners need to be transported to a different time and place. VR experiences in STEM often consist of virtual labs or short virtual field trips.

VW—Virtual worlds are frequently considered a subset of VR with the difference that VWs are inherently social and collaborative; VWs frequently contain multiple simultaneous users, while VRs are often solo experiences. Another discrimination between virtual reality and virtual worlds is the persistence of the virtual space. VR tends to be episodic, with the learner in the virtual experience for a few minutes and the reality created within the experience ends when the learner experience ends. VWs are persistent in that the worlds continue to exist on computer servers whether or not there are active avatars within the virtual space (Bell ). This discrimination between VR and VW, however, is dissolving. VR experiences can be created to exist for days, and some users have been known to wear headsets for extended periods of time. Additionally, more and more VR experiences are being designed to be for game play, socialization, or mental relaxation. The IEEE VR 2020 online conference and the Educators in VR International Summit 2020 offered participants opportunities to experience conference presentations in virtual rooms as avatars while interacting with presenters and conference attendees (see Sect. 2.5 for more information).

CVEs—Collaborative virtual environments are communication systems in which multiple interactants share the same three-dimensional digital space despite occupying remote physical locations (Yee and Bailenson ).

Embodiment—Embodiment is defined by Lindgren and Johnson-Glenberg () as the enactment of knowledge and concepts through the activity of our bodies within an MR (mixed reality) and physical environment

https://hyp.is/mBiunvx3EeudElMRwHm5dQ/www.ncbi.nlm.nih.gov/pmc/articles/PMC7948004/ 

Human-Centered Design philosophy that involves putting human needs, capabilities, and behavior first (Jerald 2018: 15). XR provides the opportunity to experience just-in-time immersive, experiential learning that uses concrete yet exploratory experiences involving senses that result in lasting memories. Here we discuss opportunities for social applications with XR. 

 

https://hyp.is/wJSoFPx3Eeu1mAPmeAp2tQ/www.ncbi.nlm.nih.gov/pmc/articles/PMC7948004/ 

XR learner activities are usually created for individual use, which may or may not need to be simultaneously experienced as a class together at the same time or place with the instructor. Activities can be designed into instruction with VR headsets, high-resolution screens, smartphones, or other solo technological devices for use inside and outside of the classroom. 

 

https://hyp.is/wJSoFPx3Eeu1mAPmeAp2tQ/www.ncbi.nlm.nih.gov/pmc/articles/PMC7948004/ 

Ready to go relationship between STEM courses and XR. In bullet points! 

 

https://hyp.is/wJSoFPx3Eeu1mAPmeAp2tQ/www.ncbi.nlm.nih.gov/pmc/articles/PMC7948004/ 

Do we address the challenges in the grant proposal? 

some learners will be held back from full XR activity by visual, physical, and social abilities such as stroke, vertigo, epilepsy, or age-related reaction time. It should also be noted that the encompassing nature of VR headsets might create some discomfort or danger for any learners as they can no longer fully see and control their body and body space. 

1 38 39 40 41 42 491