my note: the LITA publication about the Emporia State University (see below) pursues the same goals of the project two SCSU librarians, Susan Hubbs, MLIS, and Plamen Miltenoff, Ph.D. MLIS, have developed:
Virtual reality (VR) has emerged as a popular technology for gaming and learning, with its uses for teaching presently being investigated in a variety of educational settings. However, one area where the effect of this technology on students has not been examined in detail is as tool for new student orientation in colleges and universities. This study investigates this effect using an experimental methodology and the population of new master of library science (MLS) students entering a library and information science (LIS) program. The results indicate that students who received a VR orientation expressed more optimistic views about the technology, saw greater improvement in scores on an assessment of knowledge about their program and chosen profession, and saw a small decrease in program anxiety compared to those who received the same information as standard text-and-links. The majority of students also indicated a willingness to use VR technology for learning for long periods of time (25 minutes or more). The researchers concluded that VR may be a useful tool for increasing student engagement, as described by Game Engagement Theory.
AUTHOR BIOGRAPHY
Brady Lund, Emporia State University
Brady Lund is a doctoral student at Emporia State University’s School of Library and Information Management, where he studies the intersection of information technology and information science, among other topics.
The EDUCAUSE XR (Extended Reality) Community Group Listserv <XR@LISTSERV.EDUCAUSE.EDU>
Greetings to you all! Presently, I am undertaking a masters course in “Instruction Design and Technology” which has two components: Coursework and Research. For my research, I would like to pursue it in the field of Augmented Reality (AR) and Mobile Learning. I am thinking of an idea that could lead to collaboration among students and directly translate into enhanced learning for students while using an AR application. However, I am having a problem with coming up with an application because I don’t have any computing background. This, in turn, is affecting my ability to come up with a good research topic.
I teach gross anatomy and histology to many students of health sciences at Mbarara University, and this is where I feel I could make a contribution to learning anatomy using AR since almost all students own smartphones. I, therefore, kindly request you to let me know which of the freely-available AR app authoring tools could help me in this regard. In addition, I request for your suggestions regarding which research area(s) I should pursue in order to come up with a good research topic.
Hoping to hear from you soon.
Grace Muwanga Department of Anatomy Mbarara University Uganda (East Africa)
One limitation with Spark and Snap is that file sizes need to be small.
If you’re interested in creating AR experiences that work directly in a web browser and are up for writing some markup code, look at A-Frame AR https://aframe.io/blog/webxr-ar-module/.
For finding and hosting 3D models you can look at Sketchfab and Google Poly. I think both have many examples of anatomy.
I’ve been using Roar. They have a 99$ a year license.
++++++++++++
I have recently been experimenting with an AR development tool called Zappar, which I like because the end users do not have to download an app to view the AR content. Codes can be scanned either with the Zappar app or at web.zappar.com.
From a development standpoint, Zappar has an easy to use drag-and-drop interface called ZapWorks Designer that will help you build basic AR experiences quickly, but for a more complicated, more interactive use case such as learning anatomy, you will probably need ZapWorks Studio, which will have much more of a learning curve. The Hobby (non-commercial) license is free if you are interested in trying it out.
You can check out an AR anatomy mini-lesson with models of the human brain, liver, and heart using ZapWorks here: https://www.zappar.com/campaigns/secrets-human-body/. Even if you choose to go with a different development tool, this example might help nail down ideas for your own project.
Hope this helps,
Brighten
Brighten Jelke Academic Assistant for Virtual Technology Lake Forest College bjelke@lakeforest.edu Office: DO 233 | Phone: 847-735-5168
Presentation 1: Inspiring Faculty (+ Students) with Tales of Immersive Tech (Practitioner Presentation #106)
Authors: Nicholas Smerker
Immersive technologies – 360º video, virtual and augmented realities – are being discussed in many corners of higher education. For an instructor who is familiar with the terms, at least in passing, learning more about why they and their students should care can be challenging, at best. In order to create a font of inspiration, the IMEX Lab team within Teaching and Learning with Technology at Penn State devised its Get Inspired web resource. Building on a similar repository for making technology stories at the sister Maker Commons website, the IMEX Lab Get Inspired landing page invites faculty to discover real world examples of how cutting edge XR tools are being used every day. In addition to very approachable video content and a short summary calling out why our team chose the story, there are also instructional designer-developed Assignment Ideas that allow for quick deployment of exercises related to – though not always relying upon – the technologies highlighted in a given Get Inspired story.
Presentation 2: Lessons Learned from Over A Decade of Designing and Teaching Immersive VR in Higher Education Online Courses (Practitioner Presentation #101)
Authors: Eileen Oconnor
This presentation overviews the design and instruction in immersive virtual reality environments created by the author beginning with Second Life and progressing to open source venues. It will highlight the diversity of VR environment developed, the challenges that were overcome, and the accomplishment of students who created their own VR environments for K12, college and corporate settings. The instruction and design materials created to enable this 100% online master’s program accomplishment will be shared; an institute launched in 2018 for emerging technology study will be noted.
Presentation 3: Virtual Reality Student Teaching Experience: A Live, Remote Option for Learning Teaching Skills During Campus Closure and Social Distancing (Practitioner Presentation #110)
Summary: During the Coronavirus pandemic, Ithaca College teacher education majors needed a classroom of students in order to practice teaching and receive feedback, but the campus was closed, and gatherings forbidden. Students were unable to participate in live practice teaching required for their program. We developed a virtual reality pilot project to allow students to experiment in two third-party social VR programs, AltSpaceVR and Rumii. Social VR platforms allow a live, embodied experience that mimics in-person events to give students a more realistic, robust and synchronous teaching practice opportunity. We documented the process and lessons learned to inform, develop and scale next generation efforts.
Sunday, June 21 • 8:00am – 9:00am Escape the (Class)room games in OpenSim or Second Life FULLhttps://ilrn2020.sched.com/event/ceKP/escape-the-classroom-games-in-opensim-or-second-lifePre-registration for this tour is required as places are limited. Joining instructions will be emailed to registrants ahead of the scheduled tour time.The Guided Virtual Adventure tour will take you to EduNation in Second Life to experience an Escape room game. For one hour, a group of participants engage in voice communication and try to solve puzzles, riddles or conundrums and follow clues to eventually escape the space. These scenarios are designed for problem solving and negotiating language and are ideal for language education. They are fun and exciting and the clock ticking adds to game play.Tour guide(s)/leader(s): Philp Heike, let’s talk online sprl, Belgium
Presentation 1: Evaluating the impact of multimodal Collaborative Virtual Environments on user’s spatial knowledge and experience of gamified educational tasks (Full Paper #91)
Authors: Ioannis Doumanis and Daphne Economou
>>Access Video Presentation<<
Several research projects in spatial cognition have suggested Virtual Environments (VEs) as an effective way of facilitating mental map development of a physical space. In the study reported in this paper, we evaluated the effectiveness of multimodal real-time interaction in distilling understanding of the VE after completing gamified educational tasks. We also measure the impact of these design elements on the user’s experience of educational tasks. The VE used reassembles an art gallery and it was built using REVERIE (Real and Virtual Engagement In Realistic Immersive Environment) a framework designed to enable multimodal communication on the Web. We compared the impact of REVERIE VG with an educational platform called Edu-Simulation for the same gamified educational tasks. We found that the multimodal VE had no impact on the ability of students to retain a mental model of the virtual space. However, we also found that students thought that it was easier to build a mental map of the virtual space in REVERIE VG. This means that using a multimodal CVE in a gamified educational experience does not benefit spatial performance, but also it does not cause distraction. The paper ends with future work and conclusions and suggestions for improving mental map construction and user experience in multimodal CVEs.
Presentation 2: A case study on student’s perception of the virtual game supported collaborative learning (Full Paper #42)
Authors: Xiuli Huang, Juhou He and Hongyan Wang
>>Access Video Presentation<<
The English education course in China aims to help students establish the English skills to enhance their international competitiveness. However, in traditional English classes, students often lack the linguistic environment to apply the English skills they learned in their textbook. Virtual reality (VR) technology can set up an immersive English language environment and then promote the learners to use English by presenting different collaborative communication tasks. In this paper, spherical video-based virtual reality technology was applied to build a linguistic environment and a collaborative learning strategy was adopted to promote their communication. Additionally, a mixed-methods research approach was used to analyze students’ achievement between a traditional classroom and a virtual reality supported collaborative classroom and their perception towards the two approaches. The experimental results revealed that the virtual reality supported collaborative classroom was able to enhance the students’ achievement. Moreover, by analyzing the interview, students’ attitudes towards the virtual reality supported collaborative class were reported and the use of language learning strategies in virtual reality supported collaborative class was represented. These findings could be valuable references for those who intend to create opportunities for students to collaborate and communicate in the target language in their classroom and then improve their language skills
Presentation 1: Reducing Cognitive Load through the Worked Example Effect within a Serious Game Environment (Full Paper #19)
Authors: Bernadette Spieler, Naomi Pfaff and Wolfgang Slany
>>Access Video Presentation<<
Novices often struggle to represent problems mentally; the unfamiliar process can exhaust their cognitive resources, creating frustration that deters them from learning. By improving novices’ mental representation of problems, worked examples improve both problem-solving skills and transfer performance. Programming requires both skills. In programming, it is not sufficient to simply understand how Stackoverflow examples work; programmers have to be able to adapt the principles and apply them to their own programs. This paper shows evidence in support of the theory that worked examples are the most efficient mode of instruction for novices. In the present study, 42 students were asked to solve the tutorial The Magic Word, a game especially for girls created with the Catrobat programming environment. While the experimental group was presented with a series of worked examples of code, the control groups were instructed through theoretical text examples. The final task was a transfer question. While the average score was not significantly better in the worked example condition, the fact that participants in this experimental group finished significantly faster than the control group suggests that their overall performance was better than that of their counterparts.
Presentation 2: A literature review of e-government services with gamification elements (Full Paper #56)
Authors: Ruth S. Contreras-Espinosa and Alejandro Blanco-M
>>Access Video Presentation<<
Nowadays several democracies are facing the growing problem of a breach in communication between its citizens and their political representatives, resulting in low citizen’s engagement in the participation of political decision making and on public consultations. Therefore, it is fundamental to generate a constructive relationship between both public administration and the citizens by solving its needs. This document contains a useful literature review of the gamification topic and e-government services. The documents contain a background of those concepts and conduct a selection and analysis of the different applications found. A set of three lines of research gaps are found with a potential impact on future studies.
Presentation 1: Connecting User Experience to Learning in an Evaluation of an Immersive, Interactive, Multimodal Augmented Reality Virtual Diorama in a Natural History Museum & the Importance of Story (Full Paper #51)
Authors: Maria Harrington
>>Access Video Presentation<<
Reported are the findings of user experience and learning outcomes from a July 2019 study of an immersive, interactive, multimodal augmented reality (AR) application, used in the context of a museum. The AR Perpetual Garden App is unique in creating an immersive multisensory experience of data. It allowed scientifically naïve visitors to walk into a virtual diorama constructed as a data visualization of a springtime woodland understory, and interact with multimodal information directly through their senses. The user interface comprised of two different AR data visualization scenarios reinforced with data based ambient bioacoustics, an audio story of the curator’s narrative, and interactive access to plant facts. While actual learning and dwell times were the same between the AR app and the control condition, the AR experience received higher ratings on perceived learning. The AR interface design features of “Story” and “Plant Info” showed significant correlations with actual learning outcomes, while “Ease of Use” and “3D Plants” showed significant correlations with perceived learning. As such, designers and developers of AR apps can generalize these findings to inform future designs.
Presentation 2: The Naturalist’s Workshop: Virtual Reality Interaction with a Natural Science Educational Collection (Short Paper #11)
Authors: Colin Patrick Keenan, Cynthia Lincoln, Adam Rogers, Victoria Gerson, Jack Wingo, Mikhael Vasquez-Kool and Richard L. Blanton
>>Access Video Presentation<<
For experiential educators who utilize or maintain physical collections, The Naturalist’s Workshop is an exemplar virtual reality platform to interact with digitized collections in an intuitive and playful way. The Naturalist’s Workshop is a purpose-developed application for the Oculus Quest standalone virtual reality headset for use by museum visitors on the floor of the North Carolina Museum of Natural Sciences under the supervision of a volunteer attendant. Within the application, museum visitors are seated at a virtual desk. Using their hand controllers and head-mounted display, they explore drawers containing botanical specimens and tools-of-the-trade of a naturalist. While exploring, the participant can receive new information about any specimen by dropping it into a virtual examination tray. 360-degree photography and three-dimensionally scanned specimens are used to allow user-motivated, immersive experience of botanical meta-data such as specimen collection coordinates.
Presentation 3: 360˚ Videos: Entry level Immersive Media for Libraries and Education (Practitioner Presentation #132)
Authors: Diane Michaud
>>Access Video Presentation<<
Within the continuum of XR Technologies, 360˚ videos are relatively easy to produce and need only an inexpensive mobile VR viewer to provide a sense of immersion. 360˚ videos present an opportunity to reveal “behind the scenes” spaces that are normally inaccessible to users of academic libraries. This can promote engagement with unique special collections and specific library services. In December 2019, with little previous experience, I led the production of a short 360˚video tour, a walk-through of our institution’s archives. This was a first attempt; there are plans to transform it into a more interactive, user-driven exploration. The beta version successfully generated interest, but the enhanced version will also help prepare uninitiated users for the process of examining unique archival documents and artefacts. This presentation will cover the lessons learned, and what we would do differently for our next immersive video production. Additionally, I will propose that the medium of 360˚ video is ideal for many institutions’ current or recent predicament with campuses shutdown due to the COVID-19 pandemic. Online or immersive 360˚ video can be used for virtual tours of libraries and/or other campus spaces. Virtual tours would retain their value beyond current campus shutdowns as there will always be prospective students and families who cannot easily make a trip to campus. These virtual tours would provide a welcome alternative as they eliminate the financial burden of travel and can be taken at any time.
IM 690 lab plan for March 31, online: Virtual Worlds
If at any point you are lost in the virtual worlds, please consider talking/chatting using our IM 690 zoom link:https://minnstate.zoom.us/j/964455431 or call 320 308 3072
Readings: Currently, if you go to the SCSU online dbases
,if they are working at all, don’t be surprised when clicking on EBSCOhost Business Source Complete to see this msg:
and if you execute a search:
“AltSpaceVR” + “education”, you will find only meager 1+ results.
Google Scholar, naturally, will yield much greater number.
So, search and find an article of your interest using Google Scholar. I used “immersive learning” + “education” for my search.
I chose to read this article: https://journal.alt.ac.uk/index.php/rlt/article/view/2347/2657
since it addressed design principles when applying mixed reality in education. What article did you find/choose/read/are ready to share your analysis with?
Tuesday, March 31, 5PM lab
As usually, we will meet at this Zoom link: https://minnstate.zoom.us/j/964455431 All of us will be online and we will meet in the Zoom room. Please come 10 min earlier, so we can check our equipment and make sure everything works. Since we will be exploring online virtual worlds, please be prepared for technical issues, especially with microphones.
For this lab, please download and install on your computers the AltSpaceVR (ASVR) software: https://www.microsoft.com/en-us/p/altspacevr/9nvr7mn2fchq?activetab=pivot:overviewtab Please consider the impediment that Microsoft has made the 2D mode for PC available only for Windows. If you are a Mac user and don’t have PC available at home, please contact me directly for help.
In addition, pls have a link to the video tutorial; https://blog.stcloudstate.edu/ims/2020/03/13/im690-asvr-2d-tutorial/
pls be informed about MediaSpace issues of the last two weeks, which can result in poor rendering of the video. If issues persist and you still need help downloading and installing the software, contact me directly for help. Please do your best to have ASVR installed on your computer before the lab starts on Tues, March 31, 5PM, so we can use our time during the lab for much more fun activities!
Intro to ASVR.
Please watch this 5 min video anytime you feel a bit lost in ASVR
pls consider the issues with MediaSpace and be patient, if the video renders and/or does not play right away. The video is meant to help you learn how to navigate your avatar in ASVR.
the first 15-20 min in the lab, we will “meet” in ASVR, figure out how to work on our ASVR avatar, how to use the computer keyboard to move, communicate and have basic dexterity. We must learn to “make friends” with Mark Gill (ASVR name: MarkGill47), Dr. Park (ASVR name: dhk3600) and Dr. Miltenoff (ASVR name: Plamen), as well as with your class peers, who will be sharing their ASVR contact info in the Zoom Chat session. Once we learn this skills, we are ready to explore ASVR.
Mark Gill will “lead” us through several virtual worlds, which you will observe and assess from the point of view of an Instructional Designer and an educator (e.g. how these worlds can accommodate learning; what type of teaching do these virtual worlds offer, etc.)
Eventually, Mark Gill will bring us to the SCSU COSE space, created by him, where he will leave us to discuss.
Discussion in the COSE ASVR room
We will start our discussion with you sharing your analysis of the article you found in Google Scholar for today’s class (see above Readings). How do your findings from the article match your impressions from the tour across virtual worlds in ASVR? How does learning happen?
Final projects
the rest of the time in the lab will be allocated for work on your final projects.
Dr. Park and Dr. Miltenoff will work individually with your groups to assist with ideas, questions regarding your projects,
Technology is rapidly changing how we learn and grow. More and more, tools and platforms that make use of virtual reality (VR), augmented reality (AR), and extended reality (ER)—collectively known as immersive learning technology—are moving from the niche world of Silicon Valley into retail stores, warehouses, factory floors, classrooms as well as corporate education and training programs. The value is clear: these immersive learning tools help companies, training providers, and educators train workers better, faster, and more efficiently. Of course, the impact doesn’t stop at the bottom line. Immersive learning presents an opportunity to reliably train employees for situations that are expensive to support, challenging to replicate, and even dangerous. And it can be done efficiently, safely, and with better learning outcomes.
1 in every 3 small and mid-size businesses in the U.S. is expected to be piloting a VR employee training program by 2021, seeing their new hires reach full productivity 50% faster as a result.1
The worldwide AR and VR market size is forecast to grow nearly 7.7 times between 2018 and 2022.
14 million AR and VR devices are expected to be sold in 2019
By 2023, enterprise VR hardware and software revenue is expected to jump 587% to $5.5 billion, up from an estimated $800 million in 2018.
Virtual Reality VR A computer-generated experience that simulates reality. VR may include visual, auditory, or tactile experiences.
Augmented Reality AR A live experience of a physical space, where computer-enhanced visualizations, sounds, or tactile experiences overlay the real-world environment.
Mixed Reality MR A blend of virtual experiences and the real world where virtual and augmented experiences are presented simultaneously
Extended Reality ER An immersive experience involving interactions with the real world, virtual reality, augmented reality, as well as other machines or computers adding content to the experience.
Soft Skills Technical Skills Immersive learning technologies can help people develop human skills, such as empathy, customer service, improving diversity and inclusion, and other areas
Technical Skills. Immersive learning technologies enable workers to learn through simulated experiences, providing the opportunity for risk-free repetition of complex or dangerous technical tasks.
IM 690 lab plan for March 3, MC 205: Oculus Go and Quest
Readings:
TAM:Technology Acceptances Model
Read Venkatesh, and Davis and sum up the importance of their model for instructional designers working with VR technologies and creating materials for users of VR technologies.
UTAUT: using the theory to learn well with VR and to design good acceptance model for endusers: https://blog.stcloudstate.edu/ims/2020/02/20/utaut/
Watch both parts of Victoria Bolotina presentation at the Global VR conference. How is she applying UTAUT for her research?
Read Bracq et al (2019); how do they apply UTAUT for their VR nursing training?
joining a space and collaborating and communicating with other users
Assignment: Group work
Find one F2F and one online peer to form a group.
Based on the questions/directions before you started watching the videos:
– Does this particular technology fit in the instructional design (ID) frames and theories covered
– how does this particular technology fit in the instructional design (ID) frames and theories covered so far?
– what models and ideas from the videos you will see seem possible to be replicated by you?
exchange thoughts with your peers and make a plan to create similar educational product
Post your writing in the following D2L Discussions thread
Augmented Reality with Hololens Watch videos at computer station)
The event requires no registration, and is virtual only, free, and open to the public. Platform access is required, so please install one of the above platforms to attend the International Summit. You may attend in 2D on a desktop or laptop computer with a headphone and microphone (USB gaming headphone recommended), or with a virtual device such as the Oculus Go, Quest, and Rift, Vive, and other mobile and tethered devices. Please note the specifications and requirements of each platform.
Charlie Fink, author, columnist for Forbes magazine, and Adjunct Faculty member of Chapman University, will be presenting “Setting the Table for the Next Decade in XR,” discussing the future of this innovative and immersive technology, at the 2020 Educators in VR International Summit. He will be speaking in AltspaceVR on Tuesday, February 18 at 1:00 PM EST /
This workshop with Dr. Sarah Jones will focus on developing a relevant and new literacy for virtual reality, including the core competencies and skills needed to develop and understand how to become an engaged user of the technology in a meaningful way. The workshop will develop into research for a forthcoming book on Uncovering a Literacy for VR due to be published in 2020.
Sarah is listed as one of the top 15 global influencers within virtual reality. After nearly a decade in television news, Sarah began working in universities focusing on future media, future technology and future education. Sarah holds a PhD in Immersive Storytelling and has published extensively on virtual and augmented reality, whilst continuing to make and create immersive experiences. She has advised the UK Government on Immersive Technologies and delivers keynotes and speaks at conferences across the world on imagining future technology. Sarah is committed to diversifying the media and technology industries and regularly champions initiatives to support this agenda.
Currently there are limited ways to connect 3D VR environments to physical objects in the real-world whilst simultaneously conducting communication and collaboration between remote users. Within the context of a solar power plant, the performance metrics of the site are invaluable for environmental engineers who are remotely located. Often two or more remotely located engineers need to communicate and collaborate on solving a problem. If a solar panel component is damaged, the repair often needs to be undertaken on-site thereby incurring additional expenses. This triage of communication is known as inter-cognitive communication and intra-cognitive communication: inter-cognitive communication where information transfer occurs between two cognitive entities with different cognitive capabilities (e.g., between a human and an artificially cognitive system); intra-cognitive communication where information transfer occurs between two cognitive entities with equivalent cognitive capabilities (e.g., between two humans) [Baranyi and Csapo, 2010]. Currently, non-VR solutions offer a comprehensive analysis of solar plant data. A regular PC with a monitor currently have advantages over 3D VR. For example, sensors can be monitored using dedicated software such as EPEVER or via a web browser; as exemplified by the comprehensive service provided by Elseta. But when multiple users are able to collaborate remotely within a three-dimensional virtual simulation, the opportunities for communication, training and academic education will be profound.
Michael Vallance Ed.D. is a researcher in the Department of Media Architecture, Future University Hakodate, Japan. He has been involved in educational technology design, implementation, research and consultancy for over twenty years, working closely with Higher Education Institutes, schools and media companies in UK, Singapore, Malaysia and Japan. His 3D virtual world design and tele-robotics research has been recognized and funded by the UK Prime Minister’s Initiative (PMI2) and the Japan Advanced Institute of Science and Technology (JAIST). He has been awarded by the United States Army for his research in collaborating the programming of robots in a 3D Virtual World.
Augmented Reality Lens is popular among young people thanks to Snapchat’s invention. Business is losing money without fully using of social media targeting young people (14-25). In my presentation, Dominique Wu will show how businesses can generate more leads through Spark AR (Facebook AR/Instagram AR) & Snapchat AR Lens, and how to create a strategic Snapchat & Instagram AR campaigns.
Domnique Wu is an XR social media strategist and expert in UX/UI design.She has her own YouTube and Apple Podcast show called “XReality: Digital Transformation,” covering the technology and techniques of incorporating XR and AR into social media, marketing, and integration into enterprise solutions.
Mark Christian, EVP, Strategy and Corporate Development, GIGXR
Mixed Reality devices like the HoloLens are transforming education now. Mark Christian will discuss how the technology is not about edge use cases or POCs, but real usable products that are at Universities transforming the way we teach and learn. Christian will talk about the products of GIGXR, the story of how they were developed and what the research is saying about their efficacy. It is time to move to adoption of XR technology in education. Learn how one team has made this a reality.
As CEO of forward-thinking virtual reality and software companies, Mark Christian employs asymmetric approaches to rapid, global market adoption, hiring, diversity and revenue. He prides himself on unconventional approaches to building technology companies.
Virtual Reality is an effective medium to impart education to the student only if it is done right.The way VR is considered gimmick or not is by the way the software application are designed/developed by the developers not the hardware limitation.I will be giving insight about the VR development for educational content specifically designed for students of lower secondary school.I will also provide insights about the development of game in unity3D game engine.
Game Developer and VR developer with over 3 years of experience in Game Development.Developer of Zombie Shooter, winner of various national awards in the gaming and entertainment category, Avinash Gyawali is the developer of EDVR, an immersive voice controlled VR experience specially designed for children of age 10-18 years.
Virtual Reality Technologies for Learning Designers
Margherita Berti
Virtual Reality (VR) is a computer-generated experience that simulates presence in real or imagined environments (Kerrebrock, Brengman, & Willems, 2017). VR promotes contextualized learning, authentic experiences, critical thinking, and problem-solving opportunities. Despite the great potential and popularity of this technology, the latest two installations of the Educause Horizon Report (2018, 2019) have argued that VR remains “elusive” in terms of mainstream adoption. The reasons are varied, including the expense and the lack of empirical evidence for its effectiveness in education. More importantly, examples of successful VR implementations for those instructors who lack technical skills are still scarce. Margherita Berti will discuss a range of easy-to-use educational VR tools and examples of VR-based activity examples and the learning theories and instructional design principles utilized for their development.
Margherita Berti is a doctoral candidate in Second Language Acquisition and Teaching (SLAT) and Educational Technology at the University of Arizona. Her research specialization resides at the intersection of virtual reality, the teaching of culture, and curriculum and content development for foreign language education.
Amanda Fox, Creative Director of STEAMPunks/MetaInk Publishing, MetaInk Publishing
There is a barrier between an author and readers of his/her books. The author’s journey ends, and the reader’s begins. But what if as an author/trainer, you could use gamification and augmented reality(AR) to interact and coach your readers as part of their learning journey? Attend this session with Amanda Fox to learn how the book Teachingland leverages augmented reality tools such as Metaverse to connect with readers beyond the text.
Amanda Fox, Creative Director of STEAMPunksEdu, and author of Teachingland: A Teacher’s Survival Guide to the Classroom Apolcalypse and Zom-Be A Design Thinker. Check her out on the Virtual Reality Podcast, or connect with her on twitter @AmandaFoxSTEM.
Christian Jonathan Angel Rueda specializaes in didactic activity of the use of virtual reality/virtual worlds to learn the fundamentals of design. He shares the development of a course including recreating in the three-dimensional environment using the fundamentals learned in class, a demonstration of all the works developed throughout the semester using the knowledge of design foundation to show them creatively, and a final project class scenario that connected with the scenes of the students who showed their work throughout the semester.
Christian Jonathan Angel Rueda is a research professor at the Autonomous University of Queretaro in Mexico. With a PhD in educational technology, Christian has published several papers on the intersection of education, pedagogy, and three-dimensional immersive digital environments. He is also an edtech, virtual reality, and social media consultant at Eco Onis.
How we can bridge the gap between eLearning and XR. Richard Van Tilborg discusses combining brain insights enabled with new technologies. Training and education cases realised with the CoVince platform: journeys which start on you mobile and continue in VR. The possibilities to earn from your creations and have a central distribution place for learning and data.
Richard Van Tilborg works with the CoVince platform, a VR platform offering training and educational programs for central distribution of learning and data. He is an author and speaker focusing on computers and education in virtual reality-based tasks for delivering feedback.
During Lab work on Jan 28, we experienced Video 360 cardboard movies
let’s take 5-10 min and check out the following videos (select and watch at least three of them)
F2F students, please Google Cardboard
Online students, please view on your computer or mobile devices, if you don’t have googles at your house (you can purchase now goggles for $5-7 from second-hand stores such as Goodwill)
Both F2F and online students. Here directions how to easily open the movies on your mobile devices:
Copy the URL and email it to yourself.
Open the email on your phone and click on the link
If you have goggles, click on the appropriate icon lower right corner and insert the phone in the goggles
Open your D2L course on your phone (you can use the mobile app).
Go to the D2L Content Module with these directions and click on the link.
After the link opens, insert phone in the goggles to watch the video
Videos: While watching the videos, consider the following objectives:
– Does this particular technology fit in the instructional design (ID) frames and theories covered, e.g. PBL, CBL, Activity Theory, ADDIE Model, TIM etc. (https://blog.stcloudstate.edu/ims/2020/01/29/im-690-id-theory-and-practice/ ). Can you connect the current state, but also the potential of this technology with the any of these frameworks and theories, e.g., how would Google Tour Creator or any of these videos fits in the Analysis – Design – Development – Implementation – Evaluation process? Or, how do you envision your Google Tour Creator project or any of these videos to fit in the Entry – Adoption – Adaptation – Infusion – Transformation process?
– how does this particular technology fit in the instructional design (ID) frames and theories covered so far?
– what models and ideas from the videos you will see seem possible to be replicated by you?
Find one F2F and one online peer to form a group.
Based on the questions/directions before you started watching the videos:
– Does this particular technology fit in the instructional design (ID) frames and theories covered. e.g. PBL, CBL, Activity Theory, ADDIE Model, TIM etc. (https://blog.stcloudstate.edu/ims/2020/01/29/im-690-id-theory-and-practice/ ). Can you connect the current state, but also the potential of this technology with the any of these frameworks and theories, e.g., how would Google Tour Creator or any of these videos fits in the Analysis – Design – Development – Implementation – Evaluation process? Or, how do you envision your Google Tour Creator project or any of these videos to fit in the Entry – Adoption – Adaptation – Infusion – Transformation process?
– how does this particular technology fit in the instructional design (ID) frames and theories covered so far?
– what models and ideas from the videos you will see seem possible to be replicated by you?
exchange thoughts with your peers and make a plan to create similar educational product
Evaluate the ability of the game you watched to be incorporated in the educational process
Assignment: In 10-15 min (mind your peers, since we have only headset), do your best to evaluate one educational app (e.g., Labster) and one leisure app (games).
Use the same questions to evaluate Lenovo DayDream:
– Does this particular technology fit in the instructional design (ID) frames and theories covered, e.g. PBL, CBL, Activity Theory, ADDIE Model, TIM etc. (https://blog.stcloudstate.edu/ims/2020/01/29/im-690-id-theory-and-practice/ ). Can you connect the current state, but also the potential of this technology with the any of these frameworks and theories, e.g., how would Google Tour Creator or any of these videos fits in the Analysis – Design – Development – Implementation – Evaluation process? Or, how do you envision your Google Tour Creator project or any of these videos to fit in the Entry – Adoption – Adaptation – Infusion – Transformation process?
– how does this particular technology fit in the instructional design (ID) frames and theories covered so far?
– what models and ideas from the videos you will see seem possible to be replicated by you?