Archive of ‘AR’ category

Apple Glass

‘Apple Glass’ users may be able to manipulate AR images with any real object from r/gadgets

https://appleinsider.com/articles/20/07/30/apple-glass-users-may-be-able-to-manipulate-ar-images-with-any-real-object

With AR and especially with what Apple refers to as Mixed Reality (MR), it’s great to be able to see an iPad Pro in front of you, but you need to be able to use it. You have to be able to pick up a virtual object and use it, or otherwise AR is no better than a 3D movie.

Apple’s proposed solution is described in “Manipulation of Virtual Objects using a Tracked Physical Object,” a patent application filed in January 2020 but only revealed this week. It suggests that truly mixing realities, in that the virtual object could be mapped onto an actual object in the real world.

+++++++++++++++
more on Apple Glass in this IMS blog
http://blog.stcloudstate.edu/ims?s=apple+glass

Mixed Reality remote learning platform

https://www.prweb.com/releases/gigxr_announces_new_immersive_learning_system_for_fall_2020_academic_year_with_remote_and_socially_distanced_learning/prweb17183361.htm

GIGXR, Inc., a provider of extended reality (XR) learning systems for instructor-led teaching and training, announced today the availability of its GIG Immersive Learning System for the Fall 2020 Northern Hemisphere academic year. The cloud-based System was created to dramatically enhance learning outcomes while simplifying complex, real-life teaching and training scenarios in medical and nursing schools, higher education, healthcare and hospitals.

++++++++++++++++
more on mixed reality in this IMS blog
http://blog.stcloudstate.edu/ims?s=mixed+reality

Immersive Journalism and Storytelling

My note: Consider these SCSU courses:

LIB 490/590 Digital Storytelling and Virtual Reality: https://web.stcloudstate.edu/pmiltenoff/lib490/

and

IM 690 Virtual and Augmented Reality for Instructional Designers

+++++++++++
more on immersive journalism in this IMS blog
http://blog.stcloudstate.edu/ims?s=immersive+journalism

and storytelling
http://blog.stcloudstate.edu/ims?s=storytelling

research in/about VR

https://account.altvr.com/events/1459609665267564719

Tuesday, June 16, 2020 from 1:00 PM to 2:00 PM (CDT)

This event will be an expert panel considering research in/about VR. The experts in the panel are; Sam Reno, Géraldine Perriguey, Anthony Chaston PhD and Evelien Ydo who all have presented for the research track before (biographies below, see the EDVR YouTube channel for their previous presentations). The event will be highly interactive, where the audience is welcomed to introduce topics and questions for the panel to discuss. At the end of the event there will be some time to network as well.

The Educators in VR Research Team features researchers from across the spectrum of VR/AR/XR research and development, coming together to share their knowledge, techniques, and research and learn from each other. Join us to discuss the possibilities and potential of research in VR. We host regular meetups and workshops for discussion and learning.

XR anatomy

The EDUCAUSE XR (Extended Reality) Community Group Listserv <XR@LISTSERV.EDUCAUSE.EDU>

Greetings to you all! Presently, I am undertaking a masters course in “Instruction Design and Technology” which has two components: Coursework and Research. For my research, I would like to pursue it in the field of Augmented Reality (AR) and Mobile Learning. I am thinking of an idea that could lead to collaboration among students and directly translate into enhanced learning for students while using an AR application. However, I am having a problem with coming up with an application because I don’t have any computing background. This, in turn, is affecting my ability to come up with a good research topic.

I teach gross anatomy and histology to many students of health sciences at Mbarara University, and this is where I feel I could make a contribution to learning anatomy using AR since almost all students own smartphones. I, therefore, kindly request you to let me know which of the freely-available AR app authoring tools could help me in this regard. In addition, I request for your suggestions regarding which research area(s) I should pursue in order to come up with a good research topic.

Hoping to hear from you soon.

Grace Muwanga Department of Anatomy Mbarara University Uganda (East Africa)

++++++++++++

matthew.macvey@journalism.cuny.edu

Dear Grace, a few augmented reality tools which I’ve found are relatively easy to get started with:

For iOS, iPhone, iPad: https://www.torch.app/ or https://www.adobe.com/products/aero.html

To create AR that will work on social platforms like Facebook and Snapchat (and will work on Android, iOS) try https://sparkar.facebook.com/ar-studio/ or https://lensstudio.snapchat.com/ . You’ll want to look at the tutorials for plane tracking or target tracking https://sparkar.facebook.com/ar-studio/learn/documentation/tracking-people-and-places/effects-in-surroundings/

https://lensstudio.snapchat.com/guides/general/tracking/tracking-modes/

One limitation with Spark and Snap is that file sizes need to be small.

If you’re interested in creating AR experiences that work directly in a web browser and are up for writing some markup code, look at A-Frame AR https://aframe.io/blog/webxr-ar-module/.

For finding and hosting 3D models you can look at Sketchfab and Google Poly. I think both have many examples of anatomy.

Best, Matt

+++++++++++

“Beth L. Ritter-Guth” <britter-guth@NORTHAMPTON.EDU>

I’ve been using Roar. They have a 99$ a year license.

++++++++++++

I have recently been experimenting with an AR development tool called Zappar, which I like because the end users do not have to download an app to view the AR content. Codes can be scanned either with the Zappar app or at web.zappar.com.

From a development standpoint, Zappar has an easy to use drag-and-drop interface called ZapWorks Designer that will help you build basic AR experiences quickly, but for a more complicated, more interactive use case such as learning anatomy, you will probably need ZapWorks Studio, which will have much more of a learning curve. The Hobby (non-commercial) license is free if you are interested in trying it out.

You can check out an AR anatomy mini-lesson with models of the human brain, liver, and heart using ZapWorks here: https://www.zappar.com/campaigns/secrets-human-body/. Even if you choose to go with a different development tool, this example might help nail down ideas for your own project.

Hope this helps,

Brighten

Brighten Jelke Academic Assistant for Virtual Technology Lake Forest College bjelke@lakeforest.edu Office: DO 233 | Phone: 847-735-5168

http://www.lakeforest.edu/academics/resources/innovationspaces/virtualspace.php

+++++++++++++++++
more on XR in education in this IMS blog
http://blog.stcloudstate.edu/ims?s=xr+education

iLearn2020

YouTube Live stream: https://www.youtube.com/watch?v=DSXLJGhI2D8&feature=youtu.be
and the Discord directions: https://docs.google.com/document/d/1GgI4dfq-iD85yJiyoyPApB33tIkRJRns1cJ8OpHAYno/editiLearn2020

Modest3D Guided Virtual Adventure – iLRN Conference 2020 – Session 1: currently, live session: https://youtu.be/GjxTPOFSGEM

https://mediaspace.minnstate.edu/media/Modest+3D/1_28ejh60g

CALL FOR PROPOSALS: GUIDED VIRTUAL ADVENTURE TOURS
at iLRN 2020: 6th International Conference of the Immersive Learning Research Network
Organized in conjunction with Educators in VR
Technically co-sponsored by the IEEE Education Society
June 21-25, 2020, Online
Conference theme: “Vision 20/20: Hindsight, Insight, and Foresight in XR and Immersive Learning”
Conference website: https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fimmersivelrn.org%2Filrn2020&amp;data=02%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C7a9997a1d6724744f7d708d7f52d9387%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637247448406614239&amp;sdata=Jt%2BFUtP3Vs%2FQi1z9HCk9x8m%2B%2BRjkZ63qrcoZnFiUdaQ%3D&amp;reserved=0
++++++++++++++++++++++++++++++
Wednesday, June 24 • 12:00pm – 1:00pm

 Instruction and Instructional Design

Presentation 1: Inspiring Faculty (+ Students) with Tales of Immersive Tech (Practitioner Presentation #106)

Authors: Nicholas Smerker

Immersive technologies – 360º video, virtual and augmented realities – are being discussed in many corners of higher education. For an instructor who is familiar with the terms, at least in passing, learning more about why they and their students should care can be challenging, at best. In order to create a font of inspiration, the IMEX Lab team within Teaching and Learning with Technology at Penn State devised its Get Inspired web resource. Building on a similar repository for making technology stories at the sister Maker Commons website, the IMEX Lab Get Inspired landing page invites faculty to discover real world examples of how cutting edge XR tools are being used every day. In addition to very approachable video content and a short summary calling out why our team chose the story, there are also instructional designer-developed Assignment Ideas that allow for quick deployment of exercises related to – though not always relying upon – the technologies highlighted in a given Get Inspired story.

Presentation 2: Lessons Learned from Over A Decade of Designing and Teaching Immersive VR in Higher Education Online Courses (Practitioner Presentation #101)

Authors: Eileen Oconnor

This presentation overviews the design and instruction in immersive virtual reality environments created by the author beginning with Second Life and progressing to open source venues. It will highlight the diversity of VR environment developed, the challenges that were overcome, and the accomplishment of students who created their own VR environments for K12, college and corporate settings. The instruction and design materials created to enable this 100% online master’s program accomplishment will be shared; an institute launched in 2018 for emerging technology study will be noted.

Presentation 3: Virtual Reality Student Teaching Experience: A Live, Remote Option for Learning Teaching Skills During Campus Closure and Social Distancing (Practitioner Presentation #110)

Authors: Becky Lane, Christine Havens-Hafer, Catherine Fiore, Brianna Mutsindashyaka and Lauren Suna

Summary: During the Coronavirus pandemic, Ithaca College teacher education majors needed a classroom of students in order to practice teaching and receive feedback, but the campus was closed, and gatherings forbidden. Students were unable to participate in live practice teaching required for their program. We developed a virtual reality pilot project to allow students to experiment in two third-party social VR programs, AltSpaceVR and Rumii. Social VR platforms allow a live, embodied experience that mimics in-person events to give students a more realistic, robust and synchronous teaching practice opportunity. We documented the process and lessons learned to inform, develop and scale next generation efforts.

++++++++++++++++++++++++++
Tuesday, June 23 • 5:00pm – 6:00pm
+++++++++++++++++++++++++++
Sunday, June 21 • 8:00am – 9:00am
Escape the (Class)room games in OpenSim or Second Life FULLhttps://ilrn2020.sched.com/event/ceKP/escape-the-classroom-games-in-opensim-or-second-lifePre-registration for this tour is required as places are limited. Joining instructions will be emailed to registrants ahead of the scheduled tour time.The Guided Virtual Adventure tour will take you to EduNation in Second Life to experience an Escape room game. For one hour, a group of participants engage in voice communication and try to solve puzzles, riddles or conundrums and follow clues to eventually escape the space. These scenarios are designed for problem solving and negotiating language and are ideal for language education. They are fun and exciting and the clock ticking adds to game play.Tour guide(s)/leader(s): Philp Heike, let’s talk online sprl, Belgium

Target audience sector: Informal and/or lifelong learning

Supported devices: Desktop/laptop – Windows, Desktop/laptop – Mac

Platform/environment access: Download from a website and install on a desktop/laptop computer
Official website: http://www.secondlife.com

+++++++++++++++++++

Thursday, June 25 • 9:00am – 10:00am

Games and Gamification II

Click here to remove from My Sched.

Presentation 1: Evaluating the impact of multimodal Collaborative Virtual Environments on user’s spatial knowledge and experience of gamified educational tasks (Full Paper #91)

Authors: Ioannis Doumanis and Daphne Economou

>>Access Video Presentation<<

Several research projects in spatial cognition have suggested Virtual Environments (VEs) as an effective way of facilitating mental map development of a physical space. In the study reported in this paper, we evaluated the effectiveness of multimodal real-time interaction in distilling understanding of the VE after completing gamified educational tasks. We also measure the impact of these design elements on the user’s experience of educational tasks. The VE used reassembles an art gallery and it was built using REVERIE (Real and Virtual Engagement In Realistic Immersive Environment) a framework designed to enable multimodal communication on the Web. We compared the impact of REVERIE VG with an educational platform called Edu-Simulation for the same gamified educational tasks. We found that the multimodal VE had no impact on the ability of students to retain a mental model of the virtual space. However, we also found that students thought that it was easier to build a mental map of the virtual space in REVERIE VG. This means that using a multimodal CVE in a gamified educational experience does not benefit spatial performance, but also it does not cause distraction. The paper ends with future work and conclusions and suggestions for improving mental map construction and user experience in multimodal CVEs.

Presentation 2: A case study on student’s perception of the virtual game supported collaborative learning (Full Paper #42)

Authors: Xiuli Huang, Juhou He and Hongyan Wang

>>Access Video Presentation<<

The English education course in China aims to help students establish the English skills to enhance their international competitiveness. However, in traditional English classes, students often lack the linguistic environment to apply the English skills they learned in their textbook. Virtual reality (VR) technology can set up an immersive English language environment and then promote the learners to use English by presenting different collaborative communication tasks. In this paper, spherical video-based virtual reality technology was applied to build a linguistic environment and a collaborative learning strategy was adopted to promote their communication. Additionally, a mixed-methods research approach was used to analyze students’ achievement between a traditional classroom and a virtual reality supported collaborative classroom and their perception towards the two approaches. The experimental results revealed that the virtual reality supported collaborative classroom was able to enhance the students’ achievement. Moreover, by analyzing the interview, students’ attitudes towards the virtual reality supported collaborative class were reported and the use of language learning strategies in virtual reality supported collaborative class was represented. These findings could be valuable references for those who intend to create opportunities for students to collaborate and communicate in the target language in their classroom and then improve their language skills

!!!!!!!!!!!!!!!!!!!
Thursday, June 25 • 11:00am – 12:00pm

 Games and Gamification III

Click here to remove from My Sched.

Presentation 1: Reducing Cognitive Load through the Worked Example Effect within a Serious Game Environment (Full Paper #19)

Authors: Bernadette Spieler, Naomi Pfaff and Wolfgang Slany

>>Access Video Presentation<<

Novices often struggle to represent problems mentally; the unfamiliar process can exhaust their cognitive resources, creating frustration that deters them from learning. By improving novices’ mental representation of problems, worked examples improve both problem-solving skills and transfer performance. Programming requires both skills. In programming, it is not sufficient to simply understand how Stackoverflow examples work; programmers have to be able to adapt the principles and apply them to their own programs. This paper shows evidence in support of the theory that worked examples are the most efficient mode of instruction for novices. In the present study, 42 students were asked to solve the tutorial The Magic Word, a game especially for girls created with the Catrobat programming environment. While the experimental group was presented with a series of worked examples of code, the control groups were instructed through theoretical text examples. The final task was a transfer question. While the average score was not significantly better in the worked example condition, the fact that participants in this experimental group finished significantly faster than the control group suggests that their overall performance was better than that of their counterparts.

Presentation 2: A literature review of e-government services with gamification elements (Full Paper #56)

Authors: Ruth S. Contreras-Espinosa and Alejandro Blanco-M

>>Access Video Presentation<<

Nowadays several democracies are facing the growing problem of a breach in communication between its citizens and their political representatives, resulting in low citizen’s engagement in the participation of political decision making and on public consultations. Therefore, it is fundamental to generate a constructive relationship between both public administration and the citizens by solving its needs. This document contains a useful literature review of the gamification topic and e-government services. The documents contain a background of those concepts and conduct a selection and analysis of the different applications found. A set of three lines of research gaps are found with a potential impact on future studies.

++++++++++++++++++
Thursday, June 25 • 12:00pm – 1:00pm

 Museums and Libraries

Click here to remove from My Sched.

Presentation 1: Connecting User Experience to Learning in an Evaluation of an Immersive, Interactive, Multimodal Augmented Reality Virtual Diorama in a Natural History Museum & the Importance of Story (Full Paper #51)

Authors: Maria Harrington

>>Access Video Presentation<<

Reported are the findings of user experience and learning outcomes from a July 2019 study of an immersive, interactive, multimodal augmented reality (AR) application, used in the context of a museum. The AR Perpetual Garden App is unique in creating an immersive multisensory experience of data. It allowed scientifically naïve visitors to walk into a virtual diorama constructed as a data visualization of a springtime woodland understory, and interact with multimodal information directly through their senses. The user interface comprised of two different AR data visualization scenarios reinforced with data based ambient bioacoustics, an audio story of the curator’s narrative, and interactive access to plant facts. While actual learning and dwell times were the same between the AR app and the control condition, the AR experience received higher ratings on perceived learning. The AR interface design features of “Story” and “Plant Info” showed significant correlations with actual learning outcomes, while “Ease of Use” and “3D Plants” showed significant correlations with perceived learning. As such, designers and developers of AR apps can generalize these findings to inform future designs.

Presentation 2: The Naturalist’s Workshop: Virtual Reality Interaction with a Natural Science Educational Collection (Short Paper #11)

Authors: Colin Patrick Keenan, Cynthia Lincoln, Adam Rogers, Victoria Gerson, Jack Wingo, Mikhael Vasquez-Kool and Richard L. Blanton

>>Access Video Presentation<<

For experiential educators who utilize or maintain physical collections, The Naturalist’s Workshop is an exemplar virtual reality platform to interact with digitized collections in an intuitive and playful way. The Naturalist’s Workshop is a purpose-developed application for the Oculus Quest standalone virtual reality headset for use by museum visitors on the floor of the North Carolina Museum of Natural Sciences under the supervision of a volunteer attendant. Within the application, museum visitors are seated at a virtual desk. Using their hand controllers and head-mounted display, they explore drawers containing botanical specimens and tools-of-the-trade of a naturalist. While exploring, the participant can receive new information about any specimen by dropping it into a virtual examination tray. 360-degree photography and three-dimensionally scanned specimens are used to allow user-motivated, immersive experience of botanical meta-data such as specimen collection coordinates.

Presentation 3: 360˚ Videos: Entry level Immersive Media for Libraries and Education (Practitioner Presentation #132)

Authors: Diane Michaud

>>Access Video Presentation<<

Within the continuum of XR Technologies, 360˚ videos are relatively easy to produce and need only an inexpensive mobile VR viewer to provide a sense of immersion. 360˚ videos present an opportunity to reveal “behind the scenes” spaces that are normally inaccessible to users of academic libraries. This can promote engagement with unique special collections and specific library services. In December 2019, with little previous experience, I led the production of a short 360˚video tour, a walk-through of our institution’s archives. This was a first attempt; there are plans to transform it into a more interactive, user-driven exploration. The beta version successfully generated interest, but the enhanced version will also help prepare uninitiated users for the process of examining unique archival documents and artefacts. This presentation will cover the lessons learned, and what we would do differently for our next immersive video production. Additionally, I will propose that the medium of 360˚ video is ideal for many institutions’ current or recent predicament with campuses shutdown due to the COVID-19 pandemic. Online or immersive 360˚ video can be used for virtual tours of libraries and/or other campus spaces. Virtual tours would retain their value beyond current campus shutdowns as there will always be prospective students and families who cannot easily make a trip to campus. These virtual tours would provide a welcome alternative as they eliminate the financial burden of travel and can be taken at any time.

++++++++++++++++++

Asynchronous Virtual Field Experiences with 360 Video

Zolfaghari, M., Austin, C. K., Kosko, K. W., & Ferdig, R. E. (2020). Creating Asynchronous Virtual Field Experiences with 360 Video. Journal of Technology and Teacher Education, 28(2), 315–320.
https://www.learntechlib.org/p/216115/
The global COVID-19 pandemic has disrupted normal face-to-face classes across institutions. This has significantly impacted methods courses where preservice teachers (PSTs) practice pedagogy in the field (e.g., in the PreK-12 classroom). In this paper, we describe efforts to adapt an assignment originally situated in a face-to-face school placement into a virtual version. By utilizing multi-perspective 360 video, preliminary results suggest virtual field experiences can provide PSTs with similar experiences for observation-based assignments. Acknowledging that immersive virtual experiences are not a complete replacement for face-to-face field-based experiences, we suggest virtual field assignments can be a useful supplement or a viable alternative during a time of pandemic.

+++++++++++++++++
more on Video 360 in this IMS blog
http://blog.stcloudstate.edu/ims?s=360
and specifically for education:
http://blog.stcloudstate.edu/ims?s=video+360+education

1 2 3 8