Searching for "immersive reality"

Cross Reality (XR)

Ziker, C., Truman, B., & Dodds, H. (2021). Cross Reality (XR): Challenges and Opportunities Across the Spectrum. Innovative Learning Environments in STEM Higher Education, 55–77. https://doi.org/10.1007/978-3-030-58948-6_4
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7948004/

For the purpose of this chapter, Cross Reality or XR refers to technologies and applications that involve combinations of mixed reality (MR), augmented reality (AR), virtual reality (VR), and virtual worlds (VWs). These are technologies that connect computer technology (such as informational overlays) to the physical world for the purposes of augmenting or extending experiences beyond the real. Especially relevant to the definition of XR is the fact that this term encompasses a wide range of options for delivering learning experiences, from minimal technology and episodic experiences to deep immersion and persistent platforms. The preponderance of different terms for slightly different technologies indicate that this is a growth area within the field. Here we provide a few definitions of these technologies.

MR—Mixed reality refers to a blend of technologies used to influence the human perception of an experience. Motion sensors, body tracking, and eye tracking interplay with overlaid technology to give a rich and full version of reality displayed to the user. For example, technology could add sound or additional graphics to an experience in real time. Examples include the Magic Leap One and Microsoft HoloLens 2.0. MR and XR are often used interchangeably.

AR—Augmented reality refers to technology systems that overlay information onto the real world, but the technology might not allow for real-time feedback. As such, AR experiences can move or animate, but they might not interact with changes in depth of view or external light conditions. Currently, AR is considered the first generation of the newer and more interactive MR experiences.

VR—Virtual reality, as a technological product, traces its history to approximately 1960 and tends to encompass user experiences that are visually and auditorily different from the real world. Indeed, the real world is often blocked from interacting with the virtual one. Headsets, headphones, haptics, and haptic clothing might purposely cut off all input except that which is virtual. In general, VR is a widely recognizable term, often found in gaming and workplace training, where learners need to be transported to a different time and place. VR experiences in STEM often consist of virtual labs or short virtual field trips.

VW—Virtual worlds are frequently considered a subset of VR with the difference that VWs are inherently social and collaborative; VWs frequently contain multiple simultaneous users, while VRs are often solo experiences. Another discrimination between virtual reality and virtual worlds is the persistence of the virtual space. VR tends to be episodic, with the learner in the virtual experience for a few minutes and the reality created within the experience ends when the learner experience ends. VWs are persistent in that the worlds continue to exist on computer servers whether or not there are active avatars within the virtual space (Bell ). This discrimination between VR and VW, however, is dissolving. VR experiences can be created to exist for days, and some users have been known to wear headsets for extended periods of time. Additionally, more and more VR experiences are being designed to be for game play, socialization, or mental relaxation. The IEEE VR 2020 online conference and the Educators in VR International Summit 2020 offered participants opportunities to experience conference presentations in virtual rooms as avatars while interacting with presenters and conference attendees (see Sect. 2.5 for more information).

CVEs—Collaborative virtual environments are communication systems in which multiple interactants share the same three-dimensional digital space despite occupying remote physical locations (Yee and Bailenson ).

Embodiment—Embodiment is defined by Lindgren and Johnson-Glenberg () as the enactment of knowledge and concepts through the activity of our bodies within an MR (mixed reality) and physical environment

https://hyp.is/mBiunvx3EeudElMRwHm5dQ/www.ncbi.nlm.nih.gov/pmc/articles/PMC7948004/ 

Human-Centered Design philosophy that involves putting human needs, capabilities, and behavior first (Jerald 2018: 15). XR provides the opportunity to experience just-in-time immersive, experiential learning that uses concrete yet exploratory experiences involving senses that result in lasting memories. Here we discuss opportunities for social applications with XR. 

 

https://hyp.is/wJSoFPx3Eeu1mAPmeAp2tQ/www.ncbi.nlm.nih.gov/pmc/articles/PMC7948004/ 

XR learner activities are usually created for individual use, which may or may not need to be simultaneously experienced as a class together at the same time or place with the instructor. Activities can be designed into instruction with VR headsets, high-resolution screens, smartphones, or other solo technological devices for use inside and outside of the classroom. 

 

https://hyp.is/wJSoFPx3Eeu1mAPmeAp2tQ/www.ncbi.nlm.nih.gov/pmc/articles/PMC7948004/ 

Ready to go relationship between STEM courses and XR. In bullet points! 

 

https://hyp.is/wJSoFPx3Eeu1mAPmeAp2tQ/www.ncbi.nlm.nih.gov/pmc/articles/PMC7948004/ 

Do we address the challenges in the grant proposal? 

some learners will be held back from full XR activity by visual, physical, and social abilities such as stroke, vertigo, epilepsy, or age-related reaction time. It should also be noted that the encompassing nature of VR headsets might create some discomfort or danger for any learners as they can no longer fully see and control their body and body space. 

Virtual reality: The gateway to the future

Virtual reality: The gateway to the future

https://www.itp.net/emergent-tech/97949-virtual-reality-the-gateway-to-the-future

Apart from offering a virtually interactive environment, VR also offers a myriad of variations in how and to what extent an environment is explorable.

VR technology can offer a fully immersive, non-immersive, and web-based VR experience. A great example of a non-immersive VR experience is a flight simulator which allows the user to experience an alternate reality with just a joystick controller and a PC. Non-immersive technologies are commonly used in architecture, industrial designing, and archeology through 3D designs, allowing users to create a replica of the real-life environment.

++++++++++++++
more on VR in this IMS blog
https://blog.stcloudstate.edu/ims?s=virtual+reality

Augmented Reality Apple iPhone 12 and Delta Airlines

Apple’s big iPhone 12 deal with Delta shows a path to AR

The augmented enterprise is the focal point for Apple and AR.

https://www.computerworld.com/article/3616982/apples-big-iphone-12-deal-with-delta-shows-a-path-to-ar.html

Delta Airlines this week dropped the biggest hint of this when it announced deployment of 19,000 iPhone 12s for its in-flight staff

Delta plans to use iPhones to offer immersive training experiences to staffers, featuring video, photos, and AR. Specifically, it plans to use AR to help cabin staff quickly locate where items are stowed – useful for hand luggage but even more useful when attempting to locate a flight’s worth of desserts for the meal.

+++++++++++++++++++
more on AR in this IMS blog
https://blog.stcloudstate.edu/ims?s=Augmented+reality

immersive and goggles

The tech industry is looking to replace the smartphone — and everybody is waiting to see what Apple comes up with

https://www.cnbc.com/2021/02/20/apple-facebook-microsoft-battle-to-replace-smartphone-with-ar.html

Apple’s working on solving this problem, too, according to a report in Nikkei Asia. The newspaper says that Apple is working with TSMC, its primary processor manufacturer, to develop a new kind of augmented reality display that’s printed directly on wafers, or the base layer for chips.

If Apple does eventually reveal a big leap forward in AR display technology — especially if the technology is developed and owned by Apple instead of a supplier — Apple could find itself with multi-year head-start in augmented reality as it did when the iPhone vaulted it to the head of the smartphone industry.

Apple is also adding hardware to its iPhones that hint at a headset-based future. High-end iPhones released in 2020 include advanced Lidar sensors embedded in their camera.

Microsoft has invested heavily in these kind of technologies, purchasing AltspaceVR, a social network for virtual reality, in 2018. Before it launched Hololens, it paid $150 million for intellectual property from a smartglasses pioneer.

Facebook CEO Mark Zuckerberg speaks the most in public about his hopes for augmented reality. Last year, he said, “While I expect phones to still be our primary devices through most of this decade, at some point in the 2020s, we will get breakthrough augmented reality glasses that will redefine our relationship with technology.”

+++++++++
more on immersive in this IMS blog
https://blog.stcloudstate.edu/ims?s=immersive

campus wide infrastructure for immersive

Cabada, E., Kurt, E., & Ward, D. (2021). Constructing a campus-wide infrastructure for virtual reality. College & Undergraduate Libraries, 0(0), 1–24. https://doi.org/10.1080/10691316.2021.1881680

As an interdisciplinary hub, academic libraries are uniquely positioned to serve the full lifecycle of immersive environment needs, from development through archiving of successful projects. As and informal learning environment that or discipline neutral and high traffic, the academic library can serve as a clearinghouse for experimentation and transmission of best practices across colleges.

these founda­tional questions:
1. What VR infrastructure needs do faculty and researchers have?
2. Where is campus support lagging?
3. What current partnerships exist?
4. What and where is the campus level of interest in VR?
As marketing for workshops and programs can be challenging, particu­larly for large institutions, data was collected on where workshop partici­pants learned about Step Into VR. The responses show that users learned of the workshops from a variety of ways with email ( 41 % ) as the most cited method (Figure 4). These marketing emails were sent through distributed listservs that reached nearly the entire campus population. Facebook was called out specifically and represented the second largest marketing method at 29% with the library website, friends, instructors, and digital signage rep­resenting the remaining marketing channels.
While new needs continue to emerge, the typical categories of consult­ation support observed include:
• Recommendations on hardware selection, such as choosing the best VR headset for viewing class content
• Guidance on developing VR applications that incorporate domain-spe­cific curricular content
• Support for curricular integration of VR
• Recommendations on 360 capture media and equipment for document­ing environments or experiences, such as the GoPro Fusion and Insta360 One X
• Advice on editing workflows, including software for processing and ren­dering of 360 content
Alex Fogarty
p. 9
While many library patrons understand the basic concepts of recording video on a camera, 360 cameras present a large divergence from this pro­cess in several primary ways. The first is a 360 camera captures every direc­tion at once, so there is no inherent “focus,” and no side of a scene that is not recorded. This significantly changes how someone might compose a video recording, and also adds complexity to post-production, including how to orient viewers within a scene. The second area of divergence is that many of these devices, especially the high-end versions, are recording each lens to a separate data file or memory card and these ftles need to be com­bined, or “stitched,” at a later time using software specific to the camera. A final concern is that data ftles for high-resolution 3 D capture can be huge, requiring both large amounts of disk space and high-end processors and graphic cards for detailed editing to occur. For example, the Insta360 Pro 2 has 6 sensors all capable of data recording at 120 Mbps for a grand total of 720 Mbps. This translates into 43.2 gigabytes of data for every minute o

Red Cross and Immersive Learning

Virtual Reality & Innovation

https://www.icrc.org/en/what-we-do/virtual-reality

mounting research suggests that gaming in immersive virtual environments can directly affect and impact regions of the brain responsible for memory, spatial orientation, information organizations, and fine motor skills.

the ICRC officially established its Virtual Reality Unit (VRU) to delve further into computer-generated environments as a way to educate, communicate and advocate respect for IHL.

By 2017, the VRU had amassed a library of virtual environments for FAS’ IHL training sessions but there was a desire within the VRU, as well as in FAS and ICRC’s Learning & Development, to develop more advanced VR opportunities for a wider audience.

2018 report researched global financial investment in XR and a 2019 meta-analysis consolidated global academic findings that used VR to measure behaviour.

December 2019 … the production of an XR Quick Start Guide in April 2020 which introduces ICRC staff to lessons learned and best practices for initiative development.

++++++++++
more on gaming in this IMS blog
https://blog.stcloudstate.edu/ims?s=gaming
and immersive learning
https://blog.stcloudstate.edu/ims?s=immersive+learning

1 2 3 4 5 12