Cabada, E., Kurt, E., & Ward, D. (2021). Constructing a campus-wide infrastructure for virtual reality. College & Undergraduate Libraries, 0(0), 1–24. https://doi.org/10.1080/10691316.2021.1881680
As an interdisciplinary hub, academic libraries are uniquely positioned to serve the full lifecycle of immersive environment needs, from development through archiving of successful projects. As and informal learning environment that or discipline neutral and high traffic, the academic library can serve as a clearinghouse for experimentation and transmission of best practices across colleges.
these foundational questions:
1. What VR infrastructure needs do faculty and researchers have?
2. Where is campus support lagging?
3. What current partnerships exist?
4. What and where is the campus level of interest in VR?
As marketing for workshops and programs can be challenging, particularly for large institutions, data was collected on where workshop participants learned about Step Into VR. The responses show that users learned of the workshops from a variety of ways with email ( 41 % ) as the most cited method (Figure 4). These marketing emails were sent through distributed listservs that reached nearly the entire campus population. Facebook was called out specifically and represented the second largest marketing method at 29% with the library website, friends, instructors, and digital signage representing the remaining marketing channels.
While new needs continue to emerge, the typical categories of consultation support observed include:
• Recommendations on hardware selection, such as choosing the best VR headset for viewing class content
• Guidance on developing VR applications that incorporate domain-specific curricular content
• Support for curricular integration of VR
• Recommendations on 360 capture media and equipment for documenting environments or experiences, such as the GoPro Fusion and Insta360 One X
• Advice on editing workflows, including software for processing and rendering of 360 content
Alex Fogarty
p. 9
While many library patrons understand the basic concepts of recording video on a camera, 360 cameras present a large divergence from this process in several primary ways. The first is a 360 camera captures every direction at once, so there is no inherent “focus,” and no side of a scene that is not recorded. This significantly changes how someone might compose a video recording, and also adds complexity to post-production, including how to orient viewers within a scene. The second area of divergence is that many of these devices, especially the high-end versions, are recording each lens to a separate data file or memory card and these ftles need to be combined, or “stitched,” at a later time using software specific to the camera. A final concern is that data ftles for high-resolution 3 D capture can be huge, requiring both large amounts of disk space and high-end processors and graphic cards for detailed editing to occur. For example, the Insta360 Pro 2 has 6 sensors all capable of data recording at 120 Mbps for a grand total of 720 Mbps. This translates into 43.2 gigabytes of data for every minute o
Taqtile had a compelling vision for using the Hololens for digital transformation for industrial frontline workers. The goal was to democratize expertise and make “everyone an expert.”
Taqtile’s content platform is called Manifest. It’s an enterprise platform for knowledge capture and reuse for industrial workers—a tool for structuring the “checklist” items for a particular task. It’s unlike anything we saw in the KM era. Manifest procedures contain instructions, photos, videos, pointers, and the like. If that’s not enough, it can also contact experts in real time—as with the BP Virtual Teamwork system.
Apple’s known interest in this field has so far focused more on augmented reality (AR) than virtual reality (VR), but the recent reports point to a mixed-reality device, which would be mostly VR but including some real-world elements.
Lischer-Katz, Z., & Clark, J. (2021). Institutional Factors Shaping XR Technology Accessibility Policy & Practice in Academic Libraries. Survey. The EDUCAUSE XR (Extended Reality) Community Group Listserv <XR@LISTSERV.EDUCAUSE.EDU>. https://uarizona.co1.qualtrics.com/jfe/form/SV_1Ya9id4uCXoktLv
participate in a survey is being sent out to those responsible for managing and providing XR technologies in academic libraries. This survey is part of a study titled “Institutional Factors Shaping XR Technology Accessibility Policy & Practice in Academic Libraries.” The principal investigator (PI) is Dr. Zack Lischer-Katz, PhD (Assistant Professor, School of Information, University of Arizona) and the co-principal investigator (Co-PI) is Jasmine Clark (Digital Scholarship Librarian, Temple University).
An Institutional Review Board (IRB) responsible for human subjects research at The University of Arizona reviewed this research project and found it to be acceptable, according to applicable state and federal regulations and University policies designed to protect the rights and welfare of participants in research
Please feel free to share this survey widely with colleagues.
Introduction
Over the past five years, many academic libraries have begun systematically integrating innovative technologies, including virtual reality (VR) and other “XR” technologies, into their spaces and services. Even though schools, libraries, and the library profession all stress equitable access to information and technology for all community members, accessibility – understood in terms of the design of spaces, services, and technologies to support users with disabilities – is rarely given sufficient consideration when it comes to the design, implementation, and administration of XR technology programs. Because XR technologies engage the body and multiple senses they show great potential for providing enhanced means for disabled users to access information resources; however, without accessibility policies in place, the embodied aspects of XR technologies can create new barriers (e.g., chairs and other furniture that cannot be adapted, controllers that cannot be adjusted for different degrees of dexterity, etc.)
Purpose of the study
The purpose of this study is to develop new understanding about the current landscape of accessibility policies and practices for XRtechnology programs and to understand the barriers to adoption of XR accessibility policies and practices.
The main research objective is to understand what policies and practices are currently in place in academic libraries and their level of development, the existing beliefs and knowledge of library staff and administrators involved with XR technology programs and spaces, and the institutional factors that shape the adoption of accessibility policies for XR technology programs.
The survey will be open from February 1, 2021 to April 30, 2021. More information regarding confidentiality and consent can be found at the beginning of the survey.
AppleAAPL is expected to launch its first virtual reality (VR) headset in 2022, which will be a forerunner of its much-anticipated augmented reality (AR) glasses
along with VR features like a completely simulated 3-D digital environment, the device might include limited AR functionalities.
Apple’s entry will intensify competition in the VR device market, which includes devices such as Facebook’s FB Oculus Quest 2, Sony’s SNE PlayStation VR, Microsoft’s MSFT Windows Mixed Reality and HTC’s Vive and Vive Pro.
global spending on AR and VR is expected to reach $72.8 billion in 2024 from $12 billion in 2020, reflecting a CAGR of 54%
Verizon and Unity partner to enable new digital experiences ranging from entertainment applications to enterprise toolkits using 5G, mobile edge compute (MEC) and real-time 3D technology.
5G Ultra Wideband and MEC will be a game changer for real-time 3D entertainment content by offering faster speeds, higher bandwidth and ultra low-latency for industries like gaming, retail, sports and more.
The companies will also explore how 5G and MEC can enhance real-time 3D enterprise experiences, transforming the way businesses design, build and operate in a real-time economy.
Eye tracking technology – Projects information at driver’s level of sight based on driver’s eye position, eliminating a potential mismatch between the projected image when the driver moves their head
Advanced optics – Advanced optical design techniques provide expanded field-of-view (beyond 10 by 4 degrees) for virtual image distance of 10m or greater; detects pedestrians and objects through enhanced low light and nighttime view; tilted virtual image planes adjust visibility of objects in the driver’s field of view; embedded camera system allows discrete monitoring for the driver’s eye location.
AI navigation accuracy – AI-driven AR navigation technology detects and provides multi-color 3D navigation graphics that adjust with moving vehicle’s surroundings, displaying information like lane markers and GPS arrows where turns will occur and sudden changes such as collisions or cyclists in one’s path
Vibration control – Panasonic’s proprietary camera image stability algorithm enables AR icons to lock onto the driving environment regardless of the bumpiness of the road
Real-time situational awareness – Driving environment updates occur in real-time; ADAS, AI, AR environment information updates in less than 300 milliseconds
3D imaging radar – Sensor-captured full 180-degree forward vision up to 90 meters and across approximately three traffic lanes
Compact size – Efficient compact packaging to fit any vehicle configuration
4K resolution – Crisp, bright 4K resolution using advanced laser and holography technology, with static near-field cluster information and far-field image plane for AR graphic overlay