Searching for ""augmented reality""

Apple Maps in 3D

Apple Maps introduces new ways to explore major cities in 3D

https://www.apple.com/newsroom/2021/09/apple-maps-introduces-new-ways-to-explore-major-cities-in-3d/

With iOS 15, Apple Maps introduces step-by-step walking guidance in augmented reality. Users can simply raise their iPhone to scan buildings in the area, and Maps generates a highly accurate position to deliver detailed directions that can be viewed in the context of the real world.

iPhone shows step-by-step walking guidance in augmented reality in Apple Maps in iOS 15.

VR collaboration for veterans

https://blogs.va.gov/VAntage/94636/virtual-reality-collaboration-transforming-veteran-health-care/

Augmented reality can be a valuable therapeutic tool for Veterans. Through a previous three-year evaluation of Veterans using Waya Health’s VR tools in inpatient and long-term care settings at the Western North Carolina VA Health Care System (WNCVAHCS), in Asheville, N.C., 84 percent of Veterans reported reduction in discomfort, 89 percent reported reduction in stress, 96 percent reported enjoying their experience, and 97 percent said they would recommend it to their peers.

Nreal international expansion

Chinese augmented reality glasses maker Nreal valued at $700 million after fresh funding

https://www.cnbc.com/2021/09/23/chinese-ar-glasses-firm-nreal-raises-100-million-in-new-funding.html

Apple CEO Tim Cook has called AR the “next big thing” and the iPhone maker is reportedly working on a headset. FacebookMicrosoftGoogle and other technology companies are all investing in augmented reality too.

Xu said he welcomes the competition from these titans. “I think the best product will win,” he said.

Nreal has its own operating system called Nebula that runs on its headsets. Like Apple with iOS on iPhones, developers can make apps for Nebula which people can then use via Nreal headsets.

Having compelling apps on AR headsets will be key to their success and Nreal is trying to lure developers onto Nebula. The company currently has 8,000 developers on the platform.

++++++++++++++++++++
more on AR in this iMS blog
https://blog.stcloudstate.edu/ims?s=Augmented+reality

Cross Reality (XR)

Ziker, C., Truman, B., & Dodds, H. (2021). Cross Reality (XR): Challenges and Opportunities Across the Spectrum. Innovative Learning Environments in STEM Higher Education, 55–77. https://doi.org/10.1007/978-3-030-58948-6_4
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7948004/

For the purpose of this chapter, Cross Reality or XR refers to technologies and applications that involve combinations of mixed reality (MR), augmented reality (AR), virtual reality (VR), and virtual worlds (VWs). These are technologies that connect computer technology (such as informational overlays) to the physical world for the purposes of augmenting or extending experiences beyond the real. Especially relevant to the definition of XR is the fact that this term encompasses a wide range of options for delivering learning experiences, from minimal technology and episodic experiences to deep immersion and persistent platforms. The preponderance of different terms for slightly different technologies indicate that this is a growth area within the field. Here we provide a few definitions of these technologies.

MR—Mixed reality refers to a blend of technologies used to influence the human perception of an experience. Motion sensors, body tracking, and eye tracking interplay with overlaid technology to give a rich and full version of reality displayed to the user. For example, technology could add sound or additional graphics to an experience in real time. Examples include the Magic Leap One and Microsoft HoloLens 2.0. MR and XR are often used interchangeably.

AR—Augmented reality refers to technology systems that overlay information onto the real world, but the technology might not allow for real-time feedback. As such, AR experiences can move or animate, but they might not interact with changes in depth of view or external light conditions. Currently, AR is considered the first generation of the newer and more interactive MR experiences.

VR—Virtual reality, as a technological product, traces its history to approximately 1960 and tends to encompass user experiences that are visually and auditorily different from the real world. Indeed, the real world is often blocked from interacting with the virtual one. Headsets, headphones, haptics, and haptic clothing might purposely cut off all input except that which is virtual. In general, VR is a widely recognizable term, often found in gaming and workplace training, where learners need to be transported to a different time and place. VR experiences in STEM often consist of virtual labs or short virtual field trips.

VW—Virtual worlds are frequently considered a subset of VR with the difference that VWs are inherently social and collaborative; VWs frequently contain multiple simultaneous users, while VRs are often solo experiences. Another discrimination between virtual reality and virtual worlds is the persistence of the virtual space. VR tends to be episodic, with the learner in the virtual experience for a few minutes and the reality created within the experience ends when the learner experience ends. VWs are persistent in that the worlds continue to exist on computer servers whether or not there are active avatars within the virtual space (Bell ). This discrimination between VR and VW, however, is dissolving. VR experiences can be created to exist for days, and some users have been known to wear headsets for extended periods of time. Additionally, more and more VR experiences are being designed to be for game play, socialization, or mental relaxation. The IEEE VR 2020 online conference and the Educators in VR International Summit 2020 offered participants opportunities to experience conference presentations in virtual rooms as avatars while interacting with presenters and conference attendees (see Sect. 2.5 for more information).

CVEs—Collaborative virtual environments are communication systems in which multiple interactants share the same three-dimensional digital space despite occupying remote physical locations (Yee and Bailenson ).

Embodiment—Embodiment is defined by Lindgren and Johnson-Glenberg () as the enactment of knowledge and concepts through the activity of our bodies within an MR (mixed reality) and physical environment

https://hyp.is/mBiunvx3EeudElMRwHm5dQ/www.ncbi.nlm.nih.gov/pmc/articles/PMC7948004/ 

Human-Centered Design philosophy that involves putting human needs, capabilities, and behavior first (Jerald 2018: 15). XR provides the opportunity to experience just-in-time immersive, experiential learning that uses concrete yet exploratory experiences involving senses that result in lasting memories. Here we discuss opportunities for social applications with XR. 

 

https://hyp.is/wJSoFPx3Eeu1mAPmeAp2tQ/www.ncbi.nlm.nih.gov/pmc/articles/PMC7948004/ 

XR learner activities are usually created for individual use, which may or may not need to be simultaneously experienced as a class together at the same time or place with the instructor. Activities can be designed into instruction with VR headsets, high-resolution screens, smartphones, or other solo technological devices for use inside and outside of the classroom. 

 

https://hyp.is/wJSoFPx3Eeu1mAPmeAp2tQ/www.ncbi.nlm.nih.gov/pmc/articles/PMC7948004/ 

Ready to go relationship between STEM courses and XR. In bullet points! 

 

https://hyp.is/wJSoFPx3Eeu1mAPmeAp2tQ/www.ncbi.nlm.nih.gov/pmc/articles/PMC7948004/ 

Do we address the challenges in the grant proposal? 

some learners will be held back from full XR activity by visual, physical, and social abilities such as stroke, vertigo, epilepsy, or age-related reaction time. It should also be noted that the encompassing nature of VR headsets might create some discomfort or danger for any learners as they can no longer fully see and control their body and body space. 

what is the Metaverse

What Is the Metaverse? Is It Just Virtual Reality, or Something More?

https://www-howtogeek-com.cdn.ampproject.org/c/s/www.howtogeek.com/745807/what-is-the-metaverse-is-it-just-virtual-reality-or-something-more/amp/

Tech CEOs keep talking about “the metaverse.” Mark Zuckerberg insists that Facebook will be seen as a “metaverse company” instead of a social media company. Satya Nadella proclaims Microsoft is creating a “metaverse stack” for the enterprise.

Author Neil Stephenson coined the term “metaverse” in Snow Crash, a dystopian cyberpunk novel published in 1992.

In the novel, the metaverse is a sort of 3D virtual world. It’s not simply a virtual reality game but is a persistent, shared virtual world. Or rather, the metaverse is a whole universe of shared virtual spaces seemingly linked together—you could, essentially, teleport between them.

If you think this all sounds a bit like Ready Player One or a higher-tech version of Second Life, you’re right.

virtual reality (VR) and not augmented reality (AR) was necessary for that kind of vision

To Zuckerberg and other tech CEOs, the concept of “the metaverse” seems to have more in common with “Web 2.0.” It’s a bunch of new technologies: VR headsets! Presence! Persistent digital worlds

Microsoft’s vision of the metaverse seems to take the form of rambling, buzzword-heavy talk about “digital twins” and “converging the physical with the digital” with “mixed reality.” Microsoft’s Azure cloud can do it!

Of course, as we learned with Windows 10’s “Mixed Reality” headsets, that term often just means Virtual Reality to Microsoft. However, it can also mean augmented reality: And, little surprise, Microsoft also has a headset to sell you: The HoloLens.

++++++++++++++++
more on Metaverse in this IMS blog
https://blog.stcloudstate.edu/ims?s=metaverse

more on immersive in this IMS blog
https://blog.stcloudstate.edu/ims?s=immersive

Designing XR into Higher Education

Immersive Learning Environments: Designing XR into Higher Education

Heather Elizabeth Dodds

https://edtechbooks.org/id_highered/immersive_learning_e

The terms ‘extended reality’ or ‘cross reality’ refer to “technologies and applications that involve combinations of mixed reality (MR), augmented reality (AR), virtual reality (VR), and virtual worlds (VWs)” (Ziker, Truman, & Dodds, 2021, p. 56). Immersive learning definitions draw from Milgram and Kishino’s key taxonomy (1994) emphasizing the continuum of experiences that range from where a computer adds to a learner’s reality with overlays of information, or a computer experientially transports a learner to a different place and time by manipulating sight and sound.

VR Design Model

three different design models (see Figure 3): the ADDIE Design Model (Branson, 1978), Design Thinking (Brown & Wyatt, 2010) from user experience (UX), and the 3D Learning Experience Design Model (Kapp & O’Driscoll, 2009).

Serrat (2008) defines storytelling as “the vivid description of ideas, beliefs, personal experiences, and life-lessons through stories or narratives that evoke powerful emotions and insights” (p.1).

The foundational theory for most XR experiences is experiential learning theory. In cases where users create within XR, constructivist learning theory also applies.

XR experiences can include a story arc (See Appendix D), a tutorial of user affordances, intentional user actions, and place the user into first or third person experiences (Spillers, 2020).

+++++++++++++++++
more on immersive in this IMS blog
https://blog.stcloudstate.edu/ims?s=immersive+
more on ID in this IMS blog
https://blog.stcloudstate.edu/ims?s=instructional+design

Haptx moves to Seattle

HaptX raises another $12M for high-tech gloves, relocates HQ back to Seattle

https://www.geekwire.com/2021/haptx-raises-another-12m-high-tech-gloves-relocates-hq-back-seattle/

Founded in 2012 and previously known as AxonVR, the company’s tech promises to deliver realistic touch feedback to users reaching out for objects in VR, thanks to microfluidics in the glove system that physically and precisely displace the skin on a user’s hands and fingers.

Virtual and augmented reality have not yet reached mainstream consumers as some predicted but enterprise-focused startups such as HaptX have found traction. Large tech companies such as Facebook and Apple also continue investing in the technology, and investors keep making bets.

++++++++++++++++++++
more on HaptX in this IMS blog
https://blog.stcloudstate.edu/ims?s=haptx

Maya Georgieva Emory Craig XR

The state of XR in higher education

Two experts reveal the state of the art

July 22, 1:00 – 2:00 PM (CDT)

What is happening with virtual and augmented reality in higher education?

This week the Forum will explore that question with two authors of a new report, iLRN‘s State of XR 2021.  Maya Georgieva and Emory Craig, founders and principals of Digital Bodies, are world experts in Extended Reality.  They have also been brilliant and in-demand Forum guests in 20202019, and 2018.

+++++++++++++++++++
more on future trends forum in this IMS blog
https://blog.stcloudstate.edu/ims?s=future+trends+forum

JigSpace Tutorial Educational Technology

JigSpace Puts Together $4.7 Million in Funding to Expand AR Tutorial Technology

https://mobile-ar.reality.news/news/jigspace-puts-together-4-7-million-funding-expand-ar-tutorial-technology-0384775/

startup JigSpace, which was among the first apps to support ARKit and LiDAR for iPhone augmented reality apps

“Creating and sharing knowledge in 3D should be simple, useful, and delightful. We’re on a mission to unlock the utility of augmented reality at massive scale and bring interactive 3D experiences into everyday life,” said Zac Duff, co-founder and CEO at JigSpace

Compared to the camera effects from Snapchat and Facebook, mobile AR apps built on ARKit from Apple and ARCore from Google haven’t had quite the impact we expected them to when Apple originally announced ARKit.

+++++++++++++++++++++
more on AR in education in this IMS blog
https://blog.stcloudstate.edu/ims?s=augmented+reality+education

New Elements of Digital Transformation

The New Elements of Digital Transformation

https://sloanreview-mit-edu.cdn.ampproject.org/c/s/sloanreview.mit.edu/article/the-new-elements-of-digital-transformation/amp

2014, “The Nine Elements of Digital Transformation

It requires that companies become what we call digital masters. Digital masters cultivate two capabilities: digital capability, which enables them to use innovative technologies to improve elements of the business, and leadership capability, which enables them to envision and drive organizational change in systematic and profitable ways. Together, these two capabilities allow a company to transform digital technology into business advantage.

We found that the elements of leadership capability have endured, but new elements of digital capability have come to the fore.

While strong leadership capability is even more essential than ever, its core elements — vision, engagement, and governance — are not fundamentally changed, though they are informed by recent innovations. The elements of digital capability, on the other hand, have been more profoundly altered by the rapid technological advances of recent years.

The New Elements of Digital Capability

Experience design: Customer experience has become the ultimate battleground for many companies and brands.

Customer intelligence: Integrating customer data across silos and understanding customer behavior

Emotional engagement: Emotional connections with customers are as essential as technology in creating compelling customer experiences.

As ever, well-managed operations are essential to converting revenue into profit, but now we’re seeing a shift in the focus of digital transformation in this arena.

Core process automation: Amazon’s distribution centers deliver inventory to workers rather than sending workers to collect inventory. Rio Tinto, an Australian mining company, uses autonomous trucks, trains, and drilling machinery so that it can shift workers to less dangerous tasks, leading to higher productivity and better safety.

Connected and dynamic operations: Thanks to the growing availability of cheap sensors, cloud infrastructure, and machine learning, concepts such as Industry 4.0, digital threads, and digital twins have become a reality. Digital threads connecting machines, models, and processes provide a single source of truth to manage, optimize, and enhance processes from requirements definition through maintenance.

Data-driven decision-making: from backward-looking reports to real-time data. Now, connected devices, new machine learning algorithms, smarter experimentation, and plentiful data enable more-informed decisions.

Transforming Employee Experience

Augmentation: Warnings that robots will replace humans have given way to a more nuanced and productive discussion.
Workers in Huntington Ingalls Industries’ shipyard use augmented reality to help build giant complex vessels such as aircraft carriers and submarines. They can “see” where to route wires or pipes or what is behind a wall before they start drilling into it.

Future-readying: providing employees with the skills they need to keep up with the pace of change. In the past few years, this has given rise to new models of managing learning and development in organizations, led by a new kind of chief learning officer, whom we call the transformer CLO

Flexforcing: To respond to fast-paced digital opportunities and threats, companies also need to build agility into their talent sourcing systems. As automation and AI applications take over tasks once performed by humans, some companies are multiskilling employees to make the organization more agile.

Transforming Business Models

three elements supporting business model transformation: digital enhancements, information-based service extensions, and multisided platforms.

 

1 3 4 5 6 7 16