H5P Virtual Tour plugin
+++++++++++++++++++++
https://blog.stcloudstate.edu/ims?s=H5p
Digital Literacy for St. Cloud State University
+++++++++++++++++++++
https://blog.stcloudstate.edu/ims?s=H5p
The Digital Library Federation’s recently published #DLFteach Toolkit Volume 2: Lesson Plans on Immersive Pedagogy may be of interest to some of you.
“The #DLFteach Toolkit 2.0 focuses on lesson plans to facilitate disciplinary and interdisciplinary work engaged with 3D technology. As 3D/VR technology becomes relevant to a wide range of scholarly disciplines and teaching context, libraries are proving well-suited to coordinating the dissemination and integration of this technology across the curriculum. For our purposes, 3D technology includes, but is not limited to Augmented Reality (AR) and Virtual Reality (VR) technologies, 3D modeling and scanning software, 3D game engines and WebGL platforms, as well as 3D printers and extruders. While 3D/VR/AR technologies demonstrate real possibilities for collaborative, multidisciplinary learning, they are also fraught with broader concerns prevalent today about digital technologies.”
+++++++++++++++++++++++++++
Dalton, C. (2021). 3D Modeling for Historical Reconstruction. #DLFteach. Retrieved from https://dlfteach.pubpub.org/pub/vol2-dalton-3d-modeling-for-historical-reconstruction
https://dlfteach.pubpub.org/pub/vol2-dalton-3d-modeling-for-historical-reconstruction/release/1
=======++++++++++++++++++++++++++++
Clark, J. L. (2021). Creating an Equally Effective Alternative Action Plan for Immersive Technologies. #DLFteach. Retrieved from https://dlfteach.pubpub.org/pub/vol2-clark-creating-an-equally-effective-alternative-action-plan
Zoom, Teams, Skype, and FaceTime all became daily fixtures, and many of us quickly became fatigued by seeing our colleagues, students and far-away loved ones almost exclusively in 2D. Most video conferencing solutions were not designed to be online classrooms. what is missing from the current video platforms that could improve online teaching: tools to better facilitate student interactions, including enhanced polling and quizzing features, group work tools, and more.
While universities continue to increase in-person and HyFlex courses, hoping to soon see campuses return to normalcy, there is mounting evidence that the increased interest in digital tools for teaching and learning will persist even after the pandemic.
We should move beyond 2D solutions and take advantage of what extended reality (XR) and virtual reality (VR) have to offer us.
Professor Courtney Cogburn created the 1,000 Cut Journey, an immersive VR research project that allows participants to embody an avatar that experiences various forms of racism. Professor Shantanu Lal has implemented VR headsets for pediatric dentistry patients who become anxious during procedures. At Columbia Engineering, professor Steven Feiner’s Computer Graphics and User Interfaces Lab explores the design and development of 2D and 3D user interfaces for a broad range of applications and devices. Professor Letty Moss-Salentijn is working with Feiner’s lab to create dental training simulations to guide dental students through the process of nerve block injection. Faculty, students and staff at Columbia’s Media Center for Art History have created hundreds of virtual reality panoramas of archaeology projects and fieldwork that are available on the Art Atlas platform.
In spring 2020, a group of Columbia students began to build “LionCraft,” a recreation of Columbia’s Morningside campus in Minecraft. Even though students were spread out around the world, they still found creative and fun ways to run into each other on campus, in an immersive online format.
Created by The Academy—Bank of America’s award-winning onboarding, training, and development organization—the program is part of the company’s commitment to providing the most cutting-edge professional development tools to their employees to ensure they are successful in their roles.
400 employees using the Oculus Go. According to the report, 97% of the participants left feeling more confident in their abilities.
In a 2019 article published by the Academy of International Extended Reality (AIXR), research showed that success in your job and your ability to grow in that role was based on 15% hard skills and 85% soft skills.
++++++++++++++++++
More on haptic in this blog
https://blog.stcloudstate.edu/ims?s=haptic
metaverse (hopefully) won’t be the virtual world of ‘Snow Crash,’ or ‘Ready Player One.’ It will likely be something more complex, diverse, and wild.
The metaverse concept clearly means very different things to different people. What exists right now is a series of embryonic digital spaces, such as Facebook’s Horizon, Epic Games’ Fortnite, Roblox‘s digital space for gaming and game creation, and the blockchain-based digital world Decentraland–all of which have clear borders, different rules and objectives, and differing rates of growth.
different layers of realities that we can all be experiencing, even in the same environment or physical space. We’re already doing that with our phones to a certain extent—passively in a physical environment while mentally in a digital one. But we’ll see more experiences beyond your phone, where our whole bodies are fully engaged, and that’s where the metaverse starts to get interesting—we genuinely begin to explore and live in these alternate realities simultaneously.
Xverse
It will have legacy parts that look and feel like the web today, but it will have new nodes and capabilities that will look and feel like the Ready Player One Oasis (amazing gaming worlds), immersion leaking into our world (like my Magicverse concept), and every imaginable permutation of these. I feel that the Xverse will have gradients of sentience and autonomy, and we will have the emergence of synthetic life (things Sun and Thunder is working on) and a multitude of amazing worlds to explore. Building a world will become something everyone can do (like building a webpage or a blog) and people will be able to share richer parts of their external and inner lives at incredibly high-speed across the planet.
Reality will exist on a spectrum ranging from physical to virtual (VR), but a significant chunk of our time will be spent somewhere between those extremes, in some form of augmented reality (AR). Augmented reality will be a normal part of daily life. Virtual companions will provide information, commentary, updates and advice on matters relevant to you at that point in time, including your assets and activities, in both virtual and real spaces.
I think we can all agree our initial dreams of a fully immersive, separate digital world is not only unrealistic, but maybe not what we actually want. So I’ve started defining the metaverse differently to capture the zeitgeist: we’re entering an era where every computer we interact with, big or small, is increasingly world-aware. They can recognize faces, voices, hands, relative and absolute position, velocity, and they can react to this data in a useful way. These contextually aware computers are the path to unlocking ambient computing: where computers fade from the foreground to the background of everyday, useful tools. The metaverse is less of a ‘thing’ and more of a computing era. Contextual computing enables a multitude of new types of interactions and apps: VR sculpting tools and social hangouts, self-driving cars, robotics, smart homes.
as carbon is to the organic world, AI will be both the matrix that provides the necessary structural support and the material from which digital representation will be made. Of all the ways in which AI will shape the form of the metaverse, perhaps most essential is the role it will play in the physical-digital interface. Translating human actions into digital input–language, eye movement, hand gestures, locomotion–these are all actions which AI companies and researchers have already made tremendous progress on.
Qualcomm views the metaverse as an ever-present spatial internet complete with personalized digital experiences that spans the physical and virtual worlds, where everything and everyone can communicate and interact seamlessly.
As an active researcher in the security and forensics of VR systems, should the metaverse come into existence, we should explore and hypothesize the ways it will be misused.
I picture [the metaverse] almost like The Truman Show. Only, instead of walking into a television set, you walk into the internet and can explore any number of different realities
We imagine the metaverse as reality made better, a world infused with magic, stories, and functionality at the intersection of the digital and physical worlds.
Rather than building the “metaverse,” a separate and fully virtual reality that is disconnected from the physical world, we are focused on augmenting reality, not replacing it. We believe AR–or computing overlaid on the world around us–has a smoother path to mass adoption, but will also be better for the world than a fully virtual world.
In the reality-based metaverse, we will be able to more effectively design products of the future, meet and collaborate with our colleagues far away, and experience any remote place in real-time.
I prefer to think of the metaverse as simply bringing our bodies into the internet.
The metaverse isn’t just VR! Those spaces will connect to AR glasses and to 2D spaces like Instagram. And most importantly, there will be a real sense of continuity where the things you buy are always available to you.
At its core will be a self-contained economy that allows individuals and businesses to create, own or invest in a range of activities and experiences.
the metaverse experience can be altered from the individual’s point of view and shaped or curated by any number of agents—whether human or A.I. In that sense, the metaverse does not have an objective look beyond its backend. In essence, the metaverse, together with our physical locations, forms a spatial continuum.
The AR applications of the metaverse are limitless and it really can become the next great version of the internet.
It seems fair to predict that the actual aesthetic of any given metaverse will be determined by user demand. If users want to exist in a gamified world populated by outrageous avatars and fantastic landscapes then the metaverse will respond to that demand. Like all things in this world the metaverse will be market driven
+++++++++++++++
More on meta-verse in this blog
https://blog.stcloudstate.edu/ims?s=metaverse
William Gibson’s award-winning 1984 science fiction classic Neuromancer popularized the word cyberspace, a meaningless portmanteau that went viral and eventually became a shorthand expression describing the totality of the online world.
something similar happen with the word metaverse, coined in Neal Stephenson’s 1992 novel Snow Crash where it referred to the successor of our two-dimensional Internet. The word resurfaced a short time later in the product road maps of a hundred failed startups and is returning now as the plaything of Big Tech
the department had implemented a relaxation VR program in its waiting room
+++++++++++++++++
More on virtual reality and mindfulness in this blog
https://blog.stcloudstate.edu/ims?s=Virtual+reality+mindfulness
https://www.emteqlabs.com/e40000-prize-fund-for-vr-researchers-and-content-creators/
FOA (family of apps), FRL (facebook reality labs)