the power of VR goes beyond simply recruiting. The University of Michigan uses the technology as a learning tool, and by instituting a virtual reality “cave” they’ve allowed engineering students to interact with virtual structures as they “come together, buckle and collapse.” Instead of relying on physical models—which tend to be large, expensive, and slow to build—a student using the MIDEN VR cave can fly around a virtual structure to study mechanical connections.
Earlier this week, Apple ($NASDAQ:AAPL) acquired augmented reality (AR) lens and glasses company Akonia Holographics ($AKONIAHOLOGRAPHICS), which spawned plenty of speculation on Apple getting serious about AR.
Augmented reality overlays digital information over the real world and differs from virtual reality (VR), where the whole environment is simulated. Akonia describes its AR product as “thin, transparent smart glass lenses that display vibrant, full-color, wide field-of-view images.”
“Digital maps have become essential tools of our everyday lives, yet despite their ubiquity, they are still in their infancy. From urban mobility to indoor positioning, from LIDAR to Augmented Reality, advances in technology and new kinds of data are powering innovations in all areas of digital mapping. If you love maps and are passionate about what is possible, you will be in great company.”
eXtended Reality (XR): The New World of Human/Machine Interaction
Wednesday, October 31 | 9:45am – 10:30am MT |
Session Type: Breakout Session
Delivery Format: Interactive Presentation
eXtended reality (XR) technologies present opportunities to advance the higher education mission and prepare students for a new world of human/machine interaction. In this interactive session, we will explore what is being done today and what is possible in four key areas of XR: use, technology, content development, and gamification.
*Identify best-of-class tools and methods available for the design and support of XR in higher ed
* Explain to campus stakeholders the potential of XR to support pedagogy, research, and student success
* Understand the areas of focus of our growing XR community of practice and how you can participate
Kiwi enhances learning experiences by encouraging active participation with AR and social media. A student can use their smartphone or tablet to scan physical textbooks and unlock learning assistance tools, like highlighting, note creation and sharing, videos and AR guides—all features that encourage peer-to-peer learning. (my note, as reported at the discussion at the QQLM conference in Crete about Zois Koukopoulos, Dimitrios Koukopoulos Augmented Reality Dissemination and Exploitation Services for Libraries: http://blog.stcloudstate.edu/ims/2018/05/21/measuring-learning-outcomes-of-new-library-initiatives/
Street Smarts VR is a startup that is working to provide solutions for a major issue facing America’s communities: conflicts between police officers and citizens.
NYC Media Lab recently collaborated with Bloomberg and the augmented reality startup Lampix on a fellowship program to envision the future of learning in the workplace. Lampix technology looks like it sounds: a lamp-like hardware that projects AR capabilities, turning any flat surface into one that can visualize data and present collaborative workflows.
Calling Thunder: The Unsung History of Manhattan
Calling Thunder: The Unsung History of Manhattan, a project that came out of a recent fellowship program with A+E Networks, re-imagines a time before industrialization, when the City we know now was lush with forests, freshwater ponds, and wildlife.
Google, for instance, has made virtual field trips to inaccessible locations easier for history and social studies classes with its Cardboard viewers used in conjunction with the Expeditions app. And technologies like zSpace have expanded opportunities in STEM subjects with virtual interactive dissections, diagrams and experiments.
more on VR in education in this IMS blog
Augmented reality can be described as experiencing the real world with an overlay of additional computer generated content. In contrast, virtual reality immerses a user in an entirely simulated environment, while mixed or merged reality blends real and virtual worlds in ways through which the physical and the digital can interact. AR, VR, and MR offer new opportunities to create a psychological sense of immersive presence in an environment that feels real enough to be viewed, experienced, explored, and manipulated. These technologies have the potential to democratize learning by giving everyone access to immersive experiences that were once restricted to relatively few learners.
In Grinnell College’s Immersive Experiences Lab http://gciel.sites.grinnell.edu/, teams of faculty, staff, and students collaborate on research projects, then use 3D, VR, and MR technologies as a platform to synthesize and present their findings.
In terms of equity, AR, VR, and MR have the potential to democratize learning by giving all learners access to immersive experiences
relatively little research about the most effective ways to use these technologies as instructional tools. Combined, these factors can be disincentives for institutions to invest in the equipment, facilities, and staffing that can be required to support these systems. AR, VR, and MR technologies raise concerns about personal privacy and data security. Further, at least some of these tools and applications currently fail to meet accessibility standards. The user experience in some AR, VR, and MR applications can be intensely emotional and even disturbing (my note: but can be also used for empathy literacy),
immersing users in recreated, remote, or even hypothetical environments as small as a molecule or as large as a universe, allowing learners to experience “reality” from multiple perspectives.
Augmented reality adds computer-generated content as a contextual overlay to the real world. This technology, often powered by devices we already carry, has enormous applications for training and development.
Virtual reality has existed for decades, but technology has finally emerged that makes it truly accessible. VR allows us to put learners in a truly immersive environment, creating entirely new opportunities for training and learning.
AR and VR are just the start of the alternate-reality conversation. There are additional technologies that we can use on their own or as part of a blend with AR and VR to increase the level of immersion in the experiences we create.
new forms of human-computer interaction (HCI) such as augmented reality (AR),virtual reality (VR) and mixed reality (MR).
combining AR/VR/MR with cognitive computing and artificial intelligence (AI) technologies (such as machine learning, deep learning, natural language processing and chatbots).
Some thought-provoking questions include:
Will remote workers be able to be seen and interacted with via their holograms (i.e., attending their meetings virtually)? What would this mean for remote learners?
Will our smartphones increasingly allow us to see information overlaid on the real world? (Think Pokémon Go, but putting that sort of technology into a vast array of different applications, many of which could be educational in nature)
How do/will these new forms of HCI impact how we design our learning spaces?
Will students be able to pick their preferred learning setting (i.e., studying by a brook or stream or in a virtual Starbucks-like atmosphere)?
Will more devices/platforms be developed that combine the power of AI with VR/AR/MR-related experiences? For example, will students be able to issue a verbal question or command to be able to see and experience walking around ancient Rome?
Will there be many new types of learning experiences,like what Microsoft was able to achieve in its collaboration with Case Western Reserve University [OH]? Its HoloLens product transforms the way human anatomy can be taught.
p. 22 Extensive costs for VR design and development drive the need for collaborative efforts.
Case Western Reserve University, demonstrates a collaboration with the Cleveland Clinic and Microsoft to create active multi-dimensional learning using holography.
the development of more affordable high-quality virtual reality solutions.
AR game developed by the Salzburg University of Applied Sciences [Austria] (http://www.fh-salzburg.ac.at/en/) that teaches about sustainability, the environment and living green.
Whether using AR for a gamified course or to acclimate new students to campus, the trend will continue into 2017.
Google Expeditions This virtual reality field trip tool works in conjunction with Google Cardboard and has just been officially released. The app allows teachers to guide students through an exploration of 200 (and growing) historical sites and natural resources in an immersive, three-dimensional experience. The app only works on Android devices and is free.
Flippity This app works in conjunction with Google Sheets and allows teachers to easily make a Jeopardy-style game.
Google Science Journal This Android app allows users to do science experiments with mobile phones. Students can use sensors in the phone or connect external sensors to collect data, but can also take notes on observations, analyze and annotate within the app.
Google Cast This simple app solves issues of disparate devices in the classroom. When students download the app, they can project from their devices onto the screen at the front of the room easily. “You don’t have to have specific hardware, you just have to have Wi-Fi,”
Constitute This site hosts a database of constitutions from around the world. Anything digitally available has been aggregated here. It is searchable by topic and will pull out specific excerpts related to search terms like “freedom of speech.”
YouTube a database of YouTube Channels by subject to help educators with discoverability (hint subjects are by tab along the bottom of the document).
Zygote Body This freemium tool has a lot of functionality in the free version, allowing students to view different parts of human anatomy and dig into how various body systems work.
Pixlr This app has less power than Photoshop, but is free and fairly sophisticated. It works directly with Google accounts, so students can store files there.
uild With Chrome This extension to the Chrome browser lets kids play with digital blocks like Legos. Based on the computer’s IP address, the software assigns users a plot of land on which to build nearby. There’s a Build Academy to learn how to use the various tools within the program, but then students can make whatever they want.
Google CS First Built on Scratch’s programming language, this easy tool gives step-by-step instructions to get started and is great for the hesitant teacher who is just beginning to dip a toe into coding.
Topics: Assistive and adaptive technologies, Augmented reality, Learning spaces, Mobile learning, Tools
the Universal Design for Learning (UDL) framework, which aims to develop expert learners. In addition to removing barriers and making learning accessible to the widest varied of learners possible, UDL addresses many of the metacognitive and self-efficacy skills associated with becoming an expert learner, including:
Executive functions. These cognitive processes include initiation, goal setting, attention, planning and organization.
Comprehension skills. This skillset encompasses knowledge construction, making connections, developing strategies and monitoring understanding.
Engagement principles. These soft skills include coping, focus, resilience, effort, persistence, self-assessment and reflection.
AR apps : two types of AR apps: those for experience and for creation. Experience AR apps, such as Star Walk, are designed to provide the user with an AR experience within a specific content or context. Creation AR apps, such as BlippAR and Aurasma, allow users to create their own AR experiences.
Posters : To support comprehension and metacognitive skills, images related to classroom topics, or posters related to a process could serve as the trigger image.
iBeacons : Beacon technology, such as iBeacon, shares some similarities with QR codes and AR, as it is a way to call up digital content from a specific spot in the physical world. However, unlike QR codes and AR, you do not have to point your device at a code or use a trigger image to call up content with iBeacon. Your device will automatically sync when it is near a beacon, a small device that emits a low-power Bluetooth signal, if you have an iBeacon-enabled app. The beacon then automatically launches digital content, such as a video, audio file or webpage. Beacon technology is well suited for center-based activities, as you can set up the app to trigger instructions for each center, exemplars of what the finished work will look like and/or prompts for the reflection when the center’s activity has been completed.
More on QR codes in this IMS blog: