The event requires no registration, and is virtual only, free, and open to the public. Platform access is required, so please install one of the above platforms to attend the International Summit. You may attend in 2D on a desktop or laptop computer with a headphone and microphone (USB gaming headphone recommended), or with a virtual device such as the Oculus Go, Quest, and Rift, Vive, and other mobile and tethered devices. Please note the specifications and requirements of each platform.
Charlie Fink, author, columnist for Forbes magazine, and Adjunct Faculty member of Chapman University, will be presenting “Setting the Table for the Next Decade in XR,” discussing the future of this innovative and immersive technology, at the 2020 Educators in VR International Summit. He will be speaking in AltspaceVR on Tuesday, February 18 at 1:00 PM EST /
This workshop with Dr. Sarah Jones will focus on developing a relevant and new literacy for virtual reality, including the core competencies and skills needed to develop and understand how to become an engaged user of the technology in a meaningful way. The workshop will develop into research for a forthcoming book on Uncovering a Literacy for VR due to be published in 2020.
Sarah is listed as one of the top 15 global influencers within virtual reality. After nearly a decade in television news, Sarah began working in universities focusing on future media, future technology and future education. Sarah holds a PhD in Immersive Storytelling and has published extensively on virtual and augmented reality, whilst continuing to make and create immersive experiences. She has advised the UK Government on Immersive Technologies and delivers keynotes and speaks at conferences across the world on imagining future technology. Sarah is committed to diversifying the media and technology industries and regularly champions initiatives to support this agenda.
Currently there are limited ways to connect 3D VR environments to physical objects in the real-world whilst simultaneously conducting communication and collaboration between remote users. Within the context of a solar power plant, the performance metrics of the site are invaluable for environmental engineers who are remotely located. Often two or more remotely located engineers need to communicate and collaborate on solving a problem. If a solar panel component is damaged, the repair often needs to be undertaken on-site thereby incurring additional expenses. This triage of communication is known as inter-cognitive communication and intra-cognitive communication: inter-cognitive communication where information transfer occurs between two cognitive entities with different cognitive capabilities (e.g., between a human and an artificially cognitive system); intra-cognitive communication where information transfer occurs between two cognitive entities with equivalent cognitive capabilities (e.g., between two humans) [Baranyi and Csapo, 2010]. Currently, non-VR solutions offer a comprehensive analysis of solar plant data. A regular PC with a monitor currently have advantages over 3D VR. For example, sensors can be monitored using dedicated software such as EPEVER or via a web browser; as exemplified by the comprehensive service provided by Elseta. But when multiple users are able to collaborate remotely within a three-dimensional virtual simulation, the opportunities for communication, training and academic education will be profound.
Michael Vallance Ed.D. is a researcher in the Department of Media Architecture, Future University Hakodate, Japan. He has been involved in educational technology design, implementation, research and consultancy for over twenty years, working closely with Higher Education Institutes, schools and media companies in UK, Singapore, Malaysia and Japan. His 3D virtual world design and tele-robotics research has been recognized and funded by the UK Prime Minister’s Initiative (PMI2) and the Japan Advanced Institute of Science and Technology (JAIST). He has been awarded by the United States Army for his research in collaborating the programming of robots in a 3D Virtual World.
Augmented Reality Lens is popular among young people thanks to Snapchat’s invention. Business is losing money without fully using of social media targeting young people (14-25). In my presentation, Dominique Wu will show how businesses can generate more leads through Spark AR (Facebook AR/Instagram AR) & Snapchat AR Lens, and how to create a strategic Snapchat & Instagram AR campaigns.
Domnique Wu is an XR social media strategist and expert in UX/UI design.She has her own YouTube and Apple Podcast show called “XReality: Digital Transformation,” covering the technology and techniques of incorporating XR and AR into social media, marketing, and integration into enterprise solutions.
Mark Christian, EVP, Strategy and Corporate Development, GIGXR
Mixed Reality devices like the HoloLens are transforming education now. Mark Christian will discuss how the technology is not about edge use cases or POCs, but real usable products that are at Universities transforming the way we teach and learn. Christian will talk about the products of GIGXR, the story of how they were developed and what the research is saying about their efficacy. It is time to move to adoption of XR technology in education. Learn how one team has made this a reality.
As CEO of forward-thinking virtual reality and software companies, Mark Christian employs asymmetric approaches to rapid, global market adoption, hiring, diversity and revenue. He prides himself on unconventional approaches to building technology companies.
Virtual Reality is an effective medium to impart education to the student only if it is done right.The way VR is considered gimmick or not is by the way the software application are designed/developed by the developers not the hardware limitation.I will be giving insight about the VR development for educational content specifically designed for students of lower secondary school.I will also provide insights about the development of game in unity3D game engine.
Game Developer and VR developer with over 3 years of experience in Game Development.Developer of Zombie Shooter, winner of various national awards in the gaming and entertainment category, Avinash Gyawali is the developer of EDVR, an immersive voice controlled VR experience specially designed for children of age 10-18 years.
Virtual Reality Technologies for Learning Designers
Margherita Berti
Virtual Reality (VR) is a computer-generated experience that simulates presence in real or imagined environments (Kerrebrock, Brengman, & Willems, 2017). VR promotes contextualized learning, authentic experiences, critical thinking, and problem-solving opportunities. Despite the great potential and popularity of this technology, the latest two installations of the Educause Horizon Report (2018, 2019) have argued that VR remains “elusive” in terms of mainstream adoption. The reasons are varied, including the expense and the lack of empirical evidence for its effectiveness in education. More importantly, examples of successful VR implementations for those instructors who lack technical skills are still scarce. Margherita Berti will discuss a range of easy-to-use educational VR tools and examples of VR-based activity examples and the learning theories and instructional design principles utilized for their development.
Margherita Berti is a doctoral candidate in Second Language Acquisition and Teaching (SLAT) and Educational Technology at the University of Arizona. Her research specialization resides at the intersection of virtual reality, the teaching of culture, and curriculum and content development for foreign language education.
Amanda Fox, Creative Director of STEAMPunks/MetaInk Publishing, MetaInk Publishing
There is a barrier between an author and readers of his/her books. The author’s journey ends, and the reader’s begins. But what if as an author/trainer, you could use gamification and augmented reality(AR) to interact and coach your readers as part of their learning journey? Attend this session with Amanda Fox to learn how the book Teachingland leverages augmented reality tools such as Metaverse to connect with readers beyond the text.
Amanda Fox, Creative Director of STEAMPunksEdu, and author of Teachingland: A Teacher’s Survival Guide to the Classroom Apolcalypse and Zom-Be A Design Thinker. Check her out on the Virtual Reality Podcast, or connect with her on twitter @AmandaFoxSTEM.
Christian Jonathan Angel Rueda specializaes in didactic activity of the use of virtual reality/virtual worlds to learn the fundamentals of design. He shares the development of a course including recreating in the three-dimensional environment using the fundamentals learned in class, a demonstration of all the works developed throughout the semester using the knowledge of design foundation to show them creatively, and a final project class scenario that connected with the scenes of the students who showed their work throughout the semester.
Christian Jonathan Angel Rueda is a research professor at the Autonomous University of Queretaro in Mexico. With a PhD in educational technology, Christian has published several papers on the intersection of education, pedagogy, and three-dimensional immersive digital environments. He is also an edtech, virtual reality, and social media consultant at Eco Onis.
How we can bridge the gap between eLearning and XR. Richard Van Tilborg discusses combining brain insights enabled with new technologies. Training and education cases realised with the CoVince platform: journeys which start on you mobile and continue in VR. The possibilities to earn from your creations and have a central distribution place for learning and data.
Richard Van Tilborg works with the CoVince platform, a VR platform offering training and educational programs for central distribution of learning and data. He is an author and speaker focusing on computers and education in virtual reality-based tasks for delivering feedback.
at a session on the umbrella concept of “mixed reality” (abbreviated XR) here Thursday, attendees had some questions for the panel’s VR/AR/XR evangelists: Can these tools help students learn? Can institutions with limited budgets pull off ambitious projects? Can skeptical faculty members be convinced to experiment with unfamiliar technology?
Yale has landed on a “hub model” for project development — instructors propose projects and partner with students with technological capabilities to tap into a centralized pool of equipment and funding. (My note: this is what I suggest in my Chapter 2 of Arnheim, Eliot & Rose (2012) Lib Guides)
Several panelists said they had already been getting started on mixed reality initiatives prior to the infusion of support from Educause and HP, which helped them settle on a direction
While 3-D printing might seem to lend itself more naturally to the hard sciences, Yale’s humanities departments have cottoned to the technology as a portal to answering tough philosophical questions.
institutions would be better served forgoing an early investment in hardware and instead gravitating toward free online products like Unity, Organon and You by Sharecare, all of which allow users to create 3-D experiences from their desktop computers.
XR technologies encompassing 3D simulations, modeling, and production.
This project sought to identify
current innovative uses of these 3D technologies,
how these uses are currently impacting teaching and learning, and
what this information can tell us about possible future uses for these technologies in higher education.
p. 5 Extended reality (XR) technologies, which encompass virtual reality (VR) and augmented reality (AR), are already having a dramatic impact on pedagogy in higher education. XR is a general term that covers a wide range of technologies along a continuum, with the real world at one end and fully immersive simulations at the other.
p. 6The Campus of the Future project was an exploratory evaluation of 3D technologies for instruction and research in higher education: VR, AR, 3D scanning, and 3D printing. The project sought to identify interesting and novel uses of 3D technology
p. 7 HP would provide the hardware, and EDUCAUSE would provide the methodological expertise to conduct an evaluation research project investigating the potential uses of 3D technologies in higher education learning and research.
The institutions that participated in the Campus of the Future project were selected because they were already on the cutting edge of integrating 3D technology into pedagogy. These institutions were therefore not representative, nor were they intended to be representative, of the state of higher education in the United States. These institutions were selected precisely because they already had a set of use cases for 3D technology available for study
p. 9 At some institutions, the group participating in the project was an academic unit (e.g., the Newhouse School of Communications at Syracuse University; the Graduate School of Education at Harvard University). At these institutions, the 3D technology provided by HP was deployed for use more or less exclusively by students and faculty affiliated with the particular academic unit.
p. 10 definitions
there is not universal agreement on the definitions of these
terms or on the scope of these technologies. Also, all of these technologies
currently exist in an active marketplace and, as in many rapidly changing markets, there is a tendency for companies to invent neologisms around 3D technology.
A 3D scanner is not a single device but rather a combination of hardware and
software. There are generally two pieces of hardware: a laser scanner and a digital
camera. The laser scanner bounces laser beams off the surface of an object to
determine its shape and contours.
p. 11 definitions
Virtual reality means that the wearer is completely immersed in a computer
simulation. Several types of VR headsets are currently available, but all involve
a lightweight helmet with a display in front of the eyes (see figure 2). In some
cases, this display may simply be a smartphone (e.g., Google Cardboard); in other
cases, two displays—one for each eye—are integrated into the headset (e.g., HTC
Vive). Most commercially available VR rigs also include handheld controllers
that enable the user to interact with the simulation by moving the controllers
in space and clicking on finger triggers or buttons.
p. 12 definitions
Augmented reality provides an “overlay” of some type over the real world through
the use of a headset or even a smartphone.
In an active technology marketplace, there is a tendency for new terms to be
invented rapidly and for existing terms to be used loosely. This is currently
happening in the VR and AR market space. The HP VR rig and the HTC Vive
unit are marketed as being immersive, meaning that the user is fully immersed in
a simulation—virtual reality. Many currently available AR headsets, however, are
marketed not as AR but rather as MR (mixed reality). These MR headsets have a
display in front of the eyes as well as a pair of front-mounted cameras; they are
therefore capable of supporting both VR and AR functionality.
p. 13 Implementation
Technical difficulties.
Technical issues can generally be divided into two broad categories: hardware
problems and software problems. There is, of course, a common third category:
human error.
p. 15 the technology learning curve
The well-known diffusion of innovations theoretical framework articulates five
adopter categories: innovators, early adopters, early majority, late majority, and
laggards. Everett M. Rogers, Diffusion of Innovations, 5th ed. (New York: Simon and Schuster, 2003).
It is also likely that staff in the campus IT unit or center for teaching and learning already know who (at least some of) these individuals are, since such faculty members are likely to already have had contact with these campus units.
Students may of course also be innovators and early adopters, and in fact
several participating institutions found that some of the most creative uses of 3D technology arose from student projects
p. 30 Zeynep Tufekci, in her book Twitter and Tear Gas
definition: There is no necessary distinction between AR and VR; indeed, much research
on the subject is based on a conception of a “virtuality continuum” from entirely
real to entirely virtual, where AR lies somewhere between those ends of the
spectrum. Paul Milgram and Fumio Kishino, “A Taxonomy of Mixed Reality Visual Displays,” IEICE Transactions on Information Systems, vol. E77-D, no. 12 (1994); Steve Mann, “Through the Glass, Lightly,” IEEE Technology and Society Magazine 31, no. 3 (2012): 10–14.
For the future of 3D technology in higher education to be realized, that
technology must become as much a part of higher education as any technology:
the learning management system (LMS), the projector, the classroom. New
technologies and practices generally enter institutions of higher education as
initiatives. Several active learning classroom initiatives are currently under
way,36 for example, as well as a multi-institution open educational resources
(OER) degree initiative.37
p. 32 Storytelling
Some scholars have argued that all human communication
is based on storytelling;41 certainly advertisers have long recognized that
storytelling makes for effective persuasion,42 and a growing body of research
shows that narrative is effective for teaching even topics that are not generally
thought of as having a natural story, for example, in the sciences.43
p. 33 accessibility
The experience of Gallaudet University highlights one of the most important
areas for development in 3D technology: accessibility for users with disabilities.
p. 34 instructional design
For that to be the case, 3D technologies must be incorporated into the
instructional design process for building and redesigning courses. And for that
to be the case, it is necessary for faculty and instructional designers to be familiar
with the capabilities of 3D technologies. And for that to be the case, it may not be necessary but would certainly be helpful for instructional designers to collaborate closely with the staff in campus IT units who support and maintain this hardware.
Every institution of higher education has a slightly different organizational structure, of course, but these two campus units are often siloed. This siloing may lead to considerable friction in conducting the most basic organizational tasks, such as setting up meetings and apportioning responsibilities for shared tasks. Nevertheless, IT units and centers for teaching and learning are almost compelled to collaborate in order to support faculty who want to integrate 3D technology into their teaching. It is necessary to bring the instructional design expertise of a center for teaching and learning to bear on integrating 3D technology into an instructor’s teaching (My note: and where does this place SCSU?) Therefore, one of the most critical areas in which IT units and centers for teaching and learning can collaborate is in assisting instructors to develop this integration and to develop learning objects that use 3D technology. p. 35 For 3D technology to really gain traction in higher education, it will need to be easier for instructors to deploy without such a large support team.
p. 35 Sites such as Thingiverse, Sketchfab, and Google Poly are libraries of freely
available, user-created 3D models.
ClassVR is a tool that enables the simultaneous delivery of a simulation to
multiple headsets, though the simulation itself may still be single-user.
p. 37 data management:
An institutional repository is a collection of an institution’s intellectual output, often consisting of preprint journal articles and conference papers and the data sets behind them.49 An institutional repository is often maintained by either the library or a partnership between the library and the campus IT unit. An institutional repository therefore has the advantage of the long-term curatorial approach of librarianship combined with the systematic backup management of the IT unit. (My note: leaves me wonder where does this put SCSU)
Sharing data sets is critical for collaboration and increasingly the default for
scholarship. Data is as much a product of scholarship as publications, and there
is a growing sentiment among scholars that it should therefore be made public.50
University libraries have held collections of books and printed material throughout their existence and continue to be perceived as repositories for physical collections. Other non-print specialized collections of interest have been held in various departments on campus such as Anthropology, Art, and Biology due to the unique needs of the collections and their usage. With the advent of electronic media, it becomes possible to store these non-print collections in a central place, such as the Libray.
The skills needed to curate artifacts from an archeological excavation, biological specimens from various life forms, and sculpture work are very different, making it difficult for smaller university libraries to properly hold, curate, and make available such collections. In addition, faculty in the various departments tend to want those collections near their coursework and research, so it can be readily available to students and researchers. With the expansion of online learning, the need for such availability becomes increasingly pronounced.
With the advent of 3 dimensional (3D) scanners, it has become possible for a smaller library to hold digital representations of these collections in an archive that can be curated from the various departments by experts in the discipline. The Library can then make the digital representations available to other researchers, students, and the public through kiosks in the Library or via the Internet. Current methods to scan and store an artifact in 3Dstill require expertise not often found in a Library.
We propose to use existing technology to build an easy-to-use system to scan smaller artifacts in 3D. The project will include purchase and installation of a workstation in the Library where the artifact collection can be accessed using a large touch-screen monitor, and a portable, easy-to-use 3D scanning station. Curators of collections from various departments on the St. Cloud State University campus can check out the scanning station, connect to power and Internet where the collection is located, and scan their collection into the libraries digital archives, making the collection easily available to students, other researchers and the public.
The project would include assembly of two workstations previously mentioned and potentially develop the robotic scanner. Software would be produced to automate the workflow from the scanner to archiving the digital representation and then make the collection available on the Internet.
This project would be a collaboration between the St. Cloud State University Library (https://www.stcloudstate.edu/library/ and Visualization Laboratory (https://www.facebook.com/SCSUVizLab/). The project would use the expertise and services of the St. Cloud State Visualization Laboratory. Dr. Plamen Miltenoff, a faculty with the Library will coordinate the Library initiatives related to the use of the 3D scanner. Mark Gill, Visualization Engineer, and Dr. Mark Petzold, Associate Professor of Electrical and Computer Engineering will lead a group of students in developing the software to automate the scanning, storage, and retrieval of the 3D models. The Visualization Lab has already had success in 3D scanning objects for other departments and in creating interactive displays allowing retrieval of various digital content, including 3D scanned objects such animal skulls and video. A collaboration between the Library, VizLab and the Center for Teaching and Learning (, https://www.stcloudstate.edu/teaching/) will enable campus faculty to overcome technical and financial obstacles. It will promote the VizLab across campus, while sharing its technical resources with the Library and making those resources widely available across campus. Such work across silos will expose the necessity (if any) of standardization and will help faculty embrace stronger collaborative practices as well as spur the process of reproduction of best practices across disciplines.
Budget:
Hardware
Cost
42” Touch Screen Monitor
$2200
Monitor Mount
$400
2 Computer Workstations
$5000
Installation
$500
Cart for Mobile 3D Scanner
$1000
3D Scanner (either purchase or develop in-house)
$2000
Total
$11100
The budget covers two computer workstations. One will be installed in the library as a way to access the digital catalog, and will include a 42 inch touch screen monitor mounted to a wall or stand. This installation will provide students a way to interact with the models in a more natural way. The second workstation would be mounted on a mobile cart and connected to the 3D scanner. This would allow collection curators from different parts of campus to check out the scanner and scan their collections. The ability to bring the scanner to the collection would increase the likelihood the collections to be scanned into the library collection.
The 3D scanner would either be purchased off-the shelf or designed by a student team from the Engineering Department. A solution will be sought to use and minimize the amount of training the operator would need. If the scanner is developed in-house, a simple optical scanner such as an XBox Kinect device and a turntable or robotic arm will be used. Support for the XBox Kinect is built into Microsoft Visual Studio, thus creating the interface efficient and costeffective.
Dr. Miltenoff is part of a workgroup within the academic library, which works with faculty, students and staff on the application of new technologies in education. Dr. Miltenoff’s most recent research with Mark Gill is on the impact of Video 360 on students during library orientation:http://web.stcloudstate.edu/pmiltenoff/bi/
Mark Petzold, Ph.D. mcpetzold@stcloudstate.edu
320-308-4182
Dr. Petzold is an Associate Professor in Electrical and Computer Engineering. His current projects involve visualization of meteorological data in a virtual reality environment and research into student retention issues. He is co-PI on a $5 million NSF S-STEM grant which gives scholarships to low income students and investigates issues around student transitions to college.
Mr. Gill is a Visualization Engineer for the College of Science and Engineering and runs the Visualization Laboratory. He has worked for several major universities as well as Stennis Space Center and Mechdyne, Inc. He holds a Masters of Science in Software Engineering.
+++++++++++++
University of Nevada, Reno and Pennsylvania State University 41 campus libraries to include collaborative spaces where faculty and students gather to transform virtual ideas into reality.
Maker Commons in the Modern Library 6 REASONS 3D PRINTERS SHOULD BE IN YOUR LIBRARY
1. Librarians Know How to Share 2. Librarians Work Well with IT People 3. Librarians Serve Everybody 4. Librarians Can Fill Learning Gaps 5. Librarians like Student Workers 6. Librarians are Cross-Discipline
“The #DLFteach Toolkit 2.0 focuses on lesson plans to facilitate disciplinary and interdisciplinary work engaged with 3D technology. As 3D/VR technology becomes relevant to a wide range of scholarly disciplines and teaching context, libraries are proving well-suited to coordinating the dissemination and integration of this technology across the curriculum. For our purposes, 3D technology includes, but is not limited to Augmented Reality (AR) and Virtual Reality (VR) technologies, 3D modeling and scanning software, 3D game engines and WebGL platforms, as well as 3D printers and extruders. While 3D/VR/AR technologies demonstrate real possibilities for collaborative, multidisciplinary learning, they are also fraught with broader concerns prevalent today about digital technologies.”
Until now, technology that readily identifies everyone based on his or her face has been taboo because of its radical erosion of privacy. Tech companies capable of releasing such a tool have refrained from doing so; in 2011, Google’s chairman at the time said it was the one technology the company had held back because it could be used “in a very bad way.” Some large cities, including San Francisco, have barred police from using facial
But without public scrutiny, more than 600 law enforcement agencies have started using Clearview in the past year, according to the company, which declined to provide a list. recognition technology.
Facial recognition technology has always been controversial. It makes people nervous about Big Brother. It has a tendency to deliver false matches for certain groups, like people of color. And some facial recognition products used by the police — including Clearview’s — haven’t been vetted by independent experts.
Clearview deployed current and former Republican officials to approach police forces, offering free trials and annual licenses for as little as $2,000. Mr. Schwartz tapped his political connections to help make government officials aware of the tool, according to Mr. Ton-That.
“We have no data to suggest this tool is accurate,” said Clare Garvie, a researcher at Georgetown University’s Center on Privacy and Technology, who has studied the government’s use of facial recognition. “The larger the database, the larger the risk of misidentification because of the doppelgänger effect. They’re talking about a massive database of random people they’ve found on the internet.”
Part of the problem stems from a lack of oversight. There has been no real public input into adoption of Clearview’s software, and the company’s ability to safeguard data hasn’t been tested in practice. Clearview itself remained highly secretive until late 2019.
The software also appears to explicitly violate policies at Facebook and elsewhere against collecting users’ images en masse.
while there’s underlying code that could theoretically be used for augmented reality glasses that could identify people on the street, Ton-That said there were no plans for such a design.
In May of last year, San Francisco banned facial recognition; the neighboring city of Oakland soon followed, as did Somerville and Brookline in Massachusetts (a statewide ban may follow). In December, San Diego suspended a facial recognition program in advance of a new statewide law, which declared it illegal, coming into effect. Forty major music festivals pledged not to use the technology, and activistsare calling for a nationwide ban. Many Democratic presidential candidates support at least a partial ban on the technology.
facial recognition bans are the wrong way to fight against modern surveillance. Focusing on one particular identification method misconstrues the nature of the surveillance society we’re in the process of building. Ubiquitous mass surveillance is increasingly the norm. In countries like China, a surveillance infrastructure is being built by the government for social control. In countries like the United States, it’s being built by corporations in order to influence our buying behavior, and is incidentally used by the government.
People can be identified at a distance by their heart beat or by their gait, using a laser-based system. Cameras are so good that they can read fingerprints and irispatterns from meters away. And even without any of these technologies, we can always be identified because our smartphones broadcast unique numbers called MAC addresses.
The data broker industry is almost entirely unregulated; there’s only one law — passed in Vermont in 2018 — that requires data brokers to register and explain in broad terms what kind of data they collect.
Until now, technology that readily identifies everyone based on his or her face has been taboo because of its radical erosion of privacy. Tech companies capable of releasing such a tool have refrained from doing so; in 2011, Google’s chairman at the time said it was the one technology the company had held back because it could be used “in a very bad way.” Some large cities, including San Francisco, have barred police from using facial recognition technology.
Like any augmented reality app, the new AR content in Google Expeditions lets students view and manipulate digital content in a physical world context. The new AR content can be used as components in science, math, geography, history, and art lessons. Some examples of the more than 100 AR tours that you’ll now find in the app include landforms, the skeletal system, dinosaurs, ancient Egypt, the brain, and the Space Race.
To use the AR content available through Google Expeditions you will need to print marker or trigger sheets that students scan with their phones or tablets. Once scanned the AR imagery appears on the screen. (You can actually preview some of the imagery without scanning a marker, but the imagery will not be interactive or 3D). Students don’t need to look through a Cardboard viewer in order to see the AR imagery.