Like any augmented reality app, the new AR content in Google Expeditions lets students view and manipulate digital content in a physical world context. The new AR content can be used as components in science, math, geography, history, and art lessons. Some examples of the more than 100 AR tours that you’ll now find in the app include landforms, the skeletal system, dinosaurs, ancient Egypt, the brain, and the Space Race.
To use the AR content available through Google Expeditions you will need to print marker or trigger sheets that students scan with their phones or tablets. Once scanned the AR imagery appears on the screen. (You can actually preview some of the imagery without scanning a marker, but the imagery will not be interactive or 3D). Students don’t need to look through a Cardboard viewer in order to see the AR imagery.
In his book, “Experience on Demand,” Jeremy Bailenson, the founding director of Stanford University’s Virtual Human Interaction Lab, writes, “No medium, of course can fully capture the subjective experience of another person, but by richly evoking a real-seeming, first-person experience, virtual reality does seem to promise to offer new, empathy-enhancing qualities.” Bailenson contrasts experiencing virtual reality with reading news accounts and watching documentaries. Those latter activities, he writes, require “a lot of imaginative work,” whereas virtual reality can “convey the feeling” of, say, a refugee camp’s environment, and the “smallness of the living quarters, the size of the camp.”
Caldwell—who used Google Expeditions to deliver a virtual reality experience set in the Holocaust—says that when his students first put on the goggles, they viewed them as a novelty. But within a minute or two, the students became quiet, absorbed in what they were seeing; they realized the “reality of the horror of what was in front of them.” Questions ensued.
Ron Berger, the Chief Academic Officer of EL Education, points to another factor schools should consider. He thinks virtual reality can be a powerful way to introduce kids to situations that require empathy or adopting different perspectives. However, he thinks no one tool or experience will bring results unless it is “nested in a broader framework of a vision and goals and relationships.”
Berger says virtual reality experiences have to be accompanied by work beforehand and follow-up afterwards. Kids, he says, need to be reflective and think critically.
immersion experiences like virtual reality should be “embedded in positive” adult and peer relationships. He adds that ideally, there’s also a resulting action where kids do something productive with the information they’ve learned, to help their own growth and to help others. He mentions an example where students interviewed local immigrants and refugees, then wrote the stories they heard. They published the stories in a book, and the profits went to legal fees for local refugees.
saving virtual reality for “very special experiences,” keeping it “relatively short” and not getting students dizzy or disoriented. A report Bailenson co-authored for Common Sense Media highlights the research that has—and has not—explored the effects of virtual reality on children. It states that the “potentially negative outcomes of VR include impacts on children’s sensory systems and vision, aggression, and unhealthy amounts of escapism and distraction from the physical world.”
The Brain Science Is In: Students’ Emotional Needs Matter
What the neuro-, cognitive, and behavioral research says about social-emotional learning
• Malleability: Genes are not destiny. Our developing brains are largely shaped by our environments and relationships—a process that continues into adulthood.
• Context: Family, relationships, and lived experiences shape the physiological structure of our brains over time. Healthy amounts of challenge and adversity promote growth, but toxic stress takes a toll on the connections between the hemispheres of our brain.
• Continuum: While we’ve become familiar with the exponential development of the brain for young children, it continues throughout life. The explosion of brain growth into adolescence and early adulthood, in particular, requires putting serious work into much more intentional approaches to supporting that development than is common today.
While the big-river scientists work on launching satellites to keep an eye on the world’s giant rivers and lakes, the best monitoring device for these little streams remains people, walking around on the ground looking for streams instead of Pokemon — especially in dry states like this one.
The Augmented Reality Game, Pokemon Go, took the world by storm in the summer of 2016. City landscapes were decorated with amusing, colourful objects called Pokemon, and the holiday activities were enhanced by catching these wonderful creatures. In light of this, it is inevitable for mobile language learning researchers to reflect on the impact oft his game on learning and how it may be leveraged to enhance the design of mobile and ubiquitous technologies for mobile and situated language learning. This paper analyses the game Pokemon Go and the players’ experiences accordingto a framework developed for evaluating mobile language learning and discusses how Pokemon Go can help to meetsome of the challenges faced by earlier research activities.
A comparison between PG and Geocashing will illustrate the evolution of the concept of location-based games a concept that is very close to that of situated learning that we have explored in several previous works.
Pokémon Go is a free, location-based augmented reality game developed for mobile devices. Players useGPS on their mobile device to locate, capture, battle, and train virtual creatures (a.k.a. Pokémon), whichappear on screen overlaying the image seen through the device’s camera. This makes it seem like thePokemon are in the same real-world location as the player
“Put simply, augmented reality is a technology that overlays computer generated visuals over the real worldthrough a device camera bringing your surroundings to life and interacting with sensors such as location and heart rate to provide additional information”(Ramirez, 2014).
Apply the evaluation framework developed in 2015 for mobile learning applications(Cacchione, Procter-Legg, Petersen, & Winter, 2015). The framework is composed of a set offactors of different nature neuroscientific, technological, organisational and pedagogical and aim toprovide a comprehensive account of what plays a major role in ensuring effective learning via mobile devices
By creating engaging 360° tours, students are not only learning these new tools for themselves but are also helping local organizations see the possibility of VR for marketing and public relations.
some key takeaways from the projects that we have seen:
Let the students lead: In all of these projects, students are taking the initiative. The institutions are providing the technology, the space, organizational vision, and in some cases, academic credit. At NYU Tandon, students organized the entire conference, doing publicity, registration, catering, and scheduling (see figure 4). They brought in a diverse group of speakers from academic, tech, and startup backgrounds. The event included TED-style spotlights, talks, workshops, and demos.
Don’t compromise on space: Brown University’s Granoff Center for the Creative Arts is designed to encourage cross-discipline collaboration. The Tandon event used the main auditorium and the flagship NYU MakerSpace. Space influences behavior and is crucial in driving collaboration and active participation. In addition, to produce VR and AR/MR experiences students need access to high-end technology and, in some cases, motion-capture studios and 360° cameras.
Create opportunities for social impact: Many of these programs are open to the local community or have been designed to have an impact outside higher education. At Emporia State, students are using VR and 360° video to help local businesses. The Gaspee Affair VR experience at Brown University will become a resource for teaching middle and high school students.
Showcase student work: So often in education, the work students do in a course is only seen by others in the same class. Like the example at Texas A&M, all of these experiences have a connection with their campus or larger community. VR and AR engender a level of excitement that gets students engaged with each other and encourage peer learning. It’s worth it to seek out opportunities to bring this work to community events.
more on VR in education in this IMS blog
Cai, Y., Chiew, R., Nay, Z. T., Indhumathi, C., & Huang, L. (2017). Design and development of VR learning environments for children with ASD. Interactive Learning Environments, 25(8), 1098-1109. doi:10.1080/10494820.2017.1282877
Collins, J., Hoermann, S., & Regenbrecht, H. (2016). Comparing a finger dexterity assessment in virtual, video-mediated, and unmediated reality. International Journal Of Child Health And Human Development, 9(3), 333-341.
Epure, P., Gheorghe, C., Nissen, T., Toader, L. O., Macovei, A. N., Nielsen, S. M., & … Brooks, E. P. (2016). Effect of the Oculus Rift head mounted display on postural stability. International Journal Of Child Health And Human Development, 9(3), 343-350.
Sánchez, J., & Espinoza, M. (2016). Usability and redesign of a university entrance test based on audio for learners who are blind. International Journal Of Child Health And Human Development, 9(3), 379-387.
Eden, S. (2008). The effect of 3D virtual reality on sequential time perception among deaf and hard-of-hearing children. European Journal Of Special Needs Education, 23(4), 349-363. doi:10.1080/08856250802387315
Eden, S., & Bezer, M. (2011). Three-dimensions vs. two-dimensions intervention programs: the effect on the mediation level and behavioural aspects of children with intellectual disability. European Journal Of Special Needs Education, 26(3), 337-353. doi:10.1080/08856257.2011.593827
Lorenzo, G., Lledó, A., Roig, R., Lorenzo, A., & Pomares, J. (2016). New Educational Challenges and Innovations: Students with Disability in Immersive Learning Environments. In Virtual Learning. InTech. https://doi.org/10.5772/65219