money and gaming are only the first of being completely pushed into a new all encompassing ‘reality’ of its own.
2021 also popularized the Non Fungible Tokens (NFTs) — private property transferred to virtual reality. If on a first level a photo or film keep certain relation to the real, a NFT is a second level of dematerialization of our world and perceived ‘reality’. It simulates, in a virtual world (per se already a simulacrum), the simulacrum of an image/photo/video — and, as if magically, it acquires ‘value’ and it is deemed proprietary.
The future decade’s pressures are more and more on the deterioration of what is left of the real world — climate change, pandemics, automation of work, decreasing populations (first in the West, then in the rest of the world), and scarce resources. The work that produces material things were/are the first to be automated — first in agriculture, then manufacturing and now finally services. In such a decadent material world, continuous growth would not be possible anymore — but with virtual ones
There is never enough data.
super fast internet (5G), increasing data centers, quantum computing, health trackers on human bodies, machine-brain interfaces, internet of things… The goal is for the AI to know how to reproduce material things in a virtual setting.
The COVID-19 pandemic probably accelerated this ‘metaversing’ of reality in many years.
some of the implications of people spending significantly more time in immersive 3D environments that provide alternative “realities” to the physical world.
Tech CEOs keep talking about “the metaverse.” Mark Zuckerberg insists that Facebook will be seen as a “metaverse company” instead of a social media company. Satya Nadella proclaims Microsoft is creating a “metaverse stack” for the enterprise.
Author Neil Stephenson coined the term “metaverse” in Snow Crash, a dystopian cyberpunk novel published in 1992.
In the novel, the metaverse is a sort of 3D virtual world. It’s not simply a virtual reality game but is a persistent, shared virtual world. Or rather, the metaverse is a whole universe of shared virtual spaces seemingly linked together—you could, essentially, teleport between them.
If you think this all sounds a bit like Ready Player One or a higher-tech version of Second Life, you’re right.
virtual reality (VR) and not augmented reality (AR) was necessary for that kind of vision
To Zuckerberg and other tech CEOs, the concept of “the metaverse” seems to have more in common with “Web 2.0.” It’s a bunch of new technologies: VR headsets! Presence! Persistent digital worlds
Microsoft’s vision of the metaverse seems to take the form of rambling, buzzword-heavy talk about “digital twins” and “converging the physical with the digital” with “mixed reality.” Microsoft’s Azure cloud can do it!
The 7th International Conference of the Immersive Learning Research Network (iLRN 2021) will be an innovative and interactive virtual gathering for a strengthening global network of researchers and practitioners collaborating to develop the scientific, technical, and applied potential of immersive learning. It is the premier scholarly event focusing on advances in the use of virtual reality (VR), augmented reality (AR), mixed reality (MR), and other extended reality (XR) technologies to support learners across the full span of learning–from K-12 through higher education to work-based, informal, and lifelong learning contexts.
Following the success of iLRN 2020, our first fully online and in-VR conference, this year’s conference will once again be based on the iLRN Virtual Campus, powered by VirBELA, but with a range of activities taking place on various other XR simulation, gaming, and other platforms. Scholars and professionals working from informal and formal education settings as well as those representing diverse industry sectors are invited to participate in the conference, where they may share their research findings, experiences, and insights; network and establish partnerships to envision and shape the future of XR and immersive technologies for learning; and contribute to the emerging scholarly knowledge base on how these technologies can be used to create experiences that educate, engage, and excite learners.
Note: Last year’s iLRN conference drew over 3,600 attendees from across the globe, making the scheduling of sessions a challenge. This year’s conference activities will be spread over a four-week period so as to give attendees more opportunities to participate at times that are conducive to their local time zones.
##### TOPIC AREAS #####
XR and immersive learning in/for:
Serious Games • 3D Collaboration • eSports • AI & Machine Learning • Robotics • Digital Twins • Embodied Pedagogical Agents • Medical & Healthcare Education • Workforce & Industry • Cultural Heritage • Language Learning • K-12 STEM • Higher Ed & Workforce STEM • Museums & Libraries • Informal Learning • Community & Civic Engagement • Special Education • Geosciences • Data Visualization and Analytics • Assessment & Evaluation
##### SUBMISSION STREAMS & CATEGORIES #####
ACADEMIC STREAM (Refereed paper published in proceedings):
– Full (6-8 pages) paper for oral presentation
– Short paper (4-5 pages) for oral presentation
– Work-in-progress paper (2-3 pages) for poster presentation
– Doctoral colloquium paper (2-3 pages)
PRACTITIONER STREAM (Refereed paper published in proceedings):
– Oral presentation
– Poster presentation
– Guided virtual adventures
– Immersive learning project showcase
NONTRADITIONAL SESSION STREAM (1-2 page extended abstract describing session published in proceedings):
– Workshop
– Special session
– Panel session
##### SESSION TYPES & SESSION FORMATS #####
– Oral Presentation: Pre-recorded video + 60-minute live in-world discussion with
others presenting on similar/related topics (groupings of presenters into sessions determined by Program Committee)
– Poster Presentation: Live poster session in 3D virtual exhibition hall; pre-recorded video optional
– Doctoral Colloquium: 60-minute live in-world discussion with other doctoral researchers; pre-recorded video optional
– Guided Virtual Adventures: 60-minute small-group guided tours of to various social and collaborative XR/immersive environments and platforms
– Immersive Learning Project Showcase: WebXR space to assemble a collection of virtual artifacts, accessible to attendees throughout the conference
– Workshop: 1- or 2-hour live hands-on session
– Special Session: 30- or 60-minute live interactive session held in world; may optionally be linked to one or more papers
– Panel Session: 60-minute live in-world discussion with a self-formed group of 3-5 panelists (including a lead panelist who serves as a moderator)
Please see the conference website for templates and guidelines.
##### PROGRAM TRACKS #####
Papers and proposals may be submitted to one of 10 program tracks, the first nine of which correspond to the iLRN Houses of application, and the tenth of which is intended for papers making knowledge contributions to the learning sciences, computer science, and/or game studies that are not linked to any particular application area:
Track 1. Assessment and Evaluation (A&E)
Track 2. Early Childhood Development & Learning (ECDL)
Track 4. Inclusion, Diversity, Equity, Access, & Social Justice (IDEAS)
Track 5. K-12 STEM Education
Track 6. Language, Culture, & Heritage (LCH)
Track 7. Medical & Healthcare Education (MHE)
Track 8. Nature & Environmental Sciences (NES)
Track 9. Workforce Development & Industry Training (WDIT)
Track 10. Basic Research and Theory in Immersive Learning (not linked to any particular application area)
##### PAPER/PROPOSAL SUBMISSION & REVIEW #####
Papers for the Academic Stream and extended-abstract proposals for the Nontraditional Session Stream must be prepared in standard IEEE double-column US Letter format using Microsoft Word or LaTeX, and will be accepted only via the online submission system, accessible via the conference website (from which guidelines and templates are also available).
Proposals for the Practitioner Stream are to be submitted via an online form, also accessible from the conference website.
A blind peer-review process will be used to evaluate all submissions.
##### IMPORTANT DATES #####
– Main round submission deadline – all submission types welcome: 2021-01-15
– Notification of review outcomes from main submission round: 2021-04-01
– Late round submission deadline – Work-in-progress papers, practitioner presentations, and nontraditional sessions only: 2021-04-08
– Camera-ready papers for proceedings due – Full and short papers: 2021-04-15
– Presenter registration deadline – Full and short papers (also deadline for early-bird registration rates): 2021-04-15
– Notification of review outcomes from late submission round: 2021-04-19
– Camera-ready work-in-progress papers and nontraditional session extended abstracts for proceedings due; final practitioner abstracts for conference program due: 2021-05-03
– Deadline for uploading presentation materials (videos, slides for oral presentations, posters for poster presentations): 2021-05-10
– Conference opening: 2021-05-17
– Conference closing: 2021-06-10
*Full and short papers can only be submitted in the main round.
##### PUBLICATION & INDEXING #####
All accepted and registered papers in the Academic Stream that are presented at iLRN 2021 and all extended abstracts describing the Nontraditional Sessions presented at the conference will be published in the conference proceedings and submitted to the IEEE Xplore(r) digital library.
Content loaded into Xplore is made available by IEEE to its abstracting and indexing partners, including Elsevier (Scopus, EiCompendex), Clarivate Analytics (CPCI–part of Web of Science) and others, for potential inclusion in their respective databases. In addition, the authors of selected papers may be invited to submit revised and expanded versions of their papers for possible publication in the IEEE Transactions on Learning Technologies (2019 JCR Impact Factor: 2.714), the Journal of Universal Computer Science (2019 JCR Impact Factor: 0.91), or another Scopus and/or Web of Science-indexed journal, subject to the relevant journal’s regular editorial and peer-review policies and procedures.
##### CONTACT #####
Inquiries regarding the iLRN 2020 conference should be directed to the Conference Secretariat at conference@immersivelrn.org.
What is AR (how is it different from VR or MR) https://blog.stcloudstate.edu/ims/2019/03/25/peter-rubin-future-presence/
p. 225
“augmented reality: Bringing artificial objects into the real world-these can be as simple as a ” heads-up display,” like a speedometer project it onto your car’s windshield, or as complex as seen to be virtual creature woke across your real world leaving room, casting a realistic shadow on the floor” https://blog.stcloudstate.edu/ims/2018/11/07/can-xr-help-students-learn/
p. 12
Augmented reality provides an “overlay” of some type over the real world through
the use of a headset or even a smartphone.
There is no necessary distinction between AR and VR; indeed, much research
on the subject is based on a conception of a “virtuality continuum” from entirely
real to entirely virtual, where AR lies somewhere between those ends of the
spectrum. Paul Milgram and Fumio Kishino, “A Taxonomy of Mixed Reality Visual Displays,
Augmented reality superimposes a digital layer on the world around us, often activated by scanning a trigger image or via GPS (think Pokemon Go!). Virtual reality takes users away from the real world, fully immersing students in a digital experience that replaces reality. Mixed reality takes augmented a step further by allowing the digital and real worlds to interact and the digital components to change based on the user’s environment.
Gamifying Library orientation using Metaverse: https://mtvrs.io/GenerousJubilantEeve
(the gateway to the Library orientation project)
Metaverse experience through the user’s phone:
Discuss the following statement:
“low-end AR (Metaverse), like low-end VR (360 degrees) has strong potential to introduce students, faculty and staff to immersive teaching and learning“
As part of our involvement with the Extended Reality Community of Practice, InforMedia Services and SCSU VizLab are offering the following workshops / introductions in augmented and virtual reality:
–Wednesday,March 18, 3PM, MC 205 (directions to MC 205: https://youtu.be/jjpLR3FnBLI ) Intro to 360 Video: easy adoption of virtual reality in your classroom
Plamen Miltenoff will lead exploration of resources; capturing 360 images and videos; hands-on session on creating virtual tours with existing and acquired imagery.
–Wednesday, March 25, 3PM, MC 205 (directions to MC 205: https://youtu.be/jjpLR3FnBLI ) Intro to Augmented Reality
Alan Srock and Mark Gill will demonstrate the use of the Merge Cube and other augmented reality tools in their courses.
Plamen Miltenoff will lead hands-on session on creating basic AR content with Metaverse.
–Wednesday, April 1, 3PM, MC 205 (directions to MC 205: https://youtu.be/jjpLR3FnBLI ) Intro to Virtual Reality Mark Gill, Alan Srock and Plamen Miltenoff will demonstrate AltSpaceVR and Virbela.
Hands-on session on creating learning spaces in virtual reality.
These sessions will share ready-to-go resources as well as hands-on creation of materials suitable for most disciplines taught on this campus.
Thousands of people, from kids to teachers to big brands, are creating all kinds of Augmented Reality Experiences (games, interactive stories, educational curriculum, scavenger hunts, RPGs and much more!)
ProProfs Brain Games provides templates for building interactive crossword puzzles, jigsaw puzzles, word searches, hangman games, and sliding puzzle games. The games you create can be embedded into your blog or shared via email, social media, or any place that you’d typically post a link for students. If you don’t want to take the time to create your own game, you can browse the gallery of games. Most of the games in gallery can be embedded into your blog.
ClassTools.net templates for creating map-based games, word sorting games, matching games, and many more common game formats.
Purpose Games is a free service for creating and or playing simple educational games. The service currently gives users the ability to create seven types of games. Those game types are image quizzes, text quizzes, matching games, fill-in-the-blank games, multiple choice games, shape games, and slide games.
TinyTap is a free iPad app and Android app that enables you to create educational games for your students to play on their iPads or Android tablets. Through TinyTap you can create games in which students identify objects and respond by typing, tapping, or speaking. You can create games in which students complete sentences or even complete a diagram by dragging and dropping puzzle pieces.
Wherever I’ve demonstrated it in the last year, people have been intrigued by Metaverse. It’s a free service that essentially lets you create your own educational versions of Pokemon Go. This augmented reality platform has been used by teachers to create digital breakout games, augmented reality scavenger hunts, and virtual tours.
There was a time when Kahoot games could only be played in the classroom and only created on your laptop. That is no longer the case. Challenge mode lets you assign games to your students to play at home or anywhere else on their mobile devices.