Searching for "mixed reality"

Apple AAPL

Apple AAPL is expected to launch its first virtual reality (VR) headset in 2022, which will be a forerunner of its much-anticipated augmented reality (AR) glasses

along with VR features like a completely simulated 3-D digital environment, the device might include limited AR functionalities.

Apple’s entry will intensify competition in the VR device market, which includes devices such as Facebook’s FB Oculus Quest 2, Sony’s SNE PlayStation VR, Microsoft’s MSFT Windows Mixed Reality and HTC’s Vive and Vive Pro.

global spending on AR and VR is expected to reach $72.8 billion in 2024 from $12 billion in 2020, reflecting a CAGR of 54%

+++++++++++++
more on Apple Glass in this IMS blog
https://blog.stcloudstate.edu/ims?s=apple+glass

iLRN 2021

CALL FOR PAPERS AND PROPOSALS
iLRN 2021: 7th International Conference of the Immersive Learning Research Network
May 17 to June 10, 2021, on iLRN Virtual Campus, powered by Virbela
… and across the Metaverse!
Technically co-sponsored by the IEEE Education Society,
with proceedings to be submitted for inclusion in IEEE Xplore(r)
Conference theme: “TRANSCEND: Accelerating Learner Engagement in XR across Time, Place, and Imagination”
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Conference website: https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fimmersivelrn.org%2Filrn2021%2F&data=04%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C24d0f76661804eca489508d8a66c7801%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637442332084340933%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=6d614jJWaou4vQMNioW4ZGdiHIm2mCD5uRqaZ276VVw%3D&reserved=0
PDF version of this CFP available at: https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fbit.ly%2F3qnFYRu&data=04%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C24d0f76661804eca489508d8a66c7801%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637442332084340933%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=Ksq0YFtUxHI9EM0%2Fa7OyYTeb7ObhOy3JdVquCRvvH54%3D&reserved=0
The 7th International Conference of the Immersive Learning Research Network (iLRN 2021) will be an innovative and interactive virtual gathering for a strengthening global network of researchers and practitioners collaborating to develop the scientific, technical, and applied potential of immersive learning. It is the premier scholarly event focusing on advances in the use of virtual reality (VR), augmented reality (AR), mixed reality (MR), and other extended reality (XR) technologies to support learners across the full span of learning–from K-12 through higher education to work-based, informal, and lifelong learning contexts.
Following the success of iLRN 2020, our first fully online and in-VR conference, this year’s conference will once again be based on the iLRN Virtual Campus, powered by VirBELA, but with a range of activities taking place on various other XR simulation, gaming, and other platforms. Scholars and professionals working from informal and formal education settings as well as those representing diverse industry sectors are invited to participate in the conference, where they may share their research findings, experiences, and insights; network and establish partnerships to envision and shape the future of XR and immersive technologies for learning; and contribute to the emerging scholarly knowledge base on how these technologies can be used to create experiences that educate, engage, and excite learners.
Note: Last year’s iLRN conference drew over 3,600 attendees from across the globe, making the scheduling of sessions a challenge. This year’s conference activities will be spread over a four-week period so as to give attendees more opportunities to participate at times that are conducive to their local time zones.
##### TOPIC AREAS #####
XR and immersive learning in/for:
Serious Games • 3D Collaboration • eSports • AI & Machine Learning • Robotics • Digital Twins • Embodied Pedagogical Agents • Medical & Healthcare Education • Workforce & Industry • Cultural Heritage • Language Learning • K-12 STEM • Higher Ed & Workforce STEM  • Museums & Libraries • Informal Learning • Community & Civic Engagement  • Special Education • Geosciences • Data Visualization and Analytics • Assessment & Evaluation
##### SUBMISSION STREAMS & CATEGORIES #####
ACADEMIC STREAM (Refereed paper published in proceedings):
– Full (6-8 pages) paper for oral presentation
– Short paper (4-5 pages) for oral presentation
– Work-in-progress paper (2-3 pages) for poster presentation
– Doctoral colloquium paper (2-3 pages)
PRACTITIONER STREAM (Refereed paper published in proceedings):
– Oral presentation
– Poster presentation
– Guided virtual adventures
– Immersive learning project showcase
NONTRADITIONAL SESSION STREAM (1-2 page extended abstract describing session published in proceedings):
– Workshop
– Special session
– Panel session
##### SESSION TYPES & SESSION FORMATS #####
– Oral Presentation: Pre-recorded video + 60-minute live in-world discussion with
others presenting on similar/related topics (groupings of presenters into sessions determined by Program Committee)
– Poster Presentation: Live poster session in 3D virtual exhibition hall; pre-recorded video optional
– Doctoral Colloquium: 60-minute live in-world discussion with other doctoral researchers; pre-recorded video optional
– Guided Virtual Adventures: 60-minute small-group guided tours of to various social and collaborative XR/immersive environments and platforms
– Immersive Learning Project Showcase: WebXR space to assemble a collection of virtual artifacts, accessible to attendees throughout the conference
– Workshop: 1- or 2-hour live hands-on session
– Special Session: 30- or 60-minute live interactive session held in world; may optionally be linked to one or more papers
– Panel Session: 60-minute live in-world discussion with a self-formed group of 3-5 panelists (including a lead panelist who serves as a moderator)
Please see the conference website for templates and guidelines.
##### PROGRAM TRACKS #####
Papers and proposals may be submitted to one of 10 program tracks, the first nine of which correspond to the iLRN Houses of application, and the tenth of which is intended for papers making knowledge contributions to the learning sciences, computer science, and/or game studies that are not linked to any particular application area:
Track 1. Assessment and Evaluation (A&E)
Track 2. Early Childhood Development & Learning (ECDL)
Track 3. Galleries, Libraries, Archives, & Museums (GLAM)
Track 4. Inclusion, Diversity, Equity, Access, & Social Justice (IDEAS)
Track 5. K-12 STEM Education
Track 6. Language, Culture, & Heritage (LCH)
Track 7. Medical & Healthcare Education (MHE)
Track 8. Nature & Environmental Sciences (NES)
Track 9. Workforce Development & Industry Training (WDIT)
Track 10. Basic Research and Theory in Immersive Learning (not linked to any particular application area)
##### PAPER/PROPOSAL SUBMISSION & REVIEW #####
Papers for the Academic Stream and extended-abstract proposals for the Nontraditional Session Stream must be prepared in standard IEEE double-column US Letter format using Microsoft Word or LaTeX, and will be accepted only via the online submission system, accessible via the conference website (from which guidelines and templates are also available).
Proposals for the Practitioner Stream are to be submitted via an online form, also accessible from the conference website.
A blind peer-review process will be used to evaluate all submissions.
##### IMPORTANT DATES #####
– Main round submission deadline – all submission types welcome: 2021-01-15
– Notification of review outcomes from main submission round: 2021-04-01
– Late round submission deadline – Work-in-progress papers, practitioner presentations, and nontraditional sessions only: 2021-04-08
– Camera-ready papers for proceedings due – Full and short papers: 2021-04-15
– Presenter registration deadline – Full and short papers (also deadline for early-bird registration rates): 2021-04-15
– Notification of review outcomes from late submission round: 2021-04-19
– Camera-ready work-in-progress papers and nontraditional session extended abstracts for proceedings due; final practitioner abstracts for conference program due: 2021-05-03
– Presenter registration deadline – Work-in-progress papers, practitioner presentations, and nontraditional sessions: 2021-05-03
– Deadline for uploading presentation materials (videos, slides for oral presentations, posters for poster presentations): 2021-05-10
– Conference opening: 2021-05-17
– Conference closing: 2021-06-10
*Full and short papers can only be submitted in the main round.
##### PUBLICATION & INDEXING #####
All accepted and registered papers in the Academic Stream that are presented at iLRN 2021 and all extended abstracts describing the Nontraditional Sessions presented at the conference will be published in the conference proceedings and submitted to the IEEE Xplore(r) digital library.
Content loaded into Xplore is made available by IEEE to its abstracting and indexing partners, including Elsevier (Scopus, EiCompendex), Clarivate Analytics (CPCI–part of Web of Science) and others, for potential inclusion in their respective databases. In addition, the authors of selected papers may be invited to submit revised and expanded versions of their papers for possible publication in the IEEE Transactions on Learning Technologies (2019 JCR Impact Factor: 2.714), the Journal of Universal Computer Science (2019 JCR Impact Factor: 0.91), or another Scopus and/or Web of Science-indexed journal, subject to the relevant journal’s regular editorial and peer-review policies and procedures.
##### CONTACT #####
Inquiries regarding the iLRN 2020 conference should be directed to the Conference Secretariat at conference@immersivelrn.org.
General inquiries about iLRN may be sent to info@immersivelrn.org.

More on Virbela in this IMS blog
https://blog.stcloudstate.edu/ims?s=virbela

Metaverse for XR COP

Discussion on low-end AR (Metaverse)

  1. What is AR (how is it different from VR or MR)
    https://blog.stcloudstate.edu/ims/2019/03/25/peter-rubin-future-presence/
    p. 225
    “augmented reality: Bringing artificial objects into the real world-these can be as simple as a ” heads-up display,” like a speedometer project it onto your car’s windshield, or as complex as seen to be virtual creature woke across your real world leaving room, casting a realistic shadow on the floor”
    https://blog.stcloudstate.edu/ims/2018/11/07/can-xr-help-students-learn/
    p. 12
    Augmented reality provides an “overlay” of some type over the real world through
    the use of a headset or even a smartphone.
    There is no necessary distinction between AR and VR; indeed, much research
    on the subject is based on a conception of a “virtuality continuum” from entirely
    real to entirely virtual, where AR lies somewhere between those ends of the
    spectrum. Paul Milgram and Fumio Kishino, “A Taxonomy of Mixed Reality Visual Displays,

Augmented Reality

 

 

https://blog.stcloudstate.edu/ims/2018/10/17/vr-ar-learning-materials/

Augmented reality superimposes a digital layer on the world around us, often activated by scanning a trigger image or via GPS (think Pokemon Go!). Virtual reality takes users away from the real world, fully immersing students in a digital experience that replaces reality. Mixed reality takes augmented a step further by allowing the digital and real worlds to interact and the digital components to change based on the user’s environment.

  1. Low-end and hi-end AR
    1. Hi-end: Hololens, Google Glass, Apple Glass
      1. Unity-driven content
    2. Low-end: Metaverse
  2. What is Metaverse
        1. Metaverse studio
          https://studio.gometa.io/discover/me
        2. Metaverse app
          1. iOS: https://apps.apple.com/us/app/metaverse-experience-browser/id1159155137
          2. Android: https://play.google.com/store/apps/details?id=com.gometa.metaverse&hl=en&gl=US
        3. Gamifying Library orientation using Metaverse:
          https://mtvrs.io/GenerousJubilantEeve
          (the gateway to the Library orientation project)
          Metaverse experience through the user’s phone:

    1. Student projects using Metaverse
      https://im690group.weebly.com/
      https://mtvrs.io/PreviousImpracticalNandu
    2. Behind the scene, or how does it work
      https://studio.gometa.io/discover/me/a0cc4490-85fb-41d8-849b-bf52ac3ecb70
      YouTube materials:
      https://youtu.be/jLRR6fKtfwY
      https://youtu.be/MLeZo7X5rnA
      https://youtu.be/g9kY41OcR0Y
  3. Discussion
    1. Low-end vs hi-end AR
      1. advantages
      2. disadvantages
    2. gamify learning content with Metaverse
      https://youtu.be/2lUrs3mJSHg
    3. Discuss the following statement:
      low-end AR (Metaverse), like low-end VR (360 degrees) has strong potential to introduce students, faculty and staff to immersive teaching and learning
  4. Alternatives
    1. Merge Cube: https://blog.stcloudstate.edu/ims/2020/10/21/how-to-create-merge-cube/
    2. Aero, GamAR: https://blog.stcloudstate.edu/ims/2020/12/04/augmented-reality-tools/

++++++++++++++++
more on Metavere in this IMS blog
https://blog.stcloudstate.edu/ims?s=metaverse

XR Bootcamp Microsoft

For details, go here:
https://www.eventbrite.com/e/behind-the-scenes-with-microsoft-vr-in-the-wild-tickets-128181001827

Behind the Scenes: Microsoft’s Principal Researcher Eyal Ofek speaking about technical and social perspectives of XR

About this Event

The XR Bootcamp Open Lecture Series continues with Microsoft’s Principal Researcher Eyal Ofek!

Agenda:

Virtual Reality (VR) & Augmented reality (AR) pose challenges and opportunities from both a technical and social perspective. We could now have digital, and not physical objects change our understanding of the world around us. It is a unique opportunity to change reality as we sense it.

The Microsoft Researchers are looking for new possibilities to extend our abilities when we are not bound by our physical limitations, enabling superhuman abilities on one hand, and leveling the playfield for people with physical limitations.

Dr. Ofek will describe efforts to design VR & AR applications that will adjust according to the user’s uncontrolled environment, enabling a continuous use during work and leisure, over the large variance of environments. He will also review efforts to the extent the rendering to new capabilities such as haptic rendering.

His lecture will be followed by a Q&A session where you can ask all your questions about the topic.

Lead Instructors:

Eyal Ofek is a principal researcher at the Microsoft Research lab in Redmond, WA. His research interests include Augmented Reality (AR)/Virtual Reality (VR), Haptics, interactive projection mapping, and computer vision for human-computer interaction. He is also the Specialty Chief Editor of Frontiers in Virtual Reality, for the area of Haptics and an Assoc. Editor of IEEE Computer Graphics and Application (CG&A).

Prior to joining Microsoft Research, he obtained his Ph.D. at the Hebrew University of Jerusalem and has founded a couple of companies in computer graphics, including a successful drawing and photo editing application and developing the world’s first time-of-flight video cameras which was a basis for the HoloLens depth camera.

This event is part of the Global XR Bootcamp event:

The Global XR Bootcamp 2020 will be the biggest community-driven, FREE, online Virtual, Augmented and Mixed Reality event in the world! Join us on YouTube or AltspaceVR for a 24 hour live stream with over 50 high quality talks, panels and sessions. Meet your fellow XR enthousiasts in our Community Zone, and win amazing prizes – from vouchers to XR hardware.

++++++++++++++++++++
more on XR in this IMS blog
https://blog.stcloudstate.edu/ims?s=xr

Apple Glass

‘Apple Glass’ users may be able to manipulate AR images with any real object from r/gadgets

https://appleinsider.com/articles/20/07/30/apple-glass-users-may-be-able-to-manipulate-ar-images-with-any-real-object

With AR and especially with what Apple refers to as Mixed Reality (MR), it’s great to be able to see an iPad Pro in front of you, but you need to be able to use it. You have to be able to pick up a virtual object and use it, or otherwise AR is no better than a 3D movie.

Apple’s proposed solution is described in “Manipulation of Virtual Objects using a Tracked Physical Object,” a patent application filed in January 2020 but only revealed this week. It suggests that truly mixing realities, in that the virtual object could be mapped onto an actual object in the real world.

+++++++++++++++
more on Apple Glass in this IMS blog
https://blog.stcloudstate.edu/ims?s=apple+glass

ARLearn

Ternier, S., Klemke, R., Kalz, M., Van Ulzen, P., & Specht, M. (in press). ARLearn: augmented reality meets augmented virtuality [Special issue]. Journal of Universal Computer Science – Technolgy for learning across physical and virtual spaces.

https://www.academia.edu/29464704/ARLearn_augmented_reality_meets_augmented_virtuality

Augmented reality (AR) and AR games offer a unique opportunity to
implement this core idea in linking real world situations and problems with learning
support. The theory of situated learning [Lave & Wenger, 90] is grounded on the
assumption that learners do not learn via the plain acquisition of knowledge but they
learn via the active participation in frameworks and social contexts with a specific
social engagement structure. Kolb’s learning cycle [Kolb, 84] and the concept of
experiential learning discusses

de Freitas stresses the importance of linking the
experiences made in a game, simulation or micro world with their application in real
world practices [de Freitas, 06]. [Brown & Cairns, 04] describe game immersion as a
continuum from engagement over engrossment to total immersion.

Despite the huge potential of immersive games to overcome the gap between the real
world and the educational context and the rising market for electronic games [PWC,
10], the use of technology-enhanced immersive games in education is still quite low.
The reasons for this are manyfold:
● high game development costs meet limited educational budgets [Westera et
al., 08]
● predefined games are hard to be integrated in the educational process
[Klopfer, Osterweil & Salen, 09]
● learner support in online games does not easily scale [Van Rosmalen et al.,
08]
● furthermore, game platforms up to now could not easily be integrated with
real world environments.

mixed reality definition

 

augmented reality browsers like Layar and Wikitude

first mashups for Google StreetView (called StreetLearn) and for mobile
devices which use the Android Google Maps API (called ARLearn). StreetLearn is
intended to provide an augmented virtuality environment on a Desktop, while mobile
devices are provided with an augmented reality experience through ARLearn. By
creating scripts, adding interactive elements and by introducing gamification
elements, we believe that we can increase the learner’s motivation and provide a
richer learning experience linking mobile augmented reality and augmented virtuality.

freely available tools and offers an open REST API. From the enduser
point of view, playing games is easy for users and requires no special knowledge.
Creating scripts requires no programming skills but does impose still technical
background as scripts are to be edited either in JSON or XML.

IM 690 lab ASVR

IM 690 Virtual Reality and Augmented Reality. short link: http://bit.ly/IM690lab

IM 690 lab plan for March 31, online:  Virtual Worlds

If at any point you are lost in the virtual worlds, please consider talking/chatting using our IM 690 zoom link:https://minnstate.zoom.us/j/964455431 or call 320 308 3072

Readings:
Currently, if you go to the SCSU online dbases
,if they are working at all, don’t be surprised when clicking on EBSCOhost Business Source Complete to see this msg:

library error msg

and if you execute a search:
“AltSpaceVR” + “education”, you will find only meager 1+ results.
Google Scholar, naturally, will yield much greater number.
So, search and find an article of your interest using Google Scholar. I used “immersive learning” + “education” for my search.
I chose to read this article:
https://journal.alt.ac.uk/index.php/rlt/article/view/2347/2657
since it addressed design principles when applying mixed reality in education.
What article did you find/choose/read/are ready to share your analysis with?

Tuesday, March 31, 5PM lab

  1. As usually, we will meet at this Zoom link: https://minnstate.zoom.us/j/964455431
    All of us will be online and we will meet in the Zoom room.
    Please come 10 min earlier, so we can check our equipment and make sure everything works. Since we will be exploring online virtual worlds, please be prepared for technical issues, especially with microphones.
  2. For this lab, please download and install on your computers the AltSpaceVR  (ASVR) software:
    https://www.microsoft.com/en-us/p/altspacevr/9nvr7mn2fchq?activetab=pivot:overviewtab
    Please consider the impediment that Microsoft has made the 2D mode for PC available only for Windows. If you are a Mac user and don’t have PC available at home, please contact me directly for help.
    In addition, pls have a link to the video tutorial;
    https://blog.stcloudstate.edu/ims/2020/03/13/im690-asvr-2d-tutorial/
    pls be informed about MediaSpace issues of the last two weeks, which can result in poor rendering of the video. If issues persist and you still need help downloading and installing the software, contact me directly for help.
    Please do your best to have ASVR installed on your computer before the lab starts on Tues, March 31, 5PM, so we can use our time during the lab for much more fun activities!
  3. Intro to ASVR.
    Please watch this 5 min video anytime you feel a bit lost in ASVR

    pls consider the issues with MediaSpace and be patient, if the video renders and/or does not play right away. The video is meant to help you learn how to navigate your avatar in ASVR.
    the first 15-20 min in the lab, we will “meet” in ASVR, figure out how to work on our ASVR avatar, how to use the computer keyboard to move, communicate and have basic dexterity. We must learn to “make friends” with Mark Gill (ASVR name: MarkGill47), Dr. Park (ASVR name: dhk3600) and Dr. Miltenoff (ASVR name: Plamen), as well as with your class peers, who will be sharing their ASVR contact info in the Zoom Chat session. Once we learn this skills, we are ready to explore ASVR.
    Mark Gill will “lead” us through several virtual worlds, which you will observe and assess from the point of view of an Instructional Designer and an educator (e.g. how these worlds can accommodate learning; what type of teaching do these virtual worlds offer, etc.)
    Eventually, Mark Gill will bring us to the SCSU COSE space, created by him, where he will leave us to discuss.
  4. Discussion in the COSE ASVR room
    We will start our discussion with you sharing your analysis of the article you found in Google Scholar for today’s class (see above Readings). How do your findings from the article match your impressions from the tour across virtual worlds in ASVR? How does learning happen?
  5. Other platforms for immersive learning
    Following the discussions around your articles, we also will briefly touch on other platforms for immersive learning:
    https://blog.stcloudstate.edu/ims/2020/03/17/vr-after-conferences-cancellations/
  6. Final projects
    the rest of the time in the lab will be allocated for work on your final projects.
    Dr. Park and Dr. Miltenoff will work individually with your groups to assist with ideas, questions regarding your projects,

+++++++++++
Plamen Miltenoff, Ph.D., MLIS
Professor
320-308-3072
pmiltenoff@stcloudstate.edu
http://web.stcloudstate.edu/pmiltenoff/faculty/
schedule a meeting: https://doodle.com/digitalliteracy
find my office: https://youtu.be/QAng6b_FJqs

++++++++++++++++++
more on IM 690 labs in this IMS blog
https://blog.stcloudstate.edu/ims?s=im+690

2020 Immersive Learning Technology

2020 Immersive Learning Technology

https://www.jff.org/what-we-do/impact-stories/jfflabs-acceleration/2020-immersive-learning-technology/

2020-Immersion-012420 per Mark Gill’s finding

Technology is rapidly changing how we learn and grow. More and more, tools and platforms that make use of virtual reality (VR), augmented reality (AR), and extended reality (ER)—collectively known as immersive learning technology—are moving from the niche world of Silicon Valley into retail stores, warehouses, factory floors, classrooms as well as corporate education and training programs. The value is clear: these immersive learning tools help companies, training providers, and educators train workers better, faster, and more efficiently. Of course, the impact doesn’t stop at the bottom line. Immersive learning presents an opportunity to reliably train employees for situations that are expensive to support, challenging to replicate, and even dangerous. And it can be done efficiently, safely, and with better learning outcomes.

1 in every 3 small and mid-size businesses in the U.S. is expected to be piloting a VR employee training program by 2021, seeing their new hires reach full productivity 50% faster as a result.1

The worldwide AR and VR market size is forecast to grow nearly 7.7 times between 2018 and 2022.

14 million AR and VR devices are expected to be sold in 2019

By 2023, enterprise VR hardware and software revenue is expected to jump 587% to $5.5 billion, up from an estimated $800 million in 2018.

Virtual Reality VR  A computer-generated experience that simulates reality. VR may include visual, auditory, or tactile experiences.

Augmented Reality AR A live experience of a physical space, where computer-enhanced visualizations, sounds, or tactile experiences overlay the real-world environment.

Mixed Reality MR A blend of virtual experiences and the real world where virtual and augmented experiences are presented simultaneously

Extended Reality ER  An immersive experience involving interactions with the real world, virtual reality, augmented reality, as well as other machines or computers adding content to the experience.

Soft Skills Technical Skills Immersive learning technologies can help people develop human skills, such as empathy, customer service, improving diversity and inclusion, and other areas

Technical Skills.  Immersive learning technologies enable workers to learn through simulated experiences, providing the opportunity for risk-free repetition of complex or dangerous technical tasks.

+++++++++++++
more on immersive learning in this IMS blog
https://blog.stcloudstate.edu/ims?s=immersive+learning

Educators in VR

Info on all presentations: https://account.altvr.com/channels/1182698623012438188

Charlie Fink: Setting the Table for the Next Decade in XR

Translating Training Requirements into Immersive Experience

Virtual Reality Technologies for Learning Designers

Virtual Reality Technologies for Learning Designers Margherita Berti

$$$$$$$$$$$$$$$$$$$$$$

Technology Acceptance and Learning Process Victoria Bolotina part 1

Technology Acceptance and Learning Process Victoria Bolotina part 2

Assessment of Learning Activities in VR Evelien Ydo part 2

++++++++++++++++++++++++++++++++++++++++

VR: So Much More Than a Field Trip Shannon Putman, Graduate Assistant/PhD Student, University of Louisville SPED special education https://account.altvr.com/events/1406092840622096803

++++++++++++++++++++++++++++++

VR and Health Professionals Rob Theriault

+++++++++++++++++++++++

Transform Your History Lessons with AR and VR Michael Fricano II

++++++++++++++++++++++++++++

Transform Your History Lessons with AR and VR Michael Fricano II, Technology Integration Specialist https://www.arvreduhub.com/transform-history

Qlone App for 3D scanning

++++++++++++++++++++++++++++++++++++++

2020 Educators in VR International Summit

The 2020 Educators in VR International Summit is February 17-22. It features over 170 speakers in 150+ events across multiple social and educational platforms including AltspaceVRENGAGErumiiMozilla Hubs, and Somnium Space.

The event requires no registration, and is virtual only, free, and open to the public. Platform access is required, so please install one of the above platforms to attend the International Summit. You may attend in 2D on a desktop or laptop computer with a headphone and microphone (USB gaming headphone recommended), or with a virtual device such as the Oculus Go, Quest, and Rift, Vive, and other mobile and tethered devices. Please note the specifications and requirements of each platform.

The majority of our events are on AltspaceVR. AltspaceVR is available for Samsung GearSteam Store for HTC ViveWindows Mixed Reality, and the Oculus Store for RiftGo and Quest users. Download and install the 2D version for use on your Windows desktop computer.

Charlie Fink, author, columnist for Forbes magazine, and Adjunct Faculty member of Chapman University, will be presenting “Setting the Table for the Next Decade in XR,” discussing the future of this innovative and immersive technology, at the 2020 Educators in VR International Summit. He will be speaking in AltspaceVR on Tuesday, February 18 at 1:00 PM EST /

International Summit

Setting the Table for the Next Decade in XR 1PM, Tues, Feb 18 https://account.altvr.com/events/1406089727517393133

Finding a New Literacy for a New Reality 5PM, Tues, Feb 18

https://account.altvr.com/events/1406093036194103494 schedule for new literacy

Finding a New Literacy for a New Reality

Dr. Sarah Jones, Deputy Dean, De Montfort University

This workshop with Dr. Sarah Jones will focus on developing a relevant and new literacy for virtual reality, including the core competencies and skills needed to develop and understand how to become an engaged user of the technology in a meaningful way. The workshop will develop into research for a forthcoming book on Uncovering a Literacy for VR due to be published in 2020.

Sarah is listed as one of the top 15 global influencers within virtual reality. After nearly a decade in television news, Sarah began working in universities focusing on future media, future technology and future education. Sarah holds a PhD in Immersive Storytelling and has published extensively on virtual and augmented reality, whilst continuing to make and create immersive experiences. She has advised the UK Government on Immersive Technologies and delivers keynotes and speaks at conferences across the world on imagining future technology. Sarah is committed to diversifying the media and technology industries and regularly champions initiatives to support this agenda.

Inter-cognitive and Intra-cognitive Communication in Virtual Reality

Inter-cognitive and Intra-cognitive Communication in Virtual Reality

Michael Vallance, Professor, Future University Hakodate

Currently there are limited ways to connect 3D VR environments to physical objects in the real-world whilst simultaneously conducting communication and collaboration between remote users. Within the context of a solar power plant, the performance metrics of the site are invaluable for environmental engineers who are remotely located. Often two or more remotely located engineers need to communicate and collaborate on solving a problem. If a solar panel component is damaged, the repair often needs to be undertaken on-site thereby incurring additional expenses. This triage of communication is known as inter-cognitive communication and intra-cognitive communication: inter-cognitive communication where information transfer occurs between two cognitive entities with different cognitive capabilities (e.g., between a human and an artificially cognitive system); intra-cognitive communication where information transfer occurs between two cognitive entities with equivalent cognitive capabilities (e.g., between two humans) [Baranyi and Csapo, 2010]. Currently, non-VR solutions offer a comprehensive analysis of solar plant data. A regular PC with a monitor currently have advantages over 3D VR. For example, sensors can be monitored using dedicated software such as EPEVER or via a web browser; as exemplified by the comprehensive service provided by Elseta. But when multiple users are able to collaborate remotely within a three-dimensional virtual simulation, the opportunities for communication, training and academic education will be profound.

Michael Vallance Ed.D. is a researcher in the Department of Media Architecture, Future University Hakodate, Japan. He has been involved in educational technology design, implementation, research and consultancy for over twenty years, working closely with Higher Education Institutes, schools and media companies in UK, Singapore, Malaysia and Japan. His 3D virtual world design and tele-robotics research has been recognized and funded by the UK Prime Minister’s Initiative (PMI2) and the Japan Advanced Institute of Science and Technology (JAIST). He has been awarded by the United States Army for his research in collaborating the programming of robots in a 3D Virtual World.

Create Strategic Snapchat & Instagram AR Campaigns

Create Strategic Snapchat & Instagram AR Campaigns

Dominique Wu, CEO/Founder, Hummingbirdsday

Augmented Reality Lens is popular among young people thanks to Snapchat’s invention. Business is losing money without fully using of social media targeting young people (14-25). In my presentation, Dominique Wu will show how businesses can generate more leads through Spark AR (Facebook AR/Instagram AR) & Snapchat AR Lens, and how to create a strategic Snapchat & Instagram AR campaigns.

Domnique Wu is an XR social media strategist and expert in UX/UI design.She has her own YouTube and Apple Podcast show called “XReality: Digital Transformation,” covering the technology and techniques of incorporating XR and AR into social media, marketing, and integration into enterprise solutions.

Mixed Reality in Classrooms Near You

Mixed Reality in Classrooms Near You

Mark Christian, EVP, Strategy and Corporate Development, GIGXR

Mixed Reality devices like the HoloLens are transforming education now. Mark Christian will discuss how the technology is not about edge use cases or POCs, but real usable products that are at Universities transforming the way we teach and learn. Christian will talk about the products of GIGXR, the story of how they were developed and what the research is saying about their efficacy. It is time to move to adoption of XR technology in education. Learn how one team has made this a reality.

As CEO of forward-thinking virtual reality and software companies, Mark Christian employs asymmetric approaches to rapid, global market adoption, hiring, diversity and revenue. He prides himself on unconventional approaches to building technology companies.

Designing Educational Content in VR

Designing Educational Content in VR

Avinash Gyawali, VR Developer, Weaver Studio

Virtual Reality is an effective medium to impart education to the student only if it is done right.The way VR is considered gimmick or not is by the way the software application are designed/developed by the developers not the hardware limitation.I will be giving insight about the VR development for educational content specifically designed for students of lower secondary school.I will also provide insights about the development of game in unity3D game engine.

Game Developer and VR developer with over 3 years of experience in Game Development.Developer of Zombie Shooter, winner of various national awards in the gaming and entertainment category, Avinash Gyawali is the developer of EDVR, an immersive voice controlled VR experience specially designed for children of age 10-18 years.

8:00 AM PST Research Virtual Reality Technologies for Learning Designers Margherita Berti ASVR

Virtual Reality Technologies for Learning Designers

Margherita Berti

Virtual Reality (VR) is a computer-generated experience that simulates presence in real or imagined environments (Kerrebrock, Brengman, & Willems, 2017). VR promotes contextualized learning, authentic experiences, critical thinking, and problem-solving opportunities. Despite the great potential and popularity of this technology, the latest two installations of the Educause Horizon Report (2018, 2019) have argued that VR remains “elusive” in terms of mainstream adoption. The reasons are varied, including the expense and the lack of empirical evidence for its effectiveness in education. More importantly, examples of successful VR implementations for those instructors who lack technical skills are still scarce. Margherita Berti will discuss a range of easy-to-use educational VR tools and examples of VR-based activity examples and the learning theories and instructional design principles utilized for their development.

Margherita Berti is a doctoral candidate in Second Language Acquisition and Teaching (SLAT) and Educational Technology at the University of Arizona. Her research specialization resides at the intersection of virtual reality, the teaching of culture, and curriculum and content development for foreign language education.

Wed 11:00 AM PST Special Event Gamifying the Biblioverse with Metaverse Amanda Fox VR Design / Biblioverse / Training & Embodiment ASVR

Gamifying the Biblioverse with Metaverse

Amanda Fox, Creative Director of STEAMPunks/MetaInk Publishing, MetaInk Publishing

There is a barrier between an author and readers of his/her books. The author’s journey ends, and the reader’s begins. But what if as an author/trainer, you could use gamification and augmented reality(AR) to interact and coach your readers as part of their learning journey? Attend this session with Amanda Fox to learn how the book Teachingland leverages augmented reality tools such as Metaverse to connect with readers beyond the text.

Amanda Fox, Creative Director of STEAMPunksEdu, and author of Teachingland: A Teacher’s Survival Guide to the Classroom Apolcalypse and Zom-Be A Design Thinker. Check her out on the Virtual Reality Podcast, or connect with her on twitter @AmandaFoxSTEM.

Wed 10:00 AM PST Research Didactic Activity of the Use of VR and Virtual Worlds to Teach Design Fundamentals Christian Jonathan Angel Rueda VR Design / Biblioverse / Training & Embodiment ASVR

Didactic Activity of the Use of VR and Virtual Worlds to Teach Design Fundamentals

Christian Jonathan Angel Rueda, research professor, Autonomous University of Queretaro (Universidad Autónoma de Querétaro)

Christian Jonathan Angel Rueda specializaes in didactic activity of the use of virtual reality/virtual worlds to learn the fundamentals of design. He shares the development of a course including recreating in the three-dimensional environment using the fundamentals learned in class, a demonstration of all the works developed throughout the semester using the knowledge of design foundation to show them creatively, and a final project class scenario that connected with the scenes of the students who showed their work throughout the semester.

Christian Jonathan Angel Rueda is a research professor at the Autonomous University of Queretaro in Mexico. With a PhD in educational technology, Christian has published several papers on the intersection of education, pedagogy, and three-dimensional immersive digital environments. He is also an edtech, virtual reality, and social media consultant at Eco Onis.

Thu 11:00 AM PST vCoaching Closing the Gap Between eLearning and XR Richard Van Tilborg XR eLearning / Laughter Medicine ASVR

Closing the Gap Between eLearning and XR

Richard Van Tilborg, founder, CoVince

How we can bridge the gap between eLearning and XR. Richard Van Tilborg discusses combining brain insights enabled with new technologies. Training and education cases realised with the CoVince platform: journeys which start on you mobile and continue in VR. The possibilities to earn from your creations and have a central distribution place for learning and data.

Richard Van Tilborg works with the CoVince platform, a VR platform offering training and educational programs for central distribution of learning and data. He is an author and speaker focusing on computers and education in virtual reality-based tasks for delivering feedback.

 

Thu 12:00 PM PST Research Assessment of Learning Activities in VR Evelien Ydo Technology Acceptance / Learning Assessment / Vaping Prevention ASVR
Thu 6:00 PM PST Down to Basics Copyright and Plagiarism Protections in VR Jonathan Bailey ASVR

 

Thu 8:00 PM PST Diversity Cyberbullying in VR John Williams, Brennan Hatton, Lorelle VanFossen ASVR

AI and XR and Educational Gaming

AI and Mixed Reality Drive Educational Gaming into ‘Boom Phase’

By Dian Schaffhauser 09/16/19

https://campustechnology.com/articles/2019/09/16/ai-and-mixed-reality-drive-educational-gaming-into-boom-phase.aspx

Artificial intelligence and mixed reality have driven demand in learning games around the world, according to a new report by Metaari. A five-year forecast has predicted that educational gaming will reach $24 billion by 2024, with a compound annual growth rate of 33 percent and a quadrupling of revenues. Metaari is an analyst firm that tracks advanced learning technology.

1 2 3 4 5 8