Searching for "video project"

iLRN 2021

CALL FOR PAPERS AND PROPOSALS
iLRN 2021: 7th International Conference of the Immersive Learning Research Network
May 17 to June 10, 2021, on iLRN Virtual Campus, powered by Virbela
… and across the Metaverse!
Technically co-sponsored by the IEEE Education Society,
with proceedings to be submitted for inclusion in IEEE Xplore(r)
Conference theme: “TRANSCEND: Accelerating Learner Engagement in XR across Time, Place, and Imagination”
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Conference website: https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fimmersivelrn.org%2Filrn2021%2F&data=04%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C24d0f76661804eca489508d8a66c7801%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637442332084340933%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=6d614jJWaou4vQMNioW4ZGdiHIm2mCD5uRqaZ276VVw%3D&reserved=0
PDF version of this CFP available at: https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fbit.ly%2F3qnFYRu&data=04%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C24d0f76661804eca489508d8a66c7801%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637442332084340933%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=Ksq0YFtUxHI9EM0%2Fa7OyYTeb7ObhOy3JdVquCRvvH54%3D&reserved=0
The 7th International Conference of the Immersive Learning Research Network (iLRN 2021) will be an innovative and interactive virtual gathering for a strengthening global network of researchers and practitioners collaborating to develop the scientific, technical, and applied potential of immersive learning. It is the premier scholarly event focusing on advances in the use of virtual reality (VR), augmented reality (AR), mixed reality (MR), and other extended reality (XR) technologies to support learners across the full span of learning–from K-12 through higher education to work-based, informal, and lifelong learning contexts.
Following the success of iLRN 2020, our first fully online and in-VR conference, this year’s conference will once again be based on the iLRN Virtual Campus, powered by VirBELA, but with a range of activities taking place on various other XR simulation, gaming, and other platforms. Scholars and professionals working from informal and formal education settings as well as those representing diverse industry sectors are invited to participate in the conference, where they may share their research findings, experiences, and insights; network and establish partnerships to envision and shape the future of XR and immersive technologies for learning; and contribute to the emerging scholarly knowledge base on how these technologies can be used to create experiences that educate, engage, and excite learners.
Note: Last year’s iLRN conference drew over 3,600 attendees from across the globe, making the scheduling of sessions a challenge. This year’s conference activities will be spread over a four-week period so as to give attendees more opportunities to participate at times that are conducive to their local time zones.
##### TOPIC AREAS #####
XR and immersive learning in/for:
Serious Games • 3D Collaboration • eSports • AI & Machine Learning • Robotics • Digital Twins • Embodied Pedagogical Agents • Medical & Healthcare Education • Workforce & Industry • Cultural Heritage • Language Learning • K-12 STEM • Higher Ed & Workforce STEM  • Museums & Libraries • Informal Learning • Community & Civic Engagement  • Special Education • Geosciences • Data Visualization and Analytics • Assessment & Evaluation
##### SUBMISSION STREAMS & CATEGORIES #####
ACADEMIC STREAM (Refereed paper published in proceedings):
– Full (6-8 pages) paper for oral presentation
– Short paper (4-5 pages) for oral presentation
– Work-in-progress paper (2-3 pages) for poster presentation
– Doctoral colloquium paper (2-3 pages)
PRACTITIONER STREAM (Refereed paper published in proceedings):
– Oral presentation
– Poster presentation
– Guided virtual adventures
– Immersive learning project showcase
NONTRADITIONAL SESSION STREAM (1-2 page extended abstract describing session published in proceedings):
– Workshop
– Special session
– Panel session
##### SESSION TYPES & SESSION FORMATS #####
– Oral Presentation: Pre-recorded video + 60-minute live in-world discussion with
others presenting on similar/related topics (groupings of presenters into sessions determined by Program Committee)
– Poster Presentation: Live poster session in 3D virtual exhibition hall; pre-recorded video optional
– Doctoral Colloquium: 60-minute live in-world discussion with other doctoral researchers; pre-recorded video optional
– Guided Virtual Adventures: 60-minute small-group guided tours of to various social and collaborative XR/immersive environments and platforms
– Immersive Learning Project Showcase: WebXR space to assemble a collection of virtual artifacts, accessible to attendees throughout the conference
– Workshop: 1- or 2-hour live hands-on session
– Special Session: 30- or 60-minute live interactive session held in world; may optionally be linked to one or more papers
– Panel Session: 60-minute live in-world discussion with a self-formed group of 3-5 panelists (including a lead panelist who serves as a moderator)
Please see the conference website for templates and guidelines.
##### PROGRAM TRACKS #####
Papers and proposals may be submitted to one of 10 program tracks, the first nine of which correspond to the iLRN Houses of application, and the tenth of which is intended for papers making knowledge contributions to the learning sciences, computer science, and/or game studies that are not linked to any particular application area:
Track 1. Assessment and Evaluation (A&E)
Track 2. Early Childhood Development & Learning (ECDL)
Track 3. Galleries, Libraries, Archives, & Museums (GLAM)
Track 4. Inclusion, Diversity, Equity, Access, & Social Justice (IDEAS)
Track 5. K-12 STEM Education
Track 6. Language, Culture, & Heritage (LCH)
Track 7. Medical & Healthcare Education (MHE)
Track 8. Nature & Environmental Sciences (NES)
Track 9. Workforce Development & Industry Training (WDIT)
Track 10. Basic Research and Theory in Immersive Learning (not linked to any particular application area)
##### PAPER/PROPOSAL SUBMISSION & REVIEW #####
Papers for the Academic Stream and extended-abstract proposals for the Nontraditional Session Stream must be prepared in standard IEEE double-column US Letter format using Microsoft Word or LaTeX, and will be accepted only via the online submission system, accessible via the conference website (from which guidelines and templates are also available).
Proposals for the Practitioner Stream are to be submitted via an online form, also accessible from the conference website.
A blind peer-review process will be used to evaluate all submissions.
##### IMPORTANT DATES #####
– Main round submission deadline – all submission types welcome: 2021-01-15
– Notification of review outcomes from main submission round: 2021-04-01
– Late round submission deadline – Work-in-progress papers, practitioner presentations, and nontraditional sessions only: 2021-04-08
– Camera-ready papers for proceedings due – Full and short papers: 2021-04-15
– Presenter registration deadline – Full and short papers (also deadline for early-bird registration rates): 2021-04-15
– Notification of review outcomes from late submission round: 2021-04-19
– Camera-ready work-in-progress papers and nontraditional session extended abstracts for proceedings due; final practitioner abstracts for conference program due: 2021-05-03
– Presenter registration deadline – Work-in-progress papers, practitioner presentations, and nontraditional sessions: 2021-05-03
– Deadline for uploading presentation materials (videos, slides for oral presentations, posters for poster presentations): 2021-05-10
– Conference opening: 2021-05-17
– Conference closing: 2021-06-10
*Full and short papers can only be submitted in the main round.
##### PUBLICATION & INDEXING #####
All accepted and registered papers in the Academic Stream that are presented at iLRN 2021 and all extended abstracts describing the Nontraditional Sessions presented at the conference will be published in the conference proceedings and submitted to the IEEE Xplore(r) digital library.
Content loaded into Xplore is made available by IEEE to its abstracting and indexing partners, including Elsevier (Scopus, EiCompendex), Clarivate Analytics (CPCI–part of Web of Science) and others, for potential inclusion in their respective databases. In addition, the authors of selected papers may be invited to submit revised and expanded versions of their papers for possible publication in the IEEE Transactions on Learning Technologies (2019 JCR Impact Factor: 2.714), the Journal of Universal Computer Science (2019 JCR Impact Factor: 0.91), or another Scopus and/or Web of Science-indexed journal, subject to the relevant journal’s regular editorial and peer-review policies and procedures.
##### CONTACT #####
Inquiries regarding the iLRN 2020 conference should be directed to the Conference Secretariat at conference@immersivelrn.org.
General inquiries about iLRN may be sent to info@immersivelrn.org.

More on Virbela in this IMS blog
https://blog.stcloudstate.edu/ims?s=virbela

XR Bootcamp Microsoft

For details, go here:
https://www.eventbrite.com/e/behind-the-scenes-with-microsoft-vr-in-the-wild-tickets-128181001827

Behind the Scenes: Microsoft’s Principal Researcher Eyal Ofek speaking about technical and social perspectives of XR

About this Event

The XR Bootcamp Open Lecture Series continues with Microsoft’s Principal Researcher Eyal Ofek!

Agenda:

Virtual Reality (VR) & Augmented reality (AR) pose challenges and opportunities from both a technical and social perspective. We could now have digital, and not physical objects change our understanding of the world around us. It is a unique opportunity to change reality as we sense it.

The Microsoft Researchers are looking for new possibilities to extend our abilities when we are not bound by our physical limitations, enabling superhuman abilities on one hand, and leveling the playfield for people with physical limitations.

Dr. Ofek will describe efforts to design VR & AR applications that will adjust according to the user’s uncontrolled environment, enabling a continuous use during work and leisure, over the large variance of environments. He will also review efforts to the extent the rendering to new capabilities such as haptic rendering.

His lecture will be followed by a Q&A session where you can ask all your questions about the topic.

Lead Instructors:

Eyal Ofek is a principal researcher at the Microsoft Research lab in Redmond, WA. His research interests include Augmented Reality (AR)/Virtual Reality (VR), Haptics, interactive projection mapping, and computer vision for human-computer interaction. He is also the Specialty Chief Editor of Frontiers in Virtual Reality, for the area of Haptics and an Assoc. Editor of IEEE Computer Graphics and Application (CG&A).

Prior to joining Microsoft Research, he obtained his Ph.D. at the Hebrew University of Jerusalem and has founded a couple of companies in computer graphics, including a successful drawing and photo editing application and developing the world’s first time-of-flight video cameras which was a basis for the HoloLens depth camera.

This event is part of the Global XR Bootcamp event:

The Global XR Bootcamp 2020 will be the biggest community-driven, FREE, online Virtual, Augmented and Mixed Reality event in the world! Join us on YouTube or AltspaceVR for a 24 hour live stream with over 50 high quality talks, panels and sessions. Meet your fellow XR enthousiasts in our Community Zone, and win amazing prizes – from vouchers to XR hardware.

++++++++++++++++++++
more on XR in this IMS blog
https://blog.stcloudstate.edu/ims?s=xr

virtual reality definition

This is an excerpt from my 2018 book chapter: https://www.academia.edu/41628237/Chapter_12_VR_AR_and_Video_360_A_Case_Study_Towards_New_Realities_in_Education_by_Plamen_Miltenoff 

Among a myriad of other definitions, Noor (2016) describes Virtual Reality (VR) as “a computer generated environment that can simulate physical presence in places in the real world or imagined worlds. The user wears a headset and through specialized software and sensors is immersed in 360-degree views of simulated worlds” (p. 34).   

Noor, Ahmed. 2016. “The Hololens Revolution.” Mechanical Engineering 138(10):30-35. 

Weiss and colleagues wrote that “Virtual reality typically refers to the use of interactive simulations created with computer hardware and software to present users with opportunities to engage in environments that appear to be and feel similar to real-world objects and events” 

Weiss, P. L., Rand, D., Katz, N., & Kizony, R. (2004). Video capture virtual reality as a flexible and effective rehabilitation tool. Journal of NeuroEngineering and Rehabilitation1(1), 12. https://doi.org/10.1186/1743-0003-1-12 

Henderson defined virtual reality as a “computer based, interactive, multisensory environment that occurs in real time”  

Rubin, 2018, p. 28. Virtual reality is an 1. artificial environment that’s 2. immersive enough to convince you that you are 3. actually inside it.
artificialenvironment ” could mean just about anything. The photograph is an artificial environment of video game is an artificial environment a Pixar movie is an artificial environment the only thing that matters is that it’s not where are you physically are.  p. 46 “VR is potentially going to become a direct interface to the subconscious”

  1. p. 225 Virtual reality: the illusion of an all-enveloping artificial world, created by wearing an opaque display in front of your eyes.  

From: https://blog.stcloudstate.edu/ims/2018/11/07/can-xr-help-students-learn/ : 
p. 10 “there is not universal agreement on the definitions of these terms or on the scope of these technologies. Also, all of these technologies currently exist in an active marketplace and, as in many rapidly changing markets, there is a tendency for companies to invent neologisms around 3D technology.” p. 11 Virtual reality means that the wearer is completely immersed in a computer simulation.

from: https://blog.stcloudstate.edu/ims/2018/11/07/can-xr-help-students-learn/ 

There is no necessary distinction between AR and VR; indeed, much research
on the subject is based on a conception of a “virtuality continuum” from entirely
real to entirely virtual, where AR lies somewhere between those ends of the
spectrum.  Paul Milgram and Fumio Kishino, “A Taxonomy of Mixed Reality Visual Displays,” IEICE Transactions on Information Systems, vol. E77-D, no. 12 (1994); Steve Mann, “Through the Glass, Lightly,” IEEE Technology and Society Magazine 31, no. 3 (2012): 10–14.

++++++++++++++++++++++

Among a myriad of other definitions, Noor (2016) describes Virtual Reality (VR) as “a computer generated environment that can simulate physical presence in places in the real world or imagined worlds. The user wears a headset and through specialized software and sensors is immersed in 360-degree views of simulated worlds” (p. 34).   Weiss and colleagues wrote that “Virtual reality typically refers to the use of interactive simulations created with computer hardware and software to present users with opportunities to engage in environments that appear to be and feel similar to real-world objects and events.”
Rubin takes a rather broad approach ascribing to VR: 1. artificial environment that’s 2. immersive enough to convince you that you are 3. actually inside it. (p. 28) and further asserts “VR is potentially going to become a direct interface to the subconscious” (p. 46). 
Most importantly, as Pomeranz (2018) asserts, “there is not universal agreement on the definitions of these terms or on the scope of these technologies. Also, all of these technologies currently exist in an active marketplace and, as in many rapidly changing markets, there is a tendency for companies to invent neologisms.” (p. 10) 

Noor, Ahmed. 2016. “The Hololens Revolution.” Mechanical Engineering 138(10):30-35. 

Pomerantz, J. (2018). Learning in Three Dimensions: Report on the EDUCAUSE/HP Campus of the Future Project (Louisville, CO; ECAR Research Report, p. 57). https://library.educause.edu/~/media/files/library/2018/8/ers1805.pdf 

Rubin, P. (2018). Future Presence: How Virtual Reality Is Changing Human Connection, Intimacy, and the Limits of Ordinary Life (Illustrated edition). HarperOne. 

Weiss, P. L., Rand, D., Katz, N., & Kizony, R. (2004). Video capture virtual reality as a flexible and effective rehabilitation tool. Journal of NeuroEngineering and Rehabilitation1(1), 12. https://doi.org/10.1186/1743-0003-1-12 

360 degree images definition

  • 360-degree video
    https://en.wikipedia.org/wiki/360-degree_video
    360-degree videos, also known as immersive videos[1] or spherical videos,[2] are video recordings where a view in every direction is recorded at the same time, shot using an omnidirectional camera or a collection of cameras. During playback on normal flat display the viewer has control of the viewing direction like a panorama. It can also be played on a displays or projectors arranged in a sphere or some part of a sphere.360 Degree Video is an immersive video format consisting of a video – or series of images – mapped to a portion of a sphere that allows viewing in multiple directions from a fixed central point.
    The mapping is usually carried out using equirectangular projection, where the horizontal coordinate is simply longitude, and the vertical coordinate is simply latitude, with no transformation or scaling applied. Other possible projections are Cube Map (that uses the six faces of a cube as the map shape), Equi-Angular Cubemap – EAC (detailed by Google in 2017 to distribute pixels as evenly as possible across the sphere so that the density of information is consistent, regardless of which direction the viewer is looking), and Pyramid (defined by Facebook in 2016).
    This type of video content is typically viewable through a head-mounted display, mobile device, or personal computer and allows for three degrees of freedom (see section 4.2 for an explanation of the concept of degrees of freedom).
    https://xrsi.org/definition/360-degree-video
  • 360 Degree Video is Not Virtual Reality
    https://www.theprimacy.com/blog/360-degree-video-is-not-virtual-reality/
    “In layman’s terms, 360 means it surrounds you. 3D means it has depth, like looking at a landscape, you’ll notice that there are objects closer to you, and objects that are further away. An image can be 360 and not 3D, or 3D and not 360, but keep in mind the distinction.”

  • for more advanced definition of 360-degree videos and in conjunction with virtual experience (VX) and immersive reality in
    Engberg, M., & Bolter, J. D. (2020). The aesthetics of reality media. Journal of Visual Culture, 19(1), 81–95. https://doi.org/10.1177/1470412920906264

in house made library counters

LITA listserv exchange on “Raspberry PI Counter for Library Users”

On 7/10/20, 10:05 AM, “lita-l-request@lists.ala.org on behalf of Hammer, Erich F” <lita-l-request@lists.ala.org on behalf of erich@albany.edu> wrote:

Jason,

I think that is a very interesting project.  If I understand how it works (comparing reference images to live images), it should still work if a “fuzzy” or translucent filter were placed on the lens as a privacy measure, correct? You could even make the fuzzy video publicly accessible to prove to folks that privacy is protected.

If that’s the case, IMHO, it really is a commercially viable idea and it would have a market far beyond libraries.  Open source code and hardware designs and sales of pre-packaged hardware and support.  Time for some crowdsource funding!  🙂

Erich

On Friday, July 10, 2020 at 10:14, Jason Griffey eloquently inscribed:
I ran a multi-year project to do counting (as well as attention measurement)
called Measure the Future (http://.measurethefuture.net). That project is i
desperate need of updating….there has been some work done on it at the
> University of OK libraries, but we haven’t seen their code push et. As the
> code stands on GitHub, it isn’t usable….the installation is broken based on
> some underlying dependencies.  The Univ of OK code fixes the issue, but it
> hasn’t been pushed yet. But if you want to see the general code and way we
> approached it, that is all available.  > Jason
> On Jul 8, 2020, 1:37 PM -0500, Mitchell, James Ray
> <jmitchell20@una.edu>, wrote:
>         Hi Kun,
>         I don’t know if this will be useful to you or not, but Code4Lib journal
> had an article a couple years ago that might be helpful. It’s called
> “Testing Three Type of Raspberry Pi People Counters.” The link to the
> article is https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fjournal.code4lib.org%2Farticles%2F12947&amp;data=02%7C01%7Cpmiltenoff%40stcloudstate.edu%7C8d2342df6f3d4d83766508d824e29f23%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C1%7C637299903041974052&amp;sdata=f9qeftEvktqHakDqWY%2BxHTj3kei7idOFAJnROp%2FiOCU%3D&amp;reserved=0
>         Regards    >         James

My note:
In 2018, following the university president’s call for ANY possible savings, the library administrator was send a proposal requesting information regarding the license for the current library counters and proposing the save the money for the license by creating an in-house Arduino counter. The blueprints for such counter were share (as per another LITA listserv exchange). SCSU Physics professor agreement to lead the project was secured as well as the opportunity for SCSU Physics students to develop the project as part of their individual study plan. The proposal was never addressed neither by the middle nor the upper management.

+++++++++++++
more on raspberry pi in this IMS blog
https://blog.stcloudstate.edu/ims?s=raspberry

more on arduino in this IMS blog
https://blog.stcloudstate.edu/ims?s=arduino

Emerging Trends and Impacts of the Internet of Things in Libraries

Emerging Trends and Impacts of the Internet of Things in Libraries

https://www.igi-global.com/gateway/book/244559

Chapters:

Holland, B. (2020). Emerging Technology and Today’s Libraries. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 1-33). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch001

The purpose of this chapter is to examine emerging technology and today’s libraries. New technology stands out first and foremost given that they will end up revolutionizing every industry in an age where digital transformation plays a major role. Major trends will define technological disruption. The next-gen of communication, core computing, and integration technologies will adopt new architectures. Major technological, economic, and environmental changes have generated interest in smart cities. Sensing technologies have made IoT possible, but also provide the data required for AI algorithms and models, often in real-time, to make intelligent business and operational decisions. Smart cities consume different types of electronic internet of things (IoT) sensors to collect data and then use these data to manage assets and resources efficiently. This includes data collected from citizens, devices, and assets that are processed and analyzed to monitor and manage, schools, libraries, hospitals, and other community services.

Makori, E. O. (2020). Blockchain Applications and Trends That Promote Information Management. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 34-51). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch002
Blockchain revolutionary paradigm is the new and emerging digital innovation that organizations have no choice but to embrace and implement in order to sustain and manage service delivery to the customers. From disruptive to sustaining perspective, blockchain practices have transformed the information management environment with innovative products and services. Blockchain-based applications and innovations provide information management professionals and practitioners with robust and secure opportunities to transform corporate affairs and social responsibilities of organizations through accountability, integrity, and transparency; information governance; data and information security; as well as digital internet of things.
Hahn, J. (2020). Student Engagement and Smart Spaces: Library Browsing and Internet of Things Technology. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 52-70). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch003
The purpose of this chapter is to provide evidence-based findings on student engagement within smart library spaces. The focus of smart libraries includes spaces that are enhanced with the internet of things (IoT) infrastructure and library collection maps accessed through a library-designed mobile application. The analysis herein explored IoT-based browsing within an undergraduate library collection. The open stacks and mobile infrastructure provided several years (2016-2019) of user-generated smart building data on browsing and selecting items in open stacks. The methods of analysis used in this chapter include transactional analysis and data visualization of IoT infrastructure logs. By analyzing server logs from the computing infrastructure that powers the IoT services, it is possible to infer in greater detail than heretofore possible the specifics of the way library collections are a target of undergraduate student engagement.
Treskon, M. (2020). Providing an Environment for Authentic Learning Experiences. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 71-86). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch004
The Loyola Notre Dame Library provides authentic learning environments for undergraduate students by serving as “client” for senior capstone projects. Through the creative application of IoT technologies such as Arduinos and Raspberry Pis in a library setting, the students gain valuable experience working through software design methodology and create software in response to a real-world challenge. Although these proof-of-concept projects could be implemented, the library is primarily interested in furthering the research, teaching, and learning missions of the two universities it supports. Whether the library gets a product that is worth implementing is not a requirement; it is a “bonus.”
Rashid, M., Nazeer, I., Gupta, S. K., & Khanam, Z. (2020). Internet of Things: Architecture, Challenges, and Future Directions. In Holland, B. (Ed.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 87-104). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch005
The internet of things (IoT) is a computing paradigm that has changed our daily livelihood and functioning. IoT focuses on the interconnection of all the sensor-based devices like smart meters, coffee machines, cell phones, etc., enabling these devices to exchange data with each other during human interactions. With easy connectivity among humans and devices, speed of data generation is getting multi-fold, increasing exponentially in volume, and is getting more complex in nature. In this chapter, the authors will outline the architecture of IoT for handling various issues and challenges in real-world problems and will cover various areas where usage of IoT is done in real applications. The authors believe that this chapter will act as a guide for researchers in IoT to create a technical revolution for future generations.
Martin, L. (2020). Cloud Computing, Smart Technology, and Library Automation. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 105-123). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch006
As technology continues to change, the landscape of the work of librarians and libraries continue to adapt and adopt innovations that support their services. Technology also continues to be an essential tool for dissemination, retrieving, storing, and accessing the resources and information. Cloud computing is an essential component employed to carry out these tasks. The concept of cloud computing has long been a tool utilized in libraries. Many libraries use OCLC to catalog and manage resources and share resources, WorldCat, and other library applications that are cloud-based services. Cloud computing services are used in the library automation process. Using cloud-based services can streamline library services, minimize cost, and the need to have designated space for servers, software, or other hardware to perform library operations. Cloud computing systems with the library consolidate, unify, and optimize library operations such as acquisitions, cataloging, circulation, discovery, and retrieval of information.
Owusu-Ansah, S. (2020). Developing a Digital Engagement Strategy for Ghanaian University Libraries: An Exploratory Study. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 124-139). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch007
This study represents a framework that digital libraries can leverage to increase usage and visibility. The adopted qualitative research aims to examine a digital engagement strategy for the libraries in the University of Ghana (UG). Data is collected from participants (digital librarians) who are key stakeholders of digital library service provision in the University of Ghana Library System (UGLS). The chapter reveals that digital library services included rare collections, e-journal, e-databases, e-books, microfilms, e-theses, e-newspapers, and e-past questions. Additionally, the research revealed that the digital library service patronage could be enhanced through outreach programmes, open access, exhibitions, social media, and conferences. Digital librarians recommend that to optimize digital library services, literacy programmes/instructions, social media platforms, IT equipment, software, and website must be deployed. In conclusion, a DES helps UGLS foster new relationships, connect with new audiences, and establish new or improved brand identity.
Nambobi, M., Ssemwogerere, R., & Ramadhan, B. K. (2020). Implementation of Autonomous Library Assistants Using RFID Technology. In Holland, B. (Ed.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 140-150). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch008
This is an interesting time to innovate around disruptive technologies like the internet of things (IoT), machine learning, blockchain. Autonomous assistants (IoT) are the electro-mechanical system that performs any prescribed task automatically with no human intervention through self-learning and adaptation to changing environments. This means that by acknowledging autonomy, the system has to perceive environments, actuate a movement, and perform tasks with a high degree of autonomy. This means the ability to make their own decisions in a given set of the environment. It is important to note that autonomous IoT using radio frequency identification (RFID) technology is used in educational sectors to boost the research the arena, improve customer service, ease book identification and traceability of items in the library. This chapter discusses the role, importance, the critical tools, applicability, and challenges of autonomous IoT in the library using RFID technology.
Priya, A., & Sahana, S. K. (2020). Processor Scheduling in High-Performance Computing (HPC) Environment. In Holland, B. (Ed.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 151-179). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch009
Processor scheduling is one of the thrust areas in the field of computer science. The future technologies use a huge amount of processing for execution of their tasks like huge games, programming software, and in the field of quantum computing. In real-time, many complex problems are solved by GPU programming. The primary concern of scheduling is to reduce the time complexity and manpower. Several traditional techniques exit for processor scheduling. The performance of traditional techniques is reduced when it comes to the huge processing of tasks. Most scheduling problems are NP-hard in nature. Many of the complex problems are recently solved by GPU programming. GPU scheduling is another complex issue as it runs thousands of threads in parallel and needs to be scheduled efficiently. For such large-scale scheduling problems, the performance of state-of-the-art algorithms is very poor. It is observed that evolutionary and genetic-based algorithms exhibit better performance for large-scale combinatorial and internet of things (IoT) problems.
Kirsch, B. (2020). Virtual Reality in Libraries. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 180-193). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch010
Librarians are beginning to offer virtual reality (VR) services in libraries. This chapter reviews how libraries are currently using virtual reality for both consumption and creation purposes. Virtual reality tools will be compared and contrasted, and recommendations will be given for purchasing and circulating headsets and VR equipment. Google Tour Creator and a smartphone or 360-degree camera can be used to create a virtual tour of the library and other virtual reality content. These new library services will be discussed along with practical advice and best practices for incorporating virtual reality into the library for instructional and entertainment purposes.
Heffernan, K. L., & Chartier, S. (2020). Augmented Reality Gamifies the Library: A Ride Through the Technological Frontier. In Holland, B. (Ed.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 194-210). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch011
Two librarians at a University in New Hampshire attempted to integrate gamification and mobile technologies into the exploration of, and orientation to, the library’s services and resources. From augmented reality to virtual escape rooms and finally an in-house app created by undergraduate, campus-based, game design students, the library team learned much about the triumphs and challenges that come with attempting to utilize new technologies to reach users in the 21st century. This chapter is a narrative describing years of various attempts, innovation, and iteration, which have led to the library team being on the verge of introducing an app that could revolutionize campus discovery and engagement.
Miltenoff, P. (2020). Video 360 and Augmented Reality: Visualization to Help Educators Enter the Era of eXtended Reality. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 211-225). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch012
The advent of all types of eXtended Reality (XR)—VR, AR, MR—raises serious questions, both technological and pedagogical. The setup of campus services around XR is only the prelude to the more complex and expensive project of creating learning content using XR. In 2018, the authors started a limited proof-of-concept augmented reality (AR) project for a library tour. Building on their previous research and experience creating a virtual reality (VR) library tour, they sought a scalable introduction of XR services and content for the campus community. The AR library tour aimed to start us toward a matrix for similar services for the entire campus. They also explored the attitudes of students, faculty, and staff toward this new technology and its incorporation in education, as well as its potential and limitations toward the creation of a “smart” library.

VR for student orientation

my note: the LITA publication about the Emporia State University (see below) pursues the same goals of the project two SCSU librarians, Susan Hubbs, MLIS, and Plamen Miltenoff, Ph.D. MLIS, have developed:

This library orientation was an improved version of Plamen Miltenoff’s 2014-2016 research project with numerous national and international publications and presentations: https://web.stcloudstate.edu/pmiltenoff/bi/
E.g.:
Miltenoff, P. (2018). AR, VR, and Video 360: Toward New Realities in Education by Plamen Miltenoff. In J.-P. Van Arnhem, C. Elliott, & M. Rose (Eds.), Augmented and Virtual Reality in Libraries. Retrieved from https://rowman.com/ISBN/9781538102909
https://www.slideshare.net/aidemoreto/video-360-in-the-library
https://www.slideshare.net/aidemoreto/scsu-augmented-reality-library-tour-122152539
https://www.slideshare.net/aidemoreto/vr-library
https://www.slideshare.net/aidemoreto/intro-to-xr-in-libraries-137315988
https://www.slideshare.net/aidemoreto/xr-mission-possible
https://www.slideshare.net/aidemoreto/library-technology-conference-2018
and the upcoming LITA workshops:
http://www.ala.org/lita/virtual-reality-augmented-reality-mixed-reality-and-academic-library

Virtual Reality as a Tool for Student Orientation in Distance Education Programs

A Study of New Library and Information Science Students

ABSTRACT

Virtual reality (VR) has emerged as a popular technology for gaming and learning, with its uses for teaching presently being investigated in a variety of educational settings. However, one area where the effect of this technology on students has not been examined in detail is as tool for new student orientation in colleges and universities. This study investigates this effect using an experimental methodology and the population of new master of library science (MLS) students entering a library and information science (LIS) program. The results indicate that students who received a VR orientation expressed more optimistic views about the technology, saw greater improvement in scores on an assessment of knowledge about their program and chosen profession, and saw a small decrease in program anxiety compared to those who received the same information as standard text-and-links. The majority of students also indicated a willingness to use VR technology for learning for long periods of time (25 minutes or more). The researchers concluded that VR may be a useful tool for increasing student engagement, as described by Game Engagement Theory.

AUTHOR BIOGRAPHY

Brady Lund, Emporia State University

Brady Lund is a doctoral student at Emporia State University’s School of Library and Information Management, where he studies the intersection of information technology and information science, among other topics.

iLearn2020

YouTube Live stream: https://www.youtube.com/watch?v=DSXLJGhI2D8&feature=youtu.be
and the Discord directions: https://docs.google.com/document/d/1GgI4dfq-iD85yJiyoyPApB33tIkRJRns1cJ8OpHAYno/editiLearn2020

Modest3D Guided Virtual Adventure – iLRN Conference 2020 – Session 1: currently, live session: https://youtu.be/GjxTPOFSGEM

https://mediaspace.minnstate.edu/media/Modest+3D/1_28ejh60g

CALL FOR PROPOSALS: GUIDED VIRTUAL ADVENTURE TOURS
at iLRN 2020: 6th International Conference of the Immersive Learning Research Network
Organized in conjunction with Educators in VR
Technically co-sponsored by the IEEE Education Society
June 21-25, 2020, Online
Conference theme: “Vision 20/20: Hindsight, Insight, and Foresight in XR and Immersive Learning”
Conference website: https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fimmersivelrn.org%2Filrn2020&amp;data=02%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C7a9997a1d6724744f7d708d7f52d9387%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637247448406614239&amp;sdata=Jt%2BFUtP3Vs%2FQi1z9HCk9x8m%2B%2BRjkZ63qrcoZnFiUdaQ%3D&amp;reserved=0
++++++++++++++++++++++++++++++
Wednesday, June 24 • 12:00pm – 1:00pm

 Instruction and Instructional Design

Presentation 1: Inspiring Faculty (+ Students) with Tales of Immersive Tech (Practitioner Presentation #106)

Authors: Nicholas Smerker

Immersive technologies – 360º video, virtual and augmented realities – are being discussed in many corners of higher education. For an instructor who is familiar with the terms, at least in passing, learning more about why they and their students should care can be challenging, at best. In order to create a font of inspiration, the IMEX Lab team within Teaching and Learning with Technology at Penn State devised its Get Inspired web resource. Building on a similar repository for making technology stories at the sister Maker Commons website, the IMEX Lab Get Inspired landing page invites faculty to discover real world examples of how cutting edge XR tools are being used every day. In addition to very approachable video content and a short summary calling out why our team chose the story, there are also instructional designer-developed Assignment Ideas that allow for quick deployment of exercises related to – though not always relying upon – the technologies highlighted in a given Get Inspired story.

Presentation 2: Lessons Learned from Over A Decade of Designing and Teaching Immersive VR in Higher Education Online Courses (Practitioner Presentation #101)

Authors: Eileen Oconnor

This presentation overviews the design and instruction in immersive virtual reality environments created by the author beginning with Second Life and progressing to open source venues. It will highlight the diversity of VR environment developed, the challenges that were overcome, and the accomplishment of students who created their own VR environments for K12, college and corporate settings. The instruction and design materials created to enable this 100% online master’s program accomplishment will be shared; an institute launched in 2018 for emerging technology study will be noted.

Presentation 3: Virtual Reality Student Teaching Experience: A Live, Remote Option for Learning Teaching Skills During Campus Closure and Social Distancing (Practitioner Presentation #110)

Authors: Becky Lane, Christine Havens-Hafer, Catherine Fiore, Brianna Mutsindashyaka and Lauren Suna

Summary: During the Coronavirus pandemic, Ithaca College teacher education majors needed a classroom of students in order to practice teaching and receive feedback, but the campus was closed, and gatherings forbidden. Students were unable to participate in live practice teaching required for their program. We developed a virtual reality pilot project to allow students to experiment in two third-party social VR programs, AltSpaceVR and Rumii. Social VR platforms allow a live, embodied experience that mimics in-person events to give students a more realistic, robust and synchronous teaching practice opportunity. We documented the process and lessons learned to inform, develop and scale next generation efforts.

++++++++++++++++++++++++++
Tuesday, June 23 • 5:00pm – 6:00pm
+++++++++++++++++++++++++++
Sunday, June 21 • 8:00am – 9:00am
Escape the (Class)room games in OpenSim or Second Life FULLhttps://ilrn2020.sched.com/event/ceKP/escape-the-classroom-games-in-opensim-or-second-lifePre-registration for this tour is required as places are limited. Joining instructions will be emailed to registrants ahead of the scheduled tour time.The Guided Virtual Adventure tour will take you to EduNation in Second Life to experience an Escape room game. For one hour, a group of participants engage in voice communication and try to solve puzzles, riddles or conundrums and follow clues to eventually escape the space. These scenarios are designed for problem solving and negotiating language and are ideal for language education. They are fun and exciting and the clock ticking adds to game play.Tour guide(s)/leader(s): Philp Heike, let’s talk online sprl, Belgium

Target audience sector: Informal and/or lifelong learning

Supported devices: Desktop/laptop – Windows, Desktop/laptop – Mac

Platform/environment access: Download from a website and install on a desktop/laptop computer
Official website: http://www.secondlife.com

+++++++++++++++++++

Thursday, June 25 • 9:00am – 10:00am

Games and Gamification II

Click here to remove from My Sched.

Presentation 1: Evaluating the impact of multimodal Collaborative Virtual Environments on user’s spatial knowledge and experience of gamified educational tasks (Full Paper #91)

Authors: Ioannis Doumanis and Daphne Economou

>>Access Video Presentation<<

Several research projects in spatial cognition have suggested Virtual Environments (VEs) as an effective way of facilitating mental map development of a physical space. In the study reported in this paper, we evaluated the effectiveness of multimodal real-time interaction in distilling understanding of the VE after completing gamified educational tasks. We also measure the impact of these design elements on the user’s experience of educational tasks. The VE used reassembles an art gallery and it was built using REVERIE (Real and Virtual Engagement In Realistic Immersive Environment) a framework designed to enable multimodal communication on the Web. We compared the impact of REVERIE VG with an educational platform called Edu-Simulation for the same gamified educational tasks. We found that the multimodal VE had no impact on the ability of students to retain a mental model of the virtual space. However, we also found that students thought that it was easier to build a mental map of the virtual space in REVERIE VG. This means that using a multimodal CVE in a gamified educational experience does not benefit spatial performance, but also it does not cause distraction. The paper ends with future work and conclusions and suggestions for improving mental map construction and user experience in multimodal CVEs.

Presentation 2: A case study on student’s perception of the virtual game supported collaborative learning (Full Paper #42)

Authors: Xiuli Huang, Juhou He and Hongyan Wang

>>Access Video Presentation<<

The English education course in China aims to help students establish the English skills to enhance their international competitiveness. However, in traditional English classes, students often lack the linguistic environment to apply the English skills they learned in their textbook. Virtual reality (VR) technology can set up an immersive English language environment and then promote the learners to use English by presenting different collaborative communication tasks. In this paper, spherical video-based virtual reality technology was applied to build a linguistic environment and a collaborative learning strategy was adopted to promote their communication. Additionally, a mixed-methods research approach was used to analyze students’ achievement between a traditional classroom and a virtual reality supported collaborative classroom and their perception towards the two approaches. The experimental results revealed that the virtual reality supported collaborative classroom was able to enhance the students’ achievement. Moreover, by analyzing the interview, students’ attitudes towards the virtual reality supported collaborative class were reported and the use of language learning strategies in virtual reality supported collaborative class was represented. These findings could be valuable references for those who intend to create opportunities for students to collaborate and communicate in the target language in their classroom and then improve their language skills

!!!!!!!!!!!!!!!!!!!
Thursday, June 25 • 11:00am – 12:00pm

 Games and Gamification III

Click here to remove from My Sched.

Presentation 1: Reducing Cognitive Load through the Worked Example Effect within a Serious Game Environment (Full Paper #19)

Authors: Bernadette Spieler, Naomi Pfaff and Wolfgang Slany

>>Access Video Presentation<<

Novices often struggle to represent problems mentally; the unfamiliar process can exhaust their cognitive resources, creating frustration that deters them from learning. By improving novices’ mental representation of problems, worked examples improve both problem-solving skills and transfer performance. Programming requires both skills. In programming, it is not sufficient to simply understand how Stackoverflow examples work; programmers have to be able to adapt the principles and apply them to their own programs. This paper shows evidence in support of the theory that worked examples are the most efficient mode of instruction for novices. In the present study, 42 students were asked to solve the tutorial The Magic Word, a game especially for girls created with the Catrobat programming environment. While the experimental group was presented with a series of worked examples of code, the control groups were instructed through theoretical text examples. The final task was a transfer question. While the average score was not significantly better in the worked example condition, the fact that participants in this experimental group finished significantly faster than the control group suggests that their overall performance was better than that of their counterparts.

Presentation 2: A literature review of e-government services with gamification elements (Full Paper #56)

Authors: Ruth S. Contreras-Espinosa and Alejandro Blanco-M

>>Access Video Presentation<<

Nowadays several democracies are facing the growing problem of a breach in communication between its citizens and their political representatives, resulting in low citizen’s engagement in the participation of political decision making and on public consultations. Therefore, it is fundamental to generate a constructive relationship between both public administration and the citizens by solving its needs. This document contains a useful literature review of the gamification topic and e-government services. The documents contain a background of those concepts and conduct a selection and analysis of the different applications found. A set of three lines of research gaps are found with a potential impact on future studies.

++++++++++++++++++
Thursday, June 25 • 12:00pm – 1:00pm

 Museums and Libraries

Click here to remove from My Sched.

Presentation 1: Connecting User Experience to Learning in an Evaluation of an Immersive, Interactive, Multimodal Augmented Reality Virtual Diorama in a Natural History Museum & the Importance of Story (Full Paper #51)

Authors: Maria Harrington

>>Access Video Presentation<<

Reported are the findings of user experience and learning outcomes from a July 2019 study of an immersive, interactive, multimodal augmented reality (AR) application, used in the context of a museum. The AR Perpetual Garden App is unique in creating an immersive multisensory experience of data. It allowed scientifically naïve visitors to walk into a virtual diorama constructed as a data visualization of a springtime woodland understory, and interact with multimodal information directly through their senses. The user interface comprised of two different AR data visualization scenarios reinforced with data based ambient bioacoustics, an audio story of the curator’s narrative, and interactive access to plant facts. While actual learning and dwell times were the same between the AR app and the control condition, the AR experience received higher ratings on perceived learning. The AR interface design features of “Story” and “Plant Info” showed significant correlations with actual learning outcomes, while “Ease of Use” and “3D Plants” showed significant correlations with perceived learning. As such, designers and developers of AR apps can generalize these findings to inform future designs.

Presentation 2: The Naturalist’s Workshop: Virtual Reality Interaction with a Natural Science Educational Collection (Short Paper #11)

Authors: Colin Patrick Keenan, Cynthia Lincoln, Adam Rogers, Victoria Gerson, Jack Wingo, Mikhael Vasquez-Kool and Richard L. Blanton

>>Access Video Presentation<<

For experiential educators who utilize or maintain physical collections, The Naturalist’s Workshop is an exemplar virtual reality platform to interact with digitized collections in an intuitive and playful way. The Naturalist’s Workshop is a purpose-developed application for the Oculus Quest standalone virtual reality headset for use by museum visitors on the floor of the North Carolina Museum of Natural Sciences under the supervision of a volunteer attendant. Within the application, museum visitors are seated at a virtual desk. Using their hand controllers and head-mounted display, they explore drawers containing botanical specimens and tools-of-the-trade of a naturalist. While exploring, the participant can receive new information about any specimen by dropping it into a virtual examination tray. 360-degree photography and three-dimensionally scanned specimens are used to allow user-motivated, immersive experience of botanical meta-data such as specimen collection coordinates.

Presentation 3: 360˚ Videos: Entry level Immersive Media for Libraries and Education (Practitioner Presentation #132)

Authors: Diane Michaud

>>Access Video Presentation<<

Within the continuum of XR Technologies, 360˚ videos are relatively easy to produce and need only an inexpensive mobile VR viewer to provide a sense of immersion. 360˚ videos present an opportunity to reveal “behind the scenes” spaces that are normally inaccessible to users of academic libraries. This can promote engagement with unique special collections and specific library services. In December 2019, with little previous experience, I led the production of a short 360˚video tour, a walk-through of our institution’s archives. This was a first attempt; there are plans to transform it into a more interactive, user-driven exploration. The beta version successfully generated interest, but the enhanced version will also help prepare uninitiated users for the process of examining unique archival documents and artefacts. This presentation will cover the lessons learned, and what we would do differently for our next immersive video production. Additionally, I will propose that the medium of 360˚ video is ideal for many institutions’ current or recent predicament with campuses shutdown due to the COVID-19 pandemic. Online or immersive 360˚ video can be used for virtual tours of libraries and/or other campus spaces. Virtual tours would retain their value beyond current campus shutdowns as there will always be prospective students and families who cannot easily make a trip to campus. These virtual tours would provide a welcome alternative as they eliminate the financial burden of travel and can be taken at any time.

++++++++++++++++++

iLRN 2020

YouTube Live stream: https://www.youtube.com/watch?v=DSXLJGhI2D8&feature=youtu.be
and the Discord directions: https://docs.google.com/document/d/1GgI4dfq-iD85yJiyoyPApB33tIkRJRns1cJ8OpHAYno/editiLearn2020

Modest3D Guided Virtual Adventure – iLRN Conference 2020 – Session 1: currently, live session: https://youtu.be/GjxTPOFSGEM

https://mediaspace.minnstate.edu/media/Modest+3D/1_28ejh60g

CALL FOR PROPOSALS: GUIDED VIRTUAL ADVENTURE TOURS
at iLRN 2020: 6th International Conference of the Immersive Learning Research Network
Organized in conjunction with Educators in VR
Technically co-sponsored by the IEEE Education Society
June 21-25, 2020, Online
Conference theme: “Vision 20/20: Hindsight, Insight, and Foresight in XR and Immersive Learning”
Conference website: https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fimmersivelrn.org%2Filrn2020&amp;data=02%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C7a9997a1d6724744f7d708d7f52d9387%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637247448406614239&amp;sdata=Jt%2BFUtP3Vs%2FQi1z9HCk9x8m%2B%2BRjkZ63qrcoZnFiUdaQ%3D&amp;reserved=0
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
At our physical iLRN conferences, the first day of the conference (Sunday) is typically devoted to one or more guided social tours of local attractions in which attendees have the opportunity to socialize and get to know one another while immersing themselves in the sights and sounds of the host city and/or region. As this year’s conference will take place entirely online, we are instead offering the opportunity for attendees to sign up for small-group “Guided Virtual Adventure” tours of 50 minutes in duration to various social and collaborative XR/immersive environments and platforms.
Proposals are being sought for prospective Guided Virtual Adventure tour offerings on Sunday, June 21, 2020. Tour destinations may be:
– a third-party XR/immersive platform with which you are familiar (e.g., Altspace, Mozilla Hubs, Minecraft, World of Warcraft, Somnium Space, OrbusVR, Second Life);
– a specific virtual environment that you, your institution/organization, or someone else has developed within a third-party platform;
– a platform that you or your institution/organization has developed and/or specific environments within that platform.
There are no fees involved in offering a Guided Virtual Adventure tour; however, preference will be given to proposals that involve environments/platforms that are freely and openly accessible, and that are associated with nonprofit organizations and educational institutions. Where possible, it is strongly recommended that multiple offerings of the tour are made available throughout the day so as to cater for different time zones in which the 8,000+ iLRN 2020 event attendees will be based.
Companies wishing to offer Guided Virtual Adventure tours involving their commercial products and services may submit proposals for consideration, but the iLRN 2020 Organizing Committee reserves the right to, at its discretion, place limits on the number of tours of platforms/environments of a certain type or that address a particular target audience/application vertical. In doing so, they will prioritize companies that have purchased a sponsorship or exhibition package.
*** IMPORTANT: The Guided Virtual Adventures are intended to be a social activity, and as such, platforms and environments to be toured must support interaction among multiple users. For other types of platform or environment, please consider offering a Workshop (https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fimmersivelrn.org%2Filrn2020%2Fworkshops%2F&amp;data=02%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C7a9997a1d6724744f7d708d7f52d9387%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637247448406614239&amp;sdata=2rSCHtYBw3116hRmXFowDz8vEJ%2FPE8MjBjPjhuoU%2FKM%3D&amp;reserved=0) instead, and/or participating in the Immersive Learning Project Showcase & Competition (https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fimmersivelrn.org%2Filrn2020%2Fimmersive-learning-project-showcase%2F&amp;data=02%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C7a9997a1d6724744f7d708d7f52d9387%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637247448406614239&amp;sdata=vldC9NaYxK6cYof9QoBxq9dTjO1Zv%2F9OIcUAdqdT0rs%3D&amp;reserved=0). ***
### Submitting a Proposal ###
Please use this form to propose a Guided Virtual Adventure: https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fforms.gle%2FP4JTAkb29Lb9L18JA&amp;data=02%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C7a9997a1d6724744f7d708d7f52d9387%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637247448406614239&amp;sdata=P7uRpwfXWvrQWld%2FQV6JI%2FdnP9lYPxV%2BeRq73xsCozE%3D&amp;reserved=0
### Contact ###
Inquiries regarding the Guided Virtual Adventures may be directed to conference@immersivelrn.org.
### Important Dates ###
– Guided Virtual Adventure proposal submission deadline: May 18, 2020
– Notification of proposal review outcomes: May 21, 2020
– Presenter registration deadline: May 25, 2020
– Deadline for providing final participant instructions: June 1, 2020
– Guided Virtual Adventure Day: June 21, 2020
Other upcoming iLRN 2020 deadlines (see conference website for details):
–  Immersive Learning Project Showcase & Competition – expressions of interest to participate due May 14, 2020 (deadline extended, no further extensions will be announced)
– Practitioner Stream oral and poster presentations – 1-2 page proposals, not for publication in proceedings, due May 18, 2020 (will not be extended)
– Workshops, Panel Sessions, and Special Sessions –  2-3 page proposals for publication in proceedings as extended-abstract descriptions of the sessions, due May 18, 2020 (will not be extended)
– Free registration deadline for non-presenter educators and students – May 23, 2020
(Sent to blend-online@listserv.educause.edu)

+++++++++++++++++
more on virtual tours in this IMS blog
https://blog.stcloudstate.edu/ims?s=virtual+tour

Bloom’s Taxonomy and VR

Please have recording through my Quest goggles; EngageVR does NOT allow simultaneous login through goggles and PC client

MediaSpace / Kaltura has several shortcomings, this is why I am offering you a parallel YouTube recording

Please have also my highlights:

+++++++++++++++++++++++++++
Announcement

https://app.engagevr.io/events/ZJa7A/view

Mon, Apr 13th, 2020 at 12:00 PM (CDT)

A chance to join Steve Bambury as he shares his Bloom’s Taxonomy and VR project

Hosted By Steve Bambury

After another break (due to Steve fracturing his arm), the one and only #CPDinVR events are back with not one but TWO opportunities to join Steve as he shares his Bloom’s Taxonomy and VR project

Debuted at the GESS Conference in Dubai in February, the presentation recounts the lengthy history of this project, which included contributions from Steven Sato, Alex Johnson and the late, great Chris Long.

This new version will delve deeper into the specific levels of Bloom’s and the types of VR applications which can be used to engage student skills at each level.

There will also be an opportunity for Q+A with Steve and some of the usual #CPDinVR fun and games at the end of the event…

+++++++++++++
more on EngageVR in this IMS blog
https://blog.stcloudstate.edu/ims?s=engagevr

1 2 3 4 5 6 15