Searching for "game based learning"

iLRN 2021

CALL FOR PAPERS AND PROPOSALS
iLRN 2021: 7th International Conference of the Immersive Learning Research Network
May 17 to June 10, 2021, on iLRN Virtual Campus, powered by Virbela
… and across the Metaverse!
Technically co-sponsored by the IEEE Education Society,
with proceedings to be submitted for inclusion in IEEE Xplore(r)
Conference theme: “TRANSCEND: Accelerating Learner Engagement in XR across Time, Place, and Imagination”
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Conference website: https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fimmersivelrn.org%2Filrn2021%2F&data=04%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C24d0f76661804eca489508d8a66c7801%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637442332084340933%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=6d614jJWaou4vQMNioW4ZGdiHIm2mCD5uRqaZ276VVw%3D&reserved=0
PDF version of this CFP available at: https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fbit.ly%2F3qnFYRu&data=04%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C24d0f76661804eca489508d8a66c7801%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637442332084340933%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=Ksq0YFtUxHI9EM0%2Fa7OyYTeb7ObhOy3JdVquCRvvH54%3D&reserved=0
The 7th International Conference of the Immersive Learning Research Network (iLRN 2021) will be an innovative and interactive virtual gathering for a strengthening global network of researchers and practitioners collaborating to develop the scientific, technical, and applied potential of immersive learning. It is the premier scholarly event focusing on advances in the use of virtual reality (VR), augmented reality (AR), mixed reality (MR), and other extended reality (XR) technologies to support learners across the full span of learning–from K-12 through higher education to work-based, informal, and lifelong learning contexts.
Following the success of iLRN 2020, our first fully online and in-VR conference, this year’s conference will once again be based on the iLRN Virtual Campus, powered by VirBELA, but with a range of activities taking place on various other XR simulation, gaming, and other platforms. Scholars and professionals working from informal and formal education settings as well as those representing diverse industry sectors are invited to participate in the conference, where they may share their research findings, experiences, and insights; network and establish partnerships to envision and shape the future of XR and immersive technologies for learning; and contribute to the emerging scholarly knowledge base on how these technologies can be used to create experiences that educate, engage, and excite learners.
Note: Last year’s iLRN conference drew over 3,600 attendees from across the globe, making the scheduling of sessions a challenge. This year’s conference activities will be spread over a four-week period so as to give attendees more opportunities to participate at times that are conducive to their local time zones.
##### TOPIC AREAS #####
XR and immersive learning in/for:
Serious Games • 3D Collaboration • eSports • AI & Machine Learning • Robotics • Digital Twins • Embodied Pedagogical Agents • Medical & Healthcare Education • Workforce & Industry • Cultural Heritage • Language Learning • K-12 STEM • Higher Ed & Workforce STEM  • Museums & Libraries • Informal Learning • Community & Civic Engagement  • Special Education • Geosciences • Data Visualization and Analytics • Assessment & Evaluation
##### SUBMISSION STREAMS & CATEGORIES #####
ACADEMIC STREAM (Refereed paper published in proceedings):
– Full (6-8 pages) paper for oral presentation
– Short paper (4-5 pages) for oral presentation
– Work-in-progress paper (2-3 pages) for poster presentation
– Doctoral colloquium paper (2-3 pages)
PRACTITIONER STREAM (Refereed paper published in proceedings):
– Oral presentation
– Poster presentation
– Guided virtual adventures
– Immersive learning project showcase
NONTRADITIONAL SESSION STREAM (1-2 page extended abstract describing session published in proceedings):
– Workshop
– Special session
– Panel session
##### SESSION TYPES & SESSION FORMATS #####
– Oral Presentation: Pre-recorded video + 60-minute live in-world discussion with
others presenting on similar/related topics (groupings of presenters into sessions determined by Program Committee)
– Poster Presentation: Live poster session in 3D virtual exhibition hall; pre-recorded video optional
– Doctoral Colloquium: 60-minute live in-world discussion with other doctoral researchers; pre-recorded video optional
– Guided Virtual Adventures: 60-minute small-group guided tours of to various social and collaborative XR/immersive environments and platforms
– Immersive Learning Project Showcase: WebXR space to assemble a collection of virtual artifacts, accessible to attendees throughout the conference
– Workshop: 1- or 2-hour live hands-on session
– Special Session: 30- or 60-minute live interactive session held in world; may optionally be linked to one or more papers
– Panel Session: 60-minute live in-world discussion with a self-formed group of 3-5 panelists (including a lead panelist who serves as a moderator)
Please see the conference website for templates and guidelines.
##### PROGRAM TRACKS #####
Papers and proposals may be submitted to one of 10 program tracks, the first nine of which correspond to the iLRN Houses of application, and the tenth of which is intended for papers making knowledge contributions to the learning sciences, computer science, and/or game studies that are not linked to any particular application area:
Track 1. Assessment and Evaluation (A&E)
Track 2. Early Childhood Development & Learning (ECDL)
Track 3. Galleries, Libraries, Archives, & Museums (GLAM)
Track 4. Inclusion, Diversity, Equity, Access, & Social Justice (IDEAS)
Track 5. K-12 STEM Education
Track 6. Language, Culture, & Heritage (LCH)
Track 7. Medical & Healthcare Education (MHE)
Track 8. Nature & Environmental Sciences (NES)
Track 9. Workforce Development & Industry Training (WDIT)
Track 10. Basic Research and Theory in Immersive Learning (not linked to any particular application area)
##### PAPER/PROPOSAL SUBMISSION & REVIEW #####
Papers for the Academic Stream and extended-abstract proposals for the Nontraditional Session Stream must be prepared in standard IEEE double-column US Letter format using Microsoft Word or LaTeX, and will be accepted only via the online submission system, accessible via the conference website (from which guidelines and templates are also available).
Proposals for the Practitioner Stream are to be submitted via an online form, also accessible from the conference website.
A blind peer-review process will be used to evaluate all submissions.
##### IMPORTANT DATES #####
– Main round submission deadline – all submission types welcome: 2021-01-15
– Notification of review outcomes from main submission round: 2021-04-01
– Late round submission deadline – Work-in-progress papers, practitioner presentations, and nontraditional sessions only: 2021-04-08
– Camera-ready papers for proceedings due – Full and short papers: 2021-04-15
– Presenter registration deadline – Full and short papers (also deadline for early-bird registration rates): 2021-04-15
– Notification of review outcomes from late submission round: 2021-04-19
– Camera-ready work-in-progress papers and nontraditional session extended abstracts for proceedings due; final practitioner abstracts for conference program due: 2021-05-03
– Presenter registration deadline – Work-in-progress papers, practitioner presentations, and nontraditional sessions: 2021-05-03
– Deadline for uploading presentation materials (videos, slides for oral presentations, posters for poster presentations): 2021-05-10
– Conference opening: 2021-05-17
– Conference closing: 2021-06-10
*Full and short papers can only be submitted in the main round.
##### PUBLICATION & INDEXING #####
All accepted and registered papers in the Academic Stream that are presented at iLRN 2021 and all extended abstracts describing the Nontraditional Sessions presented at the conference will be published in the conference proceedings and submitted to the IEEE Xplore(r) digital library.
Content loaded into Xplore is made available by IEEE to its abstracting and indexing partners, including Elsevier (Scopus, EiCompendex), Clarivate Analytics (CPCI–part of Web of Science) and others, for potential inclusion in their respective databases. In addition, the authors of selected papers may be invited to submit revised and expanded versions of their papers for possible publication in the IEEE Transactions on Learning Technologies (2019 JCR Impact Factor: 2.714), the Journal of Universal Computer Science (2019 JCR Impact Factor: 0.91), or another Scopus and/or Web of Science-indexed journal, subject to the relevant journal’s regular editorial and peer-review policies and procedures.
##### CONTACT #####
Inquiries regarding the iLRN 2020 conference should be directed to the Conference Secretariat at conference@immersivelrn.org.
General inquiries about iLRN may be sent to info@immersivelrn.org.

More on Virbela in this IMS blog
https://blog.stcloudstate.edu/ims?s=virbela

Information Overload Fake News Social Media

Information Overload Helps Fake News Spread, and Social Media Knows It

Understanding how algorithm manipulators exploit our cognitive vulnerabilities empowers us to fight back

https://www.scientificamerican.com/article/information-overload-helps-fake-news-spread-and-social-media-knows-it/

a minefield of cognitive biases.

People who behaved in accordance with them—for example, by staying away from the overgrown pond bank where someone said there was a viper—were more likely to survive than those who did not.

Compounding the problem is the proliferation of online information. Viewing and producing blogs, videos, tweets and other units of information called memes has become so cheap and easy that the information marketplace is inundated. My note: folksonomy in its worst.

At the University of Warwick in England and at Indiana University Bloomington’s Observatory on Social Media (OSoMe, pronounced “awesome”), our teams are using cognitive experiments, simulations, data mining and artificial intelligence to comprehend the cognitive vulnerabilities of social media users.
developing analytical and machine-learning aids to fight social media manipulation.

As Nobel Prize–winning economist and psychologist Herbert A. Simon noted, “What information consumes is rather obvious: it consumes the attention of its recipients.”

attention economy

Nodal diagrams representing 3 social media networks show that more memes correlate with higher load and lower quality of information shared

 Our models revealed that even when we want to see and share high-quality information, our inability to view everything in our news feeds inevitably leads us to share things that are partly or completely untrue.

Frederic Bartlett
Cognitive biases greatly worsen the problem.

We now know that our minds do this all the time: they adjust our understanding of new information so that it fits in with what we already know. One consequence of this so-called confirmation bias is that people often seek out, recall and understand information that best confirms what they already believe.
This tendency is extremely difficult to correct.

Making matters worse, search engines and social media platforms provide personalized recommendations based on the vast amounts of data they have about users’ past preferences.

pollution by bots

Nodal diagrams representing 2 social media networks show that when more than 1% of real users follow bots, low-quality information prevails

Social Herding

social groups create a pressure toward conformity so powerful that it can overcome individual preferences, and by amplifying random early differences, it can cause segregated groups to diverge to extremes.

Social media follows a similar dynamic. We confuse popularity with quality and end up copying the behavior we observe.
information is transmitted via “complex contagion”: when we are repeatedly exposed to an idea, typically from many sources, we are more likely to adopt and reshare it.

Twitter users with extreme political views are more likely than moderate users to share information from low credibility sources

In addition to showing us items that conform with our views, social media platforms such as Facebook, Twitter, YouTube and Instagram place popular content at the top of our screens and show us how many people have liked and shared something. Few of us realize that these cues do not provide independent assessments of quality.

programmers who design the algorithms for ranking memes on social media assume that the “wisdom of crowds” will quickly identify high-quality items; they use popularity as a proxy for quality. My note: again, ill-conceived folksonomy.

Echo Chambers
the political echo chambers on Twitter are so extreme that individual users’ political leanings can be predicted with high accuracy: you have the same opinions as the majority of your connections. This chambered structure efficiently spreads information within a community while insulating that community from other groups.

socially shared information not only bolsters our biases but also becomes more resilient to correction.

machine-learning algorithms to detect social bots. One of these, Botometer, is a public tool that extracts 1,200 features from a given Twitter account to characterize its profile, friends, social network structure, temporal activity patterns, language and other features. The program compares these characteristics with those of tens of thousands of previously identified bots to give the Twitter account a score for its likely use of automation.

Some manipulators play both sides of a divide through separate fake news sites and bots, driving political polarization or monetization by ads.
recently uncovered a network of inauthentic accounts on Twitter that were all coordinated by the same entity. Some pretended to be pro-Trump supporters of the Make America Great Again campaign, whereas others posed as Trump “resisters”; all asked for political donations.

a mobile app called Fakey that helps users learn how to spot misinformation. The game simulates a social media news feed, showing actual articles from low- and high-credibility sources. Users must decide what they can or should not share and what to fact-check. Analysis of data from Fakey confirms the prevalence of online social herding: users are more likely to share low-credibility articles when they believe that many other people have shared them.

Hoaxy, shows how any extant meme spreads through Twitter. In this visualization, nodes represent actual Twitter accounts, and links depict how retweets, quotes, mentions and replies propagate the meme from account to account.

Free communication is not free. By decreasing the cost of information, we have decreased its value and invited its adulteration. 

D2L gamification webinar

Gamification Network: Exploring Gamification through the Octalysis Lens

Mary Nunaley

Karl Kapp The Gamification of Learning and Instruction

Kevin Werbach, Dan Hunter How Game Thinking Can Revolutionize Your Business

Yu-Kai Chou gamification design. Octalysis.  https://www.gish.com/

8 core drives: 

Meaning

Accomplishment

Empowerment

Ownership

Social Influence. social media, instagram influencers

Scarcity: scarcity with homework deadlines, coupons at the store

Unpredictability and curiosity. scavenger hunt in courses. careful when teaching.

Avoidance

gamification

 

Octalysis – the complete Gamification framework

Webinars

motivation

black hat white hat

 

 

 

+++++++++++++++
https://yukaichou.com/octalysis-tool/

+++++++++++++++

https://island.octalysisprime.com/

+++++++++++++
https://yukaichou.com/

+++++++++++++

Home

++++++++++++++

Actionable Gamification: Beyond Points, Badges, and Leaderboards

++++++++++++
more on gamification in this IMS blog
https://blog.stcloudstate.edu/ims?s=gamification

virtual reality definition

This is an excerpt from my 2018 book chapter: https://www.academia.edu/41628237/Chapter_12_VR_AR_and_Video_360_A_Case_Study_Towards_New_Realities_in_Education_by_Plamen_Miltenoff 

Among a myriad of other definitions, Noor (2016) describes Virtual Reality (VR) as “a computer generated environment that can simulate physical presence in places in the real world or imagined worlds. The user wears a headset and through specialized software and sensors is immersed in 360-degree views of simulated worlds” (p. 34).   

Noor, Ahmed. 2016. “The Hololens Revolution.” Mechanical Engineering 138(10):30-35. 

Weiss and colleagues wrote that “Virtual reality typically refers to the use of interactive simulations created with computer hardware and software to present users with opportunities to engage in environments that appear to be and feel similar to real-world objects and events” 

Weiss, P. L., Rand, D., Katz, N., & Kizony, R. (2004). Video capture virtual reality as a flexible and effective rehabilitation tool. Journal of NeuroEngineering and Rehabilitation1(1), 12. https://doi.org/10.1186/1743-0003-1-12 

Henderson defined virtual reality as a “computer based, interactive, multisensory environment that occurs in real time”  

Rubin, 2018, p. 28. Virtual reality is an 1. artificial environment that’s 2. immersive enough to convince you that you are 3. actually inside it.
artificialenvironment ” could mean just about anything. The photograph is an artificial environment of video game is an artificial environment a Pixar movie is an artificial environment the only thing that matters is that it’s not where are you physically are.  p. 46 “VR is potentially going to become a direct interface to the subconscious”

  1. p. 225 Virtual reality: the illusion of an all-enveloping artificial world, created by wearing an opaque display in front of your eyes.  

From: https://blog.stcloudstate.edu/ims/2018/11/07/can-xr-help-students-learn/ : 
p. 10 “there is not universal agreement on the definitions of these terms or on the scope of these technologies. Also, all of these technologies currently exist in an active marketplace and, as in many rapidly changing markets, there is a tendency for companies to invent neologisms around 3D technology.” p. 11 Virtual reality means that the wearer is completely immersed in a computer simulation.

from: https://blog.stcloudstate.edu/ims/2018/11/07/can-xr-help-students-learn/ 

There is no necessary distinction between AR and VR; indeed, much research
on the subject is based on a conception of a “virtuality continuum” from entirely
real to entirely virtual, where AR lies somewhere between those ends of the
spectrum.  Paul Milgram and Fumio Kishino, “A Taxonomy of Mixed Reality Visual Displays,” IEICE Transactions on Information Systems, vol. E77-D, no. 12 (1994); Steve Mann, “Through the Glass, Lightly,” IEEE Technology and Society Magazine 31, no. 3 (2012): 10–14.

++++++++++++++++++++++

Among a myriad of other definitions, Noor (2016) describes Virtual Reality (VR) as “a computer generated environment that can simulate physical presence in places in the real world or imagined worlds. The user wears a headset and through specialized software and sensors is immersed in 360-degree views of simulated worlds” (p. 34).   Weiss and colleagues wrote that “Virtual reality typically refers to the use of interactive simulations created with computer hardware and software to present users with opportunities to engage in environments that appear to be and feel similar to real-world objects and events.”
Rubin takes a rather broad approach ascribing to VR: 1. artificial environment that’s 2. immersive enough to convince you that you are 3. actually inside it. (p. 28) and further asserts “VR is potentially going to become a direct interface to the subconscious” (p. 46). 
Most importantly, as Pomeranz (2018) asserts, “there is not universal agreement on the definitions of these terms or on the scope of these technologies. Also, all of these technologies currently exist in an active marketplace and, as in many rapidly changing markets, there is a tendency for companies to invent neologisms.” (p. 10) 

Noor, Ahmed. 2016. “The Hololens Revolution.” Mechanical Engineering 138(10):30-35. 

Pomerantz, J. (2018). Learning in Three Dimensions: Report on the EDUCAUSE/HP Campus of the Future Project (Louisville, CO; ECAR Research Report, p. 57). https://library.educause.edu/~/media/files/library/2018/8/ers1805.pdf 

Rubin, P. (2018). Future Presence: How Virtual Reality Is Changing Human Connection, Intimacy, and the Limits of Ordinary Life (Illustrated edition). HarperOne. 

Weiss, P. L., Rand, D., Katz, N., & Kizony, R. (2004). Video capture virtual reality as a flexible and effective rehabilitation tool. Journal of NeuroEngineering and Rehabilitation1(1), 12. https://doi.org/10.1186/1743-0003-1-12 

Emerging Trends and Impacts of the Internet of Things in Libraries

Emerging Trends and Impacts of the Internet of Things in Libraries

https://www.igi-global.com/gateway/book/244559

Chapters:

Holland, B. (2020). Emerging Technology and Today’s Libraries. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 1-33). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch001

The purpose of this chapter is to examine emerging technology and today’s libraries. New technology stands out first and foremost given that they will end up revolutionizing every industry in an age where digital transformation plays a major role. Major trends will define technological disruption. The next-gen of communication, core computing, and integration technologies will adopt new architectures. Major technological, economic, and environmental changes have generated interest in smart cities. Sensing technologies have made IoT possible, but also provide the data required for AI algorithms and models, often in real-time, to make intelligent business and operational decisions. Smart cities consume different types of electronic internet of things (IoT) sensors to collect data and then use these data to manage assets and resources efficiently. This includes data collected from citizens, devices, and assets that are processed and analyzed to monitor and manage, schools, libraries, hospitals, and other community services.

Makori, E. O. (2020). Blockchain Applications and Trends That Promote Information Management. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 34-51). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch002
Blockchain revolutionary paradigm is the new and emerging digital innovation that organizations have no choice but to embrace and implement in order to sustain and manage service delivery to the customers. From disruptive to sustaining perspective, blockchain practices have transformed the information management environment with innovative products and services. Blockchain-based applications and innovations provide information management professionals and practitioners with robust and secure opportunities to transform corporate affairs and social responsibilities of organizations through accountability, integrity, and transparency; information governance; data and information security; as well as digital internet of things.
Hahn, J. (2020). Student Engagement and Smart Spaces: Library Browsing and Internet of Things Technology. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 52-70). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch003
The purpose of this chapter is to provide evidence-based findings on student engagement within smart library spaces. The focus of smart libraries includes spaces that are enhanced with the internet of things (IoT) infrastructure and library collection maps accessed through a library-designed mobile application. The analysis herein explored IoT-based browsing within an undergraduate library collection. The open stacks and mobile infrastructure provided several years (2016-2019) of user-generated smart building data on browsing and selecting items in open stacks. The methods of analysis used in this chapter include transactional analysis and data visualization of IoT infrastructure logs. By analyzing server logs from the computing infrastructure that powers the IoT services, it is possible to infer in greater detail than heretofore possible the specifics of the way library collections are a target of undergraduate student engagement.
Treskon, M. (2020). Providing an Environment for Authentic Learning Experiences. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 71-86). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch004
The Loyola Notre Dame Library provides authentic learning environments for undergraduate students by serving as “client” for senior capstone projects. Through the creative application of IoT technologies such as Arduinos and Raspberry Pis in a library setting, the students gain valuable experience working through software design methodology and create software in response to a real-world challenge. Although these proof-of-concept projects could be implemented, the library is primarily interested in furthering the research, teaching, and learning missions of the two universities it supports. Whether the library gets a product that is worth implementing is not a requirement; it is a “bonus.”
Rashid, M., Nazeer, I., Gupta, S. K., & Khanam, Z. (2020). Internet of Things: Architecture, Challenges, and Future Directions. In Holland, B. (Ed.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 87-104). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch005
The internet of things (IoT) is a computing paradigm that has changed our daily livelihood and functioning. IoT focuses on the interconnection of all the sensor-based devices like smart meters, coffee machines, cell phones, etc., enabling these devices to exchange data with each other during human interactions. With easy connectivity among humans and devices, speed of data generation is getting multi-fold, increasing exponentially in volume, and is getting more complex in nature. In this chapter, the authors will outline the architecture of IoT for handling various issues and challenges in real-world problems and will cover various areas where usage of IoT is done in real applications. The authors believe that this chapter will act as a guide for researchers in IoT to create a technical revolution for future generations.
Martin, L. (2020). Cloud Computing, Smart Technology, and Library Automation. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 105-123). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch006
As technology continues to change, the landscape of the work of librarians and libraries continue to adapt and adopt innovations that support their services. Technology also continues to be an essential tool for dissemination, retrieving, storing, and accessing the resources and information. Cloud computing is an essential component employed to carry out these tasks. The concept of cloud computing has long been a tool utilized in libraries. Many libraries use OCLC to catalog and manage resources and share resources, WorldCat, and other library applications that are cloud-based services. Cloud computing services are used in the library automation process. Using cloud-based services can streamline library services, minimize cost, and the need to have designated space for servers, software, or other hardware to perform library operations. Cloud computing systems with the library consolidate, unify, and optimize library operations such as acquisitions, cataloging, circulation, discovery, and retrieval of information.
Owusu-Ansah, S. (2020). Developing a Digital Engagement Strategy for Ghanaian University Libraries: An Exploratory Study. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 124-139). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch007
This study represents a framework that digital libraries can leverage to increase usage and visibility. The adopted qualitative research aims to examine a digital engagement strategy for the libraries in the University of Ghana (UG). Data is collected from participants (digital librarians) who are key stakeholders of digital library service provision in the University of Ghana Library System (UGLS). The chapter reveals that digital library services included rare collections, e-journal, e-databases, e-books, microfilms, e-theses, e-newspapers, and e-past questions. Additionally, the research revealed that the digital library service patronage could be enhanced through outreach programmes, open access, exhibitions, social media, and conferences. Digital librarians recommend that to optimize digital library services, literacy programmes/instructions, social media platforms, IT equipment, software, and website must be deployed. In conclusion, a DES helps UGLS foster new relationships, connect with new audiences, and establish new or improved brand identity.
Nambobi, M., Ssemwogerere, R., & Ramadhan, B. K. (2020). Implementation of Autonomous Library Assistants Using RFID Technology. In Holland, B. (Ed.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 140-150). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch008
This is an interesting time to innovate around disruptive technologies like the internet of things (IoT), machine learning, blockchain. Autonomous assistants (IoT) are the electro-mechanical system that performs any prescribed task automatically with no human intervention through self-learning and adaptation to changing environments. This means that by acknowledging autonomy, the system has to perceive environments, actuate a movement, and perform tasks with a high degree of autonomy. This means the ability to make their own decisions in a given set of the environment. It is important to note that autonomous IoT using radio frequency identification (RFID) technology is used in educational sectors to boost the research the arena, improve customer service, ease book identification and traceability of items in the library. This chapter discusses the role, importance, the critical tools, applicability, and challenges of autonomous IoT in the library using RFID technology.
Priya, A., & Sahana, S. K. (2020). Processor Scheduling in High-Performance Computing (HPC) Environment. In Holland, B. (Ed.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 151-179). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch009
Processor scheduling is one of the thrust areas in the field of computer science. The future technologies use a huge amount of processing for execution of their tasks like huge games, programming software, and in the field of quantum computing. In real-time, many complex problems are solved by GPU programming. The primary concern of scheduling is to reduce the time complexity and manpower. Several traditional techniques exit for processor scheduling. The performance of traditional techniques is reduced when it comes to the huge processing of tasks. Most scheduling problems are NP-hard in nature. Many of the complex problems are recently solved by GPU programming. GPU scheduling is another complex issue as it runs thousands of threads in parallel and needs to be scheduled efficiently. For such large-scale scheduling problems, the performance of state-of-the-art algorithms is very poor. It is observed that evolutionary and genetic-based algorithms exhibit better performance for large-scale combinatorial and internet of things (IoT) problems.
Kirsch, B. (2020). Virtual Reality in Libraries. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 180-193). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch010
Librarians are beginning to offer virtual reality (VR) services in libraries. This chapter reviews how libraries are currently using virtual reality for both consumption and creation purposes. Virtual reality tools will be compared and contrasted, and recommendations will be given for purchasing and circulating headsets and VR equipment. Google Tour Creator and a smartphone or 360-degree camera can be used to create a virtual tour of the library and other virtual reality content. These new library services will be discussed along with practical advice and best practices for incorporating virtual reality into the library for instructional and entertainment purposes.
Heffernan, K. L., & Chartier, S. (2020). Augmented Reality Gamifies the Library: A Ride Through the Technological Frontier. In Holland, B. (Ed.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 194-210). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch011
Two librarians at a University in New Hampshire attempted to integrate gamification and mobile technologies into the exploration of, and orientation to, the library’s services and resources. From augmented reality to virtual escape rooms and finally an in-house app created by undergraduate, campus-based, game design students, the library team learned much about the triumphs and challenges that come with attempting to utilize new technologies to reach users in the 21st century. This chapter is a narrative describing years of various attempts, innovation, and iteration, which have led to the library team being on the verge of introducing an app that could revolutionize campus discovery and engagement.
Miltenoff, P. (2020). Video 360 and Augmented Reality: Visualization to Help Educators Enter the Era of eXtended Reality. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 211-225). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch012
The advent of all types of eXtended Reality (XR)—VR, AR, MR—raises serious questions, both technological and pedagogical. The setup of campus services around XR is only the prelude to the more complex and expensive project of creating learning content using XR. In 2018, the authors started a limited proof-of-concept augmented reality (AR) project for a library tour. Building on their previous research and experience creating a virtual reality (VR) library tour, they sought a scalable introduction of XR services and content for the campus community. The AR library tour aimed to start us toward a matrix for similar services for the entire campus. They also explored the attitudes of students, faculty, and staff toward this new technology and its incorporation in education, as well as its potential and limitations toward the creation of a “smart” library.

iLearn2020

YouTube Live stream: https://www.youtube.com/watch?v=DSXLJGhI2D8&feature=youtu.be
and the Discord directions: https://docs.google.com/document/d/1GgI4dfq-iD85yJiyoyPApB33tIkRJRns1cJ8OpHAYno/editiLearn2020

Modest3D Guided Virtual Adventure – iLRN Conference 2020 – Session 1: currently, live session: https://youtu.be/GjxTPOFSGEM

https://mediaspace.minnstate.edu/media/Modest+3D/1_28ejh60g

CALL FOR PROPOSALS: GUIDED VIRTUAL ADVENTURE TOURS
at iLRN 2020: 6th International Conference of the Immersive Learning Research Network
Organized in conjunction with Educators in VR
Technically co-sponsored by the IEEE Education Society
June 21-25, 2020, Online
Conference theme: “Vision 20/20: Hindsight, Insight, and Foresight in XR and Immersive Learning”
Conference website: https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fimmersivelrn.org%2Filrn2020&data=02%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C7a9997a1d6724744f7d708d7f52d9387%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637247448406614239&sdata=Jt%2BFUtP3Vs%2FQi1z9HCk9x8m%2B%2BRjkZ63qrcoZnFiUdaQ%3D&reserved=0
++++++++++++++++++++++++++++++
Wednesday, June 24 • 12:00pm – 1:00pm

 Instruction and Instructional Design

Presentation 1: Inspiring Faculty (+ Students) with Tales of Immersive Tech (Practitioner Presentation #106)

Authors: Nicholas Smerker

Immersive technologies – 360º video, virtual and augmented realities – are being discussed in many corners of higher education. For an instructor who is familiar with the terms, at least in passing, learning more about why they and their students should care can be challenging, at best. In order to create a font of inspiration, the IMEX Lab team within Teaching and Learning with Technology at Penn State devised its Get Inspired web resource. Building on a similar repository for making technology stories at the sister Maker Commons website, the IMEX Lab Get Inspired landing page invites faculty to discover real world examples of how cutting edge XR tools are being used every day. In addition to very approachable video content and a short summary calling out why our team chose the story, there are also instructional designer-developed Assignment Ideas that allow for quick deployment of exercises related to – though not always relying upon – the technologies highlighted in a given Get Inspired story.

Presentation 2: Lessons Learned from Over A Decade of Designing and Teaching Immersive VR in Higher Education Online Courses (Practitioner Presentation #101)

Authors: Eileen Oconnor

This presentation overviews the design and instruction in immersive virtual reality environments created by the author beginning with Second Life and progressing to open source venues. It will highlight the diversity of VR environment developed, the challenges that were overcome, and the accomplishment of students who created their own VR environments for K12, college and corporate settings. The instruction and design materials created to enable this 100% online master’s program accomplishment will be shared; an institute launched in 2018 for emerging technology study will be noted.

Presentation 3: Virtual Reality Student Teaching Experience: A Live, Remote Option for Learning Teaching Skills During Campus Closure and Social Distancing (Practitioner Presentation #110)

Authors: Becky Lane, Christine Havens-Hafer, Catherine Fiore, Brianna Mutsindashyaka and Lauren Suna

Summary: During the Coronavirus pandemic, Ithaca College teacher education majors needed a classroom of students in order to practice teaching and receive feedback, but the campus was closed, and gatherings forbidden. Students were unable to participate in live practice teaching required for their program. We developed a virtual reality pilot project to allow students to experiment in two third-party social VR programs, AltSpaceVR and Rumii. Social VR platforms allow a live, embodied experience that mimics in-person events to give students a more realistic, robust and synchronous teaching practice opportunity. We documented the process and lessons learned to inform, develop and scale next generation efforts.

++++++++++++++++++++++++++
Tuesday, June 23 • 5:00pm – 6:00pm
+++++++++++++++++++++++++++
Sunday, June 21 • 8:00am – 9:00am
Escape the (Class)room games in OpenSim or Second Life FULLhttps://ilrn2020.sched.com/event/ceKP/escape-the-classroom-games-in-opensim-or-second-lifePre-registration for this tour is required as places are limited. Joining instructions will be emailed to registrants ahead of the scheduled tour time.The Guided Virtual Adventure tour will take you to EduNation in Second Life to experience an Escape room game. For one hour, a group of participants engage in voice communication and try to solve puzzles, riddles or conundrums and follow clues to eventually escape the space. These scenarios are designed for problem solving and negotiating language and are ideal for language education. They are fun and exciting and the clock ticking adds to game play.Tour guide(s)/leader(s): Philp Heike, let’s talk online sprl, Belgium

Target audience sector: Informal and/or lifelong learning

Supported devices: Desktop/laptop – Windows, Desktop/laptop – Mac

Platform/environment access: Download from a website and install on a desktop/laptop computer
Official website: http://www.secondlife.com

+++++++++++++++++++

Thursday, June 25 • 9:00am – 10:00am

Games and Gamification II

Click here to remove from My Sched.

Presentation 1: Evaluating the impact of multimodal Collaborative Virtual Environments on user’s spatial knowledge and experience of gamified educational tasks (Full Paper #91)

Authors: Ioannis Doumanis and Daphne Economou

>>Access Video Presentation<<

Several research projects in spatial cognition have suggested Virtual Environments (VEs) as an effective way of facilitating mental map development of a physical space. In the study reported in this paper, we evaluated the effectiveness of multimodal real-time interaction in distilling understanding of the VE after completing gamified educational tasks. We also measure the impact of these design elements on the user’s experience of educational tasks. The VE used reassembles an art gallery and it was built using REVERIE (Real and Virtual Engagement In Realistic Immersive Environment) a framework designed to enable multimodal communication on the Web. We compared the impact of REVERIE VG with an educational platform called Edu-Simulation for the same gamified educational tasks. We found that the multimodal VE had no impact on the ability of students to retain a mental model of the virtual space. However, we also found that students thought that it was easier to build a mental map of the virtual space in REVERIE VG. This means that using a multimodal CVE in a gamified educational experience does not benefit spatial performance, but also it does not cause distraction. The paper ends with future work and conclusions and suggestions for improving mental map construction and user experience in multimodal CVEs.

Presentation 2: A case study on student’s perception of the virtual game supported collaborative learning (Full Paper #42)

Authors: Xiuli Huang, Juhou He and Hongyan Wang

>>Access Video Presentation<<

The English education course in China aims to help students establish the English skills to enhance their international competitiveness. However, in traditional English classes, students often lack the linguistic environment to apply the English skills they learned in their textbook. Virtual reality (VR) technology can set up an immersive English language environment and then promote the learners to use English by presenting different collaborative communication tasks. In this paper, spherical video-based virtual reality technology was applied to build a linguistic environment and a collaborative learning strategy was adopted to promote their communication. Additionally, a mixed-methods research approach was used to analyze students’ achievement between a traditional classroom and a virtual reality supported collaborative classroom and their perception towards the two approaches. The experimental results revealed that the virtual reality supported collaborative classroom was able to enhance the students’ achievement. Moreover, by analyzing the interview, students’ attitudes towards the virtual reality supported collaborative class were reported and the use of language learning strategies in virtual reality supported collaborative class was represented. These findings could be valuable references for those who intend to create opportunities for students to collaborate and communicate in the target language in their classroom and then improve their language skills

!!!!!!!!!!!!!!!!!!!
Thursday, June 25 • 11:00am – 12:00pm

 Games and Gamification III

Click here to remove from My Sched.

Presentation 1: Reducing Cognitive Load through the Worked Example Effect within a Serious Game Environment (Full Paper #19)

Authors: Bernadette Spieler, Naomi Pfaff and Wolfgang Slany

>>Access Video Presentation<<

Novices often struggle to represent problems mentally; the unfamiliar process can exhaust their cognitive resources, creating frustration that deters them from learning. By improving novices’ mental representation of problems, worked examples improve both problem-solving skills and transfer performance. Programming requires both skills. In programming, it is not sufficient to simply understand how Stackoverflow examples work; programmers have to be able to adapt the principles and apply them to their own programs. This paper shows evidence in support of the theory that worked examples are the most efficient mode of instruction for novices. In the present study, 42 students were asked to solve the tutorial The Magic Word, a game especially for girls created with the Catrobat programming environment. While the experimental group was presented with a series of worked examples of code, the control groups were instructed through theoretical text examples. The final task was a transfer question. While the average score was not significantly better in the worked example condition, the fact that participants in this experimental group finished significantly faster than the control group suggests that their overall performance was better than that of their counterparts.

Presentation 2: A literature review of e-government services with gamification elements (Full Paper #56)

Authors: Ruth S. Contreras-Espinosa and Alejandro Blanco-M

>>Access Video Presentation<<

Nowadays several democracies are facing the growing problem of a breach in communication between its citizens and their political representatives, resulting in low citizen’s engagement in the participation of political decision making and on public consultations. Therefore, it is fundamental to generate a constructive relationship between both public administration and the citizens by solving its needs. This document contains a useful literature review of the gamification topic and e-government services. The documents contain a background of those concepts and conduct a selection and analysis of the different applications found. A set of three lines of research gaps are found with a potential impact on future studies.

++++++++++++++++++
Thursday, June 25 • 12:00pm – 1:00pm

 Museums and Libraries

Click here to remove from My Sched.

Presentation 1: Connecting User Experience to Learning in an Evaluation of an Immersive, Interactive, Multimodal Augmented Reality Virtual Diorama in a Natural History Museum & the Importance of Story (Full Paper #51)

Authors: Maria Harrington

>>Access Video Presentation<<

Reported are the findings of user experience and learning outcomes from a July 2019 study of an immersive, interactive, multimodal augmented reality (AR) application, used in the context of a museum. The AR Perpetual Garden App is unique in creating an immersive multisensory experience of data. It allowed scientifically naïve visitors to walk into a virtual diorama constructed as a data visualization of a springtime woodland understory, and interact with multimodal information directly through their senses. The user interface comprised of two different AR data visualization scenarios reinforced with data based ambient bioacoustics, an audio story of the curator’s narrative, and interactive access to plant facts. While actual learning and dwell times were the same between the AR app and the control condition, the AR experience received higher ratings on perceived learning. The AR interface design features of “Story” and “Plant Info” showed significant correlations with actual learning outcomes, while “Ease of Use” and “3D Plants” showed significant correlations with perceived learning. As such, designers and developers of AR apps can generalize these findings to inform future designs.

Presentation 2: The Naturalist’s Workshop: Virtual Reality Interaction with a Natural Science Educational Collection (Short Paper #11)

Authors: Colin Patrick Keenan, Cynthia Lincoln, Adam Rogers, Victoria Gerson, Jack Wingo, Mikhael Vasquez-Kool and Richard L. Blanton

>>Access Video Presentation<<

For experiential educators who utilize or maintain physical collections, The Naturalist’s Workshop is an exemplar virtual reality platform to interact with digitized collections in an intuitive and playful way. The Naturalist’s Workshop is a purpose-developed application for the Oculus Quest standalone virtual reality headset for use by museum visitors on the floor of the North Carolina Museum of Natural Sciences under the supervision of a volunteer attendant. Within the application, museum visitors are seated at a virtual desk. Using their hand controllers and head-mounted display, they explore drawers containing botanical specimens and tools-of-the-trade of a naturalist. While exploring, the participant can receive new information about any specimen by dropping it into a virtual examination tray. 360-degree photography and three-dimensionally scanned specimens are used to allow user-motivated, immersive experience of botanical meta-data such as specimen collection coordinates.

Presentation 3: 360˚ Videos: Entry level Immersive Media for Libraries and Education (Practitioner Presentation #132)

Authors: Diane Michaud

>>Access Video Presentation<<

Within the continuum of XR Technologies, 360˚ videos are relatively easy to produce and need only an inexpensive mobile VR viewer to provide a sense of immersion. 360˚ videos present an opportunity to reveal “behind the scenes” spaces that are normally inaccessible to users of academic libraries. This can promote engagement with unique special collections and specific library services. In December 2019, with little previous experience, I led the production of a short 360˚video tour, a walk-through of our institution’s archives. This was a first attempt; there are plans to transform it into a more interactive, user-driven exploration. The beta version successfully generated interest, but the enhanced version will also help prepare uninitiated users for the process of examining unique archival documents and artefacts. This presentation will cover the lessons learned, and what we would do differently for our next immersive video production. Additionally, I will propose that the medium of 360˚ video is ideal for many institutions’ current or recent predicament with campuses shutdown due to the COVID-19 pandemic. Online or immersive 360˚ video can be used for virtual tours of libraries and/or other campus spaces. Virtual tours would retain their value beyond current campus shutdowns as there will always be prospective students and families who cannot easily make a trip to campus. These virtual tours would provide a welcome alternative as they eliminate the financial burden of travel and can be taken at any time.

++++++++++++++++++

iLRN 2020

iLRN 2020: 6th International Conference of the Immersive Learning Research Network
Organized in conjunction with Educators in VR
Technically co-sponsored by the IEEE Education Society
June 21-25, 2020, Now Fully Online and in Virtual Reality!!!
Conference theme: “Vision 20/20: Hindsight, Insight, and Foresight in XR and Immersive Learning”
Conference website: https://immersivelrn.org/ilrn2020/
*** FREE registration for non-presenter EDU attendees until April 19th ***
##### TOPIC AREAS #####
XR and immersive learning in/for:
– Serious Games
– Medical & Healthcare
– Workforce & Industry
– Culture & Language
– K-12
– Museums & Libraries
– Special Education
– Geosciences
– Data Visualization
##### SESSION TYPES/ACTIVITIES #####
– Keynotes and plenaries
– Academic stream presentations (with peer-reviewed proceedings for submission to IEEE Xplore)
– Practitioner stream presentations (no paper required)
– Poster and exhibition sessions
– Workshops
– Panel sessions
– Special sessions
– Immersive Learning Adventures
– Immersive Learning Project Showcase & Competition
– Game Night
– Virtual Awards Banquet & Masquerade Ball
##### INTERESTED IN ATTENDING? #####
For a limited time, free registration is being offered to faculty, students, and staff of educational institutions (including K-12 schools/districts, universities, colleges, museums, and libraries) who wish to attend but will NOT be presenting at the conference or publishing in the proceedings. To take advantage of this offer, you must register by April 19, 2020 using an email address associated with your educational institution:
https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fimmersivelrn.org%2Filrn2020%2Filrn-2020-fees-registration%2F&amp;data=02%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C18042fee4df246db0d4b08d7da283e4b%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637217738689183646&amp;sdata=8D%2BIdSk4F5WrvHhPpV5Bjw00NvsfRS669vv%2F1anSyFE%3D&amp;reserved=0
##### INTERESTED IN PRESENTING? #####
Submissions of Practitioner presentation and poster proposals; proposals for workshops, panel sessions, and special sessions; as well as Academic Work-in-progress papers (for delivery as posters or as part of the doctoral colloquium) are being accepted until the late-round submission deadline of April 19, 2020. See the Call for Papers and Proposals at https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fimmersivelrn.org%2Filrn2020%2Fcall-for-papers-proposals%2F&amp;data=02%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C18042fee4df246db0d4b08d7da283e4b%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637217738689183646&amp;sdata=LVkDtui3de5QKdg8Q2dCmox2xb%2F5Oo85HtQXfMuERdA%3D&amp;reserved=0.
Proposals for the Immersive Learning Project Showcase & Competition may be submitted at https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fimmersivelrn.org%2Filrn2020%2Fimmersive-learning-project-showcase%2F&amp;data=02%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C18042fee4df246db0d4b08d7da283e4b%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637217738689183646&amp;sdata=JH3PY8FL5hHvXkayKzS7mQLj3nmNoExzcQTnoBZp3yw%3D&amp;reserved=0until April 30, 2020.
No further Academic Full and Short paper submissions are being considered at this stage.
##### INTERESTED IN VOLUNTEERING OR REVIEWING? #####
A range of volunteer opportunities are available, including conference internships for undergraduate and graduate students. Some of the roles currently available include session chair/facilitator, moderator, audio-visual/technical support, virtual event greeter/usher, virtual event photographer, virtual event videographer/livestreamer, 2D artist / illustrator
3D artist / modeler, graphic designer, and general conference intern. For details and to apply for one or more of these roles, please visit https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fimmersivelrn.org%2Filrn2020%2Fvolunteer-opportunities&amp;data=02%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C18042fee4df246db0d4b08d7da283e4b%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637217738689183646&amp;sdata=MDe2N4jth2anSyTR3cpHwoH8zdwlxCbS8JZmJheWTJc%3D&amp;reserved=0
Expressions of interest are also being solicited from scholars and practitioners wishing to join the iLRN 2020 Program Committee to peer review papers and proposals received in the late submission round (closing April 19, 2020). The late-round submissions will be no longer than 3 pages in length, and each Program Committee member will be asked to review no more than two submissions.
##### INTERESTED IN SPONSORING OR EXHIBITING? #####
A number of sponsorship and exhibition opportunities are available for organizations to:
– Meet and interact with key educational stakeholders
– Showcase their products and services
– Connect and collaborate with top researchers / scientists
– Build and strengthen customer / client relationships.
Packages range from US$500 for a basic virtual exhibit booth to US$15,000 for an exclusive Gold Sponsorship.
For information about the packages available, visit https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fimmersivelrn.org%2Filrn2020%2Fsponsorships-and-exhibitions&amp;data=02%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C18042fee4df246db0d4b08d7da283e4b%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637217738689183646&amp;sdata=3M7ADMO9NDE51rrmC5ycoy8RgL0Y%2BnPyH1DW%2F45xRNc%3D&amp;reserved=0
##### INTERESTED IN JOINING ILRN? (FREE) #####
Basic individual membership of the Immersive Learning Research Network is currently free; you can sign up at https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fimmersivelrn.org%2Fget-involved%2Fmembership%2F&amp;data=02%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C18042fee4df246db0d4b08d7da283e4b%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637217738689183646&amp;sdata=nZg7GBgGiaozJ2l7GNM8b42v%2FnxQvv73F4CrfIqRMkw%3D&amp;reserved=0.
Fee-based premium individual memberships and organizational memberships will be introduced in the near future.
##### CONTACT #####
Email: conference@immersivelrn.org
Web: https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fimmersivelrn.org%2Filrn2020&amp;data=02%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C18042fee4df246db0d4b08d7da283e4b%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637217738689183646&amp;sdata=fcCXlkfgohdTjaitYkgcZmZp73cdd0ylUt%2FgMFiwAQk%3D&amp;reserved=0

++++++++++++
more about Educators in VR in this IMS blog
https://blog.stcloudstate.edu/ims?s=educators+in+vr

IM 690 VR and AR lab part 2

IM 690 Virtual Reality and Augmented Reality. short link: http://bit.ly/IM690lab

IM 690 lab plan for March 3, MC 205:  Oculus Go and Quest

Readings:

  1. TAM:Technology Acceptances Model
    Read Venkatesh, and Davis and sum up the importance of their model for instructional designers working with VR technologies and creating materials for users of VR technologies.
  2. UTAUT: using the theory to learn well with VR and to design good acceptance model for endusers: https://blog.stcloudstate.edu/ims/2020/02/20/utaut/
    Watch both parts of Victoria Bolotina presentation at the Global VR conference. How is she applying UTAUT for her research?
    Read Bracq et al (2019); how do they apply UTAUT for their VR nursing training?

Lab work (continue):

revision from last week:
How to shoot and edit 360 videos: Ben Claremont
https://www.youtube.com/channel/UCAjSHLRJcDfhDSu7WRpOu-w
and
https://www.youtube.com/channel/UCUFJyy31hGam1uPZMqcjL_A

  1. Oculus Quest as VR advanced level
    1. Using the controllers
    2. Confirm Guardian
    3. Using the menu

Oculus Quest main

    1. Watching 360 video in YouTube
      1. Switch between 2D and 360 VR
        1. Play a game

Climbing


Racketball

View this post on Instagram

Hell yeah, @naysy is the ultimate Beat Saber queen! 💃 #VR #VirtualReality #BeatSaber #PanicAtTheDisco

A post shared by Beat Saber (@beatsaber) on

Practice interactivity (space station)

    1. Broadcast your experience (Facebook Live)
  1. Additional (advanced) features of Oculus Quest
    1. https://engagevr.io/
    2. https://sidequestvr.com/#/setup-howto

Interactivity: communication and working collaboratively with Altspace VR

https://account.altvr.com/

setting up your avatar

joining a space and collaborating and communicating with other users

  1. Assignment: Group work
    1. Find one F2F and one online peer to form a group.
      Based on the questions/directions before you started watching the videos:
      – Does this particular technology fit in the instructional design (ID) frames and theories covered
      – how does this particular technology fit in the instructional design (ID) frames and theories covered so far?
      – what models and ideas from the videos you will see seem possible to be replicated by you?
      exchange thoughts with your peers and make a plan to create similar educational product
    2. Post your writing in the following D2L Discussions thread
  2. Augmented Reality with Hololens Watch videos at computer station)
    1. Start and turn off; go through menu

      https://youtu.be/VX3O650comM
    2. Learn gestures, voice commands,
  1. Augmented Reality with Merge Cube
    1. 3D apps and software packages and their compatibility with AR
  2. Augmented Reality with telephone
  3. Samsung Gear 360 video camera
    1. If all other goggles and devices are busy, please feel welcome to use the camera to practice and/or work toward your final project
    2. CIM card and data transfer – does your phone have a CIM card compatible with the camera?
    3. Upload 360 images and videos on your YouTube and FB accounts
  4. Issues with XR
    1. Ethics
      1. empathy
        Peter Rubin “Future Presence”
        https://blog.stcloudstate.edu/ims/2019/03/25/peter-rubin-future-presence/

+++++++++++++

Enhance your XR instructional Design with other tools: https://blog.stcloudstate.edu/ims/2020/02/07/crs-loop/

https://aframe.io/

https://framevr.io/

https://learn.framevr.io/ (free learning of frame)

https://hubs.mozilla.com/#/

https://sketchfab.com/ WebxR technology

https://mixedreality.mozilla.org/hello-webxr/

https://studio.gometa.io/landing

+++++++++++
Plamen Miltenoff, Ph.D., MLIS
Professor
320-308-3072
pmiltenoff@stcloudstate.edu
http://web.stcloudstate.edu/pmiltenoff/faculty/
schedule a meeting: https://doodle.com/digitalliteracy
find my office: https://youtu.be/QAng6b_FJqs

Educators in VR

Info on all presentations: https://account.altvr.com/channels/1182698623012438188

Charlie Fink: Setting the Table for the Next Decade in XR

Translating Training Requirements into Immersive Experience

Virtual Reality Technologies for Learning Designers

Virtual Reality Technologies for Learning Designers Margherita Berti

$$$$$$$$$$$$$$$$$$$$$$

Technology Acceptance and Learning Process Victoria Bolotina part 1

Technology Acceptance and Learning Process Victoria Bolotina part 2

Assessment of Learning Activities in VR Evelien Ydo part 2

++++++++++++++++++++++++++++++++++++++++

VR: So Much More Than a Field Trip Shannon Putman, Graduate Assistant/PhD Student, University of Louisville SPED special education https://account.altvr.com/events/1406092840622096803

++++++++++++++++++++++++++++++

VR and Health Professionals Rob Theriault

+++++++++++++++++++++++

Transform Your History Lessons with AR and VR Michael Fricano II

++++++++++++++++++++++++++++

Transform Your History Lessons with AR and VR Michael Fricano II, Technology Integration Specialist https://www.arvreduhub.com/transform-history

Qlone App for 3D scanning

++++++++++++++++++++++++++++++++++++++

2020 Educators in VR International Summit

The 2020 Educators in VR International Summit is February 17-22. It features over 170 speakers in 150+ events across multiple social and educational platforms including AltspaceVRENGAGErumiiMozilla Hubs, and Somnium Space.

The event requires no registration, and is virtual only, free, and open to the public. Platform access is required, so please install one of the above platforms to attend the International Summit. You may attend in 2D on a desktop or laptop computer with a headphone and microphone (USB gaming headphone recommended), or with a virtual device such as the Oculus Go, Quest, and Rift, Vive, and other mobile and tethered devices. Please note the specifications and requirements of each platform.

The majority of our events are on AltspaceVR. AltspaceVR is available for Samsung GearSteam Store for HTC ViveWindows Mixed Reality, and the Oculus Store for RiftGo and Quest users. Download and install the 2D version for use on your Windows desktop computer.

Charlie Fink, author, columnist for Forbes magazine, and Adjunct Faculty member of Chapman University, will be presenting “Setting the Table for the Next Decade in XR,” discussing the future of this innovative and immersive technology, at the 2020 Educators in VR International Summit. He will be speaking in AltspaceVR on Tuesday, February 18 at 1:00 PM EST /

International Summit

Setting the Table for the Next Decade in XR 1PM, Tues, Feb 18 https://account.altvr.com/events/1406089727517393133

Finding a New Literacy for a New Reality 5PM, Tues, Feb 18

https://account.altvr.com/events/1406093036194103494 schedule for new literacy

Finding a New Literacy for a New Reality

Dr. Sarah Jones, Deputy Dean, De Montfort University

This workshop with Dr. Sarah Jones will focus on developing a relevant and new literacy for virtual reality, including the core competencies and skills needed to develop and understand how to become an engaged user of the technology in a meaningful way. The workshop will develop into research for a forthcoming book on Uncovering a Literacy for VR due to be published in 2020.

Sarah is listed as one of the top 15 global influencers within virtual reality. After nearly a decade in television news, Sarah began working in universities focusing on future media, future technology and future education. Sarah holds a PhD in Immersive Storytelling and has published extensively on virtual and augmented reality, whilst continuing to make and create immersive experiences. She has advised the UK Government on Immersive Technologies and delivers keynotes and speaks at conferences across the world on imagining future technology. Sarah is committed to diversifying the media and technology industries and regularly champions initiatives to support this agenda.

Inter-cognitive and Intra-cognitive Communication in Virtual Reality

Inter-cognitive and Intra-cognitive Communication in Virtual Reality

Michael Vallance, Professor, Future University Hakodate

Currently there are limited ways to connect 3D VR environments to physical objects in the real-world whilst simultaneously conducting communication and collaboration between remote users. Within the context of a solar power plant, the performance metrics of the site are invaluable for environmental engineers who are remotely located. Often two or more remotely located engineers need to communicate and collaborate on solving a problem. If a solar panel component is damaged, the repair often needs to be undertaken on-site thereby incurring additional expenses. This triage of communication is known as inter-cognitive communication and intra-cognitive communication: inter-cognitive communication where information transfer occurs between two cognitive entities with different cognitive capabilities (e.g., between a human and an artificially cognitive system); intra-cognitive communication where information transfer occurs between two cognitive entities with equivalent cognitive capabilities (e.g., between two humans) [Baranyi and Csapo, 2010]. Currently, non-VR solutions offer a comprehensive analysis of solar plant data. A regular PC with a monitor currently have advantages over 3D VR. For example, sensors can be monitored using dedicated software such as EPEVER or via a web browser; as exemplified by the comprehensive service provided by Elseta. But when multiple users are able to collaborate remotely within a three-dimensional virtual simulation, the opportunities for communication, training and academic education will be profound.

Michael Vallance Ed.D. is a researcher in the Department of Media Architecture, Future University Hakodate, Japan. He has been involved in educational technology design, implementation, research and consultancy for over twenty years, working closely with Higher Education Institutes, schools and media companies in UK, Singapore, Malaysia and Japan. His 3D virtual world design and tele-robotics research has been recognized and funded by the UK Prime Minister’s Initiative (PMI2) and the Japan Advanced Institute of Science and Technology (JAIST). He has been awarded by the United States Army for his research in collaborating the programming of robots in a 3D Virtual World.

Create Strategic Snapchat & Instagram AR Campaigns

Create Strategic Snapchat & Instagram AR Campaigns

Dominique Wu, CEO/Founder, Hummingbirdsday

Augmented Reality Lens is popular among young people thanks to Snapchat’s invention. Business is losing money without fully using of social media targeting young people (14-25). In my presentation, Dominique Wu will show how businesses can generate more leads through Spark AR (Facebook AR/Instagram AR) & Snapchat AR Lens, and how to create a strategic Snapchat & Instagram AR campaigns.

Domnique Wu is an XR social media strategist and expert in UX/UI design.She has her own YouTube and Apple Podcast show called “XReality: Digital Transformation,” covering the technology and techniques of incorporating XR and AR into social media, marketing, and integration into enterprise solutions.

Mixed Reality in Classrooms Near You

Mixed Reality in Classrooms Near You

Mark Christian, EVP, Strategy and Corporate Development, GIGXR

Mixed Reality devices like the HoloLens are transforming education now. Mark Christian will discuss how the technology is not about edge use cases or POCs, but real usable products that are at Universities transforming the way we teach and learn. Christian will talk about the products of GIGXR, the story of how they were developed and what the research is saying about their efficacy. It is time to move to adoption of XR technology in education. Learn how one team has made this a reality.

As CEO of forward-thinking virtual reality and software companies, Mark Christian employs asymmetric approaches to rapid, global market adoption, hiring, diversity and revenue. He prides himself on unconventional approaches to building technology companies.

Designing Educational Content in VR

Designing Educational Content in VR

Avinash Gyawali, VR Developer, Weaver Studio

Virtual Reality is an effective medium to impart education to the student only if it is done right.The way VR is considered gimmick or not is by the way the software application are designed/developed by the developers not the hardware limitation.I will be giving insight about the VR development for educational content specifically designed for students of lower secondary school.I will also provide insights about the development of game in unity3D game engine.

Game Developer and VR developer with over 3 years of experience in Game Development.Developer of Zombie Shooter, winner of various national awards in the gaming and entertainment category, Avinash Gyawali is the developer of EDVR, an immersive voice controlled VR experience specially designed for children of age 10-18 years.

8:00 AM PST Research Virtual Reality Technologies for Learning Designers Margherita Berti ASVR

Virtual Reality Technologies for Learning Designers

Margherita Berti

Virtual Reality (VR) is a computer-generated experience that simulates presence in real or imagined environments (Kerrebrock, Brengman, & Willems, 2017). VR promotes contextualized learning, authentic experiences, critical thinking, and problem-solving opportunities. Despite the great potential and popularity of this technology, the latest two installations of the Educause Horizon Report (2018, 2019) have argued that VR remains “elusive” in terms of mainstream adoption. The reasons are varied, including the expense and the lack of empirical evidence for its effectiveness in education. More importantly, examples of successful VR implementations for those instructors who lack technical skills are still scarce. Margherita Berti will discuss a range of easy-to-use educational VR tools and examples of VR-based activity examples and the learning theories and instructional design principles utilized for their development.

Margherita Berti is a doctoral candidate in Second Language Acquisition and Teaching (SLAT) and Educational Technology at the University of Arizona. Her research specialization resides at the intersection of virtual reality, the teaching of culture, and curriculum and content development for foreign language education.

Wed 11:00 AM PST Special Event Gamifying the Biblioverse with Metaverse Amanda Fox VR Design / Biblioverse / Training & Embodiment ASVR

Gamifying the Biblioverse with Metaverse

Amanda Fox, Creative Director of STEAMPunks/MetaInk Publishing, MetaInk Publishing

There is a barrier between an author and readers of his/her books. The author’s journey ends, and the reader’s begins. But what if as an author/trainer, you could use gamification and augmented reality(AR) to interact and coach your readers as part of their learning journey? Attend this session with Amanda Fox to learn how the book Teachingland leverages augmented reality tools such as Metaverse to connect with readers beyond the text.

Amanda Fox, Creative Director of STEAMPunksEdu, and author of Teachingland: A Teacher’s Survival Guide to the Classroom Apolcalypse and Zom-Be A Design Thinker. Check her out on the Virtual Reality Podcast, or connect with her on twitter @AmandaFoxSTEM.

Wed 10:00 AM PST Research Didactic Activity of the Use of VR and Virtual Worlds to Teach Design Fundamentals Christian Jonathan Angel Rueda VR Design / Biblioverse / Training & Embodiment ASVR

Didactic Activity of the Use of VR and Virtual Worlds to Teach Design Fundamentals

Christian Jonathan Angel Rueda, research professor, Autonomous University of Queretaro (Universidad Autónoma de Querétaro)

Christian Jonathan Angel Rueda specializaes in didactic activity of the use of virtual reality/virtual worlds to learn the fundamentals of design. He shares the development of a course including recreating in the three-dimensional environment using the fundamentals learned in class, a demonstration of all the works developed throughout the semester using the knowledge of design foundation to show them creatively, and a final project class scenario that connected with the scenes of the students who showed their work throughout the semester.

Christian Jonathan Angel Rueda is a research professor at the Autonomous University of Queretaro in Mexico. With a PhD in educational technology, Christian has published several papers on the intersection of education, pedagogy, and three-dimensional immersive digital environments. He is also an edtech, virtual reality, and social media consultant at Eco Onis.

Thu 11:00 AM PST vCoaching Closing the Gap Between eLearning and XR Richard Van Tilborg XR eLearning / Laughter Medicine ASVR

Closing the Gap Between eLearning and XR

Richard Van Tilborg, founder, CoVince

How we can bridge the gap between eLearning and XR. Richard Van Tilborg discusses combining brain insights enabled with new technologies. Training and education cases realised with the CoVince platform: journeys which start on you mobile and continue in VR. The possibilities to earn from your creations and have a central distribution place for learning and data.

Richard Van Tilborg works with the CoVince platform, a VR platform offering training and educational programs for central distribution of learning and data. He is an author and speaker focusing on computers and education in virtual reality-based tasks for delivering feedback.

 

Thu 12:00 PM PST Research Assessment of Learning Activities in VR Evelien Ydo Technology Acceptance / Learning Assessment / Vaping Prevention ASVR
Thu 6:00 PM PST Down to Basics Copyright and Plagiarism Protections in VR Jonathan Bailey ASVR

 

Thu 8:00 PM PST Diversity Cyberbullying in VR John Williams, Brennan Hatton, Lorelle VanFossen ASVR

IM 690 VR and AR lab

IM 690 Virtual Reality and Augmented Reality. short link: http://bit.ly/IM690lab

IM 690 lab plan for Feb. 18, MC 205:  Experience VR and AR

What is an “avatar” and why do we need to know how it works?

How does the book (and the movie) “Ready Player One” project the education of the future

Peter Rubin “Future Present” pictures XR beyond education. How would such changes in the society and our behavior influence education.

Readings:

each group selected one article of this selection: https://blog.stcloudstate.edu/ims/2020/02/11/immersive-reality-and-instructional-design/
to discuss the approach of an Instructional Designer to XR

Announcements:
https://blog.stcloudstate.edu/ims/2020/02/07/educators-in-vr/

https://blog.stcloudstate.edu/ims/2020/01/30/realities360-conference/

Translating Training Requirements into Immersive Experience

Virtual Reality Technologies for Learning Designers

Virtual Reality Technologies for Learning Designers

Inter

Inter-cognitive and Intra-cognitive communication in VR: https://slides.com/michaelvallance/deck-25c189#/

https://www.youtube.com/channel/UCGHRSovY-KvlbJHkYnIC-aA

People with dementia

https://docs.google.com/presentation/d/e/2PACX-1vSVNHSXWlcLzWZXObifZfhrL8SEeYA59IBdatR1kI7Q-Hry20AHtvLVTWQyH3XxBQ/pub?start=false&loop=false&delayms=60000&slide=id.p1

Free resources:
https://blog.stcloudstate.edu/ims?s=free+audio, free sound, free multimedia

Lab work:

  1. Video 360 as VR entry level
    1. During Lab work on Jan 28, we experienced Video 360 cardboard movies
      let’s take 5-10 min and check out the following videos (select and watch at least three of them)

      1. F2F students, please Google Cardboard
      2. Online students, please view on your computer or mobile devices, if you don’t have googles at your house (you can purchase now goggles for $5-7 from second-hand stores such as Goodwill)
      3. Both F2F and online students. Here directions how to easily open the movies on your mobile devices:
        1. Copy the URL and email it to yourself.
          Open the email on your phone and click on the link
          If you have goggles, click on the appropriate icon lower right corner and insert the phone in the goggles
        2. Open your D2L course on your phone (you can use the mobile app).
          Go to the D2L Content Module with these directions and click on the link.
          After the link opens, insert phone in the goggles to watch the video
      4. Videos:
        While watching the videos, consider the following objectives:
        – Does this particular technology fit in the instructional design (ID) frames and theories covered, e.g. PBL, CBL, Activity Theory, ADDIE Model, TIM etc. (https://blog.stcloudstate.edu/ims/2020/01/29/im-690-id-theory-and-practice/ ). Can you connect the current state, but also the potential of this technology with the any of these frameworks and theories, e.g., how would Google Tour Creator or any of these videos fits in the Analysis – Design – Development – Implementation – Evaluation process? Or, how do you envision your Google Tour Creator project or any of these videos to fit in the Entry – Adoption – Adaptation – Infusion – Transformation process?

– how does this particular technology fit in the instructional design (ID) frames and theories covered so far?
– what models and ideas from the videos you will see seem possible to be replicated by you?

Assignment: Use Google Cardboard to watch at least three of the following options
YouTube:
Elephants (think how it can be used for education)
https://youtu.be/2bpICIClAIg
Sharks (think how it can be used for education)
https://youtu.be/aQd41nbQM-U
Solar system
https://youtu.be/0ytyMKa8aps
Dementia
https://youtu.be/R-Rcbj_qR4g
Facebook
https://www.facebook.com/EgyptVR/photos/a.1185857428100641/1185856994767351/

From Peter Rubin’s Future Presence: here is a link https://blog.stcloudstate.edu/ims/2019/03/25/peter-rubin-future-presence/ if you want to learn more
Empathy, Chris Milk, https://youtu.be/iXHil1TPxvA
Clouds Over Sidra, https://youtu.be/mUosdCQsMkM

  1. Assignment: Group work
    1. Find one F2F and one online peer to form a group.
      Based on the questions/directions before you started watching the videos:
      – Does this particular technology fit in the instructional design (ID) frames and theories covered. e.g. PBL, CBL, Activity Theory, ADDIE Model, TIM etc. (https://blog.stcloudstate.edu/ims/2020/01/29/im-690-id-theory-and-practice/ ). Can you connect the current state, but also the potential of this technology with the any of these frameworks and theories, e.g., how would Google Tour Creator or any of these videos fits in the Analysis – Design – Development – Implementation – Evaluation process? Or, how do you envision your Google Tour Creator project or any of these videos to fit in the Entry – Adoption – Adaptation – Infusion – Transformation process?
      – how does this particular technology fit in the instructional design (ID) frames and theories covered so far?
      – what models and ideas from the videos you will see seem possible to be replicated by you?
      exchange thoughts with your peers and make a plan to create similar educational product
    1. Post your writing in the following D2L Discussions thread: https://stcloudstate.learn.minnstate.edu/d2l/le/4819732/discussions/threads/43483637/View
  1. Lenovo DayDream as VR advanced level
    1. Recording in DayDream
      https://skarredghost.com/2018/08/17/how-to-shoot-cool-screenshots-videos-lenovo-mirage-solo-and-save-them-on-pc/
    2. Using the controller
      https://support.google.com/daydream/answer/7184597?hl=en
    3. Using the menu
    4. Watching 360 video in YouTube
      1. Using keyboard to search
      2. Using voice command to search
    5. Using Labster. https://www.labster.com/
      1. Record how far in the lab you managed to proceed
    6. Playing the games
      1. Evaluate the ability of the game you watched to be incorporated in the educational process

Assignment: In 10-15 min (mind your peers, since we have only headset), do your best to evaluate one educational app (e.g., Labster) and one leisure app (games).
Use the same questions to evaluate Lenovo DayDream:
– Does this particular technology fit in the instructional design (ID) frames and theories covered, e.g. PBL, CBL, Activity Theory, ADDIE Model, TIM etc. (https://blog.stcloudstate.edu/ims/2020/01/29/im-690-id-theory-and-practice/ ). Can you connect the current state, but also the potential of this technology with the any of these frameworks and theories, e.g., how would Google Tour Creator or any of these videos fits in the Analysis – Design – Development – Implementation – Evaluation process? Or, how do you envision your Google Tour Creator project or any of these videos to fit in the Entry – Adoption – Adaptation – Infusion – Transformation process?
– how does this particular technology fit in the instructional design (ID) frames and theories covered so far?
– what models and ideas from the videos you will see seem possible to be replicated by you?

+++++++++++
Plamen Miltenoff, Ph.D., MLIS
Professor
320-308-3072
pmiltenoff@stcloudstate.edu
http://web.stcloudstate.edu/pmiltenoff/faculty/
schedule a meeting: https://doodle.com/digitalliteracy
find my office: https://youtu.be/QAng6b_FJqs

1 3 4 5 6 7 12