Searching for "project based"

in house made library counters

LITA listserv exchange on “Raspberry PI Counter for Library Users”

On 7/10/20, 10:05 AM, “lita-l-request@lists.ala.org on behalf of Hammer, Erich F” <lita-l-request@lists.ala.org on behalf of erich@albany.edu> wrote:

Jason,

I think that is a very interesting project.  If I understand how it works (comparing reference images to live images), it should still work if a “fuzzy” or translucent filter were placed on the lens as a privacy measure, correct? You could even make the fuzzy video publicly accessible to prove to folks that privacy is protected.

If that’s the case, IMHO, it really is a commercially viable idea and it would have a market far beyond libraries.  Open source code and hardware designs and sales of pre-packaged hardware and support.  Time for some crowdsource funding!  🙂

Erich

On Friday, July 10, 2020 at 10:14, Jason Griffey eloquently inscribed:
I ran a multi-year project to do counting (as well as attention measurement)
called Measure the Future (http://.measurethefuture.net). That project is i
desperate need of updating….there has been some work done on it at the
> University of OK libraries, but we haven’t seen their code push et. As the
> code stands on GitHub, it isn’t usable….the installation is broken based on
> some underlying dependencies.  The Univ of OK code fixes the issue, but it
> hasn’t been pushed yet. But if you want to see the general code and way we
> approached it, that is all available.  > Jason
> On Jul 8, 2020, 1:37 PM -0500, Mitchell, James Ray
> <jmitchell20@una.edu>, wrote:
>         Hi Kun,
>         I don’t know if this will be useful to you or not, but Code4Lib journal
> had an article a couple years ago that might be helpful. It’s called
> “Testing Three Type of Raspberry Pi People Counters.” The link to the
> article is https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fjournal.code4lib.org%2Farticles%2F12947&amp;data=02%7C01%7Cpmiltenoff%40stcloudstate.edu%7C8d2342df6f3d4d83766508d824e29f23%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C1%7C637299903041974052&amp;sdata=f9qeftEvktqHakDqWY%2BxHTj3kei7idOFAJnROp%2FiOCU%3D&amp;reserved=0
>         Regards    >         James

My note:
In 2018, following the university president’s call for ANY possible savings, the library administrator was send a proposal requesting information regarding the license for the current library counters and proposing the save the money for the license by creating an in-house Arduino counter. The blueprints for such counter were share (as per another LITA listserv exchange). SCSU Physics professor agreement to lead the project was secured as well as the opportunity for SCSU Physics students to develop the project as part of their individual study plan. The proposal was never addressed neither by the middle nor the upper management.

+++++++++++++
more on raspberry pi in this IMS blog
http://blog.stcloudstate.edu/ims?s=raspberry

more on arduino in this IMS blog
http://blog.stcloudstate.edu/ims?s=arduino

Emerging Trends and Impacts of the Internet of Things in Libraries

Emerging Trends and Impacts of the Internet of Things in Libraries

https://www.igi-global.com/gateway/book/244559

Chapters:

Holland, B. (2020). Emerging Technology and Today’s Libraries. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 1-33). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch001

The purpose of this chapter is to examine emerging technology and today’s libraries. New technology stands out first and foremost given that they will end up revolutionizing every industry in an age where digital transformation plays a major role. Major trends will define technological disruption. The next-gen of communication, core computing, and integration technologies will adopt new architectures. Major technological, economic, and environmental changes have generated interest in smart cities. Sensing technologies have made IoT possible, but also provide the data required for AI algorithms and models, often in real-time, to make intelligent business and operational decisions. Smart cities consume different types of electronic internet of things (IoT) sensors to collect data and then use these data to manage assets and resources efficiently. This includes data collected from citizens, devices, and assets that are processed and analyzed to monitor and manage, schools, libraries, hospitals, and other community services.

Makori, E. O. (2020). Blockchain Applications and Trends That Promote Information Management. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 34-51). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch002
Blockchain revolutionary paradigm is the new and emerging digital innovation that organizations have no choice but to embrace and implement in order to sustain and manage service delivery to the customers. From disruptive to sustaining perspective, blockchain practices have transformed the information management environment with innovative products and services. Blockchain-based applications and innovations provide information management professionals and practitioners with robust and secure opportunities to transform corporate affairs and social responsibilities of organizations through accountability, integrity, and transparency; information governance; data and information security; as well as digital internet of things.
Hahn, J. (2020). Student Engagement and Smart Spaces: Library Browsing and Internet of Things Technology. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 52-70). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch003
The purpose of this chapter is to provide evidence-based findings on student engagement within smart library spaces. The focus of smart libraries includes spaces that are enhanced with the internet of things (IoT) infrastructure and library collection maps accessed through a library-designed mobile application. The analysis herein explored IoT-based browsing within an undergraduate library collection. The open stacks and mobile infrastructure provided several years (2016-2019) of user-generated smart building data on browsing and selecting items in open stacks. The methods of analysis used in this chapter include transactional analysis and data visualization of IoT infrastructure logs. By analyzing server logs from the computing infrastructure that powers the IoT services, it is possible to infer in greater detail than heretofore possible the specifics of the way library collections are a target of undergraduate student engagement.
Treskon, M. (2020). Providing an Environment for Authentic Learning Experiences. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 71-86). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch004
The Loyola Notre Dame Library provides authentic learning environments for undergraduate students by serving as “client” for senior capstone projects. Through the creative application of IoT technologies such as Arduinos and Raspberry Pis in a library setting, the students gain valuable experience working through software design methodology and create software in response to a real-world challenge. Although these proof-of-concept projects could be implemented, the library is primarily interested in furthering the research, teaching, and learning missions of the two universities it supports. Whether the library gets a product that is worth implementing is not a requirement; it is a “bonus.”
Rashid, M., Nazeer, I., Gupta, S. K., & Khanam, Z. (2020). Internet of Things: Architecture, Challenges, and Future Directions. In Holland, B. (Ed.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 87-104). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch005
The internet of things (IoT) is a computing paradigm that has changed our daily livelihood and functioning. IoT focuses on the interconnection of all the sensor-based devices like smart meters, coffee machines, cell phones, etc., enabling these devices to exchange data with each other during human interactions. With easy connectivity among humans and devices, speed of data generation is getting multi-fold, increasing exponentially in volume, and is getting more complex in nature. In this chapter, the authors will outline the architecture of IoT for handling various issues and challenges in real-world problems and will cover various areas where usage of IoT is done in real applications. The authors believe that this chapter will act as a guide for researchers in IoT to create a technical revolution for future generations.
Martin, L. (2020). Cloud Computing, Smart Technology, and Library Automation. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 105-123). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch006
As technology continues to change, the landscape of the work of librarians and libraries continue to adapt and adopt innovations that support their services. Technology also continues to be an essential tool for dissemination, retrieving, storing, and accessing the resources and information. Cloud computing is an essential component employed to carry out these tasks. The concept of cloud computing has long been a tool utilized in libraries. Many libraries use OCLC to catalog and manage resources and share resources, WorldCat, and other library applications that are cloud-based services. Cloud computing services are used in the library automation process. Using cloud-based services can streamline library services, minimize cost, and the need to have designated space for servers, software, or other hardware to perform library operations. Cloud computing systems with the library consolidate, unify, and optimize library operations such as acquisitions, cataloging, circulation, discovery, and retrieval of information.
Owusu-Ansah, S. (2020). Developing a Digital Engagement Strategy for Ghanaian University Libraries: An Exploratory Study. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 124-139). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch007
This study represents a framework that digital libraries can leverage to increase usage and visibility. The adopted qualitative research aims to examine a digital engagement strategy for the libraries in the University of Ghana (UG). Data is collected from participants (digital librarians) who are key stakeholders of digital library service provision in the University of Ghana Library System (UGLS). The chapter reveals that digital library services included rare collections, e-journal, e-databases, e-books, microfilms, e-theses, e-newspapers, and e-past questions. Additionally, the research revealed that the digital library service patronage could be enhanced through outreach programmes, open access, exhibitions, social media, and conferences. Digital librarians recommend that to optimize digital library services, literacy programmes/instructions, social media platforms, IT equipment, software, and website must be deployed. In conclusion, a DES helps UGLS foster new relationships, connect with new audiences, and establish new or improved brand identity.
Nambobi, M., Ssemwogerere, R., & Ramadhan, B. K. (2020). Implementation of Autonomous Library Assistants Using RFID Technology. In Holland, B. (Ed.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 140-150). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch008
This is an interesting time to innovate around disruptive technologies like the internet of things (IoT), machine learning, blockchain. Autonomous assistants (IoT) are the electro-mechanical system that performs any prescribed task automatically with no human intervention through self-learning and adaptation to changing environments. This means that by acknowledging autonomy, the system has to perceive environments, actuate a movement, and perform tasks with a high degree of autonomy. This means the ability to make their own decisions in a given set of the environment. It is important to note that autonomous IoT using radio frequency identification (RFID) technology is used in educational sectors to boost the research the arena, improve customer service, ease book identification and traceability of items in the library. This chapter discusses the role, importance, the critical tools, applicability, and challenges of autonomous IoT in the library using RFID technology.
Priya, A., & Sahana, S. K. (2020). Processor Scheduling in High-Performance Computing (HPC) Environment. In Holland, B. (Ed.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 151-179). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch009
Processor scheduling is one of the thrust areas in the field of computer science. The future technologies use a huge amount of processing for execution of their tasks like huge games, programming software, and in the field of quantum computing. In real-time, many complex problems are solved by GPU programming. The primary concern of scheduling is to reduce the time complexity and manpower. Several traditional techniques exit for processor scheduling. The performance of traditional techniques is reduced when it comes to the huge processing of tasks. Most scheduling problems are NP-hard in nature. Many of the complex problems are recently solved by GPU programming. GPU scheduling is another complex issue as it runs thousands of threads in parallel and needs to be scheduled efficiently. For such large-scale scheduling problems, the performance of state-of-the-art algorithms is very poor. It is observed that evolutionary and genetic-based algorithms exhibit better performance for large-scale combinatorial and internet of things (IoT) problems.
Kirsch, B. (2020). Virtual Reality in Libraries. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 180-193). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch010
Librarians are beginning to offer virtual reality (VR) services in libraries. This chapter reviews how libraries are currently using virtual reality for both consumption and creation purposes. Virtual reality tools will be compared and contrasted, and recommendations will be given for purchasing and circulating headsets and VR equipment. Google Tour Creator and a smartphone or 360-degree camera can be used to create a virtual tour of the library and other virtual reality content. These new library services will be discussed along with practical advice and best practices for incorporating virtual reality into the library for instructional and entertainment purposes.
Heffernan, K. L., & Chartier, S. (2020). Augmented Reality Gamifies the Library: A Ride Through the Technological Frontier. In Holland, B. (Ed.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 194-210). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch011
Two librarians at a University in New Hampshire attempted to integrate gamification and mobile technologies into the exploration of, and orientation to, the library’s services and resources. From augmented reality to virtual escape rooms and finally an in-house app created by undergraduate, campus-based, game design students, the library team learned much about the triumphs and challenges that come with attempting to utilize new technologies to reach users in the 21st century. This chapter is a narrative describing years of various attempts, innovation, and iteration, which have led to the library team being on the verge of introducing an app that could revolutionize campus discovery and engagement.
Miltenoff, P. (2020). Video 360 and Augmented Reality: Visualization to Help Educators Enter the Era of eXtended Reality. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 211-225). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch012
The advent of all types of eXtended Reality (XR)—VR, AR, MR—raises serious questions, both technological and pedagogical. The setup of campus services around XR is only the prelude to the more complex and expensive project of creating learning content using XR. In 2018, the authors started a limited proof-of-concept augmented reality (AR) project for a library tour. Building on their previous research and experience creating a virtual reality (VR) library tour, they sought a scalable introduction of XR services and content for the campus community. The AR library tour aimed to start us toward a matrix for similar services for the entire campus. They also explored the attitudes of students, faculty, and staff toward this new technology and its incorporation in education, as well as its potential and limitations toward the creation of a “smart” library.

iLearn2020

YouTube Live stream: https://www.youtube.com/watch?v=DSXLJGhI2D8&feature=youtu.be
and the Discord directions: https://docs.google.com/document/d/1GgI4dfq-iD85yJiyoyPApB33tIkRJRns1cJ8OpHAYno/editiLearn2020

Modest3D Guided Virtual Adventure – iLRN Conference 2020 – Session 1: currently, live session: https://youtu.be/GjxTPOFSGEM

https://mediaspace.minnstate.edu/media/Modest+3D/1_28ejh60g

CALL FOR PROPOSALS: GUIDED VIRTUAL ADVENTURE TOURS
at iLRN 2020: 6th International Conference of the Immersive Learning Research Network
Organized in conjunction with Educators in VR
Technically co-sponsored by the IEEE Education Society
June 21-25, 2020, Online
Conference theme: “Vision 20/20: Hindsight, Insight, and Foresight in XR and Immersive Learning”
Conference website: https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fimmersivelrn.org%2Filrn2020&amp;data=02%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C7a9997a1d6724744f7d708d7f52d9387%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637247448406614239&amp;sdata=Jt%2BFUtP3Vs%2FQi1z9HCk9x8m%2B%2BRjkZ63qrcoZnFiUdaQ%3D&amp;reserved=0
++++++++++++++++++++++++++++++
Wednesday, June 24 • 12:00pm – 1:00pm

 Instruction and Instructional Design

Presentation 1: Inspiring Faculty (+ Students) with Tales of Immersive Tech (Practitioner Presentation #106)

Authors: Nicholas Smerker

Immersive technologies – 360º video, virtual and augmented realities – are being discussed in many corners of higher education. For an instructor who is familiar with the terms, at least in passing, learning more about why they and their students should care can be challenging, at best. In order to create a font of inspiration, the IMEX Lab team within Teaching and Learning with Technology at Penn State devised its Get Inspired web resource. Building on a similar repository for making technology stories at the sister Maker Commons website, the IMEX Lab Get Inspired landing page invites faculty to discover real world examples of how cutting edge XR tools are being used every day. In addition to very approachable video content and a short summary calling out why our team chose the story, there are also instructional designer-developed Assignment Ideas that allow for quick deployment of exercises related to – though not always relying upon – the technologies highlighted in a given Get Inspired story.

Presentation 2: Lessons Learned from Over A Decade of Designing and Teaching Immersive VR in Higher Education Online Courses (Practitioner Presentation #101)

Authors: Eileen Oconnor

This presentation overviews the design and instruction in immersive virtual reality environments created by the author beginning with Second Life and progressing to open source venues. It will highlight the diversity of VR environment developed, the challenges that were overcome, and the accomplishment of students who created their own VR environments for K12, college and corporate settings. The instruction and design materials created to enable this 100% online master’s program accomplishment will be shared; an institute launched in 2018 for emerging technology study will be noted.

Presentation 3: Virtual Reality Student Teaching Experience: A Live, Remote Option for Learning Teaching Skills During Campus Closure and Social Distancing (Practitioner Presentation #110)

Authors: Becky Lane, Christine Havens-Hafer, Catherine Fiore, Brianna Mutsindashyaka and Lauren Suna

Summary: During the Coronavirus pandemic, Ithaca College teacher education majors needed a classroom of students in order to practice teaching and receive feedback, but the campus was closed, and gatherings forbidden. Students were unable to participate in live practice teaching required for their program. We developed a virtual reality pilot project to allow students to experiment in two third-party social VR programs, AltSpaceVR and Rumii. Social VR platforms allow a live, embodied experience that mimics in-person events to give students a more realistic, robust and synchronous teaching practice opportunity. We documented the process and lessons learned to inform, develop and scale next generation efforts.

++++++++++++++++++++++++++
Tuesday, June 23 • 5:00pm – 6:00pm
+++++++++++++++++++++++++++
Sunday, June 21 • 8:00am – 9:00am
Escape the (Class)room games in OpenSim or Second Life FULLhttps://ilrn2020.sched.com/event/ceKP/escape-the-classroom-games-in-opensim-or-second-lifePre-registration for this tour is required as places are limited. Joining instructions will be emailed to registrants ahead of the scheduled tour time.The Guided Virtual Adventure tour will take you to EduNation in Second Life to experience an Escape room game. For one hour, a group of participants engage in voice communication and try to solve puzzles, riddles or conundrums and follow clues to eventually escape the space. These scenarios are designed for problem solving and negotiating language and are ideal for language education. They are fun and exciting and the clock ticking adds to game play.Tour guide(s)/leader(s): Philp Heike, let’s talk online sprl, Belgium

Target audience sector: Informal and/or lifelong learning

Supported devices: Desktop/laptop – Windows, Desktop/laptop – Mac

Platform/environment access: Download from a website and install on a desktop/laptop computer
Official website: http://www.secondlife.com

+++++++++++++++++++

Thursday, June 25 • 9:00am – 10:00am

Games and Gamification II

Click here to remove from My Sched.

Presentation 1: Evaluating the impact of multimodal Collaborative Virtual Environments on user’s spatial knowledge and experience of gamified educational tasks (Full Paper #91)

Authors: Ioannis Doumanis and Daphne Economou

>>Access Video Presentation<<

Several research projects in spatial cognition have suggested Virtual Environments (VEs) as an effective way of facilitating mental map development of a physical space. In the study reported in this paper, we evaluated the effectiveness of multimodal real-time interaction in distilling understanding of the VE after completing gamified educational tasks. We also measure the impact of these design elements on the user’s experience of educational tasks. The VE used reassembles an art gallery and it was built using REVERIE (Real and Virtual Engagement In Realistic Immersive Environment) a framework designed to enable multimodal communication on the Web. We compared the impact of REVERIE VG with an educational platform called Edu-Simulation for the same gamified educational tasks. We found that the multimodal VE had no impact on the ability of students to retain a mental model of the virtual space. However, we also found that students thought that it was easier to build a mental map of the virtual space in REVERIE VG. This means that using a multimodal CVE in a gamified educational experience does not benefit spatial performance, but also it does not cause distraction. The paper ends with future work and conclusions and suggestions for improving mental map construction and user experience in multimodal CVEs.

Presentation 2: A case study on student’s perception of the virtual game supported collaborative learning (Full Paper #42)

Authors: Xiuli Huang, Juhou He and Hongyan Wang

>>Access Video Presentation<<

The English education course in China aims to help students establish the English skills to enhance their international competitiveness. However, in traditional English classes, students often lack the linguistic environment to apply the English skills they learned in their textbook. Virtual reality (VR) technology can set up an immersive English language environment and then promote the learners to use English by presenting different collaborative communication tasks. In this paper, spherical video-based virtual reality technology was applied to build a linguistic environment and a collaborative learning strategy was adopted to promote their communication. Additionally, a mixed-methods research approach was used to analyze students’ achievement between a traditional classroom and a virtual reality supported collaborative classroom and their perception towards the two approaches. The experimental results revealed that the virtual reality supported collaborative classroom was able to enhance the students’ achievement. Moreover, by analyzing the interview, students’ attitudes towards the virtual reality supported collaborative class were reported and the use of language learning strategies in virtual reality supported collaborative class was represented. These findings could be valuable references for those who intend to create opportunities for students to collaborate and communicate in the target language in their classroom and then improve their language skills

!!!!!!!!!!!!!!!!!!!
Thursday, June 25 • 11:00am – 12:00pm

 Games and Gamification III

Click here to remove from My Sched.

Presentation 1: Reducing Cognitive Load through the Worked Example Effect within a Serious Game Environment (Full Paper #19)

Authors: Bernadette Spieler, Naomi Pfaff and Wolfgang Slany

>>Access Video Presentation<<

Novices often struggle to represent problems mentally; the unfamiliar process can exhaust their cognitive resources, creating frustration that deters them from learning. By improving novices’ mental representation of problems, worked examples improve both problem-solving skills and transfer performance. Programming requires both skills. In programming, it is not sufficient to simply understand how Stackoverflow examples work; programmers have to be able to adapt the principles and apply them to their own programs. This paper shows evidence in support of the theory that worked examples are the most efficient mode of instruction for novices. In the present study, 42 students were asked to solve the tutorial The Magic Word, a game especially for girls created with the Catrobat programming environment. While the experimental group was presented with a series of worked examples of code, the control groups were instructed through theoretical text examples. The final task was a transfer question. While the average score was not significantly better in the worked example condition, the fact that participants in this experimental group finished significantly faster than the control group suggests that their overall performance was better than that of their counterparts.

Presentation 2: A literature review of e-government services with gamification elements (Full Paper #56)

Authors: Ruth S. Contreras-Espinosa and Alejandro Blanco-M

>>Access Video Presentation<<

Nowadays several democracies are facing the growing problem of a breach in communication between its citizens and their political representatives, resulting in low citizen’s engagement in the participation of political decision making and on public consultations. Therefore, it is fundamental to generate a constructive relationship between both public administration and the citizens by solving its needs. This document contains a useful literature review of the gamification topic and e-government services. The documents contain a background of those concepts and conduct a selection and analysis of the different applications found. A set of three lines of research gaps are found with a potential impact on future studies.

++++++++++++++++++
Thursday, June 25 • 12:00pm – 1:00pm

 Museums and Libraries

Click here to remove from My Sched.

Presentation 1: Connecting User Experience to Learning in an Evaluation of an Immersive, Interactive, Multimodal Augmented Reality Virtual Diorama in a Natural History Museum & the Importance of Story (Full Paper #51)

Authors: Maria Harrington

>>Access Video Presentation<<

Reported are the findings of user experience and learning outcomes from a July 2019 study of an immersive, interactive, multimodal augmented reality (AR) application, used in the context of a museum. The AR Perpetual Garden App is unique in creating an immersive multisensory experience of data. It allowed scientifically naïve visitors to walk into a virtual diorama constructed as a data visualization of a springtime woodland understory, and interact with multimodal information directly through their senses. The user interface comprised of two different AR data visualization scenarios reinforced with data based ambient bioacoustics, an audio story of the curator’s narrative, and interactive access to plant facts. While actual learning and dwell times were the same between the AR app and the control condition, the AR experience received higher ratings on perceived learning. The AR interface design features of “Story” and “Plant Info” showed significant correlations with actual learning outcomes, while “Ease of Use” and “3D Plants” showed significant correlations with perceived learning. As such, designers and developers of AR apps can generalize these findings to inform future designs.

Presentation 2: The Naturalist’s Workshop: Virtual Reality Interaction with a Natural Science Educational Collection (Short Paper #11)

Authors: Colin Patrick Keenan, Cynthia Lincoln, Adam Rogers, Victoria Gerson, Jack Wingo, Mikhael Vasquez-Kool and Richard L. Blanton

>>Access Video Presentation<<

For experiential educators who utilize or maintain physical collections, The Naturalist’s Workshop is an exemplar virtual reality platform to interact with digitized collections in an intuitive and playful way. The Naturalist’s Workshop is a purpose-developed application for the Oculus Quest standalone virtual reality headset for use by museum visitors on the floor of the North Carolina Museum of Natural Sciences under the supervision of a volunteer attendant. Within the application, museum visitors are seated at a virtual desk. Using their hand controllers and head-mounted display, they explore drawers containing botanical specimens and tools-of-the-trade of a naturalist. While exploring, the participant can receive new information about any specimen by dropping it into a virtual examination tray. 360-degree photography and three-dimensionally scanned specimens are used to allow user-motivated, immersive experience of botanical meta-data such as specimen collection coordinates.

Presentation 3: 360˚ Videos: Entry level Immersive Media for Libraries and Education (Practitioner Presentation #132)

Authors: Diane Michaud

>>Access Video Presentation<<

Within the continuum of XR Technologies, 360˚ videos are relatively easy to produce and need only an inexpensive mobile VR viewer to provide a sense of immersion. 360˚ videos present an opportunity to reveal “behind the scenes” spaces that are normally inaccessible to users of academic libraries. This can promote engagement with unique special collections and specific library services. In December 2019, with little previous experience, I led the production of a short 360˚video tour, a walk-through of our institution’s archives. This was a first attempt; there are plans to transform it into a more interactive, user-driven exploration. The beta version successfully generated interest, but the enhanced version will also help prepare uninitiated users for the process of examining unique archival documents and artefacts. This presentation will cover the lessons learned, and what we would do differently for our next immersive video production. Additionally, I will propose that the medium of 360˚ video is ideal for many institutions’ current or recent predicament with campuses shutdown due to the COVID-19 pandemic. Online or immersive 360˚ video can be used for virtual tours of libraries and/or other campus spaces. Virtual tours would retain their value beyond current campus shutdowns as there will always be prospective students and families who cannot easily make a trip to campus. These virtual tours would provide a welcome alternative as they eliminate the financial burden of travel and can be taken at any time.

++++++++++++++++++

bibliographical data analysis nVivo

Bibliographical data analysis with Zotero and nVivo

Bibliographic Analysis for Graduate Students, EDAD 518, Fri/Sat, May 15/16, 2020

This session will not be about qualitative research (QR) only, but rather about a modern 21st century approach toward the analysis of your literature review in Chapter 2.

However, the computational approach toward qualitative research is not much different than computational approach for your quantitative research; you need to be versed in each of them, thus familiarity with nVivo for qualitative research and with SPSS for quantitative research should be pursued by any doctoral student.

Qualitative Research

Here a short presentation on the basics:

http://blog.stcloudstate.edu/ims/2019/03/25/qualitative-analysis-basics/

Further, if you wish to expand your knowledge, on qualitative research (QR) in this IMS blog:

http://blog.stcloudstate.edu/ims?s=qualitative+research

Workshop on computational practices for QR:

http://blog.stcloudstate.edu/ims/2017/04/01/qualitative-method-research/

Here is a library instruction session for your course
http://blog.stcloudstate.edu/ims/2020/01/24/digital-literacy-edad-828/

Once you complete the overview of the resources above, please make sure you have Zotero working on your computer; we will be reviewing the Zotero features before we move to nVivo.

Here materials on Zotero collected in the IMS blog:
http://blog.stcloudstate.edu/ims?s=zotero

Of those materials, you might want to cover at least:

https://youtu.be/ktLPpGeP9ic

Familiarity with Zotero is a prerequisite for successful work with nVivo, so please if you are already working with Zotero, try to expand your knowledge using the materials above.

nVivo

http://blog.stcloudstate.edu/ims/2017/01/11/nvivo-shareware/

Please use this link to install nVivo on your computer. Even if we were not in a quarantine and you would have been able to use the licensed nVivo software on campus, for convenience (working on your dissertation from home), most probably, you would have used the shareware. Shareware is fully functional on your computer for 14 days, so calculate the time you will be using it and mind the date of installation and your consequent work.

For the purpose of this workshop, please install nVivo on your computer early morning on Saturday, May 16, so we can work together on nVivo during the day and you can continue using the software for the next two weeks.

Please familiarize yourself with the two articles assigned in the EDAD 815 D2L course content “Practice Research Articles“ :

Brosky, D. (2011). Micropolitics in the School: Teacher Leaders’ Use of Political Skill and Influence Tactics. International Journal of Educational Leadership Preparation, 6(1). https://eric.ed.gov/?id=EJ972880

Tooms, A. K., Kretovics, M. A., & Smialek, C. A. (2007). Principals’ perceptions of politics. International Journal of Leadership in Education, 10(1), 89–100. https://doi.org/10.1080/13603120600950901

It is very important to be familiar with the articles when we start working with nVivo.

++++++++++++++++

How to use Zotero

http://blog.stcloudstate.edu/ims/2020/01/27/zotero-workshop/

++++++++++++++++

How to use nVivo for bibliographic analysis

The following guideline is based on this document:

Bibliographical data analysis using Nvivo

whereas the snapshots are replaced with snapshots from nVivol, version 12, which we will be using in our course and for our dissertations.

Concept of bibliographic data

Bibliographic Data is an organized collection of references to publish in literature that includes journals, magazine articles, newspaper articles, conference proceedings, reports, government and legal publications. The bibliographical data is important for writing the literature review of a research. This data is usually saved and organized in databases like Mendeley or Endnote. Nvivo provides the option to import bibliographical data from these databases directly. One can import End Note library or Mendeley library into Nvivo. Similar to interview transcripts, one can represent and analyze bibliographical data using Nvivo. To start with bibliographical data representation, this article previews the processing of literature review in Nvivo.

Importing bibliographical data

Bibliographic Data is imported using Mendeley, Endnote and other such databases or applications that are supported with Nvivo.  Bibliographical data here refers to material in the form of articles, journals or conference proceedings. Common factors among all of these data are the author’s name and year of publication. Therefore, Nvivo helps  to import and arrange these data with their titles as author’s name and year of publication. The process of importing bibliographical data is presented in the figures below.

import Zotero data in nVivo

 

 

 

 

select the appropriate data from external folder

select the appropriate data from external folder

step 1 create record in nVIvo

 

step 2 create record in nVIvo

step 3 create record in nVIvo

 

Coding strategies for literature review

Coding is a process of identifying important parts or patterns in the sources and organizing them in theme node. Sources in case of literature review include material in the form of PDF. That means literature review in Nvivo requires grouping of information from PDF files in the forms of theme nodes. Nodes directly do not create content for literature review, they present ideas simply to help in framing a literature review. Nodes can be created on the basis of theme of the study, results of the study, major findings of the study or any other important information of the study. After creating nodes, code the information of each of the articles into its respective codes.

Nvivo allows coding the articles for preparing a literature review. Articles have tremendous amount of text and information in the forms of graphs, more importantly, articles are in the format of PDF. Since Nvivo does not allow editing PDF files, apply manual coding in case of literature review.  There are two strategies of coding articles in Nvivo.

  1. Code the text of PDF files into a new Node.
  2. Code the text of PDF file into an existing Node. The procedure of manual coding in literature review is similar to interview transcripts.

Add Node to Cases

 

 

 

 

 

The Case Nodes of articles are created as per the author name or year of the publication.

For example: Create a case node with the name of that author and attach all articles in case of multiple articles of same Author in a row with different information. For instance in figure below, five articles of same author’s name, i.e., Mr. Toppings have been selected together to group in a case Node. Prepare case nodes like this then effortlessly search information based on different author’s opinion for writing empirical review in the literature.

Nvivo questions for literature review

Apart from the coding on themes, evidences, authors or opinions in different articles, run different queries based on the aim of the study. Nvivo contains different types of search tools that helps to find information in and across different articles. With the purpose of literature review, this article presents a brief overview of word frequency search, text search, and coding query in Nvivo.

Word frequency

Word frequency in Nvivo allows searching for different words in the articles. In case of literature review, use word frequency to search for a word. This will help to find what different author has stated about the word in the article. Run word frequency  on all types of sources and limit the number of words which are not useful to write the literature.

For example, run the command of word frequency with the limit of 100 most frequent words . This will help in assessing if any of these words remotely provide any new information for the literature (figure below).

Query Text Frequency

andword frequency search

and

word frequency query saved

Text search

Text search is more elaborative tool then word frequency search in Nvivo. It allows Nvivo to search for a particular phrase or expression in the articles. Also, Nvivo gives the opportunity to make a node out of text search if a particular word, phrase or expression is found useful for literature.

For example: conduct a text search query to find a word “Scaffolding” in the articles. In this case Nvivo will provide all the words, phrases and expression slightly related to this word across all the articles (Figure 8 & 9). The difference between test search and word frequency lies in generating texts, sentences and phrases in the latter related to the queried word.

Query Text Search

Coding query

Apart from text search and word frequency search Nvivo also provides the option of coding query. Coding query helps in  literature review to know the intersection between two Nodes. As mentioned previously, nodes contains the information from the articles.  Furthermore it is also possible that two nodes contain similar set of information. Therefore, coding query helps to condense this information in the form of two way table which represents the intersection between selected nodes.

For example, in below figure, researcher have search the intersection between three nodes namely, academics, psychological and social on the basis of three attributes namely qantitative, qualitative and mixed research. This coding theory is performed to know which of the selected themes nodes have all types of attributes. Like, Coding Matrix in figure below shows that academic have all three types of attributes that is research (quantitative, qualitative and mixed). Where psychological has only two types of attributes research (quantitative and mixed).

In this way, Coding query helps researchers to generate intersection between two or more theme nodes. This also simplifies the pattern of qualitative data to write literature.

+++++++++++++++++++

Please do not hesitate to contact me with questions, suggestions before, during or after our workshop and about ANY questions and suggestions you may have about your Chapter 2 and, particularly about your literature review:

Plamen Miltenoff, Ph.D., MLIS

Professor | 320-308-3072 | pmiltenoff@stcloudstate.edu | http://web.stcloudstate.edu/pmiltenoff/faculty/ | schedule a meeting: https://doodle.com/digitalliteracy | Zoom, Google Hangouts, Skype, FaceTalk, Whatsapp, WeChat, Facebook Messenger are only some of the platforms I can desktopshare with you; if you have your preferable platform, I can meet you also at your preference.

++++++++++++++
more on nVIvo in this IMS blog
http://blog.stcloudstate.edu/ims?s=nvivo

more on Zotero in this IMS blog
http://blog.stcloudstate.edu/ims?s=zotero

iLRN 2020

YouTube Live stream: https://www.youtube.com/watch?v=DSXLJGhI2D8&feature=youtu.be
and the Discord directions: https://docs.google.com/document/d/1GgI4dfq-iD85yJiyoyPApB33tIkRJRns1cJ8OpHAYno/editiLearn2020

Modest3D Guided Virtual Adventure – iLRN Conference 2020 – Session 1: currently, live session: https://youtu.be/GjxTPOFSGEM

https://mediaspace.minnstate.edu/media/Modest+3D/1_28ejh60g

CALL FOR PROPOSALS: GUIDED VIRTUAL ADVENTURE TOURS
at iLRN 2020: 6th International Conference of the Immersive Learning Research Network
Organized in conjunction with Educators in VR
Technically co-sponsored by the IEEE Education Society
June 21-25, 2020, Online
Conference theme: “Vision 20/20: Hindsight, Insight, and Foresight in XR and Immersive Learning”
Conference website: https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fimmersivelrn.org%2Filrn2020&amp;data=02%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C7a9997a1d6724744f7d708d7f52d9387%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637247448406614239&amp;sdata=Jt%2BFUtP3Vs%2FQi1z9HCk9x8m%2B%2BRjkZ63qrcoZnFiUdaQ%3D&amp;reserved=0
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
At our physical iLRN conferences, the first day of the conference (Sunday) is typically devoted to one or more guided social tours of local attractions in which attendees have the opportunity to socialize and get to know one another while immersing themselves in the sights and sounds of the host city and/or region. As this year’s conference will take place entirely online, we are instead offering the opportunity for attendees to sign up for small-group “Guided Virtual Adventure” tours of 50 minutes in duration to various social and collaborative XR/immersive environments and platforms.
Proposals are being sought for prospective Guided Virtual Adventure tour offerings on Sunday, June 21, 2020. Tour destinations may be:
– a third-party XR/immersive platform with which you are familiar (e.g., Altspace, Mozilla Hubs, Minecraft, World of Warcraft, Somnium Space, OrbusVR, Second Life);
– a specific virtual environment that you, your institution/organization, or someone else has developed within a third-party platform;
– a platform that you or your institution/organization has developed and/or specific environments within that platform.
There are no fees involved in offering a Guided Virtual Adventure tour; however, preference will be given to proposals that involve environments/platforms that are freely and openly accessible, and that are associated with nonprofit organizations and educational institutions. Where possible, it is strongly recommended that multiple offerings of the tour are made available throughout the day so as to cater for different time zones in which the 8,000+ iLRN 2020 event attendees will be based.
Companies wishing to offer Guided Virtual Adventure tours involving their commercial products and services may submit proposals for consideration, but the iLRN 2020 Organizing Committee reserves the right to, at its discretion, place limits on the number of tours of platforms/environments of a certain type or that address a particular target audience/application vertical. In doing so, they will prioritize companies that have purchased a sponsorship or exhibition package.
*** IMPORTANT: The Guided Virtual Adventures are intended to be a social activity, and as such, platforms and environments to be toured must support interaction among multiple users. For other types of platform or environment, please consider offering a Workshop (https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fimmersivelrn.org%2Filrn2020%2Fworkshops%2F&amp;data=02%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C7a9997a1d6724744f7d708d7f52d9387%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637247448406614239&amp;sdata=2rSCHtYBw3116hRmXFowDz8vEJ%2FPE8MjBjPjhuoU%2FKM%3D&amp;reserved=0) instead, and/or participating in the Immersive Learning Project Showcase & Competition (https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fimmersivelrn.org%2Filrn2020%2Fimmersive-learning-project-showcase%2F&amp;data=02%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C7a9997a1d6724744f7d708d7f52d9387%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637247448406614239&amp;sdata=vldC9NaYxK6cYof9QoBxq9dTjO1Zv%2F9OIcUAdqdT0rs%3D&amp;reserved=0). ***
### Submitting a Proposal ###
Please use this form to propose a Guided Virtual Adventure: https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fforms.gle%2FP4JTAkb29Lb9L18JA&amp;data=02%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C7a9997a1d6724744f7d708d7f52d9387%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637247448406614239&amp;sdata=P7uRpwfXWvrQWld%2FQV6JI%2FdnP9lYPxV%2BeRq73xsCozE%3D&amp;reserved=0
### Contact ###
Inquiries regarding the Guided Virtual Adventures may be directed to conference@immersivelrn.org.
### Important Dates ###
– Guided Virtual Adventure proposal submission deadline: May 18, 2020
– Notification of proposal review outcomes: May 21, 2020
– Presenter registration deadline: May 25, 2020
– Deadline for providing final participant instructions: June 1, 2020
– Guided Virtual Adventure Day: June 21, 2020
Other upcoming iLRN 2020 deadlines (see conference website for details):
–  Immersive Learning Project Showcase & Competition – expressions of interest to participate due May 14, 2020 (deadline extended, no further extensions will be announced)
– Practitioner Stream oral and poster presentations – 1-2 page proposals, not for publication in proceedings, due May 18, 2020 (will not be extended)
– Workshops, Panel Sessions, and Special Sessions –  2-3 page proposals for publication in proceedings as extended-abstract descriptions of the sessions, due May 18, 2020 (will not be extended)
– Free registration deadline for non-presenter educators and students – May 23, 2020
(Sent to blend-online@listserv.educause.edu)

+++++++++++++++++
more on virtual tours in this IMS blog
http://blog.stcloudstate.edu/ims?s=virtual+tour

iLRN 2020

iLRN 2020: 6th International Conference of the Immersive Learning Research Network
Organized in conjunction with Educators in VR
Technically co-sponsored by the IEEE Education Society
June 21-25, 2020, Now Fully Online and in Virtual Reality!!!
Conference theme: “Vision 20/20: Hindsight, Insight, and Foresight in XR and Immersive Learning”
Conference website: https://immersivelrn.org/ilrn2020/
*** FREE registration for non-presenter EDU attendees until April 19th ***
##### TOPIC AREAS #####
XR and immersive learning in/for:
– Serious Games
– Medical & Healthcare
– Workforce & Industry
– Culture & Language
– K-12
– Museums & Libraries
– Special Education
– Geosciences
– Data Visualization
##### SESSION TYPES/ACTIVITIES #####
– Keynotes and plenaries
– Academic stream presentations (with peer-reviewed proceedings for submission to IEEE Xplore)
– Practitioner stream presentations (no paper required)
– Poster and exhibition sessions
– Workshops
– Panel sessions
– Special sessions
– Immersive Learning Adventures
– Immersive Learning Project Showcase & Competition
– Game Night
– Virtual Awards Banquet & Masquerade Ball
##### INTERESTED IN ATTENDING? #####
For a limited time, free registration is being offered to faculty, students, and staff of educational institutions (including K-12 schools/districts, universities, colleges, museums, and libraries) who wish to attend but will NOT be presenting at the conference or publishing in the proceedings. To take advantage of this offer, you must register by April 19, 2020 using an email address associated with your educational institution:
https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fimmersivelrn.org%2Filrn2020%2Filrn-2020-fees-registration%2F&amp;data=02%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C18042fee4df246db0d4b08d7da283e4b%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637217738689183646&amp;sdata=8D%2BIdSk4F5WrvHhPpV5Bjw00NvsfRS669vv%2F1anSyFE%3D&amp;reserved=0
##### INTERESTED IN PRESENTING? #####
Submissions of Practitioner presentation and poster proposals; proposals for workshops, panel sessions, and special sessions; as well as Academic Work-in-progress papers (for delivery as posters or as part of the doctoral colloquium) are being accepted until the late-round submission deadline of April 19, 2020. See the Call for Papers and Proposals at https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fimmersivelrn.org%2Filrn2020%2Fcall-for-papers-proposals%2F&amp;data=02%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C18042fee4df246db0d4b08d7da283e4b%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637217738689183646&amp;sdata=LVkDtui3de5QKdg8Q2dCmox2xb%2F5Oo85HtQXfMuERdA%3D&amp;reserved=0.
Proposals for the Immersive Learning Project Showcase & Competition may be submitted at https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fimmersivelrn.org%2Filrn2020%2Fimmersive-learning-project-showcase%2F&amp;data=02%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C18042fee4df246db0d4b08d7da283e4b%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637217738689183646&amp;sdata=JH3PY8FL5hHvXkayKzS7mQLj3nmNoExzcQTnoBZp3yw%3D&amp;reserved=0until April 30, 2020.
No further Academic Full and Short paper submissions are being considered at this stage.
##### INTERESTED IN VOLUNTEERING OR REVIEWING? #####
A range of volunteer opportunities are available, including conference internships for undergraduate and graduate students. Some of the roles currently available include session chair/facilitator, moderator, audio-visual/technical support, virtual event greeter/usher, virtual event photographer, virtual event videographer/livestreamer, 2D artist / illustrator
3D artist / modeler, graphic designer, and general conference intern. For details and to apply for one or more of these roles, please visit https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fimmersivelrn.org%2Filrn2020%2Fvolunteer-opportunities&amp;data=02%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C18042fee4df246db0d4b08d7da283e4b%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637217738689183646&amp;sdata=MDe2N4jth2anSyTR3cpHwoH8zdwlxCbS8JZmJheWTJc%3D&amp;reserved=0
Expressions of interest are also being solicited from scholars and practitioners wishing to join the iLRN 2020 Program Committee to peer review papers and proposals received in the late submission round (closing April 19, 2020). The late-round submissions will be no longer than 3 pages in length, and each Program Committee member will be asked to review no more than two submissions.
##### INTERESTED IN SPONSORING OR EXHIBITING? #####
A number of sponsorship and exhibition opportunities are available for organizations to:
– Meet and interact with key educational stakeholders
– Showcase their products and services
– Connect and collaborate with top researchers / scientists
– Build and strengthen customer / client relationships.
Packages range from US$500 for a basic virtual exhibit booth to US$15,000 for an exclusive Gold Sponsorship.
For information about the packages available, visit https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fimmersivelrn.org%2Filrn2020%2Fsponsorships-and-exhibitions&amp;data=02%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C18042fee4df246db0d4b08d7da283e4b%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637217738689183646&amp;sdata=3M7ADMO9NDE51rrmC5ycoy8RgL0Y%2BnPyH1DW%2F45xRNc%3D&amp;reserved=0
##### INTERESTED IN JOINING ILRN? (FREE) #####
Basic individual membership of the Immersive Learning Research Network is currently free; you can sign up at https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fimmersivelrn.org%2Fget-involved%2Fmembership%2F&amp;data=02%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C18042fee4df246db0d4b08d7da283e4b%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637217738689183646&amp;sdata=nZg7GBgGiaozJ2l7GNM8b42v%2FnxQvv73F4CrfIqRMkw%3D&amp;reserved=0.
Fee-based premium individual memberships and organizational memberships will be introduced in the near future.
##### CONTACT #####
Email: conference@immersivelrn.org
Web: https://nam02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fimmersivelrn.org%2Filrn2020&amp;data=02%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C18042fee4df246db0d4b08d7da283e4b%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637217738689183646&amp;sdata=fcCXlkfgohdTjaitYkgcZmZp73cdd0ylUt%2FgMFiwAQk%3D&amp;reserved=0

++++++++++++
more about Educators in VR in this IMS blog
http://blog.stcloudstate.edu/ims?s=educators+in+vr

Accessible Teaching

https://www.mapping-access.com/blog-1/2020/3/10/accessible-teaching-in-the-time-of-covid-19

ONLINE TEACHING AND ACCESSIBILITY

IF YOUR COURSE IS LECTURE-BASED… (see link the above for more details)

IF YOUR COURSE IS DISCUSSION-BASED…

(see link the above for more details)

IF YOUR COURSE ASSIGNS TESTS, QUIZZES, AND PAPERS…

(see link the above for more details)

IF YOUR COURSE ASSIGNS PROJECTS…

(see link the above for more details)

IF YOUR COURSE ASSIGNS PRESENTATIONS…

(see link the above for more details)

AS YOU PREPARE FOR ONLINE TEACHING:

(see link the above for more details)

USE A QUESTIONNAIRE TO CHECK IN WITH STUDENTS

SAMPLE QUESTIONS:

CHECKING IN ABOUT NEW ACCESSIBILITY NEEDS

CHECK IN ABOUT INTERNET ACCESS AND TECH AVAILABILITY

CHECK IN ABOUT WELL-BEING AND BASIC NEEDS

CONDUCT A SYLLABUS HACK-A-THON OR DESIGN CHARRETTE

BUILD SOLIDARITY, MUTUAL AID, AND FEELINGS OF CONNECTION

SUPPORT DISABILITY JUSTICE

+++++++++++++++++
more  on online learning in this IMS blog
http://blog.stcloudstate.edu/ims?s=online+learning

IM 690 VR and AR lab part 2

IM 690 Virtual Reality and Augmented Reality. short link: http://bit.ly/IM690lab

IM 690 lab plan for March 3, MC 205:  Oculus Go and Quest

Readings:

  1. TAM:Technology Acceptances Model
    Read Venkatesh, and Davis and sum up the importance of their model for instructional designers working with VR technologies and creating materials for users of VR technologies.
  2. UTAUT: using the theory to learn well with VR and to design good acceptance model for endusers: http://blog.stcloudstate.edu/ims/2020/02/20/utaut/
    Watch both parts of Victoria Bolotina presentation at the Global VR conference. How is she applying UTAUT for her research?
    Read Bracq et al (2019); how do they apply UTAUT for their VR nursing training?

Lab work (continue):

revision from last week:
How to shoot and edit 360 videos: Ben Claremont
https://www.youtube.com/channel/UCAjSHLRJcDfhDSu7WRpOu-w
and
https://www.youtube.com/channel/UCUFJyy31hGam1uPZMqcjL_A

  1. Oculus Quest as VR advanced level
    1. Using the controllers
    2. Confirm Guardian
    3. Using the menu

Oculus Quest main

    1. Watching 360 video in YouTube
      1. Switch between 2D and 360 VR
        1. Play a game

Climbing


Racketball

View this post on Instagram

Hell yeah, @naysy is the ultimate Beat Saber queen! 💃 #VR #VirtualReality #BeatSaber #PanicAtTheDisco

A post shared by Beat Saber (@beatsaber) on

Practice interactivity (space station)

    1. Broadcast your experience (Facebook Live)
  1. Additional (advanced) features of Oculus Quest
    1. https://engagevr.io/
    2. https://sidequestvr.com/#/setup-howto

Interactivity: communication and working collaboratively with Altspace VR

https://account.altvr.com/

setting up your avatar

joining a space and collaborating and communicating with other users

  1. Assignment: Group work
    1. Find one F2F and one online peer to form a group.
      Based on the questions/directions before you started watching the videos:
      – Does this particular technology fit in the instructional design (ID) frames and theories covered
      – how does this particular technology fit in the instructional design (ID) frames and theories covered so far?
      – what models and ideas from the videos you will see seem possible to be replicated by you?
      exchange thoughts with your peers and make a plan to create similar educational product
    2. Post your writing in the following D2L Discussions thread
  2. Augmented Reality with Hololens Watch videos at computer station)
    1. Start and turn off; go through menu

      https://youtu.be/VX3O650comM
    2. Learn gestures, voice commands,
  1. Augmented Reality with Merge Cube
    1. 3D apps and software packages and their compatibility with AR
  2. Augmented Reality with telephone
  3. Samsung Gear 360 video camera
    1. If all other goggles and devices are busy, please feel welcome to use the camera to practice and/or work toward your final project
    2. CIM card and data transfer – does your phone have a CIM card compatible with the camera?
    3. Upload 360 images and videos on your YouTube and FB accounts
  4. Issues with XR
    1. Ethics
      1. empathy
        Peter Rubin “Future Presence”
        http://blog.stcloudstate.edu/ims/2019/03/25/peter-rubin-future-presence/

+++++++++++++

Enhance your XR instructional Design with other tools: http://blog.stcloudstate.edu/ims/2020/02/07/crs-loop/

https://aframe.io/

https://framevr.io/

https://learn.framevr.io/ (free learning of frame)

https://hubs.mozilla.com/#/

https://sketchfab.com/ WebxR technology

https://mixedreality.mozilla.org/hello-webxr/

https://studio.gometa.io/landing

+++++++++++
Plamen Miltenoff, Ph.D., MLIS
Professor
320-308-3072
pmiltenoff@stcloudstate.edu
http://web.stcloudstate.edu/pmiltenoff/faculty/
schedule a meeting: https://doodle.com/digitalliteracy
find my office: https://youtu.be/QAng6b_FJqs

Educators in VR

Info on all presentations: https://account.altvr.com/channels/1182698623012438188

Charlie Fink: Setting the Table for the Next Decade in XR

Translating Training Requirements into Immersive Experience

Virtual Reality Technologies for Learning Designers

Virtual Reality Technologies for Learning Designers Margherita Berti

$$$$$$$$$$$$$$$$$$$$$$

Technology Acceptance and Learning Process Victoria Bolotina part 1

Technology Acceptance and Learning Process Victoria Bolotina part 2

Assessment of Learning Activities in VR Evelien Ydo part 2

++++++++++++++++++++++++++++++++++++++++

VR: So Much More Than a Field Trip Shannon Putman, Graduate Assistant/PhD Student, University of Louisville SPED special education https://account.altvr.com/events/1406092840622096803

++++++++++++++++++++++++++++++

VR and Health Professionals Rob Theriault

+++++++++++++++++++++++

Transform Your History Lessons with AR and VR Michael Fricano II

++++++++++++++++++++++++++++

Transform Your History Lessons with AR and VR Michael Fricano II, Technology Integration Specialist https://www.arvreduhub.com/transform-history

Qlone App for 3D scanning

++++++++++++++++++++++++++++++++++++++

2020 Educators in VR International Summit

The 2020 Educators in VR International Summit is February 17-22. It features over 170 speakers in 150+ events across multiple social and educational platforms including AltspaceVRENGAGErumiiMozilla Hubs, and Somnium Space.

The event requires no registration, and is virtual only, free, and open to the public. Platform access is required, so please install one of the above platforms to attend the International Summit. You may attend in 2D on a desktop or laptop computer with a headphone and microphone (USB gaming headphone recommended), or with a virtual device such as the Oculus Go, Quest, and Rift, Vive, and other mobile and tethered devices. Please note the specifications and requirements of each platform.

The majority of our events are on AltspaceVR. AltspaceVR is available for Samsung GearSteam Store for HTC ViveWindows Mixed Reality, and the Oculus Store for RiftGo and Quest users. Download and install the 2D version for use on your Windows desktop computer.

Charlie Fink, author, columnist for Forbes magazine, and Adjunct Faculty member of Chapman University, will be presenting “Setting the Table for the Next Decade in XR,” discussing the future of this innovative and immersive technology, at the 2020 Educators in VR International Summit. He will be speaking in AltspaceVR on Tuesday, February 18 at 1:00 PM EST /

International Summit

Setting the Table for the Next Decade in XR 1PM, Tues, Feb 18 https://account.altvr.com/events/1406089727517393133

Finding a New Literacy for a New Reality 5PM, Tues, Feb 18

https://account.altvr.com/events/1406093036194103494 schedule for new literacy

Finding a New Literacy for a New Reality

Dr. Sarah Jones, Deputy Dean, De Montfort University

This workshop with Dr. Sarah Jones will focus on developing a relevant and new literacy for virtual reality, including the core competencies and skills needed to develop and understand how to become an engaged user of the technology in a meaningful way. The workshop will develop into research for a forthcoming book on Uncovering a Literacy for VR due to be published in 2020.

Sarah is listed as one of the top 15 global influencers within virtual reality. After nearly a decade in television news, Sarah began working in universities focusing on future media, future technology and future education. Sarah holds a PhD in Immersive Storytelling and has published extensively on virtual and augmented reality, whilst continuing to make and create immersive experiences. She has advised the UK Government on Immersive Technologies and delivers keynotes and speaks at conferences across the world on imagining future technology. Sarah is committed to diversifying the media and technology industries and regularly champions initiatives to support this agenda.

Inter-cognitive and Intra-cognitive Communication in Virtual Reality

Inter-cognitive and Intra-cognitive Communication in Virtual Reality

Michael Vallance, Professor, Future University Hakodate

Currently there are limited ways to connect 3D VR environments to physical objects in the real-world whilst simultaneously conducting communication and collaboration between remote users. Within the context of a solar power plant, the performance metrics of the site are invaluable for environmental engineers who are remotely located. Often two or more remotely located engineers need to communicate and collaborate on solving a problem. If a solar panel component is damaged, the repair often needs to be undertaken on-site thereby incurring additional expenses. This triage of communication is known as inter-cognitive communication and intra-cognitive communication: inter-cognitive communication where information transfer occurs between two cognitive entities with different cognitive capabilities (e.g., between a human and an artificially cognitive system); intra-cognitive communication where information transfer occurs between two cognitive entities with equivalent cognitive capabilities (e.g., between two humans) [Baranyi and Csapo, 2010]. Currently, non-VR solutions offer a comprehensive analysis of solar plant data. A regular PC with a monitor currently have advantages over 3D VR. For example, sensors can be monitored using dedicated software such as EPEVER or via a web browser; as exemplified by the comprehensive service provided by Elseta. But when multiple users are able to collaborate remotely within a three-dimensional virtual simulation, the opportunities for communication, training and academic education will be profound.

Michael Vallance Ed.D. is a researcher in the Department of Media Architecture, Future University Hakodate, Japan. He has been involved in educational technology design, implementation, research and consultancy for over twenty years, working closely with Higher Education Institutes, schools and media companies in UK, Singapore, Malaysia and Japan. His 3D virtual world design and tele-robotics research has been recognized and funded by the UK Prime Minister’s Initiative (PMI2) and the Japan Advanced Institute of Science and Technology (JAIST). He has been awarded by the United States Army for his research in collaborating the programming of robots in a 3D Virtual World.

Create Strategic Snapchat & Instagram AR Campaigns

Create Strategic Snapchat & Instagram AR Campaigns

Dominique Wu, CEO/Founder, Hummingbirdsday

Augmented Reality Lens is popular among young people thanks to Snapchat’s invention. Business is losing money without fully using of social media targeting young people (14-25). In my presentation, Dominique Wu will show how businesses can generate more leads through Spark AR (Facebook AR/Instagram AR) & Snapchat AR Lens, and how to create a strategic Snapchat & Instagram AR campaigns.

Domnique Wu is an XR social media strategist and expert in UX/UI design.She has her own YouTube and Apple Podcast show called “XReality: Digital Transformation,” covering the technology and techniques of incorporating XR and AR into social media, marketing, and integration into enterprise solutions.

Mixed Reality in Classrooms Near You

Mixed Reality in Classrooms Near You

Mark Christian, EVP, Strategy and Corporate Development, GIGXR

Mixed Reality devices like the HoloLens are transforming education now. Mark Christian will discuss how the technology is not about edge use cases or POCs, but real usable products that are at Universities transforming the way we teach and learn. Christian will talk about the products of GIGXR, the story of how they were developed and what the research is saying about their efficacy. It is time to move to adoption of XR technology in education. Learn how one team has made this a reality.

As CEO of forward-thinking virtual reality and software companies, Mark Christian employs asymmetric approaches to rapid, global market adoption, hiring, diversity and revenue. He prides himself on unconventional approaches to building technology companies.

Designing Educational Content in VR

Designing Educational Content in VR

Avinash Gyawali, VR Developer, Weaver Studio

Virtual Reality is an effective medium to impart education to the student only if it is done right.The way VR is considered gimmick or not is by the way the software application are designed/developed by the developers not the hardware limitation.I will be giving insight about the VR development for educational content specifically designed for students of lower secondary school.I will also provide insights about the development of game in unity3D game engine.

Game Developer and VR developer with over 3 years of experience in Game Development.Developer of Zombie Shooter, winner of various national awards in the gaming and entertainment category, Avinash Gyawali is the developer of EDVR, an immersive voice controlled VR experience specially designed for children of age 10-18 years.

8:00 AM PST Research Virtual Reality Technologies for Learning Designers Margherita Berti ASVR

Virtual Reality Technologies for Learning Designers

Margherita Berti

Virtual Reality (VR) is a computer-generated experience that simulates presence in real or imagined environments (Kerrebrock, Brengman, & Willems, 2017). VR promotes contextualized learning, authentic experiences, critical thinking, and problem-solving opportunities. Despite the great potential and popularity of this technology, the latest two installations of the Educause Horizon Report (2018, 2019) have argued that VR remains “elusive” in terms of mainstream adoption. The reasons are varied, including the expense and the lack of empirical evidence for its effectiveness in education. More importantly, examples of successful VR implementations for those instructors who lack technical skills are still scarce. Margherita Berti will discuss a range of easy-to-use educational VR tools and examples of VR-based activity examples and the learning theories and instructional design principles utilized for their development.

Margherita Berti is a doctoral candidate in Second Language Acquisition and Teaching (SLAT) and Educational Technology at the University of Arizona. Her research specialization resides at the intersection of virtual reality, the teaching of culture, and curriculum and content development for foreign language education.

Wed 11:00 AM PST Special Event Gamifying the Biblioverse with Metaverse Amanda Fox VR Design / Biblioverse / Training & Embodiment ASVR

Gamifying the Biblioverse with Metaverse

Amanda Fox, Creative Director of STEAMPunks/MetaInk Publishing, MetaInk Publishing

There is a barrier between an author and readers of his/her books. The author’s journey ends, and the reader’s begins. But what if as an author/trainer, you could use gamification and augmented reality(AR) to interact and coach your readers as part of their learning journey? Attend this session with Amanda Fox to learn how the book Teachingland leverages augmented reality tools such as Metaverse to connect with readers beyond the text.

Amanda Fox, Creative Director of STEAMPunksEdu, and author of Teachingland: A Teacher’s Survival Guide to the Classroom Apolcalypse and Zom-Be A Design Thinker. Check her out on the Virtual Reality Podcast, or connect with her on twitter @AmandaFoxSTEM.

Wed 10:00 AM PST Research Didactic Activity of the Use of VR and Virtual Worlds to Teach Design Fundamentals Christian Jonathan Angel Rueda VR Design / Biblioverse / Training & Embodiment ASVR

Didactic Activity of the Use of VR and Virtual Worlds to Teach Design Fundamentals

Christian Jonathan Angel Rueda, research professor, Autonomous University of Queretaro (Universidad Autónoma de Querétaro)

Christian Jonathan Angel Rueda specializaes in didactic activity of the use of virtual reality/virtual worlds to learn the fundamentals of design. He shares the development of a course including recreating in the three-dimensional environment using the fundamentals learned in class, a demonstration of all the works developed throughout the semester using the knowledge of design foundation to show them creatively, and a final project class scenario that connected with the scenes of the students who showed their work throughout the semester.

Christian Jonathan Angel Rueda is a research professor at the Autonomous University of Queretaro in Mexico. With a PhD in educational technology, Christian has published several papers on the intersection of education, pedagogy, and three-dimensional immersive digital environments. He is also an edtech, virtual reality, and social media consultant at Eco Onis.

Thu 11:00 AM PST vCoaching Closing the Gap Between eLearning and XR Richard Van Tilborg XR eLearning / Laughter Medicine ASVR

Closing the Gap Between eLearning and XR

Richard Van Tilborg, founder, CoVince

How we can bridge the gap between eLearning and XR. Richard Van Tilborg discusses combining brain insights enabled with new technologies. Training and education cases realised with the CoVince platform: journeys which start on you mobile and continue in VR. The possibilities to earn from your creations and have a central distribution place for learning and data.

Richard Van Tilborg works with the CoVince platform, a VR platform offering training and educational programs for central distribution of learning and data. He is an author and speaker focusing on computers and education in virtual reality-based tasks for delivering feedback.

 

Thu 12:00 PM PST Research Assessment of Learning Activities in VR Evelien Ydo Technology Acceptance / Learning Assessment / Vaping Prevention ASVR
Thu 6:00 PM PST Down to Basics Copyright and Plagiarism Protections in VR Jonathan Bailey ASVR

 

Thu 8:00 PM PST Diversity Cyberbullying in VR John Williams, Brennan Hatton, Lorelle VanFossen ASVR

IM 690 VR and AR lab

IM 690 Virtual Reality and Augmented Reality. short link: http://bit.ly/IM690lab

IM 690 lab plan for Feb. 18, MC 205:  Experience VR and AR

What is an “avatar” and why do we need to know how it works?

How does the book (and the movie) “Ready Player One” project the education of the future

Peter Rubin “Future Present” pictures XR beyond education. How would such changes in the society and our behavior influence education.

Readings:

each group selected one article of this selection: http://blog.stcloudstate.edu/ims/2020/02/11/immersive-reality-and-instructional-design/
to discuss the approach of an Instructional Designer to XR

Announcements:
http://blog.stcloudstate.edu/ims/2020/02/07/educators-in-vr/

http://blog.stcloudstate.edu/ims/2020/01/30/realities360-conference/

Translating Training Requirements into Immersive Experience

Virtual Reality Technologies for Learning Designers

Virtual Reality Technologies for Learning Designers

Inter

Inter-cognitive and Intra-cognitive communication in VR: https://slides.com/michaelvallance/deck-25c189#/

https://www.youtube.com/channel/UCGHRSovY-KvlbJHkYnIC-aA

People with dementia

https://docs.google.com/presentation/d/e/2PACX-1vSVNHSXWlcLzWZXObifZfhrL8SEeYA59IBdatR1kI7Q-Hry20AHtvLVTWQyH3XxBQ/pub?start=false&loop=false&delayms=60000&slide=id.p1

Free resources:
http://blog.stcloudstate.edu/ims?s=free+audio, free sound, free multimedia

Lab work:

  1. Video 360 as VR entry level
    1. During Lab work on Jan 28, we experienced Video 360 cardboard movies
      let’s take 5-10 min and check out the following videos (select and watch at least three of them)

      1. F2F students, please Google Cardboard
      2. Online students, please view on your computer or mobile devices, if you don’t have googles at your house (you can purchase now goggles for $5-7 from second-hand stores such as Goodwill)
      3. Both F2F and online students. Here directions how to easily open the movies on your mobile devices:
        1. Copy the URL and email it to yourself.
          Open the email on your phone and click on the link
          If you have goggles, click on the appropriate icon lower right corner and insert the phone in the goggles
        2. Open your D2L course on your phone (you can use the mobile app).
          Go to the D2L Content Module with these directions and click on the link.
          After the link opens, insert phone in the goggles to watch the video
      4. Videos:
        While watching the videos, consider the following objectives:
        – Does this particular technology fit in the instructional design (ID) frames and theories covered, e.g. PBL, CBL, Activity Theory, ADDIE Model, TIM etc. (http://blog.stcloudstate.edu/ims/2020/01/29/im-690-id-theory-and-practice/ ). Can you connect the current state, but also the potential of this technology with the any of these frameworks and theories, e.g., how would Google Tour Creator or any of these videos fits in the Analysis – Design – Development – Implementation – Evaluation process? Or, how do you envision your Google Tour Creator project or any of these videos to fit in the Entry – Adoption – Adaptation – Infusion – Transformation process?

– how does this particular technology fit in the instructional design (ID) frames and theories covered so far?
– what models and ideas from the videos you will see seem possible to be replicated by you?

Assignment: Use Google Cardboard to watch at least three of the following options
YouTube:
Elephants (think how it can be used for education)
https://youtu.be/2bpICIClAIg
Sharks (think how it can be used for education)
https://youtu.be/aQd41nbQM-U
Solar system
https://youtu.be/0ytyMKa8aps
Dementia
https://youtu.be/R-Rcbj_qR4g
Facebook
https://www.facebook.com/EgyptVR/photos/a.1185857428100641/1185856994767351/

From Peter Rubin’s Future Presence: here is a link http://blog.stcloudstate.edu/ims/2019/03/25/peter-rubin-future-presence/ if you want to learn more
Empathy, Chris Milk, https://youtu.be/iXHil1TPxvA
Clouds Over Sidra, https://youtu.be/mUosdCQsMkM

  1. Assignment: Group work
    1. Find one F2F and one online peer to form a group.
      Based on the questions/directions before you started watching the videos:
      – Does this particular technology fit in the instructional design (ID) frames and theories covered. e.g. PBL, CBL, Activity Theory, ADDIE Model, TIM etc. (http://blog.stcloudstate.edu/ims/2020/01/29/im-690-id-theory-and-practice/ ). Can you connect the current state, but also the potential of this technology with the any of these frameworks and theories, e.g., how would Google Tour Creator or any of these videos fits in the Analysis – Design – Development – Implementation – Evaluation process? Or, how do you envision your Google Tour Creator project or any of these videos to fit in the Entry – Adoption – Adaptation – Infusion – Transformation process?
      – how does this particular technology fit in the instructional design (ID) frames and theories covered so far?
      – what models and ideas from the videos you will see seem possible to be replicated by you?
      exchange thoughts with your peers and make a plan to create similar educational product
    1. Post your writing in the following D2L Discussions thread: https://stcloudstate.learn.minnstate.edu/d2l/le/4819732/discussions/threads/43483637/View
  1. Lenovo DayDream as VR advanced level
    1. Recording in DayDream
      https://skarredghost.com/2018/08/17/how-to-shoot-cool-screenshots-videos-lenovo-mirage-solo-and-save-them-on-pc/
    2. Using the controller
      https://support.google.com/daydream/answer/7184597?hl=en
    3. Using the menu
    4. Watching 360 video in YouTube
      1. Using keyboard to search
      2. Using voice command to search
    5. Using Labster. https://www.labster.com/
      1. Record how far in the lab you managed to proceed
    6. Playing the games
      1. Evaluate the ability of the game you watched to be incorporated in the educational process

Assignment: In 10-15 min (mind your peers, since we have only headset), do your best to evaluate one educational app (e.g., Labster) and one leisure app (games).
Use the same questions to evaluate Lenovo DayDream:
– Does this particular technology fit in the instructional design (ID) frames and theories covered, e.g. PBL, CBL, Activity Theory, ADDIE Model, TIM etc. (http://blog.stcloudstate.edu/ims/2020/01/29/im-690-id-theory-and-practice/ ). Can you connect the current state, but also the potential of this technology with the any of these frameworks and theories, e.g., how would Google Tour Creator or any of these videos fits in the Analysis – Design – Development – Implementation – Evaluation process? Or, how do you envision your Google Tour Creator project or any of these videos to fit in the Entry – Adoption – Adaptation – Infusion – Transformation process?
– how does this particular technology fit in the instructional design (ID) frames and theories covered so far?
– what models and ideas from the videos you will see seem possible to be replicated by you?

+++++++++++
Plamen Miltenoff, Ph.D., MLIS
Professor
320-308-3072
pmiltenoff@stcloudstate.edu
http://web.stcloudstate.edu/pmiltenoff/faculty/
schedule a meeting: https://doodle.com/digitalliteracy
find my office: https://youtu.be/QAng6b_FJqs

1 2 3 4 5 19