Searching for "vr camera"

iLRN 2021

CALL FOR PAPERS AND PROPOSALS
iLRN 2021: 7th International Conference of the Immersive Learning Research Network
May 17 to June 10, 2021, on iLRN Virtual Campus, powered by Virbela
… and across the Metaverse!
Technically co-sponsored by the IEEE Education Society,
with proceedings to be submitted for inclusion in IEEE Xplore(r)
Conference theme: “TRANSCEND: Accelerating Learner Engagement in XR across Time, Place, and Imagination”
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Conference website: https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fimmersivelrn.org%2Filrn2021%2F&data=04%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C24d0f76661804eca489508d8a66c7801%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637442332084340933%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=6d614jJWaou4vQMNioW4ZGdiHIm2mCD5uRqaZ276VVw%3D&reserved=0
PDF version of this CFP available at: https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fbit.ly%2F3qnFYRu&data=04%7C01%7Cpmiltenoff%40STCLOUDSTATE.EDU%7C24d0f76661804eca489508d8a66c7801%7C5011c7c60ab446ab9ef4fae74a921a7f%7C0%7C0%7C637442332084340933%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=Ksq0YFtUxHI9EM0%2Fa7OyYTeb7ObhOy3JdVquCRvvH54%3D&reserved=0
The 7th International Conference of the Immersive Learning Research Network (iLRN 2021) will be an innovative and interactive virtual gathering for a strengthening global network of researchers and practitioners collaborating to develop the scientific, technical, and applied potential of immersive learning. It is the premier scholarly event focusing on advances in the use of virtual reality (VR), augmented reality (AR), mixed reality (MR), and other extended reality (XR) technologies to support learners across the full span of learning–from K-12 through higher education to work-based, informal, and lifelong learning contexts.
Following the success of iLRN 2020, our first fully online and in-VR conference, this year’s conference will once again be based on the iLRN Virtual Campus, powered by VirBELA, but with a range of activities taking place on various other XR simulation, gaming, and other platforms. Scholars and professionals working from informal and formal education settings as well as those representing diverse industry sectors are invited to participate in the conference, where they may share their research findings, experiences, and insights; network and establish partnerships to envision and shape the future of XR and immersive technologies for learning; and contribute to the emerging scholarly knowledge base on how these technologies can be used to create experiences that educate, engage, and excite learners.
Note: Last year’s iLRN conference drew over 3,600 attendees from across the globe, making the scheduling of sessions a challenge. This year’s conference activities will be spread over a four-week period so as to give attendees more opportunities to participate at times that are conducive to their local time zones.
##### TOPIC AREAS #####
XR and immersive learning in/for:
Serious Games • 3D Collaboration • eSports • AI & Machine Learning • Robotics • Digital Twins • Embodied Pedagogical Agents • Medical & Healthcare Education • Workforce & Industry • Cultural Heritage • Language Learning • K-12 STEM • Higher Ed & Workforce STEM  • Museums & Libraries • Informal Learning • Community & Civic Engagement  • Special Education • Geosciences • Data Visualization and Analytics • Assessment & Evaluation
##### SUBMISSION STREAMS & CATEGORIES #####
ACADEMIC STREAM (Refereed paper published in proceedings):
– Full (6-8 pages) paper for oral presentation
– Short paper (4-5 pages) for oral presentation
– Work-in-progress paper (2-3 pages) for poster presentation
– Doctoral colloquium paper (2-3 pages)
PRACTITIONER STREAM (Refereed paper published in proceedings):
– Oral presentation
– Poster presentation
– Guided virtual adventures
– Immersive learning project showcase
NONTRADITIONAL SESSION STREAM (1-2 page extended abstract describing session published in proceedings):
– Workshop
– Special session
– Panel session
##### SESSION TYPES & SESSION FORMATS #####
– Oral Presentation: Pre-recorded video + 60-minute live in-world discussion with
others presenting on similar/related topics (groupings of presenters into sessions determined by Program Committee)
– Poster Presentation: Live poster session in 3D virtual exhibition hall; pre-recorded video optional
– Doctoral Colloquium: 60-minute live in-world discussion with other doctoral researchers; pre-recorded video optional
– Guided Virtual Adventures: 60-minute small-group guided tours of to various social and collaborative XR/immersive environments and platforms
– Immersive Learning Project Showcase: WebXR space to assemble a collection of virtual artifacts, accessible to attendees throughout the conference
– Workshop: 1- or 2-hour live hands-on session
– Special Session: 30- or 60-minute live interactive session held in world; may optionally be linked to one or more papers
– Panel Session: 60-minute live in-world discussion with a self-formed group of 3-5 panelists (including a lead panelist who serves as a moderator)
Please see the conference website for templates and guidelines.
##### PROGRAM TRACKS #####
Papers and proposals may be submitted to one of 10 program tracks, the first nine of which correspond to the iLRN Houses of application, and the tenth of which is intended for papers making knowledge contributions to the learning sciences, computer science, and/or game studies that are not linked to any particular application area:
Track 1. Assessment and Evaluation (A&E)
Track 2. Early Childhood Development & Learning (ECDL)
Track 3. Galleries, Libraries, Archives, & Museums (GLAM)
Track 4. Inclusion, Diversity, Equity, Access, & Social Justice (IDEAS)
Track 5. K-12 STEM Education
Track 6. Language, Culture, & Heritage (LCH)
Track 7. Medical & Healthcare Education (MHE)
Track 8. Nature & Environmental Sciences (NES)
Track 9. Workforce Development & Industry Training (WDIT)
Track 10. Basic Research and Theory in Immersive Learning (not linked to any particular application area)
##### PAPER/PROPOSAL SUBMISSION & REVIEW #####
Papers for the Academic Stream and extended-abstract proposals for the Nontraditional Session Stream must be prepared in standard IEEE double-column US Letter format using Microsoft Word or LaTeX, and will be accepted only via the online submission system, accessible via the conference website (from which guidelines and templates are also available).
Proposals for the Practitioner Stream are to be submitted via an online form, also accessible from the conference website.
A blind peer-review process will be used to evaluate all submissions.
##### IMPORTANT DATES #####
– Main round submission deadline – all submission types welcome: 2021-01-15
– Notification of review outcomes from main submission round: 2021-04-01
– Late round submission deadline – Work-in-progress papers, practitioner presentations, and nontraditional sessions only: 2021-04-08
– Camera-ready papers for proceedings due – Full and short papers: 2021-04-15
– Presenter registration deadline – Full and short papers (also deadline for early-bird registration rates): 2021-04-15
– Notification of review outcomes from late submission round: 2021-04-19
– Camera-ready work-in-progress papers and nontraditional session extended abstracts for proceedings due; final practitioner abstracts for conference program due: 2021-05-03
– Presenter registration deadline – Work-in-progress papers, practitioner presentations, and nontraditional sessions: 2021-05-03
– Deadline for uploading presentation materials (videos, slides for oral presentations, posters for poster presentations): 2021-05-10
– Conference opening: 2021-05-17
– Conference closing: 2021-06-10
*Full and short papers can only be submitted in the main round.
##### PUBLICATION & INDEXING #####
All accepted and registered papers in the Academic Stream that are presented at iLRN 2021 and all extended abstracts describing the Nontraditional Sessions presented at the conference will be published in the conference proceedings and submitted to the IEEE Xplore(r) digital library.
Content loaded into Xplore is made available by IEEE to its abstracting and indexing partners, including Elsevier (Scopus, EiCompendex), Clarivate Analytics (CPCI–part of Web of Science) and others, for potential inclusion in their respective databases. In addition, the authors of selected papers may be invited to submit revised and expanded versions of their papers for possible publication in the IEEE Transactions on Learning Technologies (2019 JCR Impact Factor: 2.714), the Journal of Universal Computer Science (2019 JCR Impact Factor: 0.91), or another Scopus and/or Web of Science-indexed journal, subject to the relevant journal’s regular editorial and peer-review policies and procedures.
##### CONTACT #####
Inquiries regarding the iLRN 2020 conference should be directed to the Conference Secretariat at conference@immersivelrn.org.
General inquiries about iLRN may be sent to info@immersivelrn.org.

More on Virbela in this IMS blog
https://blog.stcloudstate.edu/ims?s=virbela

XR Bootcamp Microsoft

For details, go here:
https://www.eventbrite.com/e/behind-the-scenes-with-microsoft-vr-in-the-wild-tickets-128181001827

Behind the Scenes: Microsoft’s Principal Researcher Eyal Ofek speaking about technical and social perspectives of XR

About this Event

The XR Bootcamp Open Lecture Series continues with Microsoft’s Principal Researcher Eyal Ofek!

Agenda:

Virtual Reality (VR) & Augmented reality (AR) pose challenges and opportunities from both a technical and social perspective. We could now have digital, and not physical objects change our understanding of the world around us. It is a unique opportunity to change reality as we sense it.

The Microsoft Researchers are looking for new possibilities to extend our abilities when we are not bound by our physical limitations, enabling superhuman abilities on one hand, and leveling the playfield for people with physical limitations.

Dr. Ofek will describe efforts to design VR & AR applications that will adjust according to the user’s uncontrolled environment, enabling a continuous use during work and leisure, over the large variance of environments. He will also review efforts to the extent the rendering to new capabilities such as haptic rendering.

His lecture will be followed by a Q&A session where you can ask all your questions about the topic.

Lead Instructors:

Eyal Ofek is a principal researcher at the Microsoft Research lab in Redmond, WA. His research interests include Augmented Reality (AR)/Virtual Reality (VR), Haptics, interactive projection mapping, and computer vision for human-computer interaction. He is also the Specialty Chief Editor of Frontiers in Virtual Reality, for the area of Haptics and an Assoc. Editor of IEEE Computer Graphics and Application (CG&A).

Prior to joining Microsoft Research, he obtained his Ph.D. at the Hebrew University of Jerusalem and has founded a couple of companies in computer graphics, including a successful drawing and photo editing application and developing the world’s first time-of-flight video cameras which was a basis for the HoloLens depth camera.

This event is part of the Global XR Bootcamp event:

The Global XR Bootcamp 2020 will be the biggest community-driven, FREE, online Virtual, Augmented and Mixed Reality event in the world! Join us on YouTube or AltspaceVR for a 24 hour live stream with over 50 high quality talks, panels and sessions. Meet your fellow XR enthousiasts in our Community Zone, and win amazing prizes – from vouchers to XR hardware.

++++++++++++++++++++
more on XR in this IMS blog
https://blog.stcloudstate.edu/ims?s=xr

Emerging Trends and Impacts of the Internet of Things in Libraries

Emerging Trends and Impacts of the Internet of Things in Libraries

https://www.igi-global.com/gateway/book/244559

Chapters:

Holland, B. (2020). Emerging Technology and Today’s Libraries. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 1-33). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch001

The purpose of this chapter is to examine emerging technology and today’s libraries. New technology stands out first and foremost given that they will end up revolutionizing every industry in an age where digital transformation plays a major role. Major trends will define technological disruption. The next-gen of communication, core computing, and integration technologies will adopt new architectures. Major technological, economic, and environmental changes have generated interest in smart cities. Sensing technologies have made IoT possible, but also provide the data required for AI algorithms and models, often in real-time, to make intelligent business and operational decisions. Smart cities consume different types of electronic internet of things (IoT) sensors to collect data and then use these data to manage assets and resources efficiently. This includes data collected from citizens, devices, and assets that are processed and analyzed to monitor and manage, schools, libraries, hospitals, and other community services.

Makori, E. O. (2020). Blockchain Applications and Trends That Promote Information Management. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 34-51). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch002
Blockchain revolutionary paradigm is the new and emerging digital innovation that organizations have no choice but to embrace and implement in order to sustain and manage service delivery to the customers. From disruptive to sustaining perspective, blockchain practices have transformed the information management environment with innovative products and services. Blockchain-based applications and innovations provide information management professionals and practitioners with robust and secure opportunities to transform corporate affairs and social responsibilities of organizations through accountability, integrity, and transparency; information governance; data and information security; as well as digital internet of things.
Hahn, J. (2020). Student Engagement and Smart Spaces: Library Browsing and Internet of Things Technology. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 52-70). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch003
The purpose of this chapter is to provide evidence-based findings on student engagement within smart library spaces. The focus of smart libraries includes spaces that are enhanced with the internet of things (IoT) infrastructure and library collection maps accessed through a library-designed mobile application. The analysis herein explored IoT-based browsing within an undergraduate library collection. The open stacks and mobile infrastructure provided several years (2016-2019) of user-generated smart building data on browsing and selecting items in open stacks. The methods of analysis used in this chapter include transactional analysis and data visualization of IoT infrastructure logs. By analyzing server logs from the computing infrastructure that powers the IoT services, it is possible to infer in greater detail than heretofore possible the specifics of the way library collections are a target of undergraduate student engagement.
Treskon, M. (2020). Providing an Environment for Authentic Learning Experiences. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 71-86). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch004
The Loyola Notre Dame Library provides authentic learning environments for undergraduate students by serving as “client” for senior capstone projects. Through the creative application of IoT technologies such as Arduinos and Raspberry Pis in a library setting, the students gain valuable experience working through software design methodology and create software in response to a real-world challenge. Although these proof-of-concept projects could be implemented, the library is primarily interested in furthering the research, teaching, and learning missions of the two universities it supports. Whether the library gets a product that is worth implementing is not a requirement; it is a “bonus.”
Rashid, M., Nazeer, I., Gupta, S. K., & Khanam, Z. (2020). Internet of Things: Architecture, Challenges, and Future Directions. In Holland, B. (Ed.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 87-104). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch005
The internet of things (IoT) is a computing paradigm that has changed our daily livelihood and functioning. IoT focuses on the interconnection of all the sensor-based devices like smart meters, coffee machines, cell phones, etc., enabling these devices to exchange data with each other during human interactions. With easy connectivity among humans and devices, speed of data generation is getting multi-fold, increasing exponentially in volume, and is getting more complex in nature. In this chapter, the authors will outline the architecture of IoT for handling various issues and challenges in real-world problems and will cover various areas where usage of IoT is done in real applications. The authors believe that this chapter will act as a guide for researchers in IoT to create a technical revolution for future generations.
Martin, L. (2020). Cloud Computing, Smart Technology, and Library Automation. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 105-123). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch006
As technology continues to change, the landscape of the work of librarians and libraries continue to adapt and adopt innovations that support their services. Technology also continues to be an essential tool for dissemination, retrieving, storing, and accessing the resources and information. Cloud computing is an essential component employed to carry out these tasks. The concept of cloud computing has long been a tool utilized in libraries. Many libraries use OCLC to catalog and manage resources and share resources, WorldCat, and other library applications that are cloud-based services. Cloud computing services are used in the library automation process. Using cloud-based services can streamline library services, minimize cost, and the need to have designated space for servers, software, or other hardware to perform library operations. Cloud computing systems with the library consolidate, unify, and optimize library operations such as acquisitions, cataloging, circulation, discovery, and retrieval of information.
Owusu-Ansah, S. (2020). Developing a Digital Engagement Strategy for Ghanaian University Libraries: An Exploratory Study. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 124-139). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch007
This study represents a framework that digital libraries can leverage to increase usage and visibility. The adopted qualitative research aims to examine a digital engagement strategy for the libraries in the University of Ghana (UG). Data is collected from participants (digital librarians) who are key stakeholders of digital library service provision in the University of Ghana Library System (UGLS). The chapter reveals that digital library services included rare collections, e-journal, e-databases, e-books, microfilms, e-theses, e-newspapers, and e-past questions. Additionally, the research revealed that the digital library service patronage could be enhanced through outreach programmes, open access, exhibitions, social media, and conferences. Digital librarians recommend that to optimize digital library services, literacy programmes/instructions, social media platforms, IT equipment, software, and website must be deployed. In conclusion, a DES helps UGLS foster new relationships, connect with new audiences, and establish new or improved brand identity.
Nambobi, M., Ssemwogerere, R., & Ramadhan, B. K. (2020). Implementation of Autonomous Library Assistants Using RFID Technology. In Holland, B. (Ed.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 140-150). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch008
This is an interesting time to innovate around disruptive technologies like the internet of things (IoT), machine learning, blockchain. Autonomous assistants (IoT) are the electro-mechanical system that performs any prescribed task automatically with no human intervention through self-learning and adaptation to changing environments. This means that by acknowledging autonomy, the system has to perceive environments, actuate a movement, and perform tasks with a high degree of autonomy. This means the ability to make their own decisions in a given set of the environment. It is important to note that autonomous IoT using radio frequency identification (RFID) technology is used in educational sectors to boost the research the arena, improve customer service, ease book identification and traceability of items in the library. This chapter discusses the role, importance, the critical tools, applicability, and challenges of autonomous IoT in the library using RFID technology.
Priya, A., & Sahana, S. K. (2020). Processor Scheduling in High-Performance Computing (HPC) Environment. In Holland, B. (Ed.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 151-179). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch009
Processor scheduling is one of the thrust areas in the field of computer science. The future technologies use a huge amount of processing for execution of their tasks like huge games, programming software, and in the field of quantum computing. In real-time, many complex problems are solved by GPU programming. The primary concern of scheduling is to reduce the time complexity and manpower. Several traditional techniques exit for processor scheduling. The performance of traditional techniques is reduced when it comes to the huge processing of tasks. Most scheduling problems are NP-hard in nature. Many of the complex problems are recently solved by GPU programming. GPU scheduling is another complex issue as it runs thousands of threads in parallel and needs to be scheduled efficiently. For such large-scale scheduling problems, the performance of state-of-the-art algorithms is very poor. It is observed that evolutionary and genetic-based algorithms exhibit better performance for large-scale combinatorial and internet of things (IoT) problems.
Kirsch, B. (2020). Virtual Reality in Libraries. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 180-193). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch010
Librarians are beginning to offer virtual reality (VR) services in libraries. This chapter reviews how libraries are currently using virtual reality for both consumption and creation purposes. Virtual reality tools will be compared and contrasted, and recommendations will be given for purchasing and circulating headsets and VR equipment. Google Tour Creator and a smartphone or 360-degree camera can be used to create a virtual tour of the library and other virtual reality content. These new library services will be discussed along with practical advice and best practices for incorporating virtual reality into the library for instructional and entertainment purposes.
Heffernan, K. L., & Chartier, S. (2020). Augmented Reality Gamifies the Library: A Ride Through the Technological Frontier. In Holland, B. (Ed.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 194-210). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch011
Two librarians at a University in New Hampshire attempted to integrate gamification and mobile technologies into the exploration of, and orientation to, the library’s services and resources. From augmented reality to virtual escape rooms and finally an in-house app created by undergraduate, campus-based, game design students, the library team learned much about the triumphs and challenges that come with attempting to utilize new technologies to reach users in the 21st century. This chapter is a narrative describing years of various attempts, innovation, and iteration, which have led to the library team being on the verge of introducing an app that could revolutionize campus discovery and engagement.
Miltenoff, P. (2020). Video 360 and Augmented Reality: Visualization to Help Educators Enter the Era of eXtended Reality. In Holland, B. (Eds.), Emerging Trends and Impacts of the Internet of Things in Libraries (pp. 211-225). IGI Global. http://doi:10.4018/978-1-7998-4742-7.ch012
The advent of all types of eXtended Reality (XR)—VR, AR, MR—raises serious questions, both technological and pedagogical. The setup of campus services around XR is only the prelude to the more complex and expensive project of creating learning content using XR. In 2018, the authors started a limited proof-of-concept augmented reality (AR) project for a library tour. Building on their previous research and experience creating a virtual reality (VR) library tour, they sought a scalable introduction of XR services and content for the campus community. The AR library tour aimed to start us toward a matrix for similar services for the entire campus. They also explored the attitudes of students, faculty, and staff toward this new technology and its incorporation in education, as well as its potential and limitations toward the creation of a “smart” library.

IM 690 Gear 360 tutorial

IM 690 Virtual Reality and Augmented Reality

https://stcloudstate.learn.minnstate.edu/d2l/home/4819732

Jan. 21, MC 205 (how to get to the PDR room:

Plan: learn to create, edit and use still 360 degrees images and videos.

#scalability

  1. What is 360 degrees video and how does it fit in the Virtual Reality concept?
    https://www.academia.edu/41628237/Chapter_12_VR_AR_and_Video_360_A_Case_Study_Towards_New_Realities_in_Education_by_Plamen_Miltenoff
  2. Video 360: existing materials versus materials we create
    1. how to find existing materials
      https://www.youtube.com/watch?v=nOHM8gnin8Y
      https://www.youtube.com/results?search_query=360+videos+education
    2. how to decide if we need to create materials
      https://poly.google.com/u/1/view/epydAlXlJSw
      https://poly.google.com/u/1/view/elo1OtpgzHP
      https://poly.google.com/view/8HB4l4zGSbv
  3. Tools and apps for Video 360
    1. Cameras:
      1. Samsung Gear 360: https://www.samsung.com/global/galaxy/gear-360/
        1. 2016
        2. 2017
      2. alternatives: https://filmora.wondershare.com/virtual-reality/samsung-gear-360-camera-alternatives.html
      3. Vuze: https://vuze.camera/
        https://youtu.be/peu-OavRcd8 
        Video 360 3D
    2. Samsung Android (Galaxy) phones app
      https://youtu.be/AKhfoJjcZBM?t=66
    3. Editing
      1. Gear 360 Action Director
        https://youtu.be/c2bcz77y3UY
      2. Photoshop CC

https://www.digitaltrends.com/photography/how-to-edit-360-photos-in-photoshop/

https://tonyredhead.com/adobe/360-photoshop-advanced-editing

Phot

 

      1. Premiere CC
        https://youtu.be/8g4DhBEWvak
      2. Others
  1. Issues and solutions
    1. issues connected to Windows and Apple
      https://youtu.be/2Fok2YcyNSw
      (explains all the quirks between the 2016 & 2017 cameras)
    2. issues connected to Gear 360 camera
    3. issues connected to Gear 360 ActionDirector
      in version 2.0, drag and drop, export etc.
      https://youtu.be/c2bcz77y3UY
  2. Upload
    1. local
    2. social media
      1. Facebook
      2. YouTube
        1. resolution
        2. live stream
  3. Viewing, goggles
    1. Google Cardboard
      1. why do we still consider it?
    2. Low-end goggles (examples)
      1. Pansonite 3D VR Headset
      2. Gearsone G1 VR Headset
      3. Utopia 360 VR Headset
      4. TaoTronics 3D VR Headset
      5. Destek V4 VR Headset
    3. Hi-end goggles
      1. Oculus https://www.oculus.com/
        1. Go
        2. Rift
        3. Quest
          1. haptic devices https://youtu.be/6IhQnWb44zk
      2. HTC Vive: https://www.vive.com/us/comparison/
      3. Daydream Lenovo: https://www.lenovo.com/us/en/daydreamvr/
  4. Creating content
    1. Polly Google Tour Creator: https://poly.google.com/creator/tours/
      https://poly.google.com/view/8HB4l4zGSbv
      (turn ambient audio on)

Error messages working with Action Director

Gear 360 Action Director Error Msg

Gear 360 Action Director Error MsgNVIDIA error msg

 

 

More on VR in this IMS blog:
https://blog.stcloudstate.edu/ims/2018/11/01/vendors-for-vr/

https://docs.google.com/document/d/1efFVsOIwxlTO2Qy-onKbG0dgr8qum3onq3bgFkaVfec/edit?usp=sharing

Peter Rubin Future Presence

P 4. But all that “disruption,” as people love to collect, is over looking the thing that’s the most disruptive of them all call on the way we relate to each other will never be the same. That’s because of something called presence.
Presence is the absolute foundation of virtual reality, and in VR, it’s the absolute foundation of connection-connection with yourself, with an idea, with another human, even connection with artificial intelligence.
p. 28 VR definition
Virtual reality is an 1. artificial environment that’s 2. immersive enough to convince you that you are 3. actually inside it.
1. ” artificial environment ” could mean just about anything. The photograph is an artificial environment of video game is an artificial environment a Pixar movie is an artificial environment the only thing that matters is that it’s not where are you physically are
p. 44 VR: putting the “it” in “meditation”
my note: it seems Rubin sees the 21st century VR as the equivalent of the drug experimentation in the 1960s US: p. 46 “VR is potentially going to become a direct interface to the subconscious”

p. 74 serious games, Carrie Heeter. p. 49

The default network in the brain in today’s society is the wandering mind. We are ruminating about the past, and we are worrying about the future, or maybe even planning for the future; there is some productive thinking. But in general, a wandering mind is an unhappy mind. And that is where we spent all of our week in time: not being aware of everything that we are experiencing in the moment.
Hester’s Open meditation had already let her to design apps and studies that investigated mediate meditations ability to calm that wandering mind
p. 51 Something called interoception. It is a term that is gaining ground in psychologist circles in recent years and basically means awareness of battle associations-like my noticing the fact that I was sitting awkwardly or that keeping my elbows on the cheers armrests was making my shoulders hunched slightly. Not surprisingly, mindfulness meditation seems to heighten interoception. And that is exactly how Heeter and Allbritton Strep throat the meditation I am doing on Costa Del sole. First, I connect with the environment; then with my body; Dan I combined the two. The combination of the VR and interception leads to what she describes as “embodied presence”: not only do you feel like you are in the VR environment, but because you have consciously work to integrate your bodily Sensations into VR, it is a fuller, more vivid version of presents.

p. 52 guided meditation VR GMVR

p. 56 VVVR visual voice virtual reality

p. 57

Just as the ill-fated google glass immediately stigmatized all its wearers as “glassholes”- a.k.a. “techier-than-thou douche bags who dropped $1500 to see an email notification appear in front of their face”-so to do some VR headset still look like face TVs for another it’s

p. 61 Hedgehog Love
engineering feelings with social presence. p.64 remember presents? This is the beginning of social presence. Mindfulness is cool, but making eye contact with Henry is the first step into the future.

p.65 back in 1992, our friend Carrie heeter posited that presence-the sensation did you are really there in VR-head treat day mentions. There was personal presents, environmental presents, and social presents, which she basically defined is being around other people who register your existence.
p. 66 the idea that emotion can be not a cause, as sweet so often assumed, but a result of it of behavior
p. 72 in chapter 1, we explain the difference between Mobile VR and PC driven PR.  The former is cheaper and easier; all you do is drop your smart phone into a headset, and it provides just about everything can eat. Dedicated VR headsets rely on the stronger processors of desktop PCs and game consoles,So they can provide a more robust sense of presence-usually at the cost of being hit Earth to your computer with cables. (it’s the cost of actual money: dedicated headset systems from hundreds of dollars, while mobile headsets like Samsung’s deer VR or Google’s DayDream View can be had for mere tens of dollars.) There is one other fundamental distinction between mobile VR and high-end VR, though, and that is what you do with your hands-how you input your desires. When VR reemerged in the early 2010s, however, the question of input was open to debate. Actually, more than one debate. p. 73 video game controllers are basically metaphors. Some, like steering wheels or pilot flight sticks, might look like that think they’re supposed to be, but  at their essence they are all just collections of buttons. p. 77 HTC sales small wearable truckers that you can affix to any object, or anybody part, to break it into the Vive’s VR.
p. 78 wait a second – you were talking about storytelling.
p. 79 Every Hollywood studio you can imagine-21st Century Fox, Paramount, Warner Bross.-Has already invested in virtual reality. They have made VR experiences based on their own movies, like interstellar or ghost in the Shell, and they have invested in other VR companies. Hollywood directors like Doug Liman (Edge of Tomorrow) and Robert Stromberg (Maleficent) have taken VR project. And the progress is exhilarating. Alejandro GOnzalez Inarritu, a 4-Time Oscar winner for best director 2014 movie Birdman, won best picture, received this special achievement Academy award in 2017 for a VR Schwartz he made. Yet Carne Y Arena, which puts viewers insight a harrowing journey from Mexico to the United States, is nothing like a movie, or even a video game.

When you premiered at the Cannes film Festival in early 2017, it was housed in an airplane hangar; viewers were a shirt, barefoot, into a room with a sand-covert floor, where they could watch and interact with other people trying to make it over the border. Arrests, detention centers, dehydration-the extremity of the human condition happening all around you. India announcement, the Academy of motion picture arts and sciences called the peas “deeply emotional and physically immersive”

p. 83 empathy versus intimacy. Why good stories need someone else

p. 84 Chris Milk

http://www.thewildernessdowntown.com/

p. 85 empathy vs intimacy: appreciation vs emotion

Both of these words are fuzzy, to say the least. Both have decades of study behind him, but both have also appeared and more magazine covers in just about any words, other than possibly “abs”

Empathy: dear Do it to do identify with and understand dollars, particularly on an emotional level. It involves imagining yourself in the place of another and, therefore, appreciating how do you feel.

Intimacy: a complex sphere of ‘inmost’ relationships with self and others that are not usually minor or incidental (though they may be a transitory) and which usually touch the personal world very deeply. They are our closest relationships with friends, family, children, lovers, but they are also the deep into important experiences we have with self

Empathy necessarily needs to involve other people; intimacy doesn’t. Empathy involves emotional understanding; intimacy involves emotion itself. Empathy, at its base, isn’t act of getting outside yourself: you’re protecting yourself into someone’s else experience, which means that in some ways you are leaving your own experience behind, other than as a reference point. Intimacy, on the other hand, is at its base act of feeling: you might be connecting quit someone or something Else, but you are doing so on the basis of the emotions you feel. p 86. Any type of VR experience perfectly illustrates the surprising gap between empathy and intimacy: life action VR. p. 87 unlike CGI-based storytelling, which full somewhere in between game in movie, live action VR feels much more like the conventional video forms that we are used to from television and movies. Like those media, people have been using VR to shoot everything from narrative fiction to documentary the sports.

Nonny de la Peña Hunger in Los Angeles at Sundance

p. 89 Clouds over Sidra Chris Milk

p. 90 SXSW south by southwest Austin Texas

p. 92 every single story has only one goal at its base: to make you care. This holds true whether it is a tale told around a campfire at night, one related to a sequence of panels in the comic book, or dialogue-heavy narrative of a television show. The story might be trying to make you laugh, or just scare you, or to make you feel sad or happy on behalf of one of the characters, but those are all just forms of caring, right? Your emotional investment-the fact that what kept us in this tale matters to you-is the fundamental aim of the storyteller.

Storytelling, than, has evolved to find ways to draw you out of yourself, to make you forget that what you are hearing or seeing or reading isn’t real. It’s only at that point, after all, that our natural capacity for empathy can kick in. p. 93 meanwhile, technology continues to evolve to detaches from those stories. For one, the frame itself continues to get smaller. Strangers still, this distraction has happened well stories continue to become more and more complex. Narratively, at least, stories are more intricate then the have ever been. p. 94. Now, with VR storytelling, the distracting power of multiple screens his met it’s match.

p. 101 experiencing our lives- together

What videos two cannot do, though, he’s bringing people together insights VR, the way re-McClure’s sinking-multicoloredat-blogs-at-each-other tag-team project is VVVR does. That’s why even V are filmmaking powerhouses like Within ( https://www.with.in/get-the-app) are moving beyond mere documentary and narrative and trying to turn storytelling into a shared experience.

Make no mistake: storytelling has always been a shirt experience. Being conscripted into the story, or even being the story.

https://www.linkedin.com/in/jess-engel-96421010/

https://medium.com/@Within/welcome-jess-aea620df0ca9

p. 103 like so many VR experiences, life of us defies many of the ways we describe a story to each other. For one, it feels at fonts shorter and longer than its actual seven-minutes runtime; although it’s seems to be over in a flash, flash contains so many details that in retrospect it is as full and vivid is a two-our movie.

There is another think, though, that sets life of us apart from so many other stories-it is the fact that not only was I in the story, but someone else was in there with me. In that someone wasn’t a field character talking to a camera that they some calling about it, or a video game creature that was programmed to look in ‘my’ direction, but a real person-a person who saw what I saw, a person who was present for each of those moments and who know is inextricably part of my old, shard-Like memory of them.

p. 107 what to do and what to do it with . How social VR is reinventing everything from game night to online harassment.

Facebook Hires Altspace CEO Eric Romo

p. 110 VR isn’t given Romo’s first bet on the future. When he was finishing up his masters degree in mechanical engineering, a professor emailed him on behalf of two men who were recruiting for a rocket company there were starting. One of those man was a Elon musk, which is how Romo became the 13th employee at space X. Eventually, she started the company focusing go solar energy, but when the bottom fell out of the industry, she shut down the company and looked for his next opportunity. Romo spent the next year and a half researching the technology and thinking about what kind of company might make sense in the new VR enabled world. He had read Snow crash, but he oh soon you get our hopes for DVR future could very well end up like gay themed flying car: defined-and limited-bite an expectation that might not match perfectly which what we actually want.

https://www.amazon.com/Snow-Crash-Neal-Stephenson/dp/1491515058

p. 116 back in the day, trolling just trim forward to pursuing a provocative argument for kicks. Today, the word used to describe the actions of anonymous mobs like the one that, for instance, Rolf actor Leslie Jones off Twitter with an onslaught of racist and sexist abuse. Harassment has become one of the defining characteristics of the Internet is for use it today. But with the emergernce of VR, our social networks have become, quite literally, embodied.

p. 116 https://medium.com/athena-talks/my-first-virtual-reality-sexual-assault-2330410b62ee 

p. 142 increasing memory function by moving from being a voyeur to physically participating in the virtual activity. embodied presence – bringing not just your head into your hands, but your body into VR-strengthens memories in the number of ways.

p. 143 at the beginning of 2017, Facebook fit published some of its. New Ron’s in internal research about the potential of social VR. Neurons INc. The agency measured eye movements, Brain activity, and pools of volunteers who were watching streaming video on smart phones and ultimately discovered that buffering and lag were significantly more stressful than waiting can line it a store, and even slightly more stressful than watching a horror movie.

p. 145 after the VR experience, more than 80% of introverts — is identified by a short survey participants took before hand-wanted to become friends with the person they had chatted with, as opposed to less than 60% of extroverts

p. 149 Rec Room Confidential: the anatomy in evolution of VR friendships

p. 165 reach out and touch someone; haptics, tactile presence and making VR physical.

https://www.digicert.com/ 

VOID: Vision of Infinite Dimensions p. 167

p. 169 the 4-D-effects: steam, cool air, moisture,

p. 170 Copresence

About

https://www.researchgate.net/profile/Shanyang_Zhao

https://www.researchgate.net/publication/2532682_Toward_A_Taxonomy_of_Copresence

https://astro.temple.edu/~bzhao001/Taxonomy_Copresence.pdf

p. 171 Zhao laid out two different criteria. The first was whether or not to people are actually in the same place-basically, are they or their stand-ins physically close enough to be able to communicate without any other tools? To people, she wrote, can either have “physical proximity” or “electronic proximity” the latter being some sort of networked connection. The second criterion was whether each person is corporeally there; in other words, is it their actual flesh-and-blood body? The second condition can have three outcomes: both people can be there corporeally; neither can be there corporeally , instead using some sort of stand in like an avatar or a robot; or just one of them can be there corporeally, with the other using case stent in

“virtual copresence” is when a flesh and blood person interacts physically with a representative of a human; if that sounds confusing, 80 good example is using an ATM call mom where are the ATM is a stent in for a bank teller

p. 172 “hypervirtual copresence,” which involves nonhuman devices that are interacting in the same physical space in a humanlike fashion. social VR does not quite fit into any of this category. Zhao refers to this sort of hybrid as a “synthetic environment” and claims that it is a combination of corporeal https://www.waze.com/telecopresence (like Skyping) and virtual telecopresence(like Waze directions )

p. 172 haptic tactics for tactile aptness

Of the five human senses,  a VR headset ca currently stimulates only to: vision and hearing. That leaves treat others-and while smell and taste me come some day.
P. 174; https://en.wikipedia.org/wiki/Aldous_Huxley Brave New World. tactile “feelies”

p. 175 https://en.wikipedia.org/wiki/A._Michael_Noll, 1971

p. 177 https://www.pcmag.com/review/349966/oculus-touch

p. 178 haptic feedback accessories, gloves. full body suites, p. 179 ultrasonics, low-frequency sound waves.

p. 186 the dating game: how touch changes intimacy.

p. 187 MIT Presence https://www.mitpressjournals.org/loi/pres

p. 186-190 questionnaire for the VRrelax project

p. 195 XXX-chnage program: turning porn back into people

p. 221 where we are going, we don’t need headsets. lets get speculative

p. 225 Magic Leap. p. 227 Magic Leap calls its technology “mixed reality,” claiming that the three dimensional virtual objects it brings into your world are far more advanced than the flat, static overlays of augmented reality. In reality, there is no longer any distinction between the two; in fact, the air are by now so many terms being accused in various ways by various companies that it’s probably worth a quick clarification.

definitions

Virtual reality: the illusion of an all-enveloping artificial world, created by wearing an opaque display in front of your eyes.

augmented reality: Bringing artificial objects into the real world-these can be as simple as a ” heads-up display,” like a speedometer project it onto your car’s windshield, or as complex as seen to be virtual creature woke across your real world leaving room, casting a realistic shadow on the floor

mixed reality: generally speaking, this is synonymous with AR, or eight at least with the part of AR that brings virtual objects into the real world. However, some people prefer “mixed” because they think “augmented” implies that reality isn’t enough.

extended or synthetic reality (XR or SR): all of the above! this are bought catch old terms that encompass the full spectrum of virtual elements individual settings.

p. 228 https://avegant.com/.

Edward Tang:

p. 231 in ten years, we won’t even have smartphone anymore.

p. 229 Eve VR is these come blink toddler, though, AR/MR is a third-trimester fetus: eat may be fully formed book eat is not quite ready to be out in the world yet. The headsets or large, the equipment is far more expensive than VR Anthony in many cases we don’t even know what a consumer product looks like.

p. 235 when 2020 is hindsight: what life in 2028 might actually look like.

++++++++++++

Belamire, J. (2016, October 20). My First Virtual Reality Groping. Athena Talks. https://medium.com/athena-talks/my-first-virtual-reality-sexual-assault-2330410b62ee

Splice upload on YouTube

Splice fails to export directly to YouTube

Here is a short screencapture I did on my phone for you:

Here are the snapshots to the step-by-step process

  • To export your Splice project, click in the upper right corner

 

  • Instead of choosing YouTube, just click on the blue button “Save”

 

 

  • Choose a file size to save: smaller one will do you fine

 

 

 

 

 

 

 

 

 

  • Get out of Splice and open the YouTube app

 

  • Click on the little camera icon to upload your Splice video

 

  • Choose the Splice exported video and upload

 

++++++++
more on Splice in this IMS blog
https://blog.stcloudstate.edu/ims?s=splice

GoPro report on Splice fail to export video
https://gopro.com/help/articles/Solutions_Troubleshooting/Splice-Video-Exports-Fail

Hololens in academic library

Blurred Lines—between virtual reality games, research, and education

http://library.ifla.org/2133/

p. 5 a LibGuide was created that provided a better description of the available software for both the Microsoft Hololens and the HTC Vive and also discussed potential applications for the technology.

Both the HTC Vive and the Hololens were made bookable through the library’s LibCalendar booking system, streamlining the booking process and creating a better user experience.

When the decision was made to bring virtual and augmented reality into the McGill University Library, an important aspect of this project was to develop a collection of related software to be used alongside the technology. In building this software collection a priority was placed on acquiring software that could be demonstrated as having educational value, or that could potentially be used in relation to, or in support of, university courses.

For the Microsoft Hololens, all software was acquired through Microsoft’s Online Store. The store has a number of educationally relevant HoloLens apps available for purchase. The app ARchitect, for example, gives a basic sense of how augmented reality could be used for viewing new building designs. The app Robotics BIW allows user to simulate robotic functions. A select number of apps, such as Land of the Dinosaurs and Boulevard, provide applications for natural history and art. There were a select number of apps related to science, mathematics and medicine, and others with artistic applications. All of the HoloLens applications were free but, compared to what is available for virtual reality, the experiences were much smaller in size and scope.

For the HoloLens, a generic user account was created and shared with person who booked the HoloLens at the time of their booking. After logging into this account – which could sometimes prove to be a challenge because typing is done using the headset’s gesture controls – the user could select a floating tile which would reveal a list of available software. An unresolved problem was that users would then need to refer to the HoloLens LibGuide for a detailed description of the software, or else choose software based on name alone, and the names were not always helpful.

For the Microsoft HoloLens, the three most popular software programs were Land of the Dinosaurs, Palmyra and Insight Heart. Insight Heart allow users to view and manipulate a 3D rendering of a high-resolution human heart, Land of the Dinosaurs provided an augment reality experience featuring 3D renderings of dinosaurs, and Palmyra gave an augmented reality tour of the ancient city of Palmyra.

p. 7 Though many students had ideas for research projects that could make use of the technology, there was no available software that would have allowed them to use augmented reality in the way they wanted. There were no students interested in developing their own software to be used with the technology either.

p. 8 we found that the Microsoft HoloLens received significant use from our patrons, we would recommend the purchase of one only for libraries serving researchers and developers.

++++++++++++

Getting Real in the Library: A Case Study at the University of Florida

Samuel R. Putnam and Sara Russell GonzalezIssue 39, 2018-02-05

Getting Real in the Library: A Case Study at the University of Florida

As an alternative, Microsoft offers a Hololens with enterprise options geared toward multiple users for $5000.

The transition from mobile app development to VR/AR technology also reflected the increased investment in VR/AR by some of the largest technology companies in the world. In the past four years, Facebook purchased the virtual reality company Oculus, Apple released the ARKit for developing augmented reality applications on iOS devices, Google developed Google Cardboard as an affordable VR option, and Sony released Playstation VR to accompany their gaming platform, just to name a few notable examples. This increase of VR/AR development was mirrored by a rise in student interest and faculty research in using and creating new VR/AR content at UF.

+++++++++++

Arnhem, J.-P. van, Elliott, C., & Rose, M. (2018). Augmented and Virtual Reality in Libraries. Rowman & Littlefield.
https://books.google.com/books?id=PslaDwAAQBAJ&lpg=PA205&ots=HT7qTY-16o&dq=hololens%20academic%20library&lr&pg=PA214#v=onepage&q=hololens%20academic%20library&f=false
360 degree video in library instruction
+++++++++++++++
Hammady, R., & Ma, M. (2018). Designing Spatial UI as a Solution of the Narrow FOV of Microsoft HoloLens: Prototype of Virtual Museum Guide. In Proceedings of the 4th International AR & VR Conference 2018. Springer. Retrieved from https://eprints.staffs.ac.uk/4799/
‘HoloMuse’ that engage users with archaeological artefacts through gesture-based interactions (Pollalis, Fahnbulleh, Tynes, & Shaer, 2017). Another research utilised HoloLens to provide in-situ assistant for users (Blattgerste, Strenge, Renner, Pfeiffer, & Essig, 2017). HoloLens also used to provide magnification for low vision users by complementary finger-worn camera alongside with the HMD (Stearns, DeSouza, Yin, Findlater, & Froehlich, 2017). Even in the medical applications, HoloLens contributed in 3D visualisation purposes using AR techniques (Syed, Zakaria, & Lozanoff, 2017) and provide optimised measurements in medical surgeries(Pratt et al., 2018) (Adabi et al., 2017). Application of HoloLens extended to visualise prototype designs (DeLaOsa, 2017) and showed its potential in gaming industry (Volpe, 2015) (Alvarez, 2015) and engaging cultural visitors with gaming activities (Raptis, Fidas, & Avouris, 2017).
++++++++++++
van Arnhem, J.-P., & Spiller, J. M. (2014). Augmented Reality for Discovery and Instruction. Journal of Web Librarianship, 8(2), 214–230. https://doi.org/10.1080/19322909.2014.904208

+++++++++++

Evaluating the Microsoft HoloLens through an augmented reality assembly application
Proceedings Volume 10197, Degraded Environments: Sensing, Processing, and Display 2017; 101970V (2017) https://doi.org/10.1117/12.2262626
Event: SPIE Defense + Security, 2017, Anaheim, California, United States
To assess the HoloLens’ potential for delivering AR assembly instructions, the cross-platform Unity 3D game engine was used to build a proof of concept application. Features focused upon when building the prototype were: user interfaces, dynamic 3D assembly instructions, and spatially registered content placement. The research showed that while the HoloLens is a promising system, there are still areas that require improvement, such as tracking accuracy, before the device is ready for deployment in a factory assembly setting.
+++++++++++
Pollalis, C., Fahnbulleh, W., Tynes, J., & Shaer, O. (2017). HoloMuse: Enhancing Engagement with Archaeological Artifacts Through Gesture-Based Interaction with Holograms. In Proceedings of the Eleventh International Conference on Tangible, Embedded, and Embodied Interaction (pp. 565–570). New York, NY, USA: ACM. https://doi.org/10.1145/3024969.3025094
https://www.researchgate.net/publication/315472858_HoloMuse_Enhancing_Engagement_with_Archaeological_Artifacts_through_Gesture-Based_Interaction_with_Holograms
++++++++++++++
Gračanin, D., Ciambrone, A., Tasooji, R., & Handosa, M. (2017). Mixed Library — Bridging Real and Virtual Libraries. In S. Lackey & J. Chen (Eds.), Virtual, Augmented and Mixed Reality (pp. 227–238). Springer International Publishing.
We use Microsoft HoloLens device to augment the user’s experience in the real library and to provide a rich set of affordances for embodied and social interactions.We describe a mixed reality based system, a prototype mixed library, that provides a variety of affordances to support embodied interactions and improve the user experience.

++++++++++++

Dourish, P. (n.d.). Where the Action Is. Retrieved November 23, 2018, from https://mitpress.mit.edu/books/where-action
embodied interactions
Computer science as an engineering discipline has been spectacularly successful. Yet it is also a philosophical enterprise in the way it represents the world and creates and manipulates models of reality, people, and action. In this book, Paul Dourish addresses the philosophical bases of human-computer interaction. He looks at how what he calls “embodied interaction”—an approach to interacting with software systems that emphasizes skilled, engaged practice rather than disembodied rationality—reflects the phenomenological approaches of Martin Heidegger, Ludwig Wittgenstein, and other twentieth-century philosophers. The phenomenological tradition emphasizes the primacy of natural practice over abstract cognition in everyday activity. Dourish shows how this perspective can shed light on the foundational underpinnings of current research on embodied interaction. He looks in particular at how tangible and social approaches to interaction are related, how they can be used to analyze and understand embodied interaction, and how they could affect the design of future interactive systems.

++++++++++

Pollalis, C., Fahnbulleh, W., Tynes, J., & Shaer, O. (2017). HoloMuse: Enhancing Engagement with Archaeological Artifacts Through Gesture-Based Interaction with Holograms. In Proceedings of the Eleventh International Conference on Tangible, Embedded, and Embodied Interaction (pp. 565–570). New York, NY, USA: ACM. https://doi.org/10.1145/3024969.3025094
HoloMuse, an AR application for the HoloLens wearable device, which allows users to actively engage with archaeological artifacts from a museum collection
pick up, rotate, scale, and alter a hologram of an original archeological artifact using in-air gestures. Users can also curate their own exhibit or customize an existing one by selecting artifacts from a virtual gallery and placing them within the physical world so that they are viewable only using the device. We intend to study the impact of HoloMuse on learning and engagement with college-level art history and archeology students.
++++++++++++

Dugas, Z., & Kerne Andruld. (2007). Location-Aware Augmented Reality Gaming for Emergency Response Education: Concepts and Development. ResearchGate. Retrieved from https://www.researchgate.net/publication/242295040_Location-Aware_Augmented_Reality_Gaming_for_Emergency_Response_Education_Concepts_and_Development

+++++++++++

Library Spaces II: The IDEA Lab at the Grainger Engineering Library Information Center

https://prism.ucalgary.ca/bitstream/handle/1880/52190/DL5_mischo_IDEA_Lab2.pdf

++++++++++
more on Hololens in this IMS blog
https://blog.stcloudstate.edu/ims?s=hololens

U of St. Thomas HyFlex model of course delivery

ELI Annual Meeting 2018

https://events.educause.edu/eli/annual-meeting/2018/agenda/the-hyflex-model-of-course-delivery-tribulations-triumphs-and-technology

From: EDUCAUSE Listserv <BLEND-ONLINE@LISTSERV.EDUCAUSE.EDU> on behalf of “Kinsella, John R.” <jrkinsella@STTHOMAS.EDU>
Reply-To: EDUCAUSE Listserv <BLEND-ONLINE@LISTSERV.EDUCAUSE.EDU>
Date: Thursday, November 15, 2018 at 11:43 AM
To: EDUCAUSE Listserv <BLEND-ONLINE@LISTSERV.EDUCAUSE.EDU>
Subject: Re: [BLEND-ONLINE] Flexible Training/Learning Incubation Spaces

We launched our group, STELAR (St. Thomas E-Learning and Research), almost 2 years ago.  Part of that launch included a physical space that offers: Innovative individual and collaborative group study spaces for students, consultation spaces for faculty and our staff, meeting spaces, a Technology Showcase providing access to leading edge technology for faculty and students (VR/AR, AI, ML,) an Active Learning classroom space used for training and for faculty to experiment, and a video recording space for faculty to create course video objects using a Lightboard, touch Panel computer or just talking to the camera.

We’ve seen exceptional usage among our students for this space, likely in part because we partnered with our library to include our space along with the other learning resources for students in our main library.  We have had numerous faculty not only experiment with but then integrate VR/AR and other leading edge technologies in their classes and research projects.  Our classroom is busy consistently for training, class sessions, meetings, etc. and our learning spaces see student use throughout the day and into the evening.  In short, our physical space has become an essential and highly visible part of the work we do around providing opportunities, expertise, and technology for the innovation of teaching and learning (Our tagline: … at the intersection of Pedagogy and Technology)

The reception has been so positive that our space has been used as a model for some new student-focus collaboration spaces around campus.

We have a good deal of information about STELAR as a team on our website: https://www.stthomas.edu/stelar/

It does include some information about our physical space but we’ve also pared that down since our launch.  I’d be happy to connect you with our team if you’d like to learn more about what we’ve done here, where we’ve seen success and ideas that didn’t pan out as we expected.

John Kinsella
Instructional Systems Consultant

ITS – STELAR: St. Thomas E-Learning and Research
(651) 962-7839
jrkinsella@stthomas.edu

24/7 Canvas Support: 1.877.704.2127 or Help button in Canvas course.
Other tech needs contact:Techdesk@stthomas.edu

Digital Learning Essentials: Students/faculty self-enroll here

MoreBlogOnline ShowcaseTrainings & Events, &  Online Teaching Certificate

can XR help students learn

Giving Classroom Experiences (Like VR) More … Dimension

https://www.insidehighered.com/digital-learning/article/2018/11/02/virtual-reality-other-3-d-tools-enhance-classroom-experiences

at a session on the umbrella concept of “mixed reality” (abbreviated XR) here Thursday, attendees had some questions for the panel’s VR/AR/XR evangelists: Can these tools help students learn? Can institutions with limited budgets pull off ambitious projects? Can skeptical faculty members be convinced to experiment with unfamiliar technology?

All four — one each from Florida International UniversityHamilton CollegeSyracuse University and Yale University — have just finished the first year of a joint research project commissioned by Educause and sponsored by Hewlett-Packard to investigate the potential for immersive technology to supplement and even transform classroom experiences.

Campus of the Future” report, written by Jeffrey Pomerantz

Yale has landed on a “hub model” for project development — instructors propose projects and partner with students with technological capabilities to tap into a centralized pool of equipment and funding. (My note: this is what I suggest in my Chapter 2 of Arnheim, Eliot & Rose (2012) Lib Guides)

Several panelists said they had already been getting started on mixed reality initiatives prior to the infusion of support from Educause and HP, which helped them settle on a direction

While 3-D printing might seem to lend itself more naturally to the hard sciences, Yale’s humanities departments have cottoned to the technology as a portal to answering tough philosophical questions.

institutions would be better served forgoing an early investment in hardware and instead gravitating toward free online products like UnityOrganon and You by Sharecare, all of which allow users to create 3-D experiences from their desktop computers.

+++++++++

Campus of the Future” report, written by Jeffrey Pomerantz

https://library.educause.edu/~/media/files/library/2018/8/ers1805.pdf?la=en

XR technologies encompassing 3D simulations, modeling, and production.

This project sought to identify

  • current innovative uses of these 3D technologies,
  • how these uses are currently impacting teaching and learning, and
  • what this information can tell us about possible future uses for these technologies in higher education.

p. 5 Extended reality (XR) technologies, which encompass virtual reality (VR) and augmented reality (AR), are already having a dramatic impact on pedagogy in higher education. XR is a general term that covers a wide range of technologies along a continuum, with the real world at one end and fully immersive simulations at the other.

p. 6The Campus of the Future project was an exploratory evaluation of 3D technologies for instruction and research in higher education: VR, AR, 3D scanning, and 3D printing. The project sought to identify interesting and novel uses of 3D technology

p. 7 HP would provide the hardware, and EDUCAUSE would provide the methodological expertise to conduct an evaluation research project investigating the potential uses of 3D technologies in higher education learning and research.

The institutions that participated in the Campus of the Future project were selected because they were already on the cutting edge of integrating 3D technology into pedagogy. These institutions were therefore not representative, nor were they intended to be representative, of the state of higher education in the United States. These institutions were selected precisely because they already had a set of use cases for 3D technology available for study

p. 9  At some institutions, the group participating in the project was an academic unit (e.g., the Newhouse School of Communications at Syracuse University; the Graduate School of Education at Harvard University). At these institutions, the 3D technology provided by HP was deployed for use more or less exclusively by students and faculty affiliated with the particular academic unit.

p. 10 definitions
there is not universal agreement on the definitions of these
terms or on the scope of these technologies. Also, all of these technologies
currently exist in an active marketplace and, as in many rapidly changing markets, there is a tendency for companies to invent neologisms around 3D technology.

A 3D scanner is not a single device but rather a combination of hardware and
software. There are generally two pieces of hardware: a laser scanner and a digital
camera. The laser scanner bounces laser beams off the surface of an object to
determine its shape and contours.

p. 11 definitions

Virtual reality means that the wearer is completely immersed in a computer
simulation. Several types of VR headsets are currently available, but all involve
a lightweight helmet with a display in front of the eyes (see figure 2). In some
cases, this display may simply be a smartphone (e.g., Google Cardboard); in other
cases, two displays—one for each eye—are integrated into the headset (e.g., HTC
Vive). Most commercially available VR rigs also include handheld controllers
that enable the user to interact with the simulation by moving the controllers
in space and clicking on finger triggers or buttons.

p. 12 definitions

Augmented reality provides an “overlay” of some type over the real world through
the use of a headset or even a smartphone.

In an active technology marketplace, there is a tendency for new terms to be
invented rapidly and for existing terms to be used loosely. This is currently
happening in the VR and AR market space. The HP VR rig and the HTC Vive
unit are marketed as being immersive, meaning that the user is fully immersed in
a simulation—virtual reality. Many currently available AR headsets, however, are
marketed not as AR but rather as MR (mixed reality). These MR headsets have a
display in front of the eyes as well as a pair of front-mounted cameras; they are
therefore capable of supporting both VR and AR functionality.

p. 13 Implementation

Technical difficulties.
Technical issues can generally be divided into two broad categories: hardware
problems and software problems. There is, of course, a common third category:
human error.

p. 15 the technology learning curve

The well-known diffusion of innovations theoretical framework articulates five
adopter categories: innovators, early adopters, early majority, late majority, and
laggards. Everett M. Rogers, Diffusion of Innovations, 5th ed. (New York: Simon and Schuster, 2003).

It is also likely that staff in the campus IT unit or center for teaching and learning already know who (at least some of) these individuals are, since such faculty members are likely to already have had contact with these campus units.
Students may of course also be innovators and early adopters, and in fact
several participating institutions found that some of the most creative uses of 3D technology arose from student projects

p. 30  Zeynep Tufekci, in her book Twitter and Tear Gas

definition: There is no necessary distinction between AR and VR; indeed, much research
on the subject is based on a conception of a “virtuality continuum” from entirely
real to entirely virtual, where AR lies somewhere between those ends of the
spectrum.  Paul Milgram and Fumio Kishino, “A Taxonomy of Mixed Reality Visual Displays,” IEICE Transactions on Information Systems, vol. E77-D, no. 12 (1994); Steve Mann, “Through the Glass, Lightly,” IEEE Technology and Society Magazine 31, no. 3 (2012): 10–14.

For the future of 3D technology in higher education to be realized, that
technology must become as much a part of higher education as any technology:
the learning management system (LMS), the projector, the classroom. New
technologies and practices generally enter institutions of higher education as
initiatives. Several active learning classroom initiatives are currently under
way,36 for example, as well as a multi-institution open educational resources
(OER) degree initiative.37

p. 32 Storytelling

Some scholars have argued that all human communication
is based on storytelling;41 certainly advertisers have long recognized that
storytelling makes for effective persuasion,42 and a growing body of research
shows that narrative is effective for teaching even topics that are not generally
thought of as having a natural story, for example, in the sciences.43

p. 33 accessibility

The experience of Gallaudet University highlights one of the most important
areas for development in 3D technology: accessibility for users with disabilities.

p. 34 instructional design

For that to be the case, 3D technologies must be incorporated into the
instructional design process for building and redesigning courses. And for that
to be the case, it is necessary for faculty and instructional designers to be familiar
with the capabilities of 3D technologies. And for that to be the case, it may
not be necessary but would certainly be helpful for instructional designers to
collaborate closely with the staff in campus IT units who support and maintain
this hardware.

Every institution of higher
education has a slightly different organizational structure, of course, but these
two campus units are often siloed. This siloing may lead to considerable friction
in conducting the most basic organizational tasks, such as setting up meetings
and apportioning responsibilities for shared tasks. Nevertheless, IT units and
centers for teaching and learning are almost compelled to collaborate in order
to support faculty who want to integrate 3D technology into their teaching. It
is necessary to bring the instructional design expertise of a center for teaching
and learning to bear on integrating 3D technology into an instructor’s teaching (My note: and where does this place SCSU?) Therefore,
one of the most critical areas in which IT units and centers for teaching and
learning can collaborate is in assisting instructors to develop this integration
and to develop learning objects that use 3D technology. p. 35 For 3D technology to really gain traction in higher education, it will need to be easier for instructors to deploy without such a large support team.

p. 35 Sites such as Thingiverse, Sketchfab, and Google Poly are libraries of freely
available, user-created 3D models.

ClassVR is a tool that enables the simultaneous delivery of a simulation to
multiple headsets, though the simulation itself may still be single-user.

p. 37 data management:

An institutional repository is a collection of an institution’s intellectual output, often consisting of preprint journal articles and conference papers and the data sets behind them.49 An
institutional repository is often maintained by either the library or a partnership
between the library and the campus IT unit. An institutional repository therefore has the advantage of the long-term curatorial approach of librarianship combined with the systematic backup management of the IT unit. (My note: leaves me wonder where does this put SCSU)

Sharing data sets is critical for collaboration and increasingly the default for
scholarship. Data is as much a product of scholarship as publications, and there
is a growing sentiment among scholars that it should therefore be made public.50

++++++++
more on VR in this IMS blog
https://blog.stcloudstate.edu/ims?s=virtual+reality+definition

Gaming and Gamification for SPED 204

https://catalog.stcloudstate.edu/Catalog/ViewCatalog.aspx?pageid=viewcatalog&topicgroupid=1994&entitytype=CID&entitycode=SPED+204

SPED 204. Program Overview and E-Portfolio

Credits: 1
Department: Special Education
Description: Overview of the programmatic standards for general and special education, how these standards are integrated in special education curriculum, and e-portfolio requirements for documenting acquisition of the above standards.
  1. Gaming and Gamification.

    why Gaming and Gamification? Vygotsky and ZPD (immersive storytelling is a form of creative play)

    from: https://cpb-us-e1.wpmucdn.com/blog.stcloudstate.edu/dist/d/10/files/2015/03/Gaming-and-Gamification-in-academic-and-library-settings-final-draft-1digudu.pdf
    play >>> games >>> serious games >>> Game Based learning >>>>+ Digital Game Based learning
    Games are type of cooperative learning. Games embody the essence of constructivism, which for students/gamers means constructing their own knowledge while they interact (learn cooperatively). Learning can happen without games, yet games accelerate the process. Games engage. Games, specifically digital ones, relate to the digital natives, those born after 1976 – 80, who are also known as Generation Y, or Millennials”

    is it generational? Is it a fad? is it counter-pedagogical?

    what is the difference between GBL (Game Based Learning) and DGBL (Digital GBL): share examples, opinions. Is one better / preferable then the other? Why?

    Kahoot game (Yahoo): https://play.kahoot.it/#/k/1412b52c-da28-4507-b658-7dfeedf0864c 
    hands-on assignment (10 min): split in groups and discuss your experience with games; identify your preferable mode (e.g. GBL vs DGBL) and draft a short plan of transitioning your current curricula to a curricula incorporating games.

    What is gamification? Why gamification, if we have games?
    “Gamification takes game elements (such as points, badges, leaderboards, competition, achievements) and applies them to a non – game setting. It has the potential to turn routine, mundane tasks into refreshing, motivating experiences

    let’s check our understanding of gamification: https://play.kahoot.it/#/k/542b5b23-acbd-4575-998e-e199ea08b3e7

    hands-on assignment (10 min): split in groups and use your electronic devices: smartphones, tablets, laptops to experience any of the following gamification tools:

    The Future is Now:

    Hands-on assignment (10 min): Experience Oculus Go, Google Cardboard, Samsung Gear 360,  Vuze,
    create your own VR (video 360) orientation tours:

1 2 3 4