Searching for "immersive "

multi-user reference support experiences

https://www.emeraldinsight.com/eprint/AU2Q4SJGYQG5YTQ5A9RU/full

Hahn, J. (2018). Virtual reality learning environments | Development of multi-user reference support experiences | Information and Learning Science | Ahead of Print. EmeraldInsight. Retrieved from https://www.emeraldinsight.com/eprint/AU2Q4SJGYQG5YTQ5A9RU/full
case study: an undergraduate senior projects computer science course collaboration whose aim was to develop textual browsing experiences, among other library reference functionality, within the HTC Vive virtual reality (VR) headset. In this case study, readers are introduced to applied uses of VR in service to library-based learning through the research and development of a VR reading room app with multi-user support. Within the VR reading room prototype, users are able to collaboratively explore the digital collections of HathiTrust, highlight text for further searching and discovery and receive consultative research support from a reference specialist through VR.
Library staff met with the project team weekly over the 16 weeks of both semesters to first scope out the functionality of the system and vet requirements.
The library research team further hypothesized that incorporating reference-like support in the VR environment can support library learning. There is ample evidence in the library literature which underscores the importance of reference interactions as learning and instructional experiences for university students
Educational benefits to immersive worlds include offering a deeper presence in engagement with rare or non-accessible artifacts. Sequeira and Morgado (2013, p. 2) describe their Virtual Archeology project as using “a blend of techniques and methods employed by historians and archaeologists using computer models for visualizing cultural artefacts and heritage sites”.
The higher-end graphics cards include devices such as the NVIDIA GeForceTM GTX 1060 or AMD RadeonTM RX 480, equivalent or better. The desktop system that was built for this project used the GeForce GTX 1070, which was slightly above the required minimum specifications.

Collaboration: Library as client.

Specific to this course collaboration, computer science students in their final year of study are given the option of several client projects on which to work. The Undergraduate Library has been a collaborator with senior computer science course projects for several years, beginning in 2012-2013 with mobile application design and chat reference software re-engineering (Hahn, 2015). (My note: Mark Gill, this is where and how Mehdi Mekni, you and I can collaborate)

The hurdles the students had the most trouble with was code integration – e.g. combining various individual software parts towards the end of the semester. The students also were challenged by the public HathiTrust APIs, as the system was developed to call the HathiTrust APIs from within the Unity programming environment and developing API calls in C#. This was a novel use of the HathiTrust search APIs for the students and a new area for the research team as well.

There are alternatives to Unity C# programming, notably WebVR, an open source specification for VR programming on the open web.

A-Frame has seen maturation as a platform agnostic and device agnostic software programming environment. The WebVR webpage notes that the specification supports HTC Vive, Oculus Rift, Samsung Gear VR, Google Daydream and Google Cardboard (WebVR Rocks, 2018). Open web platforms are consistent with library values and educational goals of sharing work that can be foundational in implementing VR learning experience both in VR environments and shareable on the web, too.

++++++++++++++
more on VR in libraries in this IMS blog
https://blog.stcloudstate.edu/ims?s=virtual+reality+library

Gaming and Gamification for SPED 204

https://catalog.stcloudstate.edu/Catalog/ViewCatalog.aspx?pageid=viewcatalog&topicgroupid=1994&entitytype=CID&entitycode=SPED+204

SPED 204. Program Overview and E-Portfolio

Credits: 1
Department: Special Education
Description: Overview of the programmatic standards for general and special education, how these standards are integrated in special education curriculum, and e-portfolio requirements for documenting acquisition of the above standards.
  1. Gaming and Gamification.

    why Gaming and Gamification? Vygotsky and ZPD (immersive storytelling is a form of creative play)

    from: https://cpb-us-e1.wpmucdn.com/blog.stcloudstate.edu/dist/d/10/files/2015/03/Gaming-and-Gamification-in-academic-and-library-settings-final-draft-1digudu.pdf
    play >>> games >>> serious games >>> Game Based learning >>>>+ Digital Game Based learning
    Games are type of cooperative learning. Games embody the essence of constructivism, which for students/gamers means constructing their own knowledge while they interact (learn cooperatively). Learning can happen without games, yet games accelerate the process. Games engage. Games, specifically digital ones, relate to the digital natives, those born after 1976 – 80, who are also known as Generation Y, or Millennials”

    is it generational? Is it a fad? is it counter-pedagogical?

    what is the difference between GBL (Game Based Learning) and DGBL (Digital GBL): share examples, opinions. Is one better / preferable then the other? Why?

    Kahoot game (Yahoo): https://play.kahoot.it/#/k/1412b52c-da28-4507-b658-7dfeedf0864c 
    hands-on assignment (10 min): split in groups and discuss your experience with games; identify your preferable mode (e.g. GBL vs DGBL) and draft a short plan of transitioning your current curricula to a curricula incorporating games.

    What is gamification? Why gamification, if we have games?
    “Gamification takes game elements (such as points, badges, leaderboards, competition, achievements) and applies them to a non – game setting. It has the potential to turn routine, mundane tasks into refreshing, motivating experiences

    let’s check our understanding of gamification: https://play.kahoot.it/#/k/542b5b23-acbd-4575-998e-e199ea08b3e7

    hands-on assignment (10 min): split in groups and use your electronic devices: smartphones, tablets, laptops to experience any of the following gamification tools:

    The Future is Now:

    Hands-on assignment (10 min): Experience Oculus Go, Google Cardboard, Samsung Gear 360,  Vuze,
    create your own VR (video 360) orientation tours:

Firefox Reality

available for Viveport, Oculus, and Daydream

From the moment you open the browser, you will be presented with immersive experiences that can be enjoyed on a VR headset directly from the Firefox Reality browser. We are working with creators around the world to bring an amazing collection of games, videos, environments, and experiences that can be accessed directly from the home screen.

++++++++++++
more on VR in this IMS blog
https://blog.stcloudstate.edu/ims?s=virtual+reality+education

coding ethics unpredictability

Franken-algorithms: the deadly consequences of unpredictable code

by  Thu 30 Aug 2018 

https://www.theguardian.com/technology/2018/aug/29/coding-algorithms-frankenalgos-program-danger

Between the “dumb” fixed algorithms and true AI lies the problematic halfway house we’ve already entered with scarcely a thought and almost no debate, much less agreement as to aims, ethics, safety, best practice. If the algorithms around us are not yet intelligent, meaning able to independently say “that calculation/course of action doesn’t look right: I’ll do it again”, they are nonetheless starting to learn from their environments. And once an algorithm is learning, we no longer know to any degree of certainty what its rules and parameters are. At which point we can’t be certain of how it will interact with other algorithms, the physical world, or us. Where the “dumb” fixed algorithms – complex, opaque and inured to real time monitoring as they can be – are in principle predictable and interrogable, these ones are not. After a time in the wild, we no longer know what they are: they have the potential to become erratic. We might be tempted to call these “frankenalgos” – though Mary Shelley couldn’t have made this up.

Twenty years ago, George Dyson anticipated much of what is happening today in his classic book Darwin Among the Machines. The problem, he tells me, is that we’re building systems that are beyond our intellectual means to control. We believe that if a system is deterministic (acting according to fixed rules, this being the definition of an algorithm) it is predictable – and that what is predictable can be controlled. Both assumptions turn out to be wrong.“It’s proceeding on its own, in little bits and pieces,” he says. “What I was obsessed with 20 years ago that has completely taken over the world today are multicellular, metazoan digital organisms, the same way we see in biology, where you have all these pieces of code running on people’s iPhones, and collectively it acts like one multicellular organism.“There’s this old law called Ashby’s law that says a control system has to be as complex as the system it’s controlling, and we’re running into that at full speed now, with this huge push to build self-driving cars where the software has to have a complete model of everything, and almost by definition we’re not going to understand it. Because any model that we understand is gonna do the thing like run into a fire truck ’cause we forgot to put in the fire truck.”

Walsh believes this makes it more, not less, important that the public learn about programming, because the more alienated we become from it, the more it seems like magic beyond our ability to affect. When shown the definition of “algorithm” given earlier in this piece, he found it incomplete, commenting: “I would suggest the problem is that algorithm now means any large, complex decision making software system and the larger environment in which it is embedded, which makes them even more unpredictable.” A chilling thought indeed. Accordingly, he believes ethics to be the new frontier in tech, foreseeing “a golden age for philosophy” – a view with which Eugene Spafford of Purdue University, a cybersecurity expert, concurs. Where there are choices to be made, that’s where ethics comes in.

our existing system of tort law, which requires proof of intention or negligence, will need to be rethought. A dog is not held legally responsible for biting you; its owner might be, but only if the dog’s action is thought foreseeable.

model-based programming, in which machines do most of the coding work and are able to test as they go.

As we wait for a technological answer to the problem of soaring algorithmic entanglement, there are precautions we can take. Paul Wilmott, a British expert in quantitative analysis and vocal critic of high frequency trading on the stock market, wryly suggests “learning to shoot, make jam and knit

The venerable Association for Computing Machinery has updated its code of ethics along the lines of medicine’s Hippocratic oath, to instruct computing professionals to do no harm and consider the wider impacts of their work.

+++++++++++
more on coding in this IMS blog
https://blog.stcloudstate.edu/ims?s=coding

Digital Literacy for SPED 405

Digital Literacy for SPED 405. Behavior Theories and Practices in Special Education.

Instructor Mark Markell. mamarkell@stcloudstate.edu Mondays, 5:30 – 8:20 PM. SOE A235

Preliminary Plan for Monday, Sept 10, 5:45 PM to 8 PM

Introduction – who are the students in this class. About myself: http://web.stcloudstate.edu/pmiltenoff/faculty Contact info, “embedded” librarian idea – I am available to help during the semester with research and papers

about 40 min: Intro to the library: http://web.stcloudstate.edu/pmiltenoff/bi/
15 min for a Virtual Reality tours of the Library + quiz on how well they learned the library:
http://bit.ly/VRlib
and 360 degree video on BYOD:
Play a scavenger hunt IN THE LIBRARY: http://bit.ly/learnlib
The VR (virtual reality) and AR (augmented reality) component; why is it important?
why is this technology brought up to a SPED class?
https://blog.stcloudstate.edu/ims/2015/11/18/immersive-journalism/
autism: https://blog.stcloudstate.edu/ims/2018/09/10/sound-and-brain/
Social emotional learning
https://blog.stcloudstate.edu/ims/2018/05/31/vr-ar-sel-empathy/
(transition to the next topic – digital literacy)

about 50 min:

  1. Digital Literacy

How important is technology in our life? Profession?

https://blog.stcloudstate.edu/ims/2018/08/20/employee-evolution/

Do you think technology overlaps with the broad field of special education? How?
How do you define technology? What falls under “technology?”

What is “digital literacy?” Do we need to be literate in that sense? How does it differ from technology literacy?
https://blog.stcloudstate.edu/ims?s=digital+literacy

Additional readings on “digital literacy”
https://blog.stcloudstate.edu/ims/2017/08/23/nmc-digital-literacy/

Digital Citizenship: https://blog.stcloudstate.edu/ims/2015/10/19/digital-citizenship-info/
Play Kahoot: https://play.kahoot.it/#/k/e844253f-b5dd-4a91-b096-b6ff777e6dd7
Privacy and surveillance: how does these two issues affect your students? Does it affect them more? if so, how?  https://blog.stcloudstate.edu/ims/2018/08/21/ai-tracks-students-writings/

Social Media:
http://web.stcloudstate.edu/pmiltenoff/lib290/. if you want to survey the class, here is the FB group page: https://www.facebook.com/groups/LIB290/

Is Social Media part of digital literacy? Why? How SM can help us become more literate?

Digital Storytelling:
http://web.stcloudstate.edu/pmiltenoff/lib490/

How is digital storytelling essential in digital literacy?

about 50 min:

  1. Fake News and Research

Syllabus: Teaching Media Manipulation: https://datasociety.net/pubs/oh/DataAndSociety_Syllabus-MediaManipulationAndDisinformationOnline.pdf

#FakeNews is a very timely and controversial issue. in 2-3 min choose your best source on this issue. 1. Mind the prevalence of resources in the 21st century 2. Mind the necessity to evaluate a) the veracity of your courses b) the quality of your sources (the fact that they are “true” does not mean that they are the best). Be prepared to name your source and defend its quality.
How do you determine your sources? How do you decide the reliability of your sources? Are you sure you can distinguish “good” from “bad?”
Compare this entry https://en.wikipedia.org/wiki/List_of_fake_news_websites
to this entry: https://docs.google.com/document/d/10eA5-mCZLSS4MQY5QGb5ewC3VAL6pLkT53V_81ZyitM/preview to understand the scope

Do you know any fact checking sites? Can you identify spot sponsored content? Do you understand syndication? What do you understand under “media literacy,” “news literacy,” “information literacy.”  https://blog.stcloudstate.edu/ims/2017/03/28/fake-news-resources/

Why do we need to explore the “fake news” phenomenon? Do you find it relevant to your professional development?

Let’s watch another video and play this Kahoot: https://play.kahoot.it/#/k/21379a63-b67c-4897-a2cd-66e7d1c83027

So, how do we do academic research? Let’s play another Kahoot: https://play.kahoot.it/#/k/5e09bb66-4d87-44a5-af21-c8f3d7ce23de
If you to structure this Kahoot, what are the questions, you will ask? What are the main steps in achieving successful research for your paper?

  • Research using social media

what is social media (examples). why is called SM? why is so popular? what makes it so popular?

use SM tools for your research and education:

– Determining your topic. How to?
Digg http://digg.com/, Reddit https://www.reddit.com/ , Quora https://www.quora.com
Facebook, Twitter – hashtags (class assignment 2-3 min to search)
LinkedIn Groups
YouTube and Slideshare (class assignment 2-3 min to search)
Flickr, Instagram, Pinterest for visual aids (like YouTube they are media repositories)

Academia.com (https://www.academia.edu/Academia.edu, a paper-sharing social network that has been informally dubbed “Facebook for academics,” https://www.academia.edu/31942069_Facebook_for_Academics_The_Convergence_of_Self-Branding_and_Social_Media_Logic_on_Academia.edu

ResearchGate: https://www.researchgate.net/

– collecting and managing your resources:
Delicious https://del.icio.us/
Diigo: https://www.diigo.com/
Evernote: evernote.com OneNote (Microsoft)

blogs and wikis for collecting data and collaborating

– Managing and sharing your information:
Refworks,
Zotero https://www.zotero.org/,
Mendeley, https://www.mendeley.com/

– Testing your work against your peers (globally):

Wikipedia:
First step:Using Wikipedia.Second step: Contributing to Wikipedia (editing a page). Third step: Contributing to Wikipedia (creating a page)  https://www.evernote.com/shard/s101/sh/ef743d1a-4516-47fe-bc5b-408f29a9dcb9/52d79bfa20ee087900764eb6a407ec86

– presenting your information


please use this form to cast your feedback. Please feel free to fill out only the relevant questions:
http://bit.ly/imseval

VWMOOC18

https://docs.google.com/document/d/1dVtIha1-P1t6t7GEFjzGYd8aegU_OVzRQM-BHzxYwNg/edit

VWMOOC18 August 1-31, 2018

Excerpts from the program

Sun.

August 5

12NOON SLT CVL Librarians Networking Forum at Community Virtual Library How can librarians help educators in virtual worlds?

Held at CVL main library SLurl:

http://maps.secondlife.com/secondlife/Cookie/206/219/21

Embedded librarianship holds potential for immersive learning.  Come learn how to promote your virtual world communities and the great work of educators in virtual worlds through networking.  https://communityvirtuallibrary.wordpress.com/

 

Fri. August 10 12pm SLT Dieter Heyne (Edward Tarber) Web Based Virtual Worlds in Education Organizing collaboration for 400 students in a web based virtual learning environment. Setting up a “synthetic” college.

In the VWMOOC HQ: http://maps.secondlife.com/secondlife/Madhupak/113/66/62

Sat. August 11 Noon SLT /

3pm Eastern

Lyr Lobo, Cynthia Calongne

Kae Novak (SL: Kavon Zenovka, WoW: Maskirovka)

Chris Luchs (SL: Abacus Capellini, WoW: Cheerwine)

What Can We Learn from the World of Warcraft? Join us as we host a blended reality session featuring a live stream from the World of Warcraft (WoW) as we explore educational opportunities in a massive multiplayer online roleplaying game (MMORPG). We will have a YouTube live stream, a Discord channel for voice discussion, and an immersive event in WoW. Educators from the International Society for Technology in Education – Games and Simulations Network (ISTE G&SN) will host an immersive event & discuss learning in a multiuser virtual environment (MUVE).

To join us in WoW: visit this site: https://worldofwarcraft.com/en-us/news/3128270

Click Try for Free and download the Blizzard Launcher, which manages the download. You’ll need 52GB for the game. Create an account, select Sisters of Elune realm and create a troll if you are new to WoW and using a Free Trial account.

Location: In the World of Warcraft and for those who do not have the game, over a YouTube Live stream (available that day) and hosted after the event over https://www.youtube.com/user/gamesmooc/videos

 

Friday

August 17

9 am slt Lynne Berrett (Wisdomseeker) Howard Gardner’s “Theory of Multiple Intelligences” explored through an Interactive, Immersive Experience in Second Life Dr. Gardner has proposed 8 different types of intelligence, ranging from Interpersonal to Kinesthetic. Join us to discover your own most innate type. You may be surprised, like many of the teachers who have tried this challenge as part of our whole-brain training program.

http://maps.secondlife.com/secondlife/Inspiration%20Island/48/54/22

Fri. August 17 Noon SLT Mark Childs (Gann McGann) Theatrical performances in virtual worlds This is a summary of various performance-based activities in Second Life and how performance studies can provide an insight into the experience of virtual worlds.

Presented in the VWMOOC HQ: http://maps.secondlife.com/secondlife/Madhupak/113/66/62

ELI Online Event XR

ELI Online Event | eXtended Reality (XR): How AR, VR, and MR Are Extending Learning Opportunities

May 22 and 24, 2018 | 12:00 noon – 3:35 p.m. ET

https://events.educause.edu/eli/focus-sessions/2018/extended-reality-xr-how-ar-vr-and-mr-are-extending-learning-opportunities

https://twitter.com/search?q=%23elifocus #elifocus

https://www.educause.edu/badging

Over the past year, interest in eXtended reality (XR) technologies (such as virtual, augmented, immersive, and mixed reality) has surged. New and more affordable XR technologies, along with voice activation and sophisticated visual display walls, provide promising directions and opportunities to immerse learners in the curriculum, offering deeper and more vivid learning experiences and extending the learning environment. But what’s the curricular reality with respect to these technologies? What is hype and what is substance? Specifically:

  • What practical applications do “XR technologies” have for teaching, learning, and research?
  • How are these technologies being applied to engage learners as consumers and creators of XR experiences?
  • What evidence is there to support XR technologies as effective tools in the learning environment?
  • How can these technologies be integrated into learning spaces?
  • What are the ethical questions we should consider as we explore XR?

digital storytelling tools

Six Good Digital Storytelling Tools In One Place

Six Good Digital Storytelling Tools In One Place

Timeline JS

The same people who created Timeline JS, Knight Lab at Northwestern University, offer five other tools for creating and publishing digital stories.

Juxtapose JS from Knight Lab is a free tool for making and hosting side-by-side comparisons of images. The tool was designed to help people see before and after views of a location, a building, a person, or anything else that changes appearance over time. Juxtapose JS will let you put the images into a slider frame that you can embed into a webpage where viewers can use the slider to reveal more or less of one of the images.

Storyline JS is one of the newer Knight Lab offerings. This beta product is designed to help students tell stories with data. The basic purpose of Storyline JS is to help students can create interactive line graphs. Students add annotations to the data points their line graphs. Those annotations are used to tell the story of the data represented in the graph.

Scene VR is another beta product from Knight Lab. The purpose of Scene VR is to enable users to stitch together panoramic images and VR images to create an immersive photo story. Stories published through Scene VR can be embedded into websites and viewed on desktop computers as well as on tablets and mobile phones.

If you want to embed audio into a written story, Soundcite JS is the tool for you. Soundcite JS lets you add audio clips to a written story. When Soundcite JS is properly used, a play button appears where you specify in the text. For example, if I wrote a story that included a scene in which a dog barks, I could have “the dog barks at the stranger” be highlighted with a play button that when clicked plays the sound of a dog barking.

StoryMap JS lets you combine elements of timelines and maps to create mapped stories. On StoryMap JS you create slides that are matched to locations on your map. Each slide in your story can include images or videos along with text. As you scroll through your story there are simple transitions between each slide. StoryMap JS integrates with your Google Drive account. To get started with StoryMap JS you have to grant it access to your Google Drive account. StoryMap JS will create a folder in your Google Drive account where all of your storymap projects will be saved. With StoryMap JS connected to your Google Drive account you will be able to pull images from your Google Drive account to use in your StoryMap JS projects. Here’s a good tutorial video made by Jan Serie Center’s Digital Liberal Arts initiative at Macalester College.

+++++++++++++++++
more on digital storytelling in this IMS blog
https://blog.stcloudstate.edu/ims?s=digital+storytelling

1 13 14 15 16 17 18