Mar
2018
Digital Literacy for St. Cloud State University
metaverse (hopefully) won’t be the virtual world of ‘Snow Crash,’ or ‘Ready Player One.’ It will likely be something more complex, diverse, and wild.
The metaverse concept clearly means very different things to different people. What exists right now is a series of embryonic digital spaces, such as Facebook’s Horizon, Epic Games’ Fortnite, Roblox‘s digital space for gaming and game creation, and the blockchain-based digital world Decentraland–all of which have clear borders, different rules and objectives, and differing rates of growth.
different layers of realities that we can all be experiencing, even in the same environment or physical space. We’re already doing that with our phones to a certain extent—passively in a physical environment while mentally in a digital one. But we’ll see more experiences beyond your phone, where our whole bodies are fully engaged, and that’s where the metaverse starts to get interesting—we genuinely begin to explore and live in these alternate realities simultaneously.
Xverse
It will have legacy parts that look and feel like the web today, but it will have new nodes and capabilities that will look and feel like the Ready Player One Oasis (amazing gaming worlds), immersion leaking into our world (like my Magicverse concept), and every imaginable permutation of these. I feel that the Xverse will have gradients of sentience and autonomy, and we will have the emergence of synthetic life (things Sun and Thunder is working on) and a multitude of amazing worlds to explore. Building a world will become something everyone can do (like building a webpage or a blog) and people will be able to share richer parts of their external and inner lives at incredibly high-speed across the planet.
Reality will exist on a spectrum ranging from physical to virtual (VR), but a significant chunk of our time will be spent somewhere between those extremes, in some form of augmented reality (AR). Augmented reality will be a normal part of daily life. Virtual companions will provide information, commentary, updates and advice on matters relevant to you at that point in time, including your assets and activities, in both virtual and real spaces.
I think we can all agree our initial dreams of a fully immersive, separate digital world is not only unrealistic, but maybe not what we actually want. So I’ve started defining the metaverse differently to capture the zeitgeist: we’re entering an era where every computer we interact with, big or small, is increasingly world-aware. They can recognize faces, voices, hands, relative and absolute position, velocity, and they can react to this data in a useful way. These contextually aware computers are the path to unlocking ambient computing: where computers fade from the foreground to the background of everyday, useful tools. The metaverse is less of a ‘thing’ and more of a computing era. Contextual computing enables a multitude of new types of interactions and apps: VR sculpting tools and social hangouts, self-driving cars, robotics, smart homes.
as carbon is to the organic world, AI will be both the matrix that provides the necessary structural support and the material from which digital representation will be made. Of all the ways in which AI will shape the form of the metaverse, perhaps most essential is the role it will play in the physical-digital interface. Translating human actions into digital input–language, eye movement, hand gestures, locomotion–these are all actions which AI companies and researchers have already made tremendous progress on.
Qualcomm views the metaverse as an ever-present spatial internet complete with personalized digital experiences that spans the physical and virtual worlds, where everything and everyone can communicate and interact seamlessly.
As an active researcher in the security and forensics of VR systems, should the metaverse come into existence, we should explore and hypothesize the ways it will be misused.
I picture [the metaverse] almost like The Truman Show. Only, instead of walking into a television set, you walk into the internet and can explore any number of different realities
We imagine the metaverse as reality made better, a world infused with magic, stories, and functionality at the intersection of the digital and physical worlds.
Rather than building the “metaverse,” a separate and fully virtual reality that is disconnected from the physical world, we are focused on augmenting reality, not replacing it. We believe AR–or computing overlaid on the world around us–has a smoother path to mass adoption, but will also be better for the world than a fully virtual world.
In the reality-based metaverse, we will be able to more effectively design products of the future, meet and collaborate with our colleagues far away, and experience any remote place in real-time.
I prefer to think of the metaverse as simply bringing our bodies into the internet.
The metaverse isn’t just VR! Those spaces will connect to AR glasses and to 2D spaces like Instagram. And most importantly, there will be a real sense of continuity where the things you buy are always available to you.
At its core will be a self-contained economy that allows individuals and businesses to create, own or invest in a range of activities and experiences.
the metaverse experience can be altered from the individual’s point of view and shaped or curated by any number of agents—whether human or A.I. In that sense, the metaverse does not have an objective look beyond its backend. In essence, the metaverse, together with our physical locations, forms a spatial continuum.
The AR applications of the metaverse are limitless and it really can become the next great version of the internet.
It seems fair to predict that the actual aesthetic of any given metaverse will be determined by user demand. If users want to exist in a gamified world populated by outrageous avatars and fantastic landscapes then the metaverse will respond to that demand. Like all things in this world the metaverse will be market driven
+++++++++++++++
More on meta-verse in this blog
https://blog.stcloudstate.edu/ims?s=metaverse
Tech CEOs keep talking about “the metaverse.” Mark Zuckerberg insists that Facebook will be seen as a “metaverse company” instead of a social media company. Satya Nadella proclaims Microsoft is creating a “metaverse stack” for the enterprise.
Author Neil Stephenson coined the term “metaverse” in Snow Crash, a dystopian cyberpunk novel published in 1992.
In the novel, the metaverse is a sort of 3D virtual world. It’s not simply a virtual reality game but is a persistent, shared virtual world. Or rather, the metaverse is a whole universe of shared virtual spaces seemingly linked together—you could, essentially, teleport between them.
If you think this all sounds a bit like Ready Player One or a higher-tech version of Second Life, you’re right.
virtual reality (VR) and not augmented reality (AR) was necessary for that kind of vision
To Zuckerberg and other tech CEOs, the concept of “the metaverse” seems to have more in common with “Web 2.0.” It’s a bunch of new technologies: VR headsets! Presence! Persistent digital worlds
Microsoft’s vision of the metaverse seems to take the form of rambling, buzzword-heavy talk about “digital twins” and “converging the physical with the digital” with “mixed reality.” Microsoft’s Azure cloud can do it!
Of course, as we learned with Windows 10’s “Mixed Reality” headsets, that term often just means Virtual Reality to Microsoft. However, it can also mean augmented reality: And, little surprise, Microsoft also has a headset to sell you: The HoloLens.
++++++++++++++++
more on Metaverse in this IMS blog
https://blog.stcloudstate.edu/ims?s=metaverse
more on immersive in this IMS blog
https://blog.stcloudstate.edu/ims?s=immersive
According to Venture Beat, Epic Games has recently had a successful round of funding, raising a sum of $1 billion for its proposed digital reality world
The metaverse idea of connecting all of Epic Games’ titles is an ingenious way of integrating one game after the other, becoming the common ground for all games, each player with their avatars. This highly resembles “Ready Player One’s” OASIS metaverse, which is a digital, virtual, and augmented reality that leads to different online platforms.
+++++++++++++
more on VR in this IMS blog
https://blog.stcloudstate.edu/ims?s=virtual+reality
IM 690 lab plan for Feb. 18, MC 205: Experience VR and AR
What is an “avatar” and why do we need to know how it works?
How does the book (and the movie) “Ready Player One” project the education of the future
Peter Rubin “Future Present” pictures XR beyond education. How would such changes in the society and our behavior influence education.
Readings:
each group selected one article of this selection: https://blog.stcloudstate.edu/ims/2020/02/11/immersive-reality-and-instructional-design/
to discuss the approach of an Instructional Designer to XR
Announcements:
https://blog.stcloudstate.edu/ims/2020/02/07/educators-in-vr/
https://blog.stcloudstate.edu/ims/2020/01/30/realities360-conference/
Inter
Inter-cognitive and Intra-cognitive communication in VR: https://slides.com/michaelvallance/deck-25c189#/
@EducatorsVR
I’ll be talking about Didactic Activity of the Use of #VR and #VirtualWorlds to Teach Design Fundamentals. I will show the work of my students in @SansarOfficial Join the #EducatorsinVR International Summit the present day at 10:00 AM PST https://t.co/nLV6orr19i pic.twitter.com/yFQomkD7ER— Dr. Christian J. Angel Rueda (@eco_onis) February 19, 2020
https://www.youtube.com/channel/UCGHRSovY-KvlbJHkYnIC-aA
People with dementia
Free resources:
https://blog.stcloudstate.edu/ims?s=free+audio, free sound, free multimedia
Lab work:
– how does this particular technology fit in the instructional design (ID) frames and theories covered so far?
– what models and ideas from the videos you will see seem possible to be replicated by you?
Assignment: Use Google Cardboard to watch at least three of the following options
YouTube:
Elephants (think how it can be used for education)
https://youtu.be/2bpICIClAIg
Sharks (think how it can be used for education)
https://youtu.be/aQd41nbQM-U
Solar system
https://youtu.be/0ytyMKa8aps
Dementia
https://youtu.be/R-Rcbj_qR4g
Facebook
https://www.facebook.com/EgyptVR/photos/a.1185857428100641/1185856994767351/
From Peter Rubin’s Future Presence: here is a link https://blog.stcloudstate.edu/ims/2019/03/25/peter-rubin-future-presence/ if you want to learn more
Empathy, Chris Milk, https://youtu.be/iXHil1TPxvA
Clouds Over Sidra, https://youtu.be/mUosdCQsMkM
Assignment: In 10-15 min (mind your peers, since we have only headset), do your best to evaluate one educational app (e.g., Labster) and one leisure app (games).
Use the same questions to evaluate Lenovo DayDream:
– Does this particular technology fit in the instructional design (ID) frames and theories covered, e.g. PBL, CBL, Activity Theory, ADDIE Model, TIM etc. (https://blog.stcloudstate.edu/ims/2020/01/29/im-690-id-theory-and-practice/ ). Can you connect the current state, but also the potential of this technology with the any of these frameworks and theories, e.g., how would Google Tour Creator or any of these videos fits in the Analysis – Design – Development – Implementation – Evaluation process? Or, how do you envision your Google Tour Creator project or any of these videos to fit in the Entry – Adoption – Adaptation – Infusion – Transformation process?
– how does this particular technology fit in the instructional design (ID) frames and theories covered so far?
– what models and ideas from the videos you will see seem possible to be replicated by you?
+++++++++++
Plamen Miltenoff, Ph.D., MLIS
Professor
320-308-3072
pmiltenoff@stcloudstate.edu
http://web.stcloudstate.edu/pmiltenoff/faculty/
schedule a meeting: https://doodle.com/digitalliteracy
find my office: https://youtu.be/QAng6b_FJqs
Whenever I’m doing a virtual-reality demonstration, I ask for 40 minutes to an hour to get all of the students set up with their headsets, oriented in the virtual space, and then the learning can actually begin. It is not just something where you can throw headsets in a classroom and expect everyone to immediately start the learning objectives that you’re aiming for. You do need to do a little of that work explaining how the technology functions and making sure that everyone has the vision requirements, the hearing requirements, the physical requirements.
+++++++++++
more on Ready Player One in this IMS blog
https://blog.stcloudstate.edu/ims?s=ready+player+one
Teachers can bring VR stories into the classroom in many different ways for meaningful learning experiences. Imagine a scavenger hunt where students narrate a story based on what they find. Or consider using objects they see to identify vocabulary words or recognize letters. Students should have purpose in their viewing and it should directly connect to standards.
Starting with virtual reality, stories in apps such as Google Spotlight Storiesand YouTube 360 videos have been popular from the start.
Similar to the new movie, Ready Player One, they provide an intense experience where the viewer feels like they are in the center of the story.
Using a mobile device or tablet, the student can start the story and look around the scene based on their interest, rather than the cameras focus. New apps such as Baobab VR have continued to appear with more interactions and engagement.
A creative way to have your students create their own virtual stories is using the app Roundme. Upload your 360 image and add directional sound, links and content. Upload portals to walk the viewer into multiple scenes and then easily share the stories by link to the story.
Newer augmented reality apps that work with ARKit have taken another approach to storytelling. Augmented Stories and My Hungry Caterpillar.Qurious, a company that is working on a release blending gaming, making and storytelling in one app.
Storyfab, turns our students into the directors of the show
A new AR book, SpyQuest, has moved the immersive experience a big step forward as it helps define the story by bringing the images to life. Through the camera lens on a device, the stories make students the agents in an adventure into the world of espionage. The augmented reality experiences on the images use the accompanying app to scan the scene and provide further insight into the story.
+++++++++++++
more on storytelling in this IMS blog
https://blog.stcloudstate.edu/ims?s=digital+storytelling
more on VR and storytelling in this IMS blog
https://blog.stcloudstate.edu/ims?s=virtual+reality+storytelling
+++++++++++++++Even with their services in huge demand, some smaller ed tech firms are running out of cash. https://t.co/2TbzxpRAUx (via @JonMarcusBoston)
— The Hechinger Report (@hechingerreport) September 6, 2020
Investment continues to flow to ed tech, with $803 million injected during the first six months of the year, according to the industry news website EdSurge. But half of that went to just six companies, including the celebrity tutorial provider MasterClass, the online learning platform Udemy and the school and college review site Niche.
From the outside, the ed-tech sector may appear as if “there’s a bonanza and it’s like the dot-com boom again and everybody’s printing money,” said Michael Hansen, CEO of the K-12 and higher education digital learning provider Cengage. “That is not the case.”
Even if they want to buy more ed-tech tools, meanwhile, schools and colleges are short on cash. Expenses for measures to deal with Covid-19 are up, while budgets are expected to be down.
Analysts and industry insiders now expect a wave of acquisitions as already-dominant brands like these seek to corner even more of the market by snatching up smaller players that provide services they don’t.
++++++++++++++++
Despite the pandemic, schools still must conform to the Family Educational Rights and Privacy Act (FERPA) and other laws governing student privacy. Districts can disclose information to public health officials, for example, but information can’t be released to the general public without written consent from parents.
The Safely Reopen Schools mobile app is one tool available for automating contact tracing. The idea is that if two mobile phones are close enough to connect via Bluetooth, the phone owners are close enough to transmit the virus. The app includes daily health check-ins and educational notifications, but no personal information is exchanged between the phones, and the app won’t disclose who tested positive.
Colleges are also using apps to help trace and track students’ exposure to coronavirus. In August, 20,000 participants from the University of Alabama at Birmingham were asked to test the GuideSafe mobile app, which will alert them if they’ve been in contact with someone who tested positive for COVID-19. The app determines the proximity of two people through cell phone signal strength. If someone reports they contracted the virus, an alert will be sent to anyone who has been within six feet of them for at least 15 minutes over the previous two weeks.
Critics of the technology claim these apps aren’t actually capable of contract tracing and could undermine manual efforts to do so.
+++++++++++++++++
more on ed tech in this IMS blog
https://blog.stcloudstate.edu/ims?s=educational+technology
IM 690 lab plan for March 31, online: Virtual Worlds
If at any point you are lost in the virtual worlds, please consider talking/chatting using our IM 690 zoom link:https://minnstate.zoom.us/j/964455431 or call 320 308 3072
Readings:
Currently, if you go to the SCSU online dbases
,if they are working at all, don’t be surprised when clicking on EBSCOhost Business Source Complete to see this msg:
and if you execute a search:
“AltSpaceVR” + “education”, you will find only meager 1+ results.
Google Scholar, naturally, will yield much greater number.
So, search and find an article of your interest using Google Scholar. I used “immersive learning” + “education” for my search.
I chose to read this article:
https://journal.alt.ac.uk/index.php/rlt/article/view/2347/2657
since it addressed design principles when applying mixed reality in education.
What article did you find/choose/read/are ready to share your analysis with?
Tuesday, March 31, 5PM lab
+++++++++++
Plamen Miltenoff, Ph.D., MLIS
Professor
320-308-3072
pmiltenoff@stcloudstate.edu
http://web.stcloudstate.edu/pmiltenoff/faculty/
schedule a meeting: https://doodle.com/digitalliteracy
find my office: https://youtu.be/QAng6b_FJqs
++++++++++++++++++
more on IM 690 labs in this IMS blog
https://blog.stcloudstate.edu/ims?s=im+690
Library Instruction delivered by Plamen Miltenoff, pmiltenoff@stcloudstate.eduDr. Kannan Sivaprakasam, |
Short link to this tutorial: http://bit.ly/chem151 |
Link to the video tutorial regarding microcredentials (badges)
My name is Plamen Miltenoff (https://web.stcloudstate.edu/pmiltenoff/faculty/) and I am the InforMedia Specialist with the SCSU Library (https://blog.stcloudstate.edu/ims/free-tech-instruction/).
Dr. Sivaprakasam and I are developing a microcredentialing system for your class.
The “library” part has several components:
Collecting two of the required and one of the optional badges let you earn the superbadge “Mastery of Library Instruction.”
The superbadge brings points toward your final grade.
Once you acquire the badges, Dr. Sivaprakasam will reflect your achievement in D2L Grades.
If you are building a LinkedIn portfolio, here are directions to upload your badges in your LinkedIn account using Badgr:
https://community.brightspace.com/s/article/Sharing-Badges-in-Brightspace
Please do remember we are still developing the system and we will appreciate your questions and feedback; do not hesitate to contact us, if any…
+++++++++++++++++++++++
LIBRARY INSTRUCTION – Information, Digital and Media Literacy
https://blog.stcloudstate.edu/ims/2020/01/16/fake-news-prevention/
News and Media Literacy (and the lack of) is not very different from Information Literacy
An “information literate” student is able to “locate, evaluate, and effectively use information from diverse sources.” See more About Information Literacy.
How does information literacy help me?
Every day we have questions that need answers. Where do we go? Whom can we trust? How can we find information to help ourselves? How can we help our family and friends? How can we learn about the world and be a better citizen? How can we make our voice heard?
The content of the tutorial is based on the Information Literacy Competency Standards for Higher Education as approved by the Board of Directors of the Association of College and Research Libraries (ACRL).
The standards are:
Standard 1. The information literate student determines the nature and extent of the
information needed
Standard 2. The information literate student accesses needed information effectively
and efficiently
Standard 3. The information literate student evaluates information and its sources
critically and incorporates selected information into his or her knowledge
base and value system
Standard 4. The information literate student, individually or as a member of a group,
uses information effectively to accomplish a specific purpose
Standard 5. The information literate student understands many of the economic, legal,
and social issues surrounding the use of information and accesses and uses
information ethically and legally
Project Information Literacy
A national, longitudinal research study based in the University of Washington’s iSchool, compiling data on how college students seek and use information.
+++++++++++++++++++++++
Research always starts with a question. But the success of your research also depends on how you formulate that question. If your topic is too broad or too narrow, you may have trouble finding information when you search. When developing your question/topic, consider the following:
Evaluating Web Resources
Don’t get “keyword lock!” Be willing to try a different term as a keyword. If you are having trouble thinking of synonyms, check a thesaurus, dictionary, or reference book for ideas.
Keyword worksheet
How to find the SCSU Library Website
SCSU online databases
Locating and Defining a Database
Database Searching Overview:
You can search using the SCSU library online dbases by choosing:
Simple search
Advanced search
Psychology:
PsychINFO
General Science
ScienceDirect
Arts & Humanities Citation Index
+++++++++++
Plamen Miltenoff, Ph.D., MLIS
Professor
320-308-3072
pmiltenoff@stcloudstate.edu
http://web.stcloudstate.edu/pmiltenoff/faculty/
schedule a meeting: https://doodle.com/digitalliteracy
find my office: https://youtu.be/QAng6b_FJqs