MEL Science aims to release more than 150 lessons covering all the main topics included in K–12 schools’ chemistry curriculum. Later this year, MEL Science also aims to add support for other VR platforms, including Google Cardboard and Samsung Gear VR.
Augmented reality adds computer-generated content as a contextual overlay to the real world. This technology, often powered by devices we already carry, has enormous applications for training and development.
Virtual reality has existed for decades, but technology has finally emerged that makes it truly accessible. VR allows us to put learners in a truly immersive environment, creating entirely new opportunities for training and learning.
AR and VR are just the start of the alternate-reality conversation. There are additional technologies that we can use on their own or as part of a blend with AR and VR to increase the level of immersion in the experiences we create.
Ben Ward, Kansas State University
Joelle Pitts, Instructional Design Librarian and Associate Professor, Kansas State University Libraries
Stefan Yates, Instructional Design Librarian and Associate Professor, Kansas State University
Transmedia, unicorns, and marketing, oh my!: The not-quite epic failure of transmedia design efforts in Oz.
Transmedia storytelling, also called Alternate Reality Games, have been designed to intrigue, engage, and even engineer groups of people since the release of The Beast in 2001. A few colleges and Universities have employed them to engage their student populations and even teach them a thing or two using narrative game mechanics. Presenters will chronicle a highly successful transmedia design effort at Kansas State University, and the subsequent annual efforts to replicate the engagement and enthusiasm. Best practices and not-quite epic failures will be discussed, as will tips (and laments) for marketing to our current student populations.
Glenn Larsen, National Science Foundation
SBIR and Other Funding Sources for Your Game
The National Science Foundation (NSF) awards nearly $190 million annually to startups and small businesses through the Small Business Innovation Research (SBIR)/Small Business Technology Transfer (STTR) program, transforming scientific discovery into products and services with commercial and societal impact. The equity-free funds support research and development (R&D) across almost all areas of science and technology helping companies de-risk technology for commercial success. The NSF is an independent federal agency with a budget of about $7 billion that supports fundamental research and education across all fields of science and engineering. For more information, visit http://www.nsf.gov/SBIR.
Karen Schrier, Assistant Professor/Director of Games and Emerging Media, Marist College Design Principles for Knowledge Games
Lisa Castaneda, CEO, foundry10|
Mark Suter, Teacher, Bernards Township Schools
How Teachers Can Use VR in the Classroom: Beyond the Novelty
Over the past three years, foundry10, an education research organization, has been studying the potential of Virtual Reality in Education. The research has focused on the implementation, immersion dynamics, and integration of content across the curriculum.
Working with a variety of classroom curricular areas, with students and teachers from 30 schools, we have gathered data as well as anecdotal stories to help illustrate how VR functions in a learning environment. Students from all over the US, Canada and parts of Europe, completed pre/post surveys and educators participated in extensive qualitative interviews in order to better understand what it means to learn with virtual reality.
Please join foundry10 CEO Lisa Castaneda and teachers Steve Isaacs and Mark Suter as we share what we have learned about how to effectively utilize VR for classroom learning through content creation (both inside and outside of the virtual world), content consumption and content integration and overcoming the obstacles inherent in implementation.
At the core of the platform is Voke’s TrueVR product, which delivers full stereoscopic 3D video that is integrated with augmented content in a 360-degree VR environment. It uses multiple camera angles with zoom capabilities and synchronized DVR, so that viewers can control what they want to watch. Additionally, with TrueVR, content is captured, encoded, synced with scores, metadata and audio and delivered in real time to multiple platforms.
meetings with Chief Learning Officers, talent management leaders, and vendors of next generation learning tools.
The corporate L&D industry is over $140 billion in size, and it crosses over into the $300 billion marketplace for college degrees, professional development, and secondary education around the world.
Digital Learning does not mean learning on your phone, it means “bringing learning to where employees are.” In other words, this new era is not only a shift in tools, it’s a shift toward employee-centric design. Shifting from “instructional design” to “experience design” and using design thinking are key here.
1) The traditional LMS is no longer the center of corporate learning, and it’s starting to go away.
LMS platforms were designed around the traditional content model, using a 17 year old standard called SCORM. SCORM is a technology developed in the 1980s, originally intended to help companies like track training records from their CD-ROM based training programs.
the paradigm that we built was focused on the idea of a “course catalog,” an artifact that makes sense for formal education, but no longer feels relevant for much of our learning today.
not saying the $4 billion LMS market is dead, but the center or action has moved (ie. their cheese has been moved). Today’s LMS is much more of a compliance management system, serving as a platform for record-keeping, and this function can now be replaced by new technologies.
We have come from a world of CD ROMs to online courseware (early 2000s) to an explosion of video and instructional content (YouTube and MOOCs in the last five years), to a new world of always-on, machine-curated content of all shapes and sizes. The LMS, which was largely architected in the early 2000s, simply has not kept up effectively.
2) The emergence of the X-API makes everything we do part of learning.
In the days of SCORM (the technology developed by Boeing in the 1980s to track CD Roms) we could only really track what you did in a traditional or e-learning course. Today all these other activities are trackable using the X-API (also called Tin Can or the Experience API). So just like Google and Facebook can track your activities on websites and your browser can track your clicks on your PC or phone, the X-API lets products like the learning record store keep track of all your digital activities at work.
3) As content grows in volume, it is falling into two categories: micro-learning and macro-learning.
4) Work Has Changed, Driving The Need for Continuous Learning
Why is all the micro learning content so important? Quite simply because the way we work has radically changed. We spend an inordinate amount of time looking for information at work, and we are constantly bombarded by distractions, messages, and emails.
5) Spaced Learning Has Arrived
If we consider the new world of content (micro and macro), how do we build an architecture that teaches people what to use when? Can we make it easier and avoid all this searching?
Neurological research has proved that we don’t learn well through “binge education” like a course. We learn by being exposed to new skills and ideas over time, with spacing and questioning in between. Studies have shown that students who cram for final exams lose much of their memory within a few weeks, yet students who learn slowly with continuous reinforcement can capture skills and knowledge for decades.
6) A New Learning Architecture Has Emerged: With New Vendors To Consider
One of the keys to digital learning is building a new learning architecture. This means using the LMS as a “player” but not the “center,” and looking at a range of new tools and systems to bring content together.
On the upper left is a relatively new breed of vendors, including companies like Degreed, EdCast, Pathgather, Jam, Fuse, and others, that serve as “learning experience” platforms. They aggregate, curate, and add intelligence to content, without specifically storing content or authoring in any way. In a sense they develop a “learning experience,” and they are all modeled after magazine-like interfaces that enables users to browse, read, consume, and rate content.
The second category the “program experience platforms” or “learning delivery systems.” These companies, which include vendors like NovoEd, EdX, Intrepid, Everwise, and many others (including many LMS vendors), help you build a traditional learning “program” in an open and easy way. They offer pathways, chapters, social features, and features for assessment, scoring, and instructor interaction. While many of these features belong in an LMS, these systems are built in a modern cloud architecture, and they are effective for programs like sales training, executive development, onboarding, and more. In many ways you can consider them “open MOOC platforms” that let you build your own MOOCs.
The third category at the top I call “micro-learning platforms” or “adaptive learning platforms.” These are systems that operate more like intelligent, learning-centric content management systems that help you take lots of content, arrange it into micro-learning pathways and programs, and serve it up to learners at just the right time. Qstream, for example, has focused initially on sales training – and clients tell me it is useful at using spaced learning to help sales people stay up to speed (they are also entering the market for management development). Axonify is a fast-growing vendor that serves many markets, including safety training and compliance training, where people are reminded of important practices on a regular basis, and learning is assessed and tracked. Vendors in this category, again, offer LMS-like functionality, but in a way that tends to be far more useful and modern than traditional LMS systems. And I expect many others to enter this space.
Perhaps the most exciting part of tools today is the growth of AI and machine-learning systems, as well as the huge potential for virtual reality.
7) Traditional Coaching, Training, and Culture of Learning Has Not Gone Away
8) A New Business Model for Learning
he days of spending millions of dollars on learning platforms is starting to come to an end. We do have to make strategic decisions about what vendors to select, but given the rapid and immature state of the market, I would warn against spending too much money on any one vendor at a time. The market has yet to shake out, and many of these vendors could go out of business, be acquired, or simply become irrelevant in 3-5 years.
9) The Impact of Microsoft, Google, Facebook, and Slack Is Coming
The newest versions of Microsoft Teams, Google Hangouts and Google Drive, Workplace by Facebook, Slack, and other enterprise IT products now give employees the opportunity to share content, view videos, and find context-relevant documents in the flow of their daily work.
We can imagine that Microsoft’s acquisition of LinkedIn will result in some integration of Lynda.com content in the flow of work. (Imagine if you are trying to build a spreadsheet and a relevant Lynda course opens up). This is an example of “delivering learning to where people are.”
10) A new set of skills and capabilities in L&D
It’s no longer enough to consider yourself a “trainer” or “instructional designer” by career. While instructional design continues to play a role, we now need L&D to focus on “experience design,” “design thinking,” the development of “employee journey maps,” and much more experimental, data-driven, solutions in the flow of work.
lmost all the companies are now teaching themselves design thinking, they are using MVP (minimal viable product) approaches to new solutions, and they are focusing on understanding and addressing the “employee experience,” rather than just injecting new training programs into the company.
how data is produced, collected and analyzed. make accessible all kind of data and info
ask good q/s and find good answers, share finding in meaningful ways. this is where digital literacy overshadows information literacy and this the fact that SCSU library does not understand; besides teaching students how to find and evaluate data, I also teach them how to communicate effectively using electronic tools.
connecting people tools and resources and making it easier for everybody. building collaborative, open and interdisciplinary
robust data computational literates. developing workshops, project and events to practice new skills. to position the library as the interdisciplinary nexus
what are data: definition. items of information, facts, traces of content and form. higher level, conception discussion about data in terms of social effects: matadata capturing information about the world, social political and economic changes. move away the mystic conceptions about data. nothing objective about data.
the emergence of IoT – digital meets physical. cyber physical systems. smart objects driven by industry. . proliferation of sensor and device – smart devices.
what does privacy looks like ? what is netneutrality when IoT? library must restructure : collaborate across institutions about collections of data in opien and participatory ways. put IoT in the hands of make and break things (she is maker space aficionado)
make and break things hackathons – use cheap devices such as Arduino and Pi.
data literacy programs with higher level conception exploration; libraries empower the campus in data collection. data science norms, store and share data to existing repositories and even catalogs. commercial services to store and connect data, but very restrictive and this is why libraries must be involved.
linked data and dark data
linked data – draw connections around online data most of the data are locked. linked data uses metadata to link related information in ways computers can understand.
libraries take advantage of link data. link data opportunity for semantics, natural language processing etc. if hidden data is relative to our communities, it is a library responsibility to provide it. community data practitioners
massive data, which cannot be analyzed by relational processing. data not yield significant findings. might be valuable for researchers: one persons trash is another persons’ treasure. preserving data and providing access to info. collaborate with researchers across disciplines and assist decide what is worth keeping and what discarding and how to study.
rich learning experience working with lined and dark data enable fresh perspective and learning how to work with data architecture. data literacy programming.
In the age of Big Data, there is an abundance of free or cheap data sources available to libraries about their users’ behavior across the many components that make up their web presence. Data from vendors, data from Google Analytics or other third-party tracking software, and data from user testing are all things libraries have access to at little or no cost. However, just like many students can become overloaded when they do not know how to navigate the many information sources available to them, many libraries can become overloaded by the continuous stream of data pouring in from these sources. This session will aim to help librarians understand 1) what sorts of data their library already has (or easily could have) access to about how their users use their various web tools, 2) what that data can and cannot tell them, and 3) how to use the datasets they are collecting in a holistic manner to help them make design decisions. The presentation will feature examples from the presenters’ own experience of incorporating user data in decisions related to design the Bethel University Libraries’ web presence.
lack of fear, changing the mindset.
deep collaboration both within and cross-consortia
don’t rely on vendor solutions. changing mindset
development = oppty (versus development as “work”)
private higher education is PALNI
3d virtual picture of disastrous areas. unlock the digital information to be digitally accessible to all people who might be interested.
they opened the maps of Katmandu for the local community and they were coming up with the strategies to recover. democracy in action
i can’t stop thinking that the keynote speaker efforts are mere follow up of what Naomi Klein explains in her Shock Doctrine: http://www.naomiklein.org/shock-doctrine: a government country seeks reasons to destroy another country or area and then NGOs from the same country go to remedy the disasters
A question from a librarian from the U about the use of drones. My note: why did the SCSU library have to give up its drone?
Douglas County Library model. too resource intensive to continue
Marmot Library Network
ILS integrated library system – shared with other counties, same sever for the entire consortium. they have a programmer, viewfind, open source, discovery player, he customized viewfind community to viewfind plus. instead of using the ILS public access catalogue, they are using the Vufind interface
Caiifa Enki. public library – single access collection. they purchase ebooks from the publisher and they are using also the viewfind interface. but not integrated with the library catalogs. Kansas public library went from OverDrive to Viewfind. CA State library is funding for the time being this effort.
Harper Collins is too cumbersome and the reason to avoid working with them.
security issues. some of the material sent over ftp and immediately moved to sftp
decisions – use of internal resources only, if now – amazon
programmer used for the pilot. contracted programmers. lack of the ability to see the large picture. eventually hired a full time person, instead of outsourcing. RDA compliant MARC.
ONIX, spreadsheet MARC.
Decision about who to start with : public or academic.
attempt to keep pricing down –
own agreement with the customers, separate from the agreement with the Publisher
current development: web-based online reading, shared-consortial collections and SIP2 authentication