a learning management system (LMS) is never the solution to every problem in education. Edtech is just one part of the whole learning ecosystem and student experience.
Therefore, the next generation digital learning environment (NGDLE), as envisioned by EDUCAUSE in 2015 … Looking at the NGDLE requirements from an LMS perspective, I view the NGDLE as being about five areas: interoperability; personalization; analytics, advising, and learning assessment; collaboration; accessibility and universal design.
Content can easily be exchanged between systems.
Users are able to leverage the tools they love, including discipline-specific apps.
Learning data is available to trusted systems and people who need it.
The learning environment is “future proof” so that it can adapt and extend as the ecosystem evolves.
The learning environment reflects individual preferences.
Departments, divisions, and institutions can be autonomous.
Instructors teach the way they want and are not constrained by the software design.
There are clear, individual learning paths.
Students have choice in activity, expression, and engagement.
Analytics, Advising, and Learning Assessment
Learning analytics helps to identify at-risk students, course progress, and adaptive learning pathways.
The learning environment enables integrated planning and assessment of student performance.
More data is made available, with greater context around the data.
The learning environment supports platform and data standards.
Individual spaces persist after courses and after graduation.
Learners are encouraged as creators and consumers.
Courses include public and private spaces.
Accessibility and Universal Design
Accessibility is part of the design of the learning experience.
The learning environment enables adaptive learning and supports different types of materials.
Learning design includes measurement rubrics and quality control.
The core analogy used in the NGDLE paper is that each component of the learning environment is a Lego brick:
The days of the LMS as a “walled garden” app that does everything is over.
Today many kinds of amazing learning and collaboration tools (Lego bricks) should be accessible to educators.
We have standards that let these tools (including an LMS) talk to each other. That is, all bricks share some properties that let them fit together.
Students and teachers sign in once to this “ecosystem of bricks.”
The bricks share results and data.
These bricks fit together; they can be interchanged and swapped at will, with confidence that the learning experience will continue uninterrupted.
Any “next-gen” attempt to completely rework the pedagogical model and introduce a “mash-up of whatever” to fulfil this model would fall victim to the same criticisms levied at the LMS today: there is too little time and training to expect faculty to figure out the nuances of implementation on their own.
The Lego metaphor works only if we’re talking about “old school” Lego design — bricks of two, three, and four-post pieces that neatly fit together. Modern edtech is a lot more like the modern Lego. There are wheels and rocket launchers and belts and all kinds of amazing pieces that work well with each other, but only when they are configured properly. A user cannot simply stick together different pieces and assume they will work harmoniously in creating an environment through which each student can be successful.
As the NGDLE paper states: “Despite the high percentages of LMS adoption, relatively few instructors use its more advanced features — just 41% of faculty surveyed report using the LMS ‘to promote interaction outside the classroom.'”
But this is what the next generation LMS is good at: being a central nervous system — or learning hub — through which a variety of learning activities and tools are used. This is also where the LMS needs to go: bringing together and making sense of all the amazing innovations happening around it. This is much harder to do, perhaps even impossible, if all the pieces involved are just bricks without anything to orchestrate them or to weave them together into a meaningful, personal experience for achieving well-defined learning outcomes.
Making a commitment to build easy, flexible, and smart technology
Working with colleges and universities to remove barriers to adopting new tools in the ecosystem
Standardizing the vetting of accessibility compliance (the Strategic Nonvisual Access Partner Program from the National Federation of the Blind is a great start)
Advancing standards for data exchange while protecting individual privacy
Building integrated components that work with the institutions using them — learning quickly about what is and is not working well and applying those lessons to the next generation of interoperability standards
Letting people use the tools they love [SIC] and providing more ways for nontechnical individuals (including students) to easily integrate new features into learning activities
My note: something just refused to be accepted at SCSU
Technologists are often very focused on the technology, but the reality is that the more deeply and closely we understand the pedagogy and the people in the institutions — students, faculty, instructional support staff, administrators — the better suited we are to actually making the tech work for them.
Under the Hood of a Next Generation Digital Learning Environment in Progress
The challenge is that although 85 percent of faculty use a campus learning management system (LMS),1 a recent Blackboard report found that, out of 70,000 courses across 927 North American institutions, 53 percent of LMS usage was classified as supplemental(content-heavy, low interaction) and 24 percent as complementary (one-way communication via content/announcements/gradebook).2 Only 11 percent were characterized as social, 10 percent as evaluative (heavy use of assessment), and 2 percent as holistic (balanced use of all previous). Our FYE course required innovating beyond the supplemental course-level LMS to create a more holistic cohort-wide NGDLE in order to fully support the teaching, learning, and student success missions of the program.The key design goals for our NGDLE were to:
Create a common platform that could deliver a standard curriculum and achieve parity in all course sections using existing systems and tools and readily available content
Capture, store, and analyze any generated learner data to support learning assessment, continuous program improvement, and research
Develop reports and actionable analytics for administrators, advisors, instructors, and students
The Internet of Things (IoT), augmented reality, and advancements in online learning have changed the way universities reach prospective students, engage with their current student body, and provide them the resources they need.
The Internet of Things has opened up a whole new world of possibilities in higher education. The increased connectivity between devices and “everyday things” means better data tracking and analytics, and improved communication between student, professor, and institution, often without ever saying a word. IoT is making it easier for students to learn when, how, and where they want, while providing professors support to create a more flexible and connected learning environment.
Virtual and augmented reality technologies have begun to take Higher Ed into the realm of what used to be considered science fiction.
By 2020 more than 50 billion things, ranging from cranes to coffee machines, will be connected to the internet. That means a lot of data will be created — too much data, in fact, to be manageable or to be kept forever affordably.
One by-product of more devices creating more data is that they are speaking lots of different programming languages. Machines are still using languages from the 1970s and 80s as well as the new languages of today. In short, applications need to have data translated for them — by an IoT babelfish, if you will — before they can make sense of the information.
Then there are analytics and data storage.
security becomes even more important as there is little human interaction in the flow of data from device to datacentre — so called machine-to-machine communication.
Shoshana Mayden is a content strategist with the University of Arizona Libraries in Tucson, Arizona. She advises on issues related to website content and contributes to developing a content strategy across multiple channels for the library. She works with content managers and library stakeholders to re-write, re-think and re-organize content for the main library website, as well as develop workflows related to the lifecycle of content. She also a copy editor for Weave: the Journal of Library User Experience.
Information Architecture: Designing Navigation for Library Websites
Information Architecture is an essential component of user-centered design of information spaces, especially websites. Website navigation is a key design device to help users search and browse library websites and information systems. The design of Website navigation can be simple or complex, flat or deep. In all cases, website navigation should take into account information architecture (IA) best practices, common user tasks in the library domain, user research, analytics and information seeking models.
Laura-Edythe Coleman is a Museum Informaticist: her focus is on the point of convergence for museums, information, people, and technology. Knowing that societies need museums for creating and sustaining cultural memory, she strives to help communities co-create heritage collections with museums. She holds a PhD in Information Science, a Masters of Library and Information Science and a Bachelors of Fine Arts. She can be reached via Twitter: @lauraedythe, website: http://www.lauraedythe.com, or by email email@example.com
The report gives detailed data on the use of various bibliometric and altmetric tools such as Google Scholar, Web of Science, Scimago, Plum Analytics
20 predominantly research universities in the USA, continental Europe, the UK, Canada and Australia/New Zealand. Among the survey participants are: Carnegie Mellon, Cambridge University, Universitat Politècnica de Catalunya the University at Albany, the University of Melbourne, Florida State University, the University of Alberta and Victoria University of Wellington
ResearcherID provides a solution to the author ambiguity problem within the scholarly research community. Each member is assigned a unique identifier to enable researchers to manage their publication lists, track their times cited counts and h-index, identify potential collaborators and avoid author misidentification. In addition, your ResearcherID information integrates with the Web of Science and is ORCID compliant, allowing you to claim and showcase your publications from a single one account. Search the registry to find collaborators, review publication lists and explore how research is used around the world!
Industrial revolutions are momentous events. By most reckonings, there have been only three. The first was triggered in the 1700s by the commercial steam engine and the mechanical loom. The harnessing of electricity and mass production sparked the second, around the start of the 20th century. The computer set the third in motion after World War II.
Henning Kagermann, the head of the German National Academy of Science and Engineering (Acatech), did exactly that in 2011, when he used the term Industrie 4.0 to describe a proposed government-sponsored industrial initiative.
The term Industry 4.0 refers to the combination of several major innovations in digital technology
These technologies include advanced robotics and artificial intelligence; sophisticated sensors; cloud computing; the Internet of Things; data capture and analytics; digital fabrication (including 3D printing); software-as-a-service and other new marketing models; smartphones and other mobile devices; platforms that use algorithms to direct motor vehicles (including navigation tools, ride-sharing apps, delivery and ride services, and autonomous vehicles); and the embedding of all these elements in an interoperable global value chain, shared by many companies from many countries.
Companies that embrace Industry 4.0 are beginning to track everything they produce from cradle to grave, sending out upgrades for complex products after they are sold (in the same way that software has come to be updated). These companies are learning mass customization: the ability to make products in batches of one as inexpensively as they could make a mass-produced product in the 20th century, while fully tailoring the product to the specifications of the purchaser
Three aspects of digitization form the heart of an Industry 4.0 approach.
• The full digitization of a company’s operations
• The redesign of products and services
• Closer interaction with customers
Making Industry 4.0 work requires major shifts in organizational practices and structures. These shifts include new forms of IT architecture and data management, new approaches to regulatory and tax compliance, new organizational structures, and — most importantly — a new digitally oriented culture, which must embrace data analytics as a core enterprise capability.
Klaus Schwab put it in his recent book The Fourth Industrial Revolution (World Economic Forum, 2016), “Contrary to the previous industrial revolutions, this one is evolving at an exponential rather than linear pace.… It is not only changing the ‘what’ and the ‘how’ of doing things, but also ‘who’ we are.”
This great integrating force is gaining strength at a time of political fragmentation — when many governments are considering making international trade more difficult. It may indeed become harder to move people and products across some national borders. But Industry 4.0 could overcome those barriers by enabling companies to transfer just their intellectual property, including their software, while letting each nation maintain its own manufacturing networks.
more on the Internet of Things in this IMS blog http://blog.stcloudstate.edu/ims?s=internet+of+things
Because the questionnaire data comprised both Likert scales and open questions, they were analyzed quantitatively and qualitatively. Textual data (open responses) were qualitatively analyzed by coding: each segment (e.g. a group of words) was assigned to a semantic reference category, as systematically and rigorously as possible. For example, “Using an iPad in class really motivates me to learn” was assigned to the category “positive impact on motivation.” The qualitative analysis was performed using an adapted version of the approaches developed by L’Écuyer (1990) and Huberman and Miles (1991, 1994). Thus, we adopted a content analysis approach using QDAMiner software, which is widely used in qualitative research (see Fielding, 2012; Karsenti, Komis, Depover, & Collin, 2011). For the quantitative analysis, we used SPSS 22.0 software to conduct descriptive and inferential statistics. We also conducted inferential statistics to further explore the iPad’s role in teaching and learning, along with its motivational effect. The results will be presented in a subsequent report (Fievez, & Karsenti, 2013)
The 20th century notion of conducting a qualitative research by an oral interview and then processing manually your results had triggered in the second half of the 20th century [sometimes] condescending attitudes by researchers from the exact sciences.
The reason was the advent of computing power in the second half of the 20th century, which allowed exact sciences to claim “scientific” and “data-based” results.
One of the statistical package, SPSS, is today widely known and considered a magnificent tools to bring solid statistically-based argumentation, which further perpetuates the superiority of quantitative over qualitative method.
At the same time, qualitative researchers continue to lag behind, mostly due to the inertia of their approach to qualitative analysis. Qualitative analysis continues to be processed in the olden ways. While there is nothing wrong with the “olden” ways, harnessing computational power can streamline the “olden ways” process and even present options, which the “human eye” sometimes misses.
Below are some suggestions, you may consider, when you embark on the path of qualitative research.
Palys and Atchison (2012) present a compelling case to bring your qualitative research to the level of the quantitative research by using modern tools for qualitative analysis.
1. The authors correctly promote NVivo as the “jaguar’ of the qualitative research method tools. Be aware, however, about the existence of other “Geo Metro” tools, which, for your research, might achieve the same result (see bottom of this blog entry).
text mining: https://en.wikipedia.org/wiki/Text_mining Text mining, also referred to as text data mining, roughly equivalent to text analytics, is the process of deriving high-quality information from text. High-quality information is typically derived through the devising of patterns and trends through means such as statistical pattern learning. Text mining usually involves the process of structuring the input text (usually parsing, along with the addition of some derived linguistic features and the removal of others, and subsequent insertion into a database), deriving patterns within the structured data, and finally evaluation and interpretation of the output. https://ischool.syr.edu/infospace/2013/04/23/what-is-text-mining/
Qualitative data is descriptive data that cannot be measured in numbers and often includes qualities of appearance like color, texture, and textual description. Quantitative data is numerical, structured data that can be measured. However, there is often slippage between qualitative and quantitative categories. For example, a photograph might traditionally be considered “qualitative data” but when you break it down to the level of pixels, which can be measured.
word of caution, text mining doesn’t generate new facts and is not an end, in and of itself. The process is most useful when the data it generates can be further analyzed by a domain expert, who can bring additional knowledge for a more complete picture. Still, text mining creates new relationships and hypotheses for experts to explore further.
Pros and Cons of Computer Assisted Qualitative Data Analysis Software
more on quantitative research:
Asamoah, D. A., Sharda, R., Hassan Zadeh, A., & Kalgotra, P. (2017). Preparing a Data Scientist: A Pedagogic Experience in Designing a Big Data Analytics Course. Decision Sciences Journal of Innovative Education, 15(2), 161–190. https://doi.org/10.1111/dsji.12125
literature on quantitative research:
St. Cloud State University MC Main Collection – 2nd floor
AZ195 .B66 2015
p. 161 Data scholarship in the Humanities
p. 166 When Are Data?
Philip Chen, C. L., & Zhang, C.-Y. (2014). Data-intensive applications, challenges, techniques and technologies: A survey on Big Data. Information Sciences, 275(Supplement C), 314–347. https://doi.org/10.1016/j.ins.2014.01.015
AECT-OTP Webinar: Digital Badges and Micro-Credentials for the Workplace
Time: Mar 27, 2017 1:00 PM Central Time (US and Canada)
Learn how to implement digital badges in learning environments. Digital badges and micro-credentials offer an entirely new way of recognizing achievements, knowledge, skills, experiences, and competencies that can be earned in formal and informal learning environments. They are an opportunity to recognize such achievements through credible organizations that can be integrated in traditional educational programs but can also represent experience in informal contexts or community engagement. Three guiding questions will be discussed in this webinar: (1) digital badges’ impact on learning and assessment, (2) digital badges within instructional design and technological frameworks, and (3) the importance of stakeholders for the implementation of digital badges.
Dirk Ifenthaler is Professor and Chair of Learning, Design and Technology at University of Mannheim, Germany and Adjunct Professor at Curtin University, Australia. His previous roles include Professor and Director, Centre for Research in Digital Learning at Deakin University, Australia, Manager of Applied Research and Learning Analytics at Open Universities, Australia, and Professor for Applied Teaching and Learning Research at the University of Potsdam, Germany. He was a 2012 Fulbright Scholar-in-Residence at the Jeannine Rainbolt College of Education, at the University of Oklahoma, USA
Each student learns differently and assessment is not linear. Learning for different students can be a longer or shorter path.
assessment comes before badges
what are credentials:
how well i can show my credentials: can i find it, can i translate it, issuer, earner, achievement description, date issued.
the potential to become an alternative credentialing system to link directly via metadata to validating evidence of educational achievements.
DB is not an assessment, it is the ability to demonstrate the assessment.
They are a motivational mechanism, supporting alternative forms of assessment, a way to credentialize learning, charting learning pathways, support self-reflection and planning
Top 10 IT Issues, 2017: Foundations for Student Success
Susan Grajek and the 2016–2017 EDUCAUSE IT Issues Panel Tuesday, January 17, 2017http://er.educause.edu/articles/2017/1/top-10-it-issues-2017-foundations-for-student-successThe 2017 EDUCAUSE Top 10 IT Issues are all about student success
Developing a holistic, agile approach to reduce institutional exposure to information security threats
That program should encompass people, process, and technologies:
Develop processes to identify and protect the most sensitive data
Implement technologies to encrypt data and find and block advanced threats coming from outside the network via from any type of device
Who Outside the IT Department Should Care Most about This Issue?
End-users, to understand how to avoid exposing their credentials
Unit heads, to protect institutional data
Senior leaders, to hold people accountable
Institutional leadership, to endorse, fund, and advocate for good information security
Issue #2: Student Success and Completion
Effectively applying data and predictive analytics to improve student success and completion
Predictive analytics allows us to track trends, discover gaps and inefficiencies, and displace “best guess” scenarios based on implicitly developed stories about students.
Issue #3: Data-Informed Decision Making
Ensuring that business intelligence, reporting, and analytics are relevant, convenient, and used by administrators, faculty, and students
Higher education information systems generate vast amounts of data daily (including the classroom/LMS). This potentially rich source of information is underused. Even though most institutions have created reports, dashboards, and other distillations of data, these are not necessarily useful or used to inform strategic objectives such as student success or institutional efficiency.
Issue #4: Strategic Leadership
Repositioning or reinforcing the role of IT leadership as a strategic partner with institutional leadership
CIOs have two challenges in this regard. The first is getting to the table. Contemporary requirements for IT leaders position them well for strategic leadership.18 Those requirements include expertise in management and business practices, project portfolio management, negotiation, and change leadership. However, business-savvy CIOs can alienate some academics, particularly those opposed to administrators as leaders. Worse, not all CIOs are well-equipped for a position at the executive table.
Issue #5: Sustainable Funding
Developing IT funding models that sustain core services, support innovation, and facilitate growth
Two complications have deepened the IT funding challenge in recent years. The first is that information technology is now incontrovertibly core to the mission and function of colleges and universities. The second complication is that at most institutions, digital investments and technology refreshes have been funded with capital expenditures. Yet IT services and infrastructure are moving outside the institution, generally to the cloud, and cloud funding depends on ongoing expenditures rather than one-time investments.
Issue #6: Data Management and Governance
Improving the management of institutional data through data standards, integration, protection, and governance
Data management and governance is not an IT issue. It requires a broad, top-down approach because all departments need to buy in and agree. All stakeholders (data owners as well as IR, IT, and institutional leaders) must collaboratively develop a common set of data definitions and a common understanding of what data is needed, in what format, and for what purposes. This coordination, or governance, will enable constituents to communicate with confidence about the data (e.g., “the single version of truth”) and the standards (e.g., APLU, IPEDS, CDS) under which it is collected.
Institutions often choose to approach data management from three perspectives: (1) accuracy, (2) usability, and (3) privacy. The IT organization has a role to play in creating and maintaining data warehouses, integrating systems to facilitate data exchange, and maintaining standards for data privacy and security.
Issue #7: Higher Education Affordability
Prioritizing IT investments and resources in the context of increasing demand and limited resources
Uncoordinated, redundant expenditures supplant other needed investments, such as consistent classroom technology or dedicated information security staff. Planning needs to occur at the institutional or departmental level, but it also needs a place to coalesce and be assessed regionally, nationally, and in some cases, globally, because there isn’t enough money to do everything that institutional leaders, faculty, and others want or even need to do. Public systems are making some headway in sharing services, but for the most part, local optimization supersedes collaboration and compromise.
Issue #8: Sustainable Staffing
Ensuring adequate staffing capacity and staff retention as budgets shrink or remain flat and as external competition grows
As institutions become more dependent on their IT organizations, IT organizations are more dependent on the expertise and quality of their workforce. New hires need to be great hires, and great staff need to want to stay. Each new hire can change the culture and effectiveness of the IT organizations
Issue #9: Next-Gen Enterprise IT
Developing and implementing enterprise IT applications, architectures, and sourcing strategies to achieve agility, scalability, cost-effectiveness, and effective analytics
Buildings should outlive alumni; technology shouldn’t. IT leaders are examining core enterprise applications, including ERPs (traditionally, suites of financial, HR, and student information systems) and LMSs, for their ability to meet current and future needs.
Issue #10: Digital Transformation of Learning
Collaborating with faculty and academic leadership to apply technology to teaching and learning in ways that reflect innovations in pedagogy and the institutional mission
According to Michael Feldstein and Phil Hill, personalized learning applies technology to three processes: content (moving content delivery out of the classroom and allowing students to set their pace of learning); tutoring (allowing interactive feedback to both students and faculty); and contact time (enabling faculty to observe students’ work and coach them more).
more on IT in this IMS blog http://blog.stcloudstate.edu/ims?s=information+technology