During a Dec. 8 Future Trends Forum video chat hosted by futurist Bryan Alexander, several liberal arts technology leaders spoke about their efforts to define their colleges’ approach to digital innovation.
As an example of a more promising liberal arts partnership, Eshleman pointed to LACOL, the Liberal Arts Consortium for Online Learning. LACOL’s nine member institutions comprise Amherst, Bryn Mawr, Carleton, Haverford, Pomona, Swarthmore, Vassar, Washington and Lee and Williams. LACOL is an effort to create an experimental framework that supports project work across the nine campuses. There are interesting experiments happening on each campus, and LACOL provides opportunities to use a digital network to take those to a new level, said Elizabeth Evans, LACOL’s director, who joined Eshleman on the Future Trends Forum virtual stage to describe the consortium’s setup.
This involves a multi-campus team of faculty and instructional designers, all organized around a central project, which has its ups and downs, she added.
She said she has learned to keep the focus off of technology initially. She asks faculty members to think about what have they wanted to do around student learning and why. “It is about that first, and technology second,” she stressed, adding that she has moved away from quantitative evaluation of projects and more toward storytelling.
The key campus tech issues are no longer about IT (in the past e.g.: MS versus Apple). IT is the “easy part” of technology on campus. The challenges: people, planning policy, programs, priorities, silos, egos, and IT entitlements
How do we make Digital Learning compelling and safe for the faculty? provide evidence of impact, support, recognition and reward for faculty; communicate about effectiveness of and need for IT resources.
technology is not capital cost, it is operational cost. reoccurring.
underlying issues; can i do this? why should i do this? evidence of benefit?
the more things change, the more things stay the same. new equilibrium.
change: from what did you do wrong to how do we do better. Use data as a resources, not as a weapon. there is a fear of trying, because there is no recognition or reward
Machiavelli: 1. concentrate your efforts 2. pick your issues carefully, know when to fight 3. know the history 4. build coalitions 5. set modest goals – and realistic 6. leverage the value of data (use it as resource not weapon) 7. anticipate personnel turnover 8. set deadlines for decisions
We apologize for the short notice, but wanted to make you aware of the following opportunity: provide
From Ken Graetz at Winona State University:
As part of our Digital Faculty Fellows Program at WSU, Dr. Kenneth C. Green will be speaking this Thursday, March 22nd in Stark 103 Miller Auditorium from 11:30 to 12:30 on “Innovation, Infrastructure, and Digital Learning.” We will be streaming Casey’s talk using Skype Meeting Broadcast and you can join as a guest using the following link: Join the presentation. This will allow you to see and hear his presentation, as well as post moderated questions. By way of a teaser, here is a recent quote from Dr. Green’s blog, DigitalTweed, published by Inside Higher Ed:
“If trustees, presidents, provosts, deans, and department chairs really want to address the fear of trying and foster innovation in instruction, then they have to recognize that infrastructure fosters innovation. And infrastructure, in the context of technology and instruction, involves more than just computer hardware, software, digital projectors in classrooms, learning management systems, and campus web sites. The technology is actually the easy part. The real challenges involve a commitment to research about the impact of innovation in instruction, and recognition and reward for those faculty who would like to pursue innovation in their instructional activities.”
Dr. Green is the founding director of The Campus Computing Project, the largest continuing study of the role of digital learning and information technology in American colleges and universities. Campus Computing is widely cited as a definitive source for data, information, and insight about IT planning and policy issues affecting higher education. Dr. Green also serves as the director, moderator, and co-producer of TO A DEGREE, the postsecondary success podcast of the Bill & Melinda Gates Foundation. He is the author or editor of some 20 books and published research reports and more than 100 articles and commentaries that have appeared in academic journals and professional publications. In 2002, Dr. Green received the first EDUCAUSE Award for Leadership in Public Policy and Practice. The EDUCAUSE award cites his work in creating The Campus Computing Project and recognizes his, “prominence in the arena of national and international technology agendas, and the linking of higher education to those agendas.”
special guest Steven Bell, Associate Librarian at Temple University Libraries
Tuesday, February 27 when the #DLNchatcommunity got together to discuss: What Is the Role of Libraries in Digital Learning Innovation?
“it will definitely be a more sustainable initiative if it is collaborative—-whether it’s OER, open access journals, etc…if the library wants to go alone it will go fast but if it goes with others it will go much further.”
The #DLNchat community concurred there are ample opportunities for library-led collaborations in digital learning across campus. “Curation is key
OER = “Faculty + Librarians + Digital Media Experts = Engaging Content 4 learners.”
considering exactly that-how to create “librarians on demand” to meet students and faculty in dining halls, coffee shops, study lounges or wherever they may be conducting their scholarly work.
digital ethics, which I define simply as “doing the right thing at the intersection of technology innovation and accepted social values.”
Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, written by Cathy O’Neil in early 2016, continues to be relevant and illuminating. O’Neil’s book revolves around her insight that “algorithms are opinions embedded in code,” in distinct contrast to the belief that algorithms are based on—and produce—indisputable facts.
Safiya Umoja Noble’s book Algorithms of Oppression: How Search Engines Reinforce Racism
The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power
PROGRAM FEES $2,300 STARTS ON November 28, 20182 months, online
6-8 hours per week
A Digital Revolution Is Underway.
In a rapidly expanding digital marketplace, legacy companies without a clear digital transformation strategy are being left behind. How can we stay on top of rapid—and sometimes radical—change? How can we position our organizations to take advantage of new technologies? How can we track and combat the security threats facing all of us as we are swept forward into the future?
Who is this Program for?
Professionals in traditional companies poised to implement strategic change, as well as entrepreneurs seeking to harness the opportunities afforded by new technologies, will learn the fundamentals of digital transformation and secure the necessary tools to navigate their enterprise to a digital platform.
Participants come from a wide range of industries and include C-suite executives, business consultants, corporate attorneys, risk officers, marketing, R&D, and innovation enablers.
<h3 “>Your Learning Journey
This online program takes you through the fundamentals of digital technologies transforming our world today. Led by MIT faculty at the forefront of data science, participants will learn the history and application of transformative technologies such as blockchain, artificial intelligence, cloud computing, IoT, and cybersecurity as well as the implications of employing—or ignoring—digitalization.
The term “digital humanities” can refer to research and instruction that is about information technology or that uses IT. By applying technologies in new ways, the tools and methodologies of digital humanities open new avenues of inquiry and scholarly production. Digital humanities applies computational capabilities to humanistic questions, offering new pathways for scholars to conduct research and to create and publish scholarship. Digital humanities provides promising new channels for learners and will continue to influence the ways in which we think about and evolve technology toward better and more humanistic ends.
As defined by Johanna Drucker and colleagues at UCLA, the digital humanities is “work at the intersection of digital technology and humanities disciplines.” An EDUCAUSE/CNI working group framed the digital humanities as “the application and/or development of digital tools and resources to enable researchers to address questions and perform new types of analyses in the humanities disciplines,” and the NEH Office of Digital Humanities says digital humanities “explore how to harness new technology for thumanities research as well as those that study digital culture from a humanistic perspective.” Beyond blending the digital with the humanities, there is an intentionality about combining the two that defines it.
digital humanities can include
creating digital texts or data sets;
cleaning, organizing, and tagging those data sets;
applying computer-based methodologies to analyze them;
and making claims and creating visualizations that explain new findings from those analyses.
Scholars might reflect on
how the digital form of the data is organized,
how analysis is conducted/reproduced, and
how claims visualized in digital form may embody assumptions or biases.
Digital humanities can enrich pedagogy as well, such as when a student uses visualized data to study voter patterns or conducts data-driven analyses of works of literature.
Digital humanities usually involves work by teams in collaborative spaces or centers. Team members might include
researchers and faculty from multiple disciplines,
data scientists and preservation experts,
technologists with expertise in critical computing and computing methods, and undergraduates
some disciplinary associations, including the Modern Language Association and the American HistoricalAssociation, have developed guidelines for evaluating digital proj- ects, many institutions have yet to define how work in digital humanities fits into considerations for tenure and promotion
Because large projects are often developed with external funding that is not readily replaced by institutional funds when the grant ends sustainability is a concern. Doing digital humanities well requires access to expertise in methodologies and tools such as GIS, mod- eling, programming, and data visualization that can be expensive for a single institution to obtain
Resistance to learning new tech- nologies can be another roadblock, as can the propensity of many humanists to resist working in teams. While some institutions have recognized the need for institutional infrastructure (computation and storage, equipment, software, and expertise), many have not yet incorporated such support into ongoing budgets.
Opportunities for undergraduate involvement in research, provid ing students with workplace skills such as data management, visualization, coding, and modeling. Digital humanities provides new insights into policy-making in areas such as social media, demo- graphics, and new means of engaging with popular culture and understanding past cultures. Evolution in this area will continue to build connections between the humanities and other disci- plines, cross-pollinating research and education in areas like med- icine and environmental studies. Insights about digital humanities itself will drive innovation in pedagogy and expand our conceptualization of classrooms and labs
David Demaine, S., Lemmer, C. A., Keele, B. J., & Alcasid, H. (2015). Using Digital Badges to Enhance Research Instruction in Academic Libraries. In B. L. Eden (Ed.), Enhancing Teaching and Learning in the 21st-Century Academic Library: Successful Innovations That Make a Difference (2015th ed.). Retrieved from https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2882671
At their best, badges can create a sort of interactive e-resume.
the librarian may be invited into the classroom, or the students may be sent to the Iibrary for a single research lesson on databases and search tem1s- not enough for truly high-quality research. A better alternative may be that the professor require the students to complete a series of badges- designed, implemented, and managed by the librarian- that build thorough research skills and ultimately produce a better paper.
Meta- badges are s impl y badges that indicate comp letion o f multiple related badges.
Authentication (determining that the badge has not been altered) and validation/verification (checking that the badge has actually been earned and issued by the stated issuer) are major concerns. lt is also important, particularly in the academic context, to make sure that the badge does not come to replace the learning it represents. A badge is a symbol that other skills and knowledge exist in this individual’s portfolio of skills and talents. Therefore, badges awarded in the educational context must reflect time and effort and be based on vetted standards, or they will become empty symbols
Digital credentialing recognizes “learning of many kinds which are acquired beyond formal education institutions .. . ; it proliferates and disperses author- ity over what learning to recognize; and it provides a means of translation and commensuration across multiple spheres” (Oineck, 2012, p. I)
University digital badge projects are rarely a top-down undertaking. Typi- cally, digital badge programs arise from collaborative efforts “of people agi- tating from the middle” (Raths, 2013).
a learning management system (LMS) is never the solution to every problem in education. Edtech is just one part of the whole learning ecosystem and student experience.
Therefore, the next generation digital learning environment (NGDLE), as envisioned by EDUCAUSE in 2015 … Looking at the NGDLE requirements from an LMS perspective, I view the NGDLE as being about five areas: interoperability; personalization; analytics, advising, and learning assessment; collaboration; accessibility and universal design.
Content can easily be exchanged between systems.
Users are able to leverage the tools they love, including discipline-specific apps.
Learning data is available to trusted systems and people who need it.
The learning environment is “future proof” so that it can adapt and extend as the ecosystem evolves.
The learning environment reflects individual preferences.
Departments, divisions, and institutions can be autonomous.
Instructors teach the way they want and are not constrained by the software design.
There are clear, individual learning paths.
Students have choice in activity, expression, and engagement.
Analytics, Advising, and Learning Assessment
Learning analytics helps to identify at-risk students, course progress, and adaptive learning pathways.
The learning environment enables integrated planning and assessment of student performance.
More data is made available, with greater context around the data.
The learning environment supports platform and data standards.
Individual spaces persist after courses and after graduation.
Learners are encouraged as creators and consumers.
Courses include public and private spaces.
Accessibility and Universal Design
Accessibility is part of the design of the learning experience.
The learning environment enables adaptive learning and supports different types of materials.
Learning design includes measurement rubrics and quality control.
The core analogy used in the NGDLE paper is that each component of the learning environment is a Lego brick:
The days of the LMS as a “walled garden” app that does everything is over.
Today many kinds of amazing learning and collaboration tools (Lego bricks) should be accessible to educators.
We have standards that let these tools (including an LMS) talk to each other. That is, all bricks share some properties that let them fit together.
Students and teachers sign in once to this “ecosystem of bricks.”
The bricks share results and data.
These bricks fit together; they can be interchanged and swapped at will, with confidence that the learning experience will continue uninterrupted.
Any “next-gen” attempt to completely rework the pedagogical model and introduce a “mash-up of whatever” to fulfil this model would fall victim to the same criticisms levied at the LMS today: there is too little time and training to expect faculty to figure out the nuances of implementation on their own.
The Lego metaphor works only if we’re talking about “old school” Lego design — bricks of two, three, and four-post pieces that neatly fit together. Modern edtech is a lot more like the modern Lego. There are wheels and rocket launchers and belts and all kinds of amazing pieces that work well with each other, but only when they are configured properly. A user cannot simply stick together different pieces and assume they will work harmoniously in creating an environment through which each student can be successful.
As the NGDLE paper states: “Despite the high percentages of LMS adoption, relatively few instructors use its more advanced features — just 41% of faculty surveyed report using the LMS ‘to promote interaction outside the classroom.'”
But this is what the next generation LMS is good at: being a central nervous system — or learning hub — through which a variety of learning activities and tools are used. This is also where the LMS needs to go: bringing together and making sense of all the amazing innovations happening around it. This is much harder to do, perhaps even impossible, if all the pieces involved are just bricks without anything to orchestrate them or to weave them together into a meaningful, personal experience for achieving well-defined learning outcomes.
Making a commitment to build easy, flexible, and smart technology
Working with colleges and universities to remove barriers to adopting new tools in the ecosystem
Standardizing the vetting of accessibility compliance (the Strategic Nonvisual Access Partner Program from the National Federation of the Blind is a great start)
Advancing standards for data exchange while protecting individual privacy
Building integrated components that work with the institutions using them — learning quickly about what is and is not working well and applying those lessons to the next generation of interoperability standards
Letting people use the tools they love [SIC] and providing more ways for nontechnical individuals (including students) to easily integrate new features into learning activities
My note: something just refused to be accepted at SCSU
Technologists are often very focused on the technology, but the reality is that the more deeply and closely we understand the pedagogy and the people in the institutions — students, faculty, instructional support staff, administrators — the better suited we are to actually making the tech work for them.
Under the Hood of a Next Generation Digital Learning Environment in Progress
The challenge is that although 85 percent of faculty use a campus learning management system (LMS),1 a recent Blackboard report found that, out of 70,000 courses across 927 North American institutions, 53 percent of LMS usage was classified as supplemental(content-heavy, low interaction) and 24 percent as complementary (one-way communication via content/announcements/gradebook).2 Only 11 percent were characterized as social, 10 percent as evaluative (heavy use of assessment), and 2 percent as holistic (balanced use of all previous). Our FYE course required innovating beyond the supplemental course-level LMS to create a more holistic cohort-wide NGDLE in order to fully support the teaching, learning, and student success missions of the program.The key design goals for our NGDLE were to:
Create a common platform that could deliver a standard curriculum and achieve parity in all course sections using existing systems and tools and readily available content
Capture, store, and analyze any generated learner data to support learning assessment, continuous program improvement, and research
Develop reports and actionable analytics for administrators, advisors, instructors, and students