Additional question related: why not use already existing solutions, as used across the world. Alex response: open source. Tim: content available across institutions. text banks and other data can be grouped by disciplines. Follow up q/n: MLNC, OER Commons. Solution already exists and why don’t we use existing accumulated work. Answer by Karen: pulling many resources, promoting collaboration btw 2 and 4 year institutions. Bigger then just having a repository, collaborative effort on different levels
Access to a “sandbox” to test Islandora: who to contact when and how.
Alex response to “estimated date for faculty upload” – August 2018 approximately
Transferability/ compatible: how east it is to migrate Islandora content to a different platform (e.g. the Minnesota Library Publishing Project) shall other platform is chosen as MN OER platform?
How will this structure ensure that the OER initiative (Islandora in particular) is not “owned” by one branch on campus (e.g. librarians) but it is a mutual effort by faculty and staff (e.g. ATT) in terms of access, e.g. access to different admin levels in Islandora?
From the Adobe COnnect online attendees:
Barbara Sandarin: Regarding “Admin. Rights,” does this restrict who may upload items?
Maintenance: weeding out old materials
the history of Islandora: who when developed. 2009, U of Rhode Island
Stephen Kelly: how does Inslandora integrate video. microsite solutions
structure of repository:
Islandora only stores, but the actual creation is outside of Islandora adoption scope
how do the individual teams are built, communicate with
open pedagogy: students creating open textbooks. creating of D2L courseroom. Karen: learning circles. Gary Hunter’s form regarding copyright issues etc.
storage: unlimited yet, but might be if file size are big.
Robert Bilyk: Look at OpenStax on how they handle derivative content
Tim: what do we want to be able to search for: 1. Title 2. subject 3. Format 4. type 5. permission to modify or not 6. keywords 7. author 8. home institution of author 9. peer revieewd 10. author info (advanced feature) 11. Robert Bilyk: Assurance of accessibility — tables, images, etc. 12. course 13. hashtags
Robert Bilyk: Curriki allows any submission — but their editorial board eventually gets around to review — and then this is indicated
Guardtime – This company is creating “keyless” signature systems using blockchain which is currently used to secure the health records of one million Estonian citizens.
REMME is a decentralized authentication system which aims to replace logins and passwords with SSL certificates stored on a blockchain.
Gem – This startup is working with the Centre for Disease Control to put disease outbreak data onto a blockchain which it says will increase the effectiveness of disaster relief and response.
SimplyVital Health – Has two health-related blockchain products in development, ConnectingCare which tracks the progress of patients after they leave the hospital, and Health Nexus, which aims to provide decentralized blockchain patient records.
MedRec – An MIT project involving blockchain electronic medical records designed to manage authentication, confidentiality and data sharing.
ABRA – A cryptocurrency wallet which uses the Bitcoin blockchain to hold and track balances stored in different currencies.
Bank Hapoalim – A collaboration between the Israeli bank and Microsoft to create a blockchain system for managing bank guarantees.
Barclays – Barclays has launched a number of blockchain initiatives involving tracking financial transactions, compliance and combating fraud. It states that “Our belief …is that blockchain is a fundamental part of the new operating system for the planet.”
Maersk – The shipping and transport consortium has unveiled plans for a blockchain solution for streamlining marine insurance.
Aeternity – Allows the creation of smart contracts which become active when network consensus agrees that conditions have been met – allowing for automated payments to be made when parties agree that conditions have been met, for example.
Augur – Allows the creation of blockchain-based predictions markets for the trading of derivatives and other financial instruments in a decentralized ecosystem.
Manufacturing and industrial
Provenance – This project aims to provide a blockchain-based provenance record of transparency within supply chains.
Jiocoin – India’s biggest conglomerate, Reliance Industries, has said that it is developing a blockchain-based supply chain logistics platform along with its own cryptocurrency, Jiocoin.
Hijro – Previously known as Fluent, aims to create a blockchain framework for collaborating on prototyping and proof-of-concept.
SKUChain – Another blockchain system for allowing tracking and tracing of goods as they pass through a supply chain.
Blockverify – A blockchain platform which focuses on anti-counterfeit measures, with initial use cases in the diamond, pharmaceuticals and luxury goods markets.
Transactivgrid – A business-led community project based in Brooklyn allowing members to locally produce and cell energy, with the goal of reducing costs involved in energy distribution.
STORJ.io – Distributed and encrypted cloud storage, which allows users to share unused hard drive space.
Dubai – Dubai has set sights on becoming the world’s first blockchain-powered state. In 2016 representatives of 30 government departments formed a committee dedicated to investigating opportunities across health records, shipping, business registration and preventing the spread of conflict diamonds.
Estonia – The Estonian government has partnered with Ericsson on an initiative involving creating a new data center to move public records onto the blockchain. 20
South Korea – Samsung is creating blockchain solutions for the South Korean government which will be put to use in public safety and transport applications.
Govcoin – The UK Department of Work and Pensions is investigating using blockchain technology to record and administer benefit payments.
Democracy.earth – This is an open-source project aiming to enable the creation of democratically structured organizations, and potentially even states or nations, using blockchain tools.
Followmyvote.com – Allows the creation of secure, transparent voting systems, reducing opportunities for voter fraud and increasing turnout through improved accessibility to democracy.
Bitgive – This service aims to provide greater transparency to charity donations and clearer links between giving and project outcomes. It is working with established charities including Save The Children, The Water Project and Medic Mobile.
OpenBazaar – OpenBazaar is an attempt to build a decentralized market where goods and services can be traded with no middle-man.
Loyyal – This is a blockchain-based universal loyalty framework, which aims to allow consumers to combine and trade loyalty rewards in new ways, and retailers to offer more sophisticated loyalty packages.
Blockpoint.io – Allows retailers to build payment systems around blockchain currencies such as Bitcoin, as well as blockchain derived gift cards and loyalty schemes.
Ubiquity – This startup is creating a blockchain-driven system for tracking the complicated legal process which creates friction and expense in real estate transfer.
Transport and Tourism
IBM Blockchain Solutions – IBM has said it will go public with a number of non-finance related blockchain initiatives with global partners in 2018. This video envisages how efficiencies could be driven in the vehicle leasing industry.
Arcade City – An application which aims to beat Uber at their own game by moving ride sharing and car hiring onto the blockchain.
La’Zooz – A community-owned platform for synchronizing empty seats with passengers in need of a lift in real-time.
Webjet – The online travel portal is developing a blockchain solution to allow stock of empty hotel rooms to be efficiently tracked and traded, with payment fairly routed to the network of middle-men sites involved in filling last-minute vacancies.
Kodak – Kodak recently sent its stock soaring after announcing that it is developing a blockchain system for tracking intellectual property rights and payments to photographers.
Ujomusic – Founded by singer-songwriter Imogen Heap to record and track royalties for musicians, as well as allowing them to create a record of ownership of their work.
It is exciting to see all these developments. I am sure not all of these will make it into successful long-term ventures but if they indicate one thing, then it is the vast potential the blockchain technology is offering.
The EDUCAUSE Learning Initiative has just launched its 2018 Key Issues in Teaching and Learning Survey, so vote today: http://www.tinyurl.com/ki2018.
Each year, the ELI surveys the teaching and learning community in order to discover the key issues and themes in teaching and learning. These top issues provide the thematic foundation or basis for all of our conversations, courses, and publications for the coming year. Longitudinally they also provide the way to track the evolving discourse in the teaching and learning space. More information about this annual survey can be found at https://www.educause.edu/eli/initiatives/key-issues-in-teaching-and-learning.
ACADEMIC TRANSFORMATION (Holistic models supporting student success, leadership competencies for academic transformation, partnerships and collaborations across campus, IT transformation, academic transformation that is broad, strategic, and institutional in scope)
ACCESSIBILITY AND UNIVERSAL DESIGN FOR LEARNING (Supporting and educating the academic community in effective practice; intersections with instructional delivery modes; compliance issues)
ADAPTIVE TEACHING AND LEARNING (Digital courseware; adaptive technology; implications for course design and the instructor’s role; adaptive approaches that are not technology-based; integration with LMS; use of data to improve learner outcomes)
COMPETENCY-BASED EDUCATION AND NEW METHODS FOR THE ASSESSMENT OF STUDENT LEARNING (Developing collaborative cultures of assessment that bring together faculty, instructional designers, accreditation coordinators, and technical support personnel, real world experience credit)
DIGITAL AND INFORMATION LITERACIES (Student and faculty literacies; research skills; data discovery, management, and analysis skills; information visualization skills; partnerships for literacy programs; evaluation of student digital competencies; information evaluation)
EVALUATING TECHNOLOGY-BASED INSTRUCTIONAL INNOVATIONS (Tools and methods to gather data;data analysis techniques; qualitative vs. quantitative data; evaluation project design; using findings to change curricular practice; scholarship of teaching and learning; articulating results to stakeholders; just-in-time evaluation of innovations). here is my bibliographical overview on Big Data (scroll down to “Research literature”: http://blog.stcloudstate.edu/ims/2017/11/07/irdl-proposal/ )
EVOLUTION OF THE TEACHING AND LEARNING SUPPORT PROFESSION (Professional skills for T&L support; increasing emphasis on instructional design; delineating the skills, knowledge, business acumen, and political savvy for success; role of inter-institutional communities of practices and consortia; career-oriented professional development planning)
FACULTY DEVELOPMENT (Incentivizing faculty innovation; new roles for faculty and those who support them; evidence of impact on student learning/engagement of faculty development programs; faculty development intersections with learning analytics; engagement with student success)
GAMIFICATION OF LEARNING (Gamification designs for course activities; adaptive approaches to gamification; alternate reality games; simulations; technological implementation options for faculty)
INTEGRATED PLANNING AND ADVISING FOR STUDENT SUCCESS (Change management and campus leadership; collaboration across units; integration of technology systems and data; dashboard design; data visualization (here previous blog postings on this issue: http://blog.stcloudstate.edu/ims?s=data+visualization); counseling and coaching advising transformation; student success analytics)
LEARNING ANALYTICS (Leveraging open data standards; privacy and ethics; both faculty and student facing reports; implementing; learning analytics to transform other services; course design implications)
LEARNING SPACE DESIGNS (Makerspaces; funding; faculty development; learning designs across disciplines; supporting integrated campus planning; ROI; accessibility/UDL; rating of classroom designs)
MICRO-CREDENTIALING AND DIGITAL BADGING (Design of badging hierarchies; stackable credentials; certificates; role of open standards; ways to publish digital badges; approaches to meta-data; implications for the transcript; Personalized learning transcripts and blockchain technology (here previous blog postings on this issue: http://blog.stcloudstate.edu/ims?s=blockchain)
MOBILE LEARNING (Curricular use of mobile devices (here previous blog postings on this issue:
MULTI-DIMENSIONAL TECHNOLOGIES (Virtual, augmented, mixed, and immersive reality; video walls; integration with learning spaces; scalability, affordability, and accessibility; use of mobile devices; multi-dimensional printing and artifact creation)
NEXT-GENERATION DIGITAL LEARNING ENVIRONMENTS AND LMS SERVICES (Open standards; learning environments architectures (here previous blog postings on this issue: http://blog.stcloudstate.edu/ims/2017/03/28/digital-learning/; social learning environments; customization and personalization; OER integration; intersections with learning modalities such as adaptive, online, etc.; LMS evaluation, integration and support)
ONLINE AND BLENDED TEACHING AND LEARNING (Flipped course models; leveraging MOOCs in online learning; course development models; intersections with analytics; humanization of online courses; student engagement)
OPEN EDUCATION (Resources, textbooks, content; quality and editorial issues; faculty development; intersections with student success/access; analytics; licensing; affordability; business models; accessibility and sustainability)
PRIVACY AND SECURITY (Formulation of policies on privacy and data protection; increased sharing of data via open standards for internal and external purposes; increased use of cloud-based and third party options; education of faculty, students, and administrators)
WORKING WITH EMERGING LEARNING TECHNOLOGY (Scalability and diffusion; effective piloting practices; investments; faculty development; funding; evaluation methods and rubrics; interoperability; data-driven decision-making)
10. The Virtualized Library: A Librarian’s Introduction to Docker and Virtual Machines
This session will introduce two major types of virtualization, virtual machines using tools like VirtualBox and Vagrant, and containers using Docker. The relative strengths and drawbacks of the two approaches will be discussed along with plenty of hands-on time. Though geared towards integrating these tools into a development workflow, the workshop should be useful for anyone interested in creating stable and reproducible computing environments, and examples will focus on library-specific tools like Archivematica and EZPaarse. With virtualization taking a lot of the pain out of installing and distributing software, alleviating many cross-platform issues, and becoming increasingly common in library and industry practices, now is a great time to get your feet wet.
(One three-hour session)
11. Digital Empathy: Creating Safe Spaces Online
User research is often focused on measures of the usability of online spaces. We look at search traffic, run card sorting and usability testing activities, and track how users navigate our spaces. Those results inform design decisions through the lens of information architecture. This is important, but doesn’t encompass everything a user needs in a space.
This workshop will focus on the other component of user experience design and user research: how to create spaces where users feel safe. Users bring their anxieties and stressors with them to our online spaces, but informed design choices can help to ameliorate that stress. This will ultimately lead to a more positive interaction between your institution and your users.
The presenters will discuss the theory behind empathetic design, delve deeply into using ethnographic research methods – including an opportunity for attendees to practice those ethnographic skills with student participants – and finish with the practical application of these results to ongoing and future projects.
(One three-hour session)
14. ARIA Basics: Making Your Web Content Sing Accessibility
(One three-hour session)
18. Learning and Teaching Tech
Tech workshops pose two unique problems: finding skilled instructors for that content, and instructing that content well. Library hosted workshops are often a primary educational resource for solo learners, and many librarians utilize these workshops as a primary outreach platform. Tackling these two issues together often makes the most sense for our limited resources. Whether a programming language or software tool, learning tech to teach tech can be one of the best motivations for learning that tech skill or tool, but equally important is to learn how to teach and present tech well.
This hands-on workshop will guide participants through developing their own learning plan, reviewing essential pedagogy for teaching tech, and crafting a workshop of their choice. Each participant will leave with an actionable learning schedule, a prioritized list of resources to investigate, and an outline of a workshop they would like to teach.
(Two three-hour sessions)
23. Introduction to Omeka S
Omeka S represents a complete rewrite of Omeka Classic (aka the Omeka 2.x series), adhering to our fundamental principles of encouraging use of metadata standards, easy web publishing, and sharing cultural history. New objectives in Omeka S include multisite functionality and increased interaction with other systems. This workshop will compare and contrast Omeka S with Omeka Classic to highlight our emphasis on 1) modern metadata standards, 2) interoperability with other systems including Linked Open Data, 3) use of modern web standards, and 4) web publishing to meet the goals medium- to large-sized institutions.
In this workshop we will walk through Omeka S Item creation, with emphasis on LoD principles. We will also look at the features of Omeka S that ease metadata input and facilitate project-defined usage and workflows. In accordance with our commitment to interoperability, we will describe how the API for Omeka S can be deployed for data exchange and sharing between many systems. We will also describe how Omeka S promotes multiple site creation from one installation, in the interest of easy publishing with many objects in many contexts, and simplifying the work of IT departments.
(One three-hour session)
24. Getting started with static website generators
Have you been curious about static website generators? Have you been wondering who Jekyll and Hugo are? Then this workshop is for you
But this article isn’t about setting up a domain name and hosting for your website. It’s for the step after that, the actual making of that site. The typical choice for a lot of people would be to use something like WordPress. It’s a one-click install on most hosting providers, and there’s a gigantic market of plugins and themes available to choose from, depending on the type of site you’re trying to build. But not only is WordPress a bit overkill for most websites, it also gives you a dynamically generated site with a lot of moving parts. If you don’t keep all of those pieces up to date, they can pose a significant security risk and your site could get hijacked.
In this hands-on workshop, we’ll start by exploring static website generators, their components, some of the different options available, and their benefits and disadvantages. Then, we’ll work on making our own sites, and for those that would like to, get them online with GitHub pages. Familiarity with HTML, git, and command line basics will be helpful but are not required.
(One three-hour session)
26. Using Digital Media for Research and Instruction
To use digital media effectively in both research and instruction, you need to go beyond just the playback of media files. You need to be able to stream the media, divide that stream into different segments, provide descriptive analysis of each segment, order, re-order and compare different segments from the same or different streams and create web sites that can show the result of your analysis. In this workshop, we will use Omeka and several plugins for working with digital media, to show the potential of video streaming, segmentation and descriptive analysis for research and instruction.
(One three-hour session)
28. Spark in the Dark 101 https://zeppelin.apache.org/
This is an introductory session on Apache Spark, a framework for large-scale data processing (https://spark.apache.org/). We will introduce high level concepts around Spark, including how Spark execution works and it’s relationship to the other technologies for working with Big Data. Following this introduction to the theory and background, we will walk workshop participants through hands-on usage of spark-shell, Zeppelin notebooks, and Spark SQL for processing library data. The workshop will wrap up with use cases and demos for leveraging Spark within cultural heritage institutions and information organizations, connecting the building blocks learned to current projects in the real world.
(One three-hour session)
29. Introduction to Spotlight https://github.com/projectblacklight/spotlight http://www.spotlighttechnology.com/4-OpenSource.htm
Spotlight is an open source application that extends the digital library ecosystem by providing a means for institutions to reuse digital content in easy-to-produce, attractive, and scholarly-oriented websites. Librarians, curators, and other content experts can build Spotlight exhibits to showcase digital collections using a self-service workflow for selection, arrangement, curation, and presentation.
This workshop will introduce the main features of Spotlight and present examples of Spotlight-built exhibits from the community of adopters. We’ll also describe the technical requirements for adopting Spotlight and highlight the potential to customize and extend Spotlight’s capabilities for their own needs while contributing to its growth as an open source project.
(One three-hour session)
31. Getting Started Visualizing your IoT Data in Tableau https://www.tableau.com/
The Internet of Things is a rising trend in library research. IoT sensors can be used for space assessment, service design, and environmental monitoring. IoT tools create lots of data that can be overwhelming and hard to interpret. Tableau Public (https://public.tableau.com/en-us/s/) is a data visualization tool that allows you to explore this information quickly and intuitively to find new insights.
This full-day workshop will teach you the basics of building your own own IoT sensor using a Raspberry Pi (https://www.raspberrypi.org/) in order to gather, manipulate, and visualize your data.
All are welcome, but some familiarity with Python is recommended.
(Two three-hour sessions)
32. Enabling Social Media Research and Archiving
Social media data represents a tremendous opportunity for memory institutions of all kinds, be they large academic research libraries, or small community archives. Researchers from a broad swath of disciplines have a great deal of interest in working with social media content, but they often lack access to datasets or the technical skills needed to create them. Further, it is clear that social media is already a crucial part of the historical record in areas ranging from events your local community to national elections. But attempts to build archives of social media data are largely nascent. This workshop will be both an introduction to collecting data from the APIs of social media platforms, as well as a discussion of the roles of libraries and archives in that collecting.
Assuming no prior experience, the workshop will begin with an explanation of how APIs operate. We will then focus specifically on the Twitter API, as Twitter is of significant interest to researchers and hosts an important segment of discourse. Through a combination of hands-on and demos, we will gain experience with a number of tools that support collecting social media data (e.g., Twarc, Social Feed Manager, DocNow, Twurl, and TAGS), as well as tools that enable sharing social media datasets (e.g., Hydrator, TweetSets, and the Tweet ID Catalog).
The workshop will then turn to a discussion of how to build a successful program enabling social media collecting at your institution. This might cover a variety of topics including outreach to campus researchers, collection development strategies, the relationship between social media archiving and web archiving, and how to get involved with the social media archiving community. This discussion will be framed by a focus on ethical considerations of social media data, including privacy and responsible data sharing.
Time permitting, we will provide a sampling of some approaches to social media data analysis, including Twarc Utils and Jupyter Notebooks.
Augmented reality can be described as experiencing the real world with an overlay of additional computer generated content. In contrast, virtual reality immerses a user in an entirely simulated environment, while mixed or merged reality blends real and virtual worlds in ways through which the physical and the digital can interact. AR, VR, and MR offer new opportunities to create a psychological sense of immersive presence in an environment that feels real enough to be viewed, experienced, explored, and manipulated. These technologies have the potential to democratize learning by giving everyone access to immersive experiences that were once restricted to relatively few learners.
In Grinnell College’s Immersive Experiences Lab http://gciel.sites.grinnell.edu/, teams of faculty, staff, and students collaborate on research projects, then use 3D, VR, and MR technologies as a platform to synthesize and present their findings.
In terms of equity, AR, VR, and MR have the potential to democratize learning by giving all learners access to immersive experiences
relatively little research about the most effective ways to use these technologies as instructional tools. Combined, these factors can be disincentives for institutions to invest in the equipment, facilities, and staffing that can be required to support these systems. AR, VR, and MR technologies raise concerns about personal privacy and data security. Further, at least some of these tools and applications currently fail to meet accessibility standards. The user experience in some AR, VR, and MR applications can be intensely emotional and even disturbing (my note: but can be also used for empathy literacy),
immersing users in recreated, remote, or even hypothetical environments as small as a molecule or as large as a universe, allowing learners to experience “reality” from multiple perspectives.
According to the email below, library faculty are asked to provide their feedback regarding the qualifications for a possible faculty line at the library.
In the fall of 2013 during a faculty meeting attended by the back than library dean and during a discussion of an article provided by the dean, it was established that leading academic libraries in this country are seeking to break the mold of “library degree” and seek fresh ideas for the reinvention of the academic library by hiring faculty with more diverse (degree-wise) background.
Is this still the case at the SCSU library? The “democratic” search for the answer of this question does not yield productive results, considering that the majority of the library faculty are “reference” and they “democratically” overturn votes, who see this library to be put on 21st century standards and rather seek more “reference” bodies for duties, which were recognized even by the same reference librarians as obsolete.
It seems that the majority of the SCSU library are “purists” in the sense of seeking professionals with broader background (other than library, even “reference” skills).
In addition, most of the current SCSU librarians are opposed to a second degree, as in acquiring more qualification, versus seeking just another diploma. There is a certain attitude of stagnation / intellectual incest, where new ideas are not generated and old ideas are prepped in “new attire” to look as innovative and/or 21st
Last but not least, a consistent complain about workforce shortages (the attrition politics of the university’s reorganization contribute to the power of such complain) fuels the requests for reference librarians and, instead of looking for new ideas, new approaches and new work responsibilities, the library reorganization conversation deteriorates into squabbles for positions among different department.
Most importantly, the narrow sightedness of being stuck in traditional work description impairs most of the librarians to see potential allies and disruptors. E.g., the insistence on the supremacy of “information literacy” leads SCSU librarians to the erroneous conclusion of the exceptionality of information literacy and the disregard of multi[meta] literacies, thus depriving the entire campus of necessary 21st century skills such as visual literacy, media literacy, technology literacy, etc.
Simultaneously, as mentioned above about potential allies and disruptors, the SCSU librarians insist on their “domain” and if they are not capable of leading meta-literacies instructions, they would also not allow and/or support others to do so.
Considering the observations above, the following qualifications must be considered:
According to the information in this blog post: http://blog.stcloudstate.edu/ims/2016/06/14/technology-requirements-samples/
for the past year and ½, academic libraries are hiring specialists with the following qualifications and for the following positions (bolded and / or in red). Here are some highlights: Positions
Librarian and Instructional Technology Liaison
library Specialist: Data Visualization & Collections Analytics
Advanced degree required, preferably in education, educational technology, instructional design, or MLS with an emphasis in instruction and assessment.
Data visualization skills
multi [ meta] literacy skills
Data curation, helping students working with data
Experience with website creation and design in a CMS environment and accessibility and compliance issues
Demonstrated a high degree of facility with technologies and systems germane to the 21st century library, and be well versed in the issues surrounding scholarly communications and compliance issues (e.g. author identifiers, data sharing software, repositories, among others)
Provides and develops awareness and knowledge related to digital scholarship and research lifecycle for librarians and staff.
Experience developing for, and supporting, common open-source library applications such as Omeka, ArchiveSpace, Dspace,
Responsibilities Establishing best practices for digital humanities labs, networks, and services
Assessing, evaluating, and peer reviewing DH projects and librarians
Actively promote TIGER or GRIC related activities through social networks and other platforms as needed.
Coordinates the transmission of online workshops through Google HangoutsScript metadata transformations and digital object processing using BASH, Python, and XSLT
liaison consults with faculty and students in a wide range of disciplines on best practices for teaching and using data/statistical software tools such as R, SPSS, Stata, and MatLab.
In response to the form attached to the Friday, September 29, email regarding St. Cloud State University Library Position Request Form:
Digital Initiatives Librarian
TBD, but generally:
– works with faculty across campus on promoting digital projects and other 21st century projects. Works with the English Department faculty on positioning the SCSU library as an equal participants in the digital humanities initiatives on campus
Works with the Visualization lab to establish the library as the leading unit on campus in interpretation of big data
Works with academic technology services on promoting library faculty as the leading force in the pedagogical use of academic technologies.
Quantitative data justification
this is a mute requirement for an innovative and useful library position. It can apply for a traditional request, such as another “reference” librarian. There cannot be a quantitative data justification for an innovative position, as explained to Keith Ewing in 2015. In order to accumulate such data, the position must be functioning at least for six months.
Qualitative justification: Please provide qualitative explanation that supports need for this position.
Numerous 21st century academic tendencies right now are scattered across campus and are a subject of political/power battles rather than a venue for campus collaboration and cooperation. Such position can seek the establishment of the library as the natural hub for “sandbox” activities across campus. It can seek a redirection of using digital initiatives on this campus for political gains by administrators and move the generation and accomplishment of such initiatives to the rightful owner and primary stakeholders: faculty and students.
Currently, there are no additional facilities and resources required. Existing facilities and resources, such as the visualization lab, open source and free application can be used to generate the momentum of faculty working together toward a common goal, such as, e.g. digital humanities.
Venue Hotel – Fourside Hotel City Center Vienna Grieshofgasse 11, A – 1120 Wien / Vienna, AUSTRIA
About the Conference
International Academic Conference in Vienna 2017 is an important international gathering of scholars, educators and PhD students. IAC-GETL 2017 in Vienna will take place in conference facilities located in Vienna, the touristic, business and historic center of Austria.
Conference language: English language
Conferences organized by the Czech Institute of Academic Education z.s. and Czech Technical University in Prague.
Conference Topics – Education, Teaching, Learning and E-learning
Education, Teaching and Learning
Distance Education, Higher Education, Effective Teaching Pedagogies, Learning Styles and Learning Outcomes, Emerging Technologies, Educational Management, Engineering and Sciences Research, Competitive Skills, Continuing Education, Transferring Disciplines, Imaginative Education, Language Education, Geographical Education, Health Education, Home Education, Science Education, Secondary Education, Second life Educators, Social Studies Education, Special Education, Learning / Teaching Methodologies and Assessment, Assessment Software Tools, Global Issues In Education and Research, Education, Research and Globalization, Barriers to Learning (ethnicity, age, psychosocial factors, …), Women and Minorities in Science and Technology, Indigenous and Diversity Issues, Intellectual Property Rights and Plagiarism, Pedagogy, Teacher Education, Cross-disciplinary areas of Education, Educational Psychology, Education practice trends and issues, Indigenous Education, Academic Research Projects, Research on Technology in Education, Research Centres, Links between Education and Research, Erasmus and Exchange experiences in universities, Students and Teaching staff Exchange programmes
Educational Technology, Educational Games and Software, ICT Education, E-Learning, Internet technologies, Accessibility to Disabled Users, Animation, 3D, and Web 3D Applications, Mobile Applications and Learning (M-learning), Virtual Learning Environments, Videos for Learning and Educational Multimedia, Web 2.0, Social Networking and Blogs, Wireless Applications, New Trends And Experiences, Other Areas of Education
a learning management system (LMS) is never the solution to every problem in education. Edtech is just one part of the whole learning ecosystem and student experience.
Therefore, the next generation digital learning environment (NGDLE), as envisioned by EDUCAUSE in 2015 … Looking at the NGDLE requirements from an LMS perspective, I view the NGDLE as being about five areas: interoperability; personalization; analytics, advising, and learning assessment; collaboration; accessibility and universal design.
Content can easily be exchanged between systems.
Users are able to leverage the tools they love, including discipline-specific apps.
Learning data is available to trusted systems and people who need it.
The learning environment is “future proof” so that it can adapt and extend as the ecosystem evolves.
The learning environment reflects individual preferences.
Departments, divisions, and institutions can be autonomous.
Instructors teach the way they want and are not constrained by the software design.
There are clear, individual learning paths.
Students have choice in activity, expression, and engagement.
Analytics, Advising, and Learning Assessment
Learning analytics helps to identify at-risk students, course progress, and adaptive learning pathways.
The learning environment enables integrated planning and assessment of student performance.
More data is made available, with greater context around the data.
The learning environment supports platform and data standards.
Individual spaces persist after courses and after graduation.
Learners are encouraged as creators and consumers.
Courses include public and private spaces.
Accessibility and Universal Design
Accessibility is part of the design of the learning experience.
The learning environment enables adaptive learning and supports different types of materials.
Learning design includes measurement rubrics and quality control.
The core analogy used in the NGDLE paper is that each component of the learning environment is a Lego brick:
The days of the LMS as a “walled garden” app that does everything is over.
Today many kinds of amazing learning and collaboration tools (Lego bricks) should be accessible to educators.
We have standards that let these tools (including an LMS) talk to each other. That is, all bricks share some properties that let them fit together.
Students and teachers sign in once to this “ecosystem of bricks.”
The bricks share results and data.
These bricks fit together; they can be interchanged and swapped at will, with confidence that the learning experience will continue uninterrupted.
Any “next-gen” attempt to completely rework the pedagogical model and introduce a “mash-up of whatever” to fulfil this model would fall victim to the same criticisms levied at the LMS today: there is too little time and training to expect faculty to figure out the nuances of implementation on their own.
The Lego metaphor works only if we’re talking about “old school” Lego design — bricks of two, three, and four-post pieces that neatly fit together. Modern edtech is a lot more like the modern Lego. There are wheels and rocket launchers and belts and all kinds of amazing pieces that work well with each other, but only when they are configured properly. A user cannot simply stick together different pieces and assume they will work harmoniously in creating an environment through which each student can be successful.
As the NGDLE paper states: “Despite the high percentages of LMS adoption, relatively few instructors use its more advanced features — just 41% of faculty surveyed report using the LMS ‘to promote interaction outside the classroom.'”
But this is what the next generation LMS is good at: being a central nervous system — or learning hub — through which a variety of learning activities and tools are used. This is also where the LMS needs to go: bringing together and making sense of all the amazing innovations happening around it. This is much harder to do, perhaps even impossible, if all the pieces involved are just bricks without anything to orchestrate them or to weave them together into a meaningful, personal experience for achieving well-defined learning outcomes.
Making a commitment to build easy, flexible, and smart technology
Working with colleges and universities to remove barriers to adopting new tools in the ecosystem
Standardizing the vetting of accessibility compliance (the Strategic Nonvisual Access Partner Program from the National Federation of the Blind is a great start)
Advancing standards for data exchange while protecting individual privacy
Building integrated components that work with the institutions using them — learning quickly about what is and is not working well and applying those lessons to the next generation of interoperability standards
Letting people use the tools they love [SIC] and providing more ways for nontechnical individuals (including students) to easily integrate new features into learning activities
My note: something just refused to be accepted at SCSU
Technologists are often very focused on the technology, but the reality is that the more deeply and closely we understand the pedagogy and the people in the institutions — students, faculty, instructional support staff, administrators — the better suited we are to actually making the tech work for them.
Under the Hood of a Next Generation Digital Learning Environment in Progress
The challenge is that although 85 percent of faculty use a campus learning management system (LMS),1 a recent Blackboard report found that, out of 70,000 courses across 927 North American institutions, 53 percent of LMS usage was classified as supplemental(content-heavy, low interaction) and 24 percent as complementary (one-way communication via content/announcements/gradebook).2 Only 11 percent were characterized as social, 10 percent as evaluative (heavy use of assessment), and 2 percent as holistic (balanced use of all previous). Our FYE course required innovating beyond the supplemental course-level LMS to create a more holistic cohort-wide NGDLE in order to fully support the teaching, learning, and student success missions of the program.The key design goals for our NGDLE were to:
Create a common platform that could deliver a standard curriculum and achieve parity in all course sections using existing systems and tools and readily available content
Capture, store, and analyze any generated learner data to support learning assessment, continuous program improvement, and research
Develop reports and actionable analytics for administrators, advisors, instructors, and students
These gaps and others “suggest a disconnect, the report stated, “between the impacts that many administrators perceive and the reality of how digital learning is changing the market.” Open-ended responses suggested that expectations for the impact of digital learning were “set too high” or weren’t being “measured or communicated well.” Another common refrain: There’s inadequate institutional support.
While most administrators told researchers that “faculty are crucial to the success of digital learning initiatives — serving as both a bolster and a barrier to implementation success,” the resources for supporting faculty to implement digital learning are insufficient. Just a quarter of respondents said faculty professional development was implemented “effectively and at scale.” Thirty-five percent said implementation was in progress. And a third (33 percent) reported that faculty professional development was “incomplete, inconsistent, informal and/or optional.”
The report offered recommendations for improving and expanding digital learning adoption. Among the guidance:
Get realistic. While the data suggested that digital learning could improve scheduling flexibility and access, among other benefits, schools need to identify which goals are most important and “clearly articulate how and to what extent its digital learning programs are expected to help.”
Measure impact and broadcast it. Forget about small pilots; go for a scale that will demonstrate impact and then share the findings internally and with other institutions.
Use buying power to influence the market. Connect faculty with vendors for “education, product discovery and feedback.” Insist on accessibility within products, strong integration features and user friendliness.
Prepare faculty for success. Make sure there are sufficient resources and incentives to help faculty “buy into the strategy” and follow through on implementation.
Whether you’re flipping your courses, creating videos to help your students understand specific concepts or recording lectures for exam review, these tips can help you optimize your production setup on a tight budget.
1) Speak Into the Microphone
2) Reconsider Whether You Want to be a Talking Head
Record your video and upload it to YouTube. YouTube will apply its machine transcription to the audio as a starting point. Then you can download the captions into your caption editor and improve on the captions from there. Afterward, you can delete the video from YouTube and add it to your institution’s platform.