Understanding and learning outcomes
Students will … students will … students will … students will. (Meantime the students’ will becomes defined for them, or ignored, or crushed.) Each of the above statements assume a linear, non-paradoxical, cleanly defined world.
For it turns out that two of the words we must never, ever use are “understand” and “appreciate.” These are vague words, we are told. Instead, we must use specific words like “describe,” “formulate,” “evaluate,” “identify,” and so forth.
Modern Learning: Re-Discovering the Transformative Promise of Educational Technology
By Steve Hargadon (@stevehargadon) Survey and Report: modernlearning.com |
- When do you believe technology enhances learning, and when do you believe
it does not?
- How has technology impacted your own learning?
- Does your school, library, or organization have a specific learning philosophy that guides ed-tech purchases and implementation? If yes, what is that philosophy?
More than 450 responses were received (those that agreed for their answers to be
shared publicly can be seen at http://www.modernlearning.com).
For the purposes of this report, “educational technology” (often abbreviated as “ed tech”) is assumed to refer principally to the use of modern electronic computing and other high-tech, mostly Internet-enabled, devices and services in education.
Observation 1: There is general agreement that there are good and pedagogically-sound arguments or the implementation and active use of ed tech; and that technology is changing, and will change, education for the better.
Observation 2: There is general agreement that technology is not always beneficial to teaching and learning.
When it becomes a distraction.
● When there is little or no preparation for it.
● When just used for testing / score tracking.
● When used for consuming and not creating, or just for rote learning.
● When “following the education trends: everyone else is doing it.”
● When the tech is “an end rather than means” (also stated as, ”when I don’t have a plan or learning goal…”). We found this very significant, and it is the focus of Observation 6.
● When there is a lack of guidance in how to effectively use new ed tech tools (“when there is no PD”). This is the focus of Observation 4.
● Finally, when it “gets in the way of real time talk / sharing.” Forgetting that the tech “cannot mentor, motivate, show beauty, interact fully, give quality attention, [or] contextualize.” Also: ”outcomes related to acquiring the skills and attitudes cannot be enhanced by technology.” As mentioned in the introduction, this would be missing the “human factor.” One respondent
captured this as follows: “3 reasons tech innovation fails: Misunderstanding Human Motivation, Human Learning, or Human Systems.”
Observation 3: The benefits of ed tech to educator learning are described much more positively, and much less ambiguously, than are the benefits to student learning.
- reduced their isolation by helping them to connect with their peers;
● allowed them to feel part of larger educational movements;
● afforded them opportunities to become contributors.
Observation 4: There is a lack of good professional development for educational technology.
Observation 5: Educational technology is prone to grandiose promises.
Observation 6: Some significant percentage of educational technology purchases do not appear to have a pedagogical basis.
Networked information technology has rendered the words “teacher” and “student” more ambiguous. YouTube tutorials and social-media discussions, just to cite a couple of obvious examples, have made it abundantly clear that at any given moment anyone—regardless of age or background—can be a learner or a teacher, or even both at once.
more on educational technology in this IMS blog
Three lessons from rigorous research on education technology
Hope seen in “personalized” software for math
an August 2017 working paper, “Education Technology: An Evidence-Based Review,” published by the National Bureau of Economic Research with clear tables on which technology improves learning and which doesn’t.
1. Computers and internet access alone don’t boost learning
Handing out laptops, providing high-speed internet access or buying most other kinds of hardware doesn’t on its own boost academic outcomes. The research shows that student achievement doesn’t rise when kids are using computers more, and it sometimes decreases.
2. Some math software shows promise
math programs such as SimCalc and ASSISTments. One popular program, DreamBox, showed small gains for students, as well. Only one piece of software that taught reading, Intelligent Tutoring for the Structure Strategy (ITSS), showed promise, suggesting that it is possible to create good educational software outside of math, but it’s a lot harder.
One commonality of the software that seems to work is that it somehow “personalizes” instruction. Sometimes students start with a pre-test so the computer can determine what they don’t know and then sends each student the right lessons, or a series of worksheet problems, to help fill in the gaps. Other times, the computer ascertains a student’s gaps as he works through problems and makes mistakes, giving personalized feedback. Teachers also get data reports to help pinpoint where students are struggling.
3. Cheap can be effective
a study in San Francisco where texts reminded mothers to read to their preschoolers. That boosted children’s literacy scores.
more on educational technology in this IMS blog
THE VALUE OF ACADEMIC LIBRARIES
A Comprehensive Research Review and Report. Megan Oakleaf
Librarians in universities, colleges, and community colleges can establish, assess, and link
academic library outcomes to institutional outcomes related to the following areas:
student enrollment, student retention and graduation rates, student success, student
achievement, student learning, student engagement, faculty research productivity,
faculty teaching, service, and overarching institutional quality.
Assessment management systems help higher education educators, including librarians, manage their outcomes, record and maintain data on each outcome, facilitate connections to
similar outcomes throughout an institution, and generate reports.
Assessment management systems are helpful for documenting progress toward
strategic/organizational goals, but their real strength lies in managing learning
to determine the impact of library interactions on users, libraries can collect data on how individual users engage with library resources and services.
increase library impact on student enrollment.
p. 13-14improved student retention and graduation rates. High -impact practices include: first -year seminars and experiences, common intellectual experiences, learning communities, writing – intensive courses, collaborative assignments and projects, undergraduate research, Value of Academic Libraries diversity/global learning, service learning/community -based learning, internships, capstone courses and projects
Libraries support students’ ability to do well in internships, secure job placements, earn salaries, gain acceptance to graduate/professional schools, and obtain marketable skills.
librarians can investigate correlations between student library interactions and their GPA well as conduct test item audits of major professional/educational tests to determine correlations between library services or resources and specific test items.
p. 15 Review course content, readings, reserves, and assignments.
Track and increase library contributions to faculty research productivity.
Continue to investigate library impact on faculty grant proposals and funding, a means of generating institutional income. Librarians contribute to faculty grant proposals in a number of ways.
Demonstrate and improve library support of faculty teaching.
p. 20 Internal Focus: ROI – lib value = perceived benefits / perceived costs
production of a commodity – value=quantity of commodity produced × price per unit of commodity
p. 21 External focus
a fourth definition of value focuses on library impact on users. It asks, “What is the library trying to achieve? How can librarians tell if they have made a difference?” In universities, colleges, and community colleges, libraries impact learning, teaching, research, and service. A main method for measuring impact is to “observe what the [users] are actually doing and what they are producing as a result”
A fifth definition of value is based on user perceptions of the library in relation to competing alternatives. A related definition is “desired value” or “what a [user] wants to have happen when interacting with a [library] and/or using a [library’s] product or service” (Flint, Woodruff and Fisher Gardial 2002) . Both “impact” and “competing alternatives” approaches to value require libraries to gain new understanding of their users’ goals as well as the results of their interactions with academic libraries.
p. 23 Increasingly, academic library value is linked to service, rather than products. Because information products are generally produced outside of libraries, library value is increasingly invested in service aspects and librarian expertise.
service delivery supported by librarian expertise is an important library value.
p. 25 methodology based only on literature? weak!
p. 26 review and analysis of the literature: language and literature are old (e.g. educational administrators vs ed leaders).
G government often sees higher education as unresponsive to these economic demands. Other stakeholder groups —students, pa rents, communities, employers, and graduate/professional schools —expect higher education to make impacts in ways that are not primarily financial.
Because institutional missions vary (Keeling, et al. 2008, 86; Fraser, McClure and
Leahy 2002, 512), the methods by which academic libraries contribute value vary as
well. Consequently, each academic library must determine the unique ways in which they contribute to the mission of their institution and use that information to guide planning and decision making (Hernon and Altman, Assessing Service Quality 1998, 31) . For example, the University of Minnesota Libraries has rewritten their mission and vision to increase alignment with their overarching institution’s goals and emphasis on strategic engagement (Lougee 2009, allow institutional missions to guide library assessment
Assessment vs. Research
In community colleges, colleges, and universities, assessment is about defining the
purpose of higher education and determining the nature of quality (Astin 1987)
Academic libraries serve a number of purposes, often to the point of being
Assessment “strives to know…what is” and then uses that information to change the
status quo (Keeling, et al. 2008, 28); in contrast, research is designed to test
hypotheses. Assessment focuses on observations of change; research is concerned with the degree of correlation or causation among variables (Keeling, et al. 2008, 35) . Assessment “virtually always occurs in a political context ,” while research attempts to be apolitical” (Upcraft and Schuh 2002, 19) .
p. 31 Assessment seeks to document observations, but research seeks to prove or disprove ideas. Assessors have to complete assessment projects, even when there are significant design flaws (e.g., resource limitations, time limitations, organizational contexts, design limitations, or political contexts); whereas researchers can start over (Upcraft and Schuh 2002, 19) . Assessors cannot always attain “perfect” studies, but must make do with “good enough” (Upcraft and Schuh 2002, 18) . Of course, assessments should be well planned, be based on clear outcomes (Gorman 2009, 9- 10) , and use appropriate methods (Keeling, et al. 2008, 39) ; but they “must be comfortable with saying ‘after’ as well as ‘as a result of’…experiences” (Ke eling, et al. 2008, 35) .
Two multiple measure approaches are most significant for library assessment: 1) triangulation “where multiple methods are used to find areas of convergence of data from different methods with an aim of overcoming the biases or limitations of data gathered from any one particular method” (Keeling, et al. 2008, 53) and 2) complementary mixed methods , which “seek to use data from multiple methods to build upon each other by clarifying, enhancing, or illuminating findings between or among methods” (Keeling, et al. 2008, 53) .
p. 34 Academic libraries can help higher education institutions retain and graduate students, a keystone part of institutional missions (Mezick 2007, 561) , but the challenge lies in determining how libraries can contribute and then document their contribution
p. 35. Student Engagement: In recent years, academic libraries have been transformed to provide “technology and content ubiquity” as well as individualized support
My Note: I read the “technology and content ubiquity” as digital literacy / metaliteracies, where basic technology instructional sessions (everything that IMS offers for years) is included, but this library still clenches to information literacy only.
p. 37 Student Learning
In the past, academic libraries functioned primarily as information repositories; now they are becoming learning enterprises (Bennett 2009, 194) . This shift requires academic librarians to embed library services and resources in the teaching and learning activities of their institutions (Lewis 2007) . In the new paradigm, librarians focus on information skills, not information access (Bundy 2004, 3); they think like educators, not service providers (Bennett 2009, 194) .
p. 38. For librarians, the main content area of student learning is information literacy; however, they are not alone in their interest in student inform ation literacy skills (Oakleaf, Are They Learning? 2011).
My note: Yep. it was. 20 years ago. Metaliteracies is now.
p. 41 surrogates for student learning in Table 3.
p. 42 strategic planning for learning:
According to Kantor, the university library “exists to benefit the students of the educational institution as individuals ” (Library as an Information Utility 1976 , 101) . In contrast, academic libraries tend to assess learning outcomes using groups of students
p. 45 Assessment Management Systems
Each assessment management system has a slightly different set of capabilities. Some guide outcomes creation, some develop rubrics, some score student work, or support student portfolios. All manage, maintain, and report assessment data
p. 46 faculty teaching
However, as online collections grow and discovery tools evolve, that role has become less critical (Schonfeld and Housewright 2010; Housewright and Schonfeld, Ithaka’s 2006 Studies of Key Stakeholders 2008, 256) . Now, libraries serve as research consultants, project managers, technical support professionals, purchasers , and archivists (Housewright, Themes of Change 2009, 256; Case 2008) .
Librarians can count citations of faculty publications (Dominguez 2005)
Tenopir, C. (2012). Beyond usage: measuring library outcomes and value. Library Management, 33(1/2), 5-13.
methods that can be used to measure the value of library products and services. (Oakleaf, 2010; Tenopir and King, 2007): three main categories
- Implicit value. Measuring usage through downloads or usage logs provide an implicit measure of value. It is assumed that because libraries are used, they are of value to the users. Usage of e-resources is relatively easy to measure on an ongoing basis and is especially useful in collection development decisions and comparison of specific journal titles or use across subject disciplines.
do not show purpose, satisfaction, or outcomes of use (or whether what is downloaded is actually read).
- Explicit methods of measuring value include qualitative interview techniques that ask faculty members, students, or others specifically about the value or outcomes attributed to their use of the library collections or services and surveys or interviews that focus on a specific (critical) incident of use.
- Derived values, such as Return on Investment (ROI), use multiple types of data collected on both the returns (benefits) and the library and user costs (investment) to explain value in monetary terms.
more on ROI in this IMS blog
International Academic Conference on Global Education, Teaching and Learning in Vienna, Austria 2017 (IAC-GETL in Vienna 2017)
Conference Program Dates
Friday – Saturday, November 24 – 25, 2017
Venue Hotel – Fourside Hotel City Center Vienna
Grieshofgasse 11, A – 1120 Wien / Vienna, AUSTRIA
About the Conference
International Academic Conference in Vienna 2017 is an important international gathering of scholars, educators and PhD students. IAC-GETL 2017 in Vienna will take place in conference facilities located in Vienna, the touristic, business and historic center of Austria.
Conference language: English language
Conferences organized by the Czech Institute of Academic Education z.s. and Czech Technical University in Prague.
Conference Topics – Education, Teaching, Learning and E-learning
Education, Teaching and Learning
Distance Education, Higher Education, Effective Teaching Pedagogies, Learning Styles and Learning Outcomes, Emerging Technologies, Educational Management, Engineering and Sciences Research, Competitive Skills, Continuing Education, Transferring Disciplines, Imaginative Education, Language Education, Geographical Education, Health Education, Home Education, Science Education, Secondary Education, Second life Educators, Social Studies Education, Special Education, Learning / Teaching Methodologies and Assessment, Assessment Software Tools, Global Issues In Education and Research, Education, Research and Globalization, Barriers to Learning (ethnicity, age, psychosocial factors, …), Women and Minorities in Science and Technology, Indigenous and Diversity Issues, Intellectual Property Rights and Plagiarism, Pedagogy, Teacher Education, Cross-disciplinary areas of Education, Educational Psychology, Education practice trends and issues, Indigenous Education, Academic Research Projects, Research on Technology in Education, Research Centres, Links between Education and Research, Erasmus and Exchange experiences in universities, Students and Teaching staff Exchange programmes
Educational Technology, Educational Games and Software, ICT Education, E-Learning, Internet technologies, Accessibility to Disabled Users, Animation, 3D, and Web 3D Applications, Mobile Applications and Learning (M-learning), Virtual Learning Environments, Videos for Learning and Educational Multimedia, Web 2.0, Social Networking and Blogs, Wireless Applications, New Trends And Experiences, Other Areas of Education
Critical Factors for Implementing Blended Learning in Higher Education.
Available from: https://www.researchgate.net/publication/318191000_Critical_Factors_for_Implementing_Blended_Learning_in_Higher_Education [accessed Jul 6, 2017].
Definition of Blended learning
Blended learning is in one dimension broadly defined as “The convergence of online and face-to-face Education” as in the study by Watson (2008). At the same time it is important to also include the dimension of technology and media use as it has been depicted in the multimodal conceptual model in Figure 1 below. This conceptual model was proposed and presented in an article published by Picciano (2009). Critical Factors for Implementing Blended Learning in Higher Education.
online face to face hybrid
Several studies that argue for the need to focus on pedagogy and learning objectives and not solely on technology (Hoffinan, 2006; Garrison & Vaughan, 2008; Al amm ary et al., 2014; McGee & Reis, 2012; Shand, Glassett Farrelly & Costa, 2016). Other findings in this study are that technology still is a critical issue (So & Brush, 2008; Fleming, Becker & Newton, 2017), not least in developing regions (AI Busaidi & Al-Shihi, 2012; Raphae1 & Mtebe, 2016), and also the more positive idea of technology as a supporting factor for innovative didactics and instructional design to satisfy the needs in heterogeneous student groups (Picciano, 2009). Critical Factors for Implementing Blended Learning in Higher Education.
- didactics – pedagogy, instructional design and the teacher role
- Course outcomes – learning outcomes and learner satisfaction
- collaboration and social presence
- course design
- the heritage from technology enhanced distance courses
- multimodal overload
- trends and hypes
Blended learning perspectives
- the university perspective
- the Learner perspective
- the Teacher perspective
- the Global perspective
more on blended learning in this IMS blog
Updating the Next Generation Digital Learning Environment for Better Student Learning Outcomes
Monday, July 3, 2017
a learning management system (LMS) is never the solution to every problem in education. Edtech is just one part of the whole learning ecosystem and student experience.
Therefore, the next generation digital learning environment (NGDLE), as envisioned by EDUCAUSE in 2015 … Looking at the NGDLE requirements from an LMS perspective, I view the NGDLE as being about five areas: interoperability; personalization; analytics, advising, and learning assessment; collaboration; accessibility and universal design.
- Content can easily be exchanged between systems.
- Users are able to leverage the tools they love, including discipline-specific apps.
- Learning data is available to trusted systems and people who need it.
- The learning environment is “future proof” so that it can adapt and extend as the ecosystem evolves.
- The learning environment reflects individual preferences.
- Departments, divisions, and institutions can be autonomous.
- Instructors teach the way they want and are not constrained by the software design.
- There are clear, individual learning paths.
- Students have choice in activity, expression, and engagement.
Analytics, Advising, and Learning Assessment
- Learning analytics helps to identify at-risk students, course progress, and adaptive learning pathways.
- The learning environment enables integrated planning and assessment of student performance.
- More data is made available, with greater context around the data.
- The learning environment supports platform and data standards.
- Individual spaces persist after courses and after graduation.
- Learners are encouraged as creators and consumers.
- Courses include public and private spaces.
Accessibility and Universal Design
- Accessibility is part of the design of the learning experience.
- The learning environment enables adaptive learning and supports different types of materials.
- Learning design includes measurement rubrics and quality control.
The core analogy used in the NGDLE paper is that each component of the learning environment is a Lego brick:
- The days of the LMS as a “walled garden” app that does everything is over.
- Today many kinds of amazing learning and collaboration tools (Lego bricks) should be accessible to educators.
- We have standards that let these tools (including an LMS) talk to each other. That is, all bricks share some properties that let them fit together.
- Students and teachers sign in once to this “ecosystem of bricks.”
- The bricks share results and data.
- These bricks fit together; they can be interchanged and swapped at will, with confidence that the learning experience will continue uninterrupted.
Any “next-gen” attempt to completely rework the pedagogical model and introduce a “mash-up of whatever” to fulfil this model would fall victim to the same criticisms levied at the LMS today: there is too little time and training to expect faculty to figure out the nuances of implementation on their own.
The Lego metaphor works only if we’re talking about “old school” Lego design — bricks of two, three, and four-post pieces that neatly fit together. Modern edtech is a lot more like the modern Lego. There are wheels and rocket launchers and belts and all kinds of amazing pieces that work well with each other, but only when they are configured properly. A user cannot simply stick together different pieces and assume they will work harmoniously in creating an environment through which each student can be successful.
As the NGDLE paper states: “Despite the high percentages of LMS adoption, relatively few instructors use its more advanced features — just 41% of faculty surveyed report using the LMS ‘to promote interaction outside the classroom.'”
But this is what the next generation LMS is good at: being a central nervous system — or learning hub — through which a variety of learning activities and tools are used. This is also where the LMS needs to go: bringing together and making sense of all the amazing innovations happening around it. This is much harder to do, perhaps even impossible, if all the pieces involved are just bricks without anything to orchestrate them or to weave them together into a meaningful, personal experience for achieving well-defined learning outcomes.
- Making a commitment to build easy, flexible, and smart technology
- Working with colleges and universities to remove barriers to adopting new tools in the ecosystem
- Standardizing the vetting of accessibility compliance (the Strategic Nonvisual Access Partner Program from the National Federation of the Blind is a great start)
- Advancing standards for data exchange while protecting individual privacy
- Building integrated components that work with the institutions using them — learning quickly about what is and is not working well and applying those lessons to the next generation of interoperability standards
- Letting people use the tools they love [SIC] and providing more ways for nontechnical individuals (including students) to easily integrate new features into learning activities
My note: something just refused to be accepted at SCSU
Technologists are often very focused on the technology, but the reality is that the more deeply and closely we understand the pedagogy and the people in the institutions — students, faculty, instructional support staff, administrators — the better suited we are to actually making the tech work for them.
Under the Hood of a Next Generation Digital Learning Environment in Progress
Monday, July 31, 2017
The challenge is that although 85 percent of faculty use a campus learning management system (LMS),1 a recent Blackboard report found that, out of 70,000 courses across 927 North American institutions, 53 percent of LMS usage was classified as supplemental(content-heavy, low interaction) and 24 percent as complementary (one-way communication via content/announcements/gradebook).2 Only 11 percent were characterized as social, 10 percent as evaluative (heavy use of assessment), and 2 percent as holistic (balanced use of all previous). Our FYE course required innovating beyond the supplemental course-level LMS to create a more holistic cohort-wide NGDLE in order to fully support the teaching, learning, and student success missions of the program.The key design goals for our NGDLE were to:
- Create a common platform that could deliver a standard curriculum and achieve parity in all course sections using existing systems and tools and readily available content
- Capture, store, and analyze any generated learner data to support learning assessment, continuous program improvement, and research
- Develop reports and actionable analytics for administrators, advisors, instructors, and students
more on LMS in this blog
more on learning outcomes in this IMS blog
International Journal of Game-Based Learning (IJGBL)
Editor-in-Chief: Patrick Felicia (Waterford Institute of Technology, Ireland)
Published Quarterly. Est. 2011.
ISSN: 2155-6849|EISSN: 2155-6857|DOI: 10.4018/IJGBL
The International Journal of Game-Based Learning (IJGBL) is devoted to the theoretical and empirical understanding of game-based learning. To achieve this aim, the journal publishes theoretical manuscripts, empirical studies, and literature reviews. The journal publishes this multidisciplinary research from fields that explore the cognitive and psychological aspects that underpin successful educational video games. The target audience of the journal is composed of professionals and researchers working in the fields of educational games development, e-learning, technology-enhanced education, multimedia, educational psychology, and information technology. IJGBL promotes an in-depth understanding of the multiple factors and challenges inherent to the design and integration of Game-Based Learning environments.
- Adaptive games design for Game-Based Learning
- Design of educational games for people with disabilities
- Educational video games and learning management systems
- Game design models and design patterns for Game-Based Learning
- Instructional design for Game-Based Learning
- Integration and deployment of video games in the classroom
- Intelligent tutoring systems and Game-Based Learning
- Learning by designing and developing video games
- Learning styles, behaviors and personalities in educational video games
- Mobile development and augmented reality for Game-Based Learning
- Motivation, audio and emotions in educational video games
- Role of instructors
- Virtual worlds and Game-Based Learning
The mission of the International Journal of Game-Based Learning (IJGBL) is to promote knowledge pertinent to the design of Game-Based Learning environments, and to provide relevant theoretical frameworks and the latest empirical research findings in the field of Game-Based Learning. The main goals of IJGBL are to identify, explain, and improve the interaction between learning outcomes and motivation in video games, and to promote best practices for the integration of video games in instructional settings. The journal is multidisciplinary and addresses cognitive, psychological and emotional aspects of Game-Based Learning. It discusses innovative and cost-effective Game-Based Learning solutions. It also provides students, researchers, instructors, and policymakers with valuable information in Game-Based Learning, and increases their understanding of the process of designing, developing and deploying successful educational games. IJGBL also identifies future directions in this new educational medium.
more on gaming and gamification in this IMS blog: