social media demographics:
Dr. Scott Barry Kaufman When he was young, Kaufman had central auditory processing disorder, which made it hard for him to process verbal information in real time. He was asked to repeat third grade because he was considered a “slow” learner.
Kaufman thinks the traditional IQ test does a good job of measuring general cognitive ability, but says it misses all the ways that ability interacts with engagement. An individual’s goals within the learning classroom and excitement about a topic affect how he or she pursues learning, none of which is captured on IQ tests. Worse, those tests are often used to filter people in or out of special programs.
FOUR PRACTICES TO CULTIVATE CHILDREN’S CREATIVITY
it’s even worth measuring imagination, but Kaufman believes that measurement is important so researchers can see how changing behavior affects creative achievement. But he hopes the measurements are never used as another sorting mechanism.
My note: Kaufman makes a new call for an old trend. The futility of testing is raging across the United States K12 system. Higher education is turned into the last several decades (similarly to the United States health care system) into a cash cow. When the goal is profit, then good education goes down the drain. Cultivating children’s creativity cannot happen, when the foremost goals to make more money, which inevitably entails spending less cash (not only on teacher’s salaries).
The authors of a new book challenge what they call the “frantic pace” of contemporary university life.
ACRL e-Learning webcast series: Learning Analytics – Strategies for Optimizing Student Data on Your Campus
This three-part webinar series, co-sponsored by the ACRL Value of Academic Libraries Committee, the Student Learning and Information Committee, and the ACRL Instruction Section, will explore the advantages and opportunities of learning analytics as a tool which uses student data to demonstrate library impact and to identify learning weaknesses. How can librarians initiate learning analytics initiatives on their campuses and contribute to existing collaborations? The first webinar will provide an introduction to learning analytics and an overview of important issues. The second will focus on privacy issues and other ethical considerations as well as responsible practice, and the third will include a panel of librarians who are successfully using learning analytics on their campuses.
Webcast One: Learning Analytics and the Academic Library: The State of the Art and the Art of Connecting the Library with Campus Initiatives
March 29, 2016
Learning analytics are used nationwide to augment student success initiatives as well as bolster other institutional priorities. As a key aspect of educational reform and institutional improvement, learning analytics are essential to defining the value of higher education, and academic librarians can be both of great service to and well served by institutional learning analytics teams. In addition, librarians who seek to demonstrate, articulate, and grow the value of academic libraries should become more aware of how they can dovetail their efforts with institutional learning analytics projects. However, all too often, academic librarians are not asked to be part of initial learning analytics teams on their campuses, despite the benefits of library inclusion in these efforts. Librarians can counteract this trend by being conversant in learning analytics goals, advantages/disadvantages, and challenges as well as aware of existing examples of library successes in learning analytics projects.
Learn about the state of the art in learning analytics in higher education with an emphasis on 1) current models, 2) best practices, 3) ethics, privacy, and other difficult issues. The webcast will also focus on current academic library projects and successes in gaining access to and inclusion in learning analytics initiatives on their campus. Benefit from the inclusion of a “short list” of must-read resources as well as a clearly defined list of ways in which librarians can leverage their skills to be both contributing members of learning analytics teams, suitable for use in advocating on their campuses.
open academic analytics initiative
where data comes from:
D2L degree compass
Predictive Analytics Reportitng PAR – was open, but just bought by Hobsons (https://www.hobsons.com/)
IMS Caliper Enabled Services. the way to connect the library in the campus analytics https://www.imsglobal.org/activity/caliperram
student’s opinion of this process
benefits: self-assessment, personal learning, empwerment
analytics and data privacy – students are OK with harvesting the data (only 6% not happy)
8 in 10 are interested in personal dashboard, which will help them perform
Big Mother vs Big Brother: creepy vs helpful. tracking classes, helpful, out of class (where on campus, social media etc) is creepy. 87% see that having access to their data is positive
recognize metrics, assessment, analytics, data. visualization, data literacy, data science, interpretation
INSTRUCTION DEPARTMENT – N.B.
determine who is the key leader: director of institutional research, president, CIO
who does analyics services: institutional research, information technology, dedicated center
analytic maturity: data drivin, decision making culture; senior leadership commitment,; policy supporting (data ollection, accsess, use): data efficacy; investment and resourcefs; staffing; technical infrastrcture; information technology interaction
student success maturity: senior leader commited; fudning of student success efforts; mechanism for making student success decisions; interdepart collaboration; undrestanding of students success goals; advising and student support ability; policies; information systems
developing learning analytics strategy
understand institutional challenges; identify stakeholders; identify inhibitors/challenges; consider tools; scan the environment and see what other done; develop a plan; communicate the plan to stakeholders; start small and build
ways librarians can help
idenfify institu partners; be the partners; hone relevant learning analytics; participate in institutional analytics; identify questions and problems; access and work to improve institu culture; volunteer to be early adopters;
questions to ask: environmental scanning
do we have a learning analytics system? does our culture support? leaders present? stakeholders need to know?
questions to ask: Data
questions to ask: Library role
learning analytics & the academic library: the state of the art of connecting the library with campus initiatives
causation versus correlation studies. speakers claims that it is difficult to establish causation argument. institutions try to predict as accurately as possible via correlation, versus “if you do that it will happen what.”
More on analytics in this blog:
Steve Kelman Mar 28, 2016 at 4:38 PM
Ines Mergel report from the IBM Center for the Business of Government, called The Social Intranet: Insights on Managing and Sharing Knowledge Internally.
co-locating subject-matter experts
traditional technologies have important limitations
when teaching about efforts many agencies are making to break down functional stovepipes, which often involves having functional experts spend significant time in offsite cross-agency teams with “functionals” from other specialties. This has a number of advantages, but creates risks that employees won’t have anywhere to go to get answers to questions or refresh their knowledge base. My students and I also discussed communities of practice, which allowed employees to ask questions to fellow experts not at the same location.
“social intranet” — a one-stop shop where agency employees can go to find or request information in a number of different ways to help them do their jobs. The social intranet consists of wikis; places for people to go to ask questions or solicit collaboration; publicly available conversation threads; central places for blogs; and opportunities for people to create profiles and (in a professional context) “friend” each other. These elements all appear at one web address, with its common home page, to which an employee can link.
These different features work in different ways. Wikis, for example, allow a group of employees to add knowledge to a text that is then accessible to the whole organization — and to which everyone can edit and add. Other employees can subscribe to the updates. Blogs, on the other hand, allow longer text to provide project updates, comment on industry developments, or introduce new issues more transparently than blast e-mail updates can manage.
Most intranet collaboration platforms do not require an approval chain to publish, which lowers the barriers to quick sharing. And while the personal profiles and “friending” often start by providing occasions for social conversations, the intention is for these to be a gateway to knowledge sharing; Mergel quotes a manager as saying, “The social feeds into the professional.”
educators need to figure out what they need to do. Are you trying to have a conversation? Are you simply trying to transmit information? Or are you, in fact, trying to have students create something?
Answer those pedagogical questions first and then – and only then – will you be able to connect people to the kinds of technologies that can do that thing.
The ‘digital native’ is a generational metaphor. It’s a linguistic metaphor. It’s a ridiculous metaphor. It’s the notion that there is a particular generation of people who are fundamentally unknowable and incomprehensible.
There are policy implications: if your university philosophy is grounded in assumptions around digital natives, education and technology, you’re presupposing you don’t have to teach the students how to use tech for their education. And, furthermore, it will never be possible to teach that faculty how to use that technology, either on their own behalf or for their students.
A very different paradigm is ‘visitor and resident‘. Instead of talking about these essentialised categories of native and immigrant, we should be talking about modes of behaviour because, in fact, some people do an awful lot of stuff with technology in some parts of their lives and then not so much in other parts.
How much of your university practice is behind closed doors? This is traditional, of course, gatekeeping our institutions of higher education, keeping the gates in the walled campuses closed. So much of the pedagogy as well as the content of the university is locked away. That has implications not just for potential students but also from a policy perspective – if part of the problem in higher education policy is of non-university people not understanding the work of the university, being open would have really great potential to mitigate that lack of understanding.
I would like to see our universities modelling themselves more closely on what we should be looking for in society generally: networked, open, transparent, providing the opportunity for people to create things that they wouldn’t create all by themselves.
I understand the rationale for gatekeeping, I just don’t think that there’s as much potential with a gatekept system as there is with an open one.
There are two huge problems with the notion of “student expectations”: firstly, the sense that, with the UK’s new fee model, students’ ideas of what higher education should be now weigh much more heavily in the institutions’ educational planning. Secondly, institutions in part think their role is to make their students “employable” because some politician somewhere has said the university is there to get them jobs.
Students coming into higher education don’t know much about what higher education can be. So if we allow student expectations to set the standard for what we should be doing, we create an amazingly low bar.
The point of any educational system is not to provide citizens with jobs. That’s the role of the economy.
Universities are not vocational
Institutions can approach educational technology in two very different ways. They can have a learning technology division that is basically in charge of acquiring and maintaining educational technology. Or they can provide spaces to develop pedagogy and then think about the role of technology within that pedagogy.
Classified revisions accepted by secret Fisa court affect NSA data involving Americans’ international emails, texts and phone calls
The FBI has quietly revised its privacy rules for searching data involving Americans’ international communications that was collected by the National Security Agency, US officials have confirmed to the Guardian.
Pro Domo Sua: Are We Puppets in a Wired World? Surveillance and privacy revisited…
More on privacy in this IMS blog:
more on surveillance in this IMS blog:
start with the teachers, not with the students
Participating teachers advance through a series of inquiry-based professional development modules. Teachers are awarded a digital badge for the successful completion of each 10-hour module. To accomplish this, they must complete the following steps: 1) study module content, 2) participate in a focused discussion with peers working on the same module, 3) create an original inquiry-based global lesson plan that incorporates new learning, 4) implement the original lesson plan in the classroom, 5) provide evidence of classroom implementation and 6) reflect on and revise the lesson created.
The final product of every module is a tested, global lesson plan that articulates learning objectives, activities, assessments, and resources for each stage of inquiry. Upon completion, teachers may publish finalized lessons in a resource library where they can be accessed by other educators. As designed, the HISD badging system will be a four-year, 16-badge approach that equates to 160 hours of professional learning for teachers.
five key features that taken together increase significantly the likelihood that the learning experience for a teacher will lead to results in the classroom for students — which, after all, is the point of professional development:
excellent tutorial on how to blur objects in YouTube videos
1 2 3 … 5 Next