I recent post from LITA listserv is seeking an input on libraries maintaining BYOD-friendly in some corner in the building:
From: Eng-Ziskin, Susanna M Sent: Monday, May 23, 2016 4:05 PM To: ‘lita-l@lists.ala.org’ <lita-l@lists.ala.org> Subject: Tablet technology & instruction survey
Does your library have an iPad/tablet cart, or a dedicated classroom with mobile devices? Have you been teaching library research sessions using iPads or tablets? We invite you to participate in a study that aims to take a look at how tablets are used in library instruction, and the experiences of those who administer and maintain them. We’re hoping to learn about the experiences of others who also use mobile devices for instruction, as our own have been mixed.
Participation is voluntary and this survey is anonymous. Participants must be at least 18 years of age. If you complete the survey your consent to participate will be assumed. The survey will be available until 7/1/2016.
Last year, MC 218 was supposed to be remodeled. My suggestion to bring MC 218 to the modern standards of a library, as per LITA’s survey, was completely ignored as reported to SCSU library director:
Managing projects is the most common task instructional designers undertake during their days, followed by technology and pedagogical training. Their biggest obstacle to success on the job is faculty resistance. The most important expertise they possess as a whole is the ability to learn new technologies, followed by project management and learning science or theory. Their favorite tools to work with are Camtasia and Adobe products; their least-favorite are Blackboard and learning management systems in general.
Consider adding more resources in the area of instructional design. If that isn’t possible, at least consider involving instructional designers “early” and “often” during technology transitions.”
“Incentivize” faculty to work with instructional designers “from the get-go” in order to help them learn how to engage with their students and expand class time through the use of online tools.
Technology providers should work closely with instructional designers in the selection of digital tools.
p. 4 Graph: median number of instructional designers by type of institution. According to the graph, SCSU must have between 3 and 16 instructional designers.
p. 10.“While a ‘jack-of-all-trades’ can get by in instructional design, the best instructional designers are ‘aces-of-many-trades’,with authentic experience and training in all aspects of the process.”
p. 12“Management choose[s] tools that are cheap and never ask[s] about integration or accessibility.Then we spend enormous amounts of time trying to get them to work.”
After surveying more than 4,650 educators, we learned that teachers are essentially trying to do three things with data—each of which technology can dramatically improve:
The U.S. Department of Education has increasingly encouraged and funded states to collect and analyze information about students: grades, state test scores, attendance, behavior, lateness, graduation rates and school climate measures like surveys of student engagement.
The argument in favor of all this is that the more we know about how students are doing, the better we can target instruction and other interventions. And sharing that information with parents and the community at large is crucial. It can motivate big changes.
what might be lost when schools focus too much on data. Here are five arguments against the excesses of data-driven instruction.
The National Education Policy Center releases annual reports on commercialization and marketing in public schools. In its most recent report in May, researchers there raised concerns about targeted marketing to students using computers for schoolwork and homework. Companies like Google pledge not to track the content of schoolwork for the purposes of advertising. But in reality these boundaries can be a lot more porous. For example, a high school student profiled in the NEPC report often consulted commercial programs like dictionary.com and Sparknotes: “Once when she had been looking at shoes, she mentioned, an ad for shoes appeared in the middle of a Sparknotes chapter summary.”
4) Missing What Data Can’t Capture
Computer systems are most comfortable recording and analyzing quantifiable, structured data. The number of absences in a semester, say; or a three-digit score on a multiple-choice test that can be graded by machine, where every question has just one right answer.
5) Exposing Students’ “Permanent Records”
In the past few years several states have passed laws banning employers from looking at the credit reports of job applicants. Employers want people who are reliable and responsible. But privacy advocates argue that a past medical issue or even a bankruptcy shouldn’t unfairly dun a person who needs a fresh start.
There are two types of Lexile measures: a person’s reading ability and the text’s difficulty. Students who are tested against state standards receive a Lexile reader measure from the Kansas Reading Assessment. Books and other texts receive a Lexile text measure from a MetaMetrics software tool called the Lexile Analyzer, which describes the book’s reading demand or complexity. When used together, the two measures are intended to help match a reader with reading material that is at an appropriate difficulty or will at least help give an idea of how well a reader should comprehend text. The reader should encounter some level of difficulty with the text, but not enough to get frustrated. The Lexile reader measure is used to monitor reader progress.
My note: is this another way / attempt to replace humans as educators? Or it is a supplemental approach to improve students’ reading abilities.
The light and fluffy version of computer science—which is proliferating as a superficial response to the increased need for coders in the workplace—is a phenomenon I refer to as “pop computing.” While calling all policy makers and education leaders to consider “computer science education for all” is a good thing, the coding culture promoted by Code.org and its library of movie-branded coding apps provide quick experiences of drag-and-drop code entertainment.
playing with coding apps as compared to learning to design an app using code. Building an app takes time and requires multi-dimensional learning contexts, pathways and projects.
Computing and computer science is the equivalent of immersing in a thicker study of music—its origins, influences, aesthetics, applications, theories, composition, techniques, variations and meanings. In other words, the actual foundations and experiences that change an individual’s mindset.
As noted by MIT’s Marvin Minsky and Alan Kay, computational innovation and literacy have much in common with music literacy. Just as would-be musicians become proficient by listening improvising and composing, and not just by playing other people’s compositions, so would-be programmers become proficient by designing prototypes and models that work for solving real problems, doing critical thinking and analysis, and creative collaboration—none of which can be accomplished in one hour of coding. In other words, just as a kid playing Guitar Hero wouldn’t be considered a musician, someone playing with coding apps isn’t exactly a coder or computer scientist.