Searching for "classroom response systems"

Innovation, Infrastructure, and Digital Learning

Notes from the webinar:
What is Digital Learning

 

 

 

Technology is a metaphor for change, it is also a metaphor for risk

technology is a means of uncertainly reduction that is made possible by the cause-effect relationship upon which the technology is based.

technology innovation creates a kind of uncertainty in the minds of potential adopters as well as represent an opportunity for reduced uncertainty.

The Diffusion of Innovations: https://en.wikipedia.org/wiki/Diffusion_of_innovations

https://web.stanford.edu/class/symbsys205/Diffusion%20of%20Innovations.htm

diffusion of innovations

 

technology is disruptive

  • issues and impacts | response
  • organizational practice and process |  denial, anger
  • individual behaviors and preferences | bargaining
  • visualization: can I see me/us doing that | depression, acceptance

as per https://www.amazon.com/Death-Dying-Doctors-Nurses-Families/dp/1476775540

The key campus tech issues are no longer about IT (in the past e.g.: MS versus Apple). IT is the “easy part” of technology on campus. The challenges: people, planning policy, programs, priorities, silos, egos, and IT entitlements

How do we make Digital Learning compelling and safe for the faculty? provide evidence of impact, support, recognition and reward for faculty; communicate about effectiveness of and need for IT resources.

technology is not capital cost, it is operational cost. reoccurring.

Visualization:

underlying issues; can i do this? why should i do this? evidence of benefit?

http://www.sonicfoundry.com/wp-content/uploads/2016/01/Green-PlusCaChange-EDUCAUSEReview-Sept2015.pdf

the more things change, the more things stay the same. new equilibrium.

change: from what did you do wrong to how do we do better. Use data as a resources, not as a weapon. there is a fear of trying, because there is no recognition or reward

Machiavelli: 1. concentrate your efforts 2. pick your issues carefully, know when to fight 3. know the history 4. build coalitions 5. set modest goals – and realistic 6. leverage the value of data (use it as resource not weapon) 7. anticipate personnel turnover 8. set deadlines for decisions

Colleagues,

We apologize for the short notice, but wanted to make you aware of the following opportunity: provide

From Ken Graetz at Winona State University:

As part of our Digital Faculty Fellows Program at WSU, Dr. Kenneth C. Green will be speaking this Thursday, March 22nd in Stark 103 Miller Auditorium from 11:30 to 12:30 on “Innovation, Infrastructure, and Digital Learning.” We will be streaming Casey’s talk using Skype Meeting Broadcast and you can join as a guest using the following link: Join the presentation. This will allow you to see and hear his presentation, as well as post moderated questions. By way of a teaser, here is a recent quote from Dr. Green’s blog, DigitalTweed, published by Inside Higher Ed:

“If trustees, presidents, provosts, deans, and department chairs really want to address the fear of trying and foster innovation in instruction, then they have to recognize that infrastructure fosters innovation.  And infrastructure, in the context of technology and instruction, involves more than just computer hardware, software, digital projectors in classrooms, learning management systems, and campus web sites. The technology is actually the easy part. The real challenges involve a commitment to research about the impact of innovation in instruction, and recognition and reward for those faculty who would like to pursue innovation in their instructional activities.”

Dr. Green is the founding director of The Campus Computing Project, the largest continuing study of the role of digital learning and information technology in American colleges and universities. Campus Computing is widely cited as a definitive source for data, information, and insight about IT planning and policy issues affecting higher education. Dr. Green also serves as the director, moderator, and co-producer of TO A DEGREE, the postsecondary success podcast of the Bill & Melinda Gates Foundation. He is the author or editor of some 20 books and published research reports and more than 100 articles and commentaries that have appeared in academic journals and professional publications. In 2002, Dr. Green received the first EDUCAUSE Award for Leadership in Public Policy and Practice. The EDUCAUSE award cites his work in creating The Campus Computing Project and recognizes his, “prominence in the arena of national and international technology agendas, and the linking of higher education to those agendas.”

Casey’s most recent TO A DEGREE podcasts are available now: Presidential Leadership in Challenging Times and Online’s Bottom Line.

Hope to see some of you online and please forward this invitation to anyone who might be interested.

Ken Graetz, PhD, Director of Teaching, Learning, and Technology Services, Winona State University, 507-429-3270

definitions online learning

Online learning here is used as a blanket term for all related terms:

  • HyFlex courses – hybrid + flexible
    “hybrid synchronous” or “blended synchronous” courses

    • Definition:
      The HyFlex model gives students the choice to attend class in person or via synchronous remote stream and to make that choice on a daily basis. In other words, unlike online and hybrid models which typically have a fixed course structure for the entire semester, the HyFlex model does not require students to make a choice at the beginning of term and then stick with it whether their choice works for them or not; rather students are able to make different choices each day depending on what works best for them on that day (hence the format is “flexible”) (Miller and Baham, 2018, to be published in the Proceedings of the 10th International Conference on Teaching Statistics).
    • Definition from Horizon Report, HIgher Ed edition, 2014. p. 10 integration of Online Hybrid and Collaborative Learning
    • Definition from U of Arizona (https://journals.uair.arizona.edu/index.php/itet/article/view/16464/16485)
      Beatty (2010) defines HyFlex courses to be those that “enable a flexible participation policy for students whereby students may choose to attend face-to-face synchronous class sessions or complete course learning activities online without physically attending class”
  • Online courses
    • Definition
      Goette, W. F., Delello, J. A., Schmitt, A. L., Sullivan, J. R., & Rangel, A. (2017). Comparing Delivery Approaches to Teaching Abnormal Psychology: Investigating Student Perceptions and Learning Outcomes. Psychology Learning and Teaching, 16(3), 336–352. https://doi.org/10.1177/1475725717716624
      p.2.Online classes are a form of distance learning available completely over the Internet with no F2F interaction between an instructor and students (Helms, 2014).
    • https://www.oswego.edu/human-resources/section-6-instructional-policies-and-procedures
      An online class is a class that is offered 100% through the Internet. Asynchronous courses require no time in a classroom. All assignments, exams, and communication are delivered using a learning management system (LMS). At Oswego, the campus is transitioning from ANGEL  to Blackboard, which will be completed by the Fall 2015 semester.  Fully online courses may also be synchronous. Synchronous online courses require student participation at a specified time using audio/visual software such as Blackboard Collaborate along with the LMS.
    • Web-enhanced courses

Web enhanced learning occurs in a traditional face-to-face (f2f) course when the instructor incorporates web resources into the design and delivery of the course to support student learning. The key difference between Web Enhanced Learning versus other forms of e-learning (online or hybrid courses) is that the internet is used to supplement and support the instruction occurring in the classroom rather than replace it.  Web Enhanced Learning may include activities such as: accessing course materials, submitting assignments, participating in discussions, taking quizzes and exams, and/or accessing grades and feedback.”

  • Blended/Hybrid Learning
    • Definition

Goette, W. F., Delello, J. A., Schmitt, A. L., Sullivan, J. R., & Rangel, A. (2017). Comparing Delivery Approaches to Teaching Abnormal Psychology: Investigating Student Perceptions and Learning Outcomes. Psychology Learning and Teaching, 16(3), 336–352. https://doi.org/10.1177/1475725717716624
p.3.

Helms (2014) described blended education as incorporating both online and F2F character- istics into a single course. This definition captures an important confound to comparing course administration formats because otherwise traditional F2F courses may also incorp- orate aspects of online curriculum. Blended learning may thus encompass F2F classes in which any course content is available online (e.g., recorded lectures or PowerPoints) as well as more traditionally blended courses. Helms recommended the use of ‘‘blended’’ over ‘‘hybrid’’ because these courses combine different but complementary approaches rather than layer opposing methods and formats.

Blended learning can merge the relative strengths of F2F and online education within a flexible course delivery format. As such, this delivery form has a similar potential of online courses to reduce the cost of administration (Bowen et al., 2014) while addressing concerns of quality and achievement gaps that may come from online education. Advantages of blended courses include: convenience and efficiency for the student; promotion of active learning; more effective use of classroom space; and increased class time to spend on higher- level learning activities such as cooperative learning, working with case studies, and discuss- ing big picture concepts and ideas (Ahmed, 2010; Al-Qahtani & Higgins, 2013; Lewis & Harrison, 2012).

Although many definitions of hybrid and blended learning exist, there is a convergence upon three key points: (1) Web-based learning activities are introduced to complement face-to-face work; (2) “seat time” is reduced, though not eliminated altogether; (3) the Web-based and face-to-face components of the course are designed to interact pedagogically to take advantage of the best features of each.
The amount of in class time varies in hybrids from school to school. Some require more than 50% must be in class, others say more than 50% must be online. Others indicate that 20% – 80% must be in class (or online). There is consensus that generally the time is split 50-50, but it depends on the best pedagogy for what the instructor wants to achieve.

Backchannel and CRS (or Audience Response Systems):
https://journals.uair.arizona.

More information:

Blended Synchronous Learning project (http://blendsync.org/)

https://journals.uair.arizona.edu/index.php/itet/article/view/16464/16485

https://www.binghamton.edu/academics/provost/faculty-staff-handbook/handbook-vii.html

VII.A.3. Distance Learning Courses
Distance learning courses are indicated in the schedule of classes on BU Brain with an Instructional Method of Online Asynchronous (OA), Online Synchronous (OS), Online Combined (OC), or Online Hybrid (OH). Online Asynchronous courses are those in which the instruction is recorded/stored and then accessed by the students at another time. Online Synchronous courses are those in which students are at locations remote from the instructor and viewing the instruction as it occurs. Online Combined courses are those in which there is a combination of asynchronous and synchronous instruction that occurs over the length of the course. Online Hybrid courses are those in which there is both in-person and online (asynchronous and/or synchronous) instruction that occurs over the length of the course.

Digital Citizenship Symposium

We invite you to join us on Monday, March 12, 2018, in Washington, DC, for the 2018 Global Symposium on Digital Citizenship.
$129. Select CoSN Member or Non-member, change the “0” next to the “Symposium on Educating for Digital Citizenship ONLY” to a “1”. Click “next” and complete your registration.
CoSN and UNESCO, in partnership with the Global Education Conference, HP, ClassLink, Participate, Qatar Foundation International, Partnership for 21st Century Learning, ISTE, iEARN-USA, The Stevens Initiative at the Aspen Institute, World Savvy, Wikimedia, TakingITGlobal, Smithsonian Institute, and Project Tomorrow, are hosting this event to bring together thought leaders from across the world to explore the role of education in ensuring students are responsible digital citizens.

Internet safety has been a concern for policymakers and educators since the moment technology, particularly the Internet, was introduced to classrooms. Increasingly many school systems are evolving that focus from simply minimizing risk and blocking access, to more responsible use policies and strategies that empower the student as a digital citizen. Digital citizenship initiatives also seek to prepare students to live in a world where online hate and radicalization are all too common.

 

Join us for a lively and engaging exploration of what are the essential digital citizenship skills that students need, including policies and practices in response to the following questions:
  • How can technology be used to improve digital citizenship and to what extent is technology providing new challenges to digital citizenship?
  • How should we access information effectively and form good evaluate its accuracy?
  • How should we develop the skills to engage with others respectfully and in a sensitive and ethical manner?
  • How should we develop an appropriate balance between instruction and nurturing student behaviors that ensure ICT (Information and communications technology) is used safely and responsibly?

ECAR Study of Undergraduate Students and Information Technology, 2017

ECAR Study of Undergraduate Students and Information Technology, 2017

  • Students would like their instructors to use more technology in their classes.Technologies that provide students with something (e.g., lecture capture, early-alert systems, LMS, search tools) are more desired than those that require students to give something (e.g., social media, use of their own devices, in-class polling tools). We speculate that sound pedagogy and technology use tied to specific learning outcomes and goals may improve the desirability of the latter.
  • Students reported that faculty are banning or discouraging the use of laptops, tablets, and (especially) smartphones more often than in previous years. Some students reported using their devices (especially their smartphones) for nonclass activities, which might explain the instructor policies they are experiencing. However, they also reported using their devices for productive classroom activities (e.g., taking notes, researching additional sources of information, and instructor-directed activities).

++++++++++++++
more on ECAR studies in this IMS blog
https://blog.stcloudstate.edu/ims?s=ecar

embedded librarian qualifications

qualifications of the embedded librarian: is there any known case for an academic library to employ as embedded librarian a specialist who has both MLIS and terminal degree in a discipline, where he works as embedded librarian.

I also think that we need to be more welcoming to people who may not have come through a traditional education program (i.e., the M.L.S.) but who bring critical skills and new perspectives into the library.
The Changing Roles of Academic and Research Libraries – Higher Ed Careers – HigherEdJobs. (2013). Retrieved from https://www.higheredjobs.com/HigherEdCareers/interviews.cfm?ID=632

“Embedded librarian” is understood as librarians presence in traditional classroom environments and or through LMS.
Then opinions vary: According to Kvienlid (2012), http://www.cclibinstruction.org/wp-content/uploads/2012/02/CCLI2012proceedings_Kvenild.pdf

  1. “Their engagement can be over two or more class sessions, even co-teaching the class in some cases. This model provides in-depth knowledge of student research projects during the research and revision process.” This is for first-year experience students.
  2. Embedding with project teams in Business and STEM programs involves:  “in – depth participation in short – term projects, aiding the team in their searches, literature review, grant preparation, data curation, or other specialized information aspects of the project. This level of embedment requires a heavy time commitment during the length of the project, as well as subject expertise and established trust with the research team.”
  3. embedding in departments as a liaison. 
    “They are usually closely affiliated with the departme nt (maybe even more so than with the libraries) and might be paid out of departmental funds. These librarians learn the ways and needs of their patrons in their natural environment. They often work as finders of information, organizers of information, and taxonomy creators. Embedding within departments provides in – depth knowledge of the users of library services, along with potential isolation from other librarians. It involves a high degree of specialization, co – location and shared responsibility”

best practices, new opptunities (video, screencasts, social media. Adobe Connect) , Assessment

here is Kvenild 2016 article also

Kvenild, C., Tumbleson, B. E., Burke, J. J., & Calkins, K. (2016). Embedded librarianship: questions and answers from librarians in the trenches. Library Hi Tech34(2), 8-11.

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dllf%26AN%3d124513010%26site%3dehost-live%26scope%3dsite

utilizing technology tools; and providing information literacy and assessment. Technology tools continue to evolve and change, and most librarians can anticipate using multiple learning management systems over time. There is an ongoing need for professional development in online library instruction and assessment

+++++++++++++++++++++++++++++++

Tumbleson, B. E., & Burke, J. (. J. (2013). Embedding librarianship in learning management systems: A how-to-do-it manual for librarians. Neal-Schuman, an imprint of the American Library Association.

https://blog.stcloudstate.edu/ims/2015/05/04/lms-and-embedded-librarianship/

read in red my emphasis on excerpts from that book

++++++++++++++++++++++++++++++++++

Monroe-Gulick, A., O ’brien, M. S., & White, G. (2013). Librarians as Partners: Moving from Research Supporters to Research Partners. In Moving from Research Supporters to Research Partners. Indianapolis, IN. Retrieved from http://www.ala.org/acrl/sites/ala.org.acrl/files/content/conferences/confsandpreconfs/2013/papers/GulickOBrienWhite_Librarians.pdf

From Supporter to Partner

++++++++++++++++++++++
Andrews, C. (2014). An Examination of Embedded Librarian Ideas and Practices: A Critical Bibliography.

http://academicworks.cuny.edu/cgi/viewcontent.cgi?article=1000&context=bx_pubs

emphasis is on undergraduate. “a tremendous amount of literature published addressing library/faculty partnerships.”

“There will never be one golden rule when it comes to way in which a librarian networks with faculty on campus.”

++++++++++++++++
Bobish, G. (2011). Participation and Pedagogy: Connecting the Social Web to ACRL Learning Outcomes. Journal Of Academic Librarianship37(1), 54-63.

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3daph%26AN%3d57844282%26site%3dehost-live%26scope%3dsite

https://www.researchgate.net/publication/232382226_Participation_and_Pedagogy_Connecting_the_Social_Web_to_ACRL_Learning_Outcomes

requested through researchgate

++++++++++++++++++++++

Cahoy, E. S., & Schroeder, R. (2012). EMBEDDING AFFECTIVE LEARNING OUTCOMES IN LIBRARY INSTRUCTION. Communications In Information Literacy6(1), 73-90.

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dllf%26AN%3d84110749%26site%3dehost-live%26scope%3dsite

attention must be paid to students’ affective, emotional needs throughout the research
process. My note: And this is exactly what comprise half of my service of. The relatively small amount of research into affective learning, as opposed to cognition, remains true to this day.

p. 78  As the 50-minute one-shot session is still the norm for library research sessions on the
majority of campuses, behavioral assessment can be problematic.

++++++++++++++++++++++++++++++

Cha, T., & Hsieh, P. (2009). A Case Study of Faculty Attitudes toward Collaboration with Librarians to Integrate Information Literacy into the Curriculum. (Chinese). Journal Of Educational Media & Library Sciences46(4), 441-467.

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dllf%26AN%3d502982677%26site%3dehost-live%26scope%3dsite

Meanwhile, different attitudes were revealed between teaching higher order thinking skills and lower order thinking skills. Librarian Domain Knowledge, Librarian Professionalism, Curriculum Strategies, and Student Learning were identified as factorial dimensions influencing faculty-librarian collaboration.

two major concerns of “Students Learning” and “Librarian Professionalism” from faculty provide insights that understanding pedagogy, enhancing instructional skills and continuing progress in librarian professionalism will contribute to consolidating partnerships when developing course-specific IL programs.

this proves how much right I am to develophttp://web.stcloudstate.edu/pmiltenoff/bi/

++++++++++++++++++

COVONE, N., & LAMM, M. (2010). Just Be There: Campus, Department, Classroom…and Kitchen?. Public Services Quarterly6(2/3), 198-207. doi:10.1080/15228959.2010.498768

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dkeh%26AN%3d53155456%26site%3dehost-live%26scope%3dsite

p. 199 There is also the concept of the ‘‘blended librarian’’ as described by Bell and Shank (2004) to merge the assets and abilities of a librarian with those of one versed in technology. Academic librarians are obligated and privileged to merge several strengths to meet the needs of their user population. No longer is the traditional passive role acceptable. Bell and Shank (2004) implore academic librarians ‘‘to proactively advance their integration into the teaching and learning process’’ (p. 373).

p. 200 first year experience

++++++++++++++++++++++++++++
Dewey, B. I. (2004). The Embedded Librarian: Strategic Campus Collaborations. Resource Sharing & Information Networks17(1-2), 5-17.

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3deric%26AN%3dEJ835947%26site%3dehost-live%26scope%3dsite

p. 6 the imperative for academic librarians to become embedded in the priorities of teaching, learning, and research in truly relevant ways. Embedding as an effective mode of collaboration will be explored through examples relating to the physical and virtual environment. An analysis of current approaches and next steps for the future will be addressed, with the goal of providing food for thought as librarians assess programs and activities in terms of positive collaboration and effectiveness

p. 9  new academic salon,
p. 10 the pervasive campus librarian
The fact that we are generalists and devoted to all disciplines and all sectors of the academic user community gives us a special insight on ways to advance the university and achieve its mission

this contradicts Shumaker and Talley, who assert that the embedded librarian is NOT a generalist, but specialist

p. 11 Central administrators, along with the chief academic officer, make critical funding and policy decisions affecting the library

p. 11 librarians and teaching.
In 2011, interim dean Ruth Zietlow “gave up” classes after the messy divorce with CIM. the library faculty poled itself to reveal that a significant number of the faculty does NOT want to teach.

p. 14 influencing campus virtual space
this library’s social media is imploded in its image.

++++++++++++++++++++++++++

DREWES, K., & HOFFMAN, N. (2010). Academic Embedded Librarianship: An Introduction. Public Services Quarterly6(2/3), 75-82. doi:10.1080/15228959.2010.498773

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dkeh%26AN%3d53155443%26site%3dehost-live%26scope%3dsite

p. 75 Literature about embedded librarianship is so diverse that the definition of this term, as well as related goals and methods when embedding services and programs, can be difficult to define. What are some characteristics of an embedded program? Is embedding only achieved through an online classroom? How did embedded librarianship first begin in academic libraries?

p. 76 adopted as a term because it is a similar concept to embedded journalism.
Embedded librarian programs often locate librarians involved in the spaces of their users and colleagues, either physically or through technology, in order to become a part of their users’ culture. A librarian’s physical and metaphorical location is often what defines them as embedded.

David Shumaker and Mary Talley (see bottom of this blog entry)

Highly technical tasks, such as creating information architecture, using analytical software, and computer and network systems management were performed by less than 20% of the survey respondents. Shumaker and Talley also report embedded services are often found in tandem with specialized funding. This study also confirms embedded services are not new.

p. 77 history and evolution of the role

p. 79 methods of embedding

In North America, one would be hard-pressed to find a library that does not already electronically embed services into online reference chat, make use of Web 2.0 communication applications such as Twitter and blogs, and embed librarians and collaborators within online classrooms. These are all examples of the embedding process (Ramsay & Kinnie, 2006). The name embedded librarian in this context is a double entendre, as the insertion of widgets and multimedia files into HTML code when designing Web sites is usually called the embedding of the file.
My note: is this library actually is one that does not use Twitter and blogs in the hard-core meaning of library service

++++++++++++++++++++++++++++++

Essinger, C. c., & Ke, I. i. (2013). Outreach: What Works?. Collaborative Librarianship5(1), 52-58.

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dllf%26AN%3d87760803%26site%3dehost-live%26scope%3dsite

Recommendations:
The authors distributed their findings at a half day workshop attended by nearly all liaisons. They made the following recommendations:

  • • Personalize outreach.
  • • Spend more time marketing and reaching out to departments, even though it might mean having less time for other activities.
  • • Find an alternative advocate who can build your reputation through word-ofmouth if your relationship with your assigned department liaison is not fruitful.
  • • Seek opportunities to meet department staff in person.
  • • As much as possible, administrators should commit to keeping liaisons assignments static.

p. 57 that faculty outreach is similar to other types of relationship building: it requires time to establish trust, respect and appreciation on both sides. Even a liaison’s challenging first two years can, therefore, be viewed as productive because the relationship is developing in the background. This phenomenon also signals to library administrators the benefits of maintaining a stable workforce. Frequent changes in academic assignments and staff changes can lead to a less engaged user population, and also make the outreach assignment much more frustrating.

+++++++++++++++++

Heider, K. L. (2010). Ten Tips for Implementing a Successful Embedded Librarian Program. Public Services Quarterly6(2-3), 110-121.

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3deric%26AN%3dEJ896199%26site%3dehost-live%26scope%3dsite

embedded librarian program in the university’s College of Education and Educational Technology

p. 112 Make Sure You Have Buy-in from All Stakeholders

Include College=Department Faculty in the Interview Process

Look for the Following Qualities=Qualifications in an Embedded Librarian

Have a Physical Presence in the College=Department a Few Days Each Week

Serve as Bibliographer to College=Department

Offer Bibliographic Instruction Sessions and Guest Lectures at Main Campus, Branch Campuses, and Centers

Develop Collaborative Programs that Utilize the Library’s Resources for College=Department Improvement

#9 Offer to Teach Credit Courses for the College=Department When Department Faculty Are Not Available

#10: Publish Scholarly Works and Present at Professional Conferences with College=Department Faculty. again, Martin Lo, John Hoover,

+++++++++++++++++++

Hollister, C. V. (2008). Meeting Them where They Are: Library Instruction for Today’s Students in the World Civilizations Course. Public Services Quarterly4(1), 15-27.

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3deric%26AN%3dEJ877341%26site%3dehost-live%26scope%3dsite

history and library. My note: can you break the silo in the history department? https://blog.stcloudstate.edu/ims/2017/05/01/history-becker/ 

world civilizations course

Faculty come to the world civilizations enterprise from a broad range of academic disciplines and world experiences, which has a significant impact on their interpretations of world history, their selections of course materials, their teaching styles, and their expectations for students. Moreover, faculty teach the course on a rotating basis. So, there is no single model of faculty-librarian collaboration that can be applied from section to section, or even from semester to semester. Faculty have widely differing views on the role of library instruction in their sections of the course, and the extent to which library research is required for coursework. They also differ in terms of their ability or willingness to collaborate with the libraries. As a result, student access to library instruction varies from section to section.

+++++++++++++++++

Kesselman, M. A., & Watstein, S. B. (2009). Creating Opportunities: Embedded Librarians. Journal Of Library Administration49(3), 383-400.

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dllf%26AN%3d502977425%26site%3dehost-live%26scope%3dsite

p. 384 embedded librarians in the blogosphere.
not even close to the local idea how blog must be used  for library use.

p. 387 definitions

p. 389 clinical librarianship – term from the 1970s.

p. 390 Special librarians and particularly those in corporate settings tend to be more integrated within the company they serve and are often instrumental in cost-related services such as competitive intelligence, scientific, and patent research.

p. 391 Librarians Collaborating With Faculty in Scholarly Communication Activities

My note: this is what I am doing with Martin Lo and used to do with John Hoover. Attempts with the sociology department, IS department

p. 392 Role of Librarians With Multidisciplinary Collaborations

my note : my work with Mark Gill and Mark Petzhold

p. 393 social media
again, this library cannot be farther from the true meaning of Web 2.0 collaboration.

p. 396 organizational structures

Three different types of organizational structures are generally recognized—hierarchical, matrix, and flat. We suggest that each of these conventional structures promotes, to some extent, its own brand of silos—silos that inherently pose obstacles to the assumption of new roles and responsibilities. For example, we question whether the hierarchical organization structures that define many of our libraries, with their emphasis on line, lateral staff and functional relationships and the relative ranks of parts and positions or jobs, are flexible enough to support new roles and responsibilities. In contrast, matrix management offers a different type of organizational management model in which people with similar skills are pooled for work assignments. We suggest that, in contrast to hierarchical structures, matrix management allows team members to share information more readily across task boundaries and allows for specialization that can increase depth of knowledge and allow professional development and career progression to be managed. The third organizational structure mentioned—flat or horizontal organizations, refers to an organizational structure with few or no levels of intervening management between staff and managers

+++++++++++++++++

Kobzina, N. G. (2010). A Faculty—Librarian Partnership: A Unique Opportunity for Course Integration. Journal Of Library Administration50(4), 293-314.

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dllf%26AN%3d502990477%26site%3dehost-live%26scope%3dsite

my LIB 290 is such class. and I am the only one who is teaching it online by QM standards.
Can the administration encourage Global Studies to combine efforts with my LIB 290 and offer a campus-wide class?

++++++++++++++++++++

Lange, J. j., Canuel, R. r., & Fitzgibbons, M. m. (2011). Tailoring information literacy instruction and library services for continuing education. Journal Of Information Literacy5(2), 66-80.

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dllf%26AN%3d70044774%26site%3dehost-live%26scope%3dsite

McGill. p. 77 The McGill University Library’s system-wide liaison model emphasises a disciplinary approach, placing the impetus for outreach and service on individual librarians responsible for particular departments and user groups.

 

+++++++++++++++++

MCMILLEN, P., & FABBI, J. (2010). How to Be an E3 Librarian. Public Services Quarterly6(2/3), 174-186. doi:10.1080/15228959.2010.497454

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dkeh%26AN%3d53155458%26site%3dehost-live%26scope%3dsite

ILL

++++++++++++++++++++

Meyer, N. J., & Miller, I. R. (2008). The Library as Service-Learning Partner: A Win-Win Collaboration with Students and Faculty. College & Undergraduate Libraries15(4), 399-413.

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dllf%26AN%3d502937618%26site%3dehost-live%26scope%3dsite

ILL

I did something similar with Keith Christensen in 2012: http://bit.ly/SCSUlibGame, yet again, blocked for further consideration

+++++++++++++++++++++
Niles, P. (2011). Meeting the Needs of the 21st Century Student. Community & Junior College Libraries17(2), 47-51.

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3deric%26AN%3dEJ947074%26site%3dehost-live%26scope%3dsite

about Millennials

millennials. p. 48 my note: the losing battle of telling the millennials the value of books

librarians need to emphasize that not all information
is found on the Web and that the information found there might not be
reliable, depending on its source

p. 49 The latest technology can be used for communication. Two examples of this modernization process are making podcasts of library lectures and using instant messaging to answer reference queries. Students need Reference Librarians to assist them in focusing their research, showing them appropriate sources and how to use those sources. The change is not how the librarians serve the students but how the service is delivered. Instead of coming to the reference desk Millennial students may choose to use e-mail, cell phones to send a text message or use a chat reference service to communicate with the librarian. Students want to have 24/7 access to library resources and librarians.

my note: and yet this library still uses 90ish communication – the facebook page is just an easy to edit web page and the concept of Web 2.0 has not arrived or shaped the current communication.

p. 50 Librarians should examine how they present library instruction and ensure that students know why it is important. Further, Lancaster and Stillman state that librarians need to “incorporate some computer-based instruction for Millennials as it allows them to go at their own speed and acknowledges their ability to manage information” (2003, 231).
and, once again, talking about inducing library instruction with technology: http://web.stcloudstate.edu/pmiltenoff/bi/

+++++++++++++++++++

Oakleaf, M., & VanScoy, A. (2010). Instructional Strategies for Digital Reference: Methods to Facilitate Student Learning. Reference & User Services Quarterly49(4), 380-390.

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dllf%26AN%3d502996462%26site%3dehost-live%26scope%3dsite

constructivism, social constructivism, active learning

they have a graph about metacognition. I wish, they had found place for metaliteracy also

p. 383. #5 Let them drive. this is EXACTLY what I am offering with:http://web.stcloudstate.edu/pmiltenoff/bi/
build their own construct

p. 386 my work with the doctoral cohorts:

In the current climate of educational accountability, reference librarians should embrace the opportunity to align reference service with the teaching and learning missions of their libraries and overarching institutions

+++++++++++++++++

Rao, S., Cameron, A., & Gaskin-Noel, S. (2009). Embedding General Education Competencies into an Online Information Literacy Course. Journal Of Library Administration49(1/2), 59-73.

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dllf%26AN%3d502963853%26site%3dehost-live%26scope%3dsite

online programs a 3-credit junior seminar course (JRSM 301) to assess general education competencies

p. 60 The 3-credit course titled LISC 260—Using Electronic Resources for Research has existed as a required course for this overseas cohort of students since the fall of 1999. The course was initially developed as a required course to introduce the Mercy College Libraries’ resources to this cohort of overseas students. Full-time librarians teach this course as an overload.

The course lasts for 8 weeks during fall and spring semesters and is divided into eight modules with five quizzes. Summer sessions are shorter; the summer version of the course runs for 6 weeks. There is no midterm exam, final exam, project, or term paper for this course. Sixty percent of the grade is based on the quizzes and assignments and 40% on discussion and class participation.

Each quiz addresses a specific competency. We identified the modules where the five competencies would fit best. A document containing the five general education competencies (critical thinking, information literacy, quantitative reasoning, critical reading, and writing) statements

Critical Thinking Competency This competency was placed in the second module covering the topic “Developing Search Strategies” in the second week of the course. In this module, students are required to select a topic and develop logical terminologies and search strings. This task requires a great deal of critical and analytical thinking and therefore lays the groundwork for the other competencies. The quizzes and assignments for this competency involve breaking or narrowing down the topic into subtopics, comparing two topics or ideas, and similar skills. It is hoped that students will be able to adopt Boolean and other search logic in clear and precise ways in their analyses and interpretations of their topic and use the search strategies they develop for continued assignments throughout the rest of the course.

p. 61. Information Literacy Competency The information literacy competency is introduced in the fourth module in the fourth week of the course. As part of the course, students are required to learn about the Mercy College Libraries’ indexes and databases, which this module addresses (“Information Literacy,” n.d.).

Quantitative Reasoning Competency

This seminar course is a library research course with no statistics or mathematics component. Many students enrolled in the course are not mathematics or statistics majors, hence some creativity was needed to evaluate their mathematical and computational skills. Students are given this competency in the fifth module during the fifth week of the course, which deals with subject-specific sources. It was decided that, to assess this competency, a quiz analyzing data obtained in a tabular format from one of the databases subscribed to by the library would fulfill the requirement. Students are given a choice of various countries and related data, and are asked to create some comparative demographic profiles. This approach has worked well because it gives students the opportunity to focus on countries and data that interest them.

 

++++++++++++++++++

Abrizah, A., Inuwa, S., & Afiqah-Izzati, N. (2016). Systematic Literature Review Informing LIS Professionals on Embedding Librarianship Roles. Journal Of Academic Librarianship42(6), 636-643. doi:10.1016/j.acalib.2016.08.010

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dlxh%26AN%3d119652419%26site%3dehost-live%26scope%3dsite

requested through research gate

+++++++++++++++++++
Summey, T. P., & Kane, C. A. (2017). Going Where They Are: Intentionally Embedding Librarians in Courses and Measuring the Impact on Student Learning. Journal Of Library & Information Services In Distance Learning11(1/2), 158-174. doi:10.1080/1533290X.2016.1229429

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dkeh%26AN%3d121839436%26site%3dehost-live%26scope%3dsite

a pilot project utilizing a variety of methods.

p. 158 The concept of embedded librarians is not new, as it has antecedents in branch librarians of the seventies and academic departmental liaisons of the 1980s and 1990s. However, it is a way to proactively reach out to the campus community (Drewes & Hoffman, 2010).

There is not a one-size-fits-all definition for embedded librarianship. As a result, librarians in academic libraries may be embedded in their communities in a variety of ways and at varying levels from course integrated instruction to being fully embedded as a member of an academic department

p. 160 my note: the authors describe the standard use of LMS for embedded librarianship.

p. 163 they managed to fight out and ensure their efforts are “credited.” Assigning credits to embedded librarian activities can be a very tough process.

p. 165  assessment

the authors utilized a pre-module and post-module survey to assess the students’ performance using library resources. The survey also helped to determine the students’ perceived self-efficacy and confidence in using the library, its resources, and services. In addition, the researchers analyzed student responses to discussion questions, studied feedback at the end of the course in the course discussion forum, and conducted interviews with the faculty members teaching the courses (

In another study, researchers analyzed bibliographies of students in the course to identify what resources they cited in their research projects. More specifically, they analyzed the type and appropriateness of sources used by the students, their currency, and noting how deeply the students delved into their topics. They also looked at the number of references cited. The authors believed that examining the bibliographies provided an incomplete picture because it provided data on the sources selected by the students but not information on how they retrieved those sources.

p. 171 survey sample

+++++++++++++++++

Wu, L., & Thornton, J. (2017). Experience, Challenges, and Opportunities of Being Fully Embedded in a User Group. Medical Reference Services Quarterly36(2), 138-149. doi:10.1080/02763869.2017.1293978

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dkeh%26AN%3d122763145%26site%3dehost-live%26scope%3dsite

this is somehow close to my role with the EDAD

Texas A&M University academic health sciences library integrating a librarian into the College of Pharmacy, approximately 250 miles away from the main library. preembedded and postembedded activities demonstrated the effectiveness and impact of

For this study, the fully embedded librarian is defined as one who is out of the traditional library and into an onsite setting to provide a full range of library services that enable collaboration with researchers or teaching faculty and support student learning. In this model, the embedded librarian is a team member of the RCOP rather than a service provider standing apart. The lines are not blurred as to the kind of services that should be embedded because the embedded librarian is 100% onsite. Very few reports in the literature describe fully embedded librarian models such as this. However, one similar model exists at the Arizona Health Sciences Library (AHSL), which is affiliated with the University of Arizona, where librarians relocated their permanent offices to the colleges of Nursing, Public Health, and Pharmacy. AHSL librarians spent close to 100% of their time in the colleges.

p. 144 The embedded librarian has gained recognition in the college and was appointed by the dean to serve on the Instructional Venues Ad Hoc Committee (IVC).

My note: This is what Tom Hergert and I have been advocating for years: the role of the librarian is not to find info and teach how to find info ONLY. The role of the librarian is to bring 21st century to School of Education: information literacy is only a fragment of metaliteracies. Information literacy is a 1990s priority. While it is still an important part of librarians goals, digital literacy, visual and media literacy, as well as technology literacy and pedagogical application of technology is imposed as integral part of the work of the embedded librarian.

p. 145 Challenges and Opportunities

Another challenge involved the librarian’s decision-making and effective communications skills, especially when deciding to implement library services or programs. Other challenges included speaking the client group’s language and knowing the information needs of each group—faculty, students, staff, postdocs, research assistants, and research scientists—to deliver the right information at the point of need. The following strategies were practiced to overcome these challenges: .

  • A positive attitude can increase connectivity, networking, and collaboration beyond a limited space. Proactively seeking opportunities to participate and get involved in library events, instructional programs, training workshops, or committee work shortened the distance between the remote librarian and those in main campus. .
  • As video conferencing tools or programs (e.g., Adobe Connnect, Webex, Skype, Google Hangout, Zoom) were the primary means for the remote at 18:19 24 August 2017 librarian to attend library meetings and teach in library instructional programs, spending some time learning to use these tools and embracing them greatly increased the librarian’s capacity to overcome the feelings of disconnection.
  • The willingness to travel several times a year to the main campus to meet librarians face-to-face helped in understanding the system and in getting help that seemed complicated and difficult via remote resources (e.g., computer issues). .
  • Actively listening to the faculty and students during the conversations helped understand their information needs. This served as the basis to initiate any targeted library services and programs.

Despite the challenges, the embedded librarian was presented with numerous opportunities that a traditional librarian might think impossible or difficult to experience, for example, attending RCOP department meetings or RCOP executive committee meetings to present library resources and services, serving on RCOP committees, co-teaching with faculty in RCOP credit courses, creating and grading assignments counting toward total course credits, and being given access to all RCOP course syllabi in eCampus. (the last is in essence what I am doing right now)

p. 147 Marketing Embedded Library Services

The “What’s in It for Me” (WIIFM) principle1 was a powerful technique to promote embedded library services. The essentials of WIIFM are understanding patron needs and ensuring the marketing effort or communications addressing those needs15—in other words, always telling patrons what is in it for them when promoting library services and resources. Different venues were used to practice WIIFM: .

  • RCOP faculty email list was an effective way to reach out to all the faculty. An email message at the beginning of a semester to the faculty highlighted the embedded librarian’s services. During the semester, the librarian communicated with the faculty on specific resources and services addressing their needs, such as measuring their research impact at the time of their annual evaluation, sharing grant funding resources, and promoting MSL’s resources related to reuse of images. .
  • Library orientations to new students and new faculty allowed the librarian to focus on who to contact for questions and help, available resources, and ways to access them. . Being a guest speaker for the monthly RCOP departmental faculty meetings provided another opportunity for the librarian to promote services and resources.
  • Casual conversations with faculty, students, researchers, and postdocs in the hallway, at staff luncheons, and at RCOP events helped understand their information needs, which helped the librarian initiate MSL service projects and programs.
  • The Facebook private group, created by Instructional Technology & MSL Resources @ Rangel COP, was used to announce MSL resources and services. The group currently has 256 members. The librarian is one of the group administrators who answers student questions related to library and MSL resources. (social media is my forte)

p. 148 This model would not have been successful without the strong support from MSL leadership team and the RCOP administration.

the next step would be to conduct a systematic assessment to get feedback from RCOP administrators, faculty, students, staff, postdocs, and research assistants. The integration of the library instructional program into the RCOP curriculum should be included in RCOP final course evaluations. Another future direction might be to conduct a curriculum map to get a better idea about the learning objectives of each course and to identity information literacy instruction needs across the curriculum. The curriculum mapping might also help better structure library instruction delivery to RCOP. Teaching content might be structured more purposefully and logically sequenced across the curriculum to ensure that what students have learned in one course prepares them for the next ones.

+++++++++++++++++++

Blake, L., Ballance, D., Davies, K., Gaines, J. K., Mears, K., Shipman, P., & … Burchfield, V. (2016). Patron perception and utilization of an embedded librarian program. Journal Of The Medical Library Association104(3), 226-230. doi:10.3163/1536-5050.104.3.008

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dkeh%26AN%3d116675007%26site%3dehost-live%26scope%3dsite

The overall satisfaction with services was encouraging, but awareness of the embedded program was low, suggesting an overall need for marketing of services.

+++++++++++++++++++

Tumbleson, B. E. (2016). Collaborating in Research: Embedded Librarianship in the Learning Management System. Reference Librarian57(3), 224-234. doi:10.1080/02763877.2015.1134376

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dkeh%26AN%3d114820440%26site%3dehost-live%26scope%3dsite

+++++++++++++++++++

O’Toole, E., Barham, R., & Monahan, J. (2016). The Impact of Physically Embedded Librarianship on Academic Departments. Portal: Libraries & The Academy16(3), 529-556.

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dkeh%26AN%3d116715636%26site%3dehost-live%26scope%3dsite

++++++++++++++

Agrawal, P. p., & Kumar, A. (2016). Embedded Librarianship and Academic Setup: Going beyond the library stockades. International Journal Of Information Dissemination & Technology6(3), 170-173.

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dllf%26AN%3d119763981%26site%3dehost-live%26scope%3dsite

India. p. 173 as of today, most of the users are not able to differentiate the library professional who have a bachelor degree, Masters degree and who are doctorate of the subject. My note: not in my case and this is my great advantage.

+++++++++++++++++

Madden, H., & Rasmussen, A. M. (2016). Embedded Librarianship: Einbindung von Wissenschafts- und Informationskompetenz in Schreibkurse / Ein US-amerikanisches Konzept. Bub: Forum Bibliothek Und Information68(4), 202-205.

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dlxh%26AN%3d114671852%26site%3dehost-live%26scope%3dsite

ILL

+++++++++++++++++
Delaney, G., & Bates, J. (2015). Envisioning the Academic Library: A Reflection on Roles, Relevancy and Relationships. New Review Of Academic Librarianship21(1), 30-51. doi:10.1080/13614533.2014.911194

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dkeh%26AN%3d101516816%26site%3dehost-live%26scope%3dsite

overview of the literature on embedded librarianship

+++++++++++++++++++
Freiburger, G., Martin, J. R., & Nuñez, A. V. (2016). An Embedded Librarian Program: Eight Years On. Medical Reference Services Quarterly35(4), 388-396. doi:10.1080/02763869.2016.1220756

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dkeh%26AN%3d118281342%26site%3dehost-live%26scope%3dsite

close to my role with the doctoral cohorts

++++++++++++++++++++++

Wilson, G. (2015). The Process of Becoming an Embedded Curriculum Librarian in Multiple Health Sciences Programs. Medical Reference Services Quarterly34(4), 490-497. doi:10.1080/02763869.2015.1082386

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dkeh%26AN%3d110525415%26site%3dehost-live%26scope%3dsite

ILL

+++++++++++++++++++++

Milbourn, A. a. (2013). A Big Picture Approach: Using Embedded Librarianship to Proactively Address the Need for Visual Literacy Instruction in Higher Education. Art Documentation: Bulletin Of The Art Libraries Society Of North America32(2), 274-283.

http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3daft%26AN%3d92600699%26site%3dehost-live%26scope%3dsite

visual literacy – this is IMS area, which was de facto shot off by the omnipotence of “information literacy”

+++++++++++++++++

Talley, M. (2007). Success and the Embedded Librarian. https://www.sla.org/wp-content/uploads/2013/05/Success_and_the_Embedded.pdf

Shumaker, D., Talley, M. Models of Embedded Librarianship: A Research Summary. https://www.sla.org/wp-content/uploads/2013/05/Models_of_Embedded.pdf

Shumaker, D., Talley, M. (2009). Models of Embedded Librarianship. Final Report.  Prepared under the Special Libraries Association Research Grant 2007. https://embeddedlibrarian.files.wordpress.com/2009/08/executivesummarymodels-of-embedded-librarianship.pdf

Shumaker, D. (2013). Embedded Librarianship: Digital World Future? http://www.infotoday.com/CIL2013/session.asp?ID=W30

Modelsof embeddedlibrarianship presentation_final_mt61509 from MaryTalley
slide 8: vision of embedded librarianship:
customer centric not library centric; located in their workplace not our workplace; focused on small groups not entire populations; composed of specialists, not generalists; dependent on domain knowledge not only library skills; aming an analysis and synthesis not simply delivery; in context, not out of context; built on trusted advice not service delivery
all of the above is embodied in my work with the doctoral cohorts
slide 9: why study? because traditional library service model is in decline
slide 11: broad analytical research on successful implementation is lacking
slide 20: large institutions more likely to offer specialized services
slide 21: domain knowledge through continuous learning, not always through formal degrees.
slide 39: what matters most
slide 40: strong leadership by library managers is critical (I will add here “by deans of other colleges)
+++++++++++++++++++++
bibliography:

Abrizah, A., Inuwa, S., & Afiqah-Izzati, N. (2016). Systematic Literature Review Informing LIS Professionals on Embedding Librarianship Roles. Journal Of Academic Librarianship42(6), 636-643. doi:10.1016/j.acalib.2016.08.010

Agrawal, P. p., & Kumar, A. (2016). Embedded Librarianship and Academic Setup: Going beyond the library stockades. International Journal Of Information Dissemination & Technology6(3), 170-173.

Andrews, C. R. (2014). CUNY Academic Works An Examination of Embedded Librarian Ideas and Practices: A Critical Bibliography. An Examination of Embedded Librarian Ideas and Practices: A Critical Bibliography. Codex, 3(1), 2150–86. Retrieved from http://academicworks.cuny.edu/bx_pubs

Blake, L., Ballance, D., Davies, K., Gaines, J. K., Mears, K., Shipman, P., & … Burchfield, V. (2016). Patron perception and utilization of an embedded librarian program. Journal Of The Medical Library Association104(3), 226-230. doi:10.3163/1536-5050.104.3.008

Bobish, G. (2011). Participation and Pedagogy: Connecting the Social Web to ACRL Learning Outcomes. Journal Of Academic Librarianship37(1), 54-63.

Cahoy, E. S., & Schroeder, R. (2012). EMBEDDING AFFECTIVE LEARNING OUTCOMES IN LIBRARY INSTRUCTION. Communications In Information Literacy6(1), 73-90.

Cha, T., & Hsieh, P. (2009). A Case Study of Faculty Attitudes toward Collaboration with Librarians to Integrate Information Literacy into the Curriculum. (Chinese). Journal Of Educational Media & Library Sciences46(4), 441-467.

COVONE, N., & LAMM, M. (2010). Just Be There: Campus, Department, Classroom…and Kitchen?. Public Services Quarterly6(2/3), 198-207. doi:10.1080/15228959.2010.498768

Delaney, G., & Bates, J. (2015). Envisioning the Academic Library: A Reflection on Roles, Relevancy and Relationships. New Review Of Academic Librarianship21(1), 30-51. doi:10.1080/13614533.2014.911194

Dewey, B. I. (2004). The Embedded Librarian: Strategic Campus Collaborations. Resource Sharing & Information Networks17(1-2), 5-17.

DREWES, K., & HOFFMAN, N. (2010). Academic Embedded Librarianship: An Introduction. Public Services Quarterly6(2/3), 75-82. doi:10.1080/15228959.2010.498773

Essinger, C. c., & Ke, I. i. (2013). Outreach: What Works?. Collaborative Librarianship5(1), 52-58.

Freiburger, G., Martin, J. R., & Nuñez, A. V. (2016). An Embedded Librarian Program: Eight Years On. Medical Reference Services Quarterly35(4), 388-396. doi:10.1080/02763869.2016.1220756

Heider, K. L. (2010). Ten Tips for Implementing a Successful Embedded Librarian Program. Public Services Quarterly6(2-3), 110-121.

Hollister, C. V. (2008). Meeting Them where They Are: Library Instruction for Today’s Students in the World Civilizations Course. Public Services Quarterly4(1), 15-27.

Kesselman, M. A., & Watstein, S. B. (2009). Creating Opportunities: Embedded Librarians. Journal Of Library Administration49(3), 383-400.

Kobzina, N. G. (2010). A Faculty—Librarian Partnership: A Unique Opportunity for Course Integration. Journal Of Library Administration50(4), 293-314.

Kvenild, C. (n.d.). The Future of Embedded Librarianship: Best Practices and Opportunities. Retrieved from http://www.cclibinstruction.org/wp-content/uploads/2012/02/CCLI2012proceedings_Kvenild.pdf

Lange, J. j., Canuel, R. r., & Fitzgibbons, M. m. (2011). Tailoring information literacy instruction and library services for continuing education. Journal Of Information Literacy5(2), 66-80.

Madden, H., & Rasmussen, A. M. (2016). Embedded Librarianship: Einbindung von Wissenschafts- und Informationskompetenz in Schreibkurse / Ein US-amerikanisches Konzept. Bub: Forum Bibliothek Und Information68(4), 202-205.

MCMILLEN, P., & FABBI, J. (2010). How to Be an E3 Librarian. Public Services Quarterly6(2/3), 174-186. doi:10.1080/15228959.2010.497454

Meyer, N. J., & Miller, I. R. (2008). The Library as Service-Learning Partner: A Win-Win Collaboration with Students and Faculty. College & Undergraduate Libraries15(4), 399-413.

Milbourn, A. (2013). A Big Picture Approach: Using Embedded Librarianship to Proactively Address the Need for Visual Literacy Instruction in Higher Education. Art Documentation: Bulletin Of The Art Libraries Society Of North America32(2), 274-283.

The Changing Roles of Academic and Research Libraries – Higher Ed Careers – HigherEdJobs. (2013). Retrieved from https://www.higheredjobs.com/HigherEdCareers/interviews.cfm?ID=632

Niles, P. (2011). Meeting the Needs of the 21st Century Student. Community & Junior College Libraries17(2), 47-51.

Oakleaf, M., & VanScoy, A. (2010). Instructional Strategies for Digital Reference: Methods to Facilitate Student Learning. Reference & User Services Quarterly49(4), 380-390.

O’Toole, E., Barham, R., & Monahan, J. (2016). The Impact of Physically Embedded Librarianship on Academic Departments. Portal: Libraries & The Academy16(3), 529-556.

Rao, S., Cameron, A., & Gaskin-Noel, S. (2009). Embedding General Education Competencies into an Online Information Literacy Course. Journal Of Library Administration49(1/2), 59-73.

Shumaker, D., Talley, M. Models of Embedded Librarianship: A Research Summary. https://www.sla.org/wp-content/uploads/2013/05/Models_of_Embedded.pdf

Shumaker, D., Talley, M. (2009). Models of Embedded Librarianship. Final Report.  Prepared under the Special Libraries Association Research Grant 2007. https://embeddedlibrarian.files.wordpress.com/2009/08/executivesummarymodels-of-embedded-librarianship.pdf

Shumaker, D. (2013). Embedded Librarianship: Digital World Future? http://www.infotoday.com/CIL2013/session.asp?ID=W30

Summey, T. P., & Kane, C. A. (2017). Going Where They Are: Intentionally Embedding Librarians in Courses and Measuring the Impact on Student Learning. Journal Of Library & Information Services In Distance Learning11(1/2), 158-174. doi:10.1080/1533290X.2016.1229429

Talley, M. (2007). Success and the Embedded Librarian. https://www.sla.org/wp-content/uploads/2013/05/Success_and_the_Embedded.pdf

Tumbleson, B. E., & Burke, J. (John J. . (2013). Embedding librarianship in learning management systems : a how-to-do-it manual for librarians. Retrieved from http://www.worldcat.org/title/embedding-librarianship-in-learning-management-systems-a-how-to-do-it-manual-for-librarians/oclc/836261183

Tumbleson, B. E. (2016). Collaborating in Research: Embedded Librarianship in the Learning Management System. Reference Librarian57(3), 224-234. doi:10.1080/02763877.2015.1134376

Wilson, G. (2015). The Process of Becoming an Embedded Curriculum Librarian in Multiple Health Sciences Programs. Medical Reference Services Quarterly34(4), 490-497. doi:10.1080/02763869.2015.1082386

Wu, L., & Thornton, J. (2017). Experience, Challenges, and Opportunities of Being Fully Embedded in a User Group. Medical Reference Services Quarterly36(2), 138-149. doi:10.1080/02763869.2017.1293978

+++++++++++++++++
more about embedded librarian in this IMS blog
https://blog.stcloudstate.edu/ims?s=embedded+librarian

teacher evaluation

doctoral cohort student’s request for literature: “I am looking for some more resources around the historical context of teacher evaluation.”

pre-existing bibliography:

Allen, J., Gregory, A., Mikami, A. I., Lun, J., Hamre, B., & Pianta, R. (2013). Observations of Effective Teacher-Student Interactions in Secondary School Classrooms: Predicting Student Achievement With the Classroom Assessment Scoring System—Secondary. School Psychology Review, 42(1), 76–98.

Alonzo, A. C. (2011). COMMENTARIES Learning Progressions That Support Formative Assessment Practices. Measurement, 9, 124–129. http://doi.org/10.1080/15366367.2011.599629

Baker, B. D., Oluwole, J. O., & Green, P. C. (2013). The Legal Consequences of Mandating High Stakes Decisions Based on Low Quality Information: Teacher Evaluation in the Race-to-the-Top Era. Education Policy Analysis Archives, 21(5), 1–71. http://doi.org/http://epaa.asu.edu/ojs/article/view/1298

Benedict, A. E., Thomas, R. a., Kimerling, J., & Leko, C. (2013). Trends in Teacher Evaluation. Teaching Exceptional Children. May/Jun2013, 45(5), 60–68.

Bonavitacola, A. C., Guerrazzi, E., & Hanfelt, P. (2014). TEACHERS’ PERCEPTIONS OF THE IMPACT OF THE McREL TEACHER EVALUATION SYSTEM ON PROFESSIONAL GROWTH.

Charlotte Danielson. (2016). Creating Communities of Practice. Educational Leadership, (May), 18 – 23.

Darling-Hammond, L., Wise, A. E., & Pease, S. R. (1983). Teacher Evaluation in the Organizational Context: A Review of the Literature. Review of Educational Research, 53(3), 285–328. http://doi.org/10.3102/00346543053003285

Darling-Hammond, L., Jaquith, A., & Hamilton, M. (n.d.). Creating a Comprehensive System for Evaluating and Supporting Effective Teaching.

Derrington, M. L. (n.d.). Changes in Teacher Evaluation: Implications for the Principal’s Work.

Gallagher, H. A. (2004). Vaughn Elementary’s Innovative Teacher Evaluation System: Are Teacher Evaluation Scores Related to Growth in Student Achievement? Peabody Journal of Education, 79(4), 79–107. http://doi.org/10.1207/s15327930pje7904_5

Hallgren, K., James-Burdumy, S., & Perez-Johnson, I. (2014). STATE REQUIREMENTS FOR TEACHER EVALUATION POLICIES PROMOTED BY RACE TO THE TOP.

Hattie Helen E-Mail Address, J. T., Hattie, J., & Timperley, H. (2007). The power of feedback. [References]. Review of Educational Research, .77(1), 16–7. http://doi.org/10.3102/003465430298487

Hazi, H. M. (n.d.). Legal Challenges to Teacher Evaluation: Pitfalls and Possibilities in the States. http://doi.org/10.1080/00098655.2014.891898

Ingle, W. K., Willis, C., & Fritz, J. (2014). Collective Bargaining Agreement Provisions in the Wake of Ohio Teacher Evaluation System Legislation. Educational Policy. http://doi.org/10.1177/0895904814559249

Marzano, R. J. (2012). The Two Purposes of Teacher Evaluation. Educational Leadership, 70(3), 14–19. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&db=aph&AN=83173912&site=ehost-live

Moskal, A. C. M., Stein, S. J., & Golding, C. (2016). Assessment & Evaluation in Higher Education Can you increase teacher engagement with evaluation simply by improving the evaluation system? Can you increase teacher engagement with evaluation simply by improving the evaluation system? http://doi.org/10.1080/02602938.2015.1007838

Quinn, A. E. (n.d.). The Delta Kappa Gamma Bulletin Looking a t th e B igger Picture w ith Dr. R o b ert M arzan o : Teacher E valuation and D e v e lo p m e n t fo r Im p ro ved S tu d en t Learning.

Riordan, J., Lacireno-Paquet, Shakman, N., Bocala, K., & Chang, C. (2015). Redesigning teacher evaluation: Lessons from a pilot implementation. Retrieved from http://ies.ed.gov/

Taylor, E. S., & Tyler, J. H. (n.d.). Evidence of systematic growth in the effectiveness of midcareer teachers Can Teacher Evaluation Improve Teaching?

Tuytens, M., & Devos, G. (n.d.). The problematic implementation of teacher evaluation policy: School failure or governmental pitfall? http://doi.org/10.1177/1741143213502188

Wong, W. Y., & Moni, K. (2013). Teachers’ perceptions of and responses to student evaluation of teaching: purposes and uses in clinical education. http://doi.org/10.1080/02602938.2013.844222

my list of literature:

Avalos, B., & Assael, J. (2006). Moving from resistance to agreement: The case of the Chilean teacher performance evaluation. International Journal of Educational Research, 45(4-5), 254-266.

Cowen, J. M., & Fowles, J. (2013). Same contract, different day? an analysis of teacher bargaining agreements in Louisville since 1979. Teachers College Record, 115(5)

Flippo, R. F. (2002). Repeating history: Teacher licensure testing in Massachusetts. Journal of Personnel Evaluation in Education, 16(3), 211-29.

Griffin, G. (1997). Teaching as a gendered experience. Journal of Teacher Education, 48(1), 7-18.

Hellawell, D. E. (1992). Structural changes in education in England. International Journal of Educational Reform, 1(4), 356-65.

Hibler, D. W., & Snyder, J. A. (2015). Teaching matters: Observations on teacher evaluations. Schools: Studies in Education, 12(1), 33-47.

Hill, H. C., & Grossman, P. (2013). Learning from teacher observations: Challenges and opportunities posed by new teacher evaluation systems. Harvard Educational Review, 83(2), 371-384.

Hines, L. M. (2007). Return of the thought police?: The history of teacher attitude adjustment. Education Next, 7(2), 58-65.

Kersten, T. A. (2006). Teacher tenure: Illinois school board presidents’ perspectives and suggestions for improvement. Planning and Changing, 37(3-4), 234-257.

Kersten, T. A., & Israel, M. S. (2005). Teacher evaluation: Principals’ insights and suggestions for improvement. Planning and Changing, 36(1-2), 47-67.

Korkmaz, I. (2008). Evaluation of teachers for restructured elementary curriculum (grades 1 to 5). Education, 129(2), 250-258.

Lamb, M. L., & Swick, K. J. (1975). Historical overview of teacher observation Educational Forum.

Maharaj, S. (2014). Administrators’ views on teacher evaluation: Examining Ontario’s teacher performance appraisal. Canadian Journal of Educational Administration and Policy, (152)

Naba’h, A. A., Al-Omari, H., Ihmeideh, F., & Al-Wa’ily, S. (2009). Teacher education programs in Jordan: A reform plan. Journal of Early Childhood Teacher Education, 30(3), 272-284.

Ornstein, A. C. (1977). Critics and criticism of education Educational Forum.

Pajak, E., & Arrington, A. (2004). Empowering a profession: Rethinking the roles of administrative evaluation and instructional supervision in improving teacher quality. Yearbook of the National Society for the Study of Education, 103(1), 228-252.

Stamelos, G., & Bartzakli, M. (2013). The effect of a primary school teachers, trade union on the formation and realisation of policy in Greece: The case of teacher evaluation policy. Policy Futures in Education, 11(5), 575-588.

Stamelos, G., Vassilopoulos, A., & Bartzakli, M. (2012). Understanding the difficulties of implementation of a teachers’ evaluation system in greek primary education: From national past to european influences. European Educational Research Journal, 11(4), 545-557.

Sullivan, J. P. (2012). A collaborative effort: Peer review and the history of teacher evaluations in Montgomery county, Maryland. Harvard Educational Review, 82(1), 142-152.

Tierney, W. G., & Lechuga, V. M. (2005). Academic freedom in the 21st century. Thought & Action, , 7-22.

Turri, M. (2014). The new italian agency for the evaluation of the university system (ANVUR): A need for governance or legitimacy? Quality in Higher Education, 20(1), 64-82.

VanPatten, J. J. (1972). Some reflections on accountability Journal of Thought.

Vijaysimha, I. (2013). Teachers as professionals: Accountable and autonomous? review of the report of the justice Verma commission on teacher education. august 2012. department of school education and literacy, ministry of human resource development, government of India. Contemporary Education Dialogue, 10(2), 293-299.

Vold, D. J. (1985). The roots of teacher testing in America. Educational Measurement: Issues and Practice, 4(3), 5-7.

Wermke, W., & Höstfält, G. (2014). Contextualizing teacher autonomy in time and space: A model for comparing various forms of governing the teaching profession. Journal of Curriculum Studies, 46(1), 58-80.

Ydesen, C., & Andreasen, K. E. (2014). Accountability practices in the history of Danish primary public education from the 1660s to the present. Education Policy Analysis Archives, 22(120)

polling clickers education

Using a Mobile Solution to Empower Students in and out of the Classroom

Date: Tuesday, February 23, 2016.  Time: 11:00 AM Pacific Standard Time.  Sponsored by: i>clicker

archived webcast here:
http://w.on24.com/r.htm?e=1130110&s=1&k=281F43456B7CB0FF8CFD94D02CBC7DDC&partnerref=theremind
(you need to register. That means, sharing personal data, so fill out cautiously).

REEF polling is a proprietary for i>Clickers.
All other contenstors, TopHat, Turning Technologies etc. have the same scheme

methodology of the chemistry teacher:
flipped classroom active learning

quizzes: may not use external resources, graded on accuracy
questions: may use external sources, graded on participation (chemistry teacher wants students to be active and not penalized for wrong answer).

think: students consider the question. submit an answer individually
pair: instructors shows the results (no answer is given); students form groups to discuss their answers; students must agree on the answer
share: students submit an answer individually; the instructor shows the result (an answer is given)

kate.biberdorf@cm.utexas.edu @FunwithChem

follow up q/s, also standard:
1. what to do, if students don’t have smart phones, 2. CRS integration with CMS 3. data export

More on polling and CRS in the classroom in this blog:

https://blog.stcloudstate.edu/ims/?s=clickers&submit=Search
https://blog.stcloudstate.edu/ims/?s=classroom+response+systems&submit=Search
https://blog.stcloudstate.edu/ims/?s=crs&submit=Search

More on flipped classroom and active learning in this blog:
https://blog.stcloudstate.edu/ims/?s=flipped+classroom&submit=Search
https://blog.stcloudstate.edu/ims/?s=active+learning&submit=Search

clickers and mobile devices

Transform Classroom Dynamics with Clickers and/or Mobile Devices

Getting Students more involved in classroom presentations and assessing their interest is always part of an educator’s goal. Student Response Systems (SRS), also called audience response systems or more commonly “clickers,” have been around in university lecture halls in one form or another for more than two decades.

iclicker_CampusTechnologyGameChanger_onlinefinal

7 Free Social Media Tools for Teachers

7 Fantastic Free Social Media Tools for Teachers

http://mashable.com/2010/10/16/free-social-media-tools-for-teachers/

1. EDU 2.0

EDU 2.0 is a lot like online course management systems Blackboard and Moodle, but with a couple of distinct advantages. First, teachers can share their lesson plans, quizzes, videos, experiments and other resources in a shared library that currently hosts more than 15,000 pieces of content. Second, a community section allows teachers and students to network and collaborate with other members who share the same educational interests. And third, everything is hosted in the cloud for free.

2. SymbalooEDU

The popular visual organizing and sharing tool Symbaloo launched its “EDU” version last month. According to the company, 50,000 teachers are already using Symbaloo to organize classroom resources. The new EDU version comes with academic subject-specific resource pages or “webmixes” and top tools like TeacherTube, Slideshare, Google Docs, Flickr and more are fully embeddable. Teachers with a “Free Plus” account can add their school logo and customize the links. The site also allows students to easily share their Symbaloo pages and projects with classmates.

3. Collaborize Classroom

This app gives teachers four discussion format choices. Students can either agree or disagree with a statement, answer a multiple choice question, post responses, or have the choice between adding a new response or voting for someone else’s response. Teachers can add photos or videos to their prompts and all of the discussions take place on one class page.

4. Edublogs

This WordPress-like blogging platform only supports educational content and thus, unlike WordPress, usually isn’t blocked by school filters. Since 2005, it has hosted more than a million blogs from students and teachers.

5. Kidblog

Kidblog is a bit more specific than Edublogs. There are fewer options to adjust the appearance of the main page, and it’s hard to use the platform for anything other than as a system for managing individual class blogs. The homepage serves as a catalog of student blogs on the right with a recent post feed on the left.

Teachers can also control how private they want the blogs to be. They can keep them student-and-teacher only, allow parents to log in with a password, or make them open to the public.

6. Edmodo

Edmodo looks and functions much like Facebook. But unlike Facebook, it’s a controlled environment that teachers can effectively leverage to encourage class engagement. The platform allows teachers and students to share ideas, files and assignments on a communal wall. Teachers can organize different groups of students and monitor them from the same dashboard. Once they’ve organized classes, they can post assignments to the wall and grade them online. They can then archive the class groups and begin new ones.

7. TeacherTube and SchoolTube and YouTube

As the name implies, TeacherTube is YouTube for teachers. It’s a great resource for lesson ideas but videos can also be used during class to supplement a lecture. For instance, you can let Mrs. Burk rap about perimeters if you like her idea but lack the rhyming skills to pull it off yourself. This site also has a crowdsourced stock of documents, audio and photos that can be added to your lesson plans. Unfortunately, every video is preceded by an ad.

SchoolTube is another YouTube alternative. Unlike other video sharing sites, it is not generally blocked by school filters because all of its content is moderated.

The original, generic YouTube also has a bevy of teacher resources, though it’s often blocked in schools. Khan Academy consistently puts out high-quality lessons for every subject, but a general search on any topic usually yields a handful of lesson approaches. Some of the better ones are indexed onWatchKnow.

Do student evaluations measure teaching effectiveness?

Do student evaluations measure teaching effectiveness?Manager’s Choice

Assistant Professor in MISTop Contributor

Higher Education institutions use course evaluations for a variety of purposes. They factor in retention analysis for adjuncts, tenure approval or rejection for full-time professors, even in salary bonuses and raises. But, are the results of course evaluations an objective measure of high quality scholarship in the classroom?

—————————-

  • Daniel WilliamsDaniel

    Daniel Williams

    Associate Professor of Molecular Biology at Winston-Salem State University

    I feel they measure student satisfaction, more like a customer service survey, than they do teaching effectiveness. Teachers students think are easy get higher scores than tough ones, though the students may have learned less from the former.

    Maria P.John S. and 17 others like this

  • Muvaffak

    Muvaffak GOZAYDIN

    Founder at Global Digital University

    Top Contributor

    How can you measure teachers’ effectiveness.
    That is how much students learn?
    If there is a method to measure how much we learn , I would appreciate to learn .

    Simphiwe N.Laura G. and 4 others like this

  • Michael TomlinsonMichael

    Michael Tomlinson

    Senior Director at TEQSA

    From what I recall, the research indicates that student evaluations have some value as a proxy and rough indicator of teacher effectiveness. We would expect that bad teachers will often get bad ratings, and good teachers will often get good ratings. Ratings for individual teachers should always be put in context, IMHO, for precisely the reasons that Daniel outlines.

    Aggregated ratings for teachers in departments or institutions can even out some of these factors, especially if you combine consideration with other indicators, such as progress rates.The hardest indicators however are drop-out rates and completion rates. When students vote with their feet this can flag significant problems. We have to bear in mind that students often drop out for personal reasons, but if your college’s drop-out rate is higher than your peers, this is worth investigating.

    phillip P.J.B. W. and 12 others like this

  • Rina SahayRina

    Rina Sahay

    Technical educator looking for a new opportunity or career direction

    I agree with what Michael says – to a point. Unfortunately student evaluations have also been used as a venue for disgruntled students, acting alone or in concert – a popularity contest of sorts. Even more unfortunately college administrations (especially for-profits) tend to rate Instructor effectiveness on the basis of student evaluations.

    IMHO, student evaluation questions need to be carefully crafted in order to be as objective as possible in order to eliminate the possibility of responses of an unprofessional nature. To clarify – a question like “Would you recommend this teacher to other students?” has the greatest potential for counter-productivity.

    Maria P.phillip P. and 6 others like this

  • Robert WhippleRobert

    Robert Whipple

    Chair, English Department at Creighton University

    No.

    Rina S.Elizabeth T. and 7 others like this

  • Dr. Virginia Stead, Ed.D.Dr. Virginia

    Dr. Virginia Stead, Ed.D.

    2013-2015 Peter Lang Publishing, Inc. (New York) Founding Book Series Editor: Higher Education Theory, Policy, & Praxis

    This is not a Cartesian question in that the answer is neither yes nor no; it’s not about flipping a coin. One element that may make it more likely that student achievement is a result of teacher effectiveness is the comparison of cumulative or summative student achievement against incoming achievement levels. Another variable is the extent to which individual students are sufficiently resourced (such as having enough food, safety, shelter, sleep, learning materials) to benefit from the teacher’s beneficence.

    Bridget K.Simphiwe N. and 4 others like this

  • Barbara

    Barbara Celia

    Assistant Clinical Professor at Drexel University

    Depends on how the evaluation tool is developed. However, overall I do not believe they are effective in measuring teacher effectiveness.

    Jeremy W.Ronnie S. and 1 other like this

  • Sri YogamalarSri

    Sri Yogamalar

    Lecturer at MUSC, Malaysia

    Overall, I think students are the best judge of a teacher’s effective pedagogy methods. Although there may be students with different learning difficulties (as there usually is in a class), their understanding of the concepts/principles and application of the subject matter in exam questions, etc. depends on how the teacher imparts such knowledge in a rather simplified and easy manner to enhance analytical and critical thinking in them. Of course, there are students too who give a bad review of a teacher’s teaching mode out of spite just because the said teacher has reprimanded him/her in class for being late, for example, or for even being rude. In such a case, it would not be a true reflection of the teacher’s method of teaching. A teacher tries his/her best to educate and inculcate values by imparting the required knowledge and ensuring a 2-way teaching-learning process. It is the students who will be the best judge to evaluate and assess the success of the efforts undertaken by the teacher because it is they who are supposed to benefit at the end of the teaching exercise.

    Chunli W.Simphiwe N. and 2 others like this

  • Paul S HickmanPaul S

    Paul S Hickman

    Member of the Council of Trustees & Distinguished Mentor at Warnborough College, Ireland & UK

    No! No!

    Anne G.Maria P. and 2 others like this

  • Bonnie FoxBonnie

    Bonnie Fox

    Higher Education Copywriter

    In some cases, I think evaluations (and negative ones in particular) can offer a good perspective on the course, especially if an instructor is willing to review them with an open mind. Of course, there are always the students who nitpick and, as Rina said, use the eval as a chance to vent. But when an entire class complains about how an instructor has handled a course (as I once saw happen with a tutoring student whose fellow classmates were in agreement about the problems in the course), I think it should be taken seriously. But I also agree with Daniel about how evaluations should be viewed like a customer service survey for student satisfaction. Evals are only useful up to a point.

    I definitely agree about the way evaluations are worded, though, to make sure that it’s easier to recognize the useful information and weed out the whining.

    Maria P.Pierre H. and 4 others like this

  • Pierre HENONPierre

    Pierre HENON

    university teacher (professeur agrege)

    I am director of studies and students in continuing education are making evaluation of the teaching effectiveness. Because I am in an ISO process, I must take in account those measurements. It might be very difficult sometimes because the number of students does not reach the level required for the sample to be valid (in a statistic meaning). But in the meantime, I believe in the utility of such measurements. The hard job is for me when I have to discuss with the teacher who is under the required score.

    Simphiwe N.Maria P. like this

  • Maria PerssonMaria

    Maria Persson

    Senior Tutor – CeTTL – Student Learning & Digital/Technology Coach (U of W – Faculty of Education)

    I’m currently ‘filling in’ as the administrator in our Teaching Development Unit – Appraisals and I have come to appreciate that the evaluation tool of choice is only that – a tool. How the tool is used in terms of the objective for collecting ‘teaching effectiveness’ information, question types developed to gain insight of, and then how that info is acted upon to inform future teaching and learning will in many ways denote the quality of the teaching itself !

    Student voice is not just about keeping our jobs, ‘bums on seats’ or ‘talking with their feet’ (all part of it of course) but should be about whether or not we really care about learning. Student voice in the form of evaluating teachers’ effectiveness is critically essential if we want our teaching to model learning that affects positive change – Thomas More’s educational utopia comes to mind…

    Simphiwe N.Pierre H. and 4 others like this

  • David ShallenbergerDavid

    David Shallenberger

    Consultant and Professor of International Education

    Alas, I think they are weak indicators of teaching effectiveness, yet they are used often as the most important indicators of the same. And in the pursuit of high response rate, they are too often given the last day of class, when they cannot measure anything significant — before the learning has “sunk in.” Ask better questions, and ask the questions after students have had a chance to reflect on the learning.

    Barbara C.Pierre H. and 9 others like this

  • Cathryn McCormackCathryn

    Cathryn McCormack

    Lecturer (Teaching and Learning), and Belly Dance teacher

    I’m just wrapping up a very large project at my university that looked at policy, processes, systems and the instrument for collecting student feedback (taking a break from writing the report to write this comment). One thing that has struck me very clearly is that we need to reconceptualise SETs. de Vellis, in Scale Development, talks about how a scale generally has a higher validity if the respondent is asked to talk about their own experiences.

    Yet here we are asking students to not only comment on, but evaluate their teachers. What we really want students to do in class in concentrate on their learning – not on what the teacher is doing. If they are focussing on what the teacher is doing then something is not going right. The way we ask now seems even crazier when we consider the most sophisticated conception of teaching is to help students learn. So why aren’t we asking students about their learning?

    The standard format has something to do with it – it’s extremely difficult to ask interesting questions on learning when the wording must align with a 5 point Likert response scale. Despite our best efforts, I do not believe it is possible to prepare a truly student centred and learning centred questionnaire using this format.

    An alternate format I came across that I really liked (Modified PLEQ Devlin 2002, An Improved Questionnaire for Gathering Student Perceptions of Teaching and Learning), but no commercial evaluation software (which we are required to purchase) can do it. A few overarching questions sets the scene for the nature of the class, but the general question format goes: In [choose from drop down list] my learning was [helped/hindered] when [fill in the blank] because [fill in the blank]. The drop down list would include options such as lectures, seminars/tutorials, a private study situation, preparing essays, labs, field trip, etc. After completing one question the student has the option to fill in another … and another … and another … for as long as they want.

    Think about what information we could actually get on student learning if we we started asking like this! No teacher ratings, all learning. The only number that would emerge would be the #helped and the #hindered.

    Maria P.Pierre H. and 6 others like this

  • Hans TilstraHans

    Hans Tilstra

    Senior Coordinator, Learning and Teaching

    Keep in mind “Goodhart’s Law” – When a measure becomes a target, it ceases to be a good measure.

    For example, if youth unemployment figures become the main measure, governments may be tempted to go for the low hanging fruit, the short term (eg. a work for the dole stick to steer unemployed people into study or the army).

    Punita S.Laura G. and 2 others like this

  • robert easterbrookrobert

    robert easterbrook

    Education Management Professional

    Nope.

    Catherine W.Anne G. like this

  • John StanburyJohn

    John Stanbury

    Professor at Singapore Institute of Management

    I totally agree with most of the comments here. I find student evaluations to be virtually meaningless as measures of a teachers’ effectiveness. They are measures of student perception NOT of learning. Yet university administrators eg Deans, Dept chairs, persist in using them to evaluate faculty performance in the classroom to the point where many instructors have had their careers torn apart. Its an absolute disgrace!! But no one seems to care! That’s the sick thing about it!

    Ronnie S.Maria P. and 4 others like this

  • Simon YoungSimon

    Simon Young

    Programme Coordinator, Pharmacy

    Satisfaction cannot be simply correlated with teaching quality. The evidence is that students are most “satisfied” with courses that support a surface learning approach – what the student “needs to know” to pass the course. Where material and delivery is challenging, this generates less crowd approval but, conversely, is more likely to be “good teaching” as this supports deep learning.

    Our challenge is to achieve deep learning and still generate rave satisfaction reviews. If any reader has the magic recipe, I would be pleased to learn of it.

    joe O.Maria P. and 4 others like this

  • Laura GabigerLaura

    Laura Gabiger

    Professor at Johnson & Wales University

    Top Contributor

    Maybe it is about time we started calling it what it is and got Michelin to develop the star rating system for our universities.

    Nevertheless I appreciate everyone’s thoughtful comments. Muvaffak, I agree with you about the importance and also the difficulty of measuring student learning. Cathryn, thank you for taking a break from your project to give us an overview.

    My story: the best professor and mentor in my life (I spent a total of 21 years as a student in higher education), the professor from whom I learned indispensable and enduring habits of thought that have become more important with each passing year, was one whom the other graduate students in my first term told me–almost unanimously– to avoid at all costs.

    Jeremy W.Maria P. and 1 other like this

  • Dr. Pedro L. MartinezDr. Pedro L.

    Dr. Pedro L. Martinez

    Former Provost and Vice Chancellor for Academic Affairs at Winston Salem State University & President of HigherEd SC.

    I am not sure that course evaluations based on one snap shot measure “teacher effectiveness”. For various reasons, some ineffective teachers get good ratings by pandering to the lowest level of intellectual laziness. However, consistently looking at comments and some other measures may yield indicators of teachers who are unprepared, do not provide feedback, do not adhere to a syllabus of record, and do not respect students in general. I think part of that information is based how questions are crafted.

    I believe that a self evaluation of instructor over a period of a semester could yield invaluable information. Using a camera and other devices, ask the instructor to take snap shots of their teaching/ learning in the classroom over a period of time and then ask for a self-evaluation. For the novice teacher that information could be evaluated by senior faculty and assist the junior faculty to improve his/her delivery. Many instructors are experts in their field but lack exposure to different methods of instructional delivery. I would like to see a taxonomy of a scale that measures the instructor’s ability using lecture as the base of instruction and moving up to levels of problem based learning, service learning, undergraduate research by gauging the different pedagogies (pedagogy, androgogy heutagogy, paragogy etc. that engage students in active learning.

    Dvora P.Maria P. and 1 other like this

  • Steve CharlierSteve

    Steve Charlier

    Assistant Professor at Quinnipiac University

    I wanted to piggyback on Cathryn’s comment above, and align myself with how many of you seem to feel about student evaluations. The quantitative part of student evals are problematic, for all of the reasons mentioned already. But the open-ended feedback that is (usually) a part of student evaluations is where I believe some real value can be gained, both for administrative purposes and for instructor development.

    When allowed to speak freely, what are students saying? Are they lamenting a particular aspect of the course/instructor? Is that one area coloring their response across all questions? These are all important considerations, and provide a much richer source of information for all involved.

    Sadly, the quantitative data is what most folks gravitate to, simply because it’s standardized and “easy”. I don’t believe that student evaluations are a complete waste of time, but I do think that we tend to focus on the wrong information. And, of course, this ignores the issues of timing and participation rates that are probably another conversation altogether!

    Dvora P.Sonu S. and 4 others like this

  • robert easterbrookrobert

    robert easterbrook

    Education Management Professional

    ‘What the Student Does: teaching for enhanced learning’ by John Biggs in Higher Education Research & Development, Vol. 18, No. 1, 1999.

    “The deep approach refers to activities that are appropriate to handling the task so that an appropriate outcome is achieved. The surface approach is therefore to be discouraged, the deep approach encouraged – and that is my working definition of good teaching. Learning is thus a way of interacting with the world. As we learn, our conceptions of phenomena change, and we see the world differently. The acquisition of information in itself does not bring about such a change, but the way we structure that information and think with it does. Thus, education is about conceptual change, not just the acquisition of information.” (p. 60)

    This is the approach higher education is trying adapt to at the moment, as far as I’m aware.

    Jeremy W.Adrian M. like this

  • Cindy KenkelCindy

    Cindy Kenkel

    Northwest Missouri State University

    My Human Resource students will focus on this issue in a class debate “Should student evaluation data significantly impact faculty tenure and promotion decisions?” One side will argue “yes, it provides credible data that should be one of the most important elements” and the other group will argue against this based on much of what has been said above. They will say student evaluations are basically a popularity contest and faculty may actually be dumbing down their classes in order to get higher ratings.

    To what extent is student data used in faculty tenure and promotion decisions at your institutions?

  • yasir

    yasir hayat

    Faculty member at institute of management sciences,peshawar

    NO

  • yasir

    yasir hayat

    Faculty member at institute of management sciences,peshawar

    NO

  • joe othmanjoe

    joe othman

    Associate Professor at Institute of Education, IIUM

    Agree with Pierre, when the number of students responding is not what is expected; then what?

  • joe othmanjoe

    joe othman

    Associate Professor at Institute of Education, IIUM

    Cindy; it is used in promotion decision in my university, but only a small percentage of the total points. Yet this issue is still a thorny one for some faculty

  • Sonu SardaSonu

    Sonu Sarda

    Lecturer at University of Southern Queensland

    How open are we? Is learning about the delivery of a subject only or bulding on soft skills as well?So if we as teachers are facilitating learning in a conducive manner ,would it not lead to an average TE at the least &thus indicate our teaching effectiveness at the base level. Indeed qualitative approach would be far better an approach, if we intend to accomplish the actual purpose of TE i.e Reflection for continual improvement.More and more classrooms are becoming learner centered and to accomplish this learners ‘say’ is vital.
    Some students using these as platforms for personal whims, must not be a concern for many, since the TE are averaged out .Of course last but not the least TEs are like dynamites and must be handled by experts.These are one of the means of assessing the gaps,if any, between the teaching and learning strategies. These must not be used for performance evaluation.If at all, then all the other factors such as the number of students,absenteeism,pass rate rather HD & D rates over a period of minimum three terms must also be included alongside.

  • Dvora PeretsDvora

    Dvora Perets

    Teaching colleague at Ben Gurion University of the Negev

    I implement a semester long self evaluation process in all my mathematics courses. Students gets 3 points (out of 100) for anonymously filling an online questionnaire online every week . They rate (1-5) their personal class experience (I was bored -I was fascinated, I understood nothing- I understood everything, The tutorials sessions didn’t-did help, I visited Lecturer’s/TA’s office hours, I spent X hours of self learning this week). They can also add verbal comments.
    I started it 10 years ago when I built a new special course, to help me “hear” the students (80-100 in each class) and to better adjust myself and the content to my new students. I used to publish a weekly respond to the verbal comments, accepting some and rejecting others while making sure to explain and justify any decision of mine.
    Not only that it helped me improve my teaching and the course but it turned out that it actually created a very solid perception of me as a caring teacher. I always was a very caring teacher (some of my colleagues accuse me of being over caring…) but it seems that “forcing” my student to give feedback along all the semester kind of “brought it out” to the open.

    I am still using long-semester feedback in all my courses and I consider both quantitative and qualitative responds. It helps me see that the majority of students understand me in class. I ignore those who choose “I understand nothing” – obviously if they were indeed understanding “nothing” they would have not come to class… (they can choose “I didn’t participate” or “I don’t wont to answer”)
    I ignore all verbal comments that aim to “punish” me and I change things when I think students r right.
    Finally, being a math lecturer for non-major students is extremely hard, both academically and emotionally. Most students are not willing to do what is needed in order to understand the abstract/complicated concepts and processes.
    Only few (“courageous “) students will attribute their lack of understanding to the fact that they did not attend all classes, or that they weren’t really focused on learning, (probably they spend a lot of time in “Facebook” during class..), or that they didn’t go over class notes at home and come to office hours when they didn’t understand something etc.
    I am encouraged by the fact that about 2/3 of the students that attend classes state they “understood enough” and above (3-5) all semester long. This is especially important as only 40-50% of the students fill the formal end of the semester SE and I bet u can guess how the majority of of them will rate my performance. Students fill SE before the final exam but (again) u can guess how 2 midterms with about 24% failures will influence their evaluation of my teaching.

    Cathryn M.Steve C. and 3 others like this

  • Michael TomlinsonMichael

    Michael Tomlinson

    Senior Director at TEQSA

    I think it’s important to avoid defensive responses to the question. Most participants have assumed that we are talking about individual teachers being assessed through questionnaires, and I share everyone’s reservations about that. I entirely agree that deep learning is what we need to go for, but given the huge amounts of public money that are poured into our institutions, we need to have some way of evaluating whether what we are doing is effective or whether it isn’t.

    I’m not impressed by institutions that are obsessed only with evaluation by numbers. However, there is some merit in monitoring aggregated statistics over time and detecting statistically significant variations. If average satisfaction rates in Engineering have gone down every year for five years shouldn’t we try and find out why? If satisfaction rates in Architecture have gone up every year for five years wouldn’t it be interesting to know if they have been doing something to bring that about that might be worthwhile? It might turn out to be a statistical artifact, but we need to inquire into it, and bring the same arts of critical inquiry to bear on the evidence that we use in our scholarship and research.

    But I always encourage faculties and institutions to supplement this by actually getting groups of students together and talking to them about their student experience as well. Qualitative responses can be more valuable than quantitative surveys. We might actually learn something!

    Laura G.yasir H. and 2 others like this

  • Aleardo

    Aleardo Manacero

    Associate Professor at UNESP – São Paulo State University

    As everyone here I also think that these evaluation forms do not truly measure teaching effectiveness. This is a quite hard thing to evaluate, since the effect of learning will be felt several years later, while performing their job duties.

    Besides that, some observations made by students are interesting for our own growth. I usually get these through informal talks with the class or even some students.

    In another direction, some of the previous comments are addressing deep/surface learning basically stating that deep learning is the right way to go. I have to disagree with this for some of the contents that have to be taught. In my case (teaching to computer science majors) it is important, for example, that every student have a surface knowledge about operating systems design, but those who are going to work as database analysts do not need to know the deep concepts involved with that (the same is true for database concepts for a network analyst…). So, surface learning has also its relevance in the professional formation.

    Jeremy W.Sonu S. like this

  • George ChristodoulidesGeorge

    George Christodoulides

    Senior Consultant and Lecturer at university of nicosia

    The usefulness of Student evaluations, like all similar surveys, is closely linked to the particular questions they are asked to answer. There are the objective-type/factual questions such as “Does he start class on time” or “does he speak clearly” and the very personal questions such as “does he give fair grades”… The effectiveness of a Teacher could be more appropriately linked to suitably phrased question, such as “has he motivated you to learn” and “how much have you bebnefited from the course”. The responses to these questions could, also, be further assessed by comparison with the final grades given to that particular course with the performance of the class in the other courses they have taken..during that semester. So, for assessing Teacher Effectiveness, one needs to ask relevant questions. and perform the appropriate evaluations..

  • Laura GabigerLaura

    Laura Gabiger

    Professor at Johnson & Wales University

    Top Contributor

    Michael has an excellent point that some accountability of institutions and programs is appropriate, and that aggregated data or qualitative results can be useful in assessing whether the teaching in a particular program is accomplishing what it sets out to do. Many outcomes studies are set up to measure the learning in an aggregated way.

    We may want to remember that our present conventions of teaching evaluation had their roots in the 1970s (in California, if I remember correctly), partly as a response to a system in which faculty, both individually and collectively, were accountable to no one. I recall my student days when a professor in a large public research institution would consider it an intrusion and a personal affront to be asked to supply a course syllabus.

    As the air continues to leak out of the USA’s higher education bubble, as the enrollments drop and the number of empty seats rises, it seems inevitable that institutions will feel the pressure to offer anything to make the students perceive their experience as positive. It may be too hard to make learning–often one of the most uncomfortable experiences in life–the priority. Faculty respond defensively because we are continually put in the position of defending ourselves, often by poorly-designed quantitative instruments that address every kind of feel-good hotel concierge aspect of classroom management while overlooking learning.

    John S. likes this

  • Sethuraman JambunathaSethuraman

    Sethuraman Jambunatha

    Dean (I & E) at Vinayaka Mission

    The evaluation of faculty by the students is welcome. The statistics of information can be looked into to a certain degree of objectivity. An instructor strict with his/her students may be ranked low in spite of being an asset to the department. A ‘free-lance’ teacher with students may be placed higher despite being a poor teacher. At any rate the HoD’s duty is to observe the quality of all teachers and his objective evaluation is final. The parents feed-back is also to be taken. Actually
    teaching is a multi-dimensional task and students evaluation is just one co-ordinate only.

  • Edwin

    Edwin Herman

    Associate Professor at University of Wisconsin, Stevens Point

    Student evaluations are a terrible tool for measuring teacher effectiveness. They do measure student satisfaction, and to some extent the measure student *perception* of teacher effectiveness. But the effectiveness of a teaching method or of an instructor is poorly correlated with student satisfaction: while there are positive linkages between the two concepts, students are generally MORE satisfied by an easy course that makes them feel good than by a hard course that makes them have to really think and work (and learn).

    Students like things that are flashy, and things that are easy more than they like things that require a lot of work or things that force them to rethink their core values. Certainly there are students who value a challenge, but even those students may not recognize which teacher gave them a better course.

    Student evaluations can be used effectively to help identify very poor teaching. But it is useless to distinguish between adequate and good teaching practices.

    John S. likes this

  • Cesar GranadosCesar

    Cesar Granados

    ex Vicerrector Administrativo en Universidad Nacional de San Cristóbal de Huamanga

    César S. Granados
    Retired Professor from The National University of San Cristóbal de Huamanga
    Ayacucho, PERÚ

    Since teaching effectiveness is a function of teacher competencies, an effective teacher is able to use the existing competencies to achieve the desired student´s results; but, student´s performance mainly depends of his commitment to achieve competencies.

  • Steve KaczmarekSteve

    Steve Kaczmarek

    Professor at Columbus State Community College

    The student evaluations I’ve seen are more like customer satisfaction surveys, and in this respect, there is less helpful information for the instructor to improve his or her craft and instead more feedback about whether or not the student liked the experience. Shouldn’t their learning and/or improving skills be at least as important? I’m not arguing that these concepts are mutually exclusive, but the evaluations are often written to privilege one over the other.

    There are other problems. Using the same evaluation tool for very different kinds of courses (lecture versus workshop, for instance) doesn’t make a lot of sense. Evaluation language is often vague and puzzling in what it rewards (one evaluation form asks “Was the instructor enthusiastic?” Would an instructor bursting with smiles and enthusiasm but who is disorganized and otherwise less effective be privileged over one who is low-key but nonetheless covers the material effectively?). The “halo effect” can distort findings, where, among other things, more attractive instructors can get higher marks.

    Given how many times I’ve heard from students about someone being their favorite instructor because he or she was easy, I question the criteria students may use when evaluating. Instructors are also told that evaluations are for their benefit to improve teaching ability, but then chairs and administrators use them in promotion and hiring decisions.

    I think if the evaluation tool is sound, it can be useful to helping instructors. But, lastly, I think of my own experiences as a student, where I may have disliked or even resented some instructors because they challenged me or pushed me out of my comfort zone to learn new skills or paradigms. I may have evaluated them poorly at the time, only to come to learn a few years later with greater maturity that they not only taught me well, but taught me something invaluable, and perhaps more so than the instructors I liked. In this respect, it would be more fair to those instructors for me to fill out an evaluation a few years later to accurately describe their teaching.

  • Diane

    Diane Halm

    Adjunct Professor of Writing at Niagara University

    Wow, there are so many valid points raised; so many considerations. In general, I tend to agree with those who believe it gauges student satisfaction more than learning, though there is a correlation between the two. After 13 years as an adjunct at a relatively small, private college, I have found that engagement really is what many students long for. It seems far less about the final grades earned and more about the tools they’ve acquired. It should be mentioned that I teach developmental level composition, and while almost no student earns an A, most feel they have learned much:)

    Pierre H. likes this

  • Nira HativaNira

    Nira Hativa

    Former director, center for the advancement of teaching at Tel Aviv University

    Student ratings of instruction (SRI) do not measure teaching effectiveness but rather student satisfaction from instruction (as some previous comments on this list suggest). However there is a substantial research evidence for the relationships between SRIs and some agreed-upon measures of good teaching and of student learning. This research is summarized in much detail in my recent book:
    Student Ratings of Instruction: A Practical Approach to Designing, Operating, and Reporting (220 pp.) https://www.createspace.com/4065544
    ISBN-13:978-1481054331

    Michael T.Diane H. and 1 other like this

  • robert easterbrookrobert

    robert easterbrook

    Education Management Professional

    Learning is not about what the teacher does, it is about what the learner does.

    Do not confuse the two.

    Learning is what the learner does with what the teacher teaches.

    If you think that learning is all about what the teacher does, then the SRI will mislead and deceive.

    Adrian M.David Shallenberger and 1 other like this

  • Sami SamraSami

    Sami Samra

    Associate Professor at Notre Dame University – Louaize

    Evaluation, in all its forms, is a complex exercise that needs both knowledge and skill. Further, evaluation can best be achieved through a variety of instruments. We know all of this as teachers. Question is how knowledgeable are our students regarding the teaching/learning process. More, how knowledgeable are our administrators in translating information collected from questionnaires (some of which are validity-questionable) into plausible data-based decisions. I agree that students should have a say in how their courses are being conducted. But to use their feedback, quantitatively, to evaluate university professors… I fear that I must hold a very skeptical stand towards such evaluation.

     

  • Top Contributor

    Quite an interesting topic, and I’m reminded of the ancient proverb, “Parts is not parts.” OK, maybe that was McDonalds. This conversation would make a very thoughtful manuscript.

    Courses is not courses. Which course will be more popular, “Contemporary Music” or “General Chemistry?”

    Search any university using the following keywords “really easy course [university].” Those who teach these courses are experts at what they do, and what they do is valuable, however the workload for the student is minimal.

    The major issues: (1) popularity is inversely proportional to workload; and (2) the composition of the questions appearing on course and professor evaluations (CAPEs).

    “What grade do you expect in this class? Instructor explains course material well? Lectures hold your attention?”

    If Sally gets to listen to Nickleback in class and then next period learn quantum mechanics, which course does one suppose best held her attention?

    A person about to receive a C- in General Chemistry is probably receiving that C- because s/he was never able to understand the material for lack of striving, and probably hates the subject. That person is very likely to have never visited the professor during office hours for help. Logically one might expect low approval ratings from such a scenario.

    A person about to receive an A in General Chemistry is getting that A because s/he worked his/her tail off. S/he was able to comprehend mostly everything the professor said, and most probably liked the course. Even more, s/he probably visited the professor during office hours several times for feedback.

    One might argue that the laws of statistics will work in favor of reality, however that’s untrue when only 20% of students respond to CAPEs. Those who respond either love the professor or hate the professor. There’s usually no middle ground. Add this to internet anonymity, and the problem is compounded. I am aware of multiple studies conducted by universities indicating high correlation between written CAPEs and electronic CAPEs, however I’d like to bring up one point.

    Think of the last time you raised your voice to a customer service rep on the phone. Would you have raised your voice to that rep in person?

    There’s not enough space to comment on all the variables involved in CAPE numerical responses. As of last term I stopped paying attention to the numbers and focused exclusively on the comments. There’s a lot of truth in most of the comments.

    I would like to see the following experiment performed. Take a group of 10,000 students. Record their CAPE responses prior to receiving their final grade. Three weeks later, have them re-CAPE. One year later, have them re-CAPE again. Two years. Three years. Finally, have them re-CAPE after getting a job.

    Many students don’t know what a professor did for them until semesters or years down the road. They’re likely to realize how good of a teacher the professor was by their performance in future courses in the same subject requiring cumulative mastery.

    Do I think student evaluations measure teaching effectiveness? CAPEs is not CAPEs.

    Ronnie S.Sonu S. like this

  • Anne GardnerAnne

    Anne Gardner

    Senior Lecturer at University of Technology Sydney

    No, of course they don’t.

  • Christa van StadenChrista

    Christa van Staden

    Owner of AREND.co, a professional learning community for educators

    No, it does not. Efficiency in class room should be measured by the results of students, their attitude towards students and the quality of their preparation. I worked with a man who told a story about the different hats and learning and thought that was a new way of looking at learning. To my utmost shock my colleague, who sat because he had to say something, told me that he did it exactly the same, same jokes, etc, when he did the course five years ago. For real – nothing changed, no new technology, no new insights. no learning happened over a period of five years, nothing? And he is rated very high – head of a new wing. Who rated him? How? And why did it not effect his teaching at all?

  • Mat Jizat AbdolMat Jizat

    Mat Jizat Abdol

    Chief Executive at Institut Sains @ Teknologi Darul Takzim ( INSTEDT)

    If we are looking for quality, we have to get information about our performance.in the lecture room. There are 6 elements normally being practice. They are: 1.Teaching Plan of lecture contents 2.Teaching Delivery 3.Fair and systematic of evaluation on student’s work 4. Whether the Teaching follows the semester plan.5. Whether the lecturer follows the T-Table and always on time of their lecturer hours and lastly is the Relationship between lecturer and students.

  • orlando mcallisterorlando

    orlando mcallister

    Department Head – Communications/Mathematics

    Do we need to be reminded that educators were students at one time or the other? So why not have students evaluate the performance of a teacher? After all, the students are contributing to their own investment in what is significant for survival; and whether it is effective towards career development to attain their full potential as a human sentient being towards the greater good of humanity; anything else falls short of human progress in a tiny rotating planet cycling through the solar system with destination unknown! Welcome to the ‘Twilight Zone.”

    Would you rather educate a student to make a wise decision to accept 10 gallons of water in a desert? Or accept a $1 million check that further creates mirages and illusory dreams of success?

  • Stephen RobertsonStephen

    Stephen Robertson

    Lecturer at Edinburgh Napier University

    I think what my students say about me is important. I’m most interested in the comments they make and have used these to pilot other ideas or adjust my approach.

    I’ve had to learn to not beat myself up about a few bad comments or get carried away with a few good ones.

    I also use the assessment results to see if the adjustments made have had the intended impact. I use the VLE logs as well to see how engaged the students are with the materials and what tools they use and when.

    I find the balance keeps me grounded. I want my students to do well and have fun. The dashboard on your car has multiple measures. Why should teaching be different? Like the car I listen for strange noises and look out the window to make sure I’m still on the road.

    Jeremy W. likes this

  • Allan SheppardAllan

    Allan Sheppard

    Lecturer/tutor/PhD student at Griffith University

    I think that most student evaluations are only reaction measures and not true evaluation of learning outcome or teaching effectiveness – and often evaluations are tainted if the student get a lower mark than anticipated
    I think these types of evaluation are only indicative — and should not really be used to measure teacher/teaching effectiveness – and should not be allowed to affect teachers’ careers
    I note Stephen’s point about multiple measures — unfortunately most evaluations are quick and dirty — and certainly do not provide multiple measures

    Jeremy W.John S. like this

  • Allan SheppardAllan

    Allan Sheppard

    Lecturer/tutor/PhD student at Griffith University

    interestingly most student evaluations are anonymous – so the student can say what he/she likes and not have to face scrutiny

    George C. likes this

  • Olga

    Olga Kuznetsova

    No, students’evaluations cannot fully measure teaching effectiveness.
    However,for the relationship to be mutually beneficial, you have to accept their judgement on the matter, Unfortunately a Unique teacher for all categories (types) of students does not exist in our dynamic world.

    George C. likes this

  • Penny PaliadelisPenny

    Penny Paliadelis

    Professor, Executive Dean, Faculty of Health, Federation University Australia

    Student evaluations are merely popularity contests, they tempt academics to ‘ dumb down’ the content in order to be liked and evaluated positively…this is a dangerous and slippery slope then can result in graduates being ill-prepared for the professions and industries they seek to enter.

    Kathleen C.John S. like this

  • Robson Chiambiro (MBA, MSc, MEd.)Robson

    Robson Chiambiro (MBA, MSc, MEd.)

    PRINCE 2 Registered Practitioner at Higher Colleges of Technology

    In my opinion the student-teacher evaluations are measuring popularity as others suggested but the problem is that some of the questions and intentions of assesing are not fulfilled due to the use of wrong questioning. I have never seen in the instruments a question asking students of their expectations from the teacher and the course as such. To me that is more important than to ask if the student likes the teaching style which students do not know anyway. Teachers who give any test before the assessment are likely to get low ratings than those who give tests soon after the evaluation.

  • Chris GarbettChris

    Chris Garbett

    Principal Lecturer Leeds Metropolitan University

    I agree with other contributors. The evaluations are akin to a satisfaction survey. Personally, if, for example, I stay at an hotel, I only fill in the satisfaction survey if something is wrong. If the service is as I expect, I don’t bother with the survey.

    I feel also that students rate the courses or modules on a popularity basis. A module on a course may be enjoyable, or fun, but not necessarily better taught than another subject with a less entertaining subject.

    Unfortunately, everyone seems to think that the student evaluations are the main criteria by which to judge a course.

    Olga K. likes this

  • Steve BentonSteve

    Steve Benton

    Senior Research Officer, The IDEA Center

    First of all, it would help if we stop referring to them as “student” or “course” evaluations. Students are not qualified to evaluate. That is what administrators are paid to do. However, students are qualified to provide feedback to instructors and administrators about their perceptions of what occurred in the class and of how much they believe they learned. How can that not be valuable information, especially for developmental purposes about how to teach more effectively? Evaluation is not an event that happens at the end of a course–it is an ongoing process that requires multiple indicators of effectiveness (e.g., student ratings of the course, peer evaluations, administrator evaluations, course design, student products). By triangulating that combination of evidence, administrators and faculty can then make informed judgments and evaluate.

    Olga K. likes this

  • Eytan FichmanEytan

    Eytan Fichman

    Lecturer at Hanoi Architectural University

    The student / teacher relationship around the subject matter is a ‘triangle.’ The character of the triangle has a lot to do with a student’s reception of the of the material and the teacher.

    The Student:
    The well-prepared student and the intrinsically motivated student can more readily thrive in the relationship. If s/he is thriving s/he may be more inclined to rate the teacher highly. The poorly prepared student or the student who requires motivation from ‘outside’ is much less likely to thrive and more likely to rate a teacher poorly.

    The Teacher:
    The well-prepared teacher and the intrinsically motivated teacher can more readily thrive in the relationship. If s/he is thriving students may be more inclined to rate the teacher highly. The poorly prepared teacher or the teacher who requires motivation from ‘outside’ is much less likely to thrive and more likely to achieve poor teacher ratings.

    The Subject Matter:
    The content and form of the subject matter are crucial, especially in their relation to the student and teacher.

  • Daniel GoecknerDaniel

    Daniel Goeckner

    Education Professonal

    Student evaluations do not measure teaching effectiveness. I have been told I walk on water and am the worst teacher ever. The major difference was the level of student participation. The more they participated the better I was.

    What I use them for is a learning tool. I take the comments apart looking for snippets that I can use to improve my teaching.

    I have been involved in a portfolio program the past two years. One consist is the better the measured outcomes, the worse the student reviews.

    • Dr. Pedro L. MartinezDr. Pedro L.

      Dr. Pedro L. Martinez

      Former Provost and Vice Chancellor for Academic Affairs at Winston Salem State University & President of HigherEd SC.

      Steve,
      Have you ever been part of a tenure or promotion committee evaluation process? In my 35 years of experience, faculty members do not operate in that ideal smooth linear trajectory that you have described. On the contrary, they partition evaluations into categories and look at student course evaluations as the evidence of an instructor’s ability to teach. However, faculty can choose which evaluations they can submit and what comments they want to include as part of the record. I have never seen “negative comments” as evidence of “ineffective teaching”. The five point scale is used and whenever that falls below a 3.50, it becomes a great concern for our colleagues!

    • Sethuraman JambunathaSethuraman

      Sethuraman Jambunatha

      Dean (I & E) at Vinayaka Mission

      There are many other ways of asserting the faculty by the peer group. There can be a weekly seminar and faculty members are expected to give a seminar and other faculty members and students are the audience. This measures how much interest a faculty has in some chosen areas. The Chair (HoD) can talk to some selected students (chosen as representing highly motivated/average/take easy) and reach a decision for tenure-track. As I said earlier the students’ evaluation can be one of many aspects. In my own experience other (senior) faculty evaluation is many times detrimental to the progress of junior faculty. But one ask the HoD is the senior most: but one thing is clear, the chair of the ‘Chair’ has some ‘vision’ and transcends discrimination and partisan feelings. In India we call: “(Sar)Panch me Parameshwar rahtha hai”, meaning: On the position of Judge, God dwells (sits). Think of Becket and the King Henry II. As archbishop, Rev. Thomas Becket was a completely changes person fully submerged in divinity order. So the Chair is supremo. Students evaluation is just

    • Sethuraman JambunathaSethuraman

      Sethuraman Jambunatha

      Dean (I & E) at Vinayaka Mission

      There are many other ways of asserting the faculty by the peer group. There can be a weekly seminar and faculty members are expected to give a seminar and other faculty members and students are the audience. This measures how much interest a faculty has in some chosen areas. The Chair (HoD) can talk to some selected students (chosen as representing highly motivated/average/take easy) and reach a decision for tenure-track. As I said earlier the students’ evaluation can be one of many aspects. In my own experience other (senior) faculty evaluation is many times detrimental to the progress of junior faculty. But one ask the HoD is the senior most: but one thing is clear, the chair of the ‘Chair’ has some ‘vision’ and transcends discrimination and partisan feelings. In India we call: “(Sar)Panch me Parameshwar rahtha hai”, meaning: On the position of Judge, God dwells (sits). Think of Becket and the King Henry II. As archbishop, Rev. Thomas Becket was a completely changes person fully submerged in divinity order. So the Chair is supremo. Students evaluation is just

    • Sethuraman JambunathaSethuraman

      Sethuraman Jambunatha

      Dean (I & E) at Vinayaka Mission

      There are many other ways of asserting the faculty by the peer group. There can be a weekly seminar and faculty members are expected to give a seminar and other faculty members and students are the audience. This measures how much interest a faculty has in some chosen areas. The Chair (HoD) can talk to some selected students (chosen as representing highly motivated/average/take easy) and reach a decision for tenure-track. As I said earlier the students’ evaluation can be one of many aspects. In my own experience other (senior) faculty evaluation is many times detrimental to the progress of junior faculty. But one ask the HoD is the senior most: but one thing is clear, the chair of the ‘Chair’ has some ‘vision’ and transcends discrimination and partisan feelings. In India we call: “(Sar)Panch me Parameshwar rahtha hai”, meaning: On the position of Judge, God dwells (sits). Think of Becket and the King Henry II. As archbishop, Rev. Thomas Becket was a completely changes person fully submerged in divinity order. So the Chair is supremo. Students evaluation is just

    • Sethuraman JambunathaSethuraman

      Sethuraman Jambunatha

      Dean (I & E) at Vinayaka Mission

      There are many other ways of asserting the faculty by the peer group. There can be a weekly seminar and faculty members are expected to give a seminar and other faculty members and students are the audience. This measures how much interest a faculty has in some chosen areas. The Chair (HoD) can talk to some selected students (chosen as representing highly motivated/average/take easy) and reach a decision for tenure-track. As I said earlier the students’ evaluation can be one of many aspects. In my own experience other (senior) faculty evaluation is many times detrimental to the progress of junior faculty. But one ask the HoD is the senior most: but one thing is clear, the chair of the ‘Chair’ has some ‘vision’ and transcends discrimination and partisan feelings. In India we call: “(Sar)Panch me Parameshwar rahtha hai”, meaning: On the position of Judge, God dwells (sits). Think of Becket and the King Henry II. As archbishop, Rev. Thomas Becket was a completely changes person fully submerged in divinity order. So the Chair is supremo. Students evaluation is just

    • Sethuraman JambunathaSethuraman

      Sethuraman Jambunatha

      Dean (I & E) at Vinayaka Mission

      There are many other ways of asserting the faculty by the peer group. There can be a weekly seminar and faculty members are expected to give a seminar and other faculty members and students are the audience. This measures how much interest a faculty has in some chosen areas. The Chair (HoD) can talk to some selected students (chosen as representing highly motivated/average/take easy) and reach a decision for tenure-track. As I said earlier the students’ evaluation can be one of many aspects. In my own experience other (senior) faculty evaluation is many times detrimental to the progress of junior faculty. But one ask the HoD is the senior most: but one thing is clear, the chair of the ‘Chair’ has some ‘vision’ and transcends discrimination and partisan feelings. In India we call: “(Sar)Panch me Parameshwar rahtha hai”, meaning: On the position of Judge, God dwells (sits). Think of Becket and the King Henry II. As archbishop, Rev. Thomas Becket was a completely changes person fully submerged in divinity order. So the Chair is supremo. Students evaluation is just

    • Sethuraman JambunathaSethuraman

      Sethuraman Jambunatha

      Dean (I & E) at Vinayaka Mission

      There are many other ways of asserting the faculty by the peer group. There can be a weekly seminar and faculty members are expected to give a seminar and other faculty members and students are the audience. This measures how much interest a faculty has in some chosen areas. The Chair (HoD) can talk to some selected students (chosen as representing highly motivated/average/take easy) and reach a decision for tenure-track. As I said earlier the students’ evaluation can be one of many aspects. In my own experience other (senior) faculty evaluation is many times detrimental to the progress of junior faculty. But one ask the HoD is the senior most: but one thing is clear, the chair of the ‘Chair’ has some ‘vision’ and transcends discrimination and partisan feelings. In India we call: “(Sar)Panch me Parameshwar rahtha hai”, meaning: On the position of Judge, God dwells (sits). Think of Becket and the King Henry II. As archbishop, Rev. Thomas Becket was a completely changes person fully submerged in divinity order. So the Chair is supremo. Students evaluation is just

    • Sethuraman JambunathaSethuraman

      Sethuraman Jambunatha

      Dean (I & E) at Vinayaka Mission

      There are many other ways of asserting the faculty by the peer group. There can be a weekly seminar and faculty members are expected to give a seminar and other faculty members and students are the audience. This measures how much interest a faculty has in some chosen areas. The Chair (HoD) can talk to some selected students (chosen as representing highly motivated/average/take easy) and reach a decision for tenure-track. As I said earlier the students’ evaluation can be one of many aspects. In my own experience other (senior) faculty evaluation is many times detrimental to the progress of junior faculty. But one ask the HoD is the senior most: but one thing is clear, the chair of the ‘Chair’ has some ‘vision’ and transcends discrimination and partisan feelings. In India we call: “(Sar)Panch me Parameshwar rahtha hai”, meaning: On the position of Judge, God dwells (sits). Think of Becket and the King Henry II. As archbishop, Rev. Thomas Becket was a completely changes person fully submerged in divinity order. So the Chair is supremo. Students evaluation is just

    • Sethuraman JambunathaSethuraman

      Sethuraman Jambunatha

      Dean (I & E) at Vinayaka Mission

      There are many other ways of asserting the faculty by the peer group. There can be a weekly seminar and faculty members are expected to give a seminar and other faculty members and students are the audience. This measures how much interest a faculty has in some chosen areas. The Chair (HoD) can talk to some selected students (chosen as representing highly motivated/average/take easy) and reach a decision for tenure-track. As I said earlier the students’ evaluation can be one of many aspects. In my own experience other (senior) faculty evaluation is many times detrimental to the progress of junior faculty. But one ask the HoD is the senior most: but one thing is clear, the chair of the ‘Chair’ has some ‘vision’ and transcends discrimination and partisan feelings. In India we call: “(Sar)Panch me Parameshwar rahtha hai”, meaning: On the position of Judge, God dwells (sits). Think of Becket and the King Henry II. As archbishop, Rev. Thomas Becket was a completely changes person fully submerged in divinity order. So the Chair is supremo. Students evaluation is just

    • Sethuraman JambunathaSethuraman

      Sethuraman Jambunatha

      Dean (I & E) at Vinayaka Mission

      There are many other ways of asserting the faculty by the peer group. There can be a weekly seminar and faculty members are expected to give a seminar and other faculty members and students are the audience. This measures how much interest a faculty has in some chosen areas. The Chair (HoD) can talk to some selected students (chosen as representing highly motivated/average/take easy) and reach a decision for tenure-track. As I said earlier the students’ evaluation can be one of many aspects. In my own experience other (senior) faculty evaluation is many times detrimental to the progress of junior faculty. But one ask the HoD is the senior most: but one thing is clear, the chair of the ‘Chair’ has some ‘vision’ and transcends discrimination and partisan feelings. In India we call: “(Sar)Panch me Parameshwar rahtha hai”, meaning: On the position of Judge, God dwells (sits). Think of Becket and the King Henry II. As archbishop, Rev. Thomas Becket was a completely changes person fully submerged in divinity order. So the Chair is supremo. Students evaluation is just

    • Susan WrightSusan

      Susan Wright

      Assistant Professor at Clarkson University

      Amazing how things work…I’m actually in the process of framing out a research project related to this very question. Does anyone have any suggestions for specific papers I should look at i.e. literature related to the topic?

      With respect to your question, I believe the answer depends on the questions that get asked.

    • Sarah LowengardSarah

      Sarah Lowengard

      Researcher, Writer, Editor, Consultant (history, technology, art, sciences)

      I fall on the “no” side too.

      The school-derived questionnaires nearly always ask the wrong questions, for one.

      I’ve always thought students should wait some years (3-20) before providing feedback, because the final day of class is too recent to do a good assessment.

      David Shallenberger likes this

    • Jeremy

      Jeremy Wickins

      Open University Coursework Consultant, Research Methods

      I’m quite late to the topic here, and much of what I think has been said by others. There is a difference between the qualitative and quantitative aspects of student evaluations – I am always fascinated to find out what my students (and peers, of course, though that is a different topic) do/do not think I am doing well so I can learn and adapt my teaching. For this reason, I prefer a more continuous student evaluation than the questionnaire at the end of the course – if I need to adapt to a particular group, I need the information sooner rather than later.

      However, the quantitative side means nothing unless it is tied back to hard data on how the students did in their assessments – an unpopular teacher can still be a *good* teacher of the subject at hand! And the subject matter counts a lot – merely teaching an unpopular but compulsory subject (public law, for instance!) tends to make the teacher initially unpopular in the minds of students – a type of shooting the messenger.

      Teaching isn’t a beauty contest – these metrics need to be used in the right way, and combined with other data if they are to say anything about the teaching.

    • Dr. James R. MartinDr. James R.

      Dr. James R. Martin

      Professor Emeritus

      I wrote a paper about this issue a few years ago. Briefly, the thrust of my argument is that student opinions should not be used as the basis for evaluating teaching effectiveness because these aggregated opinions are invalid measures of quality teaching, provide no empirical evidence in this regard, are incomparable across different courses and different faculty members, promote faculty gaming and competition, tend to distract all participants and observers from the learning mission of the university, and insure the sub-optimization and further decline of the higher education system. Using student opinions to evaluate, compare and subsequently rank faculty members represents a severe form of a problem Deming referred to as a deadly disease of western style management. The theme of the alternative approach is that learning on a program-wide basis should be the primary consideration in the evaluation of teaching effectiveness. Emphasis should shift from student opinion surveys to the development and assessment of program-wide learning outcomes. To achieve this shift in emphasis, the university performance measurement system needs to be redesigned to motivate faculty members to become part of an integrated learning development and assessment team, rather than a group of independent contractors competing for individual rewards.

      Martin, J. R. 1998. Evaluating faculty based on student opinions: Problems, implications and recommendations from Deming’s theory of management perspective. Issues in Accounting Education (November): 1079-1094. http://maaw.info/ArticleSummaries/ArtSumMartinSet98.htm

      Barbara C. likes this

    • Joseph Lennox, Ph.D.The next logical step in the discussion would appear to be, “How would you effectively measure teacher effectiveness?”

      With large enrollment classes, one avenue is here:

      http://www.insidehighered.com/views/2013/10/11/way-produce-more-information-about-instructors-effectiveness-essay

      So, how should teacher effectiveness be measured?” data-li-editable=”false” data-li-edit-sec-left=”900″ data-li-time=”” />

      There appears to be general agreement that the answer to the proposed question is “No.”

      The next logical step in the discussion would appear to be, “How would you effectively measure teacher effectiveness?”

      With large enrollment classes, one avenue is here:

      http://www.insidehighered.com/views/2013/10/11/way-produce-more-information-about-instructors-effectiveness-essay

      So, how should teacher effectiveness be measured?

      Jeremy W.Olga K. like this

    • Ron MelchersRon

      Ron Melchers

      Professor of Criminology, University of Ottawa

      Top Contributor

      To inform this discussion, I would highly recommend this research review done for the Higher Education Quality Council of Ontario. It’s a pretty balanced and well-informed treatment of student course (and teacher) evaluations:http://www.heqco.ca/SiteCollectionDocuments/Student%20Course%20Evaluations_Research,%20Models%20and%20Trends.pdf

      Joseph L.Ken R. like this

    • Ron MelchersRon

      Ron Melchers

      Professor of Criminology, University of Ottawa

      Top Contributor

      Just to add my own two cents (two and a half Canadian cents at this point), I think students have much of value to tell us about their experience in our courses and classes, information that we can use to improve their learning and become more effective teachers. They are also able to inform academic administrators of the degree to which teachers fulfill their basic duties and perform the elementary tasks they are assigned. They have far less to tell us about the value of what they’re learning to their future, their professions … and they are perhaps not the best qualified to identify effective learning and teaching techniques and methods. Those sorts of things are better assessed by knowledgeable, expert professional and academic peers.

      David Shallenberger likes this

    • Barbara

      Barbara Celia

      Assistant Clinical Professor at Drexel University

      Thank you, Ron. A great deal of info but worth reading and analyzing.

    • Prof. Ravindra Kumar

      Prof. Ravindra Kumar Raghuvanshi

      Member of Academic committees of some Universities & Retd.Prof.,Dept.of Botany,University of Rajasthan,Jaipur.

      Student rating system may not necessarily be a reliable method to assess the teaching
      effeciveness,because it depends upon individual grasping/understanding power, intelligence
      and study tendency A teacher does his/her job well, but how many students understand
      it well. It is reflected invariably in the marks obtained by them.

1 2 3