Searching for "virtual reality definition"

Privacy & Security in Today’s Library

Privacy & Security in Today’s Library by Amigos Library Services

The virtuality of privacy and security on the from Plamen Miltenoff

From: Jodie Borgerding [mailto:Borgerding@amigos.org]
Sent: Wednesday, July 05, 2017 3:07 PM
To: Miltenoff, Plamen <pmiltenoff@stcloudstate.edu>
Cc: Nicole Walsh <WALSH@AMIGOS.ORG>
Subject: Proposal Submission for Privacy & Security Conference

Hi Plamen,

Thank you for your recent presentation proposal for the online conference, Privacy & Security in Today’s Library, presented by Amigos Library Services. Your proposal, The role of the library in teaching with technology unsupported by campus IT: the privacy and security issues of the “third-party,” has been accepted. I just wanted to confirm that you are still available to present on September 21, 2017 and if you have a time preference for your presentation (11 am, 12 pm, or 2 pm Central). If you are no longer able to participate, please let me know.

Nicole will be touch with you shortly with additional details and a speaker’s agreement.

Please let me know if you have any questions.

Thanks!
___________________

Jodie Borgerding Consulting & Education Services Manager Amigos Library Services 1190 Meramec Station Road, Suite 207 | Ballwin, MO  63021-6902 800-843-8482 x2897 | 972-340-2897(direct) http://www.amigos.org | borgerding@amigos.org

+++++++++++++++++

Bio

Dr. Plamen Miltenoff is an Information Specialist and Professor at St. Cloud State University. His education includes several graduate degrees in history and Library and Information Science and terminal degrees in education and psychology.

His professional interests encompass social media, multimedia, Web development and design, gaming and gamification, and learning environments (LEs).

Dr. Miltenoff organized and taught classes such as LIB 290 “Social Media in Global Context” (http://web.stcloudstate.edu/pmiltenoff/lib290/) and LIB 490/590 “Digital Storytelling” (http://web.stcloudstate.edu/pmiltenoff/lib490/) where issues of privacy and security are discussed.

Twitter handle @SCSUtechinstruc

Facebook page: https://www.facebook.com/InforMediaServices/

The virtuality of privacy and security on the modern campus:

The role of the library in teaching with technology unsupported by campus IT: the privacy and security issues of the “third-party software” teaching and learning

Abstract/Summary of Your Proposed Session

The virtualization reality changes rapidly all aspects of learning and teaching: from equipment to methodology, just when faculty have finalized their syllabus, they have to start a new, if they want to keep abreast with content changes and upgrades and engagement of a very different student fabric – Millennials.

Mainframes are replaced by microcomputers, microcomputers by smart phones and tablets, hard drives by cloud storage and wearables by IoT. The pace of hardware, software and application upgrade is becoming unbearable for students and faculty. Content creation and methodology becomes useless by the speed of becoming obsolete. In such environment, faculty students and IT staff barely can devote time and energy to deal with the rapidly increasing vulnerability connected with privacy and security.

In an effort to streamline ever-becoming-scarce resources, campus IT “standardizes” campus use of applications. Those are the applications, which IT chooses to troubleshoot campus-wide. Those are the applications recommended to faculty and students to use.

In an unprecedented burgeoning amount of applications, specifically for mobile devices, it is difficult to constraint faculty and students to use campus IT sanctioned applications, especially considering the rapid pace of such applications becoming obsolete. Faculty and students often “stray” away and go with their own choice. Such decision exposes faculty and students, personally, and the campus, institutionally, at risk. In a recent post by THE Journal, attention on campuses is drown to the fact that cyberattacks shift now from mobile devices to IoT and campus often are struggling even with their capability to guarantee cybersecurity of mobile devices on campus. Further, the use of third-party application might be in conflict with the FERPA campus-mandated policies. Such policies are lengthy and complex to absorb, both by faculty and students and often are excessively restrictive in terms of innovative ways to improve methodology and pedagogy of teaching and learning. The current procedure of faculty and students proposing new applications is a lengthy and cumbersome bureaucratic process, which often render the end-users’ proposals obsolete by the time the process is vetted.

Where/what is the balance between safeguarding privacy on campus and fostering security without stifling innovation and creativity? Can the library be the campus hub for education about privacy and security, the sandbox for testing and innovation and the body to expedite decision-making?

Abstract

The pace of changes in teaching and learning is becoming impossible to sustain: equipment evolves in accelerated pace, the methodology of teaching and learning cannot catch up with the equipment changes and atop, there are constant content updates. In an even-shrinking budget, faculty, students and IT staff barely can address the issues above, less time and energy left to address the increasing concerns about privacy and security.

In an unprecedented burgeoning amount of applications, specifically for mobile devices, it is difficult to constraint faculty and students to use campus IT sanctioned applications, especially considering the rapid pace of such applications becoming obsolete. Faculty and students often “stray” away and go with their own choice. Such decision exposes faculty and students, personally, and the campus, institutionally, at risk. In a recent post by THE Journal (http://blog.stcloudstate.edu/ims/2017/06/06/cybersecurity-and-students/), attention on campuses is drawn to the fact of cyberattacks shifting from mobile devices to IoT but campus still struggling to guarantee cybersecurity of mobile devices on campus. Further, the use of third-party applications might be in conflict with the FERPA campus-mandated policies. Such policies are lengthy and complex to absorb, both by faculty and students and often are excessively restrictive in terms of innovative ways to improve methodology and pedagogy of teaching and learning. The current procedure of faculty and students proposing new applications is a lengthy and cumbersome bureaucratic process, which often render the end-users’ proposals obsolete by the time the process is vetted.

Where/what is the balance between safeguarding privacy on campus and fostering security without stifling innovation and creativity? Can the library be the campus hub for education about privacy and security, the sandbox for testing and innovation and the body to expedite decision-making?

http://blog.stcloudstate.edu/ims/2017/06/06/cybersecurity-and-students/

Anything else you would like to add

3 take-aways from this session:

  • Discuss and form an opinion about the education-pertinent issues of privacy and security from the broad campus perspective, versus the narrow library one
  • Discuss and form an opinion about the role of the library on campus in terms of the greater issues of privacy and security

Re-examine the thin red line of the balance between standardization and innovation; between the need for security and privacy protection a

++++++++++++++
presentation:
https://www.slideshare.net/aidemoreto/the-virtuality-of-privacy-and-security-on-the 

chat – slide 4, privacy. please take 2 min and share your definition of privacy on campus. Does it differ between faculty and students?  what are the main characteristics to determine privacy

chat – slide 5, security. please take 2 min and share your definition of security on campus regarding electronic activities. Who’s responsibility is security? IT issue [only]?

poles: slide 6, technology unsupported by campus IT, is it worth considering? 1. i am a great believer in my freedom of choice 2. I firmly follow rules and this applies to the use of computer tools and applications 3. Whatever…

chat –  slide 6, why third party applications? pros and cons. E.g. pros – familiarity with third party versus campus-required

pole, slide 6, appsmashing. App smashing is the ability to combine mobile apps in your teaching process. How do you feel about it? 1. The force is with us 2. Nonsense…

pole slide 7 third party apps and the comfort of faculty. How do you see the freedom of using third party apps? 1. All I want, thank you 2. I would rather follow the rules 3. Indifference is my middle name

pole slide 8 Technology standardization? 1. yes, 2. no, 3. indifferent

chat slide 9 if the two major issues colliding in this instance are: standardization versus third party and they have impact on privacy and security, how would you argue for the one or the other?

++++++++++++++++
notes from the conference

 

 

Measuring Library Vendor Cyber Security: Seven Easy Questions Every Librarian Can Ask

http://journal.code4lib.org/articles/11413

Bill Walker: http://www.amigos.org/innovating_metadata

 

+++++++++++++++
more on security in education in this IMS blog
http://blog.stcloudstate.edu/ims?s=security

more on privacy in education in this IMS blog
http://blog.stcloudstate.edu/ims?s=privacy

library user

The Library in the Life of the User. Engaging with People Where They Live and Learn

http://www.oclc.org/content/dam/research/publications/2015/oclcresearch-library-in-life-of-user.pdf
p. 18
Library staff
The roles of librarians change with changes in user needs and demands and the technology employed. A survey conducted for Research Libraries UK found skill gaps in nine key areas in which subject librarians could be supporting researchers’ needs. Even though many librarians may want to hire new staff with these skills, a survey found that the reality for most will be training existing staff.
Definitions of library services will change. We need to grow the ways users can engage with whatever they value from libraries, whether papyrus rolls, maker spaces or data management instruction.
p. 19
What is the Unique Selling Point (USP) of libraries vis-à-vis other information service providers?
p. 21
Librarians should measure the effectiveness of services based on the users’ perceptions of success. Librarians also should move beyond surveys of how library space is being used and should conduct structured observations and interviews with the people using the space. It is not enough to know that the various spaces, whether physical or virtual, are busy. Librarians need to understand when and how the spaces are being used.

p. 33 What is Enough? Satisficing Information Needs

Role theory explains that: “When people occupy social positions their behavior is determined mainly by what is expected of that position rather than by their own individual characteristics” (Abercrombie et al., 1994, p. 360).
Rational choice theory is based on the premise that complex social behavior can be understood in terms of elementary individual actions because individual action is the elementary unit of social life. Rational choice theory posits that individuals choose or prefer what is best to achieve their objectives or pursue their interests, acting in their self-interest (Green, 2002). Stated another way, “When faced with several courses of action, people usually do what they believe is likely to have the best overall outcome” (Scott, 2000).
When individuals satisfice, they compare the benefits of obtaining “more information” against the additional cost and effort of continuing to search (Schmid, 2004)
p. 38
This paper examines the theoretical concepts—role theory, rational choice, and satisficing—by attempting to explain the parameters within which users navigate the complex information-rich environment and determine what and how much information will meet their needs.
p. 39
The information-seeking and -searching research that explicitly addresses the topic of “what is good enough” is scant, though several studies make oblique references to the stopping stage, or to the shifting of directions for want of adequate information. Kraft and Lee (1979, p. 50) propose three stopping rules:
1. The satiation rule, “where the scan is terminated only when the user becomes satiated by finding all the desired number of relevant documents”;
2. The disgust rule, which “allows the scan to be terminated only when the user becomes disgusted by having to examine too many irrelevant documents”; and
3. The combination rule, “which allows the user to be seen as stopping the scan if he/she is satiated by finding the desired number of relevant documents or disgusted by having to examine too many irrelevant documents, whichever comes first.”
p. 42
Ellis characterizes six different types of information activities: starting, chaining, browsing, differentiating, monitoring and extracting. He emphasizes the information- seeking activities, rather than the nature of the problems or criteria used for determining when to stop the information search process. In a subsequent article, Ellis (1997) observes that even in the final stages of writing, individuals may continue the search for information in an attempt to answer unresolved questions or to look for new literature.
p. 43
Undergraduate and graduate students
Situations creating the need to look for information (meeting assignment requirements):
• Writing research reports; and
• Preparing presentations.
Criteria used for stopping the information search (fulfilling assignment requirements):
1. Quantitative criteria:
— Required number of citations was gathered;
— Required number of pages was reached;
— All the research questions were answered; and
— Time available for preparing.
2. Qualitative criteria:
— Accuracy of information;
— Same information repeated in several sources;
— Sufficient information was gathered; and
— Concept understood.
Criteria used for stopping the information search (fulfilling assignment requirements):
1. Quantitative criteria:
— Required number of citations was gathered;
— Required number of pages was reached;
— All the research questions were answered; and
— Time available for preparing.
2. Qualitative criteria:
— Accuracy of information;
— Same information repeated in several sources;
— Sufficient information was gathered; and
— Concept understood.
p. 44
Faculty
Situations creating the need to look for information (meeting teaching needs):
• Preparing lectures and presentations;
• Delivering lectures and presentations;
• Designing and conducting workshops;
• Meeting scholarly and research needs; and
• Writing journal articles, books and grant proposals.
Criteria used for stopping the information search (fulfilling teaching needs):
1. Quantitative criteria:
— Time available for: preparing lectures and presentations; delivering lectures
— And presentations; and designing and conducting workshops; and
— Fulfilling scholarly and research needs.
2. Qualitative criteria:
— Every possible synonym and every combination were searched;
— Representative sample of research was identified;
— Current or cutting-edge research was found;
— Same information was repeated;
— Exhaustive collection of information sources was discovered;
— Colleagues’ feedback was addressed;
— Journal reviewers’ comments were addressed; and
— Publisher’s requirements were met.
1. Quantitative criteria for stopping:
— Requirements are met;
— Time constraints are limited; and
— Coverage of material for publication is verified by colleagues or reviewers.
2. Qualitative criteria for stopping:
— Trustworthy information was located;
— A representative sample of sources was gathered;
— Current information was located;
— Cutting-edge material was located;
— Exhaustive search was performed; and
— Exhaustive collection of information sources was discovered.
p. 53

“Screenagers” and Live Chat Reference: Living Up to the Promise

p. 81

Sense-Making and Synchronicity: Information-Seeking Behaviors of Millennials and Baby Boomers

p. 84 Millennials specific generational features pertinent to libraries and information-seeking include the following:

Immediacy. Collaboration. Experiential learning. Visual orientation. Results orientation.  Confidence.
Rushkoff (1996) described the non-linearity of the thinking patterns of those he terms “children of chaos,” coining the term “screenagers” to describe those who grew up surrounded by television and computers (p. 3).
p. 85
Rational choice theory describes a purposive action whereby individuals judge the costs and benefits of achieving a desired goal (Allingham 1999; Cook and Levi 1990; Coleman and Fararo 1992). Humans, as rational actors, are capable of recognizing and desiring a certain outcome, and of taking action to achieve it. This suggests that information seekers rationally evaluate the benefits of information’s usefulness and credibility, versus the costs in time and effort to find and access it.
Role theory offers a person-in-context framework within the information-seeking situation which situates behaviors in the context of a social system (Mead 1934; Marks 1996). Abercrombie, et al. (1994, p. 360) state, “When people occupy social positions their behavior is determined mainly by what is expected of that position rather than by their own individual characteristics.” Thus the roles of information-seekers in the academic environment influence the expectations for performance and outcomes. For example, faculty would be expected to look for information differently than undergraduate students. Faculty members are considered researchers and experts in their disciplines, while undergraduate students are novices and protégés, roles that place them differently within the organizational structure of the academy (Blumer, 2004; Biddle, 1979; Mead, 1934; Marks, 1996; Marks, 1977).

+++++++++++++++++
more on research in this IMS blog
http://blog.stcloudstate.edu/ims?s=research

IM554 discussion on GBL

IM554 discussion on Game Based Learning

Here is the “literature”:
http://blog.stcloudstate.edu/ims/2015/03/19/recommendations-for-games-and-gaming-at-lrs/
this link reflects my recommendations to the SCSU library, based on my research and my publication: http://scsu.mn/1F008Re

Here are also Slideshare shows from conferences’ presentations on the topic:

https://www.slideshare.net/aidemoreto/gamification-and-byox-in-academic-libraries-low-end-practical-approach

https://www.slideshare.net/aidemoreto/gaming-and-gamification-in-academic-and-library-settings

Topic :Gaming and Gamification in Academic Settings

  1. Intro: why is it important to discuss this trend
    1. The fabric of the current K12 and higher ed students: Millennials and Gen Z
    2. The pedagogical theories and namely constructivism
      1. Csikszentmihalyi’s “flow” concept (being in the zone)
      2. Active learning
      3. Sociocultural Theory
      4. Project-Based Learning
    3. The general milieu of increasing technology presence, particularly of gaming environment
    4. The New Media Consortium and the Horizon Report

Discussion: Are the presented reasons sufficient to justify a profound restructure of curricula and learning spaces?

  1. Definition and delineation
    1. Games
    2. Serious Games
    3. Gamification
    4. Game-based learning
    5. Digital game-based learning
    6. Games versus gamification
    7. Simulations, the new technological trends such as human-computer interaction (HCI) such as augmented reality (AR),virtual reality (VR) and mixed reality (MR) (http://blog.stcloudstate.edu/ims/2017/02/22/virtual-augmented-mixed-reality/ )

Discussion: Is there a way to build a simpler but comprehensive structure/definition to encompass the process of gaming and gamification in education?

  1. Gaming and Gamification
    1. Pros
    2. Cons
    3. Debates

Discussion: Which side are you on and why?

  1. Gaming and Gamification and BYOD (or BYOx)
    1. gaming consoles versus gaming over wi-fi
    2. gaming using mobile devices instead of consoles
    3. human-computer interaction (HCI) such as augmented reality (AR),virtual reality (VR) and mixed reality (MR) (http://blog.stcloudstate.edu/ims/2017/02/22/virtual-augmented-mixed-reality/ )

Discussion: do you see a trend to suggest that either one or the other will prevail? Convergence?

  1. Gaming in Education
    1. student motivation, student-centered learning, personalized learning
    2. continued practice, clear goals and immediate feedback
    3. project-based learning, Minecraft and SimCity EDU
    4. Gamification of learning versus learning with games
    5. organizations to promote gaming and gamification in education (p. 6 http://scsu.mn/1F008Re)
    6. the “chocolate-covered broccoli” problem

Discussion: why gaming and gamification is not accepted in a higher rate? what are the hurdles to enable greater faster acceptance? What do you think, you can do to accelerate this process?

  1. Gaming in an academic library
    1. why the academic library? sandbox for experimentation
    2. the connection between digital literacy and gaming and gamificiation
    3. Gilchrist and Zald’s model for instruction design through assessment
    4. the new type of library instruction:
      in house versus out-of-the box games. Gamification of the process
      http://web.stcloudstate.edu/pmiltenoff/bi/

Discussion: based on the example (http://web.stcloudstate.edu/pmiltenoff/bi/), how do you see transforming academic library services to meet the demands of 21st century education?

  1. Gaming, gamification and assessment (badges)
    1. inability of current assessments to evaluate games as part of the learning process
    2. “microcredentialing” through digital badges
    3. Mozilla Open Badges and Badgestack
    4. leaderboards

Discussion: How do you see a transition from the traditional assessment to a new and more flexible academic assessment?

computers in library conference

computers in libraries conference

March 28-30 preconference workshops March 27 hyatt regency crystal city
arlington, va
http://conferences.infotoday.com/documents/221/CIL2017-Advance-Program.pdf

W5: Want Media Coverage? Add Press Room to Your Website

9:00 a.m. – 12:00 p.m.

Kathy Dempsey, Editor, Marketing Library Services newsletter Owner, Libraries Are Essential consultancy

Library marketers crave media attention and coverage, but most don’t know how to get it. The first step is having a Press Room as part of your library’s website. This workshop, led by a member of the media who’s also a library marketing expert, shows you how to build a Press Room that works. It includes how your library benefits from having an online Press Room, even if you don’t have a marketing department; where it belongs in your website hierarchy; what content members of the press expect to find there; SEO basics and PR tactics to lead reporters to your online Press Room; why building relationships with the media is vital; how press coverage affects your library’s usage, funding, brand recognition, and position in the community. Help ensure positive coverage by adding this strategic tool to your website.

W8: Video: Hands-On Learning & Practice

9:00 a.m. – 12:00 p.m.

Jennifer E. Burke, President, IntelliCraft Research, LLC

In this half-day workshop, a former advertising executive and trainer of strategic storytelling in marketing shares secrets on how to create video that has an impact on your community. Join her to shoot, edit, and polish a video while gathering tips, techniques, and strategies to create your own video-a medium which grabs communities in exciting new ways!

W10: Implementing an Internet of Things Infrastructure & Apps

9:00 a.m. – 12:00 p.m.

May Chang, Assistant Director, LibraryTechnology, East Carolina University
Mehdi Mohammadi, Graduate Assistant, Western Michigan University

The Internet of Things (IoT) is becoming widespread in academia as well as industry. It refers to connecting smart objects with built-in unique identifiers and sensors to communicate with each other autonomously. This enables actionable insights and ultimately makes the environment around us smarter. This workshop looks at how libraries can incorporate the IoT and reviews different aspects of developing an IoT infrastructure and creating your own application. It is based on four layers of IoT application architecture: the physical layer, the communications layer, application and services layer, and data analytics. Speakers discuss the potentials and challenges of IoT applications, including the footprint of IoT applications, i.e., a high volume of sensory data, the tools and methods for data analytics. As a case study, they focus on location-aware applications using iBeacons and smartphone sensors to show how all the IoT elements work together. Gain a better understanding of the usefulness of IoT in libraries, learn the main elements and underlying technologies of IoT applications, and understand the difference between a wide range of IoT devices, protocols and technologies to choose the right ones for your IoT application. Get budget and resource estimates and more. Come with a basic understanding of JavaScript/ HTML5/ CSS and your laptop for hands-on development practice. An instruction document will be provided for the attendees to prepare their system before the workshop.

W15: Tech Trends for Libraries in 2017 & Beyond

1:00 p.m. – 4:00 p.m.

David Lee King, Digital Services Director, Topeka & Shawnee County Public Library and Publisher, davidleeking.com

Technology has changed the face of libraries and is continuing to change how we work and how we deliver services to customers. This workshop introduces emerging technology trends and shows how those trends are reshaping library services. Examples are provided of how to incorporate these evolving trends into libraries. Attendees learn what trends to look for, find out the difference between a technology trend and a fad, and get ideas on how their library can respond to technology as it emerges.

 

W16: UX Design for Broader Discovery

1:00 p.m. – 4:00 p.m.

Stephanie Rosso, Principal Web Developer, Hennepin County Library
Amy Luedtke, Senior Librarian, Information Programs and Services, Hennepin County Library
Iain Lowe, BiblioCommons Inc.

While patrons have embraced using online technology to access their public library, most of these interactions are limited to borrowing transactions. If libraries are to be truly relevant in the digital world, we need to nudge patrons out of the well-worn pattern of log-in/transact /log-out and find ways to get them to linger long enough to discover the richness the library has to offer beyond borrowing items, while offering them opportunities to add their own voice to the library’s online community. This workshop explores design patterns and techniques for introducing content to patrons at appropriate moments in their learned workflows. It considers how to encourage patrons to add their voice to the library community and overcome concerns about privacy and security. It shares research and experience from BiblioCommons and Hennepin County Public Library’s efforts and looks at analogs from other domains. Workshop participants will be asked to participate actively in a hands-on session to solve a specific design challenge in teams.

My note: Ha. Even the public library understands that service goes beyond “borrowing items” and must have “patrons to add their voice.” Only in the academic library, prevails the opinion that librarians are those omnipotent and all-knowing lecturing types.

B103: Website Redesign: Techniques & Tools

1:15 p.m. – 2:00 p.m.

Dana Haugh, Web Services Librarian, Stony Brook University
Roy Degler, Associate Professor, Digital Library Services, Digital Resources and Discovery Services, Oklahoma State University
Emily R Mitchell, Librarian / Webmaster, SUNY Oswego

Join three web experts to learn about tips, tools, and techniques for taking the pain out of website redesigns. Haugh provides advice on the visual design of your next site and shows some examples of library web redesigns. Degler takes a look at why many libraries are using popular, free, CSS-based frameworks such as Bootstrap; explains how the grid layout works; and shows how the built-in responsive design layouts can deliver a site that works on desktop, smartphones, and tablets. Often the biggest challenge in redesign isn’t the visual design, content management system or coding. It’s the people and politics. Everyone thinks they know what the library website should look like, but no two people—let alone groups—can ever agree. How do you move ahead with a library redesign when you’re facing conflicting demands from the administration, co-workers, users, and stakeholders? Mitchell tackles this challenge head on and points out the weapons that we have at hand—from data to documentation; and discusses how to wield those weapons effectively to win (almost) any fight in order to build a great website. Grab lots of insights and ideas from this experienced panel.

C102: Digital Literacy & Coding Program Models

11:15 a.m. – 12:00 p.m.

Karen Walker, E-Services & Digital Access Manager, Jacksonville Public Library
Brandy McNeil, Associate Director – Tech Education & Training, The New York Public Library
Steven Deolus, Technology Training Program Coordinator, TechConnect (Technology Training Program Department), The New York Public Library

This session looks at how one library created a technology class and programming model that spans a 21-branch, 844-square-mile library system. It discusses mobile classrooms and how Chromebooks, MacBooks, tablets, and other equipment are used to create “classrooms” throughout the system. It shares how the library is focusing on members and programming for the community, for instance, the development of Spanish language, 50+ and immigrant/refugee programming. It looks at developing new programs and instructors using the 3D model from printer to pens, from tablets to coding, from core expertise to everyone. NYPL speakers discuss how coding is the new black! They discuss how to launch a coding program at your library, how to expand the age range of current coding programs, how to promote events related to your program to gain participants, how to get staff buy-in, how to educate staff, and how to create partnerships with some of the biggest names in the game. The NYPL Tech- Connect program will help you plan out all your needs to take your existing or non-existing coding programs further.

My note: one more proof that digital literacy is not “information literacy dressed in the new verbal cloth” of “digital literacy,” but entails way more topics, skills and knowledge. Information Literacy is a 1990s concept. Time to upgrade to 2016 concepts and recognize that digital literacy requires skills beyond handling information. Moreover, information today is way more complex then the skills being taught, since information from social media is more complex then information from news media and it entails technology skills, which one does not have to preside upon for handling news media

E104: From Textbook to Activism: Engaging Students in Social Issues They Care About

2:15 p.m. – 3:00 p.m.

Janie Pickett, Head Librarian, Eureka High School, Eureka, Mo.
Anna Gray, Social Studies Teacher, Eureka High School, Eureka, Mo.

recent collaborative effort between a high school social studies teacher and a school librarian transformed a “same-old” unit on social movements in the 20th century into a dynamic study of effective social activism—and how students can become effective activists. Using both primary and secondary resources, students learned to analyze social issues, then to identify the type of activism that proved effective for those issues. Next, students selected social situations important to them, analyzed the changes they sought to effect, and determined a means of activism to effect that change in practical—and often surprising—ways. The project’s design and implementation is straightforward and replicable. This session provides concrete steps to follow, specific patterns for locating learning resources, and reproducible forms that educators can carry back to their own campuses.

B202: Managing Tech & Innovation

11:45 a.m. – 12:30 p.m.

Jen Baum Sevec, Senior Metadata and Acquisitions Librarian, Library Of Congress
Brett Williams, Systems & Liaison Librarian, University of Toronto Mississauga

Sevec offers leaders at any level the opportunity to up their game by learning current management strategies for technology and innovation. Library leaders and constituents engage in the nearly constant interplay of enabling technology and innovations to explore a wealth of information and greater depth of data in the Information Age. A framework for managing this interplay is provided as well as an understanding of the dynamic lifecycle inherent in technological innovations and constituent demands. Williams provides an introduction to Wardley Value Chain Mapping, an innovative IT planning processes discussed by Simon Wardley on his blog Bits and Pieces. He shares specific examples of how this tool can be used by systems librarians, library administrators, and library IT decision makers.

B203: Finding Your Social Media Voice

1:45 p.m. – 2:30 p.m.

Meghan Kowalski, Head, Preservation, The Catholic University of America
Kirsten Mentzer, Technology Specialist, Northern Virginia Community College’s Medical Education Campus
Alexandra Radocchia Zealand, Web Editor, New Media Developer and Video Producer, Web Team, Arlington Public Library PLA, VLA, ALA, LLAMA
Lennea R. Bower, Program Specialist, Virtual Services, Montgomery County Public Libraries

This session provides an in-depth look at how to speak in social media. Each institution and organization’s social media accounts has a personality. How you say something is just as important as what you say and why you say it. Your voice on social media says a lot to your followers. If done well, your tone will help to attract and keep an audience. The wrong kind of voice will turn people away. Finding the right voice can be difficult and involves a lot of trial and error. Speakers provide tips for finding the right voice and presenting the best personality for your intended followers. Social media is no longer the “new kid on the block,” and the panel highlights the best ways to communicate content, being real, tone, and more. They showcase what kinds of tones can be used and how to find the “real voice” for your accounts, why those voices are (or are not) successful for those accounts; and how to make your chosen voice sustainable and consistent across your accounts.

C203: Migrating & Developing With Drupal

1:45 p.m. – 2:30 p.m.

June Yang, Senior Librarian, International Monetary Fund
Linda Venable, Systems Librarian, International Monetary Fund
Elizabeth Zoby, Information Specialist, PAE, National Institute of Corrections (NIC)
Billy Mathews, Web Developer, PAE, National Institute of Corrections (NIC)

Migrating to a new ILS system is not easy, and it is even more challenging when faced with a very tight deadline. Presenters share the recent experience of migrating from SirsiDynix Symphony to Alma within 5 months: what worked, what didn’t, lessons learned, and what to prepare in advance of the migration. They also share some insight about post migration work related to data cleanup, workflows review, etc. Zoby and Mathews share their development of the NIC micro-sites using Drupal, an open-source content management software, to create dynamic websites that make accessing material easy and user-friendly. Instead of having to download and shift through large PDF documents, users can access the content on easily searchable websites which can be edited by authorized users. See how the NIC Information Center is using these sites to help customers and the public access information in innovative ways.

D202: Funding Opps for Digital Library Initiatives

11:45 a.m. – 12:30 p.m.

Trevor Owens, Digital Archivist, Office of Strategic Initiatives, Library Of Congress
Nicole Ferraiolo, Program Officer, Scholarly Resources, Council on Library & Information Resources
Joel Wurl, Senior Program Officer, National Endowment for the Humanities

Discovering and deciphering guidelines for grant programs is a daunting and challenging process. This session provides an opportunity to directly hear from and ask questions about grant opportunities for digital libraries’ initiatives to program officers from different government and private funders. Following brief overviews of the relevant funding opportunities at their organizations, panelists discuss the kinds of projects that best fit their specific programs. Get suggestions on how to develop a competitive proposal and insights on the application and review process. Panelists consider themes and trends from the digital library projects that have received funding, such as digitization, open educational resources, linked data, crowdsourcing, open access publishing, emulation and virtualization, and data visualization. By bringing together representatives from different funders, this session offers a unique opportunity to connect directly with program officers and identify new opportunities and approaches for funding.

A301: Augmented Reality & Learning

10:45 a.m. – 11:30 a.m.

Ashley Todd-Diaz, Head, Special Collections & University Archives, Towson University
Earl Givens, Head, Systems & Technology, Catawba College
Art Gutierrez, Head, Technical Services, Emporia State University
Bethanie O’Dell, Virtual Learning Librarian, Emporia State University

Just when you thought the battle of augmented reality (AR) was over with Pokémon GO, libraries across the nation have been exploring additional AR options in order to meet the needs of the mobile learners walking through their doors. With the use of free AR software, four individuals team up to become the ultimate masters of AR. Hear from a panel of closely networked professionals, each with a unique story of embedding AR into specific library services directed for higher education. These stories range from embedding AR with liaison departments to incorporating AR into information literacy sessions (both online and face-to-face).

A304: Multimodal Learning: From Textbooks to Playlists

2:45 p.m. – 3:30 p.m.

Laurie Burruss, Professor, Pasadena City College

Colleges, universities, and libraries are considering adding video making, or visual literacy, as a core skill. Preparing individuals for a highly visual communication landscape requires critical thinking to offset consumerism as well as multimodal learning and cognitive skills. Researching, creating, and sharing video playlists are important ways to create personalized learning pathways and promote continuous learning. Explore a number of case studies that demonstrate the positive learning outcomes of multimodal learning in academic and corporate settings and discover how to create playlists that can be annotated, edited, and shared across teams.

B304: Raspberry Pi

2:45 p.m. – 3:30 p.m.

David Bennett, Systems Librarian, Robert Morris University

Raspberry Pi is an inexpensive computing tool that is ideal for new projects within libraries. It’s a powerful single board computer that plays high-definition video, yet it’s only the size of a credit card. The Raspberry Pi 3 was released in February of 2016, and the built-in networking options make it an exciting fit for library applications. Learn how Raspberry Pi can be used as a people counter, a dedicated OPAC, a social media tool, and more.

D302: Upping Our “Gamification”: Speaking Millennials’ Language

11:45 a.m. – 12:30 p.m.

David Folmar, Emerging Technology Librarian, Main Branch, Richmond Public Library Author, Game It Up! Using Gamification to Incentivize Your Library

Be tech-smart and culture-savvy by using game-design thinking and gaming activities to connect with current users in a fun way and draw in new ones. Hear from a library communicator who literally wrote the book on this topic. Online games are incredibly popular; libraries, book apps, and learning institutions are leveraging this to bring in new audiences and engage with existing ones in new ways. Why are they doing this, what is the benefit, and how do you make it work to promote your library? Get the answers here!

D303: Library Story in Video

1:45 p.m. – 2:30 p.m.

Jennifer E. Burke, President, IntelliCraft Research, LLC

Video is a powerful, emotional storytelling medium that plays well in social media, and its use is still fast-growing. Video can spread your library’s story, and you can do it without hiring an expensive pro. A tech-savvy info pro shares basic video facts, along with her favorite tools, tips, and techniques that almost anyone can use for creating short, compelling videos to promote library services, staffers, and resources.

My note: my mouth ran dry to repeat this to the SCSU library. In vain. 1. make a low-cost social-media type of movie of 30 sec each week/month. 2. post on a prominent place on the library web page. 3. Have web form harvest info from patrons on the issu[s] reflected in the video 4. mirror video on the social media 5. aim as ultimate goal patrons (students, faculty, staff) furbishing the video footage instead of library staff
Why is it soooo difficult to comprehend?

E302: Zero to Maker: Invention Literacy & Mobile Memory Lab

11:45 a.m. – 12:30 p.m.

Dominique China, Information Services Librarian, Brampton Library
Colleen Dearborn, Adult Services Librarian, Alsip-Merrionette Park Library, Alsip, Ill.

Invention literacy is not just about understanding how a thing is made or how it works; it is also the ability to use that knowledge to bring one’s own ideas into reality. China gives examples of how one public library is empowering children, teens, and adults to become “invention-literate” through its maker programming. Find out how a combination of high- and low-tech equipment, safe and accessible learning environments, and a unique community partnership is nurturing invention, creative confidence, innovation, and entrepreneurship. Sparked by the CIL 2016 Hawkins and Mears talk about personal digital archiving and the DCPL Memory Lab, Dearborn shares her library’s inexpensive journey to create a mobile memory lab program. She discusses the acquisition of equipment and technology, the demo classes, lesson plans about personal archiving, outreach to other community organizations, and providing classes on related topics, such as saving and uploading images, backing up files and using cloud storage, writing and self-publishing memoirs, conducting oral interviews with veterans and other family memories, coding and saving memories on a website, etc. Future plans include digitizing local history documents, a community website with links to these documents, as well as to our patrons’ digitized images, videos, interviews and memoirs.

+++++++++++++++++++++
more on technology in library in this IMS blog
http://blog.stcloudstate.edu/ims?s=technology+library

IMS Instruction Sessions Spring 2016

IMS Instruction Sessions Spring 2016

Where is MC 205? Per campus map, Miller Center 205 is on the second floor, direction computer lab, right-handside, pass the counter with printers on both sides. Please use this virtual reality direction map to find the room (use Google Chrome and activate QuickTime plugin).

please have a link to a PDF copy print out instruction sessions spring 2016

Dreamweaver: 4 Mondays –  10-10:45AM . Jan 18, 25, Feb 1, 8 ; location MC 205.  attendees cap is 5

Keywords: web development, web design, Adobe Dreamweaver

Description: Adobe Dreamweaver CC is the default web development tool on campus. In four consecutive weeks, learn the basics of Dreamweaver, web development, web design and maintaining web pages on the Web. Site map and site structure. HTML and HTML5 basics, basics of CSS, page properties, text editing, hyperlinks and images, tables, forms.

Remote participation through desktopsharing at http://scsuconnect.stcloudstate.edu/ims upon registration and specific request

 

Photoshop: 4 Tuesdays – –  10-10:45AM .  Jan 19, 26, Feb 2, 9 ; location MC 205.  attendees cap is 5

Keywords: image processing, image editing, visual literacy, Adobe Photoshop

Description: In four 45 min sessions, learn the basics of image editing.  A comprehensive understanding of Adobe Photoshop and its essential tools. Design and edit, adjusting images for the Internet and print outs. Learn image formats, compressions, layers. Retouching, repairing and correcting photos

Remote participation through desktopsharing at http://scsuconnect.stcloudstate.edu/ims upon registration and specific request

 

Social Media in Education 9:30-10:15 AM. Feb 3, 10, 17, 24. location MC 205.  attendees cap is 15

Keywords: social media, social media in education, social media and learning, social media and teaching, social media and communication, Facebook, Twitter, Instagram, LinkedIn, YouTube, Diigo, Delicious, Evernote, SideVibe, Pinterest, Vine, Snapchat, Google+, Zotero, Mendeley, blogs, wikis, podcasts, visuals, text
Description: In four 45 min sessions, structure your approach to social media and assess how to use in teaching and learning. What is social media and how to use it. How to discriminate between personal and professional use of social media. Amidst 180 most popular social media tools, acquire a robust structure to cluster them and orient yourself quick and easy, which tools fit best your teaching materials and methods to enable learning and communication with your students. Visuals versus text and how to combine them for effective communication and teaching. Policies, engagement of students. Expanding and improving research and organization of your research through social media and networking toward your research through social media.

Remote participation through desktopsharing at http://scsuconnect.stcloudstate.edu/ims upon registration and specific request

 

Cheating: what, why and how to avoid: Jan 28, 10-10:45AM .  location MC 205.  attendees cap is 15

Keywords: cheating, academic dishonesty, academic integrity, plagiarism.

Description: in 45 minutes we can start a conversation about identification of cheating practices and determination of what plagiarism is, considering generational differences, the evolution of the Internet. Identifying of “cheating” can provide robust boundaries for understanding students’ behavior and identifying practices and methods to alleviate such behavior, including change of teaching methods and practices.

Remote participation through desktopsharing at http://scsuconnect.stcloudstate.edu/ims upon registration and specific request

 

10 basics steps to start social media. March 16, 11-11:45AM  location MC 205.  attendees cap is 15

Keywords: social media, social media in education, social media and learning, social media and teaching, social media and communication, Facebook, Twitter, Instagram, LinkedIn, YouTube, Diigo, Delicious, Evernote, SideVibe, Pinterest, Vine, Snapchat, Google+, Zotero, Mendeley, blogs, wikis, podcasts, visuals, text

Description: introduction to social media and its use for personal and professional purposes. Ideas and scenarios of using different social media tools in education. Hands-on exercises for using social media in teaching.

Remote participation through desktopsharing at http://scsuconnect.stcloudstate.edu/ims upon registration and specific request

 

Games and Gamification in Education. Feb 24 2-2:45PM, March 25, 10-10:45AM, April 14, 2-2:45PM MC 205, attendees cap is 5

Keywords: play, games, serious games, game-based learning, gaming, gamification.

Description: Gaming and Gamification is one of the most pronounced trends in education as per the New Horizon Report. Besides the increase of participation and enthusiasm, it increases learning. Introduction to gaming and gamification by establishing definitions, learning to distinguish gaming and gamification and learning the basics of gaming and gamification in the teaching process. Hands-on exercises for introducing gaming practices in the teaching and learning process and gamifying the existing syllabi.

Remote participation through desktopsharing at http://scsuconnect.stcloudstate.edu/ims upon registration and specific request

 

Teaching Online. Jan. 29. 10-10:45AM. Feb 18, 2-2:45PM,  March 30, 3-3:45 PM MC 205. attendees cap is 5.

Keywords: online teaching, mobile teaching, distance education, distributive learning, hybrid learning, hybrid teaching, blended learning

Description: this 45 min session is aimed to help you transition your F2F teaching to hybrid and online teaching. Learn about synchronous and asynchronous modes of teaching and communication to structure and organize your class materials and methods for better delivery. Hands-on exercises for improving content delivery, class discussions and communications among instructor and students.
Remote participation through desktopsharing at http://scsuconnect.stcloudstate.edu/ims upon registration and specific request

 

Effective Presentations. Jan 28, 2-2:45PM.  MC 205. attendees cap is 10

Keywords: presentations, PowerPoint, alternatives to PowerPoint, presentation design, presentation essentials, Prezi, SlideShare, LodeStar, Zentation, Zoho, Powtoon, Zaption, Thinglink, Haiku, Kahoot, Storify, EdPuzzle, PollDaddy, Evernote, Mammoth, SideVibe, Paddlet, Remind, Death by PowerPoint, visual literacy, media literacy, digital literacy, visuals
Description: http://blog.stcloudstate.edu/ims/2016/01/07/effective-presentations/ . These four 45 minute sessions are aimed to introduce and orient faculty, staff and students to the opulence of alternatives to PowerPoint and revisit the basics of well-tailored presentation. Hands-on exercises for improving the structure and delivery of presentation as well as the choice of presentation tools.
Remote participation through desktopsharing at http://scsuconnect.stcloudstate.edu/ims upon registration and specific request

 

Death by PowerPoint. Feb 26, 10-10:45PM. MC 205. attendees cap is 10

Keywords: presentations, PowerPoint, alternatives to PowerPoint, presentation design, presentation essentials, Death by PowerPoint, visual literacy, media literacy, digital literacy, visuals.
Description: http://blog.stcloudstate.edu/ims/2016/01/07/effective-presentations/ . This 45 minute session is aimed to introduce and orient faculty, staff and students to the basics of PowerPoint and revisit the basics of a well-tailored presentation. Hands-on exercises for improving the structure and delivery of presentation as well as the choice of presentation tools.

Remote participation through desktopsharing at http://scsuconnect.stcloudstate.edu/ims upon registration and specific request

 

Contemplative Computing or Disconnect: How to Bring Balance in Your Life by Managing well Your Technology. Feb 17. 2-2:45PM.  MC 205. attendees cap is 10

Keywords: disconnect, Sherry Turkle, contemplative computing, mediation, contemplative practices, balance, technology stress

Description: this 45 min session introduces faculty, staff and students to the idea of regulating the use of technology in a meaningful way. Hands-on exercises and sharing good practices on balancing the use of technology in daily life.

Remote participation through desktopsharing at http://scsuconnect.stcloudstate.edu/ims upon registration and specific request

 

Videos in the classroom: fast and easy. Jan 28, 10-10:45PM. MC 205. attendees cap is 5.
Keywords: video, video editing, video manipulation, visual literacy, digital literacy, MovieMaker, iMovie, Instagram, Vine, YouTube, Kaltura

Description: this 45 min session is an orientation to the resources available for delivery of visual materials in the classroom. Hands-on experience of different basics tools on different computer platforms.

Remote participation through desktopsharing at http://scsuconnect.stcloudstate.edu/ims upon registration and specific request

 

Voice Over presentations: solutions. Feb 4, 10-10:45PM. MC 205. attendees cap is 5.

Keywords: PowerPoint, VoiceThread, LodeStar, MediaSpace (Kaltura), audio editing, narration

Description: http://blog.stcloudstate.edu/ims/2015/04/28/voice-over-presentation-solutions/ . This 45 min session is a short hands-on introduction to the tools available at MnSCU intuitions and free third-party applications for delivery of narrative attached to presentations.

Remote participation through desktopsharing at http://scsuconnect.stcloudstate.edu/ims upon registration and specific request

 

Infographics: make your projects, presentations and research credible through presentable data. Feb 10, 2-2:45PM.  March 29, 10-10:45AM, MC 205. attendees cap is 10

Keywords: Piktochart, Infogr, Visualy, statistics, visual literacy, digital literacy
Description: http://blog.stcloudstate.edu/ims/2014/04/09/infographics-how-to-create-them/. This 45 min session is an orientation to the world of infographics. Short introduction to the basics of statistics and their importance in presenting a research and idea. Hands-on exercise using one of the 3 popular infographic tools.

Do student evaluations measure teaching effectiveness?

Do student evaluations measure teaching effectiveness?Manager’s Choice

Assistant Professor in MISTop Contributor

Higher Education institutions use course evaluations for a variety of purposes. They factor in retention analysis for adjuncts, tenure approval or rejection for full-time professors, even in salary bonuses and raises. But, are the results of course evaluations an objective measure of high quality scholarship in the classroom?

—————————-

  • Daniel WilliamsDaniel

    Daniel Williams

    Associate Professor of Molecular Biology at Winston-Salem State University

    I feel they measure student satisfaction, more like a customer service survey, than they do teaching effectiveness. Teachers students think are easy get higher scores than tough ones, though the students may have learned less from the former.

    Maria P.John S. and 17 others like this

  • Muvaffak

    Muvaffak GOZAYDIN

    Founder at Global Digital University

    Top Contributor

    How can you measure teachers’ effectiveness.
    That is how much students learn?
    If there is a method to measure how much we learn , I would appreciate to learn .

    Simphiwe N.Laura G. and 4 others like this

  • Michael TomlinsonMichael

    Michael Tomlinson

    Senior Director at TEQSA

    From what I recall, the research indicates that student evaluations have some value as a proxy and rough indicator of teacher effectiveness. We would expect that bad teachers will often get bad ratings, and good teachers will often get good ratings. Ratings for individual teachers should always be put in context, IMHO, for precisely the reasons that Daniel outlines.

    Aggregated ratings for teachers in departments or institutions can even out some of these factors, especially if you combine consideration with other indicators, such as progress rates.The hardest indicators however are drop-out rates and completion rates. When students vote with their feet this can flag significant problems. We have to bear in mind that students often drop out for personal reasons, but if your college’s drop-out rate is higher than your peers, this is worth investigating.

    phillip P.J.B. W. and 12 others like this

  • Rina SahayRina

    Rina Sahay

    Technical educator looking for a new opportunity or career direction

    I agree with what Michael says – to a point. Unfortunately student evaluations have also been used as a venue for disgruntled students, acting alone or in concert – a popularity contest of sorts. Even more unfortunately college administrations (especially for-profits) tend to rate Instructor effectiveness on the basis of student evaluations.

    IMHO, student evaluation questions need to be carefully crafted in order to be as objective as possible in order to eliminate the possibility of responses of an unprofessional nature. To clarify – a question like “Would you recommend this teacher to other students?” has the greatest potential for counter-productivity.

    Maria P.phillip P. and 6 others like this

  • Robert WhippleRobert

    Robert Whipple

    Chair, English Department at Creighton University

    No.

    Rina S.Elizabeth T. and 7 others like this

  • Dr. Virginia Stead, Ed.D.Dr. Virginia

    Dr. Virginia Stead, Ed.D.

    2013-2015 Peter Lang Publishing, Inc. (New York) Founding Book Series Editor: Higher Education Theory, Policy, & Praxis

    This is not a Cartesian question in that the answer is neither yes nor no; it’s not about flipping a coin. One element that may make it more likely that student achievement is a result of teacher effectiveness is the comparison of cumulative or summative student achievement against incoming achievement levels. Another variable is the extent to which individual students are sufficiently resourced (such as having enough food, safety, shelter, sleep, learning materials) to benefit from the teacher’s beneficence.

    Bridget K.Simphiwe N. and 4 others like this

  • Barbara

    Barbara Celia

    Assistant Clinical Professor at Drexel University

    Depends on how the evaluation tool is developed. However, overall I do not believe they are effective in measuring teacher effectiveness.

    Jeremy W.Ronnie S. and 1 other like this

  • Sri YogamalarSri

    Sri Yogamalar

    Lecturer at MUSC, Malaysia

    Overall, I think students are the best judge of a teacher’s effective pedagogy methods. Although there may be students with different learning difficulties (as there usually is in a class), their understanding of the concepts/principles and application of the subject matter in exam questions, etc. depends on how the teacher imparts such knowledge in a rather simplified and easy manner to enhance analytical and critical thinking in them. Of course, there are students too who give a bad review of a teacher’s teaching mode out of spite just because the said teacher has reprimanded him/her in class for being late, for example, or for even being rude. In such a case, it would not be a true reflection of the teacher’s method of teaching. A teacher tries his/her best to educate and inculcate values by imparting the required knowledge and ensuring a 2-way teaching-learning process. It is the students who will be the best judge to evaluate and assess the success of the efforts undertaken by the teacher because it is they who are supposed to benefit at the end of the teaching exercise.

    Chunli W.Simphiwe N. and 2 others like this

  • Paul S HickmanPaul S

    Paul S Hickman

    Member of the Council of Trustees & Distinguished Mentor at Warnborough College, Ireland & UK

    No! No!

    Anne G.Maria P. and 2 others like this

  • Bonnie FoxBonnie

    Bonnie Fox

    Higher Education Copywriter

    In some cases, I think evaluations (and negative ones in particular) can offer a good perspective on the course, especially if an instructor is willing to review them with an open mind. Of course, there are always the students who nitpick and, as Rina said, use the eval as a chance to vent. But when an entire class complains about how an instructor has handled a course (as I once saw happen with a tutoring student whose fellow classmates were in agreement about the problems in the course), I think it should be taken seriously. But I also agree with Daniel about how evaluations should be viewed like a customer service survey for student satisfaction. Evals are only useful up to a point.

    I definitely agree about the way evaluations are worded, though, to make sure that it’s easier to recognize the useful information and weed out the whining.

    Maria P.Pierre H. and 4 others like this

  • Pierre HENONPierre

    Pierre HENON

    university teacher (professeur agrege)

    I am director of studies and students in continuing education are making evaluation of the teaching effectiveness. Because I am in an ISO process, I must take in account those measurements. It might be very difficult sometimes because the number of students does not reach the level required for the sample to be valid (in a statistic meaning). But in the meantime, I believe in the utility of such measurements. The hard job is for me when I have to discuss with the teacher who is under the required score.

    Simphiwe N.Maria P. like this

  • Maria PerssonMaria

    Maria Persson

    Senior Tutor – CeTTL – Student Learning & Digital/Technology Coach (U of W – Faculty of Education)

    I’m currently ‘filling in’ as the administrator in our Teaching Development Unit – Appraisals and I have come to appreciate that the evaluation tool of choice is only that – a tool. How the tool is used in terms of the objective for collecting ‘teaching effectiveness’ information, question types developed to gain insight of, and then how that info is acted upon to inform future teaching and learning will in many ways denote the quality of the teaching itself !

    Student voice is not just about keeping our jobs, ‘bums on seats’ or ‘talking with their feet’ (all part of it of course) but should be about whether or not we really care about learning. Student voice in the form of evaluating teachers’ effectiveness is critically essential if we want our teaching to model learning that affects positive change – Thomas More’s educational utopia comes to mind…

    Simphiwe N.Pierre H. and 4 others like this

  • David ShallenbergerDavid

    David Shallenberger

    Consultant and Professor of International Education

    Alas, I think they are weak indicators of teaching effectiveness, yet they are used often as the most important indicators of the same. And in the pursuit of high response rate, they are too often given the last day of class, when they cannot measure anything significant — before the learning has “sunk in.” Ask better questions, and ask the questions after students have had a chance to reflect on the learning.

    Barbara C.Pierre H. and 9 others like this

  • Cathryn McCormackCathryn

    Cathryn McCormack

    Lecturer (Teaching and Learning), and Belly Dance teacher

    I’m just wrapping up a very large project at my university that looked at policy, processes, systems and the instrument for collecting student feedback (taking a break from writing the report to write this comment). One thing that has struck me very clearly is that we need to reconceptualise SETs. de Vellis, in Scale Development, talks about how a scale generally has a higher validity if the respondent is asked to talk about their own experiences.

    Yet here we are asking students to not only comment on, but evaluate their teachers. What we really want students to do in class in concentrate on their learning – not on what the teacher is doing. If they are focussing on what the teacher is doing then something is not going right. The way we ask now seems even crazier when we consider the most sophisticated conception of teaching is to help students learn. So why aren’t we asking students about their learning?

    The standard format has something to do with it – it’s extremely difficult to ask interesting questions on learning when the wording must align with a 5 point Likert response scale. Despite our best efforts, I do not believe it is possible to prepare a truly student centred and learning centred questionnaire using this format.

    An alternate format I came across that I really liked (Modified PLEQ Devlin 2002, An Improved Questionnaire for Gathering Student Perceptions of Teaching and Learning), but no commercial evaluation software (which we are required to purchase) can do it. A few overarching questions sets the scene for the nature of the class, but the general question format goes: In [choose from drop down list] my learning was [helped/hindered] when [fill in the blank] because [fill in the blank]. The drop down list would include options such as lectures, seminars/tutorials, a private study situation, preparing essays, labs, field trip, etc. After completing one question the student has the option to fill in another … and another … and another … for as long as they want.

    Think about what information we could actually get on student learning if we we started asking like this! No teacher ratings, all learning. The only number that would emerge would be the #helped and the #hindered.

    Maria P.Pierre H. and 6 others like this

  • Hans TilstraHans

    Hans Tilstra

    Senior Coordinator, Learning and Teaching

    Keep in mind “Goodhart’s Law” – When a measure becomes a target, it ceases to be a good measure.

    For example, if youth unemployment figures become the main measure, governments may be tempted to go for the low hanging fruit, the short term (eg. a work for the dole stick to steer unemployed people into study or the army).

    Punita S.Laura G. and 2 others like this

  • robert easterbrookrobert

    robert easterbrook

    Education Management Professional

    Nope.

    Catherine W.Anne G. like this

  • John StanburyJohn

    John Stanbury

    Professor at Singapore Institute of Management

    I totally agree with most of the comments here. I find student evaluations to be virtually meaningless as measures of a teachers’ effectiveness. They are measures of student perception NOT of learning. Yet university administrators eg Deans, Dept chairs, persist in using them to evaluate faculty performance in the classroom to the point where many instructors have had their careers torn apart. Its an absolute disgrace!! But no one seems to care! That’s the sick thing about it!

    Ronnie S.Maria P. and 4 others like this

  • Simon YoungSimon

    Simon Young

    Programme Coordinator, Pharmacy

    Satisfaction cannot be simply correlated with teaching quality. The evidence is that students are most “satisfied” with courses that support a surface learning approach – what the student “needs to know” to pass the course. Where material and delivery is challenging, this generates less crowd approval but, conversely, is more likely to be “good teaching” as this supports deep learning.

    Our challenge is to achieve deep learning and still generate rave satisfaction reviews. If any reader has the magic recipe, I would be pleased to learn of it.

    joe O.Maria P. and 4 others like this

  • Laura GabigerLaura

    Laura Gabiger

    Professor at Johnson & Wales University

    Top Contributor

    Maybe it is about time we started calling it what it is and got Michelin to develop the star rating system for our universities.

    Nevertheless I appreciate everyone’s thoughtful comments. Muvaffak, I agree with you about the importance and also the difficulty of measuring student learning. Cathryn, thank you for taking a break from your project to give us an overview.

    My story: the best professor and mentor in my life (I spent a total of 21 years as a student in higher education), the professor from whom I learned indispensable and enduring habits of thought that have become more important with each passing year, was one whom the other graduate students in my first term told me–almost unanimously– to avoid at all costs.

    Jeremy W.Maria P. and 1 other like this

  • Dr. Pedro L. MartinezDr. Pedro L.

    Dr. Pedro L. Martinez

    Former Provost and Vice Chancellor for Academic Affairs at Winston Salem State University & President of HigherEd SC.

    I am not sure that course evaluations based on one snap shot measure “teacher effectiveness”. For various reasons, some ineffective teachers get good ratings by pandering to the lowest level of intellectual laziness. However, consistently looking at comments and some other measures may yield indicators of teachers who are unprepared, do not provide feedback, do not adhere to a syllabus of record, and do not respect students in general. I think part of that information is based how questions are crafted.

    I believe that a self evaluation of instructor over a period of a semester could yield invaluable information. Using a camera and other devices, ask the instructor to take snap shots of their teaching/ learning in the classroom over a period of time and then ask for a self-evaluation. For the novice teacher that information could be evaluated by senior faculty and assist the junior faculty to improve his/her delivery. Many instructors are experts in their field but lack exposure to different methods of instructional delivery. I would like to see a taxonomy of a scale that measures the instructor’s ability using lecture as the base of instruction and moving up to levels of problem based learning, service learning, undergraduate research by gauging the different pedagogies (pedagogy, androgogy heutagogy, paragogy etc. that engage students in active learning.

    Dvora P.Maria P. and 1 other like this

  • Steve CharlierSteve

    Steve Charlier

    Assistant Professor at Quinnipiac University

    I wanted to piggyback on Cathryn’s comment above, and align myself with how many of you seem to feel about student evaluations. The quantitative part of student evals are problematic, for all of the reasons mentioned already. But the open-ended feedback that is (usually) a part of student evaluations is where I believe some real value can be gained, both for administrative purposes and for instructor development.

    When allowed to speak freely, what are students saying? Are they lamenting a particular aspect of the course/instructor? Is that one area coloring their response across all questions? These are all important considerations, and provide a much richer source of information for all involved.

    Sadly, the quantitative data is what most folks gravitate to, simply because it’s standardized and “easy”. I don’t believe that student evaluations are a complete waste of time, but I do think that we tend to focus on the wrong information. And, of course, this ignores the issues of timing and participation rates that are probably another conversation altogether!

    Dvora P.Sonu S. and 4 others like this

  • robert easterbrookrobert

    robert easterbrook

    Education Management Professional

    ‘What the Student Does: teaching for enhanced learning’ by John Biggs in Higher Education Research & Development, Vol. 18, No. 1, 1999.

    “The deep approach refers to activities that are appropriate to handling the task so that an appropriate outcome is achieved. The surface approach is therefore to be discouraged, the deep approach encouraged – and that is my working definition of good teaching. Learning is thus a way of interacting with the world. As we learn, our conceptions of phenomena change, and we see the world differently. The acquisition of information in itself does not bring about such a change, but the way we structure that information and think with it does. Thus, education is about conceptual change, not just the acquisition of information.” (p. 60)

    This is the approach higher education is trying adapt to at the moment, as far as I’m aware.

    Jeremy W.Adrian M. like this

  • Cindy KenkelCindy

    Cindy Kenkel

    Northwest Missouri State University

    My Human Resource students will focus on this issue in a class debate “Should student evaluation data significantly impact faculty tenure and promotion decisions?” One side will argue “yes, it provides credible data that should be one of the most important elements” and the other group will argue against this based on much of what has been said above. They will say student evaluations are basically a popularity contest and faculty may actually be dumbing down their classes in order to get higher ratings.

    To what extent is student data used in faculty tenure and promotion decisions at your institutions?

  • yasir

    yasir hayat

    Faculty member at institute of management sciences,peshawar

    NO

  • yasir

    yasir hayat

    Faculty member at institute of management sciences,peshawar

    NO

  • joe othmanjoe

    joe othman

    Associate Professor at Institute of Education, IIUM

    Agree with Pierre, when the number of students responding is not what is expected; then what?

  • joe othmanjoe

    joe othman

    Associate Professor at Institute of Education, IIUM

    Cindy; it is used in promotion decision in my university, but only a small percentage of the total points. Yet this issue is still a thorny one for some faculty

  • Sonu SardaSonu

    Sonu Sarda

    Lecturer at University of Southern Queensland

    How open are we? Is learning about the delivery of a subject only or bulding on soft skills as well?So if we as teachers are facilitating learning in a conducive manner ,would it not lead to an average TE at the least &thus indicate our teaching effectiveness at the base level. Indeed qualitative approach would be far better an approach, if we intend to accomplish the actual purpose of TE i.e Reflection for continual improvement.More and more classrooms are becoming learner centered and to accomplish this learners ‘say’ is vital.
    Some students using these as platforms for personal whims, must not be a concern for many, since the TE are averaged out .Of course last but not the least TEs are like dynamites and must be handled by experts.These are one of the means of assessing the gaps,if any, between the teaching and learning strategies. These must not be used for performance evaluation.If at all, then all the other factors such as the number of students,absenteeism,pass rate rather HD & D rates over a period of minimum three terms must also be included alongside.

  • Dvora PeretsDvora

    Dvora Perets

    Teaching colleague at Ben Gurion University of the Negev

    I implement a semester long self evaluation process in all my mathematics courses. Students gets 3 points (out of 100) for anonymously filling an online questionnaire online every week . They rate (1-5) their personal class experience (I was bored -I was fascinated, I understood nothing- I understood everything, The tutorials sessions didn’t-did help, I visited Lecturer’s/TA’s office hours, I spent X hours of self learning this week). They can also add verbal comments.
    I started it 10 years ago when I built a new special course, to help me “hear” the students (80-100 in each class) and to better adjust myself and the content to my new students. I used to publish a weekly respond to the verbal comments, accepting some and rejecting others while making sure to explain and justify any decision of mine.
    Not only that it helped me improve my teaching and the course but it turned out that it actually created a very solid perception of me as a caring teacher. I always was a very caring teacher (some of my colleagues accuse me of being over caring…) but it seems that “forcing” my student to give feedback along all the semester kind of “brought it out” to the open.

    I am still using long-semester feedback in all my courses and I consider both quantitative and qualitative responds. It helps me see that the majority of students understand me in class. I ignore those who choose “I understand nothing” – obviously if they were indeed understanding “nothing” they would have not come to class… (they can choose “I didn’t participate” or “I don’t wont to answer”)
    I ignore all verbal comments that aim to “punish” me and I change things when I think students r right.
    Finally, being a math lecturer for non-major students is extremely hard, both academically and emotionally. Most students are not willing to do what is needed in order to understand the abstract/complicated concepts and processes.
    Only few (“courageous “) students will attribute their lack of understanding to the fact that they did not attend all classes, or that they weren’t really focused on learning, (probably they spend a lot of time in “Facebook” during class..), or that they didn’t go over class notes at home and come to office hours when they didn’t understand something etc.
    I am encouraged by the fact that about 2/3 of the students that attend classes state they “understood enough” and above (3-5) all semester long. This is especially important as only 40-50% of the students fill the formal end of the semester SE and I bet u can guess how the majority of of them will rate my performance. Students fill SE before the final exam but (again) u can guess how 2 midterms with about 24% failures will influence their evaluation of my teaching.

    Cathryn M.Steve C. and 3 others like this

  • Michael TomlinsonMichael

    Michael Tomlinson

    Senior Director at TEQSA

    I think it’s important to avoid defensive responses to the question. Most participants have assumed that we are talking about individual teachers being assessed through questionnaires, and I share everyone’s reservations about that. I entirely agree that deep learning is what we need to go for, but given the huge amounts of public money that are poured into our institutions, we need to have some way of evaluating whether what we are doing is effective or whether it isn’t.

    I’m not impressed by institutions that are obsessed only with evaluation by numbers. However, there is some merit in monitoring aggregated statistics over time and detecting statistically significant variations. If average satisfaction rates in Engineering have gone down every year for five years shouldn’t we try and find out why? If satisfaction rates in Architecture have gone up every year for five years wouldn’t it be interesting to know if they have been doing something to bring that about that might be worthwhile? It might turn out to be a statistical artifact, but we need to inquire into it, and bring the same arts of critical inquiry to bear on the evidence that we use in our scholarship and research.

    But I always encourage faculties and institutions to supplement this by actually getting groups of students together and talking to them about their student experience as well. Qualitative responses can be more valuable than quantitative surveys. We might actually learn something!

    Laura G.yasir H. and 2 others like this

  • Aleardo

    Aleardo Manacero

    Associate Professor at UNESP – São Paulo State University

    As everyone here I also think that these evaluation forms do not truly measure teaching effectiveness. This is a quite hard thing to evaluate, since the effect of learning will be felt several years later, while performing their job duties.

    Besides that, some observations made by students are interesting for our own growth. I usually get these through informal talks with the class or even some students.

    In another direction, some of the previous comments are addressing deep/surface learning basically stating that deep learning is the right way to go. I have to disagree with this for some of the contents that have to be taught. In my case (teaching to computer science majors) it is important, for example, that every student have a surface knowledge about operating systems design, but those who are going to work as database analysts do not need to know the deep concepts involved with that (the same is true for database concepts for a network analyst…). So, surface learning has also its relevance in the professional formation.

    Jeremy W.Sonu S. like this

  • George ChristodoulidesGeorge

    George Christodoulides

    Senior Consultant and Lecturer at university of nicosia

    The usefulness of Student evaluations, like all similar surveys, is closely linked to the particular questions they are asked to answer. There are the objective-type/factual questions such as “Does he start class on time” or “does he speak clearly” and the very personal questions such as “does he give fair grades”… The effectiveness of a Teacher could be more appropriately linked to suitably phrased question, such as “has he motivated you to learn” and “how much have you bebnefited from the course”. The responses to these questions could, also, be further assessed by comparison with the final grades given to that particular course with the performance of the class in the other courses they have taken..during that semester. So, for assessing Teacher Effectiveness, one needs to ask relevant questions. and perform the appropriate evaluations..

  • Laura GabigerLaura

    Laura Gabiger

    Professor at Johnson & Wales University

    Top Contributor

    Michael has an excellent point that some accountability of institutions and programs is appropriate, and that aggregated data or qualitative results can be useful in assessing whether the teaching in a particular program is accomplishing what it sets out to do. Many outcomes studies are set up to measure the learning in an aggregated way.

    We may want to remember that our present conventions of teaching evaluation had their roots in the 1970s (in California, if I remember correctly), partly as a response to a system in which faculty, both individually and collectively, were accountable to no one. I recall my student days when a professor in a large public research institution would consider it an intrusion and a personal affront to be asked to supply a course syllabus.

    As the air continues to leak out of the USA’s higher education bubble, as the enrollments drop and the number of empty seats rises, it seems inevitable that institutions will feel the pressure to offer anything to make the students perceive their experience as positive. It may be too hard to make learning–often one of the most uncomfortable experiences in life–the priority. Faculty respond defensively because we are continually put in the position of defending ourselves, often by poorly-designed quantitative instruments that address every kind of feel-good hotel concierge aspect of classroom management while overlooking learning.

    John S. likes this

  • Sethuraman JambunathaSethuraman

    Sethuraman Jambunatha

    Dean (I & E) at Vinayaka Mission

    The evaluation of faculty by the students is welcome. The statistics of information can be looked into to a certain degree of objectivity. An instructor strict with his/her students may be ranked low in spite of being an asset to the department. A ‘free-lance’ teacher with students may be placed higher despite being a poor teacher. At any rate the HoD’s duty is to observe the quality of all teachers and his objective evaluation is final. The parents feed-back is also to be taken. Actually
    teaching is a multi-dimensional task and students evaluation is just one co-ordinate only.

  • Edwin

    Edwin Herman

    Associate Professor at University of Wisconsin, Stevens Point

    Student evaluations are a terrible tool for measuring teacher effectiveness. They do measure student satisfaction, and to some extent the measure student *perception* of teacher effectiveness. But the effectiveness of a teaching method or of an instructor is poorly correlated with student satisfaction: while there are positive linkages between the two concepts, students are generally MORE satisfied by an easy course that makes them feel good than by a hard course that makes them have to really think and work (and learn).

    Students like things that are flashy, and things that are easy more than they like things that require a lot of work or things that force them to rethink their core values. Certainly there are students who value a challenge, but even those students may not recognize which teacher gave them a better course.

    Student evaluations can be used effectively to help identify very poor teaching. But it is useless to distinguish between adequate and good teaching practices.

    John S. likes this

  • Cesar GranadosCesar

    Cesar Granados

    ex Vicerrector Administrativo en Universidad Nacional de San Cristóbal de Huamanga

    César S. Granados
    Retired Professor from The National University of San Cristóbal de Huamanga
    Ayacucho, PERÚ

    Since teaching effectiveness is a function of teacher competencies, an effective teacher is able to use the existing competencies to achieve the desired student´s results; but, student´s performance mainly depends of his commitment to achieve competencies.

  • Steve KaczmarekSteve

    Steve Kaczmarek

    Professor at Columbus State Community College

    The student evaluations I’ve seen are more like customer satisfaction surveys, and in this respect, there is less helpful information for the instructor to improve his or her craft and instead more feedback about whether or not the student liked the experience. Shouldn’t their learning and/or improving skills be at least as important? I’m not arguing that these concepts are mutually exclusive, but the evaluations are often written to privilege one over the other.

    There are other problems. Using the same evaluation tool for very different kinds of courses (lecture versus workshop, for instance) doesn’t make a lot of sense. Evaluation language is often vague and puzzling in what it rewards (one evaluation form asks “Was the instructor enthusiastic?” Would an instructor bursting with smiles and enthusiasm but who is disorganized and otherwise less effective be privileged over one who is low-key but nonetheless covers the material effectively?). The “halo effect” can distort findings, where, among other things, more attractive instructors can get higher marks.

    Given how many times I’ve heard from students about someone being their favorite instructor because he or she was easy, I question the criteria students may use when evaluating. Instructors are also told that evaluations are for their benefit to improve teaching ability, but then chairs and administrators use them in promotion and hiring decisions.

    I think if the evaluation tool is sound, it can be useful to helping instructors. But, lastly, I think of my own experiences as a student, where I may have disliked or even resented some instructors because they challenged me or pushed me out of my comfort zone to learn new skills or paradigms. I may have evaluated them poorly at the time, only to come to learn a few years later with greater maturity that they not only taught me well, but taught me something invaluable, and perhaps more so than the instructors I liked. In this respect, it would be more fair to those instructors for me to fill out an evaluation a few years later to accurately describe their teaching.

  • Diane

    Diane Halm

    Adjunct Professor of Writing at Niagara University

    Wow, there are so many valid points raised; so many considerations. In general, I tend to agree with those who believe it gauges student satisfaction more than learning, though there is a correlation between the two. After 13 years as an adjunct at a relatively small, private college, I have found that engagement really is what many students long for. It seems far less about the final grades earned and more about the tools they’ve acquired. It should be mentioned that I teach developmental level composition, and while almost no student earns an A, most feel they have learned much:)

    Pierre H. likes this

  • Nira HativaNira

    Nira Hativa

    Former director, center for the advancement of teaching at Tel Aviv University

    Student ratings of instruction (SRI) do not measure teaching effectiveness but rather student satisfaction from instruction (as some previous comments on this list suggest). However there is a substantial research evidence for the relationships between SRIs and some agreed-upon measures of good teaching and of student learning. This research is summarized in much detail in my recent book:
    Student Ratings of Instruction: A Practical Approach to Designing, Operating, and Reporting (220 pp.) https://www.createspace.com/4065544
    ISBN-13:978-1481054331

    Michael T.Diane H. and 1 other like this

  • robert easterbrookrobert

    robert easterbrook

    Education Management Professional

    Learning is not about what the teacher does, it is about what the learner does.

    Do not confuse the two.

    Learning is what the learner does with what the teacher teaches.

    If you think that learning is all about what the teacher does, then the SRI will mislead and deceive.

    Adrian M.David Shallenberger and 1 other like this

  • Sami SamraSami

    Sami Samra

    Associate Professor at Notre Dame University – Louaize

    Evaluation, in all its forms, is a complex exercise that needs both knowledge and skill. Further, evaluation can best be achieved through a variety of instruments. We know all of this as teachers. Question is how knowledgeable are our students regarding the teaching/learning process. More, how knowledgeable are our administrators in translating information collected from questionnaires (some of which are validity-questionable) into plausible data-based decisions. I agree that students should have a say in how their courses are being conducted. But to use their feedback, quantitatively, to evaluate university professors… I fear that I must hold a very skeptical stand towards such evaluation.

     

  • Top Contributor

    Quite an interesting topic, and I’m reminded of the ancient proverb, “Parts is not parts.” OK, maybe that was McDonalds. This conversation would make a very thoughtful manuscript.

    Courses is not courses. Which course will be more popular, “Contemporary Music” or “General Chemistry?”

    Search any university using the following keywords “really easy course [university].” Those who teach these courses are experts at what they do, and what they do is valuable, however the workload for the student is minimal.

    The major issues: (1) popularity is inversely proportional to workload; and (2) the composition of the questions appearing on course and professor evaluations (CAPEs).

    “What grade do you expect in this class? Instructor explains course material well? Lectures hold your attention?”

    If Sally gets to listen to Nickleback in class and then next period learn quantum mechanics, which course does one suppose best held her attention?

    A person about to receive a C- in General Chemistry is probably receiving that C- because s/he was never able to understand the material for lack of striving, and probably hates the subject. That person is very likely to have never visited the professor during office hours for help. Logically one might expect low approval ratings from such a scenario.

    A person about to receive an A in General Chemistry is getting that A because s/he worked his/her tail off. S/he was able to comprehend mostly everything the professor said, and most probably liked the course. Even more, s/he probably visited the professor during office hours several times for feedback.

    One might argue that the laws of statistics will work in favor of reality, however that’s untrue when only 20% of students respond to CAPEs. Those who respond either love the professor or hate the professor. There’s usually no middle ground. Add this to internet anonymity, and the problem is compounded. I am aware of multiple studies conducted by universities indicating high correlation between written CAPEs and electronic CAPEs, however I’d like to bring up one point.

    Think of the last time you raised your voice to a customer service rep on the phone. Would you have raised your voice to that rep in person?

    There’s not enough space to comment on all the variables involved in CAPE numerical responses. As of last term I stopped paying attention to the numbers and focused exclusively on the comments. There’s a lot of truth in most of the comments.

    I would like to see the following experiment performed. Take a group of 10,000 students. Record their CAPE responses prior to receiving their final grade. Three weeks later, have them re-CAPE. One year later, have them re-CAPE again. Two years. Three years. Finally, have them re-CAPE after getting a job.

    Many students don’t know what a professor did for them until semesters or years down the road. They’re likely to realize how good of a teacher the professor was by their performance in future courses in the same subject requiring cumulative mastery.

    Do I think student evaluations measure teaching effectiveness? CAPEs is not CAPEs.

    Ronnie S.Sonu S. like this

  • Anne GardnerAnne

    Anne Gardner

    Senior Lecturer at University of Technology Sydney

    No, of course they don’t.

  • Christa van StadenChrista

    Christa van Staden

    Owner of AREND.co, a professional learning community for educators

    No, it does not. Efficiency in class room should be measured by the results of students, their attitude towards students and the quality of their preparation. I worked with a man who told a story about the different hats and learning and thought that was a new way of looking at learning. To my utmost shock my colleague, who sat because he had to say something, told me that he did it exactly the same, same jokes, etc, when he did the course five years ago. For real – nothing changed, no new technology, no new insights. no learning happened over a period of five years, nothing? And he is rated very high – head of a new wing. Who rated him? How? And why did it not effect his teaching at all?

  • Mat Jizat AbdolMat Jizat

    Mat Jizat Abdol

    Chief Executive at Institut Sains @ Teknologi Darul Takzim ( INSTEDT)

    If we are looking for quality, we have to get information about our performance.in the lecture room. There are 6 elements normally being practice. They are: 1.Teaching Plan of lecture contents 2.Teaching Delivery 3.Fair and systematic of evaluation on student’s work 4. Whether the Teaching follows the semester plan.5. Whether the lecturer follows the T-Table and always on time of their lecturer hours and lastly is the Relationship between lecturer and students.

  • orlando mcallisterorlando

    orlando mcallister

    Department Head – Communications/Mathematics

    Do we need to be reminded that educators were students at one time or the other? So why not have students evaluate the performance of a teacher? After all, the students are contributing to their own investment in what is significant for survival; and whether it is effective towards career development to attain their full potential as a human sentient being towards the greater good of humanity; anything else falls short of human progress in a tiny rotating planet cycling through the solar system with destination unknown! Welcome to the ‘Twilight Zone.”

    Would you rather educate a student to make a wise decision to accept 10 gallons of water in a desert? Or accept a $1 million check that further creates mirages and illusory dreams of success?

  • Stephen RobertsonStephen

    Stephen Robertson

    Lecturer at Edinburgh Napier University

    I think what my students say about me is important. I’m most interested in the comments they make and have used these to pilot other ideas or adjust my approach.

    I’ve had to learn to not beat myself up about a few bad comments or get carried away with a few good ones.

    I also use the assessment results to see if the adjustments made have had the intended impact. I use the VLE logs as well to see how engaged the students are with the materials and what tools they use and when.

    I find the balance keeps me grounded. I want my students to do well and have fun. The dashboard on your car has multiple measures. Why should teaching be different? Like the car I listen for strange noises and look out the window to make sure I’m still on the road.

    Jeremy W. likes this

  • Allan SheppardAllan

    Allan Sheppard

    Lecturer/tutor/PhD student at Griffith University

    I think that most student evaluations are only reaction measures and not true evaluation of learning outcome or teaching effectiveness – and often evaluations are tainted if the student get a lower mark than anticipated
    I think these types of evaluation are only indicative — and should not really be used to measure teacher/teaching effectiveness – and should not be allowed to affect teachers’ careers
    I note Stephen’s point about multiple measures — unfortunately most evaluations are quick and dirty — and certainly do not provide multiple measures

    Jeremy W.John S. like this

  • Allan SheppardAllan

    Allan Sheppard

    Lecturer/tutor/PhD student at Griffith University

    interestingly most student evaluations are anonymous – so the student can say what he/she likes and not have to face scrutiny

    George C. likes this

  • Olga

    Olga Kuznetsova

    No, students’evaluations cannot fully measure teaching effectiveness.
    However,for the relationship to be mutually beneficial, you have to accept their judgement on the matter, Unfortunately a Unique teacher for all categories (types) of students does not exist in our dynamic world.

    George C. likes this

  • Penny PaliadelisPenny

    Penny Paliadelis

    Professor, Executive Dean, Faculty of Health, Federation University Australia

    Student evaluations are merely popularity contests, they tempt academics to ‘ dumb down’ the content in order to be liked and evaluated positively…this is a dangerous and slippery slope then can result in graduates being ill-prepared for the professions and industries they seek to enter.

    Kathleen C.John S. like this

  • Robson Chiambiro (MBA, MSc, MEd.)Robson

    Robson Chiambiro (MBA, MSc, MEd.)

    PRINCE 2 Registered Practitioner at Higher Colleges of Technology

    In my opinion the student-teacher evaluations are measuring popularity as others suggested but the problem is that some of the questions and intentions of assesing are not fulfilled due to the use of wrong questioning. I have never seen in the instruments a question asking students of their expectations from the teacher and the course as such. To me that is more important than to ask if the student likes the teaching style which students do not know anyway. Teachers who give any test before the assessment are likely to get low ratings than those who give tests soon after the evaluation.

  • Chris GarbettChris

    Chris Garbett

    Principal Lecturer Leeds Metropolitan University

    I agree with other contributors. The evaluations are akin to a satisfaction survey. Personally, if, for example, I stay at an hotel, I only fill in the satisfaction survey if something is wrong. If the service is as I expect, I don’t bother with the survey.

    I feel also that students rate the courses or modules on a popularity basis. A module on a course may be enjoyable, or fun, but not necessarily better taught than another subject with a less entertaining subject.

    Unfortunately, everyone seems to think that the student evaluations are the main criteria by which to judge a course.

    Olga K. likes this

  • Steve BentonSteve

    Steve Benton

    Senior Research Officer, The IDEA Center

    First of all, it would help if we stop referring to them as “student” or “course” evaluations. Students are not qualified to evaluate. That is what administrators are paid to do. However, students are qualified to provide feedback to instructors and administrators about their perceptions of what occurred in the class and of how much they believe they learned. How can that not be valuable information, especially for developmental purposes about how to teach more effectively? Evaluation is not an event that happens at the end of a course–it is an ongoing process that requires multiple indicators of effectiveness (e.g., student ratings of the course, peer evaluations, administrator evaluations, course design, student products). By triangulating that combination of evidence, administrators and faculty can then make informed judgments and evaluate.

    Olga K. likes this

  • Eytan FichmanEytan

    Eytan Fichman

    Lecturer at Hanoi Architectural University

    The student / teacher relationship around the subject matter is a ‘triangle.’ The character of the triangle has a lot to do with a student’s reception of the of the material and the teacher.

    The Student:
    The well-prepared student and the intrinsically motivated student can more readily thrive in the relationship. If s/he is thriving s/he may be more inclined to rate the teacher highly. The poorly prepared student or the student who requires motivation from ‘outside’ is much less likely to thrive and more likely to rate a teacher poorly.

    The Teacher:
    The well-prepared teacher and the intrinsically motivated teacher can more readily thrive in the relationship. If s/he is thriving students may be more inclined to rate the teacher highly. The poorly prepared teacher or the teacher who requires motivation from ‘outside’ is much less likely to thrive and more likely to achieve poor teacher ratings.

    The Subject Matter:
    The content and form of the subject matter are crucial, especially in their relation to the student and teacher.

  • Daniel GoecknerDaniel

    Daniel Goeckner

    Education Professonal

    Student evaluations do not measure teaching effectiveness. I have been told I walk on water and am the worst teacher ever. The major difference was the level of student participation. The more they participated the better I was.

    What I use them for is a learning tool. I take the comments apart looking for snippets that I can use to improve my teaching.

    I have been involved in a portfolio program the past two years. One consist is the better the measured outcomes, the worse the student reviews.

    • Dr. Pedro L. MartinezDr. Pedro L.

      Dr. Pedro L. Martinez

      Former Provost and Vice Chancellor for Academic Affairs at Winston Salem State University & President of HigherEd SC.

      Steve,
      Have you ever been part of a tenure or promotion committee evaluation process? In my 35 years of experience, faculty members do not operate in that ideal smooth linear trajectory that you have described. On the contrary, they partition evaluations into categories and look at student course evaluations as the evidence of an instructor’s ability to teach. However, faculty can choose which evaluations they can submit and what comments they want to include as part of the record. I have never seen “negative comments” as evidence of “ineffective teaching”. The five point scale is used and whenever that falls below a 3.50, it becomes a great concern for our colleagues!

    • Sethuraman JambunathaSethuraman

      Sethuraman Jambunatha

      Dean (I & E) at Vinayaka Mission

      There are many other ways of asserting the faculty by the peer group. There can be a weekly seminar and faculty members are expected to give a seminar and other faculty members and students are the audience. This measures how much interest a faculty has in some chosen areas. The Chair (HoD) can talk to some selected students (chosen as representing highly motivated/average/take easy) and reach a decision for tenure-track. As I said earlier the students’ evaluation can be one of many aspects. In my own experience other (senior) faculty evaluation is many times detrimental to the progress of junior faculty. But one ask the HoD is the senior most: but one thing is clear, the chair of the ‘Chair’ has some ‘vision’ and transcends discrimination and partisan feelings. In India we call: “(Sar)Panch me Parameshwar rahtha hai”, meaning: On the position of Judge, God dwells (sits). Think of Becket and the King Henry II. As archbishop, Rev. Thomas Becket was a completely changes person fully submerged in divinity order. So the Chair is supremo. Students evaluation is just

    • Sethuraman JambunathaSethuraman

      Sethuraman Jambunatha

      Dean (I & E) at Vinayaka Mission

      There are many other ways of asserting the faculty by the peer group. There can be a weekly seminar and faculty members are expected to give a seminar and other faculty members and students are the audience. This measures how much interest a faculty has in some chosen areas. The Chair (HoD) can talk to some selected students (chosen as representing highly motivated/average/take easy) and reach a decision for tenure-track. As I said earlier the students’ evaluation can be one of many aspects. In my own experience other (senior) faculty evaluation is many times detrimental to the progress of junior faculty. But one ask the HoD is the senior most: but one thing is clear, the chair of the ‘Chair’ has some ‘vision’ and transcends discrimination and partisan feelings. In India we call: “(Sar)Panch me Parameshwar rahtha hai”, meaning: On the position of Judge, God dwells (sits). Think of Becket and the King Henry II. As archbishop, Rev. Thomas Becket was a completely changes person fully submerged in divinity order. So the Chair is supremo. Students evaluation is just

    • Sethuraman JambunathaSethuraman

      Sethuraman Jambunatha

      Dean (I & E) at Vinayaka Mission

      There are many other ways of asserting the faculty by the peer group. There can be a weekly seminar and faculty members are expected to give a seminar and other faculty members and students are the audience. This measures how much interest a faculty has in some chosen areas. The Chair (HoD) can talk to some selected students (chosen as representing highly motivated/average/take easy) and reach a decision for tenure-track. As I said earlier the students’ evaluation can be one of many aspects. In my own experience other (senior) faculty evaluation is many times detrimental to the progress of junior faculty. But one ask the HoD is the senior most: but one thing is clear, the chair of the ‘Chair’ has some ‘vision’ and transcends discrimination and partisan feelings. In India we call: “(Sar)Panch me Parameshwar rahtha hai”, meaning: On the position of Judge, God dwells (sits). Think of Becket and the King Henry II. As archbishop, Rev. Thomas Becket was a completely changes person fully submerged in divinity order. So the Chair is supremo. Students evaluation is just

    • Sethuraman JambunathaSethuraman

      Sethuraman Jambunatha

      Dean (I & E) at Vinayaka Mission

      There are many other ways of asserting the faculty by the peer group. There can be a weekly seminar and faculty members are expected to give a seminar and other faculty members and students are the audience. This measures how much interest a faculty has in some chosen areas. The Chair (HoD) can talk to some selected students (chosen as representing highly motivated/average/take easy) and reach a decision for tenure-track. As I said earlier the students’ evaluation can be one of many aspects. In my own experience other (senior) faculty evaluation is many times detrimental to the progress of junior faculty. But one ask the HoD is the senior most: but one thing is clear, the chair of the ‘Chair’ has some ‘vision’ and transcends discrimination and partisan feelings. In India we call: “(Sar)Panch me Parameshwar rahtha hai”, meaning: On the position of Judge, God dwells (sits). Think of Becket and the King Henry II. As archbishop, Rev. Thomas Becket was a completely changes person fully submerged in divinity order. So the Chair is supremo. Students evaluation is just

    • Sethuraman JambunathaSethuraman

      Sethuraman Jambunatha

      Dean (I & E) at Vinayaka Mission

      There are many other ways of asserting the faculty by the peer group. There can be a weekly seminar and faculty members are expected to give a seminar and other faculty members and students are the audience. This measures how much interest a faculty has in some chosen areas. The Chair (HoD) can talk to some selected students (chosen as representing highly motivated/average/take easy) and reach a decision for tenure-track. As I said earlier the students’ evaluation can be one of many aspects. In my own experience other (senior) faculty evaluation is many times detrimental to the progress of junior faculty. But one ask the HoD is the senior most: but one thing is clear, the chair of the ‘Chair’ has some ‘vision’ and transcends discrimination and partisan feelings. In India we call: “(Sar)Panch me Parameshwar rahtha hai”, meaning: On the position of Judge, God dwells (sits). Think of Becket and the King Henry II. As archbishop, Rev. Thomas Becket was a completely changes person fully submerged in divinity order. So the Chair is supremo. Students evaluation is just

    • Sethuraman JambunathaSethuraman

      Sethuraman Jambunatha

      Dean (I & E) at Vinayaka Mission

      There are many other ways of asserting the faculty by the peer group. There can be a weekly seminar and faculty members are expected to give a seminar and other faculty members and students are the audience. This measures how much interest a faculty has in some chosen areas. The Chair (HoD) can talk to some selected students (chosen as representing highly motivated/average/take easy) and reach a decision for tenure-track. As I said earlier the students’ evaluation can be one of many aspects. In my own experience other (senior) faculty evaluation is many times detrimental to the progress of junior faculty. But one ask the HoD is the senior most: but one thing is clear, the chair of the ‘Chair’ has some ‘vision’ and transcends discrimination and partisan feelings. In India we call: “(Sar)Panch me Parameshwar rahtha hai”, meaning: On the position of Judge, God dwells (sits). Think of Becket and the King Henry II. As archbishop, Rev. Thomas Becket was a completely changes person fully submerged in divinity order. So the Chair is supremo. Students evaluation is just

    • Sethuraman JambunathaSethuraman

      Sethuraman Jambunatha

      Dean (I & E) at Vinayaka Mission

      There are many other ways of asserting the faculty by the peer group. There can be a weekly seminar and faculty members are expected to give a seminar and other faculty members and students are the audience. This measures how much interest a faculty has in some chosen areas. The Chair (HoD) can talk to some selected students (chosen as representing highly motivated/average/take easy) and reach a decision for tenure-track. As I said earlier the students’ evaluation can be one of many aspects. In my own experience other (senior) faculty evaluation is many times detrimental to the progress of junior faculty. But one ask the HoD is the senior most: but one thing is clear, the chair of the ‘Chair’ has some ‘vision’ and transcends discrimination and partisan feelings. In India we call: “(Sar)Panch me Parameshwar rahtha hai”, meaning: On the position of Judge, God dwells (sits). Think of Becket and the King Henry II. As archbishop, Rev. Thomas Becket was a completely changes person fully submerged in divinity order. So the Chair is supremo. Students evaluation is just

    • Sethuraman JambunathaSethuraman

      Sethuraman Jambunatha

      Dean (I & E) at Vinayaka Mission

      There are many other ways of asserting the faculty by the peer group. There can be a weekly seminar and faculty members are expected to give a seminar and other faculty members and students are the audience. This measures how much interest a faculty has in some chosen areas. The Chair (HoD) can talk to some selected students (chosen as representing highly motivated/average/take easy) and reach a decision for tenure-track. As I said earlier the students’ evaluation can be one of many aspects. In my own experience other (senior) faculty evaluation is many times detrimental to the progress of junior faculty. But one ask the HoD is the senior most: but one thing is clear, the chair of the ‘Chair’ has some ‘vision’ and transcends discrimination and partisan feelings. In India we call: “(Sar)Panch me Parameshwar rahtha hai”, meaning: On the position of Judge, God dwells (sits). Think of Becket and the King Henry II. As archbishop, Rev. Thomas Becket was a completely changes person fully submerged in divinity order. So the Chair is supremo. Students evaluation is just

    • Sethuraman JambunathaSethuraman

      Sethuraman Jambunatha

      Dean (I & E) at Vinayaka Mission

      There are many other ways of asserting the faculty by the peer group. There can be a weekly seminar and faculty members are expected to give a seminar and other faculty members and students are the audience. This measures how much interest a faculty has in some chosen areas. The Chair (HoD) can talk to some selected students (chosen as representing highly motivated/average/take easy) and reach a decision for tenure-track. As I said earlier the students’ evaluation can be one of many aspects. In my own experience other (senior) faculty evaluation is many times detrimental to the progress of junior faculty. But one ask the HoD is the senior most: but one thing is clear, the chair of the ‘Chair’ has some ‘vision’ and transcends discrimination and partisan feelings. In India we call: “(Sar)Panch me Parameshwar rahtha hai”, meaning: On the position of Judge, God dwells (sits). Think of Becket and the King Henry II. As archbishop, Rev. Thomas Becket was a completely changes person fully submerged in divinity order. So the Chair is supremo. Students evaluation is just

    • Susan WrightSusan

      Susan Wright

      Assistant Professor at Clarkson University

      Amazing how things work…I’m actually in the process of framing out a research project related to this very question. Does anyone have any suggestions for specific papers I should look at i.e. literature related to the topic?

      With respect to your question, I believe the answer depends on the questions that get asked.

    • Sarah LowengardSarah

      Sarah Lowengard

      Researcher, Writer, Editor, Consultant (history, technology, art, sciences)

      I fall on the “no” side too.

      The school-derived questionnaires nearly always ask the wrong questions, for one.

      I’ve always thought students should wait some years (3-20) before providing feedback, because the final day of class is too recent to do a good assessment.

      David Shallenberger likes this

    • Jeremy

      Jeremy Wickins

      Open University Coursework Consultant, Research Methods

      I’m quite late to the topic here, and much of what I think has been said by others. There is a difference between the qualitative and quantitative aspects of student evaluations – I am always fascinated to find out what my students (and peers, of course, though that is a different topic) do/do not think I am doing well so I can learn and adapt my teaching. For this reason, I prefer a more continuous student evaluation than the questionnaire at the end of the course – if I need to adapt to a particular group, I need the information sooner rather than later.

      However, the quantitative side means nothing unless it is tied back to hard data on how the students did in their assessments – an unpopular teacher can still be a *good* teacher of the subject at hand! And the subject matter counts a lot – merely teaching an unpopular but compulsory subject (public law, for instance!) tends to make the teacher initially unpopular in the minds of students – a type of shooting the messenger.

      Teaching isn’t a beauty contest – these metrics need to be used in the right way, and combined with other data if they are to say anything about the teaching.

    • Dr. James R. MartinDr. James R.

      Dr. James R. Martin

      Professor Emeritus

      I wrote a paper about this issue a few years ago. Briefly, the thrust of my argument is that student opinions should not be used as the basis for evaluating teaching effectiveness because these aggregated opinions are invalid measures of quality teaching, provide no empirical evidence in this regard, are incomparable across different courses and different faculty members, promote faculty gaming and competition, tend to distract all participants and observers from the learning mission of the university, and insure the sub-optimization and further decline of the higher education system. Using student opinions to evaluate, compare and subsequently rank faculty members represents a severe form of a problem Deming referred to as a deadly disease of western style management. The theme of the alternative approach is that learning on a program-wide basis should be the primary consideration in the evaluation of teaching effectiveness. Emphasis should shift from student opinion surveys to the development and assessment of program-wide learning outcomes. To achieve this shift in emphasis, the university performance measurement system needs to be redesigned to motivate faculty members to become part of an integrated learning development and assessment team, rather than a group of independent contractors competing for individual rewards.

      Martin, J. R. 1998. Evaluating faculty based on student opinions: Problems, implications and recommendations from Deming’s theory of management perspective. Issues in Accounting Education (November): 1079-1094. http://maaw.info/ArticleSummaries/ArtSumMartinSet98.htm

      Barbara C. likes this

    • Joseph Lennox, Ph.D.The next logical step in the discussion would appear to be, “How would you effectively measure teacher effectiveness?”

      With large enrollment classes, one avenue is here:

      http://www.insidehighered.com/views/2013/10/11/way-produce-more-information-about-instructors-effectiveness-essay

      So, how should teacher effectiveness be measured?” data-li-editable=”false” data-li-edit-sec-left=”900″ data-li-time=”” />

      There appears to be general agreement that the answer to the proposed question is “No.”

      The next logical step in the discussion would appear to be, “How would you effectively measure teacher effectiveness?”

      With large enrollment classes, one avenue is here:

      http://www.insidehighered.com/views/2013/10/11/way-produce-more-information-about-instructors-effectiveness-essay

      So, how should teacher effectiveness be measured?

      Jeremy W.Olga K. like this

    • Ron MelchersRon

      Ron Melchers

      Professor of Criminology, University of Ottawa

      Top Contributor

      To inform this discussion, I would highly recommend this research review done for the Higher Education Quality Council of Ontario. It’s a pretty balanced and well-informed treatment of student course (and teacher) evaluations:http://www.heqco.ca/SiteCollectionDocuments/Student%20Course%20Evaluations_Research,%20Models%20and%20Trends.pdf

      Joseph L.Ken R. like this

    • Ron MelchersRon

      Ron Melchers

      Professor of Criminology, University of Ottawa

      Top Contributor

      Just to add my own two cents (two and a half Canadian cents at this point), I think students have much of value to tell us about their experience in our courses and classes, information that we can use to improve their learning and become more effective teachers. They are also able to inform academic administrators of the degree to which teachers fulfill their basic duties and perform the elementary tasks they are assigned. They have far less to tell us about the value of what they’re learning to their future, their professions … and they are perhaps not the best qualified to identify effective learning and teaching techniques and methods. Those sorts of things are better assessed by knowledgeable, expert professional and academic peers.

      David Shallenberger likes this

    • Barbara

      Barbara Celia

      Assistant Clinical Professor at Drexel University

      Thank you, Ron. A great deal of info but worth reading and analyzing.

    • Prof. Ravindra Kumar

      Prof. Ravindra Kumar Raghuvanshi

      Member of Academic committees of some Universities & Retd.Prof.,Dept.of Botany,University of Rajasthan,Jaipur.

      Student rating system may not necessarily be a reliable method to assess the teaching
      effeciveness,because it depends upon individual grasping/understanding power, intelligence
      and study tendency A teacher does his/her job well, but how many students understand
      it well. It is reflected invariably in the marks obtained by them.

1 2