Searching for "electronic portfolio"

technology requirements for librarians job samples

also academic technology

Data Visualization Designer and Consultant for the Arts
The University Libraries of Virginia Tech seeks a specialist to join a team offering critical and sophisticated new technology development services that enhance the scholarly and creative expression of faculty and graduate students. This new position will bring relevant computational techniques to the enhance the fields of Art and Design at Virginia Tech, and will serve as a visual design consultant to project teams using data visualization methodologies.

The ideal candidates will have demonstrated web development and programming skills, knowledge of digital research methods and tools in Art and Design, experience managing and interpreting common types of digital data and assets studied in those fields.

The Data Visualization Designer & Digital Consultant for the Arts will not only help researchers in Art and Design fields develop, manage, and sustain digital creative works and digital forms of scholarly expression, but also help researchers across Virginia Tech design effective visual representations of their research. Successful candidates will work collaboratively with other Virginia Tech units, such as the School of Visual Arts; the School of Performing Arts; the Moss Center for the Arts; the Institute for Creativity, Arts, and Technology; and the arts community development initiative VTArtWorks (made possible by the Institute of Museum and Library Services [SP-02-15-0034-15])


– Investigates and applies existing and emerging technologies that help strengthen the Libraries’ mission to enhance and curate visual representations of data at Virginia Tech.

– Develops and modifies technologies and designs processes that facilitate data visualization/exploration, data and information access, data discovery, data mining, data publishing, data management, and preservation

– Serves as consultant to researchers on data visualization, visual design principles, and related computational tools and methods in the digital arts

– Keeps up with trends in digital research issues, methods, and tools in related disciplines

– Identifies data, digital scholarship, and digital library development referral opportunities; makes connections with research teams across campus

– Participates in teams and working groups and in various data-related projects and initiatives as a result of developments and changes in library services

The James E. Walker Library at Middle Tennessee State University (MTSU) seeks a systems librarian to contribute to the mission of the library through administration and optimization of the library’s various management systems.

This is a 12-month, tenure-track position (#401070) at the rank of assistant/associate professor. Start date for the position is July 1, 2018. All library faculty are expected to meet promotion and tenure standards.



Wake Forest University

Digital Curation Librarian

This position reports to the team director. The successful candidate will collaborate with campus faculty and library colleagues to ensure long-term preservation and accessibility of digital assets, projects, and datasets collected and created by the library, and to support metadata strategies associated with digital scholarship and special collections. The person in this position will engage in national and/or international initiatives and insure that best practice is followed for curation of digital materials.


Coordinate management of digital repositories, working across teams, including Digital Initiatives & Scholarly Communication, Special Collections & Archives, Technology, and Resource Services, to ensure the sustainability of projects and content
Create and maintain policies and procedures guiding digital preservation practices, including establishing authenticity and integrity workflows for born digital and digitized content
In collaboration with the Digital Collections Librarian, create guidelines and procedures for metadata creation, transformation, remediation, and enhancement
Perform metadata audits of existing digital assets to ensure compliance with standards
Maintain awareness of trends in metadata and resource discovery
Participates in team and library-wide activities; serves on Library, Librarians’ Assembly, and University committees; represents the library in relevant regional, state, and national organizations
Participates in local, regional, or national professional organizations; enriches professional expertise by attending conferences and professional development opportunities, delivering presentations at professional meetings, publishing in professional publications, and serving on professional committees
Perform other duties as assigned
Required Qualifications:

Master’s degree in Library Science from an ALA-accredited program or a master’s degree in a related field
Knowledge of best practices for current digital library standards for digital curation and of born digital and digitized content
Knowledge of current trends in data stewardship and data management plans
Experience with preservation workflows for born digital and digitized content
Experience with metadata standards and protocols (such as Dublin Core, Open Archives Initiative-Protocol for Metadata Harvesting (OAI-PMH), METS, MODS, PREMIS)
Demonstrated ability to manage multiple projects, effectively identify and leverage resources, as well as meet deadlines and budgets
Aptitude for complex, analytical work with an attention to detail
Ability to work independently and as part of a team
Excellent communication skills
Strong service orientation
Desired Qualifications:

One to three years of experience with digital preservation or metadata creation in an academic library setting
Experience with developing, using, and preserving research data collections
Familiarity with GIS and data visualization tools
Demonstrated skills with scripting languages and/or tools for data manipulation (e.g. OpenRefine, Python, XSLT, etc.)


Mimi O’Malley is the learning technology translation strategist at Spalding University


JSON and Structured Data




The redefinition of humanities scholarship has received major attention in higher education over the past few years. The advent of digital humanities has challenged many aspects of academic librarianship. With the acknowledgement that librarians must be a necessary part of this scholarly conversation, the challenges facing subject/liaison librarians, technical service librarians, and library administrators are many. Developing the knowledge base of digital tools, establishing best procedures and practices, understanding humanities scholarship, managing data through the research lifecycle, teaching literacies (information, data, visual) beyond the one-shot class, renegotiating the traditional librarian/faculty relationship as ‘service orientated,’ and the willingness of library and institutional administrators to allocate scarce resources to digital humanities projects while balancing the mission and priorities of their institutions are just some of the issues facing librarians as they reinvent themselves in the digital humanities sphere.


College & Undergraduate Libraries, a peer-reviewed journal published by Taylor & Francis, invites proposals for articles to be published in the fall of 2017. The issue will be co-edited by Kevin Gunn ( of the Catholic University of America and Jason Paul ( of St. Olaf College.

The issue will deal with the digital humanities in a very broad sense, with a major focus on their implications for the roles of academic librarians and libraries as well as on librarianship in general. Possible article topics include, but are not limited to, the following themes, issues, challenges, and criticism:

  • Developing the project development mindset in librarians
  • Creating new positions and/or cross-training issues for librarians
  • Librarian as: point-of-service agent, an ongoing consultant, or as an embedded project librarian
  • Developing managerial and technological competencies in librarians
  • Administration support (or not) for DH endeavors in libraries
  • Teaching DH with faculty to students (undergraduate and graduate) and faculty
  • Helping students working with data
  • Managing the DH products of the data life cycle
  • Issues surrounding humanities data collection development and management
  • Relationships of data curation and digital libraries in DH
  • Issues in curation, preservation, sustainability, and access of DH data, projects, and products
  • Linked data, open access, and libraries
  • Librarian and staff development for non-traditional roles
  • Teaching DH in academic libraries
  • Project collaboration efforts with undergraduates, graduate students, and faculty
  • Data literacy for librarians
  • The lack of diversity of librarians and how it impacts DH development
  • Advocating and supporting DH across the institution
  • Developing institutional repositories for DH
  • Creating DH scholarship from the birth of digital objects
  • Consortial collaborations on DH projects
  • Establishing best practices for dh labs, networks, and services
  • Assessing, evaluating, and peer reviewing DH projects and librarians.

Articles may be theoretical or ideological discussions, case studies, best practices, research studies, and opinion pieces or position papers.

Proposals should consist of an abstract of up to 500 words and up to six keywords describing the article, together with complete author contact information. Articles should be in the range of 20 double-spaced pages in length. Please consult the following link that contains instructions for authors:

Please submit proposals to Kevin Gunn ( by August 17, 2016; please do not use Scholar One for submitting proposals. First drafts of accepted proposals will be due by February 1, 2017 with the issue being published in the fall of 2017. Feel free to contact the editors with any questions that you may have.

Kevin Gunn, Catholic University of America

Jason Paul, St. Olaf College


The Transformational Initiative for Graduate Education and Research (TIGER) at the General Library of the University of Puerto Rico-Mayaguez (UPRM) seeks an enthusiastic and creative Research Services Librarian to join our recently created Graduate Research and Innovation Center (GRIC).

The Research Services Librarian works to advance the goals and objectives of Center and leads the creation and successful organization of instructional activities, collaborates to envision and implement scholarly communication services and assists faculty, postdoctoral researchers, and graduate students in managing the lifecycle of data resulting from all types of projects. This initiative is funded by a five year grant awarded by the Promoting Postbaccalaureate Opportunities for Hispanic Americans Program (PPOHA), Title V, Part B, of the U.S. Department of Education.

The Research Services Librarian will build relationships and collaborate with the GRIC personnel and library liaisons as well as with project students and staff. This is a Librarian I position that will be renewed annually (based upon performance evaluation) for the duration of the project with a progressive institutionalization commitment starting on October 1st, 2016. .

The Mayaguez Campus of the University of Puerto Rico is located in the western part of the island. Our library provides a broad array of services, collections and resources for a community of approximately 12,100 students and supports more than 95 academic programs. An overview of the library and the university can be obtained through


  • Master’s degree in library or information science (MLS, MIS, MLIS) from an ALA (American Library Association)-accredited program • Fully bilingual in English and Spanish • Excellent interpersonal and communication skills and ability to work well with a diverse academic community • Experience working in reference and instruction in an academic/research library and strong assessment and user-centered service orientation • Demonstrated experience working across organizational boundaries and managing complex stakeholder groups to move projects forward • Experience with training, scheduling and supervising at various settings • Ability to work creatively, collaboratively and effectively on teams and on independent assignments • Experience with website creation and design in a CMS environment and accessibility and compliance issues • Strong organizational skills and ability to manage multiple priorities.


  • Experience creating and maintaining web-based subject guides and tutorials • Demonstrated ability to deliver in-person and online reference services • Experience helping researchers with data management planning and understanding of trends and issues related to the research lifecycle, including creation, analysis, preservation, access, and reuse of research data • Demonstrated a high degree of facility with technologies and systems germane to the 21st century library, and be well versed in the issues surrounding scholarly communications and compliance issues (e.g. author identifiers, data sharing software, repositories, among others) • Demonstrate awareness of emerging trends, best practices, and applicable technologies in academic librarianship • Demonstrated experience with one or more metadata and scripting languages (e.g. Dublin Core, XSLT, Java, JavaScript, Python, or PHP) • Academic or professional experience in the sciences or other fields utilizing quantitative methodologies • Experience conducting data-driven analysis of user needs or user testing.
  • Second master’s degree, doctorate or formal courses leading to a doctorate degree from an accredited university


  1. Manages daily operations, coordinates activities, and services related to the GRIC and contributes to the continuing implementation of TIGER goals and objectives.
  2. Works closely with liaison and teaching librarians to apply emerging technologies in the design, delivery, and maintenance of high-quality subject guides, digital collection, learning objects, online tutorials, workshops, seminars, mobile and social media interfaces and applications.
  3. Provide support to faculty and graduate students through the integration of digital collection, resources, technologies and analytical tools with traditional resources and by offering user-centered consultation and specialized services 4. Participates in the implementation, promotion, and assessment of the institutional repository and e-science initiative related to data storage, retrieval practices, processes, and data literacy/management.
  4. Advises and educates campus community about author’s rights, Creative Commons licenses, copyrighted materials, open access, publishing trends and other scholarly communication issues.
  5. Develops new services as new needs arise following trends in scholarly communication e-humanities, and e-science.
  6. Provides and develops awareness and knowledge related to digital scholarship and research lifecycle for librarians and staff.
  7. Actively disseminates project outcomes and participates in networking and professional development activities to keep current with emerging practices, technologies and trends.
  8. Actively promote TIGER or GRIC related activities through social networks and other platforms as needed.
  9. Periodically collects, analyzes, and incorporates relevant statistical data into progress reports as needed (e.g. Facebook, Twitter, Springshare, among others).
  10. Actively collaborates with the TIGER Project Assessment Coordinator and the Springshare Administrator to create reports and tools to collect data on user needs.
  11. Coordinates the transmission of online workshops through Google Hangouts Air with the Agricultural Experiment Station Library staff.
  12. Collaborates in the creation of grants and external funds proposals.
  13. Availability and flexibility to work some weeknights and weekends.

SALARY: $ 45,720.00 yearly+ (12 month year).

BENEFITS: University health insurance, 30 days of annual leave, 18 days of sick leave.


Technology Integration and Web Services Librarian

The Ferris Library for Information, Technology and Education (FLITE) at Ferris State University (Big Rapids, Michigan) invites applications for a collaborative and service-oriented Technology Integration and Web Services Librarian.  The Technology Integration and Web Services Librarian ensures that library   systems and web services support and enhance student learning. Primary responsibilities include management and design of the library website’s  architecture, oversight of the technical and administrative aspects of the library management system and other library enterprise applications, and the seamless integration of all library web-based services. Collaborates with other library faculty and staff to provide reliable electronic access to online resources and to improve the accessibility, usability, responsiveness, and overall user experience of the library’s website. Serves as a liaison to other campus units including Information Technology Services. The Technology Integration and Web Services Librarian is a 12-month, tenure-track faculty position based in the Collections & Access Services team and reports to the Assistant Dean for Collections & Access Services.

Required Qualifications:  ALA accredited master’s degree in library or information science by the time of hire. Minimum 2 years recent experience in administration and configuration of a major enterprise system, such as a library management system. Minimum 2 years recent experience in designing and managing a large-scale website using HTML5, Javascript, and CSS. Demonstrated commitment to the principles of accessibility, universal design, and user-centered design methodologies.  Recent experience with object-oriented programming and scripting languages used to support a website. Experience working in a Unix/ Linux environment. Experience with SQL and maintaining MySQL, PostgreSQL, and/ or Oracle databases. Knowledge of web site analytics and experience with making data-driven decisions.

For a complete posting or to apply, access the electronic applicant system by logging on to


DIRECTOR OF DIGITAL PROJECTS, MIT Libraries, to direct the development, maintenance, and scaling of software applications and tools designed to dramatically increase access to research collections, improve service capabilities, and expand the library platform.  Will be responsible for leading efforts on a variety of collaborative digital library projects aimed at increasing global access to MIT’s collections and facilitating innovative human and machine uses of a full range of research and teaching objects and metadata; and lead a software development program and develop partnerships with external academic and commercial collaborators to develop tools and platforms with a local and global impact on research, scholarly communications, education, and the preservation of information and ideas.

MIT Libraries seek to be leaders in the collaborative development of a truly open global network of library repositories and platforms. By employing a dynamic, project-based staffing model and drawing on staff resources from across the Libraries to deliver successful outcomes, it is poised to make immediate progress.

A full description is available at

REQUIRED:  four-year college degree; at least seven years’ professional experience and increasing responsibility with library systems and digital library strategy and development; evidence of broad, in-depth technology and systems knowledge; experience with integrated library systems/library services platforms, discovery technologies, digital repositories, and/or digital preservation services and technologies and demonstrated understanding of the trends and ongoing development of such systems and of emerging technologies in these areas; and experience directly leading and managing projects (i.e., developing proposals; establishing timelines, budgets, and staffing plans; leading day-to-day project work; and delivering on commitments).  Job #13458-S


THE UNIVERSITY OF ALABAMA LIBRARIES  Digital Projects Librarian Position Description

General Summary of Responsibilities

The University of Alabama Libraries seeks an innovative, dynamic, and service-oriented professional for the position of Digital Projects Librarian. Reporting to the Head of Web Services, this position is primarily responsible for development, implementation, and project management of technology projects in a collaborative environment, as well as supporting the development and management of the UA Libraries various web interfaces. This position will also act as primary administrator for LibApps and similar cloud-based library application suites.

Primary Duties and Responsibilities

Reporting to the head of Web Services, the Digital Projects Librarian will manage and extend the University Libraries services by planning and implementing a variety of projects for internal and external audiences. The position will also integrate, manage, and extend various software platforms and web-based tools using LAMP technology skills and web programming languages such as PHP, CSS, and JavaScript.  S/he will support tools such as the University Libraries web site and intranet, will work with an institutional repository instance and digital archives website, and will work with the LibApps suite of library tools. Will modify, implement and create widgets and small applications for learning tools and other interfaces and APIs. The librarian will interact with a wide range of individuals with differing technological abilities and will be expected to successfully collaborate across departments. The librarian will maintain a knowledge of current best practices in security for web tools, and library privacy concerns. The librarian will work to identify promising new technologies that can impact services and generate a better user experience. The librarian will be expected to have some participation in usability and user experience studies.

Department Information

The Web Services Unit is part of the University Libraries Office of Library Technology and is responsible for web applications, web sites, content, and services that comprise the University Libraries web presence. Among its duties, Web Services manages the University Libraries discovery service application, multiple instances of the WordPress CMS, WordPress Blogs, the LibApp suite of library tools, and Omeka as well as other tools, along with usability and accessibility efforts.



  • Administrate the UA suite of the LibApps tools (LibGuides, LibCal, LibAnswers, etc.); responsible for implementation of existing guidelines and maintaining continuity of look, feel and action;
  • Works as part of team that is responsible for management and extension of the University Libraries various web-based applications and tools (such as WordPress as a CMS and other CMS frameworks, WordPress Blogs, custom apps using an Angular JS framework and Bootstrap, Omeka, Drupal);
  • General, project-based web development and UX implementation within the framework of our web site, intranet and student portal;
  • Responsible for creating, modifying and implementing learning-tool solutions, such as Blackboard Learn widgets;
  • Evaluate the use and effectiveness of web applications and other technological services using analytics, usability studies, and other methods;
  • Work to identify and assist in implementing and evaluating promising emerging technologies and social media tools;
  • Provide technical expertise for the use of social media applications and tools;
  • Other duties as assigned.

Required qualifications

  • Master’s degree in Library & Information Sciences from an ALA-accredited program or advanced degree in Instructional Technology or comparable field from an accredited institution;
  • Ability to successfully initiate, track, and manage projects;
  • Demonstrated experience working on digital library projects;
  • Experience administering CMS-type tools and an understanding of web programming work;
  • Familiarity with the Linux and/or Unix command-line;
  • Excellent interpersonal, communication, and customer service skills and the ability to interact effectively with faculty, students, and staff.

Preferred Qualifications

  • One year of experience working in an academic library on large digital projects – either implementation or programming/developing, or both.
  • Demonstrable experience creating course and/or subject guides via LibGuides or a comparable application;
  • Experience developing for libraries using current best practices in writing and implementation of multiple scripting or programing languages;
  • Experience with automated development repository environments using Grunt, Bower, GitHub, etc.
  • Experience with an Open Source content management systems such as WordPress;
  • Demonstrated ability to work collaboratively in a large and complex environment;
  • Familiarity with project management and team productivity tools such as Asana, Trello, and Slack;
  • Knowledge of XML and library metadata standards ;
  • Knowledge of scripting languages such as XSLT, JavaScript, Python, Perl, and PHP;
  • Familiarity with responsive design methodologies and best practices;
  • Familiarity with agile-design practices;
  • Knowledge of graphic design and image editing software.


The University of Alabama, The Capstone University, is the State of Alabama’s flagship public university and the senior comprehensive doctoral level institution in Alabama. UA enrolls over 37,000 students, is ranked in the top 50 public universities in the United States, and its School of Library and Information Studies is ranked in the top 15 library schools in the country. UA has graduated 15 Rhodes Scholars, 15 Truman Scholars, has had 121 Fulbright Scholars, is one of the leading institutions for National Merit Scholars (150 in 2015), and has 5 Pulitzer Prize winners among its ranks. Under the new leadership of President Stuart Bell, UA has launched a strategic planning process that includes an aggressive research agenda and expansion of graduate education. UA is located in Tuscaloosa, a metropolitan area of 200,000, with a vibrant economy, a moderate climate, and a reputation across the South as an innovative, progressive community with an excellent quality of life. Tuscaloosa provides easy access to mountains, several large cities, and the beautiful Gulf Coast.

The University of Alabama is an equal opportunity employer and is strongly committed to the diversity of our faculty and staff. Applicants from a broad spectrum of people, including members of ethnic minorities and disabled persons, are especially encouraged to apply. The University Libraries homepage may be accessed at

Prior to employment the successful candidate must pass a pre-employment background investigation.

SALARY/BENEFITS: This will be a non-tenure track 12-month renewable appointment for up to three year cycles at the Assistant Professor rank based on performance, funding, and the needs of the University Libraries. Salary is commensurate with qualifications and experience.  Excellent benefits, including professional development support and tuition fee waiver.


Digital Humanities Developer

Columbia University Libraries seeks a collegial, collaborative, and creative Digital Humanities Developer to join our Libraries IT staff. The Digital Humanities Developer will provide technology support for digital humanities-focused projects by evaluating, implementing and managing relevant platforms and applications; the Developer will also analyze, transform and/or convert existing humanities-related data sets for staff, engage in creative prototyping of innovative applications, and provide technology consulting and instructional support for Libraries staff.

This new position, based in the Libraries’ Digital Program Division, will work on a variety of projects, collaborating closely with the Digital Humanities Librarian, the Digital Scholarship Coordinator, other Libraries technology groups, librarians in the Humanities & History division and project stakeholders. The position will contribute to building out flexible and sustainable technology platforms for the Libraries’ DH programs and will
also explore new and innovative DH applications and tools.

Responsibilities include:
– Evaluate, implement and manage web and related software applications and platforms relevant to the digital humanities program
– Analyze, transform and/or convert existing humanities-related data sets for staff, students and faculty as needed
– Engage in creative prototyping and model innovative technology solutions in support of the goals of the Digital Humanities Center
– Provide technology consulting, guidance and instruction to CUL staff a well as students and faculty as required
– Conduct independent exploration of technology issues and opportunities in the Digital Humanities domain

The successful candidate will have great collaboration and communication skills and a strong interest in developing expertise in the evolving field of digital humanities.

Columbia University is An Equal Opportunity/Affirmative Action employer and strongly encourages individuals of all backgrounds and cultures to consider this position.

-Bachelor’s degree in computer science or a related field, with experience in the humanities, a minimum of 3 years of related work experience, or an equivalent combination of education and experience

Significant experience with UNIX, relational databases (e.g., MySQL, PostgreSQL), and one or more relevant software / scripting languages (e.g., JavaScript, PHP, Python, Ruby/Rails, Perl); experience with modern web standards (HTML5 / CSS / JavaScript); ability to manage software development using revision control software such as SVN and GIT/GITHUB; strong interpersonal skills and demonstrated ability to work as part of collaborative teams; ability to communicate effectively with faculty, students, and staff, including both technical and non-technical collaborators; commitment to supporting and working in a diverse collegial environment

Advanced degree in computer science or a related field, or an advanced degree in the humanities or related field; experience in one or more of the following areas: natural language processing, text analysis, data-mining, machine learning, spatial information / mapping, data modeling, information visualization, integrating digital media into web applications; experience with XML/XSLT, GIS, SOLR, linked data technologies; experience with platforms used for digital exhibits or archives.


UMass Dartmouth, Assistant/Associate Librarian – Online Services and Digital Applications Librarian, Dartmouth, MA


  • Experience in the design, development and management of web interfaces, including demonstrated pro?ciency with HTML, CSS, and web authoring tools.
  • Working knowledge of relevant coding languages such as Javascript and PHP
  • Ability and willingness to develop work?ows and standards related to all aspects of the library’s web presence and services including related applications.
  • Strong problem solving skills
  • Excellent organizational skills, including the capability for managing a variety of tasks and multiple priorities
  • Demonstrated initiative and proven ability to learn new technologies and adapt to changes in the profession.
  • Understanding of library services and technologies in an academic environment.
  • Strong service orientation and awareness of end user needs as related to library online services and technologies
  • Possesses an understanding of, and a commitment to, usability testing and ongoing assessment of web interfaces
  • Demonstrated ability to thrive in a team environment, working both independently and collaboratively as appropriate.
  • Ability to learn new technical skills quickly and adapt emerging technologies to new domains.
  • Proven ability and willingness to share expertise with colleagues and to articulate technology strategy to non-technical sta? and patrons.
  • Must be available to respond to situations and systems maintenance work that will occur during weekends or evenings.
  • Excellent oral, written, and interpersonal communication, including the ability to develop written project documentation, process procedures, reports, etc.


  • Knowledge of Responsive Web Design and W3C Web Usability Guidelines.
  • Experience supporting an Integrated Library System (ILS)/Library Management Platform and/or discovery system such as Ex Libris’s Primo.
  • Experience using web development languages such as PHP, Javascript, XML, XSLT, and CSS3.
  • Experience with content management systems such as Drupal or WordPress
  • Familiarity with the technical applications and strategies used to enhance the discover ability of library and digital collections.
  • Experience with managing projects, meeting deadlines, and communicating to various stakeholders in an academic library environment.
  • Experience working in a Linux environment.
  • Experience supporting web applications utilizing the LAMP stack (Linux, Apache, MySQL, PHP).


Electronic Resources Librarian

Category: Academic Affairs College: Library Department: Belk Library


The University Libraries at Appalachian State University seeks a responsive and collaborative Electronic Resources Librarian. The Electronic Resources Librarian will ensure a seamless and transparent research environment for students and faculty by managing access to electronic resources. Working collaboratively across library teams, the Electronic Resources Librarian will identify and implement improvements in online content, systems and services. The successful candidate will have strong project management, problem solving, and workflow management skills. The Electronic Resources Librarian is a member of the Resource Acquisition and Management Team.


  • ALA-accredited master’s degree.
  • Excellent communication, presentation, and interpersonal skills.
  • Demonstrated e-resources project and workflow management skills.


  • Experience with integrated library systems (Sierra preferred).
  • Experience with setup and maintenance of knowledge base, OpenURL, and discovery systems (EDS preferred).
  • Experience with proxy setup and maintenance (Innovative’s WAM, and/or EZ Proxy preferred).
  • Knowledge of security standards and protocols such as LDAP, Single-Sign On, and Shibboleth, and data transfer standards and protocols such as IP, FTP, COUNTER, and SUSHI.
  • Advanced skills with office productivity software including MS Office, and Google Apps for Education.
  • Evidence of establishing and maintaining excellent vendor relationships.
  • Demonstrated ability to work collaboratively across library teams.
  • Demonstrated skill in technical trouble-shooting and problem-solving.
  • Demonstrated supervisory skills.
  • Second advanced degree.


—–Original Message—–
From: [] On Behalf Of Spencer Lamm
Sent: Thursday, October 13, 2016 12:13 PM
Subject: [lita-l] Jobs: Digital Repository Application Developer, Drexel University Libraries


Drexel University Libraries seeks a collaborative and creative professional to develop solutions for managing digital collections, research data, university records, and digital scholarship. Working primarily with our Islandora implementation, this position will play a key role as the Libraries advance preservation services and public access for a wide array of digital content including books, articles, images, journals, newspapers, audio, video, and datasets.

As a member of the Data & Digital Stewardship division, the digital repository application developer will work in a collaborative, team-based environment alongside other developers, as well as archives, metadata, and data services staff. The position’s primary responsibility will be working in a Linux environment with the Islandora digital repository stack, which includes the Fedora Commons digital asset management layer, Apache Solr, and Drupal. To support the ingestion and exposure of new collections and digital object types the position will extend the repository using tools such as: RDF, SPARQL, and triplestores; the SWORD protocol; and XSLT.

Reporting to the manager, discovery systems, the developer will collaborate with collection managers and stakeholders across campus. In addition, the successful candidate will play an active role in the Islandora and Fedora open source communities, contributing code, participating in working groups and engaging in other activities in support of current and future implementers of these technologies.

Job URL:

Key Responsibilities

  • Enhance, extend, and maintain the Libraries’ Islandora-based digital


  • Script metadata transformations and digital object processing using

BASH, Python, and XSLT

  • Develop workflows and integrate systems in collaboration with the

Libraries’ data infrastructure developer to support the ingestion of university records and research output, including datasets and publications

  • Work with campus collection managers and technology staff to plan and

coordinate content migrations

  • Collaborate with team members on the exposure of library and

repository data for indexing by search tools and reuse by other applications

  • Ensure adherence of systems to technical, quality assurance, data

integrity, and security standards for managing data

  • Document solutions and workflows for internal purposes and also as

part of compliance with University legal and privacy requirements

  • As part of the discovery systems team, provide support for library

applications and systems

Required Qualifications

  • Bachelor’s degree in Information or Computer Sciences or a related

field, or an equivalent combination of education and experience

  • 3 years minimum application or systems development experience
  • Experience with scripting languages such as Python and BASH
  • Demonstrated proficiency with a major language such as Java, PHP, Ruby
  • Experience performing data transfers utilizing software library or

language APIs

  • Experience with XML, XSLT, XPath, XQuery, and data encoding languages

and standards

  • Experience with Linux
  • Commitment to continuously enhancing development skills
  • Strong analytical and problem solving ability
  • Strong oral and written communications skills
  • Demonstrated success in working effectively both independently and

within teams

  • Evidence of flexibility and initiative working within a dynamic

environment and a diverse matrix organization


Preferred Qualifications

  • Experience in an academic, library, or archives environments
  • Experience with the Fedora Commons and Islandora digital asset

management systems

  • Working knowledge of Apache, Tomcat & other delivery servers.
  • Experience with triple stores, SPARQL, RDF
  • Experience with a version-control system such as Git or Subversion.


Interested, qualified applicants may apply at:


Librarian and Instructional Technology Liaison – Data Services (#459)

Date Posted: 10/19/2016  Type/Department: Staff in Library, Information & Technology Services
As a member of a fully blended group of librarians and instructional technologists in the Research & Instructional Support (RIS) department, the Librarian/Library and Instructional Technology Liaison (title dependent on qualifications) will work closely with fellow liaisons in RIS to provide forward-looking library research and instructional technology services to faculty and students, with a special focus on data services.The liaison collaborates broadly across LITS as well as with internal and external partners to support faculty and students participating in the College’s data science curricular initiative and in data-intensive disciplines. The liaison coordinates the development, design, and provision of responsive and flexible data services programming for faculty and students, including data analysis, data storage, data publishing, data management, data visualization, and data preservation. The liaison consults with faculty and students in a wide range of disciplines on best practices for teaching and using data/statistical software tools such as R, SPSS, Stata, and MatLab.All liaisons collaborate with faculty to support the design, implementation and assessment of meaningfully integrated library research and technology skills and tools (including Moodle, the learning management system) into teaching and learning activities; provide library research and instructional technology consultation; effectively design, develop, deliver, and assess seminars, workshops, and other learning opportunities; provide self-motivated leadership in imagining and implementing improvements in teaching and learning effectiveness; serve as liaison to one or more academic departments or programs, supporting pedagogical and content needs in the areas of collection development, library research, and instructional technology decisions; maintain high levels of quality customer service standards responding to questions and problems;  partner with colleagues across Library, Information, and Technology Services (LITS) to ensure excellence in the provision of services in support of teaching and learning;  and actively work to help the RIS team and the College to create a welcoming environment in which a diverse population of students, faculty, and staff can thrive.Evening and weekend work may be necessary. In some circumstances, it may be important to assist during adverse weather and emergency situations to ensure essential services and service points are covered. Performs related duties as assigned.Qualifications:

  • Advanced degree required, preferably in education, educational technology, instructional design, or MLS with an emphasis in instruction and assessment. Open to other combinations of education and experience such as advanced degree in quantitative academic disciplines with appropriate teaching and outreach experience.
  • 3-5 years experience in an academic setting with one or more of the following: teaching, outreach, instructional technology and design support, or research support.
  • Significant experience with statistical/quantitative data analysis using one or more of the following tools: R, SPSS, Stata, or MatLab.
  • Significant experience with one or more of the following: data storage, data publishing, data management, data visualization, or data preservation.


  • Demonstrated passion for the teaching and learning process, an understanding of a variety of pedagogical approaches, and the ability to develop effective learning experiences.
  • Demonstrated ability to lead projects that include diverse groups of people.
  • A love of learning, the ability to think critically with a dash of ingenuity, the open-mindedness to change your mind, the confidence to admit to not knowing something, and a willingness to learn and move on from mistakes.
  • Attention and care for detail without losing sight of the big picture and our users’ needs.
  • Flexibility to accept, manage, and incorporate change in a fast-paced environment.
  • Excellent oral and written communication, quantitative, organization, and problem-solving skills.
  • The ability to work independently with minimal supervision.
  • Able to maintain a professional and tactful approach in all interactions, ensuring confidentiality and an individual’s right to privacy regarding appropriate information.
  • Enthusiastic service orientation with sensitivity to the needs of users at all skill levels; the ability to convey technical information to a non-technical audience is essential.
  • Ability to travel as needed to participate in consortia and professional meetings and events.


From: [] On Behalf Of Williams, Ginger
Sent: Tuesday, November 22, 2016 8:37 AM
To: ‘’ <>
Subject: [lita-l] Job: Library Specialist Data Visualization & Collection Analytics (Texas USA)


library Specialist: Data Visualization & Collections Analytics


The Albert B. Alkek Library at Texas State University is seeking a Library Specialist: Data Visualization & Collections Analytics. Under the direction of the Head of Acquisitions, this position provides library-wide support for data visualization and collection analytics projects to support data-driven decision making. This position requires a high level of technical expertise and specialized knowledge to gather, manage, and analyze collection data and access rights, then report complex data in easy-to-understand visualizations. The position will include working with print and digital collections owned or leased by the library.


RESPONSIBILITIES: Develop and maintain an analytics strategy for the library. Manage and report usage statistics for electronic resources. Conduct complex holdings comparison analyses utilizing data from the Integrated Library System (ILS), vendors and/or external systems. Produce reports from the ILS on holdings and circulation. Develop strategies to clean and normalize data exported from the ILS and other systems for use in further analysis. Utilize data visualization strategies to report and present analytics. Conduct benchmarking with vendors, peer institutions, and stakeholders. Coordinate record-keeping of current and perpetual access rights for electronic resources and the management of titles in preservation systems such as LOCKSS and PORTICO. Maintain awareness of developments with digital preservation systems and national and international standards for electronic resources. Serve as the primary resource person for questions related to collections analytics and data visualization. Represent department and library-wide needs by participating in various committees. Participate in formulating departmental and unit policies. Pursue professional development activities to improve knowledge, skills, and abilities. Coordinate and/or perform special projects, participate in department & other staff meetings and perform other duties as needed.


Required: Ability to read, analyze, and understand data in a variety of formats; strong written, oral, and interpersonal skills, including ability to work effectively in a team; experience using R, Tableau, BayesiaLab or other data visualization or AI applications, demonstrated by an online portfolio; advanced problem solving, critical thinking, and analytical skills; demonstrated advanced proficiency with Microsoft Excel, including experience using VBA, macros, and formulas; intermediate familiarity with relational databases such as Microsoft Access, including creating relationships, queries, and reports; innovative thinking including the ability to utilize analytics/visualization tools in new, creative, and effective ways.


Preferred:  Bachelor’s degree in quantitative or data visualization field such as Applied Statistics, Data Science, or Business Analytics or certificate in data visualization; familiarity with library collection management standards and tools, such as reporting modules within integrated library systems, COUNTER, SUSHI, PIE-J, LOCKSS, PORTICO, library electronic resource usage statistics, and continuing resources; experience with SQL or other query language.


SALARY AND BENEFITS:  Commensurate with qualifications and experience. Benefits include monthly contribution to health insurance/benefits package and retirement program. No state or local income tax.

BACKGROUND CHECK: Employment with Texas State University is contingent upon the outcome of a criminal history background check.

Texas State’s 38,849 students choose from 98 bachelor’s, 90 master’s and 12 doctoral degree programs offered by the following colleges: Applied Arts, McCoy College of Business Administration, Education, Fine Arts and Communication, Health Professions, Liberal Arts, Science and Engineering, University College and The Graduate College. As an Emerging Research University, Texas State offers opportunities for discovery and innovation to faculty and students.

Application information:

Apply online at

Texas State University is an Equal Opportunity Employer. Texas State, a member of the

Texas State University System, is committed to increasing the number of women and

minorities in administrative and professional positions.


Assistant Professor
Working Title Assistant Professor – Web Development Librarian #002847
Department Office of the Dean – Hunter Library
Position Summary Hunter Library seeks an enthusiastic, innovative, collaborative, and user-oriented librarian for the position of Web Development and User Experience Librarian. This librarian will research, develop, and assess enhancements to the library’s web presence. The person in this position will design new sites and applications to improve the user experience in discovering, finding, and accessing library content and services. Providing vision and leadership in designing, developing and supporting the library website content and integrating it with the larger library web presence, which includes discovery tools, digital collections, and electronic resources; supervision of one technology support analyst, as well as staff/student employees engaged in related work, as assigned. Monitors workflow and deadlines; day-to-day management, including programming and editorial recommendations, of the library’s web pages and intranet; serves as a member of the library’s web steering committee, an advisory group that includes representatives from across the library; development and implementation web applications and tools, particularly for mobile environments. The library values collaboration and broad engagement in library-wide decisions and initiatives. This position reports directly to the Head of Technology, Access, and Special Collections.
Carnegie statement WCU embraces its role as a regionally engaged university and is designated by the Carnegie Foundation for the Advancement of Teaching as a community engaged university. Preference will be given to candidates who can demonstrate a commitment to public engagement through their teaching, service, and scholarship
Knowledge, Skills, & Abilities Required for this Position Strong leadership skills and ability to lead a web based electronic content management development team; experience in designing, developing, and supporting web sites using HTML, CSS, and JavaScript; familiarity with User Experience Design; basic skills in graphic design; familiarity with usability testing, WAI guidelines, and web analytics; familiarity with mobile platforms, applications, and design; familiarity with responsive design; familiarity with content management systems, intranets, relational databases, and web servers; demonstrated flexibility and initiative; strong commitment to user-centered services and service excellence; strong analytical and problem-solving skills; ability to work effectively with faculty, staff, and students; superior oral and written communication skills; ability to achieve tenure through effective job performance, service, and research.
Minimum Qualifications ALA-accredited master’s degree or international equivalent in library or information science; strong leadership skills and ability to lead a web based electronic content management development team; experience in designing, developing, and supporting web sites using HTML, CSS, and JavaScript; familiarity with User Experience Design; basic skills in graphic design; familiarity with usability testing, WAI guidelines, and web analytics; familiarity with mobile platforms, applications, and design; familiarity with responsive design; familiarity with content management systems, intranets, relational databases, and web servers. Demonstrated flexibility and initiative; strong commitment to user-centered services and service excellence; strong analytical and problem-solving skills; ability to work effectively with faculty, staff, and students; superior oral and written communication skills; ability to achieve tenure through effective job performance, service, and research
Preferred Qualifications Academic library experience; demonstrated skills in User Experience Design; demonstrated experience with usability testing, WAI guidelines, and web analytics; demonstrated experience with mobile platforms, applications, and design; demonstrated experience developing responsive web pages or applications; demonstrated experience with content management systems, relational databases, and web servers; skills or interest in photography; experience with graphic design software; familiarity with a programming environment that includes languages such as ASP.NET, PHP, Python, or Ruby
Position Type Permanent Full-Time

Position: Library Information Analyst


Position summary
The Library Information Analyst coordinates Access & Information Services (AIS) technology assessment activities, working in a 24/5 environment to support the technology needs of customers. This position will analyze and report quantitative and qualitative data gathered from various technology-related services including the iSpace (library maker space), equipment lending, and all public-facing user technology. Using this data, the incumbent will support strategic planning for improving and operationalizing technology-related services, provide analysis to support a wide variety of data to management, and makes recommendations for process improvements.

How to apply
See the full job description to learn more and apply online.



Web Development Librarian

The University of Alabama Libraries seeks a talented and energetic professional Web Development Librarian in the Web Technologies and Development unit. Reporting to the Manager of Web Technologies and Development, this position will be responsible for supporting and extending the Libraries’ custom web applications, tools, and web presence. The position will also engage in project work, and support new technology initiatives derived from our strategic plan. The position duties will be split among extending and supporting our custom PHP web apps framework, maintaining and enhancing our web site, maintaining and extending our custom Bento search tool, and developing for open-source digital initiatives such as EBSCO’s FOLIO library framework. The position will also support inter-departmental development and troubleshooting using your front-stack and back-end skills.

The successful candidate will maintain a knowledge of current best practices in all areas of responsibility with special attention to security. S/he will identify promising new technologies that can positively impact services or generate a better user experience and will be an innovative and entrepreneurial professional who desires to work in a creative, collaborative and respectful environment.

The Web Technologies and Applications department is responsible for the development of such nationally-recognized tools as our Bento search interface and our innovative applications of Ebsco’s EDS tool. The University Libraries emphasizes a culture of continuous learning, professional growth, and diversity through ongoing and regular training, and well-supported professional development.


  • MLS/MLIS degree from an ALA accredited program, or
  • Demonstrated ability to work independently, as well as collaboratively with diverse constituencies; comfortable with ambiguity; and effective oral, written and interpersonal communication
  • Experience (1 year+) developing for LAMP systems / extensive familiarity with PHP and MySQL or other back-end development Eg, must be able to write SQL queries and PHP code, and show understanding of web application usage using these tools within a Linux and Apache environment.
  • Extensive familiarity with front-stack development using Javascript and Javascript libraries, AJAX, JSON, HTML 5 and
  • Familiarity with version control usage systems in a development
  • Familiarity with basic UX, iterative design, accessibility standards and mobile first
  • Experience developing within a WordPress
  • Ability to problem solve
  • Ability to set and follow through on both individual and team priorities and
  • Aptitude for learning new technologies and working in a dynamic
  • Demonstrated comfort with an evolving technology
  • A desire to be awesome, and develop awesome projects.


  • 1-3 years of programming and development experience in a web environment using LAMP
  • Experience developing for, and supporting, common open-source library applications such as Omeka, ArchiveSpace, Dspace,
  • Experience with Java, Ruby, RAML
  • Familiarity with NoSQL databases and
  • Experience interacting with and manipulating REST API data
  • Application or mobile development
  • Experience with professional workflows using IDEs, staging servers, Git, Grunt, and
  • Familiarity with js, Bootstrap, Angular.js,
  • Familiar with UX methodologies and
  • Experience with web security issues, HTTPS, and developing secure
  • Experience developing for and within open-source


Web Developer/Content Strategist
University Libraries

Desired Qualifications

– Experience working with Drupal or similar CMS.

– Experience working with LibGuides.

– Familiarity with academic libraries.

General Summary: Designs, develops and maintains websites and related applications for the University Libraries. The position also leads a team to develop holistic communication strategies including the creation and maintenance of an intuitive online experience.

– Develops web content strategy for all University Libraries departments. Serves as Manager for CMS website. Leads effort to coordinate website messaging across multiple platforms including Libraries CMS, LibGuides, social media, and other electronic outlets. Leads research, organization, and public relations efforts concerning the development and release of new websites.

– Designs, tests, debugs and deploys websites. Maintains and updates website architecture and content. Ensures website architecture and content meets University standards.

– Collaborates with University staff to define and document website requirements. Gathers and reports usage statistics, errors or other performance statistics to improve information access and further the goals of the University Libraries.

– Works with Libraries Resource Management to incorporate web-related materials and resources from the Integrated Library System into other web platforms. Works with Libraries IT Services to coordinate maintenance of the architecture, functionality, and integrity of University Libraries websites.

Minimum Qualifications

– Bachelor’s degree or higher in a related field from an accredited institution.

– Three years’ relevant experience.

– Strong interpersonal, written and verbal communication skills.

– Experience documenting technical and content standards.

– Skills involving strong attention to detail.

– Supervisory or lead experience.

Academic Technology Specialist

General Description

Under supervision of the Director of Educational Technology, the Academic Technology Specialist will implement complex technical programs and/or projects; perform a range of work in development/programming, communications, technical support, instructional design, and other similar functions to support faculty, staff and students depending on the needs of the Office of Educational Technology; and provide input to educational technology policy-making decisions.Key Responsibilities and Activities:

  • Support in the implementation of 21st Century technologies, such as ePortfolios, blended/asynchronous courses, mobile learning, Web 2.0 tools for education;
  • Develop and implement innovative pedagogical applications using the latest computer, mobile and digital media;
  • Develop educational and interactive websites, including interactive learning modules, multimedia presentations, and rich media;
  • Provide one-to-one guidance to faculty in Blackboard, ePortfolios, blended/online learning, mobile learning, and digital media use in the classroom across all disciplines in a professional setting;
  • Support and enhance existing homegrown applications as required;
  • Develop and administer short-term training courses for faculty and students. Provide support for Blackboard, Digication, and WordPress users.
  • Keep abreast of the latest hardware and software developments and adapt them for pedagogical uses across disciplines.


Other Duties

  • Manage multiple projects in a dynamic team-oriented environment;
  • Serve as a liaison between Academic Departments and the Office of Educational Technology, and as a technical resource in all aspects of instructional design, as well as technologies used in the classroom.
  • Bachelor Degree in Computer Science or related field, and three years of related work experience. Master Degree preferred.
  • In-depth experience of programming in ASP.NET MVC, PHP and C#;
  • In-depth experience with lecture capturing solutions (e.g. Tegrity, Panopto), TurnItIn, Camtasia, Adobe CS Suite,
  • Strong understanding of database design (MySQL, MS SQL);
  • Strong understanding of HTML5, CSS3, HTML, XHTML, XML, JavaScript, AJAX, JQUERY, and Internet standards and protocols;
  • Strong teamwork and interpersonal skills;
  • Knowledge of project development life cycle is a plus;
  • Strong understanding of WordPress Multisites, Kaltura, WikiMedia, and other CMS platforms is a plus;
  • Experience with Windows Mobile, iOS, and other mobile environments / languages is a plus.


Digital Literacies Librarian

Instruction Services Division – Library
University of California, Berkeley Library
Hiring range: Associate Librarian
$65,942 – $81,606 per annum, based on qualifications
This is a full time appointment available starting March 2019.

The University of California, Berkeley seeks a creative, collaborative, and user-oriented colleague as the Digital Literacies Librarian. The person in this role will join a team committed to teaching emerging scholars to approach research with confidence, creativity, and critical insight, empowering them to access, critically evaluate, and use information to create and distribute their own research in a technologically evolving environment. This position also has a liaison role with the School of Information, building collections and supporting research methodologies such as computational text analysis, data visualization, and machine learning.

The Environment

The UC Berkeley Library is an internationally renowned research and teaching facility at one of the nation’s premier public universities. A highly diverse and intellectually rich environment, Berkeley serves a campus community of 30,000 undergraduate students, over 11,000 graduate students, and 1,500 faculty. With a collection of more than 12 million volumes and a collections budget of over $15 million, the Library offers extensive collections in all formats and robust services to connect users with those collections and build their related research skills.

The Instruction Services Division (ISD) is a team of seven librarians and professional staff who provide leadership for all issues related to the Library’s educational role such as student learning, information literacy, first-year and transfer student experience, reference and research services, assessment of teaching and learning, instructor development, and the design of physical and virtual learning environments. We support course-integrated instruction, drop-in workshops, online guides, and individual research. Our work furthers the Library’s involvement in teaching and learning initiatives and emphasizes the opportunities associated with undergraduate education. We cultivate liaison relationships with campus partners and academic programs.

The School of Information (I School) offers: professional masters degrees in information management, data science, and cybersecurity; a doctoral program in Information Management & Systems; and a Graduate Certificate in Information and Communication Technologies and Development. Research areas include: natural language processing, computer-mediated communication, data science, human-computer interaction, information policy, information visualization, privacy, technology for developing regions, and user experience and design.


Reporting to the Head of the Instruction Services Division, the Digital Literacies Librarian will further the Library’s digital literacy initiative (Level Up) by working with colleagues in the Library and engaging with campus partners. This librarian will play a key role in supporting information literacy and emerging research methods across the disciplines, partnering with colleagues who have expertise in these areas (e.g. Data Initiatives Expertise Group, Data and GIS Librarians, Digital Humanities Librarian) and campus partners (e.g. D-Lab, Academic Innovation Studio, Research IT, Research Data Management). Collaborations will be leveraged to identify, implement, and promote entry-level research support services for undergraduate users. This librarian will actively participate in the Library’s reference and instructional services—providing in-person reference, virtual reference, individual research consultations, in-person classes, and the development of online instructional content. This librarian will provide consultation and training to students, faculty, and librarians on using digital tools and techniques to enhance their research and to improve teaching and learning. Serving as a liaison to the I School, this position will establish strong relationships with faculty and graduate students and gain insights into trends in information studies that can be incorporated into the library’s instructional portfolio, with a special focus on undergraduates.

Working with colleagues in ISD and across the Library, the Digital Literacies Librarian will develop innovative programs and services. A key pedagogical tactic is promoting peer-to-peer learning for undergraduates, including administering the Library Undergraduate Fellows program. The Fellows program provides students with training and networking opportunities while helping the Library experiment and pilot service models to best support emerging scholars. New service models are piloted in the Center for Connected Learning (CCL) beta site in Moffitt Library. Currently in the design phase, the CCL is a hub for undergraduates to engage in multidisciplinary, multimodal inquiry and creation. Students learn from peers and experts as they ask, seek, and find answers to their questions in an environment unbound by disciplines or domain expertise. Students discover possibilities for learning and research by experimenting directly with new methods and tools. The space is run in partnership with students, and they are empowered to influence service and space design, structure, and policies. The Digital Literacies Librarian will contribute to this ethos by ensuring that emerging scholars are supported to experiment and be connected to the Library’s wealth of scholarly resources and programs.


Minimum Basic Qualification required at the time of application:

● Bachelor’s degree

Additional Required Qualifications required by start date of position:

● Master’s degree from an ALA accredited institution or equivalent international degree.
● Two or more years experience providing reference and/or instruction services in an academic or research library.
● Two or more years experience using digital scholarship methodologies.

Additional Preferred Qualifications:

● Experience applying current developments in information literacy, instructional design, digital initiatives, and assessment.
● Demonstrated understanding of methods and tools related to text mining, web scraping, text and data analysis, and visualization.
● Experience with data visualization principles and tools.
● Demonstrated ability to plan, coordinate, and implement effective programs, complex projects, and services.
● Demonstrated analytical, organizational, problem solving, interpersonal, and communication skills.
● Demonstrated initiative, flexibility, creativity, and ability to work effectively both independently and as a team member.
● Knowledge of the role of the library in supporting the research lifecycle.
● Participation in Digital Humanities Summer Institute (DHSI), ARL Digital Scholarship Institute, Library Carpentry, or other intensive program.

● Experience with or coursework in collection development in an academic or research library.
● Knowledge of licensing issues related to text and data mining.
● Familiarity with data science principles and programming languages such as Python or R.


Making and Innovation Specialist, UNLV University Libraries [R0113536]


The Making and Innovation Specialist collaborates with library and campus colleagues to connect the Lied Library Makerspace with learning and research at the University of Nevada, Las Vegas. This position leads the instructional initiatives of the Makerspace, coordinates curricular and co-curricular outreach, and facilitates individual and group instruction. The incumbent coordinates daily Makerspace operations and supervises a team of student employees who maintain safety standards and provide assistance to users. As a member of the Department of Knowledge Production, this position works jointly with all disciplines to explore the application of technology in learning and research, and prioritizes creating inclusive spaces and experiences for the UNLV community.


This position requires a bachelor’s degree from a regionally accredited college or university and professionals at all stages of their career are encouraged to apply.



  • Ability to use technology in creative ways to facilitate research and learning.
  • Ability to maintain and troubleshoot digital fabrication technology.
  • Experience with 3D modeling and printing principles including equipment, software, and basic CAD skills.
  • Working knowledge of vector graphic editors and laser cutting or vinyl cutting equipment.
  • Experience with circuitry, Arduino microcontrollers, and Raspberry Pi single-board computers.
  • Coding skills as they apply to circuitry preferred.

Instructional & Organizational

  • Ability to create and maintain policies and instructional materials/guides for Makerspace equipment and services.
  • Managerial skills to hire, train, supervise, and inspire a team of student employees.
  • Excellent oral and written communication skills including the ability to describe relatively complex technical concepts to a non-technical audience.
  • Aptitude for developing and supporting learner-centered instruction for a variety of audiences.
  • Demonstrated capacity and skill to engage students and contribute to student success.
  • Ability to work creatively, collaboratively, and effectively to promote teamwork, diversity, equality, and inclusiveness within the Libraries and the campus.
  • Experience in a relevant academic or public setting preferred.

curation tools

4 Great Curation Tools Created by Teachers for Teachers

April  28, 2016


Edshelf is ‘a socially curated discovery engine of websites, mobile apps, desktop programs, and electronic products for teaching and learning. You can search and filter for specific tools, create shelves of tools you use for various purposes, rate and review tools you’ve used, and receive a newsletter of tools recommended by other educators.


a free service from nonprofit Common Sense Education designed to help preK-12 educators discover, use, and share the best apps, games, websites, and digital curricula for their students by providing unbiased, rigorous ratings and practical insights from our active community of teachers

find out content related to your topics by ‘reviewing your suggestion lists and the topics from other curators


social learning platform that allows teachers to curate and share educational content. Some of the interesting features it provides include: ‘Explore top quality education resources for K-12, create clips from the web, Drive, Dropbox, use your camera to capture awesome work that you create in and out of the classroom, create whiteboard recordings, create differentiated groups and share content with them, create Personal Learning Portfolios, create Class Portfolios as a teacher and share Assignments with students, provide quality feedback through video, audio, text, badges, or grades, collaborate with other users on eduClipboards for class projects or personal interests

Do student evaluations measure teaching effectiveness?

Do student evaluations measure teaching effectiveness?Manager’s Choice

Assistant Professor in MISTop Contributor

Higher Education institutions use course evaluations for a variety of purposes. They factor in retention analysis for adjuncts, tenure approval or rejection for full-time professors, even in salary bonuses and raises. But, are the results of course evaluations an objective measure of high quality scholarship in the classroom?


  • Daniel WilliamsDaniel

    Daniel Williams

    Associate Professor of Molecular Biology at Winston-Salem State University

    I feel they measure student satisfaction, more like a customer service survey, than they do teaching effectiveness. Teachers students think are easy get higher scores than tough ones, though the students may have learned less from the former.

    Maria P.John S. and 17 others like this

  • Muvaffak

    Muvaffak GOZAYDIN

    Founder at Global Digital University

    Top Contributor

    How can you measure teachers’ effectiveness.
    That is how much students learn?
    If there is a method to measure how much we learn , I would appreciate to learn .

    Simphiwe N.Laura G. and 4 others like this

  • Michael TomlinsonMichael

    Michael Tomlinson

    Senior Director at TEQSA

    From what I recall, the research indicates that student evaluations have some value as a proxy and rough indicator of teacher effectiveness. We would expect that bad teachers will often get bad ratings, and good teachers will often get good ratings. Ratings for individual teachers should always be put in context, IMHO, for precisely the reasons that Daniel outlines.

    Aggregated ratings for teachers in departments or institutions can even out some of these factors, especially if you combine consideration with other indicators, such as progress rates.The hardest indicators however are drop-out rates and completion rates. When students vote with their feet this can flag significant problems. We have to bear in mind that students often drop out for personal reasons, but if your college’s drop-out rate is higher than your peers, this is worth investigating.

    phillip P.J.B. W. and 12 others like this

  • Rina SahayRina

    Rina Sahay

    Technical educator looking for a new opportunity or career direction

    I agree with what Michael says – to a point. Unfortunately student evaluations have also been used as a venue for disgruntled students, acting alone or in concert – a popularity contest of sorts. Even more unfortunately college administrations (especially for-profits) tend to rate Instructor effectiveness on the basis of student evaluations.

    IMHO, student evaluation questions need to be carefully crafted in order to be as objective as possible in order to eliminate the possibility of responses of an unprofessional nature. To clarify – a question like “Would you recommend this teacher to other students?” has the greatest potential for counter-productivity.

    Maria P.phillip P. and 6 others like this

  • Robert WhippleRobert

    Robert Whipple

    Chair, English Department at Creighton University


    Rina S.Elizabeth T. and 7 others like this

  • Dr. Virginia Stead, Ed.D.Dr. Virginia

    Dr. Virginia Stead, Ed.D.

    2013-2015 Peter Lang Publishing, Inc. (New York) Founding Book Series Editor: Higher Education Theory, Policy, & Praxis

    This is not a Cartesian question in that the answer is neither yes nor no; it’s not about flipping a coin. One element that may make it more likely that student achievement is a result of teacher effectiveness is the comparison of cumulative or summative student achievement against incoming achievement levels. Another variable is the extent to which individual students are sufficiently resourced (such as having enough food, safety, shelter, sleep, learning materials) to benefit from the teacher’s beneficence.

    Bridget K.Simphiwe N. and 4 others like this

  • Barbara

    Barbara Celia

    Assistant Clinical Professor at Drexel University

    Depends on how the evaluation tool is developed. However, overall I do not believe they are effective in measuring teacher effectiveness.

    Jeremy W.Ronnie S. and 1 other like this

  • Sri YogamalarSri

    Sri Yogamalar

    Lecturer at MUSC, Malaysia

    Overall, I think students are the best judge of a teacher’s effective pedagogy methods. Although there may be students with different learning difficulties (as there usually is in a class), their understanding of the concepts/principles and application of the subject matter in exam questions, etc. depends on how the teacher imparts such knowledge in a rather simplified and easy manner to enhance analytical and critical thinking in them. Of course, there are students too who give a bad review of a teacher’s teaching mode out of spite just because the said teacher has reprimanded him/her in class for being late, for example, or for even being rude. In such a case, it would not be a true reflection of the teacher’s method of teaching. A teacher tries his/her best to educate and inculcate values by imparting the required knowledge and ensuring a 2-way teaching-learning process. It is the students who will be the best judge to evaluate and assess the success of the efforts undertaken by the teacher because it is they who are supposed to benefit at the end of the teaching exercise.

    Chunli W.Simphiwe N. and 2 others like this

  • Paul S HickmanPaul S

    Paul S Hickman

    Member of the Council of Trustees & Distinguished Mentor at Warnborough College, Ireland & UK

    No! No!

    Anne G.Maria P. and 2 others like this

  • Bonnie FoxBonnie

    Bonnie Fox

    Higher Education Copywriter

    In some cases, I think evaluations (and negative ones in particular) can offer a good perspective on the course, especially if an instructor is willing to review them with an open mind. Of course, there are always the students who nitpick and, as Rina said, use the eval as a chance to vent. But when an entire class complains about how an instructor has handled a course (as I once saw happen with a tutoring student whose fellow classmates were in agreement about the problems in the course), I think it should be taken seriously. But I also agree with Daniel about how evaluations should be viewed like a customer service survey for student satisfaction. Evals are only useful up to a point.

    I definitely agree about the way evaluations are worded, though, to make sure that it’s easier to recognize the useful information and weed out the whining.

    Maria P.Pierre H. and 4 others like this

  • Pierre HENONPierre

    Pierre HENON

    university teacher (professeur agrege)

    I am director of studies and students in continuing education are making evaluation of the teaching effectiveness. Because I am in an ISO process, I must take in account those measurements. It might be very difficult sometimes because the number of students does not reach the level required for the sample to be valid (in a statistic meaning). But in the meantime, I believe in the utility of such measurements. The hard job is for me when I have to discuss with the teacher who is under the required score.

    Simphiwe N.Maria P. like this

  • Maria PerssonMaria

    Maria Persson

    Senior Tutor – CeTTL – Student Learning & Digital/Technology Coach (U of W – Faculty of Education)

    I’m currently ‘filling in’ as the administrator in our Teaching Development Unit – Appraisals and I have come to appreciate that the evaluation tool of choice is only that – a tool. How the tool is used in terms of the objective for collecting ‘teaching effectiveness’ information, question types developed to gain insight of, and then how that info is acted upon to inform future teaching and learning will in many ways denote the quality of the teaching itself !

    Student voice is not just about keeping our jobs, ‘bums on seats’ or ‘talking with their feet’ (all part of it of course) but should be about whether or not we really care about learning. Student voice in the form of evaluating teachers’ effectiveness is critically essential if we want our teaching to model learning that affects positive change – Thomas More’s educational utopia comes to mind…

    Simphiwe N.Pierre H. and 4 others like this

  • David ShallenbergerDavid

    David Shallenberger

    Consultant and Professor of International Education

    Alas, I think they are weak indicators of teaching effectiveness, yet they are used often as the most important indicators of the same. And in the pursuit of high response rate, they are too often given the last day of class, when they cannot measure anything significant — before the learning has “sunk in.” Ask better questions, and ask the questions after students have had a chance to reflect on the learning.

    Barbara C.Pierre H. and 9 others like this

  • Cathryn McCormackCathryn

    Cathryn McCormack

    Lecturer (Teaching and Learning), and Belly Dance teacher

    I’m just wrapping up a very large project at my university that looked at policy, processes, systems and the instrument for collecting student feedback (taking a break from writing the report to write this comment). One thing that has struck me very clearly is that we need to reconceptualise SETs. de Vellis, in Scale Development, talks about how a scale generally has a higher validity if the respondent is asked to talk about their own experiences.

    Yet here we are asking students to not only comment on, but evaluate their teachers. What we really want students to do in class in concentrate on their learning – not on what the teacher is doing. If they are focussing on what the teacher is doing then something is not going right. The way we ask now seems even crazier when we consider the most sophisticated conception of teaching is to help students learn. So why aren’t we asking students about their learning?

    The standard format has something to do with it – it’s extremely difficult to ask interesting questions on learning when the wording must align with a 5 point Likert response scale. Despite our best efforts, I do not believe it is possible to prepare a truly student centred and learning centred questionnaire using this format.

    An alternate format I came across that I really liked (Modified PLEQ Devlin 2002, An Improved Questionnaire for Gathering Student Perceptions of Teaching and Learning), but no commercial evaluation software (which we are required to purchase) can do it. A few overarching questions sets the scene for the nature of the class, but the general question format goes: In [choose from drop down list] my learning was [helped/hindered] when [fill in the blank] because [fill in the blank]. The drop down list would include options such as lectures, seminars/tutorials, a private study situation, preparing essays, labs, field trip, etc. After completing one question the student has the option to fill in another … and another … and another … for as long as they want.

    Think about what information we could actually get on student learning if we we started asking like this! No teacher ratings, all learning. The only number that would emerge would be the #helped and the #hindered.

    Maria P.Pierre H. and 6 others like this

  • Hans TilstraHans

    Hans Tilstra

    Senior Coordinator, Learning and Teaching

    Keep in mind “Goodhart’s Law” – When a measure becomes a target, it ceases to be a good measure.

    For example, if youth unemployment figures become the main measure, governments may be tempted to go for the low hanging fruit, the short term (eg. a work for the dole stick to steer unemployed people into study or the army).

    Punita S.Laura G. and 2 others like this

  • robert easterbrookrobert

    robert easterbrook

    Education Management Professional


    Catherine W.Anne G. like this

  • John StanburyJohn

    John Stanbury

    Professor at Singapore Institute of Management

    I totally agree with most of the comments here. I find student evaluations to be virtually meaningless as measures of a teachers’ effectiveness. They are measures of student perception NOT of learning. Yet university administrators eg Deans, Dept chairs, persist in using them to evaluate faculty performance in the classroom to the point where many instructors have had their careers torn apart. Its an absolute disgrace!! But no one seems to care! That’s the sick thing about it!

    Ronnie S.Maria P. and 4 others like this

  • Simon YoungSimon

    Simon Young

    Programme Coordinator, Pharmacy

    Satisfaction cannot be simply correlated with teaching quality. The evidence is that students are most “satisfied” with courses that support a surface learning approach – what the student “needs to know” to pass the course. Where material and delivery is challenging, this generates less crowd approval but, conversely, is more likely to be “good teaching” as this supports deep learning.

    Our challenge is to achieve deep learning and still generate rave satisfaction reviews. If any reader has the magic recipe, I would be pleased to learn of it.

    joe O.Maria P. and 4 others like this

  • Laura GabigerLaura

    Laura Gabiger

    Professor at Johnson & Wales University

    Top Contributor

    Maybe it is about time we started calling it what it is and got Michelin to develop the star rating system for our universities.

    Nevertheless I appreciate everyone’s thoughtful comments. Muvaffak, I agree with you about the importance and also the difficulty of measuring student learning. Cathryn, thank you for taking a break from your project to give us an overview.

    My story: the best professor and mentor in my life (I spent a total of 21 years as a student in higher education), the professor from whom I learned indispensable and enduring habits of thought that have become more important with each passing year, was one whom the other graduate students in my first term told me–almost unanimously– to avoid at all costs.

    Jeremy W.Maria P. and 1 other like this

  • Dr. Pedro L. MartinezDr. Pedro L.

    Dr. Pedro L. Martinez

    Former Provost and Vice Chancellor for Academic Affairs at Winston Salem State University & President of HigherEd SC.

    I am not sure that course evaluations based on one snap shot measure “teacher effectiveness”. For various reasons, some ineffective teachers get good ratings by pandering to the lowest level of intellectual laziness. However, consistently looking at comments and some other measures may yield indicators of teachers who are unprepared, do not provide feedback, do not adhere to a syllabus of record, and do not respect students in general. I think part of that information is based how questions are crafted.

    I believe that a self evaluation of instructor over a period of a semester could yield invaluable information. Using a camera and other devices, ask the instructor to take snap shots of their teaching/ learning in the classroom over a period of time and then ask for a self-evaluation. For the novice teacher that information could be evaluated by senior faculty and assist the junior faculty to improve his/her delivery. Many instructors are experts in their field but lack exposure to different methods of instructional delivery. I would like to see a taxonomy of a scale that measures the instructor’s ability using lecture as the base of instruction and moving up to levels of problem based learning, service learning, undergraduate research by gauging the different pedagogies (pedagogy, androgogy heutagogy, paragogy etc. that engage students in active learning.

    Dvora P.Maria P. and 1 other like this

  • Steve CharlierSteve

    Steve Charlier

    Assistant Professor at Quinnipiac University

    I wanted to piggyback on Cathryn’s comment above, and align myself with how many of you seem to feel about student evaluations. The quantitative part of student evals are problematic, for all of the reasons mentioned already. But the open-ended feedback that is (usually) a part of student evaluations is where I believe some real value can be gained, both for administrative purposes and for instructor development.

    When allowed to speak freely, what are students saying? Are they lamenting a particular aspect of the course/instructor? Is that one area coloring their response across all questions? These are all important considerations, and provide a much richer source of information for all involved.

    Sadly, the quantitative data is what most folks gravitate to, simply because it’s standardized and “easy”. I don’t believe that student evaluations are a complete waste of time, but I do think that we tend to focus on the wrong information. And, of course, this ignores the issues of timing and participation rates that are probably another conversation altogether!

    Dvora P.Sonu S. and 4 others like this

  • robert easterbrookrobert

    robert easterbrook

    Education Management Professional

    ‘What the Student Does: teaching for enhanced learning’ by John Biggs in Higher Education Research & Development, Vol. 18, No. 1, 1999.

    “The deep approach refers to activities that are appropriate to handling the task so that an appropriate outcome is achieved. The surface approach is therefore to be discouraged, the deep approach encouraged – and that is my working definition of good teaching. Learning is thus a way of interacting with the world. As we learn, our conceptions of phenomena change, and we see the world differently. The acquisition of information in itself does not bring about such a change, but the way we structure that information and think with it does. Thus, education is about conceptual change, not just the acquisition of information.” (p. 60)

    This is the approach higher education is trying adapt to at the moment, as far as I’m aware.

    Jeremy W.Adrian M. like this

  • Cindy KenkelCindy

    Cindy Kenkel

    Northwest Missouri State University

    My Human Resource students will focus on this issue in a class debate “Should student evaluation data significantly impact faculty tenure and promotion decisions?” One side will argue “yes, it provides credible data that should be one of the most important elements” and the other group will argue against this based on much of what has been said above. They will say student evaluations are basically a popularity contest and faculty may actually be dumbing down their classes in order to get higher ratings.

    To what extent is student data used in faculty tenure and promotion decisions at your institutions?

  • yasir

    yasir hayat

    Faculty member at institute of management sciences,peshawar


  • yasir

    yasir hayat

    Faculty member at institute of management sciences,peshawar


  • joe othmanjoe

    joe othman

    Associate Professor at Institute of Education, IIUM

    Agree with Pierre, when the number of students responding is not what is expected; then what?

  • joe othmanjoe

    joe othman

    Associate Professor at Institute of Education, IIUM

    Cindy; it is used in promotion decision in my university, but only a small percentage of the total points. Yet this issue is still a thorny one for some faculty

  • Sonu SardaSonu

    Sonu Sarda

    Lecturer at University of Southern Queensland

    How open are we? Is learning about the delivery of a subject only or bulding on soft skills as well?So if we as teachers are facilitating learning in a conducive manner ,would it not lead to an average TE at the least &thus indicate our teaching effectiveness at the base level. Indeed qualitative approach would be far better an approach, if we intend to accomplish the actual purpose of TE i.e Reflection for continual improvement.More and more classrooms are becoming learner centered and to accomplish this learners ‘say’ is vital.
    Some students using these as platforms for personal whims, must not be a concern for many, since the TE are averaged out .Of course last but not the least TEs are like dynamites and must be handled by experts.These are one of the means of assessing the gaps,if any, between the teaching and learning strategies. These must not be used for performance evaluation.If at all, then all the other factors such as the number of students,absenteeism,pass rate rather HD & D rates over a period of minimum three terms must also be included alongside.

  • Dvora PeretsDvora

    Dvora Perets

    Teaching colleague at Ben Gurion University of the Negev

    I implement a semester long self evaluation process in all my mathematics courses. Students gets 3 points (out of 100) for anonymously filling an online questionnaire online every week . They rate (1-5) their personal class experience (I was bored -I was fascinated, I understood nothing- I understood everything, The tutorials sessions didn’t-did help, I visited Lecturer’s/TA’s office hours, I spent X hours of self learning this week). They can also add verbal comments.
    I started it 10 years ago when I built a new special course, to help me “hear” the students (80-100 in each class) and to better adjust myself and the content to my new students. I used to publish a weekly respond to the verbal comments, accepting some and rejecting others while making sure to explain and justify any decision of mine.
    Not only that it helped me improve my teaching and the course but it turned out that it actually created a very solid perception of me as a caring teacher. I always was a very caring teacher (some of my colleagues accuse me of being over caring…) but it seems that “forcing” my student to give feedback along all the semester kind of “brought it out” to the open.

    I am still using long-semester feedback in all my courses and I consider both quantitative and qualitative responds. It helps me see that the majority of students understand me in class. I ignore those who choose “I understand nothing” – obviously if they were indeed understanding “nothing” they would have not come to class… (they can choose “I didn’t participate” or “I don’t wont to answer”)
    I ignore all verbal comments that aim to “punish” me and I change things when I think students r right.
    Finally, being a math lecturer for non-major students is extremely hard, both academically and emotionally. Most students are not willing to do what is needed in order to understand the abstract/complicated concepts and processes.
    Only few (“courageous “) students will attribute their lack of understanding to the fact that they did not attend all classes, or that they weren’t really focused on learning, (probably they spend a lot of time in “Facebook” during class..), or that they didn’t go over class notes at home and come to office hours when they didn’t understand something etc.
    I am encouraged by the fact that about 2/3 of the students that attend classes state they “understood enough” and above (3-5) all semester long. This is especially important as only 40-50% of the students fill the formal end of the semester SE and I bet u can guess how the majority of of them will rate my performance. Students fill SE before the final exam but (again) u can guess how 2 midterms with about 24% failures will influence their evaluation of my teaching.

    Cathryn M.Steve C. and 3 others like this

  • Michael TomlinsonMichael

    Michael Tomlinson

    Senior Director at TEQSA

    I think it’s important to avoid defensive responses to the question. Most participants have assumed that we are talking about individual teachers being assessed through questionnaires, and I share everyone’s reservations about that. I entirely agree that deep learning is what we need to go for, but given the huge amounts of public money that are poured into our institutions, we need to have some way of evaluating whether what we are doing is effective or whether it isn’t.

    I’m not impressed by institutions that are obsessed only with evaluation by numbers. However, there is some merit in monitoring aggregated statistics over time and detecting statistically significant variations. If average satisfaction rates in Engineering have gone down every year for five years shouldn’t we try and find out why? If satisfaction rates in Architecture have gone up every year for five years wouldn’t it be interesting to know if they have been doing something to bring that about that might be worthwhile? It might turn out to be a statistical artifact, but we need to inquire into it, and bring the same arts of critical inquiry to bear on the evidence that we use in our scholarship and research.

    But I always encourage faculties and institutions to supplement this by actually getting groups of students together and talking to them about their student experience as well. Qualitative responses can be more valuable than quantitative surveys. We might actually learn something!

    Laura G.yasir H. and 2 others like this

  • Aleardo

    Aleardo Manacero

    Associate Professor at UNESP – São Paulo State University

    As everyone here I also think that these evaluation forms do not truly measure teaching effectiveness. This is a quite hard thing to evaluate, since the effect of learning will be felt several years later, while performing their job duties.

    Besides that, some observations made by students are interesting for our own growth. I usually get these through informal talks with the class or even some students.

    In another direction, some of the previous comments are addressing deep/surface learning basically stating that deep learning is the right way to go. I have to disagree with this for some of the contents that have to be taught. In my case (teaching to computer science majors) it is important, for example, that every student have a surface knowledge about operating systems design, but those who are going to work as database analysts do not need to know the deep concepts involved with that (the same is true for database concepts for a network analyst…). So, surface learning has also its relevance in the professional formation.

    Jeremy W.Sonu S. like this

  • George ChristodoulidesGeorge

    George Christodoulides

    Senior Consultant and Lecturer at university of nicosia

    The usefulness of Student evaluations, like all similar surveys, is closely linked to the particular questions they are asked to answer. There are the objective-type/factual questions such as “Does he start class on time” or “does he speak clearly” and the very personal questions such as “does he give fair grades”… The effectiveness of a Teacher could be more appropriately linked to suitably phrased question, such as “has he motivated you to learn” and “how much have you bebnefited from the course”. The responses to these questions could, also, be further assessed by comparison with the final grades given to that particular course with the performance of the class in the other courses they have taken..during that semester. So, for assessing Teacher Effectiveness, one needs to ask relevant questions. and perform the appropriate evaluations..

  • Laura GabigerLaura

    Laura Gabiger

    Professor at Johnson & Wales University

    Top Contributor

    Michael has an excellent point that some accountability of institutions and programs is appropriate, and that aggregated data or qualitative results can be useful in assessing whether the teaching in a particular program is accomplishing what it sets out to do. Many outcomes studies are set up to measure the learning in an aggregated way.

    We may want to remember that our present conventions of teaching evaluation had their roots in the 1970s (in California, if I remember correctly), partly as a response to a system in which faculty, both individually and collectively, were accountable to no one. I recall my student days when a professor in a large public research institution would consider it an intrusion and a personal affront to be asked to supply a course syllabus.

    As the air continues to leak out of the USA’s higher education bubble, as the enrollments drop and the number of empty seats rises, it seems inevitable that institutions will feel the pressure to offer anything to make the students perceive their experience as positive. It may be too hard to make learning–often one of the most uncomfortable experiences in life–the priority. Faculty respond defensively because we are continually put in the position of defending ourselves, often by poorly-designed quantitative instruments that address every kind of feel-good hotel concierge aspect of classroom management while overlooking learning.

    John S. likes this

  • Sethuraman JambunathaSethuraman

    Sethuraman Jambunatha

    Dean (I & E) at Vinayaka Mission

    The evaluation of faculty by the students is welcome. The statistics of information can be looked into to a certain degree of objectivity. An instructor strict with his/her students may be ranked low in spite of being an asset to the department. A ‘free-lance’ teacher with students may be placed higher despite being a poor teacher. At any rate the HoD’s duty is to observe the quality of all teachers and his objective evaluation is final. The parents feed-back is also to be taken. Actually
    teaching is a multi-dimensional task and students evaluation is just one co-ordinate only.

  • Edwin

    Edwin Herman

    Associate Professor at University of Wisconsin, Stevens Point

    Student evaluations are a terrible tool for measuring teacher effectiveness. They do measure student satisfaction, and to some extent the measure student *perception* of teacher effectiveness. But the effectiveness of a teaching method or of an instructor is poorly correlated with student satisfaction: while there are positive linkages between the two concepts, students are generally MORE satisfied by an easy course that makes them feel good than by a hard course that makes them have to really think and work (and learn).

    Students like things that are flashy, and things that are easy more than they like things that require a lot of work or things that force them to rethink their core values. Certainly there are students who value a challenge, but even those students may not recognize which teacher gave them a better course.

    Student evaluations can be used effectively to help identify very poor teaching. But it is useless to distinguish between adequate and good teaching practices.

    John S. likes this

  • Cesar GranadosCesar

    Cesar Granados

    ex Vicerrector Administrativo en Universidad Nacional de San Cristóbal de Huamanga

    César S. Granados
    Retired Professor from The National University of San Cristóbal de Huamanga
    Ayacucho, PERÚ

    Since teaching effectiveness is a function of teacher competencies, an effective teacher is able to use the existing competencies to achieve the desired student´s results; but, student´s performance mainly depends of his commitment to achieve competencies.

  • Steve KaczmarekSteve

    Steve Kaczmarek

    Professor at Columbus State Community College

    The student evaluations I’ve seen are more like customer satisfaction surveys, and in this respect, there is less helpful information for the instructor to improve his or her craft and instead more feedback about whether or not the student liked the experience. Shouldn’t their learning and/or improving skills be at least as important? I’m not arguing that these concepts are mutually exclusive, but the evaluations are often written to privilege one over the other.

    There are other problems. Using the same evaluation tool for very different kinds of courses (lecture versus workshop, for instance) doesn’t make a lot of sense. Evaluation language is often vague and puzzling in what it rewards (one evaluation form asks “Was the instructor enthusiastic?” Would an instructor bursting with smiles and enthusiasm but who is disorganized and otherwise less effective be privileged over one who is low-key but nonetheless covers the material effectively?). The “halo effect” can distort findings, where, among other things, more attractive instructors can get higher marks.

    Given how many times I’ve heard from students about someone being their favorite instructor because he or she was easy, I question the criteria students may use when evaluating. Instructors are also told that evaluations are for their benefit to improve teaching ability, but then chairs and administrators use them in promotion and hiring decisions.

    I think if the evaluation tool is sound, it can be useful to helping instructors. But, lastly, I think of my own experiences as a student, where I may have disliked or even resented some instructors because they challenged me or pushed me out of my comfort zone to learn new skills or paradigms. I may have evaluated them poorly at the time, only to come to learn a few years later with greater maturity that they not only taught me well, but taught me something invaluable, and perhaps more so than the instructors I liked. In this respect, it would be more fair to those instructors for me to fill out an evaluation a few years later to accurately describe their teaching.

  • Diane

    Diane Halm

    Adjunct Professor of Writing at Niagara University

    Wow, there are so many valid points raised; so many considerations. In general, I tend to agree with those who believe it gauges student satisfaction more than learning, though there is a correlation between the two. After 13 years as an adjunct at a relatively small, private college, I have found that engagement really is what many students long for. It seems far less about the final grades earned and more about the tools they’ve acquired. It should be mentioned that I teach developmental level composition, and while almost no student earns an A, most feel they have learned much:)

    Pierre H. likes this

  • Nira HativaNira

    Nira Hativa

    Former director, center for the advancement of teaching at Tel Aviv University

    Student ratings of instruction (SRI) do not measure teaching effectiveness but rather student satisfaction from instruction (as some previous comments on this list suggest). However there is a substantial research evidence for the relationships between SRIs and some agreed-upon measures of good teaching and of student learning. This research is summarized in much detail in my recent book:
    Student Ratings of Instruction: A Practical Approach to Designing, Operating, and Reporting (220 pp.)

    Michael T.Diane H. and 1 other like this

  • robert easterbrookrobert

    robert easterbrook

    Education Management Professional

    Learning is not about what the teacher does, it is about what the learner does.

    Do not confuse the two.

    Learning is what the learner does with what the teacher teaches.

    If you think that learning is all about what the teacher does, then the SRI will mislead and deceive.

    Adrian M.David Shallenberger and 1 other like this

  • Sami SamraSami

    Sami Samra

    Associate Professor at Notre Dame University – Louaize

    Evaluation, in all its forms, is a complex exercise that needs both knowledge and skill. Further, evaluation can best be achieved through a variety of instruments. We know all of this as teachers. Question is how knowledgeable are our students regarding the teaching/learning process. More, how knowledgeable are our administrators in translating information collected from questionnaires (some of which are validity-questionable) into plausible data-based decisions. I agree that students should have a say in how their courses are being conducted. But to use their feedback, quantitatively, to evaluate university professors… I fear that I must hold a very skeptical stand towards such evaluation.


  • Top Contributor

    Quite an interesting topic, and I’m reminded of the ancient proverb, “Parts is not parts.” OK, maybe that was McDonalds. This conversation would make a very thoughtful manuscript.

    Courses is not courses. Which course will be more popular, “Contemporary Music” or “General Chemistry?”

    Search any university using the following keywords “really easy course [university].” Those who teach these courses are experts at what they do, and what they do is valuable, however the workload for the student is minimal.

    The major issues: (1) popularity is inversely proportional to workload; and (2) the composition of the questions appearing on course and professor evaluations (CAPEs).

    “What grade do you expect in this class? Instructor explains course material well? Lectures hold your attention?”

    If Sally gets to listen to Nickleback in class and then next period learn quantum mechanics, which course does one suppose best held her attention?

    A person about to receive a C- in General Chemistry is probably receiving that C- because s/he was never able to understand the material for lack of striving, and probably hates the subject. That person is very likely to have never visited the professor during office hours for help. Logically one might expect low approval ratings from such a scenario.

    A person about to receive an A in General Chemistry is getting that A because s/he worked his/her tail off. S/he was able to comprehend mostly everything the professor said, and most probably liked the course. Even more, s/he probably visited the professor during office hours several times for feedback.

    One might argue that the laws of statistics will work in favor of reality, however that’s untrue when only 20% of students respond to CAPEs. Those who respond either love the professor or hate the professor. There’s usually no middle ground. Add this to internet anonymity, and the problem is compounded. I am aware of multiple studies conducted by universities indicating high correlation between written CAPEs and electronic CAPEs, however I’d like to bring up one point.

    Think of the last time you raised your voice to a customer service rep on the phone. Would you have raised your voice to that rep in person?

    There’s not enough space to comment on all the variables involved in CAPE numerical responses. As of last term I stopped paying attention to the numbers and focused exclusively on the comments. There’s a lot of truth in most of the comments.

    I would like to see the following experiment performed. Take a group of 10,000 students. Record their CAPE responses prior to receiving their final grade. Three weeks later, have them re-CAPE. One year later, have them re-CAPE again. Two years. Three years. Finally, have them re-CAPE after getting a job.

    Many students don’t know what a professor did for them until semesters or years down the road. They’re likely to realize how good of a teacher the professor was by their performance in future courses in the same subject requiring cumulative mastery.

    Do I think student evaluations measure teaching effectiveness? CAPEs is not CAPEs.

    Ronnie S.Sonu S. like this

  • Anne GardnerAnne

    Anne Gardner

    Senior Lecturer at University of Technology Sydney

    No, of course they don’t.

  • Christa van StadenChrista

    Christa van Staden

    Owner of, a professional learning community for educators

    No, it does not. Efficiency in class room should be measured by the results of students, their attitude towards students and the quality of their preparation. I worked with a man who told a story about the different hats and learning and thought that was a new way of looking at learning. To my utmost shock my colleague, who sat because he had to say something, told me that he did it exactly the same, same jokes, etc, when he did the course five years ago. For real – nothing changed, no new technology, no new insights. no learning happened over a period of five years, nothing? And he is rated very high – head of a new wing. Who rated him? How? And why did it not effect his teaching at all?

  • Mat Jizat AbdolMat Jizat

    Mat Jizat Abdol

    Chief Executive at Institut Sains @ Teknologi Darul Takzim ( INSTEDT)

    If we are looking for quality, we have to get information about our the lecture room. There are 6 elements normally being practice. They are: 1.Teaching Plan of lecture contents 2.Teaching Delivery 3.Fair and systematic of evaluation on student’s work 4. Whether the Teaching follows the semester plan.5. Whether the lecturer follows the T-Table and always on time of their lecturer hours and lastly is the Relationship between lecturer and students.

  • orlando mcallisterorlando

    orlando mcallister

    Department Head – Communications/Mathematics

    Do we need to be reminded that educators were students at one time or the other? So why not have students evaluate the performance of a teacher? After all, the students are contributing to their own investment in what is significant for survival; and whether it is effective towards career development to attain their full potential as a human sentient being towards the greater good of humanity; anything else falls short of human progress in a tiny rotating planet cycling through the solar system with destination unknown! Welcome to the ‘Twilight Zone.”

    Would you rather educate a student to make a wise decision to accept 10 gallons of water in a desert? Or accept a $1 million check that further creates mirages and illusory dreams of success?

  • Stephen RobertsonStephen

    Stephen Robertson

    Lecturer at Edinburgh Napier University

    I think what my students say about me is important. I’m most interested in the comments they make and have used these to pilot other ideas or adjust my approach.

    I’ve had to learn to not beat myself up about a few bad comments or get carried away with a few good ones.

    I also use the assessment results to see if the adjustments made have had the intended impact. I use the VLE logs as well to see how engaged the students are with the materials and what tools they use and when.

    I find the balance keeps me grounded. I want my students to do well and have fun. The dashboard on your car has multiple measures. Why should teaching be different? Like the car I listen for strange noises and look out the window to make sure I’m still on the road.

    Jeremy W. likes this

  • Allan SheppardAllan

    Allan Sheppard

    Lecturer/tutor/PhD student at Griffith University

    I think that most student evaluations are only reaction measures and not true evaluation of learning outcome or teaching effectiveness – and often evaluations are tainted if the student get a lower mark than anticipated
    I think these types of evaluation are only indicative — and should not really be used to measure teacher/teaching effectiveness – and should not be allowed to affect teachers’ careers
    I note Stephen’s point about multiple measures — unfortunately most evaluations are quick and dirty — and certainly do not provide multiple measures

    Jeremy W.John S. like this

  • Allan SheppardAllan

    Allan Sheppard

    Lecturer/tutor/PhD student at Griffith University

    interestingly most student evaluations are anonymous – so the student can say what he/she likes and not have to face scrutiny

    George C. likes this

  • Olga

    Olga Kuznetsova

    No, students’evaluations cannot fully measure teaching effectiveness.
    However,for the relationship to be mutually beneficial, you have to accept their judgement on the matter, Unfortunately a Unique teacher for all categories (types) of students does not exist in our dynamic world.

    George C. likes this

  • Penny PaliadelisPenny

    Penny Paliadelis

    Professor, Executive Dean, Faculty of Health, Federation University Australia

    Student evaluations are merely popularity contests, they tempt academics to ‘ dumb down’ the content in order to be liked and evaluated positively…this is a dangerous and slippery slope then can result in graduates being ill-prepared for the professions and industries they seek to enter.

    Kathleen C.John S. like this

  • Robson Chiambiro (MBA, MSc, MEd.)Robson

    Robson Chiambiro (MBA, MSc, MEd.)

    PRINCE 2 Registered Practitioner at Higher Colleges of Technology

    In my opinion the student-teacher evaluations are measuring popularity as others suggested but the problem is that some of the questions and intentions of assesing are not fulfilled due to the use of wrong questioning. I have never seen in the instruments a question asking students of their expectations from the teacher and the course as such. To me that is more important than to ask if the student likes the teaching style which students do not know anyway. Teachers who give any test before the assessment are likely to get low ratings than those who give tests soon after the evaluation.

  • Chris GarbettChris

    Chris Garbett

    Principal Lecturer Leeds Metropolitan University

    I agree with other contributors. The evaluations are akin to a satisfaction survey. Personally, if, for example, I stay at an hotel, I only fill in the satisfaction survey if something is wrong. If the service is as I expect, I don’t bother with the survey.

    I feel also that students rate the courses or modules on a popularity basis. A module on a course may be enjoyable, or fun, but not necessarily better taught than another subject with a less entertaining subject.

    Unfortunately, everyone seems to think that the student evaluations are the main criteria by which to judge a course.

    Olga K. likes this

  • Steve BentonSteve

    Steve Benton

    Senior Research Officer, The IDEA Center

    First of all, it would help if we stop referring to them as “student” or “course” evaluations. Students are not qualified to evaluate. That is what administrators are paid to do. However, students are qualified to provide feedback to instructors and administrators about their perceptions of what occurred in the class and of how much they believe they learned. How can that not be valuable information, especially for developmental purposes about how to teach more effectively? Evaluation is not an event that happens at the end of a course–it is an ongoing process that requires multiple indicators of effectiveness (e.g., student ratings of the course, peer evaluations, administrator evaluations, course design, student products). By triangulating that combination of evidence, administrators and faculty can then make informed judgments and evaluate.

    Olga K. likes this

  • Eytan FichmanEytan

    Eytan Fichman

    Lecturer at Hanoi Architectural University

    The student / teacher relationship around the subject matter is a ‘triangle.’ The character of the triangle has a lot to do with a student’s reception of the of the material and the teacher.

    The Student:
    The well-prepared student and the intrinsically motivated student can more readily thrive in the relationship. If s/he is thriving s/he may be more inclined to rate the teacher highly. The poorly prepared student or the student who requires motivation from ‘outside’ is much less likely to thrive and more likely to rate a teacher poorly.

    The Teacher:
    The well-prepared teacher and the intrinsically motivated teacher can more readily thrive in the relationship. If s/he is thriving students may be more inclined to rate the teacher highly. The poorly prepared teacher or the teacher who requires motivation from ‘outside’ is much less likely to thrive and more likely to achieve poor teacher ratings.

    The Subject Matter:
    The content and form of the subject matter are crucial, especially in their relation to the student and teacher.

  • Daniel GoecknerDaniel

    Daniel Goeckner

    Education Professonal

    Student evaluations do not measure teaching effectiveness. I have been told I walk on water and am the worst teacher ever. The major difference was the level of student participation. The more they participated the better I was.

    What I use them for is a learning tool. I take the comments apart looking for snippets that I can use to improve my teaching.

    I have been involved in a portfolio program the past two years. One consist is the better the measured outcomes, the worse the student reviews.

    • Dr. Pedro L. MartinezDr. Pedro L.

      Dr. Pedro L. Martinez

      Former Provost and Vice Chancellor for Academic Affairs at Winston Salem State University & President of HigherEd SC.

      Have you ever been part of a tenure or promotion committee evaluation process? In my 35 years of experience, faculty members do not operate in that ideal smooth linear trajectory that you have described. On the contrary, they partition evaluations into categories and look at student course evaluations as the evidence of an instructor’s ability to teach. However, faculty can choose which evaluations they can submit and what comments they want to include as part of the record. I have never seen “negative comments” as evidence of “ineffective teaching”. The five point scale is used and whenever that falls below a 3.50, it becomes a great concern for our colleagues!

    • Sethuraman JambunathaSethuraman

      Sethuraman Jambunatha

      Dean (I & E) at Vinayaka Mission

      There are many other ways of asserting the faculty by the peer group. There can be a weekly seminar and faculty members are expected to give a seminar and other faculty members and students are the audience. This measures how much interest a faculty has in some chosen areas. The Chair (HoD) can talk to some selected students (chosen as representing highly motivated/average/take easy) and reach a decision for tenure-track. As I said earlier the students’ evaluation can be one of many aspects. In my own experience other (senior) faculty evaluation is many times detrimental to the progress of junior faculty. But one ask the HoD is the senior most: but one thing is clear, the chair of the ‘Chair’ has some ‘vision’ and transcends discrimination and partisan feelings. In India we call: “(Sar)Panch me Parameshwar rahtha hai”, meaning: On the position of Judge, God dwells (sits). Think of Becket and the King Henry II. As archbishop, Rev. Thomas Becket was a completely changes person fully submerged in divinity order. So the Chair is supremo. Students evaluation is just

    • Sethuraman JambunathaSethuraman

      Sethuraman Jambunatha

      Dean (I & E) at Vinayaka Mission

      There are many other ways of asserting the faculty by the peer group. There can be a weekly seminar and faculty members are expected to give a seminar and other faculty members and students are the audience. This measures how much interest a faculty has in some chosen areas. The Chair (HoD) can talk to some selected students (chosen as representing highly motivated/average/take easy) and reach a decision for tenure-track. As I said earlier the students’ evaluation can be one of many aspects. In my own experience other (senior) faculty evaluation is many times detrimental to the progress of junior faculty. But one ask the HoD is the senior most: but one thing is clear, the chair of the ‘Chair’ has some ‘vision’ and transcends discrimination and partisan feelings. In India we call: “(Sar)Panch me Parameshwar rahtha hai”, meaning: On the position of Judge, God dwells (sits). Think of Becket and the King Henry II. As archbishop, Rev. Thomas Becket was a completely changes person fully submerged in divinity order. So the Chair is supremo. Students evaluation is just

    • Sethuraman JambunathaSethuraman

      Sethuraman Jambunatha

      Dean (I & E) at Vinayaka Mission

      There are many other ways of asserting the faculty by the peer group. There can be a weekly seminar and faculty members are expected to give a seminar and other faculty members and students are the audience. This measures how much interest a faculty has in some chosen areas. The Chair (HoD) can talk to some selected students (chosen as representing highly motivated/average/take easy) and reach a decision for tenure-track. As I said earlier the students’ evaluation can be one of many aspects. In my own experience other (senior) faculty evaluation is many times detrimental to the progress of junior faculty. But one ask the HoD is the senior most: but one thing is clear, the chair of the ‘Chair’ has some ‘vision’ and transcends discrimination and partisan feelings. In India we call: “(Sar)Panch me Parameshwar rahtha hai”, meaning: On the position of Judge, God dwells (sits). Think of Becket and the King Henry II. As archbishop, Rev. Thomas Becket was a completely changes person fully submerged in divinity order. So the Chair is supremo. Students evaluation is just

    • Sethuraman JambunathaSethuraman

      Sethuraman Jambunatha

      Dean (I & E) at Vinayaka Mission

      There are many other ways of asserting the faculty by the peer group. There can be a weekly seminar and faculty members are expected to give a seminar and other faculty members and students are the audience. This measures how much interest a faculty has in some chosen areas. The Chair (HoD) can talk to some selected students (chosen as representing highly motivated/average/take easy) and reach a decision for tenure-track. As I said earlier the students’ evaluation can be one of many aspects. In my own experience other (senior) faculty evaluation is many times detrimental to the progress of junior faculty. But one ask the HoD is the senior most: but one thing is clear, the chair of the ‘Chair’ has some ‘vision’ and transcends discrimination and partisan feelings. In India we call: “(Sar)Panch me Parameshwar rahtha hai”, meaning: On the position of Judge, God dwells (sits). Think of Becket and the King Henry II. As archbishop, Rev. Thomas Becket was a completely changes person fully submerged in divinity order. So the Chair is supremo. Students evaluation is just

    • Sethuraman JambunathaSethuraman

      Sethuraman Jambunatha

      Dean (I & E) at Vinayaka Mission

      There are many other ways of asserting the faculty by the peer group. There can be a weekly seminar and faculty members are expected to give a seminar and other faculty members and students are the audience. This measures how much interest a faculty has in some chosen areas. The Chair (HoD) can talk to some selected students (chosen as representing highly motivated/average/take easy) and reach a decision for tenure-track. As I said earlier the students’ evaluation can be one of many aspects. In my own experience other (senior) faculty evaluation is many times detrimental to the progress of junior faculty. But one ask the HoD is the senior most: but one thing is clear, the chair of the ‘Chair’ has some ‘vision’ and transcends discrimination and partisan feelings. In India we call: “(Sar)Panch me Parameshwar rahtha hai”, meaning: On the position of Judge, God dwells (sits). Think of Becket and the King Henry II. As archbishop, Rev. Thomas Becket was a completely changes person fully submerged in divinity order. So the Chair is supremo. Students evaluation is just

    • Sethuraman JambunathaSethuraman

      Sethuraman Jambunatha

      Dean (I & E) at Vinayaka Mission

      There are many other ways of asserting the faculty by the peer group. There can be a weekly seminar and faculty members are expected to give a seminar and other faculty members and students are the audience. This measures how much interest a faculty has in some chosen areas. The Chair (HoD) can talk to some selected students (chosen as representing highly motivated/average/take easy) and reach a decision for tenure-track. As I said earlier the students’ evaluation can be one of many aspects. In my own experience other (senior) faculty evaluation is many times detrimental to the progress of junior faculty. But one ask the HoD is the senior most: but one thing is clear, the chair of the ‘Chair’ has some ‘vision’ and transcends discrimination and partisan feelings. In India we call: “(Sar)Panch me Parameshwar rahtha hai”, meaning: On the position of Judge, God dwells (sits). Think of Becket and the King Henry II. As archbishop, Rev. Thomas Becket was a completely changes person fully submerged in divinity order. So the Chair is supremo. Students evaluation is just

    • Sethuraman JambunathaSethuraman

      Sethuraman Jambunatha

      Dean (I & E) at Vinayaka Mission

      There are many other ways of asserting the faculty by the peer group. There can be a weekly seminar and faculty members are expected to give a seminar and other faculty members and students are the audience. This measures how much interest a faculty has in some chosen areas. The Chair (HoD) can talk to some selected students (chosen as representing highly motivated/average/take easy) and reach a decision for tenure-track. As I said earlier the students’ evaluation can be one of many aspects. In my own experience other (senior) faculty evaluation is many times detrimental to the progress of junior faculty. But one ask the HoD is the senior most: but one thing is clear, the chair of the ‘Chair’ has some ‘vision’ and transcends discrimination and partisan feelings. In India we call: “(Sar)Panch me Parameshwar rahtha hai”, meaning: On the position of Judge, God dwells (sits). Think of Becket and the King Henry II. As archbishop, Rev. Thomas Becket was a completely changes person fully submerged in divinity order. So the Chair is supremo. Students evaluation is just

    • Sethuraman JambunathaSethuraman

      Sethuraman Jambunatha

      Dean (I & E) at Vinayaka Mission

      There are many other ways of asserting the faculty by the peer group. There can be a weekly seminar and faculty members are expected to give a seminar and other faculty members and students are the audience. This measures how much interest a faculty has in some chosen areas. The Chair (HoD) can talk to some selected students (chosen as representing highly motivated/average/take easy) and reach a decision for tenure-track. As I said earlier the students’ evaluation can be one of many aspects. In my own experience other (senior) faculty evaluation is many times detrimental to the progress of junior faculty. But one ask the HoD is the senior most: but one thing is clear, the chair of the ‘Chair’ has some ‘vision’ and transcends discrimination and partisan feelings. In India we call: “(Sar)Panch me Parameshwar rahtha hai”, meaning: On the position of Judge, God dwells (sits). Think of Becket and the King Henry II. As archbishop, Rev. Thomas Becket was a completely changes person fully submerged in divinity order. So the Chair is supremo. Students evaluation is just

    • Sethuraman JambunathaSethuraman

      Sethuraman Jambunatha

      Dean (I & E) at Vinayaka Mission

      There are many other ways of asserting the faculty by the peer group. There can be a weekly seminar and faculty members are expected to give a seminar and other faculty members and students are the audience. This measures how much interest a faculty has in some chosen areas. The Chair (HoD) can talk to some selected students (chosen as representing highly motivated/average/take easy) and reach a decision for tenure-track. As I said earlier the students’ evaluation can be one of many aspects. In my own experience other (senior) faculty evaluation is many times detrimental to the progress of junior faculty. But one ask the HoD is the senior most: but one thing is clear, the chair of the ‘Chair’ has some ‘vision’ and transcends discrimination and partisan feelings. In India we call: “(Sar)Panch me Parameshwar rahtha hai”, meaning: On the position of Judge, God dwells (sits). Think of Becket and the King Henry II. As archbishop, Rev. Thomas Becket was a completely changes person fully submerged in divinity order. So the Chair is supremo. Students evaluation is just

    • Susan WrightSusan

      Susan Wright

      Assistant Professor at Clarkson University

      Amazing how things work…I’m actually in the process of framing out a research project related to this very question. Does anyone have any suggestions for specific papers I should look at i.e. literature related to the topic?

      With respect to your question, I believe the answer depends on the questions that get asked.

    • Sarah LowengardSarah

      Sarah Lowengard

      Researcher, Writer, Editor, Consultant (history, technology, art, sciences)

      I fall on the “no” side too.

      The school-derived questionnaires nearly always ask the wrong questions, for one.

      I’ve always thought students should wait some years (3-20) before providing feedback, because the final day of class is too recent to do a good assessment.

      David Shallenberger likes this

    • Jeremy

      Jeremy Wickins

      Open University Coursework Consultant, Research Methods

      I’m quite late to the topic here, and much of what I think has been said by others. There is a difference between the qualitative and quantitative aspects of student evaluations – I am always fascinated to find out what my students (and peers, of course, though that is a different topic) do/do not think I am doing well so I can learn and adapt my teaching. For this reason, I prefer a more continuous student evaluation than the questionnaire at the end of the course – if I need to adapt to a particular group, I need the information sooner rather than later.

      However, the quantitative side means nothing unless it is tied back to hard data on how the students did in their assessments – an unpopular teacher can still be a *good* teacher of the subject at hand! And the subject matter counts a lot – merely teaching an unpopular but compulsory subject (public law, for instance!) tends to make the teacher initially unpopular in the minds of students – a type of shooting the messenger.

      Teaching isn’t a beauty contest – these metrics need to be used in the right way, and combined with other data if they are to say anything about the teaching.

    • Dr. James R. MartinDr. James R.

      Dr. James R. Martin

      Professor Emeritus

      I wrote a paper about this issue a few years ago. Briefly, the thrust of my argument is that student opinions should not be used as the basis for evaluating teaching effectiveness because these aggregated opinions are invalid measures of quality teaching, provide no empirical evidence in this regard, are incomparable across different courses and different faculty members, promote faculty gaming and competition, tend to distract all participants and observers from the learning mission of the university, and insure the sub-optimization and further decline of the higher education system. Using student opinions to evaluate, compare and subsequently rank faculty members represents a severe form of a problem Deming referred to as a deadly disease of western style management. The theme of the alternative approach is that learning on a program-wide basis should be the primary consideration in the evaluation of teaching effectiveness. Emphasis should shift from student opinion surveys to the development and assessment of program-wide learning outcomes. To achieve this shift in emphasis, the university performance measurement system needs to be redesigned to motivate faculty members to become part of an integrated learning development and assessment team, rather than a group of independent contractors competing for individual rewards.

      Martin, J. R. 1998. Evaluating faculty based on student opinions: Problems, implications and recommendations from Deming’s theory of management perspective. Issues in Accounting Education (November): 1079-1094.

      Barbara C. likes this

    • Joseph Lennox, Ph.D.The next logical step in the discussion would appear to be, “How would you effectively measure teacher effectiveness?”

      With large enrollment classes, one avenue is here:

      So, how should teacher effectiveness be measured?” data-li-editable=”false” data-li-edit-sec-left=”900″ data-li-time=”” />

      There appears to be general agreement that the answer to the proposed question is “No.”

      The next logical step in the discussion would appear to be, “How would you effectively measure teacher effectiveness?”

      With large enrollment classes, one avenue is here:

      So, how should teacher effectiveness be measured?

      Jeremy W.Olga K. like this

    • Ron MelchersRon

      Ron Melchers

      Professor of Criminology, University of Ottawa

      Top Contributor

      To inform this discussion, I would highly recommend this research review done for the Higher Education Quality Council of Ontario. It’s a pretty balanced and well-informed treatment of student course (and teacher) evaluations:,%20Models%20and%20Trends.pdf

      Joseph L.Ken R. like this

    • Ron MelchersRon

      Ron Melchers

      Professor of Criminology, University of Ottawa

      Top Contributor

      Just to add my own two cents (two and a half Canadian cents at this point), I think students have much of value to tell us about their experience in our courses and classes, information that we can use to improve their learning and become more effective teachers. They are also able to inform academic administrators of the degree to which teachers fulfill their basic duties and perform the elementary tasks they are assigned. They have far less to tell us about the value of what they’re learning to their future, their professions … and they are perhaps not the best qualified to identify effective learning and teaching techniques and methods. Those sorts of things are better assessed by knowledgeable, expert professional and academic peers.

      David Shallenberger likes this

    • Barbara

      Barbara Celia

      Assistant Clinical Professor at Drexel University

      Thank you, Ron. A great deal of info but worth reading and analyzing.

    • Prof. Ravindra Kumar

      Prof. Ravindra Kumar Raghuvanshi

      Member of Academic committees of some Universities & Retd.Prof.,Dept.of Botany,University of Rajasthan,Jaipur.

      Student rating system may not necessarily be a reliable method to assess the teaching
      effeciveness,because it depends upon individual grasping/understanding power, intelligence
      and study tendency A teacher does his/her job well, but how many students understand
      it well. It is reflected invariably in the marks obtained by them.

POD conference 2013, Pittsburgh

Conference program available in PDF and upub format, so I can have it on my laptop and on my mobile device: diminishes the necessity to carry and pull constantly a paper stack.

it is the only conference I know with 6AM yoga. Strong spirit in a strong body. LRS & CETL must find space and instructors an offer mediation + yoga opportunity for SCSU students to disconnect

1:00 – 5:00 PM excursion to Carnegie Mellon – Learning Spaces. LRS interest in Learning Commons.

From the pre-conference workshops, Thurs, Nov 7, 8:30AM – 12:00PM:
Linda Shadiow, Connecting Reflection and Growth: Engaging Faculty Stories.
This workshop seems attractive to me, since it coincides with my firm conviction that SCSU faculty must share “best practices” as part of the effort to engage them into learning new technologies.

Kenyon, Kimberly et al, Risky Business: Strategic Planning and Your Center.
This workshop might be attractive for Lalita and Mark Vargas, since strategic planning is considered right now at LRS and CETL might also benefit from such ideas.

roundtables, Thurs, Nov. 7, 1:30-2:45PM

Measuring the Promise in Learner-Centered Syllabi
Michael Palmer, Laura Alexander, Dorothe Bach, and Adriana Streifer, University of Virginia

Effective Faculty Practices: Student-Centered Pedagogy and Learning Outcomes
Laura Palucki Blake, UCLA

Laura is the assistant director
3 time survey of freshmen. survey also faculty every 3 years.  can link this date: faculty practices and student learning
triangulating research findings. student-centered pedagogy. which teaching practices are effective in promoting student-center learning practices.
no statistical differences in terms of student learning outcomes between part-time and full-time faculty. The literature says otherwise, but Laura did not find any statistical difference.
discussions is big, small group work is big with faculty
in terms of discussions, there is huge difference between doing discussion and doing it well.
this is a self-report data, so it can be biased
there are gender differences. women more likely to use class discussions, cooperative learning same, students presentations same. gender discipline holds the gender differences.  same also in STEM fields.
students evaluations of each other work. cooperative learning: it is closer gender-wise.
the more student-centered pedagogy, the less disengagement from school work.
understand on a national level what students are exposed to.
wabash national data.

ePublishing: Emerging Scholarship and the Changing Role of CTLs
Laura Cruz, Andrew Adams, and Robert Crow, Western Carolina University
LORs are in Kentucky.
CETL does at least Professional Development, Resources, Eportfolios, LORSs. FLCs
Teaching Times at Penn.
model 2: around instructional technology. More and more CETL into a combined comprehensive center. about 9 are paid by IT and 11 by academic center. because of finances cuts this is the model predicted from the 90s. Why not IT? because ater they say how to use it. and how to use it effective. think outside of technology, technogogy is not the same as technology.  Teacher-scholar model: research, service, teaching.
egallery and other electronic ways to recognize productivity. Stats and survey software does NOT reside with grad studies, but with CETL, so CETL can help faculty from a glimmer of an idea to presentation and publication. Research Support Specialist.
how and where it fits into faculty development. Neutrality. Should CETL be advocates for institutional, organizational change.  Do CETL encourage faculty to take innovation and risk (change the culture of higher ed). Tenure and promotion: do we advocate that epub should count, e.g. a blog will count toward tenure.
a national publication:
we domenstrate that it is good school. scholarship of teaching will be good teaching.
OER? Open educational resources. SHould CETL host and participate in those? Do we participate in creating resources, which are designed to replace texbooks? Caroline has a state-wide grant to support faculty developing learning resources.
open access is controversial. the right to publish and republish.
40% of all scholarly articles are owned by 3 publishers
Academic Social Media and electronic journals.
CETL is the comprehensive center, the hub where people go to, so CETL can direct them to and or get together stakeholder to make things happen.
the lesson from this session for me is that Lalita and Keith Ewing must work much closer.

Evaluating the quality of MOOCs: Is there room for improvement?
Erping Zhu, University of Michigan; Danilo Baylen, University of West Georgia
reflection on “taking” a MOOC and the seven principles. how to design and teach MOOC using the seven principles.
MOOC has a lot of issues; this is not the focus, focus is on the instructional design. Both presenters are instructional designers. Danilo is taking MOOC in library and information science.
Second principle: what is a good graduate education.
about half had completed a course. Atter the 3rd week the motivation is dissipating.
Erping’s experience: Provost makes quick decision. The CETL was charged with MOOC at U of Michigan. Securing Digital Democracy.
Danilo is a librarian. his MOOC class had a blog, gets a certificate at the end. Different from online class is the badges system to get you involved in the courses. the MOOC instructors also had involved grad students to monitor the others. the production team is not usually as transparent as at Corsera. Sustainability. 10 week module, need to do reflections, feedback from peers. 7 assignments are too much for a full-time professional.

1. principle: contact btw faculty and student. Not in a MOOC. video is the only source provides sense of connection. the casual comments the instructor makes addressing the students provides this sense. Quick response. Collaboration and cooperation in MOOC environment and bring it in a F2F and campus teaching. Feedback for quizzes was not helpful to improve, since it i automated. students at the discussion board were the one who helped. from an instructional design point of view, how MOOC design can be improved.
group exercise, we were split in groups and rotated sheet among each other to log in response to 7 sheets of paper. then each group had to choose the best of the logged responses. the responses will be on the POD site.
eri week resources

Per Keith’s request

“Why Students Avoid Risking Engagement with Innovative Instructional Methods
Donna Ellis, University of Waterloo”

Excerpt From: Professional and Organizational Development Network in Higher Education. “POD Network 2013 Conference Program, Pittsburgh PA 11/7 to 11/10.” iBooks.
This material may be protected by copyright.

A quantitative study. The difficulty of group works. Various questions from the audience, the time of class (early Mrng) is it a reason to increase the students disengagement. Students pereceptions .

The teacher did. It explain why the research and this might have increased the negative perception. Summary of key barrierS.

Risk of negative consequneces

preceived lack of control

contravention of perceived norms.

fishbein and Aizen 2010

discussoon .  How faculty can design and deliver the course to minimize the barriers. Our table thought that there are a lot of unknown parameters to decide and it is good to hear the instructor nit only the researcher. How to deal with dysfunctional group members behaviors. Reflections from the faculty member how to response to the data? Some of the barriers frustrated him. Outlines for the assignments only part of the things he had done to mitigate. What are we asking students on course evaluations. Since a lot more then only negative feedback. Instructor needes more training in conflict resolution and how to run group work.


CRLT Players

Friday, Nov 8, 10:30 AM – 12:00 PM
William Penn Ballroom
7 into 15

CRLT Players, University of Michigan”

Excerpt From: Professional and Organizational Development Network in Higher Education. “POD Network 2013 Conference Program, Pittsburgh PA 11/7 to 11/10.” iBooks.
This material may be protected by copyright.

It is a burlesque and theater approach to engage students and faculty into a conversation. 10 plays in 30 min.

Discuses different topics from the plays and seek solutions as a team. How to deal with international students ( Harvard lady said ” safe places” for students) how to deal with technology or the lack of it, missed next one writing this notes and how to reward faculty in innvative things. T. Encoruage innovation, they received a letter from the provost and if they fail, it is not used in their annual evaluation

No  videotaping of this performance because the power is in conversation. Is there a franchise, like training people to do that. NSF grant was allowing them but now just pick up the idea. Sell scripts? Can have conversations about strategies how to collaborate with the theater department where to start these short vinniets. If come to campus and bring performance do they do also the follow up.
Is anger or hostility a reaction during after these presentations. How to handle it. Hostility can be productive and make sure that it is told that it is productive. Getting difficult things out there is what the theater is trying to do in a distant way. This is not a morality
how develop the work? How come up with issues. Faculty bring issues, followed by interviews, draft created we heater identifies the problem and address the issue. Preview performances with stakeholders who confirm .  There are more then. Sufficient ideas, so the organizers can choose what they see most pertinent
other ways to follow up.
ecrc committee went to their meeting instead of lunch to see if I can particpirate for next year activitities. Ecrc is the acronym for the tech committee. Web site is one takes of this committee. Word press site , how the groups work, how forms work, how to connect with people and most importantly how to start communicating through the web site and cut the listserv. An attempt to centralized all info in the website rather then scattered across universities.
what is BRL? Google apps and Wikipedia as a wiki for another year until figure out if it can be incorporated in the web site. Reconceptualize how do work in the process. To groups in ecrc. Wikpaidea and web page.  And then social media with Amy?  Ecrc liaison in every POD committee to understand how to set up the committee web presence. Blackboard collaborate to do meetings and this is what liason explain to committee members.
Designing Online Discussions For Student Engagement And Deep Learning
Friday, Nov 8, 2:15 PM – 3:30 PM, Roundtable
Parkview East
Danilo M Baylen, University of West Georgia”
pit must be asynchronous discussion
What is the purpose and format of the discussion. Assessment.  How the online discussion is supporting the purpose of the curriculum to the students learning
About five discussions per semester all together. Behaved part of the class culture
Format of the assignment
asynchronous discussion list. Series of questions or a case study. Is the format a sequence of responses or invite a discussions
checklist which stifles a creative discussion or just let it more free
purpose – must be part of the syllabus and it must be clear.
Meeting learning objectives.
interactivity – response to other students. List of 6 different options how they can reply. what format the interactivity takes Is important issue, which has no textbook
assessment- initial posting are critical, since it gives and idea what to work on. How much points as part of the bigger picture. Yet it is the ground work for the assignment, which gets most points.
metacognitive not evaluative , give students examples from the pro regions class what a good discussion is And explain students how to. Evaluate a good discussion entry
how the question is worded and use the threaded discussion for them to reflect how they think, rather then only assess if they read the chapter. The research about online discussion is very different.
What is the  baseline.
Online course must must be set up ready before semester starts or not?
reflection for the end of the semester
SteVn brookfields critical questionaire
meet thISTI and qr standards
is reflection on the content or the process
students reflect on their own reflections
what have you learned about yourself as online learner and look for consistencies for both negative and positive reflections
“Connecting and Learning with Integrative ePortfolios: The Teaching Center’s Role
Friday, Nov 8, 3:45 PM – 5:00 PM, Roundtable
Assess critical thinking
there is a workshop by the presenters instituitions how to organize
more claims then actual evidence so Data is sought to
main issues
programmatic emportfolio. Not student presentation portfolios, but academic portfolio
e portfolio forum
look at image of the green copy:
1. Integration and reflection
2. Social media – in community with other students , faculty, organizations
3. Resume builder
eportfolio is. Prt of the assessment. Conversation on campus. Some depts use exportfolio extensively but not happy.  Programmatic academic e portfolio to collect data
use Sakai open portfolio system
12 drepartments and six more second year.  to speak the same language, they developed a guideline, conceptual framework ( see snapshot of handout)
Curriculum mapping ( see the grid on the. Handout) took much longer then expected.
Fachlty was overwhelmed by the quantity of responses from studentses when filling out Th grid.
the role of CETL. The provost at Kevin’s institution charged CETL to do the portfolio gig.
The big argument of the CETL redirector with the provost is that portfolio not only to collect data for assessment and accreditation but to provide meaningful experience for the students. EDUCAUSE report horizon, learning analytics  Scandalous headlines of students suing law schools. bad deductions made on big data. The things that matte for students must be in the portfolio and they get used to use the portfolio. Pre reflection entries by the students, which shorted the advising sessions. The advisor can see ahead of time. The advisers. Will. B the. Focus point,   The. Advising  portfolio Is becoming
portfolio must be used by faculty not only students.
Whats the by in for students.  Presentations portfolio part of. Marketing purposes. Google sites so when students leave the institutions students can ” take” the portfolio with them as we’ll go multimedia. attempts failed because platforms which can be cutozmized we’re not used   Digital identity   As CETL director not technology expect and how to learn from the faculty and that was very
documenting and learning with eportfolios.
faculty to demonstrate reflections to students and how enter into portfolio. Using rubrics. Faculty are using already tools but connecting with. Reflections.
STAR: Situation , tasks, action, response
Writing skills differentiate, but even good writers got better on reflection
how one polish a portfolio before bringing to an Employer. Student Working with career services to polish and proofread.
How much the university is responsible for an individual portfolio. How many levels of proof reading.
Poor student work reflects a poor faculty attention.
“Teaching Online and Its Impact on Face-to-Face Teaching
Friday, Nov 8, 3:45 PM – 5:00 PM, 35-Minute Research Session B
“Groups Inform Pedagogies
Friday, Nov 8, 3:45 PM – 5:00 PM, 35-Minute Research Session A
Carnegie III
Rhett McDaniel and Derek Bruff, Vanderbilt University”
Teaching Online and Its Impact on Face-to-Face Teaching
Friday, Nov 8, 3:45 PM – 5:00 PM, 35-Minute Research Session B
Greene & Franklin
Lorna Kearns, University of Pittsburgh”

Freedom to Breathe: A Discussion about Prioritizing Your Center’s Work
Andy Goodman and Susan Shadle, Boise State University

Connecting, Risking, and Learning: A Panel Conversation about Social Media
Michelle Rodems, University of Louisville.  Conference C 9:00 AM – 10:15 AM
The use of social media in higher education
Conference C 9-11:15 AM

Panel of CETL directors and faculty. The guy from Notre dame uses word press the same way I use it. Collect questions and after the 3rd one creates blog entry and answers the next q/ s  with the URL to the blog entry NspireD is the name of. The blog

the OHIO state UCAT guy is a twitter guy. Program coordinator who manages wordpress and web site. Intersect with FB and twitter. Platforms are inteGrated, so be did not to know the technicalities. The graduate consultants are setting up. ciirdinator tried to understand how the mesh together. Can be used as conversation starters or to broadcast and share info.  Use of hashtags how to use them appropriate in twitter and FB to streamline .

Scsu problem. W don’t build it they will not come. a Tim burton version of the field of dreams.

Rachel CETL assist dir at U of Michigan.  She is out there personally likes it. Very static web page. Drupal as a content management system so the blog is part of the web page. So 2 times a week entries. One of the staff people is an editor and writes blog posts, but vetted by a second CETL staff. Auto push for the blog to the twitter. Screencasts for YouTube channel with screencasts.  Comments on the blog minimal from faculty and stat. What about students? About 1000 followers on the twitter.  What do analytics say. Hits on home page, but no idea how much time reading. The time people spend more time and using the tags .  the use of blog is less formal way to share information.  recycling in December and August a lot of material.

does anybody subscribe and do you promote RSS

the separate blog for a workshop requires interaction and that is a success

for faculty development U of Michigan is using blog recruited 50  to follow the blog.  TSam of 3 using. WordPress  For a semester and then survey. Focus group. Huge success, between 6 and 30 comments. Community with no other space on campus

how are u using social media to promote connections. elevate voices of others on campus by interviewing faculty.  At U of Michigan there was no interest to learn about what other faculty are doing. So they trashed that initiative but starTed a video narration about faculty who innovate. Videotaped and edited no hi Qual video , tagged and blog posted and this approach created more connection, because it is not text only.

What have been the obstacles and indoor failure and what have you learned?

convincing the administration that CETL than do it and it does not have to be the same quality as the web page and the printed material.  Changing the mindset. No assessment, since nothing else was working and they were ready for radical step such as blog

Same with the twitter. Taking the risk to experiment with the hashtags. Tweets can’t be approved. Need to time to build an audience, one month will not have an impact. Start with the. Notion that you are building a reposIvory noT a foRum

one of the panelist has a google spreadsheet which has information of allCETL social media sites   There are resources on how to deal with negative outcomes of using social media. Working with librarians, the Norte dame said! they will give you twenty sources. No no, no, he siad, give me your best three.


U of MichiGan more grad studns blog guest posts almost no faculty.

Have you considered giving them more then guest blog, but no facilitator? Let faculty once a semester do a blog post. It is not moderated but more like lead to how to do a good blog. Interview based approach is unique and does not show up somewhere elSe.

Insitutional background important in these decisions.

How often refresh the wordpress page.  How often one person is voicing and it takes a log of journalistic skills. Use the draft option to publish when there are several ideas coming at once.

Mindshift of CETL is to decrease the standards. Make it more informal. Blog post can be always fixed later. To avoid faculty false perception that this is not scholarly needs to be references. So causal tone + references.

Blog ” from students perspective” is repurposE

Risking Together: Cultivating Connection and Learning for Faculty Teaching Online
Michaella Thornton, Christopher Grabau, and Jerod Quinn, Saint Louis University
Oliver 9-11:15 AM

Space Matters! and Is There a Simple Formula to Understand and Improve Student Motivation
Kathleen Kane and Leslie A. Lopez, University of Hawaii at Manoa
Riverboat 9:00 AM – 10:15 AM

The Risks and Rewards of Becoming a Campus Change Agent
Dr. Adrianna Kezar, University of Southern California
William Penn Ballroom 10:30 AM – 12:00 PM

Branch campuses, students abroad, to more with less, completion from profit institutions

students work more but this is a good reflection on learning success

provost might ask to consolidate prof development opportunities for faculty and students instead of faculty only.

If administration is genuine understand transparent   Administration more about persuading not listening. Respect, not assuming that faculty will not accept it. If faculty will sacrifices what will faculty see the administration sacrifice on their side. Leading from the. Middle , it means collective vision for the future. Multilevel leadershup, top down efforts dont work and bottom top are fragile. Managing up  is less preferred then powering up.  It is difficult to tell administration that they miss or misunderstand the technology issue.

Four frames. Goal multi frame leadership Vey much the same as Jim Collins good to great right people on the bus right trained

How to build coalition, different perspectives, aknowledge  the inherent conflict.

The Delphi project


It Takes a Campus: Promoting Information Literacy through Collaboration
Karla Fribley and Karen St. Clair, Emerson College
Oakmont 1:45 PM – 3:00 PM

Most of the attendees and both presenters were librarians

The presenters played a scatch to involve the particppaints

deifnition what is IL.

Information literacy collaborative  work with faculty to design student learning outocmes for information literacy

Guiding principles by backward course design

Where they see students struggle with research

question to students survey, what is most difficult for your and wordle.

self reflection

Curriculum mapping to identify which courses are the stretigic ones to instill the non credit info litreacy

acrl assessment in action


Risky Business: Supporting Institutional Data Gathering in Faculty Development Centers
Meghan Burke and Tom Pusateri, Kennesaw State University
Oliver 1:45 PM – 3:00 PM Roundtable

Exploring Issues of Perceptual Bias and International Faculty
Shivanthi Anandan, Drexel University.
Heinz 3:15 PM – 4:30 PM Roundtable

Why do we need it and onoy regarding international faculty don’t in Kim Lisa wolf-wendel

susan twombly. Pointers for hiring and retention. Performance is both teaching and living. Sanitary effect.  sanitary issues not only pay rate. FLC all tenure track without citizenship they are worried about their tenure. Funding agencies, very few will fund you if you are not a citizenship

Diane Schafer  perceptual biases, graffiti. Cathryn Ross


Averting Death by PowerPoint! From Killer Professors to Killer Presenters
Christy Price, Dalton State College
Riverboat 4:45 PM – 6:00 PM

How to create effective mini lectures checklist for acting palnning

engage and leave lecture out. The reason why can’t move away is because some  people lecture as performance art

Make lectures mini. How long mini should be. 22 min, the age number of the person.

Emotional appeal, empathy.

Evoke positive emotions with humor.   Always mixed method research, since the narrative   Berk, r. (2000) and Sousa (2011)

ethical. Obligations and emotional appeal

acknowledge the opposition

enhance memory processing with visuals and multimedia

use guided practice by miniki zing note taking

presentationzen is a book! which need to read

Enchanted memory processing by creating mistery

address relevance
Death by PowerPoint:  Nancy Duarte: The secret structure of great talks

Engage faculty by showing. Faculty how their presentation. Is. And how it c can be

process with clickers

Sunday Mrng session

vygotsky zone of  NAND the flipped mindset. Cool tweets at #pod13.

Ideas process baudler Boyd stromle 2013

I – identify the issue

D debrief the situation

A  analyze what happened

s strategize solutions and Oport unities for growth and future success


1 2