Searching for "formative assessment"

Classroom Routines Change

Classroom Routines Must Change. Here’s What Teaching Looks Like Under COVID-19

By  August 5, 2020

https://www.edweek.org/ew/articles/2020/08/06/classroom-routines-have-to-change-heres-what.html

Class cultures built on collaboration or group project work will change.

discuss these priorities and present ideas for adapting common classroom routines for remote or socially distanced settings.

  • Frequent, meaningful engagement
  • Cognitively demanding work
  • Responding to formative assessment

Adapting Common Classroom Routines in an Online (or Socially Distanced) Environment

  • Introduce yourself to students at the beginning of the year
  • Hold a remote discussion
  • Plan a socially distanced art, music, or physical education lesson
  • Have students think-pair-share

 

free digital tools for students engagement

15 Free Digital Tools to Boost Students’ Engagement Online

A review of digital tools and ideas for teachers to support formative assessment in online classrooms

https://medium.com/the-faculty/digital-tools-for-online-student-engagement-2faafbbd0b44

1. Diigo

2. Evernote

3. Notion

4. Hypothes.is

5. Mural

6. Miro

7. Kahoot

8. Sli.do

9. Factile

10. Wakelet

11. Flipgrid

12. Slack

13. Padlet

14. Zoom

15. BigBlueButton

+++++++++++++
more on engagement in this IMS blog
https://blog.stcloudstate.edu/ims?s=engage

peer to peer curation

Peer-to-Peer Curation Activities Boost Higher-Order Thinking

https://www.kritik.io/resources/peer-to-peer-curation-activities-boost-higher-order-thinking

Most professors we hear from want to assess their students on higher levels and that if current assessments kept student at the lowest level of Bloom’s Taxonomy, they wouldn’t feel rewarded as educators.

However, assessment is by far the most labour-intensive part of teaching. Assessment plans and rubrics must be prepped. Test questions must be written. Every student needs a mark, personalized feedback and a road-map for improvement. The larger the class, the more work for the instructor. Add in formative assessments like weekly assignments and exercises that precipitate subtle, ongoing tweaks to the syllabus and it’s easy to see why many faculty opt to stick with what they know: An accumulation of easy-to-grade summative assessments that almost inevitably rely upon memorization and the most basic understanding of concepts

Curation Activities can be one of the most effective teaching strategies to help students compare what they’re learning in the classroom with real-world examples, and gain insight into how they can relate to each other.

Curation Activities can apply to all disciples, such as Business, Arts, or Sciences.

When students explain what they’ve learned to other students, they help consolidate and strengthen connections to those concepts while simultaneously engaging in active learning Find more project ideas here.

By actively engaging with their classmates and applying their own evaluative skills to feedback they’re delivering to their peers, students are developing lifelong critical thinking and creative analysis skills. Additionally, peer assessment is proven to be effective in getting students faster feedback from diverse sources, increases meta-cognition, independence and self-reflection, and improves student learning. These are all important skills that provide value far beyond the classroom. More details on the benefits of peer assessment here.

++++++++++++++
more on curation in this IMS blog
https://blog.stcloudstate.edu/ims?s=curation

Innovative Pedagogy

Rebecca Ferguson
  • Senior lecturer in the Institute of Educational Technology (IET) at The Open University in the UK
  • Senior fellow of the Higher Education Academy
TODAY, Thursday at 1:00 PM CT
JOIN HERE
This Week:
An interactive discussion on the Innovating Pedagogy 2019 report from The Open University
About the Guest
Rebecca is a senior lecturer in the Institute of Educational Technology (IET) at The Open University in the UK and a senior fellow of the Higher Education Academy. Her primary research interests are educational futures, and how people learn together online and I supervise doctoral students in both these areas.
Rebecca worked for several years as a researcher and educator on the Schome project, which focuses on educational futures, and was also the research lead on the SocialLearn online learning platform, and learning analytics lead on the Open Science Lab (Outstanding ICT Initiative of the Year: THE Awards 2014). She is currently a pedagogic adviser to the FutureLearn MOOC platform, and evaluation lead on The Open University’s FutureLearn MOOCs. She is an active member of the Society for Learning Analytics Research, and have co-chaired many learning analytics events, included several associated with the Learning Analytics Community Exchange (LACE), European Project funded under Framework 7.
Rebecca’s most recent book, Augmented Education, was published by Palgrave in spring 2014.
++++++++++++++++++++
My notes
innovative assessment is needed for innovative pedagogy.
Analytics. what is I want to know about my learning (from the learner’s perspective)
Ray Garcelon
How is “stealth assessment” unique compared to formative assessment?
students teaching robots
learning analytics, Rebecca is an authority.
how to assess resources are trustworthy, fake news and social media, navigating post-truth society
how to advance the cause of empathy through technological means
gamification. XR safer environment. digital storytelling and empathy.
poll : learning with robots –
digital literacy and importance for curriculum primary, secondary and post secondary level.
digital literacy is changing every year;
drones
Buckingham Shum, S., & Ferguson, R. (2012). Social Learning Analytics. Educational Technology & Society15(3), 3–26.https://mnpals-scs.primo.exlibrisgroup.com/discovery/fulldisplay?docid=ericEJ992500&context=PC&vid=01MNPALS_SCS:SCS&search_scope=MyInst_and_CI&tab=Everything&lang=en
Mor, Y., Ferguson, R., & Wasson, B. (2015). Editorial: Learning design, teacher inquiry into student learning and learning analytics: A call for action. British Journal of Educational Technology46(2), 221–229. https://doi.org/10.1111/bjet.12273
Rebecca Ferguson. (2014). Learning Analytics: drivers, developments and challenges. TD Tecnologie Didattiche22(3), 138–147. https://doi.org/10.17471/2499-4324/183
Hansen, C., Emin, V., Wasson, B., Mor, Y., Rodriguez-Triana, M., Dascalu, M., … Pernin, J. (2013). Towards an Integrated Model of Teacher Inquiry into Student Learning, Learning Design and Learning Analytics. Scaling up Learning for Sustained Impact – Proceedings of EC-TEL 20138095, 605–606. https://doi.org/10.1007/978-3-642-40814-4_73
how to decolonize educational technology: MOOCs coming from the big colonial powers, not from small countries. Video games: many have very colonial perspective
strategies for innovative pedagogies: only certainly groups or aspects taking into account; rarely focus on support by management, scheduling, time tabling, tech support.

+++++++++++
more on future trends in this IMS blog
https://blog.stcloudstate.edu/ims?s=future+trends

seesaw

https://web.seesaw.me/ 

Seesaw saves time on organization and parent communication, makes formative assessment easy, and provides a safe place to teach 21st Century skills.
QR code connection, available across mobile devices

Alternatives:

Google Classroom https://classroom.google.com/

Quizzlet https://quizlet.com/

Showbie https://www.showbie.com

alternatives for student portfolios:

https://www.commonsense.org/education/blog/top-11-apps-and-websites-for-student-portfolios

https://www.commonsense.org/education/blog/edtech-quick-take-seesaw-vs-freshgrade

two popular digital-portfolio apps: Seesaw and FreshGrade.

See Saw digital learning journal.

See Saw digital learninurnal. https://www.symbalooedu.com/

2017 teaching w technology conference

2017 Teaching with Technology Conference

October 6-8 in Baltimore

Forward-thinking educators are finding that technology can enhance their teaching methods, infuse new energy into their courses, and improve student learning.

But the latest cool technology is only cool if you know where, when, why, and how to use it. Join us in Baltimore for the 2017 Teaching with Technology Conference to learn best practices for effectively integrating technology into your courses.

Topics include:

  • Blended and flipped learning
  • Assignments for online discussion
  • Digital tools for formative assessment
  • Online course design and development
  • Active learning
  • Media literacy
  • Copyright issues

Smartphones in the classroom

+++++++++++++++++++++++++++
more on teaching with technology in this IMS blog
https://blog.stcloudstate.edu/ims?s=educational+technology

biometric authentication online ed

Wiklund, M., Mozelius, P., Westing, T., & Norberg, L. (2016). Biometric Belt and Braces for Authentication in Distance Education. Retrieved from https://www.researchgate.net/publication/309548915_Biometric_Belt_and_Braces_for_Authentication_in_Distance_Education
Abstract
a need for new techniques to handle the problem in online environments. To achieve zero cheating is hard (or impossible) without repelling not only cheaters but also those students who do not cheat, where a zero ‐ tolerance emphasis also would risk inhibiting students’ intrinsic motivation. Several studies indicate that existing virtual learning environments do not provide the features needed to control that the intended student is the one taking the online exam. Biometric Belt and Braces for Authentication in Distance Education.
One approach to prevent student’s dishonesty is the university code of honour. This is a set of rules describing what actions are not permitted and the consequences for students taking such actions. Another way of preventing cheating is the use of proctors during written exams. Even while using such codes of honour and proctors, universities still have found many students to cheat. Biometric Belt and Braces for Authentication in Distance Education.
Neutralisation is the phenomenon when a person rationalises his or her dishonest behaviour with arguments like “I can do this because the work load within this course is just too overwhelming” or “I can do this because I have a half ‐ time job on the side which gives me less study time than the other students have”. By doing so the student puts the blame for cheating on external factors rather than on himself, and also protects himself from the blame of others (Haines et al. 1986). This neutralises the behavior in the sense that the person’s feelings of shame are reduced or even eliminated. Haines et al. (1986 Biometric Belt and Braces for Authentication in Distance Education.
Simply asking participants to read a code of honour when they had the opportunity to cheat reduced dishonesty. Also whether one signed the code of honour or just read it influenced cheating. The Shu et al. (2011) study suggests that opportunity and knowledge of ethical standards are two factors that impact students’ ethical decision about cheating. This is in line with the results in (McCabe, Trevino and Butterfield 2001), showing that if students regularly are reminded of the university’s code of honour, they are less likely to cheat Biometric Belt and Braces for Authentication in Distance Education.
For an online course setting, Gearhart (2001) suggest that teachers should develop a guideline for “good practices”.
In online examination there are reports of students hiring other persons to increase their scores (Flior & Kowalski, 2010) and there is a need for new enhanced authentication tools (Ullah, Xiao & Lilley, 2012). For companies and Internet environments the process of authentication is often completed through the use of logon identification with passwords and the assumption of the password to guarantee that the user is authentic (Ramzan, 2007), but logins and passwords can be borrowed (Bailie & Jortberg, 2009). The discussion on how to provide enhanced authentication in online examination has led to many suggested solutions; four of them are: Biometric Belt and Braces for Authentication in Distance Education.
  • Challenge Questions: with questions based on third ‐ party data ƒ
  • Face ‐ to ‐ Face Proctored Exam: with government or institution issued identification ƒ
  • Web Video Conference Proctor: audio and video conference proctoring via webcam and screen monitoring service with live, certified proctors ƒ
  • Biometrics and Web Video Recording: with unique biometrics combined with the recording of student in exam via webcam

An idea for online courses is that assessment should not only be a one way process where the students get grades and feedback. The examination process should also be a channel for students’ feedback to teachers and course instructors (Mardanian & Mozelius, 2011). New online methods could be combined with traditional assessment in an array of techniques aligned to the learning outcomes (Runyon and Von Holzen, 2005). Examples of summative and formative assessment in an online course could be a mix of: Biometric Belt and Braces for Authentication in Distance Education.

  • Multiple choice questions (MCQ) tests, automatically corrected in a virtual learning environment ƒ
  • Term papers or essays analysed by the course instructors ƒ
  • Individual or group assignments posted in digital drop ‐ boxes ƒ
  • Oral or written tests conducted in the presence of the instructor or through videoconferences (Dikli, 2003)

Authors’ suggestion is a biometric belt and braces model with a combination of scanned facial coordinates and voice recognition, where only a minimum of biometric data has to be stored. Even if the model is based on biometrics with a medium to low grade of uniqueness and permanence, it would be reliable enough for authentication in online courses if two (or more) types of biometrics are combined with the presented dialogue based examination using an interaction/obser ‐ vation process via web cameras. Biometric Belt and Braces for Authentication in Distance Education.

++++++++++++++++++
more on identification in this IMS blog
https://blog.stcloudstate.edu/ims?s=identification

++++++++++++++++
more on proctoring and detecting cheating:

http://www.wgu.edu/blogpost/innocent-red-flags-caught-by-online-exam-proctors

voices from the other side:
http://infoproc.blogspot.com/2013/04/how-to-cheat-online-exam-proctoring.html

https://campustechnology.com/articles/2016/04/06/how-students-try-to-bamboozle-online-proctors.aspx

http://www.usnews.com/education/online-education/articles/2014/06/17/think-twice-before-cheating-in-online-courses

teaching with technology

Boulder Faculty Teaching with Technology Report
Sarah Wise, Education Researcher ,  Megan Meyer, Research Assistant, March 8,2016

http://www.colorado.edu/assett/sites/default/files/attached-files/final-fac-survey-full-report.pdf

Faculty perceive undergraduates to be less proficient with digital literacy skills. One-third think
their students do not find or organize digital information very well. The majority (52%) think
they lack skill in validating digital information.
My note: for the SCSU librarians, digital literacy is fancy word for information literacy. Digital literacy, as used in this report is much greater area, which encompasses much broader set of skills
Faculty do not prefer to teach online (57%) or in a hybrid format (where some sessions occur
online, 32%). One-third of faculty reported no experience with these least popular course types
my note: pay attention to the questions asked; questions I am asking Mike Penrod to let me work with faculty for years. Questions, which are snubbed by CETL and a dominance of D2L and MnSCU mandated tools is established.

Table 5. Do you use these in-class technologies for teaching undergraduates? Which are the Top 3 in-class technologies you would like to learn or use more? (n = 442)

Top 3 use in most of my classes have used in some classes tried, but do not use  

N/A: no experience

in-class activities, problems (via worksheets, tablets, laptops, simulations, beSocratic, etc.)  

52%

 

33%

 

30%

 

6%

 

30%

in-class question, discussion tools (e.g. Twitter, TodaysMeet, aka “backchannel communication”)  

 

47%

 

 

8%

 

 

13%

 

 

11%

 

 

68%

using online resources to find high quality curricular materials  

37%

 

48%

 

31%

 

3%

 

18%

iClickers 24% 23% 16% 9% 52%
other presentation tool (Prezi, Google presentation, Slide Carnival, etc.)  

23%

 

14%

 

21%

 

15%

 

51%

whiteboard / blackboard 20% 58% 23% 6% 14%
Powerpoint or Keynote 20% 74% 16% 4% 5%
document camera / overhead projector 15% 28% 20% 14% 38%

 

Table 6. Do you have undergraduates use these assignment technology tools? Which are your Top 3 assignment technology tools to learn about or use more? (n = 432)

Top 3 use in most of my classes have used in some classes tried, but do not use N/A: no experience using
collaborative reading and discussion tools (e.g. VoiceThread, NB, NotaBene, Highlighter, beSocratic) 43% 3% 10% 10% 77%
collaborative project, writing, editing tools (wikis, PBWorks, Weebly, Google Drive, Dropbox, Zotero)  

38%

 

16%

 

29%

 

12%

 

43%

online practice problems / quizzes with instant feedback 36% 22% 22% 8% 47%
online discussions (D2L, Today’s Meet, etc) 31% 33% 21% 15% 30%
individual written assignment, presentation and project tools (blogs, assignment submission, Powerpoint, Prezi, Adobe Creative Suite, etc.)  

31%

 

43%

 

28%

 

7%

 

22%

research tools (Chinook, pubMed, Google Scholar, Mendeley, Zotero, Evernote) 30% 33% 32% 8% 27%
online practice (problems, quizzes, simulations, games, CAPA, Pearson Mastering, etc.) 27% 20% 21% 7% 52%
data analysis tools (SPSS, R, Latex, Excel, NVivo, MATLAB, etc.) 24% 9% 23% 6% 62%
readings (online textbooks, articles, e-books) 21% 68% 23% 1% 8%

Table 7. Do you use any of these online tools in your teaching? Which are the Top 3 online tools you would like to learn about or use more? (n = 437)

 

 

 

Top 3

 

use in most of my classes

 

have used in some classes

 

tried, but do not use

N/A: no experience using
videos/animations produced for my course (online lectures, Lecture Capture, Camtasia, Vimeo)  

38%

 

14%

 

21%

 

11%

 

54%

chat-based office hours or meetings (D2L chat, Google Hangouts, texting, tutoring portals, etc.)  

36%

 

4%

 

9%

 

10%

 

76%

simulations, PhET, educational games 27% 7% 17% 6% 70%
videoconferencing-based office hours or meetings (Zoom, Skype, Continuing Education’s Composition hub, etc.)  

26%

 

4%

 

13%

 

11%

 

72%

alternative to D2L (moodle, Google Site, wordpress course website) 23% 11% 10% 13% 66%
D2L course platform 23% 81% 7% 4% 8%
online tutorials and trainings (OIT tutorials, Lynda.com videos) 21% 4% 16% 13% 68%
D2L as a portal to other learning tools (homework websites, videos, simulations, Nota Bene/NB, Voice Thread, etc.)  

21%

 

28%

 

18%

 

11%

 

42%

videos/animations produced elsewhere 19% 40% 36% 2% 22%

In both large and small classes, the most common responses faculty make to digital distraction are to discuss why it is a problem and to limit or ban phones in class.
my note: which completely defies the BYOD and turns into empty talk / lip service.

Quite a number of other faculty (n = 18) reported putting the onus on themselves to plan engaging and busy class sessions to preclude distraction, for example:

“If my students are more interested in their laptops than my course material, I need to make my curriculum more interesting.”

I have not found this to be a problem. When the teaching and learning are both engaged/engaging, device problems tend to disappear.”

The most common complaint related to students and technology was their lack of common technological skills, including D2L and Google, and needing to take time to teach these skills in class (n = 14). Two commented that digital skills in today’s students were lower than in their students 10 years ago.

Table 9. Which of the following are the most effective types of learning opportunities about teaching, for you? Chose your Top 2-3. (n = 473)

Count           Percentage

meeting 1:1 with an expert 296 63%
hour-long workshop 240 51%
contact an expert on-call (phone, email, etc) 155 33%
faculty learning community (meeting across asemester,

e.g. ASSETT’s Hybrid/Online Course Design Seminar)

116 25%
expert hands-on support for course redesign (e.g. OIT’s Academic Design Team) 114 24%
opportunity to apply for grant funding with expert support, for a project I design (e.g. ASSETT’s Development Awards)  

97

 

21%

half-day or day-long workshop 98 21%
other 40 8%
multi-day retreats / institutes 30 6%

Faculty indicated that the best times for them to attend teaching professional developments across the year are before and early semester, and summer. They were split among all options for meeting across one week, but preferred afternoon sessions to mornings. Only 8% of respondents (n = 40) indicated they would not likely attend any professional development session (Table 10).

+++++++++++++++++++++++++++

Teaching Through Technology
http://www.maine.edu/pdf/T4FinalYear1ReportCRE.pdf

Table T1: Faculty beliefs about using digital technologies in teaching

Count Column N%
Technology is a significant barrier to teaching and learning. 1 0.2%
Technology can have a place in teaching, but often detracts from teaching and learning. 76 18.3%
Technology has a place in teaching, and usually enhances the teaching learning process. 233 56.0%
Technology greatly enhances the teaching learning process. 106 25.5%

Table T2: Faculty beliefs about the impact of technology on courses

Count Column N%
Makes a more effective course 302 72.6%
Makes no difference in the effectiveness of a course 42 10.1%
Makes a less effective course 7 1.7%
Has an unknown impact 65 15.6%

Table T3: Faculty use of common technologies (most frequently selected categories shaded)

Once a month or less A few hours a month A few hours a week An hour a day Several hours a day
Count % Count % Count % Count % Count %
Computer 19 4.8% 15 3.8% 46 11.5% 37 9.3% 282 70.7%
Smart Phone 220 60.6% 42 11.6% 32 8.8% 45 12.4% 24 6.6%
Office Software 31 7.8% 19 4.8% 41 10.3% 82 20.6% 226 56.6%
Email 1 0.2% 19 4.6% 53 12.8% 98 23.7% 243 58.7%
Social Networking 243 68.8% 40 11.3% 40 11.3% 23 6.5% 7 2.0%
Video/Sound Media 105 27.6% 96 25.2% 95 24.9% 53 13.9% 32 8.4%

Table T9: One sample t-test for influence of technology on approaches to grading and assessment

Test Value = 50
t df Sig. (2-tailed) Mean Difference 95% Confidence Interval of the Difference
Lower Upper
In class tests and quizzes -4.369 78 .000 -9.74684 -14.1886 -5.3051
Online tests and quizzes 5.624 69 .000 14.77143 9.5313 20.0115
Ungraded  assessments 1.176 66 .244 2.17910 -1.5208 5.8790
Formative assessment 5.534 70 .000 9.56338 6.1169 13.0099
Short essays, papers, lab reports, etc. 2.876 70 .005 5.45070 1.6702 9.2312
Extended essays and major projects or performances 1.931 69 .058 3.67143 -.1219 7.4648
Collaborative learning projects .000 73 1.000 .00000 -4.9819 4.9819

Table T10: Rate the degree to which your role as a faculty member and teacher has changed as a result of increased as a result of increased use of technology

Strongly Disagree Disagree Somewhat Disagree Somewhat Agree Agree Strongly Agree
Count % Count % Count % Count % Count % Count %
shifting from the role of content expert to one of learning facilitator  

12

 

9.2%

 

22

 

16.9%

 

14

 

10.8%

 

37

 

28.5%

 

29

 

22.3%

 

16

 

12.3%

your primary role is to provide content for students  

14

 

10.9%

 

13

 

10.1%

 

28

 

21.7%

 

29

 

22.5%

 

25

 

19.4%

 

20

 

15.5%

your identification with your University is increased  

23

 

18.3%

 

26

 

20.6%

 

42

 

33.3%

 

20

 

15.9%

 

12

 

9.5%

 

3

 

2.4%

you have less ownership of your course content  

26

 

20.2%

 

39

 

30.2%

 

24

 

18.6%

 

21

 

16.3%

 

14

 

10.9%

 

5

 

3.9%

your role as a teacher is strengthened 13 10.1% 12 9.3% 26 20.2% 37 28.7% 29 22.5% 12 9.3%
your overall control over your course(s) is diminished  

23

 

17.7%

 

44

 

33.8%

 

30

 

23.1%

 

20

 

15.4%

 

7

 

5.4%

 

6

 

4.6%

Table T14: One sample t-test for influence of technology on faculty time spent on specific teaching activities

Test Value = 50
t df Sig. (2-tailed) Mean Difference 95% Confidence Interval of the Difference
Lower Upper
Lecturing -7.381 88 .000 -12.04494 -15.2879 -8.8020
Preparing course materials 9.246 96 .000 16.85567 13.2370 20.4744
Identifying course materials 8.111 85 .000 13.80233 10.4191 17.1856
Grading / assessing 5.221 87 .000 10.48864 6.4959 14.4813
Course design 12.962 94 .000 21.55789 18.2558 24.8600
Increasing access to materials for all types of learners 8.632 86 .000 16.12644 12.4126 19.8403
Reading student discussion posts 10.102 79 .000 21.98750 17.6553 26.3197
Email to/with students 15.809 93 .000 26.62766 23.2830 29.9724

++++++++++++++++++++++++++

Study of Faculty and Information Technology, 2014

http://net.educause.edu/ir/library/pdf/ers1407/ers1407.pdf

Although the LMS is pervasive in higher education, 15% of faculty said that they
do not use the LMS at all. Survey demographics suggest these nonusers are part of
the more mature faculty ranks, with a tenure status, more than 10 years of teaching
experience, and a full-professor standing.
18
The vast majority of faculty use the LMS
to conduct or support their teaching activities, but only three in five LMS users (60%)
said it is critical to their teaching. The ways in which faculty typically use the LMS are
presented in figure 8.
19
Pushing out information such as a syllabus or other handout
is the most common use of the LMS (58%), which is a basic functionality of the
first-generation systems that emerged in the late 1990s, and it remains one of the core
features of any LMS.
20
Many institutions preload the LMS with basic course content
(58%), up about 12% since 2011, and this base gives instructors a prepopulated plat
form from which to build their courses.
21
Preloading basic content does not appear to
preclude faculty from making the LMS part of their daily digital habit; a small majority
of faculty (56%) reported using the LMS daily, and another 37% use it weekly.

+++++++++++++++++++++++++++++

Digital Literacy, Engagement, and Digital Identity Development

https://www.insidehighered.com/blogs/student-affairs-and-technology/digital-literacy-engagement-and-digital-identity-development

igital Literacy, Engagement, and Digital Identity Development

+++++++++++++++++

 

++++++++++++++++

more on digital literacy in this IMS blog

https://blog.stcloudstate.edu/ims?s=digital+literacy

Save

teacher evaluation

doctoral cohort student’s request for literature: “I am looking for some more resources around the historical context of teacher evaluation.”

pre-existing bibliography:

Allen, J., Gregory, A., Mikami, A. I., Lun, J., Hamre, B., & Pianta, R. (2013). Observations of Effective Teacher-Student Interactions in Secondary School Classrooms: Predicting Student Achievement With the Classroom Assessment Scoring System—Secondary. School Psychology Review, 42(1), 76–98.

Alonzo, A. C. (2011). COMMENTARIES Learning Progressions That Support Formative Assessment Practices. Measurement, 9, 124–129. http://doi.org/10.1080/15366367.2011.599629

Baker, B. D., Oluwole, J. O., & Green, P. C. (2013). The Legal Consequences of Mandating High Stakes Decisions Based on Low Quality Information: Teacher Evaluation in the Race-to-the-Top Era. Education Policy Analysis Archives, 21(5), 1–71. http://doi.org/http://epaa.asu.edu/ojs/article/view/1298

Benedict, A. E., Thomas, R. a., Kimerling, J., & Leko, C. (2013). Trends in Teacher Evaluation. Teaching Exceptional Children. May/Jun2013, 45(5), 60–68.

Bonavitacola, A. C., Guerrazzi, E., & Hanfelt, P. (2014). TEACHERS’ PERCEPTIONS OF THE IMPACT OF THE McREL TEACHER EVALUATION SYSTEM ON PROFESSIONAL GROWTH.

Charlotte Danielson. (2016). Creating Communities of Practice. Educational Leadership, (May), 18 – 23.

Darling-Hammond, L., Wise, A. E., & Pease, S. R. (1983). Teacher Evaluation in the Organizational Context: A Review of the Literature. Review of Educational Research, 53(3), 285–328. http://doi.org/10.3102/00346543053003285

Darling-Hammond, L., Jaquith, A., & Hamilton, M. (n.d.). Creating a Comprehensive System for Evaluating and Supporting Effective Teaching.

Derrington, M. L. (n.d.). Changes in Teacher Evaluation: Implications for the Principal’s Work.

Gallagher, H. A. (2004). Vaughn Elementary’s Innovative Teacher Evaluation System: Are Teacher Evaluation Scores Related to Growth in Student Achievement? Peabody Journal of Education, 79(4), 79–107. http://doi.org/10.1207/s15327930pje7904_5

Hallgren, K., James-Burdumy, S., & Perez-Johnson, I. (2014). STATE REQUIREMENTS FOR TEACHER EVALUATION POLICIES PROMOTED BY RACE TO THE TOP.

Hattie Helen E-Mail Address, J. T., Hattie, J., & Timperley, H. (2007). The power of feedback. [References]. Review of Educational Research, .77(1), 16–7. http://doi.org/10.3102/003465430298487

Hazi, H. M. (n.d.). Legal Challenges to Teacher Evaluation: Pitfalls and Possibilities in the States. http://doi.org/10.1080/00098655.2014.891898

Ingle, W. K., Willis, C., & Fritz, J. (2014). Collective Bargaining Agreement Provisions in the Wake of Ohio Teacher Evaluation System Legislation. Educational Policy. http://doi.org/10.1177/0895904814559249

Marzano, R. J. (2012). The Two Purposes of Teacher Evaluation. Educational Leadership, 70(3), 14–19. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&db=aph&AN=83173912&site=ehost-live

Moskal, A. C. M., Stein, S. J., & Golding, C. (2016). Assessment & Evaluation in Higher Education Can you increase teacher engagement with evaluation simply by improving the evaluation system? Can you increase teacher engagement with evaluation simply by improving the evaluation system? http://doi.org/10.1080/02602938.2015.1007838

Quinn, A. E. (n.d.). The Delta Kappa Gamma Bulletin Looking a t th e B igger Picture w ith Dr. R o b ert M arzan o : Teacher E valuation and D e v e lo p m e n t fo r Im p ro ved S tu d en t Learning.

Riordan, J., Lacireno-Paquet, Shakman, N., Bocala, K., & Chang, C. (2015). Redesigning teacher evaluation: Lessons from a pilot implementation. Retrieved from http://ies.ed.gov/

Taylor, E. S., & Tyler, J. H. (n.d.). Evidence of systematic growth in the effectiveness of midcareer teachers Can Teacher Evaluation Improve Teaching?

Tuytens, M., & Devos, G. (n.d.). The problematic implementation of teacher evaluation policy: School failure or governmental pitfall? http://doi.org/10.1177/1741143213502188

Wong, W. Y., & Moni, K. (2013). Teachers’ perceptions of and responses to student evaluation of teaching: purposes and uses in clinical education. http://doi.org/10.1080/02602938.2013.844222

my list of literature:

Avalos, B., & Assael, J. (2006). Moving from resistance to agreement: The case of the Chilean teacher performance evaluation. International Journal of Educational Research, 45(4-5), 254-266.

Cowen, J. M., & Fowles, J. (2013). Same contract, different day? an analysis of teacher bargaining agreements in Louisville since 1979. Teachers College Record, 115(5)

Flippo, R. F. (2002). Repeating history: Teacher licensure testing in Massachusetts. Journal of Personnel Evaluation in Education, 16(3), 211-29.

Griffin, G. (1997). Teaching as a gendered experience. Journal of Teacher Education, 48(1), 7-18.

Hellawell, D. E. (1992). Structural changes in education in England. International Journal of Educational Reform, 1(4), 356-65.

Hibler, D. W., & Snyder, J. A. (2015). Teaching matters: Observations on teacher evaluations. Schools: Studies in Education, 12(1), 33-47.

Hill, H. C., & Grossman, P. (2013). Learning from teacher observations: Challenges and opportunities posed by new teacher evaluation systems. Harvard Educational Review, 83(2), 371-384.

Hines, L. M. (2007). Return of the thought police?: The history of teacher attitude adjustment. Education Next, 7(2), 58-65.

Kersten, T. A. (2006). Teacher tenure: Illinois school board presidents’ perspectives and suggestions for improvement. Planning and Changing, 37(3-4), 234-257.

Kersten, T. A., & Israel, M. S. (2005). Teacher evaluation: Principals’ insights and suggestions for improvement. Planning and Changing, 36(1-2), 47-67.

Korkmaz, I. (2008). Evaluation of teachers for restructured elementary curriculum (grades 1 to 5). Education, 129(2), 250-258.

Lamb, M. L., & Swick, K. J. (1975). Historical overview of teacher observation Educational Forum.

Maharaj, S. (2014). Administrators’ views on teacher evaluation: Examining Ontario’s teacher performance appraisal. Canadian Journal of Educational Administration and Policy, (152)

Naba’h, A. A., Al-Omari, H., Ihmeideh, F., & Al-Wa’ily, S. (2009). Teacher education programs in Jordan: A reform plan. Journal of Early Childhood Teacher Education, 30(3), 272-284.

Ornstein, A. C. (1977). Critics and criticism of education Educational Forum.

Pajak, E., & Arrington, A. (2004). Empowering a profession: Rethinking the roles of administrative evaluation and instructional supervision in improving teacher quality. Yearbook of the National Society for the Study of Education, 103(1), 228-252.

Stamelos, G., & Bartzakli, M. (2013). The effect of a primary school teachers, trade union on the formation and realisation of policy in Greece: The case of teacher evaluation policy. Policy Futures in Education, 11(5), 575-588.

Stamelos, G., Vassilopoulos, A., & Bartzakli, M. (2012). Understanding the difficulties of implementation of a teachers’ evaluation system in greek primary education: From national past to european influences. European Educational Research Journal, 11(4), 545-557.

Sullivan, J. P. (2012). A collaborative effort: Peer review and the history of teacher evaluations in Montgomery county, Maryland. Harvard Educational Review, 82(1), 142-152.

Tierney, W. G., & Lechuga, V. M. (2005). Academic freedom in the 21st century. Thought & Action, , 7-22.

Turri, M. (2014). The new italian agency for the evaluation of the university system (ANVUR): A need for governance or legitimacy? Quality in Higher Education, 20(1), 64-82.

VanPatten, J. J. (1972). Some reflections on accountability Journal of Thought.

Vijaysimha, I. (2013). Teachers as professionals: Accountable and autonomous? review of the report of the justice Verma commission on teacher education. august 2012. department of school education and literacy, ministry of human resource development, government of India. Contemporary Education Dialogue, 10(2), 293-299.

Vold, D. J. (1985). The roots of teacher testing in America. Educational Measurement: Issues and Practice, 4(3), 5-7.

Wermke, W., & Höstfält, G. (2014). Contextualizing teacher autonomy in time and space: A model for comparing various forms of governing the teaching profession. Journal of Curriculum Studies, 46(1), 58-80.

Ydesen, C., & Andreasen, K. E. (2014). Accountability practices in the history of Danish primary public education from the 1660s to the present. Education Policy Analysis Archives, 22(120)

Greensboro presentation

Please develop a one hour workshop for faculty on using a new (or old but new to them) technology tool. The aim is not to only show the technical operation, but the pedagogical use of the tool helping faculty think about what this might mean in their own teaching.

Short link: : http://bit.ly/UNCGpres

Alternatives to the pedagogical use of BYOD

Who: students, faculty and staff
Where: TBD
When: Friday, June 17, 2016. 10-11:30 AM

announcement

5 min introduction of workshop presenter Plamen Miltenoff and workshop participants

5 min plan of the workshop

5 min introduction to the topic:

Outline
In financially-sparse times for educational institutions, one viable way to save money is by rethinking pedagogy/methodology and adapt it to the burgeoning numbers of mobile devices (BYOD) owned by students, faculty and staff.

In 5 min,
we will be playing a game, using Kahoot (https://kahoot.it). Kahoot is an application from Norway, which is increasingly popular in K12 and gradually picking momentum at higher ed.

Why Kahoot and not any of the other similar polling apps (AKA formative assessment tools), such as PollEverywhere, PollDaddy etc. (https://blog.stcloudstate.edu/ims/2016/01/13/formative-assessment-tools/)?
1. Kahoot has gained momentum; at least one third of your undergraduates have used it in high school and are familiar with the interface.
2. I personally like Kahoot for the kahoots. J
3. I like badges as “badges in gamification.” Let me know, if you want to work on this topic some other time and lets schedule work time after this session (https://blog.stcloudstate.edu/ims?s=badges).

In 10-15 min,
lets try to create an account and build our first kahoot (https://getkahoot.com/). You can use any topic and focus on the features, which Kahoot provides. Split in groups and help each other; if you feel stuck, please let me know and I will do my best to help advance further.
Here are two YouTube lectures how to create an account and a kahoot quiz (5 min) and how to play a kahoot (3 min): https://blog.stcloudstate.edu/ims/2016/06/13/how-to-kahoot/

In 5-10 min,

let’s display 1-2 kahoot’s to the entire audience and think about situations, when and where such kahoots can be used for educational purposes.
Let’s think about the implications, which the use of kahoots on BOYD may trigger in the classroom

Let’s think about the preparation needed for the smooth use of the kahoots (is your WiFi in that particular classroom robust enough to hold the action of 20? 200? Students?
Let’s think about students’ engagement: what constitutes it? would a kahoot on their BYOD will be sufficient to pick their interest and if not, what else must be added to the magic elixir?

In 5 min, lets discuss Kahoot’s similarities with other educational technologies used in the classroom

Let’s assess the potential of Kahoot.
how does it compare
how does it transfer
is it compatible with Canvas

1 2 3 4