Posts Tagged ‘assessment’
We know that many of you have been interested in exploring Turnitin in the past, so we are excited to bring you an exclusive standardized price and more information on the roll out of Feedback Studio, replacing the Turnitin you have previously seen. We would like to share some exciting accessibility updates, how Feedback Studio can help faculty deliver formative feedback to students and help students become writers. Starting today thru December 31st non-integrated Feedback Studio will be $2.50 and integrated Feedback Studio will be $3 for new customers! Confused by the name? Don’t be! Turnitin is new and improved! Check out this video to learn about Feedback Studio!
Meet your exclusive Turnitin Team!
Ariel Ream – Account Executive, Indianapolis email@example.com – 317.650.2795
Juliessa Rivera – Relationship Manager, Oakland firstname.lastname@example.org – 510.764.7698
Juan Valladares – Account Representative, Oakland
email@example.com – 510.764.7552
To learn more, please join us for a WebEx on September 21st. We will be offering free 30 day pilots to anyone who attends!
Wednesday, September 21, 2016
11:00 am | Central Daylight Time (Chicago) | 1 hr
Meeting number (access code): 632 474 162
my notes from the webinar
I am prejudiced against TI and I am not hiding it; that does not mean that I am wrong.
For me, TurnitIn (TI) is an anti-pedagogical “surfer,” using the hype of “technology” to ride the wave of overworked faculty, who hope to streamline increasing workload with technology instead of working on pedagogical resolutions of not that new issues.
Low and behold, Juan, the TI presenter is trying to dazzle me with stuff, which does not dazzle me for a long time.
WCAG 2.0 AA standards of the W3C and section 508 of the rehabilitation act.
the sales pitch: 79% of students believe in feedback, but only %50+ receive it. HIs source is TurnitIn surveys from 2012 to 2016 (very very small font size (ashamed of it?))
It seems to me very much like “massaged” data.
Testimonials: one professor and one students. Ha. the apex of qualitative research…
next sales pitch: TurnitIn feedback studio. Not any more the old Classic. It assesses the originality. Drag and drop macro-style notes. Pushing rubrics. but we still fight for rubrics in D2L. If we have a large amount of adjuncts. Ha. another gem. “I know that you are, guys, IT folks.” So the IT folks are the Trojan horse to get the faculty on board. put comments on
This presentation is structured dangerously askew: IT people but no faculty. If faculty is present, they will object that they ARE capable of doing the same which is proposed to be automated.
More , why do i have to pay for another expensive software, if we have paid already Microsoft? MS Word can do everything that has been presented so far. Between MS Word and D2L, it becomes redundant.
why the heck i am interested about middle school and high school.
TI was sued for illegal collection of paper; paper are stored in their database without the consent of the students’ who wrote it. TI goes “great length to protect the identity of the students,” but still collects their work [illegally?}
November 10 – 30 day free trial
otherwise, $3 per student, prompts back: between Google, MS Word and D2L (which we already heftily pay for), why pay another exuberant price.
D2L integration: version, which does not work. LTI.
“small price to pay of such a beauty” – it does not matter how quick and easy the integration is, it is a redundancy, which already can be resolved with existing tools, part of which we are paying hefty price for
Quantile Measures for Math Added to Kansas Student Assessments
By Dian Schaffhauser 05/27/16
There are two types of Lexile measures: a person’s reading ability and the text’s difficulty. Students who are tested against state standards receive a Lexile reader measure from the Kansas Reading Assessment. Books and other texts receive a Lexile text measure from a MetaMetrics software tool called the Lexile Analyzer, which describes the book’s reading demand or complexity. When used together, the two measures are intended to help match a reader with reading material that is at an appropriate difficulty or will at least help give an idea of how well a reader should comprehend text. The reader should encounter some level of difficulty with the text, but not enough to get frustrated. The Lexile reader measure is used to monitor reader progress.
My note: is this another way / attempt to replace humans as educators? Or it is a supplemental approach to improve students’ reading abilities.
Digital Badges in Education: Trends, Issues, and Cases.
In recent years, digital badging systems have become a credible means through which learners can establish portfolios and articulate knowledge and skills for both academic and professional settings. Digital Badges in Education provides the first comprehensive overview of this emerging tool. A digital badge is an online-based visual representation that uses detailed metadata to signify learners’ specific achievements and credentials in a variety of subjects across K-12 classrooms, higher education, and workplace learning. Focusing on learning design, assessment, and concrete cases in various contexts, this book explores the necessary components of badging systems, their functions and value, and the possible problems they face. These twenty-five chapters illustrate a range of successful applications of digital badges to address a broad spectrum of learning challenges and to help readers formulate solutions during the development of their digital badges learning projects.
Badges and Leaderboards: Professional Developments for Teachers in K12
Why should I bother earning badges?
issues to consider:
More on badges and gaming in education in this IMS blog:
ACRL e-Learning webcast series: Learning Analytics – Strategies for Optimizing Student Data on Your Campus
This three-part webinar series, co-sponsored by the ACRL Value of Academic Libraries Committee, the Student Learning and Information Committee, and the ACRL Instruction Section, will explore the advantages and opportunities of learning analytics as a tool which uses student data to demonstrate library impact and to identify learning weaknesses. How can librarians initiate learning analytics initiatives on their campuses and contribute to existing collaborations? The first webinar will provide an introduction to learning analytics and an overview of important issues. The second will focus on privacy issues and other ethical considerations as well as responsible practice, and the third will include a panel of librarians who are successfully using learning analytics on their campuses.
Webcast One: Learning Analytics and the Academic Library: The State of the Art and the Art of Connecting the Library with Campus Initiatives
March 29, 2016
Learning analytics are used nationwide to augment student success initiatives as well as bolster other institutional priorities. As a key aspect of educational reform and institutional improvement, learning analytics are essential to defining the value of higher education, and academic librarians can be both of great service to and well served by institutional learning analytics teams. In addition, librarians who seek to demonstrate, articulate, and grow the value of academic libraries should become more aware of how they can dovetail their efforts with institutional learning analytics projects. However, all too often, academic librarians are not asked to be part of initial learning analytics teams on their campuses, despite the benefits of library inclusion in these efforts. Librarians can counteract this trend by being conversant in learning analytics goals, advantages/disadvantages, and challenges as well as aware of existing examples of library successes in learning analytics projects.
Learn about the state of the art in learning analytics in higher education with an emphasis on 1) current models, 2) best practices, 3) ethics, privacy, and other difficult issues. The webcast will also focus on current academic library projects and successes in gaining access to and inclusion in learning analytics initiatives on their campus. Benefit from the inclusion of a “short list” of must-read resources as well as a clearly defined list of ways in which librarians can leverage their skills to be both contributing members of learning analytics teams, suitable for use in advocating on their campuses.
open academic analytics initiative
where data comes from:
- students information systems (SIS)
- Video streaming and web conferencing
- Co-curricular and extra-curricular involvement
D2L degree compass
Predictive Analytics Reportitng PAR – was open, but just bought by Hobsons (https://www.hobsons.com/)
IMS Caliper Enabled Services. the way to connect the library in the campus analytics https://www.imsglobal.org/activity/caliperram
student’s opinion of this process
benefits: self-assessment, personal learning, empwerment
analytics and data privacy – students are OK with harvesting the data (only 6% not happy)
8 in 10 are interested in personal dashboard, which will help them perform
Big Mother vs Big Brother: creepy vs helpful. tracking classes, helpful, out of class (where on campus, social media etc) is creepy. 87% see that having access to their data is positive
recognize metrics, assessment, analytics, data. visualization, data literacy, data science, interpretation
INSTRUCTION DEPARTMENT – N.B.
determine who is the key leader: director of institutional research, president, CIO
who does analyics services: institutional research, information technology, dedicated center
analytic maturity: data drivin, decision making culture; senior leadership commitment,; policy supporting (data ollection, accsess, use): data efficacy; investment and resourcefs; staffing; technical infrastrcture; information technology interaction
student success maturity: senior leader commited; fudning of student success efforts; mechanism for making student success decisions; interdepart collaboration; undrestanding of students success goals; advising and student support ability; policies; information systems
developing learning analytics strategy
understand institutional challenges; identify stakeholders; identify inhibitors/challenges; consider tools; scan the environment and see what other done; develop a plan; communicate the plan to stakeholders; start small and build
ways librarians can help
idenfify institu partners; be the partners; hone relevant learning analytics; participate in institutional analytics; identify questions and problems; access and work to improve institu culture; volunteer to be early adopters;
questions to ask: environmental scanning
do we have a learning analytics system? does our culture support? leaders present? stakeholders need to know?
questions to ask: Data
questions to ask: Library role
learning analytics & the academic library: the state of the art of connecting the library with campus initiatives
causation versus correlation studies. speakers claims that it is difficult to establish causation argument. institutions try to predict as accurately as possible via correlation, versus “if you do that it will happen what.”
More on analytics in this blog:
6 Strategies for Differentiated Instruction in Project-Based Learning
1. Differentiate Through Teams
2. Reflection and Goal Setting
3. Mini-Lessons, Centers, and Resources
4. Voice and Choice in Products
5. Differentiate Through Formative Assessments
6. Balance Teamwork and Individual Work
Ten Teaching Trends from the Innovating Pedagogy Report
Ten Teaching Trends from the Innovating Pedagogy Report
The 2015 Innovating Pedagogy Report proposes ten innovations that explore ways of teaching, learning, and assessment for an interactive, engaged world.
Based on the literature regarding games, gaming, gamification, game-based learning, and serious games, several clear trends emerge:
- Gaming and gamification in the sense of game-based learning is about using games and game-like tactics in the education process, for greater engagement and better learning outcomes. However, this is only the first level of such initiative. The second and higher level is about involving students in the game-building and gamification of the learning process (as per Vygotsky’s Zone of…) thus achieving student-centered and experiential learning.
- When hosting games and gaming in any library, “in-person” or electronic/online games are welcome but not sufficient to fulfill their promise, especially in an academic library. Per (1), an academic library has the responsibility to involve students and guide them in learning how to engage in the building process required in true game-based learning.
- Game-based learning, gaming and gamification in particular, in educational (academic library) settings must consider mobile devices and the BYOD movement in particular as intrinsic parts of the entire process. Approaching the initiative primarily by acquiring online “in-person” games, or game consoles has the same limited educational potential as only hosting games, rather than elevating the students to full guided engagement with game-based learning. If public relations and raised profile are the main goals for the academic library, such an approach is justified. If the academic library seeks to maximize the value of game-based learning, then the library must consider: a. gaming consoles, b. mobile devices as part of a BYOD initiative and c. cloud-based / social games, such as MineCraft, SimCity etc.
- Design for game-based learning, gaming and gamification in educational (academic library) settings must include multiple forms of assessment and reward, e.g. badges, leaderboards and/or certificates as an intrinsic part of the entire process. Merely hosting games in the academic library cannot guarantee true game-based learning. The academic library, as the forefront of a game-based learning initiative on campus, must work with faculty on understanding and fine tuning badges and similar new forms of assessment and reward, as they effectively implement large scale game-based learning, focused on the students’ learning gains.
Recommendations for LRS
- In regard to LRS, the gaming and gamification process must be organized and led by faculty, including housing and distributing the hardware, software and applications, when needed.
- The attached paper and the respective conclusions summarized in four points demand educational and experiential background, which is above the limits of the LRS staff. In addition, the LRS staff has clearly admitted that the pedagogical value of gaming and gamification is beyond their interest. This recommendation is not contradicting to the fact and opportunity for LRS staff to participate in the process and contribute to the process; it just negates the possibility of staff mandating and leading the process, since it will keep the gaming and gamification process on a very rudimentary level.
- The process must be further led by faculty with a terminal degree in education (Ph.D.) and experience in the educational field, since, as proved by the attached paper and 4 point conclusion, the goal is not a public-library type of hosting activities, but rather involving students in a pedagogically-sound creative process, with the respective opportunity for assessment and future collaboration with instructors across campus. This recommendation is not contradicting the fact and opportunity for LRS library faculty to participate actively in the process and contribute to the process. It just safeguards from restricting the process to the realm of “public-library” type of hosting activities, but failing to elevate them to the needs of an academic campus and connecting with instructors across campus.
- This conclusions adhere to and are derived from the document recommended by the LRS dean, discussed and accepted by LRS faculty in 2013 about new trends and directions in academic libraries, namely diversification of LRS faculty; breaking from the traditional library mold of including faculty from different disciplines with different opinions and ideas.
Pls have a link to the PDF file
Here some opinions from the comments section:
Formative assessments are only good if you use them to alter your teaching or for students to adjust their learning. Too often, I’ve seen exit tickets used and nothing is done with the results.
Please consider other IMS blog postings on assessment
A Survey of the Electronic Portfolio Market Sector: Analysis and Surprising Trends
FolioTek, Columbia, Missouri, ePortfolio launch in 2001. Sells in U.S. with interest in expanding globally.
Livetext, LaGrange, IL, founded in 1998. New product: Field Experience Module. Smart phone app: iPad, iPhone, Android. Mostly U.S., but expanding in South America and the Middle East. Easy tie-in to accreditation agencies and their standards. Individual accounts. New release start of 2012. Started in K-12, moved focus to higher education, now exploring K-12 once again, starting with teacher education.
RCampus, produced by Reazon Systems, Santa Ana, CA. Software development started in 1999,
Desire2Learn, Kitchener, Ontario also Baltimore, MD, with offices around the world, founded in 1999. Sells worldwide, latest release for the electronic portfolio (ver. 3.5) was in August 2011. Electronic portfolio and the D2L LMS are bundled; each leverages functionalities from the other. ePortfolio moving to hosting service and individual accounts soon.
Digication, Providence, RI and Palo Alto, CA, founded 2002. Is in partnership with Google Apps. Individual accounts; institution keeps assessment data; individual keeps ePortfolio functionality. Through Google Apps: free digital accounts with Digication (no assessment management functions with these accounts). “Three or four clicks and Digication is enabled.” Almost daily updates. Smart phone app: IOS and Android. Contact firstname.lastname@example.org.
Learning Objects, producers of Campus Pack, in Washington, DC, with employees around the world, founded in 2003.
TaskStream, New York City, organized 1998, founded 2000, markets internationally, versions available in a variety of languages. Offers separate platforms, AMS (Accountability Management System) and LAT (Learning Achievement Tools); each is multi-component.
Longsight, based in Ohio with offices in NY, IN, OH, WI, and CA, founded in 1978, a service provider for open source solutions. Supports both the Open Source Portfolio (OSP) and Sakai, within which OSP is embedded.
Chalk & Wire, Ridgeway, Ontario, Canada;
NobleHour, produced by TreeTop Software, in Lakeland, FL, founded in 2011
Sherston, Tag Developments, the assessment division of Sherston Software, Ltd., providers of Red Pen Tool: http://www.maps-ict.com/redpentool.mov, of LiveAssess: http://www.maps-ict.com/liveassess.mov, and of MAPS 3: http://www.maps-ict.com/maps3.mov.
PebblePad from PebbleLearning, in Telford, UK, with office in Australia, founded in 2003. Most popular ePortfolio in the U.K. and Australia,
Symplicity, in Arlington, VA, offers an electronic portfolio (http://www.symplicity.com/reflection) but it is only one among dozens of products that Symplicity offers–all of them are management tools for higher education (see http://www.symplicity.com/products). Good example of separating products to support a single function.
eFolioWorld, technology from Avenet, the Minnesota Colleges and Universities portfolio system,
iWebFolio, from Nuventive. Also known for TracDat, marketed since the 1990s, Nuventive founded 2000.
p. 10 and p. 18 offer questionnaires for assessment
p. 3 questionnaire p. 5