browser icon
You are using an insecure version of your web browser. Please update your browser!
Using an outdated browser makes your computer unsafe. For a safer, faster, more enjoyable user experience, please update your browser today or try a newer browser.

A Pilot Test of Revision Assistant and What We Learned in the Process

Posted by on April 17, 2017

Revision Assistant, part of the Turnitin family, is a comprehensive virtual writing assistant for students that allows them to digitally edit and rewrite documents for any class. Last year, Sammy Spencer, a High School English teacher in Southern California, ran a pilot program using Revision Assistant in her school. Here’s her story:

pilot programLast Fall, my El Camino Real High School colleagues and I set out to change the way we teach writing. We wanted to redefine effective standards-based instruction and assessment. By the time we were finished with a pilot test, we discovered that a technology tool helped us and our students in some unexpected ways. It changed our day-to-day writing instruction practices, gave students more power over their own learning, and happily, made writing exercises more real and applicable for other departments like social studies.

In 2016-17, I was the new English department chair at ECRCHS, which is a large public charter school in Los Angeles. We are fortunate in that we have a lot of academic freedom, but since this is an accreditation year, we have to be sure we have data to prove we are meeting our learning objectives.

This year, I needed to help our English department implement shifts in writing methods directed by the Common Core State Standards (CCSS). We also needed new pedagogical approaches that would yield data to measure progress. Our literacy coach and English teacher, Heidi Crocker, found a product from Turnitin – Revision Assistant – that used a powerful technology to assess writing and would turn the data it uncovered into feedback that students could apply to their essays immediately. We decided to give it a try.

We took a measured approach and piloted Revision Assistant in August 2016 with a small group of English and History teachers. At around the same time, our administration department asked us to align department objectives so that writing instruction reflected CCSS and the Smarter Balanced-style prompts. We needed benchmark assessments that would not only measure student achievement, but also able to drive instruction.

Here’s what we did. First, we used Revision Assistant’s Spot Check assignments to get a snapshot of students’ writing skills before direct instruction began. These instant scores on specific rubric traits gave us baseline pre-assessment data we could use to measure student progress over time. The instant baseline scores allowed our team to mix-and-match units. We could also identify exemplar essays we pulled from the first Revision Assistant exercise that we used in self- and peer-review assignments. The exemplars gave students models and explained the language and meaning of the rubrics. It also gave us our initial essay scores to compare pre- and post-performance.

My sophomores were the first ones to try out Revision Assistant’s Signal Check and its instant feedback in the school. They dove right into it, happily writing essays and reviewing rubrics to help guide their writing.

This is where the Ah-Ha moment hit me.

revision assistant

??????????

My students and I began comparing papers and different writing approaches by using the rubric scores Revision Assistant automatically provided. We could all immediately see what types of evidence, use of language, organization, etc. led to higher scores. Right away, I recognized that my students had just engaged in the holy grail objective of an ELA teacher: the ability to create self-directed, reflective learners who use inquiry and research to communicate and solve problems. If their writing skills also improved in this process, then that would be icing on the proverbial cake.

In fact, their scores did improve. And more to our original goal, the teachers using Revision Assistant in my department ended up with more cohesive writing instruction plans, along with rubric-aligned writing scores that we could confidently say aligned with CCSS metrics. More importantly, our students were writing more each day. They’ve learned to use Revision Assistant’s machine generated scores and feedback to take control of their own learning, not the other way around– as some teachers had feared might happen with using technology to support writing instruction. Our students using Revision Assistant have written more words per assignment, submitted more drafts, and improved scores based on rubric-aligned feedback on specific essay traits than ever before.

As a natural offshoot of their growth, students have taken over some data analysis as well. They are comparing their scores between themselves, their peers, and the computer-generated numbers. As they look for incongruities and outliers, they ask questions about the rubric and redirect their own practice with little to no prompting. In essence, students are participating in their own meta-data analysis in their English classes.

These have been the positives of using technology to help writing instruction and guide our curriculum team, but there is one limitation, as one of my students pointed out. Revision Assistant, or any technology that evaluates writing, is unable to assess the human writer. This is a fair criticism. But, for our teachers, the consistent objectivity is what can democratize writing and writing instruction and makes it equitable. The algorithm’s scoring always evaluates each essay in the same way, using the same criteria. In this light, our teachers were relieved of the daunting formative assessment process to direct their time and energy toward using the assessment data to drive daily instruction.

As we near the end of our pilot, our school is exploring how to more permanently integrate Revision Assistant into our classrooms. For my part, I can easily envision the use of at least one Spotlight assessment per grade, of the mix-and-match curriculum that my English teachers have already modeled out, and of the coaching and practice that Signal Check assignments provide. We have understood the importance of writing well. However, we have also learned that standards-aligned, formative assessments can drive instruction and can empower our young writers with the skills of self-directed learning.

For more on Revision Assistant, visit their website and/or check out the review I did over here at Ask a Tech Teacher.

Author, Sammy Spencer is an English Teacher, head of the English department, and an Instructional Coach at El Camino Real Charter High School in Southern California. She is in the process of becoming a Nationally Board Certified teacher and holds a Master’s degree in Ed Tech from Cal Lutheran University. She played Division 1 soccer at the University of Pennsylvania, and has visited over 40 countries around the world.

2 Responses to A Pilot Test of Revision Assistant and What We Learned in the Process

  1. mellasweden

    wow <3

Leave a Reply