Monday, October 15, 2012

Measuring Student Growth

Part of my job as an interventionist is to monitor progress to ensure that the interventions I am using are effective. Using my own curriculum has made this process challenging. In an attempt to match instruction with assessment, I had to decide which progress monitoring tool would work best. In past years, our school used Dibels and since I was primarily teaching students to blend sounds, the Dibels Nonsense Word Fluency worked well. This year, we no longer use Dibels or Aimsweb, replacing it with the Fountas & Pinnell benchmarking system. The question became - do I have students complete a fluency passage with text I have taught or should I have them participate in a "cold read" of a text on a similar level?

Fountas & Pinnell recommended one read through and then follow up with the same text by doing a running record. I decided to go with this approach with slight modifications. There are baskets in my room with leveled readers appropriate for each student in my group. Each week, I give students the option of reading through these books during self selected reading time, though I do not actually "teach" the book. The story songs that I am using for teaching are slightly above their instructional level; however, they are repetitive in nature and so far, students have been successful at reading these books independently. To date, I have administered three running records for each student. The first one was with a practiced text and the second two were at a slightly lower level but unpracticed. Here are the results:

Student 1: 9/19 - Level C (practiced)       97.4% accuracy
                 10/1 - Level A (cold read)       96.0
                10/10- Level B (cold read)       95.0

Student 2: 9/19 - Level C (practiced)       No accuracy reported (frustration too high)
                 9/28 - Level A (cold read)       96.0
                 10/9 - Level B (cold read)       96.0

Student 3 9/19 - Level C (practiced)        98.4
                10/1 - Level B (cold read)        98.0
              10/10 - Level C (cold read)        97.0

Student 4 9/19 - Level C (practiced)       No accuracy reported (student is distracted)
                9/28 - Level B (cold read)       83.6
                10/9 - Level C (cold read)       91.0

Student 5 9/19 - Level C (practiced)       93.5
                9/27 - Level B (cold read)       90.0
              10/10 - Level C (cold read)       89.7

Student 6 9/19 - Level C (practiced)     100.0
                9/27 - Level D (cold read)     100.0
                10/9 - Level E (cold read)       98.0

I am excited that students are able to read more difficult text with little drop in accuracy. One student actually made gains with a higher level text. There was also not a significant difference between the practiced text and the unpracticed text which indicates that most students are able to transfer and apply what they learned with a slightly higher level text to one that is somewhat easier. Although there are other factors that may have impacted results (such as student motivation, distractions in the classroom, usage of picture clues, etc), the initial results are encouraging. My plan is to administer two more running records over the next two weeks to see if students can maintain growth without losing accuracy.

No comments:

Post a Comment