6_wright_RTI_assessm..
Transcription
6_wright_RTI_assessm..
RTI Toolkit: A Practical Guide for Schools RTI: Assessment & Progress-Monitoring Jim Wright, Presenter August 2009 Hudson River Teacher Center Yorktown Heights, NY Jim Wright 364 Long Road Tully, NY 13159 Email: [email protected] Resources from Workshop Available at: http://www.interventioncentral.org/RTIToolkit.php 6 CBM Reading Assessment Jim Wright, Presenter www.interventioncentral.org 2 ‘RTI-Ready’ Literacy Measures: A Guide for Schools GR Initial Sound Fluency (Phonemic Awareness) K 1 2 3 4 5 6 7 8 : 3 minutes Administration: 1:1 Description: The student is shown a collection of 4 pictures, each depicting an object that begins with a different letter sound. The examiner gives the student a letter sound and asks the student to select from the collection the picture of the object that begins with that letter sound. The process is repeated with new sets of pictures for the duration of the monitoring period. Sources for This Measure DIBELS (https://dibels.uoregon.edu/).[Free]. Administration Range: Pre-K through middle of Kindergarten. Phoneme Segmentation (Phonemic Awareness) GR K 1 2 3 4 5 6 7 8 : 1 minute Administration: 1:1 Description: The student is read a list of words containing 2 or more phonemes. For each word, the student is asked to recite all of the phonemes that make up the word. Sources for This Measure • DIBELS (https://dibels.uoregon.edu/).[Free]. Administration Range: • AimsWeb (http://www.aimsweb.com/). [Pay]. Administration Range: Middle of Kindergarten through end of Grade 1. Middle of Kindergarten through middle of Grade 1. • Easy CBM (http://www.easycbm.com/).[Free]. Administration Range: Kindergarten and Grade 1. Letter Naming Fluency (Alphabetics) GR K 1 2 3 4 5 6 7 8 : 1 minute Administration: 1:1 Description: The student is presented with a list of randomly arranged letters. The student names as many letters as possible. Sources for This Measure • Easy CBM (http://www.easycbm.com/).[Free]. Administration Range: • DIBELS (https://dibels.uoregon.edu/).[Free]. Administration Range: Kindergarten and Grade 1. Beginning of Kindergarten through beginning of Grade 1. • AimsWeb (http://www.aimsweb.com/). [Pay]. Administration Range: • Intervention Central (http://www.rti2.org/rti2/letterNamings).[Free]. Site Beginning of Kindergarten through beginning of Grade 1. provides an online application (‘Letter Naming Fluency Probe Generator’) that creates randomly generated sets of uppercase, lowercase, and mixed-case letters in English and Spanish for Letter Naming Fluency assessments. Jim Wright, Presenter www.interventioncentral.org 3 Letter Sound Fluency (Alphabetics) GR K 1 2 3 4 5 6 7 8 : 1 minute Administration: 1:1 Description: The student is presented with a list of randomly arranged letters. The student gives the sounds of as many letters as possible. Sources for This Measure • Easy CBM (http://www.easycbm.com/).[Free]. Administration Range: • EdCheckup (http://www.edcheckup.com/). [Pay]. Administration Range: Kindergarten and Grade 1. Information unavailable. • AimsWeb (http://www.aimsweb.com/). [Pay]. Administration Range: Middle of Kindergarten through beginning of Grade 1. Nonsense Word Fluency (Alphabetics) GR K 1 2 3 4 5 6 7 8 : 1 minute Administration: 1:1 Description: The student is shown a list of short nonsense words. For each word, the student is to read the word or give the sounds that make up the word. Sources for This Measure • DIBELS (https://dibels.uoregon.edu/).[Free]. Administration Range: • AimsWeb (http://www.aimsweb.com/). [Pay]. Administration Range: Middle of Kindergarten through middle of Grade 2. Middle of Kindergarten through end of Grade 1. Word Identification Fluency (Alphabetics) GR K 1 2 3 4 5 6 7 8 : 1 minute Administration: 1:1 Description: The student is presented with a list of words randomly selected from a larger word list (e.g., Dolch Wordlist). The student reads as many words as possible. Sources for This Measure • EdCheckup (http://www.edcheckup.com/). [Pay]. Administration Range: • Easy CBM (http://www.easycbm.com/).[Free]. Administration Range: Information unavailable. Kindergarten through Grade 3. • Intervention Central (http://www.interventioncentral.org).[Free]. Site provides an online application (‘CBM List Builder’) that creates randomly generated Word Identification Probes based on the Dolch Wordlist. Jim Wright, Presenter www.interventioncentral.org 4 Oral Reading Fluency (Fluency With Text) GR K 1 2 3 4 5 6 7 8 : 1 minute Administration: 1:1 Description: The student reads aloud from a passage and is scored for fluency and accuracy. Passages are controlled for level of reading difficulty. Sources for This Measure • AimsWeb (http://www.aimsweb.com/). [Pay]. Administration Range: • DIBELS (https://dibels.uoregon.edu/).[Free]. Administration Range: Grade 1 through Grade 8. Middle of Grade 1 through Grade 6. • iSteep (http://www.isteep.com/).[Pay] Administration Range: Grade 1 • Easy CBM (http://www.easycbm.com/).[Free]. Administration Range: through Grade 5 (progress-monitoring) Grade 1 through Grade 8. • EdCheckup (http://www.edcheckup.com/). [Pay]. Administration Range: • Intervention Central (http://www.rti2.org/rti2/oralReadings).[Free]. Information unavailable. Site provides an online application that creates an oral reading fluency probe based on text typed in by the user. Maze (Comprehension) GR K 1 2 3 4 5 6 7 8 : 1-3 minutes Administration: Group Description: The student is given a passage in which every 7th word has been removed. The student reads the passage silently. Each time the student comes to a removed word, the student chooses from among 3 replacement words: the correct word and two distractors. The student circles the replacement word that he or she believes best restores the meaning of the text. Sources for This Measure • AimsWeb (http://www.aimsweb.com/). [Pay] Administration Range: • iSteep (http://www.isteep.com/). {Pay] Administration Range: Grade 1 Grade 1 through Grade 8. through Grade 6 (progress-monitoring) • EdCheckup (http://www.edcheckup.com/). [Pay]. Administration Range: • Intervention Central (http://www.rti2.org/rti2/mazes).[Free]. Information unavailable. Site provides an online application that creates a maze passage probe based on text typed in by the user. Jim Wright, Presenter www.interventioncentral.org 5 Evaluate the ‘RTI Readiness’ of Your School’s Academic Measures Directions. Use the questionnaire below to evaluate the ‘RTI readiness’’ of any academic measure. Note that questions on the form are hierarchically organized: If items earlier in the survey are endorsed ‘no’, the measure probably cannot be used for more advanced applications that appear later in the survey. Use the table Interpreting the Results of This Survey below to identify the appropriate uses for your measure in the RTI problem-solving process.. Name of Measure: _________________________________________________________________________________ Item # N Y N Y N Y N Y N Y N Y N Y N Y N Y N Y N Progress-Monitoring Y Goal-Setting NO Baseline YES Background Rating Item Background: Validity. 1. Content Validity. Does the measure provide meaningful information about the academic skill of interest? 2. Convergent Validity. Does the measure yield results that are generally consistent with other well-regarded tests designed to measure the same academic skill? 3. Predictive Validity. Does the measure predict student success on an important future test, task, or other outcome? Baseline: Reliability. 4. Test-Retest/Alternate-Form Reliability. Does the measure have more than one version or form? If two alternate, functionally equivalent versions of the measure are administered to the student, does the student perform about the same on both? 5. Interrater Reliability. When two different evaluators observe the same student’s performance and independently use the measure to rate that performance, do they come up with similar ratings? Benchmarks & Goal-Setting 6. Performance Benchmarks. Does the measure include benchmarks or other performance criteria that indicate typical or expected student performance in the academic skill? 7. Goal-Setting. Does the measure include guidelines for setting specific goals for improvement? Progress-Monitoring and Instructional Impact 8. Repeated Assessments. Does the measure have sufficient alternative forms to assess the student weekly for at least 20 weeks? 9. Equivalent Alternate Forms. Are the measure’s repeated assessments (alternative forms) equivalent in content and level of difficulty? 10. Sensitive to Short-Term Student Gains. Is the measure sensitive to short-term improvements in student academic performance? 11. Positive Impact on Learning. Does research show that the measure gives teachers information that helps them to make instructional decisions that positively impact student learning? Interpreting the Results of This Survey of Your Academic Measure: • YES to Items 1-3. Background. The measure gives valid general information about the student’s academic skills and performance. While not sufficient, the data can be interpreted as part of a larger collection of student data. • YES to Items 4-5. Baseline. The measure gives reliable results when given by different people and at different times of the day or week. Therefore, the measure can be used to collect a current ‘snapshot’ of the student’s academic skills prior to starting an intervention. • YES to Items 6-7. Goal-Setting. The measure includes standards (e.g., benchmarks or performance criteria) for ‘typical’ student performance (e.g., at a given grade level) and guidelines for estimating rates of student progress. Schools can use the measure to assess the gap in performance between a student and grade level peers—and also to estimate expected rates of student progress during an intervention. • YES to Items 8-11. Progress Monitoring. The measure has the appropriate qualities to be used to track student progress in response to an intervention. Jim Wright, Presenter www.interventioncentral.org 6 Comparing Reading Measures for ‘RTI Readiness’ School: _________________________________ Date: _______________ Person(s) Completing Ratings: __________________________________________ Phonemic Awareness/Alphabetics Fluency With Text Vocabulary Comprehension Directions: Use this form to compare reading measures in your school for qualities of ‘RTI readiness’. Put an ‘X’ in a column if the measure has that measurement quality. (Consult the form Evaluate the ‘RTI Readiness’ of Your School’s Academic Measures for a more detailed description of each measurement quality.) Background: Validity Name of Measure Jim Wright, Presenter Baseline: Reliability Content Validity. Convergent Validity Predictive Validity Test-Retest/ Alternate Form Reliability 1 2 3 4 Goal-Setting Progress-Monitoring Interrater Reliability Performance Benchmarks GoalSetting Repeated Assessments Equivalent Alternate Forms 5 6 7 8 9 www.interventioncentral.org Sensitive to Short-Term Student Gains Positive Impact on Learning 10 11 7 Administration of CBM reading probes The examiner and the student sit across the table from each other. The examiner hands the student the unnumbered copy of the CBM reading passage. The examiner takes the numbered copy of the passage, shielding it from the student's view. The examiner says to the student: When I say, 'start,' begin reading aloud at the top of this page. Read across the page [demonstrate by pointing]. Try to read each word. If you come to a word you don't know, I'll tell it to you. Be sure to do your best reading. Are there any questions? [Pause] Start. The examiner begins the stopwatch when the student says the first word. If the student does not say the initial word within 3 seconds, the examiner says the word and starts the stopwatch. As the student reads along in the text, the examiner records any errors by marking a slash (/) through the incorrectly read word. If the student hesitates for 3 seconds on any word, the examiner says the word and marks it as an error. At the end of 1 minute, the examiner says, Stop and marks the student's concluding place in the text with a bracket ( ] ). Scoring Reading fluency is calculated by first determining the total words attempted within the timed reading probe and then deducting from that total the number of incorrectly read words. The following scoring rules will aid the instructor in marking the reading probe: Words read correctly are scored as correct: --Self-corrected words are counted as correct. --Repetitions are counted as correct. --Examples of dialectical speech are counted as correct. --Inserted words are ignored. Mispronunciations are counted as errors. Example Text: The small gray fox ran to the cover of the trees. Student: "The smill gray fox ran to the cover of the trees." Substitutions are counted as errors. Jim Wright, Presenter www.interventioncentral.org 8 Example Text: When she returned to the house, Grandmother called for Franchesca. Student: "When she returned to the home, Grandmother called for Franchesca. Omissions are counted as errors. Example Text: Anna could not compete in the last race. Student: "Anna could not in the last race." Transpositions of word-pairs are counted as 1 error. Example Text: She looked at the bright, shining face of the sun. Student: "She looked at the shining bright face of the sun." Words read to the student by the examiner after 3 seconds have gone by are counted as errors. Jim Wright, Presenter www.interventioncentral.org 9 Student Record Form: Curriculum-Based Measurement: Oral Reading Fluency Student Name: ____________________________ Grade/Classroom: ________________ Rdng Skill Level: _________ Step 1: Conduct a Survey-Level Assessment: Use this section to record the student’s reading rates in progressively more difficult material. Date:_______ Book/Reading Level: ____________ TRW E CRW %CRW A. ______ ______ ______ ______ ______ ______ ______ B. ______ ______ ______ ______ C. ______ Date:_______ Book/Reading Level: ____________ TRW E CRW %CRW A. ______ ______ ______ ______ ______ ______ ______ B. ______ C. ______ ______ ______ ______ Table 1: CBM Oral Reading Fluency Norms (Hasbrouck & Tindal, 2005) Start-of-Yr/Fall Mid-Yr/Winter End-of-Yr/Spring GR 25%ile 50%ile 25%ile 50%ile 25%ile 50%ile 1 --12 13 28 53 2 25 51 42 72 61 89 3 44 71 62 92 78 107 4 68 94 68 94 98 123 5 85 110 99 127 109 139 6 98 127 111 140 122 150 7 102 128 109 136 123 150 8 106 133 115 146 124 151 Note: To interpret student reading fluency probe scores for any grade level text (Shapiro, 2008): • below 25th percentile = frustration range • between 25th - 50th percentiles = instructional range • above the 50th percentile = mastery range Date:_______ Book/Reading Level: ____________ TRW E CRW %CRW A. ______ ______ ______ ______ ______ ______ ______ B. ______ ______ ______ ______ C. ______ Step 2: Compute a Student Reading Goal 1. At what grade or book level will the student be monitored? (Refer to results of Step 1:Survey-Level Assessment) Date:_______ Book/Reading Level: ____________ TRW E CRW %CRW A. ______ ______ ______ ______ ______ ______ ______ B. ______ ______ ______ ______ C. ______ 3. Date:_______ Book/Reading Level: ____________ TRW E CRW %CRW A. ______ ______ ______ ______ ______ ______ ______ B. ______ ______ ______ ______ C. ______ Date:_______ Book/Reading Level: ____________ TRW E CRW %CRW A. ______ ______ ______ ______ B. ______ ______ ______ ______ ______ ______ ______ C. ______ 2. 4. 5. 6. 7. 8. _______________________________________ What is the student’s baseline reading rate (# correctly read words per min)? ________CRW Per Min When is the start date to begin monitoring the student in reading? _____ / _____ / _____ When is the end date to stop monitoring the student in reading? _____ / _____ / _____ How many instructional weeks are there between the start and end dates? (Round to the nearest week if necessary): _______ Instructional Weeks What do you predict the student’s average increase in correctly read words per minute will be for each instructional week of the monitoring period? (See Table 2): _________ Weekly Increase in CRW Per Min What will the student’s predicted CRW gain in reading fluency be at the end of monitoring? (Multiply Item 5 by Item 6): ______________ What will the student’s predicted reading rate be at the end of the monitoring period? (Add Items 2 & 7): ______ CRW Per Min References Fuchs, L.S., Fuchs, D., Hamlett, C.L., Walz, L., & Germann, G. (1993). Formative evaluation of academic progress: How much growth can we expect? School Psychology Review, 22, 27-48. Hasbrouck, J., & Tindal, G. (2005). Oral reading fluency: 90 years of measurement. Eugene, OR: Behavioral Research & Teaching. Retrieved from http://www.brtprojects.org/tech_reports.php Shapiro, E. S. (2008). Best practices in setting progress-monitoring goals for academic skill improvement. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 141-157). Bethesda, MD: National Association of School Psychologists. Jim Wright, Presenter www.interventioncentral.org 10 Student Name: ____________________________________________ Grade/Classroom: ______________________________ 6. Baseline 1 Date:_______ Book/Reading Level: ____________ TRW E CRW %CRW A. ______ ______ ______ ______ ______ ______ ______ B. ______ ______ ______ ______ C. ______ Date:_______ Book/Reading Level: ____________ TRW E CRW %CRW A. ______ ______ ______ ______ ______ ______ ______ B. ______ ______ ______ ______ C. ______ 7. Baseline 2 Date:_______ Book/Reading Level: ____________ TRW E CRW %CRW A. ______ ______ ______ ______ ______ ______ ______ B. ______ ______ ______ ______ C. ______ Date:_______ Book/Reading Level: ____________ TRW E CRW %CRW A. ______ ______ ______ ______ ______ ______ ______ B. ______ ______ ______ ______ C. ______ 8. Baseline 3 Step 3: Collect Baseline Data: Give 3 CBM reading assessments within a one-week period using monitoring-level probes. Date:_______ Book/Reading Level: ____________ TRW E CRW %CRW A. ______ ______ ______ ______ ______ ______ ______ B. ______ ______ ______ ______ C. ______ Date:_______ Book/Reading Level: ___________ TRW E CRW %CRW A. ______ ______ ______ ______ ______ ______ ______ B. ______ ______ ______ ______ C. ______ 9. Date:_______ Book/Reading Level: ____________ TRW E CRW %CRW A. ______ ______ ______ ______ ______ ______ ______ B. ______ ______ ______ ______ C. ______ 10. Date:_______ Book/Reading Level: ____________ TRW E CRW %CRW A. ______ ______ ______ ______ ______ ______ ______ B. ______ ______ ______ ______ C. ______ 11. Date:_______ Book/Reading Level: ____________ TRW E CRW %CRW A. ______ ______ ______ ______ ______ ______ ______ B. ______ ______ ______ ______ C. ______ 12. Date:_______ Book/Reading Level: ____________ TRW E CRW %CRW A. ______ ______ ______ ______ B. ______ ______ ______ ______ ______ ______ ______ C. ______ Step 4: Complete CBM Progress-Monitoring Weekly or More Frequently: Record the results of regular monitoring of the student’s progress in reading fluency. 1. Date:_______ Book/Reading Level: ____________ TRW E CRW %CRW A. ______ ______ ______ ______ ______ ______ ______ B. ______ ______ ______ ______ C. ______ 2. Date:_______ Book/Reading Level: ____________ TRW E CRW %CRW A. ______ ______ ______ ______ ______ ______ ______ B. ______ ______ ______ ______ C. ______ 3. Date:_______ Book/Reading Level: ____________ TRW E CRW %CRW A. ______ ______ ______ ______ B. ______ ______ ______ ______ ______ ______ ______ C. ______ 4. Date:_______ Book/Reading Level: ____________ TRW E CRW %CRW A. ______ ______ ______ ______ ______ ______ ______ B. ______ ______ ______ ______ C. ______ 5. Date:_______ Book/Reading Level: ____________ TRW E CRW %CRW A. ______ ______ ______ ______ ______ ______ ______ B. ______ C. ______ ______ ______ ______ Jim Wright, Presenter Table 2: Predictions for Rates of Reading Growth by Grade (Fuchs, Fuchs, Hamlett, Walz, & Germann, 1993) Increase in Correctly Read Words Per Minute for Each Instructional Week Realistic Weekly Goal Grade Level Grade 1 Grade 2 Grade 3 Grade 4 Grade 5 Grade 6 www.interventioncentral.org 2.0 1.5 1.0 0.85 0.5 0.3 Ambitious Weekly Goal 3.0 2.0 1.5 1.1 0.8 0.65 11 CBA Reading Probes: Harcourt Brace Signatures Series Book 4-1 Rare Finds One hundred years ago in Paris, when theaters and music halls 11 drew traveling players from all over the world, the best place to 23 stay was at the widow Gateau’s, a boardinghouse on English 33 Street. Acrobats, jugglers, actors, and mimes from as far away 43 as Moscow and New York reclined on the widow’s feather 53 mattresses and devoured her kidney stews. Madame Gateau 61 worked hard to make her guests comfortable, and so did her 72 daughter, Mirette. The girl was an expert at washing linens, 82 chopping leeks, paring potatoes, and mopping floors. She was 91 a good listener too. Nothing pleased her more than to overhear 102 the vagabond players tell of their adventures in this town and 113 that along the road. 117 Harcourt Brace Signatures Series 1999 Level 4-1 Rare Finds Mirette on the High Wire pp. 87 Special Education Department Jim Wright, Presenter Syracuse City School District Syracuse, NY www.interventioncentral.org 12 CBA Reading Probes: Harcourt Brace Signatures Series Book 4-1 Rare Finds One hundred years ago in Paris, when theaters and music halls drew traveling players from all over the world, the best place to stay was at the widow Gateau’s, a boardinghouse on English Street. Acrobats, jugglers, actors, and mimes from as far away as Moscow and New York reclined on the widow’s feather mattresses and devoured her kidney stews. Madame Gateau worked hard to make her guests comfortable, and so did her daughter, Mirette. The girl was an expert at washing linens, chopping leeks, paring potatoes, and mopping floors. She was a good listener too. Nothing pleased her more than to overhear the vagabond players tell of their adventures in this town and that along the road. Harcourt Brace Signatures Series 1999 Level 4-1 Rare Finds Mirette on the High Wire pp. 87 Special Education Department Jim Wright, Presenter Syracuse City School District Syracuse, NY www.interventioncentral.org 13 CBA Reading Probes: Harcourt Brace Signatures Series Book 4-1 Rare Finds Someone is lost in the woods. He might be hurt, or the weather 13 could turn bad. It is important to find him as fast as possible. 26 But he didn’t follow a trail, and footprints don’t show on the 38 forest floor. What to do? Call in the search and rescue dogs. 50 Dogs have a very fine sense of smell. They can find people lost 63 by following their scents, because each person has his or her 74 own, unique scent. Panda is a Newfoundland dog trained to 84 locate lost people. She and her owner, Susie Foley, know how 95 to search through the woods, under the snow, or in the water. 107 Harcourt Brace Signatures Series 1999 Level 4-1 Rare Finds Hugger to the Rescue pp. 143-144 Special Education Department Jim Wright, Presenter Syracuse City School District Syracuse, NY www.interventioncentral.org 14 CBA Reading Probes: Harcourt Brace Signatures Series Book 4-1 Rare Finds In the busy rain forest of Malaysia, a grasshopper leaps into a 12 spray of orchids. Suddenly, one of the “flowers” turns on the 23 grasshopper. An orchid mantis, with wings like petals, grips it 33 tightly. For the grasshopper, there will be no escape. The 43 orchid mantis is a master of camouflage – the art of hiding while 55 in plain sight. Camouflage enables predators like the orchid 64 mantis to hide while they lie in wait for their prey. For other 77 animals, camouflage is a method of protection from their 86 enemies. Animals blend into the background in several ways. 95 Their colors and patterns may match their surroundings. 103 Harcourt Brace Signatures Series 1999 Level 4-1 Rare Finds Hiding Out pp. 270 Special Education Department Jim Wright, Presenter Syracuse City School District Syracuse, NY www.interventioncentral.org 15 CBM Reading: Graphing Exercise for Jared M.: 4th-Grader Background. Your RTI Problem-Solving Team has completed a CBM survey-level screening in reading for Jared M., a 4th grader. According to his teacher, Jared reads at the beginning 2nd-grade level. An initial RTI Team meeting is held on Monday, January 20th. At that meeting, an intervention is designed in which Jared will be paired with a volunteer adult tutor for a reading intervention that takes place daily for 30-minute sessions using the Paired Reading strategy. Your team schedules a follow-up RTI Team meeting for Monday, March 10th, six instructional weeks from the date of the initial meeting. CBM Practice Items. Attached is a CBM Student Record that contains Jared’s CBM reading data. Complete the practice items below to gain experience in interpreting and charting CBM data. 1. Survey-Level Assessment. On Jared’s attached CBM Student Record Form, review the Survey-Level assessment results. For each level of CBM probe administered, circle the median Correctly Read Words (CRWs), Errors (E), and Percentage of Correctly Read Words (%CRWs). Consult Table 1 on the Record Form to identify the student’s Mastery, Instructional, and Frustration levels of reading. Note: Use the “Mid-Yr/Winter’ Norms from table 1 to interpret Jared’s survey-level assessment results. To interpret Jared’s reading fluency probe scores for any grade level text (Shapiro, 2008), use these guidelines: • Scores below the 25th percentile = frustration range • Scores between 25th - 50th percentiles = instructional range • Scores above the 50th percentile = mastery range 2. Set up the graph. At the top of your monitoring graph, put in these date-spans for each of the instructional weeks during which Jared will be monitored: Baseline: 1/13-1/17 Week 1: 1/20-1/24 Week 2: 1/27-1/31 Week 3: 2/3-2/7 Week 4: 2/10-2/14 Week 5: 2/24-2/28 Week 6: 3/3-3/7 Week 7: 3/10-3/14 Week 8: 3/17-3/21 Week 9: 3/24-3/28 Week 10: 3/31-4/4 Week 11: 4/7-4/11 Week 12: 4/14-4/18 3. Determine & chart the student’s baseline reading rate. On the Record Form, review Jared’s Baseline assessment information. • Circle the median CRW and E for each of the Baseline observations. • On the progress-monitoring graph, chart the median CRWs and Es for all 3 observations. • Of the Baseline values that you charted, disregard the highest and lowest CRWs. The middle CRW should be assumed to be the best estimate of the student’s starting, or baseline, reading rate. Circle this middle (‘Baseline’) data point on your chart. 4. Set a performance goal. To compute Jared’s performance goal in reading: • Use Table 2 on the Record Form to identify the rate of progress that Jared should make each week in goal-level (3rd-Grade) reading material. Jim Wright, Presenter www.interventioncentral.org 16 • • You will recall that your RTI Team has decided to monitor Jared’s reading for six weeks before holding a follow-up meeting. To compute how much Jared’s reading rate should increase in that time, multiply his expected weekly progress by the number of weeks that he will be monitored. Add Jared’s expected reading progress to his baseline reading rate. This combined figure is Jared’s reading goal. 5. Plot the ‘Aim-Line’. To graph a 6-week ‘aim-line’: • Draw a vertical dividing line (‘start-line’) at the point where the intervention will begin (start of Week 1). • Draw a second dividing line on the graph (‘end-line’) that marks the conclusion of six weeks of monitoring (end of Week 6). • On the start-line, mark an ‘X’ at the point that is equal to the value of your circled baseline data point. • Mark Jared’s reading goal with an ‘X’ at the appropriate spot on the end-line. • Now draw a straight line between the start-line and end-line ‘X’s. This is your chart’s aim-line. 6. Plot Jared’s progress-monitoring data. Review Jared’s CBM data for the first six weeks of progress-monitoring. Circle the median CRWs and Es and plot them on the chart. What conclusions do you draw from the chart? Based on these data, should the RTI Team recommend changing Jared’s intervention? Keep it in place with no changes? Why? 7. Continue with progress monitoring. Assume that your RTI Team met for the follow-up meeting and decided to keep the current intervention in place because it appears to be effective. The team plans to monitor for another 6 weeks. • Compute a new baseline for Jared by looking at his most recent 3 CRW data points and circling the median value. Compute how much Jared’s reading rate should increase after 6 additional weeks of intervention and add this amount to his new baseline reading rate. This is Jared’s revised reading goal. • Set up a new ‘aim-line’: o Draw a vertical dividing line (‘start-line’) at the point where the revised intervention begins (start of week 7). o Draw a second dividing line on the graph (‘end-line’) that marks the conclusion of 6 more weeks of monitoring (end of Week 12). o On the new start-line, mark an ‘X’ at the point that is equal to the value of the circled baseline data point. o Next, mark Jared’s revised reading goal with an ‘X’ at the appropriate spot on the endline. o Now draw a straight line between the start-line and end-line ‘X’s. This is your chart’s revised aim-line. 8. Plot the rest of Jared’s progress-monitoring data. Identify Jared’s ‘median’ reading data for the final 6 weeks of progress-monitoring (see Weeks 7-12 on the Student Record Form). Plot these values on the chart. What conclusions do you draw from the chart? Based on these data, should the RTI Team recommend changing Jared’s intervention? Keep it in place with no changes? Why? Jim Wright, Presenter www.interventioncentral.org 17 References Shapiro, E. S. (2008). Best practices in setting progress-monitoring goals for academic skill improvement. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 141157). Bethesda, MD: National Association of School Psychologists. Jim Wright, Presenter www.interventioncentral.org 18 Student Record Form: Curriculum-Based Measurement: Oral Reading Fluency 4/Mrs. Legione Rdng Skill Level: _________ Mid-Gr.2 Jared M. __________________ Grade/Classroom: Gr. Student Name: __________ ________________ Step 1: Conduct a Survey-Level Assessment: Use this section to record the student’s reading rates in progressively more difficult material. Th 12/5 Book/Reading Level: ____________ GR 2-Bk 2-P 1 Date:_______ TRW E CRW %CRW A. __77 ____ __3____ __74 ____ _96 _____ 64 4 60 94 __ ______ ______ ____ B. ______ 99 __ ____ __1____ __78 ____ ____ C. __79 Th 12/5 Book/Reading Level: _GR 3-Bk 1-P1 Date:_______ ___________ TRW E CRW %CRW 103 2 101 A. ______ ______ ______ __98 ____ 84 3 81 ______ ______ __96 ____ B. ______ 2 __ C. ___78 ___ ____ ___76 ___ __97 ____ 12/5 Book/Reading Level: ____________ GR 3-Bk 2-P1 Date:_Th ______ TRW E CRW %CRW 62 4 58 A. ______ ______ ______ __94 ____ 4 __ ___ ____ ___77 ___ __95 ____ B. ___81 73 3 70 96 ______ ______ ______ C. ______ 12/5 Book/Reading Level: ___________ GR 4-P1 _ Date:_Th ______ TRW E CRW %CRW 58 5 53 91 __ A. ______ ______ ______ ____ 61 _ __5____ ___56 ___ __92 ____ B. _____ 64 6 58 ______ ______ __91 ____ C. ______ Date:_______ Book/Reading Level: ____________ TRW E CRW %CRW D. ______ ______ ______ ______ ______ ______ ______ E. ______ ______ ______ ______ F. ______ Date:_______ Book/Reading Level: ____________ TRW E CRW %CRW G. ______ ______ ______ ______ H. ______ ______ ______ ______ ______ ______ ______ I. ______ Table 1: CBM Oral Reading Fluency Norms (Hasbrouck & Tindal, 2005) Start-of-Yr/Fall Mid-Yr/Winter End-of-Yr/Spring GR 25%ile 50%ile 25%ile 50%ile 25%ile 50%ile 1 --12 13 28 53 2 25 51 42 72 61 89 3 44 71 62 92 78 107 4 68 94 68 94 98 123 5 85 110 99 127 109 139 6 98 127 111 140 122 150 7 102 128 109 136 123 150 8 106 133 115 146 124 151 Note: To interpret student reading fluency probe scores for any grade level text (Shapiro, 2008): • below 25th percentile = frustration range • between 25th - 50th percentiles = instructional range • above the 50th percentile = mastery range Step 2: Compute a Student Reading Goal 1. At what grade or book level will the student be monitored? (Refer to results of Step 1:Survey-Level Assessment) 2. 3. 4. 5. 6. 7. 8. _______________________________________ What is the student’s baseline reading rate (# correctly read words per min)? ________CRW Per Min When is the start date to begin monitoring the student in reading? _____ / _____ / _____ When is the end date to stop monitoring the student in reading? _____ / _____ / _____ How many instructional weeks are there between the start and end dates? (Round to the nearest week if necessary): _______ Instructional Weeks What do you predict the student’s average increase in correctly read words per minute will be for each instructional week of the monitoring period? (See Table 2): _________ Weekly Increase in CRW Per Min What will the student’s predicted CRW gain in reading fluency be at the end of monitoring? (Multiply Item 5 by Item 6): ______________ What will the student’s predicted reading rate be at the end of the monitoring period? (Add Items 2 & 7): ______ CRW Per Min References Fuchs, L.S., Fuchs, D., Hamlett, C.L., Walz, L., & Germann, G. (1993). Formative evaluation of academic progress: How much growth can we expect? School Psychology Review, 22, 27-48. Hasbrouck, J., & Tindal, G. (2005). Oral reading fluency: 90 years of measurement. Eugene, OR: Behavioral Research & Teaching. Retrieved from http://www.brtprojects.org/tech_reports.php Shapiro, E. S. (2008). Best practices in setting progress-monitoring goals for academic skill improvement. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 141-157). Bethesda, MD: National Association of School Psychologists. Jim Wright, Presenter www.interventioncentral.org 19 Student Name: _____________Jared M.________ Grade/Classroom: ______Gr 4/Mrs. Legione_________ Baseline 1 Step 2: Collect Baseline Data: Give 3 CBM reading assessments within a one-week period using monitoring-level probes. Th 12/5 Book/Reading Level:_Gr 3: Bk 2-P1 Date:_______ _________ TRW E CRW %CRW 58 62 4 A. ______ ______ ______ ___94 ___ 77 _ 4 __ B. ___81 ___ ____ _____ ___95 ___ 70 _ 3 __ ___ ____ _____ ___96 ___ C. ___73 Baseline 3 Baseline 2 M 12/9 Date:_______ Book/Reading Level: _Gr 3: Bk 2-P2__ TRW E CRW E ___3___ ___2___ ___2___ 7. %CRW A. ___56 ___ __4____ ___52 ___ __93 ____ 70 3 67 96 B. ______ ______ ______ ______ 4 __ ___ ____ ___77 ___ __95 ____ C. ___81 12/11 Book/Reading Level: _Gr 3: Bk 2-P3__ Date:W _______ TRW A. ___75 ___ B. ___71 ___ ___ C. ___76 F 3/7 CRW ___72 ___ 69 ______ 74 ______ %CRW __96 ____ 97 _ _____ 97 _ _____ 8. 9. Step 3: Complete CBM Progress-Monitoring Weekly or More Frequently: Record the results of regular monitoring of the student’s progress in reading fluency. 1. 2. 3. 4. 5. 1/22 Book/Reading Level: ____________ 3 Gr Bk 2-P4 Date:_W ______ TRW E CRW %CRW 89 3 86 A. ______ ______ ______ __97 ____ 95 _ ___ ___4___ ___71 ___ _____ B. ___75 74 4 70 95 _ ______ ______ _____ C. ______ rd “ 1/29 Book/Reading Level: __________ –P5 __ Date:_W ______ TRW E CRW %CRW A. ___69 ___ __1____ ___68 ___ __99 ____ 106 3 103 97 ______ ______ ______ B. ______ 2 __ ___ ____ ___77 ___ __97 ____ C. ___79 2/3 Book/Reading Level: _________ ″-P6 ___ Date:_M______ TRW E CRW %CRW A. ___80 ___ __3____ ___77 ___ __96 ____ 78 3 75 96 ______ ______ ______ B. ______ ___ __4____ ___74 ___ __95 ____ C. ___78 2/13Book/Reading Level: _________ ″-P7 ___ Date:_Th ______ TRW E CRW %CRW 77 2 75 A. ______ ______ ______ __97 ____ 79 1 78 ______ ______ __99 ____ B. ______ 67 6 61 91 ______ ______ ______ C. ______ Th 2/27 Book/Reading Level: ________ ″-P8____ Date:_______ TRW E CRW %CRW 75 _ 74 _ 99 __ A. _____ __1____ _____ ____ 80 _ __1____ ___79 ___ __99 ____ B. _____ 81 2 79 98 ______ ______ ______ C. ______ Jim Wright, Presenter ″-P9 Book/Reading Level: ____________ 6. Date:_TR______ W E CRW %CRW 10. 11. 12. 72 _ 69 _ A. _____ __3____ _____ __96 ____ ___ __1____ ___82 ___ __99 ____ B. ___83 C. ___85 ___ __1____ ___84 ___ __99 ____ M 3/10 Book/Reading Level: __________ ″-P10 __ Date:_______ TRW A. ____87 __ __ B. ____87 C. ____82 __ E 2_ _____ ____3__ ____4__ CRW ____85 __ ____84 __ ____78 __ %CRW ___98 ___ 97 ______ 95 ______ 3/21 Book/Reading Level: __________ ″-P11 __ Date:_F______ TRW E CRW %CRW A. ____89 __ ___1___ ____88 __ ___99 ___ 4_ __ _____ ____82 __ ___95 ___ B. ____86 82 4 78 95 ______ ______ ______ C. ______ 3/25 Book/Reading Level: __________ ″-P12 __ Date:_T______ TRW E CRW %CRW 3_ A. ____82 __ _____ ____79 __ ___96 ___ 1_ __ _____ ____69 __ ___99 ___ B. ____70 100 2 98 98 ______ ______ ______ C. ______ 4/2 Book/Reading Level: __________ ″-P13 __ Date:_W ______ TRW E CRW %CRW 3_ A. ____89 __ _____ ____86 __ ___97 ___ 63 3 60 95 ______ ______ ______ B. ______ __ ____3__ ____89 __ ___97 ___ C. ____92 4/8 Book/Reading Level: ___________ ″-P14 _ Date:_T______ TRW E CRW %CRW A. ____97 __ ___3___ ____94 __ ___97 ___ 105 6 99 94 ______ ______ ______ B. ______ 2_ __ _____ ____87 __ ___98 ___ C. ____89 T 4/15 Book/Reading Level: __________ ″-P15 __ Date:_______ TRW E CRW %CRW 99 4 95 A. ______ ______ ______ ___96 ___ 103 6 97 ______ ______ ___94 ___ B. ______ 2_ C. ____78 __ _____ ____76 __ ___97 ___ Table 2: Predictions for Rates of Reading Growth by Grade (Fuchs, Fuchs, Hamlett, Walz, & Germann, 1993) Increase in Correctly Read Words Per Minute for Each Instructional Week Realistic Weekly Goal Grade Level Grade 1 Grade 2 Grade 3 Grade 4 Grade 5 Grade 6 www.interventioncentral.org 2.0 1.5 1.0 0.85 0.5 0.3 Ambitious Weekly Goal 3.0 2.0 1.5 1.1 0.8 0.65 20 Student: _____________________ Classrm/Grade: ________________ Monitoring Level: ____________ WEEK 1 WEEK 2 BASELINE ______-____ ___-___ ___-___ ______-____ ___-___ ___-___ WEEK 3 WEEK 4 ___-___ ___-___ ___-___ ___-___ WEEK 5 ___-___ ___-___ WEEK 6 WEEK 7 ___-___ ___-___ ___-___ ___-___ WEEK 8 WEEK 9 ___-___ ___-___ ___-___ ___-___ WEEK 10 WEEK 11 WEEK 12 ___-___ ___-___ ___-___ ___-___ ___-___ ___-___ 100 80 60 40 Direct assessment and intervention (2nd ed ) New York: Guilford 120 (Grade Norms from Shapiro, E. S. (1996). Academic skills problems: Correctly Read Words Per Minute 140 20 0 BASELI NE M T W T F M T W TF M T W T F M T W TF M T W T F M T W TF M T W T F M T W TF M T W T F M T W TF M T W T F M T W TF Instructional Days Jim Wright, Presenter www.interventioncentral.org Reading 140-12 ©2003 Jim Wright www.interventioncentral.org 21 CBM Math Assessment Jim Wright, Presenter www.interventioncentral.org 22 RTI-Ready Methods to Monitor Student Academics Math: Early Math Fluency Quantity Discrimination Fluency : 1 minute Administration: 1:1 Description: The student is given a sheet with number pairs. For each number pair, the student must name the larger of the two numbers. Where to get materials: • AimsWeb http://www.aimsweb.com/ • Intervention Central http://www.interventioncentral.org (Numberfly Early Math Fluency Probe Creator) Missing Number Fluency : 1 minute Administration: 1:1 Description: The student is given a sheet containing numerous sets of 3 or 4 sequential numbers. For each number series, one of the numbers is missing. The student must name the missing number. Where to get materials: • AimsWeb http://www.aimsweb.com/ • Intervention Central http://www.interventioncentral.org (Numberfly Early Math Fluency Probe Creator) Number Identification Fluency : 1 minute Administration: 1:1 Description: The student is given a sheet with numbers in random order. The student gives the name of each number. Where to get materials: • AimsWeb http://www.aimsweb.com/ • Intervention Central http://www.interventioncentral.org (Numberfly Early Math Fluency Probe Creator) Oral Counting Fluency : 1 minute Administration: 1:1 Description: The student counts aloud as many words in sequence as possible, starting from zero or one. Where to get materials: • The student does not require materials for this assessment. The examiner can make a sheet with numbers listed sequentially from 0-100 to record those numbers that the student can recite in sequence. Jim Wright, Presenter www.interventioncentral.org 23 Math: Computation Math Computation Fluency : 2 minutes Administration: Group Description: The student is given a worksheet with single-skill or mixed-skill math computation problems. The student works independently to complete as many problems as possible. The student receives credit for each correct digit appearing in his or her answer. Where to get materials: • AimsWeb http://www.aimsweb.com/ • Intervention Central http://www.interventioncentral.org (Math Worksheet Generator) • SuperKids http://www.superkids.com/aweb/tools/math/ (This website allows you to create math computation worksheets for more advanced areas such as fractions, percentages, decimals, and more) Math: Applied Problems Math Concepts & Applications : 6-8 minutes Administration: Group Description: Students are given assessment booklets with a mix of applied problem types appropriate to that grade level. (Assessments are available for grades 2-6). A mix of applied problems is included in each assessment, sampling the typical math curriculum for the student’s grade (e.g., money skills, time-telling, etc.) Where to get materials: • MBSP: Monitoring Basic Skills Progress: Basic Math Kit – Second Edition developed by Drs. Lynn & Dough Fuchs, Vanderbilt University. Available through Pro-Ed: http://www.proedinc.com/ Math: Vocabulary Math Vocabulary Probes (Howell, : 5 minutes Administration: Group 2008) Description: Students are given a math vocabulary probe consisting of 20 vocabulary items. There are two versions commonly used: (1) The sheet contains vocabulary terms on one side of the sheet and the definitions of those terms—in scrambled order—on the other. The student connects term to its correct definition; (2) The sheet contains only definitions. The student must read each definition and write the correct corresponding vocabulary term. Where to get materials: • Math vocabulary probes are developed by the school. Teachers create ‘vocabulary pools’ that contain the key vocabulary items to be included in probes. From that larger pool, vocabulary items are randomly sampled to create individual probes. References Hosp, M.K., Hosp, J. L., & Howell, K. W. (2007). The ABCs of CBM. New York: Guilford Howell, K. W. (2008). Best practices in curriculum-based evaluation and advanced reading. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 397-418). Bethesda, MD: National Association of School Psychologists. Jim Wright, Presenter www.interventioncentral.org 24 Curriculum-Based Measurement Administration & Scoring Guidelines for Math Computation CBM MATH Description There are 2 types of CBM math probes, single-skill worksheets (those containing like problems) and multiple-skill worksheets (those containing a mix of problems requiring different math operations). Single-skill probes give instructors good information about students' mastery of particular problem-types, while multiple-skill probes allow the teacher to test children's math competencies on a range of computational objectives during a single CBM session. Both types of math probes can be administered either individually or to groups of students. The examiner hands the worksheet(s) out to those students selected for assessment. Next, the examiner reads aloud the directions for the worksheet. Then the signal is given to start, and students proceed to complete Figure 5: A Sampling of Math Computational Goals for Addition, Subtraction, Multiplication, and Division (from Wright, 2002). Addition Two 1-digit numbers: sums to 10 Two 3-digit numbers: no regrouping 1- to 2-digit number plus 1- to 2-digit number: regrouping Subtraction Two 1-digit numbers: 0 to 9 2-digit number from a 2-digit number: no regrouping 2-digit number from a 2-digit number: regrouping Multiplication Multiplication facts: 0 to 9 2-digit number times 1-digit number: no regrouping 3-digit number times 1-digit number: regrouping Division Division facts: 0 to 9 2-digit number divided by 1-digit number: no remainder 2-digit number divided by 1-digit number: remainder Wright, J. (2002) Curriculum-Based Assessment Math Computation Probe Generator: Multiple-Skill Worksheets in Mixed Skills. Retrieved August 13, 2006, from http://www.lefthandlogic.com/htmdocs/tools/mathprobe/allmult.shtml as many items as possible within 2 minutes. The examiner collects the worksheets at the end of the assessment for scoring. Creating a measurement pool for math computational probes The first task of the instructor in preparing CBM math probes is to define the computational skills to be assessed. Many districts have adopted their own math curriculum that outlines the various computational Jim Wright, Presenter www.interventioncentral.org 25 skills in the order in which they are to be taught. Teachers may also review scope-and-sequence charts that accompany math textbooks when selecting CBM computational objectives. The order in which math computational skills are taught, however, probably does not vary a great deal from district to district. Figure 5 contains sample computation goals for addition, subtraction, multiplication, and division. Instructors typically are interested in employing CBM to monitor students' acquisition of skills in which they are presently being instructed. However, teachers may also want to use CBM as a skills checkup to assess those math objectives that students have been taught in the past or to "preview" a math group's competencies in computational material that will soon be taught. Preparing CBM Math Probes After computational objectives have been selected, the instructor is ready to prepare math probes. The teacher may want to create single-skills probes, multipleskill probes, or both types of CBM math worksheets. Creating the Single-skill Math Probe As the first step in putting together a single-skill math probe, the teacher will select one computational objective as a guide. The measurement pool, then, will consist of problems randomly ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Figure 6: Example of a single-skill math probe: Three to five 3- and 4-digit numbers: no regrouping + + | | | | | 105 600 293 2031 + 531 + 2322 | | | | | + + 111 717 260 | | | | | 634 + 8240 + 203 | | | | | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- constructed that conform to the computational objective chosen. For example, the instructor may select the following computational objective (Figure 6) as the basis for a math probe. The teacher would then construct a series of problems that match the computational goal, as in Figure 6. In general, single-skill math probes should contain between 80 and 200 problems, and worksheets should have items on both the front and back of the page. Adequate space should also be left for the student's computations, especially with more complex problems such as long division. Creating the Multiple-skill Math Probe To assemble a multiple-skill math probe, the instructor will first select the range of math operations and of problem-types that will make up the probe. The teacher will probably want to consult the district math ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Figure 7: Example of a multiple-skill math probe: Division: 3-digit number divided by 1-digit number: no remainder Subtraction: 2-digit number from a 2-digit number: regrouping Multiplication” 3-digit number times 1-digit number: no regrouping Division: Two 3-digit numbers: no regrouping 9/431 | | | | | 20 -18 | | | | | 113 x 2 | | | | | + + + 106 172 200 600 | | | | | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Jim Wright, Presenter www.interventioncentral.org 26 curriculum, appropriate scope –and sequence charts, or the computational-goal chart included in this manual when selecting the kinds of problems to include in the multiple-skill probe. Once the computational objectives have been chosen, the teacher can make up a worksheet of mixed math facts conforming to those objectives. Using our earlier example, the teacher who wishes to estimate the proficiency of his 4thgrade math group may decide to create a multiple-skills CBM probe. He could choose to sample only those problem-types that his students have either mastered or are presently being instructed in. Those skills are listed in Figure 7, with sample problems that might appear on the worksheet of mixed math facts. Materials needed for giving CBM math probes Student copy of CBM math probe (either single- or multiple-skill) Stopwatch Pencils for students Administration of CBM math probes The examiner distributes copies of one or more math probes to all the students in the group. (Note: These probes may also be administered individually). The examiner says to the students: The sheets on your desk are math facts. If the students are to complete a single-skill probe, the examiner then says: All the problems are [addition or subtraction or multiplication or division] facts. If the students are to complete a multiple-skill probe, the examiner then says: There are several types of problems on the sheet. Some are addition, some are subtraction, some are multiplication, and some are division [as appropriate]. Look at each problem carefully before you answer it. When I say 'start,' turn them over and begin answering the problems. Start on the first problem on the left on the top row [point]. Work across and then go to the next row. If you can't answer the problem, make an 'X' on it and go to the next one. If you finish one side, go to the back. Are there any questions? Say, Start. The examiner starts the stopwatch. While the students are completing worksheets, the examiner and any other adults assisting in the assessment circulate around the room to ensure that students are working on the correct sheet, that they are completing problems in the correct order (rather than picking out only the easy items), and that they have pencils, etc. After 2 minutes have passed, the examiner says Stop. CBM math probes are collected for scoring. Scoring Traditional approaches to computational assessment usually give credit for the total number of correct answers appearing on a worksheet. If the answer to a problem is found to contain one or more incorrect digits, that problem is marked wrong and receives no credit. In contrast to this all-or-nothing marking system, CBM assigns credit to each individual correct digit appearing in the solution to a math fact. On the face of it, a math scoring system that awards points according to the number of correct digits may appear unusual, but this alternative approach is grounded in good academic-assessment research and practice. By separately scoring each digit in the answer of a computation problem, the instructor is better able to recognize and to give credit for a student's partial math competencies. Scoring computation problems by the digit rather than as a single answer also allows for a more minute analysis of a child's number skills. Imagine, for instance, that a student was given a CBM math probe consisting of addition problems, sums less than or equal to 19 (incorrect digits appear in boldface and italics): ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------Figure 8: Example of completed problems from a single-skill math probe + + 105 600 293 988 | | | | | 2031 + 531 + 2322 4884 | | | | | 111 + 717 + 260 1087 ----------------- Jim-----------------------------------------------------------------------Wright, Presenter www.interventioncentral.org | | | | | 634 + 8240 + 203 9077 --- | | | | | 27 If the answers in Figure 8 were scored as either correct or wrong, the child would receive a score of 1 correct answer out of 4 possible answers (25 percent). However, when each individual digit is scored, it becomes clear that the student actually correctly computed 12 of 15 possible digits (80 percent). Thus, the CBM procedure of assigning credit to each correct digit demonstrates itself to be quite sensitive to a student's emerging, partial competencies in math computation. The following scoring rules will aid the instructor in marking single- and multiple-skill math probes: • Individual correct digits are counted as correct. Reversed or rotated digits are not counted as errors unless their change in position makes them appear to be another digit (e.g., 9 and 6). • Incorrect digits are counted as errors. Digits that appear in the wrong place value, even if otherwise correct, are scored as errors. Example "873" is the correct answer to this problem, but no credit can be given since the addition of the 0 97 pushes the other digits out of their proper placex9 8730 value positions. • The student is given credit for "place-holder" numerals that are included simply to correctly align the problem. As long as the student includes the correct space, credit is given whether or not a "0" has actually been inserted. Example 55 x 82 110 4400 4510 Since the student correctly placed 0 in the "placeholder" position, it is given credit as a correct digit. Credit would also have been given if the space were reserved but no 0 had been inserted. • In more complex problems such as advanced multiplication, the student is given credit for all correct numbers that appear below the line. Example 33 x 28 Credit is given for all work below the line. In this 264 example, the student earns credit for 9 correct 660 digits. 924 • Credit is not given for any numbers appearing above the line (e.g., numbers marked at the top of number columns to signify regrouping). Example 1 Credit is given for the 2 digits below the line. 46 However, the carried "1" above the line does not + 39 receive credit. 85 Reference: Wright, J. (n.d.). Curriculum-based measurement: A manual for teachers. Retrieved September 23, 2006, from http://www.jimwrightonline.com/pdfdocs/cbaManual.pdf Jim Wright, Presenter www.interventioncentral.org 28 Appendix D: Computational Goals APPENDIX D: List of computational goals COMPUTATIONAL GOALS OF MATH CURRICULUM (ADAPTED FROM SHAPIRO, 1989) The computational skills listed below are arranged in ascending order of difficulty. Please identify(1) the skills which you have instructed in the classroom, (2) the skills that the student has mastered, and (3) the skills with which the student is currently having difficulty. MASTERED : Place a check under the M column indicating the skills which the student has mastered. INSTRUCTED DIFFICULTY difficulty. M I __ __ __ __ : Place a check under the I column indicating the skills which you have instructed. : Place a check under the D column indicating the skills with which the student is having D Grade 1 __ __ 1. Add two one-digit numbers: sums to 10. 2. Subtract two one-digit numbers: combinations to 10. Grade 2 __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ 3. 4. 5. 6. 7. Add two one-digit numbers: sums 11 to 19. Add a one-digit number to a two-digit number--no regrouping. Add a two-digit number to a two-digit number--no regrouping. Add a three-digit number to a three-digit number--no regrouping. Subtract a one-digit number from a one- or two-digit number: combinations to 18. 8. Subtract a one-digit number from a two-digit number--no regrouping. 9. Subtract a two-digit number from a two-digit number--no regrouping. 10. Subtract a three-digit number from a three-digit number--no regrouping. 11. Multiplication facts--0's, 1's, 2's. Grade 3 __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ 12. 13. 14. 15. 16. 17. __ __ __ 18. __ __ __ 19. __ __ __ 20. __ __ __ 21. Add three or more one-digit numbers. Add three or more two-digit numbers--no regrouping. Add three or more three- and four-digit numbers--no regrouping. Add a one-digit number to a two-digit number with regrouping. Add a two-digit number to a two-digit number with regrouping. Add a two-digit number to a three-digit number with regrouping from the units to the tens column only. Add a two-digit number to a three-digit number with regrouping from the tens to the hundreds column only. Add a two-digit number to a three-digit number with regrouping from the units to the tens column and from the tens to the hundreds column. Add a three-digit number to a three-digit number with regrouping from the units to the tens column only. Add a three-digit number to a three-digit number with regrouping from the tens to the hundreds column only. CBM Workshop Manual Jim Wright, Presenter Jim Wright www.interventioncentral.org Appendix D-1 29 Appendix D: Computational Goals M I D __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ 22. Add a three-digit number to a three-digit number with regrouping from the units to the tens column and from the tens to the hundreds column. 23. Add a four-digit number to a four-digit number with regrouping in one to three columns. 24. Subtract two four-digit numbers-no regrouping. 25. Subtract a one-digit number from a two-digit number with regrouping. 26. Subtract a two-digit number from a two-digit number with regrouping. 27. Subtract a two-digit number from a three-digit number with regrouping from the units to the tens column only. 28. Subtract a two-digit number from a three-digit number with regrouping from the tens to the hundreds column only. 29. Subtract a two-digit number from a three-digit number with regrouping from the units to the tens column and from the tens to the hundreds column. 30. Subtract a three-digit from a three-digit number with regrouping from the units to the tens column only. 31. Subtract a three-digit number from a three-digit number with regrouping from the tens to the hundreds column only. 32. Subtract a three-digit number from a three-digit number with regrouping from the units to the tens column and from the tens to the hundreds column. 33. Multiplication facts--3 to 9. Grade 4 __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ __ 34. Add a five- or six-digit number to a five- or six-digit number with regrouping in any columns. 35. Add three or more two-digit numbers with regrouping. 36. Add three or more three-digit numbers with regrouping with regrouping in any columns. 37. Subtract a five- or six-digit number from a five- or six-digit number with regrouping in any columns. 38. Multiply a two-digit number by a one-digit number with no regrouping. 39. Multiply a three-digit number by a one-digit number with no regrouping. 40. Multiply a two-digit number by a one-digit number with no regrouping. 41. Multiply a three-digit number by a one-digit number with regrouping. 42. Division facts--0 to 9. 43. Divide a two-digit number by a one-digit number with no remainder. 44. Divide a two-digit number by a one-digit number with remainder. 45. Divide a three-digit number by a one digit number with remainder. 46. Divide a four-digit number by a one-digit number with remainder. CBM Workshop Manual Jim Wright, Presenter Jim Wright www.interventioncentral.org Appendix D-2 30 Appendix D: Computational Goals M I D __ __ __ __ __ __ __ __ __ Grade 5 47. Multiply a two-digit number by a two-digit number with regrouping. 48. Multiply a three-digit number by a two-digit number with regrouping. 49. Multiply a three-digit number by a three-digit number with regrouping. List of computational goals taken from Shapiro, Edward S. (1989). Academic skills problems: Direct assessment and intervention. New York: Guilford Press. CBM Workshop Manual Jim Wright, Presenter Jim Wright www.interventioncentral.org Appendix D-3 31 Curriculum-Based Assessment Mathematics Multiple-Skills Computation Probe: Student Copy Student: 727,162 +30,484 146,569 +532,260 Date: ____________________ | | | | | | | | | | 42,286 -29,756 33,516 -21,366 | | | | | | | | | | 156 x623 192 x371 | | | | | 52/2207 | | | | | | | | | | 43/4742 | | | | | www.interventioncentral.org Jim Wright, Presenter www.interventioncentral.org 32 Curriculum-Based Assessment Mathematics Multiple-Skills Computation Probe: Examiner Copy Item 1: 6 CD/6 CD Total ADDITION: 5- to 6digit number plus 5to 6-digit number: Regrouping in any column 727,162 + 30,484 757,646 Item 5: 6 CD/49 CD Total ADDITION: 5- to 6digit number plus 5to 6-digit number: Regrouping in any column 146,569 +532,260 678,829 | | | | | | | | | | Item 2: 5 CD/11 CD Total SUBTRACTION: 5digit number from 5-digit number: regrouping in any column 42,286 - 29,756 12,530 Item 6: 5 CD/54 CD Total SUBTRACTION: 5digit number from 5-digit number: regrouping in any column 33,516 - 21,366 12,150 Item 3: 17 CD/28 CD Total MULTIPLICATION: 3digit number times 3-digit number: regrouping | | | | | 156 x 623 468 312936-97,188 Item 4: 15 CD/43 CD Total DIVISION: 4-digit number divided by 2-digit number: remainder | | | | | Item 7: 18 CD/72 CD Total MULTIPLICATION: 3digit number times 3-digit number: regrouping | | | | | 192 x 371 192 1344576-71,232 | | | | | 42 r23 52/2207 -208 127 -104 23 Item 8: 13 CD/85 CD Total DIVISION: 4-digit number divided by 2-digit number: remainder | | | | | | | | | | 110 r12 43/4742 -43 44 -43 12 www.interventioncentral.org Jim Wright, Presenter www.interventioncentral.org 33 Early Math Fluency CBM Probe: Quantity Discrimination This introduction to the Quantity Discrimination probe provides information about the preparation, administration, and scoring of this Early Math CBM measure. Additionally, it offers brief guidelines for integrating this assessment into a school-wide ‘Response-to-Intervention’ model. Quantity Discrimination: Description (Clarke & Shinn, 2005; Gersten, Jordan & Flojo, 2005) The student is given a sheet containing pairs of numbers. In each number pair, one number is larger than the other. The numbers in each pair are selected from within a predefined range (e.g., no lower than 0 and no higher than 20). During a one-minute timed assessment, the student identifies the larger number in each pair, completing as many items as possible while the examiner records any Quantity Discrimination errors. Quantity Discrimination: Preparation The following materials are needed to administer Quantity Discrimination (QD) Early Math CBM probes: • Student and examiner copies of a QD assessment probe. (Note: Customized QD probes can be created conveniently and at no cost using Numberfly, a web-based application. Visit Numberfly at http://www.interventioncentral.org/php/numberfly/numberfly.php). • A pencil, pen, or marker • A stopwatch Quantity Discrimination: Directions for Administration 1. The examiner sits with the student in a quiet area without distractions. The examiner sits at a table across from the student. 2. The examiner says to the student: “The sheet on your desk has pairs of numbers. In each set, one number is bigger than the other.” “When I say, 'start,' tell me the name of the number that is larger in each pair. Start at the top of this page and work across the page [demonstrate by pointing]. Try to figure out the larger number for each example.. When you come to the end of a row, go to the next row. Are there any questions? [Pause] Start. “ 3. The examiner begins the stopwatch when the student responds aloud to the first item. If the student hesitates on a number for 3 seconds or longer on a Quantity Discrimination item, the examiner says, “Go to the next one.” (If necessary, the examiner points to the next number as a student prompt.) Jim Wright, Presenter www.interventioncentral.org 34 Jim Wright www.interventioncentral.org 2 of 4 4. The examiner marks each Quantity Discrimination error by marking a slash (/) through the incorrect response item on the examiner form. 5. At the end of one minute, the examiner says, “Stop” and writes in a right-bracket symbol ( ] ) on the examiner form after the last item that the student had attempted when the time expired. The examiner then collects the student Quantity Discrimination sheet. Quantity Discrimination: Scoring Guidelines Correct QD responses include: • • Quantity Discriminations read correctly Quantity Discriminations read incorrectly but corrected by the student within 3 seconds Incorrect QD responses include: • • • • The student’s reading the smaller number in the QD number pair Correct QD responses given after hesitations of 3 seconds or longer The student’s calling out a number other than appears in the QD number pair Response items skipped by the student To calculate a Quantity Discrimination fluency score, the examiner: 1. counts up all QD items that the student attempted to answer and 2. subtracts the number of QD errors from the total number attempted. 3. The resulting figure is the number of correct Quantity Discrimination items completed.(QD fluency score). Quantity Discrimination Probes as Part of a Response to Intervention Model • Universal Screening: To proactively identify children who may have deficiencies in development of foundation math concepts, or ‘number sense’ (Berch, 2003), schools may choose to screen all kindergarten and first grade students using Quantity Discrimination probes. Those screenings would take place in fall, winter, and spring. Students who fall below the ‘cutpoint’ of the 35th percentile (e.g., Jordan & Hanich, 2003).of the grade norms on the QD task would be identified as having moderate deficiencies and given additional interventions to build their ‘number sense’ skills. • Tier I (Classroom-Based) Interventions: Teachers can create Quantity Discrimination probes and use them independently to track the progress of students who show modest delays in their math foundation skills. • Tier II (Individualized) Interventions. Students with more extreme academic delays may be referred to a school-based problem-solving team, which will develop more intensive, specialized interventions to target the student’s academic deficits (Wright, 2007). Quantity Discrimination probes can be used as one formative measure to track student progress with Tier II interventions to build foundation math skills. Jim Wright, Presenter www.interventioncentral.org 35 Jim Wright www.interventioncentral.org 3 of 4 Quantity Discrimination: Measurement Statistics Test-Retest Reliability Correlations for Quantity Discrimination Probes Time Span Correlation Reference 13-week interval 0.85 Clarke & Shinn (2005) 26-week interval 0.86 Clarke & Shinn (2005) Predictive Validity Correlations for Quantity Discrimination Probes Predictive Validity Measure Correlation Reference Curriculum-Based Measurement Math 0.67 Clarke & Shinn (2005) Computation Fluency Probes: Grade 1 Addition & Subtraction (Fall Administration of QD Probe and Spring Administration of Math Computation Probe) 0.79 Clarke & Shinn (2005) Woodcock-Johnson Tests of Achievement: Applied Problems subtest (Fall Administration of QD Probe and Spring Administration of WJACH subtest) Number Knowledge Test 0.53 Chard, Clarke, Baker, Otterstedt, Braun & Katz.(2005) cited in Gersten, Jordan & Flojo (2005) References Chard, D. J., Clarke, B., Baker, S., Otterstedt, J., Braun, D., & Katz, R. (2005). Using measures of number sense to screen for difficulties in mathematics: Preliminary findings. Assessment For Effective Intervention, 30(2), 3-14. Clarke, B., & Shinn, M. (2004). A preliminary investigation into the identification and development of early mathematics curriculum-based measurement. School Psychology Review, 33, 234–248. Gersten, R., Jordan, N.C., & Flojo, J.R. (2005). Early identification and interventions for students with mathematics difficulties. Journal of Learning Disabilities, 38, 293-304. Jordan, N. C. & Hanich, L. B. (2003). Characteristics of children with moderate mathematics deficiencies: A longitudinal perspective. Learning Disabilities Research and Practice, 18(4), 213221. Berch, D. B. (2003). Making sense of number sense: Implications for children with mathematical disabilities. Journal of Learning Disabilities, 38, 333-339.. Wright, J. (2007). The RTI toolkit: A practical guide for schools. Port Chester, NY: National Professional Resources, Inc. Jim Wright, Presenter www.interventioncentral.org 36 Early Math Fluency CBM Probe: Missing Number This introduction to the Missing Number probe provides information about the preparation, administration, and scoring of this Early Math CBM measure. Additionally, it offers brief guidelines for integrating this assessment into a school-wide ‘Response-to-Intervention’ model. Missing Number: Description (Clarke & Shinn, 2005; Gersten, Jordan & Flojo, 2005) The student is given a sheet containing multiple number series. Each series consists of 3-4 numbers that appear in sequential order. The numbers in each short series are selected to fall within a predefined range (e.g., no lower than 0 and no higher than 20). In each series, one number is left blank (e.g., ‘1 2 _ 4’).During a one-minute timed assessment, the student states aloud the missing number in as many response items as possible while the examiner records any Missing Number errors. Missing Number: Preparation The following materials are needed to administer Missing Number (MN) Early Math CBM probes: • Student and examiner copies of a MN assessment probe. (Note: Customized MN probes can be created conveniently and at no cost using Numberfly, a web-based application. Visit Numberfly at http://www.interventioncentral.org/php/numberfly/numberfly.php). • A pencil, pen, or marker • A stopwatch Missing Number: Directions for Administration 1. The examiner sits with the student in a quiet area without distractions. The examiner sits at a table across from the student. 2. The examiner says to the student: “The sheet on your desk has sets of numbers. In each set, a number is missing.” “When I say, 'start,' tell me the name of the number that is missing from each set of numbers. Start at the top of this page and work across the page [demonstrate by pointing]. Try to figure out the missing number for each example.. When you come to the end of a row, go to the next row. Are there any questions? [Pause] Start. “ 3. The examiner begins the stopwatch when the student reads the first number aloud. If the student hesitates on a number for 3 seconds or longer on a Missing Number item, the examiner says the correct number aloud and says, “Go to the next one.” (If necessary, the examiner points to the next number as a student prompt.) 4. The examiner marks each Missing Number error by marking a slash (/) through the incorrect response item on the examiner form. Jim Wright, Presenter www.interventioncentral.org 37 Jim Wright www.interventioncentral.org 2 of 3 5. At the end of one minute, the examiner says, “Stop” and writes in a right-bracket symbol ( ] ) on the examiner form after the last item that the student had attempted when the time expired. The examiner then collects the student Missing Number sheet. Missing Number: Scoring Guidelines Correct MN responses include: • • Missing numbers read correctly Missing numbers read incorrectly but corrected by the student within 3 seconds Incorrect MN responses include: • • • Missing numbers read incorrectly Missing numbers read correctly after hesitations of 3 seconds or longer Response items skipped by the student To calculate a Missing Number fluency score, the examiner: 1. counts up all MN items that the student attempted to read aloud and 2. subtracts the number of MN errors from the total number attempted. 3. The resulting figure is the number of correct Missing Number items completed.(MN fluency score). Missing Number Probes as Part of a Response to Intervention Model • Universal Screening: To proactively identify children who may have deficiencies in development of foundation math concepts, or ‘number sense’ (Berch, 2003), schools may choose to screen all kindergarten and first grade students using Missing Number probes. Those screenings would take place in fall, winter, and spring. Students who fall below the ‘cutpoint’ of the 35th percentile (e.g., Jordan & Hanich, 2003).of the grade norms on the MN task would be identified as having moderate deficiencies and given additional interventions to build their ‘number sense’ skills. • Tier I (Classroom-Based) Interventions: Teachers can create Missing Number probes and use them independently to track the progress of students who show modest delays in their math foundation skills. • Tier II (Individualized) Interventions. Students with more extreme academic delays may be referred to a school-based problem-solving team, which will develop more intensive, specialized interventions to target the student’s academic deficits (Wright, 2007). Missing Number probes can be used as one formative measure to track student progress with Tier II interventions to build foundation math skills. Missing Number: Measurement Statistics Test-Retest Reliability Correlations for Missing Number Probes Time Span Correlation 13-week interval 0.79 Jim Wright, Presenter www.interventioncentral.org Reference Clarke & Shinn (2005) 38 Jim Wright 26-week interval www.interventioncentral.org 0.81 Predictive Validity Correlations for Missing Number Probes Predictive Validity Measure Correlation Curriculum-Based Measurement Math 0.67 Computation Fluency Probes: Grade 1 Addition & Subtraction (Fall Administration of MN Probe and Spring Administration of Math Computation Probe) 0.72 Woodcock-Johnson Tests of Achievement: Applied Problems subtest (Fall Administration of MNF Probe and Spring Administration of WJ-ACH subtest) Number Knowledge Test 0.61 3 of 3 Clarke & Shinn (2005) Reference Clarke & Shinn (2005) Clarke & Shinn (2005) Chard, Clarke, Baker, Otterstedt, Braun & Katz.(2005) cited in Gersten, Jordan & Flojo (2005) References Chard, D. J., Clarke, B., Baker, S., Otterstedt, J., Braun, D., & Katz, R. (2005). Using measures of number sense to screen for difficulties in mathematics: Preliminary findings. Assessment For Effective Intervention, 30(2), 3-14. Clarke, B., & Shinn, M. (2004). A preliminary investigation into the identification and development of early mathematics curriculum-based measurement. School Psychology Review, 33, 234–248. Gersten, R., Jordan, N.C., & Flojo, J.R. (2005). Early identification and interventions for students with mathematics difficulties. Journal of Learning Disabilities, 38, 293-304. Jordan, N. C. & Hanich, L. B. (2003). Characteristics of children with moderate mathematics deficiencies: A longitudinal perspective. Learning Disabilities Research and Practice, 18(4), 213221. Berch, D. B. (2003). Making sense of number sense: Implications for children with mathematical disabilities. Journal of Learning Disabilities, 38, 333-339.. Wright, J. (2007). The RTI toolkit: A practical guide for schools. Port Chester, NY: National Professional Resources, Inc. Jim Wright, Presenter www.interventioncentral.org 39 Early Math Fluency CBM Probe: Number Identification This introduction to the Number Identification probe provides information about the preparation, administration, and scoring of this Early Math CBM measure. Additionally, it offers brief guidelines for integrating this assessment into a school-wide ‘Response-to-Intervention’ model. Number Identification: Description (Clarke & Shinn, 2005; Gersten, Jordan & Flojo, 2005) The student is given a sheet containing rows of randomly generated numbers (e.g., ranging from 0 to 20). During a one-minute timed assessment, the student reads aloud as many numbers as possible while the examiner records any Number Identification errors. Number Identification: Preparation The following materials are needed to administer Number Identification (NID) Early Math CBM probes: • Student and examiner copies of a NID assessment probe. (Note: Customized NID probes can be created conveniently and at no cost using Numberfly, a web-based application. Visit Numberfly at http://www.interventioncentral.org/php/numberfly/numberfly.php). • A pencil, pen, or marker • A stopwatch Number Identification: Directions for Administration 1. The examiner sits with the student in a quiet area without distractions. The examiner sits at a table across from the student. 2. The examiner says to the student: “The sheet on your desk has rows of numbers.” “When I say, 'start,' begin reading the numbers aloud. Start at the top of this page and read across the page [demonstrate by pointing]. Try to read each number. When you come to the end of a row, go to the next row. Are there any questions? [Pause] Start. “ 3. The examiner begins the stopwatch when the student reads the first number aloud. If the student hesitates on a number for 3 seconds or longer, the examiner says, “Go to the next one.” (If necessary, the examiner points to the next number as a student prompt.) 4. The examiner marks each Number Identification error by marking a slash (/) through the incorrectly read number on the examiner form. 5. At the end of one minute, the examiner says, “Stop” and writes in a right-bracket symbol ( ] ) on the examiner form from the point in the number series that the student had reached when the time expired. The examiner then collects the student Number Identification sheet. Jim Wright, Presenter www.interventioncentral.org 40 Jim Wright www.interventioncentral.org 2 of 3 Number Identification: Scoring Guidelines Correct NID responses include: • • Numbers read correctly Numbers read incorrectly but corrected by the student within 3 seconds Incorrect NID responses include: • • • Numbers read incorrectly Numbers read correctly after hesitations of 3 seconds or longer Numbers skipped by the student To calculate a Number Identification fluency score, the examiner: 1. counts up all numbers that the student attempted to read aloud and 2. subtracts the number of errors from the total of numbers attempted. 3. The resulting figure is the number of correct numbers identified.(NID fluency score). Number Identification Probes as Part of a Response to Intervention Model • Universal Screening: To proactively identify children who may have deficiencies in development of foundation math concepts, or ‘number sense’ (Berch, 2003), schools may choose to screen all kindergarten and first grade students using Number Identification probes. Those screenings would take place in fall, winter, and spring. Students who fall below the ‘cutpoint’ of the 35th percentile (e.g., Jordan & Hanich, 2003).of the grade norms on the NID task would be identified as having moderate deficiencies and given additional interventions to build their ‘number sense’ skills. • Tier I (Classroom-Based) Interventions: Teachers can create Number Identification probes and use them independently to track the progress of students who show modest delays in their math foundation skills. • Tier II (Individualized) Interventions. Students with more extreme academic delays may be referred to a school-based problem-solving team, which will develop more intensive, specialized interventions to target the student’s academic deficits (Wright, 2007). Number Identification probes can be used as one formative measure to track student progress with Tier II interventions to build foundation math skills. Number identification: Measurement Statistics Test-Retest Reliability Correlations for Number Identification Probes Time Span Correlation Reference 13-week interval 0.85 Clarke & Shinn (2005) 26-week interval 0.76 Clarke & Shinn (2005) Jim Wright, Presenter www.interventioncentral.org 41 Jim Wright www.interventioncentral.org Predictive Validity Correlations for Number Identification Probes Predictive Validity Measure Correlation Curriculum-Based Measurement Math 0.60 Computation Fluency Probes: Grade 1 Addition & Subtraction (Fall Administration of MN Probe and Spring Administration of Math Computation Probe) Woodcock-Johnson Tests of Achievement: 0.72 Applied Problems subtest (Fall Administration of NID Probe and Spring Administration of WJ-ACH subtest) Number Knowledge Test 0.58 3 of 3 Reference Clarke & Shinn (2005) Clarke & Shinn (2005) Chard, Clarke, Baker, Otterstedt, Braun & Katz.(2005) cited in Gersten, Jordan & Flojo (2005) References Chard, D. J., Clarke, B., Baker, S., Otterstedt, J., Braun, D., & Katz, R. (2005). Using measures of number sense to screen for difficulties in mathematics: Preliminary findings. Assessment For Effective Intervention, 30(2), 3-14. Clarke, B., & Shinn, M. (2004). A preliminary investigation into the identification and development of early mathematics curriculum-based measurement. School Psychology Review, 33, 234–248. Gersten, R., Jordan, N.C., & Flojo, J.R. (2005). Early identification and interventions for students with mathematics difficulties. Journal of Learning Disabilities, 38, 293-304. Jordan, N. C. & Hanich, L. B. (2003). Characteristics of children with moderate mathematics deficiencies: A longitudinal perspective. Learning Disabilities Research and Practice, 18(4), 213221. Berch, D. B. (2003). Making sense of number sense: Implications for children with mathematical disabilities. Journal of Learning Disabilities, 38, 333-339.. Wright, J. (2007). The RTI toolkit: A practical guide for schools. Port Chester, NY: National Professional Resources, Inc. Jim Wright, Presenter www.interventioncentral.org 42 CBM Writing Assessment Jim Wright, Presenter www.interventioncentral.org 43 Written Expression Description CBM Writing probes are simple to administer but offer a variety of scoring options. As with math and spelling, writing probes may be given individually or to groups of students. The examiner prepares a lined composition sheet with a storystarter sentence or partial sentence at the top. The student thinks for 1 minute about a possible story to be written from the story-starter, then spends 3 minutes writing the story. The examiner collects the writing sample for scoring. Depending on the preferences of the teacher, the writing probe can be scored in several ways (see below). Creating a measurement pool for writing probes Since writing probes are essentially writing opportunities for students, they require minimal advance preparation. The measurement pool for writing probes would be a collection of grade-appropriate story-starters, from which the teacher would randomly select a story-starter for each CBM writing assessment. Writing texts are often good sources for lists of story-starters; teachers may also choose to write their own. Preparing CBM writing probes The teacher selects a story-starter from the measurement pool and places it at the top of a lined composition sheet. The story-starter should avoid wording that encourages students to generate lists. It should also be open-ended, requiring the writer to build a narrative rather than simply to write down a "Yes" or Fig. 2.9: Example of a writing probe CBM Written Language Name_______________________ Grade____ Date_______ One day, I was out sailing. A storm carried me far out to sea and wrecked my boat on a desert island. ______________________________ __________________________________________________________ __________________________________________________________ "No" response. The CBM writing probe in Figure 2.9 is a good example of how a such a probe might appear. This particular probe was used in a 5th-grade classroom. Jim Wright, Presenter www.interventioncentral.org 44 Materials needed for giving CBM writing probes o Student copy of CBM writing probe with story-starter o Stopwatch o Pencils for students Administration of CBM writing probes The examiner distributes copies of CBM writing probes to all the students in the group. (Note: These probes may also be administered individually). The examiner says to the students: I want you to write a story. I am going to read a sentence to you first, and then I want you to write a short story about what happens. You will have 1 minute to think about the story you will write and then have 3 minutes to write it. Do your best work. If you don't know how to spell a word, you should guess. Are there any questions? For the next minute, think about . . . [insert story-starter]. The examiner starts the stopwatch. At the end of 1 minute, the examiner says, Start writing. While the students are writing, the examiner and any other adults helping in the assessment circulate around the room. If students stop writing before the 3-minute timing period has ended, monitors encourage them to continue writing. After 3 additional minutes, the examiner says, Stop writing. CBM writing probes are collected for scoring. Scoring The instructor has several options when scoring CBM writing probes. Student writing samples may be scored according to the (1) number of words written, (2) number of letters written, (3) number of words correctly spelled, or (4) number of writing units placed in correct sequence. Scoring methods differ both in the amount of time that they require of the instructor and in the quality of information that they provide about a student's writing skills. Advantages and potential limitations of each scoring system are presented below. 1. Total words--The examiner counts up and records the total number of words written during the 3-minute writing probe. Misspelled words are included in the tally, although numbers written in numeral form (e.g., 5, 17) are not counted. Calculating total words is the quickest of scoring methods. A drawback, however, is that it yields only a rough estimate of writing fluency (that is, of how quickly the student can put words on paper) without examining the accuracy of spelling, punctuation, and other writing conventions. The CBM writing sample in Figure Jim Wright, Presenter www.interventioncentral.org 45 2.10 was written by a 6th-grade student: Fig. 2.10: CBM writing sample scored for total words I woud drink water from the ocean.....07 and I woud eat the fruit off of.......08 the trees. Then I woud bilit a.......07 house out of trees, and I woud........07 gather firewood to stay warm. I......06 woud try and fix my boat in my........08 spare time. .........................02 Word total = 45 Using the total-words scoring formula, this sample is found to contain 45 words (including misspellings). 2. Total letters--The examiner counts up the total number of letters written during the 3-minute probe. Again, misspelled words are included in the count, but numbers written in numeral form are excluded. Calculating total letters is a reasonably quick operation. When compared to word-total, it also enjoys the advantage of controlling for words of varying length. For example, a student who writes few words but whose written vocabulary tends toward longer words may receive a relatively low score on word-total but receive a substantially higher score Fig. 2.11: CBM writing sample scored for total letters I woud drink water from the ocean.....27 and I woud eat the fruit off of.......24 the trees. Then I woud bilit a.......23 house out of trees, and I woud........23 gather firewood to stay warm. I......25 woud try and fix my boat in my........23 spare time. .........................09 Letter total = 154 for letter-total . As with word-total, though, the letter-total formula gives only a general idea of writing fluency without examining a student's mastery of writing conventions. When scored according to total letters written, our writing sample is found to contain 154 letters. Jim Wright, Presenter www.interventioncentral.org 46 3. Correctly Spelled Words--The examiner counts up only those words in the writing sample that are spelled correctly. Words are considered separately, not within the context of a sentence. When scoring a word according to this approach, a Fig. 2.12: CBM Writing sample scored for correctly spelled words I woud drink water from the ocean.....06 and I woud eat the fruit off of.......07 the trees. Then I woud bilit a.......05 house out of trees, and I woud........06 gather firewood to stay warm. I......06 woud try and fix my boat in my........07 spare time. .........................02 Correctly Spelled Words = 39 good rule of thumb is to determine whether--in isolation--the word represents a correctly spelled term in English. If it does, the word is included in the tally. Assessing the number of correctly spelled words has the advantage of being quick. Also, by examining the accuracy of the student's spelling, this approach monitors to some degree a student's mastery of written language. Our writing sample is found to contain 39 correctly spelled words. 4. Correct Writing Sequences--When scoring correct writing sequences, the examiner goes beyond the confines of the isolated word to consider units of writing and their relation to one another. Using this approach, the examiner starts at the beginning of the writing sample and looks at each successive pair of writing units (writing sequence). Words are considered separate writing units, as are essential marks of punctuation. To receive credit, writing sequences must be correctly spelled and be grammatically correct. The words in each writing sequence must also make sense within the context of the sentence. In effect, the student's writing is judged according to the standards of informal standard American English. A caret (^) is Jim Wright, Presenter www.interventioncentral.org 47 used to mark the presence of a correct writing sequence. Fig. 2.13: An illustration of selected scoring rules for correct writing sequences. Since the first word is correct, it is marked as a correct writing sequence. ^ It^ was^ dark^ .^ Nobody^ Because the period is considered essential punctuation, it is joined with the words before and after it to make 2 correct writing sequences. could seen the^ trees^ of ^ the forrest. Grammatical or syntactical errors are not counted . Misspelled words are not counted. The following scoring rules will aid the instructor in determining correct writing sequences: Correctly spelled words make up a correct writing sequence (reversed letters are acceptable, so long as they do not lead to a misspelling): Example ^ Is^ that^ a^ red^ car^ ? Necessary marks of punctuation (excluding commas) are included in correct writing sequences: Example ^ Is^ that^ a^ red^ car^ ? Syntactically correct words make up a correct writing sequence: Example ^ Is^ that^ a^ red^ car^ ? ^ Is^ that^ a^ car Jim Wright, Presenter red? www.interventioncentral.org 48 Semantically correct words make up a correct writing sequence: Example ^ Is^ that^ a^ red^ car^ ? ^ Is^ that^ a read car^ ? If correct, the initial word of a writing sample is counted as a correct writing sequence: Example ^ Is^ that^ a^ red^ car^ ? Titles are included in the correct writing sequence count: Example ^ The^ Terrible^ Day With the exception of dates, numbers written in numeral form are not included in the correct writing sequence count: Example ^ The 14 soldiers^ waited^ in^ the^ cold^ . ^ The^ crash^ occurred^ in^ 1976^ . Not surprisingly, evaluating a writing probe according to correct writing sequences is the most time-consuming of the scoring methods presented here. It is also the scoring approach, however, that yields the most comprehensive Jim Wright, Presenter www.interventioncentral.org 49 information about a student's writing competencies. While further research is Fig. 2.14: CBM Writing sample scored for correct writing sequence (Each correct writing sequence is marked with a caret (^)): ^I woud drink^ water^ from^ the^ ocean...05 ^ and^ I woud eat^ the^ fruit^ off^ of.... 06 ^ the^ trees^ . ^ Then^ I woud bilit a....05 ^ house^ out^ of^ trees,^ and^ I woud .....06 gather^ firewood^ to^ stay^ warm^ .^ I.... 06 woud try^ and^ fix^ my^ boat^ in^ my .....06 ^ spare^ time^ . .........................03 Correct Word Sequences = 37 needed to clarify the point, it also seems plausible that the correct writing sequence method is most sensitive to short-term student improvements in writing. Presumably, advances in writing skills in virtually any area (e.g., spelling, punctuation) could quickly register as higher writing sequence scores. Our writing sample is found to contain 37 correct writing sequences. Jim Wright, Presenter www.interventioncentral.org 50 Daily Behavior Report Cards Jim Wright, Presenter www.interventioncentral.org 51 Daily Behavior Report Cards: A Convenient Behavior Monitoring Tool Daily Behavior Report Cards (DBRCs) are behavior rating forms that teachers use to evaluate the student’s global behaviors on a daily basis or even more frequently. An advantage of DBRCs is that these rating forms are quick and convenient for the teacher to complete. This section contains daily and weekly versions of a generic DBRC, as well as a progress-monitoring chart to record cumulative DBRC ratings. Increasing the Reliability of DBRCs. DBRCs rely heavily on teacher judgment and therefore can present a somewhat subjective view of the student’s behavior. When a teacher's ratings on DBRCs are based on subjective opinions, there is a danger that the teacher will apply inconsistent standards each day when rating student behaviors. This inconsistency in assessment can limit the usefulness of report card data. One suggestion that teachers can follow to make it more likely that their report card ratings are consistent and objective over time is to come up with specific guidelines for rating each behavioral goal. For example, one item in the sample DBRC included in this section states that "The student spoke respectfully and complied with adult requests without argument or complaint." It is up to the teacher to decide how to translate so general a goal into a rubric of specific, observable criteria that permits the teacher to rate the student on this item according to a 9-point scale. In developing such criteria, the instructor will want to consider: • taking into account student developmental considerations. For example, "Without argument or complaint" may mean "without throwing a tantrum" for a kindergarten student but mean "without loud, defiant talking-back" for a student in middle school. • tying Report Card ratings to classroom behavioral norms. For each behavioral goal, the teacher may want to think of what the typical classroom norm is for this behavior and assign to the classroom norm a specific number rating. The teacher may decide, for instance, that the target student will earn a rating of 7 ('Usually/Always') each day that the student's compliance with adult requests closely matches that of the 'average' child in the classroom. • developing numerical criteria when appropriate. For some items, the teacher may be able to translate certain more general Report Card goals into specific numeric ratings. If a DBRC item rates a student’s compliance with adult requests, for example, the teacher may decide that the student is eligible to earn a rating of 7 or higher on this item on days during which instructional staff had to approach the student no more than once about noncompliance. Charting Report Card Ratings. Daily Behavior Report Card ratings can be charted over time to provide a visual display of the student's progress toward behavioral goals. The sample DBRC (daily and weekly versions) included in this section has its own progress-monitoring chart, which permits the teacher to graph student behavior for up to 4 school weeks. The instructor simply fills in the bubble each day that matches the numerical rating that he or she assigned to the student for the specific behavioral goal. As multiple points are filled in on the graph, the instructor connects those points to create a time-series progress graph. (Figure 1 contains an example of a completed progress-monitoring chart.) When enough data points have been charted, the behavior graph can be used to judge the relative effectiveness of any strategies put in place to improve the student's behavior. Using DBRCs as a Self-Monitoring Intervention. DBRCs are primarily used as a behavior-monitoring tool. However, teachers may also choose to use DBRCs as part of a student self-monitoring program, in which the student rates their own behaviors each day. If teachers decide to use student behavior report cards for self-monitoring, they should first identify and demonstrate for the student the behaviors that the Jim Wright, Presenter www.interventioncentral.org 52 student is to monitor and show the student how to complete the behavior report card. Since it is important that the student learn the teacher's behavioral expectations, the instructor should meet with the student daily, ask the student to rate their own behaviors, and then share with the student the teacher's ratings of those same behaviors. The teacher and student can use this time to discuss any discrepancies in rating between their two forms. (If report card ratings points are to be applied toward a student reward program, the teacher might consider allowing points earned on a particular card item to count toward a reward only if the student's ratings fall within a point of the teacher's, to encourage the student to be accurate in their ratings.) Figure 1: Example of completed DBRC progress-monitoring form During instructional periods, the student focused his or her attention on teacher instructions, classroom lessons and assigned work. 9 Usually/Always 8 7 6 Sometimes 5 4 3 Never/Seldom 2 9 8 7 6 5 4 3 2 1 1 M T W Th F Jim Wright, Presenter 9 8 Usually/Always 7 6 5 Sometimes 4 3 2 Never/Seldom M T W Th F 1 M T W Th F www.interventioncentral.org M T W Th F 53 Daily Classroom Behavior Report Card Student: _______________________ Date: __________________________ Teacher: _______________________ Classroom: _____________________ Directions: Review each of the Behavior Report Card items below. For each item, rate the degree to which the student showed the behavior or met the behavior goal. During instructional periods, the student focused his or her attention on teacher instructions, classroom lessons and assigned work. The student interacted with classmates appropriately and respectfully. The student completed and turned in his or her assigned class work on time. The student spoke respectfully and complied with adult requests without argument or complaint. Jim Wright, Presenter www.interventioncentral.org 54 Weekly Classroom Behavior Report Card Student: ________________________________________________ Teacher: __________________ Classroom: _________________ Directions: Review each of the Behavior Report Card items below. For each item, rate the degree to which the student showed the behavior or met the behavior goal. Date Behavioral Target M T W Th F During instructional periods, the student focused his or her attention on teacher instructions, classroom lessons and assigned work. The student interacted with classmates appropriately and respectfully. The student completed and turned in his or her assigned class work on time. The student spoke respectfully and complied with adult requests without argument or complaint. Classroom Behavior Report Card Jim Wright, Presenter www.interventioncentral.org 55 Progress-Monitoring Chart During instructional periods, the student focused his or her attention on teacher instructions, classroom lessons and assigned work. The student interacted with classmates appropriately and respectfully. Classroom Behavior Report Card Jim Wright, Presenter www.interventioncentral.org 56 Progress-Monitoring Chart The student completed and turned in his or her assigned class work on time. The student spoke respectfully and complied with adult requests without argument or complaint. Jim Wright, Presenter www.interventioncentral.org 57 A Framework for Developing Special Education ‘Eligibility Decision Rules’ Under RTI Jim Wright, Presenter www.interventioncentral.org 58 Using Response to Intervention to Determine Special Education Eligibility: Laying the Foundation As school districts grow their capacity to provide RTI support to struggling students, they must also develop the decision rules required to determine when students who fail to respond to generaleducation interventions may need special education support. While existing research gives us only a partial roadmap for what the process will look like for diagnosing Learning Disabilities under RTI, there are sufficient guideposts in place to allow districts to get started immediately in developing their own capacity to use RTI information at special education eligibility meetings. Listed below are factors for districts to consider: Section 1: Building the Foundation. Before an effective set of decision rules can be developed to determine student eligibility for special education, the school must first put into place these foundation components and procedures. Ensure Tier 1 (Classroom) Capacity to Carry Out Quality Interventions. The classroom teacher is the ‘first responder’ available to address emerging student academic concerns. Therefore, general-education teachers should have the capacity to define student academic concerns in specific terms, independently choose and carry out appropriate evidence-based Tier 1 (classroom) interventions, and document student response to those interventions. (NOTE: See attached form Tier 1 (Classroom) Interventions: Building Your School’s Capacity for an 8-step process to promote teacher intervention skills.) Collect Benchmarking/Universal Screening Data on Key Reading and Math (and Perhaps Other) Academic Skills for Each Grade Level. Benchmarking data is collected on all students at least three times per year (fall, winter, spring). Measures selected for benchmarking should track student fluency and accuracy in basic academic skills that are key to success at each grade level. Hold ‘Data Meetings’ With Each Grade Level. After each benchmarking period (fall, winter, spring), the school organizes data meetings by grade level. The building administrator, classroom teachers, and perhaps other staff (e.g., reading specialist, school psychologist) meet to: o review student benchmark data. o discuss how classroom (Tier 1) instruction should be changed to accommodate the student needs revealed in the benchmarking data. o select students for Tier 2 (supplemental group) instruction/intervention. Section 2: Creating Special Education Eligibility Decision Rules. Fuchs (2003) has formulated the ‘dual discrepancy model’, an influential conceptual framework for defining Learning Disabilities under RTI. According to this model, a student qualifies as LD only if (A) there is a significant academic skill gap o between the target student and typical peers (discrepancy 1), and (B) the target student fails to make adequate progress to close the skill gap despite appropriate interventions (discrepancy 2). In line with RTI logic, then, the school makes the initial assumption Jim Wright, Presenter www.interventioncentral.org 59 that students with emerging academic concerns have typical abilities and simply require the ‘right’ instructional strategies to be successful. Your district must develop decision rules that allow you to evaluate data collected during successive intervention trials to identify with confidence those students who are ‘non-responders’ to Tier 2 and Tier 3 interventions and may require special education services. Establish the Minimum Number of Intervention Trials Required Prior to a Special Education Referral. Your district should require a sufficient number of intervention trials to definitively rule out instructional variables as possible reasons for student academic delays. Many districts require that at least three Tier 2 (small-group supplemental) and/or Tier 3 (intensive, highly individualized) intervention trials be attempted before moving forward with a special education evaluation. Determine the Minimum Timespan for Each Tier 2 or Tier 3 Intervention Trial. An intervention trial should last long enough to show definitively whether it was effective. One expert recommendation (Burns & Gibbons, 2008) is that each academic intervention trial should last at least 8 instructional weeks to allow enough time for the school to collect sufficient data to generate a reliable trend line. Define the Level of Student Academic Delay That Will Qualify as a Significant Skill Discrepancy. Not all students with academic delays require special education services; those with more modest deficits may benefit from general-education supplemental interventions alone. Your district should develop guidelines for determining whether a student’s academic skills should be judge as significantly delayed when compared to those of peers: o If using local Curriculum-Based Measurement norms, set an appropriate ‘cutpoint’ score (e.g., at the 10th percentile). Any student performing below that cutpoint would be identified as having a significant gap in skills. o If using reliable national or research norms (e.g., reading fluency norms from Hasbrouck & Tindal, 2004), set an appropriate ‘cutpoint’ score (e.g., at the 10th percentile). Any student performing below that cutpoint would be identified as having a significant gap in skills. Define the Rate of Student Progress That Will Qualify as a Significant Discrepancy in Rate of Learning. The question of whether a student has made adequate progress when on intervention is complex. While each student case must be considered on its own merits, however, your district can bring consistency to the process of judging the efficacy of interventions by discussing the factors below and ensuring to the maximum degree possible that your district adopts uniform expectations: 1. Define ‘grade level performance’. The goal of academic intervention is to bring student skills to grade level. However, your district may want to specify what is meant by ‘grade level’ performance. Local CBM norms or reliable national or research norms can be helpful here. The district can set a cutpoint that sets a minimum threshold for ‘typical student performance’ (e.g., 25th percentile or above on local or research norms). Students Jim Wright, Presenter www.interventioncentral.org 60 whose performance is above the cutpoint would fall within the ‘reachable, teachable range’ and could be adequately instructed by the classroom teacher. 2. Set ambitious but realistic goals for student improvement. When an intervention plan is put into place, the school should predict a rate of student academic improvement that is ambitious but realistic (Hosp, Hosp, and Howell, 2007). During a typical intervention series, a student usually works toward intermediate goals for improvement, and an intermediate goal is reset at a higher level each time that the student attains it. The ultimate goal, of course, is to move the student up to grade-level performance (defined above). The school should be able to supply a rationale for how it set goals for rate of student improvement. For example, a school may use research guidelines in oral reading fluency growth (Fuchs, Fuchs, Hamlett, Walz, & Germann, 1993) to set a goal. Or the school may use local norms to compute a weekly goal for improvement by (1) calculating the amount of progress that the student needs to close to reach grade-level performance and (2) dividing that figure by the number of weeks available for intervention. 3. Decide on a reasonable time horizon to ‘catch’ the student up with his or her peers. Interventions for students with serious academic delays cannot be successfully completed overnight. It is equally true, though, that interventions cannot stretch on without end if the student fails to make adequate progress. Your district should decide on a reasonable span of time in which a student on intervention should be expected to close the gap and reach grade level performance (e.g., 12 months). Failure to close that gap within the expected timespan may be partial evidence that the student requires special education support. 4. View student progress-monitoring data in relation to peer norms. When viewed in isolation, student progress-monitoring data tells only part of the story. Even if students shows modest progress, they may still be falling farther and farther behind their peers in the academic skill of concern. Your district should evaluate student progress relative to peers. If the skill gap between the student and their peers (as determined through repeated school-wide benchmarking) continues to widen, despite the school’s most intensive intervention efforts, this may be partial evidence that the student requires special education support. 5. Set uniform expectations for how progress-monitoring data are presented at special education eligibility meetings. Your district should adopt guidelines for schools in collecting and presenting student progress-monitoring information at special education eligibility meetings. For example, it is recommended that curriculum-based measurement or similar data be presented as time-series charts. These charts should include trend lines to summarize visually the student’s rate of academic growth, as well as a ‘goal line’ indicating the intermediate or final performance goal toward which the student is working. References Burns, M. K., & Gibbons, K. A. (2008). Implementing response-to-intervention in elementary and secondary schools. Routledge: New York. Jim Wright, Presenter www.interventioncentral.org 61 Fuchs, L. (2003). Assessing intervention responsiveness: Conceptual and technical issues. Learning Disabilities Research & Practice, 18(3), 172-186. Fuchs, L.S., Fuchs, D., Hamlett, C.L., Walz, L., & Germann, G. (1993). Formative evaluation of academic progress: How much growth can we expect? School Psychology Review, 22, 27-48. Hasbrouck, J., & Tindal, G. (2004). Oral reading fluency: 90 years of measurement. Retrieved March 4, 2009 from http://www.brtprojects.org/tech_reports.php Hosp, M.K., Hosp, J. L., & Howell, K. W. (2007). The ABCs of CBM. New York: Guilford. Jim Wright, Presenter www.interventioncentral.org 62 Creating an RTI Literacy Program at Tiers 1 & 2 That is Responsive to the Needs of All Students In order to create an RTI literacy program that addresses the needs of all learners, schools must do much more than simply purchase an off-the-shelf instructional product. For example, RTI-ready schools use data to determine the specific learning needs of their student population, ensure that classroom teachers have the skills to instruct students with diverse reading needs, focus their resources to provide the strongest reading instructional support in the early primary grades, and constantly review their instructional practices to weed out those that are less-effective. Below are 8 steps that schools can follow to build a strong RTI literacy program: Verify that the School’s Reading Program is ‘Evidence-Based’. Use Benchmarking/Universal Screening Data to Verify that the Current Core Reading Program is Appropriate. Establish a Breadth of Instructional Expertise in Reading. Adopt Efficient Methods of Instructional Delivery and Time Management. Mass Resources for Focused Literacy Instruction & Intervention in the Primary Grades. Avoid Use of Less Effective Reading Instructional Strategies. Adopt Evidence-Based Tier 2 (Supplemental) Reading Interventions for Struggling Students. Promote Ongoing Professional Development. Verify that the School’s Reading Program is ‘Evidence-Based’. The school has an evidence-based reading program in place for all elementary grades. • The program is tied to a well-designed literacy curriculum and may consist of one or several commercial reading-instruction products. • The program is supported by research as being effective. • Teachers implementing the reading program at their grade level can describe its effective instructional elements. Use Benchmarking/Universal Screening Data to Verify that the Current Core Reading Program is Appropriate. The school uses benchmarking/universal screening data in literacy to verify that its current reading program can effectively meet the needs of its student population at each grade level--in the key areas of phonemic awareness, alphabetics (phonics), vocabulary, reading fluency, and reading comprehension. When evaluating its benchmarking/universal screening data, the school uses the cutpoint of “80% student success” to determine whether its core reading program presently meets the needs of its population: • In grades K-2, if fewer than 80% of students are successful in attaining benchmark or criterion status on grade-appropriate reading measures of phonemic awareness and alphabetics (phonics), it is recommended that the core reading program at that grade level be patterned Jim Wright, Presenter www.interventioncentral.org 63 after direct instruction – with texts containing highly controlled student vocabulary and use of explicit instruction to reinforce letter-sound correspondences. (Foorman & Torgesen, 2001) • In grades K-2, if more than 80% of students are successful on a gradewide phonemic awareness and alphabetics screening, the school may choose to adopt a reading program that provides “less direct instruction in sound-spelling patterns embedded in trade books (embedded code)” (Foorman & Torgesen, 2001; p. 205). ‘Less direct instruction’ can include a greater number of general teacher-led ‘guided reading’ activities such as talking with students about the content of a text, providing instruction in ‘strategic strategies’ to better access the text, and so on (Kosanovich, Ladinsky, Nelson & Torgesen, n.d.). Depending on its findings from the screening data, the school will select a reading program that has more or fewer of the elements of direct instruction. In any classroom, however, core reading instruction should always devote time to building general vocabulary, reading comprehension, writing, and other essential literacy skills (Foorman, Breier, & Fletcher, 2003). Establish a Breadth of Instructional Expertise in Reading. Teachers are knowledgeable about the causes of reading delays. They understand that the most common explanation for deficiencies in foundation reading skills for students entering kindergarten is that—prior to public school—those delayed students did not have the same exposure to spoken vocabulary, phonemic awareness activities, and print as did their more advanced classmates. Classroom teachers therefore possess the appropriate knowledge and skills to provide Tier 1 literacy instruction to children with delays in reading skills. Specifically, these general-education instructors have the instructional expertise to teach children whose reading skills are up to 2 years below those of their classmates. Adopt Efficient Methods of Instructional Delivery and Time Management. The teacher uses an appropriate range of efficient instructional delivery and time-management methods to match student readers to effective learning activities. Examples include: • reading centers (Kosanovich et al., n.d.) • using students as peer tutors (e.g. Mathes et al., 2003) • incorporating paraprofessionals (Foorman, Breier, & Fletcher, 2003), adult volunteer tutors, or other non-instructional personnel under teacher supervision to review and reinforce student reading skills • scheduling core literacy instruction at the same time for each grade level to allow students to access reading instruction across classrooms as needed (cf. Burns & Gibbons, 2008). Mass Resources for Focused Literacy Instruction & Intervention in the Primary Grades. The school organizes its resources to provide the most intensive general-education literacy instruction and intervention support at the early grades – Grades K through 2—because research suggests that student reading deficits can be addressed in these primary grades with far less effort and with better outcomes than for students whose reading deficits are addressed in later grades (Foorman, Breier, & Fletcher, 2003). Avoid Use of Less Effective Reading Instructional Strategies. Classrooms make minimal use of inefficient instructional reading activities such as Round Robin Reading that can result in poor modeling of text reading and reduced rates of actual student reading engagement--and may also Jim Wright, Presenter www.interventioncentral.org 64 cause emotional distress for poor readers (Ash, Kuhn, & Walpole, 2009; Ivey, 1999). Furthermore, the school has a clear and shared understanding that purposeful, focused reading interventions are required to help struggling readers: The passive strategy of grade-retention has not been shown to be an effective means of reading intervention (Foorman, Breier, & Fletcher, 2003). Adopt Evidence-Based Tier 2 (Supplemental) Reading Interventions for Struggling Students. The school has a range of evidence-based Tier 2 intervention options for those students who fail to respond adequately to classroom literacy instruction alone. Group-based Tier 2 interventions are capped at 7 students, and all children in those groups have the same general intervention need (Burns & Gibbons, 2008). Tier 2 instruction is more explicit (e.g., contains more direct-instruction elements), intensive (e.g., more teacher attention), and supportive (e.g., timely performance feedback, praise, and encouragement) than the reading instruction that all children receive (Foorman & Torgesen, 2001). Promote Ongoing Professional Development. The school supports teachers with professional development as they implement any reading program (Foorman, Breier, & Fletcher, 2003). Training addresses such key topics as: • understanding the underlying research, instructional objectives, and components of the program • managing the classroom during reading activities, • moving at an appropriate instructional pace • grouping students. References Ash, G. E., Kuhn, M. R., & Walpole, S. (2009). Analyzing “inconsistencies” in practice: Teachers' continued use of round robin reading. Reading & Writing Quarterly, 25, 87-103. Burns, M. K., & Gibbons, K. A. (2008). Implementing response-to-intervention in elementary and secondary schools: Procedures to assure scientific-based practices. New York: Routledge. Foorman, B. R., Breier, J. Il, & Fletcher, J. M. (2003). Interventions aimed at improving reading success: An evidence-based approach. Developmental Neuropsychology, 24, 613-639. Foorman, B. R., & Torgesen, J. (2001). Critical elements of classroom and small-group instruction promote reading success in all children. Learning Disabilities Research & Practice, 16, 203-212. Ivey, G. (1999). A multicase study in the middle school: Complexities among young adolescent readers. Reading Research Quarterly, 34, 172-192. Kosanovich, M., Ladinsky, K., Nelson, L., & Torgesen, J. (n.d.). Differentiated reading instruction: Small group alternative lesson structures for all students. Florida Center for Reading Research. Retrieved on November 5, 2008, from http://www.fcrr.org/assessment/pdf/smallGroupAlternativeLessonStructures.pdf Mathes, P. G., Torgesen, J. K., Clancy-Menchetti, J., Santi, K., Nicholas, K., Robinson, C., Grek, M. (2003). A comparison of teacher-directed versus peer-assisted instruction to struggling first-grade readers. The Elementary School Journal, 103(5), 459–479. Jim Wright, Presenter www.interventioncentral.org 65 Discussion Activity: What is Your School’s Capacity to… Verify that the School’s Reading Program is ‘Evidence-Based’? Directions: Read the quote and RTI Literacy goal statement below. Then discuss strengths, limitations, and next steps that your school should follow to pursue this goal. “Although commercially prepared programs and the subsequent manuals and materials are inviting, they are not necessary. … A recent review of research suggests that interventions are research based and likely to be successful, if they are correctly targeted and provide explicit instruction in the skill, an appropriate level of challenge, sufficient opportunities to respond to and practice the skill, and immediate feedback on performance…Thus, these [elements] could be used as criteria with which to judge potential [school] interventions.” (Burns & Gibbons, 2008; p. 88). RTI Literacy Goal: Verify that the School’s Reading Program is ‘Evidence-Based’. The school has an evidence-based reading program in place for all elementary grades. • The program is tied to a well-designed literacy curriculum and may consist of one or several commercial reading-instruction products. • The program is supported by research as being effective. • Teachers implementing the reading program at their grade level can describe its effective instructional elements. What are strengths in your system that can help you to advance this goal? • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ What are possible limitations or challenges within your system that may impede ability to advance this goal? • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ What are key ‘next steps’ that your school should pursue to advance the goal? • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ Burns, M. K., & Gibbons, K. A. (2008). Implementing response-to-intervention in elementary and secondary schools. Routledge: New York. Foorman, B. R., & Torgesen, J. (2001). Critical elements of classroom and small-group instruction promote reading success in all children. Learning Disabilities Research & Practice, 16, 203-212. Jim Wright, Presenter www.interventioncentral.org 66 Discussion Activity: What is Your School’s Capacity to… Use Benchmarking/Universal Screening Data to Verify that the Current Core Reading Program is Appropriate? Directions: Read the quote and RTI Literacy goal statement below. Then discuss strengths, limitations, and next steps that your school should follow to pursue this goal. “…we want to emphasize that effective interventions for almost all children highly at risk for reading disabilities should contain strongly explicit instruction in the knowledge and skills required for learning to read words accurately and fluently, and that this instruction should be balanced and integrated with explicit instruction in other language and reading skills that are also important for good reading comprehension.” (Foorman & Torgesen, 2001; p. 209). RTI Literacy Goal: Use Benchmarking/Universal Screening Data to Verify that the Current Core Reading Program is Appropriate. The school uses benchmarking/universal screening data in literacy to verify that its current reading program can effectively meet the needs of its student population at each grade level--in the key areas of phonemic awareness, alphabetics (phonics), vocabulary, reading fluency, and reading comprehension. When evaluating its benchmarking/universal screening data, the school uses the cutpoint of “80% student success” to determine whether its core reading program presently meets the needs of its population. Depending on its findings from the screening data, the school may adopt a reading program that has more or fewer of the elements of direct instruction. What are strengths in your system that can help you to advance this goal? • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ What are possible limitations or challenges within your system that may impede ability to advance this goal? • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ What are key ‘next steps’ that your school should pursue to advance the goal? • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ Foorman, B. R., & Torgesen, J. (2001). Critical elements of classroom and small-group instruction promote reading success in all children. Learning Disabilities Research & Practice, 16, 203-212. Jim Wright, Presenter www.interventioncentral.org 67 Discussion Activity: What is Your School’s Capacity to… Establish a Breadth of Instructional Expertise in Reading? Directions: Read the quote and RTI Literacy goal statement below. Then discuss strengths, limitations, and next steps that your school should follow to pursue this goal. “Whitehurst and Lonigan (1998) recently identified two broad classes of emergent literacy skills that children bring with them to school and that have a substantial impact on how easily they learn to read. One group of skills, referred to as “inside-out” skills, includes phonological awareness and letter knowledge, while the other group, called “outside-in” skills, includes vocabulary and conceptual knowledge. Skills like phonemic awareness and letter knowledge are particularly important predictors of the ease with which children acquire word-reading accuracy and fluency…, while broad oral language facility (vocabulary in particular) becomes critically important to the growth of reading comprehension skills once children learn to read words efficiently…” (Foorman & Torgesen, 2001; p. 207) RTI Literacy Goal: Establish a Breadth of Instructional Expertise in Reading. Teachers are knowledgeable about the causes of reading delays. They understand that the most common explanation for reading delays for students entering kindergarten is that—prior to public school—those delayed students have not had the same exposure to spoken vocabulary, phonemic awareness activities, and print as their more advanced classmates. Teachers therefore have the appropriate knowledge and skills to provide Tier 1 literacy instruction to children with delays in reading skills; classroom teachers have the instructional expertise to teach children whose reading skills are up to 2 years below those of their classmates. What are strengths in your system that can help you to advance this goal? • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ What are possible limitations or challenges within your system that may impede ability to advance this goal? • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ What are key ‘next steps’ that your school should pursue to advance the goal? • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ Foorman, B. R., & Torgesen, J. (2001). Critical elements of classroom and small-group instruction promote reading success in all children. Learning Disabilities Research & Practice, 16, 203-212. Jim Wright, Presenter www.interventioncentral.org 68 Discussion Activity: What is Your School’s Capacity to… Adopt Efficient Methods of Instructional Delivery and Time Management? Directions: Read the quote and RTI Literacy goal statement below. Then discuss strengths, limitations, and next steps that your school should follow to pursue this goal. “The most effective early intervention is prevention—in the form of differentiated classroom instruction. Many techniques and programs exist for helping classroom teachers with small-group instruction, such as classwide peer tutoring…and cooperative grouping. But one of the persistent problems of differentiated classroom instruction is how to engage classroom teachers in continuous progress monitoring and translating the results of assessment to differentiated instruction.” (Foorman & Moats, 2004; p. 54). RTI Literacy Goal: Adopt Efficient Methods of Instructional Delivery and Time Management. The teacher uses an appropriate range of efficient instructional delivery and time-management methods to provide effective reading instruction to a wide range of student readers. Examples include: • reading centers (Kosanovich et al., n.d.) • using students as peer tutors (e.g. Mathes et al., 2003), incorporating Para-professionals (Foorman, Breier, & Fletcher, 2003), adult volunteer tutors, or other non-instructional personnel under teacher supervision to review and reinforce student reading skills • scheduling core literacy instruction at the same time for each grade level to allow students to access reading instruction across classrooms as needed (cf. Burns & Gibbons, 2008). What are strengths in your system that can help you to advance this goal? • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ What are possible limitations or challenges within your system that may impede ability to advance this goal? • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ What are key ‘next steps’ that your school should pursue to advance the goal? • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ Foorman, B. R., & Moats, L. C. (2004). Conditions for sustaining research-based practices in early reading instruction. Remedial & Special Education, 25, 51-60. Jim Wright, Presenter www.interventioncentral.org 69 Discussion Activity: What is Your School’s Capacity to… Mass Resources for Intensive Literacy Instruction & Intervention in the Primary Grades? Directions: Read the quote and RTI Literacy goal statement below. Then discuss strengths, limitations, and next steps that your school should follow to pursue this goal. “Risk for reading failure always involves the interaction of a particular set of child characteristics with specific characteristics of the instructional environment. Risk status is not entirely inherent in the child, but always involves a “mismatch” between child characteristics and the instruction that is provided.” (Foorman & Torgesen, 2001; p. 206). Mass Resources for Focused Literacy Instruction & Intervention in the Primary Grades. The school organizes its resources to provide the most intensive general-education literacy instruction and intervention support at the early grades – Grades K through 2—because research suggests that student reading deficits can be addressed in these primary grades with far less effort and with better outcomes than for students whose reading deficits are addressed in later grades (Foorman, Breier, & Fletcher, 2003),. What are strengths in your system that can help you to advance this goal? • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ What are possible limitations or challenges within your system that may impede ability to advance this goal? • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ What are key ‘next steps’ that your school should pursue to advance the goal? • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ Foorman, B. R., Breier, J. Il, & Fletcher, J. M. (2003). Interventions aimed at improving reading success: An evidence-based approach. Developmental Neuropsychology, 24, 613-639. Jim Wright, Presenter www.interventioncentral.org 70 Discussion Activity: What is Your School’s Capacity to… Avoid Use of Less Effective Reading Instructional Strategies? Directions: Read the quote and RTI Literacy goal statement below. Then discuss strengths, limitations, and next steps that your school should follow to pursue this goal. “Children’s status as readers is established early… Torgesen et al. (1997) showed that over 8 of 10 children with severe word reading problems at the end of the first grade performed below the average at the beginning of the third grade. Such evidence supports the view that early reading problems are the result of deficits rather than delay. In other words, the early childhood mantra “Just wait; they’ll catch up” has no empirical basis.” (Foorman, Breier, & Fletcher, 2003; p. 626) RTI Literacy Goal: Avoidance of Less Effective Reading Instructional Strategies. Classrooms make minimal use of inefficient instructional reading activities such as Round Robin Reading that can result in poor modeling of text reading and reduced rates of actual student reading engagement--and may also cause emotional distress for poor readers (Ash, Kuhn, & Walpole, 2009; Ivey, 1999). Furthermore, the school has a clear and shared understanding that purposeful, focused reading interventions are required to help struggling readers: The passive strategy of grade-retention has not been shown to be an effective means of reading intervention (Foorman, Breier, & Fletcher, 2003), What are strengths in your system that can help you to advance this goal? • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ What are possible limitations or challenges within your system that may impede ability to advance this goal? • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ What are key ‘next steps’ that your school should pursue to advance the goal? • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ Foorman, B. R., Breier, J. Il, & Fletcher, J. M. (2003). Interventions aimed at improving reading success: An evidence-based approach. Developmental Neuropsychology, 24, 613-639. Jim Wright, Presenter www.interventioncentral.org 71 Discussion Activity: What is Your School’s Capacity to… Adopt Evidence-Based Tier 2 (Supplemental) Reading Interventions for Struggling Students? Directions: Read the RTI Literacy goal statement below. Then discuss strengths, limitations, and next steps that your school should follow to pursue this goal. RTI Literacy Goal: Adopt Evidence-Based Tier 2 (Supplemental) Reading Interventions for Struggling Students. The school has a range of evidence-based Tier 2 intervention options for those students who fail to respond adequately to classroom literacy instruction alone. Groupbased Tier 2 interventions are capped at 7 students, and all children in those groups have the same general intervention need (Burns & Gibbons, 2008). Tier 2 instruction is more explicit (e.g., contains more direct-instruction elements), intensive (e.g., more teacher attention), and supportive (e.g., timely performance feedback, praise, and encouragement) than the reading instruction that all children receive (Foorman & Torgesen, 2001). What are strengths in your system that can help you to advance this goal? • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ What are possible limitations or challenges within your system that may impede ability to advance this goal? • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ What are key ‘next steps’ that your school should pursue to advance the goal? • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ Foorman, B. R., & Torgesen, J. (2001). Critical elements of classroom and small-group instruction promote reading success in all children. Learning Disabilities Research & Practice, 16, 203-212. Jim Wright, Presenter www.interventioncentral.org 72 Discussion Activity: What is Your School’s Capacity to… Promote Ongoing Professional Development? Directions: Read the RTI Literacy goal statement below. Then discuss strengths, limitations, and next steps that your school should follow to pursue this goal. RTI Literacy Goal: Promote Ongoing Professional Development. The school supports teachers with professional development as they implement any reading program (Foorman, Breier, & Fletcher, 2003). Training addresses such key topics as: • understanding the underlying research, instructional objectives, and components of the program • managing the classroom during reading activities, • moving at an appropriate instructional pace • grouping students. What are strengths in your system that can help you to advance this goal? • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ What are possible limitations or challenges within your system that may impede ability to advance this goal? • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ What are key ‘next steps’ that your school should pursue to advance the goal? • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ • __________________________________ Foorman, B. R., Breier, J. Il, & Fletcher, J. M. (2003). Interventions aimed at improving reading success: An evidence-based approach. Developmental Neuropsychology, 24, 613-639. Jim Wright, Presenter www.interventioncentral.org 73 Jim Wright www.interventioncentral.org 1 Building Capacity for Evidence-Based Assessment & Instruction Under RTI: Recommended Internet Sites Here are a range of websites with free resources that support curriculum development, assessment, and instruction. Recommended Instruction Practices 1A. National Reading Panel Report. The NRP was NOTES: established by Congress in 1997 to “to assess the status of research-based knowledge about reading, including the effectiveness of various approaches to teaching children to read.” The Panel issued a report in 2000 that established a shared vision of effective reading instruction. http://www.nationalreadingpanel.org/ 1B. National Math Advisory Panel Report. Issued by a NOTES: panel of math experts under the direction of the US Department of Education, this recent report includes 45 recommendations and findings on best practices in math instruction. http://www.ed.gov/mathpanel Progress-Monitoring Resources 2. National Center on Student Progress-Monitoring. NOTES: Supported by funding from the US Department of Education, this site reviews and analyzes academic assessments tools to track student response to interventions. The ‘Tools’ section of the site compares current assessment products to a set of objective progressmonitoring standards to evaluate their use to support RTI. http://www.studentprogress.org/ 3. Easy CBM. The Easy CBM site was developed by Dr. NOTES: Gerald Tindal and colleagues and provides free progressmonitoring materials and online data-storage and charting services for teachers. Users must enter an email address and minimal personal information to obtain a free log-in account. (NOTE: Math assessment materials will be available on this site soon.) http://www.easycbm.com Jim Wright, Presenter www.interventioncentral.org 74 Jim Wright www.interventioncentral.org 4. Intervention Central. The Intervention Central website 2 NOTES: has a Reading Probe Generator for Oral Reading Fluency probes and a Math Worksheet Generator that will create CBM-friendly assessments in the basic math operations: addition, subtraction, multiplication, and division. Additionally, the site allows users to create several types of Early Math Fluency probes to track the foundation numeracy skills of children in grades K and 1. All worksheets are computer-generated and unlimited in number. http://www.interventioncentral.org 5. SuperKids. The SuperKids website allows users to create a series of math computation probes—including such areas as fractions, decimals, and percentages. All worksheets are computer-generated, so any number can be created. NOTES: http://www.superkids.com/aweb/tools/math/ Instructional Resources 6. What Works Clearinghouse. Probably the most NOTES: influential website for rating commercial products for general instruction and intervention, this site is sponsored by the U.S. Department of Education’s Institute of Education Sciences. The WWC reviews existing research on commercial products to determine if they show evidence of being effective for instruction or supplemental intervention with school-age populations. http://ies.ed.gov/ncee/wwc 7. Florida Center for Reading Research. The Florida Center for Reading Research (http://www.fcrr.org/) is a great source of general information about early reading instruction. The site also has reviewed a number of reading programs for the primary grades. While the FCRR uses different criteria than the WWC to judge the effectiveness of reading programs, it does review research supporting various reading programs and summarizes the results in report format. Reports include detailed descriptions of the reading programs as well as a listing of their specific strengths and weaknesses. A full listing of reading programs reviewed by the FCRR can be found at: http://www.fcrr.org/FCRRReports/LReports.aspx 8. eNumeracy.com. Developed by Dr. Scott Methe of NOTES: East Carolina University, the eNumeracy site has begun to offer research summaries and ideas to “help advance Jim Wright, Presenter www.interventioncentral.org 75 Jim Wright www.interventioncentral.org 3 knowledge about how young children develop number sense.” http://www.enumeracy.com 9. Math.com. This website is one of the most popular NOTES: math instructional sites on the web. It offers ideas for learning and teaching math for teachers, students, and parents. Math topics covered range from basic arithmetic operations to algebra and other advanced math areas. http://math.com/ 10. Mathwire. This site offers teachers and parents a NOTES: wide range of activities and worksheets that “support the constructivist approach to learning mathematics and the NCTM Standards.” http://www.mathwire.com/ 11. Algebra Help. This specialized site presents comprehensive tutorials on aspects of algebra. The site is especially suited for student use—and can also be useful for math tutors. NOTES: http://www.algebrahelp.com/ Jim Wright, Presenter www.interventioncentral.org 76