Using curriculum based measures to identify and monitor progress in an adult basic education program, final report
Publisher: University of Pittsburgh
Published At: Pittsburgh, PA
Date Published: 1988
Distributor: Institute for Practice and Research in Education; University of Pittsburgh
Source Address: 5N25 Forbes Quadrangle
Source City/State/Zip: : Pittsburgh PA 15260
Material Type: Contractor Report
Intended Audience: Policy Maker
Physical Media: Print
Subjects: Adult Basic Education; Educational Assessment
This final report of a 310 Project describes how curriculum based measures and procedures were used to monitor reading and writing performance for adults in an adult basic education program. The measure used for reading performance was repeated oral reading, and the measure used for writing was a three minute writing sample. Results suggest that curriculum based measure of reading and writing may be useful in an adult ABE program because of their feasibility and reliability in monitoring the performance of adults. They may also serve as a supplement to the standardized measures often used to assess performance of adults.
Curriculum based measures and procedure to monitor reading and writing performance were developed and evaluated with adults (reading from beginning reading level through eighth grade level) in an adult basic education program. The most efficient, reliable, and feasible measure of reading performance was the repeated oral reading procedure (1 minute readings). The most feasible and efficient measure of writing was a fluency procedure (3 minute writing sample). Both measures enabled teachers to chart and monitor progress of adults throughout the program. Teachers reported that the oral reading and writing fluency measured were useful and easy to use. Students were also receptive to the measures as a means of obtaining feedback about their progress. Result of this project suggested that curriculum based measures of reading and writing may be useful in an adult basic education program because of their feasibility and reliability in monitoring the performance of adults. They may also serve as a supplement to the standardized measure often used to assess performance of adult.
The purpose of this project funded by the Division of Adult Basic Education (310 of the Adult Education Act) was to develop and test curriculum based procedures and measures to monitor and asses the reading and writing progress of adults in a basic education program. Although there has been a surge of interest in programs for improving the literacy skills of adults in the United States (Harmon, 1985), there are few reliable and valid assessment instruments with which to plan instruction and monitor the quality and impact of these adult literacy programs. The norm-referenced tests that are used to document overall progress (Webster, 1986) are limited for two reasons. First, they tend not to be useful for making instructional decisions. Second, since they generally do not relate to the content of the adult literacy curriculum, they are not sensitive to progress made by adult students in short duration adult education programs. In a survey conducted by the Adult Learning Division of the College Reading Association (Richardson, 1985), improvement of assessment procedures was listed as one of the crucial needs; respondents criticized the use of inappropriate standardized tests and anecdotal record for measuring program success. At the Roundtable Conference held by the Institute for the Study of Adult Literacy at the Pennsylvania State University, one specific need identified was that "a field-tested, adult-oriented diagnostic tool be developed to assess more adequately the adult population" (Proceedings, 1986).
Curriculum based measures are short tasks administered at frequent intervals to permit instructors to determine the effectiveness of their teaching and to make necessary modifications in instruction. Results from research with younger students indicated that the best measure of reading for use in a curriculum based system was number of words read correctly by a pupil in 1 minute from the curricular materials used in the classroom (Deno, Mirkin & Chiang, 1982; Deno, Marston, Mirkin, Lowry, et al., 1982; Fuchs & Deno, 1981). Current research (Deno, Marston & Mirkin, 1982 also supports the validity of short (3 minute), frequent writing samples. These and other measures were investigated in this project.The project was implemented at the University of Pittsburgh in an adult basic education program, Pittsburgh Adult Competency Program (PAC), Institute for Practice and Research in Education, September 1988 through June 1989. PAC, a ten-week literacy program funded by the City of Pittsburgh for unemployed adults reading below the eighth grade reading level, was held on campus in university classroom. The PAC staff consisted of five instructors, one of whom served as supervisor in addition to assuming full instructional responsibilities. Four were language arts teachers and one, a math teacher. The program also supported one full-time job developer. This individual helped with training related to job awareness. However, his major responsibility was to obtain job placements or additional training opportunities for those who completed PAC.
The program had three consecutive cycles in a year (nine weeks of instruction and one week of job search). Students met for a three-hour period each day, Monday through Friday, for a total of forty-five, three hour session. For four day a week, students followed a systematic schedule which included instruction in reading, writing, math, and job readiness. One day each week, student had the opportunity to hear speakers from various companies, to visit places of potential employment, or to work in special or interest (for more information about PAC program, see Bean & Johnson, 1987).
The results of this project contained in this report should be useful to adult educators interested in curriculum based measures as a means of monitoring reading and writing progress as well a to those responsible for developing and managing adult basic education programs. The results should be helpful also to educators interested in measurement issues related to curriculum based procedures.
Members of the Research Team were: Rita M. Bean, Director, Adelle Byra, Project Coordinator, Roland Good, Suzanne Lane, and Rhonda Johnson. The PAC teachers were: Louise Hammond, Rhonda Johnson, Arzella McCauley, Martha Weiss, Kent Weaver, and the job developer was Arthur Bailey.
Part I of this report contain. five chapters, an introductory chapter and three chapter (II, III, and IV), in which procedures and results pertaining to each of the objective of the project are described. Chapter V contains the findings, conclusions, and recommendations for the entire project. Part II is an Instructor's Guide that provided specific information about how to develop and use curriculum based procedures in an adult literacy program.
This report has been filed with Department of Education, Bureau of Vocational and Adult Education, Division of Adult Basic Education, 333 Market Street, Harrisburg, PA 17126-0333, and with Advance, Pennsylvania Department of Education Resource Center.
FINDINGS, CONCLUSION, AND RECOMMENDATIONS
The overall purpose of this project was to develop and test curriculum based measures and procedures with adults enrolled in an adult basic education program. The specific objectives were to: (a develop and test curriculum based measures and procedures with students in Cycle I; (b) evaluate the curriculum based measures and procedures that were modified as a result of Cycle I findings, and use in Cycle II; and (c) implement and test the use of the modified reading and writing measures and procedures in Cycle III, investigating the validity, reliability, and feasibility of these curriculum based measures. The information below summarizes the findings of the entire year's efforts.
1. The most efficient and feasible measure of reading performance was the repeated oral reading procedure (1 minute readings), conducted once a week. This measure enabled teachers to chart and monitor the performance of adults consistently and efficiently. Several different procedures (Retellings, vocabulary definitions, 1 minute silent readings) were investigated; however, because of the ease of administering the oral reading procedure, in addition to the evidence that oral reading fluency measures correlate highly with other measures of reading performance (Fuchs, 1986; Fuch, 1988), the one minute oral readings became the primary procedure used in the program.
2. Alternate form reliability coefficients calculated during Cycle II indicated that performance on narrative material (.94) was more reliable than performance on expository material (.68). We suspect that unfamiliarity with topic and text structure in expository material may have affected consistency in performance on the expository material, since readability was controlled on all passages.
3. Alternate form reliability of the oral reading fluency measures on narrative passages, established by having students read two selections at three different level, was high (Cycle III). Coefficient were .91 on fourth grade passages; .88 on fifth grade passages, and .92 on eighth grade passages. Results of a repeated measures ANOVA indicated that students performed significantly better (F=89.79, p <.001) on the fourth grade passages. Performance on the sixth and eighth grade passages appeared to be approximately the same.
4. Interrater reliability, calculated using percent agreement between two raters, was also high (number of words read correctly in one minute, 98.8%; total number of words read in one minute, 99.7%).
5. Criterion related validity coefficients of the oral reading fluency measures proved somewhat inconsistent. Correlating performance (scaled scores) on the CAT reading comprehension sub-test, level 18, administered prior to program admission, with the first oral reading fluency measure yielded coefficients of .57 (Cycle I), .24 (Cycle II), and .43 (Cycle III) on fourth grade passages. Correlating performance on the CAT post test with the last oral reading measure yielded coefficients of .62 (Cycle I), .49 (Cycle II), and .18 (Cycle III). The correlations between the grade equivalent scores on the Woodcock Word Recognition Sub-test (post-test) and final oral reading fluency measures were .41 (Cycle I), .12 (Cycle II), and .41 (Cycle III).
Correlating a teacher estimated reading level at the end of Cycle III for each student with the last oral reading fluency measure, yielded a validity coefficient of .60. (The correlation between the post CAT scores and teacher grade also yielded a coefficient of .60).
The inconsistent validity coefficients between the standardized comprehension measures and the CBM's may be caused by several factors: (1) the small sample size; (2) the restricted range in reading performance of these individuals on both the standardized measures and the CBM's and; (3) the fact that these procedures may be measuring different aspects of the reading processes. This is certainly an area requiring further study.
6. The aim line, used in the final cycle, proved to be an important and useful part of the CBM procedures. Both teachers and students responded positively to the aim line. Teachers and students commented on the motivational aspect of the aim line. Teachers felt the aim line could be used effectively to make instructional decisions. The decision to establish an aim line based upon a gain of 3 words per reading and then to modify the line as the student progressed through the program, proved to be a useful one. (The three words gain per reading was determined based upon the actual average gain of students per reading sample in the first two cycles).
7. Teachers found the oral reading fluency measures to be informative and easy to use. The average amount of time spent with each student per testing session (in Cycle III) was four and a half minutes.
8. Students were also receptive to the measures: they generally enjoyed doing the one minute readings and most felt that these one minute readings helped them to improve their reading. Students appreciated seeing their graphs. The two most frequently mentioned negative comments related to the "timing" aspect of the measures and concerns about reading aloud.
9. In Cycle III, seventeen of the nineteen students (89%) evidenced gains on the oral reading measures. In Cycle III, the mean gain of the group on the CBM's was 17.0 CWPM. Using the CAT post-test scores, 16 of l9, or 84% of the students in this cycle evidenced gain. The mean gain on the CAT in this cycle was a grade equivalent score of 1.6.
In Cycle II, 13 of 15 (87% of the students evidenced gain on the CBM's and 9 of 15 (60%) on the CAT post-test. The mean gain of this group on the CBM's was 10.6 CWPM. In this cycle, mean gain on the CAT was a grade equivalent score of .5.
l. The writing fluency procedure (3 minute writing sample) proved to be an efficient and feasible means of monitoring the writing of students in the program. Changes in procedures from Cycle I to Cycle III included: (1) establishment of a one-minute think time for students, before actual writing began; (2) refinement and modification of topics so that they were of interest to students and also related to students' experiences and needs.
2. Interrater reliability was extremely high, with a reliability coefficient of .99 between two raters based upon the number of words written for each of the writing prompts.
3. Since no formal measures of writing or grammar were given in this adult literacy program, the only measure of criterion validity calculated was that between the fluency measure (CBM) and a holistic scoring calculated on a final untimed writing sample. The correlation between the final fluency scores (CBM's) and the idea scores obtained on the untimed prompt was .32. The correlation between the final fluency scores and sentence structure on the untimed prompts was .11.
4. The aim line, instituted in the final cycle, was highly valued by both teachers and students. The aim line was increased by two words per writing based upon average gain of students in previous cycles.
5. All teachers indicated that the writing procedure was helpful and that they would continue to use it in the future. However, teachers did indicate that students were not as receptive to the writing task.
as they were to the reading task.
6. Seventeen of the nineteen students in Cycle III (89%) evidenced gains on the writing fluency measure. Overall, the mean gain for the group in Cycle III was 8.2 words/3 minutes.
Thirteen of the fifteen students in Cycle II (87%) evidenced gain on the writing fluency measures. The mean gain of the group in Cycle II was 12.7 words/3 minutes.
7. The rankings of students' perception of the difficulty of each prompt correlated negatively with the rankings of the writing samples, based upon number of word written. Overall the results suggested that there was no relationship between the group's perception of the difficulty of each prompt and the group's actual performance.
1. The one minute oral readings, which were administered once a week, were an efficient and reliable means of measuring and monitoring reading performance of adults in a basic adult literacy program, with adults who read from beginning reading level through eighth grade level. Materials used throughout the project were at the estimated instructional level of the group of students, and result suggested that this procedure of using instructional level material, rather than higher level material, should be continued to important aspect of the procedures and was effective as a means of monitoring student progress and as an indicator of the need to modify instruction.
The oral reading fluency measures were one of several measures used to assess the effectiveness of the program. Students showed evidence of gains on both the CBM measures and the standardized measures; and the CBM measures were particularly sensitive to gains in the short term program. However, at this stage, the CBM's should really only be considered as one indicator of program evaluation.
1. The three minute writing samples, which were administered once a week, were useful for monitoring learning of students. Topics selected should be those of interest to the students and should be based upon their experiences. An unexpected finding was that the difficulty of the writing task, as perceived by students, related negatively to number of words written. Overall, there was no relationship between the group's perception of topic difficulty and actual performance.
The writing fluency measure were one of the measures used to assess program effectiveness. Students in this project showed evidence of making gains, although not substantial, on the CBM procedure for writing. However, at this stage, the CBM's should really only be considered as one indicator of program evaluation.
Results of this project suggested that CBM measures of reading and writing can be useful in an adult basic education program because of their feasibility and reliability in monitoring the performance of adults. They provided a built-in mechanism for tracking performance. Moreover, they were extremely motivational for adult students who could track their performance over the course of the program. Finally, the procedures provided a supplement to the current evaluation tool, the standardized tests given as pre-post measures. However, as useful as these CBM measures were, they are only one means of monitoring performance. They should not dictate program emphasis or approaches (i.e., if a student is poor in oral fluency, there should be much more practice in oral reading. The teacher must focus on the instructional needs of the students (decoding, vocabulary, or comprehension), with the expectation that growth in any area of reading will be reflected in the oral reading performance of the students.
Results of this project indicated that the curriculum based measures in reading and writing were effective in helping teachers monitor the ongoing progress of the adults. Teacher and students responded favorably to the measures; teachers indicated that they would continue to use the procedures in future programs. Yet these procedures cannot be used unless there is time allotted for administration and training provided for staff as to their use.
TIME FOR ADMINISTRATION
Given that the CBM's are repeated and frequent measures, time had to be allocated within each week of instruction for administration and analysis of results. In this project, the Project Coordinator assisted with the actual administration and interpretation of the measures. Moreover, the Project Coordinator assumed responsibility for organizing the materials that would be used for testing. Any program that implemented CBM's would need to plan carefully so that materials could be selected (perhaps before the program began), and to make decisions about how and when CBM's would be administered.
Moreover, if the program were one in which teachers worked with groups, various management decisions would need to be considered (what other students will do when one student is being tested). The use of an aide would certainly facilitate testing in such a program.
Instructors who are not familiar with CBM's could benefit from staff development sessions so that they have some understanding of the rationale for such measures, ideas for selecting materials, coding oral reading, establishment of aim line, and use of the graph for modifying instruction. The User's Guide developed as part of this project should be a helpful document in staff training.
1. There is a need for continued research regarding the validity of CBM measures with adult populations, both the reading and writing procedures. A larger and more heterogeneous sample of adults would be helpful in assessing the validity of these measures.
2. There is a continued need to investigate the viability of these and similar procedures with other groups of adult students reading at higher levels and in different types of programs. Now that procedures have been streamlined and some baseline data have been collected, there is much opportunity to implement these procedures in various programs and to share results.
3. Investigations regarding criteria for establishing and modifying the aim line, for both the reading and writing measures, should be conducted. In this project, criteria for establishing the aim line were determined based upon average group gains (an arbitrary figure that provided a starting point). However, research in which various studied would provide important information to those interested in using curriculum based measures as a means of monitoring individual student progress.
4. Research is needed on whether students can be accurately "placed" into reading materials, using from curriculum based measures. Establishment of norms regarding adult performance on oral reading materials would be helpful in making determinations about placement. At the present time in adult programs, placement decisions based on the results of a comprehensive battery of tests may be too time consuming, thus reducing instructional opportunities. Other alternatives include making placement decisions based on a single standardized test score or using CBM's by themselves, or as a supplement to other forms of testing that are used for placement decisions. However, the validity of using any of these procedures for placement in adult programs needs to be established.
5. The use of CBM's as an indicator of the effectiveness of a program for adults should continue to be studied. because many adult programs are of short duration, and since standardized test results do not tend to be sensitive to gains in such short periods, CBM gains may prove to be helpful to educators as an alternative or supplement to standardized indicators of effective programming.
6. Continue to investigate other reading procedures for obtaining curriculum based measures. Although the oral reading samples were highly effective, there are several concerns: (1) negative reaction by adults to reading aloud, and (2) need for individual assessment. The use of the computer to obtain reading data might also be investigated.
7. Continue to investigate writing procedures other than fluency for obtaining curriculum based measures, particularly with adults who have more sophisticated writing skills than the adults in this program.
GUIDE TO USING CURRICULUM BASED MEASURES OF READING AND WRITING IN AN ADULT LITERACY PROGRAM
The procedures and materials provided in this user's guide were developed based on results of a 310 project funded by the Pennsylvania Department of Education during the 1988-89 year. These curriculum based measures (CBM's) were implemented in an adult literacy program (PAC) at the University of Pittsburgh.
Adult educators are encouraged to utilize these approaches, to study their effectiveness in monitoring progress of students and to suggest modifications, based upon experiences with adults reading at various levels and with different instructional needs. Although the CBM's were used with groups of students, these procedures would be particularly useful to those working with adults on an individual basis. Although it may take Some time to select and organize the material.s necessary to utilize these procedures, once materials have been selected, the actual implementation of the procedures is efficient and useful.
The guide is divided into two sections: (l) directions and guidelines for using curriculum based procedures for monitoring progress of students in an adult literacy program; and (2) suggestions/ideas for implementation developed by teachers in the PAC project. Samples of the materials and charting procedures used in the program are in Appendices D to J.
MONITORING PROGRESS OF ADULTS IN READING AND WRITING
The need for more efficient and reliable instruments to determine the ongoing progress of adults has been cited as a high priority for those interested in adult basic education. Curriculum based measures provide one alternative to the standardized tests generally used in most programs.
Curriculum based measures are short tasks administered at frequent intervals to permit instructors to determine the effectiveness of their teaching and to make necessary modifications in instruction. These procedures tend to be sensitive to growth in performance over relatively short durations, an important characteristic for adults who are eager to learn as much as possible in a short amount of time. The two curriculum based tasks described below are: (a) oral reading fluency number of words read correctly in one minute); (b) writing fluency number of words written during a three minute writing task).
The procedure used to monitor growth in reading was a series of one minute oral readings administered once a week to students on an individual basis. Specific directions and guidelines for implementing these procedures are described below.
1. *Determine levels at which students will read.* The adults in PAC read materials that were at their instructional levels the level at which the teacher was working with students). We suggest beginning with materials st that levels to provide initial success and build confidence), and then to change levels, if necessary, as the student progresses through the program. The instructional level can be determined from the results of a standardized test, an informal reading inventory, or on the basis of teacher judgment and diagnostic teaching. Once this level has been determined, obtain passages written at that level.
2. *Selection and preparation of materials.* Although material can be selected from any source, the following guidelines were found to be helpful.
- a. Select materials from those available in the literacy program so that they are of interest to adults. Moreover, there are many materials of high interest, low readability, that often can be used for these measures. Narratives (story type) materials provided more consistent results than did expository or information type passages in this project.
- b. Select passages from various graded materials; otherwise, calculate the readability of the passage using the FRY readability formulas (Fry, 1968).
- c. Randomly select passages of at least 300 words from these materials for the oral reading procedures. Make a copy for teacher coding or prepare a coding format sheet such as the one in Appendix D. This format was used so that teachers could calculate the number of words read easily and efficiently.
- d. Read through the passages and prepare a short purpose statement that will give the student some prior information about what he/she will be reading and help the student relate the text material to his/her prior knowledge. (e.g., This selection is about a man, who, when he pulls into a truck stop for a coffee, does not see what he expects to see).
3.*Procedures for administration.* Follow these procedures each time a student reads orally.
Materials Needed: text material for students marked so that the student knows where to begin, or retyped) (see Appendix E); copy of text material for teacher or format sheet (see Appendix D for sample); stopwatch.
- a. Show each student his/her graph before the student begins reading so that student has a sense of what was accomplished during the previous reading.
- b. Read instructions to students. "I'm going to have you read one selection aloud each week. You will read the selection for one minute only. It is important that you read as quickly and as carefully as you can. I will be timing you so that we can keep track of your reading and to help me plan my lessons better."
- c. Read the prompt that you have developed for the student for that specific passage.
- d. Point to the first word of the selection (on the student's copy) and tell the student to begin reading.
- e. Begin timing as soon as the student begins to read. The student reads for one minute only.
- f. On the teacher copy of the text, put a line though words that the student has difficulty with or omits. Teachers may also code miscued as to types for more diagnostic information). For additional information about coding, see Appendix F).
- g. If a student spends seconds on a word and is still unable to decode it, pronounce the word so that the student can continue reading.
- h. When students have completed reading, commend them for their efforts Good job, Sue!; You really tried hard on that passage, Jim). You may also wish to provide feedback about the student's performance by giving an estimate of what the student's scores (CWPM) was.
4. *Graphing the results.* Calculate the number of words read and the number of miscues made. The correct word per minute is calculated by subtracting the number of miscues from the total number of words read. (The student read 45 words and substituted 4 words in the text, resulting in a score of 41 CWPM). (See Appendix F for suggestions on how to count miscues, and Appendix G for recording student reading data). The CWPM is placed on a graphs Appendix H). The graph provides an important means of sharing information with the student and charting progress.
5.*Determine an aim line.* One of the important parts of the procedure is the establishment of an aim line, or goal, for each student. The aim line provides a standard by which each student can determine whether he/she is making progress in the program. To establish an aim line, first obtain a baseline score for each student by calculating the average of at least the first two readings. Specific procedures are:
- a. Establish baseline levels student read 82 and 85 correct words per minute on the first two readings; the baseline, therefore, was 84 correct words per minute).
- b. Determine the number of one minute readings to be administered during the program. There should be at least one reading a week, and if possible, two.
- c. Determine the final goals. In order to establish an aim line quickly, given the 10 week program, and based upon the performance of adults in PAC, we calculated our aim line based upon an average gain of 3 words per reading). Example of calculation:
--- Step 1: Baseline (average of first two readings) 82 + 85 = 84 CWPM
--- Step 2: Number of readings after baseline readings: 10
--- Step 3: Multiply estimated gain/per readings x\ X number of readings to obtain estimated total gain (3 x 10 = 30)
--- Step 4: Add baseline plus estimated total to obtain final goals (84 CWPM + 30 = 114 CWPM). X\ One could calculate estimated gain/per reading for each student based upon the average gain of the first three readings (e.g., 82, 85, 83 = +1 estimated gain/reading) and calculate the aim line based upon these data. The critical point is that the aim line can be modified based upon the ongoing performance of the individual student.
- d. Draw the aim line on the graph and share with the student. Explain that this goal is increased by 'x' number of words a reading (thus the slanted aim line), and that it may change as instruction progresses.
6. *Use the aim line for monitoring instruction.* If students are making steady progress toward their goals, the aim line can be left as is, and instruction can proceed. However, if students are not making steady progress toward their weekly goals (two consecutive readings fall below the aim line), several options can be considered.
- a. The aim line, or goal, can be modified. In this case, calculate the average gain, using all of the previous data points) to establish the
beginning point of a new aim line.
- b. The teacher could modify or change the instruction for that student. c. The level of reading materials could be changed. For example, the student could be asked to read a lower or higher level text.
- d. The teacher may also consider other alternatives, based upon experiences with the student.
7. *Sharing the results.* After the first reading, show students their graphs and explain the aim line. It is important for students to see what they have accomplished and to have a goal. A strategy used in the project was to calculate quickly the CWPM and share this information with the student immediately after the reading, and then to explain that this was an "estimated point" and that the student would see the "actual" score just before the next reading. This immediate feedback was appreciated by the students.)
8. *Additional option.* Discuss with students any pattern of incorrect responses of miscues. In this way, the readings can be used for instructional purposes. Moreover, the adults appreciated immediate feedback about words that they experienced difficulty with during the readings.
The procedure recommended for use is a 3 minute writing sample, administered at least once a week. The writing sample can be administered to small groups of students or on an individual basis. The procedure includes a one minute "think" time during which students are given an opportunity to mentally organize their thoughts. The criterion used to assess growth is fluency or number of words written in three minutes.
1. *Selecting prompts. Select a number of topics that are of interest to your students and that relate to experiences that they may have had. (A list of prompts successful in this project are listed in Appendix I.) 2. *Procedures for administration.* These procedures are used each time a student does a writing sample.
Necessary materials: stopwatch for timing, writing prompt, pen or pencil.
- a. Read the following instructions to student(s). "I am going to have you do some writing. The writing will only take three minutes. I will read the writing task that you are to write about to you as you follow along on your paper. The goal is to see if having you write in this manner will help you to improve your writing. I would encourage you to try to write as much as you can about the topic. If you want to use a word that you are not sure of how to spell, simply leave a blank or put down the first letter and leave the rest blank. Demonstrate for students on a board or a sheet of paper. --- The boy jumped over the . . . .
--- The boy jumped over the f . . . .
Please do not worry about your spelling, I am most interested in two things: your ideas and how many words you can write in three minutes.
- b. Give the student the writing prompt and read the prompt aloud as students follow.
- c. Give students one minute to think about the prompt. Encourage the student to make notes on his/her paper if she/he wishes. This preparation is done individually by each student.
- d. When told to begin writing, the student writes as much as he/she can in three minutes.
- e. When 3 minutes are over, tell students to stop writing and collect the prompts. You may choose to discuss the topic and to share orally the students' responses to the prompt.
3. *Graph the results.* Calculate the number of words written and place this number on a graph (See Appendix J). Affluence was determined by counting the number of words written; there was no effort to look at correctness of spelling. In this project, when students left "blanks" for words they did not know how to spell, these were not counted as words written). Again, the graph provided an important means of sharing information with the student and charting progress.
4. *Determine an aim line.* To establish an aim line, first obtain a baseline score for each student by averaging the results of the first two writings. Specific procedures follow:
- a. Determine baseline levels average number of words written on first two writing samples).
- b. Determine the number of 3 minute writings to be administered during the program. There should be at least one writing sample administered each week.
- c. Determine the final goal.-In order to establish an aim line quickly and based upon the performance of adults in PAC, an aim line was calculated based upon the average gain in writing fluency of 2 words/per writing sample. (Example of calculation).
--- Step 1: Establish baseline: Average of first two writings (19, 22 words per 3 minute writings = baseline of 21).
--- Step 2: Number of writings after baseline = 10
--- Step 3: Multiply estimated gain/per writing x\ times number of writings after baseline to obtain estimated total gain (2 x 10 = 20).
--- Step 4: Add baseline plus estimated total gain to obtain final goal (21 + 20 = 41)
x\ One could calculate estimated gain per writing for each student based upon the average gain of the first three writing samples (e.g., 19, 22, 23 = +4 estimated gain/writing) and calculate the aim line based upon these data. The critical point is that the aim line can be modified based upon the ongoing performance of individual students.
- d. Draw the aim line on the graph and share with the student. Explain that this goal is a starting point and that it may change as instruction progresses.
5. *Use the aim line for monitoring instruction.* If students are making steady progress towards their goal, the aim line can be left as is, and instruction can proceed. however, if students are not making steady progress towards their weekly goals two consecutive writing measures fall below the aim line), several options can be considered by the teacher.
- a. The aim line, or goal, can be modified. In this case, the average gain of all of the previous data points can be used to establish the beginning point of a new aim line.
- b. The teacher could modify or change the instruction for that student.
- c. The teacher may also consider other alternatives, based upon his/her experiences with the student.
6. *Share the results.* After calculating the writing fluency score, share the graph with the student and explain the aim line. It is important for students to see what they have accomplished.
7. *Additional option.* The writing samples can be used to obtain additional diagnostic information or to provide probes for instruction. As a diagnostic tool, the teacher can observe various spelling or grammatical errors that may be addressed at another time in the program. Instructionally, the prompts can be used to lead into other writing activities related to a specific topic. Moreover, the topics of the prompts can be used for discussion after the writing samples have been completed. Students may share their responses and discuss differences and similarities. Teachers may also, once they have calculated the words/3 minutes, ask students to write more about that specific topic.
SUGGESTIONS/IDEAS FOR IMPLEMENTING CURRICULUM BASED PROCEDURES IN AN ADULT LITERACY PROGRAM.
The ideas and suggestions below were contributed by the PAC teachers on the basis of their experiences with implementing CBM'a with adult students during the 1988-89 school year. The practical nature of their suggestions should be helpful to those interested in using these procedures, either with individuals or with a group.
1. Read directions very carefully before administering materials. This is important so that you are familiar with the materials before you introduce them to students.
2. Prepare materials ahead of time. If possible, put students' names and the date on each paper to save class time when you are actually working with students. Have everything needed (timer, student reading page, response sheets for instructor, graph).
3. Read instructions to the students which should include the purpose for the CBM'a) and prompt for the passage the student will read.
4. Read passage to become familiar with the reading before instructing students to begin. It helps with coding.
5. Show students the goal line prior to each reading. We found the graphs to be extremely motivating with students.
6. Try to relax the student with casual conversation before the reading. This gives the instructor a chance to see each student individually. Then if it is necessary, a longer conference time could be set up at a later date. It's helpful to touch base with each student at a personal level at least once each week.
7. Be flexible with testing schedule - we all have bad days. We had students who would come to class influenced by what had happened at home that morning. In some cases it was necessary to reschedule them to read the following day.
8. Choose high interest reading material.
1. After first reading, show students their graph and explain the goal line. It is really important for students to see how they have done and to have a goal. It's especially important that they see the graph right before they read, not a day before.
2. Code as the student reads and then calculate correct words per minute (CWPM). Share this information with students to give them an idea of where their point would be on the graph. Be sure to explain that this is an estimated point and that you will show them next week exactly how they did. This immediate feedback is appreciated by students.
3. Discuss with students their incorrect responses. Often students would want immediate feedback about words they had stumbled over during their reading.
4. If time permits, figure the CWPM point for the graph at the end of each reading so students can see if progress was made.
5. Be open to adjusting the aim line after, if necessary. It will defeat the purpose of the goal line if students are continually way above or below.
SUGGESTIONS FOR IMPLEMENTATION WITH LARGE GROUP OR WHOLE CLASSROOM
1. During reading class the teacher can teat five or six students each day while the others are working independently. This could be done at the very beginning of reading class. Show students their graphs before reading new passages each week. This helps motivate students to read more accurately.
2. An aide could tape record student readings and cross off any reading errors that the student might make during the reading. They could also put a small "c" where a self correction has been made. Then the teacher could use the aide's material or do a miscue analysis after the class period. This may save class time but could cause additional work for the teacher. If the teacher wishes to see only the readings and what errors were made, taping the session may be an option.
3. Teachers could team teach with another instructor during reading class. One could be responsible for the reading lessons while the other administered the CBM's. They could alternate this schedule each week, sharing the preparation for CBM's.
4. Plan a lesson in which the students will work in groups or individually after an initial introduction of the lesson. Once the class is working independently and the teacher has circled the room to answer questions, students can be pulled one at a time for administering the CBM's.
1. If presenting a writing prompt with a large class, be sure (if the written prompt is not given to each student), to write it on the board or have it on an overhead. Students need to see and hear the prompt. 2. Develop prompts that will be of interest to students. Be aware of grade level, outside interests, and subject matter.
3. Show students their goal line before they write each week. 4. Remind students each week.that you are not checking their spelling and that they can leave a blank if they are unsure of how a word is spelled. Encourage them to attempt spelling all words.
5. Ten minutes should be set aside each week, if possible on the same day, so the students get into the routine of knowing when the writing sample will be administered. This should help ease the apprehension that some feel about timed writings. The time it took to calculate results for the writing samples was usually minimal, depending on how much the students wrote.
6. Writing prompts can be administered to small groups or whole classes. The method chosen will not affect class time instruction because the writing samples can be used as a lesson.
7. The prompt can be used for discussion after the writing sample. For example, the students wanted to discuss what changes they would make to an apartment if they were the landlord. They wanted to see what their classmates had said and they wanted to share their ideas about what a landlord should change and why.
8. Prompts can also be used to lead into other writing activities, either individual or group activities. With the prompt about the apartments and the landlord, students could write letters and work on the appropriate forms for them.
9. These timed writings should only be a part of the whole writing program. We recommend a timed writing should be administered once a week in addition to a more complete writing program. Having students write under a time limit can create pressure and it may take students time to become accustomed to this task. They should learn to do this in addition to other forms of writing.
10. If students are really concerned about misspelled words, they could begin to keep a spelling log.
11. Students could keep a daily journal. Writing enables students to become more attuned to their experiences and emotions on paper. In other words, writing helps them to find and express their "inner voice".
Bean, R. M. & Johnson, R. (1987). The Pittsburgh Adult Competency Program: A effective literacy programming. *Adult Literacy and Basic Education, 11* (1), 1-12.
Deno, S. L., Marston, D. & Mirkin, P. (1982) . Valid measurement procedures for continuous evaluation of written expression. *Exceptional Children, 48,* 368-371 .
Deno, S., Marston, D., Mirkin, P., Lowry, L., Sindelar, P. & Jenkins, J. (1982). *The use of standard tasks to measure achievement in reading,. spelling, and written expression: A normative and developmental study,* (Research Report No. 87). Minneapolis, MN: University of Minnesota, Institute for Research on Learning Disabilities. (ERIC Document Reproduction Service No. ED 227129) .
Deno, S. L., Mirkin, P.K. & Chiang, B. (1982). Identifying valid measures of reading. *Exceptional Children, 49,* 36-45.
Fry, E. (1968) . A readability formula that saves time. *Journal of Reading, 11, 513-516, 575-578.
Fuchs, L. S. (1988) *Relations among basic skill measures.* Unpublished manuscript.
Fuchs, L. S. (1986). Monitoring the performance of mildly handicapped students: Review of current practice and research. *Remedial and Special Education, 7,* 5-12.
Fuchs, L. & Deno, S. (1981). *The relationship between curriculum-based mastery measures and standardized achievement tests in readings.* (Research Report No. 57) Minneapolis, MN: University of Minnesota, Institute for Research on Learning Disabilities.
Harmon, D. (1985). *Turning illiteracy around: An agenda for national action.* New York: Business Council for Effective Literacy.
Proceedings of the Roundtable Conference (1986, May). Symposium conducted in State College, PA.
Webster, L.P. (1986). A national survey of evaluation procedures in adult basic education. *Proceedings of the 31st annual Convention of the International Reading Association.* 1-5.
-----FORM: APPENDIX D
Self - Correction Tally
INTRODUCTION TO STUDENT: This selection is about a man who, when he pulls into a truck stop for a coffee, does not see what he expects to see.
Whenever I get sleepy at the wheel, I always stop for coffee. (12)
This time, I was driving along in western Texas, and I got sleepy (25)
I saw a sign that said GAS EAT, so I pulled off the road. It was (41)
long after midnight; I expected a place like most of the rest - (53)
where the coffee tastes like copper and flies never sleep. (63)
What I found was something else. The tables were painted and (74)
clean. They looked as if nobody ever spilled ketchup on them. (85)
The counter was spick-and-span. Even the smell was okay. (96)
The man behind the counter was the only person in the diner. (109)
I judged him to be about forty years old. His hair was just (122)
starting to get gray above the ears. I sat down at the counter (135)
and ordered coffee and apple pie. Right away he got me (146)
feeling sad. (148)
I have a habit: I divide people into two groups - winners and (160)
losers. This guy behind the counter belonged to the group of (171)
people who mean well; they can't do enough for you. But their (183)
eyes have this gentle, faraway look, and they can't win. You (194)
know - with their clean white shirts and their little bow ties? (205)
It makes you sad just to look at them. Take my advice, though. (218)
Don't feel too sad for them. (224)
Level: 4th ........(checked)/............# Wds Read/................%
S/C: .........../ Subs: ............ Insert: (P) ......... (Wd) .......... Omissions:..............
Take from: *Points,* New Directions in Reading, Houghton-Mifflin Co. (1986), pp. 83-84.
Whenever I get sleepy at the wheel, I always stop for coffee. This time, I was driving along in western Texas, and I got sleepy I saw a sign that said GAS EAT, so I pulled off the road. It was long after midnight; I expected a place like most of the rest - where the coffee tastes like copper and flies never sleep.
What I found was something else. The tables were painted and clean. They looked as if nobody ever spilled ketchup on them. The counter was spick-and-span. Even the smell was okay. Really.
The man behind the counter was the only person in the diner. I judged him to be about forty years old. His hair was just starting to get gray above the ears. I sat down at the counter and ordered coffee and apple pie. Right away he got me feeling sad.
I have a habit: I divide people into two groups - winners and losers. This guy behind the counter belonged to the group of people who mean well; they can't do enough for you. But their eyes have this gentle, faraway look, and they can't win. You know - with their clean white shirts and their little bow ties? It makes you sad just to look at them. Take my advice, though. Don't feel too sad for them.
APPENDIX F: CODING SHEET
- If a student says a word incorrectly, put a line through the word. Write the word that the student substitutes above the mispronounced word. (example omitted)
- Count each proper name miscue only once; all other substitution miscues get counted each time.(example omitted)
- Circle the word or the part of the word that the student leaves out. (example omitted)
- Each word or word part that is omitted is counted as one miscue. (example omitted)
- If the student adds a word or a part of a word to the text, write in the addition. (example omitted)
- Categorize the insertions as either PARTIAL or WHOLE WORD. (example omitted)
- Count only the PARTIAL insertions in the error count, but record the number of WHOLE WORD insertions. (example omitted)
- For any word that the student corrects, put a small circled 'c' above the word(s). (example omitted)
- When counting miscues, do not count self-corrections as errors; count as a word read correctly.
The SUBSTITUTIONS + OMISSIONS + PARTIAL INSERTIONS = the difference between the NUMBER OF WORDS READ CORRECTLY and the NUMBER OF WORDS READ.
eg. 30 (checked)/36 words read . . .Difference = 6
Therefore, substitutions + Omissions + partial insertions will total 6.
-----FORM: DATA SUMMARY CYCLE III
...............(checked)/......................# Words Read =.................%
- a) Partial:
- b) Whole Word:
TIME: I Minute (60 Seconds)
- a) Words per minute (WPM):
- b) Correct Words per Minute (CWPM):
-----GRAPH: APPENDIX H
(Student reading weekly progress plotting graph consists of a vertical scale from 20 to 225 indicating "Correct Words Per Minute (CWPM) and a Horizontal scale indicating weeks with each week indicating the "Date" and the "Story" Read that week and reporting for the grid the Goal and the actual CWPM.
CYCLE III WRITING PROMPTS
You have just read a "Dear Abby" column. In response to a letter from one of her writers from Pittsburgh, Abby argues that SMOKING SHOULD NOT BE ALLOWED in public places such as hospitals, restaurants, libraries, hallways in public buildings, buses or banks. Tell whether or not you AGREE or DISAGREE. Explain why you agree or disagree.
If you could have a date with ANYONE, who would it be? Tell why you would like to have a date with this "dream person". Tell where you would go or what YOU would do on your date.
As of November 1, you have just become the new landlord of an older apartment building. The building needs to be painted inside and out; the toilets do not work properly; the air conditioners are old and do not work; the heating system works occasionally; and the building's security locks do not work. Your tenants are very unhappy. What Two things would you repair first. Tell why.
You have been applying for jobs and one day you receive a call from an employer who may be interested in hiring you. As you know, it is very important to make good first impressions at interviews. Therefore, what kinds of things do You think are important in order to help make a good first impression? Explain why.
Do YOU have (or have YOU had) a good friend? Describe WHY your friend is or was special If you do not have or have not had a good friend, then describe the qualities that YOU would look for in a person with whom You might want to become friends.
Tell about the BEST things that have ever happened to you in your life. Then, describe why they were the BEST things ever to have happened to you.
If You could be ANYONE for 24 hours.... who would you choose to be? Then, describe why you chose to be that person. Also, tell what you would do as that person.
-----GRAPH: APPENDIX J
(Student reading writing progress plotting graph consists of a vertical scale from 10 to 150 indicating "Correct Number of Words Written in Three (3) Minutes" and a Horizontal scale indicating weeks with each week indicating the "Date" and the "Prompt #" used that week and reporting for the grid the Goal and the actual # Words Written.
*****END OF DOCUMENT AB0015*****
To find out what information OTAN Resources contains on a broad topic, or to learn if a particular title on a topic is available, browse the Document Library by topic.
Documents are added to OTAN's Web site by scanning the document and formatting for presentation on the Internet. The original content is preserved, but the process may result in format changes. Documents posted prior to 1997 may not include tables and other visuals. Some documents are posted in Adobe Acrobat PDF format and their original format is preserved. A source for obtaining a copy of the original document will be given.