The same questions are used for both the pre-and-posttest. Full disclosure, I made an 88% on the pretest, so there are a few items that don't have the corrected answer selected for. I made a 99% on the posttest but I'm unsure of what the correct answer actually was. Good luck! Show
A teacher's first responsibility is to provide opportunities for writing and encouragement for students who attempt to write. A teacher's second responsibility is to promote students' success in writing. The teacher does this by carefully monitoring students' writing to assess strengths and weaknesses, teaching specific skills and strategies in response to student needs, and giving careful feedback that will reinforce newly learned skills and correct recurring problems. These responsibilities reveal, upon inspection, that assessment is clearly an integral part of good instruction. In their review of the existing research on effective instruction Christenson, Ysseldyke, and Thurlow (1989) found that, in addition to other factors, the following conditions were positively correlated to pupil achievement: Assessment, therefore, is an essential component of effective instruction. Airasian (1996) identified three types of classroom assessments. The first he called "sizing-up" assessments, usually done during the first week of school to provide the teacher with quick information about the students when beginning their instruction. The second type, instructional assessments, are used for the daily tasks of planning instruction, giving feedback, and monitoring student progress. The third type he referred to as official assessments, which are the periodic formal functions of assessment for grouping, grading, and reporting. In other words, teachers use assessment for identifying strengths and weaknesses, planning instruction to fit diagnosed needs, evaluating instructional activities, giving feedback, monitoring performance, and reporting progress. Simple curriculum-based methods for assessing written expression can meet all these purposes. Process, product, and purposeCurriculum-based assessment must start with an inspection of the curriculum. Many writing curricula are based on a conceptual model that takes into account process, product, and purpose. This conceptual model, therefore, forms the framework for the simple assessment techniques that follow. Simple ways to assess the processThe diagnostic uses of assessment (determining the reasons for writing problems and the student's instructional needs) are best met by looking at the process of writing, i.e., the steps students go through and strategies they use as they work at writing. How much planning does the student do before he or she writes? Does she have a strategy for organizing ideas? What seem to be the obstacles to getting thoughts down on paper? How does the student attempt to spell words she does not know? Does the student reread what she has written? Does the student talk about or share her work with others as she is writing it? What kind of changes does the student make to her first draft? In order to make instructionally relevant observations, the observer must work from a conceptual model of what the writing process should be. Educators have reached little consensus regarding the number of steps in the writing process. Writing experts have proposed as few as two (Elbow, 1981) and as many as nine (Frank, 1979). Englert, Raphael, Anderson, Anthony, and Stevens (1991) provided a model of a five-step writing process using the acronym POWER: Plan, Organize, Write, Edit, and Revise. Each step has its own substeps and strategies that become more sophisticated as the students become more mature as writers, accommodating their style to specific text structures and purposes of writing. Assessment of the writing process can be done through observation of students as they go through the steps of writing. Having students assess their own writing process is also important for two reasons. First, self-assessment allows students an opportunity to observe and reflect on their own approach, drawing attention to important steps that may be overlooked. Second, self-assessment following a conceptual model like POWER is a means of internalizing an explicit strategy, allowing opportunities for the student to mentally rehearse the strategy steps. Figure 1 is a format for both self-observation and teacher observation of the writing process following the POWER strategy. Similar self-assessments or observation checklists could be constructed for other conceptual models of the writing process. Figure 1. Using a five-step conceptual model for student and teacher observation of the writing process
Simple ways to assess the productAn effective writing process should lead to a successful product. A writing product fulfills its communicative intent if it is of appropriate length, is logical and coherent, and has a readable format. It is a pleasure to read if it is composed of well-constructed sentences and a rich variety of words that clearly convey the author's meaning. When various conceptual models of writing are compared side by side (Isaacson, 1984) five product variables seem to emerge: fluency, content, conventions, syntax, and vocabulary. Too often teachers focus their attention primarily on surface features of a student's composition related to the mechanical aspects of writing, or conventions. A balanced assessment should look at all five aspects of a student's writing. The following are simple methods for assessing each product variable. In some instances quantifiable measures are used; in others, qualitative assessments seem more appropriate. FluencyThe first writing skill a teacher might assess with a beginning writer is fluency: being able to translate one's thoughts into written words. As concepts of print and fine motor skills develop, the student should become more proficient at writing down words and sentences into compositions of gradually increasing length. The developmental route of very young writers involves trying to understand what written language is about as they look at books, become aware of environmental print, and put pencil to paper (Clay, 1982). Then children try to relate their experiences in writing using invented spelling. As they begin to construct little stories they explore spelling patterns and develop new language patterns. Clay (1979, 1993) recommends a simple rating scale for emerging writing skills that focuses on language level (from only letters to sentences and paragraphs), message quality, and directional principles (Figure 2). Figure 2. Rating a child's early attempts at writing (Clay, 1993)
A simple curriculum-based measure of fluency is total number of words written during a short writing assignment. When fluency is the focus, misspellings, poor word choice, and faulty punctuation are not considered. Attention is only directed to the student's facility in translating thoughts into words. A baseline of at least three writing samples should be collected and the total number of words counted for each. For the purpose of evaluation, this total can be compared with those of proficient writers of the same age or grade level. However, total words may be used best in monitoring the student's progress, comparing performance with his or her own previous fluency. A resulting IEP objective might be written like this: After a group prewriting discussion with the teacher, Daniel will write original narrative compositions of [40] words or more. A rough guideline for setting the criterion can be established from research reported by Deno, Mirkin, and Wesson (1984) and Parker and Tindal (1989):
ContentContent is the second factor to consider in the writing product. Content features include the composition's organization, cohesion, accuracy (in expository writing), and originality (in creative writing). General questions the classroom teacher can ask regarding a composition's organization include:
Analytical scales are the best way to lend some objectivity to evaluation of content. One can choose from a general rating scale, appropriate to almost any writing assignment, or one tailored to a specific genre or text structure. Spandel and Culham (1993) developed an analytical trait scoring guide for six aspects of writing, three of which address content: Ideas and content, organization, and voice. (Voice refers to the author's own unique personality, style, and honesty reflected in the writing.) Each of these traits is scored on a five-point scale. For example, organization is scored using the following guidelines:
To promote agreement between raters, each of the guidelines above is further defined by specific criteria (or rubrics). A rating of 3, for example, requires these attributes:
A composition that is somewhat better organized than described by the guidelines for 3 but does not quite fit the descriptors for 5 would receive a rating of 4. Similarly, a rating of 2 falls between the descriptors for 1 and 3. Analytical scoring guidelines such as these are used in many state writing assessments. There are two limitations to scales such as these. First, teachers must spend many hours learning the rubrics and discussing student compositions in order to establish any degree of integrater reliability. Second, these scales may not be sensitive enough to measure growth in students with emerging literacy skills who are unable to achieve a rating above 1 or-at the most-2. For many students, writing instruction begins with smaller units of discourse, such as a paragraph. Welch and Link (1992) recommended an informal paragraph assessment that focuses on each of a paragraph's three parts: topic sentence, supporting sentences, and clincher sentence (Figure 3). Each part can receive a point for its existence, its form (grammatical correctness), and its function (relevance to the topic). Both topic sentence and clincher sentence can earn only one point for each of the three criteria, but up to three supporting sentences can be scored for existence, form, and function. This scale could be used to evaluate almost any kind of paragraph. Figure 3. Informal assessment of a paragraph compositionSource: Welch, M. & Link, D.P. (1992) Informal assessment of paragraph composition. Intervention in School and Clinic, 27(3), 145-149.
Writing instruction for students with special needs also may focus on specific text structures. An example of a structure-specific scale is one that Isaacson (1995) devised for evaluating factual paragraphs written by middle school students (Figure 4). Isaacson's scale reflects the conceptual definition of fact paragraphs taught to the students: (a) A fact paragraph has more than one sentence; (b) The first sentence tells the topic; (c) All other sentences are about the topic; (d) Sentences tell facts, not opinions; and (e) The most important information is given first. Judgments of factual accuracy and fact vs. opinion make the scale specific to factual paragraphs. Figure 4. Analytical scale for factual paragraphs
Harris and Graham (1992) provided another example of a structure-explicit measure for assessing the inclusion and quality of eight story elements in stories written by students with learning disabilities: introduction of the main character, description of the locale, the time in which the story takes place, a precipitating event (or starter event), the goal formulated by the character in response to the starter event, action(s) carried out in an attempt to achieve the goal, the ending result, and the final reaction of the main character to the outcome. Each story element receives a numerical score for its inclusion and quality of development. The validity of the scale was demonstrated by its correlation with Thematic Maturity scores on the Test of Written Language and holistic ratings of story quality (Graham & Harris, 1986). A resulting IEP objective for content might read: Using a story map, John will plan, write, and revise a story which includes a description of the character, setting, problem or goal, two or more events, and conclusion. (A story map is a planning sheet that prompts students to think about and write down their ideas concerning the character, setting, and other components of a good story before they write.) ConventionsIn order to fulfill the communicative function of writing, the product must be readable. Writers are expected to follow the standard conventions of written English: correct spelling, punctuation, capitalization, and grammar and legible handwriting. Consequently, even if the message is communicated, readers tend to be negatively predisposed to compositions that are not presentable in their form or appearance. Teachers traditionally have been more strongly influenced by length of paper, spelling, word usage, and appearance than by appropriateness of content or organization (Charney, 1984; Moran, 1982). Counting correct word sequences is one quantitative method of measuring and monitoring students' use of conventions. Correct word sequences (CWS) are two adjacent, correctly spelled words that are grammatically acceptable within the context of the phrase (Videen, Deno, & Marston, 1982). Capitalization and punctuation also can be considered within the sequence. To calculate the proportion of CWS:
Proportion of correct word sequences, however, does not in itself pinpoint specific concerns about the student's spelling, punctuation, capitalization, grammar, or handwriting. The diagnostic function of assessment will only be met if the teacher also notes the student's strengths and weaknesses as in Figure 5. Figure 5. Diagnostic analysis of conventions
Like the other assessments discussed in this article, these methods can be useful for instructional planning. A resulting IEP objective addressing conventions, for example, might read: Using a 4-step editing strategy, Kevin will reread his composition checking for correct capitals, punctuation, spelling, and overall appearance, writing a final draft with 2 or less mechanical errors. SyntaxAs discussed previously, a child's early attempts at writing move from writing single words to writing word groups and sentences (Clay, 1993). Beginning writers often produce sentences that follow a repeated subject-verb (S-V) or subject-verb-object (S-V-O) pattern. The composition in Figure 5 was written by a ten-year-old female deaf student. The beginning of the composition reveals this typical repetitious pattern to a certain degree in its first few sentences: "I go… I Ride my Horse… [I] get my Cow… I Leave My cow…" A more mature writer will vary the sentence pattern and combine short S-V and S-V-O sentences into longer, more complex sentences. Powers and Wilgus (1983) examined three parameters of syntactic maturity: (a) variations in the use of sentence patterns, (b) first expansions (six basic sentence patterns formed by the addition of adverbial phrases, infinitives, and object complements, and the formation of simple compound sentences), and (c) transformations that result in relative and subordinate clauses. Adapting Power and Wilgus's analysis of patterns suggests a simple schema for evaluating the syntactic maturity of a student's writing:
Seldom does a student write sentences at only one level of syntactic maturity. One determines a syntactic level by analyzing all the sentences in the sample and summarizing them according to the type most often used. Occasionally one might characterize a student's syntactic level as being a transitional Level 2/Level 3 or Level 3/Level 4. A resulting IEP objective for syntax might read: Daniel will plan, write, and revise a descriptive paragraph using mature sentences, at least half containing embedded clauses or adverbial phrases. VocabularyThe words used in a student's composition can be evaluated according to the uniqueness or maturity of the words used in the composition. Both quantitative and qualitative methods can be used to evaluate vocabulary. Quantitative methods include calculating the use of unrepeated words in relation to the total number of words, such as Morris and Crump's (1982) corrected type-token ratio. A simpler classroom-based method of looking at vocabulary is to simply make note of words used repetitiously (over-used words) as well as new and mature words the student uses. Example: Over-Used Words: New Mature Words
A resulting IEP objective for vocabulary might read: Diana will revise her expository compositions, substituting at least five over-used words (e.g., is) for more interesting action words. Taking into account the purposeBeing skilled is not just knowing how to perform some action but also knowing when to perform it and adapt it to varied circumstances (Resnick & Klopfer, 1989, p. 4). Being a skilled writer requires knowing how to employ the writing process across a range of writing tasks and adapt the process to the specific purpose for writing. Instruction often begins with story structures because they represent the genre most familiar to children. Children also use and depend upon narrative as their principal mode of thinking (Moffett, 1983). However, several educators (Hennings, 1982; Sinatra, 1991; Stotsky, 1984) have called for more emphasis on descriptive and expository text structures which relate more closely to real life writing tasks. Different purposes for writing call for different text structures. Writing a story calls for a narrative text structure that includes a character, setting, problem, etc. Writing about one's beliefs calls for a persuasive text structure that includes discussion of the problem, statement of belief, two or three reasons for the belief, facts and examples that support the reasons, etc. Assessment of writing skills, therefore, should take into account a variety of purposes and text structures. Purposes and genres to consider include: personal narrative (my trip to the state fair), story narrative, descriptive, explanation of a process (how to give your dog a bath), factual report, letter, compare-contrast (compare the Allegheny Mountains with the Rocky Mountains), and persuasive. SummarySimple curriculum-based assessments can be used to assess the writing process and products of students with learning disabilities, as well as take into account purpose. The assessments recommended in this article also adequately fulfill the purposes of assessment as discussed at the beginning of the article: identifying strengths and weaknesses, planning instruction to fit diagnosed needs, evaluating instructional activities, giving feedback, monitoring performance, and reporting progress. A teacher might use these methods at the beginning of the year to do a quick sizing-up of student instructional needs. The process checklist in Figure 1 gives the teacher important diagnostic information about the strategies a student does or does not use when writing. A quick assessment of product variables from the first two or three writing assignments also gives the teacher important diagnostic information about skill strengths and weaknesses. The teacher then should use the initial assessment to identify instructional targets. Some students, for example, may do pretty well at planning their composition, but do little in the way of effective editing. Other students may have creative ideas, but need considerable work on conventions. Some students may do pretty well with writing stories, but need to learn how to write factual paragraphs. All classroom-based assessment should involve the student. Self-assessment helps students take ownership for their own writing and helps them internalize the strategies they are learning. The teacher's feedback should be given judiciously: generous in the encouragement of ideas and improved skills, but cautious in correction. Corrective feedback should only focus on those few skill targets that have been addressed in instruction. Simple classroom-based methods also can be used to monitor student performance and report progress. Figure 6 is an assessment summary sheet that could be used to give a profile of a student's skills across a variety of writing purposes and genres. In an assessment portfolio the summary sheet would be accompanied by representative samples of a student's writing with both the student's and teacher's evaluations. After an initial assessment of student strengths and weakness across fluency, content, conventions, syntax, and vocabulary, the teacher would not necessarily need to monitor all the product factors, just those that focus on the student's greatest challenges and priority instructional objectives. Figure 6. Assessment summary sheet
In conclusion, on-going assessment of writing is integral to effective teaching of writing. A teacher cannot make an appropriate instructional match between a student's skills and appropriate tasks without assessment. A teacher cannot ensure a student's success and make necessary adjustments in instruction without engaging in frequent assessment. Careful, thorough assessment of a student's writing requires that the teacher have a sound conceptual model of written expression taking into account process, product, and purpose. Which would be the most essential step to take in preparing students to write a complete narrative?Which would be the most essential step to take in preparing students to write a complete narrative? Talk out ideas and make notes within a Story Framework.
What would be the best way to begin teaching students how do you write a sentence Letrs?What would be the best way to begin teaching students how to write a sentence? Explain that a complete sentence has a subject (the naming part) and a predicate (the action or doing/being part).
Which of the following is a serious limitation of an essay test?1. One of the serious limitations of the essay tests is that these tests do not give scope for larger sampling of the content. You cannot sample the course content so well with six lengthy essay questions as you can with 60 multiple-choice test items.
Why is writing called the quintessential mental juggling act because?Writing is sometimes called the "quintessential mental juggling act" because: it requires more cognitive, motor, and language skills than any other academic activity.
|