YOKA / UCLA Writing Project Partnership 2016-2017
Spring Reflection Questions
Directions: Please take a moment and respond to these two sets of questions. Please upload this to your YOKA blog when finished. If you have a picture that goes along with it, all the better!
Grounding: Thinking about your instructional sequence, professional development themes this year, and other factors influencing your classrooms...
Part 1: DATA ANALYSIS. Take a moment and look at your Fall and Spring spreadsheets based on the information from your study class.
- What do you notice about the data, especially when comparing Fall to Spring?
- What might this data capture or address in terms of student learning and your curricular goals?
- What might this data not capture or address in terms of student learning and your curricular goals?
- We suggested you use a “high” “medium” “low” rating for scoring student work unless you wanted to use an existing rubric that perhaps your department uses or a rubric from a high stakes assessment. What scoring method would you want to use next year? What would give you the most meaningful information? Please explain.
Part 2: STUDENT WRITING ANALYSIS. Take a moment and look at a student sample that you are particularly proud of, or that you think shows a need that you’d like to address with future Professional Development. If possible, compare the sample with a sample of the student’s writing from early in the year.
- Give a context for the writing assignment, especially given your curricular goals for the year.
- Give a context for the student, especially given their growth, stagnation, or decline throughout the year.
- Describe what is present in the writing, rather than what is missing.
- Describe what might be next instructional steps for this student. Would these instructional steps work for many students in your class? Please elaborate.
- What ideas is this giving you about how we should shape professional development for next year?
- Thinking about this PD sequence this year, what worked? What can we (UCLA Writing Project, YOKA Leadership Team, your department, and yourself) improve on for next year? Also do you agree w/ the instructional foci identified by your peers on the google survey? Click on the following links and scroll to the bottom for ALL RESPONSES, SOME MATH, SOME SPED.
On behalf of the YOKA Leadership Team, the UCLA Writing Project, and all of our students, Norma and Jason want to thank you all for a wonderful year full of hard work, energy, enthusiasm, inquiry, and writing! Happy trails to you, ‘till we meet again!
Part 1: DATA ANALYSIS
ReplyDelete1. I notice that my students have high writing skills, which I expected because the group of students I did my data analysis with are a gifted group of students.
2. This data might capture that students are able to communicate their scientific thinking when a clear structure is provided to them beforehand, for prewriting organization. Some students might still have problems answering the prompts directly, as the content and argument they make don't relate completely to the prompt.
3. What this data might not capture is what each tier really means. How I see it, I labeled students relative to the whole class. Therefore, the ones who are "low" are still able to write complete, complex sentences and structure a paragraph.
4. 4-point rubric would be helpful, or maybe A-B-C-D-F
Part 2: STUDENT WRITING ANALYSIS
1. The task for the spring that I had students complete was a letter to Trump making an argument for him not ignoring climate change. They had to explain the science of climate change and what it means to address it instead of ignore it.
2. This student produced a writing piece that is eloquent and made me very proud. They academically do well, but are not in the top 5 of my students. Therefore, I was pleasantly surprised by her writing.
3. What is present: clean explanation of what climate change is, solid reasons why climate change can't be ignored, passion for what she is saying, and a clear argument.
4. Next instructional steps for this student would be to use graphic organizers again for organizing information before writing. I think this would be helpful to the whole class, but some students will find it more helpful than others.
5. After writing a prompt, having time to support us in generating graphic organizers that would help students visually organize their thinking in ways that make sense and don't confuse students. Or maybe time to share best practices in how we have created graphic organizers we've implemented successfully before.
6. I think this writing project was successful. I would appreciate more focus on topics highlighted in the surveys. If this was done again, I would appreciate periodic reminders as it was at times hard to keep track of timelines.
Part 1: DATA ANALYSIS.
ReplyDeleteIn terms of Fall and Spring, I see that about the same amount of students are in the medium category. Very small percentage in the High categories.
I am surprised the majority are in the medium category. I wonder how reflective this is of how students are actually doing. Like do I really feel most of my students are writing in the medium category? I felt like most are still very low.
This data might not show the progress we’ve had just in the amount of writing that they are doing. Ss who would write one or two sentences are writing a lot more, but they may still not be writing as much.
Pros of using Rubric – we would better be able to separate “strands”. With the rubric we can see if students are scoring 4’s in “writing clearly” and 1’s in “Justifying Solutions”. It would help us better target our lessons (i.e. our kids can write clearly, but they are not grasping the content)
Pros of “high, medium, low” grading - it is more aligned to the high stakes SBAC math test they will be given. But I noticed that the performance Task that they will be given has a scale of 2, 1, 0 on ONE part of the problem, on another part just a 1, 0 and the third part it has 2, 1,0 again.
I think what we decide for next year will be determined based on our writing goal – do we want students to practice writing/responding to the MATH performance Task questions or will our goal be to make our students better writers overall? (Understanding that yes, they both help students be better)
Part 2: STUDENT WRITING ANALYSIS.
1. I felt like my two writing samples were different from each other but there was growth is writing prompts because they second one was more specific to the rubric we planned on using.
2. Present in the writing – students are attempting to answer the main question and they tried creating their own examples.
3. The next steps in helping my students, is to get them familiar with the rubric and have them understand their goals.
4. Next Steps, giving them back their graded samples so that they know where they need to improve. So that they can see that they score is given based on what IS there, and not on what the teacher THINKS you meant to write down.
5. For next year, it’d be great for everyone to be on the same page with the SLO’s (student learner outcomes) so that we can be more intentional with the work that we do.
Thank you teachers for participating in this writing teacher action research writing project. We look forward to see how we may respond to your needs to inform the Writing Project for hte 17-18 school year. Whether it be to write about behavior concerns, language objectives, writing objectives, development of writing prompts, essential questions we want to hear your voice to ensure our PD and dept. time serves the needs of your dept and grade level teams. We look forward to continue looking at students work as we move forward and writing samples linked
ReplyDeletePart 1: DATA ANALYSIS.
ReplyDelete1.
Fall Spring
High 3.4% (N=1) 14.3% (N=2)
Medium 31% (N=9) 42.9% (N=6)
Low 65.5% (n=19) 42.9% (N=6)
2. What might this data capture or address in terms of student learning and your curricular goals?
Overall, there’s a greater percentage of “Low” students and the smallest percentage of “High” students.
The data is not accurate enough to say what the overall trend is between the Fall and Spring data.
3. What might this data not capture or address in terms of student learning and your curricular goals?
The numbers are skewed in regards to not knowing the raw number within each percentage, so just looking at the percentages can misrepresent the interpretation of the High/Medium/Low data.
4. We suggested you use a “high” “medium” “low” rating for scoring student work unless you wanted to use an existing rubric that perhaps your department uses or a rubric from a high stakes assessment. What scoring method would you want to use next year? What would give you the most meaningful information? Please explain.
I would like to co-create a rubric with the Social Studies department, calibrate the rubric, then finalize it so we can then use department wide. That would allow us to have some important conversations about where we need to meet our students had (skill wise) and also brainstorm and collaborate on how to do so.
Part 1:
ReplyDelete1)When provided with the proper supports and scaffolds, students are improving. We need to actually identify small improvements and progress.
2) Sentence frames and graphic organizers give students the structures needed to write a paragraph.
3) If students are having trouble accessing the content or writing structures
4) A H-M-L is a great tool because it is quick and does not take a long time to assess each paper. Unfortunately, it does not explain a lot of the nuances. A 4-3-2-1 Rubric for constructing an explanation might be more useful in developing future practice.
Part 2
1)The students had to construct an explanation to the question "How can you prevent your friends from getting sick?" They have learned about how germs spread(ex: communicable diseases) and the types of germs (ex: viruses, bacteria).
2) The student could access the content, but had trouble formulating the thoughts into a coherent paragraph. It is apparent that the student can write a stronger explanation.
3) The student can make a claim that answers the question, and uses relevant data from labs as well as text/discussions to back up their claim. They can also use appropriate reasoning to connect all three parts of their explanation.
4) The student needs more help with transitions - possibly provide more examples of how to transition between thoughts and ideas.
5) Writing maps, sentence frames, WE need to construct explanations, WE need to do our own assignments
6) What worked: using data to inform reflection and future instruction. Improvement: Time to adapt ELA strategies to our departments.
What do you notice about the data, especially when comparing Fall to Spring?
ReplyDeleteI feel that the data shows that the students made more an overall improvement on their writing once they received more scaffolds and practice from fall to spring.
What might this data capture or address in terms of student learning and your curricular goals?
I believe that the data shows that once students have ample practice and the ability to revise their first draft, the students are able to put together much more conclusive and articulate writing responses that merit higher scores.
What might this data not capture or address in terms of student learning and your curricular goals?
I feel that considering the length of time of this experiment, the data really only provides a small snapshot of the work that the student are capable of. I feel this experiment should continue and more data should be collected over a longer period of time.
We suggested you use a “high” “medium” “low” rating for scoring student work unless you wanted to use an existing rubric that perhaps your department uses or a rubric from a high stakes assessment. What scoring method would you want to use next year? What would give you the most meaningful information? Please explain.
I would prefer to use the 4 point scale according to Claim Evidence Reasoning Next Generation Science Standards, because I feel it provides a much more clear assessment of the students work. Further, this rubric is much more aligned with the work that the student are doing in the science class.
1. There are more students performing in the high category in the Spring. Medium students made shifts into the High rating, whereas the low % stayed about the same.
ReplyDelete2. Students are getting more familiar with the criteria to score high, medium, or low. They were also building their capability to answer the short written responses on a Math exam.
3. The task was one assignment outside of the context of the unit students were learning in the moment. Also it may not capture the guided practice that was provided to scaffold student understanding for EL’s.
4. Next year, I would use the department created 4-point rubric for consistency and vertical alignment. This way the student would be familiar with the grading system from previous grade levels.
1. Students were studying system of equations and the point of intersection. We created a real life application to mimic the concept we had just covered in class.
2. James immigrated to the USA in January and joined our class mid -year. He does not speak any English, ESL 1, and he manages to get work done through translation provided by the teacher. His math skills are at grade level but his language skills are far below basic.
3. Student was able to answer the questions using sentence frames, word bank, and fill-in the blanks. When asked to create an example of his own, he responded in Chinese.
4. Next steps would be to have students write in paragraph form without supports or with limited supports. When questions are given, there would be no sentence frames or word banks. Instead the student will craft their own narrative and rationale using their developed writing techniques.
4. I would like to work with the department on a task that can be given cross grade levels to track both conceptual mathematics and summative writing skills. The task would be aligned to all grade levels at varying difficulties. (ie. ratio, proportions, slope).
5. The PD’s developed smoothly. The tasks were scaffolded and expectations for the writing tasks were clear. Suggestions for next year are meet as departments first semester and have time to develop content specific tasks. Work in focus groups in a one-on-one content specific basis. Then second semester come together as a staff and compare across grade leve
The data show me that students have shown growth from one semester to the other. I believe the growth would come from the fact that later in the school year, they feel more comfortable with the topic, they have a sense of mastery and therefore feel free to express their thoughts and their opinions. The data captures how English only students, E.L students and RFEP students think, and process their thoughts in writing. I see mistakes that are made due to the going back and forth from Spanish to English. But what I do see is that in both languages they are doing well. For the upcoming year, if I had to do this again, I would use a 1, 2, 3, 4, which for this project I translated to a H, M, L, for the google doc purpose.
ReplyDeleteThe writing context was based on work that the kids have been building on from 7th and even partially 6th grade in our CPM book. The goal for this is to identify, create, solve and read (understand graphs) linear equations of the form y=mx+b. This is a stepping stone for other graphs that will come in later courses. This particular student has shown growth, tremendous growth from time to time. I can see how this person’s thoughts are growing along with the understanding. I can see the ease and comfort with which this person is explaining not just her work, but the processes. I think that this student can benefit from more challenging questions, and more challenging reading/writing assignments that force the individual to express themselves more.
The next step for professional development is how to take the given information and increase the level of difficulty in a way that students benefit from, instead of being scared of.
What helped was department time, and the strategies used in class to reach the goal.
DATA ANALYSIS:
ReplyDelete1. What I noticed about my data is that in the beginning of the year, I had a majority of students who got a "high" rating and only a handful who got a "low" rating. In this second semester, a lot of students shifted lower. Although many students still got "high" or "medium," a more significant percentage got "low" rating.
2. The shift in data captures the higher expectations I have for my students. In my first assignment, I was just rating students based on their content knowledge that was evident in their writing. However, in the second phase, I was grading students ability to write a solid claim, provide evidence and reasoning. Students were still pretty solid with their content, but the data captures how the students still struggle with writing a cohesive conclusion paragraph.
3. This data does not fully capture the connections that students are able to make in terms of content.
4. I would use a similar method of rating students "high," "medium," and "low" and using a rubric. Using a rubric makes it easy to see trends in students' strengths and weaknesses.
STUDENT WRITING ANALYSIS:
1. This writing assignment was asking students to write a conclusion paragraph to a lab investigation. Scaffolding activities included looking at another conclusion paragraph and finding the claim, reasoning and evidence statements, and filling in a graphic organizer where they wrote the claim down first, followed by the reasoning and evidence. The students drafted a response for homework and the response graded was when they rewrote the paragraph for a quiz.
2. This focus student is a strong student overall. She scored high both in Fall and Spring. Student participates a lot in class and does all her work.
3. Student was able to make a claim and provide appropriate evidence. Student was also able to make good connections.
4. Student just needs to work on the clarity of her response. So this student needs to read over her response and revise for clarity.
5.
6.
Part 1:
ReplyDelete1. When comparing data from the fall and spring, more students tested within the medium-high range in the spring. By reviewing data from the fall and providing targeted interventions for students in the low range, more of them were able to move up the rubric scoring.
2. This data shows how much support our students require. Because I am an RSP teacher, I am easily able to see where RSP students compare to general education students, and identify the RSP students who require more support in the classroom.
3. The data does not show what makes a student ‘low.’ More comprehensive data that shows the exact deficits a student has in relation to an assignment would assist me in providing targeted interventions for general education and RSP students.
4. Next year I would like to use a rubric that I have made specifically for the assignment with 4 different areas of assessment. By using a detailed rubric, rather than low-medium-high scoring, I would better be able to see the areas of student strength and need. At the moment, I think the 4 areas of the assessment would include grammar/punctuation, tone, supporting information related to main idea, and introduction/conclusion quality, to better understand the precise areas of student need in a given classroom. Additionally, this would help me better support RSP students with purposeful interventions in the classroom.
Part 2:
1. The students were asked to develop an expository writing samples with tone and purpose related to the task. Students were asked to define empathy and provide supporting details to identify why empathy is important inside and outside of the classroom. Students were frontloaded with information about empathy and given examples of how the development of empathy can support student learning.
2. Many student writing samples were lacking tone, purpose, and evidence. With the use of scaffolding and supports, including graphic organizers and sentence frames, students were better able to develop their compositions to reflect their ideas.
3. The writing sample contained student opinions and input that was unique and heartfelt.
4. I think more instruction on organizational elements of developing writing compositions will support student(s) in relaying their ideas in purposeful and sensical ways.
5. Combining grade-level topics with the instruction of fundamental writing skills.
I had a difficult time because I do not have a classroom of my own. I had to schedule time to provide assignments and instruction in other classes. Because I collaborate well with the English teacher I am scheduled to work with, I was able to find this time and easily implement writing assignments. Moving forward, perhaps coming up with uniform tasks per grade and department would enable to teachers to implement writing tasks more easily.
Part I:
ReplyDelete1.) The growth is hard to tell since I was looking the second set of writing differently. I think the second round I had higher expectations, because I was looking for specific things.
2) This data brings the question on how our grading differs from one another, but i think it helps in forming questions so we can collaborate and calibrate our scoring; which is a goal to be on the same page on what we expect and how to measure the learning from our students.
3) It might not capture their grammar skills or their spelling skills. It might not capture how the other students would have performed which I would guess they would struggle with writing more.
4) I liked both methods but I was happy that my content grade level partner and I were able to create a rubric to help ourselves articulate what we want to see in students' work, thinking, and really see how to determine what they know and have learned.
Part II:
1) Assignment ~ learning log
2&3) Student showed more knowledge, wrote more, and included class examples. They two learning logs were different in the sense that one was more computational and the other more conceptual so that might have influence why students wrote more and more effectively.
4) Peer editing, reading each other's work. Writing more.
5) Loved the time to read student work, time to grade together, and time to calibrate using our rubric.
6)
1. What do you notice about the data, especially when comparing Fall to Spring?
ReplyDeleteThe students completed more work in the spring than in the fall and appeared to do better. This is a special day class and it was nice to see the students writing more.
2. What might this data capture or address in terms of student learning and your curricular goals?
The students understood the writing process more and gained confidence.
3. What might this data not capture or address in terms of student learning and your curricular goals?
This data does not show how the students would work independently, without assistance.
4. We suggested you use a “high” “medium” “low” rating for scoring student work unless you wanted to use an existing rubric that perhaps your department uses or a rubric from a high stakes assessment. What scoring method would you want to use next year? What would give you the most meaningful information? Please explain.
We would continue to use the high, medium, and low method but we use 1- for low, 2- for medium, and 3- for high. We like using this method because it would explain what each number represents, using a rubric that the students can understand.
ReplyDeletePart 1: DATA ANALYSIS.
1. In comparing Fall to Spring my student data shows that less students turned in written assignments in the Spring. The Fall written work was generally shorter dealing with one paragraph or very short answers, more students turned it in. However the lengthier Spring assignments were not well-received and very few turned their work.
2. In terms of student learning and curricular goals this data captures the students’ feelings towards the class. The students that like and enjoy the class generally did the assignment but very few followed the prompt for the Spring semester.
3. The data does not capture the strengths that many students in music possess that are not showcased in their general education classes. Many of the music students that are doing well have low grades in their general classes where they are required to do more writing.
4. I am content with using “high,” “medium,” or “low” as a rubric for my students next year since they are use to numbers the generic words seems to be more neutral.
Part 2: STUDENT WRITING ANALYSIS.
1. Perhaps I wasn’t clear but very few students address the prompt for the Spring. However, most of the work was coherent with what they wanted to write just that it didn’t address my prompt.
2. From the students that turned in the work in both semesters, there was an individual growth that unfortunately was not significant enough to change their rubric score.
3. Students generally wrote about how much their music class or why they chose to play their instrument. They spoke with honesty when they mentioned challenges in music class. They also had great ideas or examples in their work.
4. My next instructional steps are to incorporate a quick write in all of my class, adding more reading for me during grades. I want to read all of their work so that they take their written assignment in music serious. I also need to do one lengthier written assignment per semester in all my classes.
5. I have no comment.
6. The beginning sequence of the project was great as it got students writting short sentences in class. This made writing fun for most of the students. I think that this is a great program to get students writing or hyped about writting…but in some classes it’s a struggle to get students to write lengthier papers when they are asked to do this in all of their classes.
Question #1
ReplyDeleteIn observing the data in the fall versus the spring, I observed a clear improvement for those students who are at the medium writing level, who met the requirements of the rubric. The data captures the overall writing improvement, but it does not capture the students analytic abilities with regards to the comprehending the writing prompt. I think I prefer the numerical rubric, because it gives room to deal with more specific topics within the rubric.
Question #2
Within the context of the student samples, the first writing prompt dealt with overcoming difficulties, and the second writing prompt dealt more specifically with an excerpt of a text we were reading in class. There was definitely growth in my sample student's writing. The student incorporated quotes more effectively into his writing in the spring than in the fall. I think what was missing in the student's writing was the explanation of the quotes in his essay once he used them. Next year I think it would be great to focus on explaining the quotes we use in our writing. Overall I think the PD caused us to be more reflective of our students' writing.
1. What do you notice about the data, especially when comparing Fall to Spring?
ReplyDeleteThe students completed more work in the spring than in the fall and appeared to do better. This is a special day class and it was nice to see the students writing more.
2. What might this data capture or address in terms of student learning and your curricular goals?
The students understood the writing process more and gained confidence.
3. What might this data not capture or address in terms of student learning and your curricular goals?
This data does not show how the students would work independently, without assistance.
4. We suggested you use a “high” “medium” “low” rating for scoring student work unless you wanted to use an existing rubric that perhaps your department uses or a rubric from a high stakes assessment. What scoring method would you want to use next year? What would give you the most meaningful information? Please explain.
We would continue to use the high, medium, and low method but we use 1- for low, 2- for medium, and 3- for high. We like using this method because it would explain what each number represents, using a rubric that the students can understand.
PART ONE
ReplyDelete1) I noticed a slight increase in the high and medium and a decrease in the low ratings.
2) Students have developed the skills and stamina to effectively extract information from several texts and integrate it within the writing task.
3) It didn’t capture how to support those students by highlighting the needs of their writing task at a micro level.
4) I prefer using the same rubrics that the District uses on interim assessments and the rubrics built within the Study Sync platform.
PART TWO
1) This writing assignment is a close read on the validity of Teaching History Through Fiction built around the fable “The Boy in the Striped Pajamas.” This fable and assignment belongs to the In Time of War within the 8th Grade StudySync platform.
2) This student struggled with the organizational part in her first close reads at the beginning of the year. My ongoing lessons and support on structure and integrating evidence have really helped the student improve in terms of clear focus, writing and evidence.
3) A clear thesis and effective evidence to support it are present throughout the writing assignment.
4) This student should be exposed to a more diverse type of writing assignments that demand different type of skills to be developed. Student should also be exposed to more challenging but exciting type of texts that really drive the writing process.
5) Writing tasks and grading session should gravitate towards the assignments built into the lessons and be reviewed on a more regular basis.
6) I think we should be able to calibrate the rubrics more than once and also be able to adapt them to each writing assignment within our curriculum.
PART 1
ReplyDelete1) When comparing data, I noticed that in Fall I had 42.9% score Low, 31.4% score Medium, and 25.7% score High. In Spring, 51.5% scored Low, 45.5% scored Medium, and 1 person scored High. So while I had more student score in the Medium this second time around, but I also had more students score in the Low section, and my High numbers decreased to 1 person.
2) This data tells me that more students did well when I provided more sentence frames. The second time around I provided a sentence starter, whereas the first time I provided multiple ones. I also believe that the math question the second time around was a little harder to explain, which is why more students did worst.
3) These data does not capture outside factors affecting students.
4) I would like to keep the same scoring method of High Medium Low, though I do see the benefits in adding 1 more category for students who are just not actually answering the prompts at all, even with the support.
PART 2
1) In the first writing assignment, students were supposed to talk about the conversion between percent, decimals, and fractions. In the second one, they talked about how fractions, division, and decimals are related.
2) I can tell that the student used all the sentence starters that I provided to write their first reflection and used examples. The second time around the students still explained everything and provided examples, even though I had less scaffold on there.
3) In both writings, I can see complete sentences and examples provided.
4) I think the next instructional step would be to go further into the relationship and making the connections between her examples.
5) I think we need to focus on students really reading the prompt and ensuring that they understand how to begin to answer a prompt without the sentence frames, because during testing, we do not give them sentence starters.
6) I think having students consistently write in our classes can really help getting them to write more in general.
Part 1:
ReplyDelete1.) Comparing fall and spring data, I have noticed that the percentage of high scores significantly increased. The fall data didn’t have any students in the high scores percentage. It only consisted of either low or medium scores. The spring data showed that there was 17.2% increase in high scores and 34.5% of students in medium scores, which meant the students improved after the UCLA project.
2.) It shows that incorporating mini writing strategies and scaffolds into our writing task have significantly improved my student’s scores. I realized how important it is to scaffold the work for the students. The writing strategies and reading strategies have been vital for their comprehension and it helped them improve their writing conventions.
3.) I don’t think this data captured the apathetic students that chose not to do this assignment. They were categorized into the low scores. The data is great for a comprehensive look at the student’s scores, but it doesn’t capture the process. It doesn’t capture how the students have slowly and steadily improved.
4.) I like the low, medium, high because the data is easier to read. It’s simple and right to the point. I can tell that the students have improved immediately. Also, as an English teacher, I found that this method of grading is the easiest to organize. This data was meaningful to me, but I wish the graph would show individual students and if their writing scores improved or decreased from the fall data.
Part 2:
1.) The fall prompt was focusing on the character’s struggles and what motivates them to overcome their struggles. The spring prompt was focusing on what makes a good leader and why people would question a bad leader. The student had to write a 5-paragraph essay, which consists of an introduction, three body paragraphs, and a conclusion. The students needed to include transitional phrases, a thesis statement, evidence, and explanation of evidence.
2.) This student showed an increase in his scores. Looking at the fall data, this student received a medium score. After I grade the spring data, I saw a significant increase in his score. He jumped from a medium score to a high score.
3.) This student was able to include textual evidence and explanations to their claims. The students were able to use transitional phrases throughout the writing. The student was able to include a thesis statement in the introduction. The student was able to include topic sentences for each body paragraph. The student was able to produce a well thought out opening and a well thought out closing.
4.) I will continue incorporating all the varieties of writing and reading strategies within my writing process. I will continue to scaffold their work and provide writing outlines or sentence starters if needed. I might need to do more front-loading with comprehension and vocabulary with this group.
5.) It’s important to learn about different reading and writing strategies consistently because we don’t know what strategies might be effective in your class.
6.) Focusing on our prompts and rubrics were very helpful. It was helpful to answer our own prompts to see our writing patterns and habits. It helped the teachers stay reflective on the writing process from beginning to end. I think if the teachers learned more about writing and reading strategies to incorporate into our lessons would be good for next year.
ReplyDeleteWhat do you notice about the data, especially when comparing Fall to Spring?
I don't think my data is necessarily valid because I gave different prompts to different periods. Some periods girls some periods boys. So not the same
What might this data capture or address in terms of student learning and your curricular goals?
Student writing levels vary.
What might this data not capture or address in terms of student learning and your curricular goals?
The difference in grade level, the gender, and whether they are ELL, special ed, gifted, etc.
What scoring method would you want to use next year?
High, Medium, Low is fine
What would give you the most meaningful information? Please explain.
Maybe grading just one paragraph - the essays are too long. I think I would get the same quality of data from one paragraph
Part 2: STUDENT WRITING ANALYSIS. Take a moment and look at a student sample that you are particularly proud of, or that you think shows a need that you’d like to address with future Professional Development. If possible, compare the sample with a sample of the student’s writing from early in the year.
Transitions. Our kids need to use transitions when writing.
Give a context for the student, especially given their growth, stagnation, or decline throughout the year.
Students wrote their papers in class I wonder how they would do if these were homework assignments.
Describe what is present in the writing, rather than what is missing.
From my favorite pieces I see present grammatically correct sentences.
Describe what might be next instructional steps for this student.
For the student that needs help with transitions it might be helpful to have her say what she'd like to write out loud first before committing it to paper.
Would these instructional steps work for many students in your class? Please elaborate.
Yes, I think that would work for other students. If they say the sentence out loud they realize it sounds funny without transitions.
What ideas is this giving you about how we should shape professional development for next year?
Next year should be content specific. Meet with us in small groups by department.
Thinking about this PD sequence this year, what worked?
I liked the different strategies we learned at the beginning. The poems and stuff
What can we (UCLA Writing Project, YOKA Leadership Team, your department, and yourself) improve on for next year?
Meet with us in small groups content specific
What do you notice about the data, especially when comparing Fall to Spring?
ReplyDeleteI noticed that the students scored fairly the same for both Fall and Spring assessments. There were a few students who scored lower in the Spring than the Fall, and a couple students who scored higher in the Spring. For both the Fall and the Spring assessments, majority of the students scored medium scores.
What might this data capture or address in terms of student learning and your curricular goals?
This data captures that students are progressing in the writing and towards their goals of writing an analytic essay. The fact that most of the students scored the same, shows that the students were not falling back in their writing skills.
What might this data not capture or address in terms of student learning and your curricular goals?
This data does not capture the fact that the writing assessment was only one paragraph for the Fall and four paragraphs for the Spring. The writing task became a bit more challenging for the students in the Spring. Although a few students scored lower, I saw that most of the students scored a medium and high score and thus met the goals of writing a thesis and finding quotes/evidence to support their claims.
We suggested you use a “high” “medium” “low” rating for scoring student work unless you wanted to use an existing rubric that perhaps your department uses or a rubric from a high stakes assessment. What scoring method would you want to use next year? What would give you the most meaningful information? Please explain.
I would use the social studies district interim assessment rubric that follows a score of 4,3,2,1. Most of the teachers in the history department have started to use this rubric and it would benefit the students to see a consistent rubric that they follow from 6th to 8th grade.
Give a context for the writing assignment, especially given your curricular goals for the year.
The Fall writing assignment was about slavery in the Constitution and the Spring was about the Indian Removal Act. Both were analytic essays where students had to find evidence from primary and secondary sources to support a claim.
Give a context for the student, especially given their growth, stagnation, or decline throughout the year.
This student has great work habits but struggled with comprehension. There were many times instructions were misunderstood either verbally or written.
Describe what is present in the writing, rather than what is missing.
The student is able to write a thesis and provide quotes. There is a bit of a disconnect between the quotes and the thesis, and the students shows a lack of understanding when explaining the quotes.
Describe what might be next instructional steps for this student. Would these instructional steps work for many students in your class? Please elaborate.
The next step for this student and for this class is to give students examples of quotes that support thesis. I will show models of essays that scored a 4 as opposed to a 3.
What ideas is this giving you about how we should shape professional development for next year?
I would like time to work on a rubric for our department. I would also like strategies for helping students choose quotes that supports a thesis.
Part 1
ReplyDelete1.)At first glance I noticed that the data from the fall semester to the spring semester had an increase in the number of students who scored high on the writing task. From 11.8% to 19.4%. The percentage of students who scored medium decreased, and the percentage of students who scored low was very similar to the fall semester.
2.)Students need to be able to explain their reasoning and critique the reasoning of others using examples and proofs. The writing samples address these standards as they apply to the concept of the distributive property and division.
3.)After further reflection, I take note of the students who transferred into and out of the class during the second semester. The comparison assumes the same students and that the writing skills and strategies are taught consistently.
4.)There is also an assumption in math that the math skill were understood and that students used the time to write about the skill. If a student struggled to understand how to correctly use the math skills, then the writing task was also very challenging to explain and earn a high score. High, Medium, Low does not address what students understood in the writing versus the math skills.
Part 2
1.)The math department agreed to use ‘learning logs’ as the writing prompts.
2.) The students in my first period class did change at the semester, however I did practice writing and learning logs with all of my class periods.
3.)What is present in the math is an understanding of vocabulary, examples, and explanation of the math content and skills.
4.) Next are the math performance tasks practice and continued writing practice.
5.)The reflection using student work, the time to analyze and compare, the collaboration with other teachers were great. So much more.
It was interesting to analyze the data which was taken from a high performing class. I expected this class to have generally high scores because so many students are capable writers and self motivated students. Thus the scores were what I expected. The students did the best they could to answer the prompt, that I think could have used improvements
ReplyDeleteHowever I have noticed the ways my students in that class have been less motivated and while I expected there to be high scores I was expecting more “high” grades.
I used the writing assignment as a short assessment on their knowledge of our colonization unit. The students had difficulty answering the prompt which tells me that the wording needs to be modified. The writing was generally aligned with how students have been performing in their writing skills in general.