Blog Post Category: Evidence

3/9/2017

Reflecting on Our Practice: Collecting Evidence of Student Learning

By: Virginia Ressa

I feel like I really know and understand something new when I’m able to create an analogy that represents the new concept in different terms. There’s a fancy word for that — analogical thinking. This month, I’ve been thinking about how we collect accurate evidence of student learning through an analogy with the changing of seasons.

ThinkstockPhotos-507302242.jpgIt’s March, and spring is on its way! As far as I can tell, it seems to be arriving early. How do I know that? Well, I know the vernal equinox is around March 20, because I learned that years ago. But beyond my recollection of the date, I know what spring in Ohio looks and feels like, and I see evidence all around me. I see bulbs starting to sprout up in my garden bed and the grass beginning to green. I haven’t needed my heavy winter coat in a week or so, and my gloves have been forgotten. Have you seen the trees starting to bud? Did you notice how much later the sun is setting?

These are all small pieces of evidence that we take note of as we wait for spring to start. Some of the evidence might be formal, like the meteorologist reporting changes in high and low temperatures. If we slow down and take notice, there is a great deal of informal evidence available, like the changes I see in the plants along the path where I walk my dog. There also is some dubious evidence of spring’s arrival that comes to us via Punxsutawney Phil on Groundhog Day and misleading evidence in the form of a March snowfall. If we really want to be sure that spring is coming, we can put into place a plan to document the changes we see over time, with the hope that we’ll see those changes come together by the equinox as we expect.
 
Watching students learn and grow is remarkably similar to watching the seasons change. We set goals of what we want them to know and be able to do and then observe their progress toward those goals. As teachers, we know that some of the most valuable evidence is gathered informally by listening to students talk through problems or noticing their use of new vocabulary. We don’t always document this growth, but we see it happening and respond accordingly. Once a new word becomes a regular part of conversation, we might introduce more challenging words as students work toward the learning goal. We also can collect formal evidence — pieces of writing, completed math problems, responses to critical questions — and document student progress using rubrics or other grading methods to record where students are in their learning. We work hard to make sure our students are on track to reach their learning goals in the time we planned, but sometimes they get there faster than we expected and other times it takes longer — just like spring arrives late some years (hopefully not this year!).

Screen-Shot-2017-03-09-at-3-59-20-PM.png
In this video from FIP Your School Ohio, Mr. Cline shares and shows
how he uses clear learning targets in his Grade 7 Math classroom.

Evidence of student learning isn’t always straightforward and accurate. Sometimes we are confounded with unclear evidence delivered by characters like Punxsutawney Phil — which is likely a sign that it’s time to reassess. Maybe a group assignment misleads us into thinking all of our students have mastered a concept. A homework assignment may come back showing evidence of a parent’s understanding rather than the student’s. Our students may become confused during a unit of study and all of a sudden it’s snowing in March. Each of these pieces of evidence are worth considering and responding to. You might want to have the student complete an individual assignment to double check his or her understanding. Rather than rely on homework, an in-class activity may give you more accurate evidence. And if it starts to snow, you may need to go back a couple of steps and reteach the content that caused the confusion.

You don’t need a weather station to know spring is coming, and you don’t need lots of formal tests to know your students are learning. Evidence comes in many forms — from informal to formal — you just need to be a careful observer. If you’ve set clear learning targets with your students, you can look for those telltale signs of growth as you work toward the goal. And, remember, if you run into a groundhog or spring snowfall, take the time to reassess to make sure you’re all headed in the right direction.

Virginia Ressa is an education program specialist at the Ohio Department of Education, where she focuses on helping schools and educators meet the needs of diverse learners through professional learning. You can learn more about Virginia by clicking here.

Leave a Comment
4/27/2017

Reflecting on Our Practice: Collecting Evidence of Student Learning, Part 2

By: Virginia Ressa

When I wrote my blog entry last month, I didn’t plan on writing a follow up, but assessment is just one of those topics educators can discuss and debate forever. As educators, we each have our own personal theories and experiences that color our assessment practices. Even the language we use when talking about assessment can differ. Our professional lexicon is full of synonyms, maybe euphemisms, for assessment: test, quiz, written response, essay, performance, project, presentation, etc. You get the idea, right? At the essence of all of these tasks — whatever you call them — is the goal of collecting evidence that will tell us where students are in their learning.

I know the phrase “collecting evidence of student learning” is a mouthful; assessment and test are both braces.jpga lot shorter and easier to use. So, what’s the difference? Why bother with the long phrase? The difference is in what kind of information we want and how we plan to use it. The term “assessment” carries with it a connotation of a singular event, while the phrase “collecting evidence of student learning” suggests a process of ongoing assessment that seeks to track student growth over time.

The purpose of assessment has traditionally been to assess students’ knowledge and skills against a set of norms or standards. We’ve given these types of assessments in Ohio for many years. Districts and schools use these types of assessments at the local level, too. For example, career-tech students know they’ve got to exhibit proficiency in order to earn their credentials. These types of assessments provide a snapshot of what a student can do at a certain point in time, not unlike your school picture from eighth grade. It provides us with important, but limited, information about who you were at that point in time — braces and all.

Each assessment event provides us with a snapshot of student knowledge and skills at a particular time Glasses.jpgand place. But, can one photograph show us a complete picture? Of course not. Take a minute and think about your high school yearbook photo. It provides a lot of information about you on the day the picture was taken, but the information is limited to that one piece of evidence. We know it doesn’t represent how you changed throughout high school, how you got taller or cut your hair to look like your favorite musician. It doesn’t explain why you chose those glasses and that tie. And, unfortunately, that one picture might not accurately represent you. Maybe you had a bad hair day or the photographer didn’t tell you when to smile. One snapshot can only provide a limited amount of information, and sometimes it may not be reliable.

In our classrooms, it is often more useful to think of assessment with the purpose of measuring student growth. Measuring growth requires us to envision assessment as a series of events over time rather than as singular, isolated events. What if we took a series of photographs over time? How would a series of photos give us different information than the single snapshot? When we have evidence we’ve collected over time, we can compare current performance to past performance to look for change, hopefully in the right direction. We can look for patterns of misconceptions, strengths and weaknesses or opportunities for acceleration — all of which can inform our instruction in order to meet the needs of our students.

If a snapshot is a single assessment event that measures achievement, then a series of snapshots of student learning is a process of assessment school-picture.jpgthat encourages us to measure and respond to growth. Collecting evidence of student learning over time and in different formats provides us with more details about what students know and can do. If we only have that one photo of you in your high school yearbook, we have no way of knowing if the picture is reliable. Why weren’t you smiling? Was your hair always that long? Multiple pieces of evidence help us to identify and account for evidence that isn’t reliable: a misleading test question, homework completed by a parent or maybe just a bad day for that student. Each photo we take can’t possibly tell others all they need to know about us at that point in time, and no single assessment event can tell us enough about student learning to inform our instruction.

Ohio’s Formative Instructional Practices modules include a course on measuring student growth. Click here to access the Department’s Learning Management System, which is free to all Ohio teachers.

Virginia Ressa is an education program specialist at the Ohio Department of Education, where she focuses on helping schools and educators meet the needs of diverse learners through professional learning. You can learn more about Virginia by clicking here.

Leave a Comment

Last Modified: 6/1/2016 4:16:44 PM