I’m not sure if this is possible, but I’m starting to think it could be. I’ve been incredibly inspired by quite a few teachers who have been playing around with some very uncommon assessment approaches based upon the professional learning we’re doing together.

Take Michele, for example. She’s a teacher in the Kenmore-Town of Tonawanda school district, and she provides reading support to special education students.  I’ve written about her before, and I remain compelled by the way she works with data.

She spent part of her spring break reflecting on her learning at the WNY Young Writer’s Studio, and I think I learned more than she might have in the process.

These are some of the questions that have been nudging at one or both of us all year:

  • Do we have to test in order to learn what we need to about students’ strengths and needs?
  • What do numbers really tell us about learners and learning and instruction that inspires both?
  • What are we really interested in measuring anyway, and which approaches best enable this?
  • Why do some educators place more value on quantitative data than qualitative, and how can we inspire change here?
  • What are the limitations of qualitative data? How can we attend to them?
  • What are we learning that we didn’t expect to?

Michele wasn’t satisfied by the information that multiple measures of quantitative data provided about her students. She was really eager to know more about their metacognitive processes, especially those relevant to the questions they asked of the texts they were reading and those they posed to their classmates during class discussions.

Over and over again, Michele seemed to be asking her students, ‘”How do you know what you know?” and “Why did you do things that way?” and “What helped you most?”

She asked a lot of other questions too, but these are the ones that showed up most often in our debriefs, and they prompted some powerful shifts in her practice and in the data she chose to collect.

Michele devoted herself to documenting learning made visible, and her cell phone was her greatest friend. She captured photos of kids at work, evidence of their thinking and learning, and work samples as well. When she printed them, we learned something unexpected and very worthwhile: the photos printed small enough to make display creation almost as tidy as what spreadsheets enable, but the pictures were large enough to allow a very clear reading of what students wrote.

Here’s an example of some of her most recent data. Her kids were stoked to be learning about cats and lions, and she was stoked to be learning about how they were able to question, compare, and contrast. In this display, she was able to capture evidence of their thinking as they read, made meaning from their reading, wrote, and reflected. These data were captured over time, and she was able to fit all of this evidence on our table at the same time.

dataIt may be difficult to tell, but these samples are actually very small. This allowed us to look at multiple pieces at the same time, but more importantly, it enabled us to cluster different pieces according to what we saw. Then, we could mix and remix these data, bumping one photo up against another and noticing things we didn’t expect to.

In short, capturing these data provided Michele a far more satisfying view of what her students learned, what they knew, what they were capable of doing….and why. Far more rewarding than any information a spreadsheet could provide, these data took little time to gather, our work with them was efficient, and most importantly: Michele did not have to stop to test in order to learn what she needed to.

I’ll share examples from my work with other teachers in the coming days and weeks, but in the mean time, take a peek at what Terry Heick posted recently. What incredible food for thought for anyone who cares about improving our current state of assessment and data practices in education.

Now more than ever, I’m so grateful for those who propose great solutions to problems that many only complain about.

Author

5 Comments

  1. Michele Cammarata Reply

    Thank you Angela for putting into words what I fumblingly explain to you as I look at my students work processes. I cannot tell you how excited I was to see this today because I put all the snapshots willy nilly on the table for my group today and they were oh so curious!!! I asked of them two things today: What was hard about compare/contrast at the start of the unit? And, Tell me what you learned about compare and contrast. I will bring their responses when I come in May!! Next step with them… How can we sort the snapshots to show your learning? Where can we go after this with our learning. I’ll keep you posted! My thanks for your support, insights and always your questions that help me question and grow!

  2. “How can we sort the snapshots to show your learning?” THAT IS AN INCREDIBLE QUESTION. This work has taught me so much, Michelle. So grateful to you. Can’t wait to hear more!

  3. Quoting: “What are we really interested in measuring anyway, and which approaches best enable this?” I don’t know about anyone else; but these are the questions I struggled with from my first class started in August 1980 until my last class ended in May 2012!!! I didn’t read about “#TTOG” until sometime in 2014 and it never occurred to me to try something like that. But I had almost 22 years of stress and concern to know it is a movement whose time is here!!!!

    • Doesn’t it seem like such common sense, John? Matt Townsley was the first to pique my interest here, and I’ve learned a ton from those who support standards based grading. Mark Barnes and Starr Sackstein keep pushing my thinking further as well. I know that times are tough in education right now, but there’s much to be excited about too. I share your enthusiasm here for certain. Loving this work!

  4. Pingback: 10 Creative Pre-assessment Ideas You May Not Know - Brilliant or Insane

Write A Comment