When I first began teaching, I was passionate about performance-based assessment, and the first groups of students that I taught found themselves engaged in upwards of ten different performance-based assessments each year. My seniors wrote I-Search reports ala Ken Macrorie, and when I taught eighth grade, my students were performing everything from the meaning of their vocabulary words to the scripts that they wrote as a part of their Cast of Characters projects. These were good times, and I learned an awful lot from the veteran teachers that I worked with about developing quality rubrics and helping kids attend to the finer details of their work, reinforcing the expectation that project-work was not mere fluff.

I still favor performance-based assessments over all others, but what I’ve come to realize is that this is something of a shortcoming. Recently, I had the opportunity to watch Rick Stiggins in action, and as he spoke about aligning assessment types to instructional targets, I found myself having (for lack of better phraseology) an AH HA moment. Stiggins presented the table below to the educators in his audience, and suddenly, I felt as if a missing piece were sliding into place for me.

Performance-based assessment is not always the most appropriate measure of assessment, and while I used to assume that comprehension was inherently necessary in order to complete a performance-based assessment successfully, I know now that this isn’t true. Sometimes, there are more appropriate measures…it simply depends upon what you are measuring.

All of this has me thinking today.

I’m thinking about the assessments on this chart, and I’m wondering where a blog entry might fit as an assessment tool. I’m wondering where we would place a wiki. Can Twitter be used as a means of formative assessment? I think it can. I think that most web 2.0 tools can be used to inspire formative assessment. They need a place on this map….what do you think? Are they a new assessment type, or do they fit into one of the categories already distinguished here? I know what I’m thinking, but perhaps we could invite others to the table to have this conversation.

What would happen if we spoke with kids about formative assessment? What would happen if we were direct about its purposes, if we told kids why were studying certain skills formatively, and if we invited them to share their thoughts and ideas about what might work best with us?

Maybe “getting ready for the test” can look like a blog entry. Maybe we can do it in MySpace. Maybe we can keep kids engaged while moving through our agenda as educators. Maybe we CAN do it all.

Skill Type


Selected Response


Performance Task

Personal Communication

Knowledge Mastery

MC, True/False

Matching and Fill in the Blank can sample mastery of elements of knowledge

Essay exercises can tap understandings of relationships among elements of knowledge

Not a good choice for this target

Can ask questions, evaluate answers, and infer mastery—but time-consuming

Reasoning Proficiency

Can assess understanding of basic patterns of reasoning

Written descriptions of complex problem solutions can provide a window into reasoning proficiency

Can watch some students solve some problems and infer about reasoning proficiency

Can ask a student to think-aloud or ask follow-up questions to probe reasoning


Can assess mastery of the knowledge prerequisites to skillful performance, but can’t rely on these to tap the skill itself

Can evaluate the skills as they are being performed

Strong match when skill is oral communication proficiency; also can assess mastery of knowledge prerequisite to skillful performance

Ability to Create Products

Can assess mastery of the knowledge prerequisites to skillful performance, but can’t rely on these to assess the quality of the product itself

A strong match can assess:

a. proficiency in carrying out steps in product development

b. attributes of the product itself

Can probe procedural knowledge and knowledge of attributes of quality products—but not product quality


Selected response questionnaire items can tap student feelings

Open-ended questionnaire items can probe dispositions

Can infer dispositions from behavior and products

Can talk with students about their feelings

Taken from Stiggins, Richard J. Student-Involved Assessment, 3rd ed. Columbus, Ohio: Merrill Education, 2001



Write A Comment