By the Atlas PD Team

We all know the steps to the teaching dance: forward into our room, side step to themes and decorations, double back to our plans and vision for our children, and a final flourish as we arrange our desks and check our rosters.

And then, just like one out-of-tune violin ruins a waltz, we receive an email: “Attached are the roster reports from last year’s major assessment. Please review the results and plan accordingly.” Suddenly, our steps become self-conscious, all eyes are on us, and the pressure builds. Students – once our dance partners – transform into our audience and judges. And, to a great extent, you understand. After all, their performance on these exams will stay with them year after year.

The real concern is that the steps to this dance are confusing: they more closely resemble the fluidity of jazz than the precision of tap. We hear conflicting messages about data-driven instruction in the classroom, and, as the song is ever-changing, it’s also hard to know if we’re on beat when all is said and done.

Reflective Questions to Understand Student Data

Many books and educational leaders provide insights on analyzing student data, but an overly prescriptive routine will always leave some students clutching the wall while others dance freely. So how do we let the student assessment data speak to us in a way that makes us dance, not run? Ideally, you will have a tool that allows you easily sort and visualize results, but you can begin to answer these questions with a highlighter and printed roster report for data driven instruction in the classroom.

Student Performance

What are the strengths and weaknesses with which students are coming to you? Maybe the results you see are broken down by skills or standards, or maybe all you see in the English results are “Reading” and “Writing.” Still, identify areas for growth and areas of strength for the class as a whole.

Highlight the top performers; what are their strengths and weaknesses? The skills you see these kids mastering and struggling with may be different than when looking at the whole class.

Highlight the low performers; what are their strengths and weaknesses? The skills you identify here will likely be different than the top performers, but they are often similar to the skills identified in the whole group.

Which students were close, and what were their strengths and weaknesses? Identifying these children may show you students who have been underestimated. Maybe they get the label of “not proficient” or “below grade level,” but in reality they just need small adjustments to their instruction to get those lessons to click.

Repeat this process for other significant sub-groups; how did student subgroups perform? Do you have a high ELL, or special education population? What about gifted and talented or IB students?


What did the assessment questions look like? Maybe students struggled with reading informational texts and geometry. When looking at the text, are many of the geometry questions story problems? What this could show is that students can do the math, but they couldn’t decode the question in order to answer it—this might be a literacy problem, not a math one! Understanding the question format can change not only your assessment format, but the focus of your teaching.


What will students need in order to be successful in this unit? At the beginning of each unit, look back at your performance results and compare them with the skills you will be covering. Poor performance in numeracy skills may make it hard to graph functions in this unit, because numeracy is a foundational skill that isn’t usually covered in this unit. But now that you know this is a gap for your kids, you can adjust instruction to fill in those holes.

How can you encourage students to grow throughout the year? Children need goals more than labels. Showing students the skills they need help on and how they will grow in that skill is much more powerful than telling students they are or are not proficient.

Data Driven Instruction in the Classroom

Not only can we structure our units around student assessment data for data driven instruction in the classroom, we can also help students understand tools that assessments use. Much of the work assessment companies are doing is around creating accessibility tools, otherwise known as accommodations, that work for all students. With accessibility assessment tools in mind, here are a few final notes to keep in mind as you plan your own classroom assessments:

  • Each assessment currently uses slightly different language for the same accommodations. Check online to find the names of the tools on your assessments. “Text Blocker,” “Masking,” and “Text Mask” are three different names for the same tool. This tool allows students to cover up a part of the text so that it won’t be distracting. Make sure your classroom assessments use language that mirrors that of the major assessments your kids will take; this will help decrease their confusion during the test!
  • Give students practice time with assessment accommodations/accessibility tools. If students get a ruler or highlighter on the big test, give them practice with those tools in your assessments.
  • Most assessments will not have a paper/pencil option, so get technology in the hands of the students. Many major assessments stop offering paper/pencil after 3rd. Make sure you’re using online testing tools in your classroom, so students are familiar with the look and feel of online assessments.
Share This