Why Student Data Without Student Thinking Leads to the Wrong Interventions
- Pamela Seda
- 2 days ago
- 5 min read

Walk into almost any school’s data meeting and you will see a familiar scene.
Spreadsheets projected on a screen. Percentages highlighted in red and green. Lists of students labeled “below benchmark.”
Leaders ask questions that matter:
What does this data tell us about our students?
Which students are not meeting expectations?
What should we do about these students?
The intention is good. Principals and instructional leaders genuinely want to support students who are struggling.
But there is a quiet, costly problem hiding underneath many of these conversations.
The data being reviewed often cannot answer the questions leaders are asking. And when we act on incomplete evidence, we design interventions that miss the mark—not because teachers don’t care, but because the data never revealed the actual problem.

The Question Schools Really Want to Answer
When you listen carefully in school data meetings, the real question leaders are wrestling with is this:
How do we best support students who are not meeting grade-level expectations?
It’s exactly the right question. But most of the data we bring to the table cannot answer it.

What Most School Data Actually Tells Us
Think about the data typically reviewed in a school’s data meeting:
Benchmark scores
Unit test averages
Item analysis reports
Failure rates
These numbers can be useful. They help identify which students are struggling. But they cannot explain why.
Educational researcher Dylan Wiliam has long argued that assessment should be used to reveal how students are thinking so that teachers can adjust instruction accordingly. When assessments only measure correctness, they limit teachers’ ability to understand the learning process that produced the result (Wiliam, 2011).
Without evidence of student thinking, data meetings leave leaders guessing:
Is the student misunderstanding the concept?
Did they apply the wrong strategy?
Are they reasoning correctly but making calculation errors?
Did they misinterpret the problem entirely?
The spreadsheet cannot answer those questions.
And when leaders don’t have answers to those questions, the next step becomes predictable: more practice, more review, more intervention packets. Sometimes those supports help. Often they don’t—not because teachers aren’t working hard, but because the intervention was designed for the wrong problem.
This is the cost of data without thinking.

The Missing Step in Most Data Conversations
In research, the first step is always the same: define the research question. Only after the question is clear do researchers decide what kind of data to collect.
Some questions require quantitative data—numbers, scores, percentages. Other questions require qualitative data—observations, explanations, written work, artifacts of student thinking. Often the strongest conclusions come from both.
But in many schools, the order gets reversed. Data is collected because the district requires it, the state reports it, or the assessment system generates it automatically. Leaders inherit dashboards full of numbers long before they stop to ask: What question are we actually trying to answer?
In math especially, where student reasoning is rarely visible in a score, this gap between the data we review and the decisions we need to make is particularly costly. A student who gets a problem wrong because she misread the prompt needs something fundamentally different from a student who chose the right strategy but made a procedural error halfway through. The benchmark score treats them identically.

The Evidence Schools Rarely Examine
If we truly want to understand how to support students, we have to examine the one source of evidence that reveals how students are thinking: student work.
Student work shows us things that numbers cannot:
How students interpret problems
What strategies they choose
Where their reasoning breaks down
How they justify their conclusions
Research on effective professional learning communities consistently shows that teacher teams who examine student work develop a deeper understanding of student thinking and make more informed instructional decisions (DuFour, DuFour, Eaker, & Many, 2010). Similarly, researchers studying formative assessment emphasize that understanding student thinking—rather than simply scoring answers—is essential for improving learning outcomes (Black & Wiliam, 1998).
The thinking that produced the answer is where learning actually lives. And it’s where effective intervention has to begin.

What Changes When Leaders Look at Student Thinking
When leadership teams begin examining student work regularly, the entire conversation changes.
Instead of asking:
Which students failed this assessment?
The conversation becomes:
What are students thinking? What does their work reveal about where understanding breaks down—and what kind of support would actually move them forward?
Leaders begin to notice patterns that numbers alone never reveal. Students may be able to compute but struggle to explain their reasoning. Students may have strong ideas but lack the language to communicate them clearly. Students may be applying a strategy that almost works but breaks down at a critical step.
Now the leadership team can finally answer the question they care about—not with guesses, but with evidence.

A Different Kind of Data Meeting
Imagine a different kind of data meeting.
Teachers bring three pieces of student work: one that demonstrates strong reasoning, one that shows partial understanding, and one where thinking breaks down. Together, the team examines the work—not to assign blame or celebrate scores, but to understand what students are actually doing when they engage with the mathematics.
They ask: What claim is the student making? What evidence do they use? How does their reasoning connect the two?
Patterns emerge. And now the team can plan instruction that responds directly to those patterns—not generic interventions, but targeted support grounded in student thinking.
That’s not a longer meeting. It’s a more powerful one.

The Leadership Shift
School leaders don’t need less data. They need better-aligned data.
Data that helps answer the question that matters most: How do we help students grow as thinkers and problem-solvers?
That shift requires leaders to move beyond conversations focused only on answers and percentages. It requires creating routines where teachers and leaders regularly examine the thinking behind the work students produce. It requires asking not just who struggled, but how they struggled—and what that tells us about what they need next.
When leaders build those routines, the support students receive becomes far more than intervention. It becomes instruction that actually meets students where they are.
And that is what transforms a data meeting into a catalyst for real student growth.
Is Your Leadership System Aligned to the Learning That Matters? The challenge this post describes — data that identifies who is struggling but can't explain why — often reflects something deeper: a gap between leadership systems and the evidence leaders actually need to make good instructional decisions. If you're wondering where your school or district stands, the Leadership Systems Mini Diagnostic is a free, four-question reflection designed for principals, assistant principals, and district leaders who are responsible for math outcomes. There are no right or wrong answers — only information that can clarify next steps. |
References
Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education, 5(1), 7–74.
DuFour, R., DuFour, R., Eaker, R., & Many, T. (2010). Learning by Doing: A Handbook for Professional Learning Communities at Work. Solution Tree.
Wiliam, D. (2011). Embedded Formative Assessment. Solution Tree Press.
