Are you looking for the Classworks Special Education program from TouchMath? Click here to go to their site.

Adaptive Learning Dead End

Adaptive Learning vs. Modern Formative Assessment Systems:

Why Data-Driven Instruction Is the Key to Student Growth

For more than thirty years, adaptive learning and adaptive testing systems have been heralded as breakthrough innovations in education. From early pioneers like Carnegie Learning’s Cognitive Tutor™ in the late 1990s to today’s i-Ready™, Mathia™, ALEKS™, and NWEA MAP Growth™, adaptivity has been marketed as the ultimate solution: a faster, more precise measurement of student ability, delivered through personalized pathways that “meet students where they are.”

And yet, despite billions in investment and decades of widespread adoption, student achievement in the United States has remained flat— and even declined. The most recent post-COVID NAEP results showed the steepest drops in reading and math in decades. If adaptive technology really delivered on its promise, we should have seen some positive signal in the data by now. We haven’t.

Teachers know why. Adaptive platforms may be efficient at measurement, but they are rarely transparent, actionable, or connected to the real work of the classroom. In contrast, modern formative assessment platforms, such as Formative.com™ or Classwork.com™, support data-driven instruction. This instructional approach is what adaptivity has always promised but never truly delivered: an engine for increasing student achievement.

Let’s break down the contrast, drawing from the classroom perspective, the lessons learned during COVID, and thirty years of experience with computer-centric models.

What Adaptive Learning Systems Do Well (and Not So Well)

At their best, adaptive systems serve two main functions:

  1. Efficient measurement. By using item response theory (IRT) or knowledge-space theory, adaptive tests can quickly zero in on a student’s skill level. Instead of 40 fixed items, a MAP Growth test can use 25–30 adaptive questions to place a student with statistical precision.

  2. Personalized pathways. Programs like i-Ready or Mathia use this data to deliver lessons tailored to each learner’s readiness level. In theory, students work at their own pace and close gaps in knowledge while teachers monitor progress from a dashboard.

From a policymaker or administrator perspective, this sounds ideal. You get efficient testing, comparable growth measures, and reports that look good in district dashboards. For large-scale accountability systems, adaptive technology is convenient.

But here’s the teacher perspective:

  • Opaque data. Teachers are often handed percentile scores, RIT bands, or color-coded levels. These may be statistically elegant, but they don’t translate easily into the next step for data-driven instruction.

  • Poor alignment. Adaptive platforms follow their own internal learning progression. That doesn’t always match a state scope and sequence—or the district curriculum. Teachers often find that adaptive lessons push students into topics weeks or months before they are introduced in class.

  • Student morale. Adaptivity can be discouraging. A student may feel “punished” by getting harder and harder questions until they hit failure. Or they may get pulled into content they’ve never seen before, leaving them frustrated and disengaged.

  • Little instructional value. Perhaps the biggest critique: adaptive results rarely lead to better data-driven instruction the next day. Teachers don’t need more abstract numbers. They need concrete insights into student thinking, tied directly to yesterday’s lesson and tomorrow’s plan.

This disconnect explains why, despite decades of use, adaptive systems have had little measurable impact on student achievement. They are designed for system-level efficiency, not classroom-level instructional improvement.

Lessons from the COVID Era: The Limits of Computer-Centric Models

The pandemic exposed the cracks in adaptive platforms more clearly than ever. When students were sent home and told to “log into i-Ready” or “complete your ALEKS modules,” engagement collapsed. Teachers quickly learned:

  • Computer time is not data-driven instruction. Left alone with adaptive software, many students clicked through, guessed, or disengaged. Completion didn’t equal mastery.

  • Teachers are irreplaceable. The human connection—motivating, reteaching, scaffolding, encouraging—was the missing link. Adaptive systems could deliver problems, but they couldn’t inspire persistence or contextualize mistakes.

  • Equity gaps widened. Adaptive systems assumed consistent internet access, quiet space, and independent learning stamina. Many students, particularly the most vulnerable, lacked those conditions and fell further behind.

  • Daily feedback matters most. The most effective remote teaching wasn’t handing kids a dashboard. It was when teachers collected formative data every day—exit tickets, quick checks, live quizzes—and used it for data-driven instruction immediately.

The COVID years underscored a truth teachers had long known: student learning improves when teachers adapt their instruction based on daily data, not when computers adapt behind the scenes.

The Rise of Modern Formative Assessment Platforms

Formative platforms like Formative.com™ and Classwork.com™ emerged out of this recognition. Instead of focusing on psychometric efficiency or long-term growth models, these systems empower teachers to:

  • Embed assessment into daily instruction. Every piece of classwork— homework, quizzes, warm-ups, exit tickets, etc. becomes input for data-driven instruction.

  • See real-time student thinking. Teachers can view every student’s responses as they work, intervene immediately, and correct misconceptions on the spot.

  • Build longitudinal data naturally. Because every day’s formative work is captured, teachers and administrators can track growth over weeks and months—not just during three benchmark testing windows.

  • Align directly to curriculum. Assessments are teacher- or district- authored, curriculum-aligned, and tied to the standards being taught in the moment. There’s no mismatch between what the computer decides to test and what the class is actually learning.

  • Motivate students through feedback. Students know their teacher sees their work and responds. This makes the feedback loop immediate, meaningful, and confidence-building.

Why Formative + Data-Driven Instruction Beats Adaptive

Teachers overwhelmingly prefer formative platforms because they feel instructionally useful. The contrast between the two approaches is clear:

Feature

Adaptive Systems (i-Ready, Mathia, MAP)

Formative Platforms (Formative.com, Classwork.com)

Frequency

Periodic (benchmarks, or weekly modules separate from instruction)

Daily, embedded in instruction

Feedback Loop

Slow (report data builds over days/weeks)

Immediate (drives data-driven instruction tomorrow)

Alignment

Vendor’s learning progression

Teacher’s curriculum and pacing

Transparency

Abstract scores, percentiles

Item-level responses, visible misconceptions

Ownership

District-driven, dashboard reporting

Teacher-driven, classroom-embedded

Impact on Student Motivation

Often discouraging/confusing

Builds confidence through immediate feedback

Put simply: adaptive learning platforms serve accountability systems; formative platforms serve data-driven instruction in the classroom.

Why Adaptives Haven’t Moved the Needle in 30 Years

The big-picture reason is simple: adaptive systems were designed primarily for measurement, not for data-driven instruction.

  • Policymakers and administrators value comparability and growth models.

  • Vendors optimize for psychometric validity and efficiency.

  • Teachers, however, need concrete insights tied to daily instruction.

Because adaptive systems don’t feed the daily cycle of data-driven instruction, they haven’t meaningfully changed teacher practice—and therefore haven’t meaningfully moved student achievement. By contrast, decades of research on formative assessment show that when teachers use data to adjust instruction daily, student outcomes improve significantly.

The Path Forward: Teacher-Centered, Data-Driven

The lesson of thirty years of adaptive testing, coupled with the wake-up call of COVID shutdowns, is that data-driven instruction led by teachers is what drives real growth. Technology should empower teachers, not replace them. Why? Because students need them.

Modern formative assessment platforms show what’s possible:

  • Daily data to guide tomorrow’s lesson.

  • Longitudinal views built from authentic classroom work.

  • A feedback loop that motivates students and strengthens instruction.

That’s the future of effective edtech: not black-box algorithms, but transparent, teacher-led, data-driven instruction that keeps learning personal, human, and connected to instruction.

Final Word

Adaptive testing will likely remain entrenched in accountability systems. Districts and states appreciate the efficiency of adaptive measures for benchmarking and growth modeling. But in the classroom, teachers know better: what drives achievement isn’t periodic efficiency, it’s data-driven instruction rooted in daily feedback.

That’s why modern formative assessment platforms are gaining traction. They marry the immediacy of classroom data with the longitudinal tracking administrators need, all while keeping teachers in the driver’s seat.

After three decades of adaptive promises, the verdict is clear: student learning improves not when computers adapt to students, but when teachers adapt their instruction, armed with the right data every single day.

📚 References

AIR. (2022). Differences in 2019-2022 COVID-related NAEP urban district score declines (Grade 4 math and reading). American Institutes for Research. https://www.air.org/resource/differences-2019-2022-covid-related-naep-urban-district-score-declines-grade-4-based

Education Next. (2022). New Nation’s Report Card disappoints — but shouldn’t surprise (learning loss). https://www.educationnext.org/new-nations-report-card-disappoints-but-shouldnt-surprise-learning-loss/

EDUCAUSE Review. (2016). Adaptive learning systems: Surviving the storm. https://er.educause.edu/articles/2016/10/adaptive-learning-systems-surviving-the-storm

Ed Trust. (2022). NAEP results show dismal learning loss due to pandemic: What can be done? The Education Trust. https://edtrust.org/blog/naep-results-show-dismal-learning-loss-due-to-pandemic-what-can-be-done/

Frontiers in Education. (2022). Measuring adaptive teaching in classroom discourse. https://www.frontiersin.org/articles/10.3389/feduc.2022.1041316/full

Frontiers in Education. (2025). Toward an adaptive learning assessment pathway. https://www.frontiersin.org/journals/education/articles/10.3389/feduc.2025.1498233/full

Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses relating to achievement. Routledge. (Summary: https://en.wikipedia.org/wiki/Visible_learning)

National Center for Education Statistics (NCES). (2022a). NAEP reading: Reading highlights 2022. https://www.nationsreportcard.gov/highlights/reading/2022/

National Center for Education Statistics (NCES). (2022b). Pandemic performance declines across racial and ethnic groups (NAEP blog). https://nces.ed.gov/nationsreportcard/blog/pandemic_performance_declines_across_racial_and_ethnic_groups.aspx

NCBI. (2020). Use of an adaptive e-learning platform as a formative assessment tool: Firecracker. Journal of Medical Education and Curricular Development. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7752734/

Springer Open. (2024). Improving the learning-teaching process through adaptive learning. Smart Learning Environments. https://slejournal.springeropen.com/articles/10.1186/s40561-024-00314-9

The Society of Digital Information and Wireless Communications (SDIWC). (2023). A systematic review on assessment in adaptive learning. International Journal of Advanced Computer Science and Applications. https://thesai.org/Downloads/Volume15No7/Paper_85-A_Systematic_Review_on_Assessment_in_Adaptive_Learning.pdf

USAFacts. (2023). COVID disrupted decades of progress in math and reading. https://usafacts.org/articles/covid-disrupted-decades-of-progress-in-math-and-reading/