Nation's Report Card' Distracts From Real Concerns For Public Schools

Imagine you’re a parent of a seven-year-old who has just come home from school with her end-of-year report card. And the report card provides marks for only two subjects, and for children who are in grade-levels different from hers. Furthermore, there's nothing on the report card to indicate how well these children have been progressing throughout the year. There are no teacher comments, like "great participation in class" or "needs to turn in homework on time." And to top it off, the report gives a far harsher assessment of academic performance than reports you've gotten from other sources.

That's just the sort of "report card" that was handed to America yesterday in the form of the National Assessment of Education Progress. And while the NAEP is all well and good for what it is -- a biennial norm-referenced, diagnostic assessment of fourth and eighth graders in math and reading -- the results of the NAEP invariably get distorted into all kinds of completely unfounded "conclusions" about the state of America's public education system.

'Nation's Report Card" Is Not A Report Card

First off, let's be clear on what the NAEP results that we got yesterday actually entail. As Diane Ravitch explains, there are two different versions of NAEP: 1) the Main NAEP, which we got yesterday, given every other year in grades 4 and 8 to measure national and state achievement in reading and math based on guidelines that change from time to time; and 2) the Long-Term Trend NAEP given less frequently at ages 9, 13, and 17 to test reading and math on guidelines that have been tested since the early 1970s. (There are also occasional NAEPs given in other subjects.) So in other words, be very wary of anyone claiming to identify "long term trends" based on the Main NAEP. This week's release was not the "long term" assessment.

Second, let's keep in mind the NAEP's limits in measuring "achievement." NAEP reports results in terms of the percent of students attaining Advanced, Proficient, Basic, and Below Basic levels. What's usually reported out by the media is the "proficient and above" figure. After all, don't we want all children to be "proficient?" But what does that really mean? Proficiency as defined by NAEP is actually quite high, in fact, much higher than what most states require and higher than what other nations such as Sweden and Singapore follow.

Third, despite its namesake, NAEP doesn't really show "progress." Because NAEP is a norm-referenced test, its purpose is for comparison -- to see how many children fall above or below a "cut score." Repeated applications of NAEP provide periodic points of comparison of the percentages of students falling above and below the cut score, but does tracking that variance really show "progress?" Statisticians and researchers worth their salt would say no.

Finally, let's remember that NAEP proficiency levels have defined the targets that all states are to aim for according toto the No Child Left Behind legislation. This policy that has now been mostly scrapped, or at least significantly changed, due to the proficiency goals that have been called "unrealistic."

Does this mean that NAEP is useless. Of course not. As a diagnostic tool it certainly has its place. But as the National Center on Fair and Open Testing (FairTest) has concluded, "NAEP is better than many state tests but is still far from the 'gold standard' its proponents claim for it."

[readon2 url="http://ourfuture.org/blog-entry/2011114402/nations-report-card-distracts-real-concerns-public-schools"]Continue reading...[/readon2]