Article

STUDY: Teachers Find No Value in the SAS Education Value-Added Assessment System

A new study published in the education policy analysis archives titled "Houston, We Have a Problem: Teachers Find No Value in the SAS Education Value-Added Assessment System (EVAAS®)" looks at the use of Value-add in the real world. Their findings are not shocking, but continue to be troubling as we enter a high-stakes phase of deployment.

Today, SAS EVAAS® is the most widely used VAM in the country, and North Carolina, Ohio, Pennsylvania and Tennessee use the model state-wide (Collins & Amrein-Beardsley, 2014). Despite widespread popularity of the SAS EVAAS®, however, no research has been done from the perspective of teachers to examine how their practices are impacted by this methodology that professedly identifies effective and ineffective teachers. Even more disconcerting is that districts and states are tying consequences to the data generated from the SAS EVAAS®, entrusting the sophisticated methodologies to produce accurate, consistent, and reliable data, when it remains unknown how the model actually works in practice.

As you can see, the findings here are directly relevant to educators in Ohio. The report looked at a number of factors, including reliability, which once again proves to be anything but

Reliability
As discussed in related literature (Baker et al., 2010; Corcoran, 2010; EPI, 2010; Otterman, 2010; Schochet & Chiang, 2010) and preliminary studies in SSD (Amrein-Beardsley & Collins, 2012), it was evident that inconsistent SAS EVAAS® scores year-to-year were an issue of concern. According to teachers who participated in this study, reliability as measured by consistent SAS EVAAS® scores year-to-year was ironically, an inconsistent reality. About half of the responding teachers reported consistent data whereas the other half did not, just like one would expect with the flip of a coin (see also Amrein-Beardsley & Collins, 2012).

Reliability Implications
Unless school districts could prevent teacher mobility and ensure equal, random student assignment, it appears that EVAAS is unable to produce reliable results, at least greater than 50% of the time.

A random number generator isn't an appropriate tool for measuring anything, let alone educator effectiveness that might lead to high-stakes career decisions.

Furthermore, the study found that teachers are discovering that despite claims to the contrary, the SAS formula for calculating Value-add is highly dependent upon the student population

teachers repeatedly identified specific groups of students (e.g., gifted, ELL, transition, special education) that typically demonstrated little to no SAS EVAAS® growth. Other teachers described various teaching scenarios such as teaching back-to-back grade levels or switching grade levels which negatively impacted their SAS EVAAS® scores. Such reports contradict Dr. Sanders’ claim that a teacher in one environment is equally as effective in another (LeClaire, 2011).

In conclusion, the study finds

The results from this study provide very important information of which not only SSD administrators should be aware, but also any other administrators from districts or states currently using or planning to use a VAM for teacher accountability. Although high-stakes use certainly exacerbates such findings, it is important to consider and understand that unintended consequences will accompany the intended consequences of implementing SAS EVAAS®, or likely any other VAM. Reminiscent of Campbell’s law, the overreliance on value-added assessment data (assumed to have great significance) to make high-stakes decisions risks contamination of the entire educational process, for students, teachers and administrators (Nichols & Berliner, 2007). Accordingly, these findings also strongly validate researchers’ recommendations to not use value-added data for high-stakes consequences (Eckert & Dabrowski, 2010; EPI, 2010; Harris, 2011). While the SAS EVAAS® model’s vulnerability as expressed by the SSD EVAAS®-eligible teachers is certainly compounded by the district’s high-stakes use, the model’s reliability and validity issues combined with teachers’ feedback that the SAS EVAAS® reports do not provide sufficient information to allow for instructional modification or reflection, would make it seem inappropriate at this point to use value-added data for anything.

the full study can be read below.

Houston, We Have a Problem: Teachers Find No Value in the SAS Education Value-Added Assessment System (EVAA...

Top 7 tips for Improving Public Schools

  • Discourage teacher turnover by downplaying the importance of having money and respect
  • Maybe get some underprepared, overconfident recent college graduates in there to figure things out
  • Federal law that prevents Dylan from raising his hand and wasting everybody’s time with the wrong answer
  • Tattoo grades on foreheads to shame low performers
  • Toss Northrop Grumman another $4.5 billion and see what kind of curriculum it pumps out
  • Whatever you do, don’t change anything about a property-tax-based funding system in which rich schools get richer while poor schools get poorer. That’s working just fine.
  • Cut losses and reallocate funding to nation’s prison system

You might be fooled into thinkinng these are the latest ideas from Students First, alas no.

The great charter school rip-off

Last week when former President Bill Clinton meandered onto the topic of charter schools, he mentioned something about an “original bargain” that charters were, according to the reporter for The Huffington Post, “supposed to do a better job of educating students.”

A writer at Salon called the remark “stunning” because it brought to light the fact that the overwhelming majority of charter schools do no better than traditional public schools. Yet, as the Huffington reporter reminded us, charter schools are rarely shuttered for low academic performance.

But what’s most remarkable about what Clinton said is how little his statement resembles the truth about how charters have become a reality in so many American communities.

In a real “bargaining process,” those who bear the consequences of the deal have some say-so on the terms, the deal-makers have to represent themselves honestly (or the deal is off and the negotiating ends), and there are measures in place to ensure everyone involved is held accountable after the deal has been struck.

But that’s not what’s happening in the great charter industry rollout transpiring across the country. Rather than a negotiation over terms, charters are being imposed on communities – either by legislative fiat or well-engineered public policy campaigns. Many charter school operators keep their practices hidden or have been found to be blatantly corrupt. And no one seems to be doing anything to ensure real accountability for these rapidly expanding school operations.

Instead of the “bargain” political leaders may have thought they struck with seemingly well-intentioned charter entrepreneurs, what has transpired instead looks more like a raw deal for millions of students, their families, and their communities. And what political leaders ought to be doing – rather than spouting unfounded platitudes, as Clinton did, about “what works” – is putting the brakes on a deal gone bad, ensuring those most affected by charter school rollouts are brought to the bargaining table, and completely renegotiating the terms for governing these schools.

(Continue reading at Salon.com)

The Testing Camera

This video by author/illustrator Peter H. Reynolds is a both current reality and cautionary tale about what testing does or can do to our children.

The fascinating story about the testing camera raises questions about education in general and about testing in particular.

The Trouble with Having Trouble with the Common Core

2015 is the year when the Common Core rubber finally hits the road, and maybe even goes off the rails. 2014-15 is the first full year of the standards, and this spring will see those standards put to the standardized common core test. The results are likely to be ugly. The standards are new, challenging, with little time to prepare, and technology infrastructure is no where near where it needs to be. These are just some of the reasons we'll be hearing a lot about the impact of common core once the tests are wrapped up and results are in.

Couple this with a sizable shift to the right in state legislatures across the country, including Ohio, and we're certain to see more efforts to repeal or otherwise change the common core.

Andy Smarick at Bellwether Education Partners, a right wing corporate reform think tank, believes there are serious challenges ahead for Common Core

Rather than addressing conservatives’ intellectually serious concerns, too many proponents, time and time again, have antagonized the right. Skeptics have been told their opposition is a “circus,” just “political,” and “not about education,” and that they must be “comfortable with mediocrity,” “paranoid,” and/or “resistant to change.”

Just weeks ago, Secretary Duncan caricatured opponents as “politicians who want to dummy down standards…to make themselves look good.” The reliably liberal NPR just ran a laudatory piece on the professor from “an elite liberal arts college in Vermont” who authored Common Core math. The world’s most influential philanthropist called the substance of what we teach our kids “a technocratic issue”—that is, a matter for technical experts wielding political power—akin to standardizing electric outlets.

All of this inflames, not enervates, the conservative opposition.

The problem with having a problem with the common core however, is what to do instead of it? Going back to old standards isn't an option - that's something everyone agrees on. Developing new standards is costly and time consuming (as the CCSS have demonstrated) and k-12 education is a ship that doesn't turn easily, or quickly. This leads opponents offering up all kinds of bizarre solutions, Here's what the Ohio tea party legislators dreamed up last year

Shame on GOP members of the Ohio House Rules and Reference Committee for bowing to partisan pressure and voting out of committee a deeply flawed bill to eliminate Ohio's Common Core educational standards and replace them on an interim basis with old Massachusetts standards. The committee voted 7-2 along party lines Wednesday to approve the controversial plan.
[...]
The bill envisions dumping Common Core next year, switching to pre-2011 Massachusetts standards for the next three academic years and then imposing a new standard that Ohio would develop. The nonpartisan Legislative Service Commission estimates the one-time cost of developing the new standards at up to $15.75 million.

Needless to say, it didn't go anywhere. But your first idea often contains the kernels of your best - and if this is the best idea opponents of Common Core have, they are in even deeper trouble than the standards themselves.