[…]

ALEC and the Invisible Schools with Invisible Success

From a report titled Invisible Schools, Invisible Success

Virtual schools are popular because they are profitable. Estimates show that “revenues from the K-12 online learning industry will grow by 43 percent between 2010 and 2015, with revenues reaching $24.4 billion.”

More than 200,000 K-12 students are enrolled in full-time virtual schools across the country; when expanded to all students enrolled in at least one course, the number explodes to 2,000,000. The more children enrolled in virtual schools, the greater the profit for the companies.
[…]
In December 2004, the American Legislative Exchange Council (ALEC) approved the “Virtual Public Schools Act.” That model bill sparked a rush by private companies to embrace virtual schools and virtual learning across the country. Today, there are more than 230 nationwide accredited private virtual schools in the country.
[…]
ALEC is closely tied to the virtual school movement, having pushed its “Virtual PublicSchools Act” on behalf of corporate members of its board since 2005. The law was adopted by ALEC through the work of its Education Task Force,comprised of corporate lobbyists and conservative legislators. According to the Center for Media and Democracy’s website, ALEC Exposed, two of the three co-chairs on ALEC’s Education Task Force work directly for virtual school companies

  • Mickey Revenaugh, Co-founder and Senior VicePresident of State Relations for Connections Academy,a virtual school company; and
  • Lisa Gillis, Director of Government Affairs and SchoolDevelopment for Insight Schools, part of K12 Inc.

K12 is one of the largest virtual school operators in Ohio. The Ohio Virtual Academy, represent about 26% of K12′s annual revenues. We've previously demonstrated that virtual schools in Ohio are manufacturing profits at the expense of education, primarily by packing their virtual classrooms. These packed virtual classrooms have a significant effect on students

–OVA enrolled a total of 18,743 students cumulatively throughout the 2010/2011 school year with 9,593 withdrawing by the end of the year, for an astoundingly high churn rate of 51.1%

"[…]these cyber schools might as well have a turnstile as their logo for the volume of withdrawals they experience.", noted one researcher.

To highlight the emphasis K12 puts on profits above education, comes this leaked email from their CFO in Pennsylvania

An April 23, 2010 e-mail from Kevin Corcoran to a host of his colleagues is likely the sort that, in one form or another, millions of Americans deal with regularly during the work day.

Bluntly noting “We have not made the progress we need to in this area,” Corcoran adds, “More than $1[million] in funding” is in the balance.”

“Anyone who has not fulfilled their obligation in this area should not be surprised….when it’s time to discuss performance evaluations, bonuses and raises.”
[…]
In the e-mail, Corcoran, who is Agora’s financial chief, was miffed because 81 “IEPs,” short for individualized education programs–basically customized teaching plans for Agora’s growing populace of special education students–hadn’t received the necessary signatures; without them, various school districts would not release reimbursement of $15,000 per pupil (or higher) to Agora, and thus K12, to educate a student populace that have had profound troubles meeting educational expectations.

More concerned about bonuses and raises, than the fact that students have outstanding IEP's that are not being addressed. This is part of the educational mess ALEC has and continues to try to create.

Teacher Grades: Pass or Be Fired

Stealing the headline from this NYT article, to bring to your attention a report on the IMPACT rubric for teacher evaluation in Washington DC. Ohio's new evaluation system passed in the state budget draws some of its heritage from this, so we thought it would be valuable to consider it for a moment.

Emily Strzelecki, a first-year science teacher here, was about as eager for a classroom visit by one of the city’s roving teacher evaluators as she would be to get a tooth drilled. “It really stressed me out because, oh my gosh, I could lose my job,” Ms. Strzelecki said.

Her fears were not unfounded: 165 Washington teachers were fired last year based on a pioneering evaluation system that places significant emphasis on classroom observations; next month, 200 to 600 of the city’s 4,200 educators are expected to get similar bad news, in the nation’s highest rate of dismissal for poor performance.

The evaluation system, known as Impact, is disliked by many unionized teachers but has become a model for many educators. Spurred by President Obama and his $5 billion Race to the Top grant competition, some 20 states, including New York, and thousands of school districts are overhauling the way they grade teachers, and many have sent people to study Impact.

Ohio's new system involves each teacher receiving two 30 minute in-class observations also. Education Sector, a non-profit think tank recently produced a paper on IMPACT and took at look at some of the ways this new system has affected Washinton DC teachers. We urge you to read the paper in full, below, but we've also pulled out some of the interesting pieces to entice you.

The observations take 30 minutes—usually no more and never any less—and all but one of the administrator visits are unannounced. Based on these observations, teachers are assigned a crucial ranking, from 1 to 4. Combined with other factors, they produce an overall IMPACT score of from 100 to 400, which translates into“highly effective,” “effective,” “minimally effective,” or “ineffective.” A rating of ineffective means the teacher is immediately subject to dismissal; a rating of minimally effective gives him one year to improve or be fired; effective gets him a standard contract raise; and highly effective qualifies him for a bonus and an invitation to a fancy award ceremony at the Kennedy Center.

It is a measure of how weak and meaningless observations used to be that these pop visits can fill teachers, especially the less experienced ones, with the anxiety of a 10th-grader assigned an impromptu essay on this week’s history unit for a letter grade. The stress can show up in two ways—the teacher chokes under the pressure, thereby earning a poor score, or she changes her lesson in a way that can stifle creativity and does not always serve students. Describing these observations, IMPACT detractors use words like “humiliating,” “infantilizing,” “paternalistic,” and “punitive.” “It’s like somebody is always looking over your shoulder,” said a high school teacher who, like most, did not wish to be named publicly for fear of hurting her career.

[…]

“Out of 22 students, I have five non-readers, eight with IEPs [individual educational plans, which are required by federal law for students with disabilities], and no co-teacher,” says the middle school teacher. “The observers don’t know that going in, and there is no way of equalizing those variables.”

[…]

Bill Rope is not young, or particularly bubbly, but he is a respected teacher who sees this unusual relationship from the confident perspective of an older man who went into education after a 30-year career in the foreign service. Rope, who now teaches third grade at Hearst Elementary School in an affluent neighborhood of Northwest D.C., was rated “highly effective” last year and awarded a bonus that he refused to accept in a show of union solidarity.

But a more recent evaluation served to undermine whatever validation the first one may have offered. In the later one, a different master educator gave him an overall score of 2.78—toward the low end of “effective.”

[…]

So how did it all shake out? At the end of IMPACT’s first year, 15 percent of teachers were rated highly effective, 67 percent were judged effective, 16 percent were deemed minimally effective, and 2 percent were rated ineffective and fired.

[…]

Theoretically, a teacher’s value-added score should show a high correlation with his rating from classroom observations. In other words, a teacher who got high marks on performance should also see his students making big gains. And yet DCPS has found the correlation between these two measures to be only modest, with master educators’ evaluations onlyslightly more aligned with test scores than those of principals.

Impact Report Release