Plan to rate teachers based on test scores is under fire
A centerpiece of Massachusetts’ effort to evaluate the performance of educators is facing mounting opposition from the state’s teacher unions as well as a growing number of school committees and superintendents.
At issue is the state’s edict to measure — based largely on test scores — how much students have learned in a given year.
The opposition is flaring as districts have fallen behind a state deadline to create a “student impact rating,” which would assign a numeric value to test score growth by classroom and school. The rating is intended to determine whether teachers or administrators are effectively boosting student achievement. The requirement — still being implemented — would apply to all educators, including music, art, and gym teachers.
“In theory it sounded like a good idea, but in practice it turned out to be insurmountable task,” said Glenn Koocher, executive director of the Massachusetts Association of School Committees. “How do you measure a music teacher’s impact on a student’s proficiency in music? How do you measure a guidance counselor’s impact on student achievement?”
Critics question whether the data can be affected by other factors, including highly engaged parents or classrooms with disproportionate numbers of students with disabilities or other learning barriers. The requirement has also created problems in developing assessments for subjects where standardized tests are not given, such as in art and gym.
Resistance has escalated in recent weeks. On Thursday, the state’s largest teachers union, the Massachusetts Teachers Association, as well as others successfully lobbied the Senate to approve an amendment to the state budget that would no longer require student impact ratings in job evaluations. A week earlier, the Massachusetts Association of School Committees passed a policy statement urging the state to scrap the student impact ratings.
But some educators see value in the student impact ratings. Mitchell Chester, state commissioner for elementary and secondary education, defended the requirement, which has been more than five years in the making.
“Some teachers are strong, others are not,” he said. “If we are not looking at who is getting strong gains and those who are not we are missing an opportunity to upgrade teaching across the system.”
In Boston, which is moving to meet the requirement, Superintendent Tommy Chang began recruiting teachers and administrators last week for a workgroup to help develop the ratings. That prompted the teachers union on Thursday to e-mail a special news bulletin to its more than 5,000 members, condemning the move as a “harmful policy decision.” Earlier this year the teachers union walked out of three years of talks on the issue.
The backlash in Massachusetts echoes similar debates that have unfolded nationwide.
More than five years ago, dozens of states were encouraged by the Obama administration to make student academic growth a significant part of evaluations. The Obama administration promised millions of dollars to states via its Race to the Top education overhaul program if states adopted the evaluation changes along with a host of other school initiatives.
But since then educators and statisticians have raised questions about the reliability of test growth measures. A 2014 report by the Center for Educational Assessment at the University of Massachusetts Amherst, which examined student growth percentiles, found the “amount of random error was substantial.”
“You might as well flip a coin,” Stephen Sireci, one of the report’s authors and a UMass professor at the Center for Educational Assessment, said in an interview. “Our research indicates that student growth percentiles are unreliable and should not be used in teacher evaluations. We see a lot of students being misclassified at the classroom level.”
Under growing criticism, the federal government dropped the student academic growth requirement this year when Congress enacted the Every Student Succeeds Act. Some states, in turn, have started to reverse course.
But other states like Massachusetts, which received $250 million in Race to the Top money, remain committed. Massachusetts is requiring districts to use at least two measures of student academic growth, including one that the state created several years ago for the MCAS math and English exams.
Damian Betebenner, a senior associate at the National Center for the Improvement of Educational Assessment Inc. in Dover, N.H., who developed the MCAS growth percentiles, said he believes student test scores are reliable pieces of evidence on educator effectiveness.
“It’s relevant, but it has to be balanced with a lot of other evidence as well,” he said. “Unfortunately, the use of student percentiles has turned into a debate for scapegoating teachers for the ills.”
Chester said Massachusetts took a more measured approach to the student impact ratings, deciding against having it make up a certain percentage of an educator’s overall evaluation rating like some states did. Instead, evaluators can give less weight to the student impact ratings if they don’t mesh with classroom observations and other evidence.
School districts were supposed to issue the first set of ratings in the 2014-15 school year, but the state delayed implementation by a year as districts struggled to comply. The state then loosened the deadline again and expects about 40 districts will issue ratings at the end of this school year, while other districts are collecting data on some educators.
“Implementation is widespread, but it has not reached every educator in every district yet,” Jacqueline Reis, a state education system spokeswoman, said in an e-mail.
Last year, the state temporarily withheld $5.6 million in federal funds from the Boston Public Schools for failing to show any progress toward creating the student impact ratings.
Teacher unions and associations representing school committees and superintendents say implementation has been sluggish.
“Most of our school districts can’t get unions to the table to talk about this,” said Thomas Scott, executive director of the Massachusetts Association of School Superintendents. “It’s really becoming a sticking point, and a lot of superintendents feel this is becoming a distraction.”
He said superintendents are overwhelmingly supportive of looking at student testing data as part of evaluations but that many superintendents think developing a student impact rating goes too far, according to a survey the association conducted two months ago.
Barbara Madeloni, president of the Massachusetts Teachers Association, takes a different view, believing that student standardized test scores have no place in a teacher’s evaluation. She praised the state Senate for taking on the fight to eliminate the student impact ratings.
“I think it’s very powerful our senators are beginning to listen to what educators have been saying about corporate education reform accountability measures are not working for our educators or our students,” she said.
It remains unclear what the fate of the Senate amendment will be. The House has previously rejected a similar amendment, which means the issue would have to be resolved in a conference committee as the two sides reconcile their budget proposals in the coming weeks.