image

Sign up for Updates
and the e-Edition
image image


Publicizing teacher evaluations draws criticism


Unclear whether new precedent has been set


By Adam Shanks
Staff writer

walcott_3
shadow
Dennis Walcott, New York City Department of Education chancellor, says that the teacher data recently made public was “never intended to be released.” Photo by Gazette file.
March 05, 2012
The state's largest teacher's union, New York State United Teachers, has joined the United Federation of Teachers in sharply opposing the publication of evaluation scores of nearly 18,000 of New York City teachers.

The city announced more than a year ago that it would be making the ratings public, in response to Freedom of Information Law requests from over a dozen news outlets, and the United Federation of Teachers sued to stop it. After months of litigation, the state Court of Appeals decided not to hear the union's case and the data was released Feb. 24.

It remains to be seen if this will set a precedent for the accessibility of teacher data, including the name of the instructor, for the media and the public under the newly proposed teacher evaluation system.

"I don't know why the teachers' [names and scores] wouldn't be made public," said Bob Freeman, executive director of the Committee on Open Government.

iannuzzi
shadow
NYSUT President Richard Iannuzzi bashed the New York City Department of Education for the release of teachers’ names with their evaluations, saying it was a “betrayal of the essential purpose of teacher evaluations.” Photo by AP.
Freeman believes that, unless there is a specific provision written into the new teacher evaluation law shielding teacher names from FOIL, their data would be accessible to the public.

"As public employees, we're accountable to the public," said Freeman.

Carl Korn, a spokesman for NYSUT, believes the New York City teacher data being released, and the data that would be collected under the new statewide evaluation system, are "fundamentally different" because the new methods are more subjective, and not based solely on testing.

In the new teacher review system, the results would be used as "job evaluation personnel files," which have not been previously releasable under FOIL, unlike data in New York City, which was not used to judge teachers. However, he did acknowledge that NYSUT's attorneys are looking at case law to see where the current agreement would work with freedom of information laws

shadow
shadow
shadow
"If a legislative fix is required, we'll work toward that," said Korn.

He stressed the importance of keeping the ratings confidential if the state is to successfully improve its teachers.

"The new evaluation system is designed as a support system," said Korn. "If it is going to be used for a different purpose, it will have a chilling effect."

Korn added that if teacher information were to be requested via FOIL under the new agreement, the teachers union would "fight vigorously to ensure the evaluations remain confidential."

Teachers and their unions have criticized the data released by New York City, saying there is a wide margin of error on the evaluations, and that it was only part of the city's initial step into rating teachers.

"This sensationalized release of teacher scores from a New York City pilot project is a betrayal of the essential purpose of evaluations, which is to support all teachers in improving their effectiveness," said Richard Iannuzzi, president of NYSUT. "The decision by the Bloomberg Administration and the New York City Department of Education not to oppose the release of scores by individual teachers is deplorable."

The New York City Department of Education stresses that the system was not used to actually "rate" teachers, but rather to simply collect data. The only manner in which it was used to judge teacher value was in determining tenure in a few cases, and the Department of Education says even then it played only a small role.

Dennis Walcott, chancellor of the New York City Department of Education, agreed that the scores weren't entirely encompassing or reflective of a teacher's work, saying earlier this month in a letter to the city's teachers and principals "the data does not tell the whole story of your work as a teacher. Teacher Data Reports were created primarily as a tool to help teachers improve."

The data had previously been released, but did not contain the identities of the teachers. It represents the scores of teachers in grades 4-8 in English and Math between 2007 and 2010.

The method of scoring was a "value-added" model, which predicts, based on previous scores and other factors such as poverty, what a student's testing scores should be at the end of the school year. Then the actual test results are compared to the predictions to determine the teacher's impact on her or his student's education, and rank those teachers in comparison to their colleagues.

A 2009 report from the Board on Testing and Assessment of the National Research Council states that the value-added model "should not be used to make operational decisions because such estimates are far too unstable to be considered fair and reliable."

"A student's scores may be affected by many factors other than a teacher — his or her motivation, for example, or the amount of parental support," the Board said in a press release accompanying the report. "Value-added techniques have not yet found a good way to account for these other elements."

In a report released along with the teacher data, the Department of Education stressed its stance was "value-added data was designed to be used as one of multiple measures of teacher effectiveness, not on its own."

"These reports were never intended to be public or to be used in isolation," Walcott said. "Ultimately, each news organization will make its own choices about how to proceed."

Susan Fuhrman, president of the Teachers College of Columbia University, adamantly disagreed with the release of teacher names in a statement last week.

"There is no evidence that evaluating teachers solely on the basis of their students' performance on standardized tests improves schools," she said, "and releasing the yearly "rankings" of individual teachers is demeaning and demoralizing."

Fuhrman pointed out problems that could lead to errors and inconsistencies in the value-added model of data collection on teachers.

"Value-added measures also fail to capture…the effect of prior-year teachers on students' test scores, or the different content of many tests from one grade to the next," she continued. "Nor do these measures account either for classrooms with high turnover, which frequently have an adverse impact on students' test scores, or for small classes in which a few students' scores can distort the size of overall gains."

printPrint
emailMail
CommentComment
shareShare
Reader Feedback Submission
Use this form to submit Reader Feedback. Your submission will be reviewed by our staff before appearing on the Web site.
* required value
Your Name*

Subject

Comment*


inclusion image
11 - 28 - 14
06:56
Site Search

Empire State College
Aids Council of NE New York
Colonie  Senior Services
American Bible Society
Double H Ranch
Freedom Guide Dogs
Queens College
LaSalle School
Animal Protective Foundation
MiSci
Food Pantries
Equinox
Watervliet Civic Center
Upper Hudson Planned Parenthood
Habitat for Humanity
Morrisville
UNCF
Ronald McDonald House
Capital District Humane Assoc
Autism Society
Wildwood Foundation
American Red Cross
Salvation Army
Colonie Youth Center
SCAP
Family & Child of Schenectady
Community Caregivers
Girl Scouts of NE New York
Arbor Park
WERC Capital District
SICM
Pride Center
St. Catherine's
Trinity Alliance
Catholic Charities of the Diocese
Animal Lovers
Interfaith Partnership
Planned Parenthood Hudson Mohawk
Legal Project
Franklin Community Center
Captain Youth and Family Services
Human Rights
Damien Center
Community Works
Mohawk Hudson Humane Society
Community Hospice
Living Resources
The Legislative Gazette | P.O. Box 7329 | Room 116 | Empire State Plaza | Concourse Level | Albany, NY 12224 | (518) 473-9739