Schools
 

New Orleans schools have comparatively high rate of testing irregularities on standardized tests

One major testing irregularity that the state looks for is an unusual number of wrong-to-write erasures on standardized test answer sheets.

Jessica Willams/The Lens

One major testing irregularity that the state looks for is an unusual number of wrong-to-write erasures on standardized test answer sheets.

Looked at from several different angles, New Orleans public schools have a comparatively high percentage of possible cheating on standardized tests, The Lens has found after reviewing the most recent state data available.

Testing experts who reviewed and approved of The Lens’ methodology offered a broadly accepted rationale: Cheating tends to increase when standardized tests are used for rewards and punishments of schools, teachers or students. For a variety of reasons, New Orleans schools have more riding on the outcome of test scores than public schools elsewhere in the state.

“I suspect that when there are pressures around test performance, it’s more likely that there is going to be a pressure to alter test scores,” said Heather Koons, director of consulting and development for the private educational consulting company MetaMetrics.

The Lens analyzed information from the end-of-year tests for 2010-11; some leaders of schools that had serious testing irregularities that year said they’ve since tightened their security. Others, though, have said the problems didn’t prove that anyone cheated.

Though New Orleans schools far outstripped others statewide in instances of possible cheating, the problem was by no means rampant in the city that year. Only six schools, or 7 percent of the city’s 92 campuses, had major incidents that led to test scores being voided. But that’s still much higher than elsewhere.

In reviewing data from the Louisiana Department of Education, The Lens found:

  • New Orleans public schools had a rate more than three times higher than the rest of the state for serious testing irregularities.

  • Charter schools in the state, which are concentrated in New Orleans, had a rate two times higher than traditional public schools statewide.

  • No charter school outside of New Orleans had a serious incident.

  • Jefferson Parish schools, which have a similarly sized enrollment, had two schools flagged for major violations.

  • There was little difference in the rate of problems between New Orleans charters and New Orleans traditional schools.

Our analysis is based on the same set of state documents showing that 33 New Orleans schools had reported testing irregularities over three years, which we reported in July. We received the most comprehensive statewide information for 2011. The Lens has a pending request to the state for comprehensive 2012 and 2013 data, which we made in July.

Despite the high concentration of charters in New Orleans schools, one testing expert said the higher rates don’t appear to be a charter issue.

“It’s hard to make that claim,” said Koons, whose North Carolina company works with state education departments and testing companies to measure student learning. The company doesn’t have a contract or licensing agreement with Louisiana for assessment.

Rather, she said, it’s a New Orleans issue.

“The fact that there are more irregularities in New Orleans compared to a different district [Jefferson] … is telling,” she said.

What we counted, what we didn’t

We used a conservative method of counting schools with suspected cheating incidents, which means we’re understating the problem. That’s because we counted a school once if it had one serious problem or if it had two dozen. By not taking a deeper look at schools with multiple problems in this analysis, we don’t address possible evidence of a systematic effort to cheat at a particular campus.

All three testing experts agreed that this school-to-school comparison was an acceptable evaluation method.

We defined a major testing irregularity as plagiarism or excessive wrong-to-right answer changes on state standardized tests. Those problems are discovered in a uniform, objective manner by checking every test in the state — and they result in test scores being voided.

Once schools submit tests to the state, a private company, Data Recognition Corp., and Department of Education testing officials analyze them for these two types of irregularities. If written answers on two tests in the same class are too similar, or if a multiple-choice answer sheet shows erasures and corrections at an unusually high rate, those tests are thrown out.

Though there are other ways to game a test, education officials consider these irregularities key indicators of test-security breaches by students or educators.

State officials are also concerned with a wide range of violations called “administrative errors.” The bureaucratic-sounding label belies potentially serious violations in how schools give or prepare for the test that could goose scores. Such errors can include a teacher pointing out the correct answer to a student, or giving instruction on the material during the test. Another example include- teachers opening test booklets in advance of the test, and creating study guide materials strikingly similar to that test. State officials say this happened at two New Orleans charter schools in 2010 and 2011.

In some instances, such errors of test administration can lead to scores being voided.

David Berliner, regents’ professor emeritus of education at Arizona State University and co-author of “Collateral Damage: How High-Stakes Testing Corrupts America’s Schools,” contends that some errors in administering the test are “subtle versions of cheating.” A teacher can give kids extra time or leave key instructional materials on the walls uncovered, hoping kids will use them. If discovered, teachers can claim they didn’t know or forgot the rules.

Still, we didn’t count schools that had only administrative errors. That’s because they’re reported either by the school or monitors from the state. The extent to which schools report these problems largely depends on their leaders’ diligence and honesty, testing experts said.

Further, though state monitors visited about 300 of more than 1,400 public schools statewide in 2011, they tended to go where earlier problems had been detected, so they would be more inclined to look for specific errors. Including those in the analysis would have distorted the results by increasing the number of schools with problems.

About 5 percent of New Orleans public schools had such administrative errors in 2011. Statewide, that figure was 4 percent.

And state analysis for erasures doesn’t extend to special populations, such as those learning English as a second language. However, the plagiarism analysis is applied to all tests.

Still, state education officials criticized our methodology.

Department officials said that our analysis doesn’t take into the account the size of schools. If School District A has high school of 1,000 test-takers, and School District B has a high school of only has 300 test-takers, it stands to reason that School District A would have a greater chance of having one test score flagged – skewing the results, they said.

To account for a possible size problem, we also compared Jefferson Parish, which had 90 schools in 2011 and a population of about 45,000 students, to Orleans, which had 92 schools in 2011 and a population of nearly 40,000 students.

Still, department officials said a better comparison would be to count rates of voids by 100 tests taken, or a rate of voids by district.

What’s at stake

As more school districts around the country adopt test-based accountability systems, more problems with cheating have surfaced, said Priscilla Wohlstetter, a senior research fellow at the Consortium for Policy Research in Education and distinguished professor at Teachers College at Columbia University. Wohlstetter was one of the experts who reviewed our work.

The threat of school closures and evaluation systems that tie teachers’ jobs to student results only increase the temptation to cheat, she said.

“You kind of don’t know who’s doing it, but it’s got to be more than the kids,” she said.

An higher percentage of schools with of testing irregularities in New Orleans could be due to any number of factors, she said.

“Cheating happens when you’ve got low scorers in your population,” and turnaround schools often work with kids who are the furthest behind, she said.

New Orleans’ public-school kids were assuredly that. Before the state took over a majority of sites after Hurricane Katrina, the city’s schools were among the worst in the state – ranked 67th in academic performance out of 68 school districts, state data shows. After the takeover, the Recovery School District began chartering schools to eager charter operators.

Most schools face some repercussions if they don’t perform well on the state report card, which relies heavily on standardized tests.

The city’s charter leaders face an extreme pressure to achieve: they are given three to five years to bring about academic growth, or they’ll lose the school. The traditional schools still under parish School Board control risk transfer to RSD. And those in RSD’s direct control risk transfer to a charter operator, which could cost school leaders their jobs. And ultimately, any school could be shut down.

Indeed, accountability advocate and former Board of Elementary and Secondary Education member Leslie Jacobs said the state’s accountability system for schools has been successful in closing down the worst-performing schools and getting students to better campuses.

She said Louisiana likely has closed more schools as a result of high-stakes testing than any other state.

Wohlstetter said the city’s charters could be feeling the pressure from reform-minded districts and advocacy organizations that place them on a pedestal: among these groups and others, New Orleans’ charter-dominated system is touted as a solution to the traditional public schools that have failed so many students.

Yet another possibility: Because so many of the city’s charters are relatively new organizations, they may not be familiar enough with Louisiana’s test security procedures, she said.

The state, though, has provided plenty of guidance on how to test properly. The Education Department requires each district to write a test security policy in line with state rules. Districts must assure the state in writing each year that they’ve done so. The department’s test administration manuals, which every proctor receives, are comprehensive, providing guidance on everything from accommodating students with special needs to planning seating arrangements.

Longtime observers of New Orleans schools might also remember that spikes in test scores, driven by a promise of improvement from former Superintendent Morris Holmes in the early 1990s, were identified by an extensive analysis by The Times-Picayune.

A closer look at our findings

New Orleans has a greater percentage of schools with serious problems, compared with the rest of the state. Six New Orleans public schools had problems out of 92, or almost 7 percent. We compared that with all other public schools in the state. Of those, 29 had serious problems, out of 1,386. That works out to 2 percent.

The six New Orleans schools were:

  • Dr. King Charter School, which had three tests voided for excessive answer changes

  • Dwight D. Eisenhower Academy of Global Studies, which had 10 tests voided for excessive answer changes or plagiarism

  • F.W. Gregory Elementary (now closed), which had seven tests voided for excessive answer changes or plagiarism

  • James M. Singleton Charter School, which had 23 tests voided for excessive answer changes

  • McMain Secondary School, which had two tests voided for plagiarism

  • Thurgood Marshall Early College High School (now merged into Lake Area New Tech Early College High School), which had two tests voided for plagiarism

Charter schools in the state, which are concentrated in New Orleans, had a rate two times higher than traditional public schools statewide. Of the 90 charter schools statewide, four of them – all in New Orleans – had serious problems, or 4 percent; 33 of the state’s 1,388 other public schools had these issues.

Of those 90 charters, 29 were outside of New Orleans. No charter outside of the city was flagged for plagiarism or excessive erasures.

Jefferson Parish had two schools with serious problems that year:

  • West Jefferson High School had five tests voided for plagiarism; district spokeswoman Tina Chong said five students were caught using their cell phones to look up answers.

  • And Helen Cox High School had three tests voided, too, for too-similar answers. Although our state-provided data on this incident highlights it as self-reported issue, Chong said district officials voided tests after the state and its scoring contractor notified the district of similarities on the students’ tests.

We asked testing experts how New Orleans’ irregularities stack up to other school districts nationally, but they said they weren’t aware of research around the percentage of schools in a district that are typically flagged for testing irregularities.

John Fremer, the president of Caveon Consulting Services and a co-founder of leading cheating-detection firm Caveon Test Security, did say that the average number of erasures a student makes on a test in a single subject has been documented – usually, it’s only one.

“Kids don’t do a lot of erasing, even when the teachers tell them they would like them to go back and check some of their work,” Fremer said. Erasures, and even more so, wrong-to-right erasures, like the ones Louisiana scans tests for, means “usually there’s educator misbehavior, or the kids are not being monitored.”

Another expert, James Wollack, an educational psychology professor from the University of Wisconsin-Madison, said that while New Orleans’ erasures are “more…than would be expected under the model due to pure chance,” there are also reasonable explanations for high erasure counts, such as if students were taught to slash out bubbles that couldn’t possibly be the answer before bubbling in the actual one – but the state’s inspection of exams should easily be able to peg these as sources of the problem.

Schools say they’ve tightened test security

Jacobs, the former state school board member, agreed that when districts create high-stakes accountability systems, “you have created an incentive for some people to cheat.

“But that’s why we put in a very robust system that sends people out to schools during testing, flags schools that have had out-of-the-norm swings in student performance, and built a system where you are comparing results year over year,” she said.

Some educators said they’ve corrected problems after state officials pointed them out. The Orleans Parish School Board oversees McMain, where the state voided two Graduate Exit Exam math tests with similar written answers in summer 2011.

“Greater efforts are in place to ensure students are seated with adequate spacing to minimize plagiarizing during testing,” Interim Superintendent Stan Smith said in an email. Though the state flagged McMain, Smith and School Board staff maintained that the test administrator didn’t observe cheating, and that no students tested admitted to cheating.

Tracie Washington, the attorney for the board that runs Dr. King Charter School, said that the state flagged King for excessive answer changes because students erased a lot, and the proctor didn’t fill out the appropriate paperwork to document it – not because anyone cheated.

Liz Frischhertz, assessments and accountability chief for East Baton Rouge Parish Schools, said her district tries “to run a really tight ship, but there’s some crazy things people do.”

The district had two schools with tests voided for plagiarism in 2011 and a host of self-reported proctoring issues. District officials interviewed students, teachers, test coordinators and the schools’ principals and made them sign off on test security policy.

In her 13-year tenure with the district, “we haven’t had wholesale cheating in a school, the principal making the teacher cheat or anything like that.” Many teachers’ proctoring errors are unintentional, she said, but teachers still get letters of reprimand for each one.

Other issues may go unreported

The state’s test security reports summarize tests voided for erasures and plagiarism, as well as districts’ self-reporting and the results from state test monitoring. But it’s limited to what the state has discovered or what districts have elected to tell them. Other problems, however, go unreported.

State policy asks districts only to report cheating allegations that have been substantiated through a district’s internal investigation, not every claim of cheating. Such was the case at Lafayette Academy in April 2012, when the school’s board investigated a cheating claim, decided it was unfounded, and released no further information about the incident.

The state’s other methods of discovery aside, the flaw in asking a district to only self-report investigations that conclude cheating, rather than any time someone makes a claim, is that those closest to the school are jury and judge, said the researcher Berliner – particularly, with site-based charter management.

“It’s like asking the fox to watch the chicken coop,” he said. “You just don’t squeal on the people you work with.”

Even with the state’s cheating detection methods at work, if gaming the tests is entrenched in a school’s culture, as it was in Atlanta Public Schools, when instructions to cheat allegedly came from the top down, it might be more difficult to catch offenders. Erasure analysis was key in discovering Atlanta’s widespread test improprieties, but “it’s not always foolproof,” said Brad Thiessen, a statistician and mathematics professor at St. Ambrose University in Davenport, Iowa.

For instance, a teacher could tell students to leave some answers blank and bubble them in herself later.

The principal, district test coordinator, and superintendent could all be complicit in the scheme, as they were in Atlanta and El Paso.

Though that type of cheating is always possible, it means a teacher, principal, or even superintendent would have to keep cheating for years to come, Frischhertz and others point out  — especially with value-added educator evaluations being so closely tied to test scores, and state monitoring of unusual changes in test scores.

“As I tell them, if you cheat, with value-added, you are knocking yourself out,” Frischhertz said. “You better keep cheating, or you better keep doing well. It’s not worth it.”

Della Hasselle contributed reporting to this story.

Help us report this story     Report an error    
The Lens' donors and partners may be mentioned or have a stake in the stories we cover.
  • scotchirish

    Interesting that “mainstream” media seem to have little or no interest in this.

    Another to Atlanta _Journal-Constitution_ investigation – amounts to a bibliography of AJC’s reporting:

    http://www.ajc.com/s/news/school-test-scores/

  • TimGNO

    “Every system can be gamed.”
    – Life

  • scotchirish

    Not sure of relevance. The article is a little fuzzy on the distinction, but cheating and gaming the system are not synonymous. Gaming the system implies staying within rules but using them to produce a result not intended by the rules. What happened in Atlanta went beyond gaming the system, to fraud.

  • nickelndime

    I know why cheating and/or testing irregularities occur. Last year (THE LENS reported), a teacher at Pierre Capdau of the UNO New Beginnings network received a $43,000 bonus. Now that’s what I call an incentive! She may well have been a gifted teacher (rarer than one may think), but who knows. However, there are many teachers who may not be as gifted as the one at Capdau, and then we have a problem. Mentioned in the current article (i.e., subtle ways of cheating), I would like to know what the OPSB and the State/RSD are doing to monitor and correct the actions of particular charter schools which complete timely (“just in time”) paperwork for 504 accommodations for particular students (who do not have “special education” IEPs, but are considered poor test takers, ESL included) and are thereby using a subtle procedure to a school’s advantage to segregate students into smaller-than-usual testing sessions. Dare I say the words? – fertile ground for “cheating” and/or “slip ups”/”lapses of attention to detail”/”testing irregularities”… Legally, these accommodations should be in place for students throughout the academic year, but are not. For example, the Einstein Board of Directors (in the absence of the CEO) recently stated in a public meeting (as reported by THE LENS) that portable buildings (which the OPSB promised to provide) will be needed for “TESTING.” Clearly, this group is missing the point of what 504 accommodations are supposed to provide for students. Einstein is not the only charter school doing this and it needs to be corrected now because it is what these students should be getting academically, not just as test time. The OPSB (charter deputy superintendent, test coordinator, etc.) and the State need to get its people to look at the bigger picture well before testing occurs. Onsite monitoring during a week of testing is only one aspect. Gaming – cheating – testing irregularities – that is only a part. It’s got to be a healthy system, not the tail wagging the dog. Proactive, not reactive.

  • scotchirish

    That there were financial incentives imply fraud. This is why many involved in the Atlanta scandal went to jail. Atlanta (and El Paso) are not unique in their susceptibility to this kind of temptation. Another context that may influence behavior is the “gap mania” that pervades public and, at least rhetorically, private education establishments in the US. Results that appear to attenuate “the gap” are not inspected too closely.

  • scotchirish

    Correction: implied fraud in the Atlanta case.

  • scotchirish

    In addition to the “bottom up” deception in Atlanta other sites, there was at least at one time pervasive “top down” fraud in the formulation and interpretation of tests: http://www.sagepub.com/wrightstudy/articles/Cannell.pdf . The truth has not been large segments of education testing industry for some time in that it has been engaged in production of the “Lake Woebegone effect” rather rigorous assessment and interpretation.

    The rationalization that the high stakes force the alleged “bottom up” irregularities is the same that would excuse, say, bank robbery – “that’s where the money is.”

  • nickelndime

    scotchirish – I love it! I love it! “The rationalization that the high stakes force the alleged ‘bottom up’ irregularities is the same that would excuse, say, bank robbery – ‘that’s where the money is.’ ” Comparable reasoning, “High-Stakes Testing Corrupts America’s Schools.” Testing doesn’t corrupt America’s schools. People do – teachers, administrators who know better. Think about it. The students are not the ones who are cheating because basically they do not know they are doing anything wrong, but the adults sure as hell know what they are doing, which makes it even worse.

  • scotchirish

    I know that this is an obscure venue, but with as many dogs as there are in the fight it is surprising how little interest there is here, as shown by paucity of commenters.

    The students have nothing to gain by this cheating, so it is doubtful that they are the instigators.

    Maybe it’s like drunk driving in Louisiana, kind of cute until they call for the body bags.