Summary of Audit Panel Report

We all remember the fiasco last spring over the fifth grade math test. The folks at DPI set the cut score at 28%, practically guaranteeing that every student who took the exam would pass it. Faced with an enormous credibility issue, the DPI and State Board contracted an outside audit panel to review exactly what happened and why. The panel was made up of experts from outside the state who oversee or play an integral role in their own state's high-stakes testing program.

The audit was released in early December and several things are striking about it. One, the mission to find out exactly what went wrong was not fulfilled, although the panel hazards what they think is a pretty good guess. Second, the finding that the process that produced the fifth grade test and cut scores was indeed flawed was not extrapolated to include other tests or test design processes that may be similarly flawed. Third, there has been precious little attention given to the report by the State Board since its release. The Board mentioned it briefly in its January meeting and was to have another meeting this past Friday (Feb. 1) but we have yet to hear what, if any, real impact the report will have.

Here is my summary of the report:

1. There appears to be some legal vulnerabilities for the State/DPI on the issue of testing. The report isn't specific but it makes the point over & over, emphasizing the need to make the whole program bulletproof from a legal standpoint. I have contended since the release of the NC School Psychologists' Report which emphasizes the fact that the EOCs/EOGs were validated for use in measuring a school or district or perhaps a classroom BUT NEVER an individual student leaves the State/DPI wide open for a class action suit. Reason: Federal Civil Rights Law states that tests validated for one purpose may not be willy nilly used for another purpose if they have a de facto negative impact on a protected group (and of course they do have a demonstrable negative impact on minorities). Read the Office of Civil Rights Report on the Use of High-Stakes Tests on this very issue.

2. Along the first line, the panel strongly recommends that the testing folks at DPI automatically win any dispute with curriculum folks. That is, they are stating that if Fred ,who is a curriculum specialist, is suggesting a change and Wilma, a testing specialist, sees that all the legal bases may not be covered if Fred's suggestion is implemented - Wilma gets the veto in order to reduce legal liability. Bottom line - more of the tail wagging the dog.

3. The audit panel found that basically the folks at DPI were "hardworking" and "dedicated" but underpaid and understaffed. I agree, but so what.

4. Panel found that legislators had a poor understanding of the testing process, the requirements for field testing/validation, etc., and thus were too optimistic in their timetables/expectations of DPI folk. Bottom line - panel asks legislature to back off and let the "experts" at DPI call the shots. Panel made similar recommendation to State Board; chastised State Board for duplicating advisory panels that DPI had and which frequently gave conflicting advice. Recommendation - State Board should listen to DPI.

5. No clear reason was found for the screw-up on setting the cut scores too low on the 5th grade math test (which was the reason for the audit). Best guess was a combination of overworked folks at DPI and problem in linking the data/growth formula for the new tests to the old ones. I find this explanation a bit unsatisfactory and I still maintain that we need to ask why in the hell a test with a 28% cut score left Wilmington St. The linking/overwork explanation doesn't explain all of that.

6. Along the same line, the audit panel found that a possible problem in setting the cut scores lay in the fact that students may have not have tried all that hard when taking the field test, that they left several questions blank, that field test questions were embedded in "live" tests in different places, and that this combination may have led DPI to incorrectly gauge how students would do when the new test was "live." Well, duh. Which should lead all of us to another set of questions: How valid are all the other tests? Just because there wasn't a fiasco such as the one with the fifth grade math test DOES NOT mean that the other tests are ipso facto valid. Haven't we all observed student apathy in giving field tests? Don't many "Christmas Tree"? Of course, they do. Does the DPI re-calibrate, revalidate EOCs/EOGs on an ongoing basis or do they merely rely on the original field test data (flawed) to set baselines?

7. Panel called for either more money or fewer tests.

8. Panel suggested that tests be scored at state level rather than LEA level in order to prevent crazy numbers from being released. In other words, by having the tests scanned/graded in Raleigh, if the cut-scores result in too many/too few passing, the opportunity for "jimmy-jiggering" the numbers exists and the luxury of preventing another wholesale embarrassment exists as well. Both don't exist if we continue to score at the local level. I really laughed at this suggestion because it proves how fixed the numbers are (there must be a certain % pass and a certain % fail) In fact, Fabrizio said last Spring, that he "knew" the math test was wrong because he "knew" that any test that had a greater than an 85% pass rate was flawed!

9. Panel didn't look too kindly at the recent Fairness in Testing Legislation which called for a curtailment of field testing. Panel wants much, much more field testing for further validation and legal protection.

10. Panel called for an ongoing review/audit of the process by an outside group.

John deVille - Communications Director, North Carolina Citizens for Democratic Schools

Education Week's Summary of Audit Panel Report

Audit Panel Report (47 pages in pdf format)

 

 

Hosted by www.Geocities.ws

1