School reformers talk nonstop about using “data” to drive policy, teaching and just about everything else, which, you would think, would require that the data being used be accurate. The following post exposes a troubling problem with the push for “data-driven” everything — bad data. This important piece was written by award-winning Principal Carol Burris of South Side High School in New York, who was named New York’s 2013 High School Principal of the Year by the School Administrators Association of New York and the National Association of Secondary School Principals, and in 2010, tapped as the 2010 New York State Outstanding Educator by the School Administrators Association of New York State. Burris has been exposing the botched school reform program in New York for years on this blog, and it is worth reading. You can see some of her earlier posts
By Carol Burris
The New York State Education Department (NYSED) has once again demonstrated its uncanny ability to forge ahead without regard for the facts.
In its zest to prove there is a crisis of college readiness, combined with a sweetheart infatuation with big data, NYSED produced reports (SIRS 601-604) to track New York high school graduates’ college enrollment. A few days before the public release of the reports, Deputy Commissioner Ken Wagner sent a memo to districts. He explained that the department had combined school data with that of the National Student Clearinghouse to document which former high school students were enrolled in college and whether they persisted in their studies.
The memo informed superintendents that after the Regents discussed the data, it would be publicly released because it would be of interest to communities.
Our district data coordinator, who is my assistant principal, brought me the SIRS report. It claimed that only 80 percent of our students from the cohort of 2008 (Class of 2012) were enrolled in college. As soon as I saw the number, I knew it was not correct. Ninety-eight percent of the 2012 Class told us they were going to college and gave us the name of the college they would attend. Might some have left after one semester, or changed their minds? It’s possible. But I found it difficult to believe that 18 percent had either not enrolled or quickly dropped out.
I asked my assistant principal to drill down to the names in the SIRS report. Not only were the names given, the report included which colleges and universities the students attended, their race, special education status, whether or not they received free or reduced priced lunch, and in many cases, their college major. This massive collection of data on graduates made my jaw drop.
And then I looked at the names. The 2012 salutatorian wasn’t on the list. I began a name by name comparison of the cohort against the report. The list did not include the names of many former students who were attending private and public colleges and universities, both in and out of state.
I began calling families to verify the report. There were 53 names that did not have a college listing. By 5 p.m. that day, I had spoken with 27 families. In 25 of the 27 cases, the students were thriving in their third year of college. They were at Brown, Bard, Cornell, Bentley, Notre Dame and Wesleyan. One student was in the Naval Academy (which smartly and ironically is one of the few schools that does not share data), and another at Tufts. One was at the University of Florida and another at the University of Charleston. What was even more bizarre was that some were in New York State public colleges governed by NYSED—SUNY Buffalo, SUNY Binghamton, SUNY Stony Brook and Queensborough Community College. One student had already graduated from a technical school with a 3.84 GPA. Eighty percent had now become over 90 percent, and over the course of the next few days the percentage would continue to climb. This was no small error.
When calling, I asked parents whether they had “opted out” of having their son’s or daughter’s college enrollment data collected. They had not. One mom said: “Honestly, if I knew about it, I would have opted out. It is not John King’s[1] business where my son goes to college or what his major is.”
The error was not limited to Rockville Centre. Ken Mitchell, Superintendent of Schools of the South Orangetown Central School District, discovered that 80 of his 2012 graduates who were attending college were not on the list of attendees. On the SIRS report, The New York State Education Department gave a 62 percent college going rate for his district, although the true number was 89 percent.
In an article entitled “Educators livid over college ‘success’ report,” four other districts reported error rates that ranged from 15 percent to 27 percent. Harrison Schools’ Superintendent, Lou Wool, characterized the NYSED report as “irresponsible” and said that it was part of an agenda designed to convince the public “that public schools were failing.”
From APPR scores that do not add up, to a disastrous rollout of the Common Core and its testing, the New York State Education Department has inexplicably gotten a pass on a series of blunders. But there are implications regarding this latest error that go well beyond New York.
Apparently one of the many “holes” in the Clearinghouse data, according to NYSED, is that students who do not receive financial aid in some schools are excluded (see slide 9). No matter what the reasons for error, many of our schools’ highest achievers—students who are likely to persist and who do not need remediation were not reported. Other excluded students are those who attend the military academies, who opt out of data collection, who attend colleges outside of the United States, or who attend colleges that do not share data. And of course there are always errors in matching the databases.
Because of all of the problems described, it is reasonable to question the veracity of many of the national claims regarding college readiness. We have been bombarded by “data” on remediation and the lack of college readiness, and this “data” is used to justify Common Core reform. In some cases, the data is pure exaggeration as I have written about here and here and here. However, because NYSED created this report which allows us to “peek inside,” we now know that there are serious problems at the very source. In a news report, Deputy Commissioner Wagner claimed the error rate was 3 percent, because that was the error rate allegedly found by the New York City public schools. This does not conform, however, with the very large rates of error that individual districts are finding.
Flawed reports such as these reflect the mindset of those who are infatuated with data, and who jump to use it when it confirms their belief that schools are not doing a good job. NYSED included the results in a Powerpoint narrative of how unprepared our students are for college. Superintendent Donahue of Byram Hills Schools referring to NYSED said that “once again, the confidence that they put in data is misplaced.” That is a generous understatement. What happened in New York should be a cautionary tale for all.
[1] John King is the New York State Commissioner of Education
–
You may also be interested in:
No comments:
Post a Comment