Using Education Data to Address Misleading Education Data Claims

Monday, December 18, 2023 by Dorothyjean Cratty

Across this EDDR community of practice (and its previous DATA-COPE incarnation), we hear a lot of concerns about dealing with data mischaracterizations—specifically those made by state and district leadership. Lessons learned and shared in the book series include: the Volume I chapters on Politics and Descriptive Statistics, and the Volume II Transparency chapter. 

The degree to which these may help, of course, depend on your role in the agency as well as the objectives and determination of the mischaracterizing leaders. The Politics chapter explains agency roles and provides strategies, the Transparency chapter offers advice for nudging leaders if possible, and the Descriptive chapters’ basic techniques may be useful for helping receptive leaders get more perspective. 

This blog content addresses another level of this challenge—one that is increasing in this current political environment. (And it may be more useful to those EDDR readers working outside of agencies.) We all know it is not uncommon for state or district leadership to present results in an overly favorable light. What this blog illustrates, are political leaders, with an extreme agenda, mischaracterizing state data to claim that students are underserved in public education and harmed by equity initiatives.

The blog content is drawn from a full critique sent to Virginia state legislators, addressing the current administration’s published rationales and affected policies. (Opinions, findings, and conclusions expressed here are solely those of the author—please write with any questions to: CrattyEDDR@gmail.com.) The focus here in the blog is on data sources and analysis for fact-checking. Data sources are linked and include: state annual subgroup school enrollments, test results, graduate counts, AP courses, AP exams, and teacher vacancies; and national data on state measures for SAT, AP, homeschooling, and private school. 



Virginia Education Data Report Critique

In May 2022, the incoming Virginia administration of Governor Youngkin issued a report entitled Our Commitment to Virginians: High Expectations and Excellence for All Students

The report claims to show declining achievement and widening gaps in Virginia education outcomes relative to the national averages. It does this using previous administrations’ publicly reported data, while claiming these data have been hidden from the public. The report transparently selects a handful of seemingly damming Virginia student outcome measures, and attempts to attribute these to previous administration equity initiatives and policies uncorrelated in time and content. 

As summarized in the Forward (p. 3):

This report should create a sense of urgency and importance for all of us. Decisions made at the state level created confusion in Virginia education and downplayed troubling trends. It is noteworthy that the rhetorical emphasis on equity coincided with the widened gaps in student achievement. And now, decisions at the state level must correct those errors and reverse these disturbing trends.

The report narrative claiming these select data points reveal a state in decline is consistently repeated in press releases and interviews by the governor, the secretary of education, and the state superintendent. Its release followed the Governor’s Executive Order One, signed on his first day in office, immediately banning equity initiatives and directing the superintendent to remove state equity resource materials. The ban included state equity audit tools and trainings for measuring and closing achievement gaps, and caused districts to cancel their own efforts.

After banning tools for measuring and closing student outcome gaps, the administration’s report highlighted examples of these gaps and claimed previous administrations caused these by setting low expectations in the name of equity—exacerbating what they claim are overall downward state trends—and hid this from the public (p. 6):

Historically, Virginia students have outperformed their peers on national measures such as the NAEP, AP examinations, and the SAT and ACT college admissions tests. But today tells a different story. Our reputation and overall high average performance masks widening achievement gaps in the Commonwealth’s schools, as well as a recent downward slide in comparison with other states on a range of academic achievement measures. In 2017, downward performance trend lines on the state SOL assessments foreshadowed the declines to come in 2019 on national reading tests for elementary and middle school students. We also have seen this in the performance of our students’ [sic] on AP exams. Since 2015, the percentage of Virginia students earning a passing score has fallen from third to ninth in the nation. And we have widening gaps in achievement, access to opportunities, quality of schools, and college and career readiness across communities.

In truth, the data do not support these claims of widespread trending failure in Virginia or greater learning loss. (For instance, the percentages of Virginia graduates meeting the SAT benchmarks are consistently higher than the national average—every year, in every subject, for every student group.) The misrepresentations of the data are too extreme to have been anything but intentional. Which begs the question: why generate misleading claims to depict a declining state unsupported by the data? 

Governor Youngkin’s Secretary of Education, Amiee Guidera, explains the reasoning in an interview with the American Enterprise Institute (AEI), which occurred just prior to the report’s release and espoused many of the same claims. Here are excerpts of her response to the first question received (from the full transcript online, p. 14):

Question: ...where’s the administration on increasing education choices for children outside the public school system... 

Guidera: So there are a lot of us as you know from our backgrounds that believe deeply in those opportunities but where we are right now in the commonwealth is that we have a whole lot of as I said change management work to do first right of building that awareness and that urgency to do that...

I think that when we start showing people literally how many kids are further behind there’s gonna I hope that there is increasing demand that we cannot wait...  

...that’s something we’re gonna be working on but for right now our priority is going to be getting the data out there...  

This information provides very important context for understanding why this administration appears to prioritize denigrating education in Virginia over improving it. It is not uncommon for new administrations to blame others when negative results are published in the beginning of their tenure and to take credit for positive results. What the Youngkin administration is doing that is unique is that they are specifically compiling measures that they can cast as negative—even when it requires a stretch of imagination and the truth. 

For example, in what appears to be their most oft-repeated claim, they cite the 2021 Advanced Placement (AP) exam results as proof of a broader Virginia failure by comparing them to a single erroneously high number from eight years ago. Why try to paint Virginia public education as failing by claiming Virginia graduates passed far fewer AP exams during the pandemic? 

It also explains why these repeated statements include no mention of efforts to improve the results. (States increase AP passing rates by increasing supports for AP-prerequisite courses and funding for qualified AP subject teachers, and they use equity data to ensure access across schools and student subgroups, and provide funds to help families pay for the exams.) The AP example is just one of the many negative claims to be rebuked in this critique which begins with another of their frequently repeated claims: that the Youngkin administration will bring data transparency that had been missing.



Claim: “Lack of Transparency”

Report Claims: We now see that Virginia education has experienced: Lack of transparency around negative data trends on key student and school success indicators (p. 6). 

The Youngkin report, and this critique, draw on data which have been publicly posted on the VDOE website. The Data and Reports webpage compiles annual downloadable data files with detailed data by school and student subgroup for all Virginia programs and outcomes. Virginia student results on national tests, including the AP and SAT, were posted annually on the News Releases webpage. This searchable library included over ten years of results by student subgroups and relative to US means. The Youngkin administration has since removed these national results for every year prior to 2021.



Claim: “Slipping Measures of College and Career Readiness: AP Scores”

Report Claims: Since 2015, the percentage of Virginia students earning a passing score [on AP exams] has fallen from third to ninth in the nation (p. 6).  [Note: “2015” report of 2014 results.]

Report Figures (p. 9) Top 10 AP States 2014 and 2021

April 3, 2023 Secretary Guidera VDOE Press Release of 2022 Results: Virginia’s 2021-2022 AP results are yet another sad reminder that when previous administrations lowered expectations, Virginia’s children suffered,” Guidera said. “The commonwealth must reverse the declines in the AP scores that have occurred over the last 10 years by restoring rigor and celebrating the achievements of our students. I will be working with Dr. Lisa Coons, our incoming superintendent of public instruction, on creating a strategy to restore the performance of Virginia’s students in advanced courses and expanding opportunities for students to pursue rigorous academic courses.

Actual AP Data

These claims of falling percentages of Virginia graduates passing AP exams are false. The 2014 passing rate was not 30.0—it was 27.7. This rate was consistent with adjacent years, then increased steadily to 28.8 as of 2019, and remained slightly higher than 2014 (at 27.9) as the pandemic began in 2020. The first real drop occurred amid the 2021 school closures (understandably), when the rate fell to 26.9. The following year, under the Youngkin administration, was the first year passing rates fell below the 2013 rate—to 25.2—and Virginia fell out of the top ten AP-ranked states for the first time.

The graph below shows the annual Virginia rates relative to the US averages. The dark lines are the true rates. The faint lines are the initial estimates released by the College Board using projections of state high school graduates. The rates are corrected one year later when states publish the actual graduate counts. Projections underestimated 2014 and 2015 Virginia graduates, so these rates were revised sharply downward a year later. The previous administration reported these revisions in press releases which the current administration deleted. (Annual AP reports are available from this author.)

Figure 1: Percentage of Graduates with Qualifying AP Scores in VA and US 2013-2022

Each year, the administration states it will address this “fall” from 2014. If they had tried, their own data would reveal steady rates of students taking AP courses and exams, as well as passing exams.

Figure 2: Number of VA Graduates Taking and Passing at Least One AP Exam 2013-2022



Claim: “Learning Loss Worse than Feared: Renaissance STAR Math/Literacy”

Report Claims: Virginia’s math performance dropped seven points more than the national average.. English performance in the Commonwealth dropped one point more than the national average (p. 8).

Report Figures: Appendix K Math Learning Loss, and Appendix L Literacy Learning Loss

This claim refers to the Renaissance STAR interim assessments which school districts can purchase for “universal screening, progress monitoring, and goal-setting.” The administration’s report essentially copies a screenshot of the Renaissance webpage reporting fewer tested students reaching benchmarks in winter 2021-2022 versus 2020-2021. Less than 5% of Virginia’s 1.3 million students were tested—a tiny unrepresentative sample that no state education leader should mischaracterize or misunderstand to be an accurate measure of student performance in English and Math over time.

Removed from the report’s version, are positive Student Growth Percentile (SGP) results, which the webpage notes are the only true year-to-year comparison results (and only for the few tested in both).

Figure 3: Math/Literacy Median Fall-to-Winter SGP Levels in 2021-2022 Minus 2020-2021



Claim: “Eroding Parent Confidence in VA K-12 Schools: Public School Exodus”

Report Claims: School shutdowns and prolonged virtual instruction... led to an overall lack of confidence in the education system... Home schooling increased 56% in 2020-2021... The exodus of thousands of families from Virginia’s public schools was noted [by a study which] reported that the pandemic accelerated a pre-pandemic trend since 2010 of growth in homeschooling and private education outpacing growth in public school enrollment (p. 13).

There is no “exodus” found in the data on homeschooling or private schooling in Virginia—either over time or relative to other states. Virginia homeschooled students rose slightly to just 2.8% of the total (1.3 million) public, private, and homeschooled students from 2010 to 2020 (the last year of all available data), while private school enrollment fell to 5.8%. The number of homeschooled students reached 59,638 when schools closed in the pandemic, but have fallen since by thousands each year.

Figure 4: VA K-12 Private and Home School Student Enrollment, Percentages and Counts

Both the Virginia private and home school percentages consistently trail their US averages—before, during, and after the pandemic. The National Center for Education Statistics (NCES) reported that US private school students were steady at 9% up through the last published numbers in 2020—versus Virginia’s 5.8% that year and 7.1% overall high. As noted in the Youngkin report quotes above, homeschooled students in Virginia jumped 56% when schools closed in the pandemic. An Associated Press analysis of all states with annual homeschooled enrollments found a 63% average increase that year. A similar Washington Post analysis found a 2018-2023 increase of 51% for the US versus 31% in Virginia. Finally, every NCES and Census household survey to date has found the national average percentage of homeschooled students to be larger than the percentage homeschooled in Virginia.



Claim: “Low Expectations Led to Declining Achievement: SOL/NAEP Reading”

Report Claims: Reading SOL test scores grades 3 through 8 declined every year from 2017-2019 (p. 4). In 2017, downward performance trend lines on the state SOL assessments foreshadowed the declines to come in 2019 on national reading tests for elementary and middle school students (p. 6). …proficiency cut scores were lowered in mathematics in 2019, followed by reading in 2020. Virginia is the only state to define proficiency on its fourth-grade reading test below the NAEP Basic... (p. 10).

The report highlights a 2017-2019 decline in Virginia statewide Standards of Learning (SOL) reading assessments. Also highlighted throughout the report, is a concurrent decline in Virginia reading results on the National Assessment of Educational Progress (NAEP), known as the “nation’s report card.” This SOL decline is shown (as black trend lines) in the graphs below, showing that state mean pass rates varied between 79% and 80% from 2015 to 2018, before falling to a low of 78% in 2019. 

The report attributes these declines to test proficiency levels set by the Virginia Board of Education, specifically, proficiency cut-scores lower in SOL than NAEP, and lowered in some SOL revisions. Though, it is unclear why it would take 20 years for the SOL-NAEP scale gap to have had an impact, and why reading would decline in 2019 when the math assessments were the ones changed that year?

Figure 5: VA SOL Reading Pass Rates 2015-2019 for All Students and Program Subgroups

The right graph also shows the lower pass rates for economically disadvantaged students, English Language Learners (ELL), and students with disabilities. Enrollment increased for each of these groups between 2017 and 2019. These increases were largest in early grades, which experienced the largest pass rate declines. For example, the percentage of ELL students in 5th grade went from 6% to 9%, 4th grade grew from 8% to 12%, while 3rd grade grew as high as 14% by 2019. (Note, adding newly-immigrated student results in 2018 contributed to the SOL drop, but did not impact NAEP.)

In press releases of SOL and NAEP results, the prior Virginia administration and Board of Education delved into these types of underlying data and discussed targeted funding and resources, including: “…increased funding for reading specialists and the creation of an "equity fund" that would provide an additional $131.9 million in state support for schools serving significant numbers of children in poverty. Distributions from the fund would support school division efforts to recruit and retain experienced and effective teachers and other professional staff in high-poverty schools and provide additional intervention and remediation services for students.” (Since deleted from VDOE site.)

As research continually shows, students in schools with high percentages of ELL or economically disadvantaged students have lower outcome measures on average—and this includes lower averages for students outside of those subgroups as well. These schools need more resources for all of their students—this is why states add subgroup concentration and growth rates into their funding formulas. Higher average percentages of unfilled teacher positions in these schools also increase funds needed.

Figure 6: School Reading Pass Rates by Economically Disadvantaged, ELL, Teacher Vacancies

The figure above shows mean reading pass rates in schools with 60% economically disadvantaged students is 9 points lower than schools with 40% (top-left graph, red line). While the mean pass rate for students outside of that subgroup is lower by 6 points (green line). The ELL correlation is also consistent, but smaller. Schools struggling with unfilled teacher positions have a roughly one-to-one loss in pass rates—losing one pass-rate point for each percentage point of positions unfilled (bottom-left graph). Finally, unfilled teaching positions are highest in schools with high percentages of economically disadvantaged students—in a one-to-one positive correlation (bottom-right graph). These needs—which increased in the pandemic—were detailed in recent Joint Legislative Audit and Review Commission (JLARC) reports on the VA K-12 Funding Formula and K-12 Teacher Pipeline

Unfortunately, even following the pandemic, the Youngkin administration shows little interest in delving into these real issues to solve proficiency loss in reading and other subjects. In their recent report to the legislature on ways their administration will “promote excellence and higher student achievement” they repeat the SOL cut-score claims and propose changing these. They continue to address learning loss in terms of the previous administrations’ low expectations, closure policies, and equity initiatives. 

Under the first full Youngkin schoolyear, Virginia SOL reading results showed zero growth in pass rates from 2022 to 2023. By contrast, neighboring state Maryland, with a similar student population, followed the same school closure policies as Virginia, yet they increased 2023 reading pass rates by 3 percentage points. Virginia delayed the release of the test results for the first time, then announced them along with initiatives to address learning loss—several years after the school closures. “When asked at the news conference why the governor waited to launch his “playbook” for recovery until this year, Youngkin said he didn’t know the pandemic would have this lasting effect.” 

The Youngkin administration is not data driven—as some claim—it is narrative driven and focused on twisting narrow data points and large pandemic losses into proof of some larger Virginia public education failure, while neglecting real public-school needs. Their report claims of Virginia failing students through “…significant lowering of expectations, a lack of transparency with data, and weak accountability for results…” (p. 5) could become self fulfilling under their leadership. Whether intentional or not, this helps their school choice change agenda. As Guidera explains (AEI, p. 10): “…it is really hard to get people to recognize that we need to change when they don’t know that there’s a problem.”

Next
Next

Stanford Graduate School of Education MS in Education Data Science Makes EDDR Required Reading