What 'A Nation At Risk' Got Wrong, And Right, About U.S. Schools
Very few government reports have had the staying power of "A Nation At Risk," which appeared 35 years ago this month and stoked widespread concerns about the quality of American schools.
"The educational foundations of our society are presently being eroded by a rising tide of mediocrity that threatens our very future as a nation and as a people," the authors thundered in one of its best-known passages.
When it appeared in April 1983, the report received widespread coverage on radio and TV. President Reagan joined the co-authors in a series of public hearings around the country.
The report's narrative of failing schools — students being out-competed internationally and declining educational standards — persists, and has become an entrenched part of the debate over education in the U.S.
Prudence Carter, the dean of the Graduate School of Education at the University of California, Berkeley, teaches her students that "A Nation At Risk" was a "pivotal moment" in education policy — the beginning of a "moment of angst" about the state of the nation's schools.
That angst found expression, she says, in the No Child Left Behind law in 2002 and the Race to the Top initiative in 2009 and is still enshrined in federal law today.
Although there has been some progress, "the reason that we continue to mark the anniversary is that [the worry] still rings true," says Michael Petrilli, president of the Thomas B. Fordham Institute. He calls the report "a touchstone"; it's in the mission statement of the institute, which promotes school choice, testing and accountability.
This month, U.S. Education Secretary Betsy DeVos invoked the report's anniversary in remarks to the Reagan Institute Summit on Education, convened for the occasion. "Our nation is still at risk," she concluded.
But what I learned in talking to two of the original authors of "A Nation At Risk" was that they never set out to undertake an objective inquiry into the state of the nation's schools.
They started out already alarmed by what they believed was a decline in education, and looked for facts to fit that narrative.
And while their report is still widely cited, a second official federal government analysis of standardized test scores, produced just seven years later, showed the opposite of what was claimed in "A Nation At Risk." That analysis found, instead, "steady or slightly improving trends" in student achievement.
The looming disaster depicted in "A Nation At Risk," it turns out, was a matter of interpretation.
I interviewed Yvonne Larsen, the vice chair of the commission that wrote the report, for my 2015 book The Test. Here's how she described what happened:
"I was called by [President Reagan's] office. They told us that we were going to have a commission ... to address the challenge that we faced in trying to upgrade America's education to the rigorous education that we had in the past ... We felt the rigor in our schools had diminished. We were concerned. There was a strong feeling that if we continued how we were going, we wouldn't continue to improve."
Gerald Holton, now professor emeritus of physics and the history of science at Harvard University, was another member of the commission. He drafted some of the most alarmist language in the document, including the now-famous line: "If an unfriendly foreign power had attempted to impose on America the mediocre educational performance that exists today, we might well have viewed it as an act of war."
Like Larsen, he said that he and his co-authors set out to confirm their existing concerns about the state of America's schools.
Back then, "education was not on the front page," he recalls. "It was more inflation and gas lines." But he and his colleagues "knew that trouble was ahead. We knew something had to be done."
In making the case that trouble was ahead, the authors used language that Bruno Manno calls "apocalyptic, almost militaristic" — and he's an admirer of the report. Manno worked in the Department of Education starting in 1986 and now serves as a senior adviser to the Walton Family Foundation's K-12 education initiative. (Walton supports NPR's coverage, including of education.)
"That was done on purpose to capture the attention of the American public," he says.
"A Nation at Risk" cited statistics such as: "The average achievement of high school students on most standardized tests is now lower than 26 years ago when Sputnik was launched," and "[The SAT demonstrates] a virtually unbroken decline from 1963 to 1980. Average verbal scores fell over 50 points and average mathematics scores dropped nearly 40 points."
Those numbers weren't made up. But they weren't the only ones out there.
The report de-emphasized the fact that more students than ever were graduating from high school and attending college, and that top U.S. students led the world in academic achievement.
The Department of Energy — yes, Energy — commissioned a follow-up analysis of test score trends in 1990. It was known as the Sandia Report, after the federally funded Sandia National Laboratories which produced it.
Its authors were engineers trying to generate economic forecasts, not education authorities with an ax to grind. And they didn't diagnose the same disaster that "A Nation At Risk" did.
"To our surprise, on nearly every measure, we found steady or slightly improving trends," one of the authors, Robert Huelskamp, later wrote.
How could this be? Because of a statistical effect known as Simpson's Paradox.
In the early 1960s, college-going was still rare. It was mostly top students, largely well-off white males, who took standardized tests like the SAT and applied to college.
By the 1980s, college was more available to more people, and more important to getting a good job. Many more people were taking the SATs and applying to colleges. This included more people of color, more low-income students and other historically disadvantaged groups.
So, when you lumped everyone's scores together, as "A Nation At Risk" did, you saw declining average scores from the 1960s to the 1980s.
But, when you broke out test takers by subgroup, as the Sandia Report did, looking at men, women, whites, Hispanics, African–Americans and low-income students separately, you found that most of these groups of students were improving slightly on test-taking over that time.
"The idea that American schools were worse just wasn't true," says James Guthrie, an education professor at Lynn University in Florida. Guthrie published a scholarly article in 2004 titled "A Nation At Risk Revisited: Did 'Wrong' Reasoning Result in 'Right' Results? At What Cost?"
"I looked at it every which way," he says now. The authors in 1983 "were hell-bent on proving that schools were bad. They cooked the books to get what they wanted."
Holton objects to this view. "We put our honor and our lives into this report, and we were not being taken for a ride by anybody," he says. "These were serious people ranging from a Nobel Prize winner [Glenn T. Seaborg] to the head of Bell Telephone Labs [William O. Baker]."
"A Nation At Risk" got the national spotlight.
Diane Ravitch, then a Department of Education official under President George H. W. Bush, wrote an op-ed critical of the Sandia Report headlined "U.S. Schools: The Bad News Is Right."
Ravitch later publicly renounced this position and others, and became a bestselling author and advocate focused on educational equity. When "A Nation At Risk" came out, "I thought, oh boy, this is going to shake everybody up. It's a good thing," she tells NPR.
"Now, I think it sounded an alarm that was misguided, because the schools were not sunk in mediocrity."
That alarm, and the message — "the bad news is right" — has been repeated countless times in the decades since, by philanthropists, business leaders, politicians and other reformers. Ravitch notes that schools may be blamed when times are tough but not necessarily credited when things are going better.
"A Nation at Risk," she says, was "written at a moment when we were in recession. When our economy was booming, nobody said, 'Gosh, we must have really great schools.' "
The habit of criticism of student test scores persists. For example, earlier this month, the Nation's Report Card came out, showing steady scores in most areas and improvement since 2015 in one area out of four tested, eighth grade reading. Headlines called the results "disappointing".
If you look at NAEP trends over the long term, 9- and 13-year-olds scored modestly higher in reading and mathematics in 2012 than they did in the mid-1980s.
When it comes to the SAT, meanwhile, both the overall number and the racial and socioeconomic diversity of people taking the SAT continue to rise. And the test has changed over time. But according to the College Board, which reports the results in a consistent format, scores have continued fairly steady.
Meanwhile, in more than half of states, public schools are receiving less total state funding than they were a decade ago.
In all but a few states, teachers earn less than what other professionals with a similar level of education are making. Just 11 states direct more money to districts full of impoverished students than to affluent districts in consideration of their greater needs, a figure that has declined by half since the Great Recession.
And, just over 50 percent of public school students are now eligible for free and reduced-price lunch because of their family income.
In the context of declining resources and rising child poverty, maintaining steady or slightly improving test scores over decades could be described with other words besides "flat" and "disappointing" — perhaps "surprising" or "heroic."
But the narrative established by "A Nation At Risk" still seems to be the one that dominates how we think of the data.
Guthrie, for one, thinks that's been, on balance, a good thing, because it brought education to the front and center of the U.S. agenda.
"My view of it in retrospect," he says, "is seldom, maybe never, has a public report been so wrong and done so much good."
Copyright 2021 NPR. To see more, visit https://www.npr.org.