scorecardresearch Skip to main content

What exam schools can’t do


For certain sets of Bostonians — the immigrant who hopes his children have it better; the single mother who wishes the same; and the high-earning parents who got married and had kids and decided, despite the suburbs’ pull, to enroll their children in the Boston Public Schools system — these are anxious days. The admittance tests for Boston’s exam schools are next month, Nov. 5, to be exact. Blinking Web pages are dedicated to the date, replete with hyperlinks where a New York preparatory outfit called ERB offers study guides for a mere $15 a pop, guides that carry the matter-of-fact yet calming title of “What to Expect.”

In Boston, as in many American cities, the top public school is an exam school. Acceptance into one of the three elite outfits here — Boston Latin School, Boston Latin Academy, and the John D. O’Bryant School of Mathematics and Science — assumes a different life trajectory for the talented middle schooler. An exam school education, the thinking goes, offers a more meaningful, rigorous education, opening the doors of elite universities and rewarding careers thereafter. Miss the cutoff for an exam school, and the fear is that you miss out on the life it promises.


Such fear, however, appears to be misplaced. Two provocative new studies — the first to attempt to isolate the real effects of exam schools on their students — conclude that an exam school education in Boston or New York isn’t any better than the education that other public schools offer, especially for the kids most worried about getting in, whose scores hover near the admittance threshold. Those kids, one study found, end up performing about the same on their SATs whether they get into an exam school or not. And after high school, according to a study of exam schools in New York, exam-school students enroll and graduate from more or less the same universities as comparable students graduating from other public high schools.

“Getting an exam-school education is no better than an alternative,” said Josh Angrist, a professor of economics at MIT and coauthor of one of the studies, “The Elite Illusion: Achievement Effects at Boston and New York Exam Schools,” published this summer by the National Bureau of Economic Research. “Neither Boston nor New York exam schools seem to boost achievement,” Angrist said.


Supporters of exam schools point out that by focusing only on performance, the authors are disregarding many of the auxiliary benefits such schools offer. And to be clear, these studies don’t imply that exam schools are awful, or that the top-tier students don’t benefit from them: The brilliant and deeply curious will almost always be better served by the more rigorous course load and studious environment of an exam school.

But the research does take a bit of the shine off of the exam school mystique. And it suggests that we still don’t know some fundamental things about school performance and what matters the most in education: the teachers, the schools, or just the kids themselves.

Taking a test to gain admittance to a public school dates back at least as far as 1734, when John Lovell, the headmaster of the Boston Latin School, which was by then already 100 years old, informed Bostonians that new students would gain admission after reading a few verses from the Bible. Since then, the tests have grown more demanding.


By sifting out the better students from the rest, the exam school promises a lot, both to students and a city: a competitive free education for the kids who get in, and a way to keep talented students and their families from fleeing the city and its public schools for the suburbs.

That’s certainly true of Boston Latin School. It was established in 1635 and catered then, just as it does today, to the most ambitious students in Boston, with a heavy reliance on the classics and three years of Latin for all entering freshmen. Five students of Boston Latin went on to sign the Declaration of Independence: Benjamin Franklin, Samuel Adams, John Hancock, Robert Treat Paine, and William Hooper. Ralph Waldo Emerson was an alum. So was Boston mayor and US Congressman John “Honey Fitz” Fitzgerald, as was his eventual son-in-law, Joe Kennedy. (To keep this list from growing tedious, simply know that a lot of incandescent thinkers over the last 375 years — actors, novelists, clergymen, scientists, businessmen, doctors, and even an accused Soviet spy — are products of Boston Latin School.)

As such, it is the Ur-institute on which many other schools are based. Boston’s two other exam schools opened in the late 1800s; it’s also a model for some of New York’s, like the Brooklyn Latin School, which opened in 2006. All of these exam schools — the three in Boston and six in New York — are prestigious. Boston Latin School, and New York’s Stuyvesant High School, the Bronx High School of Science, and Brooklyn Technical High School have appeared in U.S. News and World Report’s annual ranking of the top high schools in the nation.


As Josh Angrist sees it, however, that doesn’t necessarily mean they’re much better than ho-hum public schools. Angrist wanted to see if the exalted status of exam schools was reflected in their students’ test scores. He and his coauthors — Parag Pathak, also of MIT, and Atila Abdulkadiroglu, of Duke University — “have no personal connection to exam schools,” Angrist said, no children of their own who’ve taken the admittance test. For them, exam schools are simply the perfect lab to study what the academic literature calls the “peer effect,” the idea that smarter classmates can boost kids’ performance.

Angrist and his coauthors gathered the registration and demographic information for Boston and New York public school students; their scores from annual standardized tests; their PSAT and SAT scores; their Advanced Placement scores; and, for the students who applied, their exam school application files. All told, Angrist had 12 years worth of data, from 1997 to 2008.

The researchers wanted to study what happens to individual kids — whether the exam school measurably improved their test performance. So they picked a group of kids who shared a comparable starting point: the students who fell on either side of the admission line, those who, by a couple points here or there, either made it into an exam school or did not.

If the exam school “worked,” the kids who made the cut would fare noticeably better than their peers who didn’t. But that’s not what Angrist found. In fact, the exam school kids who just cleared the admittance cutoff fared little better on tests than the ones who just missed it and attended nonexam public schools in Boston or New York. It didn’t matter if it was an MCAS, SAT, or AP test. Their results resembled each other, and on many tests were small enough to be statistically insignificant, Angrist said. That meant that the so-called peer effect was a “myth,” he said.


Steven Levitt, the University of Chicago economist of “Freakonomics” fame, found similar results to Angrist’s when he studied Chicago’s public schools in 2006. He and two coauthors looked at the students who, through a lottery system, were admitted into the city’s better public schools, versus the ones who were unlucky and attended other public schools. The lottery winners’ academic achievement didn’t surpass that of the lottery losers. “We find little evidence,” Levitt wrote in the study, published in the journal Econometrica, “that winning a lottery provides any systematic benefit across a wide variety of academic measures.” In other words, a student’s own curiosity and ability seemed to matter far more than a school’s curriculum.

Students at Boston Latin left school at the end of the day.Bill Greene/Globe Staff

The same seems to hold through college. Harvard economist Roland Fryer, a 2011 winner of a MacArthur Foundation “genius grant,” published a paper in July that looked at exam school results from 2002 through 2009, comparing the kids just above and below the admissions cutoff in New York City. Fryer found that not only did the exam school education have little impact on a marginal student’s SAT score, but there was little evidence that it improved the chances of college enrollment or college graduation. “If anything,” Fryer wrote, “students eligible for exam schools are less likely to have attended or graduated from college by 2009,” when compared to their peers at less prestigious public schools. Students who’d just made it into Brooklyn Tech, in fact, were 2.3 percent less likely to graduate from college than those who hadn’t made the cut, Fryer found.

This is not to say that school doesn’t matter. Classroom achievement, as measured in grades, test performance, and even college admissions, is only one of the whole range of ways that schools shape a student’s life.

Asked about the Angrist study, a Boston Public Schools spokesman, Matthew Wilder, pointed out that it examined only test scores and did not account for these less measurable benefits — the clubs, the theater, the network of supportive alumni that an exam school like Boston Latin offers.

Many parents also see exam schools as a safer and more positive environment — a view borne out by Levitt’s study of the Chicago school lottery, which found that the kids in better schools were arrested less often and faced fewer disciplinary reprimands.

So far, the studies done on exam schools are limited because of the range of students they used for comparison. Fryer, in his study, acknowledged that he was looking only at the “marginal” exam school students; the exam school’s effect may be different for more talented ones. And although the studies don’t address it, it’s also possible that the kids just below the cutoff would end up highly ranked at a lesser school, which could help their college applications.

Kamal Chavda, the Boston Public Schools’ assistant superintendent for research and evaluation, said that by narrowly focusing on students who just made the cut or just didn’t, a researcher misses the point: By and large, exam schools are built to serve the top students, those who score phenomenally well. The marginal students, in fact, can often feel challenged to keep up. “I’m encouraged to see that more of them aren’t getting frustrated and falling dramatically behind,” Chavda said of the results. (He also argued that the success of the kids who missed the cut says good things about the increasingly rigorous course load that all Boston Public Schools high schools offer: “We’ve made it a mandate to expose every student to at least one AP course,” he said.)

Angrist’s study also showed one intriguing exception, a spot in which exam schools clearly did benefit some students more than other public schools. Minority students taking 10th grade English tests performed markedly better in exam schools than their cohorts in other public schools. Though Angrist hesitates to draw a direct connection, this specific exam school finding is in general accord with other studies of his, where he looked at the performance of charter schools like Roxbury Prep that serve the neediest of students, and found that they did indeed help such students’ test scores.

Fryer and Angrist argue that one lesson of the new research may simply be that all children should be given more credit for their academic achievement, wherever it occurs. “These are good kids, with good test scores,” Angrist said.

The only thing we may know about education is that inquisitive kids respond to it. Parents should consider that, Angrist said, when stressing over a promising child’s exam-school test. If she’s smart enough to be in the mix, Angrist said, she’s smart enough, period. It doesn’t matter which school she attends.

Paul Kix is a general editor at ESPN: The Magazine.