How a teacher bombed the SATs
<?EM-dummyText [Drophead goes here] ?>
I hold a PhD in English from the University of California, Berkeley. I’ve taught students to write at Emory, Berkeley, and Harvard and picked up three teaching awards along the way. I have published more than two dozen pieces in national publications, including The Atlantic and Vanity Fair.
In May, I bombed the essay portion of the SAT.
Did I mention that I've also been prepping students for the SAT as a teacher and tutor for the Princeton Review for almost 20 years?
This spring, the College Board, which administers the SAT, introduced a new, optional essay section of the test. Almost two-thirds of spring test-takers opted in, even though less than 10 percent of colleges require the essay. The College Board believes the new essay looks "a lot like a typical college writing assignment in which you're asked to analyze a text." It is scored from 1 to 4 by two graders in three areas: reading, analysis, and writing. The scores are added together to generate totals between 2 and 8 in each category.
In May, I took the SAT because it is part of my job to be up on the test. We had to analyze an op-ed by Eric Klinenberg decrying the use of air conditioning. The directions explained, "Your essay should not explain whether you agree with Klinenberg's claims, but rather explain how [he] builds an argument to persuade his audience."
I followed the directions and wrote what I thought was a decent piece, particularly after having completed a three-hour exam. And I didn't bomb everything. I received a 7 in reading, which, according to the College Board, measures how well I understood the passage and used textual evidence. I also got a 7 for my writing score, which measures how "focused, organized, and precise" my essay was, as well its use of "an appropriate style and tone that varies sentence structure and follows the conventions of standard written English." But when it came to analysis, which demonstrates an "understanding of how [an] author builds an argument," I landed a 4.
A 4. Despite my half-dozen peer-reviewed articles published in academic journals, I scored in the bottom half of the range. According to the score, I will need to do some serious work before I go to college or maybe I should just major in math (I hit 99th percentile on the math section, as well as on the evidence-based reading and writing section).
After absorbing the blow to my ego, I was left wondering how I could have done so badly on the essay, particularly after publishing an op-ed in The Wall Street Journal that argued that students who prepped for the exam would simply use a new formula for writing their essays.
It's my own fault that I did not employ the template we teach students to use at Princeton Review, a fact painfully brought home when one of our students let me know he scored an 8 in reading, 7 in analysis, and 7 in writing.
Was my essay really as bad as my graders thought it was? I needed to know, so I contacted two experienced teachers of college writing to get second and third opinions.
My first grader, Kevin Birmingham, not only taught for several years in the Harvard College Writing Program, he also won the Truman Capote Award for Literary Criticism this year for "The Most Dangerous Book," a gripping examination of the publication of James Joyce's Ulysses. The second grader, Les Perelman, spent 25 years at MIT directing undergraduate writing programs; he was a strident critic of the old SAT essay but thinks the new assignment represents an improvement.
I gave Perelman and Birmingham three essays, marked simply A, B, and C. Essay A was mine. Essay B was written by a colleague in the test prep industry and received a 7 in reading, an 8 in analysis, and an 8 in writing. Essay C was the aforementioned student's essay (8-7-7). Neither of the graders knew I had written one of the examples. They were provided the prompt and the official scoring rubric and asked not just to score the essays using the rubric but to rank them for their overall quality.
I did not sleep well as I waited to see the results.
Both Birmingham and Perelman ranked mine first out of the group, and my re-score came out to 7-7-7. Birmingham scored and ranked the student essay (C) the lowest of the bunch. Perelman, who graded the essay according to how he expected it to be graded by official scorers, gave it the highest score in the group and ranked it second. Without prompting, he explained in an e-mail, "I scored C, a classic, mechanically produced five-paragraph essay higher than I normally would because standardized testing loves this form because it is easy to get consistent scoring."
In other words, the Princeton Review formula worked. It was my mistake not to use it.
My personal failure matters very little in comparison, however, with the failure of the new SAT essay to distinguish actual writing skills from the ability to employ a template that lends itself to quick grading by ETS employees making $15 an hour to start.
The SAT essay assignment and the five-paragraph format it encourages will likely do students little good after graduation. In a recent survey of K-12 and college teachers, ACT found that college teachers considered the ability to generate ideas the most important skill for their students to possess, twice as important as the ability to analyze texts. The problem with the College Board's new SAT essay, as Perelman said to me, is that it "rewards nonthought and mechanistic writing." Bombing it might not be so bad after all.
It is perhaps too apt that at the top of each page provided to write the SAT essay it reads, "DO NOT WRITE OUTSIDE OF THE BOX." I only hope students do not take this warning too much to heart.