STANDARDIZED TESTING is often a source of stress — for students, teachers, parents, and administrators — and so it has been with Massachusetts' potential transition to a new, computer-based assessment designed for use in states across the nation. The state is currently in the second year of a two-year trial period, after which the Board of Elementary and Secondary Education will decide whether to replace its longstanding MCAS tests with the Partnership for Assessment of Readiness for College and Careers test, also known as PARCC. While all 10th graders will take MCAS this year as a graduation requirement, most districts must choose, by Oct. 1, whether to offer MCAS or PARCC to younger students. So far, more than half have chosen PARCC — including the state's largest district, Boston Public Schools.
The Boston system's decision has raised the expected consternation — from some who oppose the very concept of testing, and others who fear that this particular test will dilute Massachusetts' educational standards. But this isn't a time to panic; it's a time to reserve judgment. There's no way to tell if PARCC will be a good fit for Massachusetts unless, and until, schools try it out.
The MCAS has served Massachusetts well, providing a common yardstick to identify gaps in students' academic preparation and flaws in schools' performance. But Boston school officials say they chose PARCC this year because it aligns with the new Common Core curriculum they've begun to implement in schools. (They attribute a rise in third-grade MCAS reading scores in part to those new standards.) They say the computer-based PARCC tests gives them an incentive to increase the use of technology in classrooms. And they say this is a safe year to launch a broad tryout, since the state has guaranteed that all but the most troubled schools will be held harmless for their scores.
There's good reason to give schools a little slack as they implement a complex new testing system. A much smaller PARCC field test in Boston last spring revealed some telling glitches: Testing went smoothly on new Chromebooks, but students who used older desktop computers faced frustrating technical problems. A larger-scale PARCC trial should be used to learn more: Are the time limits appropriate? Is the content sufficiently challenging and connected to the classroom? Can schools afford the technology they need to deliver the test effectively?
The results, when they come in, should be transparent and easy to obtain; one common complaint about the Common Core and PARCC is the speed with which both have been implemented, and the lack of public input in the process. If real problems emerge, the state should be willing to apply the brakes, use its influence to alter the test, and, if need be, revert to the tried-and-true MCAS. But the public also needs to be open-minded — and wait until the results of the year's experiment are in.