Susan Headden had an important article in the September/October issue of the Monthly running down the problem with community colleges’ high-stakes placement tests, which often wrongly assign students to remedial courses, costing them some very valuable time and money.
A good example is what happened to Monica Dekany, who enrolled at Golden West College in Huntington Beach, California in 2009:
All she had to do, the registrars told her after she paid her fee, was go down a hallway, pick a cubicle, sidle up to a computer terminal, and take a short test. The “Accuplacer,” as the test is called, was no big deal, they said—nothing she could have studied for. It was just so they could see where she was. Dekany took one test in math and another in English, and was “floored,” as she put it, to learn that she had scored at a level that would consign her to remedial classes, reviews of fundamental material for which she would receive no college credit. “It caught me totally off guard,” Dekany says. The other colleges had let her enroll directly in college-level English and literature classes, and as her transcripts clearly showed, she had passed them. But Golden West told her the test results were all that mattered.
Dekany dutifully enrolled in, and paid for, the remedial—or what colleges euphemistically call “developmental”—courses. She knew everything in the English course already; her daughter’s seventh-grade English class was more advanced. Her math course was similarly low level, but it was taught by a sympathetic professor who helped save her from further remedial work. The college had mandated that Dekany take a second remedial math class before being allowed to take Math 100 for college credit, but her professor thought the requirement made no sense—she was clearly ready for college work. So he arranged for her to take Math 100 at Cal State, Long Beach, where he happened to also teach, and there she got an A.
Dekany ended up thriving despite the obstacles thrown up by the Accuplacer, but many students, already throwing their lives into a bit of chaos by fitting in school with everything else, do not. And now two new studies out of Columbia’s Community College Research Center buttress the notion that these tests flawed. One found that “placement exams are more predictive of success in math than in English, and more predictive of who is likely to do well in college-level coursework than of who is likely to fail,” and suggested “[u]tilizing multiple measures to make placement decisions” rather than solely relying on placement tests. The other found that “placement tests do not yield strong predictions of how students will perform in college,” and also that these tests lead to rather frequent placement blunders:
The authors also calculate accuracy rates and four validity metrics for placement tests. They find high “severe” error rates using the placement test cutoffs. The severe error rate for English is 27 to 33 percent; i.e., three out of every ten students is severely misassigned. For math, the severe error rates are lower but still nontrivial. Using high school GPA instead of placement tests reduces the severe error rates by half across both English and math.
In this context, three our of ten is a lot.
Feed the Political AnimalDonate
Washington Monthly depends on donations from readers like you.