Features

September/October 2012 A Note on methodology: 4-year colleges and universities

By the Editors

The predicted graduation rate measure has been substantially changed since last year’s rankings, based on research by Robert Kelchen, a doctoral student at the University of Wisconsin-Madison and methodologist for this year’s college guide, and Douglas N. Harris, associate professor at Tulane University. While last year’s formula predicted graduation rates based on the percentage of Pell Grant students and its average SAT score, this year’s formula includes other characteristics that are associated with the academic preparation and resources of its students. In addition to the percentage of Pell recipients and the average SAT score, the formula includes the percentage of students receiving student loans, the admit rate, the racial/ethnic and gender makeup of the student body, the number of students (overall and full-time), and institutional characteristics such as type of control (public, private nonprofit, and for-profit), and whether a college is a historically black college or university (HBCU) or primarily residential. We estimated this predicted graduation rate measure in a regression model separately for each classification, either using data from a prior year or imputing for missing data when necessary. Schools with graduation rates that are higher than the “average” school with similar stats score better than schools that match or, worse, undershoot the mark. One school, the California Institute of Technology, had a predicted graduation rate of over 100 percent. We adjusted this graduation rate to 100 percent.

We then divided the difference between the actual and predicted graduation rate by the net price of attendance, defined as the average price that first-time, full-time students who receive financial aid pay for college after subtracting need-based financial aid. This cost-adjusted graduation rate measure rewards colleges that do a good job of both graduating students and keeping costs low. Two colleges (Berea College and Macon State College) reported negative net prices and were scaled back to the smallest positive net price reported by any college ($1,255). The two social mobility formulas (actual vs. predicted percent Pell and cost-adjusted graduation rate performance) were weighted equally.

the Editors can be found on Twitter: @washmonthly.

Comments

  • matt w on August 27, 2012 10:54 AM:

    I see that you're still pro-rating some scores for size and others not, and still counting humanities PhDs differently whether they're awarded by the university or to alumni. And you still haven't uttered a word to justify these arbitrary choices.

    To paraphrase Apocalypse Now:
    "Is my methodology unsound?"
    "I don't see any methodology at all, sir."

  • Tom M on August 29, 2012 3:49 AM:

    Your score for research expenditures (and science PhDs) aren't scaled for the size of the college. This produces strange effects. If we look at the Claremont group of colleges, we see Harvey Mudd College in 25th place spending $3.21 million, Pomona College in 31st place spending $2.92 million, and Claremont McKenna College in 9th place spending $5.73 million. But these colleges are all part of the same group, sharing facilities. If they merged, the result would be their spending $11.86 million, putting them in 2od place. This would not be an actual change in what they are doing, just a change in paperwork. The standings should not be affected by arbitrary changes in organization.

    A more general objection is that you are using these measures to estimate how valuable an institution is to America. By not adjusting for size, you reward mergers. This is the theory of too big to fail.

    An additional problem is the use of Pell grants to judge social mobility. If a school has a student body that has won a lot of scholarships, that would throw off the statistics. It would also be a good idea to take into account legacy admissions and athletic admissions.

  • Christine C. on August 29, 2012 10:03 AM:

    Tom, the larger point you were making is well taken, but as a Claremont 5C alumna, I must correct the example you gave. Harvey Mudd, Pomona and CMC do share some facilities (such as their main library, Honnold/Mudd) but not all (or even most). Pomona and Harvey Mudd each have their own science facilities, for instance. CMC's science facilities (Keck Science Department) are shared with Pitzer and Scripps.

  • Tom M on August 31, 2012 3:03 AM:

    Christine: Thank you for the additional information.

  • Robert Kelchen on August 31, 2012 11:31 AM:

    Hi folks,

    My name is Robert Kelchen and I'm the consulting methodologist for this year's rankings. A few responses to the above comments:

    (1) We do not adjust for institutional size when examining PhDs awarded by national universities. The denominator for this measure is unclear--we may want to divide by the number of PhD students, but that number isn't available. Dividing by overall graduate populations or institutional size produces fuzzy results, at best.

    (2) We do adjust for institutional size when examining the baccalaureate origins of PhD recipients. This gives us an idea of a college's relative commitment to research. Additionally, there is much more variation in size for non-research universities and we have a good denominator to use (number of undergraduates).

    (3) We would love to be able to look at more than the Pell Grant to judge social mobility, but there are severe data limitations. There isn't data on legacy or athletic admissions, which could potentially be fixed with a national unit record dataset. It is just now possible to get Pell recipient graduation rates.

    (4) Cases like Claremont are tricky, to be sure. But shared facilities are not at all uncommon. For example, state university systems often share resources and it can be difficult to parse out an institution's effect.

  • WHP on September 05, 2012 9:23 PM:

    The biggest problem with these rankings is the absence of any measure of quality with the possible exception of the ratio of Phds to BAs. This is truly unfortunate. Even data about income, the value of the degree, would be a step in the right direction.

    I noticed that the data on research expenditures cannot possibly be correct. The institutions ranked "105" with respect to research expenditures include colleges with faculty that are required to do research to attain tenure along with colleges staffed by part-time adjuncts with second jobs who do no research and hired to teach.

  • Robert Kelchen on September 06, 2012 9:10 AM:

    WHP--we would love to have data on income for a representative sample of students. Sadly, it doesn't exist across a broad swath of universities and is fraught with selection bias when available.

    The research expenditures data come from the Center for Measuring University Performance and the National Science Foundation, which are the most reliable sources available.

  • Suzanne Klonis on September 12, 2012 9:00 AM:

    I only found these rankings by accident, which is unusual, because as the institutional research director at my college, I report most of the numbers that are published about my college. I find it a little weird that IR directors were not contacted to verify any of the data that were used in the rankings. For example, how do you know the percent of graduates who go on to get PhD's? How would you know this information without contacting the school itself?

  • Robert Kelchen on September 12, 2012 1:19 PM:

    The number of graduates who go on to get PhDs came from the Survey of Earned Doctorates via the National Opinion Research Center. Most colleges don't track all of their alumni who go on to get PhDs.

  • Liz Sanders on September 21, 2012 3:25 PM:

    After reading the comments, and trying to locate details of the methodology, am I correct in concluding that you gather none of the information for the rankings from the schools themselves? Just to clarify.

  • Robert Kelchen on September 24, 2012 11:33 AM:

    Liz, correct. All measures are provided by outside sources. I think we have all of the sources listed with the exception of ROTC and Peace Corps enrollment, which come directly from those sources instead of from the colleges.

  • K Gilman on October 04, 2012 11:04 AM:

    Why is every public university in Maryland except Morgan State mentioned?