Here is U.S. News and World Report's response in a letter received by The Monthly on August 23rd

To the Editor:

Ever since 1983 when U.S. News & World Report first published its college rankings, the magazine has striven to improve its methodology. That should be quite evident, at least, by the fact that our first college ranking listed 76 colleges based on only a reputation survey, and 17 years later we have come to a model that ranks more than 1,300 colleges using not only a reputation survey but 15 other indicators of educational quality. We continually seek guidance from educators and education experts on how to improve our rankings, and most of the additions and changes we've made over the years have come at the suggestion of outsiders.

The report you are posting from the National Opinion Research Center (NORC), which we commissioned in 1997, was one more attempt by us to assess our system and see whether it could be improved. As with all private companies that contract for such consultant studies, we are free to accept or reject the conclusions of the consultants. In the end, we implemented the bulk of the final recommendations made by NORC. Here, below, is what we did in response to the five final recommendations from NORC. (Your readers may refer to the bulleted recommendations on pages 8 and 9 of the report you are putting up on your website.)

* To obtain empirical ratings of the value of the measures we use, in 1998 we added a section to our reputational survey that asked college officials to rate our 16 indicators. We are sending you separately the results of that survey, but, to put it simply, most of the officials polled rated most of our rankings indicators as good to excellent measures of academic quality.

* We have continued to review our data gathering and programming system, as suggested, to look for anomalies and limitations in the weights. In 1999, we took the advice of a RAND study of our law school rankings and applied a statistical technique known as standardization to the scores in our calculations. The result was a shift in the rankings of a small number of schools.

* As recommended, we introduced more averaging for individual data to smooth out year-to-year volatility in a school's data.

* We don't entirely agree with NORC's recommendation that our methodology remain constant for five to seven years. In keeping with their suggestion, we have kept the weights of our indicators constant since 1996. However, we prefer to maintain our options to make small changes in the rankings model whenever we feel it will improve the quality of the results.

* As per the NORC recommendation, we maintain an advisory council of admissions deans who meet with us for two days every year. We have added an advisory group of high school guidance counselors, with whom we meet annually, and we hold regular meetings with institutional researchers during the year. In addition, we meet with representatives from 50 to 100 colleges who visit us each year to listen to their suggestions.

It should be evident from the actions listed above that we take the suggestions of educators and other experts seriously -- especially consultants whom we hire to assess our system.

Sincerely,

Peter Cary
Special Projects Editor
U.S. News & World Report


Mission   Masthead   Features Archive    Writers Guidelines   
Feedback   Customer Service    Subscribe Online    Make A Donation

This site and all contents within are Copyright © 2003
The Washington Monthly 733 15th St. NW Suite 1000 Washington DC. 20005.
Comments or questions ... please email Christina Larson by clicking here