[Are you new to this blog? Great. This is an old post from my WashU blog.]
The Stability of US News Graduate Rankings
As the Director of Graduate Studies at WashU in Fall 2012, I completed the US News survey of political science grad programs. For those of you unfamiliar with the methodology, DGSes and department chairs fill out a survey ranking all schools on a 1-5 scale and write in the top ten departments for each subfield.
This is really tough to do. You try it. Take all of the Ph.D. programs in your state and rank them on a 1-5 scale.
The rankings for 2013 literally didn’t change from 2009. Ok, there are a few more ties in 2013, but the order of schools is the same.
2013 2009
Harvard 1 1
Princeton 2 2
Stanford 2 3
Michigan 4 4
Yale 4 5
Cal 6 6
Columbia 7 7
UCSD 7 8
MIT 8 9
Duke 10 9
UCLA 10 10
Chicago 12 11
UNC 13 13
WashU 13 13
Rochester 15 13
Wisconsin 15 15
NYU 15 17
Ohio State 15 17
What gives? Is it that reputations of top programs really don’t change? Or that this survey is so poorly designed most of us are going to simply recall the past rankings as a heuristic to rank programs?
Actually, there is a methodology reason for this. Turns out that for the first time ever (with no explanation) they averaged the survey from 2008 (the 2009 rankings) with the survey in 2012 to make the 2013 rankings.
Why? It could be that with only 50 responses, they got some odd rankings that didn’t make sense. Or that a few top programs got panned, and for whatever reason they averaged the response.
UPDATE: Someone pointed out that there is already a discussion of this issue at Political Science Rumors. This is what I get for not following the rumor blogs