I hate the rankings! No, I’m just joking. I don’t hate them and they are definitely a source of information for a lot of people, I would venture to say MOST people at first. If you’re reading this book then you’ve definitely come across the rankings of the following publications: Business Week, US News, Financial Times and the Economist, just to name a few. I should actually throw in the Poets & Quants aggregated rankings into the mix too.

Here’s the thing about the different publications... SOME (I definitely won’t say all), but definitely SOME people put so much weight into the rankings. That is why I am happy to be writing this section of the book because at a certain point RANKINGS DON’T MATTER as much as people think they do. People mull over the rankings and different

methodologies so much and it just does not matter at a certain cut-off point. I won’t go into what that cut-off point should be, because different types of people, alums, current students, and admissions committees will all say something different.

The key with the rankings is that you need to figure out which methodologies mean the most to you. I refuse to dive deep in to the differences of the methodologies because I did not read that much into them when I was applying.

Sure it was “fun” to look at the rankings the day they were published to see which schools moved up/down 1 or 2 spots. Or even schools that made much larger moves – but in the end (and I cannot stress this enough) DO NOT BASE YOUR TARGET SCHOOLS OFF OF THESE RANKINGS. I believe that this is the reason that many seemingly great students do not get into schools that would have been a better choice for them.

There are much better criteria out there that a serious candidate can use to evaluate each school. I’m not saying to not look at the rankings. I’m not saying that because I know that people will do it and I’m not going to start a grassroots effort to change the industry. I’ve got better things to do with my time – like help people navigate the process! That being said I suggest that people don’t look at the topline numbers within the rankings but IF YOU’RE GOING TO LOOK AT THEM, actually decipher each component AND the weight of each component.

For example: Did you know that publication X uses Recruiter data from a small sample size to determine the number for ___________? Sure these companies are the big companies like McKinsey, Bain, CitiGroup, etc. Now ask yourself, “Do I even want to work for one of those companies?” If the answer is no, then why would you care how they ranked a school that you may be interested in, in a small survey that was sent out to (usually alums – who are recruiters) at each school?

Now that was from the prospective student’s perspective. From the school’s perspective, the rankings DO mean a lot. If a school were to move up in Business Week and then US News within the same school year, they could expect an increase in applications for the following year. The reason for the increase in applications is due to the fact that so many applicants put so much weight into where they apply based on the rankings as stated before. If the school then experiences an increase in applications the following year, it means that not only does their total applicant pool increase, but they can then be more selective. A higher selectivity rate (acceptance rate) is one of the criteria in publication ___________. If the school is more selective then hopefully they’re choosing people who will actually enroll in their school and the hope is that the applicant yield (# of enrolled students/# of students accepted) will also go up, which is also a criteria in publication ____________. So it’s a never-ending cycle. This is why you hear of schools inflating certain numbers for the rankings because it plays to their favor. Obviously, the aftermath of being caught doesn’t bode well for the school, current students, or alums, but now you may be able to understand why it does occur.