Skip Ribbon Commands
Skip to main content

Quick Launch

 

 Categories

 
  
Edit
  
 
  
 
  
 
  
 
  
 
  
 
  
 
  
 
  
 
  
 
  
 
  
 
  
 
  
 
Home
October 30
What Really Matters?

​To continue our discussion, let's now turn to a recent query asking,”what rankings really tell us as opposed to what really matters for a successful well-being”. From this writer’s perspective, there are certain governing realities with regard to rankings and particularly college rankings, that should be kept in mind.

Rankings appear to strongly appeal to our desire for “order” and, as such, seem to fulfill two demands of modern society.

  • ​They are seen as a method for determining value (the “J.D. Power Syndrome”, if you will) by a source that sorts out and provides an interpretation of voluminous amounts of information on something of popular interest. And,
  • The general public has a strong need to have an authoritative source tell them what is “best”, i.e., “Whom” or “What” is “Number One”. 

In my view, this might be perfectly fine for ranking toaster ovens and Buicks, but the reality remains that colleges and universities are complex institutions with multiple purposes and which, to a great extent, rely on human interactions and individual determination in order to function at their best; i.e., fulfill their mission. 

This latter point compels us to acknowledge the content of two recent university job postings wh​ere interested candidates were encouraged to consider. The first stated,

“Humanity. Justice. Integrity. You know, wild-eyed San Francisco values. Change the world from here," (University of San Francisco, 2014).

The second read,

“Expectations for all Dominican Employees: To support the University's mission of preparing students to pursue truth, to give compassionate service, and to participate in the creation of a more just and humane world,” (Dominican University, 2014).

Unfortunately none of these ideals, all of which arguably “really matter for students and their well-being”, translate neatly if at all, into the current lexicon of college rankings. Rather, students and their families must face the equally unfortunate reality that, identifying a list of “Best Colleges” uniquely suited to them cannot be readily done through either ordinal ranking or the sum total of a set of data points.


http://admin.nacacnet.org/media-center/PublishingImages/prieto_small.jpgJoe Prieto, a former college admissions officer, guidance counselor and past member of the Illinois and NACAC Admissions Practices Committees, has been a contributing writer to the NACAC Counselor's Corner since its inception in 2012.

October 24
A New School Year, But the Same Old Issues with Rankings

Hello everyone and welcome to NACAC’s Counselor’s Corner. Whether you are a returning or new reader, it is hoped you will find the content helpful in understanding and most importantly, properly using some of the multitude of approaches to college rankings publications and services. 

However, before continuing it should be noted that the term “service”, as it relates to college rankings, is used with a very necessary reservation. Why? Referencing “service”, at Merriam-Webster Online, lists the following among the primary definitions,

(a)   “Contribution to the welfare of others.”

A point often made on this blog is that the connection between “rankings” and “service” is, in the eyes of many admissions and guidance professionals, tenuous at best. U.S. News and World Report (USNWR) initiated the drive to be recognized as the authority on higher educational quality with its first “Best Colleges” issue in 1983. But while USNWR may be given the benefit of the doubt that “service” was a large part of its original intent, few would agree that it is today.

A general explanation for this opinion can be found among the answers to several questions that have been submitted to the Counselor’s Corner. One in particular seeks reasons for a perceived “’love-hate’ relationship that colleges have with the rankings”. 

A past conversation with two long-time friends, both of whom have had lengthy careers as college admissions representatives provides a good point of departure on this question. Reflecting on the impact of college rankings on their work, one observed (while the other agreed) that, 

“If our [university’s] rank is ‘good’; i.e., higher/better than our main competitors, or if it has improved since the previous year, and/or if the administration is satisfied with our most recent rank, then the pressure is off, at least for the time being. That ranking can then be fully integrated into the institutional marketing plan. Under those circumstances, you could say that we ‘love’ the rankings.”  

Conversely, 

“Even when things remain relatively stable or God forbid, if our ranking slips, the downturn is quickly seen as the first reason why our school is not attracting more applicants and especially those who are among the brightest and wealthiest. Presidents and trustees then issue demands that the admissions office do whatever is necessary to right the ship and get our rank back to where they think it should be. There’s the ‘hate’ with regard to the rankings.”

On a slightly more benign note, my other colleague remarked that an uncomfortable scenario can result from a student who, solely on the basis of what is in a college rankings publication, approaches the representative at a college fair and inquires about a program that either is not necessarily the strongest program or worse, is not offered by the college. “If you tell the student the major they want is not offered you risk them saying, ’Well, how can you have such a high rank if you don’t offer this major? And, there is simply is no effective way to explain that (for example) USNWR ranks institutions as a whole and not by the quality of individual programs.”

Further evidence of a less than favorable regard for rankings is contained within the findings of a survey conducted by NACAC’s U.S. News and World Report Advisory Committee (2012) in hopes of gaining an assessment of viewpoints on college rankings. One survey question, responded to by nearly equal numbers of admissions and guidance professionals asked, 

“Are rankings a helpful resource for students and families interested in college information?” 

Only slightly more than ten percent of admissions officers and less than ten percent of the guidance counselors surveyed were in clear agreement that they are. 

A corresponding question on the survey was, “Do rankings create confusion for students and their families?” 

Nearly 50 percent of all respondents (60 percent of whom were guidance counselors) agreed that “Yes, they do create confusion.” In light of this, concluding that the tabulated response to both queries further underscore the low regard held for college rankings as a whole is difficult to avoid.

In spite of this, Princeton Review, The New York Times, Forbes, Business Insider, Money Magazine and USA Today and still others publishing their own rankings. With wide variations in the methodology used to compile them, it is hardly surprising that students and their families are overwhelmed by the mountain of divergent characteristics, data points and quantitative formulae. Unless something has been overlooked here, this hardly connotes the earlier defined, “service.” 


http://admin.nacacnet.org/media-center/PublishingImages/prieto_small.jpg
Joe Prieto, a former college admissions officer, guidance counselor and past member of the Illinois and NACAC Admissions Practices Committees, has been a contributing writer to the NACAC Counselor's Corner since its inception in 2012. 

October 07
A Quick Look at LinkedIn's New University Rankings

​Last week LinkedIn unveiled new university rankings based on career outcomes. Here’s a brief overview of what the site includes:

The LinkedIn rankings include a “top 25” list of schools in the United States for each of the following eight career fields: Accounting Professionals, Designers, Finance Professionals, Investment Bankers, Marketers, Media Professionals, Software Developers, and Software Developers at Startups (the rankings also include top 25 lists of schools in Canada and the United Kingdom for five career fields). 

How were the rankings developed? 

From LinkedIn’s blog, here’s what we know about the methodology: To develop the "top 25" lists, LinkedIn first identified top companies where professionals in a specific career choose to work (this was based on LinkedIn data on how well and which companies attract and retain the most employees in a specific career). Next, LinkedIn looked at where members in that career went to school. Then for each school, LinkedIn found the percentage of alumni who work at the top companies identified in that particular career field (only looked at members who graduated in the past eight years).

After reading through the limited explanation above, I was left with several questions. For example:

  • ​How many “top companies” were identified for each career field? I think this information would be helpful to know. For example, we can see that at least Deloitte, KPMG, EY, and PwC are identified as “top companies” for accounting professionals. However, we don’t know if and how many other companies were considered. Are the university rankings for each career only based on employment at a few companies? And similarly, are specific companies weighted more heavily in the rankings determination? 
  • How does LinkedIn account for the size of companies? It appears as if larger companies have an immediate advantage if they are able to employ more professionals.
  • What proportion of alumni in the sample from each school are on LinkedIn? Do the rankings control for schools that do not have an equal proportion of alumni on LinkedIn (i.e. how does LinkedIn compare a school where 60% of alumni are LinkedIn members compared to a school where only 20% of alumni are represented?).
  • Does LinkedIn account for location? If a company’s headquarters are in one or two major cities, what about professionals who do not live in those areas?
  • What if a college/university isn’t represented on LinkedIn?
  • Does the ranking methodology restrict the rankings to business-related fields? 
These and other questions are important to consider when considering a new ranking methodology in order to really understand the value of the data, as well as its potential limitations. LinkedIn’s rankings could be helpful for select students who wish to work at big companies identified in the specific career fields analyzed. But for the majority of students, the rankings do not apply. Visit NACAC's website​ to learn more about rankings.
September 04
​Back to School Post: 5 Things to Know About Rankings

As high school students go back to school this week across the nation, juniors in particular will soon be applying to colleges. For students and parents searching for potential colleges and universities, as well as, school counselors advising students throughout their search, NACAC has put together the following reminders to help you read between the lines when using college rankings.

#1: All Rankings Are NOT Created Equal.

Each private ranking publication relies on a formula, or methodology, to determine which colleges and universities it considers the "best." These formulas can be very different for each publication, resulting in very different rankings.

The inconsistencies in college rankings reveal their significant flaws, but they also demonstrate the real purpose of the rankings. Some lists work well for some individuals, but not all. Take a look at the methodology of each ranking publication and pay special attention to the weights associated with each criteria. Rankings typically include factors that are easy to measure, widely available, and standardized. Do you agree that standardized test scores are a good measure of quality? Do the rankings provide information on opportunities to engage with faculty and participate in internships? Look for the factors that are important to you. Visit NACAC’s rankings website to see a list of links to the methodology for the most popular sources of rankings. 

#2: One Size Does Not Fit All.

 Just because a magazine says that a particular college is number one doesn't mean that it would be a good fit for you. “Institutional Fit” is used to describe the degree to which a student’s unique “profile” matches-up with that of particular colleges or universities. This search then, for an appropriate “fit,” is directed by your own interests, abilities, and values. Rankings do not offer this answer. What these lists can do is provide a good starting point for researching different colleges and universities, and even help you discover schools you have not heard of before, but if you want to make the college selection process one that is driven by your priorities (and thereby substantially more meaningful), concentrate on your own academic interests, abilities, and personal values. Resist the temptation to seek only the “premier” brands in higher education. What someone else’s top five choices are, will likely be very different from your own. 

#3: What’s the Difference Between #40 and #48? How to Sift Through the “Noise.”

Researchers examining U.S. News and World Report rankings found that for universities with ranks beyond the top 40, differences of +/- 4 (or even greater) should be considered "noise" and universities with ranks in the top 40 should similarly consider differences of +/- 2 as "noise." In addition, however, it would be extremely difficult, if not impossible, for a university in the mid-30s to move into the top 20 without a substantive change in a university's academic reputation score, the category U.S. News weights the most in their rankings (combined weight of 22.5 percent). This score is often based on age-old perceptions that change very little over time. Therefore, it’s important not to get too caught up in the “numbers” game when conducting your search. Once a rankings publication selects its methodology, it’s unlikely that meaningful changes will occur over a short period of time or even at all. 

#4: Don’t Rely on One Source.

When using college rankings, one of the major dangers is taking one source of rankings at “face value,” or in other words, believing that one source is completely objective and the be all, end all determination of a college’s quality. Consider multiple sources. Talk to alumni. Visit campus if you can. Discuss options with your school counselor. In addition, here are a couple trusted sources to find information and search college options:

 

  • ​The U.S. Department of Education's College Navigator is a free consumer information tool designed to help students, parents, high school counselors, and others get information about more than 7,000 colleges from the US Department of Educations' database. 
  • The College Scorecard allows you to search colleges by selecting factors that are important in your college search, such as, programs or majors offered, location, and enrollment size.You can also use the College Scorecard to find out more about a college’s affordability and value so you can make more informed decisions about which college to attend. 

 

#5: BEWARE of Lead Generator Sites.

There are some websites out there that look like other rankings websites and may claim to provide a free and trusted source of information about colleges, but in reality they exist only to get you to enter your personal contact information, which the company can then sell to others for a fee. These websites, known as "lead generators," generally direct students only to schools and programs that pay them and are in no way objective or a good source of information. To identify a lead generator website and check out NACAC’s online resource for more information. 

Visit NACAC’s resource on College Rankings or resources for students and parents​ for more information.

July 30
College Rankings by Money Magazine

​Earlier this week Money magazine revealed new college rankings, including a complete list of 665 colleges. The rankings focus on several factors all weighted differently within three overarching areas: 

 

  • ​Quality of education (weighted 33.3%) (i.e. graduation rates with a value added measure, peer quality based on admitted students’ standardized test scores, quality of professors based on ratemyprofessors.com, yield);
  • Affordability (weighted 33.3%) (i.e. net price of a degree, student and parent borrowing, loan default rates); and
  • Outcomes (weighted 33.3%) (i.e. early and mid-career earnings based on payscale.com). 

 

Data were converted into a single score on a five-point scale and then ranked accordingly. For a complete look on how Money weighted each factor to rank colleges, view their methodology webpage​

As always with rankings, there are several caveats and limitations. What do you think of Money’s ranking methodology?

July 21
Q&A About Rankings for International Students Part 4 of 6

​For international students considering a college education in the United States, media rankings of colleges are a leading source of information and are often relied upon when students are exploring their options. The following question is typically encountered by international admissions counselors and independent educational consultants working with international students. The answer may useful in helping to guide students who want to know more about rankings.

There are so many rankings, including world rankings; I don’t know which ones are reliable. How do I know if some are more reliable than others in helping me do research on possible college options?  

There is no such thing as a reliable college ranking since each ranking makes assumptions about what are the most important features of an institution. College rankings are designed with specific goals in mind. In fact, rankings are not really rankings at all because they often adjust the data in order to reinforce a pre-ordained order rather than letting the best institutions rise to the top. Moreover, many rankings measure things that affect undergraduates only in the most remote ways such as the research activity of the faculty or how many faculty members have won a Nobel Prize.

 

Occasionally looking for college rankings will take you to unscrupulous “lead generation” sites.  Lead generation sites are often owned by marketing companies that unaccredited and for-profit colleges hire to find new customers. These sites generally promise to help you find the “right” college or access to special lists, rankings, and other information. A good way to tell if a site is a lead generation site is if it asks you to provide your contact information before you can proceed. Giving out your personal contact information may result in you being subject to aggressive sales calls and tactics from non-reputable colleges.  These sites often feature colleges that have paid to receive a sponsored place. When looking for colleges, always think about whether a site is offering something truly helpful. Ultimately lead generation sites are more useful to the colleges that use them than they are for potential students


July 21
Q&A About Rankings for International Students Part 3 of 6

​For international students considering a college education in the United States, media rankings of colleges are a leading source of information and are often relied upon when students are exploring their options. The following question is typically encountered by international admissions counselors and independent educational consultants working with international students. The answer may useful in helping to guide students who want to know more about rankings.

Given the 4,000 or so colleges in the U.S., I find it hard to know where to begin my research. Should I use rankings to help me narrow my search, and if so how?

There are indeed many colleges in the United States. The first thing you should do is start with yourself. You should consider the kind of college experience you want to have and what you hope to accomplish while you are there. You should consider what you might like to study, what your favorite things about your current school are, and what size school you want to attend. You should also make sure that the school has a few majors you are interested in and a compelling academic philosophy. These are the considerations you should make when trying to make a good “fit” with you and an institution. 

 

Rankings tend to correlate well with admission factors and institutional wealth. Many of the highly ranked schools are well thought of…at least in terms of what the rankings measure. However you should never cross a school off your list because it is not ranked high. The methodology of rankings tends to produce artificially large differences between institutions.


July 21
New Research Suggests That College Rankings Promote Pre-established Order Among Colleges

​University Rankings In Critical Perspective

The Journal of Higher Education, Volume 84, Number 4 July/August 2013

Brian Pusser and Simon Marginson​

In University Rankings In Critical Perspective, university rankings are recontextualized within a discourse of power that illuminates their far-flung effects on the globalization of higher education. Brian Pusser and Simon Marginson argue that rankings contribute to a convergence of institutional missions which can undermine the ability to meet localized educational needs. Moreover rankings are a tool for powerful state actors to legitimize their standing atop of the international prestige hierarchy by sanctioning global competition on terms that are most favorable to hegemonic universities. Overall Pusser and Marginson conclude that university rankings contribute to a neo-imperial project that solidifies traditional forms of power and unequal competition between nation states.

Pusser and Marginson argue that a critical-theoretical perspective that examines power is useful for the study of higher education. Pusser and Marginson see states as  fulfilling a variety of ends related to the concentration of power for elite actors. Institutions of higher learning exist in a nexus of power relations and are often recruited for nationalistic or state building agendas. As state agents, the university goals can become co-extensive with the interests of moneyed elites by bringing legitimacy to neo-liberal values. Elite research universities which are consistently found at the top of the rankings, are broadly responsive to the needs of powerful families, professions, and other authorities. Since the most sufficiently resourced institutions are consistently at the top of most rankings, the rankings ultimately serve as a framework by which the hegemony of leading institutions is perpetuated across the postsecondary education landscape. Overall, a critical-power perspective of rankings reveals that rankings do not reflect objective ratings of universities so much as they express various normative agendas typically serving elite actors.

For Pusser and Marginson, rankings contribute to a neo-liberal meta-state project that confers power and prestige to English speaking nations by engaging universities elsewhere in a contest that favors traditional elites. Given the diversity of countries, contexts, and educational needs around the world, the rankings would not be expected to have such broad appeal. However, rankings have managed this exactly and contributed to a convergence of educational goals in many disparate contexts. In short, college rankings are an efficient mechanism by which the template of highly selective and research-oriented institutions can be propagated around the world. Using a conception of power informed by Michele Foucault, they argue that rankings have socialized, or “disciplined,” other universities into behaving in ways that serve the interests of the most powerful agents. 

The disciplining feature of rankings is unfortunate because it undermines the ways in which global institutions could better serve their local contexts.  Currently there are no major rankings that award merit to promoting accessibility or other forms of social justice. In the data that Pusser and Marginson collected, they found that none of the highest ranked medical schools for “Social Mission Score” were also found in U.S. News and World Report's top research category. Only the Morehouse School of Medicine was ranked among the top ninety institutions in the primary care category for 2010.  Presumably, there could be a variety of rankings that reward institutions serving in a number of heterogeneous contexts but the data that is available points strongly towards a convergence of interests most favorable to hegemonic universities. More information on the subject of sociological and institutional effects of college rankings can be found in the report of the NACAC Ad Hoc Committee on US News & World Report rankings, available here​

The full report can be found here.


June 26
Will U.S. News Rank Community Colleges in the Future?

​The new Community College Directory​ published by U.S. News includes general information about 950 community colleges using data from the Integrated Postsecondary Education Data System (IPEDS) such as, tuition/fees, financial aid, enrollment, gender distribution, and student-to-faculty ratios. A user can search schools by name, state, zipcode, or degree/certificate program. Although the directory currently doesn’t rank community colleges by specific factors, U.S. News says they are “studying the possibility of publishing evaluative information on community colleges and their programs in the future.”

Read U.S. News’ FAQ about the directory here​

June 19
Researchers Conclude Efforts to Change a University's Rankings are a "waste of time."

​In response to rankings such as U.S. News & World Report, many university administrators take actions such as setting strategic goals, allocating resources or even tying performance to bonuses in a direct attempt to boost one's position in the rankings. However, it's unclear what it would really take to move up in the rankings significantly, or if it's even possible to do so. Researchers in a new report, examined these questions by creating a model that uses the same methodology described by U.S. News.

Gnolek, Falciano, & Kuncl (2014) conclude that "meaningful rank changes for top universities are difficult and would only occur after long-range and extraordinarily expensive changes, not through small adjustments." For example, it would be extremely difficult, if not impossible, for a university in the mid-30s to move into the top 20 without a substantive change in a university's academic reputation score, the category U.S. News weights the most in their rankings (combined weight of 22.5 percent). This score is often based on age-old perceptions that change very little over time. Even if that change occurred, the university would need to spend a very large amount of money to change other factors at the same time. 

Furthermore, in answering what amount of change should simply be considered "noise," the researchers found for universities with ranks beyond the top 40, differences of +/- 4 (or even greater) should be considered "noise" and universities with ranks in the top 40 should similarly consider differences of +/- 2 as "noise." 

So what's the advice to campuses? 

"Universities might be best served by focusing their efforts and resources on what they do best, not what is being measured by U.S. News, by making improvements that are in line with their own goals and are centered around the success of their unique student population."

The authors hope their findings will reduce the pressure for universities to constantly respond to small changes in rankings, spend money on factors exclusively aimed at changing rankings, as well as, combat the tendency to manipulate numbers. Indeed the researchers note "the focus on wealth, fame, and exclusivity that comes from placing emphasis on U.S. News rankings creates very real issues and highlights some of the inherent problems with rankings themselves."

Listen to one of the authors talk about the findings in a panel recorded here​

1 - 10Next
 

 #CollegeRankings