May 22, 2019
The Trouble(s) with College
Inequities in Admissions, Inadequacies in Teaching
By Alfie Kohn
In Annie Hall, we’re reminded of the old joke in which a guest at a resort complains, “The food at this place is really terrible,” prompting her friend to reply, “I know! And such small portions!” Which leads me to propose the collegiate version: “Boy, the education you get at this school is really terrible….And so few people have access to it!”
That’s pretty much the paradoxical pairing of criticisms I’d like to offer in what follows. The crippling debt that young people are forced to accept in order to attend college is far from the only problem with higher education in America. I don’t think it’s an exaggeration to say that the whole institution is profoundly broken: the process for determining who gets to attend as well as what awaits those who do. These arguments aren’t exactly novel, but it may be useful to gather the particulars — along with links to various studies and essays on the subject — in one place.
I. GETTING INTO COLLEGE
The current system of college admissions “exists not to provide social mobility but to prevent it,” journalist Neal Gabler has remarked. True, there have been efforts to create a semblance of racial and ethnic diversity, though apparently more to enrich the experience of the privileged than to provide broader access to those who have historically been excluded. But economically disadvantaged students still comprise a tiny percentage of those admitted to elite institutions. The struggle for cultural diversity on campuses is largely a battle “over what skin color the rich kids should have,” as Walter Benn Michaels drily observed.
As if astronomical tuition wasn’t enough of a barrier, consider the practice of offering “early decision” admission. Who can afford to accept a binding offer from a college before financial aid is discussed and scholarship offers from several schools can be compared? “Legacy” admissions, meanwhile — the common practice at elite schools of giving special consideration to children of alumni — has a “tarnished history” and continues to function as a form of affirmative action for the wealthy. (By one estimate, these second-generation applicants to selective colleges are seven hundred percent more likely to get in than those who chose to be born to parents who aren’t alumni.)
But the problem isn’t just with discrete practices like these that could in theory be abolished tomorrow. Rather, it’s the expectation, usually taken for granted, that high school students must be not merely capable of benefiting from a college education but already exceptionally accomplished at the age of eighteen. The effect, of course, is to privilege those whose precocious talent has been nurtured at considerable expense. Thus, when federal prosecutors charged dozens of rich people with an elaborate cheating scheme to get their kids into desirable colleges, we were immediately reminded not to lose sight of “the broader truth: All of American higher education is, in essence, a giant pay-to-play scandal.”
It’s understandable that William Deresiewicz, the author of Excellent Sheep, would propose this rule of thumb: “The more prestigious the school, the more unequal its student body is apt to be” — with the result that elite colleges “actively promote…the movement toward a more unequal society.” But one of the most obnoxious features of college admission — their marketing campaigns — is not limited to prestigious schools. I used to assume that the primary purpose of clever branding and aggressive outreach was to find, and encourage applications from, students who would be a good match for the school and whom the school would be delighted to accept. Yes, I really was that naive. The actual goal, I later learned, is to encourage as many applications as possible from students whom the college intends to reject, thereby increasing its selectivity and making it appear more desirable on national rankings. Hopeful students are basically exploited to improve the school’s reputation.
Another practice employed with increasing frequency by all sorts of colleges is “merit aid“: financial inducements for the students who often need the least help. The idea of offering scholarships based on any criterion other than need — let alone extending them disproportionately to the relatively affluent — is offensive on its face. But tuition rates are insane, so anything that moderates that burden for anyone tends to be welcomed, never mind that the effect is to drain the pool of money that could be offered to low-income students. The point is to allow this college to compete with that one for “desirable” applicants, and the effect — as is often the case with competition in any arena — is to redistribute resources upwards.
While the effect is regressive, some of these disturbing practices, as I’ve noted, have been adopted by colleges across the spectrum — including state universities.1 But let’s face it: The admissions policies of the most elite schools are particularly egregious and they are enabled by anyone, notably prospective applicants and their families, who worships at the altar of prestige. In some neighborhoods, college conversations are focused on a few dozen schools that are extremely difficult to get into — as if these were the only ones worth attending, or indeed, the only ones that existed.2
A bracing bit of perspective, then: Four out of five U.S. colleges accept more than half the students who apply; only about 3 percent are classified as extremely selective (which means they admit fewer than one out of five applicants). And just because a college isn’t particularly difficult to get into doesn’t mean it can’t offer a fine education. The problem is that mentioning their names doesn’t make people swoon with envy or awe. We assume that basking in the radiance of that prestige is worth sacrificing a teenager’s mental health, sleep, friends, curiosity, and sanity. Sadly, some parents would prefer to take the chance of having their child fail (or implode) at a prestigious college rather than flourish at one their friends haven’t heard of.
Not only does prestige fail to justify this all-consuming (and, by definition, usually futile) quest for admission, but it may be that ultraselective schools are actually less desirable in some respects. Much of one’s college experience, including intellectual growth, is a function of the students one learns and perhaps lives with, and those at less prestigious colleges, says Deresiewicz, are “apt to be more interesting, curious, open, and far less entitled and competitive.” In any event, research has shown that the selectivity of the college one attends has absolutely no relationship to the quality of the teaching that takes place there. Or to subsequent job satisfaction. Or to general well-being. Even in crude financial terms, it appears that the selectivity of the college usually doesn’t make much difference to future earnings (once we hold constant characteristics of the students).
We can say a few things with absolute confidence: First, the differences between more- and less-selective institutions simply aren’t as important as most people think. Second, the point should be to find a good match between a student and a college rather than assuming everyone should be straining to get into the same cluster of name-brand schools. And above all, colleges should not be judged on the basis of how many applicants they manage to avoid admitting. Selectivity and quality are two completely different things.
II. GETTING AN EDUCATION IN COLLEGE
Once a student enrolls at an institution of higher learning, how meaningful is the learning that’s likely to occur there?
At some universities, the priority is clearly not learning but sports. One report found that some U.S. universities “spend three to six times as much on each athlete as they do to educate each of their students.” (A college administrator once remarked that he wanted a university that the football team could be proud of.) Even more common is the tendency to privilege — and award tenure primarily based on — research rather than teaching. Many professors seem to regard teaching as a kind of occupational hazard, something they don’t like, aren’t especially good at, and haven’t been helped to learn how to do better. It’s telling, as David Helfand, a long-time academic, observed, that professors tend to talk about their research “opportunities” and their teaching “loads.”
To mention the criteria for awarding tenure, however, is to stumble on a related question: “Who’s doing the teaching?” It turns out that fewer than one-quarter of higher education faculty in America are either tenured or on a tenure track. Instead, colleges increasingly are hiring (and underpaying) part-time adjunct instructors to teach undergraduates: freelancers who scramble to make a living, often by shuttling from one institution to another, typically working under atrocious — even abusive — conditions. This arrangement, which has quietly become the norm in higher education, offers one sort of answer to the question of how much teaching and learning really matter in these institutions. Cut-rate instruction suggests that undergraduates’ tuition is primarily being used to subsidize research.
But regardless of who’s doing the teaching, how well is it being done? There are happy exceptions, of course, but college students are too often regarded as passive receptacles for a stream of information rather than as active participants in intellectual discovery. A typical course is mapped out in advance, which means without attention to the interests and needs of the particular students who take it and without any opportunity for their input. (A week-by-week summary, along with a list of rules and threats, is delivered to students at the beginning in the form of a syllabus, whose tone often resembles “something that might be handed to a prisoner on the first day of incarceration.”) Knowledge is transmitted through lectures, a fact that, as I’ve argued elsewhere, reveals how teaching has been conflated with telling. Research has consistently found lecturing to be of extremely limited effectiveness even for retaining information, much less for acquiring deep understanding. That it continues to be the dominant instructional strategy anyway does not speak well for the state of higher education.
Finally, there is the question of what is being taught, and what that tells us about the apparent purpose of these four years. Consider this fact: Fewer than 10 percent of U.S. college students today major in any of the humanities: literature, languages, philosophy, history, classics, or the fine arts. There has been a marked shift away from liberal arts in favor of narrow career preparation — to the point that, as one academic observer puts it, “we are drifting toward treating college as a trade school.” Many small liberal-arts colleges are shutting their doors, and at some established universities liberal arts degrees are being eliminated. With only a bit of hyperbole, an acquaintance of mine who teaches at Yale describes that institution — and, presumably, others like it — as a “finishing school for finance.” Even at colleges that deliberately don’t offer a degree in business, the most popular major is often economics, and not always because of a deep fascination with the subject itself.
Benjamin Schmidt argued in The Atlantic that this trend doesn’t reflect a “sharp drop in the actual career prospects of humanities majors. Instead, in the wake of the 2008 financial crisis, students seem to have shifted their view of what they should be studying — in a largely misguided effort to enhance their chances on the job market. And something essential is being lost in the process.” His use of the phrase “largely misguided” reflects data showing that a humanities major actually doesn’t amount to a handicap for future financial success. But the more pressing question is whether, even in an uncertain economy, this should be the chief, let alone the sole, criterion for deciding what to study — or for evaluating a college.3 It’s certainly reasonable to want to be employable after graduating, but that’s very different from evaluating an education primarily in economic terms, just as the question “Will I be able to get a decent job if I go here and major in this?” is not the same question as “Will this make me rich?”4
Intellectual exploration for its own sake — learning to think critically, to make connections and distinctions, to entertain questions one has never encountered before — is an extraordinary opportunity. It is a pity that education, as distinct from training (for a career), is highly uncommon in most countries. It is a pity that in the United States, too, it is becoming a luxury for the rich offered at a small and shrinking number of colleges. It is a pity that even many of those who do have access to it are failing to take advantage of what may be the last structured opportunity in their lives to avail themselves of this kind of pursuit.
The problem is multi-layered and deeply rooted: It comprises what colleges teach, how and by whom it’s taught, the relative lack of emphasis on high-quality teaching, and the equally troubling priorities that determine who is admitted to these schools in the first place. To understand such developments requires us to employ the critical tools of sociology, economics, philosophy, and pedagogy, among other disciplines. And it requires that a broad sweep of our population participate in this inquiry since these issues affect all of us. Alas, the capacity to carry out this sort of investigation is precisely what is at risk if there’s reason to question both the quality of higher education and who has access to it.
1. One of the great, quietly unfolding tragedies of higher education in this country is the increasing selectivity of many state schools. Abandoning their traditional mission of providing a solid education at an affordable price to residents, they have elected instead to compete — and reserve more spaces — for the most impressive out-of-state students, who pay substantially higher tuition. This trend also reflects the fact that such universities have become more tuition-dependent, and have had to cut programs, because state legislatures have slashed financial support for education.
2. More generally, this offers a warped impression of the usual experience of postsecondary education. A typical student is actually older than 25, attends a public university or community college, and often does so part-time, while holding down a job and living at home rather than in a dorm.
3. For those of us who thought we had hit bottom with the tendency to judge schools on the basis of their students’ test scores, there is now actually talk of judging them by the extent to which the degrees they confer literally pay off. Apart from the stultifyingly materialist worldview this reflects, its practical effect will be to increase institutional pressure to emphasize preprofessional courses and majors.
4. On the other hand, it is a bit disconcerting if a liberal arts education costing more than a quarter-million dollars primarily equips the graduate to employ schemas of dialectical intersectionality to problematize neoliberalism’s discourse of hegemonic heteronormativity….while fetching your grande latte.
To be notified whenever a new article or blog is posted on this site, please enter your e-mail address at www.alfiekohn.org/sign-up.