There Are Five Academias
Before we start, let me say that I’ve had a lot of life experiences—I’m 41 years old—and I’ve been in and around educational institutions of all sorts. I have plenty of friends who are graduate students or professors. Nothing in this essay is about, or directed at, any specific individual or institution. Got it? Good. This is a disclaimer I have to make, because I’ve suffered professional consequences in the past due to people mistaking my writing to be about them. I’m not Carly Simon, and this song ain’t.
Instead, this is a preliminary analysis pertaining to one of the most interesting questions of 21st-century civilization: Can academia be saved? We are not talking about ordinary corporate institutions; if a company whose purpose is to maximize shareholder value collapses, no one cares who is not a shareholder. Academia is something civilization needs. And I cannot answer any discrete question about its savability because—here is the core assertion—there is no such animal as academia. It is not even dead, morally compromised, or dysfunctional, because the concept is invested in a model of reality from another time.
In the world we must actually live in, there are five academias. They coexist, and they often need money from each other, but they don’t have very much in common, and they often don’t like each other. We’ll discuss each one, and no attempt will be made to unify these disparate images into a single picture; I could say that I have left it as an exercise, but if we restrict ourselves to what presently is and not what should be, it becomes an impossible one—and I’m not that cruel.
Academia E—Education (and other things…)
Most people, when they think about universities, think about higher education—college, professional school, and academic graduate school. This is an important service—millions of people pay for, at least, the social mobility it confers—but it’s not really academia’s core. It’s one of the academias, yes.
Parents do not render $75,000 per year unto the tuition-industrial complex because they are expecting their children to have the most excellent teachers—they want adequate teachers who are world-class researchers—or, more accurately still, they want their children to have the employment opportunities that come from the social status implications of having been taught by world-class researchers, said researchers being optional. It doesn’t actually matter all that much, financially speaking, if those courses are run entirely by TAs and the professors show up once per semester. It’s still frowned-upon when professors are that blatant, but their absence will not interfere with the university system’s discovered socioeconomic purpose of giving rich companies a sorted list of humans—for free, because the costs were paid by the people being sorted. Sadly, universities know this. Teaching brings in revenue, it is absolutely true, but the quality is irrelevant to the amount of revenue that will be acquired. Education, in our society, is mostly an elaborate euphemism for something else.
This world has eight billion people in it, and some of them are dumb motherfuckers, and one of the ways dumb motherfuckers identity themselves dumb motherfuckers is with an aphorism: “Those who can, do; those who can’t, teach.” I am being charitable by putting a semicolon in there; those who utter it have never seen one. It takes an immature mind not to recognize that helping others succeed is success. It is only because we are so infected by capitalist brainrot that this needs to be said.
As a computer programmer for more than twenty years, I’ve observed organizations and their dysfunction, and something I’ve learned about institutions is that they have a tendency to subconsciously decide that some tasks are “male work” that keeps the steam in the pipes, while others are “female work” that, while important, doesn’t require the best people. People who invest themselves in helping others succeed are judged as cowards refusing to stand alone and have an individual contribution that the world can judge. I shouldn’t have to say that this is a stupid opinion—in the real world, we are interdependent and there is no such thing as an individual contribution. The gendering of work is not always predictable. For example, women are much better suited to sales than men, and yet, since the 1950s (for reasons too numerous to discuss) companies have decided sales to be “male work”, as made evident by the use of hunting, sport, and war metaphors. Male work is all about scoring “kills”—defeated competitors, clients pounded into submission in contractual disputes, layoffs efficiently executed. Female work is resented and devalued because the people doing it “get to” help rather than harm people. Consider the taxonomy of useless, socially harmful labor according to David Graeber’s Bullshit Jobs—three of the five listed categories are archetypically male (Goons, for zero-sum violence; Duct-Tapers, to fix ugly problems; Taskmasters, to squeeze workers), while only one (Box-Tickers, to make organizations look pretty) is archetypically female and the last (Flunkies, to make bosses look good) is genderless.
Academia has not descended entirely into this inferno. There are enough positive-sum opportunities in it that the “male work” of research can be pursued without harm to others and, while teaching is not as valued as it should be, researchers do not lose face if they excel in it—as they would, in an organization where the male/female separation of work were more rigid. Teaching is still valued—just not enough. The context of an older social contract has been forgotten.
Going into the middle of the twentieth century, teaching was both a service of researchers unto the public, and one of institutions unto researchers. It insulated researchers, whose work was not broadly understood, from any questioning about whether they were earning their keep. Professors had to teach—some enjoyed it; some did not. Research, however, would be subject to no evaluative discipline at all after a six-year trial period, to establish basic competency, was completed. This made sense. Then, and now, research relevance cannot be assessed in the short term, which means that it is impossible to evaluate individuals at the level of precision that, morally speaking, a survival-stakes question—“Should this person be allowed to eat?”—requires.
We can’t go further into this without asking a question: What is research?
Academia R—Research, recursively, does a recursion in which, recursively…
No, seriously. What is research? Do medievalists do the same class of work as particle physicists, or do they have entirely different jobs? What is research’s goal? What is the end product? Are papers the important work? Or are papers merely documentation of work? What makes good research? What makes bad research? Why do we care? Is research valuable enough that we should force the public to pay for it? (Answer: Yes, but why?) These are important questions, and there are competing answers to each, but the number of people who can articulate their preferred responses well enough to leave a person confident that survival-stakes moral matters have been properly resolved is… zero. I am not one of those zero people. Still, I will try my best.
We live, as David Graeber observed, in a crab-mentality society where exploited workers rage not at their exploiters but at workers who seem to be exploited less. Therefore, outsiders to academia believe that professors live in comfort while indulging in “the life of the mind.” That there are chemistry professors who get paid to think about new molecules. That there are literature professors who get paid to read old books. That there are music professors who get paid to write songs. The idea that professors buy houses in the first year out of graduate school, and that they easily get tenure for publishing one or two papers, although it hasn’t been true since the 1970s, still persists.
Money is something humans invented to quantify failure—“You owe John 50 bushels for the damage your pig did to his farm, he owes you 40 for killing your pig; therefore, pay him 10, due next harvest”—and it is easy to see that one would envy a life free of commercial concerns. The academic outsiders who cheer on funding cuts really do believe professors live in a pastoral place where the self-appointed enlightened ones kicked out “the dumbs” and made communism actually possible. And…? Well, the prof-haters would be quite happy, I imagine, if they knew what today’s real academia was like. The idealized “life of the mind” doesn’t exist, as we’ll cover in depth when I discuss Academia G. For now, we must understand what went wrong in Academia R.
Research has an evaluation paradox—the best people’s time is better used to do new original work than to evaluate other people’s. This is not true everywhere—most industries and workplaces are not high-performance operations and run just fine with mediocre-plus managers evaluating others’ mediocre contributions. In research, though, it becomes necessary to make evaluation a service obligation that people will be expected to do, but not so much that they are unable to continue their main line of work.
We must address the need to evaluate in two different ways—we have to evaluate research contributions, such as discoveries and scholarly works, and we also have to evaluate people. The need sets are different; the moral problems diverge. One necessitates conservatism—we should be damn sure, before we canonize knowledge, that we have checked it from all angles—while the other is one where, if we’re decent humans—unfortunately, we live under capitalism, so the people running our world are not decent ones—we will agree that erring on the side of mercy is correct.
Academics gripe about peer review. It’s broken, they say. They’re not wrong. The whole system relies on volunteer labor that varies wildly in quality. The purpose of peer review is to validate methodology and argumentation—not inject personal opinions about what should be relevant or what has “novelty”—and this is sometimes forgotten. Still, peer review was designed to be conservative—to be eventually consistent; that is, get the right answer, given time. The problem is that, as a mechanism to evaluate people and whether they have the right to eat—there are now survival stakes—it is too slow and error-prone. An absurd number of academics simply “time out and die” not because their work isn’t good enough, but because delays in the peer review process ensure they are outrun by their own need to eat and pay rent.
There’s a truly awful comic about “publish or perish” in which the stock character of an uncharismatic professor complains about it, and then a white woman puts him in his place by saying private workers have that, too—it’s called “do your job or get fired.” This shows us how inaccurate the outsider’s view of academia is; when they hear complaints about the publish-or-perish culture, they think professors are complaining about having to write papers. To see people who assign term papers suffer under an expectation to write them is taken as delicious irony. “We didn’t like ‘em either, buddy.”
Of course, it’s not the writing of papers to which professors object, but a million things after that, some of which are objectively bad for the arts and sciences.
You have N years to make the case that you and your children deserve not to freeze and starve on the streets. None of the decision makers will read your work—that’s what graduate students are for—so your citation count will decide, but you don’t know what the bar will be. Each paper accepted by Journal A will result in 100 citations, but has only a 5% chance of being accepted and—worst of all—you will have to wait 2 out of your N years for a decision. Each paper accepted by Journal B will result in 2 citations, but good work has a 50% chance of acceptance, you will get a result in three months either way, and a rejection will come with useful comments. Where do you submit your work? Do you aim high, and risk career-ending time losses, or do you underplace your work to get it out there, but risk that it will be ignored? Do you break your contribution into several papers (“salami slicing”) and boost your metrics while diversifying your career? Do you “rent” graduate students out to other professors in exchange for getting your name on 500-citation papers in the pipeline? This is what professors are talking about when they talk about publish-or-perish—the stupid fucking h-index game that results in a flood of hastily-written, lousy papers. No one goes into research wanting to write shitty papers. They are all forced to play this game by terrible decisions made decades ago.
What were those terrible decisions made decades ago? I’m glad you asked.
As I said, the public image of a professor is decades out of date—“the life of the mind;” tenure at thirty unless you slept with a dean’s daughter; research as a context to travel wherever you want whenever you want—and, when outsiders are told how bad it actually is to be a professor in today’s academia, they’ll often say that this is something academics “did to themselves.” This isn’t actually wrong, if we think in terms of collectives. This is something that previous generations did to the current ones.
During the Cold War, there was so much war machine money flying around, you had to stand sideways not to be hit by it. Tenured academics who disliked teaching decided to just… not do it. They started showing up 15 minutes late to class, then 25, then 35. They delegated more and more work to TAs. If confronted about this behavior, they undermined their colleagues by suggesting their research (“male work”) was the real contribution but that education—social mobility, not learning, is the product—could be done by people with far less skill and experience. If a research rainmaker could bring in a seven- or eight-figure grant, what was the point of “wasting” his time by making him teach students? Administrators acceded. In the Cold War, this worked. People who did not want to teach were told they no longer had to do it. This also led to adjunctification. The wrong people suffered. I generally dislike generational narratives—indeed, there will be a Great Disappointment in the United States when the hated “Boomers” are all gone but everything still sucks because capitalism is capitalism, but this is a pretty clear-cut case of an older generation knifing the young.
The above should not be taken to mean there should be no “research-only” academic positions. There should be, at least outside of universities. I do not believe that people who disinvest in teaching should be ranked as the toppest-tier of the top-tier academics, as they currently are due to their single-minded investment into the gaming of publication metrics, but I also don’t think people lacking the interest and social skills to teach should be locked out of research entirely—society shouldn’t venerate this general attitude, but it should make a place for people who excel in one category of abilities but lack the other.
So long as there was a Soviet Union to hate, professors could pump class sizes to 300, show up unprepared and 20 minutes late, or even decide they weren’t going to teach the billed subject at all—the exam would tell who had read the textbook, and this would cause student anxiety, but grades would be inflated to cover it up—and hold position. The Cold War, notably, ended. And the students who were badly educated did not forget. To professors who openly showed disinterest in teaching, they returned the favor as conservative legislators who cut funding for public universities. If you do not educate, you should expect stupid results—you should expect a stupid society, such as the one we are.
The lesson here is that teaching really matters. Research does too, but research is incredibly difficult to evaluate, even when the evaluation is done by intelligent people, operating in good faith, who have been taught well—and this is why it takes years or decades for contributions to be evaluated fairly in the best conditions. It is impossible for ill-taught people to evaluate its merits. How do we improve conditions, so research is properly valued? We encourage researchers to teach—this is what we did, back when academia worked. If you’re not willing to teach undergraduates why literature is relevant, then you should expect your own research on 19th-century women’s literature to be received as irrelevant.
Academia’s situation can be called a competitive sink—a dynamic in which competition for jobs that is theorized to select the best people, in fact, worsens the quality of work due to duplicated or wasted efforts. You are possibly, when you increase the competitive temperature, selecting more talented people, but the contest itself is so corrosive that the quality of the output deteriorates. You might be getting “10% better” people but you are cutting their useful output in half, so you’re down 45 percent. It is 100 times more competitive to get an academic job than it was back when we had a country, and yet the quality of teaching and research are nowhere near as good.
A small number of professors during the Cold War decided that research was “male work” and that teaching was “female work.” They thought it would improve their daily lives to devalue the latter, so they did. There still are quite a number of fantastic teachers in academia—it feels good to do an important job well, and this keeps a lot of people going, even when they start to find the rest of academia distasteful—but they do so knowing there will be no extrinsic reward for it. There is, after all, no “+10 h-index” power-up that spawns for exceptional teachers—I’ve read all the papers on this, and no such thing exists.
Behavioral sinks are extraordinarily difficult to get out of—this is why they’re called sinks. A person who writes few excellent papers, instead of mass-producing them to inflate his own and his colleagues’ citation counts, will be ignored. A postdoc who puts effort into his teaching will be unable to secure a tenure-track job. A small number of professors in the 1960s and 1970s decided to ignore half their job—in the 2020s, all professors suffer for it, because an ill-taught public and ill-taught legislators do not value what academia does.
In the humanities, the result has been total irrelevance. The fate that has befallen the sciences and engineering is… not irrelevance, but evolution into a form antithetical to what academia is supposed to be.
Academia G—Go, go, go, grub all the grants
There is a perversion of language in academia that allows people to claim the job has two halves: teaching and research. No. Most academics do the teaching for free; that is known to most people who understand academia. They also do the research for free. Grant grubbing is what they get paid for. You might think I’m joking, but most professors go unpaid during the summer if they do not succeed at grant grubbing, and it is increasingly common (“soft money” loophole) for universities to apply this all year. Young researchers are never told—if they were, there would be 95 percent fewer people entering graduate school—the degree to which the F-word, funding, is a factor in what academia does and who succeeds in it.
That academics do apply for funding is not hidden from the young. It is widely known to be something professors do sometimes, but presented as something only people with exceptional ambitions have to think about. No one is shocked or disgusted by the fact that someone wanting to do a $30 million experiment cannot use a “because I work here” argument as justification for the money being spent. That highly ambitious, expensive research projects require external support is not a moral issue. What undergraduates aren’t told, though, is that the (small) stipend and tuition forgiveness they receive is billed against an advisor’s grants, an issue that creates horrible conflicts of interest… as we’ll discuss, when we get to Academia M. In order to do anything, professors are expected to apply for personal funding, without much institutional support beyond the name… although universities also take a truly massive cut—more on that also, later.
Grant applications are extremely time-consuming and have a success rate of about 18%, but you are dead as an academic if you’re not batting at least .667—preferably, .850. You can survive journal or conference rejections—improve the paper, resubmit—but if your funding applications don’t hit more often than they miss, you will be managed out, even if you have tenure. This is also where the socioeconomic element of academia slides in. To achieve an acceptable success ratio that is four times baseline requires an entire village of people who will tell you what to apply for, when to apply for it, how to apply, and precisely what words to say, in what order—this village is not provided; it would be extremely unusual for a university president to take a personal interest in making sure a first-year professor’s grant applications fund. Still, if you do not find a way to get these people to work to make sure your ideas get funded, you have a zero percent chance of thriving in today’s academia. What does this have to do with preexisting socioeconomic status? Well, one skill rich kids learn early on—it’s amazing to watch, and it’s impossible to develop if you’re not born into it—is the capacity to boss people around without them realizing it’s happening to them. If you want to be a fundraiser, you need a certain knack for convincing people (who, in truth, owe you nothing) that your own career is their full-time obligation. If you want the process of applying funding not to destroy your career—every failed application is time you will never get back—you either need to have this skill or do find a backer who does.
Most researchers are good people; they do not want to be dishonest. Still, the role of funding in academia is deliberately underpublished because it is depressing, and also because it would eliminate the social status of the profession. The image of the professor who gets paid to sit in an office and read or write books is an asset that raises their social standing outside of the academic world, and it’s, sadly, the only thing quite a number of people have. To make it clear, I don’t think these revelations about academia should erode the social status of researchers or teachers—most of them are excellent people, and they’re underpaid both (a) in the context of their talent, and (b) especially compared to what they’re expected to put up with.
Let me cover one more of those infuriating, underpublished topics: Overhead. Universities know that professors can’t raise funds without an affiliation, so they charge a truly massive rent for the use of the name—50 to 70 percent, sometimes more, out of every grant the professor gets. And we’ve already covered that the calls made by the university president to make sure a young professor’s grants always fund do not, in practice, happen, so that 50–70 percent really is just graft. In theory, this money is an insurance policy that will fund job security for professors (and their students) when they have dry years. In practice…? I don’t think that exists. The official line about this “overhead” is that it’s not worth getting angry about because the government agencies know it is there and factor it in—if the project will take $1,000,000 to complete, they’ll pay $2,000,000—but this is a terrible argument. First, it means there are fewer grants to go around, which makes it harder for young professors to get them. Second, it means that public funds are being diverted to causes that aren’t teaching and aren’t research. What makes it outrageous is not that this overhead exists—indirect costs are real—but that professors still get billed for everything: equipment, graduate students, travel. Universities double dip. They take the “overhead” haircut and then they also bill professors for students’ stipends and—even if we’re talking about fourth- and fifth-year students who are no longer taking courses—tuition.
Why has no one stood up to fix this? I think we know the answer. Academia is not—properly plural, the academias are not—organized enough; it is too divided along hierarchical as well as functional lines to operate with a coherent will.
Academia M: There might be a middleman for your middleman, and if you’re really good, you get a middleman for your middlemen for middlemen, and if you’re really good you get a…
The general horribleness that funding issues inflict on academia creates conflicts of interest. If an advisor takes a direct financial hit to his grants for every graduate student, he’s going to be focused more on getting funded projects done quickly, so more funds can be secured, than on the training of independent scholars. To be clear, there are a lot of professors who do the right thing in spite of a system that will never reward them. The notoriously toxic labs one hears about on r/PhD are rare, but they do exist, and the grant-grubbing culture is why. Some advisors, realizing they must raise funds to survive, and knowing the power they hold over their students, go rogue and turn into corporate bosses.
The ethical thing to do, if one decides to become a corporate boss, is to go corporate—it also serves one’s personal financial interests, if one must become a top-billing salesman, to be paid like one. Still, people in an intermediate or “corporate-curious” state often use graduate-student labor—publicly funded—for the foundational work of future private-sector startups. Is this… legal? I’m not a lawyer. It might be. It’s clearly shady, but it’s an understandable retaliatory behavior against a corrupt system. Alas, though, it’s not the system that suffers. The system is doing just fine.
One of those ugly facts about research is that a proportion of the people who are there are just there to get jobs somewhere else. They don’t care about the arts and sciences—they’re going to go private at the first opportunity; they’re just smart enough to know that going private without solid paperwork is a way to end up in a “regular” corporate job where one answers to idiots and advancement based on talent is impossible. So they wait. If their advisor wants weekly status slides, 22.5-point font, on progress in the lab, they actually make the slides. They aren’t research-grade and they don’t want to be research-grade, but they’re really good at doing what they’re told. They don’t attend half their classes, because usually they answer to advisors who tell them coursework doesn’t matter. (This is a red flag. You need more than your advisor’s support, for example, to build a dissertation committee. Grades themselves don’t matter, in the sense that if you do excellent work, but fail the last exam, and end up with a B, you’re probably fine. But your reputation does matter, and any advisor who says coursework isn’t important is one to run from.) These “lab rats” will not produce seminal work, but they’re also unlikely to end up in career-limiting conflicts, because they just don’t care enough to get into one. This, for the record, is not limited to lower tiers of the university world; Silicon Valley was built by a pipeline of Stanford professors sending second-rate students to startups and venture-capital firms, with glowing recommendations, because it avoided unpleasant conversations.
This academia, Academia M, is not discussed much at all. It mostly exists to recruit for non-research and sub-research employers. It is not really supposed to live in parallel with graduate education—the business-grade people are supposed to be “encouraged to succeed elsewhere” as juniors or seniors in college, rather than gum up the works by applying to graduate school. Unfortunately, employers overestimate the talent they need—it does not take a doctorate in machine learning to use linear regression on a corporate dataset—and so there is a nonzero extent to which Academia M is also present.
Some of Academia M’s denizens truly are mediocrities who belong in private equity or startups, but a lot of them are just unlucky—some of them really are talented. You meet these people when you encounter graduate students and postdocs who have not had a vacation in two years, and who spend ungodly hours on projects they didn’t choose, and for whom being “fired” by an advisor doesn’t mean a slightly delayed graduation due to the search for a new one, but leaving graduate school. They work on the stuff that somebody got funded and that now has to get done. This work doesn’t produce great papers, and it’s usually been stripped of all creative content, so that most people who do it often decide that, if they’re going to put up with a corporate job, they might as well go out and get an actual corporate job.
This is an ugly discussion, but middlemen, I guess, are important. They get an academia, too.
Academia Z: They get screwed, but we need them.
The last of these is Academia Z. This is where no professor wants to end up. It is nearly impossible, from here, to remain relevant in research. Time is mostly spent on teaching, which isn’t inherently a bad thing, but the teaching is still not valued—elite liberal arts colleges that offer tenure are Academia E, not Academia Z—and so teaching loads tend to be so high, and expectations so unreasonable, that there’s no room to do the teaching well. This is the land of the adjunct who leaves class 15 minutes early because she has another course to teach at a different school on the other side of town. The pay is awful—$3,500 per course is typical. Short-term, contingent contracts are the norm, and basic benefits are usually not provided.
When you land in Academia Z, you’re basically not a professor anymore. If you’re a teacher, you’ll be an adjunct and have to take a 5-5 load to survive. If you’re a researcher, you’ll complete not an elite postdoc but an ordinary one for which the prize is… another postdoc.
Farisa, notably, is Academia Z in the weeks leading up to April 26. The results are usually not as… unignorable, though.
It’s tempting to write Academia Z off entirely. For-profit universities, the epitome of Academia Z, are disliked for good reasons. There are also ethical issues of low teacher autonomy, such as when textbook selection (and don’t get me started on “homework codes”) is out of the instructor’s control, due to graft involving contracts with publishers. Nevertheless, it is mostly the case that people who fall into Academia Z wanted to be—and still want to be—serious teachers and researchers.
Academia Z must exist, because Academia M does; the former leads to overproduction of PhDs who will not get proper academic jobs, and those who cannot score decent corporate jobs (corporate jobs are numerous; decent ones are extremely rare) must go somewhere. These people legitimately can say they teach at colleges. But they’re… not… really… professors. They are in higher education, sure, but their teaching loads are so high that (a) research is likely a pipe dream, and (b) those who went into it to be teachers are chronically upset because they do not have the support, or resources, or time to do teaching well.
I don’t want to end on a dark note, so I’ll say that Academia Z is not all that bad. These people absolutely deserve better pay and working conditions; it is unacceptable how they are treated. But they also deserve more respect—they probably do more good for society than the other four academias combined, if only because they are accessible to people for whom traditional academia is not. Chris Hedges taught prisoners—the Z-est of Academia Z—and said they were his best students. I believe him. One of the things that it’s easy for a young person to believe is that success on various totem poles is 10% socioeconomic and 90% merit; as you age, you learn that it’s the reverse.
One of the under-published truths about ancient China is that its exam-based meritocracy—arguably one of the most successful failures, or one of the most failed successes—contributed more to its society indirectly than directly. The noted problem is that it did not always select the best people for leadership opportunities—if ancient poetry becomes an object that must be memorized in order to win a contest, it loses its cultural value as poetry. China did not escape, because no society has escaped, the facts that: power-seeking humans should not be trusted; those skilled at having themselves selected for important roles are usually terrible once in them; and that measuring either intellectual capacity or moral diligence is difficult. So why were the exams so crucial to progress in the society? The inner elite, the top-of-the-top, produced next to nothing of use—it never does. The real productive creativity came from the fringe, the people who passed seven out of nine exams but failed the eighth and could not afford to retake it. They became tutors for other aspirants; they were considered by society—and probably considered themselves to be—failures. Yet they were the ones who, using what little money this undesired but survivable landing afforded them, produced real art and real science, probably in obscurity, after work. Higher-placed people generally have a knack for taking ideas that come from below, synthesizing them, and packaging them so they can actually get support, but those who understand humanity and history know that 95 percent of the real progress comes from odd places. Imperial Chinese societies weren’t an exception, and neither is ours.
Conclusion?
One book I greatly enjoyed was Paul Fussell’s Class, a humorous analysis of the U.S. socioeconomic class system that, like most analyses of human behavior, faces the problem of endgame. We structure these tours as stories: Academia E is what undergraduates believe universities are about; Academia R is what graduate students want to believe they are about; Academia G, as the one with all the power, holds the cards—we introduce the monster late, in the third act out of five, an unorthodox move in storytelling. Our fourth and fifth visits are to Academias M and Z, so obscure that most people do not know they exist, although they are everywhere if one knows where to look. And so…? How do we end such a story such as this, on a note a reader will not detest? There is no tragic figure; there is also no hero. There can be no glorious exit.
I am sure Fussell was so depressed by his shared journey with the reader that he—otherwise a talented and serious man, and it is not a mark against him that he did so, for I have done the same in other fields of life—invented nonsense: Category X. An “X way out.” It doesn’t exist. Don’t look for it. The topic of social determinism is so embittering that he felt compelled to give the reader a big helping of frozen light, so he posited that those whom we would recognize as “cool”—the academics, the misfits, the artists and intellectuals—had, in some spirit of rebellion, using the power of friendship, exited the class hierarchy. Would that it were so. As Trotsky said, you may have no interest in war, but it takes interest in you. You can forget class, but class won’t forget you.
In Fussell’s case, this name of “X” was later assigned to the generation after the Baby Boomers, and lives on plus-two to denote yet another generation for which we had no good name. Of course, neoliberal slugpeople run the world—it cannot be seriously denied—and this “X” or this “cool” was short-lived as any real rebellion; it got folded into the broader phenomenon of bourgeois bohemianism—the axis of insufferability.
There is no Academia X. Fussell’s work is fantastic, but his mistake has cost us that letter. There might be a secret sixth academia undiscovered by the analysis here—I can’t rule it out—but there are diminishing returns to the search. The real question, on which I must end because I have no real answer, is far more important, and that question is:
Who’s gonna fix this shit?