
I attended the convocation this week for students graduating from the Bachelor of Arts. My bailiwick! I went because a critical mass of students I’ve taught over the past several years were crossing the stage.
Graduation ceremonies are always, for me, studies in contradictions. There is, for example, something at once august and risible in the pomp and ritual of it all, with the costumes and robes and various staffs and maces borne by the holders of high academic office, the solemn bestowal of degrees both earned and honorary—faintly ridiculous medieval cosplay, but also a gesture to a long tradition and a reminder that education is an endeavour of high seriousness. Then of course there are the addresses delivered by a succession of speakers that range from inspiring to sententious to simply tedious. Tedium, indeed, is a necessary aspect, but tedium imbued with the excitement of triumph and celebration. The endless reading of names, but some of those names are those of friends or family whom you want to cheer—and quite possibly your own name. Or in my case, the names of students, the sight of whom crossing the stage fills me with pride.
The collection of speeches delivered at any given convocation ceremony is like Forest Gump’s proverbial box of chocolates. Will we be stirred? Moved? Will one or more of the addresses tip over from engaging to overlong, from overlong to endless? How many inspirational clichés will we dab on our mental bingo card? How many surprising nuggets of wisdom do we file away?1 Sitting there, overheating in my academic regalia,2 I often find myself wondering what I would say to address a given graduating class. Sometimes, when we’re in the midst of the army of graduates crossing the stage, I start to compose a speech in my mind.
I didn’t do that this year. Or rather, I didn’t imagine addressing the graduating students. At this point, anything I might say feels redundant: I’ve had so many extraordinary students pass through my classrooms these past few years, any advice or guidance I might offer has already been given. Watching my various students cross the stage, I knew the kids were all right. Anything more I could tell them would be fatuous.
Which isn’t to say I wasn’t mentally composing words. But I was thinking: what would I say, not to outgoing graduates but incoming students? And not just at my institution, but anybody taking their first steps of the undergraduate adventure?
What would such an address be called? “Commencement” would be entirely appropriate, as it deals with starting out, commencing one’s university degree; but of course we use that very word to mean speeches given at graduation, emphasizing the ceremony as not marking just the end of something, but the beginning of a new phase of life. So I suppose we could mark all sorts of life’s watershed moments as commencements, but that would get confusing. Best keep the term where it is.
Then I thought … invocation? On one hand, it sounds good because of its similarity, aurally, to convocation. On the other hand, there’s the pesky matter issue of meaning: invocation means to call upon a higher power in prayer or supplication. Not perfect. On the other other hand,3 anything I would have to say to incoming students is going to venture into making an argument for intangibles—for the value of creativity and curiosity and being open to the experience of the sublime … not a higher power per se, but not not a higher power. Which is close enough! “Invocation” it is.
Also, what’s the point of having a PhD in English if I can’t occasionally bend words to will?4
So, sing in me Muse …
I suppose I should address you as the “Class of 2029.” That’s standard—we count four years from your start date to when you’re expected to graduate. That probably holds true for a critical mass of you, but not everybody. Some of you will fast-track by taking summer courses; some will take an extra year or more, because sometimes life gets in the way; some of you will decide this isn’t for you and drop out; some of those people will circle back around after time away and try again.
(Also, some of you will do postgraduate work and rack up degrees like you’re going for the high score. Some of you, like me, will never leave the university.)
These are all entirely valid and valuable ways to do higher education. Don’t let anybody tell you otherwise, and don’t feel you have to hew to somebody else’s ideal schedule. Or even your own ideal schedule—as I said, life sometimes gets in the way, and sometimes that just means your original plan to study criminology, or biochemistry, or geography, or business administration just wasn’t working for you and you need extra time to find your niche. Even if that niche entails leaving university to learn a trade or found a startup or hop on a tramp steamer.5
But I’m talking about university and the time you spend there. And just to put my cards on the table, I’m talking about university from the perspective of a literature professor, so my apologies if this is a humanities-centric invocation. Well, actually, no apologies—as long as I’m here I’m going to make a case for the vital importance of the humanities, even (or perhaps especially) if you’re studying STEM (science, technology, engineering, and mathematics). We use the word “silo” a lot these days to describe the way contemporary culture, politics, and the economy tends to atomize people and groups, walling them off into enclaves. For all the clichéd talk of “ivory towers,” the modern university was specifically designed to resist the creation of silos. When you’re selecting your courses and wondering why your university is requiring you to take a humanities credit, or social science, or satisfy a language requirement—or, if you’re in the humanities, why you might have to take a representative science6 course—well, that’s a vestige of when you studied everything on the premise that a well-rounded education in all areas better equipped you to contribute to society. Not just economically, but culturally and politically as well. A well-educated populace, the theory went, made us better citizens, and better citizens made for a better society.7
Of course, a lot has changed since then,8 not least being the simple fact that the exponential growth and specialization of knowledge in all fields means that it’s no longer feasible to study a substantial bit of everything. There’s just too much! So the requirement that you take a small number of courses outside your major is, as I said, vestigial—a holdover, a gesture toward the mission of the university as conceived a little over two hundred years ago.
So why still have the requirement? What’s the value in taking courses outside your field when it’s just a sort of symbolic gesture?
Well, one reason is that we’re slow to change in the university. It’s why we still do convocation ceremonies like we’re medieval Oxfordians. But we take the symbolism of graduation regalia seriously because we take symbols seriously, and if we can’t do a well-rounded education like we could in years past, we’re still going to point to it as an ideal.
None of which is to suggest, however, that there’s no value in required courses outside your area of interest. It is not at all uncommon for students to have their world rocked by a class they took because they had to—a class, perhaps, that they had no interest in whatsoever and only enrolled because it fit their schedule. As Bilbo said to Frodo, it can be a dangerous business stepping out your front door … you never know where that path is going to take you.
As an example from personal experience: at Memorial University, my institution, students are required to take at least one first-year English class. This means I have taught a good number of those classes, which always feature a very diverse cross-section of interests, many if not most of which have nothing to do with English. I won’t lie: those classes can sometimes be a slog, both for the students and for me. And while I have been known to joke that Memorial established this requirement on the principle of the benefits of shared trauma, in truth a lot of my most rewarding teaching experiences are in those classes—classes that often comprise a critical mass of students utterly indifferent to the subject matter at best, or actively hostile and resentful at worst.
But here’s the thing: that critical mass of students is indifferent or resentful or hostile at first. For some that doesn’t change. But for many, they surprise themselves by finding something in the class or in the reading that strikes a nerve or flicks a switch (choose your metaphor). I see it happen. They don’t switch their major, perhaps, but they ask me to recommend other English courses they can take as electives, or ask for reading lists of books similar to those that stirred them. Some make English their minor. And some—typically, those who haven’t yet declared a major—do in fact make English their major.
This is not, to be clear, a universal experience. But you should go into your required courses with an open mind. And if you’re one of those many students who arrive at university uncertain of what you want to major in? Be adventurous—enrol in classes that interest you for reasons you can’t identify. Don’t pass over classes because you have the vague sense they won’t be “useful.” Ignore that voice in your mind that says “but what can you do with that?” or “how will that help me get a job?” Take a chance. You never know what’s going to start a fire in you.
And that brings me to something that isn’t so much advice as a general rule that I try to inculcate in all my students: curiosity is the quality that best predicts how much university benefits you. Curiosity and its kissing cousin, receptivity to the new and unfamiliar. A little secret: the people from whom you’re taking classes are for the most part massive nerds who love nothing more than to nerd out about their areas of specialization. If you take our example and nerd out about the new stuff you’re learning, you add exponentially to your possibilities here, as well as finding in your professors eager allies and enablers.
The flip side: there is no greater predicter of a negative, cynical university experience than viewing the process as purely transactional, an investment of time and money designed to yield a decent ROI in the form of a well-paying job.
Let me be clear: making you employable is absolutely one of the university’s critical priorities. But it’s not the only one, not by a long shot. If you make job training your sole consideration, you’ll be closing—one might even say siloing—yourself from the myriad benefits of doing a university degree. What’s more, if job training is your only concern, the lion’s share of courses you take will appear to you to be pointless. There are two reasons for that: first, the greatest educational benefits of many courses of study are intangible; second, and related to the first, we do in these intangibles make you eminently employable, but don’t necessarily spell it out for you.
Or as I like to tell my students: you will never have a job interview in which your prospective employer slides a piece of paper across the table and says, “Well, that was all very impressive and your resume is solid, but what we really need is for you to analyse this sonnet for us.” Understanding conceit and metaphor and the difference between Shakespearean and Petrarchan rhyme schemes, or what the blazon has to tell us about figurations of gender … yeah, these aren’t job skills for anyone but poets and poetry teachers. Except that all the steps we put you through in reading and analysis and building an argument around a central thesis and writing an essay to communicate all that? Those are bankable skills, and in the process we’ve also taken ambles and hikes through a host of poems and novels—or the work of various philosophers, or historical events and eras, or myth and folklore or popular culture, or another of the many areas available to you—that expand your understanding of the world more generally.9 And, perhaps more importantly, your understanding of yourself.
There’s another crucial element to what I just described. Curiosity and receptivity are your starting points; but doing the work is the necessary process. Part of the problem with the increasing tendency to see a university education in transactional or instrumental terms is that it places the emphasis on the end result—the degree that is supposed to buy you access to a well-paying job. When you see your four years (or however long you take) from that perspective, it becomes easier to see the credits that get you the degree as just notches to tally on a transcript—at best some interesting distractions, at worst annoyances to be endured. The reality is that the end result is incidental; the substance of what you do is in the work itself. The work is what matters.
What do I mean by “the work”? Well, let me share my humanities-centric model.
“The work” entails reading the readings, puzzling through them, throwing them against the wall in frustration when the ideas or prose or both are opaque, then picking the book up again and soldiering on; writing furious notes working through the knottier parts, then going to the professor with a bunch of questions; going into the physical library and finding not just the books you were looking for, but also a dozen others sitting adjacent on the shelf, and sitting there on the library floor riffling through tables of contents and introductions and indices, finding stuff you hadn’t been looking for; reading the footnotes and bibliographies and chasing down those references; writing bad essays, rewriting them, writing better essays, talking to the professor more, talking about the course material with fellow students; falling in love with an author or thinker or school of thought, obsessing over ideas, reading more outside of the course material, making lists of books to read over the summer (and actually reading them); then doing it all again the next year and the year after that, and the year after that. Rinse and repeat until you hear your name called and you cross the stage at your convocation.
There are also tests and exams and oral presentations and (shudder) group projects. There are classes you love and classes you hate, classes that you expected to love but didn’t, classes that you dread but end up loving, classes taught by professors who inspire you and classes taught by professors you fantasize about garrotting with a length of piano wire. But all of them, from the most difficult to the easiest, from the most edifying to the most tedious, will have the salutary effect of expanding your mind and your vocabularies in direct proportion to the amount of effort and engagement on your part.
Again: it’s the process. It’s the work that matters. And that brings me to the seven-fingered fuchsia elephant in the room. By which I mean generative A.I.—ChatGPT, or Gemini, or CoPilot, or Grok, or whatever. It’s a problem. Not because it’s a form of cheating (which, make no mistake, it totally is), but because it allows you to bypass the hard stuff—that is, thinking and writing. If education is about the process, if the work is what matters, then pawning off your assignments on a chatbot means you’ll have just spent your four(ish) years and a ton of money on a meaningless piece of paper.
It's like paying somebody to use your fitness tracker.10 You can claim their stats as your own, post their sets and reps and impressive runs to social media and receive all the kudos for being so healthy and active … but you still get winded walking up stairs.
Indiscriminate use of A.I. will appeal to the transactionalists—if all that matters is piling up credits for the degree at the end of the rainbow, why not make use of this new tech? Especially when its advocates loudly declare that anybody who doesn’t get used to using A.I. is going to be left behind. Or as the advertising for the A.I. app Cluely claims: if everybody’s cheating, nobody is.
Yes, they actually say that.11 As a selling point. Except that if everybody’s cheating … everybody’s cheating.
But here’s the thing about the claim that if you don’t get down with A.I. you’ll be left behind: it makes no sense. The whole point is that it’s supposed to be easy, frictionless. It does the hard work for you. It’s not like it requires a skill set like coding, or writing, or thinking. “But you have to be able to refine your prompts!”
Allow me in my concluding words here to appeal not to your better angels, but your naked self-interest. Remember what I said about how the intangible benefits of a university degree are inextricable from eminently employable skills? A.I.’s entire game is that it obviates those skills. It circumvents them: it does your intellectual sets and reps and posts imaginary stats while your actual intellect, your creativity, and your curiosity, all atrophy.
Which means that if you’re the person who does the work and one day find yourself in a job market surrounded by people who earned their degrees with ChatGPT-generated gruel, you have the advantage. Because you didn’t let your mind atrophy. And when the time comes to lift the heavy thing or sprint up the stairs, you’ll do it without wheezing or herniating a disc. You spent your four(ish) years learning to think, and guess what? You won’t stop—it’s a hard habit to break. You spent your four(ish) years reading deeply and widely, puzzling through complex ideas with your own brain, and learning how best to articulate your thoughts. That matters, and it will matter going forward—because the work matters, because human thought and creativity and curiosity and what Northrop Frye called “the educated imagination” all have profound intrinsic and ineffable value.
And with that, Class of 2029, I invocate you into your university adventure. Go forth, and become good citizens.
NOTES
Always allowing for the fact that the clichés and the nuggets can be one and the same, depending on whose ears they fall upon—hearing a cliché for the first time can be revelatory.
I wear a generic black robe and a hood provided by my university, Memorial, and not my alma mater’s doctoral gown. I often think it’s time to bite the bullet and just get them already, but the University of Western Ontario’s regalia is … not great. The colours are blue and purple, which would be fine—lovely, even—if the latter were just a few shades darker. As it stands the colour scheme is, even to my unfashionable eye, unpleasant.
Yes, that’s three hands. I run a mean game of three card monte.
Insert reference to Humpty Dumpty here. You know the one.
Is that something people still do? Could I still do that? Asking for a friend.
One of my favourite courses was a science requirement: “The Nature and Growth of Scientific Thought,” which, to be fair, was basically a history course—starting with bronze age cosmology and working up through the ancient Greek and Roman world, through to Copernicus, Kepler, Galileo, Newtown, and all the breakthroughs of the 18th and 19th centuries … ending with Einstein. It was an amazing course and nurtured what has become a lifelong fascination with the Copernican Revolution. And while, yes, it was as much a humanities course as it could be while still technically being about science, it exemplified the ways in which such distinctions aren’t nearly as rigid as we tend to think.
This breakdown is a thumbnail sketch of the history offered in the first chapters of Bill Readings’ The University in Ruins (1996). Part of Readings’ thesis is that the modern university has abandoned the mission of creating well-educated citizens and instead substituted the nebulous concept of “excellence” as our primary product as a means of shifting focus to a more purely economically instrumentalist model, in which cultural and political literacy is shouldered aside in favour of maximizing graduates’ earning potential. No longer concerned with “citizens,” in other words, as with taxpayers.
“Then” in this case meaning the late 18th and early 19th century, which is when the modern idea of the university was first essayed in Germany by Wilhelm von Humboldt, based on the model outlined by Immanuel Kant in his book The Conflict of the Faculties (1798). (Again, this is cribbed from Readings. Really cannot recommend that book enough; it was published thirty years ago and is still painfully relevant.)
OK, here’s where my humanities bias asserts itself most obviously.
Actually a thing that people do, amazingly, something my wife Stephanie talks about in a recent post on her Substack Discon/Net.
I agree with you. There’s a lot hype to AI and it doesn't matter how good it gets. We still need to encourage people to think for themselves, to keep learning and improving, to break new ground. Nothing will ever displace the enormous capacities of well-nurtured human creativity.
I usually tell my students at the beginning that they are responsible for their own learning and that they should be honest to themselves. I do ungrading by the way.