Unlike many professionals, educators tend to experience time as more reassuringly cyclical than unsettlingly linear. Along with exasperating heat, August calls forth syllabus season once more: the chance to craft or revise the curricular path ahead.
Alas, for liberal arts faculty like me, the algorithm — and the culture it represents and instills — has become the enemy of our syllabuses. If the syllabus is both a metaphor and a map for what college education is supposed to accomplish — an exercise in broadened, critical thinking — then the algorithm, which is also a metaphor and a map, offers a much narrower view.
Take TikTok, which is just “America’s Funniest Home Videos” with better data.
Peeking over students’ shoulders at their smartphone screens hints at, but underreports, its phenomenal pervasiveness. According to one youth market research firm, about 20 percent of Generation Z say they spend two hours a day on TikTok, followed by an almost equal share who say they spend five hours or more. Half characterize their platform habits there as “addiction.”
That triumph — from TikTok’s perspective, at least — was secured because the app does what algorithmic recommendation systems do best: give the user more of what they already like.
A New York Times exposé of internal company documents detailed how TikTok machine-learns based upon likes, comments, and video play-lengths — individually and as mapped against unfathomable volumes of aggregated data — to microtarget and maximize revisits and time spent on-site. Its secret sauce seems to be in delivering content based upon an “interest graph” (i.e., replicable features of the content itself) rather than a “social graph” (feeding you content based upon the prominence and proximity of your network of creators).
The problem for syllabus makers is not necessarily having to compete with the quality, or even quantity, of that content. Many a moral panic has been ginned up over the years about youth pop culture obsessions castigated initially as “trash” that later ascend to canon; that’s a judgment of life-phase taste and a war that postmodernism long ago won.
The problem for syllabus makers is the medium, rather than the message, of the algorithm.
An algorithm exists for efficiency and prioritizes prior preoccupations: It’s a straight line of a rabbit-hole information journey. A syllabus expresses expansiveness, indulges digressions — and, ideally, counterfactuals — and intentionally arranges the unexpected: It’s a meandering walkabout around the Walden Pond of the mind.
A syllabus is, I suppose, “social graph”-driven content, curated by an enrolled student’s network connection of one, the professor. And it is adaptable, but not at the speed we’ve been trained by — and come to expect of — algorithmic culture. It therefore feels already out of touch by the time the ink dries in the printer room.
As part of a larger liberal arts curriculum, the syllabus does not do the one thing that an algorithm must, above all, do in order to arrest attention (and thereby reap the parent company billions of dollars): give the user more of what they already like.
A college’s core curriculum is, indeed, fundamentally antithetical to algorithmic values. But that’s a feature, not a bug, in the programming.
“Today, information, entertainment and connection are delivered to us on a conveyor belt, with less effort and exploration required of us than ever before. A retreat from the rituals of discovery comes with a cost,” Senator Chris Murphy recently explained about a bill that would prohibit the collection of children’s data to fuel those effective algorithms. “It’s in the wandering that we learn what we like and what we don’t like. The sweat to get the outcome makes the outcome more fulfilling and satisfying.”
Anecdotally, I sense this when students struggle to zig and zag with contrasting content in a way that an algorithm would never demand of them. Take, as an example, a Media Theory course I teach. In this course we survey inherently contradictory ideas about whether the media exerts deterministic influence over our lives or whether we retain empowered agency as audiences.
The course presents a particular theory each week, and if a student finds it compelling and enjoyable, they don’t get more of it; they get the opposite. No logical digital rabbit hole would ever be designed in this fashion — it would repel users. Hence, their puzzlement: “This seems incompatible with what we learned last week.” Indeed! But that doesn’t make it invalid.
It’s not that my students aren’t brilliant and capable of navigating intellectual ambivalence — far from it — but that’s not an exercise their mediated world trains them on.
To be sure, “Why do I need to know this?” is a longtime student gripe, whether they’re encountering the classics, calculus, or communication concepts.
What is new is the environmental conditioning external to our syllabuses: that slipstream of social media content tailored, as TikTok does, to an audience of one — and one alone. That precludes common conversation and therefore community cultivation. As it happens, that’s also what ails American society and democracy.
None of this is to say that algorithms are not absolutely vital for navigating a digital landscape of boundless content: terabytes piling atop terabytes as computing capacity inexorably expands. They are necessary tools, but just as we shape our tools, they in turn shape us, and often toward unreflective ends.
Algorithms imprint their logic on our habits of mind and our ways of being; the syllabus offers an analog technology of resistance.
A central moral and philosophical battle for our AI-aided era is to reaffirm what remains human, even as the machines overtake that which can be intellectually outsourced. Not coincidentally, the flourishing of a well-rounded, conversant human being happens to be precisely the ambition of a liberal arts college education.
Sociobiologist Edward O. Wilson once wrote, “The world henceforth will be run by synthesizers, people able to put together the right information at the right time, think critically about it, and make important choices wisely.” Unless you’re the coder, an algorithm does not teach you how to learn, as a syllabus can, in order to nurture skills like thinking creatively, arguing persuasively, solving problems, and making wise decisions.
To be an educator is to retain an unshakable faith in human curiosity — not least from corners of knowledge we didn’t expect to enchant us. As we draw up fall syllabuses, let that remain the open-minded inspiration against the narrowing pressures of an algorithmic “ideal.”
Michael Serazio is an associate professor of communication at Boston College and the author of the forthcoming book “The Authenticity Industries: Keeping It ‘Real’ in Media, Culture, and Politics.”