If your college-bound teen boldly announces plans to study poetry, philosophy, or art history next fall, don’t panic. All around the country, forward-leaning professors are breathing digital life into the ancient practice of interpreting art and culture. Along the way, they are trying to diminish the parental fear of ending up with a college-grad-turned-barista living in your basement.
At the University of Southern California, for instance, Tracy Fullerton, an award-winning game designer and chaired professor at the School for Cinematic Arts, invented “Walden, A Game,” with a team of scholars and coders. The gamer world has heaped praise upon the effort. It challenges players to live through a virtual recreation of Henry Thoreau’s life, trying to survive while deepening their understanding of Thoreau’s work.
Meanwhile, at the Metropolitan Museum of Art in New York, art-history students from Barnard College snap high-resolution pictures of 18th-century French artifacts held in “period rooms” behind red velvet ropes. Back on campus, the students collect and edit the images, highlighting the ways in which emerging ideas about individualism shaped people’s private spaces. It’s part of a course called “A Virtual Enlightenment,” taught by Anne Higonnet, an inventive professor of art history for Barnard and Columbia University. Through the careful use of digital media, the course examines complex issues at the intersection of class, gender, economics, philosophy, and art.
Projects like these use technology to do what the humanities have always done: contemplate the ways — sometimes disturbing ways — in which human beings experience one another and the world. “If there are ways to embrace technology in teaching, you’d be hard pressed to a find a discipline stronger than the humanities,” says Sharon Marcus, a former dean of humanities at Columbia who is on leave this year at the Radcliffe Institute for Advanced Study.
The humanities, which lie at the core of the traditional liberal arts education, desperately need a jolt of new energy. College course catalogs are filled with offerings like “The 18th-Century Novel” that no longer draw much of a crowd. But it’s not just that these courses are stale or out of fashion. The periodic fights over whether literature professors deploy too much arcane theory or off-putting jargon aren’t really the issue, either. More fundamentally, in a volatile higher-education marketplace, there is a widespread perception that the humanities are simply out of touch with the needs of a tech-driven workforce.
The result has been a protracted decline. The proportion of students awarded bachelor’s degrees in the humanities has dropped to 6 percent from its peak of 17 percent in 1968, according to a recent Hechinger Report. Instead, educational policy makers, business groups, legislatures and parents have all been pushing students into the STEM disciplines — science, technology, engineering, and math.
Some of the fall-off relates to broader problems within our higher education system. Large public institutions rocked by declining enrollments have been forced to take drastic steps to balance their budgets. Most recently, the University of Wisconsin Stevens Point announced plans to cut 13 majors, including English and history, and shift dollars towards more “job oriented” fields of study.
Nearly 30 years ago, when I graduated with a BA from a small liberal arts college, things were simpler. Technology was gaining steam, but the English major was still seen as a rigorous training ground for anyone who wanted to be a solid communicator. Now, as a parent of two teenagers on the cusp of applying to college — and as an employer looking for workers who can thrive in an increasingly digital world — I question whether I would recommend the humanities as a major to my kids. And that makes me cringe.
But I’m more upbeat when I hear about efforts to modernize literary scholarship, such as Tufts University’s Perseus Project, which has helped to digitally categorize countless ancient books. This sort of large-scale collaboration is a fine example of what we might call the neo-humanities. It bands together small armies of scholars from around the world using state of the art tools to build and analyze massive archives. The approach draws from other fields, such as computer science, to shed new light on old questions.
Just because an academic field is timeless doesn’t mean it should never change. If the humanities are ever to enjoy a true resurgence, it will come as a result of a reinvention that embraces a fresh new take on old disciplines.
* * *
Liberal arts colleges have traditionally been seen as the place where true knowledge — as opposed to, say, Google knowledge — is acquired. It started with Erasmus in the 15th century. This is where the humanities were born, according to Vassar College English professor Robert DeMaria Jr. “Erasmus had a preference for understanding and interpreting the classics through Augustan Latin, and it’s this deep appreciation for classical studies that English studies began to emulate in the 19th century and we still live with it today.”
To some extent, this preference for the classics still permeates the whole model for how the humanities are taught. In the literature field, DeMaria points out, there’s a whole industry in producing new editions of older works. “It’s amazing to me how well this work still pays from publishers like Cambridge Press.” DeMaria maintains that some professors who want to make extra money and stay on the tenure track can be pulled back towards traditional pursuits of reinterpreting the classics — even if they’re not that interesting or relevant to today’s students.
The pressure on traditional humanities departments comes at a time when philosophy, ethics, arts, and science are converging. Just listen to pretty much any of the recent TED talks about artificial intelligence. Computer scientists such as Stewart Russell at the University of California at Berkeley are grappling with existential issues surrounding how computers and humans will become intertwined in the coming years. “Eventually,” Russell has argued, “AI systems will make better decisions than humans.” But he also cautions us to be very careful about what we’re asking the machines to do. We’ll need to develop not just the technical skill to build better tech, but also the moral and ethical reasoning to guide how society deploys them.
In this context, the humanities should be thriving — not receding. Folks like Russell, who are among our greatest minds in computer engineering, benefitted from precisely this sort of classical education at places like Oxford and Stanford. Their understanding of computers has advanced the pace of technological change; but it’s their understanding of our humanity that allows them to deeply probe and construe the implications of their work for society.
* * *
If technology brings new challenges, it also creates new tools for humanities scholars.
One big step forward might be scholarship in the burgeoning field of digital humanities. This is a relatively nascent effort to use computers to interpret text. These efforts are still only young (which means a couple of decades in academy time), but they allow students to work with technology and analytics software to find deeper meaning in art and literature — a step in the right direction for sure, because they bring together a cross section of scholars from STEM and the humanities to learn together.
Consider a recent PhD candidate at the University of Iowa who used data modeling and visualization software to analyze emotion in the works of Cicero. At the heart of the work was an attempt to visualize patterns in the language Cicero would use with various groups of friends. These sorts of efforts not only bring students and researchers closer to the classics, but also offer them practical skills for understanding how to use sophisticated software that might help them in a range of jobs.
But unfortunately these approaches have come under fire from the ivory tower. In “The Digital-Humanities Bust,” a recent Chronicle of Higher Education opinion piece by professor Timothy Brennan, the author claims that “for all its resources, the digital humanities makes a rookie mistake: It confuses more information for more knowledge.” Purists like Brennan see these efforts as fruitless, and fail to grasp the broader implications for bringing a study of the humanities into a more relevant context for the next generation of students.
This line of reasoning represents a broader sentiment among many of those in the humanities who still see it as the last bastion of true education — in other words, education that transforms individuals by opening their minds to the depths of what it means to be human. These folks tend to cast more practical pursuits, such as science or business, as less impactful than the humanities. When employers or elected officials — or media commentators like me — urge a change of course, the reaction among sitting professors is often defensive.
“There might be an inherent conservatism that’s stronger in the humanities than it is in the sciences,” admits DeMaria.
I know a little bit about declining industries because I’ve spent my entire professional life working on bringing traditional publishers into the digital age. The hardest part is staying true to your heritage without getting stuck in the past. There’s a temptation to keep doing what you’ve always done and justify it as your “unique value,” while viewing newfangled adaptations as fads that only water down your standards. In scholarly pursuits, these attitudes can span generations, as schools that are well stocked with traditional humanities professors might be tempted to preserve old practices when minting new PhDs.
Yet the leaders in the field are embracing new approaches. Marcus offers tough love when she says, “We need humanities professors who aren’t afraid of saying that their classes teach practical skills.” She adds: “It’s elitist to say that practical skills don’t matter to the life of the mind, or that art transcends practical considerations. For some, art matters because it’s disinterested and theoretically outside of market forces, but, as many humanists have shown, that’s a myth.”
Marcus dismisses the kind of skirmishes captured in Chronicle articles as insignificant feuds — the variety you might find in any profession. At the same time, she adds that “while I’m a booster of the humanities, we need to do a better job of helping ourselves.”
* * *
Let’s be clear: Helping themselves means arresting the decline and regrowing student interest in their disciplines. That might seem a far way off, but it’s not impossible. But the rebirth will surely come at the intersection of professors and programmers who meld scholarship with new ways of learning and imparting cultural values.
We have the tools to rejuvenate the study of the art, history, and philosophy. But the establishment must have the will to change, or risk losing all of its students. At that point, we all lose. Higonnet, the art professor at Barnard and Columbia, reminds us that “in a democracy, the role of the humanities is to instill a sense of civil society.”
There’s just too much at stake to let the humanities continue to erode. We’re already seeing the impact of a world too easily infected by the cross currents of “fake news” and false perceptions forged on Facebook pages with suspect origins. If we want the university to play a crucial role in creating citizens and nimble thinkers across a wide swath of students, the time is ripe for an approach to the humanities that engages everyone in a search for truth and knowledge, but that’s rooted in today’s world.
Our own humanity might just depend upon it.
Josh Macht is group publisher for Harvard Business Review and head of product innovation for Harvard Business Publishing.