scorecardresearch Skip to main content

An AI opera from 1987 reboots for a new generation

At MIT, Tod Machover’s ‘VALIS’ receives its first staged production in over two decades

MIT graduate and production assistant Nina Masuelli holds an instrument she helped create that's covered in sensors, which transmits her input to an AI program to be converted into sound. This and other AI-assisted instruments will appear in a new production of Tod Machover's science-fiction opera "VALIS."Lane Turner/Globe Staff

Near the end of Tod Machover’s 1987 opera “VALIS,” a “wild-looking” composer named Mini performs an unusual solo piece on an artificially intelligent instrument. “Mini appears to be sculpting sounds, setting off musical structures with the flick of his hand — he seems to be playing the orchestra of the future,” reads the libretto, which is based on Philip K. Dick’s 1981 novel of the same name.

During the opera’s initial performances starting in 1987, Mini was portrayed by Machover himself, who also devised the concept of “hyperinstruments” for the opera: electronic instruments that can tell what and how someone is playing, and embellish it. “It adds things to the performance, as layers,” said Machover in his office at the Massachusetts Institute of Technology Media Lab. For example, “it could be that I’m playing a monophonic line and it gets orchestrated.”

Advertisement



So what kind of hyperinstrument did he use to set off music “with the flick of his hand” when “VALIS” premiered at IRCAM, the computer music laboratory in Paris where Machover served as director of musical research in the 1980s? The technology didn’t exist then, so the “orchestra of the future” was prerecorded. “The Mini solo instrument was basically faked,” he said with a grin.

But in next weekend’s live production of “VALIS” at MIT directed by Jay Scheib — the first new production of the opera in over 20 years — artificial intelligence is no longer the stuff of science fiction, and neither is Mini’s instrument.

Composer Tod Machover in the MIT Media Lab. AI-assisted electronic musical instruments created in the Media Lab will be featured in a new production of his science-fiction opera "VALIS."Lane Turner/Globe Staff

Machover, whose shock of gray hair lends him the appearance of the archetypal mad scientist, had been planning on reprising the role of Mini. He’d been working with the students in his group at the media lab — Opera of the Future — to develop the concepts for the production, and they’d agreed that operating a hyperinstrument by waving one’s arms in the air was a bit cliché, he said.

Advertisement



Then, when recent MIT graduate Nina Masuelli came in with a prototype device she’d designed, Machover realized she had “an incredible sense” of the effect he’d been trying to create, and offered her the role of Mini. The “orchestra of the future,” it turns out, now looks like a large clear plastic jar filled with Christmas lights.

But this is no wedding tabletop decoration: The lights are actually part of a system of sensors connected to an artificial intelligence program designed by Manaswi Mishra, a PhD student in Media Arts and Sciences and member of Machover’s group. The program pulls from a library of pre-selected sounds while Masuelli manipulates the jar, and Mishra tweaks the sonic output from his laptop.

The story of “VALIS” follows troubled protagonist Horselover Fat (an alter ego of Dick’s) after he experiences a quasi-religious vision during which his head is overloaded with information through a beam of pink light. At last, he meets an angelic artificial intelligence named Sophia, who comforts him with a message of love.

Boston-based, France-born soprano Anne Azéma performing as Sophia in the 1987 world premiere production of Tod Machover's "VALIS" in Paris.Anne Marie Stein

For bass-baritone Davóne Tines, who portrays both Dick the author and his literary alter ego in next weekend’s production, a universal human experience lies at the core of the story. “A person who has received important information contends with how that information changes their life,” he said, noting that the opera holds “a really amazing opportunity to share truth,” which can be “something revelatory” for both the performer and the audience.

Advertisement



France-born, Boston-based soprano Anne Azéma, who created the role of Sophia in the original production, is looking forward to revisiting “VALIS,” this time as an audience member. “You recognize Tod’s writing 15 miles away,” said Azéma, an early music specialist who now artistic directs the Boston Camerata. When she was working on the opera, the effects of the hyperinstruments were such that she was “bathed in beauty, and carried with sound,” she said.

For most people, artificial intelligence was an unknown realm when Machover was writing “VALIS,” but now it’s in the headlines daily. “In all my career, I’ve never seen anything change as fast as AI is changing right now, period,” said Machover. “So to figure out how to steer it towards something productive and useful is a really important question right now.”

AI image generators like DALL-E are already “creepily good” at “copying things” that already exist, said Machover, but he finds them off-putting. “It’s easy to combine images and words in new ways, but not as easy to create new meaning through doing that,” he wrote in a followup email. “Why a cat needs to be cooking in the mountains, or why a Mozart symphony needs an EDM backbeat, has to be carefully chosen and orchestrated, not just thrown together.”

Mishra sees most commercially available compositional AI tools as “black boxes,” where humans can’t interpret or easily influence the inner workings of the system after they input a prompt. “Though it’s sometimes impressive, this doesn’t feel like a musical instrument,” he wrote in a followup email. The Media Lab’s work, by contrast, focuses on AI systems “that will allow an individual musician to uniquely compose, perform, and manipulate” sound according to their intention.

Advertisement



Max Addae with an AI-assisted instrument called "VocalCords."Lane Turner/Globe Staff

Thus, the new generation of hyperinstruments: the Mini jar and “VocalCords,” an interface designed by master’s student Max Addae that alters the human voice by pulling on three stretchy strings attached to a device. These technologies dance on the boundary between artificial and human intelligence.

And like any instrument, the hyperinstruments still require a firm human hand, because if you don’t fine-tune the AI, it might spit out something cacophonous. “Like — ” and Machover screeched like the Green Line pulling into Boylston station. “That’s not necessarily what I want!”

VALIS

Sept. 8-10. Massachusetts Institute of Technology Building W97. http://arts.mit.edu


A.Z. Madonna can be reached at az.madonna@globe.com. Follow her @knitandlisten.