More than a decade ago, researchers at Boston College interviewed people from both sides of the Troubles in Northern Ireland, promising each contributor to the Belfast Project that his or her interview recording wouldn’t be released until the contributor died. In the meantime, the tapes would be deposited at the college’s rare books library under lock and key. On the basis of those promises, some people spoke for the first time about painful actions that remain murky in the public eye, including unsolved murders that they’d helped commit or cover up.
When the British government learned of the Belfast Project about 10 years later, it invoked a mutual legal assistance treaty to demand immediate access to some of the tapes. After months of legal wrangling, some of the tapes were turned over, resulting in the arrest in April of Sinn Féin leader Gerry Adams in connection with one of the killings discussed in the interviews. Adams was released, but Northern Ireland officials are now seeking the entire set of interviews — perhaps to balance an inquiry into the Irish Republican Army with investigation of possible crimes by members of the Ulster Volunteer Force as well.
Libraries like Boston College’s are familiar with making promises about the “dark archiving” of materials like these, whether for the papers of a Supreme Court justice, an interview with a soldier ready to give a sustained look at the conduct of war, or the records of the university’s own faculty and students. But just as it has become easier to quietly maintain such records, the reach of the subpoena has also increased. These records are more accessible and searchable than ever, whether for intelligence or law enforcement purposes, or to benefit a party to a divorce or other private lawsuit.
The increasing legal pressure against archives has created anxieties among researchers, librarians, and journalists. They cite the need to protect sources who wish to make a record for posterity; procuring documents and interviews from those sources will be difficult if the fruits are only one subpoena away from disclosure. On the other side include those who simply want to solve awful crimes and make the perpetrators answer for them on the law’s timetable rather than their own.
Are we stuck with either having to destroy our secrets or leave them exposed to near-instant disclosure? It might be possible to split the difference: to develop an ecosystem of contingent cryptography for libraries, companies, governments, and citizens. Instead of using new technologies to preserve for ready discovery material that might in the past never have been stored, or deleting everything as soon as possible, we can develop systems that place sensitive information beyond reach until a specified amount of time has passed, or other conditions are met.
There has been fitful research done on “time capsule cryptography,” by which something can be encoded so that not even its creator can access it until after a certain amount of time. Such cryptography might depend on the kinds of “proof of work” puzzles — which require vast computing power over an extended period — that undergird the operation of bitcoin and other cryptocurrencies. Cryptocurrencies, whose operations are distributed across a number of computers, use the puzzles to prevent any one entity from taking control of the system.
What works to prevent any one party from subverting a currency could also place some of the data increasingly comprising our lives beyond the reach of a simple subpoena, by forcing the curious to wait a designated period of time before they can see what they want — even if they have legal paperwork that purports to entitle them to it sooner.
Even without relying on such complicated technologies, sensitive material can be encrypted using a key that is split into fragments, the way that it can take two simultaneous keys to launch a missile. Imagine key fragments distributed around the world to, say, 10 parties, requiring the cooperation of at least six of them to reassemble the key needed to get the documents. The parties would be instructed only to announce the keys when the original owner’s specified conditions are met. Early disclosure wouldn’t be impossible, but it would require a sustained effort that would only be worth undertaking if the access were a genuine priority, and one justifiable to the authorities of several countries who could each in turn pressure their respective keyholders. That kind of encryption is easy to do.
The original conception of a trust company was as a firm that would solemnly represent the interests of its beneficiaries — which is why a bank worthy enough to be entrusted with one’s savings might also be worth entrusting with decisions about a child’s college fund if the parents became incapacitated. Banks may not be among the most trusted institutions today, but libraries are. And they can together embrace a new generation of encryption technologies to safeguard materials that otherwise will never be created or saved for fear of early discovery.
The Belfast Project is simply a high-profile example of a phenomenon that reaches into the lives of nearly every institution integrated into the digital world — and reaches us, since we are the users of those institutions. Corporations increasingly recognize that whatever they store is discoverable through judicial process — or all too leakable by a disgruntled employee. That’s why any business beyond a mom-and-pop is likely to have a formal document-retention policy for its internal secrets — which is in reality a document-destruction policy, intended to eliminate potential embarrassments and liabilities that lurk in mountains of accrued bits.
It’s more complicated when those businesses are merely custodians of their customers’ data. Google, Facebook, and Microsoft are routinely caught in the middle when, for example, Brazilian authorities demand information about a subscriber and don’t want to use the cumbersome mutual legal assistance treaty process to get it. The Brazilians threaten penalties for holding back information that American law may insist not be disclosed. Or vice versa: The public has been inundated with descriptions of the US government’s mining of digital databases for foreign intelligence, in large part thanks to a leak of the government’s own materials.
Imagine, though, if the records of private firms, government agencies, and individuals from earlier eras came free in a scheduled way, as trustees combined their keys to release them as time passed or other conditions were met. (In the case of Boston College’s promises, it might be that a keyholder would commit to publish its part of a key only upon the announcement of the death of a Belfast Project interviewee.) Subjecting secrets about government intelligence gathering to time capsule accountability by those governments could serve as a trust-restoring measure. Some actions today might reasonably remain secret — but with a guarantee that they will be revealed at a later date certain, even if the government in question feels later regret over entering into the bargain.
The last refuge of privacy cannot be placed solely in law or technology. It must repose in both, and a thoughtful combination of the two can help us thread a path between having all our secrets trivially discoverable and preserving nothing for our later selves for fear of that discovery.
Jonathan Zittrain is a professor of law at Harvard Law School and the Harvard Kennedy School, and professor of computer science at the Harvard School of Engineering and Applied Sciences. He co-founded the university’s Berkman Center for Internet & Society, and directs the Harvard Law School Library.