Remember winter storm Nemo? Last February it bore down on Boston trailed by forecasts for more than 2 feet of snow, triggering a level of weather panic (and excitement) that we hadn’t seen since the Blizzard of ’78. When the governor issued a statewide driving ban, some scoffed, but citizens grudgingly obeyed, leaving the streets to plows and ambulances.
Nemo delivered as promised, and we learned a lot that weekend. We learned who our neighbors were. We learned that you can’t just drive through anything. We learned a storm can shut down even a winter-hardened New England metropolis — and that the metropolis can sort of like it.
In 2013, various kinds of storms hit from everywhere — Syria, the Vatican, a troubled Chechen family in Cambridge — and so did surprising lessons. Who knew that beards had so much power, or that the Hilltop Steakhouse wasn’t forever? Americans learned that the government really was listening to them, whether they liked it or not. We learned about twerking and cronuts, that you can 3D-print an ear, and that you can release successful music videos from space — just ask Commander Chris Hadfield. Boston learned, to its horror, that there is sometimes no way to guess where and when violence will erupt — but also that in the face of that violence, a city of strangers can quickly and movingly weld itself into one.
It’s hard to wrap your head around how much can happen in a year. It’s even harder to rewind our brains to a moment when we hadn’t even imagined learning any of these things. As a way to bid the year goodbye, Ideas asked its staff and contributors to reflect on all the things that we would never have known if not for 2013.
For Celtics fans, the 2013 offseason was as traumatic as it was inevitable. The remaining two pieces of the New Big Three — Paul Pierce and Kevin Garnett — were traded to Brooklyn, while head coach Doc Rivers decamped for sunny LA to coach the high-flying Clippers. Amid the rubble of what had been a beloved championship team, Celtics fans took solace in two facts: One, general manager Danny Ainge replaced Rivers with Brad Stevens, who had developed a reputation as something of a young prodigy as head coach of the Butler Bulldogs. And two, the Celtics would likely finish among the league’s worst teams.
Why would fans want the Cs to lose? It comes down to draft picks:The worse a team, the better its odds of getting a top pick for next season. 2014 is forecast to be one of the most talent-rich drafts in years, meaning the Celtics, were they to prove sufficiently inept, would be just a lucky break away from landing a potential superstar like Kansas’ Andrew Wiggins or Duke’s Jabari Parker.
Except...this team is just a little too good. Despite a recent three-game losing streak, and without star point guard Rajon Rondo, who is still recovering from an ACL injury, the overachieving Celtics are, as of this writing, within a half-game of leading the Atlantic Division.
So, barring an unexpected finish, the Celtics’ chances of landing a high draft pick are razor-thin. By virtue of being sorta decent rather than terrible, the team finds itself stuck in a painful purgatory: good enough to miss out on a top draft pick, but not nearly good enough to hoist banner number 18.
-- Jesse Singal
The first mass audiovisual experiences were collective: People crowded into theaters to cheer, jeer, and laugh at newsreels and movies. Then, with the rise of television and, later, gazillion-channel cable packages, viewing became a private or family activity, with viewers scattered around the country in their own little pods.
Things came full circle in 2013. Thanks to various online platforms, most notably Twitter, every TV event now allows you to join in a real-time stream of observations, reactions, jokes, and discussions, measurable in “tweets per minute” (TPM). “Sharknado,” the intentionally ridiculous TV B-movie which aired in July on Syfy, garnered a peak TPM of about 5,000. The mega-hyped “Breaking Bad” finale in September? It reached 24,000—or 400 tweets a second.
Collective viewership becomes even more powerful when the events are live. After the Marathon bombings, as the news showed police slowly zeroing in on Dzhokhar Tsarnaev’s hiding spot in Watertown, Twitter users filled the information vacuum with speculation, forwarded news reports, and nervous humor. Months later, more joyously, Red Sox Nation lit up the Internet with typed shouts, Instagram photos, and Vine videos after every Big Papi home run during the team’s World Series run. It was, in effect, a massive viewing party—a way to shout “Whoo-hoo!” not just to everyone in the living room or bar, but to the whole world.
Will this new kind of live conversation change what’s on TV? In the coming months, expect both pandering—after a Twitter naming contest, AMC announced that we can look forward to “Sharknado 2: The Second One”—and more challenging offerings like “Breaking Bad” that draw strength from their passionate and engaged collective audience. Whatever the next generation of TV looks like, though, one thing is clear: Millions of us will watch it together.
-- Jesse Singal
Just a few years ago, people power seemed unstoppable. Arab revolts in 2010 and 2011 overthrew a quartet of tyrants—Zine El Abidine Ben Ali in Tunisia, Hosni Mubarak in Egypt, Ali Abdullah Saleh in Yemen, and Moammar Khadafy in Libya. Vast police states proved no match for crowds armed only with newfound courage. Egyptian youth leader and labor activist Ahmed Maher could have been speaking for the entire region when he declared in January 2011: “The people are no longer afraid. We want a democratic state that respects human rights and allows all its citizens to live in dignity.”
The display was enough to rattle China, which tightened its control of the Internet, local party bosses, and the foreign press corps. Russia’s Vladimir Putin clamped down so hard that he threw the punk band Pussy Riot in prison. In Saudi Arabia, the king spent billions in new handouts to quell revolutionary sentiment, while the dictatorship in Syria went for broke, torturing and shooting the first unarmed demonstrators and escalating to a consuming and murderous civil war.
Popular movements, in 2013, turned out to have less power than it had seemed, while authoritarian states emerged as crafty and resilient. In Egypt, after a brief experiment with an elected civilian president, chanting crowds demanded the installation of a new dictator, General Abdel Fattah el-Sissi; the activist Maher, meanwhile, is now in prison, sentenced to three years for protesting. Regimes in Iran and Syria regained their old swagger, while activists in Saudi Arabia and Algeria watched these outcomes and held back. The one bright spot in the Arab world was Tunisia, where elections have spawned coalition governments and where real change and compromise appear to be in the works.
After the crackdowns, the leaders of Russia and China today appear confident that no Internet activists or civil society groups can break their regimes’ monopoly of power. Egypt’s muscle-flexing new authoritarians, and Syria’s resurgent old ones, convincingly demonstrated that if the wind of people power is still blowing, it is far from an irresistible force.
-- Thanassis Cambanis
Just three years ago, gay marriage was considered such a winning issue for conservatives that Karl Rove could write, in his 2010 book, that it “revealed the nuttiness of the Left, which never saw how persistent America’s traditionalism really was.”
Well, not that persistent. Legal gay marriage moved so fast in 2013 it startled even its supporters: In June, the Supreme Court struck down the Defense of Marriage Act, creating federal marriage benefits for the first time. Eight new states legalized gay marriage, New Mexico and Utah as recently as this month. In November, two male former cadets were married in the first gay wedding at West Point.
There are still many parts of the country that are staunchly opposed to same-sex marriage—places where gay people face not only discrimination but physical danger. But in 2013 it became clear that supporting gay marriage isn’t a deal-killer even for conservatives—and it might even have turned into a smart career move. Dozens of prominent Republicans, including former top Bush advisers as well as William Weld, Jon Huntsman, and Christie Todd Whitman, responded to the Supreme Court case by filing an amicus brief in support of gay marriage. If support for it once looked like nuttiness, it’s now undeniably mainstream.
As for Rove himself, in March the ABC talk show host and former Bill Clinton staffer George Stephanopoulos asked him if he could envision the next GOP candidate fully supporting gay marriage.
“I could,” Rove said.
-- Jesse Singal
Bostonians weren’t always afraid to get in the Charles River. An 1898 map depicts five floating bathhouses on the river, run by the city’s Department of Public Baths and playing host to hundreds of thousands of visitors each year. But even then, the river was intensely polluted by industry and sewage outflow, its banks “a bed of apparently putrescible sludge,” according to an 1892 report by the Cambridge Park Committee.
Conservationists like landscape architect Charles Eliot wanted to reverse that, allowing the unpleasant tidal estuary to truly become an urban pleasure ground. That vision was largely realized over the next century. The damming of the river’s mouth and creation of new riverside parks and paths brought paddling, sailing, biking, and fireworks to the Charles.
But the water itself was slow to catch up. Public swimming was banned in the 1950s. The river’s caustic reputation was cemented by the Standells’ 1966 song “Dirty Water,” its raw guitar riff and chorus (“Well I love that dirty water/ Oh, Boston, you’re my home”) capturing the gritty, downtrodden city of the time.
For six decades, the water was off-limits save for a recent annual race for elite swimmers. But on July 13, a public swim in the Charles River drew more than 140 pioneering bathers. It was a culmination of decades of concerted efforts by groups such as the Charles River Watershed Association and Charles River Conservancy, which have dramatically improved water quality.
So when “Dirty Water” was played in Fenway Park and bars all over the city this fall after the Red Sox’ World Series win, it sounded more like a throwback than an indictment. Like the Red Sox themselves, the Charles may have finally shrugged off its curse. Granted, the river is still plagued by sewage overflow during heavy rains, and swimmers are advised not to touch the toxic bottom. But if current progress continues, perhaps those floating bathhouses could make a comeback.
-- Courtney Humphries
At the beginning of 2013, federal prosecutors had two young men under indictment for computer crimes. Aaron Swartz, known as a thoughtful activist and computer wunderkind, had used MIT’s network to download millions of articles from the academic database
JSTOR. Andrew Auernheimer, a hacker and provocateur known as weev, had given around 114,000 iPad users’ e-mail addresses to the website Gawker after they were exposed via a security hole on AT&T’s website. In January, Swartz committed suicide before his trial could start. In March, a jury found weev guilty, and he was sentenced to 41 months in prison.
These may seem like exceptional cases. But this year, they cast a spotlight on a surprising fact: As it’s written, the Computer Fraud and Abuse Act, the powerful computer crime law under which both were prosecuted, is so broad that it could be used to put nearly any one of us in prison, and we violate it in the course of using the Internet every day.
If you disobey a website’s terms of service (say, lie about your height on a dating website) or use a computer in a way you’re not authorized to (check sports scores at work, for instance), you could, in theory, be prosecuted under the CFAA. Prosecutors have gone after people for getting around Ticketmaster’s Captcha check. According to Jennifer Granick of Stanford’s Center for Internet and Society, other activities that could be construed as violations include circumventing time limits on coffee shop Wi-Fi by changing a computer’s network address or deleting browser cookies to access a metered news site.
When the CFAA was first written, few people had computers at home; now we carry them around in our hands. When technology changes this quickly, it can take a while for the law of the land to catch up. The government has responded to criticisms by saying its lawyers only go after actual bad guys. Right now, based on our routine online behavior, our only choice is to trust that’s true.
-- Sarah Laskow
The United States was founded by people so desperate to escape the influence of royalty they risked loneliness and starvation to do it. So it’s always a surprise to be reminded— as we were this summer—that many of us are still passionate royalists at heart.
When Prince William, Duke of Cambridge, and his “commoner” wife, the former Catherine Middleton, had their first child, George, this summer, Americans were by some measures even more delighted than the prince’s own subjects. American news programs stationed reporters outside St. Mary’s Hospital in London for days, and the “Today Show” set up a live “doorcam” to let viewers gaze at it 24/7. (The strangely mesmerizing feed primarily captured passersby walking down the street.) Magazines with royal-themed covers flew off newsstands and drew serious online traffic online: The editor-in-chief of Us magazine said its website earned its highest traffic numbers ever on the day the duchess left the hospital. The most bizarre touch came when Barbara Walters conducted an interview with an in-character Middleton impersonator.
Despite the media frenzy, “Kate and Wills” carried themselves with a notable lack of fuss. Middleton’s father snapped their first family portrait, and the couple was said to be going without the traditional team of royal nannies. In Middleton’s first public appearance after giving birth, she wore a plain dress that emphasized her still round belly.
The month before the birth of the new princeling, reality star Kim Kardashian had given birth to her own American royal baby of sorts in Los Angeles. Kardashian showed off photos of the infant’s designer clothes on Instagram, and refused to appear in public until her body was back in bikini shape. Maybe the contrast was what made baby George’s family so appealing: With celebrities like these, royalty look downright down-to-earth.
-- Ruth Graham
When he was elected pope in March, the Argentine Jorge Mario Bergoglio became the first pope in the long history of the church to name himself after St. Francis of Assisi (1181-1226). It was a move that indicated he was assuming Francis’ dual mission: to save the church from corruption, and to orient it towards the poor and downtrodden.
Whereas the papacy had recently been seen at odds with a changing modern world, the new pope has, in a few months, managed to make the church seem vital and engaged by bringing poverty to the center of Catholic discourse. Indeed, some have called his critiques of capitalism and his outreach to the poor “Marxism.” He is hardly a Marxist; instead, like St. Francis, he sees it as the place of the church to engage with the world as it is.
In a time of growing economic inequality, St. Francis of Assisi was the son of a rich merchant and himself a soldier, who believed that Christ had called him to “repair my house, which as you can see, is falling into ruins.” Francis created an order without buildings or material belongings, and questioned the very idea that the church should itself possess material riches. He focused his mission on living among the poor in a state of mystical spirituality—famously, he exchanged his rich clothes with those of a beggar—and on communing with nature. And he insisted that those in his order follow his example.
This was as radical then as it is now. In the 13th century, St. Francis’s message of walking with the poor and eschewing signs of wealth and hierarchy were both threatening and appealing. In 2013, it was clear that these simple old ideas were powerful enough to give the Vatican new direction and to grip the attention of the world.
-- Jacob Soll
He was the last of the Plantagenets, most loathed of British royalty, infamous Shakespearean schemer, defeated finally during the War of the Roses—but in death, remarkably, he vanished. Richard III seized the kingship of England in 1483 amid rumors of murder, and his reign lasted just two tumultuous years, before he was killed at the Battle of Bosworth Field. Somehow, though, his body was lost. In February, scholars announced that the historical mystery had been solved, that his remains had been identified. And where was the great monarch interred? In Leicester, under a parking lot.
The news signaled the start of an impressive run of car park finds across the United Kingdom. In March, workers in Edinburgh were clearing a parking lot to erect a university building when they uncovered the grave of a medieval knight. Buried with the knight was a sandstone slab, inscribed with a sword and Calvary Cross. More than a dozen skeletons were eventually found nearby, including hints the knight was buried with his wife and child. Then, in May, an entirely different parking lot in Leicester yielded a Roman graveyard, dated to 300 A.D., which included a pagan burial—the skeleton in a fetal position, oriented north-south, the head removed and placed at the feet with offerings for the afterlife—as well as a clearly Christian one. In July, back at the Richard III site, archeologists said they’d unearthed an enigmatic lead coffin within a stone coffin. Finally, in October, far to the north, a Scottish parking lot was discovered to contain the remains of a medieval Norse parliament situated on an 11th-century man-made island constructed under the direction of Thorfinn the Mighty.
If Thorfinn the Mighty’s grand legacy can wind up flattened and utterly disrespected under the Cromartie Memorial car park, then what dim prospects await the unmighty rest of us? That’s one possible lesson, but the discoveries also suggest something more elevating: What has been lost can be found again. Forgotten is not forever.
-- Gareth Cook
To see a meteor streaming across the sky is a rare thing—fast, fleeting, and above all, unexpected. So when dozens of videos, all filmed from behind the windshields of cars, started popping up online last February of a fireball zooming over Russia, the question on the minds of many Americans as they watched the amazing footage was, “How on earth did so many Russians happen to have their cameras on at the exact right moment?”
The answer was simpler, and revealed more about life in Russia, than anyone here could have guessed: The reason there were so many recordings of the meteor was that Russian drivers record everything, all the time, using inexpensive dashboard-mounted cameras. These cameras, Americans quickly came to understand, were not used by Russians out of voyeurism or sentimentality: They were used as protection. Russians, it turns out, live in a state of constant vigilance when they’re driving, prepared at all times for confrontation with their fellow road warriors and corrupt police officers cruising around looking for bribes. The dashcams, in this context, serve as a form of insurance.
The footage recorded on these dashcams is often extraordinary—capturing not just ephemeral wonders like the meteor, but lurid fights between motorists and bloody accidents that get uploaded and annotated by rubbernecking commenters. To read about it in the aftermath of the meteor’s star turn was to realize that, while Americans armed with Facebook and Instagram might go a little crazy when it comes to documenting what food they’re eating for dinner and what bands they’re seeing in concert, on the personal surveillance front, our former Cold War enemy seems to be outdoing us.
-- Leon Neyfakh
In our newly digital world, it seems like we’re swimming in data. We’re swamped with new e-mails, YouTube clips, blog posts, and e-books, while libraries and archives are busily scanning historical documents.
But historians are concerned with a major gap: What happened after humans began producing tons of audio and video culture, and before we figured out that we should preserve it? Much of this key record of the last 150 years is stuck in obsolete or decaying physical formats. Thousands of hours of radio, television, and film were simply never preserved at all. In 2013, some major holes in our self-documentation came to light.
This month, the Library of Congress’ National Film Preservation Board published a study that quantified a long-held suspicion of film researchers: Seventy percent of American silent films made between 1912 and 1930 are gone forever. Historian David Pierce found that survival varied by studio: MGM archived many films and donated others to collectors, while Paramount put no effort into preservation.
Then there was the story, equal parts heartwarming and chilling, that emerged after the death of librarian Marion Stokes, who obsessively recorded local, network, and cable TV news on VHS tapes from 1977 to 2012. Her descendants have given those tapes to the Internet Archive, a nonprofit digital library, which will put them online. The Library of Congress reports that local news archives are especially scanty, with most stations keeping tapes only a week and few repositories collecting them; it’s upsetting to realize just how rare Stokes’ collection will be.
The Library of Congress hopes to carry out assessment surveys on other 20th-century audiovisual media (educational films, animated shorts, newsreels). Meanwhile, historians warn that today’s digital documents—video games, software, websites—could join their earlier counterparts in obsolescence. If we want our great-grandkids to know what 2013 was like, we’ve got our work cut out for us.
-- Rebecca Onion
For centuries, humans have wondered whether life existed outside our Earthly bubble, but we’ve had no tools beyond imagination to apply to the question. Eventually we sent probes to other planets, finding them very different from our own and unlikely to support life. In 1992 we used a radio telescope to detect the first exoplanets—orbs circling other suns—but again, they seemed inhospitable.
Then, in 2009, NASA launched the Kepler space telescope, with the purpose of finding other Earths: planets about the size of our own and in a “Goldilocks zone,” not too near or far from their sun. This year, results from Kepler and other telescopes have flooded in.
In January, researchers reported that there is at least one planet for each of the 100 billion or so stars in our Milky Way galaxy—perhaps more, as the estimate was based on only the most common type of star.
In April we saw the announcement of two exoplanets more similar to Earth, in size and sunlight, than any previously discovered. Kepler-62e and Kepler-62f orbit a star (Kepler-62) about 1,200 light-years away. Scientists suspect they have a rocky core, a watery surface, and a thick atmosphere—suitable for an exotic, aquatic ecosystem.
In October, the number of exoplanets we’ve discovered surpassed 1,000. Kepler has found more than 3,000 additional candidates that await confirmation. And many may be habitable. In November, researchers analyzing Kepler data estimated that one in five stars like the sun have planets with sizes and orbits like the Earth’s, which suggests billions of such planets in the galaxy. The nearest is likely only 12 light-years away, orbiting a star visible with the naked eye.
The Kepler spacecraft is currently hobbled, but it’s already produced enough data to keep scientists busy for years. And future instruments may reveal even more—eventually, maybe even someone waving back.
-- Matthew Hutson
Since the dawn of Twitter seven years ago, the social media platform has served as a space for self-organizing communities staking out their own corner of the online universe. Among the most dynamic has been “Black Twitter,” the networks of African-American, African, and Caribbean-American Twitteratti that foster conversation about issues flying under the radar of the mainstream media—and, often, elevate those issues to a more prominent plane.
What became clear in 2013 was that Black Twitter has begun to garner results, developing a reputation as a formidable cultural force. Time after time, decision-makers bowed to pressure from their incisive, indignant—if short—missives.
When Juror B-37 in the George Zimmerman trial announced that she planned to write a memoir, outrage on Black Twitter prompted her and her literary agent to make an abrupt about-face. When media mogul Russell Simmons released a parody video called “The Harriet Tubman Sex Tape,” the backlash led him to remove it within days. And though it’s been six years since “Saturday Night Live” had a black female cast member, Black Twitter frustration at the comedy show suddenly boiled to a fever pitch in October, when show comedian Kenan Thompson said producers “just never find ones that are ready.” This month, “SNL” invited about two dozen black women comedians in for auditions, with at least one slated to join the cast in early 2014.
Often, the 140-character exhortations that get these results make use of the acerbic humor that has always suffused black social movements. After accusations emerged that celebrity chef Paula Deen was a rampant racist, Twitter users responded to her request for favorite potluck dishes with suggestions like “My Best Friends are Black-Eyed Peas,” “Uncle Tom’s Cabbage,” and “black beans and white rice...on separate but equal plates.”
Funny? Yes. Effective? Absolutely. Deen was dropped by the Food Network, Walmart, Kmart, and QVC. “Don’t mess with Black Twitter,” writes Daniella Gibbs Léger of the Center for American Progress, “because it will come for you.”
-- Martine Powers
The massacre of 20 children and six educators at Sandy Hook Elementary on Dec. 14, 2012, felt, at the time, like an instantly transformative event—a tragedy so unspeakably horrific and senseless that it would force even the most dedicated supporters of gun rights in Congress to finally permit reform.
But anyone who thought the Newtown shooting would change the minds of such legislators quickly found out it wasn’t true. When a bill proposing expanded background checks and a ban on some assault weapons reached the Senate in April, it was voted down. And eight months later, The New York Times reported that of the 109 state-level gun laws passed since Newtown, 70 actually made it easier to buy guns, not harder. Meanwhile, a poll recently conducted by the Huffington Post showed that instead of causing lawmakers to rethink their positions on gun control, the tragedy at Newtown only “hardened preexisting beliefs.”
Public opinion on gun control hasn’t ended up shifting very much either. For years prior to Newtown, Americans were just about evenly split on whether it was more important to control gun ownership or to protect gun rights. While surveys conducted by the Pew Research Center showed that Newtown did nudge the numbers slightly—immediately after the shooting, more people were in favor of gun control than against it for the first time since Barack Obama took office—they went back down in the subsequent months, meaning the paradigm-shifting effect many expected never materialized.
-- Leon Neyfakh
Presidencies have suffered hiccups or collapses as a result of everything from intrigue over oil reserves (Harding) to attempted illegal wiretapping (Nixon) to extramarital affairs (Clinton, among others). But this fall, President Obama added an ignominious achievement to his legacy: He became the first president to take a hit from a bungled website launch.
During both his campaigns and his five years in office, Obama has built a reputation as a tech-savvy, connected candidate and president. But then came the October launch of the Affordable Care Act’s HealthCare.gov website, the central hub where Americans were supposed to be able to sign up for the federal health care exchange or be directed to their states’ individual systems: It was a terrible, broken mess. Many would-be consumers couldn’t even access the newly offered plans’ monthly rates, let alone enroll. And once the government shutdown concluded, the creaky website became the next rallying cry for angry Republicans, a symbol of big-government dysfunction and broken promises.
Today, the site still has bugs that need squashing, but it is functioning better than it was. On Dec. 20, Obama announced that a million people had enrolled since HealthCare.gov launched. Short of another catastrophe, it seems likely that the site’s troubled birth will, in the long run, land closer to “footnote” than “turning point” in the history books.
Still, there are lessons to be drawn. For one thing, the newest kind of policy challenge—building a national website easily accessible to every US citizen—can be hugely exacerbated by an old one: the outsourcing of important government work to a patchwork of overcharging contractors, many not up to the task.
And secondly: If you want a big tech rollout to go well, you’d better take a hands-on approach—even if you’re the president.
-- Jesse Singal