2014: A reference guide
Black Republicans, liberal high-rises, late-night irony, expired food, and what else we’ll be talking about this year.
Every year, around this time, we humbly confront the uncertainty of what the future has in store. The year behind us is a comfortably known entity; ahead is only the blankness of the calendar.
But even if some surprises have yet to unfold, the truth is we do know quite a lot about what’s going to happen in 2014, and what we’ll be talking about. We know the Winter Olympics will take place in February in the Russian city of Sochi, drawing global attention to the authoritarian reign of Vladimir Putin; we know that Microsoft will install a new CEO and trigger speculation on whether an aging tech powerhouse can jolt itself back to life. We know that historians will be combing through newly declassified Cold War documents related to the Berlin Wall, and that the 25th anniversary of the wall’s collapse will be arriving in November and making lots of fortysomethings feel old. We know pundits and political scientists will be watching the midterm elections; we know that Colorado, where marijuana was legalized on Jan. 1, is going to enjoy a bump in tourism and generally give off some mellow vibes.
To keep Ideas readers briefed for 2014, we present here a sort of pre-encyclopedia of the year to come—a highly selective guide to what you’ll want to know as the year unfolds, and why. That our list will look woefully naive 12 months from now, when extraterrestrials have taken over the government and Apple has announced plans to get into the fashion business, is a risk we’re willing to take.
In February, Major League Baseball players will suit up for spring training in preparation for the 2014 season. But if things go as expected, the game they’ll be playing will be a little different from the one they played last year. Thanks to a new rule, runners barreling home from third base will no longer be able to score by slamming into the other team’s catcher and dislodging the ball from his mitt; instead, they’ll have to try to scoot around him without getting tagged.
The decision by the MLB to ban home-plate collisions, which will take effect either this year or next, is part of a broader response in professional sports to mounting evidence that repeated contact injuries, especially concussions, can cause irreversible long-term harm. And while the pressure to enact reforms will probably build in 2014, the idea of injury-proofing pro sports has a long history in America. Baseball added batting helmets in the 1950s; the National Hockey League started cracking down on checking from behind in the early 1990s. A century ago, football was so deadly that President Theodore Roosevelt himself had to step in and broker a compromise between those who wanted to ban the sport entirely and those who wanted it to continue. Then, as now, fans and owners worried that new rules would dilute the game, chipping away at the purity and intensity that made it exciting in the first place. Happily, there exists a precedent for the opposite happening, too: One result of the Roosevelt-era overhaul of football was an innovation called the forward pass.
The Republican Party has long been deeply unpopular with African-Americans. But when 93 percent of black voters pulled the lever for Obama in 2012, GOP leaders became convinced that something had to be done. An internal postmortem released after the election declared that unless the party “gets serious” about reaching out to minority voters, it will become just about impossible to win future elections.
One recommendation of the report was to recruit and support African-Americans who want to run for elected office. This year, at least 14, and probably more, black Republicans will run for seats on Capitol Hill. That number is especially notable considering there’s currently just one black Republican in all of Congress.
That it seems exceptional to have so many African-Americans running on the Republican ticket shows just how dramatically political winds can shift in America. It’s easy to forget now, but the GOP is the party of Lincoln, founded on a platform of containing slavery, and America’s very first black congressmen, swept into office after the Civil War, were all Republicans. Not until the New Deal in the 1930s did black voters start switching their loyalty to the Democrats. It’s one of the great ironies of American history that the Republican Party ended up becoming popular among white voters in the South in part by capitalizing on the long legacy of resentment its very own members triggered more than a century ago. With the upcoming election, it’ll be interesting to see how much support the GOP leadership throws behind its black candidates—and whether these candidates can play a role in helping their party reclaim its old, inclusive image.
June 12 marks the kickoff of soccer’s World Cup, the first time since 1950 that it will take place in Brazil. Leaders there hope it will be not only a local celebration of “the beautiful game,” but also a mark of how far the country has come since the 1980s and ’90s, when it was plagued by political turmoil and hyperinflation. But that may not end up being the lesson the world takes away. As Brazil has rushed to finish its 12 massive stadiums—and prepare for the Rio Olympics in 2016—it has been racked by massive protests. Last June an estimated 1 million demonstrators poured into the streets of the country’s biggest cities to demand the government spend tax dollars on public services instead of the already over-budget World Cup. Protests continued through the summer and fall, and slogans like “There will be no World Cup” were heard all over the country.
What does it take for the world’s most soccer-mad nation to turn against its favorite pastime? Brazil’s economy, one of the fastest growing in the world, has become an extreme manifestation of the inequality that can open up in a developing nation. More than 16 million people live in extreme poverty, on the equivalent of $1.30 per day, while Brazilian executives are the highest paid in the world, according to a 2011 report by the Association of Executive Search Consultants. The general secretary of FIFA, the international soccer body, has called for calm once the event starts next summer, but demonstrations are already scheduled for the tournament’s opening days.
The beloved Japanese animator Hayao Miyazaki said last year that his newest movie, “The Wind Rises,” will be his last—marking the final curtain for a career that made Americans realize how beautiful, deep, and ambitious Japanese animation can be. Miyazaki’s films include 1997’s “Princess Mononoke,” which made him a global star, and 2001’s “Spirited Away,” the first anime film to win an Oscar. The new film’s American release in February will no doubt trigger a flurry of critical reconsiderations of Miyazaki’s life’s work.
It will also instantly shift attention to a new Miyazaki: his 46-year-old son Goro, who has so far struggled to emerge from the shadow of his father. Goro’s first movie, “Tales From Earthsea,” was greeted with derision by anime fans and critics when it came out in 2006; his successful and acclaimed 2011 follow-up, “From Up on Poppy Hill,” redeemed him. The elder Miyazaki complicated the situation when Goro first started his animation career, by questioning his son’s readiness to direct his own feature film. But he came around in time for “Poppy Hill,” which the pair worked on together. In 2014, the baton will officially pass from father to son; whether the inheritance proves to be a gift or a curse for the younger filmmaker may become clearer with the release of his own next project, scheduled for completion later this year.
This year, former Trader Joe’s president Doug Rauch will stage a closely watched experiment in Dorchester when he opens The Daily Table. A nonprofit hybrid between a supermarket and restaurant, the store will charge low prices for healthy perishable food that other grocery stores have taken off their shelves because it’s damaged or expired.
Some neighbors have expressed distaste for the idea, which can sound disturbingly like sending poor people food deemed too risky for everyone else. But behind the concept lurks a strange fact about the American food market: Expiration labels almost never correspond to whether something is safe to eat or not, and most food remains fresh long after it’s supposedly “expired.” A study published last year by the Harvard Food Law and Policy Clinic found that food manufacturers don’t write labels with any standard legal definitions in mind, which is why some of them say “best by,” while others say “sell by” or “use by.” Most consumers assume the dates mean something specific about the food’s freshness, and thus a great deal of perfectly good food gets thrown away. This is already driving the development around the country of so-called salvage stores, which sell food rejected by other grocery stores.
Part of the novelty behind Rauch’s undertaking in Dorchester is that Trader Joe’s, where he worked for more than 30 years, tends to be so aggressively pleasant, clean, and inviting: a paradise for foodies on a budget. Rauch seems to be aiming for a similar vibe with The Daily Table, making it an intriguing case study in the power of branding to overwhelm customers’ most basic intuitions.
This spring, the Korean carmaker Hyundai will make a commercial hydrogen-powered car available on the American market for the first time. As futuristic as that sounds, it’s actually the blossoming of an idea that scientists first came up with in the 1800s: the fuel cell, which uses a chemical reaction to produce electricity, like a battery constantly being replenished.
Fuel cells can operate much more efficiently than regular engines, and the hydrogen-powered version in the Hyundai doesn’t emit any pollutants—the only byproduct is water vapor produced as it combines oxygen drawn from the air with the hydrogen in its tank.
So what’s not to like? Critics have pointed out that when you take into account the entire process that goes into preparing hydrogen for use in fuel cell engines—the hydrogen must be extracted from natural gas or water, which takes up extra energy and can generate greenhouses gases—it’s no longer so obvious that they’re truly more green than traditional cars. And although in theory the Hyundai Tucson will be able to run hundreds of miles without drivers having to stop and recharge, for now those miles will need to be in Southern California, home to eight of the 10 hydrogen stations in all of America.
HIGH-RISES: THE PROGRESSIVE CASE
Big real estate developers have historically found themselves at odds with left-wing politicians, particularly in their efforts to build enormous high-rises and fill them with luxury apartments. The progressive argument has always been that regular people end up getting squeezed out unless developers are restrained through government regulation. But in recent years, thinkers on the left have started to shift on this issue, with Matt Yglesias (“The Rent Is Too Damn High”), Ryan Avent (“The Gated City”), and George Mason law professor and zoning specialist David Schleicher saying cities would become more affordable if it were easier for developers to build downtown towers.
In 2014, observers expect the idea of development as a progressive cause to reach a kind of critical mass. Democratic lawmakers in both Washington, D.C., and California have started to clear the way for bigger, easier building, and Bill de Blasio, a pro-development liberal, just ascended to America’s biggest urban bully pulpit as mayor of New York. As America’s cities struggle with income inequality and rising rents, progressives are likely to entertain any possible way to prevent them from becoming the proverbial playgrounds for the rich—even if that means abandoning old political rivalries that once seemed as unmovable as a skyscraper.
IRONY AT NIGHT
Over the last decades, the late-night TV wars exposed a kind of fault line between traditionalist comics—Johnny Carson, Jay Leno—aiming at an older audience, and hipper, more ironic hosts—David Letterman, Conan O’Brien—going after younger viewers by poking fun at their own genre. In February, Leno hands over “The Tonight Show” to Jimmy Fallon, and the wires get crossed in a whole new way.
Fallon, who currently hosts NBC’s “Late Night” show, is young and Web savvy, with a knack for producing DIY videos that appeal to viewers used to getting their laughs on YouTube. As a host, he’s also almost totally un-ironic, with a sort of eager-beaver earnestness that was recently parodied on his old show, “Saturday Night Live,” by a wide-eyed Justin Timberlake exclaiming, “Oh my gosh, so great, so great!” about anything and everything. It’s notable that Leno’s throne is being passed to a guy like that—as opposed to the arch and self-consciously awkward O’Brien, who was supposed to inherit “The Tonight Show” but didn’t. And it’s a sign that NBC, desperate to capture the 18- to 34-year-old demographic, is betting that America’s young people now see irony as yesterday’s news.
Just over a decade ago, The Lego Group was on the brink of bankruptcy. But this year, the 82-year-old Danish company, having successfully turned itself around to become the second-biggest toy maker in the world, will enjoy a victory lap with the release of the first-ever Lego movie. Scheduled for release in February, “The Lego Movie” imagines a world built out of Lego that is being threatened by a villain, voiced by Will Ferrell, who is intent on destroying it by gluing all the pieces together. Though the film is intended mainly for kids, the trailer makes it clear it’s going to be quippy and ironic in a way intended to reach Lego’s adult fans as well.
The movie promises to be a high-water mark for Lego’s penetration of geek and design culture, where it has become not just a toy but the object of serious love and obsession for legions of hobbyists. Recent years have seen these armchair engineers use Lego pieces to build everything from a life-size model of an electric guitar to little replicas of edgy contemporary art, like Damien Hirst’s formaldehyde shark. Recent highlights from the myriad blogs devoted to custom Lego creations include a model of an original Apple computer, a 125,000-brick gingerbread house, and a mini-meth lab inspired by “Breaking Bad.” The latest Lego catalog, meanwhile, includes kits for the Sydney Opera House, the United Nations Headquarters, and a Volkswagen T1 Camper Van that, as one online reviewer pointed out, aims for the “forty-year-old-child” demographic.
Intellectual property law has become a huge bugbear for people who worry about American competitiveness: Patents are crucially important to American business, but they also fuel incessant lawsuits that eat up time and money. The issue has become so technical and convoluted that it’s nearly impossible to convince the public, or Congress, to engage with it.
Enter the Supreme Court, which last year issued a high-profile decision saying that gene sequences cannot be patented. This year the court will take up the even messier world of high-tech software patents when it hears a case known as Alice Corp. vs. CLS Bank International. The case addresses the question of what it means to patent inventions that take the form of computer code—and when an idea is too abstract to be legitimately protected.
There are plenty of people who say software patents should be abolished entirely—New Zealand passed a law to that effect last spring. Under existing rules, meanwhile, a company can take out a patent on something as generic as, say, letting customers use a mouse to purchase something online with a single click—which many critics see as absurd, and which has been a boon to so-called patent troll companies that exist solely to buy patents and then sue other companies for violating them.
When critics look at modern patent law, they see a disastrous irony: A system built to protect innovators is now creating major obstacles to anyone trying to bring new ideas to fruition. A Supreme Court decision on the Alice case could offer a much needed way out of the thicket.
Thanks to the disclosures of Edward Snowden, 2013 was the year Americans found out the government was keeping much closer tabs on their communications than they had realized. In 2014, the big question is what, if anything, will be done about it. The gears have recently started turning: First, on Dec. 16, a federal judge in the DC Circuit Court, Richard Leon, ruled that the NSA’s bulk collection of phone data was a violation of the Fourth Amendment. Then, just a few days later, a panel appointed by President Obama said new restraints should be imposed on the NSA’s surveillance programs. On Dec. 22, Vermont Senator Patrick Leahy told “Meet the Press” that “momentum is building for real reform,” and announced that the Senate Judiciary Committee will hold a hearing Jan. 14 to review the recommendations of the president’s panel.
What’s going to make this debate so fascinating is that the politics of privacy, particularly as it relates to national security, are so quirky: It’s one of the few issues on which each party has a meaningful division of opinion, with some centrists on both sides trusting the government to do the right thing and civil libertarians at both ends horrified at the extent of the surveillance state. It may turn out that the key players in what happens next are companies like Facebook and Google, whose lobbyists in Washington are trying to limit the government’s ability to hoover up user information without interfering with their own ability to collect it.
Four hundred fifty years ago in the English town of Stratford-upon-Avon, a mother gave birth to a baby whose words would become the most widely read in the English language. The literary world, never inclined to pass up an anniversary, will be bandying about William Shakespeare’s name even more than usual this year. For starters, the Modern Library will be publishing Shakespeare’s Complete Works, edited from the first-ever folio of his plays that was published in 1623. The Library of America, meanwhile, will be bringing out a massive anthology of American writing, some dating back as far as the Revolution, that was directly inspired by Shakespeare. Theaters around the world will be putting on special productions as well; the Royal Shakespeare Company in Stratford-upon-Avon will be staging what many consider Shakespeare’s first play, “The Two Gentlemen of Verona,” for the first time in almost 50 years, and streaming it to movie theaters around the world. Finally, Shakespeare scholars will gather in Paris for an elaborate weeklong conference in April to celebrate the playwright’s birthday.
Behind all this enthusiasm, though, lies a slightly inconvenient mystery: Nobody knows when, exactly, Shakespeare was born. His official “birthday,” April 23, is only a best guess based on his baptism records. It seems that in 1564, the birth of a not-yet-famous baby boy wasn’t important enough to be recorded—a fact that no amount of posthumous attention can change.
Since it debuted in 2009, in the wake of the financial crisis, the cryptocurrency known as bitcoin—a form of virtual money not tied to any country or physical means of exchange—has passed through choppy but exciting waters, with the value of a single bitcoin jumping from just 8 cents in the summer of 2010 to $1,137 at the end of this past November, to just half that a week later. At its peak, the value of all bitcoins in the world was around $3 billion.
It has become almost a sport among economists to argue about whether bitcoin has a future as a legitimate currency or is doomed to collapse: Its creator is anonymous, its value is extremely sensitive to media coverage, and its most prominent retail use so far has been the defunct Silk Road online drug market. But 2014 will offer what may be a crucial test: Overstock.com, an online store where one can buy anything from furniture to digital cameras, will become the first major retailer to accept bitcoins as payment. The decision, made by Overstock.com’s libertarian CEO, will give a glimpse of whether it might really start functioning as an alternative to the US dollar. Are users willing to spend bitcoins to buy something as prosaic as a winter jacket or a set of cufflinks?
At the moment, they aren’t. As commentators have pointed out, bitcoin seems to be suffering from a hoarding problem, with many of the people who own bitcoins choosing to keep them in hopes that the price will skyrocket. Ironically, the jumps in value that make it so attractive to investors may end up preventing bitcoin from becoming anything more than a novelty investment.
WORLD WAR I
June 28, 1914, was the day Archduke Franz Ferdinand was shot dead in Sarajevo. One hundred years later, his assassination—and the devastating war it set off—will be commemorated all over Europe in what promises to be the most elaborate and multifaceted exercise in historical remembrance ever undertaken. The British government alone is spending more than $82 million on events, starting in August with a candlelight vigil in Westminster Abbey, while the BBC will be putting on four years worth of programming—“the biggest and most ambitious pan-BBC project ever commissioned,” according to the network—between now and 2018, the 100th anniversary of the war’s end. Stateside, a more muted atmosphere is likely: The fact that the United States didn’t get involved in the conflict until 1917 means the lion’s share of commemorative activity won’t start for another three years.
All this might sound like an exercise in nostalgia, but for historians, World War I is still quite alive in an important way: One hundred years after the war erupted, they’re still arguing about why it happened. Barbara Tuchman’s “The Guns of August” put forth the theory 50 years ago that the war was the result of statesmen losing control of a situation in the process of logically responding to circumstances—a thesis that John F. Kennedy took to heart when dealing with the Cuban Missile Crisis. But others have since reached different conclusions, with Max Hastings blaming Germany, Sean McMeekin blaming Russia, and Niall Ferguson blaming the British.
As the centennial gets started, expect to hear more argument on this point. The debate is crucial not just because it’s nice to get history right, but because of the important, if disheartening, perspective it offers on war: Even the most massive global conflagrations can arise without our ever really understanding why.