When the romantic poet Lord Byron angered his wife, Annabella, with his many affairs, she embarked on a course that would change the world. After leaving her husband, Anna steered their daughter, Ada, away from his airy ways. She hired tutors to provide Ada with a rigorous education in mathematics. Ada embraced her work and connected with scientific minds such as Mary Somerville and Charles Babbage. And she created a vision for computers that drives technology to this day.
Walter Isaacson gives Ada Lovelace a position of prominence in his landmark new work, “The Innovators.’’ In this often surprising history, Isaacson offers an encyclopedic account of the technological breakthroughs that made modern computers and networks possible: programming, transistors, chips, software, graphics, desktop computers, and the Internet.
Isaacson seems to embrace both sides of the ongoing debate about whether “great men” make history or events proceed according to history’s “inevitable logic.” He tells hundreds of stories of inventors and their discoveries. Claude Shannon, for example, identified the importance of on-off switching for electrical circuits. With IBM, Howard Aiken built the programmable Mark I computer. The War Department’s ENIAC developed programs that used conditional branching and subroutines. William Shockley developed the transistor.
While spinning these tales of individual triumph, Isaacson also shows how inevitable these advances were. Every new computer solved problems and created challenges. Speedy processing required more memory. A greater range of operations required separate programs, which could be loaded and unloaded from the machine. Ever-expanding calculating capabilities demanded simple user interfaces. Growing communities of users required vast networks to link them together.
Necessity spurred invention. Computer developers grew frustrated with slow calculation speeds, errors in punch cards, time lost while loading software, clumsy inputting procedures, and more. On a vaster scale, the challenges of World War II, the Cold War, and the space race created urgency to analyze data better and faster.
So what are the essential elements of innovation? To start, innovators bring big egos to develop visions bigger than themselves. Contrary to the myth of the lone inventor, innovation requires teamwork. Those teams need the frick-and-frack leadership of technical geniuses and practical operators (like Apple’s Steve Jobs and Steve Wozniak). Innovators work insanely long hours, stubbornly obsessing over details. And when their ideas falter, they need to “pirouette” and change course.
Isaacson also notes that breakthrough thinkers see the world metaphorically. Einstein famously got his “aha” moment for relativity theory while pondering the sight of two trains, traveling in opposite directions, passing a platform. Computer innovators used the metaphors of fabric looms, phone switchboards, race relays, assembly lines, railroad punchcards, painting brushstrokes, market baskets, and spider webs to visualize the operations of ever more sophisticated computers.
Above all, innovation requires the interplay of different kinds of knowledge — mathematics, quantum physics, chemistry, metallurgy, logic, even writing and aesthetics — in search of a radical vision. “What is imagination?” Lovelace asked in 1841. “It is the Combining faculty. It brings together things, facts, conceptions in new, original, endless, ever-varying combinations.”
Isaacson highlights the need for every variety of thinker and doer to join the creative fray. When Lawrence Summers, then the president of Harvard University, expressed doubt about the capabilities of women in math and the sciences, he was strangely ignorant of recent history. Women played pivotal roles in the development of the computer, particularly in wartime code-breaking and programming the ENIAC. An outside perspective lends insights that insiders miss.
Even the brainiest players miss opportunities. Xerox PARC researchers developed the mouse and visual displays that made the modern computer possible; stuck in a lab mindset, they let Steve Jobs steal them. When Google’s Sergy Brin and Larry Page offered to sell their secret search recipe for $1 million, Yahoo and others yawned and said no.
Some decisions close options and opportunities. Hyperlinks, for example, create a one-way connection to outside websites. Tim Berners-Lee, the father of the Internet, had a different vision. He wanted two-way links, requiring approval of the linker and the linked. Such a system, Isaacson says, could have produced a dynamic system of micro payments for content. Instead of cannibalizing writers’ work, the Internet could have created vast new revenues for writers — and led to more substantial content.
Or not. Maybe such tollbooths would have slowed traffic to a trickle. Maybe the Internet needs to ravage creative territory, far and wide, without too much concern for rewarding content producers. Either way, as we debate “net neutrality” and technology’s next stages of development, we need to think hard about these tradeoffs.
Charles Euchner, a case writer and editor at Yale School of Management, is the author of “Nobody Turn Me Around’’ and “The Big Book of Writing.’’