Walter Isaacson’s ‘The Innovators’ explores the teamwork that made the digital revolution
If you go on Amazon and search the phrase “the man who invented,” you’ll get more than 1,800 book results.
But, Aspen Institute CEO Walter Isaacson argues in his new book, “The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution,” that the notion that individuals shape culture and innovation is a distortion of history. Throughout the book — which covers the 1830s to today — Isaacson relates stories of how teamwork and collaboration led to breakthroughs from the analytical engine to the electromechanical computer to the personal computer and the Internet.
“Innovation comes from teams more often than from the lightbulb moments of lone geniuses,” Isaacson writes. “This was true of every era of creative ferment.”
“The Innovators” is an ambitious, briskly moving and affecting group biography. It’s more reminiscent in style of “The Wise Men,” his book about the officials who shaped American Cold War policy, than his own lone-genius biographies of Benjamin Franklin, Albert Einstein and Steve Jobs.
The new book — which was long-listed for the National Book Award last month — goes on sale today.
“What I wanted to do was look at real innovators and how they made their breakthroughs,” Isaacson said in July at an Institute forum in Aspen. “A narrative book talking about real people — what succeeded, what failed, what particular talents they had.”
He had been working on the book for 12 years, he said, when he put it aside to write “Steve Jobs,” his acclaimed 2011 biography of the Apple co-founder.
Late in the writing process, he took a cue from the lessons on collaboration he found in the book and crowd-sourced “The Innovators” itself, inviting feedback on excerpts from readers online. He notes in the acknowledgements that 18,200 people read one excerpt in its first week online. (“18,170 more draft readers than I’ve ever had in the past.”)
The surprising hero of the digital age that emerges from the narrative is Ada Lovelace, Lord Byron’s daughter, who in the 1840s conceived of a machine that could make textiles, music or perform other functions according to programmed instructions. Lovelace also embraced what she called “poetical science,” combining the arts championed by her father and the technology emerging from the Industrial Revolution.
Her conception of combining human creativity with computer processing for innovation is the backbone of the breakthroughs chronicled in the rest of the book, from the eve of World War II to today.
Rather than try to answer questions about who invented the microchip or email or the Internet, Isaacson offers a sweeping tale of the evolutionary process and collaboration that made our digital world (though more than a handful of patent lawsuits and fights over credit crop up as a result). The military, universities and private corporations — what Isaacson deems the “military-industrial-academic complex” — get equal credit for technological progress.
The book includes vivid portraits of Bill Gates, Steve Case, Steve Jobs and other Silicon Valley icons, along with computing machine inventor Alan Turing, transistor innovator William Shockley, computer programming developer Grace Hopper, Internet pioneer L.C.R. Licklider and “Whole Earth Catalog” author Steward Brand, whose Trips Festival helped make computers cool. A group of women tasked with programming computers to calculate missile trajectories during World War II turn out to have helped originate modern computer programming. And yes, Al Gore gets his due.
As Isaacson recounts, Gore spearheaded a congressional study on interconnecting computer-research networks in 1986 and sponsored a series of legislation in the early ’90s that opened the Internet to the general public.
Isaacson manages to write mostly in plain English, avoiding geek-speak or letting his prose get gummed up by the endless acronyms and technical terms his subject often calls for. A general reading audience will no doubt get a kick out of some nuggets, like the origin of the term “Silicon Valley” (it came from the title of an Electronic News column by Dan Hoefler) or how the simplicity of the instructions for the video game “Pong” led to the computer age’s focus on simple, intuitive user interfaces.
One enduring question that remains open at the end of “The Innovators” is whether artificial intelligence will ever be achieved. Isaacson recounts bold predications about the coming age of computer intelligence, from Ada Lovelace’s time to Alan Turing’s to 2013, and he is skeptical that it will ever arrive.
“Decade after decade, new waves of experts have claimed that artificial intelligence was on the visible horizon, perhaps only 20 years away,” he writes in his closing chapter. “Yet it has remained a mirage, always about 20 years away.”
Start a dialogue, stay on topic and be civil.
If you don't follow the rules, your comment may be deleted.