(Photography by Danny Lyon, AB’63/University of Chicago Photographic Archive, apf2-03136, Special Collections Research Center, University of Chicago Library)

The informed life

Edward Tenner, AM’67, PhD’72, considers what “an informed life” means in the information age.

In 1972, when I received my PhD from the University of Chicago, an age of information abundance was dawning, or so my contemporaries and I believed. The first reference I have found to an “information abundance problem”—note it was already a problem—was in volume 2 of the Annual Reviews of Information Science and Technology in 1967. Higher education was still expanding, and the guru Peter Drucker’s prophecies about the future prevalence of “knowledge workers” were ascendant in business. Even an aspiring humanist working in a relatively arcane field like early 19th-century Germany could hope.

I researched my dissertation on the causes of popular revolts in the German states in the early 1830s using file cards with holes and notched edges, sorted by rods like knitting needles as a database. A year in Germany showed me how advanced American information abundance was. Books published before the 1960s were listed in massive bound volumes smelling like cigar boxes, with strips of paper ordered by author only. Card catalogs were still high tech. The Xerox photocopier, introduced in 1959, and its offspring the instant copy shop (the original copyright pirate cove) remained uncommon in Europe. And the official kurrentschrift of archived official documents—a slanted, apparently uniform zigzag designed for rapid writing—remains challenging even to educated Germans today, meticulous as it appears once you’ve deciphered it.

In hindsight I realize that in North America, too, the 1970s were still a time of information scarcity. People looked up to a small number of authorities, for better or worse. Newspaper columnists and book reviewers still had formidable clout; their doyen, Walter Lippmann, had retired only in 1967. (Early in the Second World War, a confidential report of Isaiah Berlin to Winston Churchill gave his influence a rare four stars, the same as the whole New York Times.) The Book of the Month Club still promised the wise selections of a panel of distinguished literary judges. Or you could go to your public library and pore through month after month of the Book Review Digest.

The PhD cohort of the early 1970s was the last to climb aboard the tenure track express. When Kierkegaard wrote that life is lived forward but understood backward he could have been referring to my dissertation; I later realized I should have been asking an entirely different set of questions, more along the anthropological lines of David Sabean’s Power in the Blood (Cambridge University Press, 1984), but there was no time.

Fortunately, the skills I learned in those creaky catalog volumes and musty files paid off. Among my early jobs was a research assistantship to one of my teachers, William H. McNeill, U-High’34, AB’38, AM’39, during the project that became his best-selling Plagues and Peoples (Anchor, 1977). I also edited a paper by Theodore R. Marmor at the University’s Center for Health Administration Studies at a time when editing could still include photocopying and literally cutting and pasting.

My next job, as a science book acquisitions editor at Princeton University Press, presented a new information challenge. I learned the arcane skills of identifying the small minority of scientists motivated and able to write good books, whether specialized monographs or popular syntheses. Spies—and I have worked with at least one—call these skills tradecraft. You can’t find them in books on publishing, and at least then they were rarely discussed openly. I kept my strategies close to the vest: subscribing to a dozen university bulletins and circling promising lectures; collecting campus telephone books and departmental lists of current grants and research projects. Now universally available, such documents were then still not easy for visitors to find on many campuses.

When I decided to leave publishing for independent writing in 1991, years of coping with scarcity were my friend. I had learned shortcuts in working with the few major end-user databases of the late 1980s; these helped me navigate the flood of data that became available a few years later with the World Wide Web and modern browser. People who have grown up with digital abundance may excel at programming, games, and many other electronic skills, yet (as I discovered when I wrote an article on the subject ten years ago) there are today very few undergraduate power researchers. For older users, scarcity had sharpened our technique; it was like training wearing weights, or at high altitude.

My writing explored unintended consequences. The timing was fortunate: the new web was complementing and amplifying conventional print publications but not yet competing directly with them. For writers and publishers, the web boom of the late 1990s was the best of both worlds, a cornucopia of advertising supporting conventional products, like the excellent Britannica Yearbooks of Science and the Future series. Sadly, thanks to science, those yearbooks had no future.

Other high-quality publications to which I’ve contributed—Civilization, the Industry Standard, and now the Wilson Quarterly—are also casualties of the rise of free information sources and a mass migration of advertisers from paid to free content. Newspaper industry revenue actually rose to record heights in the first decade of the web, reaching an all-time peak of nearly $50 billion in 2005, only to drop to $22.3 billion by the end of 2012. The great journalist Edwin Diamond, PhB’47, AM’49, presciently paraphrased Tom Lehrer’s “Wernher von Braun” at an Annenberg Washington Program conference on the Internet in 1995: “‘I make them go up, where they come down / Is not my department,’ said Wernher von Braun.”

Book publishing, protected from media buyers’ herd mentality, has been more stable. The problem is that the abundance of titles—300,000 from conventional publishers in the United States alone—has exceeded the market’s ability to absorb them, leading to lower runs and higher prices.

Books along with social media thus are becoming part of what might be called a loss leader society, extending from ever expanding but still unprofitable Amazon itself to the humblest tweeter. Unpaid or low-paid promotional activities aim at some long-term goal, whether ten-figure market dominance, speaking appearances, or merely a better job.

Am I, then, a fool in welcoming information abundance rather than joining the throng of culture pessimists? There’s actually a lot to be said for ignoring the odds sometimes; if you don’t believe this, see Shelley Taylor’s Positive Illusions: Creative Self-Deception and the Healthy Mind (Basic Books, 1989). As a recent New York Times exhibition review points out, if Christopher Columbus had not relied on Ptolemy’s inaccurate estimate of the earth’s circumference he probably would never have tried to find a western route to China.

Alarming as some trends in higher education and the media have been since 1972, I’m lucky that I was displaced early and often. Necessity forced me to become the generalist that I was all along, even as I tried to deny it and stake out a specialty. I’ve even been able to write something more interesting in my original field—a study of the biology and culture of the German shepherd dog, still unpublished—than I would have if I had become a professor instead of a science editor. That experience helped inspire me to write a book I’m now completing on positive unintended consequences.

My mantra remains the graffiti famous in Europe in the pivotal years 1968–69: “Be realistic. Demand the impossible.”

Edward Tenner, AM’67, PhD’72, is an independent writer and speaker on technology and culture. His book Why Things Bite Back: Technology and the Revenge of Unintended Consequences (Vintage, 1997) has been an international best seller. His most recent book is Our Own Devices: The Past and Future of Body Technology (Vintage, 2003). Tenner is a visiting scholar in the Rutgers Department of History and an affiliate of the Center for Arts and Cultural Policy Studies of Princeton’s Woodrow Wilson School.