(Illustrations by Federico Jordan)

Free for all

Spring quarter, like any other, offered an encyclopedia of public talks on campus, illuminating topics art historical, zoological, and most everything in between. At 11 of those talks, the Magazine staff were there. Here’s what we learned.

Dessert course

For a couple of hours in May, novelist Jeffrey Eugenides basked in Hyde Park’s version of a celebrity welcome. An eager young crowd filled the Logan Center performance hall for his late afternoon reading. They listened, rapt, and lined up for the author to autograph their copies of his latest book, The Marriage Plot (Farrar, Straus and Giroux, 2011). “You’re obviously much better than my students at Princeton who never come out in the afternoon,” he said. “Don’t put it on Facebook.”

The audience also stayed in their comfy orange seats for his Q&A with creative writing instructor and Booklist senior editor Donna Seaman. Citing Eugenides’s first novel, The Virgin Suicides (Farrar, Straus and Giroux, 1993), she asked, is empathy essential to fiction? “To write about a range of characters, a range of different people, you’re going to have to admit to yourself that your own mind and ego are not at the center of the world,” he answered. “Writing can make you into a terrible person that no one wants to live with, but one of the things that it can hopefully do is make you a listener attentive to other people’s problems.”

Is the novel of ideas alive and well? Yes, said Eugenides, with a nod to the University. “I tremble with happiness to think of Saul Bellow [X’39] being here. He’s one of my favorite writers and a writer of the novel of ideas as well. It sounds so heavy and awful, … but when you meet a real novel of ideas, there’s nothing that really is more exciting and really feels like it’s teaching you about life and how to live—but not in a way that’s like taking castor oil, but in the opposite way. More like ice cream.”—Elizabeth Station

 

Glimmers of grace

William F. Schulz, AM’74, the Richard and Ann Pozen visiting professor in human rights, marveled in horror at “the sheer creativity of modern torture.”

Brazilian captives were thrown naked into small concrete cells otherwise unoccupied except for a boa constrictor. In Afghanistan, the mujahideen tied living prisoners to corpses, leaving them tethered in the sun to rot. Central American soldiers cut open the wombs of pregnant women, tossed the fetuses into the air, and caught them on their bayonets.

Schulz, a former executive director of Amnesty International USA and now president and CEO of the Unitarian Universalist Service Committee, described those atrocities during his May 7 lecture, “Is Human Dignity Inherent? What Torture Has Taught Me.” Although an avowed opponent of torture, Schulz’s answer to his title question was no. “My quarrel is not with the concept of dignity,” he said. “It is with the notion that it is inherent.”

He argued that people must learn to act with dignity and assign that characteristic to each other. As Schulz completed his litany of torture methods—the examples went on—he wondered, “What does this mean for the notion that a torturer, too, is a person of inherent worth and dignity?” The implication was that the perpetrators of such barbarism were not.

Denying dignity as an absolute, Schulz acknowledged, exposes human rights to the perpetual reconsideration of public opinion. But a global consensus expressed in treaties, declarations, and conventions, he insisted, provides a stronger legal and moral basis than appealing to an ethereal quality. “The answer to the question of why torture is wrong is because the world community, struggling as it does to describe the nature of a civilized society, says that it’s wrong.”

Schulz concluded with stories of survivors who reclaimed their lives and even reconciled with their torturers. Those “glimmers of grace” assured him that although dignity might not be inherent, it is resilient.

“If even those who endured the most extreme brutality retained their faith in human dignity, then assigning that to others and protecting it whenever it is under threat, that is surely the highest and the noblest calling.”—Jason Kelly

 

Filthy Romans

I can’t say I expected to laugh much during Elizabeth Clark’s lecture “‘Rome’ in the Nineteenth-Century Protestant Imaginary: American Professors, Ancient ‘Pagans,’ and Early Christianity.” I also wasn’t expecting an assessment of the classical world I had never heard before.

About 25 people (including one of the regular bartenders at Jimmy’s Woodlawn Tap) settled into folding chairs in the crimson-carpeted, portrait-lined Common Room at Swift Hall to hear Clark, a professor of religion and history at Duke University. Before she began, Clark passed out a written list titled “The Professors,” six early-19th-century academics who taught at places like Princeton and Yale and who took a decidedly dim view of Rome. In the 18th century, she explained, the virtues of Rome were seen as similar to those of the new American republic; later, in the late 19th century, “high-minded classical study” was embraced as “an antidote to the materialism of the gilded age.” 

In between come Clark’s professors, who adopted and expanded on the critical tone of Roman authors such as Livy, Tacitus, and Juvenal. For her research, Clark looked at the academics’ published works, their class notes, their students’ notes, memoirs, letters, sermons, and diaries.

Their writings usually included “perfunctory praise” for Roman accomplishments: a common language, roads, jurisprudence, government. After this throat clearing, the professors go on the attack. Clark paraphrased Henry Boynton Smith of Union Theological Seminary: “Woman was everywhere debased. Unnatural lust prevailed. The Romans … plunged into the fiercest excesses of gluttony and sensuality.”

The audience snickered. Clark quoted Roswell Hitchcock, also of Union, who put it even more sharply: “‘Every man became a paramour and nearly every woman a harlot.’” We snickered some more.

The alleged moral failings of the Romans provided a convenient excuse for the aspects of early Christianity that 19th century Protestants didn’t like, Clark said. Asceticism, for example, was seen as a necessary reaction against Rome’s extreme degradation. Clark invoked Hitchcock again: “The age into which Christianity came was most debauched and slimy, reeking with pollution.”

In contrast to the degenerate Romans were the Germanic tribes who eventually overran the Roman Empire. The Teutons, said Clark, were seen as part of God’s providential plan to “reinvigorate a decaying Christendom.”

Too much illicit sex not only made the Romans degenerate—in Hitchcock’s view, it also made them short and effeminate. The Romans “had been dwarfed and enfeebled by their dissolute civilization,” Clark said. “The Teutons were also better psychologically than the unmanly Greco-Roman civilization.” At this point the echo of later German history became a little uncanny. The audience stopped laughing.—Carrie Golus, AB’91, AM’93

 

The real North Korea

He was an idealist,” said historian Andrei Lankov of the first North Korean leader, Kim Il Sung. But “as the history of the 20th century has shown us many times, idealists very often kill many more people than cynical, pragmatic opportunists.” A Soviet puppet when North Korea was established in 1948, Kim swiftly shed that role and built his own hardline model of Leninist socialism, explained Lankov. The professor at Seoul’s Kookmin University spoke at the Seminary Co-op in April to promote his book The Real North Korea: Life and Politics in the Failed Stalinist Utopia (Oxford University Press, 2013).

Lankov began studying the nation in the mid-1980s, when he lived there as a Soviet exchange student. To further “the great cause of communism,” Kim created a state where money was essentially useless and the government distributed everything from food to socks. The system was painfully inefficient, but North Korea stayed afloat through deft diplomacy, Lankov said. Kim exploited the rivalry between the Soviet Union and China, promising to remain neutral only in exchange for financial support.

In the early 1990s, however, China curbed its subsidies and Soviet funds disappeared as the USSR dissolved. The North Korean people suffered devastating famine, yet Kim Il Sung and his son and successor, Kim Jong Il, kept the country going by continuing their “brilliant” foreign policy, said Lankov, using nuclear threats to squeeze aid from “mortal enemies” like the United States. At the same time, the subsidy cuts spurred growth of a grassroots market economy that has helped North Korea to putter along.

Nonetheless, Lankov believes the government is likely doomed. Chinese-style economic reform is risky since it would necessitate contact with South Korea, whose prosperity would leave North Korean citizens demanding more dramatic change. Information about South Korea is seeping in now—those who cross the poorly controlled Chinese border bring back stories, and many North Koreans watch illegal but widely available South Korean movies on DVD. Power seems to be slipping away from Kim Jong Un already. In conversations with midlevel government officials, Lankov has noted expressions of dissatisfaction with and resentment toward the regime—something unheard of in his early days as a researcher.

If resistance comes, it will not be a velvet revolution, Lankov predicted. Kim loyalists will fight the rebellion, afraid to lose everything. “North Korea is a very sad story,” he concluded, “… a story of idealists who wanted to create a paradise, who ended up creating a hell, who don’t know how to get out of this hell.”—Katherine Muhlenkamp

 

Dinosaur technology

Thirty minutes after his talk was set to begin, UChicago paleontologist Paul Sereno burst through the door at Crerar Library with a jovial apology and a plastic tub full of dinosaur skulls. “PowerPoint problems!” he said, catching his breath. Then, booting up his computer, he launched into a presentation on imaging technology’s revolutionizing effect on the study of dinosaurs. Using CT scans and digital software, researchers can see inside bones, build detailed models with the push of a button, and make long-dead animals stand up and walk. 

Medical and industrial scanners allowed Sereno’s lab to create a prototype Nigersaurus skull—the original specimen was too fragile to cast and mold by hand—and see how the brain fit inside, tilting at an angle that revealed the animal’s strange posture: head to the ground, flat mouth feeding like a vacuum cleaner: “It was a stunning confirmation that this was a Mesozoic lawnmower.”

He described how his research team used visualization and animation software to examine the joints of an early raptor’s digging claws and to scrutinize the massive jaws of the 40-foot-long SuperCroc. In a video clip, the skeletal mouth snapped open and shut while Sereno explained that its top and bottom teeth “didn’t interact”—they sat an inch apart—and were therefore made not for fishing but for “grabbing a dinosaur.”  Now, he said, he’s working to figure out how the long-legged crocodilian Araripesuchus, nicknamed DogCroc, moved when it galloped. In part, Sereno plans to do that by mapping scans of its fossil onto footage of a modern Australian crocodile in full gait, melding them “until you see a crocodilian skeleton of 100 million years ago in a step cycle, running. And you can zoom around and look at the joints and see if they’re reasonable.”

Technology is making it possible, he said, to test theories about how long-dead creatures once stood and walked, how they hunted, and what they ate. “You no longer are going to be safe as a paleontologist making a two-dimensional drawing and saying, ‘I think the animal did this.’”

Technology will also shift the methods for building physical models of dinosaurs as sculpting, casting, and molding by hand give way to scans, fabrication machines, and 3D printers. “You take the file and machine it,” said Sereno, who’s working on a machine-made foam model of a 55-foot dinosaur. “No longer the bone-by-bone molding, casting, degrading molds that have gone into the time-honored method of reconstructing a dinosaur.”—Lydialyle Gibson

 

America, more or less

Dean Price, one of the hard-luck strivers who populate George Packer’s new book, predicted the end of the big-box sprawl that had desiccated the North Carolina tobacco country where he grew up. High oil prices and other economic shocks, Price believed, would trigger a return to localized social and financial systems reminiscent of Jefferson’s agrarian ideal. An unwinding.

Packer, a New Yorker staff writer, admired the poetry and hope in that idea and named his book The Unwinding: An Inner History of the New America (Farrar, Straus and Giroux, 2013). But he had a different interpretation. “I thought of the unwinding as what Dean has been living through and what we were seeing all around us,” he said during a May 30 conversation with Northwestern University associate professor Peter Slevin at International House. “Which is to say, old structures that supported life for ordinary people, middle class, working class people, were collapsing.”

For Price and his generation, born around 1960, Packer said, the postwar social contract—fulfilled by corporate citizenship, trade unions, public schools, local newspapers—has been unwinding throughout their adult lives. His book explores “what happens when the contract is gone and the deal is off.”

Youngstown, Ohio, a husk of a city orphaned by the steel industry, represented the worst-case scenario. Another of Packer’s subjects, Tammy Thomas, raised a family there—and later tried to raise Youngstown itself—with dogged resistance to the economic and social decay around her.

Packer’s sympathy for Price, Thomas, the underemployed, and the foreclosed contrasted with his contempt for excess from Silicon Valley to Wall Street and Wal-Mart to Washington. He blamed political climate change on what he called the noxious-gas rhetoric of Newt Gingrich: “He was not an institution builder, he was an institution destroyer.” And Robert Rubin, the former Goldman Sachs executive and treasury secretary who cashed in at a crashing Citigroup, symbolized the collusion of money and power: “Here’s a shining star of the establishment who, in the end, represents institutional failure.”

Those failures were like land mines in ordinary people’s lives, but in Packer’s telling, the victims remained confident about the expansion of equal opportunity. He had a different interpretation of that too. “The circle of inclusion is wider, but in reality I’d say no, more people are falling behind,” Packer said. The Unwinding chronicles modern America’s obstacles to catching up.—Jason Kelly

 

Free zone

Sociologist Saskia Sassen is said to have invented the term “global city” in her 1991 book of the same name. On May 3 the former UChicago professor returned to campus from Columbia University, where she cochairs the Committee on Global Thought, to kick off “Globalization and Mobilities,” a conference on the theory and methods of human movement.

Before beginning any research project, Sassen says, “I need a zone, a space of a certain kind of flexibility, freedom. … I call this zone the ‘zone of before method.’” While in the zone, one of her tactics is “to actively destabilize stabilized meanings.” Rather than reject broad, abstract categories such as economy, polity, family, or border, “I accept their power,” she explains, “but I want to know what they hide.”

When Sassen theorizes about immigration, for example, she remains wary of the term. “You say ‘immigration’ and you have evoked geography, history, suffering. … How can you do the research and theorize this stuff when it’s so chock full of all kinds of meanings and realities?” Most debates about remittances assume that immigrant workers export their wages to poor countries: “They come here, they take our jobs, and then they send our money back home.” But when Sassen focused on the countries that receive remittances rather than on the senders’ countries of origin, she made a surprising discovery. The top ten remittance-receiving countries include five rich countries. “And in fact, something I find absolutely adorable, the United States is in the top 20.”—Elizabeth Station

 

Art emergencies

In an emergency, who is in charge of rescuing a distressed nation’s art and culture, and how does that happen? Richard Kurin, AM’74, PhD’81, the Smithsonian Institution’s under secretary for history, art, and culture, addressed these questions in a May 1 lecture, “Saving Haiti’s Heritage: Cultural Recovery after the Earthquake.” At the Harris School of Public Policy Studies, over a pizza lunch for about 20 hungry student and faculty guests, Kurin expressed his frustration with red tape and previous Smithsonian higher-ups who took a “not my job” attitude toward saving culture after events like Hurricane Katrina or the Iraq War.

To illustrate the hurdles involved, Kurin discussed the aftermath of the 2010 earthquake in Haiti. More is required than a desire to save art: at-risk artifacts must be triaged, physical work space secured, and funds raised, all of which was done when Smithsonian conservators landed in Haiti. (“You’re doing it like it’s a PTA bake sale,” Kurin joked about the initial $276,000 raised by the charity Broadway League to help establish a Cultural Recovery Project in Port-au-Prince.) All this needs to be done quickly too. Kurin compared a scene at Haiti’s national cathedral to the apocalyptic film Mad Max: a church’s stained glass rose window survived the quake but scavengers then trashed the glass to claim the lead that held it together.

Kurin called for a version of Doctors Without Borders, but for culture. Then questions like Who is in charge? Who do you call? wouldn’t be up in the airfollowing a crisis. In terms of existing solutions, Kurin doesn’t feel favorably toward UNESCO, which he called a “talk group, rather than a do group,” or toward countries with ministers of culture. “My experience is, a lot of ministers of culture make great declarations, they talk big,” but “nothing happens.”

Kurin envisions a bigger, better-planned role for the Smithsonian after future earthquakes, floods, and wars, both international and domestic. One of his anecdotes showed that it helps to have friends in high places. Kurin said he was able to gain speedier access to the country on behalf of the Smithsonian with the help of Haiti’s then First Lady Elisabeth Préval, once a student of his at Johns Hopkins; the institution directly asked Michelle Obama (who was in the process of donating her inaugural gown to the museum) for help as it stepped in to assist Haiti; and the actor Ben Stiller was motivated by his work in the 2009 film Night at the Museum: Battle of the Smithsonian to donate money to the instiution’s efforts.—Claire Zulkey

 

Undead Lenin

In a 1989 Communist Party poster, Vladimir Lenin rose from the dead. Alexei Yurchak, an associate professor of anthropology at the University of California, Berkeley, showed the playful image during a May talk in the Logan Center. Walking out of his mausoleum and carrying a bucket of white paint, the revered leader, Yurchak said, had “cheekily covered the word ‘Lenin’ written on the tomb’s façade in economical Soviet font with his famous signature, ‘With communist greetings, Vladimir Lenin.’” The poster was part of the Soviet Union’s attempt in its final years to accomplish an “almost surreal” task. “The party called for bringing Lenin back to life so he could speak today, in our contemporary language, about our contemporary problems.” Mikhail Gorbachev launched Perestroika in 1985 with the aim of returning to a true understanding of Lenin’s thought. But by 1989 or so, combing through archives to achieve that understanding had been declared futile. Party officials, Yurchak said, decided that the theorist’s words had been distorted through all periods of Soviet history—turned “into a corpus of dead quotes.”

The party concluded that “canonized Lenin” had to go, leaving only “the true authentic Lenin; the pure core.” It sought to discover a Lenin as yet unknown. Some party theoreticians suggested taking Lenin’s original discourse and blending it with the work of other thinkers “whose writings Lenin used in his thinking or would probably use if he had known them”—some as ideologically remote as Locke and Rousseau—to create Lenin’s “new living voice.” It was, Yurchak said, “an unprecedented move.”

At the same time, an obsessive hunt for previously undiscovered aspects of Lenin’s life and death began. Party publications ran long articles scrutinizing his final days, searching for clues to his ultimate design for communism. Weeklies began investigating his family history in response to rumors that he wasn’t purely Russian (Lenin had identified himself this way but had a German and Swedish grandmother as well as a Jewish grandfather whose ethnicity Stalin learned about after Lenin’s death and covered up).

In the end, these investigations produced more questions than answers. The status of Lenin as the holder of “unquestionable truth” had been destroyed. He had become a complex figure who would ultimately remain unknown. Canonized Lenin gave the country legitimacy and coherence. Without him, there was no Soviet Union. “Gorbachev called for a thorough rejuvenation of the Soviet system by means of ridding ourselves of the distortions and canonizations of Lenin,” concluded Yurchak. “In fact, it spelled the beginning of the system’s unexpected and spectacular revolution.”—Katherine Muhlenkamp

 

Night sights

If a man sees himself in a dream drinking warm beer, it is bad, for it means suffering will come upon him.

“I pronounce that one every time I go to Britain,” joked Oriental Institute professor Robert Ritner, PhD’87, during a talk on ancient Egyptian dream interpretation and the dangers Egyptians believed could afflict them while they slept. Warning his Breasted Hall audience that much of his talk would be “X-rated”—“because of the nature of dream interpretation and of dreams themselves,” and because the texts were all written by men—Ritner began by explaining that in Egyptian literature, “the dreamer is envisioned as a spectator rather than a participant in the dreams.” To dream, he said, “is to be awake during sleep.”

The interpretations, laid out in papyri dating as far back as 1700 BC, were often paradoxical or punny and began with a common refrain: “If a man sees himself in a dream …” Destroying one’s clothes meant release from all evil; seeing oneself dead meant a long life. Seeing a large cat meant a large harvest; being shod with white sandals meant roaming the earth. Looking into a deep well meant prison.

Ritner’s warning came true too: many of the interpretations concerned dreams about sexual encounters: with sons and sisters, wolves, cattle, goats, lions, a female jerboa. (In a late period text offering interpretations for women’s dreams, every single entry was sexual.) And not all those dreams were considered bad; in fact, many of them weren’t. A man who saw himself copulating with his mother, for instance, could take it as a good omen. “It means his clansmen will cleave fast to him,” Ritner said.

To thwart the effects of bad dreams and ward off nighttime attacks by demons or curses, Egyptian texts offered incantations that a sleeper could recite upon waking. People placed clay lamps shaped like serpents—an important animal in the protection from sleep-borne dangers—in every corner of their bedrooms and lit them before bed. “Basically,” Ritner said, “night lights.” Magic knives made from hippopotamus ivory were used to trace a perimeter around women’s and children’s beds. Some of these knives are in the OI’s collection, and “you can see the abrasions from being run across a clay dirt floor for many, many years.” Egypt’s state department produced what were essentially voodoo dolls, Ritner said, to protect the country against wars and slanders from abroad, but also against another enemy: bad dreams. “Every evil dream,” he said, reciting the curse carved into one doll, “every evil sleep.”—Lydialyle Gibson