Gritty and Glittery

From The Los Angeles Review of Books:

THESE ARE DARK DAYS for theater people. Playhouses sit empty. Performers cash unemployment checks. Broadway remains closed and won’t reopen until May 2021 at the earliest. Many industries have found ways to accommodate our apocalyptic new reality, but commercial theater is not among them. There can be no “outdoor dining” equivalent of a $21 million musical.

Here to fill the void — and to remind us of a better time — is Michael Riedel’s Singular Sensation. This juicy, jaunty book is about Broadway in the 1990s, a period of great change that paved the way for the industry’s recent artistic and financial prosperity. Singular Sensation offers less an explanation of present-day abundance, however, than a reminder of all that has been lost. “I never intended the subtitle of this book — The Triumph of Broadway — to be ironic,” Riedel writes in his foreword. Ironic, alas, it is, though the author insists better times are on their way. “There will be a comeback,” he says, “and Broadway is good at comebacks.”

Riedel’s last book, Razzle Dazzle: The Battle for Broadway, tracked the demise and resurgence of New York City, Times Square, and Broadway in the 1970s. Starring two lawyers who took over the Shubert Organization, Broadway’s biggest theater chain, Razzle Dazzle was a relentlessly entertaining piece of cultural history. Riedel, who until his COVID-19 furlough was the much-feared theater columnist at the New York Post, brought his trademark voice — biting, loving — to an epic that was every bit as gritty as it was glittery. He showed how Gerald Schoenfeld and Bernard Jacobs, the hero-lawyers at the center of the book, steered Broadway through fiscal catastrophe and helped deliver New York City into its current affluence. More than just a dishy gossip compendium (though it was that), Razzle Dazzle illustrated just how mutually intertwined the destinies of cities and their arts sectors are.

Singular Sensation has a smaller case to make, and is accordingly a shorter, more narrowly focused book. Much like Razzle Dazzle, it unfolds through a series of show profiles that embody the significant shifts of the era: the decline of the British mega-musical, the reinvigoration of American playwrighting and musical comedy, and the increased corporatization of the producer class. Also like its predecessor, it’s a blast.

We begin with Sunset Boulevard, the musical that effectively ended Andrew Lloyd Webber’s Broadway dominance. By 1993, the year of the show’s American premiere, Lloyd Webber had firmly established his commercial preeminence. The British composer’s productions, which included Cats (1982) and The Phantom of the Opera (1988), delivered unprecedented weekly grosses; Sunset Boulevard, adapted from the 1950 Billy Wilder film, seemed destined to join this run of hits. Production staffers referred to it as “The Female Phantom.”

But there were problems. Stage star Patti LuPone opened the show in London on the understanding that she’d follow it to New York. When Glenn Close won the favor of Lloyd Webber in another pre-Broadway tryout, however, LuPone was fired. On learning the news from a Liz Smith column in the Post, LuPone says she “started screaming […] I had batting practice in my dressing room. I threw a floor lamp out the window.”

Link to the rest at The Los Angeles Review of Books

Please Stop Comparing Things to “1984”

From Electric Lit:

George Orwell’s 1984 is one of those ubiquitous books that you know about just from existing in the world. It’s been referenced in everything from Apple commercials to Bowie albums, and is used across the political spectrum as shorthand for the silencing of free speech and rise of oppression. And no one seems to love referencing the text, published by George Orwell in 1949, more than the conservative far-right in America—which would be ironic if they’d actually read it or understood how close their own beliefs hew to the totalitarianism Orwell warned of.

Following last week’s insurrection at the Capitol, Josh Hawley said it was “Orwellian” for Simon & Schuster to rescind his book deal after he stoked sedition by leading a charge against the election results. Donald Trump, Jr. . . . claimed after his father was kicked off Twitter that “We are living in Orwell’s 1984,” then threw in a reference to Chairman Mao for good measure. . . . (V)oices all over Twitter lamented the “Orwellian” purge of their followers after accounts linked to the violent attack were banned from the platform. It’s enough to make an English teacher’s head spin.

I understand why Orwell’s dystopian novel is so appealing to people who want to decry authoritarianism without actually understanding what it is. It’s the same reason I relied on the text for years in my own classroom. Although we often urge our students to resist easy moralizing, the overt didacticism of 1984 has long been part of its pedagogical appeal. The good guys are good (even if they do take the last piece of chocolate from their starving sister or consider pushing their wife off a cliff that one time). The bad guys are bad. The story is linear and easy to follow; the characters are singularly-minded and voice their views in straightforward, snappy dialogue; the symbols are obvious, the kind of thing it’s easy to make a chart about or include on a short answer section of a test. (20 Points: What does the paperweight represent to Winston, and what does it mean when, after it is shattered, he thinks, “How small…how small it always was!”) Such simplicity can be helpful when presenting complicated ideas to young people who are still developing analytical and critical thinking skills. And so, like so many other teachers, I clung to Orwell’s cautionary tale for a long time as a pedagogical tool despite its literary shortcomings.

But when Trump began his rise to political power, I started to notice the dangerous inoculating quality that the text had in my own classroom. Because the dystopia of 1984 was such a simplified, exaggerated caricature, it functioned for my students not as a cautionary tale, but as a comforting kind of proof that we could never get “that bad.” I didn’t take the step to remove the text from my curriculum, but more than in previous years, I began to feel the need to charge the students to consider how things like “doublethink” and Newspeak related to our own political moment. But beyond the intellectual pleasure of the exercise itself (they were more than ready to offer examples of these methodologies across the political spectrum), most students could not bring themselves to consider that the United States could actually sink into the kind of totalitarian control that Oceania experienced. They cited our “freedoms”—speech, press, etc.—as mitigating factors. They trusted norms, even as those norms were being continually tested and broken in real time, the goalposts moving ever closer to political collapse. 

Link to the rest at Electric Lit

  1. PG apologizes for a delayed start to his posting today. Besides blaming Covid (which neither PG nor Mrs. PG or any PG offspring have caught, but it does tend to weigh on PG’s mind nonetheless), PG had a few surprises earlier in the day that occupied way more time than they should have.
  2. PG reminds one and all that TPV is not a political blog and, given the toxicity of political discussions, debates, shouting contests, etc., in the US during the past few months, PG especially doesn’t want contemporary politics to intrude into the respectful, considerate and interesting environment PG and many other visitors appreciate when they click on a link that leads them here.

1984 was published in 1949, when Joseph Stalin had been ruling the Soviet Union and its people with an iron fist since he became Secretary General in 1922 and Orwell was clearly referring to something like a Stalinist society and the some of the tactics of the Communist party in a fictional context in his book.

Fortunately, regardless of which of the two major-party candidates had won the most recent presidential election in the United States, referring to either as an Orwellian or potentially-Orwellian head of state would be a gross overstatement.

Senator Hawley’s comment about Simon & Schuster acting in an “Orwellian” manner in canceling his book contract was vastly overheated. While PG is fully capable of deploring the behavior of various major and minor US publishers with a variety of insulting adjectives, “Orwellian” is not one he would use.

The author of the OP, a former English teacher turned author, also took a Hawleyesque turn in some parts of the OP insulting Republicans that PG omitted from his excerpt.

In both the Senator’s and the former English teacher’s expressed opinions, PG observed the arrogance and foolishness of those who believe their education automatically brings them common sense and perspective on almost any contemporary event.

A Caveman Would Never Do CrossFit. Why There’s Nothing Natural About Exercise

Speaking of sitting around, sheltering in place, eating and drinking holiday cheer, etc.

From The Wall Street Journal:

One of the biggest myths about exercise is that it’s natural. If anything, human instincts lean more toward taking a nap. Want to feel bad about skipping a workout? Blame evolution.

Daniel E. Lieberman argues this theory in his new book “Exercised: Why Something We Never Evolved to Do Is Healthy and Rewarding,” which tips some of the fitness world’s most sacred cows. Everyone knows exercise is good for them, yet studies show most people don’t get enough of it. Mr. Lieberman set out to find out why, and the answers, he hopes, will help remove some of the shame people feel about their own inactivity that makes it even harder to get moving.

Mr. Lieberman criticizes people he calls “exercists” who brag about how much they work out and pass judgment on the less fit as unnaturally lazy. Those who take the escalator instead of the stairs are not guilty of the sin of sloth, he writes, but doing what they were evolved to do—saving energy only for what is necessary or recreational.

Other highlights from the book out Jan. 5: People who believe brutal cross-training workouts bring them closer to the brawny body that belonged to their ancient forebears probably are not familiar with research that shows Neanderthals were only slightly more muscular than today’s regular humans. Fitness buffs who think civilization’s pampering has muted our natural strength might not realize that a profoundly inactive couch potato moves more than a typical chimpanzee in a day. As for our natural talents, it bears noting that the average person runs as fast as a hippo.

. . . .

I wonder how people who do high-intensity cardio with weight training will respond to this book. I feel like you’re calling out CrossFit in particular.

I guess I am. I’ve done some CrossFit workouts, they’re great. I’m not anti-CrossFit. But there’s this CrossFit mystique that your inner primal macho ripped hunter-gatherer ancestor is who you were meant to be. If that gets you happy, that’s fine, all power to you, but you don’t have to make the rest of us feel bad for not doing these intense crazy workouts. They’re not necessary.

You get this sense by reading some books or popular articles that those of us who are contaminated by civilization are somehow abnormal because we don’t want to get out of bed and run an ultramarathon or go to the gym and lift 300 pounds. Our ancestors never did that and they would think it’s crazy because they were struggling to survive with limited food.

. . . .

From an evolutionary standpoint, how do you explain people who love to exercise?

It’s not that we don’t have rewards for being physically active. Our brain produces this wonderful mix of chemicals that makes us glad we’ve exercised. The sad part of the equation is that our brain doesn’t create these chemicals to get us to exercise. We’ve turned something normal and simple and basic into a virtue signaling thing.

It’s like people who are intolerant of other peoples’ weight. Most people in America are now overweight. They’re not overweight because of some fault of their own and they’re struggling, and yet there are people out there who are unacceptably mean to people who are struggling. I think we need to have the same level of compassion toward people who are struggling to exercise. There’s nothing wrong with them.

Do you disagree with those who call sitting the new smoking?

Let’s relax. The chair is not the enemy.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

PG notes he usually doesn’t include two posts from the same source on the same day and hopes all the lawyers who work for the Wall Street Journal are skiing in New Hampshire or otherwise distracted. However, a great many of those writers and editors who work for news publications that cover traditional publishing seem to be experimenting with how it would feel to be permanently unemployed over this holiday break.

PG won’t name names, but some online publications that find something or several somethings to post about traditional publishing every day are running the same headlines they were before Christmas.

‘Finding My Father’ Review: The Lies He Lived By

From The Wall Street Journal:

Thirty years ago, I became hooked on the work of socio-linguist Deborah Tannen after reading her smart, savvy, relationship-saving (yup, my husband read it, too) bestseller “You Just Don’t Understand: Women and Men in Conversation.” Here, and in Ms. Tannen’s many books since then, she displays an acute ability to decode and explain the hidden messages and assumptions our words unwittingly convey, whether about power, status, a wish for greater connection or its opposite. If there is a common lesson in all her volumes (among them, “I Only Say This Because I Love You” and “You’re the Only One I Can Tell”), it’s the importance of listening—and learning how to tune in to what’s really being said.

Now, in her appealing memoir “Finding My Father: His Century-Long Journey From World War I Warsaw and My Quest to Follow,” Ms. Tannen reveals how she acquired this skill, courtesy of her adored father, Eli Samuel Tannen:

When I was a child, and the family gathered after guests left, my father would comment, “Did you notice when she said . . . ?” and go on to explain what meaning he gleaned from the guest’s tone of voice, intonation, or wording. So I trace to him what became my life’s work: observing and explaining how subtle differences in ways of speaking can lead to frustration and misunderstandings between New Yorkers and Californians, women and men, and people of different ages, regions, or cultures.

Over the course of his long life—he died in 2006, at the age of 97—Eli loved nothing more than to reminisce, especially about his earliest years in Warsaw, where he was born into a large Hasidic family. He left that world behind when he came to America in 1920, but he never stopped missing it. His oft-repeated tales made his wife, Dorothy—Ms. Tannen’s mother—roll her eyes at what she dismissed as tedious stories about dead people. For her part, Dorothy remembered almost nothing about her childhood in what is now Belarus, from which she and her family had escaped in 1923. Why look back on a place that, had they stayed, would have probably cost them their lives in the Holocaust?

But young Deborah loved her father’s tales, many of which resembled scenes ripped from an Isaac Bashevis Singer story. She was captivated by Eli’s detailed memories of the tightly knit Jewish community where he spent his first 12 years. The poverty was visceral in a neighborhood teeming with bow-legged children suffering from rickets, and shoeless beggars wearing clothes made from rags. Yet Eli felt sustained by the sheer liveliness of his grandfather’s multigenerational household, where the boy lived with his widowed mother and only sister, surrounded by aunts, uncles and cousins of all ages.

Many of these relatives had remarkable stories of their own. Aunt Magda, for instance, having early in her life eschewed religion for communism, fought with the Soviets in World War II and afterward became a high-ranking official in the Polish government. Aunt Dora’s brilliance led her to a career as a respected mathematician and physicist who studied with Einstein, became his lover and, even after aborting her pregnancy by him, followed him from Europe to Princeton, N.J.

Eli’s own life took a different course. Despite high aptitude scores at his American public school, he dropped out at 14. He spent the next three decades struggling to make a decent living, first to support his widowed mother and his sister, and then his wife and children. In those years he notched up 68 jobs and occupations, from garment worker to prison guard, while also going to night school, eventually earning a law degree before finally opening his own law office in the 1950s. When he retired from his law practice in the late ’70s, he began writing down and recording his memories on tape, so that Ms. Tannen might someday write a book about him. Here, at last, is that book.

In her overly discursive early chapters, Ms. Tannen herself wonders why it took so long. To be sure, it was a daunting task to sift through the countless letters, notes and personal journals Eli had preserved over the course of his long life. But perhaps there was a deeper obstacle. In reconstructing the substance of her father’s life, the author admits that she was also forced to revise what she thought she knew about her parents, both as individuals and as a couple.

The first family myth to fall was the intimate setting of his grandfather’s Warsaw household. Instead, Eli’s letters reveal Aunt Magda admonishing her nephew for waxing nostalgic about a family warmth that never existed. Unable to rebut a truth he had preferred not to acknowledge, Eli admits to his idealized childhood image and owns up to the lack of family closeness he himself had so painfully observed, experienced—and, it seems, repressed.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

Ladies of the Good Dead

From The Paris Review:

My great aunt Cora Mae can’t hear well. She is ninety-eight years old. When the global pandemic reached Michigan, the rehabilitation center where she was staying stopped accepting visitors. There were attempts at FaceTime, but her silence made it clear that for her, we had dwindled into pixelated ghosts. She contracted COVID-19 and has been moved again and again. When my mother calls to check on her every day, she makes sure to explain to hospital staff that my great aunt is almost deaf, that they have to shout in her left ear if they want to be heard.

Cora Mae has a bawdy sense of humor. Most of the time when she speaks, it’s to crack a joke that would make most people blush. She wears leopard print and prefers for her hair to be dyed bright red. I have tried to imagine her in the hospital, attempting to make sense of the suited, masked figures gesticulating at her. She doesn’t know about the pandemic. She doesn’t know why we’ve stopped visiting. All she knows is that she has been kidnapped by what must appear to be astronauts.

The film, The Last Black Man in San Francisco, begins with a little black girl gazing up into the face of a white man wearing a hazmat suit. A street preacher standing on a small box asks: “Why do they have on these suits and we don’t?” He refers to the hazmat men as “George Jetson rejects.” It feels wild to watch the film right now, as governors begin to take their states out of lockdown knowing that black and brown residents will continue to die at unprecedented rates, taking a calculated risk that will look, from the vantage point of history, a lot like genocide. The film’s street preacher sounds obscenely prophetic. “You can’t Google what’s going on right now,” he shouts. “They got plans for us.”

. . . .

Under quarantine in Detroit, my father, a photographer, has been sifting through boxes of slides in his sprawling archive. Each image unleashes a story for him. Last week, he told me about arriving in Sarajevo while covering the Olympics. He stayed with a family of friendly strangers eight years before the war. “I wonder if they survived,” he mutters to an empty room.

When my cousin was a police lieutenant, she told us about getting a call for someone who had died. At first glance, they thought the man had been hoarding newspapers or magazines, but then his daughter explained that he was a composer. The papers in those leaning stacks were original compositions.

When we hear on the news that Detroit is struggling, that people are dying, do we imagine composers? Do we imagine a man who sifts through photographs of Bosnia before the war?

. . . .

Last year, the Detroit Institute of Arts mounted an exhibition called “Detroit Collects,” featuring mostly black collectors of African American art in the city. One room was lined with giant photographs of well-dressed collectors. Immediately I recognized the wry smile of a woman named Dr. Cledie Collins Taylor.

My parents have been telling me about Dr. Taylor for years. “You’re going to love her.” “Her house is a museum.” “She used to live in Italy.” “She loves pasta.” When a friend came to town, we thought we would go to the Detroit Institute of Arts, but my father took us to Dr. Taylor’s house instead. The sun was spilling across the horizon, raspberry sherbet bleeding into orange, and the temperature was in the low teens. A handful of houses along the street had big paintings integrated into the architecture of a porch or a window. We knocked on a security gate and a woman in her nineties welcomed us inside.

“Every February, someone discovers me,” Dr. Taylor joked, nodding at the coincidence of receiving extra attention during black history month. I felt a twinge of embarrassment. Here I was, encountering one of Detroit’s most important artistic matriarchs for the very first time.

. . . .

At Dr. Taylor’s house, we sat in the living room and talked for a while. The impeachment hearings were playing loudly on a television in her bedroom. “A lot of my collection is upstairs, why don’t you go take a look.” We crept carefully through a room that seemed to be fitted with exactly as many books as it could tastefully hold, toward a narrow stairway. Upstairs, canvases leaned against the walls. A figurine stood, stark, in a room off to the left. There was a photo-realistic painting of a black woman with short hair, in profile, wearing a pink turtleneck and sitting on a white couch. A black-and-white print of a man with hair like Little Richard peeked out from behind a stack of frames. In a round canvas, a man with an afro slouched behind a shiny wooden table, seated in front of geometric panes of blue, including a massive window framing a cloudy blue sky—as if Questlove were relaxing to “Kind of Blue” inside a Diebenkorn painting. Ordinary scenes of black life, exquisitely rendered, were scattered across the room, a collection relaxing into itself with a kind of easeful, dusty abundance. We were called back downstairs.

Dr. Taylor walked us through the house, and took us on a tour of the basement. African masks, sculptures, shields, and figurines were pinned to pegboard the way other basements showcase drills and rakes. I placed my hand on the baby bump of a pregnant wooden figure. “Somewhere along the line, the collecting idea just catches you,” Dr. Taylor said, humbly.

“You could tell where the problems were because you’d get a lot of things from that place,” she said of collecting work from Africa. “I realized that people were responding to what they needed; certain things in their family shrines they could part with, just to eat.” She told us about her friend, a playwright who convinced a general not to kill her family during the Biafran War. Dr. Taylor tried to hold items and then give them back when wars had ended, once she realized why they’d become so available, but she struggled to get past corrupt middlemen.

Link to the rest at The Paris Review

The Age of Wood

From The Wall Street Journal:

Human history has traditionally been divided into three “ages,” according to the materials—stone, bronze and finally iron—that our ancient ancestors used to fashion their tools. But until very recently, Roland Ennos reminds us, mankind’s most versatile resource grew on trees. In “The Age of Wood,” he takes a fresh look at the familiar substance, wielding it like a wedge to pry open our past, examine our present and even glimpse our future.

Structurally, wood is a marvel. It is pound for pound as stiff, strong and tough as steel yet relatively easy to shape, such as by splitting and carving. No wonder our prehistoric forebears chose wood (not stone) for their first tools and weapons—pointed sticks for digging tubers and hunting game. The earliest houses, plows, wheels and boats were also made from this accessible, adaptable material. Andwithout firewood for warming themselves, cooking food and keeping predators at bay, Mr. Ennos suggests, our ancestors may never have come down from the trees at all.

Toward the end of the Stone Age, people discovered that if logs were left for a long time in a controlled fire (between 600 and 900 degrees Fahrenheit), they would be reduced to charred chunks that burned hotter than regular wood. Using these lumps of pure carbon—charcoal—artisans were able to fire clay into harder, more waterproof pottery and to fuse sand and ashes to create the first glass. Then, around 5,000 B.C., craftsmen in Eastern Europe and the Near East began to purify copper ore by burning it in charcoal fires and pouring it into ceramic molds to form chisels, ax heads and other tools. Some 1,500 years later, they learned that mixing a little tin with the copper produced a sharper, stronger blade, which could cut through trees twice as fast as stone. The Bronze Age had begun.

Far from reducing mankind’s reliance on wood, Mr. Ennos notes, copper and bronze spurred demand by making it easier to convert logs into houses, furniture, fences and myriad other items. With new metal tools, carpenters also built the first wheels and plank ships, launching an era of unprecedented trade, travel and cultural exchange. In the Americas, by contrast, metallurgy didn’t develop, the plank ship was unknown, and the only wheels were made of clay and fitted to children’s toys. “The result,” Mr. Ennos writes, “was to give the people of the Old World a massive lead in logistics, one that five thousand years later was to help them discover the New World and subdue its people.”

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

America Is Running Out of Nurses

Nothing to do with the writing biz, but of interest in the middle of Covid. This is a story about traveling nurses in the US, essentially nurses who work for a limited period of time in a hospital as contract labor, usually to cover vacations or temporary staffing shortfalls.

From The New Yorker:

Gary Solesbee has an understated manner and an easy Texas drawl; he has been a nurse for twenty-seven years. Last year, he and his wife, who is also a nurse, decided to travel. It would be a chance to explore the country and to spend time with their adult children, who were scattered in different states. In December, they started an assignment in Webster, Texas. In March, they left for Walla Walla, Washington, where their son lives, and where, shortly after they arrived, coronavirus patients began to trickle in. Over the summer, when cases exploded in the Southwest, they signed on with an academic hospital in Albuquerque, New Mexico. “When we started travel-nursing, we had no idea we’d be doing covid nursing,” Solesbee told me. “But that’s pretty much what it’s become.” Solesbee, who now works in a step-down unit—the rung between a regular medical floor and the I.C.U.—is one of several hundred travelling nurses his current hospital system, in Albuquerque, has hired to contend with the surge. Solesbee’s hospital, like many around the country, has had to refashion many floors into covid-19 units. “It’s weird to go to an orthopedics floor and see it transformed for covid care,” Solesbee said. “Then you go to gynecology, it’s half covid. Outpatient services, all covid. Everywhere—it’s covidcovidcovid.” Last month, the New Mexico health department opened another hospital nearby to house recovering coronavirus patients who had no place to go.

. . . .

“It’s disheartening that much of the public still isn’t taking this seriously,” Solesbee said. “I want to tell them, ‘Come to work with me one day. See what it’s like.’ ” Caring for coronavirus patients is exhausting, but it’s their intense isolation, not clinical complexity, that bothers Solesbee most. “That’s the worst thing,” he said. “They can’t have family or visitors come in. We’re the only contact they have, and we can’t be with them as much as usual. We’re supposed to bundle care, limit exposure, do what you need to do and get out.” Still, there are moments when compassion trumps protocol. Recently, an elderly patient’s blood-oxygen levels dipped to dangerous levels despite maximum support. After declining a ventilator, he prepared to die. A nurse sat in the room with the patient, calling each of his family members on FaceTime, one by one, so they could said their goodbyes. “Sometimes, it’s the only thing we can do for people,” Solesbee told me. “But, in some ways, it’s the most important thing.” The other night, Solesbee sat with a dying man who had no family. With no one to call, he simply held the man’s hand as the patient’s breathing grew ragged and intermittent. Finally, it stopped. Solesbee stood up. Another patient needed his help.

Link to the rest at The New Yorker


From The Wall Street Journal:

On Jan. 24, 1944, the destroyer USS Plunkett found herself offshore Anzio, Italy, supporting the embattled Allied beachhead there. The ship had had an eventful year, participating in the invasions of North Africa and Sicily. Like all destroyers, Plunkett was fast and well-armed for her size. She had many roles, including shepherding other ships and patrolling, but destruction was her natural job. On this day in the Tyrrhenian Sea, the men aboard were disappointed that some much-anticipated shelling of German positions ashore had been canceled.

At around 5:15 p.m., the ship’s doctor spotted a couple of German bombers, Dorniers, in the early-evening sky, fairly high up. The Anzio roadstead was thick with Allied ships, but the Germans owned the skies. “Looks like a little business coming up,” said the doctor almost to himself. “I think I’ll go down to the wardroom.” As James Sullivan tells us in his stirring “Unsinkable: Five Men and the Indomitable Run of the USS Plunkett,” the destroyer’s wardroom—the officers’ mess, at sea—did double duty in battle, serving as the dressing station for the wounded and dying. It would prove to be a busy place that evening.

Soon after calling his crew to battle stations, Plunkett’s captain, Ed Burke —one of the five men that Mr. Sullivan follows from the 1940s through the ends of their lives—could see the glide-bombs from the far-off Dorniers floating his way. What followed over the next 25 minutes may have been the most intense aerial attack on a single ship of any navy in World War II.

Soon the Dorniers were joined by dive bombers shrieking in, while torpedo planes came in “low and slow” to drop their sinister payloads that barely moved faster through the water than Plunkett herself. Capt. Burke successfully zigged toward the glide bombs, zagged to run parallel to a torpedo, dodged dive bombs that were landing as close to 20 yards away, and through it all, whenever he could, put the ship broadside on to the planes so his guns would have a chance. Eventually Plunkett was ablaze, but her guns kept firing with coordination and accuracy until the Luftwaffe had to call off the attack. Plunkett’s “bag” that day was impossible to verify, but it likely represented a quarter or more of the force that attacked her. The ship’s casualty list would include, apart from the scores of wounded, 29 men forever “missing in action.” These numbered more than the dead whose bodies could be identified. Plunkett managed to limp back to Palermo, saw action in Normandy on D-Day and ended her war in the Pacific.

. . . .

Certain stories we need to tell regardless of their size. One of Mr. Sullivan’s achievements is to remind us why. “Unsinkable,” a fine narrative in its own right, is also a reflection on the nature of storytelling itself, as well as a valuable and entertaining contribution to the record. It is good to learn the history of the American destroyer, with its origins in the response to the torpedo warfare that began on the Roanoke River in 1864, or to learn how the depth-charges and sonar worked on a vessel of the Gleaves class 80 years later. To make such details compelling reading is an accomplishment. More significantly, Mr. Sullivan takes pains to illuminate and honor a lost world.

He pores over a photograph of a Ship’s Party in March 1943 at the Hotel St. George in Brooklyn. There are the “USO girls, recruited for the party and minded by chaperones.” There is the sailor who looks 14 years old, the towering perms and the crisp uniforms with white carnations, and, “if you look closely enough,” the sound of Benny Goodman on the clarinet.

Mr. Sullivan reads the hundred letters that made their way to and from Plunkett between teenagers Jim Feltz and Betty Kneemiller, who met when Jim was sweeping the floor at Mr. Siegal’s five-and-dime store back home in Overland, Mo. “I still call you mine,” she writes at one point, “but I’m not as definite on that being the truth.”

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

The First Christmas Meal

From The Paris Review:

These days, British and American Christmases are by and large the same hodgepodge of tradition, with relatively minor variations. This Christmas Eve, for example, when millions of American kids put out cookies and milk for Santa, children in Britain will lay out the more adult combination of mince pies and brandy for the old man many of them know as Father Christmas. For the last hundred years or so, Father Christmas has been indistinguishable from the American character of Santa Claus; two interchangeable names for the same white-bearded pensioner garbed in Coca-Cola red, delivering presents in the dead of night. But the two characters have very different roots. Saint Nicholas, the patron saint of children, was given his role of nocturnal gift-giver in medieval Netherlands. Father Christmas, however, was no holy man, but a personification of Dionysian fun: dancing, eating, late-night drinking—and the subversion of societal norms.

The earliest recognizable iteration of Father Christmas probably came in 1616 when, referring to himself as “Captain Christmas,” he appeared as the main character in Ben Jonson’s Christmas, His Masque, performed at the royal court that festive season. Nattily dressed and rotund from indulgence, he embodied Christmas as an openhearted festival of feasts and frolics. But by the time he appeared on the front cover of John Taylor’s pamphlet The Vindication of Christmas, in 1652, Father Christmas had grown skinny, mournful, and lonely, depressed by the grim fate that had befallen the most magical time of year. The days of carol singing and merrymaking were over; for the past several years Christmas across Britain had been officially canceled. The island was living through a so-called Puritan Revolution, in which the most radical changes to daily life were being attempted. Even the institution of monarchy had been discarded. As a ballad of the time put it, this was “the world turned upside down.”

The prohibitions on Christmas dining would have particularly aggrieved Robert May. One of the most skilled chefs in the land, the English-born, French-trained chef cooked Christmas dinners fit for a king—a doubly unwelcome skill in a time of republicanism and puritanism. May connected the medieval traditions of English country cooking with the early innovations of urban French gastronomy, and was at the height of his powers when the Puritan Revolution took effect. During those years, he compiled The Accomplisht Cook, an English cookbook of distinction and importance that was eventually published in 1660. In more than a thousand recipes, May recorded not only the tastes and textures of a culinary tradition, but a cultural world that he feared was being obliterated—including the Christmas dinner, an evocative sensory experience that links the holiday of four centuries ago with that of today.

. . . .

The young May’s experiences abroad hint at the changes occurring in English food culture of the time, especially among the social elite. During the late Tudor and Stuart eras, numerous foodstuffs, including potatoes, tea, coffee, chocolate, and tobacco, arrived from the Americas and established themselves as staples of the national diet. The Accomplisht Cook is replete with non-English influences, giving us a vivid idea of what new fashions entered his kitchen in the early 1600s. May drew heavily from Spanish and Italian recipes, and his book includes thirty-five dishes for eggs that he took from the pioneering French chef François Pierre La Varenne. Despite this, May’s food was quintessentially English. The Accomplisht Cook laments that French chefs “have bewitcht some of the Gallants of our Nation with Epigram Dishes” in favor of the sturdy traditions of English cooking. The Englishness of May’s approach is palpable in his suggestions for Christmas dinner, dominated by roast meats and featuring a mince pie. Today’s mince pies—a Christmas institution in Britain and Ireland—are filled with a sickly-sweet concoction of dried fruit, fortified wine, mixed spices, and mounds of brown sugar, but before the Victorian era they also contained meat. May suggests numerous cuts of beef (including tongue, buttock, and intestine) or hare, turkey, and mutton, among others. In his recipes for a veal-based mince pie, he recommends mixing it with more familiar ingredients such as dates, orange peel, nutmeg, and cinnamon, flavors that are still powerfully evocative of what many of us would consider a “traditional” Christmas.

May’s bill of fare for Christmas Day is huge: forty dishes split across two courses, with additional oysters and fruit. Partly this reflects the nature of May’s experience in the service of some of the wealthiest people in the country, and partly the Stuart approach to dining. The diaries of May’s contemporary Samuel Pepys detail the meat-heavy, gut-busting dinners he hosted each year on the anniversary of his kidney stone operation (that the procedure worked and didn’t kill him was, in the seventeenth century, truly a cause for celebration).

Link to the rest at The Paris Review

Survival Strategies for Unsupervised Children

From Electric Lit:

We’re called the Crazy 9, but there are not always nine of us. We were nine before la policía took Tuki. We called him Tuki because he loved to dance all weird. Every time he heard the tuki-tuki of electronic music, he flailed his arms and raised his knees like some sort of strange bird. Tuki was funny but a little mean. I miss him, but not too much.

I feared we would be seven soon. Ramoncito hadn’t been feeling well, throwing up everywhere. He smelled really bad because he pooped his pants the other day and hadn’t been able to find new ones, so we didn’t like to stand next to him. Or sometimes we made fun of him and yelled, “Ramoncito, pupusito!” and everyone laughed and laughed and laughed, but inside I wasn’t laughing too hard; inside I felt bad. When the others were asleep, I pinched my nose with my finger and thumb and went to Ramoncito. I used to bring him something to eat too, but the last two times he threw up right after, so I didn’t bring him food anymore—why waste it, is what I say—but I still asked, “How are you feeling, Ramoncito?” and “Is there anything I can do, Ramoncito?” My voice sounded funny because of the nose pinch, and sometimes he smiled. Before, he would talk to me a little, but now he didn’t talk much. He could still walk around and go with us on our missions, but he was very slow. His eyes were sleepy all the time, and they looked like they were sinking into his skull. But we also laughed at him because he’s the youngest, only seven and a half, and everyone always gives the youngest a hard time. I was the youngest before Ramoncito came along, but even if Ramoncito didn’t last much longer, the others wouldn’t treat me like the youngest because I was the one that found the knife, and I’m the best at using it.

. . . .

Here is what the Crazy 9 love.

We love our name, and we won’t change it, even if we are really eight, or seven—we love it because it sounds crazy and because we scrawl it all over the place—when we find spray cans, or markers, or pens.

We love the knife. We found it one night after running away from the lady who wouldn’t give us any money, so we pushed her and took her purse. As we gathered to inspect our loot on the banks of the Güaire River, I pulled it from a secret pocket, shiny and dangerous. We love to take turns and unfold the blade from its wooden handle and scream, “Give me all your money!” but we are just practicing. I carry the knife most of the time because I found it, but also because I can throw it at a tree and almost always get it to stick, and I can also throw it in the air and almost always catch it by the handle without cutting my hand.

We love Pollos Arturos, it’s everyone’s favorite, but we almost never get to have any, because if the guard sees us he screams and chases us away—but sometimes we will beg and someone will give us a wing. One time Ramoncito got a leg, but that was before he was throwing up. He got a leg because the youngest always does the best begging. But we have rules in the Crazy 9, so we didn’t take the leg away from Ramoncito. He ate it all by himself.

We love going to the protests. We don’t go to the front too much because that’s where the police fight the protesters—the protesters wear their T-shirts tight around their faces, or they make gas masks out of junk, or they wear bicycle helmets and carry wooden and zinc shields with the colors of the flag painted on them; they throw mostly rocks at the police, but sometimes they shoot fireworks at them. One of them holds the cohetón parallel to the ground—aimed straight at the line of men in their green uniforms and their plastic shields and their big shotguns—while another lights the fuse. They only let it go when the whistling is loud, and we think they might be holding on to it for too long, long enough for it to explode in their hands, but then we see it fly like a comet straight into the green and plastic wall of soldiers that stands down the road. We always cheer when we see that.

Sometimes we stand next to them and yell at the police. We wrap our T-shirts around our faces and scream “¡Viva Venezuela!” and “¡Abajo Maduro!” and jump and throw rocks. It’s fun, except for when the tear gas comes and we have to run away or else cough and cough and cry and cry. But we mostly stay at the back of the protests because we can beg or steal better. Because the women are there, or the older men, or the cowards that don’t want to fight in the front, like us. The begging is good at the protests. The lady will see us and tell her friend in the white shirt and the baseball cap with the yellow, blue, and red of the flag, “Our country is gone, isn’t it? Poor child. I swear, chama, I don’t remember it ever being this bad!” That’s the moment when I try them, and most of the time I get a few bolivares. But we have rules in the Crazy 9, so we always share the money we get from begging or stealing.

We love each other. We say “Crazy 9 forever!” and exchange manly hugs. I love that feeling you get when you hug someone and you mean it. But it also makes me remember things I don’t like remembering, so let’s not talk about that.

Link to the rest at Electric Lit:


From The Wall Street Journal:

In late antiquity, even before the fall of the western Roman empire, all roads began to lead away from Rome. Emperors abandoned the old capital and moved closer to the frontiers. In the east, they resided at Nicomedia, Thessalonike and eventually Constantinople, Constantine the Great’s “New Rome” on the Bosporus, founded in A.D. 330. Trier was their preferred headquarters in Gaul, and Milan their city of choice in northern Italy. In the fifth century, as the empire began to collapse and was invaded by barbarian armies who threatened Rome’s food supply, ordinary people also moved away from the former capital, whose population gradually declined. In the summer of 402, with Goths crossing the Alps, the western emperor Honorius moved his court to Ravenna, a naval base on the Adriatic. It had an excellent harbor from which he could (and did) receive reinforcements from his brother at Constantinople. The place was surrounded by swamps and rivers, making it famously difficult to besiege, and its “undeveloped urban plan . . . allowed the imperial court to impose its presence.”

Judith Herrin’s recent book contains a sweeping and engrossing history of Ravenna from the moment Honorius took up residence there, through the thriving period of Gothic rule (493-540), and culminating in the two centuries (540-751) when the city was a western outpost of the eastern Roman empire. In that last phase, the city was the capital of a territory called the exarchate, whose commander, the exarch, was sent from Constantinople. Ms. Herrin, a professor emerita of classics at King’s College London, is a past master at writing histories of the early medieval world that combine Eastern and Western perspectives and show why we should not study the one without the other. “Ravenna” offers an accessible narrative that brings to life the men and women who created the city during this period and who fashioned its hybrid Christian culture of Latin, Greek and Gothic elements. The narrative is periodically elevated by discussions of the city’s most famous attractions and its glorious churches, brilliantly illustrated in the book’s 62 color plates. It is also enlivened by recurring digressions on daily life in the city at each phase in its history, insofar as that is revealed by documentary papyri containing wills, donations and contracts that fortuitously survive. These local perspectives are complemented by a global outlook: Ms. Herrin’s argument is that Ravenna passed its unique hybrid culture on to the imperial centers all around it, to Rome, Constantinople, and later to Aachen, the capital of Charlemagne in the north. This made the city the “crucible of Europe.”

Ravenna, a jewel in the midst of a marsh, was a place of paradox. It was, to allude to a collection of Ms. Herrin’s previous studies, simultaneously both metropolis and margin. In the era of the late Roman emperors (fifth century), the city hosted the court but had a relatively small population compared to Rome and was always looking toward developments at Rome and Constantinople. It was, at first, an administrative center with no identity, a kind of Brasilia that was only slowly built up by the court in its own imperial Christian image. The place had no prestige past of its own, either classical or Christian. Under the rule of Theoderic the Great (493-526), Ravenna was briefly the capital of a power structure that stretched from Spain to the Balkans, but Theoderic still had to maneuver carefully between Rome and Constantinople. In theory, he was ruling the former on behalf of the latter, and his followers, who were barbarians (Goths) and heretics (Arians) in the eyes of the Romans, were an occupying force that were stationed in the north and away from the old capital. The Gothic experiment did not last once the eastern emperor Justinian decided to end it in the 530s. Under Byzantine rule, Ravenna was the capital of the exarchate, and gradually developed its own civic traditions and identity, but still it received its governing class from abroad, and people at the real center of power, Constantinople, saw it as a remote outpost. 

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

PG says it was not always a lot of fun to live in a city on the Eastern shore of Italy, on the Adriatic Sea. The rulers in Constantinople (now Istanbul) ranged from completely enlightened to pretty nasty. It was the eastern capital of the Roman Empire, then the capital of the Roman empire when Rome fell on hard times, then the capital of the Byzantine Empire. For a long time, it was an extremely wealthy city and a place where many trade routes converged.

At the intersection of Europe and Asia, Constantinople was fought over and/or controlled by the Persians, the Ottomans and the Crusaders. The adjective, Byzantine, was an apt description of life in Constantinople for a good part of its existence.

When PG and Mrs. PG visited there a few years ago, PG found it to be an extraordinary and fascinating place, an amazing combination of ancient and modern, Muslim and Christian, east and west.

A couple of photos of Ravenna:

Basilica of Sant’ Apollinare Nuovo. Ravenna, Italy via Wikimedia Commons
Apse mosaic in basilica of San Vitale, Ravenna, Italy. Built 547. via Wikimedia Commons

7 Books to Get You Through Unemployment

From Book Riot:

It’s no exaggeration to say that 2020 has been one of the toughest years in living memory. The COVID-19 pandemic has had a devastating medical impact, and a knock-on economic impact that the world will be feeling for a while to come. Many people have lost their day jobs as a result of the crisis – this Rioter included. In troubled times, I turn to books, and have found that there are a huge number of books to help you through unemployment. Here are some of the books that I’ve found useful, reassuring, or inspiring while I’ve been unemployed.

The Unemployment Guide: How a Setback Can Launch Your Career by Melissa Fleury

Fleury draws on her own experience of being unexpectedly laid off to create this guide on how to turn the shock of unemployment into a plan to work on changing your life. Initially, I was skeptical (disaster rarely actually feels like opportunity), but Fleury’s optimistic, no-nonsense style helps you see the best in the worst situation.

. . . .

Smashing It: Working Class Artists on Life, Art and Making It Happen

The working world has never been overly accessible to marginalised people, particularly those with multiple marginalisations, and the current economic situation has made things that much harder. This collection of essays and short pieces by 31 working-class artists, including Riz Ahmed, Maxine Peake, and Bridget Minamore, is a fascinating and inspirational read for anyone who’s been knocked back by 2020.

Link to the rest at Book Riot

The Ghosts of Cambridge

From The Los Angeles Review of Books:

IN THE 1960s, the Simulmatics Corporation sold a big idea: whirling IBM computers (the kind James Bond villains had) churning through your personal data in order to predict and influence your decision-making.

Simulmatics did several rather consequential things. It helped propel JFK to the presidency. It modernized the New York Times’s election reporting. It modeled the social effects of pandemics for the Department of Health. It launched a boardgame called Ghetto for teaching high school students about social advancement. It advised state police forces on predicting riots. It helped blue-chip companies sell more dog food, coffee, and cigarettes. And it developed the US Army’s counterinsurgency operations in Vietnam.

But then the company flopped and disappeared. You’ve never heard of Simulmatics. The name sounds like an awful line of protein milkshakes, or a one-room cryogenics firm hidden away in a suburban shopping center. Think again.

Jill Lepore, the award-winning Harvard historian and New Yorker staff writer, retraces its incredible rise and fall in If ThenHow the Simulmatics Corporation Invented the Future. She tells the story of how data science went digital in the Cold War, turning research in behavioral psychology into big business. If Then reads as a kind of prehistory to Cambridge Analytica, the private consultancy that imploded in 2018 after revelations of its dubious use of Facebook data to elect Trump and win Brexit. Exhuming Simulmatics from the dustbin of history also recasts our own strange moment as a mystery story: Why did the company that “invented the future” fail? And why did we forget it ever existed?

. . . .

Simulmatics’s first project in 1960 was the People Machine, a system for converting demographic data into voter prediction for John F. Kennedy’s surprisingly close election against Nixon. Pool christened the People Machine a “kind of Manhattan Project gamble in politics.” He lured prestigious colleagues by offering them bigger salaries and, even more alluringly, with the promise of significantly upscaling their experiments beyond the lab. In an unguarded moment, Pool would excitedly exclaim to historian Fritz Stern that the Vietnam War “is the greatest social science laboratory we have ever had!”

The Manhattan Project imagery stuck. Harold Lasswell, doyen of communications theory, called the People Machine the “A-bomb of the social sciences” (he meant it positively), as did writer Mary McCarthy (she didn’t). Of the behaviorists’ increasing role in government policy, McCarthy presciently wondered: “[C]onceivably you can outlaw the Bomb, but what about the Brain?”

Link to the rest at The Los Angeles Review of Books

Changing America

From Changing America:

Many Americans are watching more television in quarantine and lockdown during the coronavirus pandemic, but the plethora of content is often niche. Still, in a year of polarizing politics, “The Queen’s Gambit” has united viewers around the world.

Seven episodes were enough to garner the show a 100 percent rating on Rotten Tomatoes and accolades from major critics and publications. A more surprising measure of the show’s success, however, shows just how much one show can change American culture.

“The idea that a streaming television series can have an impact on product sales is not a new one, but we are finally able to view it through the data,” said Juli Lennett, toys industry advisor for NPD, in a statement. “The sales of chess books and chess sets, which had previously been flat or declining for years, turned sharply upward as the popular new series gained viewers.”

. . . .

Sales of chess sets rose by 87 percent in the United States while sales of chess books jumped by 603 percent, according to U.S. Retail Tracking Service data from NPD, which showed that week-over-week sales had been relatively flat for 13 weeks before the show debuted. Google searches for “chess” and “how to play chess” hit a nine-year peak, according to Netflix, and the number of new players on quintupled. 

Link to the rest at Changing America

Haynes Manuals Will Stop Publishing New Print Workshop Guides

From Road and Track:

Haynes Manuals, the workshop manuals producer that makes detailed maintenance manuals on pretty much every relevant car on the planet, confirmed today it will no longer publish any of its new print workshop manuals in print. Manuals published by the company from here on out will only be available in digital form.

The company clarified on Twitter it will continue to print its massive back catalog of existing manuals, so if it already has one on your car, you’re still in luck if you want a physical copy.

Haynes, based in the U.K. and founded in 1960, has been a lifesaver for enthusiasts around the world looking to do the work on their own cars. Instead of dropping a bunch of money on a dealer or repair shop, you could pick up a Haynes manual and have all the info you need to get any job done.

. . . .

“We are currently in the process of creating an exciting and comprehensive new automotive maintenance and repair product that will cover around 95 percent of car makes and models—an increase of around 40 percent over our current Workshop Manual coverage.”

Link to the rest at Road and Track and thanks to DM for the tip.

PG still remembers the shock he felt the first time he saw an auto mechanic wearing disposable gloves.

Prior to that, one could discern how much experience an auto mechanic had by looking at his hands. (They were all “hims” in times of old.)

Special soap was available for mechanics, farmers, factory workers, etc., who got their hands really dirty. It would remove a lot of dirt and grease, but that sort of work not only got your hands dirty, it often built up formidable calluses on those hands. When those calluses dried and cracked, dirt and grease built up in the cracks and that was very hard, sometimes impossible, to remove.

In those occupations, the condition of a man’s hands was like a calling card and could command respect among his fellow laborers. If you were a newbie, you couldn’t bluff your way up the status ladder.

Peak Brain: The Metaphors of Neuroscience

From The Los Angeles Review of Books:

UP UNTIL 2013, I carried a digital camera wherever I went: to the archive or the bar, and on the occasional trip. I took thousands of photos, sorting through them at odd intervals to post on social media or send to my mom. There was a rhythm to my relationship with the camera: I pointed, I shot, I uploaded, and (usually) I forgot. But that rhythm fell apart once I got an iPhone. Like many others tasks, taking and keeping photos didn’t just get easier — it fundamentally changed. If the move from film to digital lowered the bar for what was photographable, then the camera phone wiped that bar out entirely. The images seemed just as good, but they were no longer mindful records of my days so much as mindless drafts of myself. I was taking more photos and thinking less about each. It was almost as if the camera, now a part of my phone, had become an autonomous point-and-shoot, sort-and-post extension of myself. Or maybe I had become a part of it.

But this isn’t a story about the good old days. Instead, it’s about how tools like my camera and iPhone shift how we understand ourselves. They do so at a personal level, when we use them so much that we feel naked without them. But technologies have also shaped the sciences of mind and brain by anchoring conceptual metaphors. As far back as the telegraph, our tools for capturing, storing, and communicating information have served as analogies for the brain’s hidden processes. Such metaphors are enabling: they do work, as it were, as convenient shorthands that then spur scientific research and medical treatments. But metaphors also constrain our research and treatments, making some things visible while shielding others from view. In other words, metaphors have a politics. The question is: What kind of politics do they have?

. . . .

The role, if not the politics, of technological metaphors in neuroscience is the subject of Matthew Cobb’s new book, The Idea of the Brain: The Past and Future of Neuroscience. It proceeds from a simple idea that Cobb attributes to the 17th-century anatomist Nicolaus Steno. “The brain being indeed a machine,” Steno reasoned, “we must not hope to find its artifice through other ways than […] to dismantle it piece by piece and to consider what these can do separately and together.” Brains have been many things since then, as Cobb shows: voltaic piles and power grids, tiny factories and immense circuits, willful robots and programmable computers. The history of neuroscience can be read as a history of such metaphors. Whatever the tool, we find a way to see ourselves in it.

. . . .

After a brief tour of the ancient world, Cobb describes brains being dissected in early modern Europe, prodded and shocked in the 19th century, and injected and imaged in the 20th. Without falling prey to determinism or teleology, he maps metaphorical flows between neuroscience and technology. New machines do not cause theories to change, he cautions. Rather, causal arrows fly both ways. Human computers preceded digital ones, after all, and machine learning has been inspired by our own. Metaphors have slipped back and forth: our brains are special “hardware” even as my iPhone acts as “my brain.” Cobb’s history reveals something deep: the complex, codependent relationships we develop with our favorite tools do more than alter how we think. They become how we think. And we can’t do it without them!

. . . .

The year I abandoned my Nikon, it popped up in a surprising place: cognitive science. That year, Joshua D. Greene published Moral Tribes, a work of philosophy that draws on neuroscience to explore why and how we make moral judgments. According to Greene, we make them using two different modes — not unlike a digital camera. “The human brain,” he writes, “is like a dual-mode camera with both automatic settings and a manual mode.” Sometimes, the analogy goes, you want to optimize your exposure time and shutter speed for specific light conditions — say, when faced with a big life decision. Other times, probably most of the time, tinkering with the settings is just too much of a hassle. You don’t want to build a pro-and-con list every time you order at a restaurant, just like you don’t want to adjust the aperture manually for each selfie you take.

Greene’s “point-and-shoot morality” is an example of dual-process theory, made famous by Daniel Kahneman’s Thinking, Fast and Slow. This improbable best seller summarized decades of work, much of it by Kahneman and his longtime collaborator, the late Amos Tversky. Adopting terms proposed elsewhere, Kahneman organizes mental function into two “systems,” corresponding to Greene’s automatic and manual modes. “System 1” is fast and, often, involuntary; “System 2” is slower and more deliberate. All animals have some version of the first, while the second system is limited almost entirely to humans. We may be machines, Kahneman and Greene acknowledge, but we are reflective machines — or at least we can be.

. . . .

Mental life, on this view, is a constant, often subconscious assessment of one’s surroundings for threats or opportunities. And of course, Kahneman has a metaphor for that:

Is anything new going on? Is there a threat? […] You can think of a cockpit, with a set of dials that indicate the current values of each of these essential variables. The assessments are carried out automatically by System 1, and one of their functions is to determine whether extra effort is required from System 2.

But this raises an important issue: how helpful is the cockpit metaphor, given how little most of us know about operating airplanes? And what does it suggest about how we imagine ourselves, our capacities and purposes, and how we interact with one another? The same might be asked of the computer, or of Greene’s camera: what do we gain by framing our mental and moral lives in these technological terms, and what do we lose — in scientific or ethical terms?

Link to the rest at The Los Angeles Review of Books

PG started thinking about how his brain works, but quickly became bored. He’s not certain exactly what that might mean.

Scourge of the Elites

From The Wall Street Journal:

Thorstein Veblen may be the most important American thinker most Americans have never heard of. A prolific economist at the turn of the 20th century, Veblen’s groundbreaking work on the mysteries of inequality earned him the admiration of his academic peers, while his searing observations about the “conspicuous consumption” and “predatory” habits of the wealthy won him an audience far beyond the ivory tower. Veblen’s epic life ended in despair—a final note urged that no biographical account or intellectual tribute ever be paid to him—and yet long after his passing, writers including John Dos Passos, Carson McCullers and John Kenneth Galbraith persistently refused to honor his dying wish.

About 40 years ago, however, Veblen fell precipitously out of fashion. To the small clique of enthusiasts who remain, he is understood as a social critic or philosopher—fine things, to be sure, but a dead-end legacy for an economist in an age in which economics still calls the tune for public policy.

And so it is that an insightful new Veblen biography comes to us from Charles Camic, a sociologist at Northwestern University who proves himself a capable guide down the tumultuous currents of late-19th-century ideas. Mr. Camic’s Veblen is an intellectual flamethrower, torching every school of thought in sight, from the classicism of Adam Smith to the communism of Karl Marx, attempting to clear the ground for a new kind of science.

Economists, Veblen argued, were doing economics all wrong. They should have been studying the development and decline of economic institutions. Instead, they were devoting themselves to empty abstractions about consumption, production and productivity that could not be verified by data or experience.

Veblen cut a new course for the discipline by composing ambitious treatises on the origin of inequality. In “The Theory of the Leisure Class” (1899) and “The Theory of Business Enterprise” (1904), he argued that the rich and the modern corporation—examples of what he vaguely defined as “institutions”—were parasitic elements that leeched wealth from more productive segments of society. The ostentatious rich, Veblen maintained, were not evidence of productive abundance but proof that modern industry was making society poorer.

Veblen made his case with the acid wit of an outsider granted ill-considered admittance to an inner sanctum. Raised by Norwegian immigrants in the hinterlands of the Midwest, Veblen wedged himself into university life in the mid-1870s, just as the academy was beginning to assume an aura of high prestige. Studying first at Carleton College, he moved on to Johns Hopkins, Yale, Cornell and the University of Chicago, impressing professors who had earned their stripes studying in the grand universities of Europe.

The children of privilege Veblen encountered in higher learning were very different from the immigrant farmers of his youth. He grew up in townships where English was a second language and, in some cases, rarely spoken at all. Though his own family and many of his neighbors did quite well, even the most prosperous farmers of Minnesota were considered backwater curiosities by the industrial elite of Baltimore and New Haven.

The true significance of “pecuniary success,” Veblen came to believe, was not its purchasing power but the social rank it conferred. For wealth to serve this ultimate hierarchical purpose, it had to be diligently displayed. And the most prestigious of all social signals was idleness. The surest mark of “good breeding,” he wrote in “The Theory of the Leisure Class,” was the performance of “a substantial and patent waste of time.” Working people, after all, had to work.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

PG believes that Veblen was the first to introduce the term, “conspicuous consumption,” in The Theory of the Leisure Class.

The Pilgrims in Flesh and Spirit

From The Wall Street Journal:

If we want to encounter the first Thanksgiving as the Pilgrims experienced it in the fall of 1621, “we must put aside our images of great wide Puritan collars and large buckles (never worn by the godly of early Plymouth, who rejected all jewelry), roast turkey (absent from the earliest record of the event), and cranberry sauce (quite unknown in the early colony).” So writes Martyn Whittock in “Mayflower Lives,” an incisive series of biographical and historical essays on the Mayflower’s passengers.

All we know about the original Thanksgiving, Mr. Whittock explains, comes to us from 115 words in a document known as Mourt’s Relation, written to persuade London investors not to give up on the colony. We know that the original feast lasted three days and was attended by the Pilgrims and about 90 Native Americans from the Wampanoag tribe. It featured a meal consisting of venison, supplied by the natives, and fowl, supplied by the English. That’s it.

. . . .

Myths aside, the striking quality about the Mayflower’s trans-Atlantic journey—and this emerges beautifully in Mr. Whittock’s narrative—is just how archetypally American the whole affair was. The settlement’s mission, like the nation it began, needed both “saints” and “strangers,” to borrow the term historians use to distinguish the Puritan separatists from the rest of the Mayflower’s passengers: fervent Calvinists seeking freedom to worship God as they believed right but also economic migrants and seafarers seeking profit or adventure.

A comparison with the colony of Jamestown, founded in 1607, makes the point nicely. Jamestown was the project of merchants in search of a quick payday, and it was carried out by rugged soldiers and well-to-do settlers unaccustomed to hard work. Within a few years, Jamestown’s inhabitants had mostly either died or fled. The godly among the Plymouth settlers, by contrast, were transfixed by a vision of the New Jerusalem. In the minds of Pilgrims such as William Bradford and William Brewster—the latter had studied at Peterhouse, Cambridge, for a time a hotbed of Puritanism—the cause of Reformed Protestantism was dying in Europe. The New World beckoned, and no measure of physical hardship could divert them from achieving their end.

But a band of theologians and pious congregants, however dedicated, wasn’t likely to survive privation, disease and hostile natives. They needed what a former age would have called men of action, which is why they hired Myles Standish, a diminutive but accomplished soldier. Standish knew how to hunt, how to fix the settlement’s position in order to make it less vulnerable to attack, and what to do about deadly threats. In March 1623, hearing that a group of warriors from the Massachusetts tribes had determined to wipe out a nearby English settlement, Wessagusset, Standish set up a meeting with the Massachusetts sachem or chief. At the meeting, Standish and his men waited for an opportune moment and killed their adversaries with knives.

Standish has been judged critically for this act of premeditated bloodshed, and not altogether unfairly. But the Plymouth Colony, as Mr. Whittock rightly notes, would likely not have survived without the aid of such a man. Here and elsewhere, Mr. Whittock, an English author of several books on European history, is refreshingly reluctant to judge his subjects harshly. Gone are the usual snide remarks about the Puritans’ narrowness and grimness. He notes, for example, that the Puritans regarded monogamous sex as a “joyful expression of love,” an outlook their non-separating Anglican neighbors regarded as “very edgy, if not downright alarming.” With an infection wiping out half the colony in its first year, the little Calvinist company’s edginess probably kept it from oblivion.

Link to the rest at The Wall Street Journal (Sorry if you encounter a paywall)

Noble Volunteers

From The Wall Street Journal:

Few stereotypes from the American Revolution are as well-developed as that of the soldier who fought for Britain. The caricature is ubiquitous: A one-dimensional “lobster,” the “bloody-back” regular, motivated by selfishness, who fought for the love of money. He was heartless and cowardly; a pawn of the king, inept on the battlefield because of his old European ways. Here was a convenient foil for the resourceful Patriot, who fought valiantly on the winning side and for all the right reasons—family, farms and freedom. In “Noble Volunteers,” Don Hagist invites us to peer beneath the red coat. What do we find?

One central insight is that “there was no ‘typical’ British soldier.” British regulars encompassed “such a range of nationalities, ages, skills, and socioeconomic backgrounds” that we are better off “appreciating how they were different rather than how they were the same.” What, for instance, motivated them to enlist? The reasons were as many as the men who joined, with neither unemployment nor impoverishment ranking high on the list. Most were between the ages of 20 and 25, but little else united them. Some sought new careers. Others to escape overbearing mothers, or wives. Others still were moved by wanderlust or boredom. Mr. Hagist is skeptical of accounts, such as Sylvia Frey’s “The British Soldier in America,” that draw conclusions about soldiers’ motives from quantitative data. Too much was idiosyncratic, a mystery.

Mr. Hagist concentrates on the particular. We follow the British soldiers in America from Boston in 1773, before hostilities break out, to Yorktown in 1781. But it is not the battlefield that is most intriguing here; it is instead Mr. Hagist’s wealth of detail about all other aspects of a British soldier’s life. Recruitment in Britain (and elsewhere); monthslong transport in private vessels across the Atlantic, its trials and wonders (“flying fish, sharks, sea turtles, seals, and icebergs”); soldiers’ wages, within and without the army; literacy rates; training exercises; living arrangements in barracks, huts, wigwams and encampments; what they wore, ate and drank; the diseases they contracted; their desertions; “the plunder problem” (“the army’s Achilles’ heel,” says Mr. Hagist, because of its effect on the “hearts and minds” of the local populace); soldiers’ prizes, promotions and demotions; drafts and impressments; punishments and courts-martial; entertainments; religious dispositions; injuries, imprisonments and, occasionally, deaths; and, for some, their postwar lives. It is all here.

Every reader is sure to learn something, and in the process will come upon a favorite among the British soldiers. One of Mr. Hagist’s is Roger Lamb, whom he wrote about previously in “British Soldiers, American War” (2012). Lamb, from a middle-class Dublin family, enlisted with the 9th Regiment of Foot in 1773, at the age of 17, having lost all his money gambling. In America he saw heated action in two major campaigns; was captured twice; and, twice escaping, rejoined the British army each time. Returning to England in 1784—and discharged (from the 23rd Regiment) but “denied a pension because he had served only twelve years and had no disability”—he became a schoolteacher and published author, living until 1830.

Or, take William Crawford. An “ardent disposition for adventure” led him to join the 20th Regiment knowing that meant war in America. Captured, he was interned at Saratoga, N.Y., and marched for months throughout the north, then south to Virginia. Escaping, he was recaptured and jailed. Not to be so easily outdone, he befriended the jailer’s daughter hoping she would release him. Things didn’t go quite as he planned. “She forged a marriage certificate, spirited him out of jail, and presented him to townspeople as her husband.” Crawford accepted his fate. Others also remained in America, many with land grants. Still, most soldiers’ lives were not as well documented, and many ended in much darker places.

. . . .

[T]hinking historically about the war is difficult. It requires us not only to forget how events turned out, but also to recapture very particular moments from the participants’ perspectives. “Standing sentry on a storm-swept shoreline in the middle of a winter night, fending off a rising fever while fearful of imminent attack by assailants unseen, may have been one man’s most difficult hours of an eight-year war,” writes Mr. Hagist, “but histories focused on pivotal campaigns are unkind to such personal experiences, trivializing or entirely overlooking most of the hardships endured by most of the soldiers.”

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

PG is not an expert on military history, but he has read enough books about people fighting in war to conclude that for those actually and physically engaged in the front lines of battle, the experience is far more random than that described by the memoirs of the generals and admirals or the books written by historians based on copies of the orders, after-action reports and official histories describing the wars and interviews with the generals or admirals and examinations of the personal letters and memoirs the commanders wrote about the wars.

PG is not aware of any general who instantly died in the 20th century wars because he was inattentive or caught off guard for a split-second.

One of PG’s neighbors from many years ago had served as a Captain in the Army in Vietnam. During his service, he was engaged in extremely close-quarters and intense fighting on more than one occasion.

Like many veterans who have experienced close, person-to-person fights where lives were in immediate peril, PG’s neighbor did not spend a lot of time talking about his Vietnam fighting experiences.

However, on one occasion, he told PG that the only reason he was alive to converse about the war was because an enemy soldier had failed to clean his gun.

In the thick jungle where he and his men were actively engaged in a firefight, an enemy soldier sneaked through the jungle and emerged about five yards behind the Captain, pointing his AK-47 directly at him. The Captain, focused on the fight going on in front of him, heard something behind him, looked back and saw the soldier pointing his gun directly at him, pulling on the trigger. However, the submachine gun did not fire and the Captain was able to turn and fire his rifle in time to kill the Viet Cong soldier.

After the fighting ceased, the Captain examined the enemy soldier and his gun.

The gun was fully-loaded and ready to fire, but it had jammed because the soldier hadn’t cleaned out the powder residue present as a result of one or more earlier fights during which the soldier had evidently fired the gun a lot.

After the battle was over, while a few of his men were watching, the Captain picked up the AK-47, cleared the jam, examined the firing chamber then took out his knife and scraped away some of the built-up gunpowder residue he found. He then closed the action and the gun fired instantly when he pulled the trigger.

He said it was a good object lesson for his men about the importance of maintaining their weapons carefully.

PG doesn’t remember General Pershing or Generals Eisenhower or Montgomery describing any similar experience in the accounts of their wars.

Long Live Work!

From The Paris Review:

A Bulgarian grocery store opened for business in my Amsterdam neighborhood. On the inside of the plate-glass window they hung a Bulgarian flag, making the store highly visible from the outside, but dark inside. They sell overpriced Bulgarian groceries. And the same can be said of almost all the ethnic markets. First come the migrants, and after them—the markets. After a time the ethnic food markets disappear, but the migrants? Do they stick around? The number of Bulgarians in the Netherlands is clearly on the rise; two Bulgarian markets have opened recently in my neighborhood alone.

And as to those with a “Balkan tooth,” they have famously deep pockets as far as food is concerned; they’ll happily shell out a euro or two extra to satisfy gourmandish nostalgia. The markets sell Bulgarian wine, frozen kebapcheta and meat patties, cheese pastries (banitsas), pickled peppers and cucumbers, kyopoloupindjurlyutenitsa, and sweets that look as if they’ve come from a package for aid to the malnourished: they are all beyond their shelf dates.

The store is poorly tended and a mess, customers are always tripping over cardboard boxes. Next to the cash register sits a young man who doesn’t budge, more dead than alive, it’s as if he has sworn on his patron saint that nobody will ever extract a word from him. The young woman at the cash register is teen-magazine cute. She has a short skirt, long straight blond hair, a good tan. Her tan comes from her liquid foundation; her cunning radiates like the liquid powder. She files her nails, and next to her stands a small bottle of bright red nail polish.

The scene fills me with joy. She grins slyly. I buy lyutenitsa, Bulgarian (Turkish, Greek, Macedonian, Serbian) cheese, and three large-size Bulgarian tomatoes. DovizhdaneДовиждане.

. . . .

The division into those who work and those who do not—the hardworking and the indolent, the diligent and the ne’er-do-wells, the earnest and the couch potatoes—is hardly new, but over the last few years it has become the basic media-ideological matrix around which revolve the freethinkers of the general public. Joining the category of the indolent, ne’er-do-wells, and malingerers are the ranks of the jobless (for whom the employed claim they are simply incompetents and bumblers), along with the grumblers, indignants, and the groups defined by their country, geography, and ethnicity (Greeks, Spaniards, Romanians, Bulgarians, Serbs, Bosnians—all shiftless riffraff!), anticapitalistic elements, hooligans, vandals, terrorists, and Islamic fundamentalists.

In response to the question of how to become a multimillionaire, one of the wealthiest Russian oligarchs replied, “Don’t you forget, I work seventeen hours a day!” The very same answer is given by criminals, thieves, politicians, porn stars, war profiteers, celebs, mass murderers, and other similar deplorables. They all say seventeen hours a daymy career, and my job with such brash confidence, not a twitch to be seen.

On Meet the Russians, a TV show broadcast by Fox, young, prosperous Russians, many of them born, themselves, into money, fashion models, fashion and entertainment industry moguls, pop stars, club owners, and the like, all use the following phrases: I deserve thiseverything I have, I’ve earnedmy time is moneyI work 24/7I never give up.

. . . .

The native armed with bow and arrow, railway line, village, town, may the country thrive and grow, long live, long live work. These are the lyrics of a song that was sung during the Socialist period, when workers’ rights were much greater than they are today.

I confess I never made sense of these verses, perhaps because I didn’t try. What possible connection could there be between a native armed with bow and arrow and railway lines, villages, and towns, unless the lyrics are an anticipatory tweet about the eons of history of the human race: in other words, thanks to the appeal of hard work, natives traded in their bows and arrows for railways, villages, and towns.

Or, perhaps, it’s the other way around: without the redeeming balm of work, those same natives would have to return to the age of bows and arrows, while weeds would engulf the railway lines, villages, and towns. Although the everyday life of socialism in ex-Yugoslavia was like a hedonistic parody of the everyday life in other communist countries, Yugoslavs shared with them a packet of the same values, a set of common symbols, and their imaginary.

And at the center, at least as far as symbols and the imaginary go, was work. Work was what persuaded the native armed with bow and arrow to evolve from the ape, and the “peasant and worker” and “honest intellectuals” evolved thereafter from the native.

“The workers, peasants, and honest intellectuals” were the pillars, in the socialist imaginary, of a robust socialist society and were cast in a powerful positive light, especially because the honest intellectuals were separated from dishonest intellectuals just as the wheat is winnowed from the chaff.

The “bureaucracy” was the necessary evil, the “bureaucracy” flourished, while feeding, parasite-like, on the people. In any case, the word “work” was heard everywhere: in the news shorts that played before films in Yugoslav movie theaters, in the images of eye-catching, sweaty, workers’ muscles, in my elementary school primers where the occupations were unambiguous (male miners, female nurses, male blacksmiths, female backhoe-operators, male construction workers, female teachers, male engineers, female tram-drivers), in the movies, and in the First-of-May parades—pagan-like rites, honoring the god of labor as tons of sacrificial steel, coal, wheat, books were rolled out.

The heroes of the day were the record-breakers, the men and women who went above and beyond the norm. The heroes of today are pop stars, Marko Perković Thompson and Severina, and the many clowns who surround them.

. . . .

Today the vistas I see are post-Yugoslav. Perhaps the view is better in the postcommunist countries like Poland, Romania, Bulgaria, Hungary … I hope representatives of other postcommunist countries don’t hold against me my geopolitically narrow focus. Everything I’ve said refers only to little Croatia, little Serbia, little Bosnia, little Macedonia … And this crumb of badness in the sea of postcommunist goodness can easily be ignored, can it not? Although to be honest, research from 2007 shows that fewer than half of the Germans living in what used to be East Germany were pleased with the current market economy, and nearly half of them desired a return to socialism. As a return to the previous order is now unimaginable, the lethargic East German grumblers have been given a consolation prize, a little nostalgic souvenir, a MasterCard and on it the face of Karl Marx, designed and issued by a bank in the city known today as Chemnitz, though earlier it was called Karl-Marx-Stadt.

. . . .

In Russian fairy tales, Ivan the Simple earns his happy ending and wins the kingdom and the queen. Does he do this by working seventeen hours a day? No he does not. He does this thanks to his cunning and his powerful helpers: a horse able to traverse miles and miles at lightning speed, a magic shirt that makes him invincible, a fish that grants his wishes, Baba Yaga who gives him sly advice, and powerful hawks and falcons for brothers-in-law. Even our hero—Ivanushka, grimy, ugly, slobbering Ivanushka Zapechny, he who is the least acceptable, who lounges all the livelong day by the tile stove—even he, such as he is, wins the kingdom and the princess without breaking a sweat. Our modern fairy tale about the seventeen-hour workday has been cooked up as consolation for the losers. Who are the majority, of course.

Link to the rest at The Paris Review (PG added a few paragraph breaks.)

PG recalls one of the impressions of a college friend who studied and extensively traveled all over what was then known as the Union of Soviet Socialist Republics – “Anything that is not at least 75 years old looks like it was put up cheap.”

On a brighter note, PG is greatly enjoying reading Catherine the Great: Portrait of a Woman. It is the second volume of quite a nice four-volume set by Robert K. Massie, The Romanovs. The series starts with Peter the Great (Pyotr Alekseevich), born on February 8, 1672, crowned as Tzar at the age of ten, and ends with The Romanovs: The Final Chapter that, according to the book’s description, begins at the end of the end of the Romanov line in Siberia with “the infamous cellar room where the last tsar and his family had been murdered” on the night of 16–17 July 1918.

PG seldom provides book recommendations on TPV, but will say he is greatly enjoying this history. While by no means any sort of historian, PG has read quite a number of well-written accounts of various times past and Catherine the Great would be up towards the top of PG’s “best of” list.

FYI, he doesn’t think you necessarily need to read the books in sequence. Each one is very good at setting its current subject in her/his times and place in history.

PG notes that Amazon list used hardcovers for very reasonable prices. That said, “Catherine” is 656 pages and, since PG does a great deal of reading in bed, he is happy to have his featherweight Kindle Paperwhite on his chest instead of the hardcover.

A note on PG’s experience with his Paperwhite – He doesn’t mind the ad-supported version because, at least in his, he almost never notices the ads. While PG also has an iPad, he much prefers his Paperwhite for book-length fiction – much lighter, longer battery life and, at least for PG, a much better screen for reading text than the iPad (or the Fire or any of the android tablets PG has owned in the past).

PG is not terribly familiar with the differences between the earlier generations and current generations of Paperwhites. His is a first-generation model and, if it stopped working, he would probably by another that was identical if it were available.

How the Spanish flu changed the world

From the World Economic Forum:

A couple of years ago, journalist Laura Spinney could hardly believe how little people thought about the Spanish flu pandemic, which swept the globe in three deadly waves between 1918 and 1919.

So she wrote a book – Pale Rider: The Spanish Flu of 1918 and How It Changed the World – to bring the tragedy that claimed 50 million lives back into our consciousness,

. . . .

“It seemed to me there was this huge hole in our collective memory about the worst disaster of the 20th Century. It’s definitely not remembered in the same way as the two world wars – there is some different way we remember pandemics.

One of the ways I tried to explain it in my book was that, to me, that pandemic is remembered individually as millions of discrete tragedies, not in a history book sense of something that happened collectively to humanity.”

. . . .

We think it infected about 500 million people – so one in three people in the world alive at that time, and it killed 50 million of them. The death toll could have been even higher because there was a big problem with under-reporting at the time. They didn’t have a reliable diagnostic test.

. . . .

Pandemic flu is much worse than seasonal flu, and we think there have been 15 flu pandemics in the past 500 years. Every seasonal flu started out as a pandemic flu, which was much more virulent because it was new in the human population. Gradually over time, it evolved to become more benign and to live in a more harmonious relationship with humanity.

There are lots of theories for why the Spanish flu was so virulent and they’re not mutually exclusive. Some of them have to do with the inherent biology of that virus, and some of them with the state of the world at the time. That pandemic obviously emerged when the world was at war; there were extraordinary circumstances. Lots of people were on the move, not only troops, but also civilians: refugees and displaced persons. And there was a lot of hunger.

All of these factors may have fed into the virulence of the virus. There was definitely something very abnormal about 1918. If you think about the five flu pandemics we’ve had since the 1890s, none of them has killed more than about 4 million people maximum, whereas we think Spanish flu killed 50 million.

. . . .

There were no commercial aeroplanes, so the fastest way you could get around was by ship or by train. Henry Ford had invented his Model T motor car, but they were still the preserve of the rich, as were telephones. And illiteracy was much higher than it is now, which had an impact because the main way that news was transmitted was by newspapers. In illiterate populations news travelled much more slowly and was often distorted.

. . . .

In the short term, there was a jump in life expectancy, because a lot of people who were very ill with, for example, TB, which was a massive killer at that time, were purged from the population. They were probably the first to die of the Spanish flu because they were already in a weakened state. The people who were ill died and the people who were left behind were healthier.

There was also a baby boom in the 1920s, which has always been put down to the war and the men returning from the front. But there is an argument that the flu could have contributed because it left behind a smaller, healthier population that was able to reproduce in higher numbers. Norway, for example, had a baby boom even though it was neutral in the war.

Among those very vulnerable to the Spanish flu were the 20 to 40-year-olds. Normally flu is most dangerous to young children and to the very old, but in 1918, bizarrely, it was this middle age group. There wasn’t much of a social welfare net, even in wealthy countries, so lots of dependents were left without any means of support because the breadwinners were taken out by the flu.

. . . .

One of the great tragedies of 1918 is that those dependents just vanish into the cracks of history. We don’t really know what happened to them but we get the occasional glimpse, for example, from a study in Sweden we know that a lot of old people moved into workhouses and a lot of the children became vagrants.

Men were more vulnerable than women overall globally, though there were regional variations. Pregnant women were particularly vulnerable and had miscarriages at frighteningly high numbers because, to fight the virus, the body took resources away from the womb and the growing foetus. Some of those babies survived and we know now there’s a lifelong effect called foetal programming. That generation was physically and cognitively slightly reduced. They were more likely to suffer from heart attacks and to go to prison – and came of age just in time to go and fight in the Second World War.

. . . .

In many Western countries, there was a turning away from science after the pandemic because people were disillusioned with it. From the 1920s, for example, in America, alternative medicine took off in a big way and spread around the world.

But at the same time, in countries that had not really embraced the scientific method, you see the opposite effect. So China becomes a little bit more scientific after the pandemic. There’s a move to better disease surveillance, better public health, more organized collection of healthcare data, because they saw that to prevent future pandemics they needed to turn towards science.

. . . .

The Spanish flu was democratic on one level. It could infect anyone: British Prime Minister David Lloyd George came down with the flu and Boris Johnson has had COVID-19 today. Nobody is, in theory, spared.

If you look at the population level though, there’s a very clear disparity and basically the poorest, the most vulnerable, the ones with the least good access to healthcare, the ones who work the longest hours, who live in the most crowded accommodation, and so on, are more at risk.

But in 1918, it was a time of eugenics-type thinking and it was perceived that those people who were more prone to the flu were constitutionally somehow inferior, that it was somehow their fault.

. . . .

The dates of the waves were dependent on where you were in the world. They came later in the Southern hemisphere, which meant Australia had the luxury of seeing this thing approach in space and time from the north, and took advantage of that to put in place maritime quarantine.

It managed to keep out the lethal second wave in October 1918, which is one of the rare exceptions of public health measures really working that year. But they lifted it too soon and the third wave of infection of early 1919 came into the country and killed 12,000 Australians. But it would have been much, much worse if they had not put the quarantine in place when they did.

Link to the rest at the World Economic Forum

First Principles

From The Wall Street Journal:

First Principles: What America’s Founders Learned from the Greeks and Romans and How That Shaped Our Country

The subject of Thomas Ricks’s extraordinarily timely book is, in his words, “what our first four presidents learned, where they learned it, who they learned it from, and what they did with that knowledge.”

. . . .

John Adams attended Harvard; Thomas Jefferson, William and Mary; James Madison, the College of New Jersey (subsequently renamed Princeton). Of the first four presidents only George Washington had not received a university education: he spoke no foreign or ancient languages and was not much of a reader. Yet even he was steeped in the classicism of the Enlightenment era, and as he matured into his role as the father of his country he came to be seen as the personification of ancient Roman virtue—his country’s Cato, its Fabius, its Cincinnatus.

“Virtue” had a somewhat different meaning in the 18th century than it does today: in Mr. Ricks’s brief formulation, “it meant putting the common good before one’s own interests,” and looked specifically back to ancient exemplars like Cato, Cicero and Socrates. Adams modeled himself on Cicero as Washington did on Cato. Montesquieu, the Enlightenment theorist who had a greater influence on the founders than any other, famously stated in his “Spirit of Laws” (1748) that virtue was the one indispensable quality in a republic. Washington and Adams, at any rate, heartily agreed with him.

Jefferson brought the architecture of ancient Rome to our shores: Monticello, the University of Virginia campus and, finally, the distinctly Roman look given to Washington, D.C. “Almost single-handedly,” wrote the historian Gordon Wood, “he became responsible for making America’s public buildings resemble Roman temples.”

But as Mr. Ricks proves, Jefferson was always “more Greek than Roman, more Epicurean than Ciceronian.” Indeed, he openly admitted to being an Epicurean, a philosophy he called the “most rational system remaining of the philosophy of the ancients,” and Mr. Ricks points out that his replacement of John Locke’s “life, liberty, and estate” (that is, property) with “life, liberty, and the pursuit of happiness” indicates a specifically Epicurean outlook.

Madison, who more than any other founder was responsible for the shape that the U.S. Constitution would finally take, immersed himself in the history of ancient republics and confederations to see what good ideas they could bring to ours. The Roman Republic, which lasted almost five centuries, was of particular interest, but so too were the various Greek confederations, such as the Amphictyonic League, in which the states had the same number of votes (like our Senate today), and the Lycian confederacy, which had proportional votes (like our House of Representatives). Twenty-three of the 85 Federalist Papers cite classical authorities; interestingly, they are more often Greek than Roman.

But Madison took a crucial step to lead the country away from the most important classical precept: he decided that public virtue couldn’t be counted on, and looked for an alternative. The failure of the Articles of Confederation had made it painfully obvious that self-interest usually trumps disinterested virtue. “The present System,” complained Madison, “neither has nor deserves advocates; and if some very strong props are not applied will quickly tumble to the ground. No money is paid into the public Treasury; no respect is paid to the federal authority.”

Now Madison took inspiration from Enlightenment ideas, most memorably formulated in Bernard Mandeville’s “Fable of the Bees” and Adam Smith’s “Wealth of Nations,” that private vices might, when taken together, positively benefit the public.

In Federalist 10 he attacked the classical republican idea that the pursuit of self-interest necessarily violates the public trust: “The causes of faction cannot be removed . . . relief is only to be sought in the means of controlling its effects.” This must be done by involving “the spirit of party and faction in the necessary and ordinary operations of government.”

Here Madison departed from Montesquieu by claiming that a large republic would be more durable than a small one; the more individual interests in play, he claimed, the smaller the chance that any one will prevail. (Of course he could not have dreamed of the possibilities opened by mass communications and social media!) Washington still thought the new republic could not exist without public virtue, and said as much in his Farewell Address; but, writes Mr. Ricks, that was “old think.”

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

Lockdown named word of the year by Collins Dictionary

From The Guardian:

Lockdown, the noun that has come to define so many lives across the world in 2020, has been named word of the year by Collins Dictionary.

Lockdown is defined by Collins as “the imposition of stringent restrictions on travel, social interaction, and access to public spaces”, and its usage has boomed over the last year. The 4.5bn-word Collins Corpus, which contains written material from websites, books and newspapers, as well as spoken material from radio, television and conversations, registered a 6,000% increase in its usage. In 2019, there were 4,000 recorded instances of lockdown being used. In 2020, this had soared to more than a quarter of a million.

“Language is a reflection of the world around us and 2020 has been dominated by the global pandemic,” says Collins language content consultant Helen Newstead. “We have chosen lockdown as our word of the year because it encapsulates the shared experience of billions of people who have had to restrict their daily lives in order to contain the virus. Lockdown has affected the way we work, study, shop, and socialise. With many countries entering a second lockdown, it is not a word of the year to celebrate but it is, perhaps, one that sums up the year for most of the world.”

Other pandemic-related words such as coronavirus, social distancing, self-isolate and furlough were on the dictionary’s list of the top 10 words. So was the term key worker. According to Collins, key worker saw a 60-fold increase in usage over the last year, which reflects “the importance attributed this year to professions considered to be essential to society”.

Link to the rest at The Guardian

What It Means to Be Human

From The Wall Street Journal:

Bioethics has always been enmeshed in controversy. Arising out of gross abuses of the rights of human subjects in mid-20th-century scientific research, the field has grown to take on a variety of thorny challenges at the intersection of morality and biomedicine—from embryo research and organ markets to artificial reproduction and physician-assisted suicide.

But the most profound bioethical disputes actually lie beneath these headline-grabbing controversies, deep in the soil of moral philosophy and anthropology. To think clearly about how to protect the human person, we need an idea of what the human person is and why that matters. Our society often takes for granted a set of answers to questions about the meaning of personhood, and those answers tend to emphasize choice and independence as measures of human dignity and worth. The bias toward autonomy goes well beyond academic bioethics and, indeed, prevails throughout our culture. A critical examination of the moral suppositions underlying contemporary bioethics might shed light on much more of our common life than our engagement with biology and medicine.

Such an ambitious examination has now been taken up by O. Carter Snead in “What It Means to Be Human.” The result is a rare achievement: a rigorous academic book that is also accessible, engaging and wise.

. . . .

Mr. Snead’s subject is “public bioethics,” by which he means not the work of advising particular patients or clients facing difficult decisions but the work of setting out laws, rules and policies regarding the uses of biotechnology and medicine. He begins by drawing out the often unstated assumptions beneath such frameworks. “American law and policy concerning bioethical matters,” Mr. Snead writes, “are currently animated by a vision of the person as atomized, solitary, and defined essentially by his capacity to formulate and pursue future plans of his own invention.”

By putting decision making at the center of its understanding of the moral life, this view treats cognition and rational will as the essence of our humanity and radically plays down unchosen obligations. More important, it implicitly treats those who depend most heavily on others because they are unable to make choices—the mentally impaired, dementia patients, those suffering extreme pain, children in the womb, and others—as diminished in worth. Even when bioethics does try to protect such people, it struggles to understand just how their lives are worth living.

What this view misses, Mr. Snead argues, is the significance of our embodiment. “Human beings do not live as mere atomized wills,” he writes, “and there is more to life than self-invention and the unencumbered pursuit of destinies of our own devising. The truth is that persons are embodied beings, with all the natural limits and great gifts this entails.”

This simple fact has far-reaching implications. “Our embodiment situates us in a particular relationship to one another, from which emerge obligations to come to the aid of vulnerable others, including especially the disabled, the elderly, and children.” Our power to choose recedes into the background when our lives are viewed this way, and our embeddedness in webs of mutual regard come to the fore. Properly understood, bioethics should seek to emphasize not ways of breaking relations of dependence but ways of helping us see what our common humanity requires of us.

. . . .

Mr. Snead doesn’t emphasize the religious foundations of this truth, and he maintains a welcoming and inviting, even conciliatory, tone toward the progressive bioethicists whom he is criticizing. He knows they mean well but thinks they are caught up in the expressive individualism of our culture in ways that keep them from grappling with the full meaning of the questions their field sets out to address. The book speaks their language: It is technical at times, especially when considering in detail the law surrounding abortion, assisted reproduction and end-of-life care. But in the end, it addresses far more than professional controversies.

“What It Means to Be Human” may have its greatest impact outside public bioethics. That field is now intensely politicized, and stubbornly resistant to criticism. It is likely to remain in the business of constructing sophistic permission structures justifying a dehumanizing but convenient disregard for the weak and vulnerable in the all-atoning name of choice. Dissenters from this orthodoxy, like Mr. Snead, often defy easy political and professional classification. Their work is rooted in deeper philosophical soil and therefore tends to grow beyond the bounds of bioethics.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

PG wishes the Harvard University Press had not overpriced the ebook.

Too Much Information – Understanding What You Don’t Want to Know

From MIT Press:

How much information is too much? Do we need to know how many calories are in the giant vat of popcorn that we bought on our way into the movie theater? Do we want to know if we are genetically predisposed to a certain disease? Can we do anything useful with next week’s weather forecast for Paris if we are not in Paris? In Too Much Information, Cass Sunstein examines the effects of information on our lives. Policymakers emphasize “the right to know,” but Sunstein takes a different perspective, arguing that the focus should be on human well-being and what information contributes to it. Government should require companies, employers, hospitals, and others to disclose information not because of a general “right to know” but when the information in question would significantly improve people’s lives.

Sunstein argues that the information on warnings and mandatory labels is often confusing or irrelevant, yielding no benefit. He finds that people avoid information if they think it will make them sad (and seek information they think will make them happy). Our information avoidance and information seeking is notably heterogeneous—some of us do want to know the popcorn calorie count, others do not. Of course, says Sunstein, we are better off with stop signs, warnings on prescription drugs, and reminders about payment due dates. But sometimes less is more. What we need is more clarity about what information is actually doing or achieving.

Link to the rest at MIT Press

PG was drawn to the OP for two reasons:

  1. He had previously read several short works from the author, Cass Sunstein, and enjoyed them.
  2. At the end of the worst presidential campaign season in the history of the United States, PG (along with a great many other people) is sick and tired of hearing about the two candidates, their appearances, their statements, their goals, their relatives, their assistants, their past, their futures, etc., etc., etc., etc.

In other words, PG is suffering from too much information about people, opinions and events that tend to disgust him.

PG is looking forward to the day, hopefully not too far in the future, when he doesn’t come across a single mention of either candidate or anything associated with them.

Reading Patients, Writing Care

We turned her every couple of hours in the end, though somehow the procedure seemed more incessant than that. At times, it felt like a peculiarly brutal routine to inflict upon someone under your watch. But there was no room for compromise in the instruction we’d been told to follow: pressure sores can be deadly. And to have any chance of preventing them, we had to subject my grandmother to regular, distressing turns, which couldn’t be done fluently due to the effort involved; turns that demolished whatever quantum of peace only morphine could supply her in repose.

Before long, the task inevitably acquired a regimental punctuality. Yet it remained too intimate ever to be entirely functional. Nor did it become any easier with practice. By design, the whole process is rarely seamless. One hasty move can be torturous. Equally, though, overcautiousness carries its own perils: repositioning someone in slow motion prolongs the risk of aggravating existing abrasions. However tightly we policed our complacencies, there was always room for agony; and however inescapable such pain is, we weren’t about to absolve ourselves of the additional suffering we alone seemed to be inflicting. If it appeared as though we were destined to fail, this was hardly an acceptable compensation. The constant glare of anticipatory grief leaves the labor of care bleached of self-forgiveness.

The house in which my mother had been born and where she now once again lived—on account of poverty, not out of choice—became the place where she would see her own mother die. This symmetry was a privilege amid formidable sadness: “Most people want to die at home,” observes dementia campaigner and novelist Nicci Gerrard, yet “most die in hospital.” And while the majority of terminally ill people “want to be with family,” too “often they are alone with strangers.” How fortunate we were to be bucking that trend.

It is caregiving’s emotional and physical contours that are illuminated throughout Rachel Clarke’s Dear Life: A Doctor’s Story of Love and Loss. Although the book centers on the remarkable work of professional hospice staff—who ensure that people who don’t spend their final hours at home are at least surrounded by dignity, calm, even consolation—Clarke’s vision of care’s complex entwinements of torment and fulfillment is unconfined to specialist practitioners. As such, she reads distinct end-of-life experiences in medical settings for what they reveal about our common sentiments toward illness and dying; sentiments that imbue countless, apparently unexceptional, yet affectively multifaceted acts of caregiving that take place outside clinical environments too.

Link to the rest at Public Books

Lost in a Gallup

From The Wall Street Journal:

Griping about polling goes back a long time, even to the days before George Gallup published the first random-sample opinion poll in October 1935—as many years away from us in 2020 as that first poll was from the Compromise of 1850. And truth to tell, it doesn’t seem intuitively obvious that the responses of a randomly chosen group of 800 people should come reasonably close, in 19 cases out of 20, to those you’d get if you could interview everyone in a nation of 209 million adults. Even sharp math students don’t always know much about statistics and probability. So the griping goes on.

Some of it reflects a misunderstanding of what polling is. It’s not prediction: Polls are a snapshot taken at a point in time, not a movie preview of what you’ll see later. That fundamental point is often lost or at least misplaced by W.Joseph Campbell in “Lost in a Gallup: Polling Failure in U.S. Presidential Elections,” an otherwise fast-moving narrative history of some attempts to gauge public opinion amid electoral politics. “Election polls are not always accurate prophecies,” Mr. Campbell writes early on. He notes that “polling failures tend to produce broadly similar effects—surprise, anger, bewilderment and frustration at their failing to provide the American public with accurate clues about the most consequential of all U.S. elections.” Surprise, anger, bewilderment, frustration: This sounds like the response to the result of the 2016 election in the city where Mr. Campbell teaches, Washington (which voted 91% for Hillary Clinton and 4% for Donald Trump).

But Mr. Campbell’s gaze goes far beyond the Beltway and back further in history than the astonishing election night four years ago. He is well aware that the national polls in 2016 were close to the results; the pre-election Real Clear Politics average showed Hillary Clinton ahead by 3.3%, close to her 2.1% plurality in the popular vote. Polls in some states were further off. Still, Nate Silver’s FiveThirtyEight gave Donald Trump a 29% chance of winning, and 29% chances happen about one-third of the time. Mr. Campbell quotes RCP’s Sean Trende saying, rightly, that 2016 “wasn’t a failure of the polls. . . . It was a failure of punditry.”

The subject of “Lost in a Gallup” is not so much election polling as its effects on political journalism. Mr. Campbell, a prolific author and a communications professor at American University, admits up front that he is not concerned with “jargon and the opaque methodological arcana that pollsters and polling experts are keen to invoke.” The book is a history of mistakes and overcompensating for mistakes. Polling pioneers Gallup, Elmo Roper and Alexander Crossley, after bragging how closely the past three elections matched their poll numbers, all showed Thomas Dewey leading Harry Truman in 1948. Having got that wrong, they fudged their results to project a close race in 1952. Wrong again!

. . . .

Mr. Campbell devotes much attention, justifiably, to the 1980 election. For months, polls showed a close race between incumbent Jimmy Carter and elderly (age 69) challenger Ronald Reagan. But when the exit polls—invented by polling innovator Warren Mitofsky, also the inventor of random digit-dialing phone interviewing—showed Reagan well ahead, NBC projected his victory, to almost everyone’s astonishment.

But were the polls actually wrong? The author quotes the Carter and Reagan pollsters, Patrick Caddell and Richard Wirthlin, saying that opinion shifted strongly to Reagan after the candidates’ single debate seven days before the election and after Mr. Carter’s return to Washington the next weekend to tend to the Iran hostage crisis. Both pollsters told me the same thing back in the 1980s. Their story makes sense. Reagan’s “are you better off than you were four years ago?” debate line (stolen, though no one then realized it, from Franklin Roosevelt’s 1934 pre-election fireside chat) worked in his favor, and Mr. Carter’s job rating, buoyed upward all year by his efforts to free the hostages, was liable to collapse when he failed.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

Reading literary versus popular fiction promotes different socio-cognitive processes, study suggests

From PsyPost:

A study published in PLOS One suggests that the type of fiction a person reads affects their social cognition in different ways. Specifically, literary fiction was associated with increased attributional complexity and accuracy in predicting social attitudes, while popular fiction was linked to increased egocentric bias.

“We learn a lot about ourselves, interpersonal relations, how institutions work, etc., from fiction. In other words, fiction impacts what we think about the world. But in my research, I am interested in the ways in which fiction shapes how we think,” explained study author Emanuele Castano of the University of Trento and the National Research Council in Italy.

. . . .

“We distinguished between literary (e.g. Don Delillo, Jonathan Franzen, Alice Munroe) and popular fiction (e.g. Dan Brown, Tom Clancy, Jackie Collins), and showed that it is by reading literary fiction that you enhance your mindreading abilities — you are better at inferring and representing what other people think, feel, their intentions, etc.”

Link to the rest at PsyPost

PG wonders, at least in the current (and likely post-election) political environment in the US, if more mindreading purporting to discover other people’s (perhaps hidden) thoughts and motives is a good idea or not.

HC signs book for stressed women

From The Bookseller:

HarperCollins has acquired a guide to help women beat stress, Stressilient, by clinical psychologist Dr Sam Akbar.

World English language rights for Stressilient: How to Beat Stress and Build Resilience were acquired by PR and publishing director Michelle Kane at Fourth Estate from Claudia Young at Greene & Heaton. Publication is scheduled for spring 2022.

In the book, Dr Akbar will draw on her own professional expertise–with over 10 years’ experience as a clinical psychologist–providing “sensitive guidance and practical tools for women who are looking to feel calmer, less stressed and more resilient to life’s challenges”.

Kane said: “While life affirming insta-quotes might provide a quick fix, now, more than ever, we need the voices of experts to help us deal with our mental wellbeing in the long term. Sam’s professional experience positions her as a real voice we can trust in and this essential little book will provide tools that the reader will use for life!”

Link to the rest at The Bookseller

Stalin: Passage to Revolution

The information card on “I. V. Stalin”, from the files of the Imperial police in Saint Petersburg, 1911
via Wikipedia

From The Wall Street Journal:

Not surprisingly, Joseph Stalin has been the subject of many biographical studies, in recent years in particular, when formerly closed Soviet archives became open to students of history. Decades before, Leon Trotsky, Isaac Deutscher, Adam Ulam and Robert Tucker, to name a handful of prominent authors, wrote hefty volumes on Stalin’s life, attempting to tell the story with limited information. Their work has been surpassed by another generation of scholars, led by Dmitri Volkogonov, Robert Service, Oleg Khlevniuk and Stephen Kotkin. They have plumbed the archives and benefited from a host of memoirs that have deepened our understanding of a murderous dictator whose legacy, nearly 70 years after his death, still haunts the countries he once ruled.

Ronald Grigor Suny’s “Stalin: Passage to Revolution” is a worthy contribution to this continuing enterprise. “The telling of Stalin’s life has always been more than biography,” Mr. Suny writes. “There is wonder at the achievement—the son of a Georgian cobbler ascending the heights of world power, the architect of an industrial revolution and the destruction of millions of the people he ruled, the leader of the state that stopped the bloody expansion of fascism.” It is the story of how the Romanov dynasty, convinced of its own divine right to rule the Russian Empire, confronted “a newly emerging social class” of industrial workers, a clash that “exploded into violence, bloodshed, and eventually revolution.” Reading Mr. Suny’s chronicle, one can’t help recalling John F. Kennedy’s remark, in a 1962 speech, that “those who make peaceful revolution impossible will make violent revolution inevitable.”

. . . .

Mr. Suny’s focus is Stalin’s early decades, from his birth and education to the eve of revolution in 1917. Born in 1878 in the Georgian town of Gori, on the southern periphery of the Russian Empire, Ioseb Jughashvili, as he was christened, was raised in a poor family. His father scratched out a living as a cobbler; his mother was a religious woman who worked as a seamstress. The couple had lost their first two sons in infancy, driving his father to become “violent, erratic, and drunk,” Mr. Suny says, and to abandon the family. Convinced of Joseph’s abilities, his mother worked to gain his admission to a seminary so that he could become a priest.

Using his access to archives in Georgia, Mr. Suny describes the milieu in which the young Joseph grew up—the children’s games he enjoyed and the literature and myths that animated his imagination. It was at the seminary in the Georgian capital of Tiflis that the teenage Joseph confronted the obstinacy of his teachers, who denigrated Georgian culture and insisted on the primacy of Russian language and history. Life at the seminary, Mr. Suny writes, was “colorless and monotonous . . . , a strict routine designed to inculcate obedience and deference.” It proved to be as much a “crucible for revolutionaries as for priests” and pushed “an intelligent but still quite ordinary adolescent into opposition.” At the seminary, Joseph “came to socialism through reading and the fellowship of classmates.”

. . . .

Stalin, known as Koba to his comrades, made a name for himself as a party organizer in the Caucasus, among miners and oil workers. Here confrontations with czarist officials were violent and bloody, marked by heists and assassinations.

Stalin closely studied the works of Marx and, not least, the writings of Lenin before he met the Bolshevik leader in 1905, an encounter that began a close and fateful association. Mr. Suny’s close study of these years uncovers the traits of suspicion and intrigue that came to define Stalin in power. Koba, he writes, “was not above using dubious means against comrades with whom he disagreed,” lying about them behind their backs to compromise their standing. In his encounters with Mensheviks, he indulged in anti-Semitic insults, knowing that there were more Jews among them than among the Bolsheviks he favored.

Mr. Suny’s account of the tensions between Bolsheviks and Mensheviks is spirited and compelling, especially when he describes these ostensible allies splitting into “antagonistic cultures,” each demonizing the other over their motives, making reconciliation ever less likely. Lenin is often at the center of this story, engaging in vicious polemics against his ideological adversaries. 

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

The Alignment Problem

From The Wall Street Journal:

In the mid-1990s, a group of software developers applied the latest computer learning to tackle a problem that emergency-room doctors were routinely facing: which of the patients who showed up with pneumonia should be admitted and which could be sent home to recover there? An algorithm analyzed more than 15,000 patients and came up with a series of predictions intended to optimize patient survival. There was, however, an oddity—the computer concluded that asthmatics with pneumonia were low-risk and could be treated as outpatients. The programmers were skeptical.

Their doubts proved correct. As clinicians later explained, when asthmatics show up to an emergency room with pneumonia, they are considered so high-risk that they tend to be triaged immediately to more intensive care. It was this policy that accounted for their lower-than-expected mortality, the outcome that the computer was trying to optimize. The algorithm, in other words, provided the wrong recommendation, but it was doing exactly what it had been programmed to do.

The disconnect between intention and results—between what mathematician Norbert Wiener described as “the purpose put into the machine” and “the purpose we really desire”—defines the essence of “the alignment problem.” Brian Christian, an accomplished technology writer, offers a nuanced and captivating exploration of this white-hot topic, giving us along the way a survey of the state of machine learning and of the challenges it faces.

The alignment problem, Mr. Christian notes, is as old as the earliest attempts to persuade machines to reason, but recent advances in data-capture and computational power have given it a new prominence. To show the limits of even the most sophisticated algorithms, he describes what happened when a vast database of human language was harvested from published books and the internet. It enabled the mathematical analysis of language—facilitating dramatically improved word translations and creating opportunities to express linguistic relationships as simple arithmetical expressions. Type in “King-Man+Woman” and you got “Queen.” But if you tried “Doctor-Man+Woman,” out popped “Nurse.” “Shopkeeper-Man+Woman” produced “Housewife.” Here the math reflected, and risked perpetuating, historical sexism in language use. Another misalignment example: When an algorithm was trained on a data set of millions of labeled images, it was able to sort photos into categories as fine-grained as “Graduation”—yet classified people of color as “Gorillas.” This problem was rooted in deficiencies in the data set on which the model was trained. In both cases, the programmers had failed to recognize, much less seriously consider, the shortcomings of their models.

We are attracted, Mr. Christian observes, to the idea “that society can be made more consistent, more accurate, and more fair by replacing idiosyncratic human judgment with numerical models.” But we may be expecting too much of our software. A computer program intended to guide parole decisions, for example, delivered guidance that distilled and arguably propagated underlying racial inequalities. Is this the algorithm’s fault, or ours?

To answer this question and others, Mr. Christian devotes much of “The Alignment Problem” to the challenges of teaching computers to do what we want them to do. A computer seeking to maximize its score through trial and error, for example, can quickly figure out shoot-’em-up videogames like “Space Invaders” but struggles with Indiana Jones-style adventure games like “Montezuma’s Revenge,” where rewards are sparse and you need to swing across a pit and climb a ladder before you start to score. Human gamers are instinctively driven to explore and figure out what’s behind the next door, but the computer wasn’t—until a “curiosity” incentive was provided.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

When PG was in high school, The Mother of PG aka Mom, made PG take a typing class. Learning how to type and type quickly might have been the most useful thing PG learned in high school.

PG earned money in college by typing papers for other students who couldn’t type. He charged a high per-page rate and collected it because he specialized in typing for procrastinators. If you finished your rough draft at midnight, PG would show up with his portable typewriter and turn it into something your professor would accept at 8:00 am the next morning.

PG kept typing through law school, typing all his law school exams and whatever parts of the bar exam that could be typed.

When PG was a baby lawyer, he had a client who was also working with a fancy law firm in Los Angeles. He went over to the fancy law firm on occasion to meet with the fancy lawyers who worked there (He rode up the elevator to the law firm’s offices with Marlon Brando one time and Kareem Abdul-Jabbar another time. Kareem looked a lot less dissipated than Marlon.)

The fancy law firm had the first word-processing computers PG had ever seen. The firm had eight of these computers and they were operated by the fastest and most-accurate typists PG has ever seen. The machines and operators were in their own glass-walled room and at least a couple of typists were on duty 24 hours a day. (PG was there at midnight to pick up a rush project and one of them delivered a finished contract to him at midnight.) PG just checked and each of the computerized word processors cost over $180,000 in 2020 dollars.

PG was the first lawyer he knew who bought a personal computer for his law office. Fortunately, personal computers could also be used for playing videogames, so the price had come way, way, way, way down from $180,000.

Because he could still type fast, PG learned how a word processing program worked. Plus a bunch of other programs. He quickly started using his PC for legal work. Why type a document you used for a lot of different clients over and over when you could just type it once for Client A, save a copy, then use the copy as the basis for Clients B-Z?

PC’s were evolving quickly, so when a more powerful PC was released, PG bought one and moved his prior PC to his secretary’s desk and showed her how to use the word processing program.

Since PG always hired the smartest secretaries he could find, within a couple of weeks, she was better with the word processor than PG was.

For a variety of different reasons, PG started doing a lot of divorces for people who didn’t have a lot of money (the local Legal Aid office thought he did a good job and sent a lot of clients his way).

In order to make money doing divorces for people who didn’t have much (Legal Aid never had enough money, so it didn’t pay much for a divorce either), PG built a computer program so he could do the paperwork necessary for a divorce very quickly.

The wife’s name, the husband’s name, the kids names and ages, the year and make of the rusted-out pickup, the TV, sofa, etc., were the same from start to finish, so why not type them into a computer program once, then build standard legal forms that would use the same information for all the various forms the state legislature, in its infinite wisdom, had said were necessary to end a marriage?

PG has meandered for too long, but to conclude quickly, he ended up building a commercial divorce computer program he named “Splitsville” and sold it to about 20% of the attorneys in the state where he was practicing at the time.

(In the United States, the laws governing divorce AKA Dissolution of Marriage vary from state-to-state, so Splitsville couldn’t cross state lines. Even though the fundamental human and property issues are the same any time a marriage is ended, PG suspects there are enough idiots in any state legislature to shout down anyone who says, “Why don’t we just do it the way Alabama does instead of concocting a divorce law of our own?”)

Which means PG doesn’t have enough knowledge to build artificial intelligence programs as described in the OP, but he does have an intuitive grasp of how to persuade computers do things you would like them to accomplish. PG and computers seem to understand each other at a visceral level even though PG is less like a computer than a whole lot of smart people he knows. It’s sort of a Yin/Yang thing.

His liberal-arts assessment of the problem described in the OP is that the computer scientists in the OP haven’t figured out how to ask the ultra-computer for the answers they would like it to provide. A computer can do smart things and dumb things very quickly, but useful output requires understanding what you really want it to do, then figuring out how to explain the job to the computer.

But, undoubtedly, PG is missing something entirely and is totally off-base.

The Alignment Problem may be a good description of both the computer issue described in the book and of PG himself.

Tecumseh and the Prophet

From The Wall Street Journal:

In 1808, on the Wabash River—just downstream from where the Tippecanoe River flows into it—a new settlement was being built, in what is now northwestern Indiana. You could hear trees being cut down to construct houses and a 5,000-square-foot meeting house. Women were planting corn, beans and pumpkins. Founded by the Shawnee brothers Tecumseh and Tenskwatawa, this was Prophetstown.

The brothers’ houses were close to each other on Prophetstown’s southwestern edge, from which they could see the wide Wabash flowing through the prairie. And they could see pilgrims coming and going, visiting this place of hope in a dark time. A vast diversity of Native peoples—Wyandots, Ottawas, Lenapes (Delawares), Miamis, Potawatomis, Sauks—would pilgrimage to this multiethnic religious community, some staying and some returning home to spread the brothers’ universal Nativist message. As Tenskwatawa explained: The “Master of Life had taken pity on his red children,” who had been pushed around so long by white men. He “wished to save them from destruction” if they would cast aside “wealth and ornaments,” whisky and other trappings of “evil and unclean” white Americans and band together against those who “have taken your lands, which were not made for them.”

In “Tecumseh and the Prophet: The Shawnee Brothers Who Defied a Nation,” Peter Cozzens tells the intertwined history of the brothers Tecumseh and Tenskwatawa and makes the important argument that, without Tenskwatawa—who was known as “the prophet” for his spiritual visions and prophecies—“there would have been no Tecumseh.”

In most biographies and popular versions of this history, the famous warrior Tecumseh, who led a pan-Native force against the United States in the War of 1812, stands alone—exactly the opposite of his mission in life. As Mr. Cozzens shows, the brothers sought to bring together all Native Americans under Tenskwatawa’s teachings, persuading them to cast aside their political, cultural and religious differences to become one mighty race.

While the Master of Life spoke to the prophet Tenskwatawa, Tecumseh traveled throughout the eastern half of North America to spread the word, from Creek and Choctaw towns in the deep South to the Iroquois (Haudenosaunee) nations in the Northeast, and across the Mississippi River to the Quapaws, the powerful Osages and bands of the brothers’ own Shawnee people who had already moved west. Everywhere, Tecumseh preached Tenskwatawa’s prophecies and readied men for battle against the United States.

. . . .

The book’s sharply drawn characters go beyond the central figures of Tecumseh and Tenskwatawa. One of their greatest influences was their older brother Cheeseekau, killed in 1792 fighting against Tennessee settlements alongside Cherokees and Creeks. Their Shawnee opponent, Chief Black Hoof, believed Tenskwatawa’s call for pan-Indian resistance, instead of Shawnee-directed diplomacy, was madness. The great Miami war leader Little Turtle defeated U.S. forces in the 1790s, but by the time of Tenskwatawa’s movement he believed that compromise with the United States was the only path. As governor of the Indiana Territory, William Henry Harrison was impressed by Tecumseh’s rhetorical and martial skills and frightened by his popularity. Harrison later would win the U.S. presidency as “Old Tippecanoe,” famed for defeating Tecumseh.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

PG has read a bit of American history, but this was completely new to him.

However, he did a bit of quick research and

Tenskwatawa, “The Prophet” painted in 1830 by George Catlin via Wikipedia
Frieze of the Rotunda of the United States Capitol “Death of Tecumseh” – Tecumseh is shown being fatally shot by Colonel Johnson at the Battle of the Thames in Upper Canada during the War of 1812. With Tecumseh’s death, however, the momentum and power of the Indian confederacy was broken. via Wikipedia, public domain, US Government work
Bronze reproduction of the figurehead of the USS Delaware, located at the US Naval Academy. Originally Tamanend, chief of the Delawares, the statue is called now commonly called Tecumseh. It was designed by William Luke (1790-1839).
Photo by Employees of the U.S. Naval Academy, Public domain, via Wikimedia Commons

Cary Grant: A Brilliant Disguise

From The Wall Street Journal:

H.L. Mencken was doubtful that Shakespeare wrote the plays assigned to him because there is substantial evidence that he acted in them, which is an amusing way of saying that actors are not notable for searing intelligence. Their intelligence and much else about famous movie actors was nicely kept under cover during the years, from the 1930s through the early 1960s, of the studio system in Hollywood. The men who ran the great studios—MGM, Fox, Warner Bros., Paramount—knew that the people went to the movies above all to see their favorite actors, and so the actors had to be protected from showing themselves the coarse, ignorant, foolish beings they often were. The studio bosses did this by controlling the interviews their actors gave, restraining them from making political statements, hiding anything peculiar about their sex lives. Actors were where the money was, the vehicles in which the movie business drove all the way to the bank.

One reads about the off-screen lives of actors at the peril of never again being able to enjoy in quite the same innocent way the movies they made.

. . . .

I began Scott Eyman’s biography of Cary Grant with some trepidation. In his movies Cary Grant was the embodiment of suavity, the master of savoir faire, elegant, witty, in every way winning. He was dazzlingly but somehow inoffensively (to men) handsome, for in most of his movies he won over women not by his good looks but by his bumbling yet invincible charm. Would Cary Grant, too, in so-called real life, turn out to be a jerk, a creep, a monster, another disappointment? I, for one, distinctly preferred not.

Cary Grant was born Archibald Alexander Leach in 1904 in Bristol, England, to an alcoholic working-class father (he was a tailor’s presser) and a mother who spent more than 20 years in a mental institution. In Mr. Eyman’s account, Grant, an only child largely ignored by his parents, “would spend the rest of his life coping with the damage inflicted on him during these years,” harassed all his days by unreasonable fear and uncertainty.

The young Archie Leach left school at 14—actually, he was kicked out—and found succor in Bristol’s music halls, the English version of our vaudeville, with a touch of bawdiness added. He soon acquired low-level work among some of the performers and not long after joined a troupe of tumblers, with whom he did acrobatics, stilts-walking and pantomime. The troupe traveled to America, where it played second- and third-line theaters, and when it returned to England the young Archie Leach chose not to return with it.

He found a place acting in B-minus movies in New York, then traveled out to Hollywood, where he gradually found parts in better movies. In 1931 he had his name changed to Cary Grant—or, as Mr. Eyman puts it, “the matchless specimen of masculine charm known as Cary Grant.” A friend of Grant’s once told him, “I always wanted to be Cary Grant,” to which he replied, “So did I.” The subtitle of “Cary Grant” is “A Brilliant Disguise.”

. . . .

What was disguised underneath Grant’s nonchalant aristocratic facade, according to Mr. Eyman, “was a personality of nearly perpetual anxiety.” Grant was a man who had no fewer than five marriages (he remarked late in life that he was a better judge of scripts than wives), spent much of his life in therapy, once attempted suicide, and claimed LSD (which he had taken under supervision more than 100 times) to be a wonder drug that quieted the rumblings in his soul and becalmed him by revealing his true self to him.

Whatever the rich complications in his personal life, Cary Grant was never less than keen about cultivating his professional life. He was sedulous about his personal appearance. He worked daily on his perfect tan. His clothes were, beyond impeccable, perfection. Never rumpled, even when chased by an airplane through a farm field or climbing Mount Rushmore, he was often on Ten Best-Dressed Men lists, and the other nine men, whoever they were, must all have felt themselves more than a touch shabby compared with him. “I consider him not only the most beautiful but the most beautifully dressed man in the world,” said Edith Head, the fabled Hollywood costume designer.

Over his 40-year career, Grant made 73 movies. 

. . . .

Romantic comedy was Cary Grant’s specialty. “Grant was to romantic comedy,” Mr. Eyman writes, “what Fred Astaire was to dance—he made something extremely difficult look easy.” Grant recognized that the key to comedy was in timing, and his own timing, first learned on the English music-hall stage, was consummate. He knew his strengths and limitations and kept his ambition in bounds. William Wilkerson III, son of the founder of the Hollywood Reporter, noted that Grant “was one of the few English actors who had no desire to play Shakespeare.” He avoided glum parts generally, sensing, correctly, that movie audiences had no interest in seeing him, in a wife-beater undershirt, screaming “Stella!”

Grant understood that a key to success for an actor in Hollywood was to work with the best directors. For the most part, he was able to arrange to do so. He worked in films directed by Leo McCarey, Howard Hawks, George Stevens, George Cukor and Alfred Hitchcock. Given his popularity at the box office, he had, as Mr. Eyman writes, “first crack at nearly every script that didn’t involve a cattle drive or space aliens.”

Equally careful about female co-stars, Grant played in movies with Katharine Hepburn, Irene Dunne, Audrey Hepburn, Grace Kelly and Ingrid Bergman. He especially admired Bergman. “Grant found that he liked Ingrid Bergman a great deal,” Mr. Eyman notes. “She was beautiful, but lots of actresses are beautiful. What made Bergman special was her indifference to her looks, her clothes, to everything except her art.” With Bergman he made “Notorious,” “the high-water mark,” according to Mr. Eyman, “of the Hitchcock-Grant collaborations.”

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

Hellacious California!

From The Los Angeles Review of Books:

NINETEENTH-CENTURY AMERICAN CRITIC Hinton Rowan Helper left a lasting impression on how Californian culture is still viewed to this day through one mordant comment:

I will say, that I have seen purer liquors, better segars [cigars], finer tobacco, truer guns and pistols, larger dirks and bowie knives, and prettier courtezans here, than in any other place I have ever visited; and it is my unbiased opinion that California can and does furnish the best bad things that are obtainable in America.

Gary Noy draws on Helper’s gleeful sentiment for the title of his book Hellacious California!: Tales of Rascality! Revelry! Dissipation! and Depravity! and the Birth of the Golden State, sharing the view that California’s origin story is a combination of greatness and immorality. The book teems with bittersweet compounds of 19th-century nefariousness, including — but not limited to — gambling, knife fights, the demon drink, con artistry, and prostitution.

. . . .

Gracious dining and gluttony was also at its peak right after the Civil War, with residents binging on jackass rabbit and codfish. I respect Noy’s ability to evenly weigh the temptations of the era. There are the “bad things” that affect the self (e.g., demon drink, gambling, tobacco), and those that affect others (e.g., divorce, knife fights, sex slavery). There is heavy content on Old California’s call for political change and the depth of the unhappiness with elected leadership. Most social issues stemmed from political corruption, especially corruption brought by the railroads. Nineteenth-century state government was also not big on quality law enforcement. Instead, San Francisco local citizens formed their own vigilance committees. Miners and local townspeople created their own form of justice.

In the same way that many civilians helped one another, others tried to harm each other. Many people belonging to the lower-class scammed and tried to “eat the rich.” Wealthy individuals spent incredible amounts of money on luxurious things they did not necessarily value. Arabella Huntington, widow of Central Pacific Railroad founder Collis P. Huntington, stepped into her carriage after attending an art gallery. Soon after, a gallery employee chased the carriage to let Mrs. Huntington know she had forgotten her handbag, which contained “eleven pearl necklaces valued at more than $3.5 million, the equivalent of $108 million today.”

Link to the rest at The Los Angeles Review of Books

Having lived in California a long time ago and having close relatives and friends who still live there, PG can assure one and all that the California you will find today has changed from the California described in the OP.

In some respects.

And in some places.

California was and is a big place with lots of variations in climate, people and cultures.

San Francisco is not Fresno. Los Angeles is not Barstow. Quite a number of residents of each of these four cities are vociferously happy that they don’t live in one of the other three cities mentioned.

California includes both Hollywood and Death Valley (parts of which are shared with Nevada).

In the last half of the 19th century, a great portion of California qualified as nearly or completely uninhabited mountains and deserts that would have been described as useless and dangerous wastelands at the time. If California felt too settled, you could always go east to Nevada (which has places a bit more welcoming than Death Valley) or Arizona for more alone time.

The first transcontinental railroad was started in 1863, while the Civil War was still being fought, beginning in Council Bluffs, Iowa, and and ending in Oakland, California, in 1869.

Prior to that time, if you wished to travel from one coast of the United States to the other, you either took a miserable, long, dirty and dangerous horse-powered trip across the United States or, if you had more money, you took a ship that landed in either Nicaragua or Panama, crossed one of those countries on foot or by horse, hoping to avoid catching any tropical diseases, then boarded a ship on the other side and completed your journey to the opposite coast of the US.

Either the land or the sea route included significant dangers to life and/or health.

Some people became very rich in both the East and the West from their involvement in building the railroad. Others didn’t.

Some people in the East and West got very rich by financing the construction of the railroad and others lost their shirts, banks and fortunes.

Most US government politicians and employees received bribes for their services in picking the route and funding the construction of the railroad. State politicians sometimes participated in the bribing and at other times collected bribes. There were competing bribers who promoted one route over another because they owned a lot of land on one prospective route or another.

All this is to say that California, its residents and elected officials participated in the disorganized and corrupt parts of building the railroad, but residents and elected officials in other parts of the country did the same.

California residents and residents of other states also organized and performed the incredible engineering and construction feats necessary to build a railroad across vast uninhabited deserts and high, little-known mountains.

Imported Chinese laborers were also essential to the construction. During the crossing of the Sierra Nevada mountains, some parts of which were snow-covered all year and others snow-covered much of the year, some of the Chinese dug tunnels into the deep snow along the route and built snow caverns in which to eat and sleep under the snow to avoid the freezing winds that blew almost constantly. Such shelter was necessary for their survival because death by freezing was a real danger to workers of all nationalities.

For visitors to The Passive Voice from outside the United States, the transcontinental railroad was a bit over 1,900 miles (over 3,000 km), longer than the distance from London to Moscow. (Yes, the Trans-Siberian Railway is longer.)

The book that describes this great effort that PG read a few months ago and greatly enjoyed is Nothing Like It In the World: The Men Who Built the Transcontinental Railroad 1863-1869 by Stephen E. Ambrose. If you’re interested in more detail, PG highly recommends this book.

The switch to coal changed everything in Britain

From The Wall Street Journal:

The grimy furnaces and coal-stained cheeks of Dickensian Britain seem like an indelible birthright, but it wasn’t always so. As Ruth Goodman writes in “The Domestic Revolution,” Britons had for ages burned wood as well as peat and other plant fuels to heat their homes and cook their food. Then, in the late 16th century, London switched to coal.

This revolutionary change was carried out by ordinary families, the “ ‘hidden people’ of history,” as Ms. Goodman calls them. They switched to coal for the most prosaic of reasons—personal comfort, convenience, a small savings.

Yet the “big switch” set in motion a series of large transformations. Thousands of Britons found new work as miners and as merchant seafarers. The island’s fabled heathland, site of all those chest-throbbing novels, faded and disappeared as woodland, no longer needed for fuel, was given over to agriculture. To vacate sulfurous coal fumes, chimneys sprouted all over London, prompting homeowners to build more spacious layouts and second and third stories.

Since coal fires required a different sort of cookware, investment poured into brass and iron, hastening the development of pig iron—hastening, that is, the onset of the Industrial Revolution. Wall tapestries came down (in a coal-fired home, they quickly stained) and were replaced by smoother, washable surfaces and paint. There was a bull market in soap.

Not least, British cooking, which Ms. Goodman stoutly defends, was forced to adapt. Stirring a pot precariously dangled over a row of coals was difficult. Thick, starchy fare gave way to boiled puddings and kidney pies, which the author forgivingly describes as “democratic.” Thanks to the pleasing effects of roasting on an open grate, Ms. Goodman maintains, coal even led to the “modern British love affair” with toast. The new energy source touched every corner of life.

. . . .

Whatever the causes, the changeover to coal happened quickly. When Elizabeth ascended the throne, in 1558, London homes burned wood. A generation later, the increasingly crowded city was importing 27,000 metric tons of coal per year. By roughly the time of Elizabeth’s death, in 1603, imports had soared to 144,000 tons.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

Lover, Mother, Soviet Spy

From The Wall Street Journal:

Though little known to the broader world, Ursula Kuczynski, born in 1907 into a wealthy, cultured Jewish family in Berlin, was one of the most successful spies of the 20th century. She worked for the GRU—Soviet Military Intelligence—in China, Eastern Europe, Switzerland and, most damagingly, Britain. Beginning in the early 1930s as the provider of a safe house for spies to meet, she was soon trained as a radio operator, courier and liaison between communist underground activists and Moscow. Eventually she ran her own network of Soviet spies in Nazi Germany and, after war broke out, in Britain. There she served as a courier for Klaus Fuchs, the émigré physicist who later worked at Los Alamos and betrayed its secrets to the Soviet Union.

Kuczynski—whose code name was “Sonya”—never endured prison or torture or the other gruesome fates that befell many of her comrades, though she was pursued by the security services of the Chinese Nationalists, the Japanese, the Nazis and various British governments. She spent most of her 20-odd years as a spy living in fairly comfortable surroundings with her children, posing as a middle-class foreigner and friendly neighbor. Only when Fuchs himself was arrested in 1950, and she faced the possibility of exposure and arrest, did she board a plane to return to East Germany. She then transformed herself into a novelist and published, under a pseudonym, an autobiography that resulted in a triumphant book tour in Britain, the country that had given her refuge and that she had betrayed.

With “Agent Sonya,” Ben Macintyre, the author of several popular works about espionage, has written a lively account of Kuczynski’s remarkable career. He has been aided by the cooperation of her family and by his research in the British and (in a limited way) Russian archives. Inevitably, as the reader should keep in mind, much of Kuczynski’s life is filtered through her autobiography, which was written in East Germany under the scrutiny of censors by a woman whose survival depended on lying about many of her activities. Her account of Stalin’s purges of the GRU, for example, is limited to the statement that, “unfortunately, comrades in leading positions changed frequently at that time.” While government files and private letters offer a partial reality check, GRU archives remain inaccessible, limiting the best source for our knowing how far-reaching her career was.

Kuczynski was an early rebel, Mr. Macintyre tells us, participating in communist demonstrations in Berlin at age 17. Her father, a demographer, and her brother, an economist, had connections to many government officials throughout Europe and the United States, and both later fed her information for transmission to the U.S.S.R. Jurgen, her brother, led the underground Communist Party in Britain during World War II and was the first to put her in touch with Fuchs.

Her entry into espionage came in Shanghai, where she was living in 1930 with her husband, Rudi Hamburger, an architect. Appalled by the poverty and brutality of the city, and repulsed by the racism and luxurious lifestyle of the Western community there, she was recruited by Agnes Smedley, the American journalist, and Richard Sorge, the legendary Soviet spy. Kuczynski and Sorge (a compulsive womanizer) ended up having a passionate affair. Mr. Macintyre observes that she was “intoxicated by the thrill of her own destiny, the entwining of danger and domesticity, living one life in public and another in deepest secrecy.”

. . . .

Fearing deportation from Switzerland to Germany in 1940, she concluded that marriage to a British citizen would enable her to obtain a British passport. She arranged to divorce Rudi and married Len Beurton, a veteran of the International Brigades in Spain.

One of the great mysteries of Kuczynski’s career is how she managed to avoid detection by British authorities. 

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

Smith lands Instagrammer’s guide to planning for Ebury

From The Bookseller:

Ebury Press editorial director Emma Smith has acquired Happy Planning, a “practical guide for those who like to prep” from Charlotte Plain, a.k.a. Instagrammer Princess Planning.

Smith bought world all language rights to the title directly from the author.

Plain is the person behind Instagram account and website Princess Planning, where she sells diaries, planners and stationery which aim to help organise and inspire positivity. Happy Planning will give readers the tools they need to plan every aspect of their life, from the weekly shop and daily meal prep to big occasions like weddings, parties and holidays.

The publisher explains: “Planning is about taking away last-minute panic pressure, gaining control and helping you to be the best version of yourself. Charlotte’s everyday approach has been so successful that she launched a business off the back of it, and is now sharing all of her practical and positive know-how in this book. As well as her planning mantras and toolkit, each section of the book is dedicated to an area of life that benefits from planning and is packed with personal learning experiences, planning methods, tips and tricks, practical guidance and interactive elements. It’s simple, positive and practical planning that will lead to a healthier happier you.”

Smith added: “We all need a good dose of practical positive planning in our lives (now, more than ever), so we are incredibly excited to be publishing an Instagram star”

Link to the rest at The Bookseller

PG couldn’t resist visiting Princess Planning Ltd. on Instagram (192K followers). He found several Instagram star gems:

Weight loss is never just about losing the weight.

You have to lose the habits that got you there in the first place and replace them with better ones

Princess.Planning Ltd.

if 2020 was a chocolate it would be a turkish delight

Princess.Planning Ltd.

What would we do without traditional publishers to act as curators of culture?

‘Not Made by Slaves’ Review: Marketing to Abolitionists

From The Wall Street Journal:

As Ecclesiastes reminds us, there is nothing new under the sun. Demands for institutional divestiture of morally suspect assets, boycotts of goods from Goya beans to “blood diamonds,” and movements to promote ethically sourced consumer products from dolphin-safe tuna to fair-trade coffee, all have a long pedigree. Among the earliest expressions of American national identity was the 1765 nonimportation agreement among Boston merchants in response to British revenue acts, supported by boycotts of British goods by local households. And, starting in the late 18th century, antislavery activists in the Atlantic world urged consumers to refuse to buy products made with slave labor—primarily cotton cloth and sugar—to hasten the end of the international slave trade and ultimately slavery itself. Conscientious consumers flocked to free-labor goods and to such virtue-signaling items as emblazoned sugar bowls assuring guests that the content was “East India Sugar not made by Slaves.”

In “Not Made by Slaves,” Bronwen Everill, a lecturer in history at the University of Cambridge, terms this movement “ethical capitalism” and places it in the context of the 18th-century global consumer revolution that put luxury goods in the hands of the many. The book offers an important contribution by emphasizing West Africa’s role in the trade network that linked producers, merchants and consumers around the world. Just as Western economies traded for tropical luxuries such as tea, coffee and sugar, sophisticated African markets exchanged a highly valued commodity—unfree labor—for French wines, East Indian cottons and British firearms. It was this complex global trading community that the abolitionists sought to reform and that they disrupted in ways both foreseen and unforeseen.

Ms. Everill’s account rests on a chain of related events. In the late 1700s, opposition to the slave trade grew as the world came to appreciate the horrors of the Middle Passage between Africa and America. In their efforts to suppress the trade, antislavery activists asked Atlantic consumers to boycott goods associated with slave labor. But for the boycott to be effective, consumers needed to be certain the goods they bought were ethically produced. Antislavery traders thus began to source free-labor goods and, in early branding efforts, to identify them with labels such as “made by escaped slaves,” spawning an ethical-goods economy. In a parallel development, West African Islamic jihadists attacked local consumption of luxury goods and the international slave trade that supported those tastes, eventually banning the sale of slaves to non-Islamic traders and organizing boycotts of European goods such as tobacco and alcohol.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

When the Revolution Left Kate Millett Behind

From Public Books:

In March 1979, the white American feminist Kate Millett landed in Tehran, in the wake of one of the most significant revolutions of the 20th century. Just weeks earlier, the Shah—the monarch of Iran—had been overthrown. Millett arrived with a suitcase of recording equipment and her partner, filmmaker Sophie Kier. While there, Millett methodically recorded her whispered reflections on everything around her: the cups of tea with her hosts, the hours stuck in traffic, and the International Women’s Day celebration, which exploded into major protests against Ayatollah Khomeini’s new mandatory veiling laws.

Millett’s whispers were the raw material for her own Going to Iran (1982), but they have been newly transcribed and examined by Negar Mottahedeh in her new book, Whisper Tapes: Kate Millett in Iran. With the same recordings, Mottahedeh does something in Whisper Tapes that Millett never could. She listens closely to the women speaking, yelling, and demonstrating in Farsi around Millett, centering their voices in a radically new and vital account of the revolution.

By exploring the complexities of what Millett couldn’t hear, Whisper Tapes also reveals the narrowness of her white feminism and her lack of reciprocity. Yet, there is no need to “cancel” Kate Millett (who profoundly contributed to both feminist and literary theory, not least in her pathbreaking 1970 work, Sexual Politics). Instead, it is necessary to explore her particular brand of white, Western feminism critically, asking what Millett’s brief time in Iran might offer contemporary understandings of feminist solidarity.

. . . .

This paradox—between the book’s centering and decentering of its subject—mirrors a wider paradox: the tension between the alleged universalism of Millett’s feminism and the increasingly particular way in which she pronounces it. We might read Millett’s paradox against and alongside a revolutionary slogan pulsing throughout Mottahedeh’s book—one that Millett only “provisionally understood”: “Azadi, na sharghist, na gharbist, jahanist,” or “Freedom is neither eastern nor western, it is planetary.”

. . . .

Mottahedeh identifies the conceptual problem of misunderstanding—of a white ally who just doesn’t get it—largely as a problem of mistranslation. In an elegant anecdote, Millett is given some chaghaleh badoom, which she says on the tape are “beans,” while the women around her suggest—in English—that they are “walnuts.” In fact, neither translation is adequate, Mottahedeh tells us; the best approximation is “green, unripe almonds.”

A thirsty Millett struggles to comprehend going “through a whole revolution and not being able to have a glass of wine after it’s all over?” She declares Iran “joyless,” not understanding that there are so many versions of the good life, so many ways to be joyful. Millett can’t quite grasp why the revolution happened alongside, and so also included, men. “It’s important to ignore men,” she advises a demonstrator. “He is never gonna listen. Why waste your time?”

Millett’s unfamiliarity with Iran allows us, with the benefit of hindsight, to laugh at her presence as an awkward white woman. But this is not really the problem with Millett’s white feminism. What white feminism means, at least in the context of Whisper Tapes, is that Millett considers patriarchy to be the primary organizing structure in women’s lives, globally. This is despite the interactions she has with women who explain otherwise.

Millett reads Iranian women’s heterogeneous experiences of religion, demonstration, and revolution through this lens, and only this lens. It is this focus on patriarchy that allows her to quickly diagnose the women of Iran as being behind white American women on the path of liberation; the path that she herself, through Sexual Politics and her work in the women’s liberation movement, helped to pave. Millett’s white feminism means that she applies the logic and schedule of US women’s liberation to the Iranian revolutionary moment.

Mottahedeh’s careful treatment of Millett reveals that “white feminism” is not just a scolding charge. Instead, Millett’s white feminism is a generative and persistent world view that creates particular behaviors, blinkers, and blinds, while simultaneously proclaiming to be a universalist politics that speaks for all women. It means that Millett’s “ambitions and preoccupations are elsewhere.” She is always waiting for the moment of a radical global women’s uprising. She is “out of sync with what is right in front of her,” be it green-shelled, unripe almonds in their crinkled paper bag, or men’s crucial place alongside women in the ongoing Iranian revolution.

Kate Millett certainly does not understand that she is imposing a presumed universality steeped in the specificity of the American context. Indeed, this is just one of the things that she does not get.

Link to the rest at Public Books

No Time But the Present

From Harper’s Magazine:

From Breaking Bread with the Dead: A Reader’s Guide to a More Tranquil Mind, which was published last month by Penguin Press.

Navigating life in the internet age is a lot like doing battlefield triage. There are days we can’t even put gas in our cars without being assaulted by advertisements blared at ear-rattling volume. And so we learn to be ruthless in deciding how to deploy our attention. We only have so much of it, and often the decision of whether or not to “pay” it must be made in an instant. To avoid madness we must learn to reject appeals for our time, and reject them without hesitation or pity.

Add to this problem of information overload what the sociologist Hartmut Rosa calls “social acceleration,” the widespread belief that “the ‘tempo of life’ has increased, and with it stress, hecticness, and lack of time.” Rosa points out that our everyday experience of this acceleration has a weirdly contradictory character. On the one hand, we feel that everything is moving so fast, but we simultaneously feel trapped in our social structures and patterns of life, imprisoned, deprived of meaningful choice. Think of the college student who takes classes to prepare for a job that might not exist in a decade. To her, there doesn’t seem any escaping the need for professional self-presentation; but there also doesn’t seem to be any reliable means of knowing what form that self-presentation should take. You can’t stop playing the game, but its rules keep changing. There’s no time to think about anything other than the Now, and the not-Now increasingly takes on the character of an unwelcome and, in its otherness, even befouling imposition.

William James famously commented that “the baby, assailed by eyes, ears, nose, skin, and entrails at once, feels it all as one great blooming, buzzing confusion.” But this is the experience of everyone whose temporal bandwidth is narrowed to this instant.

What do I mean by “temporal bandwidth”? I take that phrase from one of the most infuriatingly complex and inaccessible twentieth-century novels, Thomas Pynchon’s Gravity’s Rainbow. Fortunately, you don’t have to read the novel to grasp the essential point that one of its characters makes:

“Temporal bandwidth” is the width of your present, your now. . . . The more you dwell in the past and in the future, the thicker your bandwidth, the more solid your persona. But the narrower your sense of Now, the more tenuous you are. It may get to where you’re having trouble remembering what you were doing five minutes ago.

Increasing our temporal bandwidth helps us address the condition of frenetic standstill by slowing us down and at the same time giving us more freedom of movement. It is a balm for agitated souls.

Link to the rest at Harper’s Magazine

The Innovation Delusion

From The Wall Street Journal:

‘Do you ever get the feeling that everyone around you worships the wrong gods?” So ask Lee Vinsel and Andrew L. Russell in the first pages of “The Innovation Delusion.” They are consumed by this question, convinced that America has been seduced by the false charms of innovation, causing us to chase novelty and pursue disruption while neglecting maintenance and infrastructure in both the public and private sectors. We end up discounting the value of “the ordinary work that keeps our world going.” Anyone compelled to “ideate” at a corporate breakout session can surely relate.

Agitated by Walter Isaacson’s triumphalist portraits in “The Innovators” (2014), Messrs. Vinsel and Russell, scholars of the history of technology, became increasingly troubled by what they saw as a broad cultural emphasis on “the shiny and new.” They started to wonder why no one ever celebrates the “bureaucrats, standards engineers, and introverts” who manage to keep established systems running smoothly. We live in an inverted world, they say, where “our society’s charlatans have been cast as its heroes, and the real heroes have been forgotten.”

In this dystopian view, we’ve mistaken novelty for progress and, in the desperate pursuit of growth, confused true innovation—creating things that work—with fraudulent “innovation-speak.” The result is, as the authors put it, an “unholy marriage of Silicon Valley’s conceit with the worst of Wall Street’s sociopathy.” Champions of change—like the late Harvard professor and father of disruptive innovation, Clay Christensen, and the influential thinkers at IDEO, the Palo Alto, Calif., design firm—have garnered hefty consultant fees while offering, the authors contend, little of true substance in return. Despite the frenetic pursuit of innovation stoked by the fear of missing out, “we should resist the notion that anyone on this planet knows how to increase the rate and quality of innovation.”

Privileging innovation, the authors note, costs us all. Localities find it far easier to attract federal funding for new infrastructure projects than to secure support for maintaining what already exists. And the funding for new development typically comes without the resources for downstream maintenance, saddling municipalities with unmanageable future obligations. Better for communities first to fix what’s broken, Messrs. Vinsel and Russell argue, and practice preventive maintenance. In any case, resources should be focused on what matters: Transit riders, one survey revealed, care most about service frequency and travel time, not power outlets and Wi-Fi.

The authors’ most emphatic recommendations involve talent—and our perception of it. When we overvalue innovation, they say, we forget that the vast majority of engineers will wind up maintaining existing systems, not coming up with the next Facebook. While we revere and reward data scientists and algorithm developers, we overlook the humble IT workers who keep our networks humming. Many students who might find “more joy, meaning, and pleasure” working in maintenance roles are shunted toward innovation careers sure to make them miserable. A rebalancing of our priorities is in order, Messrs. Vinsel and Russell contend.

Link to the rest at The Wall Street Journal (Sorry if you encounter a paywall)

Nonfiction, and the Past as ‘a Foreign Country’

From Publishing Perspectives:

Nonfiction publishing is often perceived by the outside world as somehow more predictable and less risky than fiction.

It’s certainly the case that betting on untried novelists is very high-risk. It’s even the case that lashing out large advances on established writers carries a degree of risk, should the writer have a declining fan base or say something to upset a special-interest group.

While some areas of nonfiction appear to be risk-free, it’s remarkable how often that’s only an appearance. The reality is that nonfiction is just as risky as fiction, just as hard to predict, and just as affected by vogue, trends, and demographics.

Per the opening line of LP Hartley’s 1953 The Go-Between, “The past is a foreign country: they do things differently there.” But perhaps we can learn something from the past while we look to the future and try to discern how the publishing landscape might be once the clouds of the coronavirus COVID-19 pandemic have cleared.

. . . .

The first lesson comes from two of my ancestors.

Garment manufacturing in Clapham Square, London, 1910. Image by Toby Charkin, provided by Richard Charkin

One owned a wedding dress factory in Margaret Street, which was then the clothing district of London. The other oversaw the manufacture of underwear and dresses in Clapham Square for the city’s burgeoning middle class. The business principle that guided them both was, “The one who makes the most money is the one who is first out of a fashion craze—not the first in.”

The same might be true in nonfiction publishing.

. . . .

The first book craze I can remember was for illustrated nostalgia. Edith Holden’s The Country Diary of an Edwardian Lady . . . was the front-runner, followed by Flora Thompson’s Lark Rise to Candleford.  They sold in their hundreds of thousands of units by satisfying a desire to enjoy what appeared to be a simpler and more perfect world.

Of course, publishers being the owners of excellent rear-view mirrors, they piled into the genre until the profit was eviscerated.

And there was the computer books boom of the 1980s: Fortran, Cobol, and all that.

. . . .

Right now, there seems to be an unquenchable thirst in the English-speaking world for books on politics, political scandal, and racial inequality.

. . . .

The overdue realization that health really matters will ensure that governments will prioritize the funding of primary and secondary health care, public health, and communication with the general public.

I suspect that there will be growing demand in digital and print for reliable and comprehensible information, formerly known as popular medicine.

In addition, we’ll see renewed research activity into all aspects of infectious disease and epidemiology with consequent growth in high-level open-access research and review publications. The distinction between general public, professional, and research information about health will narrow as more people want to know more about their own health—and as more professionals understand the need to communicate outside their own specialist communities.

. . . .

In this age of uncertainty, people will turn to books about happiness, de-stressing, self-awareness, empathy, and human interaction. “Mind, body, spirit” will emerge as a major genre, challenging even the dominance in British bookshops of celebrity-led cookery books.

Link to the rest at Publishing Perspectives

‘Twilight of the Gods’ Review: A Blood-Soaked Peace

From The Wall Street Journal:

A tale-telling axiom holds that complex narratives—whether from a writer’s quill, the pulpit or a Hollywood storyboard—are best broken into threes. From Sophocles to Coppola, the trilogy has thrived as a means to carve an enormous meal into manageable courses.

World War II, history’s most complex bloodbath, often seems to require such treatment, and over the decades the war’s two billion individual stories have been compiled into dozens of memorable (and not-so-memorable) three-volume sets. The best known of recent threepeats is Rick Atkinson’s “Liberation Trilogy,” a brilliant study of the U.S. Army in the Europe and the Mediterranean. James Holland (“Normandy ’44”) has released two of his three volumes on the Anglo-American war against Germany, and for the hard-core history geek, David M. Glantz offers a three-part deep dive into the Stalingrad campaign. Novelist James Jones (“From Here to Eternity”) drew the Pacific War’s thin red line through three volumes, while respected historian Richard B. Frank (“Tower of Skulls”) recently launched his first of three volumes on the Asian-Pacific struggle. It remains to be seen whether Mr. Frank’s series will rise to the level of Ian W. Toll’s Pacific War trilogy, now capped by “Twilight of the Gods.”

Mr. Toll, who has spent his literary career chronicling the U.S. Navy, built a solid foundation for the war’s final act in the first two volumes. The opening work, “Pacific Crucible” (2011), spanned the Navy’s disaster at Pearl Harbor to its redemption at Midway. The second installment, “The Conquering Tide” (2015), spotlighted America’s hard-won education in amphibious landings, from the six-month charnel house of Guadalcanal to the red-tinged tides of Guam. In “Twilight of the Gods,” he carries the reader through the war’s violent death rattles, spanning Peleliu to Okinawa.

The Pacific War’s complexity—and brutality—resist detailed depiction. The 8,800-mile American odyssey from Pearl Harbor to Tokyo Bay was dominated by saltwater, airstrips and islands few had heard of before 1941. Chinese, Dutch, Australians, Indians, Filipinos, British, Burmese and New Zealanders played major supporting roles in a conflict we often think of today as “U.S. versus Japan.” Setting the table of personalities, objectives, resources and innovative weapons systems is an immense job for any historian.

. . . .

As his narrative rolls through the Philippine Sea, Peleliu, the Philippine islands, Luzon, Iwo Jima and Okinawa, Mr. Toll introduces the reader to America’s battle captains of the waves. Adm. Raymond Spruance, commander of the Fifth Fleet, was an eccentric thinker who delegated nearly everything to his subordinates. “Spruance did not fit the conventional mold of a wartime fleet commander,” Mr. Toll writes. “He was aloof, introverted, and monkish. . . . On an average day at sea, Spruance paced for three to four hours around the forecastle of the Indianapolis while dressed in a garish Hawaiian floral-print bathing suit, no shirt, white socks, and his regulation black leather shoes.” Yet, he continues, Spruance’s “insistence upon delegating authority down the line of command tended to bring out the best in subordinates.” Because Spruance’s résumé included spectacular victories at Midway and the Philippine Sea, Roosevelt would tolerate eccentricities.

Third Fleet’s Adm. William Halsey, nicknamed “Bull” by the press, jumps off the pages as an instantly likeable, Pattonesque leader whose reputation was cemented with his victory at Leyte Gulf, one of the largest naval battles in history. “He was a profane, rowdy, fun-loving four-star admiral who laughed at jokes at his own expense and fired provocative verbal salvos against the enemy.” His rapport with the press would yank him out of trouble on more than one occasion and propel him to the rank of five-star fleet admiral.

. . . .

Mr. Toll’s interest in the evolution of weaponry dots the pages of “Twilight of the Gods.” The big Essex-class aircraft carriers and their unruly children, Hellcat fighter-bombers, play critical roles, as does the ultimate piece of the war’s power game: the atomic bomb. Doppler radars, proximity fuses, air-dropped mines and napalm raise the curtain on modern warfare. Carrier combat, no longer the “whites of the eyes” affair of 1941, morphed into a long-range campaign in which, Mr. Toll notes, “often the crews of the ships did not even lay eyes on a hostile plane.”

Yet on the ground Marines laid eyes on many enemies, human and natural. On Peleliu, a wasteland Mr. Toll compares to J.R.R. Tolkien’s Mordor, “clouds of large greenish-blue flies fed off the unburied dead and tormented the living. Sudden torrential rainstorms came in the late afternoon, and sometimes at night. There was no escape from the relentless artillery and mortar barrages.” Worse horrors faced the doomed enemy: “When the guns paused, the marines could hear wounded and dying Japanese crying out in the night. Often they cried for their mothers, as did dying men of all races.” Of the cave-dwelling Japanese defenders of Iwo Jima, he writes: “The noise and blast concussions took a steady toll on their nerves, and many were reduced to a catatonic stupor. Their subterranean world grew steadily more fetid and unlivable. There was no way to bury the dead, so the living simply laid them out on the ground and stepped around them. The stench was unspeakable.”

Link to the rest at The Wall Street Journal (Sorry if you encounter a paywall)

World War II in the Pacific theater covered almost unimaginable distances, particularly for the world of the early and mid-1940’s.

The OP mentions the distance from Pearl Harbor, Hawaii, to Tokyo as 8,800 miles (not in a straight line, presumably following the sequence of the location of major battles). For comparison’s sake, the distance from London to Moscow is approximately 1500 air miles. London to Shanghai via a direct flight is approximately 5700 air miles. New York to Tokyo via a direct flight is approximately 6700 air miles.

All through World War II, the only way to transport large numbers of soldiers or any significant amount of military equipment from the US to the site where needed was via ship. Most of the ships used for this purpose were a variant of the Liberty ship.

A Liberty ship cruised at about 11 knots (a bit less than 13 miles per hour). The distance between San Francisco and Honolulu is about 2400 miles. The trip took more than a week via Liberty ship. The distance between Honolulu and Manila is about 5300 miles. That trip took about 2.5 weeks. The potential for enemy submarine or air attacks that required evasive maneuvers added even more time.

The same distances were also covered while delivering massive amounts of military equipment, supplies, ammunition, food, etc., etc., etc., necessary for the Marines, troops and sailors to live and wage war on the ground, at sea and in the air.

Since virtually all of the Pacific war was waged from island to island, everything and everyone had to be put back on ships, taken to the location for the next battle and unloaded once again. This process was repeated many times.

The Conspiracy on Pushkin Street: The Costs of Humor in the USSR

From The Los Angeles Review of Books:

“IF THEY COME for me, I won’t give you up. I won’t tell them what happened in this room.” Vasily Babansky let out a sigh and locked eyes with the four young men around him. It was February 1940 and 18-year-old Vasily had become increasingly sure that the NKVD was closing in on him.

The silence hung thickly in the air, so at odds with the laughter they usually shared here. The five students were gathered together in their usual haunt — one of the dormitories at the Zoological Institute in Stavropolsky District, southwestern Russia. The door was locked, as it always was when they wanted to speak freely, but now the bolt seemed woefully inadequate. If the NKVD was coming for them, all they could rely on was silence and their loyalty to one another.

Silence would be a problem, though. They’d have to tell the NKVD something if they were arrested; Stalin’s secret police didn’t take “No” for an answer. Aleksandr Mitrofanov proposed they should tell the truth, but not the whole truth — they would come clean about anything they’d said or done in front of witnesses at the Institute, “but keep quiet about what went on in our room,” recalled another of the students, Pavel Gubanov.

They all solemnly agreed, and then Mitrofanov rushed off to find the poem he’d written criticizing the Soviet regime. He was proud of his work, and the group had hoped to make anonymous copies and spread them across campus. Instead, after relocking the door behind him, he would ritualistically read the poem aloud one last time to his comrades, then set the paper alight and watch the flames consume his words.

It would be another 11 months before the NKVD descended, but when they did, the lives of these young men would be torn apart. Despite their earnest pact not to inform on each other, in the end they had little choice. The NKVD has gone down in history for its brutality and willingness to extract confessions by any means necessary. All five would break their vow of silence as the interrogators raked through the ashes of their lives at the Institute. Aleksandr Mitrofanov, Vasily Babansky, Mikhail Penkov, and Pavel Gubanov would all be sentenced for the crime of “anti-Soviet agitation” and for being part of a “counterrevolutionary organization” that, the authorities were sure, was actively plotting the downfall of the Soviet regime. Mitrofanov and Babansky received 10 years, Penkov eight, and Gubanov seven. The fifth man, Damir Naguchev, was for some reason treated with a touch more leniency: he received “only” three years for failing to denounce his comrades.

Locked doors, burnt evidence, and a plan for resisting interrogation: at first glance, it certainly sounds like conspiracy was afoot at the Institute. But if we take a closer look at the evidence left behind in the formerly secret Soviet archives, the fate of these five teenagers reveals a very different story. A story of how, under Stalin, a poem, a few jokes, and five open minds could spell disaster.

Link to the rest at The Los Angeles Review of Books

As PG reviewed the OP and thought of a novel he is reading that is set, in part, in the all-female 588th Night Bomber Regiment of the Soviet Red Army Air Forces. One of the characters has to flee from her regiment because her father has been convicted and executed for anti-Soviet, anti-Stalin outbursts and all his family members are to be arrested.

Back to the 588th regiment. This group, which flew all of its bombing missions at night, were called the Nachthexen, or “night witches,” by their targets in the Wehrmacht because the whooshing noise their wooden planes made as they dived into their attacks resembled that of a sweeping broom. There was no other noise because the pilots were instructed to idle their engines at altitude, prior to beginning their bombing glide to drop their bombs on the German troops.

The antiquated bombers, 1920s bi-plane crop-dusters that had been used as training vehicles prior to being repurposed for night bombing were effectively invisible to German radar or infrared defense systems. They were unarmored, built of plywood with canvas stretched on top and most had no guns for defense. Machine guns and ammunition would be too heavy to carry in addition to the weight of a single bomb attached under each wing. Parachutes were also too heavy to carry.

These planes had a top speed of 94 mph and a cruising speed of 68 mph. The most common German fighter plane the Night Witches faced in battle was the Messerschmitt 109, which had a top speed of 385 mph. The maximum speed of the bombers was slower than the stall speed of the German planes, which meant these wooden planes, ironically, could maneuver faster than the enemy, making them hard to target.

The Night Witches continued their attacks through three winters, 1942-43, 1943-44 and 1944-45. Their planes had open cockpits and no insulation. Flying them exposed their pilots and navigators to almost unimaginably bitter cold temperatures. During those Russian winters, the planes became so cold, just touching them would rip off bare skin.

The Night Witches ended the war as the most highly decorated female unit in the Soviet Air Force.

On average, each pilot/navigator crew flew about 800 missions. For comparison, United States heavy bomber crews flew bomber crews’ obligations were between 25-35 combat missions.

(In fairness, since the Night Witches were stationed at airfields so close to the front lines, the flight time spent on each of their missions was much shorter. Many US crews spent far more time in the air because their missions involved much longer flights to reach their targets. On the other hand, the Night Witches were under direct enemy fire far more frequently, sometimes flying 8 missions in a single night.)

A Place at the Table

From Publishers Weekly:

Chef, restaurateur, and TV personality Marcus Samuelsson began working on his latest cookbook, The Rise (Voracious, Nov.), three years ago. A celebration of Black cooking, the book brings together chefs, food writers, and activists to share their stories and recipes, and emphasizes the diversity of the Black American experience. “There wouldn’t be American food without the contributions of Black people,” Samuelsson says. “[This book] is an opportunity to give authorship and recognition.”

The Rise arrives at a moment of racial reckoning in the U.S. more broadly, and in food media specifically. In May, cookbook author and Instagram star Alison Roman was placed on temporary hiatus from her New York Times column after mocking the achievements of Marie Kondo and fellow cookbook author Chrissy Teigen, both women of color. Weeks later, Adam Rapoport resigned from his position as editor-in-chief of Bon Appétit after a 2004 photo of him in brownface surfaced, which in turn opened up a public discussion about pay inequity in the magazine’s test kitchen. Subsequently, four on-screen personalities of color declined to participate in the brand’s popular video series, and the magazine’s only two Black editorial staff members quit.

“This moment is important; the world is watching,” says Samuelsson, who on August 17 was named Bon Appétit’s first brand advisor. “To be able to uplift Black stories of craftsmanship is important. I feel honored and privileged.”

Link to the rest at Publishers Weekly

The 22-Year-Old Blogger Behind Protests in Belarus

More under the category, “There are worse things than Covid.”

From The Atlantic:

In the videos posted last Sunday from Belarus, thousands of people can be seen streaming into the center of Minsk, walking up the broad avenues, gathering in a park. In smaller cities and even little towns—Brest, Gomel, Khotsimsk, Molodechno, Shklov—they are walking down main streets, meeting in squares, singing pop songs and folk songs. They are remarkably peaceful, and remarkably united. Many of them are carrying a flag, though not the country’s formal flag, the red and green flag used in the Soviet era. Instead, they carry a red-white-red striped flag, a banner first used in 1918 and long associated with Belarusian independence.

It was a marvelous feat of coordination: Just as in Hong Kong a few months ago, the crowds knew when to arrive and where to go. They knew what they were marching for: Many people carried posters with slogans like leave—directed at the Belarus dictator/president, Alexander Lukashenko—or freedom for political prisoners! or free elections! They carried the flag, or they wore red and white clothes, or they drove cars festooned with red and white balloons.

And yet, at most of these marches, few leaders were visible; no one ascended a stage or delivered a speech into a microphone. The opposition presidential candidate, Sviatlana Tsikhanouskaya, who probably won the contested election held on August 9, fled the country last week. How did everyone know exactly what to do? The answer, improbably, is a 22-year-old blogger named Stsiapan Sviatlou, who lives outside the country and runs a channel called Nexta Live on the encrypted messaging app Telegram.

On Sunday morning, Nexta—the word means “somebody”—posted a red and white invitation to the march. “Ring the doorbells of your neighbors, call your friends and relatives, write to your colleagues,” the message instructed them: “We are going EXCLUSIVELY peacefully to the center of town to hold the authorities to account.” The invitation also contained a list of demands: the immediate freeing of political prisoners, the resignation of Lukashenko, the indictment of those responsible for a shocking series of political murders.

People went to the Minsk march, and to dozens of smaller marches across the country, because they saw that message. On subsequent days, many went on strike because they saw another message on that channel and on channels like it. Over the past 10 days, people all across Belarus have marched, protested, carried red and white flags and banners, and gathered at factories and outside prisons because they trust what they read on Nexta. They trust Nexta even though Sviatlou is only 22 years old, even though he is an amateur blogger, and even though he is outside the country.

Or to put it more accurately, they trust Nexta because Sviatlou is only 22, and because he is an amateur who lives outside the country. In Belarus, the government is a kind of presidential monarchy with no checks, no balances, and no rule of law. State media are grotesquely biased: Memo98, a media-monitoring group, reckons that Belarus state television devoted 97 percent of all political news programming to Lukashenko in May and June, with only 30 seconds devoted to opposition presidential candidates. Political leaders in Belarus are routinely repressed, and their voices are muffled: Tsikhanouskaya was running for president because her husband, Siarhei Tsikhanouski, was arrested before he could start his own presidential campaign. Other candidates and politicians were also arrested, along with their staff. Some are still in prison. Human-rights groups have evidence of torture.

. . . .

Paradoxically, the Lukashenko regime is also the source of his unusual power. By suppressing all other sources of information, it has given him unprecedented influence. This also has its downsides. One member of the tiny but determined community of independent journalists in Belarus—I am leaving him unnamed because he remains in Minsk—pointed out that the administrators of Telegram channels outside the country (Sviatlou is one of several) have no way to check whether what they are publishing is true, and no way to coordinate what they are doing with anyone else. Although he does communicate with other channel administrators, as well as with coordinators in Minsk, mistakes are sometimes made. A couple of days ago, crosscurrents of information nearly led one group of opposition protesters into a public brawl with another.

Link to the rest at The Atlantic

Belarusian Writers Stand By Their People

From Publishing Perspectives:

As the Belarusian election protests expand, opposition leader Svetlana Tikhanovskaya has spoken from her exile in Lithuania today (August 21). She is urging widening strikes, asking citizens not to be “fooled by intimidation,” as reported by the BBC.

Days before the now-disputed election of August 9, PEN International issued a joint statement–writing for the 24 PEN centers that stand in various parts of the world. That statement of concern is focused on what the Belarus PEN Center’s Sviatlana Aleksievič has said were two dozen political prisoners whose freedom of speech the center believed was being suppressed.

“Among them are bloggers and journalists, patrons of culture, Aleksievič wrote, “those who in 2020 awakened the Belarusian society, and for the first time in 26 years create serious competition for the authoritarian regime of Aliaksandr Lukašenka [longtime Belarusian president Alexander Lukashenko].

“Today, Belarusian writers stand by their people as the story is being written in the streets and squares, not at desks.”

. . . .

Indeed, this morning (August 21), The Economist (which does not allow its writers bylines) has released a story describing, as other media have reported, how “prisoners were forced to kneel with their hands behind their backs for hours in overcrowded cells. Men and women were stripped, beaten, and raped with truncheons.”

“The repression was ostentatious,” The Economist piece continues. “Some victims were paraded on state television. By August 19, at least four people had been killed. The aim was both to terrorize citizens and to bind the regime’s officers by having them commit atrocities together, a tactic used by dictators and mafiosi to prevent defections.”

Link to the rest at Publishing Perspectives

Alert visitors will have noted that there are worse things than Covid.

The Medieval University Monopoly

From History Today

(not really to do with books, but PG found it interesting):

In June 1686, a small family – a clergyman, his wife, and their daughter – disembarked from a ship at the docks of Boston, Massachusetts. They had just finished a long journey of a month or more across the Atlantic, escaping from England. The clergyman, a scholarly, 60 year old named Charles Morton, was fleeing prosecution. His crime? Teaching students – or, more specifically, teaching students in north London.  

From 1334 onwards, graduates of Oxford and Cambridge were required to swear an oath that they would not give lectures outside these two English universities. It was a prohibition occasioned by the secession in 1333 of men from Oxford to the little Lincolnshire town of Stamford. They were escaping the violence and chaos which often attended medieval university life – the frequent battles between students, and between students and other communities within the town – the same conditions, in fact, which had led an earlier generation of scholars to up sticks and leave Oxford for Cambridge. But their action now threatened both universities, and so the Stamford experiment had to be suppressed. The sheriff of Lincoln, the lord chancellor, even the king, Edward III, were all called into play and the result became known as the ‘Stamford Oath’; an oath which Oxford and Cambridge graduates continued to swear until 1827.

It is true to say that Charles Morton was unusually unlucky in being prosecuted for breaking this oath by establishing his own academy at Newington Green in London. His evident success in recruiting numerous and impressive students, like Daniel Defoe, was part of the problem, as were his staunchly Presbyterian religious beliefs and his radical, republican political views. But the depressing effect of the Stamford Oath was undeniable and its symbolism inescapable. Repeated at each graduation and reinforced by successive revisions of both universities’ statutes, it made their determination to preserve a duopoly in higher learning absolutely plain.

This was in sharp contrast to the European experience. Just as Oxford and Cambridge were establishing and policing their unique right to produce graduates, ever growing numbers of universities were being founded across the Continent. In the 14th century new institutions appeared in towns from Pisa to Prague; from Kraków to Cahors. In the years that followed, the gap in numbers between English universities and those on the Continent grew even greater, with over 100 founded or refounded in Europe after 1500. Oxford and Cambridge remained the only universities in England. Indeed, even as Morton’s teaching career began in the mid-17th century, universities were springing up in such unlikely places as the small towns of Prešov in Slovakia and Nijmegen in the Netherlands. The English experience was also very unlike that of the Scots, who acquired five universities between 1451, when Glasgow opened, and 1582, when Edinburgh was established.

. . . .

In the first place, there is the question of why it was that Oxford and Cambridge were so keen to suppress other universities. Secondly, there is the question of how they succeeded. Finally, just as importantly, and perhaps even more interestingly, there is the question of what changed to make them reverse this position so comprehensively in the years after 1827.

In some respects, the question of why Oxbridge was so jealous of its status seems the easiest to answer. In the most general terms, it makes sense for the providers of an exclusive product – a university degree, say – to take action to preserve their exclusivity. Universities were originally little more than a sort of trade guild, a separate group of masters and their students, who controlled admission, regulated quality and negotiated with the local authorities. Just as butchers and bakers sought to restrict the supply of their skills, so masters within the university hoped to protect their distinctive rights. These privileges were threatened by rivals. Oxford and Cambridge continued to act like guilds long after they lost or forgot their origins. Thus it was that even in the 17th century they fought off attempts by places as various as Carlisle and London, Ripon and Shrewsbury to establish their own institutes of higher learning. Thus it was that they crushed the nascent Durham University in 1660. And thus it was that they pursued poor Charles Morton.

. . . .

The answer is control. Just as the two universities wanted to control the supply of teachers and students, so the English Church and state wanted to control the universities. Universities could be – indeed, were – the source of dangerous heresies, where people learnt to think the wrong things. Oxford gave birth to the reforming, proto-Protestant Lollard movement in the 14th century. Cambridge was home to an alarming nest of evangelicals – humanist-inspired converts to church reform like the martyrs Robert Barnes (c.1495-1540) and Thomas Bilney (1495-1531) – 200 years later. With only two universities it was easier to control theological debate and even to use one of the institutions to oversee the other. It is no coincidence that the Cambridge-educated bishops Hugh Latimer and Nicholas Ridley, together with the Cambridge-educated archbishop Thomas Cranmer, were sent to loyalist, Catholic Oxford to be tried and burnt in the 1550s.


Link to the rest at History Today

Talking Back to Cookbooks

From The Wall Street Journal:

On a scorching hot day last week, I decided to make a cooling salad of roasted figs and onions with mint and green leaves, a recipe that caught my eye in the lovely new cookbook “Falastin” by Sami Tamimi and Tara Wigley. After I started, I realized that I had only half as many fresh figs as I needed. I also didn’t have the radicchio or walnuts or goat’s cheese that the recipe stipulated.

In the past, I might have anxiously rushed to the store to get exactly the “right” ingredients. But this is 2020, and new rules apply. I doubled up on onions to make up for the missing figs, subbed in feta for the goat’s cheese, used lettuce instead of radicchio and toasted cashews in place of the walnuts. The rest of the recipe—the dressing, the cooking times—I followed to the letter. It may not have been quite what the authors intended, but I put a Post-it Note in my copy of “Falastin” saying that it was still one of the best salads I’ve made all summer.

When we finally resurface from this pandemic, one of the many things that will have changed is our relationship with recipes. Through necessity, we have been forced to become more experimental cooks and start talking back to our cookbooks. This is a good thing, if you ask me.

For years, many of us tortured ourselves with the idea that recipes were stone-carved commandments issued from on high by godlike chefs. But a recipe is more like a never-ending kitchen conversation between writer and cook than a one-way lecture. Recipes were originally designed to help people remember how to cook something rather than to give them exact blueprints. When something in a recipe doesn’t work for you, for whatever reason, you are free to say so and make it your own.

Link to the rest at The Wall Street Journal (Sorry if you encounter a paywall)

All Tomorrow’s Warnings

From Public Books:

Lamenting the shortsightedness of environmental policy—in 1971—U Thant, Secretary-General of the UN, deployed a by-now familiar move from the playbook of ecological advocacy. He looked to the future:

When we watch the sun go down, evening after evening, through the smog across the poisoned waters of our native earth, we must ask ourselves seriously whether we really wish some future universal historian on another planet to say about us: “With all their genius and with all their skill, they ran out of foresight and air and food and water and ideas,” or “They went on playing politics until their world collapsed around them.

Despite its familiarity, U Thant’s statement here is rhetorically complicated. And it continues to inform efforts to tackle climate change to this day. Rather than castigating the present from some abstract future perspective, contemporary environmental defenders often follow U Thant’s lead, grounding their judgments in a singular figure—“some future universal historian,” geologist, or brother from another planet—who sifts through our future remains with fascination and disbelief. The question, then as now, is simple: Will we collectively close our eyes to the future dangers barreling toward us?

But such a question leads inevitably to a second, perhaps even more pressing question: How can we create a scientifically informed history of the future? This question has galvanized a slew of contemporary writers, filmmakers, and activists, who, echoing U Thant’s warning, are turning to speculative nonfiction, a genre that strives to document the years ahead. The vogue for histories of tomorrow is driven primarily by climate breakdown and the Anthropocene. Such anticipatory histories seek to counter a disastrous temporal parochialism unequal to the demands of the warmer, more insecure world. Nonfictional forays into the future, on the one hand, tend to warn us of coming disasters, and on the other, urge us to take action today.

In a spirit of anticipatory memory, writers, artists, and activists encourage us to own the future by inhabiting it in sample form. They encourage us to feel our way forward into the emergent worlds that our current actions are precipitating. They encourage us to break out of our temporal silos and—from our diverse Anthropocene positions—face the challenges that shadow the path ahead.

In the Anthropocene, Clive Hamilton observes, “the present is drenched with the future.” Despite that, powerful economic, technological, and neurological forces intensify our present bias, severing current actions from future fallout. The neoliberal fantasy of infinite short-term growth, the digital splintering of attention spans, and the rewiring of our brains for restless interruption: all favor dissociation. The average American, after all, checks their phone 150 times a day. A succession of staccato inputs now threatens to crowd out futures of remote concern—futures that seem immaterial, in both senses of the term.

Speculative nonfiction has no innate politics. After all, Big Oil has invested heavily in creating documentaries set in the future that present the companies’ energy trajectories in a glowing light. That said, it is progressives who, recognizing that the trend lines all point toward a warmer, less stable climate, have been most insistently adventurous in experimenting with this futuristic documentary form. Again and again, progressives have conscripted speculative nonfiction as an ally against short-term extractive economics, digital dispersion, political prevarication, and ethical inertia.

Link to the rest at Public Books

As he read the OP, PG reflected that readers generally have few problems accepting science fiction as fiction in part because it is set in the future and includes elements that do not exist (or do not exist in the form depicted) at the present time.