A Love Affair with Peru

From Women Writers, Women’s Books:

I was trekking the Inca trail, and the narrow, dirt path followed alongside the torrential Apurímac River. The switchbacks were jagged, cutting through wild grasses, boulders, and twisted trees. The Salkantay mountain range loomed nearby, and snow-capped peaks and dramatic ridges reminded me of the vastness of the Andes. Soon the trail took our small group of family and friends to a higher elevation, and the river looked like a small snake below. The trail had widened, and the group of chatty and enthusiastic trekkers were at a fair distance from me.

Earlier that day, we had all witnessed a condor flying overhead. The huge bird glided through a stretch of the canyon, against a backdrop of cloudless blue and sun-lit, golden cliffs. It took our breath away. Walking now in silence, thinking back on the majestic condor, I said a prayer for my book. I had finished one of many drafts, and though far from being the final manuscript, a sense of wellbeing and positivity rested in me. I felt sure my book would be published someday. What I didn’t know was that it was still only the beginning of an endeavor that would take over a decade to achieve. 

I had arrived in Peru fifteen years earlier, at the young age of twenty-two, well-traveled, fiercely independent, and recklessly adventurous–having already traveled around the globe, trekked the Himalayas, and worked with an environmental brigade in Nicaragua during the tail end of the Iran-Contra War. When I arrived, it was 1989 and red zones marred the countryside and two terrorists groups, Sendero Luminoso and the MRTA, wreaked havoc.

The economy was in shambles, and there were lines down blocks to buy staples like rice and cooking oil. Yet, the beauty and resiliency of the people captured my heart and imagination in a way I couldn’t have predicted. I was to study at La Católica University in Lima for one year, but to the surprise of everyone back home, I stayed beyond my year of studies to marry and start a family.  

My decision to stay involved a man, the father of my children, who after a twenty-five year marriage, I  have since divorced. What I understand now is at the time, and at that young age, I was falling in love and marrying not a person but a country, a culture, and an extended family. The family was involved in a small silk production enterprise in the Andes. I witnessed the life cycle of silk worms, from worm to moth, and how the silk is produced from the cocoons. This large, boisterous, complicated Peruvian family was my love, and the life and stories they shared with me became a love affair.  

Since that time, I have often asked myself,  did I always want to be a writer, or did living in Peru spark an impulse to explore life, people, and the human condition in that deep and mysterious way that writers do. Perhaps it was because my first daughter was birthed there, so far away from my native homeland, that I so deeply rooted myself to the people and culture. It was now my daughter’s birthplace, her homeland, and I wanted desperately to share it with her.

Link to the rest at Women Writers, Women’s Books

Churchill & Son

From The Wall Street Journal:

When Randolph Churchill was a young boy in the 1920s, his father Winston stayed up with him one night, talking with him into the late hours. At 1.30 a.m., Winston—then in the political wilderness and not yet the man who would come to be seen as one of the greatest Britons in history—turned to his son and said: “You know, my dear boy, I think I have talked to you more in these holidays than my father talked to me in the whole of his life.”

Josh Ireland writes of this episode very early in “Churchill & Son,” his account of the relationship between Winston Churchill and his only male heir, and a reader would have to be ice-cold of heart not to pause to take a sorrowful breath. Winston Churchill had had a loveless childhood. He was ignored by his glamorous American-born mother and, most crushingly, cold-shouldered by his father, who barely spared his son a second glance.

Although Winston pined for his mother’s attention, he worshiped his father, Lord Randolph Churchill, a volatile and brilliant iconoclast who was appointed chancellor of the exchequer at the age of 36. But Winston’s filial adoration was not returned. Instead, when Lord Randolph did pay Winston any attention, it was to scorch him with put-downs and unfatherly contempt. As Mr. Ireland, a British journalist, notes wryly: The “best thing Lord Randolph ever did for Winston was to die young”—at age 45.

Winston would write a biography of his father, published in 1906, 11 years after Lord Randolph’s death. It is a book of many qualities, none of which includes, says Mr. Ireland, “the detachment of a professional historian.” As a cousin of Winston observed at the time: “Few fathers have done less for their sons. Few sons have done more for their fathers.” Reading Mr. Ireland’s book, it is tempting to conclude that the inverse of that judgment applies to Winston and his own son. Few fathers did more for their sons than Winston. Few sons have done less for their fathers than Randolph.

Winston was determined, writes Mr. Ireland, “that his son would not suffer the same neglect that had blighted his own childhood.” If his own father had poured scorn on him—describing him in a letter as “a mere social wastrel” destined for “a shabby unhappy & futile existence”—Winston constantly encouraged Randolph. In Mr. Ireland’s words, he “praised him, told him that the future was his to seize.” In the jargon of our times, Winston can be said to have overcompensated for his own desolate childhood by lavishing love on Randolph.

Did he give Randolph too much love? And was that love corrosive? “Winston was obsessed with his son,” Mr. Ireland says, and was “never more himself than in Randolph’s company.” As Randolph grew from boy to man, father and son spent so much time together, absorbed in conversation, “that they had come to inhabit the same mental space.” And when they communed, they shut out the rest of the world—including Clementine (Winston’s wife, Randolph’s mother) and Randolph’s three sisters.

As a child, the cherubic Randolph got more attention from his mother than his sisters did. “Clementine even breastfed him,” Mr. Ireland tells us. But as Randolph became as much a companion for Winston as he was a son, his mother began to resent him. She felt that Winston indulged Randolph to excess, failing to check his rudeness at table and his misbehavior in society. Winston was “consumed by his own sense of destiny” (in Mr. Ireland’s words), and Randolph was the “incarnation of his dynastic obsession.” So Winston placed him on a pedestal—one from which Randolph was wont to spit at the world, or even urinate upon it, as he did once on the heads of his father and David Lloyd George—the prime minister—from a bedroom window at the Churchill country home. Lloyd George thought it was a passing shower.

Yet the more Clementine criticized Randolph, the stronger Winston’s love for him seemed to become. Perversely, she blamed Randolph for Winston’s failure to discipline his own son. She felt that she was vying with Randolph for Winston’s attention. Ever the devoted wife, she had sacrificed her own needs to care for Winston’s many whims. But instead of paying her the attention she expected, Winston was, Mr. Ireland says, “infatuated by his glorious, golden, chaotic son.”

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

The Light of Days

Comrades from the pioneer training commune in Białystok, 1938.

From The Wall Street Journal:

They were nicknamed the “ghetto girls” but the label does not do justice to the defiant, mostly forgotten Eastern European Jewish women in their teens and 20s who, acting in resistance to the Nazis, undertook one mission impossible after another to disrupt the machinery of the Holocaust and save as many Jews as they could.

Now, in her well-researched and riveting chronicle “The Light of Days,” Judy Batalion brings these unsung heroines to the forefront. She has recovered their stories from diaries and memoirs written variously in Yiddish, Polish and Hebrew, some composed during the war (one in prison, on toilet paper, then hidden beneath floorboards), others afterward, still more recorded in oral histories. This group portrait forcefully counters the myth of Jewish passivity, at once documenting the breadth and extent of Jewish activism throughout the ghettos—armed resistance groups operated in more than 90 of them, according to Ms. Batalion—and underlining in particular the crucial roles women played in the fight to survive. Indeed, several of the women whose stories Ms. Batalion tells also helped lead the most significant act of anti-Nazi Jewish resistance, the 1943 Warsaw Ghetto uprising, which is recounted here in brutal detail.

The tasks and responsibilities these female fighters took on were as myriad as the false Christian identities they adopted to avoid capture, their disguises so successful that one was even hired as a translator by the Gestapo in Grodno. But mostly they traveled, seemingly nonstop, to surrounding Polish towns and in and out of the barricaded ghettos that they managed, through bribes and stealth, to penetrate. In the ghettos the Nazis not only segregated Jews from Aryan society but also prevented evidence of the massive deprivations and punishments Jews suffered there from leaking to the world outside.

This Nazi-imposed isolation made the female couriers all the more welcome when they arrived, living proof that those locked inside the walls were not forgotten. During their visits, the couriers acted as “human radios,” carrying greetings from other ghettos, bringing warnings of forthcoming deportations to the death camps, and serving as liaisons coordinating the efforts of ghetto resistance cells with those of armed partisan groups in the forests. They also took on the grim responsibility of reporting the latest massacres and other atrocities against the Jews. The eyewitness testimonies they conveyed were harrowing. But rather than spread hopelessness among the ghetto population, the couriers often did the opposite, breeding greater determination to resist, to leave a legacy of action and defiance rather than submissiveness. As one ghetto slogan declared, “It is better to be shot in the ghetto than to die in Treblinka!”

Skilled black marketers, they also smuggled in food to supplement the ghettos’ ever-dwindling food rations; medical supplies to fight typhus and the other diseases that ran rampant amid appallingly cramped, broken-down living conditions; and as many rifles, pistols, bullets, grenades and bomb-building components as possible, to spark an uprising.

Behind all these operations lay a deftness and aptitude for creating and maintaining resistance webs and networks both within and among different ghettos, as well as with sympathetic Aryans throughout Poland. That is why they were also often described as kashariyot, the Hebrew word for “connectors.” It was through these links that they set up hiding places for Jewish children outside the ghetto, found safe houses to conceal resistance fighters, provided forged papers and plotted escape routes to Palestine, even facilitated prison breaks. Nor did they hesitate to take up arms themselves, leaving Nazi troops so surprised to see women wielding guns and grenades that one startled SS commander was left to wonder if they were “devils or goddesses.”

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

While the women who did fight endured a great many terrible travails, those who didn’t may have experienced worse.

By Unknown author (Franz Konrad confessed to taking some of the photographs, the rest was probably taken by photographers from Propaganda Kompanie nr 689.[1][2]) – Image:Warsaw-Ghetto-Josef-Bloesche-HRedit.jpg uploaded by United States Holocaust MuseumThis is a retouched picture, which means that it has been digitally altered from its original version. Modifications: Restored version of Image:Stroop Report – Warsaw Ghetto Uprising 06.jpg with artifacts and scratches removed, levels adjusted, and image sharpened.., Public Domain, https://commons.wikimedia.org/w/index.php?curid=17223940

Using Novel Writing Techniques in Your Memoir

From Writers in the Storm:

I’ve spent much of our Covid year learning about, editing, and writing my own memoir. Memoir is a form I think every writer should try to tackle at least once. Everyone has a story to tell. The exercise of writing a memoir can sharpen our memories and force us to write outside our comfort zones—always good practice for a writer at any level. If you want to craft a memoir that is truly a page-turner, you can and should use many of your fiction writing tricks.

First Things First: What a Memoir Is and Is Not

It is important to know what a memoir is and is not. A memoir is not your autobiography. A memoir is a slice of your life at a particular time, in a particular place. It is literally your memories put to paper. Some memoirs cover a year in a person’s life. Some memoirs cover several years. Think in terms of a season of your life, rather than a finite block of days on the calendar.

Many new memoirists hamstring themselves by feeling they need to tell their entire life stories, nose to tail, David Copperfield-style. You do not. A memoir focuses on a theme, on a particular red thread that has wound through your life thus far. It is a not a full accounting of all your sins and wins!

A memoir is not a journal entry, even though it is your story. You must write it so that a reader can benefit from it. There must be a compelling reason to keep them turning the pages, such as a lesson they can learn or inspiration for them to find. Memoir can feel navel-gazey in the writing process, but it should never feel navel-gazey on the page. (Yes, I know this is daunting! But persevere.)

What holds a memoir together is a story—your story.

Remember as you write each page that you are telling that story, not making a police report. You can change names to protect people’s privacy. And since you are working from memory, the story will have your slant—don’t feel you have to get every single angle on it. If you ask your family about the picnic you had that one day in 1972, you will get a different story from each member about that day, told from their perspective. Somewhere in the middle lies the truth.

Discover what your truth is and use your memoir to tell it.

An Inciting Incident: You Need One

Telling us about the time you went to the market after work and ran into a friend you hadn’t seen since high school and you exchanged pleasantries with them is not  a gripping inciting incident. Telling us about the time you went to the market after work, ran into a friend you hadn’t seen since high school, and found out they needed a kidney is a start. Deciding to see if you were a match to help them because of that one time in school when they saved you from being assaulted by a teacher? That is a gripping inciting incident.

Don’t invent something that isn’t true, but when you sit down to comb through the sand of your life, you are searching for the pearl that you will hand to your readers. Think of the unusual things. If you don’t think there are any of those pearls, think again. Everyone has as story.

Once I sat in a hotel bar on a business trip and met seven different travelers, from seven different age groups, seven different places, seven different walks of life. Each and every one of them had a compelling story. You do, too. And if you write it well, people will want to read it.

Build Characters

Many new memoirists neglect to see that what they are crafting are characters (who just happen to be real people). You are the “main character” of your memoir.

This is tough for many writers. Do we ever really see ourselves completely objectively? Probably not. But we must do our best. Use the same techniques to craft interesting characters in your memoir that you do in your fiction writing. Make a list of who will appear on the stage of your memoir, and sketch them out, just as you would the players in your novel.

Link to the rest at Writers in the Storm

The Mystery of Harriet Cole

From Atlas Obscura:

If “Harriet” could hear, she might pick up the sound of ping-pong balls skittering across a table. If she could smell, she might detect a range of lunches being reheated in a nearby microwave. If her eyes could see, she might let them wander a busted Pac-Man machine, a TV, and a campus bookstore, decorated with a swooping, celebratory paper chain, like an elementary-school version of DNA’s double helix. She might even catch a glimpse of herself in a camera lens or an observer’s glassy eyeballs. People often stop to stare.

On a sweaty Saturday, before social distancing was the law of the land, a group of visitors gathered at Drexel University’s medical campus in Northwest Philadelphia to meet “Harriet.” The preamble to this encounter was a display case holding several unusual and meticulously prepared medical specimens, long used as teaching tools. Like “Harriet,” each had been created in the late 19th century by a star anatomist, Rufus Weaver. Now, behind glass, between the cadaver lab and a bookstore, a segment of intestine and a piece of a spinal cord sit in stillness. A dissected eyeball floats ethereally in century-old liquid, its separated parts looking like a tiny jellyfish, a bit of brittle plastic, a mushroom cap.

The visitors shuffled through the door and into the otherwise empty student center. They huddled on the low-pile carpet, nondescript in the style of a suburban office park, and peered at more of Weaver’s dissection work, which occupied a glass-fronted case. They surveyed a sinewy hand, ropey and purplish. Two skulls and necks. Then, “Harriet.”

Reactions rippled.


“Oh, wow.”

Quietly, “Poor Harriet.”

. . . .

“I’ve been meaning to find her,” said Malaya Fletcher, an epidemiologist in Washington, D.C., specializing in infectious disease. Fletcher remembered learning about the dissection in her high school biology class, and the story had stuck with her. “It’s just awesome,” she said. “You almost don’t believe it’s real.” The group crowded in close, lofting their cell phones above each other’s heads. They bobbed and weaved their raised hands, trying to take pictures without capturing their own flushed faces reflected in the glass.

“Harriet” is a network of fibers fastened to a black board in a case pushed up against a wall. At the top, there appears to be a brain, plump and brown, and a pair of eyes. Scan your own eyes down and you’ll encounter an intricate system of skinny, brittle cords, pulled taut and painted startlingly, artificially white. The outline is recognizably human—there’s the impression of hands and feet, the hint of a pelvis, the suggestion of a rib cage—but it is slightly fantastical, too. The way the cords loop at the hands and feet, it almost appears as if the figure has fins. Elsewhere, the fibers look shaggy, like chewed wire, as if electricity is shooting from the margins of the body.

This is a human medical specimen, in the spirit of an articulated skeleton. But unlike that familiar sight, it represents the nervous system, a part of the body’s machinery that most people have trouble even imagining. Some who stand before “Harriet” wiggle their fingers and toes, as if trying to map the fibers onto their own bodies and make the sight somehow less abstract.

Neighboring the display is a label that identifies the specimen as “Harriet Cole” and explains that she was a Black woman who worked as a maid or scrubwoman in a university laboratory at Hahnemann Medical College, died in the late 1800s, and donated her body to the medical school. Her nervous system, the story goes, was dissected by Weaver, then preserved and mounted as a teaching tool and masterpiece of medical specimen preparation.

Before the preparation wound up at this campus, more than a decade ago, it traveled to Chicago for the 1893 World’s Fair, where it won a blue ribbon. It starred in a multi-page feature in LIFE magazine and took up residence in academic textbooks. But before all of that—before the nerves were naked—the fibers animated and stimulated a body. In 2012, the university’s press office characterized the nerve donor as the school’s “longest-serving employee.”

. . . .

Researchers such as Herbison and McNaughton are neither anatomists nor ethicists: They didn’t elect to procure, dissect, and display a body, though they inherited the finished product. As caretakers of this object, they have accepted the mission of poking around in the historical record, cleaving fact from fiction, trying to piece together a fuller story of “Harriet Cole” in spite of official records that often omit women and people of color.

. . . .

Committed to resurfacing stories of women lost, warped, or overlooked in the archives, McNaughton, Herbison, and other collaborators, including medical historian Brandon Zimmerman, are trying to pin down specifics about “Harriet.” They’re wondering, more than 130 years later, how to describe the dazzling, jarring preparation, stripped of skin and pulled away from the bone. Whose body this is, and what would it mean if one of the university’s oldest fixtures never knew that she would spend her afterlife on display?

. . . .

At Hahnemann, Weaver was appointed custodian of the university’s anatomical museum in 1880, and busied himself assembling an anatomical wunderkammer with no rival. Gone were papier-mâché models and “musty,” dried-out specimens. Weaver filled the light-flooded, third-floor space with hundreds of new medical displays, many of which he prepared himself. His trove included bladder calculi, sections of healthy and diseased brains, and an entire uterus, partly consumed by a tumor and opened to reveal a six-month-old fetus. The anatomist imagined these—and the museum’s hundreds of other objects—as teaching tools instead of “mere ‘curiosities,’” according to an announcement circulated in the mid-1880s. Among the assortment, there was Weaver, described in 1902 by a reporter from The North American as a “little professor” brimming with “energy, originality, and vim,” “as cheerful and bright as a May morning,” and prone to speaking of his collection of “beautiful tumor[s]” with tenderness and awe. (“Here is a lung,” the reporter quoted him saying. “Isn’t that the handsomest thing that you ever saw?”) In one 19th-century photograph, Weaver poses next to a fresh cadaver, its chest pried open, while limbs dangle around it like cuts of meat in a butcher’s shop. The anatomist’s own bearing was stick-straight—perhaps an occupational hazard of standing above so many spinal columns.

. . . .

But these guides stop well short of how Weaver pulled his masterpiece off. The earliest description of Weaver’s work on the nervous system comes courtesy of Thomas, who described the process in an 1889 edition of The Hahnemannian Monthly, the school’s journal. But Thomas’s is a hazy picture, long on the basics of dissection and short on clarity about how Weaver managed to preserve delicate nerve structures while chipping or sawing bone apart. This must have been finicky work: The spinal cord—a hardy nerve bundle—is roughly as wide as your thumb. We don’t have the complete ingredients Weaver mingled in his preservatives, a full inventory of the tools he enlisted, or a meticulous record of which parts of the process proved surprisingly straightforward or especially thorny or vexing. We don’t have a precise timeline, either. As Thomas tells it, dissection began on April 9 and concluded by June, with mounting complete by September; years later, van Baun reported that the dissection alone took nearly seven months, and then it required “seventy days of unceasing, laborious, skilled work and supreme patience to get the specimen on the board,” for a total of “nine months of gruelling [sic] contest.”

Weaver is said to have spent up to 10 hours a day in his humid office, and reportedly spent two weeks just tussling with the bottom of the skull. Once “all the little branching strands … were laid bare,” The North American noted, Weaver attempted to keep them supple by swaddling them in alcohol-soaked gauze or wads of cotton, which needed frequent changing, and he covered the flimsy strands with rubber. He retrieved nearly everything but sacrificed the intercostal nerves, which run along the ribs and proved too difficult to wrangle. Weaver reportedly excised the brain but held on to the outer membrane, called the dura mater, and plumped it up with “curled hair” stuffing, stitched it closed, and returned it to the display. To showcase the optic nerves, Weaver left the corpse’s eyes in place and distended them “with a hard injection,” Thomas wrote.

Mounting the specimen—as Weaver later recalled to The North American—was far more “wearisome and exacting” than the dissection itself. Weaver apparently tacked the nerves in place with 1,800 pins, and then fixed every filament with a coat of lead paint. (Many of those pins were later removed, Thomas wrote, once the shellacked nerves dried and held their position.) In all, Weaver reportedly spent several months laboring over the body, with a break for a summer vacation. The ultimate result, Thomas wrote, was “perfectly clean and free from all extraneous tissues and smooth as threads of silk.”

. . . .

Some laypeople argued that a living patient was better off being treated by someone who had seen the body’s inner contents up close. In 1882, The Christian Recorder—the newspaper of the African Methodist Episcopal Church—endorsed dissection, suggesting that it would be foolish for anyone to seek treatment “at the hands of a man who had not gone through the mysteries of the dissecting room.” Still, even those who supported the notion of dissection typically did not want to entertain the thought of it happening to anyone they loved. The anonymous author of that article in The Christian Recorder skewered grave robbing on moral grounds and suggested that doctors be offered the bodies of executed murderers and anyone who died by suicide.

The few people who expressly permitted, or even beseeched, doctors to cut into them after death tended overwhelmingly to be white, wealthy, and accomplished men. By 1889, the new American Anthropometric Society, headquartered in Philadelphia, began compiling the brains of physicians and public intellectuals who embraced the ideas of phrenology, which correlated intellectual feats with cranial attributes. These donors were keen to join the organization’s “brain club” as a way to further the field while also valorizing themselves.

And in the medical realm, consent was slippery. William Osler, a founding professor of Johns Hopkins Hospital, was known to solicit family approval before giving cadavers to his students—but he was also famously dogged in his pursuit of that permission, and in a 2018 article in the journal Clinical Anatomy, Wright, the University of Calgary pathologist, notes that “autopsy consent and organ retention abuse was not uncommon in late-19th century Philadelphia.” In a 2007 Academic Medicine article about the uptick in body bequeathal in 20th-century America, Ann Garment, then a medical student at New York University, and three coauthors note that turn-of-the-century body donation was uncommon enough to make the news when it happened. The New York Times picked up the tale of Thomas Orne, a wealthy Maryland horse dealer who pledged his body to Johns Hopkins in 1899. In 1912, 200 New York City physicians also vowed to donate their bodies for dissection in an effort to erode the stigma around it.

. . . .

“I am aware that there have been men, [the philosopher Jeremy] Bentham for instance, who have voluntarily willed their bodies to be dissected, but they have been extremely few,” Sozinsky recounted in 1879. Opting in was far from commonplace. “The ‘Harriet Cole’ story, if correct, is likely very unusual,” Wright notes. If a flesh-and-blood Black woman named Harriet Cole consented to her own dissection more than 130 years ago, she would have had very little company.

Link to the rest at Atlas Obscura


The Zoologist’s Guide to the Galaxy

From The Wall Street Journal:

There are many ways of being alone. You can be alone in a room, a house, even a crowd. You can be really alone in a wilderness, or really, really alone in the universe.It’s that last, existential and cosmic loneliness that astronomers and specialists in astrobiology have in mind when they ask “Are we alone?”

In his book “The Zoologist’s Guide to the Galaxy,” Arik Kershenbaum, a zoologist and lecturer at Girton College, University of Cambridge, takes a novel and rewarding approach to this question. He is not too concerned about the evidence for or against the existence of extraterrestrial life; rather, he is interested in hypothesizing about what forms it might take, given what we know about conditions on other worlds. Instead of Enrico Fermi’s famous question, But where is everybody? Mr. Kershenbaum asks: What would everybody be like?

There are good reasons to think that we may not be alone. Humans and other earthlings exist, so life itself—for all its seemingly unique characteristics—isn’t altogether unimaginable. And there are many other planets, almost certainly in the hundreds of millions, perhaps billions, including a very large number that appear to be roughly similar to Earth. Have we any basis to presume our planetary life forms are so special?

In 2017 astronomers were perplexed by an object first spotted by the telescope on Mount Haleakala, Hawaii. It had many traits that appeared to distinguish it from other extragalactic objects that occasionally enter our planetary neighborhood: unusual shape, rotation and speed. It was dubbed “Oumuamua” the Hawaiian word for “scout,” and although most scientists doubt that it was a scout from another civilization, at least one highly regarded astronomer thinks it was.

As befits a good biologist, Mr. Kershenbaum presents insights informed by what we know about the process of evolution by natural selection. He argues that, although the details will necessarily vary from one exoplanet to another—whether life might be based, say, on silicon, or whether gravity will be stronger or weaker than on Earth—life most likely will be subject to the basic principles of variation, selective retention and reproduction. Whatever the specific planetary environment, some sort of evolutionary mechanisms could very well be inevitable. If so, there should be interplanetary commonalities when it comes to biology—just as there appear to be shared patterns of chemistry, physics and mathematics that apply to the rest of the universe’s inanimate objects, from subatomic particles to black holes.

“A zoologist observing a newly discovered continent from afar,” Mr. Kershenbaum writes, “would be buzzing with ideas about what kind of creatures might live there. Those ideas wouldn’t be wild speculations, but keenly reasoned hypotheses based on the huge diversity of animals we already know, and how each animal’s adaptations are well suited for the life they live: how they eat, sleep, find mates and build their dens. The more we know about how animals have adapted to the old world, the better we can speculate about the new.”

. . . .

Mr. Kershenbaum proceeds to argue, persuasively, that “we have enough of a diversity of adaptations here on Earth to give us at least potential mechanisms that seem appropriate solutions even on worlds almost unimaginably different from ours.”

That may lead the reader to conclude that extraterrestrial creatures, however exotic, will resemble their Earthbound counterparts in recognizable ways. They may have long, short, flexible or jointed appendages, but nevertheless would have some sort of protuberant structures used for locomotion or manipulation. They might have big, little, single, multiple, round, slitted or geometric eyes, but in any case would need some devices for apprehending what we call visible energy. Recall the justly beloved cantina scene in the first “Star Wars” movie, in which the diverse denizens were all, in some way, “animal.”

Mr. Kershenbaum doesn’t go that far, sidestepping the temptation to make assumptions by concerning himself with how aliens would behave rather than how they would appear; that is, their functions rather than their forms. Thus, when he hypothesizes about alien language, he focuses on the presumed universal payoff of communication, without speculating about, say, dialects of Klingon, à la “Star Trek.”

“If alien animals use sound for their alarm calls, their screams will probably be very much like ours,” he writes. “Don’t believe it if they say ‘no one can hear you scream’—screams evolved to be heard, and to be disturbing. Even if aliens don’t use sound, it’s likely that alien alarm calls will be similarly chaotic in whatever medium they do use. They will have whatever properties are characteristic of the alien signal-production organ when you jump out from behind a rock and give the alien a fright. ‘Scary’ is going to be similar on every planet.”

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

A Lullaby in the Desert: One Woman’s Fight for Freedom

From Self-Publishing Review:

What if by questioning injustice and standing up for the oppressed, your words were met with threats, captivity, and execution? Would you still stand up?

Imagine being born without rights. From bicycle bans and compulsory clothing to mandatory beliefs, what’s worse than being born in a society where your gender alone is a crime? Millions of women are held captive, whether behind bars or behind barriers, for what they believe, what they wear, and what they say. They are suffering at this very moment. Some, like Susan, decided they wouldn’t take being held in the grip of a society’s invisible hands any longer. Some, like Susan, decided to stand up despite the possibility of paying with their lives.

A Lullaby in the Desert isn’t just Susan’s story; it’s the chorus of millions of women, their voices carrying forcefully over the empty sands. Their silent melody can be heard from Iran to Syria, from Indonesia to Morocco. Indeed, their voices ring all over the world. Slavery as we read about it in the history books may be fading into the past, but another kind of slavery lives in the present and threatens to persist into the future if we choose to ignore it.

Some use fear as a weapon to keep others down, forcing entire societies into silence. In some countries, those in power would prefer to destroy the identities of millions of innocent people so long as their grip on power remains intact.

What they don’t know is that fear won’t stop someone who has nothing to lose. In A Lullaby in the Desert, Susan finds herself homeless, penniless, and alone in Iraq, a country on the brink of disaster. When standing on the edge of the abyss, Susan stepped forward, just like the other refugees beside her taking this journey to the point of no return. They all had the same goal: freedom.

Freedom is their fundamental right, their dream, their destination. Like so many others, Susan’s freedom was stolen from her, the shackles thrown over her, covering her body, pushing her down. For Susan, the forces of evil and slavery could be easily seen in the black flags of the Islamic States of Iraq and al-Sham, who some call ISIS, covering her life in a shadow. However, for millions of women, those dark forces are not so obvious, but they are deadly nonetheless.

. . . .

For a long time, I wondered how I could speak for those who could not, for those who had already died, for those who were still enslaved. When the idea first entered my mind, I had to take a step back. Even the thought of telling the world of our plight made me shudder as I remembered my own trauma that began from my earliest days. I remembered the nine-year-old girls sold for fifty dollars in the street to marry strange old men, I remembered a singer assassinated for speaking up about people’s rights, I remembered seeing a woman shot in the head because she wanted to be free. Shame on me if I remained silent.

When I close my eyes I feel no pain because I cannot see anything around me. But my beliefs remain, my story remains. I had to stand in front of my trauma, confront it, release it, because I didn’t choose this life but this is what I know.

When I decided to write Lullaby, one thing pushed me forward: the pain. Pain may stop some, may slow some down, may force some down a different path. For me, I allowed it to open my eyes. 

Link to the rest at Self-Publishing Review

Little Platoons

From The Wall Street Journal:

Shortly after the Industrial Revolution began plucking workers from their ancestral villages and installing them in factory towns, a certain bargain was struck. The family would need to be mobile and smaller now—just mom, dad and the kids, most likely—but it would be sacrosanct, a haven in the heartless world of urban anonymity and mechanized production. If public life was to be marked by fierce competition and creative destruction, at least in the family home you would be free, safe, independent.

In “Little Platoons: A Defense of Family in a Competitive Age,” Matt Feeney outlines a troubling deviation from this bargain, a growing incursion of market forces into the haven of the family home. Mr. Feeney’s compact and compellingly argued book, which grew out of a 2016 article he wrote for the New Yorker, takes its title from Edmund Burke’s “Reflections on the Revolution in France.” There, counseling loyalty to one’s closest community, Burke writes that “to love the little platoon we belong to in society, is the first principle (the germ as it were) of public affections. It is the first link in the series by which we proceed towards a love to our country, and to mankind.” Mr. Feeney suggests that our little platoons are being diverted from this function and transformed from schools of public affection to weapons of public competition.

Mr. Feeney points to several breach points. Tech gadgets isolate us and prey on the idiosyncrasies of our brains. All-consuming youth sports fashion not just soccer players but entire soccer families. In the ambitious, competitive environments that Mr. Feeney describes, year-round sports clubs and camps promise not joyful play or healthy exertion but “development” and preparation for advancement to “the next level”—where the good, choiceworthy thing is always a few hard steps away. If there is a terminus to this process, it is admission to a good college, which is, for many of the parents Mr. Feeney describes, the all-encompassing goal of child-rearing.

As a result, the most powerful and insidious interlopers in Mr. Feeney’s story turn out to be elite college-admissions officers. These distant commissars quietly communicate a vision of the 18-year-old who will be worthy of passing beneath their ivied arches, and “eager, anxious, ambitious kids,” the author tells us, upon “hearing of the latest behavioral and character traits favored by admissions people, will do their best to affect or adopt these traits.”

Admissions officers exercise increasing power to shape the lives of both the children and their families. Their preferences hold so much weight that, more than being merely instrumental, they create the “vague assumption, largely unquestioned, that a central ethical duty of American teenagers is to make themselves legible to a bureaucratic process and morally agreeable to its vain and blinkered personnel.”

A hypercompetitive marketplace like this is driven by fear, Mr. Feeney tells us, and, “if, thanks to this fear, people come to believe that a successful life requires passage through a super-selective college, and if such colleges use the leverage this gives them to require sheepish levels of agreeableness in their successful applicants, then agreeable, sheepish college students are what you’re going to get.”

. . . .

He tells us of studies showing that attending a second-tier college isn’t nearly as detrimental to one’s earning potential as most people would believe. Referring to the work of the economists Stacy Dale and Alan Krueger, Mr. Feeney writes that “selective colleges don’t turn kids into bigger earners. They choose the kids who are more likely to be bigger earners.” Much of what the college-admissions process does is filter the extremely talented from the very talented. If your child is one or the other, chances are their professional performance will reflect it.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

PG will opine that, other than, perhaps, helping a recent college graduate to obtain his/her first job out of college, attending a “name” college is probably not that important for the life success of 99.99% of the populace of the United States. (PG is not inclined to associate with the .01% that are not included in his statistic following a few encounters with such people.) (Uncharacteristically, PG won’t opine on what works and doesn’t in other nations.)

PG attended a “name” college (not Ivy League) and so did Mrs. PG.

For PG, he suspects that his college may have helped him overcome a very non-commercial major to get his first job. (Although the person who decided to extend him his first job offer also asked for PG’s SAT scores, which would have been the same regardless of where he had attended college.)

After that, PG doesn’t recall ever being asked about his formal education by anyone.

It also didn’t take long for PG to encounter very intelligent and competent people who had graduated from colleges he had never heard of or who had not graduated from college at all.

And everyone knows idiots who attended “name” colleges.

PG suspects that one of the main drivers of the hypercompetitive marketplace identified in the OP is insecure parents who want to drop college names when speaking about their children. PG has known a few parents of that sort and has not observed that behavior to be terribly beneficial for their children.

When Brains Dream

From The Wall Street Journal:

Sometime last August, five months into the pandemic lockdown, I had my first mask dream—I was in a crowded place and was the only person wearing a mask; it filled me with panic and I woke suddenly, scared out of my wits. Well into middle age, I still have nightmares that I’m back in college during finals week, and that there’s an entire course I forgot to attend.

Dreaming is a universal human experience; although there are some themes that run through them, dreams are also unique to the dreamer and can provoke calm, wonderment and fear. Animals have them, too, although we’re less certain about the content. About once a month, my wife and I find Madeleine, our 8-pound cairn terrier, doing battle with some unseen adversary as she whimpers and runs in her sleep; we wake her gently and comfort her because the whole thing seems so distressing to her. It sure seems as though she’s dreaming.

Antonio Zadra and Robert Stickgold, two of the world’s leading researchers in the science of sleep and dreams, have written a remarkable account of what we know and don’t know about this mysterious thing that happens during the night. The promise of “When Brains Dream” is to address four questions: “What are dreams? Where do they come from? What do they mean? And what are they for?” In a masterly narrative, the authors answer these and many more questions with solid scientific research and a flair for captivating storytelling.

Speaking of evolution across species, they note that “for it to have been maintained across half a billion years of evolution, sleep must serve functions critical to our survival.” One of those functions is cellular housekeeping. Sleep deprivation leads, for example, to impairments of insulin signaling; after being allowed only four hours of sleep for five nights, “otherwise healthy college students begin to look prediabetic.” Sleep also clears unwanted waste products from the brain, including ß-amyloid, which is a prime suspect in Alzheimer’s disease.

Attempts to understand and interpret dreams must be older than history—dreams and their meaning play parts in religious traditions from Tibetan Buddhism to the Old Testament, classical philosophy to Freud and Jung. To many, dreams are prophecies, implanted in our brains by God or angels; to others, they exist to encode our memories of the previous day, to others they are simply random neural firings. To still others, they are the products of fourth dimensional beings (such as the ones Clifford Pickover so eloquently describes in his book “Surfing Through Hyperspace”).

The weight of the evidence supports a more elaborate, nuanced and wondrous version of the memory-encoding hypothesis. Messrs. Zadra and Stickgold have designed a conceptual model they call Nextup (“Network Exploration to Understand Possibilities”), using it to describe the progression of dreams throughout the four sleep stages and their different functions. They debunk the common myth that we only dream during REM sleep and show that, in fact, we are typically dreaming throughout the night and in nonREM sleep states. They tie all of this into the brain’s “default mode network,” in which our minds are wandering and, often, problem-solving. When we’re awake, our brains are so busy attending to the environment that we tend to favor linear connections and thinking; when we allow ourselves to daydream, we solve problems that have distant, novel or nonlinear solutions.

. . . .

By the time we reach REM sleep, later in the night, our brains have entered a superelaborate and vivid version of that default mode network, where dreaming “extracts new knowledge from existing memories through the discovery and strengthening of previously unexplored weak associations. Typically, the brain starts with some new memory, encoded that day . . . and searches for other, weakly associated memories. . . . The brain then combines the memories into a dream narrative that explores associations the brain would never normally consider.”

During dreaming, then, “the brain is searching . . . digging for hidden treasures in places” it would be unaware of when we’re awake. This, in part, explains why some dreams seem to have such a bizarre, otherworldly quality. Could it be that the content or effectiveness of REM sleep among intelligent people differs from that of others? We don’t yet know. The future may see brain-training games that allow our default mode and our REM sleep to create remote associations more often.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

The Challenge of Writing Humor in Dark Times

From Publishers Weekly:

The two of us blinked at each other. We had just swapped edits for chapter one of our latest coauthored book, You Are Here: A Field Guide for Navigating Polarized Speech, Conspiracy Theories, and Our Polluted Media Landscape, and there was a problem. This chapter focused on the Satanic panics of the 1980s and ’90s, which we argued marked the beginning of the network crisis: a period of intensifying polarization and information disorder now central to our politics. For the second time, we’d rejected each other’s suggestions. Whitney was being too somber for Ryan, and Ryan was being too slapstick for Whitney. Clearly, we needed to have a conversation.

So we opened negotiations. Front and center was the question, on a scale of one to 10, how funny should You Are Here be? Ryan said seven. Whitney said two. We’d developed the one-to-10 scale while working on our previous book, The Ambivalent Internet; it was a way of posing questions, soliciting answers, and brokering compromises with minimal ego bruising. This was how we made decisions about everything from language choice to argument structure. But this was the first time we disagreed about jokes.

Previously, humor had characterized our work. A running editorial goal as we researched, drafted, and edited The Ambivalent Internet was to make each other laugh. We even opened its second chapter with a story about our own raucous laughter as we prepared to give an aggressively absurdist, meme-heavy academic presentation about the book.

But that was in 2015, the very tail end—for us, anyway—of the “lol nothing matters” myopia that had characterized our early work on internet culture. How we felt about humor at the outset of the project and how we felt on the day of our final submission—not incidentally, the day after the 2016 election—shifted considerably. Laughter still came easily in 2015. By the time the book was published in 2017, we’d stopped laughing—because all the dangers of all that laughter and that “lol nothing matters” mentality had grown painfully clear.

Whitney tackled these dangers in a follow-up project to The Ambivalent Internet exploring journalistic amplification of far-right memes. But in 2018, when we started writing You Are Here, we hadn’t fully dealt with our own laughter.

What began as a discussion about the appropriate number of Satan jokes in chapter one quickly broadened out to everything we’d overlooked—as scholars, as authors, as people—because so many of us had been so busy laughing for so long. The real question became one of how we were going to position ourselves within the crisis we were describing. Were we going to separate ourselves from it, and in that emotional distance crack jokes? Or would we place ourselves within it and take responsibility for our part?

These questions hit especially close to home as we reflected on internet culture. As we drafted The Ambivalent Internet, white supremacists had already adopted the trollish winking, ironic racism, and weaponized laughter that was so common online. But too many people didn’t notice, because all the jokes and all the memes looked the same as they ever did. Stealthily, smirkingly, white supremacists used these “jokes” to push the Overton window—the range of discourse politically acceptable to the mainstream—further and further to the far right. We couldn’t gloss over this history in You Are Here; we couldn’t fall back on easy laughter. We needed to address the political and ethical consequences of internet humor directly. So we devoted a chapter of the book to how fetishistic online laughter, our own very much included, had accelerated the network crisis.

Link to the rest at Publishers Weekly

‘Francis Bacon’ Review: Furious Beauty

From The Wall Street Journal:

In the spring of 1949, the press baron Viscount Rothermere gave a ball that defied Britain’s postwar decline. The men wore white tie, the women their family jewels. The Queen Mother was there, and so was the royal of the moment, Princess Margaret. Late that night, the princess, giddy with Champagne, took the mic from Noël Coward and delivered her party trick. Her Royal Highness began to sing, off-key and out of time. The revelers loyally cheered and called for more.

Margaret was just beginning to mutilate “Let’s Do It” when a “prolonged and thunderous booing” emerged from the crowd. The band stopped, and the princess reddened and rushed from the room. “Who did that?” Lady Caroline Blackwood asked the man at her side. “It was that dreadful man, Francis Bacon,” he fumed. “He calls himself a painter but he does the most frightful paintings.”

More than a decade before the end of the “Lady Chatterley” ban and The Beatles’s first LP, the iconoclasm of Francis Bacon announced a new era in British life. By his death in Madrid in 1992, Bacon was a social and artistic icon, his battered face the image of painterly tradition. Openly gay, he was the king of Soho, a nocturnal drinker and cruiser, his motto “Champagne for my real friends, real pain for my sham friends.” By day, alone in his studio, he was the exposer of torn flesh and gaping mouths, of the secret shames and spiritual collapse that, in the postwar decades when French thinkers ruled in concert with French painters, made him the only truly English existentialist.

With “Revelations,” Mark Stevens and Annalyn Swan enter a biographical field as crowded as the Colony Room on a Friday night. Almost all the revelations are already revealed: the Baconian literature includes testimonies from Bacon’s critical patron David Sylvester; his fellow artist and slummer Lucian Freud; his drinking friend Dan Farson; and his assistant Michael Peppiatt. Mr. Stevens and Ms. Swan might, like Bacon’s friends, share a tendency to confuse the man with the art—like Oscar Wilde, Bacon was his own best work—but they bring a sober eye and an organizing mind to Bacon’s “gilded gutter life.” As in their acclaimed “de Kooning,” the authors frame their subject and his work as a portrait of the age.

. . . .

Erratically educated, Bacon “wasn’t the slightest bit interested in art” until 1930.

. . . .

Almost entirely “self-taught and untouched,” Bacon turned to painting. A critic mocked his first publicly exhibited portrait as “a tiny piece of red mouse-cheese on the end of a stick for head,” and he was judged “insufficiently surreal” to be included in London’s International Surrealist Exhibition of 1936. He and Nanny Lightfoot survived by holding illegal roulette parties.

. . . .

The war made Bacon. The revelatory evils of Nazism, the bombing and the newsreels of the death camps all forced the public to acknowledge the horror and the moral vacuum from which it had emerged. The critics, too, realized that Bacon had “found the animal in the man and the man in the animal.” A magpie for quotations and influence, Bacon liked to quote Aeschylus: “The reek of human blood smiles out at me.”

From then on, Bacon was famous and rich, unless he had lost at the tables. The authors excel at illustrating his formation—Bacon destroyed almost all his early work—his manipulation of his image and value, and his helpless gambling in the power games of love. He believed in beauty and tragedy, and he got and gave both.

Link to the rest at The Wall Street Journal

PG includes a portion of one of three panes in a triptych for those unfamiliar with Bacon’s work. PG expects that the virtues of Bacon’s artistic sensibility may be a learned taste for some.

Excerpt from “Three Studies for Portrait of George Dyer (on Light Ground),” a painting by Francis Bacon

Lesya Ukrainka’s Revisionist Mythmaking

From The Los Angeles Review of Books:

February 25, 2021, marks the 150th birthday of the modernist poet at the top of the Ukrainian literary canon, Lesya Ukrainka (Larysa Kosach, 1871–1913). Having chosen, at the age of 13, the pen name “Ukrainian woman,” she went on to reinvent what it meant both to be a Ukrainian and a woman.

. . . .

“I am quite well aware that this is impudence,” she admitted with a sense of delicious irony in a letter to a friend, interlarding her mock-confessional Ukrainian with German words and quotes from Alexander Pushkin’s Eugene Onegin, “yet ’tis ‘has been pronounced on high’ that I must mit Todesverachtung throw myself into the maze of global themes […], which my countrymen, except two or three brave souls, dare not enter.”

As a modernist, she broke with literary tradition in two significant ways. First of all, she rejected a provincializing paradigm imposed upon Ukrainian culture by the Russian Empire. During her time, the only acceptable image of the colonized people was that of ignorant peasants, and stir Ukrainka’s fancy it did not. A polyglot in command of nine European languages, she populated her poetic dramas with archetypal characters from classical mythology, Scripture, medieval legends, and Romantic poetry. Twining Ukrainian anticolonial subtext and European cultural context, Ukrainka also undermined the masculinist underpinnings of some familiar plots. A turn-of-the-century writer in a ruffled-collar blouse, she revised the key myths of Western culture from a woman’s point of view, venturing into literary territory later to be explored by second-wave feminists.

. . . .

Ukrainka’s poetic drama Stone Host (1912) became the first story of Don Juan in European letters written by a woman. Tirso de Molina, Molière, E. T. A. Hoffmann, Lord Byron, and Alexander Pushkin were among her predecessors. Ukrainka’s version transforms the fabled libertine, the great Romantic sinner and seducer into his supposed conquest’s plaything. Donna Anna is the unmistakable New Woman of the fin de siècle, albeit dressed in Spanish courtly garb. Confused by her rationality, Ukrainka’s Don Juan cries out, “You are indeed stone, without soul or heart,” only to hear in response, “Though not without good sense, you must admit.” Don Juan agrees to sacrifice his freedom and become Donna Anna’s sword in the fight for the throne. Donna Anna’s manipulative power compensates for her overall powerlessness within a male-dominated society, which can silence her no longer. Ukrainka’s heroines seize the right to tell their stories.

Link to the rest at The Los Angeles Review of Books

PG doesn’t wish to rain on the triumphant parade of Ukrainica’s heroines, but must point out that Joseph Stalin did a pretty thorough job of crushing millions of Ukrainian women and men during the 1932-33 Ukrainian famine (The Holodomor, “to kill by starvation” or Terror-Famine).

Powerlessness is not always gender-related.

Starved peasants on a street in Kharkiv, 1933. In Famine in the Soviet Ukraine, 1932–1933: a memorial exhibition, Widener Library, Harvard University. Cambridge, Mass.: Harvard College Library: Distributed by Harvard University Press, 1986. Procyk, Oksana. Heretz, Leonid. Mace, James E. (James Earnest). ISBN: 0674294262. Page 35. Initially published in Muss Russland Hungern? [Must Russia Starve?], published by Wilhelm Braumüller, Wien [Vienna] 1935.

A Worse Place Than Hell

From The Wall Street Journal:

“The real war will never get in the books.” Walt Whitman’s well-known prediction has not prevented thousands of writers, including Whitman himself, from trying to put the Civil War between covers. Many kinds of chronicles have been written—military histories, political studies, overviews of society or culture, portraits of leading figures. One especially striking way of bringing the war alive is to convey it from the standpoint of the unexalted individual. That is the choice John Matteson makes in “A Worse Place Than Hell,” a moving group portrait that uses the Battle of Fredericksburg, in late 1862, as the focal point for the story of five participants in the Civil War, four Northerners and one Southerner.

The battle that Mr. Matteson highlights has attracted a lot of scrutiny over the years, most notably in Francis Augustín O’Reilly’s “The Fredericksburg Campaign” (2003) and George C. Rable’s “Fredericksburg! Fredericksburg!” (2002). These books give details of the fateful encounter near the Rappahannock River on Dec. 13, 1862, in which Army of the Potomac under Ambrose E. Burnside met resounding defeat at the hands of Robert E. Lee’s Army of Northern Virginia. The futile assaults by waves of Union soldiers on Confederate troops, who were protected by a stone wall on Marye’s Heights, have become a fixture of Civil War lore. On that grim winter day, the Union suffered more than 12,000 casualties, compared with some 5,300 on the Confederate side. President Lincoln put a positive spin on the battle by praising the surviving Union soldiers for their bravery. Privately, however, he confessed that the battle had left him in “a worse place than hell.”

Although Mr. Matteson uses Lincoln’s phrase for his title, he doesn’t dwell on the hellish aspects of the war. Instead he concentrates on personal and cultural transformation. The people he follows were profoundly changed by the war, he tells us; all of them “confronted war and struggled to redeem themselves within it.” Oliver Wendell Holmes Jr., the son of a famous Boston physician and author, entered the war as an idealistic man and emerged from it hard-bitten and skeptical, leading him to seek direction in a legal career. The Rev. Arthur Fuller, the brother of the women’s rights champion Margaret Fuller, served as a chaplain in a Massachusetts regiment but at Fredericksburg traded his ministerial role for a military one, taking up a gun in a burst of patriotism and losing his life to Confederate bullets. The budding author Louisa May Alcott, hoping to contribute to the Northern cause, became a volunteer nurse in a Washington war hospital, an experience that fed into her popular book “Hospital Sketches” and later provided the emotional background for “Little Women,” a fictionalized portrayal of the Civil War’s toll on her Concord, Mass., family.

As for Walt Whitman, he was writing poems and newspaper stories in Brooklyn and hobnobbing with bohemians when he heard that his brother George had been wounded at Fredericksburg. He traveled first to Washington and then south to the environs of the battlefield in search of his brother, whose wound, as it turned out, was not serious. Walt stayed on for several years in Washington, taking on minor government jobs while serving as a volunteer nurse in war hospitals, setting the stage for his later role as the major poet and memoirist of the war. Two of Whitman’s poems about Lincoln, “When Lilacs Last in the Dooryard Bloom’d” and “O Captain! My Captain!,” are timeless eulogies of America’s greatest president, and his writings about the war, in poetry and prose, are at once crisply realistic and emotionally resonant. George Whitman, Walt’s brother, ended up serving in many Civil War battles and thus provides, in Mr. Matteson’s narrative, a kind of moving lens on the war as it unfolded on the battlefield.

In addition to these Northerners, Mr. Matteson describes the dashing John Pelham, a Confederate artillery officer who exhibited unusual courage. At Fredericksburg, partly hidden by a dip in the land, Pelham coolly supervised the firing of a cannon that was protected by its very proximity to Union troops: Their return volleys mainly went over the heads of the rebels. Pelham’s death at the Battle of Kelly’s Ford, three months after Fredericksburg, becomes in Mr. Matteson’s handling a dramatic, hopeless flourish of Confederate chivalry. Pelham charged forward on a horse like a blond god of war before being felled by an enemy shell fragment. The loss of Pelham was a blow for Confederate morale. Mr. Matteson writes: “No individual in the Confederate Army had seemed more invincible than Pelham. His risks had never been punished, and his audacity had been continually rewarded. If he could fall, so, too, might the army he left behind.”

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

PG notes the scale of death of the American Civil War – 620,000 in the Civil War vs. 644,000 US deaths in all other conflicts in the history of the nation.

The Civil War killed over 2% of the total US population at the time. In a distant second place, World War II killed .39% of the US population.

For every three soldiers killed in battle, five more died of disease. No record was kept those who were psychologically damaged, but not killed, in the war.

Recruitment on both sides was very local and either no records or very scanty records were kept of the number who enlisted from various counties and states and who they were. Neither army had systems in place to accurately record deaths or notify the families of the deceased or wounded during the war.

Because families and communities went to war together and served together, a single battle could devastate the communities and families whose sons served together.

As just one example, in the Battle of Gettysburg, the 26th North Carolina, comprised of men from seven counties in the western part of the state faced the 24th Michigan. The North Carolinians suffered 714 casualties out of 800 men. The Michiganders lost 362 out of 496 men.

Nearly the entire student body of Ole Miss (The University of Mississippi) –135 out 139–enlisted in Company A of the 11th Mississippi. Company A, also known as the “University Greys” suffered 100% casualties in Pickett’s Charge.

It is estimated that one in three Southern households lost at least one family member in the war. Of those who survived the war, one in thirteen veterans returned home missing one or more limbs, making them unemployable in most parts of the country. 

PG obtained much of this detailed information from Civil War Casualties.

The Johnstown Flood

A locomotive whistle was a matter of some personal importance to a railroad engineer. It was tuned and worked (even “played”) according to his own personal choosing. The whistle was part of the make-up of the man; he was known for it as much as he was known for the engine he drove. And aside from its utilitarian functions, it could also be an instrument of no little amusement. Many an engineer could get a simple tune out of his whistle, and for those less musical it could be used to aggravate a cranky preacher in the middle of his Sunday sermon or to signal hello through the night to a wife or lady friend. But there was no horseplay about tying down the cord. A locomotive whistle going without letup meant one thing on the railroad, and to everyone who lived near the railroad. It meant there was something very wrong.
The whistle of John Hess’ engine had been going now for maybe five minutes at most. It was not on long, but it was the only warning anyone was to hear, and nearly everyone in East Conemaugh heard it and understood almost instantly what it meant.

David McCullough, The Johnstown Flood


From The Wall Street Journal:

Do we really need another Mafioso-in-the-family memoir? I mean, seriously, we’ve had books that could be called Mafia Wife, Mafia Dad, Mafia Son, Mafia Stepdaughter, Mafia Uncle, Mafia Dachshund, Mafia Goldfish—okay, well, I made up a couple of those, but you get the point. When Al Capone’s purported grandson publishes a memoir, and he has, I think it’s safe to say we’ve reached saturation.

Which is why I was surprised how thoroughly I enjoyed Russell Shorto’s “Smalltime: A Story of My Family and the Mob.” Even more so once I realized a more accurate subtitle for the book would be “Searching for Grandpa: Second-in-Command of the Johnstown (Pa.) Mob.” In other words, this is not Mafia history that will send Geraldo Rivera scrambling to open a Shorto family safe anytime soon.

And that, oddly, is part of the book’s charm. The author of well-received histories of Amsterdam and New York City, Mr. Shorto has produced something that feels altogether fresh, a street-level portrait of how his late grandfather helped build what amounted to a Mafia small business—or businesses, actually, everything from the numbers and rigged card and dice games (Grandpa’s specialty) to pool halls, a cigar store, bars, bowling alleys and pinball arcades. There’s a murder mystery here—there has to be, right?—but make no mistake, this is a spry little book about small business.

As Mr. Shorto tells it, he had only the vaguest notion of his namesake grandfather Russell “Russ” Shorto’s career until an elderly cousin buttonholed him and urged him to write a book. Mr. Shorto is reluctant—“not my thing,” he avers—but soon finds himself in a Johnstown Panera Bread, surrounded by a gang of ancient, white-haired wise guys dying to tell him about the old days. Grandpa Russ, it seems, had a long run as a mid-level Mafia bureaucrat, running a sports book and crooked card games among other things, until his drinking got out of control and the law finally came calling.

For Mr. Shorto, the challenge is Grandpa Russ’s personality, or lack of one. He was a quiet man and, despite all the Panera chats, remains a cipher for much of the book. The story opens up once Mr. Shorto goes in search of the public Russ, tracing his family from its Sicilian roots and cataloging his newspaper clippings and arrest and FBI records. What emerges is the gritty tale of a talented card-and-dice cheat who gets his break in the late ’30s when a buttoned-down Mafioso named Joseph “Little Joe” Regino, who made his bones in Philadelphia, marries into Russ’s family and opens a Mafia franchise in Johnstown.

This was industrial-age Pennsylvania, and postwar Johnstown was a city of steel factories, whose workers quickly cottoned to the backroom gambling and after-hours places Russ and Regino opened. Russ’s masterstroke was something they called the “G.I. Bank,” a thrumming numbers operation that proved a cash machine. They invested the profits in a dozen local businesses and paid off the mayor and cops, while allowing them to make periodic “raids” to sate the newspapers. A handful of foot soldiers would get pinched, a few hundred dollars in fines would be paid, and they would do it all again the next year.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)