Minds Without Brains?

From Commonweal:

n the view of many scientists, Artificial Intelligence (AI) isn’t living up to the hype of its proponents. We don’t yet have safe driverless cars—and we’re not likely to in the near future. Nor are robots about to take on all our domestic drudgery so that we can devote more time to leisure. On the brighter side, robots are also not about to take over the world and turn humans into slaves the way they do in the movies.

Nevertheless, there is real cause for concern about the impact AI is already having on us. As Gary Marcus and Ernest Davis write in their book, Rebooting AI: Building Artificial Intelligence We Can Trust, “the AI we have now simply can’t be trusted.” In their view, the more authority we prematurely turn over to current machine systems, the more worried we should be. “Some glitches are mild, like an Alexa that randomly giggles (or wakes you in the middle of the night, as happened to one of us), or an iPhone that autocorrects what was meant as ‘Happy Birthday, dear Theodore’ into ‘Happy Birthday, dead Theodore,’” they write. “But others—like algorithms that promote fake news or bias against job applicants—can be serious problems.”

Marcus and Davis cite a report by the AI Now Institute detailing AI problems in many different domains, including Medicaid-eligibility determination, jail-term sentencing, and teacher evaluations:

Flash crashes on Wall Street have caused temporary stock market drops, and there have been frightening privacy invasions (like the time an Alexa recorded a conversation and inadvertently sent it to a random person on the owner’s contact list); and multiple automobile crashes, some fatal. We wouldn’t be surprised to see a major AI-driven malfunction in an electrical grid. If this occurs in the heat of summer or the dead of winter, a large number of people could die.

The computer scientist Jaron Lanier has cited the darker aspects of AI as it has been exploited by social-media giants like Facebook and Google, where he used to work. In Lanier’s view, AI-driven social-media platforms promote factionalism and division among users, as starkly demonstrated in the 2016 and 2020 elections, when Russian hackers created fake social-media accounts to drive American voters toward Donald Trump. As Lanier writes in his book, Ten Arguments for Deleting Your Social Media Accounts Right Now, AI-driven social media are designed to commandeer the user’s attention and invade her privacy, to overwhelm her with content that has not been fact-checked or vetted. In fact, Lanier concludes, it is designed to “turn people into assholes.”

As Brooklyn College professor of law and Commonweal contributor Frank Pasquale points out in his book, The Black Box Society: The Secret Algorithms That Control Money and Information, the loss of individual privacy is also alarming. And while powerful businesses, financial institutions, and government agencies hide their actions behind nondisclosure agreements, “proprietary methods,” and gag rules, the lives of ordinary consumers are increasingly open books to them. “Everything we do online is recorded,” Pasquale writes:

The only questions left are to whom the data will be available, and for how long. Anonymizing software may shield us for a little while, but who knows whether trying to hide isn’t itself the ultimate red flag for watchful authorities? Surveillance cameras, data brokers, sensor networks, and “supercookies” record how fast we drive, what pills we take, what books we read, what websites we visit. The law, so aggressively protective of secrecy in the world of commerce, is increasingly silent when it comes to the privacy of persons.

Meanwhile, as Lanier notes, these big tech companies are publicly committed to an extravagant AI “race” that they often prioritize above all else. Lanier thinks this race is insane. “We forget that AI is a story we computer scientists made up to help us get funding once upon a time, back when we depended on grants from government agencies. It was pragmatic theater. But now AI has become a fiction that has overtaken its authors.”AI-driven social-media platforms promote factionalism and division among users, as starkly demonstrated in the 2016 and 2020 elections.

In Marcus and Davis’s view, the entire field needs to refocus its energy on making AI more responsive to common sense. And to do this will require a complete rethinking of how we program machines.

“The ability to conceive of one’s own intent and then use it as a piece of evidence in causal reasoning is a level of self-awareness (if not consciousness) that no machine I know of has achieved,” writes Judea Pearl, a leading AI proponent who has spent his entire career researching machine intelligence. “I would like to be able to lead a machine into temptation and have it say, ‘No.’” In Pearl’s view, current computers don’t really constitute artificial intelligence. They simply constitute the ground level of what can and likely will lead to true artificial intelligence. Having an app that makes your life much easier is not the same thing as having a conversation with a machine that can reason and respond to you like another human being.

In his Book of Why: The New Science of Cause and Effect, co-written with Dana McKenzie, Pearl lays out the challenges that need to be met in order to produce machines that can think for themselves. Current AI systems can scan for regularities and patterns in swaths of data faster than any human. They can be taught to beat champion chess and Go players. According to an article in Science, there is now a computer that can even beat humans at multiplayer games of poker. But these are all narrowly defined tasks; they do not require what Pearl means by thinking for oneself. In his view, machines that use data have yet to learn how to “play” with it. To think for themselves, they would need to be able to determine how to make use of data to answer causal questions. Even more crucially, they would need to learn how to ask counterfactual questions about how the same data could be used differently. In short, they would have to learn to ask a question that comes naturally to every three-year-old child: “Why?”

“To me, a strong AI should be a machine that can reflect on its actions and learn from past mistakes. It should be able to understand the statement ‘I should have acted differently,’ whether it is told as much by a human or arrives at that conclusion itself.” Pearl builds his approach around what he calls a three-level “Ladder of Causation,” at the pinnacle of which stand humans, the only species able to think in truly causal terms, to posit counterfactuals (“What would have happened if…?”).

But then a further question arises: Would such artificial intelligence be conscious the way we are? Or would it simply be a more advanced form of “smart” machine that exists purely to serve humans? There is reason for skepticism. As philosopher David Chalmers told Prashanth Ramakrishna in a New York Times interview in 2019, intelligence does not necessarily imply subjective consciousness:

Intelligence is a matter of the behavioral capacities of these systems: what they can do, what outputs they can produce given their inputs. When it comes to intelligence, the central question is, given some problems and goals, can you come up with the right means to your ends? If you can, that is the hallmark of intelligence. Consciousness is more a matter of subjective experience. You and I have intelligence, but we also have subjectivity; it feels like something on the inside when we have experiences. That subjectivity—consciousness—is what makes our lives meaningful. It’s also what gives us moral standing as human beings.

In Chalmers’s view, trying to prove that machines have achieved consciousness would not be easy. “Maybe an A.I. system that could describe its own conscious states to me, saying, ‘I’m feeling pain right now. I’m having this experience of hurt or happiness or sadness’ would count for more. Maybe what would count for the most is [its] feeling some puzzlement at its mental state: ‘I know objectively that I’m just a collection of silicon circuits, but from the inside I feel like so much more.’”

Link to the rest at Commonweal

Censorship Competition Heats Up

From The Wall Street Journal:

By now it is clear that wokeness is a contagious malady. Amazon.com made headlines in February when it suddenly delisted Ryan Anderson’s book “When Harry Became Sally: Responding to the Transgender Moment,” a thoughtful, humane and deeply researched investigation of a controverted subject of public debate.

As the publisher of that 2018 bestseller, I was taken aback by reports that Mr. Anderson’s book was unavailable at “the world’s largest bookstore.” At first, I wondered whether there was some mistake.

But no. It was a deliberate act of censorship. Moreover, like the earl of Strafford, Amazon’s motto was “Thorough.” They didn’t just stop selling the book. They pushed it into the digital oubliette, erasing all trace of it from the Amazon website. They did the same thing at their subsidiaries Audible, which sells audiobooks, and AbeBooks, which sells secondhand books.

Now it turns out that Bookshop.org, which bills itself a scrappy alternative to the Bezos Behemoth, is up to the same game. A couple of weeks ago, a reader alerted us that Mr. Anderson’s book had gone missing from the Bookshop.org website.

The organization never responded to our queries. But on Friday we learned from our distributor that Bookshop had deep-sixed the book. “We did remove this title based on our policies,” Bookshop wrote to our distributor—without, however, explaining what those “policies” might be. “We had multiple complaints and concerns from customers, affiliates, and employees about the title.”

Perhaps other customers, affiliates and employees expressed “complaints and concerns” about Heather Mac Donald’s “The War on Cops,” another Encounter bestseller. That book has also been disappeared from the Bookshop website.

. . . .

I couldn’t help but note that at least one of my own books, “Tenured Radicals,” is missing in action there. Apparently there were no “complaints and concerns” about Adolf Hitler’s “Mein Kampf,” however. That book is available in a variety of editions, as are the anti-Semitic lucubrations of Louis Farrakhan and many other similarly unedifying effusions.

Underdogs make for good copy, so it was no surprise that Bookshop was hailed as a brave upstart, a feisty David to the Goliath of Amazon. “Bookshop.org hopes to play Rebel Alliance to Amazon’s Empire,” ran the headline of a valentine in the Chicago Tribune.

Bookshop turns out to be little more than another minion for the Emperor of Wokeness. For the past couple of weeks, the first item advertised on its home page is that bible of antiwhite woke sermonizing, “How to Be an Anti-Racist.” Many readers, I’d wager, would have “complaints and concerns” about that screed. But that doesn’t mean that Bookshop should stop selling it. Nor would it, regardless of how many complained.

The move to squash Mr. Anderson’s book is the vanguard of a larger effort to silence debate and impose ideological conformity on any contentious issue in which the commissars of woke culture have made an investment. It has nothing to do with principle and everything to do with power.

Amazon and now Bookshop have sided firmly with the bullies. Doubtless there will be more interdictions, delistings and suppressions. They can do it, so they will do it.

One of the more tiresome canards from the courtiers is that entities like Amazon and Bookshop are private companies and therefore that they can choose to sell, or not sell, whatever they want.

This is true, but also irrelevant. What we are witnessing are not the prerogatives of the free market but the clashings of a culture war. Those clashings may adopt, as camouflage, the rhetoric of free enterprise, but their end is control and obliteration of opposing points of view.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

Lest any visitors to TPV should have any doubts, PG is concerned about viewpoint discrimination on the part of Amazon.

He acknowledges that, as a private business, Amazon has the right to choose what products it will and will not sell, but this decision drops the company into the middle of a political controversy that it needn’t have joined.

Amazon is a very large target for those across the political spectrum and a serious antitrust investigation of the company’s activities and policies could substantially harm its business.

More than one giant US company has been hamstrung and permanently impaired by a lengthy antitrust probe. Classic examples are AT&T, Kodak and Standard Oil.

Most recently, Microsoft was involved in a lengthy antitrust suit.

Bill Gates later said that the antitrust suit prevented Microsoft from completing development on Windows Mobile, its cell phone operating system (which left the field open to Apple and Android). Apple’s annual revenue is now about twice as large as Microsoft’s.

Gates also cited the stress of the antitrust suit as a contributing factor in his decision to step down from the leadership of Microsoft in 2000. PG is not alone in believing that Microsoft has not been the same company since Gates left.

There has been a growing sentiment in the United States that the big technology companies such as Amazon, Apple, Google and Facebook have become too large and powerful.

Amazon CEO Jeff Bezos had what was widely regarded as a poor showing in his videoconference testimony before the House Antitrust Committee last summer. He recently declined an invitation to testify before a Senate committee investigating “The Income and Wealth Inequality Crisis in America.”

PG notes that TPV is not a political blog and requests that comments not devolve into political name-calling. He is concerned about Amazon’s future primarily because it is the only significant marketplace where indie authors can publish their books on an equal basis with books from traditional publishers and Amazon provides a very large portion of the royalties that indie authors earn from their books.

How to ‘Update’ Beliefs

From The Wall Street Journal:

Metaphors carry us from one idea that is difficult to understand to another that is easier to grasp. Many subjects are notoriously incomprehensible, making the use of metaphors essential.

The Newtonian “mechanical universe” metaphor, for example, transfers us from the difficult idea of gravity and the spooky notion of action-at-a-distance to the more understandable “clockwork” of gears and wheels. Enlightenment thinkers used the mechanical metaphor to explain everything from the human body (with its levers and pulleys of joints, tendons and muscles) to political systems (the king as the sun, his subjects as encircling planets) and even economies: François Quesnay modeled the French economy after the human body, likening the flow of money through a nation to blood coursing through a body’s veins; he compared ruinous government policies to diseases that impeded economic health, and therefore recommended laissez-faire.

The workings of the human mind are especially enigmatic, so scientists have long invoked metaphors such as hydraulic mechanisms, electrical wires, logic circuits, computer networks, software programs and information workspaces to help explain what’s going on. In “The Scout Mindset: Why Some People See Things Clearly and Others Don’t,” Julia Galef, a co-founder of the Center for Applied Rationality and host of the popular podcast “Rationally Speaking,” uses a military metaphor of scouts and soldiers.

According to Ms. Galef’s divide, the soldier mindset leads us to defend our beliefs against outside threats, seek out evidence to support our beliefs, ignore or rationalize away counterevidence and resist admitting we’re wrong—as that feels like defeat. The scout mindset, by contrast, seeks to discover what is true through evidence, and reasons toward conclusions that lead to a more accurate map of reality—“the motivation to see things as they are,” Ms. Galef explains, “not as you wish they were.”

The differences between these two mindsets are striking and, Ms. Galef argues, explain how thinking goes right or wrong. Soldiers rationalize, deny, deceive and self-deceive, and engage in motivated reasoning and wishful thinking to win the battle of beliefs. “We talk about our beliefs as if they’re military positions, or even fortresses, built to resist attack,” the author writes. This soldier mindset leads us to defend against people who might “ ‘poke holes in’ our logic,” “shoot down” our beliefs or confront us with a “ ‘knock-down’ argument,” all of which may leave our beliefs “undermined,” “weakened” or even “destroyed.” Soldiers thus become “entrenched” in their beliefs, resisting “surrender” to an opposing position.

When our beliefs are true, of course, this can be effective. The problem is that almost all reasoning and decision-making happens under uncertainty, so the soldier mindset can easily lead to a perpetuation of error. In seeking truth—that is, an accurate map of reality regardless of which belief is right—scouts engage in more open-minded discovery, objectivity and intellectual honesty. “I was wrong” and “I changed my mind” become virtues instead of vices.

Soldier-types are more likely to believe that changing one’s mind is a sign of weakness, or that it is important to persevere in beliefs even when evidence is brought to bear against them. Scouts are more likely to take into consideration evidence that goes against their own beliefs, or think it may be more useful to pay attention to those who disagree with them than to those who agree.

Scouts, Ms. Galef explains, “revise their opinions incrementally over time, which makes it easier to be open to evidence against their beliefs.” They also “view errors as opportunities to hone their skill at getting things right, which makes the experience of realizing ‘I was wrong’ feel valuable, rather than just painful.” In fact, the author suggests, we should drop the whole “wrong” confession and instead describe the process as “updating”—a reference to Bayesian reasoning, in which we revise our estimations of the probability of something’s being true after gaining new information about it. “An update is routine. Low-key. It’s the opposite of an overwrought confession of sin,” Ms. Galef continues. “An update makes something better or more current without implying that its previous form was a failure.”

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

‘The Quick Fix’ Review: A Bias Toward Easy Answers

From The Wall Street Journal:

Most of us like to think of ourselves as enlightened, thoughtful observers of the world around us, skeptical of irrational claims, crazy ideas and silly theories. It is only other people, members of eccentric subcultures in far-off places, who are susceptible to such foolishness. It is a flattering self-portrait. But is it true?

In “The Quick Fix: Why Fad Psychology Can’t Cure Our Social Ills,” Jesse Singal, a contributing writer at New York magazine, chronicles several dubious enthusiasms that permeate our culture. Along the way, he tries to show why they are so widespread. His focus is on “the allure of fad psychology,” as he puts it, and on the ways in which “both individuals and institutions can do a better job of resisting it.”

We all remember the self-esteem programs that beguiled grade-school educators in the 1980s and 1990s. The idea was that, by handing out more prizes and encouraging self-affirming rhetoric, young people would do better in their studies and in life generally. But, as Mr. Singal notes, self-esteem failed to “ ‘unlock the gates’ of success.” Nor did it help to reduce—as promised—crime, teen pregnancy and a host of other social ills.

Then there was power-posing for women in the workplace: the claim that, by adopting assertive positions (legs astride, hands on hips) for two minutes before, say, going into a job interview, or while giving a presentation, a new confidence will be engendered as well as an improved status among otherwise dismissive men. A TED talk by an originator of power-posing and its chief evangelist, a Harvard psychologist, garnered 61 million views. Sheryl Sandberg of Facebook was a fan. But it turned out that standing like Wonder Woman didn’t give women the promised testosterone boost and confidence they sought.

A MacArthur Fellowship-winning social psychologist at the University of Pennsylvania championed a mental trait called “grit” (aka stick-to-itiveness). Teaching “grit” became a wildly popular way to build character or boost grades in school-aged children across the country. It didn’t deliver. As Mr. Singal notes, established concepts such as conscientiousness and IQ were far better at predicting performance.

Eventually the psychologists who created the test conceded that it had severe measurement problems. Among other things, it turned out that the IAT had notoriously low reliability, meaning that a subject could score “prejudiced” one day but not the next. And the test lacked predictive power or, as the creators acknowledged, was “problematic to use to classify persons as likely to engage in discrimination.” Nonetheless, the IAT has a vast reach. Hundreds of thousands, perhaps millions, of employees of corporations, foundations, universities, government agencies and police departments have taken the IAT—and have been told of the biases they possess but do not feel. After the killing of George Floyd, the popularity of the IAT exploded, despite the fact that it can’t predict the behavior that creates a racially unjust society.

What is the allure of these interventions? Humans will instinctively respond to a novel and simple—but not too alien—story about a subject of great social concern. What is more, fads are based on behavioral science conducted by researchers at esteemed institutions. Some of their colleagues grasp the exaggeration of their claims, but, as Mr. Singal writes, “it’s unrealistic to expect the average human resources manager or school principal or other institutional decision-maker to possess such skill and knowledge.”

On the supply side, psychologists have incentives to promote simple rather than complex theories. In a competitive academic field, a sexy press release can get one noticed. Even if fad originators were sincere at first, and most appear to have been, they often become too personally invested in what they are promoting. As Mr. Singal notes, they are “able to charge higher speaking fees, pursue lucrative consulting jobs, secure book deals, and enjoy the perks of minor celebrity.”

Academic journals, too, are keen to publish supposedly newsworthy findings. Under such conditions, it’s easy to see why a psychologist would be reluctant to re-examine her too-good-to-be-true results when doubts—her own and those of colleagues—begin to nag.

Each chapter of “The Quick Fix” presents accessible explanations of the research that was eventually shown to be “half-baked,” as Mr. Singal puts it. The problems, he shows, often derive from dodgy statistical analysis or faulty experimental design. Researchers, for instance, might use various statistical tests until one shows a sought-for result, or they might submit only positive results to a journal for publication, holding the negative ones back, a practice known as “file-drawering.” Mr. Singal also traces the social and political currents that helped propel certain trends.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

High Conflict

From The Wall Street Journal:

Amanda Ripley, a journalist whose first book, “The Unthinkable,” was about how people survive disasters, has covered “all manner of human misery.” Her latest book, “High Conflict: Why We Get Trapped and How We Get Out,” is prompted by misery of the American political kind. After the 2016 election, Ms. Ripley reflects, journalists who cared about telling the truth in all its complexity were preaching to a “shrinking choir of partisans.” Those who still read the news searched it for weapons to use against enemies. It “felt like curiosity was dead.”

Curiosity is a casualty of “high conflict,” a term that Ms. Ripley uses to describe our bitter politics and much else. We need conflict because human beings, limited in experience, biased but needing to act, are natural partisans. We find ourselves in conflict with other partisans. But under the best circumstances, that conflict, even when “stressful and heated,” keeps us “open to the reality that none of us has all the answers.” In “healthy conflict,” we defend what we hold dear but understand what others do, and, even when we don’t revise our views, find a way to work with them. In contrast, high conflict imagines an “us,” whose ideas must prevail, and a “them,” whose books must burn. It appears to clarify matters by narrowing vision.

. . . .

Our culture and values, Ms. Ripley argues, can also draw us into high conflict. We all experience humiliation, but a member of Curtis’s gang learned to perceive small slights as humiliations that required a forceful response. What humiliates and how one responds to humiliation, she argues, are “socially informed,” sometimes by “conflict entrepreneurs,” bad actors who “exploit high conflict for their own ends.”

. . . .

Ms. Ripley has more to offer than Baha’i wisdom when she turns to how people escape from high conflict. The most important insight of this part of the book is that you can’t beat high conflict with scolding it, however high-mindedly. Curtis Toler takes a step back from the Stones because he is a parent as well as a gang leader. He maintains his distance because he is offered another way to matter, working with those most likely to perpetrate or become victims of violence. Mark Lynas, the environmental activist, permits himself to see his mistakes only when he meets scientists whose “dedication to empirical evidence over ideology” he comes to admire. He sees a way to matter, and continue to pursue the aims he cares about.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

PG notes that all experienced authors know conflict is a highly-useful (perhaps almost necessary) element in successful fiction.

‘Rebellion, Rascals and Revenue’ Review: The Taxman Cometh, Again

From The Wall Street Journal:

Wherever they stand in their annual tussle with the American income-tax system, vexed readers in what is dolefully called tax season may agree with former Treasury Secretary Paul O’Neill when he declared that “our tax code is an abomination.”

If it’s any consolation, your taxes might have been both abominable and even sillier than they already are. Over the centuries, rulers have imposed levies on beards, livestock flatulence and even urine (valued in ancient Rome for its ammonia). In 1795, Britain imposed an annual tax of one guinea on the right to apply fragrant powders to smelly wigs. Since pigtails were common, those taxpayers became “guinea-pigs.”

Such are the tax-free dividends on offer in “Rebellion, Rascals and Revenue,” an erudite yet good-humored history of taxation with a particular focus on Britain and its tax-allergic offspring, the United States. The authors, economists Michael Keen and Joel Slemrod, demonstrate at surprisingly engaging length that, “when it comes to designing and implementing taxes, our ancestors were addressing fundamentally the same problems that we struggle with today.”

Among those problems are the search for fairness, the appearance of which is necessary for a tax to gain public acceptance; the inevitable metastasizing of a tax code’s complexity; the burden of administration, particularly when the task is intrusive (an English tax on hearths was resented because inspectors had to come into the home and count them); and the iron law of unintended consequences, which haunts public policy generally and taxation in particular. In Britain from 1697 to 1851, a tax on windows—not a bad proxy for affluence in those days—made work for carpenters and masons hired to close them up. The resulting loss of light and air exemplifies the so-called excess burden of taxation beyond the sum of money levied. So does your accountant’s tax-preparation fee.

The problem of “tax incidence”—figuring out who actually pays a tax, regardless of who writes the check—is especially fraught. The Earned Income Tax Credit, for example, is a reverse tax that aims to reduce poverty while encouraging work. But for every dollar that single mothers get from the EITC—at least according to one estimate—employers of low-skill labor capture 73 cents. The EITC, after all, encourages low-skill workers to enter the labor force, increasing the labor supply and presumably driving down workers’ wages.

. . . .

The Rosetta Stone, the authors note, “describes a tax break given to the temple priests of ancient Egypt.” That taxpayers should have some say in taxation was laid out (though not fully settled) in the Magna Carta. Later struggles over this question played a role in the English Civil War, the American War of Independence (remember “taxation without representation?”), and the French Revolution. “It was the ‘long nineteenth century,’ from 1789 to 1914,” the authors report, “that finally saw the emergence in the West of a stable, adequate, and broadly consensual tax structure.”

One of the book’s many insights is that taxes and war have always gone hand in hand, enabling not just each other but the social changes that often follow. “The world wars, and especially the second one,” the authors note, “created both the machinery that made the welfare state possible and the political environment that ensured it would become reality.”

Since many readers have just filed their income taxes, a word on the history of this levy may be in order. Britain’s first genuine income tax was introduced in 1799 to pay for the French and Napoleonic wars. America’s was put in place by the North to pay for the Civil War (the rate hit 10% in 1864). Eliminated in 1872, an income tax was soon back on the political agenda because of discontent with the tariffs and state and local levies that predominated in its absence.

Ultimately a constitutional amendment was required, and in 1913 a federal income tax became law. 

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

A Love Affair with Peru

From Women Writers, Women’s Books:

I was trekking the Inca trail, and the narrow, dirt path followed alongside the torrential Apurímac River. The switchbacks were jagged, cutting through wild grasses, boulders, and twisted trees. The Salkantay mountain range loomed nearby, and snow-capped peaks and dramatic ridges reminded me of the vastness of the Andes. Soon the trail took our small group of family and friends to a higher elevation, and the river looked like a small snake below. The trail had widened, and the group of chatty and enthusiastic trekkers were at a fair distance from me.

Earlier that day, we had all witnessed a condor flying overhead. The huge bird glided through a stretch of the canyon, against a backdrop of cloudless blue and sun-lit, golden cliffs. It took our breath away. Walking now in silence, thinking back on the majestic condor, I said a prayer for my book. I had finished one of many drafts, and though far from being the final manuscript, a sense of wellbeing and positivity rested in me. I felt sure my book would be published someday. What I didn’t know was that it was still only the beginning of an endeavor that would take over a decade to achieve. 

I had arrived in Peru fifteen years earlier, at the young age of twenty-two, well-traveled, fiercely independent, and recklessly adventurous–having already traveled around the globe, trekked the Himalayas, and worked with an environmental brigade in Nicaragua during the tail end of the Iran-Contra War. When I arrived, it was 1989 and red zones marred the countryside and two terrorists groups, Sendero Luminoso and the MRTA, wreaked havoc.

The economy was in shambles, and there were lines down blocks to buy staples like rice and cooking oil. Yet, the beauty and resiliency of the people captured my heart and imagination in a way I couldn’t have predicted. I was to study at La Católica University in Lima for one year, but to the surprise of everyone back home, I stayed beyond my year of studies to marry and start a family.  

My decision to stay involved a man, the father of my children, who after a twenty-five year marriage, I  have since divorced. What I understand now is at the time, and at that young age, I was falling in love and marrying not a person but a country, a culture, and an extended family. The family was involved in a small silk production enterprise in the Andes. I witnessed the life cycle of silk worms, from worm to moth, and how the silk is produced from the cocoons. This large, boisterous, complicated Peruvian family was my love, and the life and stories they shared with me became a love affair.  

Since that time, I have often asked myself,  did I always want to be a writer, or did living in Peru spark an impulse to explore life, people, and the human condition in that deep and mysterious way that writers do. Perhaps it was because my first daughter was birthed there, so far away from my native homeland, that I so deeply rooted myself to the people and culture. It was now my daughter’s birthplace, her homeland, and I wanted desperately to share it with her.

Link to the rest at Women Writers, Women’s Books

Churchill & Son

From The Wall Street Journal:

When Randolph Churchill was a young boy in the 1920s, his father Winston stayed up with him one night, talking with him into the late hours. At 1.30 a.m., Winston—then in the political wilderness and not yet the man who would come to be seen as one of the greatest Britons in history—turned to his son and said: “You know, my dear boy, I think I have talked to you more in these holidays than my father talked to me in the whole of his life.”

Josh Ireland writes of this episode very early in “Churchill & Son,” his account of the relationship between Winston Churchill and his only male heir, and a reader would have to be ice-cold of heart not to pause to take a sorrowful breath. Winston Churchill had had a loveless childhood. He was ignored by his glamorous American-born mother and, most crushingly, cold-shouldered by his father, who barely spared his son a second glance.

Although Winston pined for his mother’s attention, he worshiped his father, Lord Randolph Churchill, a volatile and brilliant iconoclast who was appointed chancellor of the exchequer at the age of 36. But Winston’s filial adoration was not returned. Instead, when Lord Randolph did pay Winston any attention, it was to scorch him with put-downs and unfatherly contempt. As Mr. Ireland, a British journalist, notes wryly: The “best thing Lord Randolph ever did for Winston was to die young”—at age 45.

Winston would write a biography of his father, published in 1906, 11 years after Lord Randolph’s death. It is a book of many qualities, none of which includes, says Mr. Ireland, “the detachment of a professional historian.” As a cousin of Winston observed at the time: “Few fathers have done less for their sons. Few sons have done more for their fathers.” Reading Mr. Ireland’s book, it is tempting to conclude that the inverse of that judgment applies to Winston and his own son. Few fathers did more for their sons than Winston. Few sons have done less for their fathers than Randolph.

Winston was determined, writes Mr. Ireland, “that his son would not suffer the same neglect that had blighted his own childhood.” If his own father had poured scorn on him—describing him in a letter as “a mere social wastrel” destined for “a shabby unhappy & futile existence”—Winston constantly encouraged Randolph. In Mr. Ireland’s words, he “praised him, told him that the future was his to seize.” In the jargon of our times, Winston can be said to have overcompensated for his own desolate childhood by lavishing love on Randolph.

Did he give Randolph too much love? And was that love corrosive? “Winston was obsessed with his son,” Mr. Ireland says, and was “never more himself than in Randolph’s company.” As Randolph grew from boy to man, father and son spent so much time together, absorbed in conversation, “that they had come to inhabit the same mental space.” And when they communed, they shut out the rest of the world—including Clementine (Winston’s wife, Randolph’s mother) and Randolph’s three sisters.

As a child, the cherubic Randolph got more attention from his mother than his sisters did. “Clementine even breastfed him,” Mr. Ireland tells us. But as Randolph became as much a companion for Winston as he was a son, his mother began to resent him. She felt that Winston indulged Randolph to excess, failing to check his rudeness at table and his misbehavior in society. Winston was “consumed by his own sense of destiny” (in Mr. Ireland’s words), and Randolph was the “incarnation of his dynastic obsession.” So Winston placed him on a pedestal—one from which Randolph was wont to spit at the world, or even urinate upon it, as he did once on the heads of his father and David Lloyd George—the prime minister—from a bedroom window at the Churchill country home. Lloyd George thought it was a passing shower.

Yet the more Clementine criticized Randolph, the stronger Winston’s love for him seemed to become. Perversely, she blamed Randolph for Winston’s failure to discipline his own son. She felt that she was vying with Randolph for Winston’s attention. Ever the devoted wife, she had sacrificed her own needs to care for Winston’s many whims. But instead of paying her the attention she expected, Winston was, Mr. Ireland says, “infatuated by his glorious, golden, chaotic son.”

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

The Light of Days

Comrades from the pioneer training commune in Białystok, 1938.
PHOTO: COURTESY OF GHETTO FIGHTERS’ HOUSE MUSEUM

From The Wall Street Journal:

They were nicknamed the “ghetto girls” but the label does not do justice to the defiant, mostly forgotten Eastern European Jewish women in their teens and 20s who, acting in resistance to the Nazis, undertook one mission impossible after another to disrupt the machinery of the Holocaust and save as many Jews as they could.

Now, in her well-researched and riveting chronicle “The Light of Days,” Judy Batalion brings these unsung heroines to the forefront. She has recovered their stories from diaries and memoirs written variously in Yiddish, Polish and Hebrew, some composed during the war (one in prison, on toilet paper, then hidden beneath floorboards), others afterward, still more recorded in oral histories. This group portrait forcefully counters the myth of Jewish passivity, at once documenting the breadth and extent of Jewish activism throughout the ghettos—armed resistance groups operated in more than 90 of them, according to Ms. Batalion—and underlining in particular the crucial roles women played in the fight to survive. Indeed, several of the women whose stories Ms. Batalion tells also helped lead the most significant act of anti-Nazi Jewish resistance, the 1943 Warsaw Ghetto uprising, which is recounted here in brutal detail.

The tasks and responsibilities these female fighters took on were as myriad as the false Christian identities they adopted to avoid capture, their disguises so successful that one was even hired as a translator by the Gestapo in Grodno. But mostly they traveled, seemingly nonstop, to surrounding Polish towns and in and out of the barricaded ghettos that they managed, through bribes and stealth, to penetrate. In the ghettos the Nazis not only segregated Jews from Aryan society but also prevented evidence of the massive deprivations and punishments Jews suffered there from leaking to the world outside.

This Nazi-imposed isolation made the female couriers all the more welcome when they arrived, living proof that those locked inside the walls were not forgotten. During their visits, the couriers acted as “human radios,” carrying greetings from other ghettos, bringing warnings of forthcoming deportations to the death camps, and serving as liaisons coordinating the efforts of ghetto resistance cells with those of armed partisan groups in the forests. They also took on the grim responsibility of reporting the latest massacres and other atrocities against the Jews. The eyewitness testimonies they conveyed were harrowing. But rather than spread hopelessness among the ghetto population, the couriers often did the opposite, breeding greater determination to resist, to leave a legacy of action and defiance rather than submissiveness. As one ghetto slogan declared, “It is better to be shot in the ghetto than to die in Treblinka!”

Skilled black marketers, they also smuggled in food to supplement the ghettos’ ever-dwindling food rations; medical supplies to fight typhus and the other diseases that ran rampant amid appallingly cramped, broken-down living conditions; and as many rifles, pistols, bullets, grenades and bomb-building components as possible, to spark an uprising.

Behind all these operations lay a deftness and aptitude for creating and maintaining resistance webs and networks both within and among different ghettos, as well as with sympathetic Aryans throughout Poland. That is why they were also often described as kashariyot, the Hebrew word for “connectors.” It was through these links that they set up hiding places for Jewish children outside the ghetto, found safe houses to conceal resistance fighters, provided forged papers and plotted escape routes to Palestine, even facilitated prison breaks. Nor did they hesitate to take up arms themselves, leaving Nazi troops so surprised to see women wielding guns and grenades that one startled SS commander was left to wonder if they were “devils or goddesses.”

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

While the women who did fight endured a great many terrible travails, those who didn’t may have experienced worse.

By Unknown author (Franz Konrad confessed to taking some of the photographs, the rest was probably taken by photographers from Propaganda Kompanie nr 689.[1][2]) – Image:Warsaw-Ghetto-Josef-Bloesche-HRedit.jpg uploaded by United States Holocaust MuseumThis is a retouched picture, which means that it has been digitally altered from its original version. Modifications: Restored version of Image:Stroop Report – Warsaw Ghetto Uprising 06.jpg with artifacts and scratches removed, levels adjusted, and image sharpened.., Public Domain, https://commons.wikimedia.org/w/index.php?curid=17223940

Using Novel Writing Techniques in Your Memoir

From Writers in the Storm:

I’ve spent much of our Covid year learning about, editing, and writing my own memoir. Memoir is a form I think every writer should try to tackle at least once. Everyone has a story to tell. The exercise of writing a memoir can sharpen our memories and force us to write outside our comfort zones—always good practice for a writer at any level. If you want to craft a memoir that is truly a page-turner, you can and should use many of your fiction writing tricks.

First Things First: What a Memoir Is and Is Not

It is important to know what a memoir is and is not. A memoir is not your autobiography. A memoir is a slice of your life at a particular time, in a particular place. It is literally your memories put to paper. Some memoirs cover a year in a person’s life. Some memoirs cover several years. Think in terms of a season of your life, rather than a finite block of days on the calendar.

Many new memoirists hamstring themselves by feeling they need to tell their entire life stories, nose to tail, David Copperfield-style. You do not. A memoir focuses on a theme, on a particular red thread that has wound through your life thus far. It is a not a full accounting of all your sins and wins!

A memoir is not a journal entry, even though it is your story. You must write it so that a reader can benefit from it. There must be a compelling reason to keep them turning the pages, such as a lesson they can learn or inspiration for them to find. Memoir can feel navel-gazey in the writing process, but it should never feel navel-gazey on the page. (Yes, I know this is daunting! But persevere.)

What holds a memoir together is a story—your story.

Remember as you write each page that you are telling that story, not making a police report. You can change names to protect people’s privacy. And since you are working from memory, the story will have your slant—don’t feel you have to get every single angle on it. If you ask your family about the picnic you had that one day in 1972, you will get a different story from each member about that day, told from their perspective. Somewhere in the middle lies the truth.

Discover what your truth is and use your memoir to tell it.

An Inciting Incident: You Need One

Telling us about the time you went to the market after work and ran into a friend you hadn’t seen since high school and you exchanged pleasantries with them is not  a gripping inciting incident. Telling us about the time you went to the market after work, ran into a friend you hadn’t seen since high school, and found out they needed a kidney is a start. Deciding to see if you were a match to help them because of that one time in school when they saved you from being assaulted by a teacher? That is a gripping inciting incident.

Don’t invent something that isn’t true, but when you sit down to comb through the sand of your life, you are searching for the pearl that you will hand to your readers. Think of the unusual things. If you don’t think there are any of those pearls, think again. Everyone has as story.

Once I sat in a hotel bar on a business trip and met seven different travelers, from seven different age groups, seven different places, seven different walks of life. Each and every one of them had a compelling story. You do, too. And if you write it well, people will want to read it.

Build Characters

Many new memoirists neglect to see that what they are crafting are characters (who just happen to be real people). You are the “main character” of your memoir.

This is tough for many writers. Do we ever really see ourselves completely objectively? Probably not. But we must do our best. Use the same techniques to craft interesting characters in your memoir that you do in your fiction writing. Make a list of who will appear on the stage of your memoir, and sketch them out, just as you would the players in your novel.

Link to the rest at Writers in the Storm

The Mystery of Harriet Cole

From Atlas Obscura:

If “Harriet” could hear, she might pick up the sound of ping-pong balls skittering across a table. If she could smell, she might detect a range of lunches being reheated in a nearby microwave. If her eyes could see, she might let them wander a busted Pac-Man machine, a TV, and a campus bookstore, decorated with a swooping, celebratory paper chain, like an elementary-school version of DNA’s double helix. She might even catch a glimpse of herself in a camera lens or an observer’s glassy eyeballs. People often stop to stare.

On a sweaty Saturday, before social distancing was the law of the land, a group of visitors gathered at Drexel University’s medical campus in Northwest Philadelphia to meet “Harriet.” The preamble to this encounter was a display case holding several unusual and meticulously prepared medical specimens, long used as teaching tools. Like “Harriet,” each had been created in the late 19th century by a star anatomist, Rufus Weaver. Now, behind glass, between the cadaver lab and a bookstore, a segment of intestine and a piece of a spinal cord sit in stillness. A dissected eyeball floats ethereally in century-old liquid, its separated parts looking like a tiny jellyfish, a bit of brittle plastic, a mushroom cap.

The visitors shuffled through the door and into the otherwise empty student center. They huddled on the low-pile carpet, nondescript in the style of a suburban office park, and peered at more of Weaver’s dissection work, which occupied a glass-fronted case. They surveyed a sinewy hand, ropey and purplish. Two skulls and necks. Then, “Harriet.”

Reactions rippled.

“Oh.”

“Oh, wow.”

Quietly, “Poor Harriet.”

. . . .

“I’ve been meaning to find her,” said Malaya Fletcher, an epidemiologist in Washington, D.C., specializing in infectious disease. Fletcher remembered learning about the dissection in her high school biology class, and the story had stuck with her. “It’s just awesome,” she said. “You almost don’t believe it’s real.” The group crowded in close, lofting their cell phones above each other’s heads. They bobbed and weaved their raised hands, trying to take pictures without capturing their own flushed faces reflected in the glass.

“Harriet” is a network of fibers fastened to a black board in a case pushed up against a wall. At the top, there appears to be a brain, plump and brown, and a pair of eyes. Scan your own eyes down and you’ll encounter an intricate system of skinny, brittle cords, pulled taut and painted startlingly, artificially white. The outline is recognizably human—there’s the impression of hands and feet, the hint of a pelvis, the suggestion of a rib cage—but it is slightly fantastical, too. The way the cords loop at the hands and feet, it almost appears as if the figure has fins. Elsewhere, the fibers look shaggy, like chewed wire, as if electricity is shooting from the margins of the body.

This is a human medical specimen, in the spirit of an articulated skeleton. But unlike that familiar sight, it represents the nervous system, a part of the body’s machinery that most people have trouble even imagining. Some who stand before “Harriet” wiggle their fingers and toes, as if trying to map the fibers onto their own bodies and make the sight somehow less abstract.

Neighboring the display is a label that identifies the specimen as “Harriet Cole” and explains that she was a Black woman who worked as a maid or scrubwoman in a university laboratory at Hahnemann Medical College, died in the late 1800s, and donated her body to the medical school. Her nervous system, the story goes, was dissected by Weaver, then preserved and mounted as a teaching tool and masterpiece of medical specimen preparation.

Before the preparation wound up at this campus, more than a decade ago, it traveled to Chicago for the 1893 World’s Fair, where it won a blue ribbon. It starred in a multi-page feature in LIFE magazine and took up residence in academic textbooks. But before all of that—before the nerves were naked—the fibers animated and stimulated a body. In 2012, the university’s press office characterized the nerve donor as the school’s “longest-serving employee.”

. . . .

Researchers such as Herbison and McNaughton are neither anatomists nor ethicists: They didn’t elect to procure, dissect, and display a body, though they inherited the finished product. As caretakers of this object, they have accepted the mission of poking around in the historical record, cleaving fact from fiction, trying to piece together a fuller story of “Harriet Cole” in spite of official records that often omit women and people of color.

. . . .

Committed to resurfacing stories of women lost, warped, or overlooked in the archives, McNaughton, Herbison, and other collaborators, including medical historian Brandon Zimmerman, are trying to pin down specifics about “Harriet.” They’re wondering, more than 130 years later, how to describe the dazzling, jarring preparation, stripped of skin and pulled away from the bone. Whose body this is, and what would it mean if one of the university’s oldest fixtures never knew that she would spend her afterlife on display?

. . . .

At Hahnemann, Weaver was appointed custodian of the university’s anatomical museum in 1880, and busied himself assembling an anatomical wunderkammer with no rival. Gone were papier-mâché models and “musty,” dried-out specimens. Weaver filled the light-flooded, third-floor space with hundreds of new medical displays, many of which he prepared himself. His trove included bladder calculi, sections of healthy and diseased brains, and an entire uterus, partly consumed by a tumor and opened to reveal a six-month-old fetus. The anatomist imagined these—and the museum’s hundreds of other objects—as teaching tools instead of “mere ‘curiosities,’” according to an announcement circulated in the mid-1880s. Among the assortment, there was Weaver, described in 1902 by a reporter from The North American as a “little professor” brimming with “energy, originality, and vim,” “as cheerful and bright as a May morning,” and prone to speaking of his collection of “beautiful tumor[s]” with tenderness and awe. (“Here is a lung,” the reporter quoted him saying. “Isn’t that the handsomest thing that you ever saw?”) In one 19th-century photograph, Weaver poses next to a fresh cadaver, its chest pried open, while limbs dangle around it like cuts of meat in a butcher’s shop. The anatomist’s own bearing was stick-straight—perhaps an occupational hazard of standing above so many spinal columns.

. . . .

But these guides stop well short of how Weaver pulled his masterpiece off. The earliest description of Weaver’s work on the nervous system comes courtesy of Thomas, who described the process in an 1889 edition of The Hahnemannian Monthly, the school’s journal. But Thomas’s is a hazy picture, long on the basics of dissection and short on clarity about how Weaver managed to preserve delicate nerve structures while chipping or sawing bone apart. This must have been finicky work: The spinal cord—a hardy nerve bundle—is roughly as wide as your thumb. We don’t have the complete ingredients Weaver mingled in his preservatives, a full inventory of the tools he enlisted, or a meticulous record of which parts of the process proved surprisingly straightforward or especially thorny or vexing. We don’t have a precise timeline, either. As Thomas tells it, dissection began on April 9 and concluded by June, with mounting complete by September; years later, van Baun reported that the dissection alone took nearly seven months, and then it required “seventy days of unceasing, laborious, skilled work and supreme patience to get the specimen on the board,” for a total of “nine months of gruelling [sic] contest.”

Weaver is said to have spent up to 10 hours a day in his humid office, and reportedly spent two weeks just tussling with the bottom of the skull. Once “all the little branching strands … were laid bare,” The North American noted, Weaver attempted to keep them supple by swaddling them in alcohol-soaked gauze or wads of cotton, which needed frequent changing, and he covered the flimsy strands with rubber. He retrieved nearly everything but sacrificed the intercostal nerves, which run along the ribs and proved too difficult to wrangle. Weaver reportedly excised the brain but held on to the outer membrane, called the dura mater, and plumped it up with “curled hair” stuffing, stitched it closed, and returned it to the display. To showcase the optic nerves, Weaver left the corpse’s eyes in place and distended them “with a hard injection,” Thomas wrote.

Mounting the specimen—as Weaver later recalled to The North American—was far more “wearisome and exacting” than the dissection itself. Weaver apparently tacked the nerves in place with 1,800 pins, and then fixed every filament with a coat of lead paint. (Many of those pins were later removed, Thomas wrote, once the shellacked nerves dried and held their position.) In all, Weaver reportedly spent several months laboring over the body, with a break for a summer vacation. The ultimate result, Thomas wrote, was “perfectly clean and free from all extraneous tissues and smooth as threads of silk.”

. . . .

Some laypeople argued that a living patient was better off being treated by someone who had seen the body’s inner contents up close. In 1882, The Christian Recorder—the newspaper of the African Methodist Episcopal Church—endorsed dissection, suggesting that it would be foolish for anyone to seek treatment “at the hands of a man who had not gone through the mysteries of the dissecting room.” Still, even those who supported the notion of dissection typically did not want to entertain the thought of it happening to anyone they loved. The anonymous author of that article in The Christian Recorder skewered grave robbing on moral grounds and suggested that doctors be offered the bodies of executed murderers and anyone who died by suicide.

The few people who expressly permitted, or even beseeched, doctors to cut into them after death tended overwhelmingly to be white, wealthy, and accomplished men. By 1889, the new American Anthropometric Society, headquartered in Philadelphia, began compiling the brains of physicians and public intellectuals who embraced the ideas of phrenology, which correlated intellectual feats with cranial attributes. These donors were keen to join the organization’s “brain club” as a way to further the field while also valorizing themselves.

And in the medical realm, consent was slippery. William Osler, a founding professor of Johns Hopkins Hospital, was known to solicit family approval before giving cadavers to his students—but he was also famously dogged in his pursuit of that permission, and in a 2018 article in the journal Clinical Anatomy, Wright, the University of Calgary pathologist, notes that “autopsy consent and organ retention abuse was not uncommon in late-19th century Philadelphia.” In a 2007 Academic Medicine article about the uptick in body bequeathal in 20th-century America, Ann Garment, then a medical student at New York University, and three coauthors note that turn-of-the-century body donation was uncommon enough to make the news when it happened. The New York Times picked up the tale of Thomas Orne, a wealthy Maryland horse dealer who pledged his body to Johns Hopkins in 1899. In 1912, 200 New York City physicians also vowed to donate their bodies for dissection in an effort to erode the stigma around it.

. . . .

“I am aware that there have been men, [the philosopher Jeremy] Bentham for instance, who have voluntarily willed their bodies to be dissected, but they have been extremely few,” Sozinsky recounted in 1879. Opting in was far from commonplace. “The ‘Harriet Cole’ story, if correct, is likely very unusual,” Wright notes. If a flesh-and-blood Black woman named Harriet Cole consented to her own dissection more than 130 years ago, she would have had very little company.

Link to the rest at Atlas Obscura

Weaver, photographed with “Harriet” in 1918. COURTESY LEGACY CENTER ARCHIVES, DREXEL UNIVERSITY COLLEGE OF MEDICINE, PHILADELPHIA.

The Zoologist’s Guide to the Galaxy

From The Wall Street Journal:

There are many ways of being alone. You can be alone in a room, a house, even a crowd. You can be really alone in a wilderness, or really, really alone in the universe.It’s that last, existential and cosmic loneliness that astronomers and specialists in astrobiology have in mind when they ask “Are we alone?”

In his book “The Zoologist’s Guide to the Galaxy,” Arik Kershenbaum, a zoologist and lecturer at Girton College, University of Cambridge, takes a novel and rewarding approach to this question. He is not too concerned about the evidence for or against the existence of extraterrestrial life; rather, he is interested in hypothesizing about what forms it might take, given what we know about conditions on other worlds. Instead of Enrico Fermi’s famous question, But where is everybody? Mr. Kershenbaum asks: What would everybody be like?

There are good reasons to think that we may not be alone. Humans and other earthlings exist, so life itself—for all its seemingly unique characteristics—isn’t altogether unimaginable. And there are many other planets, almost certainly in the hundreds of millions, perhaps billions, including a very large number that appear to be roughly similar to Earth. Have we any basis to presume our planetary life forms are so special?

In 2017 astronomers were perplexed by an object first spotted by the telescope on Mount Haleakala, Hawaii. It had many traits that appeared to distinguish it from other extragalactic objects that occasionally enter our planetary neighborhood: unusual shape, rotation and speed. It was dubbed “Oumuamua” the Hawaiian word for “scout,” and although most scientists doubt that it was a scout from another civilization, at least one highly regarded astronomer thinks it was.

As befits a good biologist, Mr. Kershenbaum presents insights informed by what we know about the process of evolution by natural selection. He argues that, although the details will necessarily vary from one exoplanet to another—whether life might be based, say, on silicon, or whether gravity will be stronger or weaker than on Earth—life most likely will be subject to the basic principles of variation, selective retention and reproduction. Whatever the specific planetary environment, some sort of evolutionary mechanisms could very well be inevitable. If so, there should be interplanetary commonalities when it comes to biology—just as there appear to be shared patterns of chemistry, physics and mathematics that apply to the rest of the universe’s inanimate objects, from subatomic particles to black holes.

“A zoologist observing a newly discovered continent from afar,” Mr. Kershenbaum writes, “would be buzzing with ideas about what kind of creatures might live there. Those ideas wouldn’t be wild speculations, but keenly reasoned hypotheses based on the huge diversity of animals we already know, and how each animal’s adaptations are well suited for the life they live: how they eat, sleep, find mates and build their dens. The more we know about how animals have adapted to the old world, the better we can speculate about the new.”

. . . .

Mr. Kershenbaum proceeds to argue, persuasively, that “we have enough of a diversity of adaptations here on Earth to give us at least potential mechanisms that seem appropriate solutions even on worlds almost unimaginably different from ours.”

That may lead the reader to conclude that extraterrestrial creatures, however exotic, will resemble their Earthbound counterparts in recognizable ways. They may have long, short, flexible or jointed appendages, but nevertheless would have some sort of protuberant structures used for locomotion or manipulation. They might have big, little, single, multiple, round, slitted or geometric eyes, but in any case would need some devices for apprehending what we call visible energy. Recall the justly beloved cantina scene in the first “Star Wars” movie, in which the diverse denizens were all, in some way, “animal.”

Mr. Kershenbaum doesn’t go that far, sidestepping the temptation to make assumptions by concerning himself with how aliens would behave rather than how they would appear; that is, their functions rather than their forms. Thus, when he hypothesizes about alien language, he focuses on the presumed universal payoff of communication, without speculating about, say, dialects of Klingon, à la “Star Trek.”

“If alien animals use sound for their alarm calls, their screams will probably be very much like ours,” he writes. “Don’t believe it if they say ‘no one can hear you scream’—screams evolved to be heard, and to be disturbing. Even if aliens don’t use sound, it’s likely that alien alarm calls will be similarly chaotic in whatever medium they do use. They will have whatever properties are characteristic of the alien signal-production organ when you jump out from behind a rock and give the alien a fright. ‘Scary’ is going to be similar on every planet.”

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

A Lullaby in the Desert: One Woman’s Fight for Freedom

From Self-Publishing Review:

What if by questioning injustice and standing up for the oppressed, your words were met with threats, captivity, and execution? Would you still stand up?

Imagine being born without rights. From bicycle bans and compulsory clothing to mandatory beliefs, what’s worse than being born in a society where your gender alone is a crime? Millions of women are held captive, whether behind bars or behind barriers, for what they believe, what they wear, and what they say. They are suffering at this very moment. Some, like Susan, decided they wouldn’t take being held in the grip of a society’s invisible hands any longer. Some, like Susan, decided to stand up despite the possibility of paying with their lives.

A Lullaby in the Desert isn’t just Susan’s story; it’s the chorus of millions of women, their voices carrying forcefully over the empty sands. Their silent melody can be heard from Iran to Syria, from Indonesia to Morocco. Indeed, their voices ring all over the world. Slavery as we read about it in the history books may be fading into the past, but another kind of slavery lives in the present and threatens to persist into the future if we choose to ignore it.

Some use fear as a weapon to keep others down, forcing entire societies into silence. In some countries, those in power would prefer to destroy the identities of millions of innocent people so long as their grip on power remains intact.

What they don’t know is that fear won’t stop someone who has nothing to lose. In A Lullaby in the Desert, Susan finds herself homeless, penniless, and alone in Iraq, a country on the brink of disaster. When standing on the edge of the abyss, Susan stepped forward, just like the other refugees beside her taking this journey to the point of no return. They all had the same goal: freedom.

Freedom is their fundamental right, their dream, their destination. Like so many others, Susan’s freedom was stolen from her, the shackles thrown over her, covering her body, pushing her down. For Susan, the forces of evil and slavery could be easily seen in the black flags of the Islamic States of Iraq and al-Sham, who some call ISIS, covering her life in a shadow. However, for millions of women, those dark forces are not so obvious, but they are deadly nonetheless.

. . . .

For a long time, I wondered how I could speak for those who could not, for those who had already died, for those who were still enslaved. When the idea first entered my mind, I had to take a step back. Even the thought of telling the world of our plight made me shudder as I remembered my own trauma that began from my earliest days. I remembered the nine-year-old girls sold for fifty dollars in the street to marry strange old men, I remembered a singer assassinated for speaking up about people’s rights, I remembered seeing a woman shot in the head because she wanted to be free. Shame on me if I remained silent.

When I close my eyes I feel no pain because I cannot see anything around me. But my beliefs remain, my story remains. I had to stand in front of my trauma, confront it, release it, because I didn’t choose this life but this is what I know.

When I decided to write Lullaby, one thing pushed me forward: the pain. Pain may stop some, may slow some down, may force some down a different path. For me, I allowed it to open my eyes. 

Link to the rest at Self-Publishing Review

Little Platoons

From The Wall Street Journal:

Shortly after the Industrial Revolution began plucking workers from their ancestral villages and installing them in factory towns, a certain bargain was struck. The family would need to be mobile and smaller now—just mom, dad and the kids, most likely—but it would be sacrosanct, a haven in the heartless world of urban anonymity and mechanized production. If public life was to be marked by fierce competition and creative destruction, at least in the family home you would be free, safe, independent.

In “Little Platoons: A Defense of Family in a Competitive Age,” Matt Feeney outlines a troubling deviation from this bargain, a growing incursion of market forces into the haven of the family home. Mr. Feeney’s compact and compellingly argued book, which grew out of a 2016 article he wrote for the New Yorker, takes its title from Edmund Burke’s “Reflections on the Revolution in France.” There, counseling loyalty to one’s closest community, Burke writes that “to love the little platoon we belong to in society, is the first principle (the germ as it were) of public affections. It is the first link in the series by which we proceed towards a love to our country, and to mankind.” Mr. Feeney suggests that our little platoons are being diverted from this function and transformed from schools of public affection to weapons of public competition.

Mr. Feeney points to several breach points. Tech gadgets isolate us and prey on the idiosyncrasies of our brains. All-consuming youth sports fashion not just soccer players but entire soccer families. In the ambitious, competitive environments that Mr. Feeney describes, year-round sports clubs and camps promise not joyful play or healthy exertion but “development” and preparation for advancement to “the next level”—where the good, choiceworthy thing is always a few hard steps away. If there is a terminus to this process, it is admission to a good college, which is, for many of the parents Mr. Feeney describes, the all-encompassing goal of child-rearing.

As a result, the most powerful and insidious interlopers in Mr. Feeney’s story turn out to be elite college-admissions officers. These distant commissars quietly communicate a vision of the 18-year-old who will be worthy of passing beneath their ivied arches, and “eager, anxious, ambitious kids,” the author tells us, upon “hearing of the latest behavioral and character traits favored by admissions people, will do their best to affect or adopt these traits.”

Admissions officers exercise increasing power to shape the lives of both the children and their families. Their preferences hold so much weight that, more than being merely instrumental, they create the “vague assumption, largely unquestioned, that a central ethical duty of American teenagers is to make themselves legible to a bureaucratic process and morally agreeable to its vain and blinkered personnel.”

A hypercompetitive marketplace like this is driven by fear, Mr. Feeney tells us, and, “if, thanks to this fear, people come to believe that a successful life requires passage through a super-selective college, and if such colleges use the leverage this gives them to require sheepish levels of agreeableness in their successful applicants, then agreeable, sheepish college students are what you’re going to get.”

. . . .

He tells us of studies showing that attending a second-tier college isn’t nearly as detrimental to one’s earning potential as most people would believe. Referring to the work of the economists Stacy Dale and Alan Krueger, Mr. Feeney writes that “selective colleges don’t turn kids into bigger earners. They choose the kids who are more likely to be bigger earners.” Much of what the college-admissions process does is filter the extremely talented from the very talented. If your child is one or the other, chances are their professional performance will reflect it.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

PG will opine that, other than, perhaps, helping a recent college graduate to obtain his/her first job out of college, attending a “name” college is probably not that important for the life success of 99.99% of the populace of the United States. (PG is not inclined to associate with the .01% that are not included in his statistic following a few encounters with such people.) (Uncharacteristically, PG won’t opine on what works and doesn’t in other nations.)

PG attended a “name” college (not Ivy League) and so did Mrs. PG.

For PG, he suspects that his college may have helped him overcome a very non-commercial major to get his first job. (Although the person who decided to extend him his first job offer also asked for PG’s SAT scores, which would have been the same regardless of where he had attended college.)

After that, PG doesn’t recall ever being asked about his formal education by anyone.

It also didn’t take long for PG to encounter very intelligent and competent people who had graduated from colleges he had never heard of or who had not graduated from college at all.

And everyone knows idiots who attended “name” colleges.

PG suspects that one of the main drivers of the hypercompetitive marketplace identified in the OP is insecure parents who want to drop college names when speaking about their children. PG has known a few parents of that sort and has not observed that behavior to be terribly beneficial for their children.

When Brains Dream

From The Wall Street Journal:

Sometime last August, five months into the pandemic lockdown, I had my first mask dream—I was in a crowded place and was the only person wearing a mask; it filled me with panic and I woke suddenly, scared out of my wits. Well into middle age, I still have nightmares that I’m back in college during finals week, and that there’s an entire course I forgot to attend.

Dreaming is a universal human experience; although there are some themes that run through them, dreams are also unique to the dreamer and can provoke calm, wonderment and fear. Animals have them, too, although we’re less certain about the content. About once a month, my wife and I find Madeleine, our 8-pound cairn terrier, doing battle with some unseen adversary as she whimpers and runs in her sleep; we wake her gently and comfort her because the whole thing seems so distressing to her. It sure seems as though she’s dreaming.

Antonio Zadra and Robert Stickgold, two of the world’s leading researchers in the science of sleep and dreams, have written a remarkable account of what we know and don’t know about this mysterious thing that happens during the night. The promise of “When Brains Dream” is to address four questions: “What are dreams? Where do they come from? What do they mean? And what are they for?” In a masterly narrative, the authors answer these and many more questions with solid scientific research and a flair for captivating storytelling.

Speaking of evolution across species, they note that “for it to have been maintained across half a billion years of evolution, sleep must serve functions critical to our survival.” One of those functions is cellular housekeeping. Sleep deprivation leads, for example, to impairments of insulin signaling; after being allowed only four hours of sleep for five nights, “otherwise healthy college students begin to look prediabetic.” Sleep also clears unwanted waste products from the brain, including ß-amyloid, which is a prime suspect in Alzheimer’s disease.

Attempts to understand and interpret dreams must be older than history—dreams and their meaning play parts in religious traditions from Tibetan Buddhism to the Old Testament, classical philosophy to Freud and Jung. To many, dreams are prophecies, implanted in our brains by God or angels; to others, they exist to encode our memories of the previous day, to others they are simply random neural firings. To still others, they are the products of fourth dimensional beings (such as the ones Clifford Pickover so eloquently describes in his book “Surfing Through Hyperspace”).

The weight of the evidence supports a more elaborate, nuanced and wondrous version of the memory-encoding hypothesis. Messrs. Zadra and Stickgold have designed a conceptual model they call Nextup (“Network Exploration to Understand Possibilities”), using it to describe the progression of dreams throughout the four sleep stages and their different functions. They debunk the common myth that we only dream during REM sleep and show that, in fact, we are typically dreaming throughout the night and in nonREM sleep states. They tie all of this into the brain’s “default mode network,” in which our minds are wandering and, often, problem-solving. When we’re awake, our brains are so busy attending to the environment that we tend to favor linear connections and thinking; when we allow ourselves to daydream, we solve problems that have distant, novel or nonlinear solutions.

. . . .

By the time we reach REM sleep, later in the night, our brains have entered a superelaborate and vivid version of that default mode network, where dreaming “extracts new knowledge from existing memories through the discovery and strengthening of previously unexplored weak associations. Typically, the brain starts with some new memory, encoded that day . . . and searches for other, weakly associated memories. . . . The brain then combines the memories into a dream narrative that explores associations the brain would never normally consider.”

During dreaming, then, “the brain is searching . . . digging for hidden treasures in places” it would be unaware of when we’re awake. This, in part, explains why some dreams seem to have such a bizarre, otherworldly quality. Could it be that the content or effectiveness of REM sleep among intelligent people differs from that of others? We don’t yet know. The future may see brain-training games that allow our default mode and our REM sleep to create remote associations more often.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

The Challenge of Writing Humor in Dark Times

From Publishers Weekly:

The two of us blinked at each other. We had just swapped edits for chapter one of our latest coauthored book, You Are Here: A Field Guide for Navigating Polarized Speech, Conspiracy Theories, and Our Polluted Media Landscape, and there was a problem. This chapter focused on the Satanic panics of the 1980s and ’90s, which we argued marked the beginning of the network crisis: a period of intensifying polarization and information disorder now central to our politics. For the second time, we’d rejected each other’s suggestions. Whitney was being too somber for Ryan, and Ryan was being too slapstick for Whitney. Clearly, we needed to have a conversation.

So we opened negotiations. Front and center was the question, on a scale of one to 10, how funny should You Are Here be? Ryan said seven. Whitney said two. We’d developed the one-to-10 scale while working on our previous book, The Ambivalent Internet; it was a way of posing questions, soliciting answers, and brokering compromises with minimal ego bruising. This was how we made decisions about everything from language choice to argument structure. But this was the first time we disagreed about jokes.

Previously, humor had characterized our work. A running editorial goal as we researched, drafted, and edited The Ambivalent Internet was to make each other laugh. We even opened its second chapter with a story about our own raucous laughter as we prepared to give an aggressively absurdist, meme-heavy academic presentation about the book.

But that was in 2015, the very tail end—for us, anyway—of the “lol nothing matters” myopia that had characterized our early work on internet culture. How we felt about humor at the outset of the project and how we felt on the day of our final submission—not incidentally, the day after the 2016 election—shifted considerably. Laughter still came easily in 2015. By the time the book was published in 2017, we’d stopped laughing—because all the dangers of all that laughter and that “lol nothing matters” mentality had grown painfully clear.

Whitney tackled these dangers in a follow-up project to The Ambivalent Internet exploring journalistic amplification of far-right memes. But in 2018, when we started writing You Are Here, we hadn’t fully dealt with our own laughter.

What began as a discussion about the appropriate number of Satan jokes in chapter one quickly broadened out to everything we’d overlooked—as scholars, as authors, as people—because so many of us had been so busy laughing for so long. The real question became one of how we were going to position ourselves within the crisis we were describing. Were we going to separate ourselves from it, and in that emotional distance crack jokes? Or would we place ourselves within it and take responsibility for our part?

These questions hit especially close to home as we reflected on internet culture. As we drafted The Ambivalent Internet, white supremacists had already adopted the trollish winking, ironic racism, and weaponized laughter that was so common online. But too many people didn’t notice, because all the jokes and all the memes looked the same as they ever did. Stealthily, smirkingly, white supremacists used these “jokes” to push the Overton window—the range of discourse politically acceptable to the mainstream—further and further to the far right. We couldn’t gloss over this history in You Are Here; we couldn’t fall back on easy laughter. We needed to address the political and ethical consequences of internet humor directly. So we devoted a chapter of the book to how fetishistic online laughter, our own very much included, had accelerated the network crisis.

Link to the rest at Publishers Weekly

‘Francis Bacon’ Review: Furious Beauty

From The Wall Street Journal:

In the spring of 1949, the press baron Viscount Rothermere gave a ball that defied Britain’s postwar decline. The men wore white tie, the women their family jewels. The Queen Mother was there, and so was the royal of the moment, Princess Margaret. Late that night, the princess, giddy with Champagne, took the mic from Noël Coward and delivered her party trick. Her Royal Highness began to sing, off-key and out of time. The revelers loyally cheered and called for more.

Margaret was just beginning to mutilate “Let’s Do It” when a “prolonged and thunderous booing” emerged from the crowd. The band stopped, and the princess reddened and rushed from the room. “Who did that?” Lady Caroline Blackwood asked the man at her side. “It was that dreadful man, Francis Bacon,” he fumed. “He calls himself a painter but he does the most frightful paintings.”

More than a decade before the end of the “Lady Chatterley” ban and The Beatles’s first LP, the iconoclasm of Francis Bacon announced a new era in British life. By his death in Madrid in 1992, Bacon was a social and artistic icon, his battered face the image of painterly tradition. Openly gay, he was the king of Soho, a nocturnal drinker and cruiser, his motto “Champagne for my real friends, real pain for my sham friends.” By day, alone in his studio, he was the exposer of torn flesh and gaping mouths, of the secret shames and spiritual collapse that, in the postwar decades when French thinkers ruled in concert with French painters, made him the only truly English existentialist.

With “Revelations,” Mark Stevens and Annalyn Swan enter a biographical field as crowded as the Colony Room on a Friday night. Almost all the revelations are already revealed: the Baconian literature includes testimonies from Bacon’s critical patron David Sylvester; his fellow artist and slummer Lucian Freud; his drinking friend Dan Farson; and his assistant Michael Peppiatt. Mr. Stevens and Ms. Swan might, like Bacon’s friends, share a tendency to confuse the man with the art—like Oscar Wilde, Bacon was his own best work—but they bring a sober eye and an organizing mind to Bacon’s “gilded gutter life.” As in their acclaimed “de Kooning,” the authors frame their subject and his work as a portrait of the age.

. . . .

Erratically educated, Bacon “wasn’t the slightest bit interested in art” until 1930.

. . . .

Almost entirely “self-taught and untouched,” Bacon turned to painting. A critic mocked his first publicly exhibited portrait as “a tiny piece of red mouse-cheese on the end of a stick for head,” and he was judged “insufficiently surreal” to be included in London’s International Surrealist Exhibition of 1936. He and Nanny Lightfoot survived by holding illegal roulette parties.

. . . .

The war made Bacon. The revelatory evils of Nazism, the bombing and the newsreels of the death camps all forced the public to acknowledge the horror and the moral vacuum from which it had emerged. The critics, too, realized that Bacon had “found the animal in the man and the man in the animal.” A magpie for quotations and influence, Bacon liked to quote Aeschylus: “The reek of human blood smiles out at me.”

From then on, Bacon was famous and rich, unless he had lost at the tables. The authors excel at illustrating his formation—Bacon destroyed almost all his early work—his manipulation of his image and value, and his helpless gambling in the power games of love. He believed in beauty and tragedy, and he got and gave both.

Link to the rest at The Wall Street Journal

PG includes a portion of one of three panes in a triptych for those unfamiliar with Bacon’s work. PG expects that the virtues of Bacon’s artistic sensibility may be a learned taste for some.

Excerpt from “Three Studies for Portrait of George Dyer (on Light Ground),” a painting by Francis Bacon

Lesya Ukrainka’s Revisionist Mythmaking

From The Los Angeles Review of Books:


February 25, 2021, marks the 150th birthday of the modernist poet at the top of the Ukrainian literary canon, Lesya Ukrainka (Larysa Kosach, 1871–1913). Having chosen, at the age of 13, the pen name “Ukrainian woman,” she went on to reinvent what it meant both to be a Ukrainian and a woman.

. . . .

“I am quite well aware that this is impudence,” she admitted with a sense of delicious irony in a letter to a friend, interlarding her mock-confessional Ukrainian with German words and quotes from Alexander Pushkin’s Eugene Onegin, “yet ’tis ‘has been pronounced on high’ that I must mit Todesverachtung throw myself into the maze of global themes […], which my countrymen, except two or three brave souls, dare not enter.”

As a modernist, she broke with literary tradition in two significant ways. First of all, she rejected a provincializing paradigm imposed upon Ukrainian culture by the Russian Empire. During her time, the only acceptable image of the colonized people was that of ignorant peasants, and stir Ukrainka’s fancy it did not. A polyglot in command of nine European languages, she populated her poetic dramas with archetypal characters from classical mythology, Scripture, medieval legends, and Romantic poetry. Twining Ukrainian anticolonial subtext and European cultural context, Ukrainka also undermined the masculinist underpinnings of some familiar plots. A turn-of-the-century writer in a ruffled-collar blouse, she revised the key myths of Western culture from a woman’s point of view, venturing into literary territory later to be explored by second-wave feminists.

. . . .

Ukrainka’s poetic drama Stone Host (1912) became the first story of Don Juan in European letters written by a woman. Tirso de Molina, Molière, E. T. A. Hoffmann, Lord Byron, and Alexander Pushkin were among her predecessors. Ukrainka’s version transforms the fabled libertine, the great Romantic sinner and seducer into his supposed conquest’s plaything. Donna Anna is the unmistakable New Woman of the fin de siècle, albeit dressed in Spanish courtly garb. Confused by her rationality, Ukrainka’s Don Juan cries out, “You are indeed stone, without soul or heart,” only to hear in response, “Though not without good sense, you must admit.” Don Juan agrees to sacrifice his freedom and become Donna Anna’s sword in the fight for the throne. Donna Anna’s manipulative power compensates for her overall powerlessness within a male-dominated society, which can silence her no longer. Ukrainka’s heroines seize the right to tell their stories.

Link to the rest at The Los Angeles Review of Books

PG doesn’t wish to rain on the triumphant parade of Ukrainica’s heroines, but must point out that Joseph Stalin did a pretty thorough job of crushing millions of Ukrainian women and men during the 1932-33 Ukrainian famine (The Holodomor, “to kill by starvation” or Terror-Famine).

Powerlessness is not always gender-related.

Starved peasants on a street in Kharkiv, 1933. In Famine in the Soviet Ukraine, 1932–1933: a memorial exhibition, Widener Library, Harvard University. Cambridge, Mass.: Harvard College Library: Distributed by Harvard University Press, 1986. Procyk, Oksana. Heretz, Leonid. Mace, James E. (James Earnest). ISBN: 0674294262. Page 35. Initially published in Muss Russland Hungern? [Must Russia Starve?], published by Wilhelm Braumüller, Wien [Vienna] 1935.

A Worse Place Than Hell

From The Wall Street Journal:

“The real war will never get in the books.” Walt Whitman’s well-known prediction has not prevented thousands of writers, including Whitman himself, from trying to put the Civil War between covers. Many kinds of chronicles have been written—military histories, political studies, overviews of society or culture, portraits of leading figures. One especially striking way of bringing the war alive is to convey it from the standpoint of the unexalted individual. That is the choice John Matteson makes in “A Worse Place Than Hell,” a moving group portrait that uses the Battle of Fredericksburg, in late 1862, as the focal point for the story of five participants in the Civil War, four Northerners and one Southerner.

The battle that Mr. Matteson highlights has attracted a lot of scrutiny over the years, most notably in Francis Augustín O’Reilly’s “The Fredericksburg Campaign” (2003) and George C. Rable’s “Fredericksburg! Fredericksburg!” (2002). These books give details of the fateful encounter near the Rappahannock River on Dec. 13, 1862, in which Army of the Potomac under Ambrose E. Burnside met resounding defeat at the hands of Robert E. Lee’s Army of Northern Virginia. The futile assaults by waves of Union soldiers on Confederate troops, who were protected by a stone wall on Marye’s Heights, have become a fixture of Civil War lore. On that grim winter day, the Union suffered more than 12,000 casualties, compared with some 5,300 on the Confederate side. President Lincoln put a positive spin on the battle by praising the surviving Union soldiers for their bravery. Privately, however, he confessed that the battle had left him in “a worse place than hell.”

Although Mr. Matteson uses Lincoln’s phrase for his title, he doesn’t dwell on the hellish aspects of the war. Instead he concentrates on personal and cultural transformation. The people he follows were profoundly changed by the war, he tells us; all of them “confronted war and struggled to redeem themselves within it.” Oliver Wendell Holmes Jr., the son of a famous Boston physician and author, entered the war as an idealistic man and emerged from it hard-bitten and skeptical, leading him to seek direction in a legal career. The Rev. Arthur Fuller, the brother of the women’s rights champion Margaret Fuller, served as a chaplain in a Massachusetts regiment but at Fredericksburg traded his ministerial role for a military one, taking up a gun in a burst of patriotism and losing his life to Confederate bullets. The budding author Louisa May Alcott, hoping to contribute to the Northern cause, became a volunteer nurse in a Washington war hospital, an experience that fed into her popular book “Hospital Sketches” and later provided the emotional background for “Little Women,” a fictionalized portrayal of the Civil War’s toll on her Concord, Mass., family.

As for Walt Whitman, he was writing poems and newspaper stories in Brooklyn and hobnobbing with bohemians when he heard that his brother George had been wounded at Fredericksburg. He traveled first to Washington and then south to the environs of the battlefield in search of his brother, whose wound, as it turned out, was not serious. Walt stayed on for several years in Washington, taking on minor government jobs while serving as a volunteer nurse in war hospitals, setting the stage for his later role as the major poet and memoirist of the war. Two of Whitman’s poems about Lincoln, “When Lilacs Last in the Dooryard Bloom’d” and “O Captain! My Captain!,” are timeless eulogies of America’s greatest president, and his writings about the war, in poetry and prose, are at once crisply realistic and emotionally resonant. George Whitman, Walt’s brother, ended up serving in many Civil War battles and thus provides, in Mr. Matteson’s narrative, a kind of moving lens on the war as it unfolded on the battlefield.

In addition to these Northerners, Mr. Matteson describes the dashing John Pelham, a Confederate artillery officer who exhibited unusual courage. At Fredericksburg, partly hidden by a dip in the land, Pelham coolly supervised the firing of a cannon that was protected by its very proximity to Union troops: Their return volleys mainly went over the heads of the rebels. Pelham’s death at the Battle of Kelly’s Ford, three months after Fredericksburg, becomes in Mr. Matteson’s handling a dramatic, hopeless flourish of Confederate chivalry. Pelham charged forward on a horse like a blond god of war before being felled by an enemy shell fragment. The loss of Pelham was a blow for Confederate morale. Mr. Matteson writes: “No individual in the Confederate Army had seemed more invincible than Pelham. His risks had never been punished, and his audacity had been continually rewarded. If he could fall, so, too, might the army he left behind.”

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

PG notes the scale of death of the American Civil War – 620,000 in the Civil War vs. 644,000 US deaths in all other conflicts in the history of the nation.

The Civil War killed over 2% of the total US population at the time. In a distant second place, World War II killed .39% of the US population.

For every three soldiers killed in battle, five more died of disease. No record was kept those who were psychologically damaged, but not killed, in the war.

Recruitment on both sides was very local and either no records or very scanty records were kept of the number who enlisted from various counties and states and who they were. Neither army had systems in place to accurately record deaths or notify the families of the deceased or wounded during the war.

Because families and communities went to war together and served together, a single battle could devastate the communities and families whose sons served together.

As just one example, in the Battle of Gettysburg, the 26th North Carolina, comprised of men from seven counties in the western part of the state faced the 24th Michigan. The North Carolinians suffered 714 casualties out of 800 men. The Michiganders lost 362 out of 496 men.

Nearly the entire student body of Ole Miss (The University of Mississippi) –135 out 139–enlisted in Company A of the 11th Mississippi. Company A, also known as the “University Greys” suffered 100% casualties in Pickett’s Charge.

It is estimated that one in three Southern households lost at least one family member in the war. Of those who survived the war, one in thirteen veterans returned home missing one or more limbs, making them unemployable in most parts of the country. 

PG obtained much of this detailed information from Civil War Casualties.

The Johnstown Flood

A locomotive whistle was a matter of some personal importance to a railroad engineer. It was tuned and worked (even “played”) according to his own personal choosing. The whistle was part of the make-up of the man; he was known for it as much as he was known for the engine he drove. And aside from its utilitarian functions, it could also be an instrument of no little amusement. Many an engineer could get a simple tune out of his whistle, and for those less musical it could be used to aggravate a cranky preacher in the middle of his Sunday sermon or to signal hello through the night to a wife or lady friend. But there was no horseplay about tying down the cord. A locomotive whistle going without letup meant one thing on the railroad, and to everyone who lived near the railroad. It meant there was something very wrong.
The whistle of John Hess’ engine had been going now for maybe five minutes at most. It was not on long, but it was the only warning anyone was to hear, and nearly everyone in East Conemaugh heard it and understood almost instantly what it meant.

David McCullough, The Johnstown Flood

Smalltime

From The Wall Street Journal:

Do we really need another Mafioso-in-the-family memoir? I mean, seriously, we’ve had books that could be called Mafia Wife, Mafia Dad, Mafia Son, Mafia Stepdaughter, Mafia Uncle, Mafia Dachshund, Mafia Goldfish—okay, well, I made up a couple of those, but you get the point. When Al Capone’s purported grandson publishes a memoir, and he has, I think it’s safe to say we’ve reached saturation.

Which is why I was surprised how thoroughly I enjoyed Russell Shorto’s “Smalltime: A Story of My Family and the Mob.” Even more so once I realized a more accurate subtitle for the book would be “Searching for Grandpa: Second-in-Command of the Johnstown (Pa.) Mob.” In other words, this is not Mafia history that will send Geraldo Rivera scrambling to open a Shorto family safe anytime soon.

And that, oddly, is part of the book’s charm. The author of well-received histories of Amsterdam and New York City, Mr. Shorto has produced something that feels altogether fresh, a street-level portrait of how his late grandfather helped build what amounted to a Mafia small business—or businesses, actually, everything from the numbers and rigged card and dice games (Grandpa’s specialty) to pool halls, a cigar store, bars, bowling alleys and pinball arcades. There’s a murder mystery here—there has to be, right?—but make no mistake, this is a spry little book about small business.

As Mr. Shorto tells it, he had only the vaguest notion of his namesake grandfather Russell “Russ” Shorto’s career until an elderly cousin buttonholed him and urged him to write a book. Mr. Shorto is reluctant—“not my thing,” he avers—but soon finds himself in a Johnstown Panera Bread, surrounded by a gang of ancient, white-haired wise guys dying to tell him about the old days. Grandpa Russ, it seems, had a long run as a mid-level Mafia bureaucrat, running a sports book and crooked card games among other things, until his drinking got out of control and the law finally came calling.

For Mr. Shorto, the challenge is Grandpa Russ’s personality, or lack of one. He was a quiet man and, despite all the Panera chats, remains a cipher for much of the book. The story opens up once Mr. Shorto goes in search of the public Russ, tracing his family from its Sicilian roots and cataloging his newspaper clippings and arrest and FBI records. What emerges is the gritty tale of a talented card-and-dice cheat who gets his break in the late ’30s when a buttoned-down Mafioso named Joseph “Little Joe” Regino, who made his bones in Philadelphia, marries into Russ’s family and opens a Mafia franchise in Johnstown.

This was industrial-age Pennsylvania, and postwar Johnstown was a city of steel factories, whose workers quickly cottoned to the backroom gambling and after-hours places Russ and Regino opened. Russ’s masterstroke was something they called the “G.I. Bank,” a thrumming numbers operation that proved a cash machine. They invested the profits in a dozen local businesses and paid off the mayor and cops, while allowing them to make periodic “raids” to sate the newspapers. A handful of foot soldiers would get pinched, a few hundred dollars in fines would be paid, and they would do it all again the next year.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

Reading My Way Through a Pandemic with Post-Apocalyptic Literature

From The Literary Hub:

I have long loved post-apocalyptic fiction. It’s a fascination that goes back to my teen years. One of my earliest end-of-civilization reads was Lucifer’s Hammer by Larry Niven and Jerry Pournelle, in which a comet is hurtling toward Earth, poised to destroy civilization but not all humans, at least not the ingenious ones. One of my favorite scenes in that book involves a scientist double-bagging books and sinking them in a septic tank. And not just any books: books that will help rebuild civilization, like manuals on the internal combustion engine. It is an enticingly clever plan I’ve been meaning to replicate ever since.

Instead, I’ve settled for keeping a massive supply of water in my basement. I am both intrigued by the idea of being that prepared but conscious of not slipping into obsession, so I have several lifetimes’ supply of bandages (including some that claim to staunch battle wound bleeding) but no food rations. Because that would be a step too far.

My fascination is such that my best friend is my best friend due to it. When my first book came out, I was at a book festival, manning a table lonelier than the sole survivor of a nuclear attack, when I turned around to survey my neighbors. At the table behind me, a pleasant, sweet-faced soccer-mom type displayed an array of meticulously arranged books with a delightful light blue cover adorned with a smattering of birds. I squinted and made out the title: Pandemic. The birds represented the vector for the disease that strikes in her book. I loved her immediately.

This past spring, I was scared, but all we had to do was get through a couple of weeks of “flattening the curve,” I told myself. I bought too much frozen food and prepared to ride it out, like the heroine in one of my post-apocalyptic novels.

. . . .

As two weeks became a month, which bled into a stunted summer, I turned my attention to what has gotten me through so many of the difficult times of my life: books. At first, I couldn’t focus, so I took a familiar route: read something so compulsively page-turning that I couldn’t help but be led along. I turned to my brother, also a writer, for a recommendation.

“I’ve been thinking a lot about Salem’s Lot by Stephen King,” he said. “It’s about vampires, not disease, but it reminds me of the pandemic. Like how everyone’s in denial and thinking nothing bad could ever happen there, but by the time anyone catches on to what’s going on, the town is overrun.”

It did the trick, showing me something about my world while carrying me away from it. After that, I revisited some old favorites: Earth AbidesAlasBabylon; World War Z. Each story destroyed the fragile fabric of modern life in its own way, but all presented me with the question that makes post-apocalyptic fiction so alluring: if the niceties and nuances, the comforts and conventions of modern living were suddenly stripped away, who would we be, really?

No book made me wonder this more than The Postman (a fantastic book not to be judged by the unfortunate movie adaptation starring that wrecker of post-apocalyptic books, Kevin Costner). In it, a humble civil servant finds meaning post-end-of-civilization through carrying out the seemingly simple task of delivering the mail. There is an evil warlord, and somewhere along the way we find out what the warlord used to be in the before times. Our times. He was an insurance salesman. I imagined a warlord somewhere in me too, beneath the couch potato with a penchant for avoiding the planks and 15-minute high-intensity interval training video I keep promising myself I’ll do.

But the pandemic taught me that civilization-wide upheaval is often not as splashy as it is in books. Instead of warlords, it’s harried moms at Target that’ll get you, snatching the last of the frozen mozzarella sticks with a look equal parts ruthlessness and apology in their worried eyes.

Link to the rest at The Literary Hub

A PG was reading the OP, a phrase arose from the grimy depths of the dank sub-basement of his mind, “The word is not the thing.”

PG recalled the phrase being tossed about during his college years, shortly before the Russian Revolution. It was from a semantics class and simply meant that when you talk or write about a loaf of bread, such talk or writing is something quite different from an actual loaf of bread.

Moving up a notch, the term, “Russian Revolution”, is something quite different than what happened in Russia between 1917 and 1923. For one thing, “Russian Revolution” didn’t kill anyone while the actual Russian Revolution resulted in the deaths of a whole bunch of people.

PG did a little research and discovered the following, more lengthy quote:

The words are maps, and the map is not the territory. The map is static; the territory constantly flows. Words are always about the past or the unborn future, never about the living present. The present is ever to quick for them; by the time words are out, it is gone.

The author of the quote was a Polish scholar named Alfred Habdank Skarbek Korzybski (no, PG has no idea how it is pronounced) who is often credited with developing a field of academic inquiry called general semantics. (Korzybski viewed General Semantics as something different than Semantics, but, again, PG can’t help you there.)

Korzybski was born to a wealthy aristocratic family in Warsaw, served in the Russian Army during World War I until he was wounded and, somehow, made his way to the United States during the latter part of the war and started writing books and giving lectures.

Evidently, Korzybski was a fun teacher, as illustrated by the following anecdote.

One day, Korzybski was giving a lecture to a group of students, and he interrupted the lesson suddenly in order to retrieve a packet of biscuits, wrapped in white paper, from his briefcase. He muttered that he just had to eat something, and he asked the students on the seats in the front row if they would also like a biscuit. A few students took a biscuit. “Nice biscuit, don’t you think,” said Korzybski, while he took a second one. The students were chewing vigorously. Then he tore the white paper from the biscuits, in order to reveal the original packaging. On it was a big picture of a dog’s head and the words “Dog Cookies.” The students looked at the package, and were shocked. Two of them wanted to vomit, put their hands in front of their mouths, and ran out of the lecture hall to the toilet. “You see,” Korzybski remarked, “I have just demonstrated that people don’t just eat food, but also words, and that the taste of the former is often outdone by the taste of the latter.”

The author, William Burroughs, attended one of Korzybski’s workshops and Robert A. Heinlein named a character after him in his 1940 short story “Blowups Happen”

Back to the OP, of course a book about a pandemic and an actual pandemic are two entirely different things and experiencing one is unlike experiencing the other no matter how real the book seems.

Finally, a short video of Korzybski himself, explaining a bit about General Semantics.

Religion and the Rise of Capitalism

From The Wall Street Journal:

The biggest cheerleaders for the Enlightenment in our era—Steven Pinker, for example—often imagine it as having occurred in a religious vacuum. They forget, or appear to forget, that Isaac Newton was animated by his Christian belief that God had created an intelligible universe; and they sidestep the Christian foundations of political rights and other key Enlightenment principles.

The story of modern economics, told from this perspective, would point out that Adam Smith, whose treatise “The Wealth of Nations” was acclaimed from almost the moment it appeared in 1776, had little interest in religion and that his mentor and friend David Hume was a fierce skeptic. Smith, resisting the tides of his time, showed that specialization of labor and voluntary exchange are the keys to prosperity, providing benefits for all even if workers, employers and other market participants are motivated mainly by self-interest.

Harvard economist Benjamin Friedman tells a very different story in “Religion and the Rise of Capitalism.” In the decades before Smith’s masterwork, several French theologian-philosophers known as Jansenists (Catholics heavily influenced by St. Augustine) mused about the difficulty of distinguishing self-interested actions from genuine charity. Bernard Mandeville’s satire “The Fable of the Bees” showed vile behavior producing socially beneficial consequences. These ideas were in the “cultural soil,” Mr. Friedman argues. He even suggests that Smith’s benign view of human nature may have been influenced by meals he shared with enlightened clerics in his dining club, the Select Society.

The book’s title is misleading—Mr. Friedman’s narrative is about the evolution of economic thought, not capitalism. He alternates between theological debates and developments in economic thought. On economics he is compelling, on theology disappointingly tendentious. Mr. Friedman unselfconsciously presents as fact a host of skeptical—and highly debatable—claims about Christianity and biblical texts. More important, he relies on a caricatured version of Calvinism, especially the New Testament-based doctrine that God predestines some to be saved, to set up his central claim: The weakening of traditional Calvinism, he contends, spurred a more optimistic conception of human potential, which helped to inspire key innovations in economic thought. 

After Adam Smith, Mr. Friedman shifts his attention to America, where the links between theology and economic thought are more direct. Several of the best-selling treatises in the early 19th century were written by minister-scholars. Rev. John McVickar, professor of moral philosophy at Columbia, and Rev. Francis Wayland, president of Brown, both saw the magic of market pricing as a recipe for prosperity. Mr. Friedman attributes the optimism in American economic thought in this era—which stood in contrast to Thomas Robert Malthus’s gloomy predictions that population growth would cause terrible suffering—to the abundance of land and opportunity, and to the rejection of Calvinism. 

. . . .

As industrialization and urbanization increased, theologians wrestled with the issues of income inequality. Mr. Friedman characterizes one response as a Gospel of Wealth—exemplified by the famous preacher Henry Ward Beecher and by Andrew Carnegie, who defended the accumulation of wealth on religious grounds. He juxtaposes this with the Social Gospel movement that emerged late in the century. Social Gospelers like Washington Gladden and Walter Rauschenbusch called for public ownership of utilities and emphasized social rather than individual sin. Although the Social Gospelers sought social reform, they were optimistic about the nation’s future. Their optimism reflected a theological perspective called postmillennialism. Whereas premillennialists believe Jesus will return before the start of a thousand-year period described in the Book of Revelation and therefore tend to doubt we can cure society’s ills, postmillennials interpret the same book as predicting a thousand-year golden era prior to Jesus’ return and look forward to a new era of equity and security. 

. . . .

By the early 20th century, the story peters out. Economics “has become ever more mature as a discipline,” Mr. Friedman reflects, “and therefore progressively more insulated at the basic conceptual level from influences originating outside the field.” This is a very convenient end date for Mr. Friedman’s thesis, because theology took a much darker turn after the horrors of World War I. The leading theologian of the mid-20th century was Reinhold Niebuhr—unmentioned in this book—who dismissed the Social Gospel movement as an exercise in naiveté. Because sin is pervasive, Niebuhr argued, justice “is a balance of competing wills and interests, and must therefore worst anyone who does not participate in the balance.” Mr. Friedman succinctly and accurately describes the Keynesian revolution during the Depression—John Maynard Keynes’s contention that markets will not adjust effectively if there is inadequate demand for goods and services. “Introducing the concept of aggregate demand,” Mr. Friedman writes, “opened the way for an entirely new dimension of economics: macroeconomics, meaning analysis of the behavior of entire economies.” But religion plays no part, because economics is, in his word, mature.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

PG doesn’t know of any regular commenters who may be tempted, but he’s not excited about receiving any ad hoc attacks on religion in general or particular religions in the comments. People have differing views on the subject.

One of PG’s longest and best discussions of religion in general was one he held some years ago with a fellow attorney and friend who was, and likely still is, an Orthodox Jew who was very serious about his beliefs and, in PG’s non-expert assessment, very knowledgeable about his religion.

PG had and has different religious views than his friend but found a great deal of personal benefit from hearing his friend talk about his religious beliefs and how he applied them in his daily life.

In PG’s observation, many who comment on the impact of religion on society in the United States tend to either overestimate or underestimate its impact.

Again, in his observation, lumping a variety of Christian religions together avoids addressing the difference between such religions and assumes some commonalty of influence between religious sects with differing beliefs.

Additionally, the impact of various religions on the daily behavior of their adherents also varies between Christian churches. Some churches teach doctrines that are easier or harder to follow. Members of some churches may view those co-religionists who do not adhere to major or minor church doctrines differently than members of other churches view adherence or non-adherence by individuals to doctrine.

In addition to lacking expertise on more than a few Christian religions, PG claims no expertise of any worth concerning non-Christian religions. PG also notes that, as a life-long American, he has only the sketchiest of knowledge of how religions, Christian or otherwise, are regarded by those living elsewhere.

PG will venture to opine that religions in their many and various forms have had and continue to have a significant collective impact on human thought and behavior throughout the world.

He also opines that strong systems of values and belief, whether they call themselves religions or not, can have and have had a significant impact on human thought and behavior in the past and present.

Mao, Lenin, Stalin were each crusaders for their own beliefs.

Gritty and Glittery

From The Los Angeles Review of Books:

THESE ARE DARK DAYS for theater people. Playhouses sit empty. Performers cash unemployment checks. Broadway remains closed and won’t reopen until May 2021 at the earliest. Many industries have found ways to accommodate our apocalyptic new reality, but commercial theater is not among them. There can be no “outdoor dining” equivalent of a $21 million musical.

Here to fill the void — and to remind us of a better time — is Michael Riedel’s Singular Sensation. This juicy, jaunty book is about Broadway in the 1990s, a period of great change that paved the way for the industry’s recent artistic and financial prosperity. Singular Sensation offers less an explanation of present-day abundance, however, than a reminder of all that has been lost. “I never intended the subtitle of this book — The Triumph of Broadway — to be ironic,” Riedel writes in his foreword. Ironic, alas, it is, though the author insists better times are on their way. “There will be a comeback,” he says, “and Broadway is good at comebacks.”

Riedel’s last book, Razzle Dazzle: The Battle for Broadway, tracked the demise and resurgence of New York City, Times Square, and Broadway in the 1970s. Starring two lawyers who took over the Shubert Organization, Broadway’s biggest theater chain, Razzle Dazzle was a relentlessly entertaining piece of cultural history. Riedel, who until his COVID-19 furlough was the much-feared theater columnist at the New York Post, brought his trademark voice — biting, loving — to an epic that was every bit as gritty as it was glittery. He showed how Gerald Schoenfeld and Bernard Jacobs, the hero-lawyers at the center of the book, steered Broadway through fiscal catastrophe and helped deliver New York City into its current affluence. More than just a dishy gossip compendium (though it was that), Razzle Dazzle illustrated just how mutually intertwined the destinies of cities and their arts sectors are.

Singular Sensation has a smaller case to make, and is accordingly a shorter, more narrowly focused book. Much like Razzle Dazzle, it unfolds through a series of show profiles that embody the significant shifts of the era: the decline of the British mega-musical, the reinvigoration of American playwrighting and musical comedy, and the increased corporatization of the producer class. Also like its predecessor, it’s a blast.

We begin with Sunset Boulevard, the musical that effectively ended Andrew Lloyd Webber’s Broadway dominance. By 1993, the year of the show’s American premiere, Lloyd Webber had firmly established his commercial preeminence. The British composer’s productions, which included Cats (1982) and The Phantom of the Opera (1988), delivered unprecedented weekly grosses; Sunset Boulevard, adapted from the 1950 Billy Wilder film, seemed destined to join this run of hits. Production staffers referred to it as “The Female Phantom.”

But there were problems. Stage star Patti LuPone opened the show in London on the understanding that she’d follow it to New York. When Glenn Close won the favor of Lloyd Webber in another pre-Broadway tryout, however, LuPone was fired. On learning the news from a Liz Smith column in the Post, LuPone says she “started screaming […] I had batting practice in my dressing room. I threw a floor lamp out the window.”

Link to the rest at The Los Angeles Review of Books

Please Stop Comparing Things to “1984”

From Electric Lit:

George Orwell’s 1984 is one of those ubiquitous books that you know about just from existing in the world. It’s been referenced in everything from Apple commercials to Bowie albums, and is used across the political spectrum as shorthand for the silencing of free speech and rise of oppression. And no one seems to love referencing the text, published by George Orwell in 1949, more than the conservative far-right in America—which would be ironic if they’d actually read it or understood how close their own beliefs hew to the totalitarianism Orwell warned of.

Following last week’s insurrection at the Capitol, Josh Hawley said it was “Orwellian” for Simon & Schuster to rescind his book deal after he stoked sedition by leading a charge against the election results. Donald Trump, Jr. . . . claimed after his father was kicked off Twitter that “We are living in Orwell’s 1984,” then threw in a reference to Chairman Mao for good measure. . . . (V)oices all over Twitter lamented the “Orwellian” purge of their followers after accounts linked to the violent attack were banned from the platform. It’s enough to make an English teacher’s head spin.

I understand why Orwell’s dystopian novel is so appealing to people who want to decry authoritarianism without actually understanding what it is. It’s the same reason I relied on the text for years in my own classroom. Although we often urge our students to resist easy moralizing, the overt didacticism of 1984 has long been part of its pedagogical appeal. The good guys are good (even if they do take the last piece of chocolate from their starving sister or consider pushing their wife off a cliff that one time). The bad guys are bad. The story is linear and easy to follow; the characters are singularly-minded and voice their views in straightforward, snappy dialogue; the symbols are obvious, the kind of thing it’s easy to make a chart about or include on a short answer section of a test. (20 Points: What does the paperweight represent to Winston, and what does it mean when, after it is shattered, he thinks, “How small…how small it always was!”) Such simplicity can be helpful when presenting complicated ideas to young people who are still developing analytical and critical thinking skills. And so, like so many other teachers, I clung to Orwell’s cautionary tale for a long time as a pedagogical tool despite its literary shortcomings.

But when Trump began his rise to political power, I started to notice the dangerous inoculating quality that the text had in my own classroom. Because the dystopia of 1984 was such a simplified, exaggerated caricature, it functioned for my students not as a cautionary tale, but as a comforting kind of proof that we could never get “that bad.” I didn’t take the step to remove the text from my curriculum, but more than in previous years, I began to feel the need to charge the students to consider how things like “doublethink” and Newspeak related to our own political moment. But beyond the intellectual pleasure of the exercise itself (they were more than ready to offer examples of these methodologies across the political spectrum), most students could not bring themselves to consider that the United States could actually sink into the kind of totalitarian control that Oceania experienced. They cited our “freedoms”—speech, press, etc.—as mitigating factors. They trusted norms, even as those norms were being continually tested and broken in real time, the goalposts moving ever closer to political collapse. 

Link to the rest at Electric Lit

  1. PG apologizes for a delayed start to his posting today. Besides blaming Covid (which neither PG nor Mrs. PG or any PG offspring have caught, but it does tend to weigh on PG’s mind nonetheless), PG had a few surprises earlier in the day that occupied way more time than they should have.
  2. PG reminds one and all that TPV is not a political blog and, given the toxicity of political discussions, debates, shouting contests, etc., in the US during the past few months, PG especially doesn’t want contemporary politics to intrude into the respectful, considerate and interesting environment PG and many other visitors appreciate when they click on a link that leads them here.

1984 was published in 1949, when Joseph Stalin had been ruling the Soviet Union and its people with an iron fist since he became Secretary General in 1922 and Orwell was clearly referring to something like a Stalinist society and the some of the tactics of the Communist party in a fictional context in his book.

Fortunately, regardless of which of the two major-party candidates had won the most recent presidential election in the United States, referring to either as an Orwellian or potentially-Orwellian head of state would be a gross overstatement.

Senator Hawley’s comment about Simon & Schuster acting in an “Orwellian” manner in canceling his book contract was vastly overheated. While PG is fully capable of deploring the behavior of various major and minor US publishers with a variety of insulting adjectives, “Orwellian” is not one he would use.

The author of the OP, a former English teacher turned author, also took a Hawleyesque turn in some parts of the OP insulting Republicans that PG omitted from his excerpt.

In both the Senator’s and the former English teacher’s expressed opinions, PG observed the arrogance and foolishness of those who believe their education automatically brings them common sense and perspective on almost any contemporary event.

A Caveman Would Never Do CrossFit. Why There’s Nothing Natural About Exercise

Speaking of sitting around, sheltering in place, eating and drinking holiday cheer, etc.

From The Wall Street Journal:

One of the biggest myths about exercise is that it’s natural. If anything, human instincts lean more toward taking a nap. Want to feel bad about skipping a workout? Blame evolution.

Daniel E. Lieberman argues this theory in his new book “Exercised: Why Something We Never Evolved to Do Is Healthy and Rewarding,” which tips some of the fitness world’s most sacred cows. Everyone knows exercise is good for them, yet studies show most people don’t get enough of it. Mr. Lieberman set out to find out why, and the answers, he hopes, will help remove some of the shame people feel about their own inactivity that makes it even harder to get moving.

Mr. Lieberman criticizes people he calls “exercists” who brag about how much they work out and pass judgment on the less fit as unnaturally lazy. Those who take the escalator instead of the stairs are not guilty of the sin of sloth, he writes, but doing what they were evolved to do—saving energy only for what is necessary or recreational.

Other highlights from the book out Jan. 5: People who believe brutal cross-training workouts bring them closer to the brawny body that belonged to their ancient forebears probably are not familiar with research that shows Neanderthals were only slightly more muscular than today’s regular humans. Fitness buffs who think civilization’s pampering has muted our natural strength might not realize that a profoundly inactive couch potato moves more than a typical chimpanzee in a day. As for our natural talents, it bears noting that the average person runs as fast as a hippo.

. . . .

I wonder how people who do high-intensity cardio with weight training will respond to this book. I feel like you’re calling out CrossFit in particular.

I guess I am. I’ve done some CrossFit workouts, they’re great. I’m not anti-CrossFit. But there’s this CrossFit mystique that your inner primal macho ripped hunter-gatherer ancestor is who you were meant to be. If that gets you happy, that’s fine, all power to you, but you don’t have to make the rest of us feel bad for not doing these intense crazy workouts. They’re not necessary.

You get this sense by reading some books or popular articles that those of us who are contaminated by civilization are somehow abnormal because we don’t want to get out of bed and run an ultramarathon or go to the gym and lift 300 pounds. Our ancestors never did that and they would think it’s crazy because they were struggling to survive with limited food.

. . . .

From an evolutionary standpoint, how do you explain people who love to exercise?

It’s not that we don’t have rewards for being physically active. Our brain produces this wonderful mix of chemicals that makes us glad we’ve exercised. The sad part of the equation is that our brain doesn’t create these chemicals to get us to exercise. We’ve turned something normal and simple and basic into a virtue signaling thing.

It’s like people who are intolerant of other peoples’ weight. Most people in America are now overweight. They’re not overweight because of some fault of their own and they’re struggling, and yet there are people out there who are unacceptably mean to people who are struggling. I think we need to have the same level of compassion toward people who are struggling to exercise. There’s nothing wrong with them.

Do you disagree with those who call sitting the new smoking?

Let’s relax. The chair is not the enemy.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

PG notes he usually doesn’t include two posts from the same source on the same day and hopes all the lawyers who work for the Wall Street Journal are skiing in New Hampshire or otherwise distracted. However, a great many of those writers and editors who work for news publications that cover traditional publishing seem to be experimenting with how it would feel to be permanently unemployed over this holiday break.

PG won’t name names, but some online publications that find something or several somethings to post about traditional publishing every day are running the same headlines they were before Christmas.

‘Finding My Father’ Review: The Lies He Lived By

From The Wall Street Journal:

Thirty years ago, I became hooked on the work of socio-linguist Deborah Tannen after reading her smart, savvy, relationship-saving (yup, my husband read it, too) bestseller “You Just Don’t Understand: Women and Men in Conversation.” Here, and in Ms. Tannen’s many books since then, she displays an acute ability to decode and explain the hidden messages and assumptions our words unwittingly convey, whether about power, status, a wish for greater connection or its opposite. If there is a common lesson in all her volumes (among them, “I Only Say This Because I Love You” and “You’re the Only One I Can Tell”), it’s the importance of listening—and learning how to tune in to what’s really being said.

Now, in her appealing memoir “Finding My Father: His Century-Long Journey From World War I Warsaw and My Quest to Follow,” Ms. Tannen reveals how she acquired this skill, courtesy of her adored father, Eli Samuel Tannen:

When I was a child, and the family gathered after guests left, my father would comment, “Did you notice when she said . . . ?” and go on to explain what meaning he gleaned from the guest’s tone of voice, intonation, or wording. So I trace to him what became my life’s work: observing and explaining how subtle differences in ways of speaking can lead to frustration and misunderstandings between New Yorkers and Californians, women and men, and people of different ages, regions, or cultures.

Over the course of his long life—he died in 2006, at the age of 97—Eli loved nothing more than to reminisce, especially about his earliest years in Warsaw, where he was born into a large Hasidic family. He left that world behind when he came to America in 1920, but he never stopped missing it. His oft-repeated tales made his wife, Dorothy—Ms. Tannen’s mother—roll her eyes at what she dismissed as tedious stories about dead people. For her part, Dorothy remembered almost nothing about her childhood in what is now Belarus, from which she and her family had escaped in 1923. Why look back on a place that, had they stayed, would have probably cost them their lives in the Holocaust?

But young Deborah loved her father’s tales, many of which resembled scenes ripped from an Isaac Bashevis Singer story. She was captivated by Eli’s detailed memories of the tightly knit Jewish community where he spent his first 12 years. The poverty was visceral in a neighborhood teeming with bow-legged children suffering from rickets, and shoeless beggars wearing clothes made from rags. Yet Eli felt sustained by the sheer liveliness of his grandfather’s multigenerational household, where the boy lived with his widowed mother and only sister, surrounded by aunts, uncles and cousins of all ages.

Many of these relatives had remarkable stories of their own. Aunt Magda, for instance, having early in her life eschewed religion for communism, fought with the Soviets in World War II and afterward became a high-ranking official in the Polish government. Aunt Dora’s brilliance led her to a career as a respected mathematician and physicist who studied with Einstein, became his lover and, even after aborting her pregnancy by him, followed him from Europe to Princeton, N.J.

Eli’s own life took a different course. Despite high aptitude scores at his American public school, he dropped out at 14. He spent the next three decades struggling to make a decent living, first to support his widowed mother and his sister, and then his wife and children. In those years he notched up 68 jobs and occupations, from garment worker to prison guard, while also going to night school, eventually earning a law degree before finally opening his own law office in the 1950s. When he retired from his law practice in the late ’70s, he began writing down and recording his memories on tape, so that Ms. Tannen might someday write a book about him. Here, at last, is that book.

In her overly discursive early chapters, Ms. Tannen herself wonders why it took so long. To be sure, it was a daunting task to sift through the countless letters, notes and personal journals Eli had preserved over the course of his long life. But perhaps there was a deeper obstacle. In reconstructing the substance of her father’s life, the author admits that she was also forced to revise what she thought she knew about her parents, both as individuals and as a couple.

The first family myth to fall was the intimate setting of his grandfather’s Warsaw household. Instead, Eli’s letters reveal Aunt Magda admonishing her nephew for waxing nostalgic about a family warmth that never existed. Unable to rebut a truth he had preferred not to acknowledge, Eli admits to his idealized childhood image and owns up to the lack of family closeness he himself had so painfully observed, experienced—and, it seems, repressed.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

Ladies of the Good Dead

From The Paris Review:

My great aunt Cora Mae can’t hear well. She is ninety-eight years old. When the global pandemic reached Michigan, the rehabilitation center where she was staying stopped accepting visitors. There were attempts at FaceTime, but her silence made it clear that for her, we had dwindled into pixelated ghosts. She contracted COVID-19 and has been moved again and again. When my mother calls to check on her every day, she makes sure to explain to hospital staff that my great aunt is almost deaf, that they have to shout in her left ear if they want to be heard.

Cora Mae has a bawdy sense of humor. Most of the time when she speaks, it’s to crack a joke that would make most people blush. She wears leopard print and prefers for her hair to be dyed bright red. I have tried to imagine her in the hospital, attempting to make sense of the suited, masked figures gesticulating at her. She doesn’t know about the pandemic. She doesn’t know why we’ve stopped visiting. All she knows is that she has been kidnapped by what must appear to be astronauts.

The film, The Last Black Man in San Francisco, begins with a little black girl gazing up into the face of a white man wearing a hazmat suit. A street preacher standing on a small box asks: “Why do they have on these suits and we don’t?” He refers to the hazmat men as “George Jetson rejects.” It feels wild to watch the film right now, as governors begin to take their states out of lockdown knowing that black and brown residents will continue to die at unprecedented rates, taking a calculated risk that will look, from the vantage point of history, a lot like genocide. The film’s street preacher sounds obscenely prophetic. “You can’t Google what’s going on right now,” he shouts. “They got plans for us.”

. . . .

Under quarantine in Detroit, my father, a photographer, has been sifting through boxes of slides in his sprawling archive. Each image unleashes a story for him. Last week, he told me about arriving in Sarajevo while covering the Olympics. He stayed with a family of friendly strangers eight years before the war. “I wonder if they survived,” he mutters to an empty room.

When my cousin was a police lieutenant, she told us about getting a call for someone who had died. At first glance, they thought the man had been hoarding newspapers or magazines, but then his daughter explained that he was a composer. The papers in those leaning stacks were original compositions.

When we hear on the news that Detroit is struggling, that people are dying, do we imagine composers? Do we imagine a man who sifts through photographs of Bosnia before the war?

. . . .

Last year, the Detroit Institute of Arts mounted an exhibition called “Detroit Collects,” featuring mostly black collectors of African American art in the city. One room was lined with giant photographs of well-dressed collectors. Immediately I recognized the wry smile of a woman named Dr. Cledie Collins Taylor.

My parents have been telling me about Dr. Taylor for years. “You’re going to love her.” “Her house is a museum.” “She used to live in Italy.” “She loves pasta.” When a friend came to town, we thought we would go to the Detroit Institute of Arts, but my father took us to Dr. Taylor’s house instead. The sun was spilling across the horizon, raspberry sherbet bleeding into orange, and the temperature was in the low teens. A handful of houses along the street had big paintings integrated into the architecture of a porch or a window. We knocked on a security gate and a woman in her nineties welcomed us inside.

“Every February, someone discovers me,” Dr. Taylor joked, nodding at the coincidence of receiving extra attention during black history month. I felt a twinge of embarrassment. Here I was, encountering one of Detroit’s most important artistic matriarchs for the very first time.

. . . .

At Dr. Taylor’s house, we sat in the living room and talked for a while. The impeachment hearings were playing loudly on a television in her bedroom. “A lot of my collection is upstairs, why don’t you go take a look.” We crept carefully through a room that seemed to be fitted with exactly as many books as it could tastefully hold, toward a narrow stairway. Upstairs, canvases leaned against the walls. A figurine stood, stark, in a room off to the left. There was a photo-realistic painting of a black woman with short hair, in profile, wearing a pink turtleneck and sitting on a white couch. A black-and-white print of a man with hair like Little Richard peeked out from behind a stack of frames. In a round canvas, a man with an afro slouched behind a shiny wooden table, seated in front of geometric panes of blue, including a massive window framing a cloudy blue sky—as if Questlove were relaxing to “Kind of Blue” inside a Diebenkorn painting. Ordinary scenes of black life, exquisitely rendered, were scattered across the room, a collection relaxing into itself with a kind of easeful, dusty abundance. We were called back downstairs.

Dr. Taylor walked us through the house, and took us on a tour of the basement. African masks, sculptures, shields, and figurines were pinned to pegboard the way other basements showcase drills and rakes. I placed my hand on the baby bump of a pregnant wooden figure. “Somewhere along the line, the collecting idea just catches you,” Dr. Taylor said, humbly.

“You could tell where the problems were because you’d get a lot of things from that place,” she said of collecting work from Africa. “I realized that people were responding to what they needed; certain things in their family shrines they could part with, just to eat.” She told us about her friend, a playwright who convinced a general not to kill her family during the Biafran War. Dr. Taylor tried to hold items and then give them back when wars had ended, once she realized why they’d become so available, but she struggled to get past corrupt middlemen.

Link to the rest at The Paris Review

The Age of Wood

From The Wall Street Journal:

Human history has traditionally been divided into three “ages,” according to the materials—stone, bronze and finally iron—that our ancient ancestors used to fashion their tools. But until very recently, Roland Ennos reminds us, mankind’s most versatile resource grew on trees. In “The Age of Wood,” he takes a fresh look at the familiar substance, wielding it like a wedge to pry open our past, examine our present and even glimpse our future.

Structurally, wood is a marvel. It is pound for pound as stiff, strong and tough as steel yet relatively easy to shape, such as by splitting and carving. No wonder our prehistoric forebears chose wood (not stone) for their first tools and weapons—pointed sticks for digging tubers and hunting game. The earliest houses, plows, wheels and boats were also made from this accessible, adaptable material. Andwithout firewood for warming themselves, cooking food and keeping predators at bay, Mr. Ennos suggests, our ancestors may never have come down from the trees at all.

Toward the end of the Stone Age, people discovered that if logs were left for a long time in a controlled fire (between 600 and 900 degrees Fahrenheit), they would be reduced to charred chunks that burned hotter than regular wood. Using these lumps of pure carbon—charcoal—artisans were able to fire clay into harder, more waterproof pottery and to fuse sand and ashes to create the first glass. Then, around 5,000 B.C., craftsmen in Eastern Europe and the Near East began to purify copper ore by burning it in charcoal fires and pouring it into ceramic molds to form chisels, ax heads and other tools. Some 1,500 years later, they learned that mixing a little tin with the copper produced a sharper, stronger blade, which could cut through trees twice as fast as stone. The Bronze Age had begun.

Far from reducing mankind’s reliance on wood, Mr. Ennos notes, copper and bronze spurred demand by making it easier to convert logs into houses, furniture, fences and myriad other items. With new metal tools, carpenters also built the first wheels and plank ships, launching an era of unprecedented trade, travel and cultural exchange. In the Americas, by contrast, metallurgy didn’t develop, the plank ship was unknown, and the only wheels were made of clay and fitted to children’s toys. “The result,” Mr. Ennos writes, “was to give the people of the Old World a massive lead in logistics, one that five thousand years later was to help them discover the New World and subdue its people.”

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

America Is Running Out of Nurses

Nothing to do with the writing biz, but of interest in the middle of Covid. This is a story about traveling nurses in the US, essentially nurses who work for a limited period of time in a hospital as contract labor, usually to cover vacations or temporary staffing shortfalls.

From The New Yorker:

Gary Solesbee has an understated manner and an easy Texas drawl; he has been a nurse for twenty-seven years. Last year, he and his wife, who is also a nurse, decided to travel. It would be a chance to explore the country and to spend time with their adult children, who were scattered in different states. In December, they started an assignment in Webster, Texas. In March, they left for Walla Walla, Washington, where their son lives, and where, shortly after they arrived, coronavirus patients began to trickle in. Over the summer, when cases exploded in the Southwest, they signed on with an academic hospital in Albuquerque, New Mexico. “When we started travel-nursing, we had no idea we’d be doing covid nursing,” Solesbee told me. “But that’s pretty much what it’s become.” Solesbee, who now works in a step-down unit—the rung between a regular medical floor and the I.C.U.—is one of several hundred travelling nurses his current hospital system, in Albuquerque, has hired to contend with the surge. Solesbee’s hospital, like many around the country, has had to refashion many floors into covid-19 units. “It’s weird to go to an orthopedics floor and see it transformed for covid care,” Solesbee said. “Then you go to gynecology, it’s half covid. Outpatient services, all covid. Everywhere—it’s covidcovidcovid.” Last month, the New Mexico health department opened another hospital nearby to house recovering coronavirus patients who had no place to go.

. . . .

“It’s disheartening that much of the public still isn’t taking this seriously,” Solesbee said. “I want to tell them, ‘Come to work with me one day. See what it’s like.’ ” Caring for coronavirus patients is exhausting, but it’s their intense isolation, not clinical complexity, that bothers Solesbee most. “That’s the worst thing,” he said. “They can’t have family or visitors come in. We’re the only contact they have, and we can’t be with them as much as usual. We’re supposed to bundle care, limit exposure, do what you need to do and get out.” Still, there are moments when compassion trumps protocol. Recently, an elderly patient’s blood-oxygen levels dipped to dangerous levels despite maximum support. After declining a ventilator, he prepared to die. A nurse sat in the room with the patient, calling each of his family members on FaceTime, one by one, so they could said their goodbyes. “Sometimes, it’s the only thing we can do for people,” Solesbee told me. “But, in some ways, it’s the most important thing.” The other night, Solesbee sat with a dying man who had no family. With no one to call, he simply held the man’s hand as the patient’s breathing grew ragged and intermittent. Finally, it stopped. Solesbee stood up. Another patient needed his help.

Link to the rest at The New Yorker

Unsinkable

From The Wall Street Journal:

On Jan. 24, 1944, the destroyer USS Plunkett found herself offshore Anzio, Italy, supporting the embattled Allied beachhead there. The ship had had an eventful year, participating in the invasions of North Africa and Sicily. Like all destroyers, Plunkett was fast and well-armed for her size. She had many roles, including shepherding other ships and patrolling, but destruction was her natural job. On this day in the Tyrrhenian Sea, the men aboard were disappointed that some much-anticipated shelling of German positions ashore had been canceled.

At around 5:15 p.m., the ship’s doctor spotted a couple of German bombers, Dorniers, in the early-evening sky, fairly high up. The Anzio roadstead was thick with Allied ships, but the Germans owned the skies. “Looks like a little business coming up,” said the doctor almost to himself. “I think I’ll go down to the wardroom.” As James Sullivan tells us in his stirring “Unsinkable: Five Men and the Indomitable Run of the USS Plunkett,” the destroyer’s wardroom—the officers’ mess, at sea—did double duty in battle, serving as the dressing station for the wounded and dying. It would prove to be a busy place that evening.

Soon after calling his crew to battle stations, Plunkett’s captain, Ed Burke —one of the five men that Mr. Sullivan follows from the 1940s through the ends of their lives—could see the glide-bombs from the far-off Dorniers floating his way. What followed over the next 25 minutes may have been the most intense aerial attack on a single ship of any navy in World War II.

Soon the Dorniers were joined by dive bombers shrieking in, while torpedo planes came in “low and slow” to drop their sinister payloads that barely moved faster through the water than Plunkett herself. Capt. Burke successfully zigged toward the glide bombs, zagged to run parallel to a torpedo, dodged dive bombs that were landing as close to 20 yards away, and through it all, whenever he could, put the ship broadside on to the planes so his guns would have a chance. Eventually Plunkett was ablaze, but her guns kept firing with coordination and accuracy until the Luftwaffe had to call off the attack. Plunkett’s “bag” that day was impossible to verify, but it likely represented a quarter or more of the force that attacked her. The ship’s casualty list would include, apart from the scores of wounded, 29 men forever “missing in action.” These numbered more than the dead whose bodies could be identified. Plunkett managed to limp back to Palermo, saw action in Normandy on D-Day and ended her war in the Pacific.

. . . .

Certain stories we need to tell regardless of their size. One of Mr. Sullivan’s achievements is to remind us why. “Unsinkable,” a fine narrative in its own right, is also a reflection on the nature of storytelling itself, as well as a valuable and entertaining contribution to the record. It is good to learn the history of the American destroyer, with its origins in the response to the torpedo warfare that began on the Roanoke River in 1864, or to learn how the depth-charges and sonar worked on a vessel of the Gleaves class 80 years later. To make such details compelling reading is an accomplishment. More significantly, Mr. Sullivan takes pains to illuminate and honor a lost world.

He pores over a photograph of a Ship’s Party in March 1943 at the Hotel St. George in Brooklyn. There are the “USO girls, recruited for the party and minded by chaperones.” There is the sailor who looks 14 years old, the towering perms and the crisp uniforms with white carnations, and, “if you look closely enough,” the sound of Benny Goodman on the clarinet.

Mr. Sullivan reads the hundred letters that made their way to and from Plunkett between teenagers Jim Feltz and Betty Kneemiller, who met when Jim was sweeping the floor at Mr. Siegal’s five-and-dime store back home in Overland, Mo. “I still call you mine,” she writes at one point, “but I’m not as definite on that being the truth.”

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

The First Christmas Meal

From The Paris Review:

These days, British and American Christmases are by and large the same hodgepodge of tradition, with relatively minor variations. This Christmas Eve, for example, when millions of American kids put out cookies and milk for Santa, children in Britain will lay out the more adult combination of mince pies and brandy for the old man many of them know as Father Christmas. For the last hundred years or so, Father Christmas has been indistinguishable from the American character of Santa Claus; two interchangeable names for the same white-bearded pensioner garbed in Coca-Cola red, delivering presents in the dead of night. But the two characters have very different roots. Saint Nicholas, the patron saint of children, was given his role of nocturnal gift-giver in medieval Netherlands. Father Christmas, however, was no holy man, but a personification of Dionysian fun: dancing, eating, late-night drinking—and the subversion of societal norms.

The earliest recognizable iteration of Father Christmas probably came in 1616 when, referring to himself as “Captain Christmas,” he appeared as the main character in Ben Jonson’s Christmas, His Masque, performed at the royal court that festive season. Nattily dressed and rotund from indulgence, he embodied Christmas as an openhearted festival of feasts and frolics. But by the time he appeared on the front cover of John Taylor’s pamphlet The Vindication of Christmas, in 1652, Father Christmas had grown skinny, mournful, and lonely, depressed by the grim fate that had befallen the most magical time of year. The days of carol singing and merrymaking were over; for the past several years Christmas across Britain had been officially canceled. The island was living through a so-called Puritan Revolution, in which the most radical changes to daily life were being attempted. Even the institution of monarchy had been discarded. As a ballad of the time put it, this was “the world turned upside down.”

The prohibitions on Christmas dining would have particularly aggrieved Robert May. One of the most skilled chefs in the land, the English-born, French-trained chef cooked Christmas dinners fit for a king—a doubly unwelcome skill in a time of republicanism and puritanism. May connected the medieval traditions of English country cooking with the early innovations of urban French gastronomy, and was at the height of his powers when the Puritan Revolution took effect. During those years, he compiled The Accomplisht Cook, an English cookbook of distinction and importance that was eventually published in 1660. In more than a thousand recipes, May recorded not only the tastes and textures of a culinary tradition, but a cultural world that he feared was being obliterated—including the Christmas dinner, an evocative sensory experience that links the holiday of four centuries ago with that of today.

. . . .

The young May’s experiences abroad hint at the changes occurring in English food culture of the time, especially among the social elite. During the late Tudor and Stuart eras, numerous foodstuffs, including potatoes, tea, coffee, chocolate, and tobacco, arrived from the Americas and established themselves as staples of the national diet. The Accomplisht Cook is replete with non-English influences, giving us a vivid idea of what new fashions entered his kitchen in the early 1600s. May drew heavily from Spanish and Italian recipes, and his book includes thirty-five dishes for eggs that he took from the pioneering French chef François Pierre La Varenne. Despite this, May’s food was quintessentially English. The Accomplisht Cook laments that French chefs “have bewitcht some of the Gallants of our Nation with Epigram Dishes” in favor of the sturdy traditions of English cooking. The Englishness of May’s approach is palpable in his suggestions for Christmas dinner, dominated by roast meats and featuring a mince pie. Today’s mince pies—a Christmas institution in Britain and Ireland—are filled with a sickly-sweet concoction of dried fruit, fortified wine, mixed spices, and mounds of brown sugar, but before the Victorian era they also contained meat. May suggests numerous cuts of beef (including tongue, buttock, and intestine) or hare, turkey, and mutton, among others. In his recipes for a veal-based mince pie, he recommends mixing it with more familiar ingredients such as dates, orange peel, nutmeg, and cinnamon, flavors that are still powerfully evocative of what many of us would consider a “traditional” Christmas.

May’s bill of fare for Christmas Day is huge: forty dishes split across two courses, with additional oysters and fruit. Partly this reflects the nature of May’s experience in the service of some of the wealthiest people in the country, and partly the Stuart approach to dining. The diaries of May’s contemporary Samuel Pepys detail the meat-heavy, gut-busting dinners he hosted each year on the anniversary of his kidney stone operation (that the procedure worked and didn’t kill him was, in the seventeenth century, truly a cause for celebration).

Link to the rest at The Paris Review

Survival Strategies for Unsupervised Children

From Electric Lit:

We’re called the Crazy 9, but there are not always nine of us. We were nine before la policía took Tuki. We called him Tuki because he loved to dance all weird. Every time he heard the tuki-tuki of electronic music, he flailed his arms and raised his knees like some sort of strange bird. Tuki was funny but a little mean. I miss him, but not too much.

I feared we would be seven soon. Ramoncito hadn’t been feeling well, throwing up everywhere. He smelled really bad because he pooped his pants the other day and hadn’t been able to find new ones, so we didn’t like to stand next to him. Or sometimes we made fun of him and yelled, “Ramoncito, pupusito!” and everyone laughed and laughed and laughed, but inside I wasn’t laughing too hard; inside I felt bad. When the others were asleep, I pinched my nose with my finger and thumb and went to Ramoncito. I used to bring him something to eat too, but the last two times he threw up right after, so I didn’t bring him food anymore—why waste it, is what I say—but I still asked, “How are you feeling, Ramoncito?” and “Is there anything I can do, Ramoncito?” My voice sounded funny because of the nose pinch, and sometimes he smiled. Before, he would talk to me a little, but now he didn’t talk much. He could still walk around and go with us on our missions, but he was very slow. His eyes were sleepy all the time, and they looked like they were sinking into his skull. But we also laughed at him because he’s the youngest, only seven and a half, and everyone always gives the youngest a hard time. I was the youngest before Ramoncito came along, but even if Ramoncito didn’t last much longer, the others wouldn’t treat me like the youngest because I was the one that found the knife, and I’m the best at using it.

. . . .

Here is what the Crazy 9 love.

We love our name, and we won’t change it, even if we are really eight, or seven—we love it because it sounds crazy and because we scrawl it all over the place—when we find spray cans, or markers, or pens.

We love the knife. We found it one night after running away from the lady who wouldn’t give us any money, so we pushed her and took her purse. As we gathered to inspect our loot on the banks of the Güaire River, I pulled it from a secret pocket, shiny and dangerous. We love to take turns and unfold the blade from its wooden handle and scream, “Give me all your money!” but we are just practicing. I carry the knife most of the time because I found it, but also because I can throw it at a tree and almost always get it to stick, and I can also throw it in the air and almost always catch it by the handle without cutting my hand.

We love Pollos Arturos, it’s everyone’s favorite, but we almost never get to have any, because if the guard sees us he screams and chases us away—but sometimes we will beg and someone will give us a wing. One time Ramoncito got a leg, but that was before he was throwing up. He got a leg because the youngest always does the best begging. But we have rules in the Crazy 9, so we didn’t take the leg away from Ramoncito. He ate it all by himself.

We love going to the protests. We don’t go to the front too much because that’s where the police fight the protesters—the protesters wear their T-shirts tight around their faces, or they make gas masks out of junk, or they wear bicycle helmets and carry wooden and zinc shields with the colors of the flag painted on them; they throw mostly rocks at the police, but sometimes they shoot fireworks at them. One of them holds the cohetón parallel to the ground—aimed straight at the line of men in their green uniforms and their plastic shields and their big shotguns—while another lights the fuse. They only let it go when the whistling is loud, and we think they might be holding on to it for too long, long enough for it to explode in their hands, but then we see it fly like a comet straight into the green and plastic wall of soldiers that stands down the road. We always cheer when we see that.

Sometimes we stand next to them and yell at the police. We wrap our T-shirts around our faces and scream “¡Viva Venezuela!” and “¡Abajo Maduro!” and jump and throw rocks. It’s fun, except for when the tear gas comes and we have to run away or else cough and cough and cry and cry. But we mostly stay at the back of the protests because we can beg or steal better. Because the women are there, or the older men, or the cowards that don’t want to fight in the front, like us. The begging is good at the protests. The lady will see us and tell her friend in the white shirt and the baseball cap with the yellow, blue, and red of the flag, “Our country is gone, isn’t it? Poor child. I swear, chama, I don’t remember it ever being this bad!” That’s the moment when I try them, and most of the time I get a few bolivares. But we have rules in the Crazy 9, so we always share the money we get from begging or stealing.

We love each other. We say “Crazy 9 forever!” and exchange manly hugs. I love that feeling you get when you hug someone and you mean it. But it also makes me remember things I don’t like remembering, so let’s not talk about that.

Link to the rest at Electric Lit:

Ravenna

From The Wall Street Journal:

In late antiquity, even before the fall of the western Roman empire, all roads began to lead away from Rome. Emperors abandoned the old capital and moved closer to the frontiers. In the east, they resided at Nicomedia, Thessalonike and eventually Constantinople, Constantine the Great’s “New Rome” on the Bosporus, founded in A.D. 330. Trier was their preferred headquarters in Gaul, and Milan their city of choice in northern Italy. In the fifth century, as the empire began to collapse and was invaded by barbarian armies who threatened Rome’s food supply, ordinary people also moved away from the former capital, whose population gradually declined. In the summer of 402, with Goths crossing the Alps, the western emperor Honorius moved his court to Ravenna, a naval base on the Adriatic. It had an excellent harbor from which he could (and did) receive reinforcements from his brother at Constantinople. The place was surrounded by swamps and rivers, making it famously difficult to besiege, and its “undeveloped urban plan . . . allowed the imperial court to impose its presence.”

Judith Herrin’s recent book contains a sweeping and engrossing history of Ravenna from the moment Honorius took up residence there, through the thriving period of Gothic rule (493-540), and culminating in the two centuries (540-751) when the city was a western outpost of the eastern Roman empire. In that last phase, the city was the capital of a territory called the exarchate, whose commander, the exarch, was sent from Constantinople. Ms. Herrin, a professor emerita of classics at King’s College London, is a past master at writing histories of the early medieval world that combine Eastern and Western perspectives and show why we should not study the one without the other. “Ravenna” offers an accessible narrative that brings to life the men and women who created the city during this period and who fashioned its hybrid Christian culture of Latin, Greek and Gothic elements. The narrative is periodically elevated by discussions of the city’s most famous attractions and its glorious churches, brilliantly illustrated in the book’s 62 color plates. It is also enlivened by recurring digressions on daily life in the city at each phase in its history, insofar as that is revealed by documentary papyri containing wills, donations and contracts that fortuitously survive. These local perspectives are complemented by a global outlook: Ms. Herrin’s argument is that Ravenna passed its unique hybrid culture on to the imperial centers all around it, to Rome, Constantinople, and later to Aachen, the capital of Charlemagne in the north. This made the city the “crucible of Europe.”

Ravenna, a jewel in the midst of a marsh, was a place of paradox. It was, to allude to a collection of Ms. Herrin’s previous studies, simultaneously both metropolis and margin. In the era of the late Roman emperors (fifth century), the city hosted the court but had a relatively small population compared to Rome and was always looking toward developments at Rome and Constantinople. It was, at first, an administrative center with no identity, a kind of Brasilia that was only slowly built up by the court in its own imperial Christian image. The place had no prestige past of its own, either classical or Christian. Under the rule of Theoderic the Great (493-526), Ravenna was briefly the capital of a power structure that stretched from Spain to the Balkans, but Theoderic still had to maneuver carefully between Rome and Constantinople. In theory, he was ruling the former on behalf of the latter, and his followers, who were barbarians (Goths) and heretics (Arians) in the eyes of the Romans, were an occupying force that were stationed in the north and away from the old capital. The Gothic experiment did not last once the eastern emperor Justinian decided to end it in the 530s. Under Byzantine rule, Ravenna was the capital of the exarchate, and gradually developed its own civic traditions and identity, but still it received its governing class from abroad, and people at the real center of power, Constantinople, saw it as a remote outpost. 

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

PG says it was not always a lot of fun to live in a city on the Eastern shore of Italy, on the Adriatic Sea. The rulers in Constantinople (now Istanbul) ranged from completely enlightened to pretty nasty. It was the eastern capital of the Roman Empire, then the capital of the Roman empire when Rome fell on hard times, then the capital of the Byzantine Empire. For a long time, it was an extremely wealthy city and a place where many trade routes converged.

At the intersection of Europe and Asia, Constantinople was fought over and/or controlled by the Persians, the Ottomans and the Crusaders. The adjective, Byzantine, was an apt description of life in Constantinople for a good part of its existence.

When PG and Mrs. PG visited there a few years ago, PG found it to be an extraordinary and fascinating place, an amazing combination of ancient and modern, Muslim and Christian, east and west.

A couple of photos of Ravenna:

Basilica of Sant’ Apollinare Nuovo. Ravenna, Italy via Wikimedia Commons
Apse mosaic in basilica of San Vitale, Ravenna, Italy. Built 547. via Wikimedia Commons

7 Books to Get You Through Unemployment

From Book Riot:

It’s no exaggeration to say that 2020 has been one of the toughest years in living memory. The COVID-19 pandemic has had a devastating medical impact, and a knock-on economic impact that the world will be feeling for a while to come. Many people have lost their day jobs as a result of the crisis – this Rioter included. In troubled times, I turn to books, and have found that there are a huge number of books to help you through unemployment. Here are some of the books that I’ve found useful, reassuring, or inspiring while I’ve been unemployed.

The Unemployment Guide: How a Setback Can Launch Your Career by Melissa Fleury

Fleury draws on her own experience of being unexpectedly laid off to create this guide on how to turn the shock of unemployment into a plan to work on changing your life. Initially, I was skeptical (disaster rarely actually feels like opportunity), but Fleury’s optimistic, no-nonsense style helps you see the best in the worst situation.

. . . .

Smashing It: Working Class Artists on Life, Art and Making It Happen

The working world has never been overly accessible to marginalised people, particularly those with multiple marginalisations, and the current economic situation has made things that much harder. This collection of essays and short pieces by 31 working-class artists, including Riz Ahmed, Maxine Peake, and Bridget Minamore, is a fascinating and inspirational read for anyone who’s been knocked back by 2020.

Link to the rest at Book Riot

The Ghosts of Cambridge

From The Los Angeles Review of Books:

IN THE 1960s, the Simulmatics Corporation sold a big idea: whirling IBM computers (the kind James Bond villains had) churning through your personal data in order to predict and influence your decision-making.

Simulmatics did several rather consequential things. It helped propel JFK to the presidency. It modernized the New York Times’s election reporting. It modeled the social effects of pandemics for the Department of Health. It launched a boardgame called Ghetto for teaching high school students about social advancement. It advised state police forces on predicting riots. It helped blue-chip companies sell more dog food, coffee, and cigarettes. And it developed the US Army’s counterinsurgency operations in Vietnam.

But then the company flopped and disappeared. You’ve never heard of Simulmatics. The name sounds like an awful line of protein milkshakes, or a one-room cryogenics firm hidden away in a suburban shopping center. Think again.

Jill Lepore, the award-winning Harvard historian and New Yorker staff writer, retraces its incredible rise and fall in If ThenHow the Simulmatics Corporation Invented the Future. She tells the story of how data science went digital in the Cold War, turning research in behavioral psychology into big business. If Then reads as a kind of prehistory to Cambridge Analytica, the private consultancy that imploded in 2018 after revelations of its dubious use of Facebook data to elect Trump and win Brexit. Exhuming Simulmatics from the dustbin of history also recasts our own strange moment as a mystery story: Why did the company that “invented the future” fail? And why did we forget it ever existed?

. . . .

Simulmatics’s first project in 1960 was the People Machine, a system for converting demographic data into voter prediction for John F. Kennedy’s surprisingly close election against Nixon. Pool christened the People Machine a “kind of Manhattan Project gamble in politics.” He lured prestigious colleagues by offering them bigger salaries and, even more alluringly, with the promise of significantly upscaling their experiments beyond the lab. In an unguarded moment, Pool would excitedly exclaim to historian Fritz Stern that the Vietnam War “is the greatest social science laboratory we have ever had!”

The Manhattan Project imagery stuck. Harold Lasswell, doyen of communications theory, called the People Machine the “A-bomb of the social sciences” (he meant it positively), as did writer Mary McCarthy (she didn’t). Of the behaviorists’ increasing role in government policy, McCarthy presciently wondered: “[C]onceivably you can outlaw the Bomb, but what about the Brain?”

Link to the rest at The Los Angeles Review of Books

Changing America

From Changing America:

Many Americans are watching more television in quarantine and lockdown during the coronavirus pandemic, but the plethora of content is often niche. Still, in a year of polarizing politics, “The Queen’s Gambit” has united viewers around the world.

Seven episodes were enough to garner the show a 100 percent rating on Rotten Tomatoes and accolades from major critics and publications. A more surprising measure of the show’s success, however, shows just how much one show can change American culture.

“The idea that a streaming television series can have an impact on product sales is not a new one, but we are finally able to view it through the data,” said Juli Lennett, toys industry advisor for NPD, in a statement. “The sales of chess books and chess sets, which had previously been flat or declining for years, turned sharply upward as the popular new series gained viewers.”

. . . .

Sales of chess sets rose by 87 percent in the United States while sales of chess books jumped by 603 percent, according to U.S. Retail Tracking Service data from NPD, which showed that week-over-week sales had been relatively flat for 13 weeks before the show debuted. Google searches for “chess” and “how to play chess” hit a nine-year peak, according to Netflix, and the number of new players on Chess.com quintupled. 

Link to the rest at Changing America

Haynes Manuals Will Stop Publishing New Print Workshop Guides

From Road and Track:

Haynes Manuals, the workshop manuals producer that makes detailed maintenance manuals on pretty much every relevant car on the planet, confirmed today it will no longer publish any of its new print workshop manuals in print. Manuals published by the company from here on out will only be available in digital form.

The company clarified on Twitter it will continue to print its massive back catalog of existing manuals, so if it already has one on your car, you’re still in luck if you want a physical copy.

Haynes, based in the U.K. and founded in 1960, has been a lifesaver for enthusiasts around the world looking to do the work on their own cars. Instead of dropping a bunch of money on a dealer or repair shop, you could pick up a Haynes manual and have all the info you need to get any job done.

. . . .

“We are currently in the process of creating an exciting and comprehensive new automotive maintenance and repair product that will cover around 95 percent of car makes and models—an increase of around 40 percent over our current Workshop Manual coverage.”

Link to the rest at Road and Track and thanks to DM for the tip.

PG still remembers the shock he felt the first time he saw an auto mechanic wearing disposable gloves.

Prior to that, one could discern how much experience an auto mechanic had by looking at his hands. (They were all “hims” in times of old.)

Special soap was available for mechanics, farmers, factory workers, etc., who got their hands really dirty. It would remove a lot of dirt and grease, but that sort of work not only got your hands dirty, it often built up formidable calluses on those hands. When those calluses dried and cracked, dirt and grease built up in the cracks and that was very hard, sometimes impossible, to remove.

In those occupations, the condition of a man’s hands was like a calling card and could command respect among his fellow laborers. If you were a newbie, you couldn’t bluff your way up the status ladder.

Peak Brain: The Metaphors of Neuroscience

From The Los Angeles Review of Books:

UP UNTIL 2013, I carried a digital camera wherever I went: to the archive or the bar, and on the occasional trip. I took thousands of photos, sorting through them at odd intervals to post on social media or send to my mom. There was a rhythm to my relationship with the camera: I pointed, I shot, I uploaded, and (usually) I forgot. But that rhythm fell apart once I got an iPhone. Like many others tasks, taking and keeping photos didn’t just get easier — it fundamentally changed. If the move from film to digital lowered the bar for what was photographable, then the camera phone wiped that bar out entirely. The images seemed just as good, but they were no longer mindful records of my days so much as mindless drafts of myself. I was taking more photos and thinking less about each. It was almost as if the camera, now a part of my phone, had become an autonomous point-and-shoot, sort-and-post extension of myself. Or maybe I had become a part of it.

But this isn’t a story about the good old days. Instead, it’s about how tools like my camera and iPhone shift how we understand ourselves. They do so at a personal level, when we use them so much that we feel naked without them. But technologies have also shaped the sciences of mind and brain by anchoring conceptual metaphors. As far back as the telegraph, our tools for capturing, storing, and communicating information have served as analogies for the brain’s hidden processes. Such metaphors are enabling: they do work, as it were, as convenient shorthands that then spur scientific research and medical treatments. But metaphors also constrain our research and treatments, making some things visible while shielding others from view. In other words, metaphors have a politics. The question is: What kind of politics do they have?

. . . .

The role, if not the politics, of technological metaphors in neuroscience is the subject of Matthew Cobb’s new book, The Idea of the Brain: The Past and Future of Neuroscience. It proceeds from a simple idea that Cobb attributes to the 17th-century anatomist Nicolaus Steno. “The brain being indeed a machine,” Steno reasoned, “we must not hope to find its artifice through other ways than […] to dismantle it piece by piece and to consider what these can do separately and together.” Brains have been many things since then, as Cobb shows: voltaic piles and power grids, tiny factories and immense circuits, willful robots and programmable computers. The history of neuroscience can be read as a history of such metaphors. Whatever the tool, we find a way to see ourselves in it.

. . . .

After a brief tour of the ancient world, Cobb describes brains being dissected in early modern Europe, prodded and shocked in the 19th century, and injected and imaged in the 20th. Without falling prey to determinism or teleology, he maps metaphorical flows between neuroscience and technology. New machines do not cause theories to change, he cautions. Rather, causal arrows fly both ways. Human computers preceded digital ones, after all, and machine learning has been inspired by our own. Metaphors have slipped back and forth: our brains are special “hardware” even as my iPhone acts as “my brain.” Cobb’s history reveals something deep: the complex, codependent relationships we develop with our favorite tools do more than alter how we think. They become how we think. And we can’t do it without them!

. . . .

The year I abandoned my Nikon, it popped up in a surprising place: cognitive science. That year, Joshua D. Greene published Moral Tribes, a work of philosophy that draws on neuroscience to explore why and how we make moral judgments. According to Greene, we make them using two different modes — not unlike a digital camera. “The human brain,” he writes, “is like a dual-mode camera with both automatic settings and a manual mode.” Sometimes, the analogy goes, you want to optimize your exposure time and shutter speed for specific light conditions — say, when faced with a big life decision. Other times, probably most of the time, tinkering with the settings is just too much of a hassle. You don’t want to build a pro-and-con list every time you order at a restaurant, just like you don’t want to adjust the aperture manually for each selfie you take.

Greene’s “point-and-shoot morality” is an example of dual-process theory, made famous by Daniel Kahneman’s Thinking, Fast and Slow. This improbable best seller summarized decades of work, much of it by Kahneman and his longtime collaborator, the late Amos Tversky. Adopting terms proposed elsewhere, Kahneman organizes mental function into two “systems,” corresponding to Greene’s automatic and manual modes. “System 1” is fast and, often, involuntary; “System 2” is slower and more deliberate. All animals have some version of the first, while the second system is limited almost entirely to humans. We may be machines, Kahneman and Greene acknowledge, but we are reflective machines — or at least we can be.

. . . .

Mental life, on this view, is a constant, often subconscious assessment of one’s surroundings for threats or opportunities. And of course, Kahneman has a metaphor for that:

Is anything new going on? Is there a threat? […] You can think of a cockpit, with a set of dials that indicate the current values of each of these essential variables. The assessments are carried out automatically by System 1, and one of their functions is to determine whether extra effort is required from System 2.

But this raises an important issue: how helpful is the cockpit metaphor, given how little most of us know about operating airplanes? And what does it suggest about how we imagine ourselves, our capacities and purposes, and how we interact with one another? The same might be asked of the computer, or of Greene’s camera: what do we gain by framing our mental and moral lives in these technological terms, and what do we lose — in scientific or ethical terms?

Link to the rest at The Los Angeles Review of Books

PG started thinking about how his brain works, but quickly became bored. He’s not certain exactly what that might mean.

Scourge of the Elites

From The Wall Street Journal:

Thorstein Veblen may be the most important American thinker most Americans have never heard of. A prolific economist at the turn of the 20th century, Veblen’s groundbreaking work on the mysteries of inequality earned him the admiration of his academic peers, while his searing observations about the “conspicuous consumption” and “predatory” habits of the wealthy won him an audience far beyond the ivory tower. Veblen’s epic life ended in despair—a final note urged that no biographical account or intellectual tribute ever be paid to him—and yet long after his passing, writers including John Dos Passos, Carson McCullers and John Kenneth Galbraith persistently refused to honor his dying wish.

About 40 years ago, however, Veblen fell precipitously out of fashion. To the small clique of enthusiasts who remain, he is understood as a social critic or philosopher—fine things, to be sure, but a dead-end legacy for an economist in an age in which economics still calls the tune for public policy.

And so it is that an insightful new Veblen biography comes to us from Charles Camic, a sociologist at Northwestern University who proves himself a capable guide down the tumultuous currents of late-19th-century ideas. Mr. Camic’s Veblen is an intellectual flamethrower, torching every school of thought in sight, from the classicism of Adam Smith to the communism of Karl Marx, attempting to clear the ground for a new kind of science.

Economists, Veblen argued, were doing economics all wrong. They should have been studying the development and decline of economic institutions. Instead, they were devoting themselves to empty abstractions about consumption, production and productivity that could not be verified by data or experience.

Veblen cut a new course for the discipline by composing ambitious treatises on the origin of inequality. In “The Theory of the Leisure Class” (1899) and “The Theory of Business Enterprise” (1904), he argued that the rich and the modern corporation—examples of what he vaguely defined as “institutions”—were parasitic elements that leeched wealth from more productive segments of society. The ostentatious rich, Veblen maintained, were not evidence of productive abundance but proof that modern industry was making society poorer.

Veblen made his case with the acid wit of an outsider granted ill-considered admittance to an inner sanctum. Raised by Norwegian immigrants in the hinterlands of the Midwest, Veblen wedged himself into university life in the mid-1870s, just as the academy was beginning to assume an aura of high prestige. Studying first at Carleton College, he moved on to Johns Hopkins, Yale, Cornell and the University of Chicago, impressing professors who had earned their stripes studying in the grand universities of Europe.

The children of privilege Veblen encountered in higher learning were very different from the immigrant farmers of his youth. He grew up in townships where English was a second language and, in some cases, rarely spoken at all. Though his own family and many of his neighbors did quite well, even the most prosperous farmers of Minnesota were considered backwater curiosities by the industrial elite of Baltimore and New Haven.

The true significance of “pecuniary success,” Veblen came to believe, was not its purchasing power but the social rank it conferred. For wealth to serve this ultimate hierarchical purpose, it had to be diligently displayed. And the most prestigious of all social signals was idleness. The surest mark of “good breeding,” he wrote in “The Theory of the Leisure Class,” was the performance of “a substantial and patent waste of time.” Working people, after all, had to work.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

PG believes that Veblen was the first to introduce the term, “conspicuous consumption,” in The Theory of the Leisure Class.

The Pilgrims in Flesh and Spirit

From The Wall Street Journal:

If we want to encounter the first Thanksgiving as the Pilgrims experienced it in the fall of 1621, “we must put aside our images of great wide Puritan collars and large buckles (never worn by the godly of early Plymouth, who rejected all jewelry), roast turkey (absent from the earliest record of the event), and cranberry sauce (quite unknown in the early colony).” So writes Martyn Whittock in “Mayflower Lives,” an incisive series of biographical and historical essays on the Mayflower’s passengers.

All we know about the original Thanksgiving, Mr. Whittock explains, comes to us from 115 words in a document known as Mourt’s Relation, written to persuade London investors not to give up on the colony. We know that the original feast lasted three days and was attended by the Pilgrims and about 90 Native Americans from the Wampanoag tribe. It featured a meal consisting of venison, supplied by the natives, and fowl, supplied by the English. That’s it.

. . . .

Myths aside, the striking quality about the Mayflower’s trans-Atlantic journey—and this emerges beautifully in Mr. Whittock’s narrative—is just how archetypally American the whole affair was. The settlement’s mission, like the nation it began, needed both “saints” and “strangers,” to borrow the term historians use to distinguish the Puritan separatists from the rest of the Mayflower’s passengers: fervent Calvinists seeking freedom to worship God as they believed right but also economic migrants and seafarers seeking profit or adventure.

A comparison with the colony of Jamestown, founded in 1607, makes the point nicely. Jamestown was the project of merchants in search of a quick payday, and it was carried out by rugged soldiers and well-to-do settlers unaccustomed to hard work. Within a few years, Jamestown’s inhabitants had mostly either died or fled. The godly among the Plymouth settlers, by contrast, were transfixed by a vision of the New Jerusalem. In the minds of Pilgrims such as William Bradford and William Brewster—the latter had studied at Peterhouse, Cambridge, for a time a hotbed of Puritanism—the cause of Reformed Protestantism was dying in Europe. The New World beckoned, and no measure of physical hardship could divert them from achieving their end.

But a band of theologians and pious congregants, however dedicated, wasn’t likely to survive privation, disease and hostile natives. They needed what a former age would have called men of action, which is why they hired Myles Standish, a diminutive but accomplished soldier. Standish knew how to hunt, how to fix the settlement’s position in order to make it less vulnerable to attack, and what to do about deadly threats. In March 1623, hearing that a group of warriors from the Massachusetts tribes had determined to wipe out a nearby English settlement, Wessagusset, Standish set up a meeting with the Massachusetts sachem or chief. At the meeting, Standish and his men waited for an opportune moment and killed their adversaries with knives.

Standish has been judged critically for this act of premeditated bloodshed, and not altogether unfairly. But the Plymouth Colony, as Mr. Whittock rightly notes, would likely not have survived without the aid of such a man. Here and elsewhere, Mr. Whittock, an English author of several books on European history, is refreshingly reluctant to judge his subjects harshly. Gone are the usual snide remarks about the Puritans’ narrowness and grimness. He notes, for example, that the Puritans regarded monogamous sex as a “joyful expression of love,” an outlook their non-separating Anglican neighbors regarded as “very edgy, if not downright alarming.” With an infection wiping out half the colony in its first year, the little Calvinist company’s edginess probably kept it from oblivion.

Link to the rest at The Wall Street Journal (Sorry if you encounter a paywall)

Noble Volunteers

From The Wall Street Journal:

Few stereotypes from the American Revolution are as well-developed as that of the soldier who fought for Britain. The caricature is ubiquitous: A one-dimensional “lobster,” the “bloody-back” regular, motivated by selfishness, who fought for the love of money. He was heartless and cowardly; a pawn of the king, inept on the battlefield because of his old European ways. Here was a convenient foil for the resourceful Patriot, who fought valiantly on the winning side and for all the right reasons—family, farms and freedom. In “Noble Volunteers,” Don Hagist invites us to peer beneath the red coat. What do we find?

One central insight is that “there was no ‘typical’ British soldier.” British regulars encompassed “such a range of nationalities, ages, skills, and socioeconomic backgrounds” that we are better off “appreciating how they were different rather than how they were the same.” What, for instance, motivated them to enlist? The reasons were as many as the men who joined, with neither unemployment nor impoverishment ranking high on the list. Most were between the ages of 20 and 25, but little else united them. Some sought new careers. Others to escape overbearing mothers, or wives. Others still were moved by wanderlust or boredom. Mr. Hagist is skeptical of accounts, such as Sylvia Frey’s “The British Soldier in America,” that draw conclusions about soldiers’ motives from quantitative data. Too much was idiosyncratic, a mystery.

Mr. Hagist concentrates on the particular. We follow the British soldiers in America from Boston in 1773, before hostilities break out, to Yorktown in 1781. But it is not the battlefield that is most intriguing here; it is instead Mr. Hagist’s wealth of detail about all other aspects of a British soldier’s life. Recruitment in Britain (and elsewhere); monthslong transport in private vessels across the Atlantic, its trials and wonders (“flying fish, sharks, sea turtles, seals, and icebergs”); soldiers’ wages, within and without the army; literacy rates; training exercises; living arrangements in barracks, huts, wigwams and encampments; what they wore, ate and drank; the diseases they contracted; their desertions; “the plunder problem” (“the army’s Achilles’ heel,” says Mr. Hagist, because of its effect on the “hearts and minds” of the local populace); soldiers’ prizes, promotions and demotions; drafts and impressments; punishments and courts-martial; entertainments; religious dispositions; injuries, imprisonments and, occasionally, deaths; and, for some, their postwar lives. It is all here.

Every reader is sure to learn something, and in the process will come upon a favorite among the British soldiers. One of Mr. Hagist’s is Roger Lamb, whom he wrote about previously in “British Soldiers, American War” (2012). Lamb, from a middle-class Dublin family, enlisted with the 9th Regiment of Foot in 1773, at the age of 17, having lost all his money gambling. In America he saw heated action in two major campaigns; was captured twice; and, twice escaping, rejoined the British army each time. Returning to England in 1784—and discharged (from the 23rd Regiment) but “denied a pension because he had served only twelve years and had no disability”—he became a schoolteacher and published author, living until 1830.

Or, take William Crawford. An “ardent disposition for adventure” led him to join the 20th Regiment knowing that meant war in America. Captured, he was interned at Saratoga, N.Y., and marched for months throughout the north, then south to Virginia. Escaping, he was recaptured and jailed. Not to be so easily outdone, he befriended the jailer’s daughter hoping she would release him. Things didn’t go quite as he planned. “She forged a marriage certificate, spirited him out of jail, and presented him to townspeople as her husband.” Crawford accepted his fate. Others also remained in America, many with land grants. Still, most soldiers’ lives were not as well documented, and many ended in much darker places.

. . . .

[T]hinking historically about the war is difficult. It requires us not only to forget how events turned out, but also to recapture very particular moments from the participants’ perspectives. “Standing sentry on a storm-swept shoreline in the middle of a winter night, fending off a rising fever while fearful of imminent attack by assailants unseen, may have been one man’s most difficult hours of an eight-year war,” writes Mr. Hagist, “but histories focused on pivotal campaigns are unkind to such personal experiences, trivializing or entirely overlooking most of the hardships endured by most of the soldiers.”

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

PG is not an expert on military history, but he has read enough books about people fighting in war to conclude that for those actually and physically engaged in the front lines of battle, the experience is far more random than that described by the memoirs of the generals and admirals or the books written by historians based on copies of the orders, after-action reports and official histories describing the wars and interviews with the generals or admirals and examinations of the personal letters and memoirs the commanders wrote about the wars.

PG is not aware of any general who instantly died in the 20th century wars because he was inattentive or caught off guard for a split-second.

One of PG’s neighbors from many years ago had served as a Captain in the Army in Vietnam. During his service, he was engaged in extremely close-quarters and intense fighting on more than one occasion.

Like many veterans who have experienced close, person-to-person fights where lives were in immediate peril, PG’s neighbor did not spend a lot of time talking about his Vietnam fighting experiences.

However, on one occasion, he told PG that the only reason he was alive to converse about the war was because an enemy soldier had failed to clean his gun.

In the thick jungle where he and his men were actively engaged in a firefight, an enemy soldier sneaked through the jungle and emerged about five yards behind the Captain, pointing his AK-47 directly at him. The Captain, focused on the fight going on in front of him, heard something behind him, looked back and saw the soldier pointing his gun directly at him, pulling on the trigger. However, the submachine gun did not fire and the Captain was able to turn and fire his rifle in time to kill the Viet Cong soldier.

After the fighting ceased, the Captain examined the enemy soldier and his gun.

The gun was fully-loaded and ready to fire, but it had jammed because the soldier hadn’t cleaned out the powder residue present as a result of one or more earlier fights during which the soldier had evidently fired the gun a lot.

After the battle was over, while a few of his men were watching, the Captain picked up the AK-47, cleared the jam, examined the firing chamber then took out his knife and scraped away some of the built-up gunpowder residue he found. He then closed the action and the gun fired instantly when he pulled the trigger.

He said it was a good object lesson for his men about the importance of maintaining their weapons carefully.

PG doesn’t remember General Pershing or Generals Eisenhower or Montgomery describing any similar experience in the accounts of their wars.

Long Live Work!

From The Paris Review:

A Bulgarian grocery store opened for business in my Amsterdam neighborhood. On the inside of the plate-glass window they hung a Bulgarian flag, making the store highly visible from the outside, but dark inside. They sell overpriced Bulgarian groceries. And the same can be said of almost all the ethnic markets. First come the migrants, and after them—the markets. After a time the ethnic food markets disappear, but the migrants? Do they stick around? The number of Bulgarians in the Netherlands is clearly on the rise; two Bulgarian markets have opened recently in my neighborhood alone.

And as to those with a “Balkan tooth,” they have famously deep pockets as far as food is concerned; they’ll happily shell out a euro or two extra to satisfy gourmandish nostalgia. The markets sell Bulgarian wine, frozen kebapcheta and meat patties, cheese pastries (banitsas), pickled peppers and cucumbers, kyopoloupindjurlyutenitsa, and sweets that look as if they’ve come from a package for aid to the malnourished: they are all beyond their shelf dates.

The store is poorly tended and a mess, customers are always tripping over cardboard boxes. Next to the cash register sits a young man who doesn’t budge, more dead than alive, it’s as if he has sworn on his patron saint that nobody will ever extract a word from him. The young woman at the cash register is teen-magazine cute. She has a short skirt, long straight blond hair, a good tan. Her tan comes from her liquid foundation; her cunning radiates like the liquid powder. She files her nails, and next to her stands a small bottle of bright red nail polish.

The scene fills me with joy. She grins slyly. I buy lyutenitsa, Bulgarian (Turkish, Greek, Macedonian, Serbian) cheese, and three large-size Bulgarian tomatoes. DovizhdaneДовиждане.

. . . .

The division into those who work and those who do not—the hardworking and the indolent, the diligent and the ne’er-do-wells, the earnest and the couch potatoes—is hardly new, but over the last few years it has become the basic media-ideological matrix around which revolve the freethinkers of the general public. Joining the category of the indolent, ne’er-do-wells, and malingerers are the ranks of the jobless (for whom the employed claim they are simply incompetents and bumblers), along with the grumblers, indignants, and the groups defined by their country, geography, and ethnicity (Greeks, Spaniards, Romanians, Bulgarians, Serbs, Bosnians—all shiftless riffraff!), anticapitalistic elements, hooligans, vandals, terrorists, and Islamic fundamentalists.

In response to the question of how to become a multimillionaire, one of the wealthiest Russian oligarchs replied, “Don’t you forget, I work seventeen hours a day!” The very same answer is given by criminals, thieves, politicians, porn stars, war profiteers, celebs, mass murderers, and other similar deplorables. They all say seventeen hours a daymy career, and my job with such brash confidence, not a twitch to be seen.

On Meet the Russians, a TV show broadcast by Fox, young, prosperous Russians, many of them born, themselves, into money, fashion models, fashion and entertainment industry moguls, pop stars, club owners, and the like, all use the following phrases: I deserve thiseverything I have, I’ve earnedmy time is moneyI work 24/7I never give up.

. . . .

The native armed with bow and arrow, railway line, village, town, may the country thrive and grow, long live, long live work. These are the lyrics of a song that was sung during the Socialist period, when workers’ rights were much greater than they are today.

I confess I never made sense of these verses, perhaps because I didn’t try. What possible connection could there be between a native armed with bow and arrow and railway lines, villages, and towns, unless the lyrics are an anticipatory tweet about the eons of history of the human race: in other words, thanks to the appeal of hard work, natives traded in their bows and arrows for railways, villages, and towns.

Or, perhaps, it’s the other way around: without the redeeming balm of work, those same natives would have to return to the age of bows and arrows, while weeds would engulf the railway lines, villages, and towns. Although the everyday life of socialism in ex-Yugoslavia was like a hedonistic parody of the everyday life in other communist countries, Yugoslavs shared with them a packet of the same values, a set of common symbols, and their imaginary.

And at the center, at least as far as symbols and the imaginary go, was work. Work was what persuaded the native armed with bow and arrow to evolve from the ape, and the “peasant and worker” and “honest intellectuals” evolved thereafter from the native.

“The workers, peasants, and honest intellectuals” were the pillars, in the socialist imaginary, of a robust socialist society and were cast in a powerful positive light, especially because the honest intellectuals were separated from dishonest intellectuals just as the wheat is winnowed from the chaff.

The “bureaucracy” was the necessary evil, the “bureaucracy” flourished, while feeding, parasite-like, on the people. In any case, the word “work” was heard everywhere: in the news shorts that played before films in Yugoslav movie theaters, in the images of eye-catching, sweaty, workers’ muscles, in my elementary school primers where the occupations were unambiguous (male miners, female nurses, male blacksmiths, female backhoe-operators, male construction workers, female teachers, male engineers, female tram-drivers), in the movies, and in the First-of-May parades—pagan-like rites, honoring the god of labor as tons of sacrificial steel, coal, wheat, books were rolled out.

The heroes of the day were the record-breakers, the men and women who went above and beyond the norm. The heroes of today are pop stars, Marko Perković Thompson and Severina, and the many clowns who surround them.

. . . .

Today the vistas I see are post-Yugoslav. Perhaps the view is better in the postcommunist countries like Poland, Romania, Bulgaria, Hungary … I hope representatives of other postcommunist countries don’t hold against me my geopolitically narrow focus. Everything I’ve said refers only to little Croatia, little Serbia, little Bosnia, little Macedonia … And this crumb of badness in the sea of postcommunist goodness can easily be ignored, can it not? Although to be honest, research from 2007 shows that fewer than half of the Germans living in what used to be East Germany were pleased with the current market economy, and nearly half of them desired a return to socialism. As a return to the previous order is now unimaginable, the lethargic East German grumblers have been given a consolation prize, a little nostalgic souvenir, a MasterCard and on it the face of Karl Marx, designed and issued by a bank in the city known today as Chemnitz, though earlier it was called Karl-Marx-Stadt.

. . . .

In Russian fairy tales, Ivan the Simple earns his happy ending and wins the kingdom and the queen. Does he do this by working seventeen hours a day? No he does not. He does this thanks to his cunning and his powerful helpers: a horse able to traverse miles and miles at lightning speed, a magic shirt that makes him invincible, a fish that grants his wishes, Baba Yaga who gives him sly advice, and powerful hawks and falcons for brothers-in-law. Even our hero—Ivanushka, grimy, ugly, slobbering Ivanushka Zapechny, he who is the least acceptable, who lounges all the livelong day by the tile stove—even he, such as he is, wins the kingdom and the princess without breaking a sweat. Our modern fairy tale about the seventeen-hour workday has been cooked up as consolation for the losers. Who are the majority, of course.

Link to the rest at The Paris Review (PG added a few paragraph breaks.)

PG recalls one of the impressions of a college friend who studied and extensively traveled all over what was then known as the Union of Soviet Socialist Republics – “Anything that is not at least 75 years old looks like it was put up cheap.”

On a brighter note, PG is greatly enjoying reading Catherine the Great: Portrait of a Woman. It is the second volume of quite a nice four-volume set by Robert K. Massie, The Romanovs. The series starts with Peter the Great (Pyotr Alekseevich), born on February 8, 1672, crowned as Tzar at the age of ten, and ends with The Romanovs: The Final Chapter that, according to the book’s description, begins at the end of the end of the Romanov line in Siberia with “the infamous cellar room where the last tsar and his family had been murdered” on the night of 16–17 July 1918.

PG seldom provides book recommendations on TPV, but will say he is greatly enjoying this history. While by no means any sort of historian, PG has read quite a number of well-written accounts of various times past and Catherine the Great would be up towards the top of PG’s “best of” list.

FYI, he doesn’t think you necessarily need to read the books in sequence. Each one is very good at setting its current subject in her/his times and place in history.

PG notes that Amazon list used hardcovers for very reasonable prices. That said, “Catherine” is 656 pages and, since PG does a great deal of reading in bed, he is happy to have his featherweight Kindle Paperwhite on his chest instead of the hardcover.

A note on PG’s experience with his Paperwhite – He doesn’t mind the ad-supported version because, at least in his, he almost never notices the ads. While PG also has an iPad, he much prefers his Paperwhite for book-length fiction – much lighter, longer battery life and, at least for PG, a much better screen for reading text than the iPad (or the Fire or any of the android tablets PG has owned in the past).

PG is not terribly familiar with the differences between the earlier generations and current generations of Paperwhites. His is a first-generation model and, if it stopped working, he would probably by another that was identical if it were available.

How the Spanish flu changed the world

From the World Economic Forum:

A couple of years ago, journalist Laura Spinney could hardly believe how little people thought about the Spanish flu pandemic, which swept the globe in three deadly waves between 1918 and 1919.

So she wrote a book – Pale Rider: The Spanish Flu of 1918 and How It Changed the World – to bring the tragedy that claimed 50 million lives back into our consciousness,

. . . .

“It seemed to me there was this huge hole in our collective memory about the worst disaster of the 20th Century. It’s definitely not remembered in the same way as the two world wars – there is some different way we remember pandemics.

One of the ways I tried to explain it in my book was that, to me, that pandemic is remembered individually as millions of discrete tragedies, not in a history book sense of something that happened collectively to humanity.”

. . . .

We think it infected about 500 million people – so one in three people in the world alive at that time, and it killed 50 million of them. The death toll could have been even higher because there was a big problem with under-reporting at the time. They didn’t have a reliable diagnostic test.

. . . .

Pandemic flu is much worse than seasonal flu, and we think there have been 15 flu pandemics in the past 500 years. Every seasonal flu started out as a pandemic flu, which was much more virulent because it was new in the human population. Gradually over time, it evolved to become more benign and to live in a more harmonious relationship with humanity.

There are lots of theories for why the Spanish flu was so virulent and they’re not mutually exclusive. Some of them have to do with the inherent biology of that virus, and some of them with the state of the world at the time. That pandemic obviously emerged when the world was at war; there were extraordinary circumstances. Lots of people were on the move, not only troops, but also civilians: refugees and displaced persons. And there was a lot of hunger.

All of these factors may have fed into the virulence of the virus. There was definitely something very abnormal about 1918. If you think about the five flu pandemics we’ve had since the 1890s, none of them has killed more than about 4 million people maximum, whereas we think Spanish flu killed 50 million.

. . . .

There were no commercial aeroplanes, so the fastest way you could get around was by ship or by train. Henry Ford had invented his Model T motor car, but they were still the preserve of the rich, as were telephones. And illiteracy was much higher than it is now, which had an impact because the main way that news was transmitted was by newspapers. In illiterate populations news travelled much more slowly and was often distorted.

. . . .

In the short term, there was a jump in life expectancy, because a lot of people who were very ill with, for example, TB, which was a massive killer at that time, were purged from the population. They were probably the first to die of the Spanish flu because they were already in a weakened state. The people who were ill died and the people who were left behind were healthier.

There was also a baby boom in the 1920s, which has always been put down to the war and the men returning from the front. But there is an argument that the flu could have contributed because it left behind a smaller, healthier population that was able to reproduce in higher numbers. Norway, for example, had a baby boom even though it was neutral in the war.

Among those very vulnerable to the Spanish flu were the 20 to 40-year-olds. Normally flu is most dangerous to young children and to the very old, but in 1918, bizarrely, it was this middle age group. There wasn’t much of a social welfare net, even in wealthy countries, so lots of dependents were left without any means of support because the breadwinners were taken out by the flu.

. . . .

One of the great tragedies of 1918 is that those dependents just vanish into the cracks of history. We don’t really know what happened to them but we get the occasional glimpse, for example, from a study in Sweden we know that a lot of old people moved into workhouses and a lot of the children became vagrants.

Men were more vulnerable than women overall globally, though there were regional variations. Pregnant women were particularly vulnerable and had miscarriages at frighteningly high numbers because, to fight the virus, the body took resources away from the womb and the growing foetus. Some of those babies survived and we know now there’s a lifelong effect called foetal programming. That generation was physically and cognitively slightly reduced. They were more likely to suffer from heart attacks and to go to prison – and came of age just in time to go and fight in the Second World War.

. . . .

In many Western countries, there was a turning away from science after the pandemic because people were disillusioned with it. From the 1920s, for example, in America, alternative medicine took off in a big way and spread around the world.

But at the same time, in countries that had not really embraced the scientific method, you see the opposite effect. So China becomes a little bit more scientific after the pandemic. There’s a move to better disease surveillance, better public health, more organized collection of healthcare data, because they saw that to prevent future pandemics they needed to turn towards science.

. . . .

The Spanish flu was democratic on one level. It could infect anyone: British Prime Minister David Lloyd George came down with the flu and Boris Johnson has had COVID-19 today. Nobody is, in theory, spared.

If you look at the population level though, there’s a very clear disparity and basically the poorest, the most vulnerable, the ones with the least good access to healthcare, the ones who work the longest hours, who live in the most crowded accommodation, and so on, are more at risk.

But in 1918, it was a time of eugenics-type thinking and it was perceived that those people who were more prone to the flu were constitutionally somehow inferior, that it was somehow their fault.

. . . .

The dates of the waves were dependent on where you were in the world. They came later in the Southern hemisphere, which meant Australia had the luxury of seeing this thing approach in space and time from the north, and took advantage of that to put in place maritime quarantine.

It managed to keep out the lethal second wave in October 1918, which is one of the rare exceptions of public health measures really working that year. But they lifted it too soon and the third wave of infection of early 1919 came into the country and killed 12,000 Australians. But it would have been much, much worse if they had not put the quarantine in place when they did.

Link to the rest at the World Economic Forum

First Principles

From The Wall Street Journal:

First Principles: What America’s Founders Learned from the Greeks and Romans and How That Shaped Our Country

The subject of Thomas Ricks’s extraordinarily timely book is, in his words, “what our first four presidents learned, where they learned it, who they learned it from, and what they did with that knowledge.”

. . . .

John Adams attended Harvard; Thomas Jefferson, William and Mary; James Madison, the College of New Jersey (subsequently renamed Princeton). Of the first four presidents only George Washington had not received a university education: he spoke no foreign or ancient languages and was not much of a reader. Yet even he was steeped in the classicism of the Enlightenment era, and as he matured into his role as the father of his country he came to be seen as the personification of ancient Roman virtue—his country’s Cato, its Fabius, its Cincinnatus.

“Virtue” had a somewhat different meaning in the 18th century than it does today: in Mr. Ricks’s brief formulation, “it meant putting the common good before one’s own interests,” and looked specifically back to ancient exemplars like Cato, Cicero and Socrates. Adams modeled himself on Cicero as Washington did on Cato. Montesquieu, the Enlightenment theorist who had a greater influence on the founders than any other, famously stated in his “Spirit of Laws” (1748) that virtue was the one indispensable quality in a republic. Washington and Adams, at any rate, heartily agreed with him.

Jefferson brought the architecture of ancient Rome to our shores: Monticello, the University of Virginia campus and, finally, the distinctly Roman look given to Washington, D.C. “Almost single-handedly,” wrote the historian Gordon Wood, “he became responsible for making America’s public buildings resemble Roman temples.”

But as Mr. Ricks proves, Jefferson was always “more Greek than Roman, more Epicurean than Ciceronian.” Indeed, he openly admitted to being an Epicurean, a philosophy he called the “most rational system remaining of the philosophy of the ancients,” and Mr. Ricks points out that his replacement of John Locke’s “life, liberty, and estate” (that is, property) with “life, liberty, and the pursuit of happiness” indicates a specifically Epicurean outlook.

Madison, who more than any other founder was responsible for the shape that the U.S. Constitution would finally take, immersed himself in the history of ancient republics and confederations to see what good ideas they could bring to ours. The Roman Republic, which lasted almost five centuries, was of particular interest, but so too were the various Greek confederations, such as the Amphictyonic League, in which the states had the same number of votes (like our Senate today), and the Lycian confederacy, which had proportional votes (like our House of Representatives). Twenty-three of the 85 Federalist Papers cite classical authorities; interestingly, they are more often Greek than Roman.

But Madison took a crucial step to lead the country away from the most important classical precept: he decided that public virtue couldn’t be counted on, and looked for an alternative. The failure of the Articles of Confederation had made it painfully obvious that self-interest usually trumps disinterested virtue. “The present System,” complained Madison, “neither has nor deserves advocates; and if some very strong props are not applied will quickly tumble to the ground. No money is paid into the public Treasury; no respect is paid to the federal authority.”

Now Madison took inspiration from Enlightenment ideas, most memorably formulated in Bernard Mandeville’s “Fable of the Bees” and Adam Smith’s “Wealth of Nations,” that private vices might, when taken together, positively benefit the public.

In Federalist 10 he attacked the classical republican idea that the pursuit of self-interest necessarily violates the public trust: “The causes of faction cannot be removed . . . relief is only to be sought in the means of controlling its effects.” This must be done by involving “the spirit of party and faction in the necessary and ordinary operations of government.”

Here Madison departed from Montesquieu by claiming that a large republic would be more durable than a small one; the more individual interests in play, he claimed, the smaller the chance that any one will prevail. (Of course he could not have dreamed of the possibilities opened by mass communications and social media!) Washington still thought the new republic could not exist without public virtue, and said as much in his Farewell Address; but, writes Mr. Ricks, that was “old think.”

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

Lockdown named word of the year by Collins Dictionary

From The Guardian:

Lockdown, the noun that has come to define so many lives across the world in 2020, has been named word of the year by Collins Dictionary.

Lockdown is defined by Collins as “the imposition of stringent restrictions on travel, social interaction, and access to public spaces”, and its usage has boomed over the last year. The 4.5bn-word Collins Corpus, which contains written material from websites, books and newspapers, as well as spoken material from radio, television and conversations, registered a 6,000% increase in its usage. In 2019, there were 4,000 recorded instances of lockdown being used. In 2020, this had soared to more than a quarter of a million.

“Language is a reflection of the world around us and 2020 has been dominated by the global pandemic,” says Collins language content consultant Helen Newstead. “We have chosen lockdown as our word of the year because it encapsulates the shared experience of billions of people who have had to restrict their daily lives in order to contain the virus. Lockdown has affected the way we work, study, shop, and socialise. With many countries entering a second lockdown, it is not a word of the year to celebrate but it is, perhaps, one that sums up the year for most of the world.”

Other pandemic-related words such as coronavirus, social distancing, self-isolate and furlough were on the dictionary’s list of the top 10 words. So was the term key worker. According to Collins, key worker saw a 60-fold increase in usage over the last year, which reflects “the importance attributed this year to professions considered to be essential to society”.

Link to the rest at The Guardian

What It Means to Be Human

From The Wall Street Journal:

Bioethics has always been enmeshed in controversy. Arising out of gross abuses of the rights of human subjects in mid-20th-century scientific research, the field has grown to take on a variety of thorny challenges at the intersection of morality and biomedicine—from embryo research and organ markets to artificial reproduction and physician-assisted suicide.

But the most profound bioethical disputes actually lie beneath these headline-grabbing controversies, deep in the soil of moral philosophy and anthropology. To think clearly about how to protect the human person, we need an idea of what the human person is and why that matters. Our society often takes for granted a set of answers to questions about the meaning of personhood, and those answers tend to emphasize choice and independence as measures of human dignity and worth. The bias toward autonomy goes well beyond academic bioethics and, indeed, prevails throughout our culture. A critical examination of the moral suppositions underlying contemporary bioethics might shed light on much more of our common life than our engagement with biology and medicine.

Such an ambitious examination has now been taken up by O. Carter Snead in “What It Means to Be Human.” The result is a rare achievement: a rigorous academic book that is also accessible, engaging and wise.

. . . .

Mr. Snead’s subject is “public bioethics,” by which he means not the work of advising particular patients or clients facing difficult decisions but the work of setting out laws, rules and policies regarding the uses of biotechnology and medicine. He begins by drawing out the often unstated assumptions beneath such frameworks. “American law and policy concerning bioethical matters,” Mr. Snead writes, “are currently animated by a vision of the person as atomized, solitary, and defined essentially by his capacity to formulate and pursue future plans of his own invention.”

By putting decision making at the center of its understanding of the moral life, this view treats cognition and rational will as the essence of our humanity and radically plays down unchosen obligations. More important, it implicitly treats those who depend most heavily on others because they are unable to make choices—the mentally impaired, dementia patients, those suffering extreme pain, children in the womb, and others—as diminished in worth. Even when bioethics does try to protect such people, it struggles to understand just how their lives are worth living.

What this view misses, Mr. Snead argues, is the significance of our embodiment. “Human beings do not live as mere atomized wills,” he writes, “and there is more to life than self-invention and the unencumbered pursuit of destinies of our own devising. The truth is that persons are embodied beings, with all the natural limits and great gifts this entails.”

This simple fact has far-reaching implications. “Our embodiment situates us in a particular relationship to one another, from which emerge obligations to come to the aid of vulnerable others, including especially the disabled, the elderly, and children.” Our power to choose recedes into the background when our lives are viewed this way, and our embeddedness in webs of mutual regard come to the fore. Properly understood, bioethics should seek to emphasize not ways of breaking relations of dependence but ways of helping us see what our common humanity requires of us.

. . . .

Mr. Snead doesn’t emphasize the religious foundations of this truth, and he maintains a welcoming and inviting, even conciliatory, tone toward the progressive bioethicists whom he is criticizing. He knows they mean well but thinks they are caught up in the expressive individualism of our culture in ways that keep them from grappling with the full meaning of the questions their field sets out to address. The book speaks their language: It is technical at times, especially when considering in detail the law surrounding abortion, assisted reproduction and end-of-life care. But in the end, it addresses far more than professional controversies.

“What It Means to Be Human” may have its greatest impact outside public bioethics. That field is now intensely politicized, and stubbornly resistant to criticism. It is likely to remain in the business of constructing sophistic permission structures justifying a dehumanizing but convenient disregard for the weak and vulnerable in the all-atoning name of choice. Dissenters from this orthodoxy, like Mr. Snead, often defy easy political and professional classification. Their work is rooted in deeper philosophical soil and therefore tends to grow beyond the bounds of bioethics.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

PG wishes the Harvard University Press had not overpriced the ebook.

Too Much Information – Understanding What You Don’t Want to Know

From MIT Press:

How much information is too much? Do we need to know how many calories are in the giant vat of popcorn that we bought on our way into the movie theater? Do we want to know if we are genetically predisposed to a certain disease? Can we do anything useful with next week’s weather forecast for Paris if we are not in Paris? In Too Much Information, Cass Sunstein examines the effects of information on our lives. Policymakers emphasize “the right to know,” but Sunstein takes a different perspective, arguing that the focus should be on human well-being and what information contributes to it. Government should require companies, employers, hospitals, and others to disclose information not because of a general “right to know” but when the information in question would significantly improve people’s lives.

Sunstein argues that the information on warnings and mandatory labels is often confusing or irrelevant, yielding no benefit. He finds that people avoid information if they think it will make them sad (and seek information they think will make them happy). Our information avoidance and information seeking is notably heterogeneous—some of us do want to know the popcorn calorie count, others do not. Of course, says Sunstein, we are better off with stop signs, warnings on prescription drugs, and reminders about payment due dates. But sometimes less is more. What we need is more clarity about what information is actually doing or achieving.

Link to the rest at MIT Press

PG was drawn to the OP for two reasons:

  1. He had previously read several short works from the author, Cass Sunstein, and enjoyed them.
  2. At the end of the worst presidential campaign season in the history of the United States, PG (along with a great many other people) is sick and tired of hearing about the two candidates, their appearances, their statements, their goals, their relatives, their assistants, their past, their futures, etc., etc., etc., etc.

In other words, PG is suffering from too much information about people, opinions and events that tend to disgust him.

PG is looking forward to the day, hopefully not too far in the future, when he doesn’t come across a single mention of either candidate or anything associated with them.

Reading Patients, Writing Care

We turned her every couple of hours in the end, though somehow the procedure seemed more incessant than that. At times, it felt like a peculiarly brutal routine to inflict upon someone under your watch. But there was no room for compromise in the instruction we’d been told to follow: pressure sores can be deadly. And to have any chance of preventing them, we had to subject my grandmother to regular, distressing turns, which couldn’t be done fluently due to the effort involved; turns that demolished whatever quantum of peace only morphine could supply her in repose.

Before long, the task inevitably acquired a regimental punctuality. Yet it remained too intimate ever to be entirely functional. Nor did it become any easier with practice. By design, the whole process is rarely seamless. One hasty move can be torturous. Equally, though, overcautiousness carries its own perils: repositioning someone in slow motion prolongs the risk of aggravating existing abrasions. However tightly we policed our complacencies, there was always room for agony; and however inescapable such pain is, we weren’t about to absolve ourselves of the additional suffering we alone seemed to be inflicting. If it appeared as though we were destined to fail, this was hardly an acceptable compensation. The constant glare of anticipatory grief leaves the labor of care bleached of self-forgiveness.

The house in which my mother had been born and where she now once again lived—on account of poverty, not out of choice—became the place where she would see her own mother die. This symmetry was a privilege amid formidable sadness: “Most people want to die at home,” observes dementia campaigner and novelist Nicci Gerrard, yet “most die in hospital.” And while the majority of terminally ill people “want to be with family,” too “often they are alone with strangers.” How fortunate we were to be bucking that trend.

It is caregiving’s emotional and physical contours that are illuminated throughout Rachel Clarke’s Dear Life: A Doctor’s Story of Love and Loss. Although the book centers on the remarkable work of professional hospice staff—who ensure that people who don’t spend their final hours at home are at least surrounded by dignity, calm, even consolation—Clarke’s vision of care’s complex entwinements of torment and fulfillment is unconfined to specialist practitioners. As such, she reads distinct end-of-life experiences in medical settings for what they reveal about our common sentiments toward illness and dying; sentiments that imbue countless, apparently unexceptional, yet affectively multifaceted acts of caregiving that take place outside clinical environments too.

Link to the rest at Public Books

Lost in a Gallup

From The Wall Street Journal:

Griping about polling goes back a long time, even to the days before George Gallup published the first random-sample opinion poll in October 1935—as many years away from us in 2020 as that first poll was from the Compromise of 1850. And truth to tell, it doesn’t seem intuitively obvious that the responses of a randomly chosen group of 800 people should come reasonably close, in 19 cases out of 20, to those you’d get if you could interview everyone in a nation of 209 million adults. Even sharp math students don’t always know much about statistics and probability. So the griping goes on.

Some of it reflects a misunderstanding of what polling is. It’s not prediction: Polls are a snapshot taken at a point in time, not a movie preview of what you’ll see later. That fundamental point is often lost or at least misplaced by W.Joseph Campbell in “Lost in a Gallup: Polling Failure in U.S. Presidential Elections,” an otherwise fast-moving narrative history of some attempts to gauge public opinion amid electoral politics. “Election polls are not always accurate prophecies,” Mr. Campbell writes early on. He notes that “polling failures tend to produce broadly similar effects—surprise, anger, bewilderment and frustration at their failing to provide the American public with accurate clues about the most consequential of all U.S. elections.” Surprise, anger, bewilderment, frustration: This sounds like the response to the result of the 2016 election in the city where Mr. Campbell teaches, Washington (which voted 91% for Hillary Clinton and 4% for Donald Trump).

But Mr. Campbell’s gaze goes far beyond the Beltway and back further in history than the astonishing election night four years ago. He is well aware that the national polls in 2016 were close to the results; the pre-election Real Clear Politics average showed Hillary Clinton ahead by 3.3%, close to her 2.1% plurality in the popular vote. Polls in some states were further off. Still, Nate Silver’s FiveThirtyEight gave Donald Trump a 29% chance of winning, and 29% chances happen about one-third of the time. Mr. Campbell quotes RCP’s Sean Trende saying, rightly, that 2016 “wasn’t a failure of the polls. . . . It was a failure of punditry.”

The subject of “Lost in a Gallup” is not so much election polling as its effects on political journalism. Mr. Campbell, a prolific author and a communications professor at American University, admits up front that he is not concerned with “jargon and the opaque methodological arcana that pollsters and polling experts are keen to invoke.” The book is a history of mistakes and overcompensating for mistakes. Polling pioneers Gallup, Elmo Roper and Alexander Crossley, after bragging how closely the past three elections matched their poll numbers, all showed Thomas Dewey leading Harry Truman in 1948. Having got that wrong, they fudged their results to project a close race in 1952. Wrong again!

. . . .

Mr. Campbell devotes much attention, justifiably, to the 1980 election. For months, polls showed a close race between incumbent Jimmy Carter and elderly (age 69) challenger Ronald Reagan. But when the exit polls—invented by polling innovator Warren Mitofsky, also the inventor of random digit-dialing phone interviewing—showed Reagan well ahead, NBC projected his victory, to almost everyone’s astonishment.

But were the polls actually wrong? The author quotes the Carter and Reagan pollsters, Patrick Caddell and Richard Wirthlin, saying that opinion shifted strongly to Reagan after the candidates’ single debate seven days before the election and after Mr. Carter’s return to Washington the next weekend to tend to the Iran hostage crisis. Both pollsters told me the same thing back in the 1980s. Their story makes sense. Reagan’s “are you better off than you were four years ago?” debate line (stolen, though no one then realized it, from Franklin Roosevelt’s 1934 pre-election fireside chat) worked in his favor, and Mr. Carter’s job rating, buoyed upward all year by his efforts to free the hostages, was liable to collapse when he failed.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)