A Body Made of Glass

From The Wall Street Journal:

The first time it happened, Caroline Crampton was on a bus. Rain lashed the windows as the breath of weary commuters fogged the glass. The hum of casual conversation drifted along the aisle, mixed with fragments of phone calls and the grinding of tires. Then, without warning, it all stopped. Ms. Crampton had gone suddenly, inexplicably deaf, except for a high-pitched whine. Seconds later, her vision dissolved “into a prismatic sphere of light.” The bus and its occupants, the windows and fog, were all subsumed by a “disco ball turned inside out.” Effectively blind and deaf, she stumbled off the bus. And then, almost as suddenly as it began, the attack was over. The anxiety, however, had only just begun—along with many, many medical tests.

“I am in the belly of the machine and it is singing,” writes Ms. Crampton in “A Body Made of Glass.” A self-proclaimed treatment seeker, she recalls her episode on the bus as she is being scanned inside an MRI machine. Most of us have come to rely on science to some degree or other, but Ms. Crampton, who wrote “The Way to the Sea” (2019), describes herself as “a supplicant” before technology, “begging it for knowledge,” asking it to turn her transparent through the miracle of medicine and magnets.

The MRI transforms her from “a lump of meat on a slab” into detailed images and structured data. “For fractions of seconds” she explains, her body becomes as glass. She expects to be found ill—she is always half convinced that she is dying of a yet-undiagnosed disease—but like so many times before, even the MRI comes up empty. There’s simply nothing to see. Her strange symptoms have no detectable source, and in medicine both modern and historic, “illness without cause” is summarily dismissed as hypochondria.

From the Greek hupo, meaning under, and khondros, meaning the sternum, “hypochondria” once denoted ailments of the torso before evolving in the 15th century into its current connotation. In effect, “hypochondria” is a body-word used to describe a mind-illness, “at once visceral and figurative, just like the condition that it describes.” Part memoir, part medical history, “A Body Made of Glass” provides an intimate, honest, willingly vulnerable exploration into a very sticky question: When it comes to health and sickness, what is real and what is imaginary? More importantly, who decides?

Ms. Crampton’s troubled relationship with illness began at the age of 17, when she was diagnosed with Hodgkin’s lymphoma. After months of treatment, she was told she was free of the cancer, only to have it return by the time she was 18. The experience left her, understandably, skeptical of the notion of a cure. With extraordinary candor, she walks us through her ever-present fear: Isn’t the cancer still there, lurking in her blood and tissue? Against a disease that’s often invisible until it’s too late, hypervigilance doesn’t seem foolish. And yet, she writes, “it feels at times like having cancer for real was the training I went through so that I could have a dozen other illnesses in my imagination.”

In prose that brings forth the most visceral aspects of hypochondriac dread—the constant sensation of feeling prodded and poked, the imagined spiral into debility and death—Ms. Crampton re-creates the sensation of being alien to an ungovernable body. She suspects there is nothing to find, that the screaming panic will subside, that she’ll return to her office chair and get back to work. But she doubts. And as her historiography makes plain, she and others like her may have good reason.

The story of hypochondria—a proper fear of illness and injury that runs amok and out of proportion—begins in ancient Egypt and Babylon but, like so many of our evolutionary quirks, goes back further still. “Illness is a story we tell about ourselves,” Ms. Crampton explains. Each of us looks for underlying patterns in our health as a means of “staving off the yawning blackness of the unknown.” Notable figures—John Donne, Howard Hughes, James Madison, Blaise Pascal, Marcel Proust, Tennessee Williams—struggled with hypochondriasis, as did the French king Charles VI. Gripped by mania during a military campaign, he became the first documented case of someone who believed his body was made of glass and would “shatter on contact.”

Link to the rest at The Wall Street Journal (Sorry if you encounter a paywall)

A Secret Letter to the KGB Turned A Lost Family History Into a Novel

From Electric Lit:

Journalist Sasha Vasilyuk’s debut novel Your Presence Is Mandatory is a poignant look at the reverberating effects of war through the story of a Ukrainian World War II veteran’s struggle to hide a damaging secret for the sake of his family.

Vasilyuk’s book begins with death—the first chapter featuring a family at the grave in Donetsk, Ukraine of main character Yefim Shulman, paying their last respects. Shortly afterwards, his wife finds a letter in his belongings addressed to the KGB, a confession that launches the family to reconsider the man they thought they knew. The novel then takes the reader back to Yefim as a young soldier in Stalin’s army stationed in Lithuania in 1941, shortly before Germany launched its invasion of the Soviet Union. Yefim’s experience as a soldier left him with a secret he was so afraid to reveal, even to his own family, that he took it to his grave. 

The book skips between Yefim’s experiences serving in Stalin’s army and the remainder of his post-war life in Ukraine, even extending 7 years after his death to the beginning of Russia’s occupation of Crimea and the start of war in Donetsk. Your Presence Is Mandatory is a timely look at survival that will make you question how wars, both past and present, shape future generations.

I interviewed author Sasha Vasilyuk over the phone about the discovery of her grandfather’s letter to the KGB that change her family narrative about who he was.

. . . .

Katya Suvorova: Sasha, you’ve talked about how, after your grandfather’s passing, your grandmother and aunt found a real-life secret letter that your grandfather wrote to the KGB that totally upended your family narrative about who he was. How did this letter inspire Your Presence is Mandatory?

Sasha Vasilyuk: My grandfather was a Jewish Ukrainian World War II vet, but he didn’t ever talk about the war. From the few things that I and the rest of my family knew, we thought of him as a war hero, because he survived from the first day of the war until the last day four years later. Given that WWII killed 27 million Soviet people, this made him seem like a brave and lucky soldier. But his letter, which was addressed to the KGB and written back in the 1980s, revealed a very different story. Imagine thinking of your grandpa as a star of Inglorious Bastards where Jewish soldiers take revenge on the Nazis and finding out he was more like The Pianist. The letter was a shock to my family, but I immediately thought: this is a novel. I wasn’t just interested in how he survived WWII, but also in why he’d kept it a secret his whole life. Interestingly, it took my grandma several months to tell me about the letter because she too wanted to keep his secret a secret.

KS: Why do you think your grandmother hid the letter from you? 

SV: So the Soviet government punished and shamed those who survived the war in non-heroic ways. That shaming culture was so strong that even after the USSR fell apart, people who’d internalized that shame continued to feel it. I think my grandpa, who inspired the main character Yefim, didn’t tell us what really happened to him during the war first to protect us from the government and later because he was ashamed. When my grandmother and my aunt discovered his letter, they also felt ashamed. At least at first. 

KS: So do you think they finally accepted that he was a victim and that’s what brought them to tell you?

SV: I think they realized their shame stemmed from decades of propaganda and of living under a regime of fear. And maybe they saw that hiding one’s past makes it easy for future generations—like me—to not know your family history, or even your national history.

KS: I was reminded frequently while reading your novel of the parallels between passages describing the destruction and occupation of Ukraine by Nazi Germany in World War II and contemporary news reports of the Russia invasion of Ukraine. With your family being from the Donbas, how did your personal experience with Russian occupation affect your characters?

SV: After my grandmother found the letter, I didn’t sit down to write this novel for the next 10 years, primarily because I couldn’t imagine writing about World War II. It felt entirely too daunting. I felt like I couldn’t imagine what it was like to survive a war, even if I’ve seen the movies, like we all have, and read other books. As somebody who was trained as a journalist, I couldn’t write about war until war broke out in my family’s town in the Donbas. This was in 2014 and I visited in 2016 when it was supposed to be safer. 

There, I heard shelling. I saw bullet holes on every surface. I saw the way people scurried about and I experienced the fear of war. And only then did I feel like I could portray those feelings in my characters with any, you know, realism.

As far as how it affected my characters, what I was surprised by was how the war changed how my family identified themselves. They shifted from this sort of a general Soviet identity, where we’re all brothers, toward a more nationalistic identity that very clearly distinguished Ukrainians from Russians. Now that we have a full-scale invasion this shift in identity really took over the entire Ukraine. There have been so many essays on this subject and so many people in Ukraine talk about how they’ve been perceiving themselves very differently because of the war. So when writing my characters, I thought about how war changes our identity and our relationship to home, to the state, to the enemy. Those things were all interesting to me.

Link to the rest at Electric Lit

In Praise of Episodic Enthusiasm

From Writer Unboxed:

 Greek on the Rosetta Stone. By 1814 he had completely translated the inscription. You read that right: he decided to tap away at it and a year later he’d cracked it wide open. He’s rightly credited with deciphering a language that I as a student called Demonic because of its impenetrability.

Young’s contributions to the study of hieroglyphs had the potential to be equally transformative, but there’s the catch: “potential.” Young was a polymath, a brilliant scholar who never committed to one passion. He practiced medicine. He created a formula for understanding blood flow. He contributed the term “Indo-European languages” to linguistics. He’s considered the founder of physiological optics. Einstein praised his work because he freaking developed the wave theory of light.

As you might imagine, all of this work didn’t leave him a lot of time to study hieroglyphs. He came, he saw, he went off to conquer something else.

The stories of these two men were on my mind last month when I led a tour group around Egypt. Nearly every ancient site we visited was covered in hieroglyphic writing, and thanks to their breakthroughs, I could translate quite a bit of it – but not all.

Not all, because like Young I’ve scattered the seeds of my time across multiple terrains rather than planting them in one field. I have spent years of my life learning to excavate archaeological sites, teach, do linguistic analysis, play the piano and clarinet, and sing. I have baked bread and made jam and quieted a crying baby, sometimes simultaneously. I’ve practiced the basics of knitting and crocheting and tapestry weaving. I’ve run a marathon and given birth three times without pain medications. Not to brag, but sometimes I can even do a yoga flow without resorting to child’s pose in the middle. My scattershot years weren’t wasted; I just didn’t “give myself up entirely” to one thing, as Champollion did.

I know I’m not alone. Many writers have other careers prior to writing, or simultaneously with it. You can probably name your own favorite authors who did X and Y before writing, and usually those experiences were what they drew upon to create their unique voices and worldviews. Most of the time I’m heartened by these examples, but in the dark nights of the tortured writerly soul, I fret: Is greatness–or even competency– a realistic goal for someone who has fiddled away so many years on other projects? If it takes ten thousand hours to get decently good at something, who’s got that kind of time, after spending a thousand hours here and another thousand there and a few thousand more binge watching police procedurals?

If you and I have missed the chance to be the Champollion of fiction writing, whatever that might look like, can Thomas Young provide a good alternative? He was less celebrated in any one field, but darn good at a lot of them. He did also have the leisure of an inherited fortune and brilliance of a kind most of us can’t fathom – so what I really want to explore is whether mere mortals can still create a meaningful life’s work after doing other things.

Link to the rest at Writer Unboxed

Hieroglyphs on the Temple of Kom Ombo, Egypt

Coptic – Parchment fragment. Verso. Book of Jeremiah. White Monastery, Sohag (Egypt), tenth century

The Vortex

From The Economist:

The book is on the minds and lips of presidents. Recently Gustavo Petro, Colombia’s leader, praised “La Vorágine” (“The Vortex”), a novella by José Eustasio Rivera, for having words that “still shine like stars” and showing how “the destruction of the jungle fills human beings with nothing but hatred”. Mr Petro and Luiz Inácio Lula da Silva, Brazil’s president, both spoke about “The Vortex” at the International Book Fair of Bogotá. This year’s biblio-bonanza is celebrating the relationship between literature and nature, as well as the centenary of the publication of “The Vortex”, which was written in April 1924.

Rivera told his story through Arturo Cova, an equivocal narrator who seduces a woman, Alicia, in Bogotá and then flees with her to los llanos in the east. Greed and cruelty drive the people they encounter, who desire only to “steal rubber and hunt Indians”.

“The Vortex” evokes the region’s colonial history, when conquistadors pillaged jungles and slaughtered inhabitants in pursuit of riches. It also excoriates the abuses that the rubber industry inflicted on indigenous people serving as indentured workers in the Amazon, the world’s largest rainforest. As the forest is despoiled and a tug-of-war plays out between those wanting to prioritise environmental protection over economic growth, the book feels modern and timely.

It can also be read as pioneering eco-literature. Rivera’s vivid, poetic prose transforms the jungle into a living being, “a green hell” that fights back against its persistent invaders. Thundering rapids drown men; ants as poisonous as scorpions prey on human flesh. At one point Cova hears a tree’s vengeful thoughts.

Link to the rest at The Economist

How to protect an endangered language

From The Economist:

Of the world’s 7,000-odd languages, almost half are expected to disappear by the end of the 21st century. Two culprits are usually considered responsible for this decline. The first is colonialism: when great powers conquered countries, they imposed their language in government and schools and relegated local ones (or banned them outright). The second is capitalism. As countries grow and industrialise, people move to cities for work. They increasingly find themselves speaking the bigger language used in the workplace rather than the smaller one used at home.

English, as the most dominant language in the history of the world, often stands as a symbol of homogenisation and the steamrolling of smaller cultures. So it may come as a surprise that the most linguistically diverse spot on Earth spans a few square miles in New York. Ross Perlin’s new book, “Language City”, is the story of what he has learned as the co-founder of the Endangered Language Alliance, a non-profit organisation that has managed to identify some 700 languages spoken in New York, a number vastly greater than the 100 or so listed in America’s official census.

Mr Perlin profiles speakers of six languages. Each tongue is threatened by different, larger neighbours. (English is by no means the only linguistic juggernaut.) Seke, from Nepal, is squeezed by Nepali and Tibetan. Wakhi, from Central Asia, sits between Chinese, Persian and Russian; its speakers also usually speak Tajik with others from their home country.

Nahuatl—though not a tiny language, as it is spoken by more than 1.6m indigenous Mexicans—is giving way to Spanish. N’Ko, a sort of alphabet-cum-written-standard meant to serve several closely related Manding languages of west Africa, must compete with French, the language of prestige in the region. Yiddish is losing out to English in New York and to Hebrew in Israel. As the language of secular Ashkenazi Jews it is nearing extinction (though it is flourishing among the ultra-Orthodox).

The people Mr Perlin meets are multilingual by necessity. Together they speak more than 30 languages; each person has “to move nimbly from one linguistic ecology to another”, he writes. They refuse to stop using their cherished language—despite incentives to do so—in order to preserve something of the associated culture.

The death of languages often follows the same pattern. Conquest and colonisation lead to poverty, and sometimes an internalised shame. As a result, parents often choose to raise their children in a bigger language for their own economic benefit. Whether a language disappears altogether is determined by the next generation: many assimilate and their language is lost for good. But sometimes they may try to reverse the decline.

Can outsiders aid preservation? Many speakers of small languages treat them as a kind of sacred or scarce good that outsiders do harm to by learning and documenting; they do not think of their languages as objects of scientific curiosity. So those trying to help, including Mr Perlin, are learning to tread carefully. (In the book he describes an initially wary encounter with the last known native speaker of Lenape, New York’s own indigenous language.)

Tim Brookes, a British writer and the executive director of the Endangered Alphabets Project, another non-profit group, describes his own approach in his recent book, “Writing Beyond Writing”. He makes a persuasive case that linguists have long neglected writing systems in their well-intentioned push to give dignity to spoken as well as written languages. Linguists have tended to ignore the wonderful and hugely varied scripts that are threatened by behemoths including the Latin, Arabic, Devanagari and Chinese systems. As well as research and advocacy, Mr Brookes makes beautiful wood carvings in the scripts he describes. Like Mr Perlin, he is careful always to put the native users of a language at the heart of the story. The field has no time for white-saviour narratives anymore.

Julia Sallabank, a linguist at the School of Oriental and African Studies at the University of London, has described how experts have historically approached languages in danger of extinction. In decades past a Western linguist would show up, learn as much as possible, then publish the results back home. In time, academics came to assist the language community by producing grammar books, dictionaries and recordings for speakers to use and pass down. Next came collaboration. Scholars and activists would sit down together to work out exactly what the group needed for the language to thrive.

Link to the rest at The Economist

“A Theory of America”: Mythmaking With Richard Slotkin

From Public Books:

Over a stunning career, which has lasted more than half a century, the historian Richard Slotkin has devoted himself to documenting the stories we tell ourselves about nation, violence, inclusion, and exclusion. From his trilogy on the place of guns in American culture—starting with Regeneration Through Violence in 1973—Slotkin has defined the study of American mythmaking. Over these same 50 years, he has witnessed the massive transformations of the late 20th century and the uneasy opening of the 21st. In his new epic, A Great Disorder, Slotkin uses foundational myths like the founding, the lost cause, the frontier, and the good war, to explore how such stories shape the limits and possibilities of our current-day political imaginaries.

Over a stunning career, which has lasted more than half a century, the historian Richard Slotkin has devoted himself to documenting the stories we tell ourselves about nation, violence, inclusion, and exclusion. From his trilogy on the place of guns in American culture—starting with Regeneration Through Violence in 1973—Slotkin has defined the study of American mythmaking. Over these same 50 years, he has witnessed the massive transformations of the late 20th century and the uneasy opening of the 21st. In his new epic, A Great Disorder, Slotkin uses foundational myths like the founding, the lost cause, the frontier, and the good war, to explore how such stories shape the limits and possibilities of our current-day political imaginaries.


Kathleen Belew (KB): Let’s start by queuing up the four big myths you talk about and why you felt it was important to bring them together in one story now, having addressed some of these themes in your earlier work.

Richard Slotkin (RS): Over the longer term, I’ve been thinking about the theory of national myth and the way in which national myths are a crucial part of the culture that holds nation states together.

National myths are developed through long-term usage in every medium of cultural expression: histories, school textbooks, newspapers, advertisements, sermons, political speeches, popular fiction, movies. They are the form in which we remember our history. But they are also, and most critically, the means through which we turn history into an instrument of political power. In any major crisis, one of our cultural reflexes is to scan our memory archives, our lexicon of myths, for analogies that will help us interpret the crisis, and precedents on which to model a successful or even “heroic” response.

Four mythologies have been central to the development of American nationality. The myth of the frontier is our oldest myth, tracing the origin of our society to the settler states of the colonial period and its phenomenal growth to the exploitation of abundant natural resources. The myth of the founding deals with the establishment of national independence and constitutional government. The myth of the Civil War arose from the existential crisis that overtook the nation in the 1860s, over slavery and Southern secession. This myth has three significant variants: the liberation myth, centered on Lincoln and emancipation; the reconciliation myth, which emphasizes the postwar coming together of whites from North and South; and the lost cause myth, which sanctifies the Confederate cause and the postwar struggle to restore white supremacy. Finally, the myth of the good war emerged in the 1940s, as the nation for the first time embraced its racial and ethnic diversity, to unite its people in a struggle for the Free World.

My past work had focused on the myth of the frontier, which was really the earliest and the most basic myth. It deals with national character and race. It deals with economic development. But then I had neglected other myths that have equal or similar power in shaping the way in which we think about our nationality specifically, that is, who counts as an American and what it is that the political structure of America is supposed to do. It seemed to me that the Civil War was certainly one of the things that was most critical to talk about. So, I wrote about the Civil War as well.

What really crystallized the idea for the book, though, was the demonstrations in Charlottesville. I realized that the Civil War was very much alive and that the banners that people were carrying on both sides were really like the headlines from myths. Behind each banner was a version of what the United States was supposed to be. If you looked at the antidemonstrators, the ones who were opposing Unite the Right, they had standard American flags, flags expressing Black Power, rainbow flags—the flags of liberalism and leftism in a sense.

On the other side, you had flags with the Confederate stars and bars, the Gadsden flag—the yellow flag with the rattlesnake that says Don’t Tread on Me, which has come to represent a gun rights flag—and right-wing paramilitary flags. It seemed to me that what we were getting there was a war of symbols, and that behind the symbols were stories, and that each of the stories amounted to a different version of what the United States was about. That’s why the book had to bring all four together.

To be clear, nobody really sits down and composes each myth, though Hollywood has at times come close to doing so.

Rather, they emerge from the rationalization of a historical crisis. They almost always have an ambivalence, a contradiction built in.

For example, there’s a white side and a Native side to the frontier story. There’s a Union side and a Confederate side, and a Black side and a white side to the Civil War story. Those ambiguities or contradictions remain embedded in the story.

Or take the lost cause. I could see somebody, a populist but not necessarily right-wing person, thinking of the Confederacy through the lens of, “That’s what you do. You rebel against the established order, when the established order gets too oppressive.” The myth makes itself available for that thing. And that’s why myths retain their power—they can serve a number of different purposes and play both sides of a contradiction.

KB: As someone who has written about this over several chapters in what we would call the culture wars, do you see this as more of a continuity across the years you’ve been writing? Or is today really different in some tactile way?

RS: Both of those things are true. Certainly, these wars are continuous. If you follow any one of the stories—the story of the founding, the story of the frontier, the story of the lost cause and then the liberation myth of the Civil War, the good war myth—they run pretty much throughout the period with greater and lesser periods of activity of intense usage.

But starting in 2000, the Civil War became really a live term explicitly, where people were saying, “We’re in a civil war.” You saw that analogy being made not only on the right in the American Conservative, but also in Sean Wilentz’s writings about how our contemporary moment resembles the 1850s. So the Civil War was very alive in mainstream culture even before Charlottesville.

With Charlottesville, what happened is what was implicit suddenly became the front and center drama. We’re now actually fighting about the legacy of Robert E. Lee. We’re actually arguing with the president about the legacy of Robert E. Lee. It turned out that in order to defend Robert E. Lee, Trump could reach back and compare him to George Washington. Washington was a slave holder too. Now, all of a sudden, the founding of the nation is involved.

What the modern gun rights movement has done is to make the Second Amendment the center of their myth of the founding, in which the right to bear arms—and not the legal protections of the Bill of Rights—is “the palladium of our liberties” because it enables citizens to resist a tyrannical government. The original “palladium doctrine” was put forward by Supreme Court Justice Story in 1833; but it held that the potential for resistance was to be held by “well regulated” state militias. But the modern movement has asserted this as an individual right and used it to justify the threat or use of armed force to resist the government. NRA spokesman Fred Romero says it directly: “The Second Amendment is there as a balance of power. It is literally a loaded gun in the hands of the people held to the heads of government.” And that power can be used to check the ordinary operations of government. As the antitax activist Grover Norquist said, “Once [the government] get our guns, they don’t have to argue with you about taxes anymore.”1 The logic of this Second Amendment myth leads straight to the attack on Congress on January 6, 2021.

The past becomes infused into modern life and politics.

KB: Do you think that originalism in that context is more a legal theory or a retelling of a cultural myth? Does originalism have the uptake or purchase that it has because it has that story power? Or is it just a legal doctrine?

RS: It’s the story’s power that gives its appeal beyond the narrow circle of legal specialists. I would argue that the legal specialists thought themselves into the mystique of the founding. They have fetishized the founding as a way of undoing the world of precedent that has been developed since the founding or since major amendments were passed.

So it is definitely a fetish, and you can see it most clearly in Clarence Thomas’s opinion in Bruen where he says that you can’t interpret the Second Amendment in any way other than the way in which it would’ve been interpreted in 1791. In a sense, it is patently absurd.

First of all, even as a historian, you can’t figure out authoritatively how the amendment would have been interpreted back then. You can’t truly be authoritative about what the common state of opinion was about that. Second, and more importantly, we’re not in the same world.


KB: One of the unresolved tensions in teaching US history that comes up over and over for me is the conflicting mythos argument articulated by Jefferson and Hamilton, that asks: Is violence by the mob justified because it seeks to restrain the tyrannical state or is state violence justified because it seeks to restrain the revolutionary anarchist mob? In so many ways, and especially while studying lynching or vigilante groups, it seems to me that we collectively never resolved this question at all.

I’m wondering how much you think there are tensions like that, tensions that exist in one of these stories or crossover between several of these stories?

RS: You can examine the question if you contrast the two halves of the founding myth, the Declaration of Independence and the Constitution. The Declaration of Independence is about the right of revolution. It’s a moral statement that people always have the right of revolution. The Constitution, on the other hand, doesn’t acknowledge a right of revolution. That’s really the core, because that’s the fundamental question about government. At some point, government may become tyrannical, and a revolution may be needed to overthrow it. In a practical sense, the point of contradiction that seems to me most meaningful is the Civil War version. I talk about this when I talk about Lincoln and Lincoln’s response to Southern secession.

The Southern states used their militia to resist what they say was a tyrannical government. In response, Lincoln says, okay, you have the right of revolution. Any people, any civil community has the right of revolution. But there are two questions, morally: Why are you rebelling? Are you rebelling to establish freedom or slavery? And Lincoln thinks the answer to that is clear, which is the latter. The South disagrees. But Lincoln’s other question is, okay, you have the right of revolution. Does the government have the right to suppress you? And if so, on what moral basis? Clearly, Lincoln argues, the government has a legal basis to suppress the South’s rebellion. The Constitution says you can suppress an insurrection. The moral basis of this right, Lincoln argues, is free elections—if you have a free election, that’s the essence of the Republican state. If you overthrow a free election, if you substitute bullets for ballots, that’s the end of the republic. It’s the end of Republicanism. And therefore, to defend the principle of free government, it’s necessary to repress the Southern Revolution. That’s the way the reasoning actually works out. That’s the story that justifies saying no to this revolution.

Link to the rest at Public Books

The author of the book is a history professor at one of the so-called “Little Ivies,” expensive small colleges located in New England, the Northeast United States where the Ivy League Universities are sited. The students at the Little Ivies tend to come from wealthy families and fall into contemporary “privileged” status.

PG hasn’t read anything written by Dr. Slotkin, but what’s an American History professor going to publish when all the history professors who have come before him have worked over all the interesting and useful American History topics?

He can discover widely accepted myths that his many predecessors have failed to discern. Many of these myths have undoubtedly been published by less-enlightened and perceptive historians who lacked the intelligence to see all those myths sitting right in front of their noses.

The fact that earlier historians may have personally failed to see these myths in the wild or interviewed older individuals who had witnessed the events that the diligent professor has discovered are nothing more than American mythology.

PG’s gaze was caught by the Professor’s statement that

“The Declaration of Independence is about the right of revolution. It’s a moral statement that people always have the right of revolution. The Constitution, on the other hand, doesn’t acknowledge a right of revolution.”

PG is merely a humble recovering lawyer not a Professor of American history with tenure, but, for him, The Declaration of Independence and Constitution were written for two very different purposes:

  1. The Declaration of Independence was written to give notice to one and all that the thirteen colonies would no longer be subject to British rule and refused to be governed by British law and/or its King or the designated representatives of either. Its purpose was to start a revolution.
  2. The Constitution was written and approved well after the Revolutionary War was concluded. Its purpose was to set the rules for an entirely new and different form of government, one without kings, hereditary nobility, etc.
  3. The Constitution was also written to delineate what the leaders of the new nation had the power and authority to do, and what powers the individual states retained to be exercised under the laws the states would write and approve. Each state would organize its own separate government with elected officers and representatives elected by the people living in the state.

The Civil War was necessary to resolve the right of revolution issue. States and their citizens, while exercising broad areas of independence, were still part of the United States and subject to the laws passed by the democratically elected representatives from all states in Congress, enforced by a President and the nation’s designated officials, and supervised by the President.

America is uniquely ill-suited to handle a falling population

From The Economist:

Cairoa town at the southern tip of Illinois founded in the early 19th century, was given that name because it was expected to grow into a huge metropolis. Located at the confluence of the Mississippi and Ohio rivers, it was the transport hub of a region that became known as “Little Egypt” because of its huge deltaic plains where farmers could grow anything.

Today, however, the name is redolent of lost civilisations. To walk around is a strange experience. Turreted Victorian houses gently crumble, being reclaimed by the weeds. What was once downtown (pictured) resembles an abandoned film set. Cairo has no petrol stations, no pharmacies and no hospitals. It has gone from six schools to two, both half-empty. “When I was growing up in the 1970s, we had two grocery stores, we had two gas stations. You know, a lot of businesses were still open,” says Toya Wilson, who runs the city’s still operating and beautiful Victorian library. One modest grocery store remains, but it is run at a loss by a charity and, when your correspondent visited, was deathly quiet, with many bare shelves.

Cairo is on its way to becoming America’s newest ghost town. Its population, having peaked above 15,000 in the 1920s, had fallen to just 1,700 people by the 2020 census. Alexander County, Illinois, of which it is the capital, lost a third of its people in the decade to 2020, making it the fastest-shrinking place in America

Link to the rest at The Economist

Unfortunately, Cairo (pronounced like the Karo in Karo Syrup) is not an isolated case.

PG grew up in rural Colorado and rural Minnesota. Every place PG lived during his K-12 years has a smaller, usually much smaller population than it was when PG was living there. The elementary schools and the high school where PG was a student have been demolished and not replaced. Today, the students who live where PG lived are bussed to school, riding at least 30 minutes each way.

The American Midwest contains some of the most fertile soil in the world, but farming finances are becoming worse and worse for family-owned farms. They’re simply too small and undercapitalized to support a family these days. So the owners, often quite old, are selling out to investment banks and other large financial enterprises, typically located on the coasts of the United States.

Somebody needs to operate the farms, so the large financial owners hire professional farm managers are hired to handle that task. Because of their size and financial strength, the large banks and financial enterprises have access to financing on far better terms than an individual farmer can obtain, they can purchase expensive new farm equipment that is far more sophisticated and efficient than anything a small farmer can afford.

Large finance enterprises hire a professional farm manager to handle all their holdings. The farm manager then hires a few middle managers to supervise the hiring and supervision of low-cost labor, importing such workers from other countries to handle the day-to-day work.

Laborers are likely organized in work groups to be moved about regularly so they can do the manual work of operating a large number of former family farms. Basically, farming is run like a factory, and nobody has roots in the many small farm towns that previously survived by providing goods. services and education needed by family farmers, spouses and their children.

‘Sociopath’ and ‘Borderline’ Review: Understanding Personality Disorders

From The Wall Street Journal:

‘I’m a liar. I’m a thief. I’m emotionally shallow. I’m mostly immune to remorse and guilt. I’m highly manipulative. I don’t care what other people think.” Thus opens “Sociopath: A Memoir,” by Patric (short for Patricia) Gagne, psychologist, former therapist, happily married mother of two—and, she would say, an “advocate” for others like her.

In grade school, Ms. Gagne tells us, she jammed a pencil into the head of another little girl and took money from the collection tray at church. As a college undergraduate, she drove stolen cars around Los Angeles. “Sociopath” is the story of Ms. Gagne’s obsessional effort to understand these impulses—to understand herself. Her divorced parents were loving, and her home was nice, defying the stereotypical origin story of family dysfunction. In the author’s case, the problem seemed innate. “I was simply different,” she writes, and it “often felt like a life sentence in emotional solitary confinement.”

As a teen, Ms. Gagne conducted little experiments on herself: “Wouldn’t it make more sense to engage in smaller acts of ‘bad’ behavior more frequently,” she surmised, “than larger acts less frequently?” She concluded that apathy was the culprit—a lack of engagement and interest that led her to misbehave and thus elicit an experience of colorful emotion. So she put herself “on a diet,” she writes. “I did exactly what I needed to give myself necessary ‘jolts’ of feeling. I never took it any further—even when I was tempted, which was often. I scheduled my mischief like I would have a doctor’s prescription.”

Eventually she understood that people like her—people, that is, who are sociopathic—“just had a harder time with feelings. We act out to fill a void.” Today she knows how much she has to lose if she acts on desires to violate social norms or harm people. The guiltless possibility of doing harm to others points to another key aspect of her condition: an inability to imagine the experience of others.

In a way, Ms. Gagne was lucky. Many people with her condition—most are men—would have landed in prison for committing some of the same trespasses. And few recover or cope as well as she has. Her life, in aerial view, has followed a fairly standard trajectory of education, employment and then doctoral training to become a therapist.

. . . .

But “Sociopath” does present an arresting story of a person who had to build an intelligible moral code from scratch—psychiatrists, apparently, were of little help. Ms. Gagne says that she wrote the book for the other estimated 15 million sociopaths in America. “Who has empathy and compassion for them?” she once asked her husband. She does, and she wants to “allow people like me to see themselves in healthy, everyday situations, and provide the single thing I knew they needed most: hope.” If she can truly help others like herself, then she will have accomplished what the psychiatric profession has largely failed to do.

. . . .

While borderline pathology and sociopathy differ—sociopaths suffer emotional numbness whereas borderlines are often flooded with inchoate anxiety and rage—psychiatry regards both conditions as very difficult to treat.

Borderlines are manipulative people who are apt to violate personal and professional boundaries. They experience panicky feelings of emptiness and engage in “splitting” (judging others as all good or all bad, appraisals that can change over the course of a day); they commit impulsive, often self-destructive acts when faced with overwhelming emotion and harbor an erratic sense of self.

To be fair, most borderlines don’t harm animals. It is just that kind of operatic portrayal, in fact, that Alexander Kriss decries in “Borderline: The Biography of a Personality Disorder,” a well-researched and compelling account of an often baffling condition. As an assistant clinical professor of psychology at Fordham University, Mr. Kriss seems to have immense empathy for borderline patients, putting him in a select group of therapists.

In “Borderline,” he charts the six-year (and counting) treatment of Ana, a young woman who—true to form—introduced herself to him out of the blue, via email, by demanding: “Call me ASAP.” Chapters alternate between sessions with Ana and scholarly excursions into the history of the concept of borderline. Experts, from the ancients to Freud to today’s trauma gurus, have tried to explain the pathology. The answers have ranged from hysteria, disruptions of the pre-Oedipal stage, parental neglect, biologically driven mood and impulse dysregulation, and child abuse. At its core, the problem seems to be one of continuity—in emotional control, identity and relatedness. Borderlines are overwhelmingly women who have been abused in childhood, and their clinical prognosis is guarded. Although most people do age out of the condition eventually, their mature years can be marked by depression, drug abuse and rocky relationships.

Thus far, Ana has made progress. Her treatment, Mr. Kriss says, is a continuing story “of how one moves from chaos to stability; from a black-and-white worldview to a more complex one; from a life defined by desperation to one defined by a sense of who we are.” It is clear from Mr. Kriss’s chronicle that Ana has made these moves but also backtracked at times. Mr. Kriss acknowledges that he must make concerted efforts to handle the emotions that Ana’s provocations stir within him.

But what of other people with character pathology who do not have the financial means to afford several sessions a week? (Mr. Kriss treats Ana for a steeply reduced fee.) Today a shorter-term therapy conducted in a group-therapy format, called Dialectical Behavior Therapy, is the most common treatment for borderline personality disorder. It helps people regulate strong emotions by, for example, getting them to think about whether a particular emotion is justified by circumstances and by improving their communication skills so that they can defuse tense situations.

Link to the rest at The Wall Street Journal (Sorry if you encounter a paywall)

After 1177 B.C.

From The Wall Street Journal:

The Bronze Age in the Near East and Aegean region began around 3000 B.C. It peaked in the 1500s B.C. and ended gradually, then suddenly, in the Late Bronze Age collapse of the late 1200s and early 1100s B.C. The Egyptian empire went into terminal decline. The Hittite empire fell, and the Assyrian and Babylonian empires faltered. The Mycenaeans, Minoans and Canaanites disappeared from the record. Mainland Greece’s population halved and its palaces collapsed, along with literacy and even the concept of a supreme ruler. Welcome to the age of ignorance, war and poverty that Hesiod called a “race of iron.”

Modern historians attributed the collapse to the Dorian Greek invaders whom the Egyptians called the Sea Peoples; a horde of them sailed into Egypt in 1177 B.C. These categories now seem to be a retrospective mirage. Various peoples came to the Eastern Mediterranean over a period of centuries, some by land, some in peace, and many driven by drought, but none understood themselves as Sea Peoples or Greeks. The crisis at Egypt’s southern border, the wave of tomb robberies attributed to “foreigners” and the partisan paralysis of Egypt’s administrators were consequences as much as causes of Egypt’s demise.

In his 2014 book, “1177 B.C.: The Year Civilization Collapsed,” Eric H. Cline attributed the Late Bronze Age collapse to a “systemic failure with both a domino and a multiplier effect.” Mr. Cline, a professor of classics and anthropology at George Washington University, applied complexity theory and systems analysis to the transition from the Bronze Age to the Iron Age. He detected a cascade effect of unpredictable interactions and nonlinear outcomes, often amplified by the sophistication and interdependence of imperial economies throughout the Mediterranean world. The Sea Peoples’ invasion of Egypt in 1177 B.C. remains a pivotal event, a proof of decline akin to the barbarian sack of Rome in A.D. 476. But the shift from the Bronze Age to the Iron Age was a “rolling” process. The mighty Bronze Age kingdoms and empires took decades to crumble, and the city-states of the early Iron Age took decades to cohere.

History is one sequel after another, and in “After 1177 B.C.,” Mr. Cline describes what happened next. These were, Hesiod wrote, times of “sore trouble,” but “even these shall have some good mingled with their evils.” Late Bronze Age societies adapted and transformed, or they faced eclipse and extinction. What did the dawning Iron Age do for us? Monotheism, coinage, innovations in iron-working, the Greek alphabet, the polis (city-state), the origins of democracy in Athens and the nation-state in Jerusalem, and, as Mr. Cline’s expert, ingenious and endlessly fascinating book shows, an ancient lesson in the lately rediscovered virtue of “resilience.”

Ramses III of Egypt saw off the Sea Peoples in 1177 B.C., but two decades later he was murdered, his throat cut in a “harem conspiracy” led by one of his wives and her son. His predecessor Ramses II became the “Ozymandias” of Percy Shelley’s poem, a “colossal Wreck” in the sand. Egypt went the same way through food shortages, palace intrigues, political schisms and pressure on the southern border. Neither adapting nor transforming, the Bronze Age superpower suffered a “rapid decline” in its standing and stability.

Around the same time the Hittite empire in Anatolia (modern Turkey) shattered. Attacked from the west by Sea Peoples and from the east by Assyria, the empire broke into as many as 15 small “Neo-Hittite” states populated by a “host of political entities and various ethnicities.” Hittite culture survived—the Israelite king David falls in love with Bathsheba, the wife of Uriah the Hittite—but the Hittite capital of Hattusa was leveled and the Hittites’ cuneiform language, a Bronze Age lingua franca, fell into disuse.

The Babylonian and Assyrian empires survived drought, famine and plague, and eventually revived in a new form. Babylonia’s population collapsed and Assyria’s record-keeping seems to have stopped in the mid-1100s B.C., but their “Neo-Babylonian” and “Neo-Assyrian” successor empires retained cultural continuity, governmental capacity and military strength. Nebuchadnezzar I of Babylonia defeated his Elamite neighbors so severely, Mr. Cline writes, that they left “no written records and little archaeological evidence” for centuries. When writing resumed at Nineveh in Assyria, it recorded victories over the Aramaeans, a nomadic people who, displaced by drought, were raiding Assyrian cities. Their language, Aramaic, would become the lingua franca of the Near East in the Iron Age.

The Mycenaean and Minoan societies of mainland Greece and Crete ended by the late 1100s B.C. Contact with the Near East dwindled, writing stopped and only “survivors or squatters,” Mr. Cline writes, lived amid the mainland’s ruined palaces. The Bronze Age was remembered in Homer’s oral legend, and Greek civilization took centuries to rebound. As everywhere, the collapse of large sociopolitical units gave space to city-states. The polis, which the Greeks would export across the region, was born from this power vacuum. The cities of Bronze Age Canaan were reborn as a mishmash of Phoenician cities, an Israelite state in the southern hill country, and, after a Sea People called the Peleset had annexed Canaan’s coastal strip, a league of Philistine cities.

Link to the rest at The Wall Street Journal (Sorry if you encounter a paywall)

Steven Levitt and John Donohue defend a finding made famous by “Freakonomics”

From The Economist:

More than two decades have passed since we published an academic paper linking the legalisation of abortion to the enormous decline in American crime since the 1990s. The underlying theory is straightforward. Children who are unwanted at birth are at risk of a range of adverse life outcomes and commit much more crime later in life. Legalised abortion greatly reduced the number of unwanted births. Consequently, legalised abortion will reduce crime, albeit with substantial lags.

Our paper created much controversy, which was further stoked by a chapter on the topic in the best-selling book “Freakonomics”, written by one of us with Stephen Dubner, published in 2005. For many, it was more important to spin a political response to our hypothesis than to evaluate whether it was correct.

The data available at the time strongly supported our hypothesis. We showed, for instance, that crime began falling sooner in the five states that legalised abortion in advance of Roe v Wade, the US Supreme Court decision that made abortion available legally nationwide. We documented that crime in states with high and low abortion rates followed nearly identical trends for many years, then suddenly and persistently diverged only after the birth cohorts exposed to legal abortion reached the age at which they would commit crime. Consistent with our theory, looking at arrest data, which reveal the age of the offender, the declines in crime were concentrated among those born after abortion became legal.

These data patterns do not, of course, make an open-and-shut case. No randomised experiment has been conducted on this topic. We also didn’t help our own case by mislabelling the set of control variables included in one of the specifications in one of our tables—a regrettable mistake (later corrected) which led some people to dismiss all of the paper’s findings. One of the most vocal and consistently sceptical voices arguing against our findings has been The Economist, which has written about our study on three occasions (most recently last month), each time taking a critical and dismissive stance.

We concede that reasonable people could disagree about how convincing the findings were in our initial paper. The analysis was retrospective, and there is always the concern that researchers have cherry-picked their findings, or that perhaps it was just pure coincidence that the patterns emerged.

There is, however, something unique about our hypothesis, which allows a second test of the theory that is far closer to the ideal of the scientific method. There is a long lag between abortions being performed and the affected cohort reaching the age at which crime is committed. Thus, we could already at the time of our first academic paper in 2001 make strong predictions about what our theory would predict should happen to future crime. Indeed, at the end of that paper, we made the following prediction: “When a steady state is reached roughly 20 years from now, the impact of abortion will be roughly twice as great as the impact felt so far. Our results suggest that all else equal, legalised abortion will account for persistent declines of 1% a year in crime over the next two decades.”

It is rare—almost unprecedented—in academic economics to be able to make a testable prediction and then to go back and actually test it decades later. That’s what we did in a paper published in 2020. Our methodological approach was straightforward: mimic the specifications reported in our original paper, but limit the time period to the years that were out of sample, ie, those after our original data ended.

The results provided stunning corroboration of our predictions. For each of the seven different analyses we had presented in the initial 2001 paper, the results for next two decades of data were at least as strong as the results in our initial dataset, and in most specifications even stronger. This included what the main critics of our 2001 paper called the “crucial” test, showing that the abortion rate at the time of any birth cohort negatively correlated with the age-specific arrest rate for that cohort years later as it moved through ages 15–24, while perfectly controlling for whatever other factors were influencing crime in a given state and a given year. We would argue that, short of a randomised experiment, this is some of the most compelling evidence one could present.

The magnitude of the implied impacts we are talking about is huge. If you look over the entire sample, violent-crime rates fell by 62.2 percentage points in high-abortion states whereas they rose by 3.1% in low-abortion states.

Though there is not complete acceptance of our hypothesis among academics, all agree that if our paper is not correct, then there is no viable explanation for the enormous drop in crime in America that started in the early 1990s. Indeed, there is not even an arguable theory to supplant the abortion-crime link. Exposure to lead in the environment might, perhaps, be the next best hypothesis. But as we showed in our 2020 paper, when one controls for both environmental lead and abortion, the coefficient on abortion remains large while the coefficient on environmental lead is greatly reduced and loses statistical significance.

It seems fair to ask why, in spite of strong supporting evidence, no academic contradiction of any of the findings of our 2020 paper and support for the abortion-crime link from international evidence, so many observers remain sceptical of the hypothesis. Not surprisingly, those whose livelihoods came from fighting crime during the great crime drop were not keen to conclude that their approaches—whether more police, more incarceration or particular social programmes—however important they might have been, were not the dominant factor in the decline. Another possible explanation is that people across the political spectrum were uncomfortable with our conclusions. Many people would prefer that our hypothesis not be true—perhaps not recognising that the core finding is that when women can control their fertility the life outcomes of their children are greatly enhanced.

Link to the rest at The Economist

Chess Teaches the Power of Sacrifice

From The Wall Street Journal:

The act of sacrifice holds an elevated and sometimes sacred place in societies across the globe. While sacrifices may be rare in a person’s daily life, they happen as a matter of course in a large number of chess games. Many positions cannot be won or saved without something of value being given away, from a lowly pawn all the way up to the mighty queen. Certain types of sacrifices happen so frequently that to an experienced player they might be considered routine, almost boring, and it often takes an unusual sort of sacrifice to quicken the pulse of jaded grandmasters who have seen tens of thousands of them in their lifetimes.

In the introduction to his classic book “The Art of Sacrifice in Chess,” player Rudolf Spielmann wrote, “The beauty of a game of chess is usually appraised, and with good reason, according to the sacrifices it contains. On principle we incline to rate a sacrificial game more highly than a positional game. Instinctively we place the moral value above the scientific.”

It is this “moral value” that separates some sacrifices from others. Spielmann draws a clear distinction between what he calls a “sham sacrifice” and a real one. A sham sacrifice is one where one can easily see that the piece being given up will return concrete benefits that can be clearly calculated. Any player would be happy to part with their queen if they see that they can checkmate the opponent’s king within a couple of moves.

However, in the case of a real sacrifice, giving away a piece offers gains that are neither immediate nor tangible. The return on investment might be controlling more space, creating an assailable weakness in the opponent’s position, or having more pieces in the critical sector of attack.

In chess, we call these intangibles “compensation.” Having enough compensation for a sacrificed piece is a judgment call based on knowledge of similar situations or a refined intuitive feel based on thousands of games played. Of course, compensation doesn’t guarantee that you will win the game, and if these intangible advantages don’t pan out then that extra material you gifted to the opponent could come roaring back to overwhelm your smaller army.

Life is filled with examples of sham sacrifices versus real ones. When someone takes out a college loan, there is a reasonable assumption that, through future earnings, they will be able not only to pay off the loan, but also to earn more money on top of it. This assumption may not work out, but it has been executed so many times with success that many students feel safe taking on that debt. The fact that it may take years, or in some cases even decades, to see the sacrifice pay off doesn’t change the sense of confidence most young people and their families have when investing in education.

In contrast, real sacrifices promise no guarantee of a concrete return. My mother made an incalculably real sacrifice when she made the painful decision to leave my brother, sister, and me in Jamaica, where we’re from, to head to the U.S. in search of a better life. I was only two years old when she left. It would take her 10 long years to gain citizenship and be able to sponsor us to join her in this land of opportunity. She could not have known how those 10 years would play out and the infinite number of possible challenges we might all have to overcome.

In fact, the very first day after she arrived in the U.S., Dr. Martin Luther King Jr.was shot and killed in Memphis, setting off riots all around the country. The way she tells it, she was in shock that her dream began in such a devastating fashion. But she understood that this wasn’t just about her emotions and fears; she had three young kids, being taken care of by her mother, who were relying on her to push on. And push on she did, with courage and determination and sense of purpose, and a decade later, she accomplished the task that she had set her mind to so many years before, and finally we were able to reunite as one family.

Her sacrifice came with unanticipated results. While she had dreamed that we would all get a college degree (we did), she assumed that we would end up in traditional professions with guaranteed pension plans. She could not have foreseen that I would end up making my living from chess, that my brother’s martial arts passion would lead to his becoming a three-time kickboxing champion, or that her baby girl would leave the world of business behind to win six world titles in boxing.

It did not have to turn out that way. It did because she was willing to stomach the key aspect of making real sacrifices: the willingness to take risks. For a chess player, risk is as much intuited as it is calculated. Due to the inherent complexity of the game, it is virtually impossible to assess with certainty whether a risky move will pay off in the end. It’s up to the player to decide if sufficient conditions have been met to take the chance on a risky move. Those conditions may be an aggressive, attacking posture, dominant pieces, weaknesses in your opponent’s position, time pressure, or the stress of the competitive situation. All these could add up to a certain degree of confidence in the chance of a positive result.

When it comes to risk, grandmasters are not a monolithic group. Depending on their personalities, top players have different levels of risk tolerance. On one hand, you’ll find the swashbuckling, dynamic attacking personality types like Alexander Alekhine, Mikhail Tal, and Rashid Nezhmetdinov, who will take risks without much hesitation. On the other side of the spectrum are more conservative players such as José Raúl Capablanca, Tigran Petrosian and Wesley So. Tolerance for risk is very personal.

What we do know, however, is that the famous saying “No risk, no reward” is true in many cases. A skilled adversary is normally able to handle solid, conservative play and therefore able rob us of opportunities that may be inherent in our position. As Magnus Carlsen put it: “Not being willing to take risks is an extremely risky strategy.”

To be comfortable with risk is to be comfortable with uncertainty.

Link to the rest at The Wall Street Journal

Captain Cook’s Final, Fatal Voyage

From The Economist:

Until recently Captain James Cook was not a particularly controversial figure. But in January a statue of the 18th-century British sailor and explorer was toppled in Melbourne and the words “The colony will fall” spraypainted on the plinth. In Hawaii an obelisk in Cook’s memory has been splattered with red paint and the message “You are on native land.” Cook has joined Edward Colston, Robert Clive and Cecil Rhodes as a focal point for anti-colonialist ire.

In fact Cook was neither a slave trader nor much of an imperialist. He was, first and foremost, a brilliant navigator and cartographer. Acting under Admiralty orders, he undertook three pioneering voyages in the Pacific between 1768 and 1779. His mapmaking transformed Europeans’ knowledge of the world’s largest ocean.

An excellent new book draws on Cook’s letters and notebooks to tell the story of his third and final trip. Cook was almost 50 when he set off on hms Resolution in July 1776. Among the crew he took were William Bligh (later captain of the Bounty before the mutiny in 1789) and Mai, a Tahitian prince noted for being painted by Sir Joshua Reynolds. Cook had secret instructions from the Admiralty not only to claim new territory for Britain, but to search for a north-west passage via the Bering Strait (a task even someone with his navigational experience found impossible).

The author, Hampton Sides, focuses on Cook’s return to Australia and New Zealand—countries the explorer had first encountered almost a decade earlier—his discovery of the Society Islands (today part of French Polynesia) and his time in Hawaii. It was there, in February 1779, that he was killed after a botched attempt to kidnap a local chief in response to the theft of a longboat.

Cook was a man of his times. He believed Europe would have a civilising influence on many benighted folk in the Pacific. He was distinctly cruel in meting out punishments, to his own crew as well as to any indigenous people who opposed him.

Yet Cook also admired many of the people and places he encountered in the South Pacific. Unlike the Spanish, he had no interest in religious conversion. He tried hard to stop his men from spreading venereal disease. For the most part, his land claims were aimed not at promoting a British empire but forestalling grabs by Britain’s rivals, France and Spain.

Link to the rest at The Economist

Writers With ADHD: Strategies for Navigating the Writing Process

From Writers Helping Writers:

Earlier this year, I received an email from Bret Wieseler, requesting, “I would love to see a post about writers with ADHD. If you’ve never struggled with it yourself, maybe you know someone who has and can share their thoughts, methods, management strategies, etc. You offer such great insight into the many aspects of being a writer. I’m sure some of your readers, like myself, who struggle with ADHD would appreciate any advice you could offer.”

I immediately knew who to call on, and I am excited to share a guest post today from a writer who has been a part of my own journey almost from the very beginning. Johne Cook and I met on an online writing forum over 15 years ago, and he remains one of my favorite people to have entered my life in this journey. I have long admired his pragmatism, his insight, and his general cool in the face of the Internet’s insanity. To this day, I will often ask myself, “What would Johne do here?”

He has always been open about his experience as a writer with ADHD—both the challenges and his solutions for overcoming them. Today, I’m excited to have the opportunity to let him share his experience, tips, and resources with you.

Discovery

I wish I knew then what I know now.

For my first 45 years, I thought I was broken: I was a daydreamer, I couldn’t focus on things everyone else thought were important, I fidgeted when I should have been focusing, and I focused intently on the wrong things when people wanted my attention elsewhere.

It’s not like there weren’t clues. I excelled as part of an award-winning marching band in high school where marching in unison was expected, but it was like I was out of step with society.

I had difficulties with organization, time management, and sustaining attention in non-stimulating environments.

I couldn’t make important decisions to save my life. I kept putting things off. I had health problems, money problems, interpersonal problems.

I waited until the 11th hour to begin anything important, and things frequently fell through the cracks.

When I was young, what I wanted most was to be “normal.” But the older I got, the more I believed that was never my reality or calling.

Everything changed the day I heard a piece on NPR called “Adult ADHD in the Workplace.” As they discussed what ADHD was and shared six basic questions, I realized I checked five of the six boxes. They shared a link to a website, and I double-checked my results when I got home.

And then I met with a doctor and confirmed the diagnosis. My entire identity changed.

When I tried two different medications that gave me additional focus at the expense of my creativity (and some other small side effects), I sensed, for the first time, that my creativity was somehow tied to my condition. I valued my ability to sling words, see patterns, and make intuitive leaps that others around me couldn’t.

Because I valued my creativity, I ultimately handled my ADHD through other means that I’ll talk about below.

I realized I could either run from my ADHD or embrace it.

I decided to lean into it.

Communication

Knowing is half the battle. Knowing this about myself (and knowing that I was special, not broken) changed the way I saw everything.

I started by talking to my wife Linda and my family about what I was like and gradually increased my communication to include my boss and peers at work.

For some of them, what I told them was no surprise, and my biggest pleasant shock was how cool everyone was about it.

Finally, when appropriate, I shared about my ADHD with people I met out in the world. Letting people know what I was like set expectations and minimized confusion.

Once I had that handled, I moved on to the fun stuff.

ADHD as a Superpower

If attention deficit is the disorder, attention hyper-focus is my superpower.*

During the pandemic, Linda and I watched an interrupted season of The Amazing Race, mostly for Penn and Kim Holderness from YouTube’s The Holderness Family. It was only while watching the show that we learned that Penn was very ADHD. They referred to his ADHD as a superpower, and I saw with my own eyes how his ADHD helped him with pattern recognition, creative outside-the-box thinking, and hyper-focus during challenges.

And watching Penn at work on the show changed how I viewed my own ADHD.

In short, when managed effectively and embraced for its positive attributes, ADHD can empower writers to harness their inner strengths and achieve success in various domains of life.

Understanding ADHD in the Writing Process

People with ADHD exhibit different symptoms such as difficulty maintaining attention, hyperactivity, or impulsive behavior. For writers, these symptoms can manifest as challenges in organizing thoughts, staying on task, and completing projects.

However, it’s also associated with high levels of creativity, the ability to make unique connections, and a propensity for innovative thinking.

Challenges Faced by Writers With ADHD

(The following challenges are common but not universal.)

  • Distraction: Writing progress can be derailed by the lure of new ideas, social media, or even minor environmental changes.
  • Difficulty Organizing Thoughts: It can be daunting to translate a whirlwind of thoughts into coherent, structured writing.
  • Procrastination: Delaying writing tasks in favor of more immediately rewarding activities.
  • Impulsivity: Starting new projects without finishing current ones can lead to a cycle of uncompleted works.

Despite these challenges, many writers with ADHD have developed strategies to thrive.

Strategies and Tools for Writing with ADHD

I decided against medication. Once I took medication off the table, I began leaning harder on software tools to become more organized and to remind myself of important things.

Turning ADHD challenges into advantages requires a combination of personal strategies, environmental adjustments, and technology.

Linda and I are a team—she knows to prompt me to use my tech to capture ideas or thoughts in the moment, and I’ve become better at tracking my ideas by noting them in my phone or on my calendar.

Today, there are more tools available than ever.

Here are several approaches:

1. Structuring the Writing Environment

Minimize Distractions: Create a writing space with minimal visual and auditory distractions. Tools like noise-canceling headphones or apps that play white noise can help.

Establish Routines: Having a set writing schedule can provide structure and make it easier to start writing sessions.

2. Breaking Down Tasks

Use Lists and Outlines: Breaking writing projects into smaller, manageable tasks can make them less daunting. Outlining can also help organize thoughts before diving into writing.

Set Small Goals: Focus on short, achievable objectives, such as writing a certain number of words daily, to build momentum.

3. Leveraging Technology

Calendars: Google Calendar or Fantastical (MacOS only) free up my mind and keep me up-to-date.

Writing Software: Applications like Scrivener or Google Docs offer features to organize ideas, research, and drafts in one place.

Time Management Apps: Pomodoro timers or task management apps like Trello can help manage time and keep track of progress.

Pocket: A social bookmarking service for storing, sharing, and discovering web bookmarks.

SnagIt: A screenshot app on my computer where I capture and store screenshots in folders for later use. Also does optical character recognition (OCR) on text strings, allowing me to replicate URLs with copy/paste.

Note-taking appsApple Notes—my second mind that I can access from any of my Internet-connected devices.  Notion—a beefier app for more sophisticated note-taking

4. Embracing the Creative Process

Allow for Free Writing: Set aside time to write without worrying about coherence or structure. This can help capture creative ideas without the pressure of perfection.

Develop a System for Capturing Ideas: Use note-taking apps or carry a notebook to jot down ideas as they come, regardless of the time and place.

5. Seeking Support

Writing Groups: Joining a writing group or participating in writing challenges can provide accountability and motivation.

Professional Help: For some, working with a coach or therapist specializing in ADHD can offer personalized strategies and support.

Success Stories: Writers With ADHD

Many successful writers have ADHD and have spoken about how it affects their creative process. Writers emphasize the importance of embracing their non-linear thinking, and view it not as a hindrance, but as a source of creativity and originality:

Agatha Christie: The “Queen of Crime” was known for her prolific output and intricate plots. Some speculate that her energetic writing style and ability to focus intensely on details could be signs of ADHD.f

. . . .

John Irving: The author of The World According to Garp was diagnosed with ADHD as an adult and has spoken about how his condition has both helped and hindered his writing process.

Link to the rest at Writers Helping Writers

The following are from The Holderness Family, mentioned in the OP:

An Utterly Misleading Book About Rural America

From The Atlantic:

Rage is the subject of a new book by the political scientist Tom Schaller and the journalist Paul Waldman. White Rural Rage, specifically. In 255 pages, the authors chart the racism, homophobia, xenophobia, violent predilections, and vulnerability to authoritarianism that they claim make white rural voters a unique “threat to American democracy.” White Rural Rage is a screed lobbed at a familiar target of elite liberal ire. Despite this, or perhaps because of it, the authors appeared on Morning Joe, the book inspired an approving column from The New York Times’ Paul Krugman, and its thesis has been a topic of discussion on podcasts from MSNBC’s Chuck Todd and the right-wing firebrand Charlie Kirk. The book has become a New York Times best seller.

. . . .

It has also kindled an academic controversy. In the weeks since its publication, a trio of reviews by political scientists have accused Schaller and Waldman of committing what amounts to academic malpractice, alleging that the authors used shoddy methodologies, misinterpreted data, and distorted studies to substantiate their allegations about white rural Americans. I spoke with more than 20 scholars in the tight-knit rural-studies community, most of them cited in White Rural Rage or thanked in the acknowledgments, and they left me convinced that the book is poorly researched and intellectually dishonest.

White Rural Rage illustrates how willing many members of the U.S. media and the public are to believe, and ultimately launder, abusive accusations against an economically disadvantaged group of people that would provoke sympathy if its members had different skin color and voting habits. That this book was able to make it to print—and onto the best-seller list—before anyone noticed that it has significant errors is a testament to how little powerful people think of white rural Americans. As someone who is from the kind of place the authors demonize—a place that is “rural” in the pejorative, rather than literal, sense—I find White Rural Rage personally offensive. I was so frustrated by its indulgence of familiar stereotypes that I aired several intemperate critiques of the book and its authors on social media. But when I dug deeper, I found that the problems with White Rural Rage extend beyond its anti-rural prejudice. As an academic and a writer, I find Schaller and Waldman’s misuse of other scholars’ research indefensible.

After fact-checking many of the book’s claims and citations, I found a pattern: Most of the problems occur in sections of the book that try to prove that white rural Americans are especially likely to commit or express support for political violence. By bending the facts to fit their chosen scapegoat, Schaller and Waldman not only trade on long-standing stereotypes about dangerous rural people. They mislead the public about the all-too-real threats to our democracy today. As serious scholarship has shown—including some of the very scholarship Schaller and Waldman cite, only to contort it—the right-wing rage we need to worry about is not coming from deep-red rural areas. It is coming from cities and suburbs.

The most obvious problem with White Rural Rage is its refusal to define rural. In a note in the back of the book, the authors write, “What constitutes ‘rural’ and who qualifies as a rural American … depends on who you ask.” Fair enough. The rural-studies scholars I spoke with agreed that there are a variety of competing definitions. But rather than tell us what definition they used, Schaller and Waldman confess that they settled on no definition at all: “We remained agnostic throughout our research and writing by merely reporting the categories and definitions that each pollster, scholar, or researcher used.” In other words, they relied on studies that used different definitions of rural, a decision that conveniently lets them pick and choose whatever research fits their narrative. This is what the scholars I interviewed objected to—they emphasized that the existence of multiple definitions of rural is not an excuse to decline to pick one. “This book amounts to a poor amalgamation of disparate literatures designed to fit a preordained narrative,” Cameron Wimpy, a political scientist at Arkansas State University, told me. It would be like undertaking a book-length study demonizing Irish people, refusing to define what you mean by Irish, and then drawing on studies of native Irish in Ireland, non-Irish immigrants to Ireland, Irish Americans, people who took a 23andMe DNA test that showed Irish ancestry, and Bostonians who get drunk on Saint Patrick’s Day to build your argument about the singular danger of “the Irish.” It’s preposterous.

The authors write that they were “at the mercy of the choices made by the researchers who collected, sorted, classified, and tabulated their results.” But reading between the lines, the authors’ working definition of rural often seems to be “a not-so-nice place where white people live,” irrespective of whether that place is a tiny hamlet or a small city. Some of the most jaw-dropping instances of this come when the authors discuss what they would have you believe is rural America’s bigoted assault on local libraries. “The American Library Association tracked 1,269 efforts to ban books in libraries in 2022,” Schaller and Waldman note. “Many of these efforts occurred in rural areas, where libraries have become a target of controversy over books with LGBTQ+ themes or discussions of racism.” The authors detail attacks on a number of libraries: in Llano, Texas; Ashtabula County, Ohio; Craighead County, Arkansas; Maury County, Tennessee; Boundary County, Idaho; and Jamestown, Michigan.

But half of these locations—Craighead County, Maury County, and Jamestown—do not seem to qualify as rural. What the authors call “rural Jamestown, Michigan,” scores a 1 out of 10 on one of the most popular metrics, the RUCA, used to measure rurality (1 being most urban), and is a quick commute away from the city of Grand Rapids.

That Schaller and Waldman so artfully dodged defining what they mean by rural is a shame for a host of reasons, not the least of which is that the question of who is rural is complex and fascinating. Scholars in rural studies make a distinction between subjective rural identity and objective rural residence—in other words, seeing yourself as rural versus living in a place that is geographically rural according to metrics like RUCA. The thing is, rural identity and rural residence are very, very different. Though Schaller and Waldman mention this distinction briefly in their authors’ note, they do not meaningfully explore it. One political scientist I spoke with, Utah Valley University’s Zoe Nemerever, recently co-authored a paper comparing rural self-identification to residence and found a stunning result: “A minority of respondents who described their neighborhood as rural actually live in an area considered rural.” Her study found that 72 percent of people—at minimum—who saw themselves as living in a rural place did not live in a rural place at all.

It turns out I am one of those people. I grew up in Mechanicsburg, Pennsylvania, an 88 percent white enclave in the southward center of the state. Eighteen minutes and nine miles to the east, you hit the capital city of Harrisburg, which has the best used bookstore in the tristate area. Nineteen minutes and 13 miles away to the west, you hit the game lands, where I spent my teenage years playing hooky and hunting in thick, hard-green mountains. Mechanicsburg feels urban, suburban, and rural all at once. There are strip malls and car dealerships. There are trailer parks and farms with beat-to-hell farmhouses. There are nice suburban neighborhoods with McMansions. My high school had a Future Farmers of America chapter and gave us the first day of deer season off. The final week of my senior year, a kid unballed his fist in the parking lot to show me a bag of heroin. Another wore bow ties and ended up at Harvard.

What do you call a place like that? It was both nice and not-nice. Somewhere and nowhere. Once in college, a professor made a wry joke: Describing a fictional town in a story, he quipped, “It’s the kind of place you see a sign for on the highway, but no one is actually from there.” He paused, racking his brain for an example. “Like Mechanicsburg, Pennsylvania.”

I tend to think of myself as having a comparatively “rural” identity for a variety of reasons: because Mechanicsburg was more rural when I was growing up. Because both sides of my family are from deeply rural places: Mathias, West Virginia (where 100 percent of the county population is rural), and Huntingdon, Pennsylvania (74 percent rural). Because, since the age of 10, I have spent nearly all my free time hunting or fishing, mostly in unambiguously rural areas that are a short drive from where I live. Because people like that professor tend to view my hometown as a place that is so irrelevant, it barely exists. So when Nemerever looked up data on Mechanicsburg and told me it had a RUCA score of 1 and was considered metropolitan—like Schaller and Waldman’s erroneous library examples—I was genuinely surprised. I’d made the same mistake about my own hometown that Schaller and Waldman had about Jamestown, Michigan.

Scholars who study rural identity say that common misperceptions like this are why defining rural is so important. “Researchers should be highly conscious of what ‘rural’ means when they want to measure relevant social, psychological, and political correlates,” a study of “non-rural rural identifiers” by Kristin Lunz Trujillo, a political-science professor at the University of South Carolina, warns. “Rurality can be a social identity that includes a broad group categorization, even including people who do not currently live in a rural area.”

Schaller and Waldman might have understood these nuances—and not repeatedly misidentified rural areas—if they’d meaningfully consulted members of the rural-studies community. In a portion of their acknowledgments section, the authors thank researchers and journalists in the field who “directed our attention to findings of relevance for our inquiry.” I contacted all 10 of these people, hoping to better understand what kind of input Schaller and Waldman sought from subject-matter experts. One said he was satisfied with the way his work had been acknowledged, and another did not respond to my message. Seven reported only a few cursory email exchanges with the authors about the subject of the book and were surprised to find that they had been thanked at all.

Although it is not unusual for authors to thank people they do not know or corresponded with only briefly, it is quite telling that not a single person I spoke with in rural studies—with the exception of the Wilmington College rural historian Keith Orejel, who said he was disappointed that his feedback did not seem to influence the book—said these men sought out their expertise in a serious way, circulated drafts of the book, or simply ran its controversial argument by them in detail.

. . . .

Arlie Hochschild, a celebrated sociologist and the author of Strangers in Their Own Land and a forthcoming book on Appalachia, struck a plaintive note in an email to me about White Rural Rage: “When I think of those I’ve come to know in Pike County, Kentucky—part of the nation’s whitest and second poorest congressional district—I imagine that many would not see themselves in this portrait.” She added that these Kentuckians would no doubt “feel stereotyped by books that talk of ‘rural white rage,’ by people who otherwise claim to honor ‘diversity.’”

Link to the rest at The Atlantic

PG grew up in rural and very rural areas. (He remembers the name of every student in his class in grades 1-6 and can recite them on demand. [No, he was not home-schooled.] He was the valedictorian of his high school graduating class of 22. Out of those 22, only two graduated from college.

PG was happy to move to a close-in suburb of Chicago to go to college. He used some of his leisure time to ride public transportation to explore all different sorts of neighborhoods in the city, including one in which all the signs were in Polish and another in which the signs were in Greek. After he graduated, he worked in Chicago for several years. During this period of time, Chicago was the largest Polish city in the world—more Poles lived in Chicago than in Warsaw.

As far as White Rural Rage is concerned, PG remembered that he was required to read a book titled How to Lie with Statistics because his first job out of college involved analyzing a lot of numbers.

From the reading PG did to understand White Rural Rage, it sounded like the authors of the book cherry-picked their statistics to fit their desired conclusions—opinions first, numbers later. And, of course, the book’s publisher was Random House, most of whose management regard New Jersey as terra incognita.

Hong Kong’s Forbidden Apple

From The Wall Street Journal:

The Chinese Communist Party has already crushed freedom in Hong Kong. Now it’s beating a dead horse.

As directors of Next Digital, we saw three years ago the jackboot effect of the security laws Beijing has imposed on Hong Kong. The company published Apple Daily, a lively independent newspaper founded in 1995 by businessman Jimmy Lai. With no judicial process, Hong Kong’s security secretary froze the company’s bank accounts, forcing the paper to close. Mr. Lai was already in jail and is now on trial on national-security charges.

Now the Communist Party is going after Apple Daily’s readers. On June 24, 2021, the paper’s last day of publication, people lined up for hours to show their support by buying a copy. More than a million were sold in a city of fewer than eight million. Under the latest national-security law, which took effect last month, possession of “seditious publications” is a crime. A Hong Kong resident could go to prison for having a keepsake copy of Apple Daily at home.

Think we’re exaggerating? On March 10 the Global Times, a Communist Party propaganda organ, published what it billed as “a rebuttal to Western media hype targeting the law.” The “rebuttal” described Mr. Lai as a “modern-day traitor” and Apple Daily as “the secessionist tabloid, depicted by Western politicians and media as the so-called defender of freedom of speech.”

During what passed for a debate over the bill in Hong Kong’s Legislative Council, lawmaker Peter Koon asked an official to clarify the provision. Apple Daily “is certainly seditious,” Mr. Koon allowed. “But what if some people intend to keep a record of such a bad newspaper and had two copies at home? Would that be counted as possessing seditious publications?”

Security Secretary Chris Tang said that it would depend. The Global Times “rebuttal” quotes him: “For example, ‘I’ve placed it [seditious item] there for a long time, I didn’t know it was still there, the purpose wasn’t to incite, I didn’t know about its existence,’ that could constitute a reasonable excuse.” Feel better? The law also authorizes the police to use “reasonable force” to remove or destroy seditious publications.

. . . .

But more nonmedia companies are likely to find the risks of doing business in Hong Kong too great. The new law expands on the old one by incorporating China’s criminalizing of “state secrets.” That can include not only journalism but also the kind of information-gathering routinely done in a global financial center by industry and company analysts, investors, consultants, lawyers and accountants.

Link to the rest at The Wall Street Journal

PG says this is a sobering story likely to spur sales of outbound airline tickets from Hong Kong. Such activity will move a great many intelligent and talented people out of Hong Kong for long-term stays outside of China.

Perhaps Chinese leaders believe the nation has a surplus of such individuals within its 1.4 billion population, but PG thinks they’re mistaken.

Could your marriage survive a shipwreck?

From The Economist:

When the captain of a Korean tuna-fishing ship first caught sight of Maurice and Maralyn Bailey bobbing in the Pacific Ocean in June 1973, he could not work out what they were. About four months earlier, while they were trying to sail from England to New Zealand, their boat had crashed into a sperm whale. The couple was stranded in a tiny inflatable life-raft tied to a flimsy dinghy, subsisting on rainwater, turtles whose throats they had slit, birds they strangled and sharks they suffocated. Unable to stand and wearing clothes that had disintegrated, their skeletons looked ready to burst through their skin.

Miraculously, they were alive. In “Maurice and Maralyn”, Sophie Elmhirst . . . draws on her own reporting, Maralyn’s diaries, the memoirs that the two published and a trove of news clips to tell the tale of this couple’s survival—and the love story that bubbled alongside.

The pair had met a decade earlier at a car rally in Derby, England. Maurice was older and flew planes, but was lonely, awkward and felt that he had “so much to wade through before he could do anything”. Maralyn was chatty and brave and, unlike Maurice, “seemed to know instinctively how to do things”. They married, bought a home and then sold it, using the money to buy a yacht. Their dream was to become explorers.

They found their adventure. But despite the seafaring and adrenaline, in “Maurice and Maralyn” it is love itself, “a terrifying fluke”, which makes life extraordinary. Two people choose and are chosen, “and, most unlikely of all, these choices must happen at roughly the same time”, Ms Elmhirst writes.

Link to the rest at The Economist

‘God’s Ghostwriters’ Review: The Bible’s Hidden Contributors

From The Wall Street Journal:

‘The stupid, the lowborn, the gullible; slaves, women, and children.” For the second-century pagan writer Celsus, it was easy to sneer at the adherents of the new Christian faith as a basket of deplorables. Still, insults often contain a grain of truth. In his point-by-point rebuttal of Celsus’ anti-Christian polemic a century or so later, the theologian Origen doesn’t dispute this particular charge. Yes, the lowborn, the uneducated, the marginalized were indeed at the core of the Christian mission: That was the point. Today, most theologians would accept that Celsus was right to foreground the crucial role of women in shaping the early church. In “God’s Ghostwriters,” Candida Moss attempts to make a similar case for the role of enslaved people. It is hard to imagine a reader who wouldn’t find this a thrilling, if at times infuriating, book.

Ms. Moss, a professor of theology at the University of Birmingham, is the author of several spiky and provocative revisionist studies of the early church. In “The Myth of Persecution” (2013), she argues that early Christian martyrdom was an overwhelmingly fictional phenomenon; the magnificent “Divine Bodies” (2019) is an exploration of the concept of bodily resurrection. In “God’s Ghostwriters,” she sets out to recover the contributions made by enslaved men and women to the development of the church in (roughly) the first two centuries after Christ.

In fact, “God’s Ghostwriters” is by far the best account we have of the roles played by enslaved people in supporting the high literary culture of the ancient world more broadly. Famously, Pliny the Elder died in A.D. 79 while composing an eyewitness account of the eruption of Mount Vesuvius; less famously, Ms. Moss conjectures, at least one enslaved shorthand writer must have perished at his side (no self-respecting Roman carried his own notebook). Throughout antiquity, every stage of literary composition, dissemination and reception was facilitated by enslaved letter-carriers, copyists and readers. As Ms. Moss reminds us, even reading a book generally meant listening to an enslaved person, who was himself reading from a scroll copied out by another enslaved person.

“God’s Ghostwriters” makes a more radical and specific claim: that enslaved people were integral to the formation of the New Testament. Ms. Moss’s key concept is that of co-authorship. When an author dictated his or her ideas to an enslaved scribe, the scribe, she argues, was much more than an animate dictaphone: “Their interpretative work,” she argues, “gave shape to the thoughts and words of the speaker and made them an indispensable part of the compositional process.”

In a mundane sense, that is obviously true. The tricky question is how far we can legitimately stretch this idea of giving “shape.” Ms. Moss is admirably keen to engage in a project of “ethical reading that is reparative as it listens to neglected voices . . . to read with erased collaborators and to attend to invisible actors.” The trouble with invisible actors is precisely their invisibility. How can we tell if we are “attending” to the muffled voices of enslaved scribes, or simply imagining them? When does “ethical reading” tip into wishful thinking?

Take the epistles of Paul. Several of his letters (Colossians, Ephesians, Philippians, Philemon) were demonstrably written from prison. The vivid narrative of Acts shows that he was imprisoned multiple times—at Philippi, Caesarea and Rome. Many scholars think, for good reasons, that the “prison epistles” were in fact written during a further imprisonment at Ephesus.

Ms. Moss begins by imagining what Paul’s hypothetical Ephesian prison might have been like: “damp, moldy, and bitterly cold . . . almost entirely dark, with the only natural light in the room entering through a small lunate opening close to the ceiling.” How could Paul have written his epistles in this ghastly place? “We might imagine,” Ms. Moss argues, “that someone—perhaps the secretary of one of Paul’s wealthier followers, or perhaps a street-corner scribe hired for the day—squatted next to the window with stylus in hand and wax tablet balanced on his thigh, ready to take dictation.”

All well and good, but once this hypothetical street-corner scribe finished noting down the text of Philippians (say), what then? “He read the letter back to Paul, but there may not have been an opportunity for the prisoner to review the final draft for errors or ambiguities. The secretary, therefore, had considerable influence over the text. . . . Many scribes were used to improving the style of their customers.” Ms. Moss reminds us that Paul himself claimed to be no great shakes as a public speaker: “It is impossible to prove that Paul’s secretaries came up with the turns of phrase, rhetorical flourishes, or intellectual arguments for which the Pauline epistles are known, but there are hints that they might have.”

No one can disprove any link in this fragile chain of hypotheticals. Yes, in theory, Philippians might have been dictated through a prison window to an enslaved scribe—though a spoilsport might note that no moldy dungeons featured in Paul’s imprisonments at Caesarea and Rome: Both were rather sociable and comfortable spells of house arrest. Yes, in theory, an enslaved scribe might have inserted words or phrases or “intellectual arguments” of his own. Yes, after his release from prison, Paul might never have gotten around to removing the enslaved scribe’s surreptitious contributions to the text of Philippians. Or then again . . .

Where I really started fidgeting was when Ms. Moss went in search of specific passages contributed by “might-have” scribes of this kind. Few Pauline images are better known than the haunting passage of 1 Thessalonians 5, which likens the coming of the Day of the Lord to a thief in the night: “Therefore let us not sleep, as do others; but let us watch and be sober.” Ms. Moss, quite correctly, notes that night watchmen in antiquity were often enslaved people. How did Paul—or, as Ms. Moss would have it, “Paul and his collaborator”—come up with the beautiful image of the watcher in the night? “Arguably,” she suggests, “the idea comes from an enslaved scribe who themselves may have spent some exhausting nights awake.”

Link to the rest at The Wall Street Journal

Ian Fleming

From The Wall Street Journal:

From the first arresting moment in Nicholas Shakespeare’s biography “Ian Fleming: The Complete Man” it is clear that we are in good hands. At a hastily arranged funeral in a village church, Fleming’s widow arrives late, accompanying his coffin, causing the ceremony to be restarted and thereby demonstrating that, “as in life, so in death, a strong woman had played a defining role.” Eager to learn more, we gladly enter a monumental edifice of a book that at first glance seems somewhat daunting.

What lies ahead could, after all, be the literary equivalent of a country-house tour that winds through room after room of arcane objects, past portraits of the rich and reprehensible. For Fleming’s life, though relatively short (he died in 1964 at the age of 56), was crammed not only with stuff—the handmade cigarettes, the gold-plated typewriter—but also with personages. He knew everybody, from Winston Churchill and JFK to Claudette Colbert and Truman Capote, not to mention a host of military men and secret agents. Depending on whom you believe, Fleming also played a vital undercover role in World War II as well as the Cold War. Then he created James Bond and became an industry himself.

Fleming was “the son of wealth, but the grandson of poverty,” as Mr. Shakespeare tells it, his grandfather Robert having come from nothing to become, by 1928, a merchant banker controlling “maybe a trillion pounds” in today’s money. And Robert Fleming is wonderfully described here in all his canniness and thrift, keeping silver in one trouser pocket and pennies in the other for fear of overtipping. Fleming’s childhood was a fairly typical one of social privilege and emotional deprivation, shadowed by the tragedy of his father’s death when Ian was 9 and presided over by a willful, narcissistic mother. His peerless father, Val, having been killed in World War I (Churchill penned his obituary) and his older brother, Peter, being a famous explorer and writer, young Ian had more than one legend against which to measure himself.

Educated (and sadistically flogged) at Dunford and then, at the age of 13, at Eton, the academically lazy but athletically talented boy was molded to enter not only elite British society but also the shadowy world of espionage. “It was a spy network already in the making,” Mr. Shakespeare writes of Eton, “a class of English men raised to rule the Empire . . . all known to one another from boyhood.” Even in his Moscow exile, the disgraced “Cambridge Five” spy Guy Burgess still wore his old school tie.

Fleming attended Sandhurst military academy but left prematurely in 1927, having become ill with gonorrhea. Dispatched to an academy-sanatorium in Germany, he considered the enlightened theories of the Viennese psychoanalyst Alfred Adler and dallied with women. (“His general taste,” a friend observed some years later, “was for tarts who looked like nice girls.”) A love of literature was also engendered, even though Fleming was being officially trained for a career in the British foreign service. By 1930 the unruly youth had a temporary job at the League of Nations in Geneva, where “he went to work at 8.30 a.m., walking around an old dog that lay on the steps at the entrance.” Drowsy Europe too lies on the threshold of disaster.

In 1931, however, thanks to his mother’s social connections, Fleming traded diplomacy for journalism, initially working at Reuters, where his early assignments included “sport, motor-racing, business, obituaries, and politics.” The next career step was both inevitable and timely. In 1939 Fleming was recruited to be the new assistant to the head of the Admiralty’s Naval Intelligence Division, and his role in confounding both Nazi and Soviet intelligence networks emerges here as vital. Mr. Shakespeare finds intrigue of all kinds to untangle—personal and political, domestic and international—when his subject becomes first an espionage professional and later a novelist courted by the likes of John F. Kennedy, who turned to Fleming for assassination tips.

. . . .

Bond’s first outing, “Casino Royale,” was published on April 15, 1953, and though reviews were positive (“Ian Fleming has discovered the secret of narrative art. The reader has to go on reading”), sales were slim. “My profits from Casino will just about keep Ann in asparagus over Coronation week,” Fleming groused. Further volumes followed, but it was the Suez Crisis of 1956 that, in Mr. Shakespeare’s words, “saved Bond.” When the ailing British Prime Minister Anthony Eden decided to convalesce at Goldeneye, Fleming’s Jamaican estate, it created a sensation. Book sales of the series soared and the fictional spy’s future was assured. “Peter Pan with a gun,” as Mr. Shakespeare calls him, would never grow old. Whether airborne or underwater, trading blows or banter, the suave Bond was forever Britain as it wished itself to be.

Much of the factual detail of Fleming’s life has been examined by previous biographers, notably John Pearson (“The Life of Ian Fleming,” 1966) and Andrew Lycett (“Ian Fleming,” 1995), whose work and assistance Mr. Shakespeare acknowledges. He also lists “other excellent, if partial accounts,” including Ben Macintyre’s 2008 “For Your Eyes Only.” Given these previous exhumations, Mr. Shakespeare was cautious about conducting another. When invited to do so by the Fleming estate, however, he was gratified to unearth a fresh specimen. Not, he writes, the “prickly, self-centred bounder” he imagined but “another, more luminous person.” A Fleming of many contradictions consequently emerges: loving yet cruel, arrogant yet insecure, spiteful yet generous.

In the end he could afford to be; success made Fleming rich. In Mr. Shakespeare’s astute opinion, the inimitable Bond also retrieved for his creator “the epoch in which he had thrived, young, single and free,” while repairing the damage inflicted on the British psyche by the 1951 defection of Burgess and fellow spy Donald Maclean.

Link to the rest at The Wall Street Journal

I Love You, Maradona

From The Paris Review:

While reading Maradona’s autobiography this past winter, I found that every few pages I would whisper or write in the margins, “I love you, Maradona.” Sadness crept up on me as I turned to the last chapter, and it intensified to heartbreak when I read its first lines: “They say I can’t keep quiet, that I talk about everything, and it’s true. They say I fell out with the Pope. It’s true.” I was devastated to be leaving Maradona’s world and returning to the ordinary one, where nobody ever picks a fight with the Pope. 

I started reading El Diego: The Autobiography of the World’s Greatest Footballer, ghostwritten by Marcela Mora y Araujo, on the basis of a recommendation by an editor I have liked working with. He said reading it was the most fun he’d had with a book. I came to El Diego with basically no knowledge of Maradona or even of soccer. I would have said I hated soccer actually. I hate the buzzing noise the crowds make on the TV. But from the very first page I found Maradona’s voice so addictive and original that reading El Diego felt like falling in love. 

Maradona’s skirmish with the Pope goes the way of much else in the book. Because of his extraordinary talents and global fame, Maradona is invited to the Vatican with his family. The Pope gives each of them a rosary to say, and he tells Maradona that he has been given a special one. Maradona checks with his mother and discovers that they have the same rosary. He goes back to confront the Pope and is outraged when the Pope pats him on the back and carries on walking. 

“Total lack of respect!” Maradona fumes. “It’s why I’ve got angry with so many people: because they are two-faced, because they say one thing here and then another thing there, because they’d stab you in the back, because they lie. If I were to talk about all the people I’ve fallen out with over the years, I’d need one of those encyclopedias, there would be volumes.” 

Whether it be FIFA, money-hungry managers, angry fans, the Mafia, drug tests, or the tabloids, Maradona never takes anything lying down. He stews and stews, and this fuels him to play better and better soccer. There is an Argentinean word Maradona uses for this: bronca. Mora y Araujo explains, in an introduction in which she lovingly details the difficulties of putting Maradona’s unique voice down on the page, that this basically means “fury, hatred, resentment, bitter discontent.” But the difficulty of choosing a translation left her to simply leave bronca, and many of Maradona’s other favorite catchphrases, as they were. The result is a narrative voice which is totally distinct, and an overall energy out of sync with the pristine, restrained public image most celebrities seek to cultivate, especially in the social media age. 

Maradona first learned to play soccer on the streets of Villa Fiorito, the extremely poor city on the outskirts of Buenos Aires where he was born. He played all day in the blazing heat and then when the sun went down too. Early on in the book he says: “When I hear someone going on about how in such and such a stadium there’s no light, I think: I played in the dark, you son of a bitch!”

I still don’t know enough about soccer to verify my impression that he is one of the greatest soccer players ever. But Wikipedia asserts this too. El Diego tells the story of his extraordinary rise through the world of small, local kids’ clubs to a glittering career which involved World Cups (one of which he captained Argentina for), a transformative stint for the Italian team Napoli, setting the world record for transfer fees twice, scoring a famous handball goal against England, and lots of other things I don’t really understand properly but felt enormously gripped by.   

It’s an incredible life story, shadowed by, as well as his constant fights, a cocaine habit and a string of extramarital dalliances. But mostly I was gripped by the way he tells it. At one point, when FIFA bans him from a match, he says: “My legs had been cut off, my soul had been destroyed.”

I started El Diego in the airport, on my way back to Belfast for Christmas. A young man on my flight pointed at my book and asked me what I was reading. (I discovered over the next few weeks that reading the book in public places was a magnet for men.) I showed him the cover.  

He said: Oh yeah I thought it said Maradona. You like football? 

I said: Oh no I don’t know anything about football. 

He said: Why are you reading it then? 

At the time I told him it was because I wanted to read something different from what I usually read. If he’d asked me the same thing when I finished it, I would have said it’s not really about football. It’s about being in love. It’s about the little guy against the big guy, I would have declared. And believing in something. And respect. It’s about having a sense of who you are.

Link to the rest at The Paris Review

Write Like a Man

From The Wall Street Journal:

Many years ago, at a dinner party attended by some of the ex-radicals turned Cold Warriors known as the New York intellectuals, the table talk turned to denigrating writers reputed to be soft on communism and praising the “hard” anticommunists who were fighting for democracy and freedom. Before the guests could become too complacent, however, the literary critic Diana Trilling stood up and declared: “None of you men are hard enough for me!”

In “Write Like a Man,” Ronnie A. Grinberg recounts this scene to illustrate how members of this “testosterone-driven literary circle,” as she calls it, “came to espouse a secular Jewish machismo” as they reinvented both themselves and liberalism to meet the exigencies of Cold War politics. When old anxieties were magnified by new ideological challenges, Ms. Grinberg writes, “a masculinity centered on strength, toughness, and virility” became a defining feature of New York intellectual life.

Ms. Grinberg, a history professor at the University of Oklahoma, briefly takes up American Jewish novelists who were eager to defy stereotypes—of the timid schlemiel or bookish Talmudic scholar—and to overcome what Norman Mailer called “the softness of a man early accustomed to motherlove.” But her real focus is on a group of intellectuals—men and women both—who “prized verbal combativeness, polemical aggression, and an unflinching style of argumentation.”

The principal figure here is Diana Trilling, a brilliant essayist and the wife of the celebrated cultural critic Lionel Trilling. Diana, “the more abrasive of the two,” Ms. Grinberg writes, balanced her husband’s checkbook and deftly edited his drafts. But when she offered similar editorial help to various male friends, they took it (as she herself reported) “as an assault on their masculinity.” According to the novelist and memoirist Ann Birstein, many men in this milieu “feared losing their manhood to literary women.” She noted that “reviews of my books still referred to me in parenthesis as Mrs. Alfred Kazin, as if that were a career in itself.”

Like other “literary wives”—Zelda Fitzgerald, Veza Canetti and, in this book, the essayist Pearl Kazin Bell come to mind—Diana struggled to emerge from the shadow of her husband’s reputation and establish herself in her own right. “I wanted as much for him as he wanted for himself,” she said of Lionel, the first Jew granted tenure at Columbia University’s English department, “and more than I wanted for myself.”

By the time of his death in 1975, she had published a single collection of essays. But along with other doyennes who held their own in this crowd—including the political theorist Hannah Arendt and the historian Gertrude Himmelfarb—Diana showed herself to be a vigorous and productive thinker. She earned a place at the table with her devotion to what she called “the life of significant contention” and to what Ms. Grinberg calls the determination to “write like a man.”

In her criticism and cultural portraiture, Diana Trilling delivered sharp polemical thrusts aimed at all sorts of targets, including male writers from her own milieu. She said that Saul Bellow’s debut novel, “Dangling Man” (1944), was among the “small novels of sterility.” She boasted that “a Viennese novelist, a refugee from Nazi Austria, was said to have remarked that he had lost his country, his home, his language, but that he had at least one good fortune: he has not been reviewed by me.” Such sharpness could cut both ways. On reading her attack on the student takeover of Columbia in 1968, the poet Robert Lowell (a fan of the students, to say the least) dismissed her as “some housekeeping goddess of reason.”

The agitation of the 1960s brought other intellectual polemicists to the fore, not least Norman Podhoretz (longtime editor of Commentary) and Midge Decter, who argued that the left needed to do some housekeeping of its own—by rejecting a political outlook that favored appeasement and opposed the muscular exercise of American power. Together the couple helped define what came to be called “neoconservatism.” Decter in particular found herself at odds with the ascendant women’s liberation movement and the ways it undermined traditional standards—moral and aesthetic both. In a chapter called “the first lady of neoconservatism,” Ms. Grinberg argues that this antipathy was at the very heart of a sensibility Decter shared with other women in this group, including Arendt, Himmelfarb and Susan Sontag. “Despite the sexism they encountered,” Ms. Grinberg writes, they “disparaged feminists.” They expected real writers to address “serious topics with masculine drive and ruthlessness. . . . In their view, feminists did not meet this standard.”

Ms. Grinberg rounds out the group portrait with Irving Howe, the anthologizer of Yiddish literature, author of the magisterial “World of Our Fathers” (1976), and presiding sage of Dissent, the magazine he launched in 1954. Unlike others in this cohort who found themselves “mugged by reality” and moving from left to right, Howe kept his socialist allegiances. But like the neoconservatives, he had little patience for either the strident “desperadoes” of the New Left or the grievances of the feminists. In a piece Decter commissioned him to write for Harper’s, Howe skewered the feminist Kate Millett as a “figment of the Zeitgeist” and scorned her bestseller, “Sexual Politics” (1970), as “intellectual goulash.”

In his essays “The Lost Young Intellectual” and “The New York Intellectuals: A Chronicle and a Critique,” Howe recognized that, despite the internecine quarrels carried out in the pages of Partisan Review, Commentary and Dissent, these Americanized writers striving to “make it” shared two states of mind. First, profound guilt over the helplessness that they or their families felt as they watched, from safe perches on the banks of the Hudson, the destruction of Europe’s Jews during the Shoah. Second, filial impiety: They often saw their immigrant fathers as failed breadwinners, as men who were, Ms. Grinberg writes, “in their own sons’ eyes, emasculated.” Howe’s father, an immigrant to the Bronx from the Russian Pale of Settlement, had gone bankrupt during the Depression. For others, the failure might be less literal, but the judgment seemed broadly to apply.

Link to the rest at The Wall Street Journal

A Love That Endured Life’s March Madness

From The Wall Street Journal:

In 1956 the Ohio State University basketball team moved out of the drafty old Ohio Fairgrounds Coliseum and into the newly constructed St. John Arena. As a 9-year-old, I was excited to get a ticket to one of the daytime games.

I have no memory of who Ohio State was playing that afternoon, but there is one thing I have never forgotten.

The captain of the Buckeyes, Gene Millard, would bring the ball down the court and for the whole game there was a pretty young woman sitting in the stands near my elementary-school buddies and me, shouting enthusiastically. “Geno!” “Geno!” “Geno!” She never stopped. He was the entire focus of her attention.

I thought at the time that it must be so great to be out on the court and have someone like you so much.

I didn’t know it that day, but she was his wife. Gene Millard and Sally Settle had met while in eighth grade in Dayton, Ohio, and married while college sophomores.

After Mr. Millard’s senior season it was five years before I had reason to think about him again—when he arrived at our town’s high school as the new history teacher and basketball coach.

We could tell immediately what a good guy he was—unfailingly friendly, thoroughly unfull of himself, available to talk with anyone, athlete or not. From the moment he and Sally came to our community—Bexley, Ohio—they were a solid part of it. They raised their children there, regularly attended neighborhood events, were devoted members of the Bexley United Methodist Church.

The one constant was that they were always Sally and Gene. They came as a package deal—you couldn’t think about one without thinking about the other. Years and then generations passed, and they were no longer the young hotshot on the Ohio State hardwood and the adoring young wife in the stands. They had become something more important: the soul of the community, its quintessence. That kind of thing can happen in a small American town. You could move away, but every time you thought of home, you thought of Sally and Gene.

Gene finally retired as the new century began, but the Millards remained in town. Speaking about Sally, Lee Caryer—one of the students who was there when she and Gene first arrived—said: “No matter how you met her, she remembered your name and cared about your life.”

In the wider world of basketball, March Madness is under way. When the college tournament ends the CBS television network will, as always, play the song “One Shining Moment” to celebrate the new national champions.

Yet there are championships in this life, and then there are championships. Gene and Sally Millard had decades of shining moments together—68 years of marriage. She died in January at 88. Gene, 89 and without Sally for the first time since eighth grade, is living with one of his sons.

Link to the rest at The Wall Street Journal

A new book rebukes the “luxury beliefs” of America’s upper class

From The Economist:

While applying to Yale University (pictured) in 2014, Rob Henderson visited New Haven for the first time. He stayed with a friend of a friend, whose cat was called Learned Claw (an obscure, pretentious allusion to an American judge of the mid-20th century who went by the name of Learned Hand). Mr Henderson did not get the reference. When he arrived at Yale more cultural mysteries awaited. Everyone raved about “The West Wing”, a television show he had never watched, and “Hamilton”, a musical he could not afford to see.

More Yale students come from families in the top 1% of income than from the bottom 60%. Mr Henderson was among the less-affluent minority. He had been removed from his drug-addicted mother when he was three years old and lived with nine different foster families before his eighth birthday. Scared, insecure and angry, he soon began to drink, take drugs and get into fights.

At the age of 17, as his peers started going to prison, he signed up to the armed forces on a whim. Mr Henderson thrived in the structured, disciplined system and spent seven years in the us Air Force. It became clear he was highly intelligent, so he was encouraged to apply to college through the GI Bill. (He has recently finished a doctorate at Cambridge University.)

Troubled” is the compelling story of his chaotic childhood, his time at Yale and what it all made him think about divisions in America. As a result of his experience, Mr Henderson has coined the concept of “luxury beliefs”, which he describes as “a set of beliefs that confer status on the upper class at very little cost, while inflicting costs upon the lower classes”.

In the past, people displayed their membership of the upper class either by doing things “like golf or beagling” that no working person would have time to do, or through their material accoutrements. But today, leisure time and luxury goods are more accessible to everyone, so it has become harder for the elites to separate themselves from the hoi polloi. Their solution? “The affluent have decoupled social status from goods and reattached it to beliefs.”

Mr Henderson gives the example of support for defunding the police. The idea gained traction in the wake of George Floyd’s murder in 2020 and has been championed by many affluent people. However, it is an unpopular policy among poor people—exactly those who the well-meaning college kids say they are trying to help—and leads to higher homicide rates.

“Troubled” is more than a fascinating memoir, as it analyses the controversial belief systems that have gripped American universities. But it does so without being an angry culture-war screed. Mr Henderson makes no statement of political affiliation. Lots of what he writes is simply common sense. It is what much of middle America believes.

Mr Henderson exposes the stupidity of what now passes for orthodoxy, such as the way the luxury-belief class claims that the unhappiness associated with substance abuse or obesity, for instance, “primarily stems from the negative social judgments they elicit, rather than the behaviours and choices themselves”. The well-off “validate and affirm the behaviours, decisions, and attitudes of marginalised and deprived kids” in a way “that they would never accept for themselves or their own children”. One classmate argues that monogamy is “outdated” but admits that she was raised by two parents and intends to have a monogamous marriage herself.

Link to the rest at The Economist

Waiting to Be Arrested at Night review – the Uyghurs’ fight for survival in a society where repression is routine

From The Guardian:

A group of Uyghur friends are having a late-night chat. “I wish the Chinese would just conquer the world,” one says suddenly. “Why do you say that?” another asks, surprised. “The world doesn’t care what happens to us,” the first man replies. “Since we can’t have freedom anyway, let the whole world taste subjugation. Then we would all be the same. We wouldn’t be alone in our suffering.”

It is an understandable outburst of bitterness. The Uyghurs are a Muslim minority who live mainly in China’s north-western Xinjiang region. They have long faced discrimination and persecution. Since 2016, the repression has greatly intensified, with mass detention, forced sterilisation and abortion, the separation of thousands of children from their parents, and the razing of thousands of mosques. Yet support for Uyghurs has been equivocal, not least from Muslim-majority countries, many of which are outraged by the burning of a Qur’an in Sweden but remain silent about the detention of more than 1 million Uyghurs in Xinjiang, for fear of upsetting Beijing.

Tahir Hamut Izgil’s Waiting to Be Arrested at Night, which recounts that conversation, is not, however, a bitter book. It is suffused, rather, by a deep sense of sadness, and of despondency even amid hope. “Yet our words could undo nothing here,/even the things we brought to be”, as one of Izgil’s poems laments.

A poet and film-maker, Izgil is famed for bringing a modernist sensibility to Uyghur poetry. He did not set out to be a political activist. The very fact of being a Uyghur, though, in a country that seeks to erase Uyghur existence, both culturally and physically, turns everyday life into a political act. And for a poet living in a culture within which “verse is woven into daily life”, writing is necessarily also an act of witness and of resistance.

Despite the subtitle of the book – “A Uyghur Poet’s Memoir of China’s Genocide” – there are no depictions here of genocide, or of torture, or even of violence. We know all these things are happening, but off-page. Izgil’s memoir is a story about how to survive in, and to negotiate one’s way through, a society in which repression has become routine, and the power of the state is unfettered. The book’s restraint is also its strength. The tension in the narrative flows from the dread captured in the title – the dread of waiting to be arrested, to be vanished into detention, a dread no Uyghur can escape.

Beijing’s strategy has been, over the past decade, to cut Uyghurs off from the rest of the world and from one another, too. When censorship and surveillance made it impossible to link to the internet beyond the Chinese firewall, many Uyghurs took to keeping in touch with the outside world through shortwave radios. Until, that is, the government banned the sale of such radios and organised mass raids into people’s homes to confiscate them. “We suddenly found ourselves living like frogs at the bottom of a well,” Izgil observes.

Beijing seeks to cut off Uyghurs from their past and their traditions, too. Qur’ans are seized and history books banned, including many previously authorised by the state. Even personal names become part of the assault on Uyghur culture. Beijing’s list of prohibited names tells Uyghurs what they cannot call their children. Some names are apparently too “Muslim” – Aisha, Fatima, Saifuddin; others, such as Arafat, too political. When the list was first introduced, newspapers carried announcements such as: “My son’s birth name was Arafat Ablikim. From now on he will be known as Bekhtiyar Ablikim.”

The greatest dread is of the physical repression wreaked upon Uyghurs: mass detentions, torture, violence. We get a glimpse of the horror when Izgil and his wife, Marhaba, attend a police station to have their biometric details collected – fingerprints, blood samples, facial scans. Along a basement corridor, `they see a cell fitted out with iron restraints and a notorious “tiger chair”, used to force detainees into agonising stress positions. On the floor are bloodstains.

People start disappearing, first in small numbers, eventually up to 1 million. They are taken to “study centres” – the code for mass detention camps – though nobody knows which one. “They simply vanished,” Izgil writes.

The police knocked on the door when “your name was on the list”. There was, though, “no way to know if or when your name would show up on the list. We all lived within this frightening uncertainty.” It spawned a climate in which people feared one another as much as they feared the authorities.

Link to the rest at The Guardian

A Well-Contained Life

From The Paris Review:

What can’t be contained? Not much. We are given the resources, mental or physical, to contain our emotions and our belongings. Failing to do so often registers as weakness. 

The smallest container you can buy at the Container Store is a rectangular crystal-clear plastic box available in orange, purple, and green. It can contain one AA or two AAA batteries, half a handful of Tic Tacs, or a folded-up tissue. The largest container you can buy at the Container Store is a four-tiered metal shelving unit. It can contain other containers.

Containers mediate us and our stuff. They create boundaries and allow our items to exist multiple feet above the ground. Most spaces are divided by containers. These containers might then be divided by additional containers. Containers form a scaffold, or an architecture. They make walls scalable and underbeds reachable. They allow you to put something down and know where it is the next time you want to pick it up. 

One of the best ways to understand containers is to imagine a world without them. We would have piles. Bracelets, creams, stick-shaped kitchen items, fruit. Small things would get lost under big ones. Or, an alternative: a line of items that snakes through an apartment or house, up and down stairs and spiraling into the center of the room. When you want to find something, you simply walk along the line of items, confronting each individual thing. 

We use containers to solve the problem of stuff. At the Container Store, containers solve other problems, too—problems we didn’t know we had. A Parking Guide provides you with a mat on which to park. A RollDown Egg Dispenser rolls your eggs. A Stackable Sweater Drawer creates a sweater-only space for sweaters. A Cheese Keeper keeps your cheese, and a Small Cube Sleeve serves as a sleeve for your cube. 

There are plenty of analogies to choose from when describing a body, but one rather insufficient one is a container for our organs, blood, souls. The problem with this analogy is that our bodies are more than just containers. We can’t untangle our bodily experience from the feeling of existing in the world. In this case, the container is not a neutral scaffold. 

Is this true at the Container Store, too? Many containers certainly try to be as neutral as possible, made from clear, thin acrylic, or a neutral-tone rattan. They sell a promise: Once you use me to sort your trouser socks, you won’t even know I’m there. The Container Store refers to their products as “organizational solutions”—a way of dealing with something rather than a thing to deal with. The thing itself is little more than the solution it offers.

And then you come to the hampers and think, If the hamper was simply a solution to the problem of storing dirty laundry, why must I choose whether I want it in plastic, canvas, or bamboo? And then you spot the Small Scalloped Edge Faux Rattan Bin and think, What’s keeping me from buying the Small Scalloped Edge Faux Rattan Bin, even if I didn’t have anything to put in it? After spending a certain amount of time in the Container Store, the containers that are meant to organize, divide, and store look less like solutions and more like stuff. Following this line of thinking is a great way to leave empty-handed. 

The Container Store isn’t safe for the problem-less and adequately organized. The Container Store is better suited for the overflowing, the misplaced, and those lacking sectioned parts. Most of us do have a problem that the Small Scalloped Edge Faux Rattan Bin could fix. Only the bravest would buy it, place it on the table, and wait for it to find the problem for itself. 

Link to the rest at The Paris Review

As any visitor who has spent much time on TPV knows, PG includes excerpts with links back to the original piece/post. The only exception to this rule is when he posts poems. For PG, a well-written poem is a lovely whole from start to finish.

PG found the OP to be such an excellent short essay that he couldn’t bring himself to excise any portion of it. So The Paris Review doesn’t suffer a dearth of page views as a result of PG’s violation of TPV protocols, click HERE to check it out. If you haven’t visited The Paris Review based on PG’s previous links, you’ll find lots of interesting writing about subjects you’re not likely to encounter elsewhere.

One additional location that will interest many visitors to TPV is the publication’s Back Issues Section. The Paris Review was first published in. While you can purchase back issues, you can also browse online tables of content and see excerpts from the issue without becoming a subscriber. The first issue – Spring 1953 – included E. M. Forster on the Art of Fiction. William Styron’s Letter to an Editor. Stories by Peter Matthiessen, Terry Southern, and Eugene Walter. Poems by Robert Bly, Donald Hall, and George Steiner.

The Back Issues Section also includes an Author Index that lists the authors and, in some instances, the subjects of every interview, story, poem, essay, and portfolio published – over 5,000 – with a hyperlink to a short synopsis of the work written. Examples are Reel to Reel about Louis Armstrong:

In a typical year, Louis Armstrong spent more than three hundred days on the road, bringing his music to audiences around the world. He always traveled with a steamer trunk designed to house two reel-to-reel tape decks and a turntable, and he carried a stash of music for his own listening pleasure, to while away the hours he spent in hotels and dressing rooms before and after each gig. Tapes being less fragile than LPs, and possessing longer recording capacity, he ultimately transferred much of his collection to seven-inch reels. He also made mix tapes of his favorite tunes. He liked musicians who prized melody, and his selections range from Glenn Miller to Jelly Roll Morton to Tchaikovsky. Occasionally he added commentary over the music or played along, and he made copies of his own recordings, to which (unlike many musicians) he enjoyed listening. But often he would just turn on the recorder to capture everyday conversations, whether he was hanging out at home in his living room with his wife Lucille, telling jokes backstage with band members, being interviewed by reporters, or entertaining fans.

Here’s another excerpt from Backyard Bird Diary written by Amy Tan:

September 16, 2017
While watching hummingbirds buzz around me, I recalled a fantasy every child has: that I could win the trust of wild animals and they would willingly come to me. I imagined tiny avian helicopters dining on my palm. To lure them, I bought Lilliputian hummingbird feeders, four for $10. Hope came cheap enough, but I was also realistic. It might take months to gain a hummingbird’s interest in the feeder and for it to lose its fear of me.
Yesterday, I set a little feeder on the rail near the regular hummingbird feeders on the patio and then sat at a table about ten feet away. Within minutes, a hummingbird came to inspect, a male with a flashing red head. He hovered, gave a cursory glance, and then left. At least he noticed it. A good beginning. Then he returned, inspected it again from different angles, and left. The third time, he did a little dance around the feeder, approached, and stuck his bill in the hole and drank. I was astonished. That was fast. Other hummingbirds came, and they did their usual territorial display of chasing each other off before the victor returned. Throughout the day, I noticed that the hummingbirds seemed to prefer the little feeder over the larger one. Why was that? Because it was new and they had to take turns in claiming it?
Today, at 1:30 P.M., I sat at the patio table again. It was quiet. I called the songbirds. Each day I pair my own whistled birdsong with tidbits of food to encourage them to come. In about two minutes, I heard the raspy chitter and squeak of the titmouse and chickadee. They sounded excited to find peanuts.

Making sense of the gulf between young men and women

From The Economist:

Men and women have different experiences, so you would expect them to have different worldviews. Nonetheless, the growing gulf between young men and women in developed countries is striking. Polling data from 20 such countries shows that, whereas two decades ago there was little difference between the share of men and women aged 18-29 who described themselves as liberal rather than conservative, the gap has grown to 25 percentage points. Young men also seem more anti-feminist than older men, bucking the trend for each generation to be more liberal than its predecessor. Polls from 27 European countries found that men under 30 were more likely than those over 65 to agree that “advancing women’s and girls’ rights has gone too far because it threatens men’s and boys’ opportunities”. Similar results can be found in Britain, South Korea and China. Young women were likely to believe the opposite.

Unpicking what is going on is not simple. A good place to start is to note that young women are soaring ahead of their male peers academically. In the European Union fully 46% of them earn degrees, versus 35% of young men, a gap that has doubled since 2002. One consequence is that young women are more likely than men to spend their early adulthood in a cocoon of campus liberalism. Meanwhile, boys outnumber girls at the bottom end of the scholastic scale. Across rich countries, 28% of them fail to learn to read to a basic level. That is true of only 18% of girls.

Another big change is that, to varying degrees across the developed world, immense progress has been made in reducing the barriers to women having successful careers. College-educated men are still thriving, too—often as one half of a double-high-income heterosexual couple. Many men welcome these advances and argue for more. However, those among their less-educated brothers who are struggling in the workplace and the dating market are more likely to be resentful, and to blame women for their loss of relative status. And young women, by and large, are glad of past progress but are keenly aware that real threats and unfairness remain, from male violence to the difficulty of juggling careers and children. In short, most young women and worryingly large numbers of young men complain that society is biased against their own sex.

. . . .

There is no easy solution to any of this. But clearly, more should be done to help boys lagging behind at school to do better. Some policies that might work without harming their female classmates include hiring more male teachers (who are exceptionally scarce at primary schools in rich countries), and allowing boys to start school a year later than girls, to reflect the fact that they mature later.

Link to the rest at The Economist

Harvard Probe Finds Honesty Researcher Engaged in Scientific Misconduct

From The Wall Street Journal:

A Harvard University probe into prominent researcher Francesca Gino found that her work contained manipulated data and recommended that she be fired, according to a voluminous court filing that offers a rare behind-the-scenes look at research misconduct investigations.

It is a key document at the center of a continuing legal fight involving Gino, a behavioral scientist who in August sued the university and a trio of data bloggers for $25 million.

The case has captivated researchers and the public alike as Gino, known for her research into the reasons people lie and cheat, has defended herself against allegations that her work contains falsified data.

The investigative report had remained secret until this week, when the judge in the case granted Harvard’s request to file the document, with some personal details redacted, as an exhibit.

The investigative committee that produced the nearly 1,300-page document included three Harvard Business School professors tapped by HBS dean Srikant Datar to examine accusations about Gino’s work.

They concluded after a monthslong probe conducted in 2022 and 2023 that Gino “engaged in multiple instances of research misconduct” in the four papers they examined. They recommended that the university audit Gino’s other experimental work, request retractions of three of the papers (the fourth had already been retracted at the time they reviewed it), and place Gino on unpaid leave while taking steps to terminate her employment.

“The Investigation Committee believes that the severity of the research misconduct that Professor Gino has committed calls for appropriately severe institutional action,” the report states.

HBS declined to comment.

The investigative report offers a rare look at the ins and outs of a research misconduct investigation, a process whose documents and conclusions are often kept secret.

Dorothy Bishop, a psychologist at the University of Oxford whose work has drawn attention to research problems in psychology, praised the disclosure. “Along with many other scientists, I have been concerned that institutions are generally very weak at handling investigations of misconduct and they tend to brush things under the carpet,” Bishop said. “It is refreshing to see such full and open reporting in this case.”

Harvard started looking into Gino’s work in October 2021 after a group of behavioral scientists who write about statistical methods on their blog Data Colada complained to the university. They had analyzed four papers co-written by Gino and said data in them appeared falsified. 

. . . .

Academic Misconduct

Dorothy Bishop, a psychologist at the University of Oxford whose work has drawn attention to research problems in psychology, praised the disclosure. “Along with many other scientists, I have been concerned that institutions are generally very weak at handling investigations of misconduct and they tend to brush things under the carpet,” Bishop said. “It is refreshing to see such full and open reporting in this case.”

Harvard started looking into Gino’s work in October 2021 after a group of behavioral scientists who write about statistical methods on their blog Data Colada complained to the university. They had analyzed four papers co-written by Gino and said data in them appeared falsified.

An initial inquiry conducted by two HBS faculty included an examination of the data sets from Gino’s computers and records, and her written responses to the allegations. The faculty members concluded that a full investigation was warranted, and Datar agreed.

In the course of the full investigation, the two faculty who ran the initial inquiry plus a third HBS faculty member interviewed Gino and witnesses who worked with her or co-wrote the papers. They gathered documents including data files, correspondence and various drafts of the submitted manuscripts. And they commissioned an outside firm to conduct a forensic analysis of the data files.

The committee concluded that in the various studies, Gino edited observations in ways that made the results fit hypotheses.

When asked by the committee about work culture at the lab, several witnesses said they didn’t feel pressured to obtain results. “I never had any indication that she was pressuring people to get results. And she never pressured me to get results,” one witness said.

According to the documents, Gino suggested that most of the problems highlighted in her work could have been the result of honest error, made by herself or research assistants who frequently worked on the data. The investigative committee rejected that explanation because Gino didn’t give evidence that explained “major anomalies and discrepancies.”

Gino also argued that other people might have tampered with her data, possibly with “malicious intent,” but the investigative committee also rejected that possibility. “Although we acknowledge that the theory of a malicious actor might be remotely possible, we do not find it plausible,” the group wrote.

Link to the rest at The Wall Street Journal

American Flannel

From The Wall Street Journal:

Many years ago, during a reporting trip to Copenhagen, I met an economist for the Danish labor federation. I had just visited a company that was transferring production of hearing aids abroad, and I asked him about this move. To my surprise, the economist was entirely in favor. “We want to be a wealthy country,” he said, “and we can’t be a wealthy country with low-wage jobs.”

That conversation came back to me as I read a pair of books about people committed to reviving U.S. garment manufacturing. Both Steven Kurutz’s “American Flannel” and Rachel Slade’s “Making It in America” follow entrepreneurs who have dared to produce American-made apparel at a moment when the domestic supply chains for such products barely exist. Both books are interesting to read. The human stories are moving, the founders’ determination admirable. But neither book finally provides a convincing answer to a question that lurks in the background: Should we even want apparel manufacturing to rebound in the U.S.?

Mr. Kurutz introduces Bayard Winthrop, a descendant of the first governor of the Massachusetts Bay Colony. He grew up in Greenwich, Conn., and spent two years on Wall Street before deciding to seek more meaningful work. After stints selling snowshoe bindings and marketing messenger bags, he created American Giant in 2011 to manufacture clothing in the U.S. Labor costs precluded products that would have required extensive sewing. Instead, Mr. Winthrop settled on cotton hoodies with heavier fabric and better construction than most others. A favorable news article brought a flood of orders, enabling American Giant to buy sewing plants and inspiring Mr. Winthrop to attempt a far more complicated product: the flannel shirts to which the title of Mr. Kurutz’s book alludes.

For guidance, Mr. Winthrop turned to James McKinnon, the head of Cotswold Industries, a family-owned textile company. “Flannel is an art and an art form,” Mr. McKinnon explains. He introduced Mr. Winthrop to small companies that had survived the textile industry’s contraction and that might have the skills and equipment to dye yarn and finish cotton fabric the way Mr. Winthrop wanted it. Mr. Kurutz’s on-the-scene reporting provides a ground-level view of what it means to reassemble a domestic supply chain for flannel, colorfully illustrating why “reshoring” is so complicated a task.

Mr. Kurutz’s other main character is Gina Locklear, who grew up around her family’s sock mill in Fort Payne in Alabama’s DeKalb County. At the start of the 1990s, the author tells us, “there was hardly a man or woman in all of DeKalb County who didn’t have a connection to the hosiery business.” But as imports gained ground, the self-proclaimed “sock capital of the world” lost its kick. Many mills closed. In 2008 Ms. Locklear used spare capacity in her parents’ factory to start Zkano, a company that knits high-fashion socks from organic cotton spun and dyed domestically. The socks sold well. The challenge, she found, was finding workers. As Mr. Kurutz writes: “You could create new business. You couldn’t invent someone with thirty years of experience in the hosiery industry.”

Rachel Slade presents a similar story in “Making It in America,” featuring Ben Waxman, a former union official and political consultant who decided in 2015 to start a business with his then-girlfriend (and now wife), Whitney Reynolds. They mortgaged their home in Maine to fund American Roots. “Together, they would bring apparel manufacturing back to America,” Ms. Slade writes. “They would be uncompromising in their commitment to domestic sourcing and the welfare of their employees.” They began with a hoodie designed to fit large men doing physical jobs in harsh weather, a product whose manufacture required 54 operations on six different kinds of sewing machines. At $80, it was too expensive for the retail market, but Mr. Waxman sold it to labor unions that were delighted to offer members a garment made by union workers in the U.S.

I have no criticism of the individuals Mr. Kurutz and Ms. Slade profile. If these entrepreneurs can make a profit and create jobs by producing clothing in the U.S., congratulations are in order. They are well-intentioned people committed to doing right, and it’s hard not to admire them. But it’s disingenuous to pretend that a handful of mom-and-pop companies sewing hoodies and socks point the way to the revival of manufacturing in the U.S. Apparel is fundamentally different from most other manufacturing sectors; its factories still rely heavily on sewing machines, usually operated by women stitching hem after hem or attaching collar after collar. The U.S. has no great productivity advantage when it comes to making hoodies.

The authors present gauzy portraits of the industrial past. Fort Payne, Mr. Kurutz writes, was “a Silicon Valley for socks” where the annual Hosiery Week “bonded the community together” before the U.S. government lowered barriers to sock imports. That’s not exactly true: Silicon Valley is notorious for ample paychecks, Fort Payne’s sock mills less so. In 1993, the year before apparel and textile makers received what Mr. Kurutz calls a “death blow” from the North American Free Trade Agreement, the 5,478 hosiery workers in DeKalb County were paid, on average, 11% less than the average private-sector employee in the same county. In North Carolina, where American Giant does its sewing, the gap was even wider: The average private-sector worker was paid $22,000 in 1993, the average apparel worker $14,649. People took jobs in garment factories not for bonding but because that was all they could find.

The American garment industry was in decline for decades before Nafta. That it survived as long as it did was due to strict import quotas and high tariffs on both textiles and apparel, fought for by an unholy alliance between garment-workers’ unions and virulently antiunion textile manufacturers. The result was that U.S. consumers were forced to pay high prices for linens and clothes to keep those industries alive.

Link to the rest at The Wall Street Journal

A very long time ago, PG practiced law in a small town located in a part of the United States not known for its wealth.

One of his clients was a small garment manufacturing business owned and operated by a husband and wife. As per expectations, the business employed mostly women and paid minimum wage or something close to it.

The jobs offered provided important income for families, including single mothers, but nobody expected to get rich working there. Had the garment manufacturer closed its doors, the result would likely have been an increased number of families relying on welfare payments for their survival.

In some circles, low-wage jobs are regarded with disdain, but they can play an important economic role in small communities as a first job or an emergency job in the event of a change in family structure or injury to a working member of a family.

In PG’s observations during this time period, while government welfare was sometimes necessary for families with no ability to support themselves, having a job-holder in the family made a positive overall improvement to the family’s long-term stability and happiness.

Classical Culture and White Nationalism

From BookBrowse:

The hands of history have reshaped the Greek past for centuries, sculpting it into an idealized version credited with birthing a myriad of ideas and concepts, notably identity. Certain contemporary political currents claim that Hellenic identity was what we would today consider white, although Greece was a multiethnic society that did not have our modern concepts of race.

Groups promoting racist ideology have pushed the interpretation that the apparent lack of color and ornamentation in Greco-Roman classical sculpture, which is in fact due to the erosion of pigments over time, is indicative of a more advanced and sophisticated culture resulting from the supposed superiority of white Europeans. As Lauren Markham writes in A Map of Future Ruins, “classical iconography continues to be a touchstone of white supremacy today, building off the myth that ancient Greece is the taproot of so-called Western culture.”

. . . .

The former president has also drawn on classical imagery. In 2020, a draft of an executive order titled “Make Federal Buildings Beautiful Again” was leaked. It sought to establish neoclassical architecture as the preferred style for federal buildings. The draft argues that in designing Washington D.C. buildings, the founding fathers embraced the classical models of “democratic Athens” and “republican Rome” because they symbolized “self-governing ideals.”

. . . .

Classicists are pushing against this misuse of antiquity’s ideas and symbols. According to Curtis Dozier, founder of Pharos and assistant professor of Greek and Roman studies at Vassar College, these views “all depend on the widespread assumption that Greco-Roman antiquity is admirable, foundational and refined. Thus any presentation that promotes uncritical admiration for the ancient world, by presenting it as a source of ‘timeless’ models and wisdom, has the potential to be complicit in white supremacy.”

As classics and black world studies professor Denise McCoskey explains, the “idea that the Greeks and Romans identified as ‘white’ with other people in modern-day Europe is just a complete misreading of ancient evidence.

Link to the rest at BookBrowse

PG reminds one and all that he doesn’t necessarily agree with items he posts on TPV.

He also asks that any comments to this or other posts avoid any disparagement of anyone on the basis of race or any other characteristic that is inherent to an individual’s personhood.

Trust Your Intuition: The Writer’s Sixth Sense

From Writers in the Storm:

Just as martial artists trust their instincts, writers must trust their intuition. If something doesn’t feel right, don’t dismiss it. Whether it’s a subtle discomfort or a gut feeling, your intuition is a valuable tool for detecting potential threats. Trust it and take appropriate action to protect your safety.

Writing in public places, such as coffee shops and libraries, offers a unique blend of inspiration and potential challenges. As both a martial artist and author, the combination of creativity and personal safety comes naturally. However, for others, safety may not be a major consideration. Drawing from my experience as a black belt and self-defense seminar instructor, I offer these tips for writers to balance safety and creativity.

The Art of Location Selection: Choose Well-Lit and Crowded Spots

Just as a martial artist assesses their environment for safety, writers should be discerning about their chosen writing spaces. Select well-lit and populated areas where the flow of people ensures a reasonable level of security. Avoid secluded corners or dimly lit spots that might pose safety risks. Your writing sanctuary should inspire creativity without compromising your well-being.

Strategic Positioning: Sit Facing Entrances for Enhanced Awareness

In martial arts, practitioners learn the significance of positioning themselves for optimal defense. Similarly, when writing in public places, sit facing entrances and exits. This strategic placement not only allows for a clear view of your surroundings, but also enhances situational awareness. Observing who enters and exits establishes a mental map of the immediate environment, helping you to focus on your writing without neglecting your safety.

Engage and Disengage: Knowing When to Look Up

Immersing yourself in your writing is crucial, but so is periodically disengaging to assess your surroundings. Establish a rhythm—write for a set period, then take a moment to look up and scan your environment. It’s a dance between creativity and vigilance, ensuring you remain connected to both your work and the world around you. Designate breaks in your writing session to focus solely on your surroundings. Use these moments to reorient yourself and ensure your safety protocols are intact.

Make a habit of being mindful of those around you and any unusual behavior. Trust your instincts—if something feels off, it probably is. Being mindful of your surroundings helps protect your creative flow from unexpected disruptions.

Guarding the Arsenal: Keep Valuables Secure

Martial artists safeguard their weapons, and for writers, the laptop or tablet is a formidable tool. Be mindful of your belongings—keep your laptop, bags, and personal items within reach. Avoid leaving them unattended, as distraction can provide an opportunity for opportunistic individuals. By maintaining control over your possessions, you safeguard both your creative work and personal safety.

Digital Fortifications: Use Lock Screen Features and VPNs

Just as martial artists fortify their defenses, writers should fortify their digital presence. Enable lock screen features on your devices to protect your work and personal information. Use strong passwords or biometric authentication for an added layer of security. When working on public Wi-Fi, avoid accessing sensitive financial or personal information. Consider using a virtual private network (VPN) for added security, ensuring that your digital activities remain shielded from potential threats.

Strategic Alliances: The Buddy System for Writers

In martial arts, strength often lies in alliances. Likewise, writers can benefit from the buddy system. If possible, work with a writing partner or a friend when venturing into public spaces. Having someone by your side not only deters potential threats but also provides a safety net, allowing you to immerse yourself in your writing without undue worry.

. . . .

Trust Your Intuition: The Writer’s Sixth Sense

Just as martial artists trust their instincts, writers must trust their intuition. If something doesn’t feel right, don’t dismiss it. Whether it’s a subtle discomfort or a gut feeling, your intuition is a valuable tool for detecting potential threats. Trust it and take appropriate action to protect your safety.

Link to the rest at Writers in the Storm

The corny over-extension of martial arts wisdom would normally have caused PG to pass the OP by.

However, he was concerned about authors, perhaps mostly female authors, having problems finding safe public spaces in which to write.

In ancient college times, PG would often walk to the library to write if things were a little chaotic around the apartment he shared, checking only on how much longer the library would remain open. He relied on library staff to kick him out at closing time. He would then walk home, without a second thought, seldom seeing anyone else on his way.

Thinking back, he almost never saw any female students during his close-the-library trips. To be fair, the campus was regarded as quite safe, even after dark.

But in those ancient times, a gentleman or would-be gentleman or guy who didn’t want to be excoriated by persons of both genders (there were only two back then), would always walk his date to the door of her dormitory, sorority, apartment, etc., and wait until the door locked behind her before leaving, regardless of what substances he had taken into his body during the preceding hours.

Why it’s hard to write a good book about the tech world

From The Economist:

When people ask Michael Moritz, a former journalist and prominent tech investor, what book they should read to understand Silicon Valley, he always recommends two. “They are not about Silicon Valley, but they have everything to do with Silicon Valley,” he says.

One is “The Studio” (1969) by John Gregory Dunne, an American writer who spent a year inside 20th Century Fox watching films get made and executives try to balance creativity with profit-seeking. The other, “Swimming Across” (2001) by Andy Grove, the former boss of Intel, a chipmaker, is a memoir about surviving the Holocaust. It shows how adversity can engender grit, which every entrepreneur needs.

That Sir Michael does not suggest a book squarely about the tech business says a lot. Silicon Valley has produced some of the world’s most gargantuan companies, but it has not inspired many written accounts with a long shelf life. Wall Street, on the other hand, claims a small canon that has stood the test of time, from chronicles of meltdowns (“Too Big to Fail”), to corporate greed (“Barbarians at the Gate”) to a fictionalised account (“The Bonfire of the Vanities”) that popularised the term “masters of the universe”.

Why not the masters of Silicon Valley? Part of the problem is access, as is often the case when writing about the powerful. Tech executives may let their guards down at Burning Man, but they have been painstakingly trained by public-relations staff to not get burned by writers. This has been the case for a while. When John Battelle was writing “The Search” (2005), about online quests for information, he spent over a year asking to interview Google’s co-founder, Larry Page. The firm tried to impose conditions, such as the right to read the manuscript in advance and add a footnote and possible rebuttal to every mention of Google. He declined. Google ended up granting the interview anyway.

Journalists who manage to finagle access can feel they owe a company and its executives and, in turn, write meek and sympathetic accounts rather than penetrating prose. Or they cannot break in—or do not even try—and write their book from a distance, without an insider’s insights.

Two new books demonstrate how hard it is to write well about Silicon Valley. “Filterworld” is an outsider’s account of the Valley’s impact, which reads as if it was entirely reported and written in a coffee shop in Brooklyn. The book laments how “culture is stuck and plagued by sameness” and blames Silicon Valley’s algorithms, “the technological spectre haunting our own era of the early 21st century”.

This is the sort of tirade against tech that has spread as widely as Silicon Valley’s apps. It is not wrong, but nor is it insightful. The author, Kyle Chayka, who is a journalist for the New Yorker, never reconciles the tension between the cultural “sameness” he decries and the personalisation everyone experiences, with online users possessing individual feeds and living in separate informational bubbles. Nor is this a wholly new phenomenon. People have been complaining about globalisation eroding local culture since “recorded civilisation” began, the author concedes. In 1890 Gabriel Tarde, a French sociologist, lamented the “persistent sameness in hotel fare and service, in household furniture, in clothes and jewellery, in theatrical notices and in the volumes in shop windows” that spread with the passenger train.

Burn Book” is a better, though imperfect, read. Kara Swisher, a veteran chronicler of Silicon Valley, is both an insider and an outsider. She has attended baby showers for tech billionaires’ offspring, and even hosted Google’s top brass for a sleepover at her mother’s apartment. But she has a distaste for the Valley’s “look-at-me narcissists, who never met an idea that they did not try to take credit for”.

In delicious detail, she offers her verdict on the techies who have become household names, such as Facebook’s founder: “As sweat poured down Mark Zuckerberg’s pasty and rounded face, I wondered if he was going to keel over right there at my feet.” (That was in 2010,before he had gone through media-training galore.) Much as Truman Capote, an American writer, was willing to skewer the socialite swans of New York, Ms Swisher delights in prodding some of her subjects to make readers smile and squirm, such as media mogul Rupert Murdoch (“Uncle Satan”) and Amazon’s Jeff Bezos (“a frenetic mongoose” with “a genuinely infectious maniacal laugh”).

Link to the rest at The Economist

Catastrophe Ethics

From The Wall Street Journal:

It’s getting harder and harder to be a decent person. You wake up hankering for a coffee. But hold on! Before you order one, better make sure that a fair-trade producer supplied the beans. And then: a drop of milk? Cows have such a huge carbon footprint. Almond milk? Growing almonds requires copious quantities of water. Soy? Don’t even think about it.

Time to walk to work. That’s how you’ve been getting there since learning, last week, that your electric car’s cobalt comes from mines that engage in unacceptable labor practices. Now you’ve arrived but—can you believe it?—a co-worker just made a deplorable comment about the presidential campaign. Cut ties? A busy day of discussion ensues.

At last you’re home for the evening. Perhaps watch a comedian on your favorite streaming service? Not till you’ve checked whether he’s uttered something offensive in the past 15 years.

Modern life, Travis Rieder declares in “Catastrophe Ethics,” is “morally exhausting.” Everything we do “seems to matter,” he notes, and yet “nothing we do seems to matter.” The term “catastrophe” might seem to apply more to climate change than offensive comedians, but Mr. Rieder is speaking generally of collective problems that lie beyond the capacity of any of us to affect individually. They’re catastrophic in that they involve large social matters—the comedian, say, might be contributing to public prejudices by ridiculing a particular group—even though our own role in affecting them is vanishingly small. You’re not going to stop climate change on your own—you are, after all, one person in a global population of eight billion. Nor will the comedian you cancel even notice. What to do?

The great moral theories, Mr. Rieder tells us, are of little help. His prime target is utilitarianism, which holds that the right thing to do is whatever will maximize benefits and minimize costs for all concerned. Such counsel is useless, though, when our individual actions will neither yield any measurable benefit nor reduce any perceptible cost.

Other doctrines are explored as well. “Deontology” argues that we should not treat other human beings manipulatively, simply as means to our own ends. But its prohibitions seem better suited to acts like lying or promise-breaking, Mr. Rieder notes, than buying coffee or watching a comedian.

Then there is virtue ethics, which advises us to cultivate morally good character traits like temperance or moderation. Because the development of such traits takes place over time, though, it can’t really tell us whether our taking a joyride in our Hummer next Tuesday is right or wrong, since it’s unlikely to affect our character one way or another. Virtue ethics is not, Mr. Rieder concludes, particularly action-guiding.

Mr. Rieder, a bioethicist at Johns Hopkins, advises that the best course is simply to follow our own sense of personal “integrity,” an idea he derives from the philosopher Bernard Williams. For example, you might drink only fair-trade coffee because the proper treatment of workers is central to your sense of right and wrong, but you’re OK with listening to a comedian who offends particular groups. I, on the other hand, might cancel the comedian because his humor crosses some non-negotiable lines in my moral core, but I don’t get particularly worked up over where my coffee comes from.

We can’t, Mr. Rieder says, do everything. But we can be a person of integrity, as long as we “walk the walk” of our deepest values. There is a gentle wisdom here, reminiscent of the rabbinical saying: “You are not obliged to complete the work of the world, but neither are you free to desist from it.” Even so, Mr. Rieder might be too quick to dismiss utilitarianism and too sanguine about personal integrity.

Link to the rest at The Wall Street Journal

Catastrophe Ethics

From The Wall Street Journal:

It’s getting harder and harder to be a decent person. You wake up hankering for a coffee. But hold on! Before you order one, better make sure that a fair-trade producer supplied the beans. And then: a drop of milk? Cows have such a huge carbon footprint. Almond milk? Growing almonds requires copious quantities of water. Soy? Don’t even think about it.

Time to walk to work. That’s how you’ve been getting there since learning, last week, that your electric car’s cobalt comes from mines that engage in unacceptable labor practices. Now you’ve arrived but—can you believe it?—a co-worker just made a deplorable comment about the presidential campaign. Cut ties? A busy day of discussion ensues.

At last you’re home for the evening. Perhaps watch a comedian on your favorite streaming service? Not till you’ve checked whether he’s uttered something offensive in the past 15 years.

Modern life, Travis Rieder declares in “Catastrophe Ethics,” is “morally exhausting.” Everything we do “seems to matter,” he notes, and yet “nothing we do seems to matter.” The term “catastrophe” might seem to apply more to climate change than offensive comedians, but Mr. Rieder is speaking generally of collective problems that lie beyond the capacity of any of us to affect individually. They’re catastrophic in that they involve large social matters—the comedian, say, might be contributing to public prejudices by ridiculing a particular group—even though our own role in affecting them is vanishingly small. You’re not going to stop climate change on your own—you are, after all, one person in a global population of eight billion. Nor will the comedian you cancel even notice. What to do?

The great moral theories, Mr. Rieder tells us, are of little help. His prime target is utilitarianism, which holds that the right thing to do is whatever will maximize benefits and minimize costs for all concerned. Such counsel is useless, though, when our individual actions will neither yield any measurable benefit nor reduce any perceptible cost.

Other doctrines are explored as well. “Deontology” argues that we should not treat other human beings manipulatively, simply as means to our own ends. But its prohibitions seem better suited to acts like lying or promise-breaking, Mr. Rieder notes, than buying coffee or watching a comedian.

Then there is virtue ethics, which advises us to cultivate morally good character traits like temperance or moderation. Because the development of such traits takes place over time, though, it can’t really tell us whether our taking a joyride in our Hummer next Tuesday is right or wrong, since it’s unlikely to affect our character one way or another. Virtue ethics is not, Mr. Rieder concludes, particularly action-guiding.

Mr. Rieder, a bioethicist at Johns Hopkins, advises that the best course is simply to follow our own sense of personal “integrity,” an idea he derives from the philosopher Bernard Williams. For example, you might drink only fair-trade coffee because the proper treatment of workers is central to your sense of right and wrong, but you’re OK with listening to a comedian who offends particular groups. I, on the other hand, might cancel the comedian because his humor crosses some non-negotiable lines in my moral core, but I don’t get particularly worked up over where my coffee comes from.

We can’t, Mr. Rieder says, do everything. But we can be a person of integrity, as long as we “walk the walk” of our deepest values. There is a gentle wisdom here, reminiscent of the rabbinical saying: “You are not obliged to complete the work of the world, but neither are you free to desist from it.” Even so, Mr. Rieder might be too quick to dismiss utilitarianism and too sanguine about personal integrity.

Link to the rest at The Wall Street Journal

Tackling the TikTok Threat

From The Wall Street Journal:

The House on Wednesday is expected to vote on a bill that would give popular social-media app TikTok an ultimatum: Break up with the Chinese Communist Party (CCP), or break up with the U.S. It didn’t need to come to this, but Beijing and TikTok’s Chinese-owner ByteDance left Washington with no choice.

Congress has spent years debating how to mitigate the national-security risks of TikTok’s Chinese ownership that have grown with the site’s popularity. About 150 million Americans use TikTok, and the app is a top source of news and search for Generation Z.

Donald Trump tried in 2020 to force ByteDance to divest TikTok, but his executive order was blocked in court, partly because the President lacked clear authority from Congress. Legislation by Wisconsin Republican Mike Gallagher and Illinois Democrat Raja Krishnamoorthi aims to overcome the legal obstacles.

Their bill would ban TikTok from app stores and web-hosting services in the U.S. if the company doesn’t divest from ByteDance. It also establishes a process by which the President can prohibit other social-media apps that are “controlled by a foreign adversary.” The bill is narrowly tailored while giving the President tools to combat future threats.

Banning TikTok should be a last resort, but ByteDance and Beijing have demonstrated that they can’t be trusted. Reams of evidence show how the Chinese government can use the platform for cyber-espionage and political influence campaigns in the U.S.

Numerous reports have found that posts about Uyghur forced labor in Xinjiang province, the Tiananmen Square massacre, Hong Kong protests, Tibet and other politically sensitive content in China are suppressed on TikTok. A December study by the Network Contagion Research Institute found significant disparities between hashtags on Instagram and TikTok. The site also appears to amplify content that sows discord and ignorance in America. Pro-Hamas videos trend more than pro-Israel ones. Videos promoting Osama bin Laden’s 2002 “letter to America” went viral on TikTok last autumn.

How has TikTok responded to allegations that its algorithms are controlled by the Chinese government? In January it restricted researcher access to its hashtag data to make it harder to study. “Some individuals and organizations have misused the Center’s search function to draw inaccurate conclusions, so we are changing some of the features to ensure it is used for its intended purpose,” a TikTok spokesperson said.

Yet TikTok can’t explain why posts that are divisive in America go viral, while those that are sensitive for the CCP get few views. TikTok tried to ameliorate concerns about CCP wizards behind the screen with its Project Texas, which houses American user data on Oracle servers and gives the U.S. software company access to its algorithms.

But TikTok’s algorithms are still controlled by ByteDance engineers in China. The Journal reported in January that TikTok executives have said internally that they sometimes need to share protected U.S. data with ByteDance to train the algorithms and keep problematic content off the site. Like protests for democracy in Hong Kong?

TikTok’s other major security risk is cyber-espionage. The app vacuums up sensitive American user information, including searches, browsing histories and locations. This data can and does flow back to China. “Everything is seen in China,” a TikTok official said in a leaked internal recording reported by Buzzfeed.

ByteDance employees tried to uncover internal leakers by spying on American journalists. After this surveillance was reported, ByteDance blamed “misconduct of certain individuals” who were no longer employed. But there’s nothing to stop CCP puppets in ByteDance back-offices from spying on Americans.

Meta ignited a firestorm several years ago when it was found to have given British consulting firm Cambridge Analytica access to user personal data. Political campaigns used the data to target ads. TikTok’s privacy risks and malign political influence are more disturbing since it answers to Beijing.

Xi Jinping has eviscerated any distinction between the government and private companies. ByteDance employs hundreds of employees who previously worked at state-owned media outlets. A former head of engineering in ByteDance’s U.S. offices has alleged that the Communist Party “had a special office or unit” in the company “sometimes referred to as the ‘Committee.’”

. . . .

In any case, the House bill doesn’t restrict First Amendment rights. It regulates national security. It also has ample precedent since U.S. law restricts foreign ownership of broadcast stations. The Committee on Foreign Investment in the United States forced the Chinese owners of Grindr, the gay dating app, to give up control of the company.

Link to the rest at The Wall Street Journal (Sorry if you encounter a paywall)

PG thinks the WSJ opinion writers are going more than a bit overboard in their fears and, particularly, their “solution” to the problem they’ve overthought.

China is going to continue to engage in Communist behavior regardless of what happens with TikTok. If, “everything is seen in China,” then China is spending a huge amount of human time looking at TikTok videos.

“TikTok can’t explain why posts that are divisive in America go viral.”

Posts that are divisive in America frequently go viral without any help from foreign nations. It’s a feature of democratic societies, not a bug.

People disagree on political issues face-to-face, by snail mail, by email, via newspaper editorial pages and, especially, online. Look at what was printed in old-fashioned newspapers run by William Randolph Hearst and Pulitzer when there were far fewer sources of news available to regular folk than is the case today.

When a case containing dismembered human remains surfaced in New York’s East River in June of 1897, the publisher of the New York Journal–a young, devil-may-care millionaire named William Randolph Hearst–decided that his newspaper would “scoop” the city’s police department by solving this heinous crime. Pulling out all the stops, Hearst launched more than a journalistic murder investigation; his newspaper’s active intervention in the city’s daily life, especially its underside, marked the birth of the Yellow Press.

Most notable among Hearst’s competitors was New York City’s The World, owned and managed by a European Jewish immigrant named Joseph Pulitzer. These two papers and others exploited the scandal, corruption, and crime among the city’s most influential citizens, and its most desperate inhabitants

PG claims no deep expertise about what goes on in the uncountable number of online discussion groups, but suspects there are copious numbers of people from all sorts of places who subscribe to the belief that Joe Biden is being controlled by alien invaders, just like Donald Trump was before him.

How Pseudo-Intellectualism Ruined Journalism

From Persuasion:

I was sitting across from the professor as she went over my latest piece. This was 1986, Columbia School of Journalism, Reporting and Writing I, the program’s core course. At one point, in response to what I don’t recall, I said, “That doesn’t bode well for me.” I could have been referring to a lot of things; there were so many, in my time in journalism school, that did not bode well for me. One was the next set of words that came out of her mouth. “‘Bode?’” she said. “I haven’t heard anyone bode anything in a long time.” Another was her comment, on a previous piece, about my use of “agglomerate.” She had circled it and written, “No such word.”

But the most important was the intellectual climate of the school as a whole, in that it did not have one. We were not there to think. We were there to learn a set of skills. One of them, ironically, was asking questions, just not about the profession itself: its premises, its procedures, its canon of ethics. I know, because from time to time I tried, and it didn’t go well. This was trade school, not liberal arts school. When a teacher said something, you were supposed to write it down, not argue.

The main thing that I learned in journalism school was that I didn’t belong in journalism school. The other thing I learned was that journalists were deeply anti-intellectual. They were suspicious of ideas; they regarded theories as pretentious; they recoiled at big words (or had never heard of them). For a long time, I had contempt for the profession on that score. In recent years, though, this has yielded to a measure of respect. For notice that I didn’t say that journalists are anti-intellectual. I said they were. Now they’re something else: pseudo-intellectual. And that is much worse.

The shift reflects the transformation of journalists’ social position. This phenomenon is familiar. Journalism used to be a working-class profession. I think of Jimmy Breslin and Pete Hamill, icons of the New York tabloids, the working people’s papers, in the second half of the twentieth century. Breslin’s father was a drunk and a piano player who went out for rolls one day and never came back. His mother was a teacher and civil servant. Hamill’s father was a grocery clerk and factory worker; his mother, a domestic, a nurse’s aide, a cashier. Breslin went to Long Island University but dropped out after two years. Hamill left school at fifteen to apprentice as a sheet metal worker, enlisted in the Navy, and took some art school classes later on (he hoped to be a cartoonist). Both were Irish Catholic: Hamill from Brooklyn, Breslin from Queens, long before those boroughs were discovered by the hipsters and the condo creeps.

Coming up working-class, you develop a certain relationship to facticity. Your parents work with their hands, with things, or on intimate, sometimes bodily terms with other people. Your environment is raw and rough—asphalt, plain talk, stains and smells—not cushioned and sweetened. You imbibe a respect for the concrete, the tangible, that which can be known through direct experience, and a corresponding contempt for euphemism and cant. You develop a gut and a bullshit detector, acquire a suspicion of experts who operate at a remove from reality, which means academics in particular. Hence the recognition, in figures like Breslin and Hamill, that the world is chaotic, full of paradox, that people evade our understanding. Hence their sense of curiosity and irony and wonder. At the source of their moral commitments, they had not rules but instincts, a feeling for the difference between right and wrong. For the masses, they felt not pity but solidarity, since they were of them.

That was the profession’s ethos—skeptical, demotic—and you didn’t have to grow up working class (or be a man) to absorb it. Molly Ivins, Nora Ephron, Cokie Roberts, Maureen Dowd, Mara Liasson, even Joan Didion and Janet Malcolm, in their own ways: all had it or have it. But none of them was born more recently than 1955. In the last few decades, journalists have turned into a very different kind of animal. “Now we’re not only a college-dominated profession,” wrote David Brooks not long ago, citing a study that found that more than half of writers at The New York Times attended one of the 29 most selective institutions in the country; “we’re an elite-college-dominated profession.”

. . . .

A couple of years ago, writing in The Chronicle of Higher Education, an Ivy League professor said the quiet part out loud. “Not all of our students will be original thinkers,” she wrote, “nor should they all be. A world of original thinkers, all thinking wholly inimitable thoughts, could never get anything done. For that we need unoriginal thinkers, hordes of them, cloning ideas by the score and broadcasting them to every corner of our virtual world. What better device for idea-cloning than the cliché?” She meant academic clichés, having mentioned “performativity,” “normativity,” “genderqueer,” and others. “[W]e should instead strive to send our students forth—and ourselves too—armed with clichés for political change.”

And that’s exactly what has happened, nowhere more so than in journalism. The progressive academic ideology has become the intellectual framework of the field, or at least of its most visible and influential parts: The New York TimesThe Washington Post, NPR, et al. More to the point, the field now has an intellectual framework, one that journalists seek, top-down, to impose on the world, on the stories they report. The practice travels under the name of “moral clarity”—as if moral clarity were anything, in this world, besides a very rare commodity (I would love to know what Didion thought of the concept), and as if the phrase meant anything other, in this context, than tailoring the evidence to fit one’s pre-existing beliefs. Facts are now subordinated to the “narrative,” a revealing word: first, because it comes from academia (it is one of those clichés); second, because it’s almost always misused, a particle of garbled theory cloned and memed (as the professor would have wanted). When journalists say “narrative,” they mean “idea.” And it is always an idea they’ve received from someone else.

They think they’re thinking, but they’re wrong. They think that thinking means applying ideas, in the sense that you’d apply an ointment. What it actually means is examining them, reworking them, without fear, without cease. They believe that they are skeptical. In fact, they’re alternately cynical and gullible: cynical toward the other side and gullible toward their own (that they see themselves as being on a side is part of the problem, of course). That is why they’re helpless before the assertions of like-minded activists and academics or of acceptably credentialed experts—incapable of challenging their premises or putting pressure on their arguments. For those who lie outside their mental world, who haven’t taken the courses and acquired the jargon, they feel not kinship but, depending on the former’s demographic category, condescension or contempt.

Few students, at any time, come out of college fully equipped to think. 

Link to the rest at Persuasion

Oppenheimer Couldn’t Run a Hamburger Stand. How Did He Run a Secret Lab?

From The Wall Street Journal:

When he was named the director of the Manhattan Project’s Los Alamos Laboratory, J. Robert Oppenheimer was an improbable choice for the most important job in America. 

At the time, he was a 38-year-old theoretical physicist who had never managed anything more than a dozen graduate students, much less an operation with the fate of the world at stake. Leslie Groves, the Army general who hired him, said he received “no support, only opposition” for his decision. One close friend who would later win a Nobel Prize called Oppenheimer “absolutely the most unlikely choice” to run a secret lab that would build the atomic bomb.  

“He couldn’t run a hamburger stand,” said another colleague. 

So how did he transform into one of the most effective and consequential leaders in history? 

This weekend, “Oppenheimer” is expected to dominate the Oscars. But even watching a three-hour movie from a painstakingly meticulous auteur like Christopher Nolan isn’t enough to understand what made Oppenheimer tick. If you really want to get inside his mind, you have to read two Pulitzer Prize-winning books, “American Prometheus” by Kai Bird and the late Martin J. Sherwin and “The Making of the Atomic Bomb” by Richard Rhodes. (PG notes that the Rhodes book is available with Kindle Unlimited.)

. . . .

Oppenheimer the recruiter

Before he could build the bomb, Oppenheimer had to build something else with the potential to blow up in his face: a team. 

Los Alamos hadn’t yet been selected as the site of his secret lab when Oppenheimer began hunting for talent. Once he identified scientists and decided to hire them, he did whatever it took to get them. When the physicist Richard Feynman turned him down because his wife was sick with tuberculosis, for example, Oppenheimer found a sanatorium close enough to Los Alamos that he could visit on the weekends. 

It’s a revealing story not because Feynman was a star but the opposite: The future Nobel winner was still merely a graduate student, “not anybody famous at all,” as he put it, and yet Oppenheimer still went above and beyond. 

. . . .

Oppenheimer the communicator

Once he got them, Oppenheimer knew how to get the best work out of his scientists. In his new book, Charles Duhigg writes about “supercommunicators,” people who are “capable of saying exactly the right thing, breaking through to almost anyone, figuring out how to connect in even the most unlikely circumstances.” Oppenheimer, as it turns out, was a supercommunicator. 

Others in Los Alamos were better physicists, chemists and engineers. But what he could do better than anybody there—and maybe better than anybody on the planet—was take scientists with different perspectives and bring them to a consensus. 

“He would stand at the back of the room and listen as everyone argued,” Bird said. “Then he would step forward at just the right moment, summarize the salient points that everyone had been making that were in common and point the way forward.” 

“He would walk in, quickly grasp what the problem was and almost always suggest some leads to a solution,” Rhodes said. 

. . . .

But what set him apart from the other geniuses at Los Alamos was his broad knowledge and breadth of interests, which allowed him to make connections across disciplines and see what others in the room couldn’t. They were specialists. He was a generalist. They were singularly focused on their narrow fields of research. He was curious about philosophy, literature, poetry and the Bhagavad Gita. “He was a good scientist precisely because he was also a humanist,” Bird says.

Groves was so impressed by Oppenheimer’s range of interests that he once declared: “Oppenheimer knows everything.” He also could explain everything he knew without condescending, another trait that distinguished him from other eminently qualified scientists who interviewed for the job. 

“He was able to speak in plain English,” Bird said. 

. . . .

Oppenheimer the collaborator

The scientists were willing to drop everything in their lives to work around the clock in the middle of nowhere. What they were not willing to do was wear a military uniform. 

Oppenheimer himself was so allergic to hierarchy that he objected to making a basic organizational chart. He was intense but informal, someone who commanded respect without demanding it, and the biggest difference between Oppenheimer and Army generals was how they believed teams should operate. 

The military relied on compartmentalization. He insisted on collaboration. 

By demanding a flatter structure, Oppenheimer might as well have asked the Army if everyone in Los Alamos could have a mullet. In fact, when Groves learned that Oppenheimer was in favor of instituting a weekly colloquium for hundreds of scientists, he tried to shut it down. Oppenheimer prevailed. He understood the value of gathering people from different parts of a project in the same place, encouraging them to discuss their work and combine their ideas.

“Very often a problem discussed in one of these meetings would intrigue a scientist in a completely different branch of the laboratory,” Bethe once wrote, “and he would come up with unexpected solutions.” 

The meetings also improved morale at Los Alamos, providing a weekly reminder that everyone on the Manhattan Project had a role to play. Oppenheimer was right to fight for their existence. 

“He won the loyalty of people inside the fence,” Bird says. “They could see that he was protecting them, allowing them to collaborate and talk freely, which was necessary to the success of the project.”

They worked six days a week, but Oppenheimer made sure they weren’t only working. On their off days, there was horseback riding, mountain climbing, skiing, hiking and some of the geekiest basketball games of all time. When a local theater group staged a performance of “Arsenic and Old Lace,” Oppenheimer brought the house down with his surprise cameo as a corpse. And he was especially famous for his parties, where Oppenheimer paired his deadly gin martinis with his favorite toast: “To the confusion of our enemies!” 

Link to the rest at The Wall Street Journal

Larry Summers on What Went Wrong on Campus

From Persuasion:

Larry Summers is an economist, the Charles W. Eliot University Professor and director of the Mossavar-Rahmani Center for Business and Government at Harvard Kennedy School, and a member of the board of directors of OpenAI. Summers is the former President of Harvard University, the former Secretary of the Treasury under Bill Clinton, and was a director of the National Economic Council under Barack Obama.

In this week’s conversation, Yascha Mounk and Larry Summers discuss how universities can re-commit to pursuing truth and protecting academic freedom; how current economic indicators contrast with how many people actually experience the economy.

. . . .

Yascha Mounk: The last few months have been rather eventful at Harvard University. Tell us your view of what has happened and why it matters.

Larry Summers: It’s been a very difficult time. I think what universities do is as important as the work of any other institution in our society, in terms of training young people and preparing them for careers of leadership, and in terms of developing new ideas that set the tone for the cultural, the political, the policy debates that go forward.

Paul Samuelson famously said that if he would be allowed to write the economics textbooks, he didn’t care who would get to perform as the finance ministers going forward. So I think what happens in universities is immensely important. And I think there is a widespread sense—and it is, I think, unfortunately, with considerable validity—that many of our leading universities have lost their way; that values that one associated as central to universities—excellence, truth, integrity, opportunity—have come to seem like secondary values relative to the pursuit of certain concepts of social justice, the veneration of certain concepts of identity, the primacy of feeling over analysis, and the elevation of subjective perspective. And that has led to clashes within universities and, more importantly, an enormous estrangement between universities and the broader society.

When the president of Harvard is a figure on a Saturday Night Live skit, when three presidents of universities combine to produce the most watched congressional hearing film clip in history, when applications to Harvard fall in a several-month period by more they’ve ever fallen before, when alumni are widely repudiating their alma mater, when they’re the subject of as many legal investigations as the Boeing company, you have a real crisis in higher education. And I think it’s been a long time coming because of those changes in values that I was describing.

Mounk: Tell us a little bit more about the nature of the conflict here. What is the conception of the university that has historically guided it, and how is it that those values have changed over the last ten years?

Summers: I think the values that animated me to spend my life in universities were values of excellence in thought, in pursuit of truth. We’re never going to find some ultimate perfect truth, but through argument, analysis, discussion, and study we can get closer to truth. And a world that is better understood is a world that is made better. And I think, increasingly, all you have to do is read the rhetoric of commencement speeches. It’s no longer what we talk about. We talk about how we should have analysis, we should have discussion, but the result of that is that we will each have more respect for each other’s point of view, as if all points of view are equally good and there’s a kind of arbitrariness to a conception of truth. That’s a kind of return to pre-Enlightenment values and I think very much a step backward. I thought of the goal of the way universities manage themselves as being the creation of an ever larger circle of opportunity in support of as much merit and as much excellence as possible.

I spoke in my inaugural address about how, a century before, Harvard had been a place where New England gentlemen taught other New England gentlemen. And today it was so much better because it reached to every corner of the nation, every subgroup within the population, every part of the world. It did that as a vehicle for providing opportunity and excellence for those who could make the greatest contribution. But again, we’ve moved away from that to an idea of identity essentialism, the supposition that somehow the conditions of your birth determine your views on intellectual questions, whether it’s interpretations of quantum theory or Shakespeare. And so that, instead, our purpose is not to bring together the greatest minds, but is back to some idea around multiplicity of perspective with perspective being identified with identity. We used to venerate and celebrate excellence. Now, at Harvard, and Harvard is not atypical of leading universities, 70 to 75% of the grades are in A-range. Why should the institutions that are most celebrating of excellence have only one grade for everyone in the top half of the class, but nine different grades that are applied to students in the lower half of the class? That is a step away from celebrating and venerating excellence. 

We celebrate particular ideas in ways that are very problematic, and we are reluctant to come to judgment: What started all the controversy at Harvard, and it has many different strands, was on October 7, when 34 student groups at Harvard, speaking as a coalition of Harvard students, condemned Israel as being responsible for the Hamas attacks. Those reports of the 34 student groups were reported in places where literally billions of people read them. And based on some inexplicable theory, the Harvard administration and the Harvard corporation (the Trustees of the University) could not find it within themselves to disassociate the university from those comments. I have no doubt that if similar comments had been made of a racist variety, there would have been no delay in the strongest possible disassociation of the university. But because Israel demonization is the fashion in certain parts of the social justice-proclaiming left, there was a reluctance to reach any kind of judgment, even about the most morally problematic statements.

It is not that the university was slow to comment on George Floyd. It is not that the university was slow to comment when some within it wanted to host a “black mass.” It is not that the university has been slow when social scientists have wanted to speculate about group differences. So I think that this combination—the veneration of a particular concept of social justice, the act of disrespect for excellence, the celebration of identity rather than the pursuit of opportunity, and the rejection of truth—have made these institutions problematic in the impact they have on those who pass through them, in whatever influence they have on the broader society and estranged from the broader society. And I think for any kind of private institution, it has to find a social contract in which it can operate with the broader society. And the fact that the ways in which great universities have acted have so enabled the Elise Stefaniks, the Bill Ackmans, and the Christopher Rufos, speaks to the danger with which they have been governed. 

I come from a left of center tradition. And I’m not far left of center, but surely left of center. And I’ve always been acutely aware, in thinking of universities, that Ronald Reagan got his political start by condemning and running against what was happening at Berkeley in the mid-1960s. And that the tradition of then-Governor Brown—who had inaugurated this wonderful idea of free college education for anyone who had a B-average in a California high school—got completely blown away in a tide of fury about “welfare Cadillacs.” But what brought that tide to prominence was a general revulsion at what had been going on at Berkeley that Ronald Reagan rode to his political career. And so it seems to me that universities that fail to govern themselves effectively are at immense peril to themselves and to the broader progressive values that they hold.

Mounk: How dangerous do you think this moment is, not just to the reputation of universities but to their actual ability to function as co-institutions of the United States? I’m a little torn on this. On the one hand, you can make the case that even the most affluent and insulated universities like Harvard need federal funding for the research that they undertake, and to finance a lot of the student loans that its undergraduates take out. On the other hand, Harvard has an endowment of, what, $50 billion? And it does continue to have real support in the population. Where would you place yourself on the worry scale about sort of the worst-case scenarios here?

Summers: I think one would find for any Ivy League school that the federal government was ten times as large a donor, at least, as any other donor. And I think it’s fair to say that the universities have thumbed their nose at what is by far their largest donor. And they’re certainly not prepared to take that casual and cavalier attitude towards much smaller individual donors because of what they think the consequences would be. I think it’s fine to stand strongly against a set of people who in many ways are riding this horse, but wish the process of thought and wish academic freedom ill. The problem is not that Harvard has worked itself into a war with Elise Stefanik. The problem is that it got itself condemned from the White House press briefing room of the Biden administration, that it finds itself subject to investigation from the Department of Education of the Biden administration, that the attacks on it are coming in a bipartisan way.

I think one of the aspects of how this has happened is that while on the one hand we think of intellectual communities as being the most broad-minded of communities, on the other hand they are actually among the most narrow, insular and inward-looking in the way they evaluate themselves and in the way they think of the necessary decision making. There’s an old story about when Pat Moynihan had decided to leave the UN and called the Dean of Harvard to say he would be returning. He said he’d let the president know and the Dean of Harvard assumed he was referring to the President of Harvard rather than the President of the United States. And that bespeaks a kind of attitude that I think is very problematic. 

Link to the rest at Persuasion

Compulsively Trying to Please People Who Never Liked Me

From Electric Lit:

Having been her editor for a few years now, this is not the first time I’ve been asked to say something coherent about Jessi Jezewska Stevens’ fiction. The curious thing is that every time the question comes, I go rummaging through my critical cupboard in search of the right tools for getting at her work and find myself defaulting, again and again, to the bludgeon over the scalpel. Though no one’s work could be less a blunt object, there’s something about Stevens’ writing that tempts me toward the grand statement. It makes me want to issue unpleasant and no doubt blinkered generalizations about the State of Contemporary Fiction. I want to hold Stevens’ work up as an antidote to this or that literary malaise. My tone becomes oracular, apocalyptic, even when the register of the text in question—as in “A New Book of Grotesques” from her collection Ghost Pains—would seem to be anything but.

Partly to blame are her two novels, The Exhibition of Persephone Q and The Visitors. Each, in its way, tackles a doomsday: Persephone, the early 2000s tipping point between pre- and post-digital notions of personhood; The Visitors more literally, with the financial crisis precipitating counterfactual collapse in New York circa 2011. But there’s more to it than Armageddon by association.

If I were to tell you that Stevens often writes about quasi-narcissistic women enjoying lives of temporal comfort while suffering from a comedic inability to act with either decisiveness or effectiveness, you would, I suspect, yawn in my face. Stevens and everyone else, you might say. Yet her fiction perches upon and pries open little fissures in our prejudices about what fiction “ought” to be doing at this moment in history. She manages, with rigor and strangeness, to make the old wounds hurt, to make our shallowness feel dangerous again. Her stories remind me that there’s a great difference between a self-regarding writer whose project is the anatomizing of her own anomie and a writer whose project is to interrogate the anomie of self-regard. It’s the difference between diminishing our artform until it takes on the proportions and vocabulary of quotidian pettiness and approaching that pettiness with all the great and varied tools to which our artform has claim. Or maybe it’s just the difference between complaint and diagnosis.

“Can you believe the mistakes I was already making?” asks the glib narrator of “A New Book of Grotesques.”

Link to the rest at Electric Lit

The Dictator’s Best Friend

From The New Statesman:

“Writers under despots,” says Simon Ings, “may have to take instruction, but they’re rarely out of a job.” Every regime requires a story to validate it, and a regime lacking the authority of tradition needs one most urgently. Songs and slogans; heroes and martyrs; legends of the past, visions of the future: every would-be dictator makes use of them, and so every dictatorship seeks out literary celebrities who can inspire an uprising and then be flattered or coaxed or bought or terrified into celebrating the brave new world.

This book describes the relationships between four such authors and the political leaders whose causes – with varying degrees of willingness – they defined or promoted.

Ings begins with a failure. General Boulanger’s populist-militarist image, waving his hat from the back of his black horse while calling for a war of revenge against Prussia, proved compelling to a variety of interest groups in late-19th-century France. Maurice Barrès, admired nationalist and nostalgic author, put his talents at Boulanger’s service. But though Boulanger was charismatic personally he was weak strategically. His programme (essentially “Make France Great Again”) was vacuous. “The less Boulanger said,” writes Ings, “the better he did.” When the moment came for him to seize power he drew back. Meanwhile, Barrès, as Ings candidly admits, was a “grouch”.

Ings makes the best of the poor material the pair offer him by beginning in 1900 with a panoptic view of the Exposition Universelle, which he describes with gusto. Barrès, who detested universalism, makes his way gloomily through it to attend a dinner given by Action Française – an organisation with which he shared a quasi-mystical reverence for patrie, rootedness and the heroic dead.

Boulanger took his own life but Barrès, his admirer, lived on, his patriotism mutating nastily into anti-Semitism. For all their shortcomings, the two of them provide a handy vehicle to carry Ings’s ideas about celebrity and its political uses, about “the religious instinct of crowds” and the power of popular sentiment.

. . . .

Ings’s next pairing is more dynamic. The author is Gabriele D’Annunzio – poet, playwright, serial seducer, aviator, war-mongering orator and, briefly, small-scale dictator. The large-scale dictator is Benito Mussolini.

D’Annunzio declared that when he temporarily laid aside “scribbling” for violent political action he was working in a new art form whose material was human lives. He seized the Croatian port city of Fiume (now Rijeka) in 1919, and made himself its “Duce”, using it as the setting for a 15-month-long piece of spectacular street-theatre. Parades, marching bands, anthems belted out by volunteers with piratical hair-dos and black uniforms – all in celebration of a greater Italy, of soldiers as sacrificial victims offered up on the altar of the patria, and of D’Annunzio himself. Mussolini took note. He later called D’Annunzio the “John the Baptist of Fascism”. As Ings writes, “All Mussolini’s ritual, symbolism, mystique and style can be tied back to D’Annunzio.”

The young Mussolini was a dogged and voracious autodidact. Ings praises his early journalistic essays as “deeply thought-out” and summarises the texts he was reading – by Georges Sorel, Gustave Le Bon, Roberto Michels. Lightly skimming where Mussolini dug deep, Ings gives his readers a concise round-up of the intellectual ground in which the 20th-century dictatorships took root. He has a talent for succinct statements so well turned that they immediately ring true. His summings-up are forceful. He can make sense of syndicalism (something many historians struggle to do) and explain how attractive it seemed to early-20th-century thinkers from each end of the political spectrum, and why it lent itself so conveniently to totalitarianism.

. . . .

His Russian “engineer of human souls” (the phrase is Stalin’s) is Maxim Gorky. Ings opens this section in comic mode with Gorky’s visit to New York in 1906. Mark Twain has laid on mass meetings and grand dinners in his honour. Americans who, as Ings tartly comments, “can’t tell one Russian revolutionary from another” are delighted to applaud him. When trouble comes it has nothing to do with politics: it arises from the clash between revolutionary/bohemian sexual mores and American prudishness. Gorky, travelling with the actress Maria Andreyeva, to whom he is not married, causes scandal. He may deliver stirring speeches about how the “black blood-soaked wings of death” hover over his fatherland. He can recite Poe’s “The Raven” in Russian, thrilling auditors with his “deep musical voice”. But the grandeur of his mission and his manner are repeatedly undercut by farce. The pair of sinful lovers are turned out of hotels, and passed from one host’s spare room to another “like unexploded ordnance”.

They move on to London, where they meet Lenin, in exile like them. Lenin checks their bed sheets for damp – “We need to take good care of you” – and congratulates Gorky on having written a “useful” book (the novel Mother, which Gorky himself considers “really bad” but which lends itself to use as propaganda). So begins an association between the party and its tame author that will last uneasily for 30 years. To the Bolsheviks, Gorky is an asset as unreliable as he is valuable. To Gorky, the Soviet Union is a paymaster that allows him to practise the fascinating work of “god-building” – creating a faith for those who had abolished God. But at the time of the October Revolution he wrote that what was coming was “a long bloody anarchy, and, after it, a not less bloody and dark reaction”. Some 20 years later, Romain Rolland, watching him being treated as a literary lion, thought that Gorky was more like a sad old performing bear.

Link to the rest at The New Statesman

A Memoirist Who Told Everything and Repented Nothing

From The New Yorker:

When she died at a hundred and one in January of 2019, Diana Athill had publicly chronicled both ends of her long life in a series of nine memoirs. The first of these, “Instead of a Letter,” was published in 1963 and recently rereleased in the U.S. as part of the NYRB Classics series; it recounts her jolly, upper-class English childhood on the family estate of Ditchingham, in Norfolk. The last book that she wrote, “Alive, Alive Oh!,” came together in her “darling little room” at the Mary Feilding Guild, in Highgate, London, a garden-set home for the elderly; it’s a high-spirited, recalcitrant account of “waiting to die” at ninety-six.

Athill was the sort of character who ought to have seen her obituaries before she went. First, because she would have bewitchingly written off any high praise—the New York Times noted “her luminous prose, gimlet social acuity and ability to convey a profound sense of place”—with her brand of droll humor. (She refused burial at the Highgate Cemetery because of the cost: “I think being dead is an expensive business.”) And, second, because she would have enjoyed the evidence of how much her reputation had emerged; she’d worked behind the scenes for meagre wages and little adulation as one of the century’s great editors. In 1952, she became a co-founding director of the publishing house André Deutsch, and, until her retirement, in 1992, shepherded the likes of Philip Roth, John Updike, and Jean Rhys to publication. Athill wrote seven of her memoirs after leaving her nine-to-five, but, until that relatively late turn toward autobiographical mania, she knew her place. “We must always remember that we are only midwives—if we want praise for progeny we must give birth to our own,” she writes, in “Stet: An Editor’s Life.” We might not have known her had she not brought forth her own romping and exuberant litter.

Critics frequently used the terms “frank” or “candid” to describe Athill’s memoirs. But Athill doesn’t write as if no one is watching; she writes as if she’d never even imagined someone might watch, and therefore doesn’t have a scruple to hold on to. To describe honesty as her hallmark isn’t quite enough: that’s the least we can ask of our memoirists. What she is marvellous at is admitting, sans self-recrimination. In the early twenty-first century, the memoir has turned into a confessional, in a nearly religious sense. Writers go there to seek redemption, and to chart their evolution from naïve to knowing: no narrative is more marketable than metamorphosis. Athill doesn’t treat her foibles and losses—of love, of money, of caste, of certainty—as traumas, events that would define her life as troubled and scarring. Instead, she makes the case that being kicked out of Eden is good for the soul.

“I am glad that I have not inherited money or possessions,” Athill writes, striking a defiant note in “Instead of a Letter.” Inheritance was never her due, though as a child she once counted the bodies that stood between her and the palatial Ditchingham estate. “It appeared that at least twelve people, seven of them my contemporaries, would have to die before I would have a claim, and I hardly thought I ought to pray for this however much I would have liked to.” Ditchingham belonged to her mother’s parents, who offered it out as the extended family’s seasonal home, where they spent long summers and holidays throughout her early life. The thousand-acre estate with a twenty-bedroom, fully staffed house granted the family security in their Englishness, as members of an élite and unquestionable class. Athill stresses that the experience of growing up with such surety turned Ditchingham into a cocoon, a secure location from which to launch a life, but also a place she would inevitably leave. “There I used to be,” she opines, “as snug and as smug as anyone.” From an early age, she knew that adulthood would exist elsewhere.

Athill’s joy in Ditchingham, the children’s after-tea appearance in front of the grownups in the drawing room and the horsemen wandering across the fields, is the bright marrow of her writing: it suffuses her later life, and her prose, with bubbling, fresh oxygen. But, in “Instead of a Letter,” she writes as if she’s relieved that she got away from the estate and its inhabitants. “Like anyone else they had their charms,” she writes of her family, but “physically, intellectually, and morally, they were no more than middling.” Yet they thought themselves superior beings: “Smugness is too small a word for what it feels like from inside. From inside, it feels like moral and aesthetic rightness; from inside, it is people like me, who question it, who look stupid, ugly, and pitiful.”

Hence her happiness that she didn’t inherit: staying on at Ditchingham for a lifetime might have trapped her in the same small, closed life. Her childhood remained blissful to her as she aged because it lived on in her memory but didn’t define her future. “Never to have broken through its smothering folds would have been, I have always thought, extremely depressing,” she writes. “But on the other hand, not to have enjoyed a childhood wrapped warmly in those folds—that would be a sad loss.” Cousins were saddled with managing the finances of an upkeep-heavy country pile, whereas she, the oldest child of a fourth daughter, absorbed the bliss of the place but not the narrowness.

Ditchingham wasn’t the only inheritance that Athill would forgo. At thirteen, her mother told her that they’d “lost” their money, but what she meant was that they’d spent it all. “My parents felt they were living austerely because we ourselves looked after our ponies and they had not kept on their own hunters,” Athill writes, dryly. She recounts her mother telling her that “the really bloody thing about being poor is that if you leave something on the floor when you go out, you know that it will still be there when you get back.” Along with her two younger siblings, the family had been living in a well-staffed, six-bedroom house in Hertfordshire since her father had retired from the Army. Financially, they fell out a window but landed on a mattress—Athill’s grandparents rented them Manor Farm, a house on the estate, for cheap. A governess cost too much, so Athill was sent to Runton, a girls’ boarding school on the North Sea, and then up to St. Mary’s College at Oxford, in 1936.

When Athill was twenty-two, her future disintegrated again. She’d been engaged for two years when her fiancé, a Royal Air Force pilot named Tony Irvine, was deployed to Egypt. Then his letters suddenly stopped. She discovered in rapid succession that he’d married someone else while abroad and then been killed in action. “A long, flat unhappiness” set in, her sense of her own value collapsed, and her twenties were filled with broken-off relationships with incompatible men. “By the time I had reached my thirties,” she writes, toward the end of “Instead of a Letter,” “I was convinced that I lacked some vital quality necessary to inspire love.” At age ninety-nine, she explained in an interview, “there was a basic, underlying sense of failure—and it came from the very simple thing of having been brought up expecting to get married.”

“How did I get this way?” is one of memoir’s primary questions. Typical culprits are poverty or abandonment, sometimes a remarkable, indelible catastrophe. Cheryl Strayed’s mother died when Strayed, the author of “Wild,” was in college: she calls it her “genesis story.” Dani Shapiro, the author of five memoirs, starts her autobiographical path in “Slow Motion” with the story of her parents’ tragic car accident. Even Joan Didion reached new heights of cultural resonance with “The Year of Magical Thinking,” her memoir of the year following her husband’s death. The modern memoir is the proving ground for our national obsession with trauma, a place to gawk at whoever comes through the emotional meat grinder with the good sense and talent to finesse their damage into a redemption song.

Link to the rest at The New Yorker

Ditchingham Hall. (2022, April 20). In Wikipedia. https://en.wikipedia.org/wiki/Ditchingham_Hall. Photographer – Stephen Richards, CC BY-SA 2.0 (Creative Commons Attribution Share-alike license 2.0)

Is everything you assumed about the Middle Ages wrong?

From The Economist:

“In public, your bottom should emit no secret winds past your thighs. It disgraces you if other people notice any of your smelly filth.” This useful bit of advice for young courtiers in the early 13th century appears in “The Book of the Civilised Man”, a poem by Daniel of Beccles. It is the first English guide to manners.

Ian Mortimer, a historian, argues that this and other popular works of advice that began appearing around the same time represent something important: a growing sense of social self-awareness, self-evaluation and self-control. Why then? Probably because of the revival of glass mirrors in the 12th century, which had disappeared from Europe after the fall of Rome. The mirror made it possible for men and women to see themselves as others did. It confirmed their individuality and inspired a greater sense of autonomy and potential. By 1500 mirrors were cheap, and their impact had spread through society.

Mr Mortimer sets out to show that the medieval period, from 1000 to 1600, is profoundly misunderstood. It was not a backward and unchanging time marked by violence, ignorance and superstition. Instead, huge steps in social and economic progress were made, and the foundations of the modern world were laid.

The misapprehension came about because people’s notion of progress is so bound up with scientific and technological developments that came later, particularly with the industrial and digital revolutions. The author recounts one claim he has heard: that a contemporary schoolchild (armed with her iPhone) knows more about the world than did the greatest scientist of the 16th century.

Never mind that astronomers such as Copernicus and Galileo knew much more about the stars than most children do today. Could a modern architect (without his computer) build a stone spire like Lincoln Cathedral’s, which is 160 metres (525 feet) tall and was completed by 1311? Between 1000 and 1300 the height of the London skyline quintupled, whereas between 1300 and the completion of the 72-storey Shard in 2010, it only doubled. Inventions, including gunpowder, the magnetic compass and the printing press, all found their way from China to transform war, navigation and literacy.

This led to many “expanding horizons” for Europeans. Travel was one. In the 11th century no European had any idea what lay to the east of Jerusalem or south of the Sahara. By 1600 there had been several circumnavigations of the globe.

Law and order was another frontier. Thanks to the arrival of paper from China in the 12th century and the advent of the printing press in the 1430s, document-creation and record-keeping, which are fundamental to administration, surged. Between 1000 and 1600 the number of words written and printed in England went from about 1m a year to around 100bn. In England, a centralised legal and criminal-justice system evolved rapidly from the 12th century. Violent deaths declined from around 23 per 100,000 in the 1300s to seven per 100,000 in the late 16th century.

Link to the rest at The Economist

Late Middle English

It was during the 14th century that a different dialect (known as the East-Midlands) began to develop around the London area.

Geoffrey Chaucer, a writer we have come to identify as the Father of English Literature[5] and author of the widely renowned Canterbury Tales, was often heralded as the greatest poet of that particular time. It was through his various works that the English language was more or less “approved” alongside those of French and Latin, though he continued to write up some of his characters in the northern dialects.

It was during the mid-1400s that the Chancery English standard was brought about. The story goes that the clerks working for the Chancery in London were fluent in both French and Latin. It was their job to prepare official court documents and prior to the 1430s, both the aforementioned languages were mainly used by royalty, the church, and wealthy Britons. After this date, the clerks started using a dialect that sounded as follows:

gaf (gave) not yaf (Chaucer’s East Midland dialect)
such not swich
theyre (their) not hir [6]

As you can see, the above is starting to sound more like the present-day English language we know.
If one thinks about it, these clerks held enormous influence over the manner of influential communication, which ultimately shaped the foundations of Early Modern English.

Early Modern English

The changes in the English language during this period occurred from the 15th to mid-17th Century, and signified not only a change in pronunciation, vocabulary or grammar itself but also the start of the English Renaissance.

The English Renaissance has much quieter foundations than its pan-European cousin, the Italian Renaissance, and sprouted during the end of the 15th century. It was associated with the rebirth of societal and cultural movements, and while slow to gather steam during the initial phases, it celebrated the heights of glory during the Elizabethan Age.

It was William Caxton’s innovation of an early printing press that allowed Early Modern English to become mainstream, something we as English learners should be grateful for! The Printing Press was key in standardizing the English language through distribution of the English Bible.

Caxton’s publishing of Thomas Malory’s Le Morte d’Arthur (the Death of Arthur) is regarded as print material’s first bestseller. Malory’s interpretation of various tales surrounding the legendary King Arthur and the Knights of the Round Table, in his own words, and the ensuing popularity indirectly ensured that Early Modern English was here to stay.

It was during Henry the VIII’s reign that English commoners were finally able to read the Bible in a language they understood, which to its own degree, helped spread the dialect of the common folk.

The end of the 16th century brought about the first complete translation of the Catholic Bible, and though it didn’t make a markable impact, it played an important role in the continued development of the English language, especially with the English-speaking Catholic population worldwide.

The end of the 16th and start of the 17th century would see the writings of actor and playwright, William Shakespeare, take the world by storm.

Why was Shakespeare’s influence important during those times? Shakespeare started writing during a time when the English language was undergoing serious changes due to contact with other nations through war, colonisation, and the likes. These changes were further cemented through Shakespeare and other emerging playwrights who found their ideas could not be expressed through the English language currently in circulation. Thus, the “adoption” of words or phrases from other languages were modified and added to the English language, creating a richer experience for all concerned.

It was during the early 17th century that we saw the establishment of the first successful English colony in what was called The New World. Jamestown, Virginia, also saw the dawn of American English with English colonizers adopting indigenous words, and adding them to the English language.

The constant influx of new blood due to voluntary and involuntary (i.e. slaves) migration during the 17th, 18th and 19th century meant a variety of English dialects had sprung to life, this included West African, Native American, Spanish and European influences.

Meanwhile, back home, the English Civil War, starting mid-17th century, brought with it political mayhem and social instability. At the same time, England’s puritanical streak had taken off after the execution of Charles I. Censorship was a given, and after the Parliamentarian victory during the War, Puritans promoted an austere lifestyle in reaction to what they viewed as excesses by the previous regime[7]. England would undergo little more than a decade under Puritan leadership before the crowning of Charles II. His rule, effectively the return of the Stuart Monarchy, would bring about the Restoration period which saw the rise of poetry, philosophical writing, and much more.

It was during this age that literary classics, like those of John Milton’s Paradise Lost, were published, and are considered relevant to this age!

Late Modern English

The Industrial Revolution and the Rise of the British Empire during the 18th, 19th and early 20th-century saw the expansion of the English language.

The advances and discoveries in science and technology during the Industrial Revolution saw a need for new words, phrases, and concepts to describe these ideas and inventions. Due to the nature of these works, scientists and scholars created words using Greek and Latin roots e.g. bacteria, histology, nuclear, biology. You may be shocked to read that these words were created but one can learn a multitude of new facts through English language courses as you are doing now!

Colonialism brought with it a double-edged sword. It can be said that the nations under the British Empire’s rule saw the introduction of the English language as a way for them to learn, engage, and hopefully, benefit from “overseas” influence. While scientific and technological discoveries were some of the benefits that could be shared, colonial Britain saw this as a way to not only teach their language but impart their culture and traditions upon societies they deemed as backward, especially those in Africa and Asia.

The idea may have backfired as the English language walked away with a large number of foreign words that have now become part and parcel of the English language e.g. shampoo, candy, cot and many others originated in India!

English in the 21st Century

If one endevours to study various English language courses taught today, we would find almost no immediate similarities between Modern English and Old English. English grammar has become exceedingly refined (even though smartphone messaging have made a mockery of the English language itself) where perfect living examples would be that of the current British Royal Family. This has given many an idea that speaking proper English is a touch snooty and high-handed. Before you scoff, think about what you have just read. The basic history and development of a language that literally spawned from the embers of wars fought between ferocious civilisations. Imagine everything that our descendants went through, their trials and tribulations, their willingness to give up everything in order to achieve freedom of speech and expression.

Getting a Grip on Gravity Aboard a Ship

From The Wall Street Journal:

There’s an endless stream of advice about things that will stop you sleeping well: bright screens before bedtime, too much caffeine, the wrong pillows and so on. Let me add one to the list. It’s almost impossible to get any decent sleep while you’re constantly sliding across the bedsheets down to the end of the bed, and then a few seconds later sliding back the other way.

This sounds extreme, but it’s been a regular feature of my life over the past month. I spent some of November and all of December on a scientific research ship in the middle of the Labrador Sea, as part of an international project to measure stormy seas and how they help the ocean breathe. It was a great opportunity from a scientific point of view, but it came with a built-in challenge—the middle of the ocean during a big storm is a very difficult place to work, because gravity becomes a fickle trickster.

Gravity is incredibly useful, most of all because it’s reliable. We put things on shelves knowing that they’re going to stay there. Faucets point downward because the water will come out and keep going down. But if you need a ship to go in specific directions for scientific reasons, it can’t sit in the most stable orientation amid the wind and the waves. The ship will roll much more than normal and gravity effectively becomes changeable.

You find yourself chasing the water around the shower cubicle because the direction that it’s falling relative to you keeps shifting. Every corridor has handrails that you need to hold onto as you zigzag along. The default assumption is that everything will move, so you have to strap everything down as soon as you move it and if you don’t, it certainly won’t be there when you get back. After one particularly violent night, I got down to my lab to find that a large, heavy instruction manual that I left on a shelf had been thrown onto the top of the printer 3 yards away.

But there is one exception to all this motion. There’s a type of nonslip rubbery mesh table mat that I’ve never seen outside a ship, and anything that’s placed on that will just stay put. The night that the manual took flight, I had left a laptop sitting on a grippy tabletop mat, not strapped down, and I really should have been punished for that mistake by finding the laptop smashed in a corner. But it hadn’t moved.

My savior was friction, and this type of mat is exceptional at creating it. Surfaces are all rough at tiny scales, and so when one object is placed on top of another they only really touch at a small number of tiny points. It’s like one mountain range being placed upside down on top of another one, so that only the biggest peaks actually poke into the other surface, and those points take all the weight.

But the soft rubber is easily squashed, increasing the contact area as the mat molds itself to the underside of the object on top. That makes it much harder for the upper object to slide. When there’s a sideways force, the top of the rubber can also move slightly sideways without breaking, absorbing the force within the mat and reducing the sideways push on the object. That all reduces the sensitivity of the object to the exact direction of gravity. As long as gravity is still mostly downward, the mat will stop things sliding around. This is a game-changer at sea, because “mostly downward” is as reliable as gravity gets.

Link to the rest at The Wall Street Journal (Sorry if you encounter a paywall)

English Is A Germanic Language

From Rosetta Stone:

German is widely considered among the easier languages for native English speakers to pick up. That’s because these languages are true linguistic siblings—originating from the exact same mother tongue. In fact, eighty of the hundred most used words in English are of Germanic origin. These most basic, common words in English and German derive from the same roots, making them amazingly similar. That’s why an English word like “friend” is “freund” in German. Plus, there are an incredible number of German and English words that aren’t simply related, but identical: arm, hand, kindergarten, angst, bitter, and many more.

Generally, if you’re an English speaker with no exposure to other languages, here are some of the most challenging: Mandarin Chinese, Arabic, Icelandic, Thai, Vietnamese, Finnish, Japanese, and Korean.

Link to the rest at Rosetta Stone

From Oxford International:

History of the English language

Charles Laurence Barber comments, “The loss and weakening of unstressed syllables at the ends of words destroyed many of the distinctive inflections of Old English.”

Similarly, John McWhorter points out that while the Norsemen and their English counterparts were able to comprehend one another in a manner of speaking, the Norsemen’s inability to pronounce the endings of various words ultimately resulted in the loss of inflectional endings.

Many of you will be forgiven for thinking that studying an English Language course consists of English grammar more than anything else. While English grammar does play a part when taking courses to improve English overall, it is but a small part of the overall curriculum where one becomes immersed in a history that was partly influenced by myths, battles, and legends on one hand, and the everyday workings of its various social class on the other.

According to the Encyclopedia Britannica, the English language itself really took off with the invasion of Britain during the 5th century. Three Germanic tribes, the JutesSaxons and Angles were seeking new lands to conquer, and crossed over from the North Sea. It must be noted that the English language we know and study through various English language courses today had yet to be created as the inhabitants of Britain spoke various dialect of the Celtic language.

During the invasion, the native Britons were driven north and west into lands we now refer to as Scotland, Ireland, and Wales. The word England and English originated from the Old English word Engla-land, literally meaning “the land of the Angles” where they spoke Englisc.

Old English (5th to 11th Century)

Albert Baugh, a notable English professor at the University of Pennsylvania notes amongst his published works that around 85% of Old English is no longer in use; however, surviving elements form the basis of the Modern English language today.

Old English can be further subdivided into the following:

  • Prehistoric or Primitive (5th to 7th Century) – available literature or documentation referencing this period is not available aside from limited examples of Anglo-Saxon runes;
  • Early Old English (7th to 10th Century) – this period contains some of the earliest documented evidence of the English language, showcasing notable authors and poets like Cynewulf and Aldhelm who were leading figures in the world of Anglo-Saxon literature.
  • Late Old English (10th to 11th Century) – can be considered the final phase of the Old English language which was brought about by the Norman invasion of England. This period ended with the consequential evolution of the English language towards Early Middle English.

Early Middle English

It was during this period that the English language, and more specifically, English grammar, started evolving with particular attention to syntax. Syntax is “the arrangement of words and phrases to create well-formed sentences in a language,” and we find that while the British government and its wealthy citizens Anglicised the language, Norman and French influences remained the dominant language until the 14th century.

An interesting fact to note is that this period has been attributed with the loss of case endings that ultimately resulted in inflection markers being replaced by more complex features of the language. Case endings are “a suffix on an inflected noun, pronoun, or adjective that indicates its grammatical function.

Late Middle English

It was during the 14th century that a different dialect (known as the East-Midlands) began to develop around the London area.

Geoffrey Chaucer, a writer we have come to identify as the Father of English Literature and author of the widely renowned Canterbury Tales, was often heralded as the greatest poet of that particular time. It was through his various works that the English language was more or less “approved” alongside those of French and Latin, though he continued to write up some of his characters in the northern dialects.

It was during the mid-1400s that the Chancery English standard was brought about. The story goes that the clerks working for the Chancery in London were fluent in both French and Latin. It was their job to prepare official court documents and prior to the 1430s, both the aforementioned languages were mainly used by royalty, the church, and wealthy Britons. After this date, the clerks started using a dialect that sounded as follows:

  • gaf (gave) not yaf (Chaucer’s East Midland dialect)
  • such not swich
  • theyre (their) not hir [6]

As you can see, the above is starting to sound more like the present-day English language we know.
If one thinks about it, these clerks held enormous influence over the manner of influential communication, which ultimately shaped the foundations of Early Modern English.

Early Modern English

The changes in the English language during this period occurred from the 15th to mid-17th Century, and signified not only a change in pronunciation, vocabulary or grammar itself but also the start of the English Renaissance.

The English Renaissance has much quieter foundations than its pan-European cousin, the Italian Renaissance, and sprouted during the end of the 15th century. It was associated with the rebirth of societal and cultural movements, and while slow to gather steam during the initial phases, it celebrated the heights of glory during the Elizabethan Age.

It was William Caxton’s innovation of an early printing press that allowed Early Modern English to become mainstream, something we as English learners should be grateful for! The Printing Press was key in standardizing the English language through distribution of the English Bible.

Caxton’s publishing of Thomas Malory’s Le Morte d’Arthur (the Death of Arthur) is regarded as print material’s first bestseller. Malory’s interpretation of various tales surrounding the legendary King Arthur and the Knights of the Round Table, in his own words, and the ensuing popularity  indirectly ensured that Early Modern English was here to stay.

It was during Henry the VIII’s reign that English commoners were finally able to read the Bible in a language they understood, which to its own degree, helped spread the dialect of the common folk.

The end of the 16th century brought about the first complete translation of the Catholic Bible, and though it didn’t make a markable impact, it played an important role in the continued development of the English language, especially with the English-speaking Catholic population worldwide.

The end of the 16th and start of the 17th century would see the writings of actor and playwright, William Shakespeare, take the world by storm.

Why was Shakespeare’s influence important during those times? Shakespeare started writing during a time when the English language was undergoing serious changes due to contact with other nations through war, colonisation, and the likes. These changes were further cemented through Shakespeare and other emerging playwrights who found their ideas could not be expressed through the English language currently in circulation. Thus, the “adoption” of words or phrases from other languages were modified and added to the English language, creating a richer experience for all concerned.

It was during the early 17th century that we saw the establishment of the first successful English colony in what was called The New World. Jamestown, Virginia, also saw the dawn of American English with English colonizers adopting indigenous words, and adding them to the English language.

The constant influx of new blood due to voluntary and involuntary (i.e. slaves) migration during the 17th, 18th and 19th century meant a variety of English dialects had sprung to life, this included West African, Native American, Spanish and European influences.

Meanwhile, back home, the English Civil War, starting mid-17th century, brought with it political mayhem and social instability. At the same time, England’s puritanical streak had taken off after the execution of Charles I. Censorship was a given, and after the Parliamentarian victory during the War, Puritans promoted an austere lifestyle in reaction to what they viewed as excesses by the previous regime. England would undergo little more than a decade under Puritan leadership before the crowning of Charles II. His rule, effectively the return of the Stuart Monarchy, would bring about the Restoration period which saw the rise of poetry, philosophical writing, and much more.

It was during this age that literary classics, like those of John Milton’s Paradise Lost, were published, and are considered relevant to this age!

Late Modern English

The Industrial Revolution and the Rise of the British Empire during the 18th, 19th and early 20th-century saw the expansion of the English language.

The advances and discoveries in science and technology during the Industrial Revolution saw a need for new words, phrases, and concepts to describe these ideas and inventions. Due to the nature of these works, scientists and scholars created words using Greek and Latin roots e.g. bacteria, histology, nuclear, biology. You may be shocked to read that these words were created but one can learn a multitude of new facts through English language courses as you are doing now!

Colonialism brought with it a double-edged sword. It can be said that the nations under the British Empire’s rule saw the introduction of the English language as a way for them to learn, engage, and hopefully, benefit from “overseas” influence. While scientific and technological discoveries were some of the benefits that could be shared, colonial Britain saw this as a way to not only teach their language but impart their culture and traditions upon societies they deemed as backward, especially those in Africa and Asia.

The idea may have backfired as the English language walked away with a large number of foreign words that have now become part and parcel of the English language e.g. shampoo, candy, cot and many others originated in India!

English in the 21st Century

If one endevours to study various English language courses taught today, we would find almost no immediate similarities between Modern English and Old English. English grammar has become exceedingly refined (even though smartphone messaging have made a mockery of the English language itself) where perfect living examples would be that of the current British Royal Family. This has given many an idea that speaking proper English is a touch snooty and high-handed. Before you scoff, think about what you have just read. The basic history and development of a language that literally spawned from the embers of wars fought between ferocious civilisations. Imagine everything that our descendants went through, their trials and tribulations, their willingness to give up everything in order to achieve freedom of speech and expression.

Link to the rest at Oxford International

What is Syntax and Why is it Important to Understand Language?

From Akorbi:

What is syntax?

Syntax is a term used by linguists to describe a set of principles and rules that govern sentence structure and word order in a particular language. In English, the general rule of syntax follows the subject-verb-object rule. The subject refers to the person or thing (a noun) performing the action, the verb describes the action being taken, and the object (another noun) refers to what is being acted upon (if anything).

More than 85% of the world’s languages put the subject first in a sentence, making it understood who or what performs the action. Many of the rest of the languages put the verb first, followed by the subject and the object.

Syntax may also include descriptive words such as adjectives and adverbs that add descriptions to nouns and verbs. Prepositions, like “to” and “above,” communicate the direction or placement of an object or subject.

Examples of Syntax in Various Languages

Every sentence in English breaks down to “subject-verb-object” no matter how long you make it.

“John cautiously drives the red car in the snow” shortens to “John drives the car.”

Spanish follows the same basic structure, except the noun and adjective are inverted. “Juan conduce con cautela el coche rojo en la naive.” The phrase “coche rojo” means “red car” in English, but in Spanish it reads as “car red.”

However, in both sentences it’s understood that the car is red as opposed to anything else because the word “red” is adjacent to the word “car.” That’s because of syntax rules that govern Spanish and English.

. . . .

Modifiers

Modifiers, such as adjectives and adverbs, should be close to the noun or verb that they modify in the sentence. The relationship between a modifier and its referent can be clarified with commas or punctuation to ensure the correct meaning is communicated.

“The deft driver swerved his car to the right to avoid an accident just in time.”

You could change this sentence to, “The car’s driver swerved deftly to the right just in time to avoid an accident.”

Both sentences are grammatically and syntactically correct. However, the second sentence is more clear. He swerved just in time to avoid an accident.

Turning Syntax on Its Head

Have you ever heard Yoda from Star Wars speak? He turns syntax on its head by putting the subject towards the end of the sentence instead of the beginning.

“When 900 years old you reach, look as good you will not.”

Ordinarily, you say to someone, “You won’t look this good when you reach 900 years old.

Link to the rest at Akorbi

Discussion: Language

From MIT Open Courseware – Introduction to Psychology:

Session Overview

How do I move meaning from my mind to yours? 

Discussion

Language is just incredible – think about how easy it is for us, as babies, to learn our native language effortlessly, and yet how hard it is, once we’ve already learned a language, to learn another.

I think about this every time I see a Chinese baby speaking perfect Mandarin. I have a master’s degree in linguistics and I’ve been trying to learn Mandarin for 10 years, but I’m just awful at it. And language is just spectacularly complicated in terms of our capacity for explaining things. There are sentences that you utter, or that friends utter, that have never been uttered before in the history of the human language, and that will probably never be uttered again, and yet they’re perfectly understandable.

We can talk about language in terms of signals for communication, that is, the perception and production of speech and sign, or what you have to do to move the message from one place to another. We can also talk about language as structures for information, or what the rules are that govern how the basic building blocks of language go together to convey meaning – including rules for sounds, words, sentences, and discourse.

In psychology, some of the big questions about language have to do with language acquisition (both as babies and as adults); the brain bases of speech and language; and communication and language disorders, such as aphasia and dyslexia. Language is a huge topic, but we’ll hit some of the highlights here.

Demonstration

Phonology is the structure of the sounds that make up the words of a language. Phonemes are the building blocks of speech sounds. Phonemes aren’t large enough units of language to convey meaning all by themselves, but they do distinguish one word from another. For example, bit and hit differ by one phoneme. English has about 45 phonemes altogether.

But think about two things. One, there’s a lot of sounds that we can make (whistles, coughs, snorts, etc.) that aren’t linguistic. Two, there’s incredible variety in how the people around us pronounce the “same” sound. Think about people speaking with different accents, or how you sound when you have a cold. How does the brain handle this?

Listen to Tyler describe and demonstrate the phenomenon of categorical perception: (Includes recorded demonstrations of “bad/bat” and “slash/splash” courtesy of UCLA Phonetics Lab, used with permission. The original recordings, and many others like them, are available at Peter Ladefoged’s website Vowels and Consonants.)

Demonstration

However, we don’t rely on our ears alone to determine what we’re hearing. The McGurk effect is a famous example of how visual cues impact our perception of speech sounds. For this demonstration, you will play a single, five-second video clip three times, with different instructions each time.

First, play the video with your eyes closed. Make note of what the man is saying.

(PG Comment: This is a very short video. On PG’s computer, at the close of this video, YouTube starts another that doesn’t seem to be related.)

Second, play the video again with your eyes open. Now what is he saying?

Third, mute the sound on the video and just watch his mouth move. What does it look like he’s saying?

What do you think is happening in this situation? Why?

. . . .

Demonstration

Consider the following joke:

A woman is taking a shower when her doorbell rings. She yells, “Who’s there?” and a man answers, “Blind man.” Being a charitable person, she runs out of the shower naked and opens the door. The man says, “Where should I put these blinds, lady?”

The use of the word ‘blind’ in this joke relies upon the particularities of English semantics and syntaxSemantics refers to the meaning of a word or a sentence; syntax refers to the rules for combining words into sentences. The word ‘blind’ has several meanings (it can be an adjective or a noun), and the one that comes to mind first for most listeners is ‘visually impaired,’ so “blind man” is at first understood as adjective + noun. However, the rules of English syntax allow us to interpret noun + noun phrases such as ‘ice cream man’ not as ‘a man made of ice cream’ but ‘a man who sells ice cream.’ It’s not until the punchline is delivered that you realize that it was a different meaning of ‘blind’ all along.

Link to the rest at MIT Open Courseware

20 Rare Languages Still Spoken Today

From Universal Translation Services:

Languages:

Culture is the most personal thing society came up with. It defines the people of a community and regulates their everyday life. There are many aspects of culture, but language is the most important. Figuring out humans’ first language is impossible, but we know some ancient tongues. We can also figure out which is the least spoken language. But languages die, too, even if they were quite famous at some point. Latin is a good example of this because it was a mighty tongue once, but today, it does not have a single native speaker. Experts are sure half of the seven thousand languages spoken today will become extinct within a hundred years.

Here are the top 20 most spoken languages in the world:

  1. English
  2. Mandarin
  3. Hindi
  4. Spanish
  5. French
  6. Standard Arabic
  7. Bengali
  8. Russian
  9. Portuguese
  10. Indonesian
  11. Urdu
  12. Standard German
  13. Japanese
  14. Swahili
  15. Marathi
  16. Telugu
  17. Western Punjabi
  18. Wu Chinese
  19. Tamil
  20. Turkish


What Are All the Languages in the World?

Language defines our cultural identity. More than seven thousand languages are spoken in the world. Some have less than a thousand speakers and risk becoming extinct. At the same time, others have millions of speakers. English and Mandarin are two vernaculars with over a billion speakers. Spanish is one of the most widely spoken languages.

. . . .

The Rarest Language in the World

Kaixana is an unknown language because it only has one speaker left today. Kaixana has never been very popular. But it had 200 speakers in the past. However, that number has been reduced to a single digit today. Learning is a complicated task since there isn’t much known about the vocabulary.

. . . .

What is the Oldest Language Still Spoken Today?

The oldest language that is still spoken today is Tamil. It has been around for at least 5000 years. It is spoken in India by more than 60 million speakers. Other old languages that are still spoken in the world today with little changes are Greek, Hebrew, Egyptian, and Farsi. Sanskrit is another language that has been around for over 3000 years, but it is only spoken by Hindu priests nowadays.

Link to the rest at Universal Translation Services

Language City

From The Wall Street Journal:

Words are coined and money talks, but the link between linguistics and economics is more than metaphor. Most notably, as Ross Perlin writes in his superb “Language City: The Fight to Preserve Endangered Mother Tongues in New York,” the English-speaking peoples’ geopolitical dominance has made their language the “reserve currency of communication.” But economic power can pull as well as push, luring immigrants and refugees to the metropole with the prospect of better lives, or at least better pay. So it is, Mr. Perlin writes, that New York City is now home to some 700 languages—there are roughly 7,000 extant—making it the “most linguistically diverse city in the history of the world.”

Linguistic variety is “often seen as a problem, the curse of Babel,” but for a linguist, New York City is a riotous collection of living specimens—a “greenhouse, not a graveyard.” Many of its languages have only thousands or hundreds of speakers worldwide. Mr. Perlin, who has a doctorate in linguistics, helps run the Endangered Language Alliance, which works to document such minority tongues. “Language City” centers on six: Yiddish; Seke, from Nepal; Wakhi, from the borderlands where Afghanistan, Tajikistan, Pakistan and China meet; N’ko, a script invented in 1949 to standardize a family of West African languages; Nahuatl, from Mexico and Central America; and Lenape, the language of New York’s first inhabitants.

The heart of “Language City” is portraits of individual New York-based speakers. Mr. Perlin writes about their work as well as his, capturing the grind of immigrant life with empathy, balance and wit. (Linguistic fieldwork, which involves coaxing people to talk while you note every nuance of their speech, is excellent training for journalism.) “If the country was rich we would never leave,” says Husniya, a Wakhi speaker from bleak post-Soviet Tajikistan. But she savors the city’s entrepreneurial energy: “New York opened my eyes. It shapes you to be a human being, not dividing based on religion, face, or race, or anything.”

. . . .

Mr. Perlin can set a scene with quick, sure strokes. “Boris, sipping from a slim-waisted glass of Turkish tea on Emmons Avenue, insists that he is not the last major Yiddish writer,” he writes of an aging Brooklynite born in Bessarabia. “This village has elevators. We step into one of them,” begins a chapter on Rasmina, a young speaker of Seke. Her language originates in a cluster of five villages in upland Nepal and is spoken by some 700 people in the world. At one point or another 100 of them have lived in a “vertical village” in Brooklyn—a six-story apartment building where a now-forgotten Nepali established a foothold decades ago.

Rasmina visits a class Mr. Perlin teaches at Columbia so that his students can practice field techniques. Like most of the world’s languages, Seke is barely recorded on paper; there’s no dictionary or grammar guide. In fact, Mr. Perlin notes later, as he travels with Rasmina back to Nepal, most languages are documented only “once, if at all.” In class, his students ask Rasmina the words in Seke for a range of objects and concepts, such as body parts and family relationships, drawn from a set of putatively universal vocabulary items called a Swadesh list. Even these are tricky for English speakers: Seke, like Russian and other languages, treats limbs as integral units, with a word that means “leg-foot” and one that means “arm-hand.”

Wonderfully rich, “Language City” is in part an introduction to the diverse ways different languages work. Seke and other “evidential” languages, for example, have different grammatical forms to indicate how the speaker knows what she’s asserting—whether from observation or inference, hearsay or hunch. Other languages syntactically “tag the speaker’s surprise at unexpected information” or have a special temporal marking “just for things happening today.”

It is also a brief survey of U.S. immigration, full of piquant detail about its tortuous history. Ellis Island, contrary to legend, was known for linguistic sensitivity, with translators capable of handling 20 languages—though premium-class passengers could skip immigration checks altogether. It’s the kind of book where even the notes are pinpoint portraits. Did you know that when Andy Warhol met Pope John Paul II (in 1980), they spoke Ruthenian? “Though Warhol famously said ‘I come from nowhere,’ ” Mr. Perlin deadpans, “his family in fact came from Mikova in what is now Slovakia.”

Why do we need so many languages, though? There’s something that seems inefficient about slicing one world into 7,000 pieces; there’s a reason that developing a universal language long seemed like the easiest route to utopia. It took surprisingly long to realize that sharing a language is no cure for conflict. The concerted push to preserve endangered tongues is only a few decades old, and the benefits it claims are more subtle than those of the biodiversity movement that helped inspire it. Exotic species have yielded many valuable medicines, but we’re not going to discover a treatment for cancer in an unfamiliar grammar. A central dogma of modern linguistics is that all languages are functionally equal: “Any society can run in any language,” as Mr. Perlin puts it.

Link to the rest at The Wall Street Journal (Sorry if you encounter a paywall)

Russian spies are back—and more dangerous than ever

From The Economist:

It is unusual for spymasters to taunt their rivals openly. But last month Bill Burns, the director of the CIA, could not resist observing that the war in Ukraine had been a boon for his agency. “The undercurrent of disaffection [among Russians] is creating a once-in-a-generation recruiting opportunity for the CIA,” he wrote in Foreign Affairs. “We’re not letting it go to waste.” The remark might well have touched a nerve in Russia’s “Special Services”, as the country describes its intelligence agencies. Russian spies botched preparations for the war and were then expelled from Europe en masse. But evidence gathered by the Royal United Services Institute (RUSI), a think-tank in London, and published exclusively by The Economist today, shows that they are learning from their errors, adjusting their tradecraft and embarking on a new phase of political warfare against the West.

The past few years were torrid for Russian spies. In 2020 operatives from the FSB, Russia’s security service, botched the poisoning of Alexei Navalny, the recently deceased opposition activist. He mocked them for spreading Novichok on his underwear. Then the FSB gave the Kremlin a rosy view of how the war would go, exaggerating Ukraine’s internal weaknesses. It failed to prevent Western agencies from stealing and publicising Russia’s plans to invade Ukraine. And it was unwilling or unable to halt a brief mutiny by Yevgeny Prigozhin, the leader of the Wagner mercenary group, last year. The SVR, Russia’s foreign intelligence agency, saw its presence in Europe eviscerated, with some 600 officers expelled from embassies across the continent. At least eight “illegals”—intelligence officers operating without diplomatic cover, often posing as non-Russians—were exposed.

The study by RUSI, written by Jack Watling and Nick Reynolds, a pair of the organisation’s analysts, and Oleksandr Danylyuk, a former adviser to both Ukraine’s defence minister and foreign intelligence chief, draws on documents “obtained from the Russian Special Services” and on interviews with “relevant official bodies”—presumably intelligence agencies—in Ukraine and Europe. In late 2022, the study says, Russia realised that it needed more honest reporting from its agencies. It put Sergei Kiriyenko, the Kremlin’s deputy chief of staff, in charge of “committees of special influence”. These co-ordinate operations against the West and then measure their effect.

That personnel change appears to have produced more coherent propaganda campaigns. In Moldova, for instance, a once-scattershot disinformation effort against the country’s bid for European Union membership grew more consistent and focused last year. It tied the accession bid to the president personally, all the while blaming her for Moldova’s economic woes. Campaigns aimed at undermining European support for Ukraine have also picked up. In January German experts published details of bots spreading hundreds of thousands of German-language posts a day from a network of 50,000 accounts over a single month on x (Twitter as was). On February 12th France exposed a large network of Russian sites spreading disinformation in France, Germany and Poland.

Meanwhile the GRU, Russia’s military intelligence agency, has also been re-evaluating its tradecraft. In recent years its Unit 29155—which had attempted to assassinate Sergei Skripal, a former GRU officer, in Salisbury, Britain in 2018—saw many of its personnel, activities and facilities exposed by Bellingcat. The investigative group draws on publicly available information and leaked Russian databases for its exposés.
The GRU concluded that its personnel were leaving too many digital breadcrumbs, in particular by carrying their mobile phones to and from sensitive sites associated with Russian intelligence. It also realised that the expulsion of Russian intelligence officers in Europe had made it harder to mount operations and control agents abroad—one reason why the invasion of Ukraine went awry.

The result was wholesale reform, which began in 2020 but sped up after the war began. General Andrei Averyanov, the head of Unit 29155, was, despite his litany of cock-ups, promoted to deputy head of the GRU and established a new “Service for Special Activities”. Unit 29155’s personnel—once exemplified by Alexander Mishkin and Anatoly Chepiga, Mr Skripal’s hapless poisoners, who insisted that they had travelled to Salisbury to see its cathedral’s famous spire—no longer carry their personal or work phones to its facility, using landlines instead. Training is done in a variety of safe houses rather than onsite. Whereas half of personnel once came from the Spetsnaz, Russia’s special forces, most new recruits no longer have military experience, making it harder for Western security services to identify them through old photographs or leaked databases.

A separate branch of the Service for Special Activities, Unit 54654, is designed to build a network of illegals operating under what Russia calls “full legalisation”—the ability to pass muster even under close scrutiny from a foreign spy agency. It recruits contractors through front companies, keeping their names and details out of government records, and embeds its officers in ministries unrelated to defence or in private companies. The GRU has also targeted foreign students studying at Russian universities, paying stipends to students from the Balkans, Africa and elsewhere in the developing world.

For another example of how Russian spies have turned disaster into opportunity, consider the case of the Wagner Group, a series of front companies overseen by Mr Prigozhin. Wagner initially served as a deniable arm of Russian influence, providing muscle and firepower to local autocrats in Syria, Libya and other African countries. In June 2023 Mr Prigozhin, angered by the mismanagement of the war by Russia’s defence minister and army chief, marched on Moscow. The mutiny was halted; two months later Mr Prigozhin was killed when his plane exploded midair.

Russia’s special services quickly divided Mr Prigozhin’s sprawling military-criminal enterprise among themselves. The FSB would keep domestic businesses, and the SVR the media arms, such as the troll farms which interfered in America’s presidential election in 2016. The GRU got the foreign military bits, split into a Volunteer Corps for Ukraine and an Expeditionary Corps, managed by General Averyanov, for the rest of the world. The latter missed its target of recruiting 20,000 troops by the end of last year, says RUSI, though its strength is “steadily rising”. There have been hiccups: Mr Prigozhin’s son, who mystifyingly remains alive and at liberty, offered Wagner troops to the Rosgvardia, Russia’s national guard, prompting a bidding war between the guard and the GRU, according to the authors.
. . . .

Mission Possible

Russian intelligence, though bruised, is firmly back on its feet after its recent humiliations. In recent weeks the Insider, a Riga-based investigative website, has published a series of stories documenting Russian espionage and influence across Europe. They include details of how a GRU officer in Brussels continues to provide European equipment to Russian arms-makers, and the revelation that a top aide in the Bundestag and a Latvian member of the European Parliament were both Russian agents, the latter for perhaps more than 20 years.

“It’s not as bad for them as we think it is,” says Andrei Soldatov, an investigative journalist, who reckons that the Russian services are “back with a vengeance” and increasingly inventive. Vladimir Putin, Russia’s president, and once a (mediocre) KGB officer, is “trying to restore the glory of Stalin’s formidable secret service”, explains Mr Soldatov. He points to a case in April 2023 when Artem Uss, a Russian businessman arrested in Milan on suspicion of smuggling American military technology to Russia, was spirited back to Russia with the help of a Serbian criminal gang—a common intermediary for the Russian services.

In the past, says Mr Soldatov, the FSB, SVR and GRU had a clearer division of labour. No longer. All three agencies have been particularly active in recruiting among the flood of exiles who left Russia after the war. It is easy to hide agents in a large group and simple to threaten those with family still in Russia. Germany is of particular concern, given that the many Russians who have moved there could make up a recruiting pool for Russian spy bodies. The flood of new arrivals is thanks in part to Baltic countries having grown more hostile to Russian emigres.

Moreover, Russian cyber-activity goes from strength to strength. In December America and Britain issued public warnings over “Star Blizzard”, an elite FSB hacking group which has been targeting Nato countries for years. The following month Microsoft said that “Cosy Bear”, a group linked to the SVR, had penetrated email accounts belonging to some of the company’s most senior executives. That came on top of a sophisticated GRU cyber-attack against Ukraine’s power grid, causing a power outage apparently co-ordinated with Russian missile strikes in the same city.

The renewal of Russia’s intelligence apparatus comes at a crucial moment in east-west competition. An annual report by Norway’s intelligence service, published on February 12th, warned that, in Ukraine, Russia was “seizing the initiative and gaining the upper hand militarily”. Estonia’s equivalent report, released a day later, said that the Kremlin was “anticipating a possible conflict with NATO within the next decade”.

Link to the rest at The Economist

PG understands that Russia is trying to harm the U.S. and any other nations that are allied with the U.S. and such activities need to be taken seriously. However, the OP reminded him of spy vs. spy during the Cold War.

Strong Passions

From The Wall Street Journal:

Peter Strong’s marriage to his young wife, Mary, had not been particularly happy over the previous few months. Now they were dealing with the death of their 14-month-old daughter. Peter, a bon vivant living off inherited wealth, is anxious to rekindle the romance with Mary. Instead, his wife’s response shocks him. She pulls away, sobbing. “Oh, forgive me, forgive me.” Then she confesses that for the previous two years she has been having an affair with his widowed younger brother, Edward.

In “Strong Passions: A Scandalous Divorce in Old New York,” Barbara Weisberg describes a case from the 1860s that has all the elements of a soap opera—powerful families, a tearful confession, adultery, abortion and the fate of two innocent little girls.

Peter and Mary would never again share a bed, but at first there was no talk of a divorce. “Whatever a couple’s miseries,” Ms. Weisberg writes, in mid-19th-century America “legally ending a marriage was viewed as an abhorrent act, especially among members of the Strongs’ social class.” In New York, divorce was especially difficult because the courts accepted only one cause—adultery—and it would have humiliated both Peter, who had been cuckolded by his own brother, and Mary, who would be branded a “fallen woman.” A wife divorced for adultery would have had to surrender custody of her children. By all accounts, Peter and Mary were equally devoted to their daughters, 7-year-old Mamie and 21/2 -year-old Allie.

Perhaps the Strongs might have managed a quiet separation, during which they would jointly raise their daughters. But then Mary admitted that she was pregnant, and that the paternity was uncertain. Peter was enraged. At this point, either Mary had a miscarriage or Peter secured an abortion. Relations between the couple deteriorated rapidly, and two years after Mary’s confession, Peter filed for divorce. Members of Manhattan’s upper crust were appalled. Mary retaliated by bolting, taking Allie with her and disappearing completely. Meanwhile, Peter was indicted for manslaughter for the murder of his wife’s unborn child.

Ms. Weisberg, a former television producer and the author of “Talking to the Dead” (2004), reconstructs the events that led up to the Strong divorce almost entirely from a few court documents plus the newspaper articles that covered the subsequent court cases. There are no personal papers remaining from any of those involved, except for a few laconic journal entries by Peter’s cousin, the diarist George Templeton Strong. This means the author had no access to the unfiltered voices of the main actors in this family saga, and she has struggled to bring them to life. Nonfiction accounts of long-dead individuals always require a degree of speculation, but “Strong Passions” has more than its share of words like “probably,” “perhaps,” “likely” and “undoubtedly.”

Instead, Ms. Weisberg devotes two-thirds of her book to the overlapping narratives heard in court from an extraordinary number and range of witnesses—“a governess, a detective, a judge’s daughter, an undertaker, an abortionist’s spouse, a laundress and Teddy Roosevelt’s uncle.” She writes that “the series of dramatic incidents that precipitated the divorce suit were clouded by a divergence in bitterly contested versions of what had occurred.” Was Mary a victim or instigator of her affair? Did she admit guilt or deny the accusations against her? Was Peter a brutal villain or a gentle and put-upon husband? Did he have an affair with the abortionist? How many of the witnesses were bribed?

Despite the dramas described in court, Peter was acquitted of the criminal charges for lack of evidence, and the jury in the civil-law divorce case deliberated for 45 hours but remained deadlocked. There was therefore no divorce. George Templeton Strong declared that “the public is sick of this horrible case.” He went on to observe that some commentators found neither Peter nor Mary “particularly admirable,” and as such the two seemed “so well matched” it would be “a pity to divorce them.”

Link to the rest at The Wall Street Journal (Sorry if you encounter a paywall)

The Literary Memoir

Non-fiction, and in particular the literary memoir, the stylised recollection of personal experience, is often as much about character and story and emotion as fiction is.

Chimamanda Ngozi Adichie