Humanly Possible

From The Wall Street Journal:

Humanism was born in Renaissance Italy as an approach to reading Roman literature. It later turned into an Enlightenment philosophy for reorganizing society along rational lines, especially in France. No one called it “humanism” in English until the 19th century. Our humanism is a Euro-American ideology, and its keynotes are progress, liberal individualism, agnosticism or atheism, and trusting the science.

The British historian Sarah Bakewell has previously deciphered the complexities of Montaigne (“How to Live”) and given a droll and chatty account of the Existentialists (“At the Existentialist Café”). Her latest book, “Humanly Possible,” traces the abstract ideal of humanism through the lives of its exponents and the hopes of its adherents, from Petrarch’s Florence to present-day Glasgow, where the Humanists International group recently issued a “Declaration of Modern Humanism.” A book of big and bold ideas, “Humanly Possible” is humane in approach and, more important, readable and worth reading, whether you agree with it or not.

For the Romans, Ms. Bakewell writes, the word humanitas meant being human, “but with added overtones of being refined, knowledgeable, articulate, generous, and well mannered.” Her first humanists are the umanisti of 14th-century Italy, literary scholars specializing in studia humanitatis, or human studies, rather than Christian theology, for instance. Not that religious concerns were ignored: Dante may have been a “cosmic visionary,” but his cosmos was Christian, so much so that he invented a Hell for his enemies and had to leave the pagan Virgil in Limbo.

After Dante came Francesco Petrarca (Petrarch) and Giovanni Boccaccio. Petrarch (1304-1374), a poet and scholar, relished the peripatetic “literary life.” He was a bloodhound in the library, hunting down fragments of Livy’s Roman history, and discovering Cicero’s “Pro Archia,” in which the Roman statesman argued that the Greek poet Archias merited citizenship for his “human and literary studies.” His biggest find was three of Cicero’s letters in the cathedral library of Verona. These showed the private Cicero, the “informal writer and friend who reflected on human dilemmas and emotions,” and mixed observations on current affairs with anecdotes from philosophy and literature. When Petrarch emulated Cicero in his letters, he revived the voice of humanitas.

Link to the rest at The Wall Street Journal

PG notes that, once again, Big Publishing (Penguin in this case) got a positive review in the largest-circulation newspaper in the United States, has posted the book for sale on Amazon but has not enabled Look Inside on Amazon so the millions of ardent readers can examine the book in more detail to decide whether to buy it or not.

Modern intelligent and cultured readers have a huge offering of interesting writing about nearly any subject available online. At least some of these potential customers are like PG, flitting from flower to flower. PG has already read the WSJ review and looked up the book on Amazon.

Between today and the release date, March 28, PG will have scanned hundreds of online pages, read at least two hundred pages of ebooks and forgotten all about the nice review the publisher scored in The Wall Street Journal.

PG is on the verge of emailing the WSJ editors to request that they not publish a book review until the book itself is on sale. His email may or may not have any impact on the WSJ review policies at all, but PG will have let off a bit of steam.

19th Century marketing and promotion is still with us in traditional publishing.

The Angel Makers of Nagyrev

From The Economist:

When the people of Nagyrev had a problem, they went to Auntie Suzy. Though she had no formal training, she was the Hungarian village’s appointed midwife and the closest thing it had to a resident doctor. Men used her homemade tinctures for relief from the aches and pains they sustained toiling in the fields. Women, too, turned to her for help, and not just with the delivery of their babies. Alongside rubs and salves, Auntie Suzy produced another concoction: arsenic, made from boiling flypaper in vinegar.

Some women used the solution out of desperation—to avoid having another mouth to feed or to get rid of a violent spouse or relative. Others, however, dispensed it to deal with less urgent personal inconveniences. One woman had tired of her clingy husband, so fed him contaminated duck soup. Another, weary of her adult son, mixed some of the elixir into his goulash; later, when she suspected her third husband of having an affair, she reprised her technique. The arsenic was an open secret. If a woman complained of her partner’s behaviour, a friend would suggest a visit to the midwife.

Patti McCracken’s new book, “The Angel Makers”, is a detailed account of the killing spree in Nagyrev and other nearby villages in the early 20th century. It took prosecutors several years to grasp its scale. Eventually 29 women and two men were put on trial in 1929 for the murders of 42 men; 16 women were convicted. Scores of bodies were exhumed and examined for traces of arsenic. Some think as many as 300 people could have been killed. “The boldness and utter callousness with which they carried on their criminal activities seems to have been equalled only by the stupidity of the men who were their victims,” the New York Times reported.

The author weaves in character sketches that suggest the perpetrators’ various motives. Her portrait of Auntie Suzy, a buxom woman fond of her pipe and brandy, is particularly evocative. When questioned by the police about a pattern of infant deaths, she described her role in benevolent terms: she helped poor people with family planning. In fact, she was motivated by money and status.

She plundered goods from clients’ homes and charged exorbitant fees for her potions. (From Maria, the woman who killed her son and husband, she hoped to extract a house.) Occasionally Auntie Suzy or one of her helpers would bump someone off unprompted. A baby was killed without the mother’s say-so. A war veteran was dispatched so that Auntie Suzy’s son might marry his wealthy widow.

Ms McCracken also lays out the context in which these misdeeds took place. She describes regional customs and the effects on the village of the first world war and the fall of the Austro-Hungarian empire. 

Link to the rest at The Economist

Fool Me Once

From The Wall Street Journal:

You know the saying: “You can’t cheat an honest man.” It assumes that those who are the target of a cheat asked for it. The investment fund with the suspiciously high returns, the inside racing tip, the stock trade that’s a “sure thing”—the investors knew, or should have known, that something was off. The victims thought they were in on the con, and it turned out they were the marks.

The truth is that honest people can be and are cheated all the time. In “Fool Me Once: Scams, Stories, and Secrets From the Trillion-Dollar Fraud Industry,” Kelly Richmond Pope explores how this happens. Small-business owners are blindsided when their bookkeeper turns out to have been cooking the books; multinational financial institutions find executives funneling out cash; the government of a small city discovers that an official has been skimming from its accounts for decades. Ms. Pope, a professor of forensic accounting at DePaul University, even includes a class of “accidental” perpetrator for those who start out as inadvertent beneficiaries of a mistake but who, rather than confessing, decide to lean into the opportunity.

As Ms. Pope argues: “We all have to understand the cycle of fraud because at one point or another you’re either impacted by it, you did it, or you were in a position to expose it.” She has spent time with fraudsters, interviewing the incarcerated and paroled, to understand why they made the choices they did. She has also spoken to victims, examining how they became vulnerable, and whistleblowers, who reported their co-workers, often at high personal cost.

Some of Ms. Pope’s observations make intuitive sense, like the conclusion that community groups and churches are among the softest of targets: they’re typically run by volunteers, hold lots of cash and apply very little oversight. But any organization can be vulnerable. Most of us ignore the red flags or don’t even know whatto look for. We don’t want to treat our friends andco-workers with suspicion, and keeping a beady eye feels unnatural. That’s why the perp is always the “trusted” employee. (The person everyone thinks is kind of shady is rarely given access to the bank accounts.) In my experience, systems designed to prevent fraud probably make scams more possible: Forcing users to change their passwords every 30 days results in passwords written on Post-it notes and found in obvious locations. Convoluted swipe-card systems end up with doors propped open.

A threat can come from anywhere, given how much of our lives is online. Maybe you’ve seen the Facebook message saying your grandson has been mugged and could you please wire some money. I’ve received emails, purportedly from my boss, asking me to send him some gift cards. Ms. Pope notes how victims often don’t report crimes like this out of shame and embarrassment. The thief didn’t lift their wallet—they willingly handed it over.

But this isn’t a “protect grandma from getting scammed” book. It’s a study of who does it, whom they do it to and who reports it. Ms. Pope’s profiles of fraudsters include the typical corporate swindlers, but there’s also one shocking case of a pharmacist who diluted cancer medicine for profit. (My gut instinct is to call this a homicide rather than fraud.) The diversity of the people in this book reminds us that almost anyone could perpetrate a fraud—or be the victim of one.

“I’m fixated on how people cheat,” Ms. Pope writes. The prevailing answer seems to be by taking advantage of opportunity and impunity. An ATM technician grabs a few handfuls of cash. An insurance-account manager creates fake transactions. If a novice cheater gets away with it the first time, he’ll probably keep doing it.

The biggest fraudster in this book is Rita Crundwell, who stole more than $50 million from the city coffers as the comptroller of Dixon, Ill. She was caught when a colleague looked at some accounts while Crundwell was out of the office. According to Ms. Pope, a high proportion of fraud comes to light this way. We depend on these human monitoring systems as much as any password setup or background check.

Link to the rest at The Wall Street Journal

Decoding the Secrets of Mary, Queen of Scots’ Letters

From Culture.org:

A team of researchers, George Lasry, Norbert Biermann, and Satoshi Tomokiyo, has successfully deciphered 57 encrypted letters written by Mary, Queen of Scots, dating from 1578 to 1584.

This discovery is being hailed as the most significant find regarding Mary in over a century. The letters were found in the French National Library, catalogued as Italian texts from the first half of the 16th century.

Dr. John Guy, a fellow in history at the University of Cambridge and author of a 2004 biography of Mary, Queen of Scots, called the findings a “literary and historical sensation.”

The decryption project involved a combination of manual research and computerized cryptanalysis, which identified the plaintext language as French, not Italian as previously assumed.

. . . .

Mary, Queen of Scots, used more than 100 different ciphers in her correspondence.

Mary’s cipher system often masked individual letters with a single symbol; however, to bolster security, she employed homophones, allowing several symbols to signify frequently used letters.

Additionally, she concealed common words by utilizing symbols designated for months, locations, and names of individuals.

Lastly, to further obscure the content, she incorporated red herrings or “nulls” that knowledgeable recipients would disregard.

. . . .

The decrypted letters reveal a mix of political discussions and personal complaints, reflecting Mary’s shifting strategies during her imprisonment.

She often wrote about her efforts to negotiate her release and her willingness to relinquish her claims to the English throne.

The letters also reveal her distrust of Sir Francis Walsingham and the Puritan faction at the English court.

Mary’s deteriorating personal circumstances, including financial difficulties and recurrent bouts of physical and mental illness, are also evident in her correspondence.

The letters provide valuable insight into how she maintained connections with her supporters despite the intense surveillance during her captivity.

. . . .

The newly deciphered letters have confirmed the long-held suspicion of a mole within the French embassy who successfully passed letters to the English.

The survival of both ciphered letters and contemporary plaintext copies in English archives indicates the mole’s success throughout 1584.

. . . .

According to Dr. Guy, these new documents show Mary as a shrewd and attentive analyst of international affairs and will occupy historians of Britain and Europe and students of the French language and early modern ciphering techniques for years to come.

Link to the rest at Culture.org

Why Marie Antoinette’s Reputation Changes With Each Generation

From The Smithsonian Magazine:

Approximately 230 years after Marie Antoinette’s execution by guillotine at the hands of revolutionaries, the French queen remains one of history’s most recognizable royals. Depicted alternatively as a materialistic, self-absorbed young woman who ignored her people’s suffering; a more benign figure who was simply out of her depth; and a feminist scapegoat for men’s mistakes, she continues to captivate in large part because of her tragic fate.

“[Marie Antoinette] has no official power. She’s just the wife of the king of France, and yet she’s put to death,” says Catriona Seth, a historian and literary scholar at the University of Oxford. “It seems like an almost gratuitous action on the part of the revolutionaries. … [If] they had sent her back to Austria or put her in a convent,” she would be far less famous.

Marie Antoinette’s exploits at the glittering court of Versailles, coupled with her dramatic fall from grace during the French Revolution, have inspired numerous silver screen adaptations, from a 1938 film starring Norma Shearer to Sofia Coppola’s sympathetic 2006 biopic. But “Marie Antoinette,” a new series premiering in the United States on March 19, is the first major English-language television show to tell the queen’s story. Much like Marie Antoinette herself, it’s proving controversial, with biographer Évelyne Lever deeming the production a “grotesque caricature” and a “litany of historic aberrations.”

Here’s what you need to know ahead of the series’ debut on PBS.

Is “Marie Antoinette” based on a true story?

Yes, but with extensive dramatic license. Created by British screenwriter Deborah Davis, who co-wrote the 2018 period drama The Favourite, “Marie Antoinette” originally premiered in Europe in 2022. Featuring Emilia Schüle as the queen and Louis Cunningham as her hapless husband, Louis XVI, the show’s first season (one of three planned installments) covers roughly 1770 to 1781, beginning with Marie Antoinette’s journey to France and ending with the birth of her first son. In between these milestones, she struggles to win the affection of both her husband and her subjects, all while navigating the competing interests of her birth kingdom of Austria and her new home.

In keeping with the recent period drama trend of presenting historical figures and settings through a thoroughly modern lens (see “Bridgerton,” “The Great” and “The Serpent Queen”), “Marie Antoinette” offers a feminist take on the queen’s life. As Schüle told Variety last October, Marie Antoinette was a “rebel” who was “modern, emancipated, and fought for equality and for her personal freedom.”

Link to the rest at The Smithsonian Magazine

Orwell, Camus and truth

From The Critic:

A war still raged in Europe, but the enemy were firmly in retreat. The occupation of Paris had been broken, and France was free, and so were the cafés of the Boulevard St Germain. No longer did the waiters have to serve coffee to SS officers.

One afternoon in April 1945, a dishevelled Englishman walked into one such café. He was a war correspondent for the Observer — fond of shag-tobacco and Indian tea. His pen-name was George Orwell. 

Orwell was meeting Albert Camus – the distinguished writer and intellectual. But even so, I always imagine Orwell taking a seat indoors, among the pale, ornate woodwork, and feeling slightly out of place. Les Deux Magots, and the Café de Flore opposite, were frequented by a kind of intellectual of which Orwell often disapproved. That is, philosopher-types with communist sympathies: the likes of Jean-Paul Sartre, Simone De Beauvoir and Maurice Merleau-Ponty. 

Orwell sat and waited, and waited, for Camus to arrive. He never turned up: he was laid up with an exacerbation of tuberculosis. They would never get the chance to meet again, and Orwell would die five years later, having lost his own battle with the same disease.

My admiration for both of these eminent writers developed in isolation of one another — but I have always unconsciously identified them as the same sort of writer, and indeed, the same sort of person. There are various superficial similarities: the TB diagnosis that prevented both of them from joining the armed forces, the foreign birth, the rampant womanising, the shared hatred of fascism and suspicion of communism. Much more importantly, they seemed to share the same outlook. Both of these writers took the view that truthfulness was more important than ideological allegiance and metaphysics, that the facts should be derived from the real world, rather than the world of ideas. They were similar stylistically too: both wrote candidly, clearly and prolifically. 

Camus seemed to have shared my view. He said as much in a letter to his mistress, Maria Casarès, on the day of Orwell’s death in 1950.

Some bad news: George Orwell is dead. You don’t know him. A very talented English writer, with exactly the same experience as me (although ten years older) and exactly the same ideas. He fought tuberculosis for years. He was one of the very few men with whom I shared something.

For Camus to say that another writer had “exactly the same ideas”, and was “one of the very few men with whom I shared something” was no small thing. 

No correspondence between the two authors seems to exist. In fact, when I searched for personal links between them there was little to go on. But although my hunt for biographical evidence of a relationship was fruitless, the time I have since spent reading and comparing their work yields some rather more intriguing connections. 

Orwell’s best-known novel is undoubtedly Nineteen Eighty-Four. What’s remarkable about this novel — above virtually all other novels in English — is the number of words and expressions it has bequeathed to the English-speaking world. Perhaps this was Orwell’s greatest gift to mankind: an entire language through which to talk about the coming age of state sponsored surveillance, fake news and post-truth politics in which we now live. When someone says a policy or a government’s behaviour is “Orwellian” people know precisely what is meant.

One phrase from Nineteen Eighty-Four should be familiar to us all, even to those who might not have actually read the novel: 

There comes a time in history when the man who dares to say that two and two make four is punished with death.

Except of course, these words are not Orwell’s at all. This is a quote from Albert Camus’ novel La Peste, which was published two years before Nineteen Eighty-Four, in 1947. Of course, the formulation of two and two making five has a history that predates both Orwell and Camus, but Orwell used a very similar version of it as far back as 1939, in a review of a book by Bertrand Russell in Adelphi:

It is quite possible that we are descending into an age in which two plus two will make five when the Leader says so

The similarity between these lines is patent. Is it possible that Camus got the idea from Orwell’s article? Yes, but such things are nearly impossible to prove. Still, it is not important whether Camus was taking influence from Orwell’s writing (although an interesting possibility). What’s important about this example is that it exposes common ground. These quotes embody a foundational principle that united their work: a shared anxiety over the fragility of truth.  

. . . .

Both Camus and Orwell are rightly credited with being “antitotalitarian” writers. And yet their reasons for being so are not wholly political. They were antitotalitarian not just because they opposed totalitarian regimes, but because they both understood that the totalitarian mindset requires you accept that truth comes from ideology. If the ideas say something is true, it becomes true, and is true. For Fascists and Communists, ideology is not merely a set of values or beliefs, but a cohesive explanation of the past, present and future of mankind. This is what Camus referred to in The Rebel as the desire “to make the earth a kingdom where man is God”. Orwell and Camus both understood the dangers of such thinking, and sought to repudiate it in their work.

Link to the rest at The Critic

Are science and religion fated to be adversaries?

From The Economist:

In the late 19th century two books on science and religion were published within a decade of each other. In “The Creed of Science” William Graham tried to reconcile new scientific ideas with faith. In 1881 Charles Darwin, by then an agnostic, told him: “You have expressed my inward conviction, though far more vividly and clearly than I could have done, that the Universe is not the result of chance.”

The other book made a much bigger splash. “History of the Conflict Between Religion and Science” by John William Draper was one of the first post-Darwinian tomes to advance the view that—as its title suggests—science and religion are strongly antithetical. Promoted hard by its publisher, the book went through 50 printings in America and 24 in Britain and was translated into at least ten languages. Draper’s bestseller told a story of antagonism that, ever since, has been the mainstream way to see this relationship.

In “Magisteria”, his illuminating new book, Nicholas Spencer claims that this framing, more recently espoused by Richard Dawkins and others, is misleading. For centuries, he says, science and religion have been “endlessly and fascinatingly entangled”. Even (or especially) those readers inclined to disagree with him will find his narrative refreshing.

Mr Spencer works at Theos, a religious think-tank in London, and is one of Britain’s most astute observers of religious affairs. Some conflict between science and religion is understandable, he argues, but not inevitable. He offers an engaging tour of the intersection of religious and scientific history: from ancient science in which “the divine was everywhere”, to the Abbasid caliphate in Baghdad in the ninth century and Maimonides, an illustrious Jewish thinker of the 12th—and onwards, eventually, to artificial intelligence. Now and again he launches salvoes against ideologues on both sides.

“Medieval science” is not an oxymoron, he writes. Nor is religious rationalism. In the 11th century Berengar of Tours held that “it is by his reason that man resembles God.” As religious dissent spread following the Reformation, Mr Spencer says, theology helped incubate modern science through the propagation of doubt about institutions and the cracking open of orthodoxies. For their part, an emergent tribe of naturalists strove, chisel and hammer in hand, to show that creation pointed towards a creator. Exploration of nature was itself a form of worship.

Mr Spencer insightfully revisits the dust-ups involving Galileo, Darwin and John Scopes (prosecuted in Tennessee in 1925 for teaching evolution). He traces the interaction of the two disciplines in often fascinating detail. Many pioneering scientists lived in times of religious and political strife and found in “natural philosophy”, as pre-modern science was known, a “ministry of reconciliation”. Thomas Sprat, dean of Westminster and biographer of the Royal Society, opined in 1667 that, in their experiments, men “may agree, or dissent, without faction, or fierceness”. That was not always true, as Isaac Newton’s spats with his peers showed. Still, says Mr Spencer, by supplying an arena for calmer debate that was beyond clerical control, “Science saved religion from itself.”

The roll call of scientists who were people of faith runs from Michael Faraday and James Clerk Maxwell to Gregor Mendel and Georges Lemaître, a Belgian priest who, on the basis of mathematical calculations, first proposed that the universe was expanding and therefore had a beginning. In 1933 Lemaître made what, for Mr Spencer, is a key observation: “Neither St Paul nor Moses had the slightest idea of relativity.” The writers of the Bible could see into “the question of salvation. On other questions they were as wise or as ignorant as their generation.” In other words, science and religion are not different attempts to do the same thing. Lemaître warned the pope against drawing any theological conclusions from his work on the cosmos.

Link to the rest at The Economist

The Battle for Your Brain

From The Wall Street Journal:

The fantastical events and strange worlds that our minds concoct as we sleep—our dreams—have long been understood as a mysterious force of creativity, emotional expression and even subconscious desire. But did you know they are a potentially lucrative site for marketing beer?

In “The Battle for Your Brain,” Nita Farahany explores a new era of neurotechnology in which ever more sophisticated devices, for all sorts of reasons, are attempting to discover exactly what we’re thinking and why. The possibilities are both practical and utopian, thrilling and disturbing.

Neurotech, as Ms. Farahany notes, promises a future where drivers never fall asleep at the wheel because their devices alert them to their fatigue; where people who suffer from conditions like epilepsy can be warned of an impending seizure; and where people with neural implants can move objects using only the power of their thoughts.

But there is more to it than that, of course. Ms. Farahany, a professor of philosophy and law at Duke University, takes readers on a tour of companies creating devices—headsets, electrode-enabled earbuds and hats—for tracking the signals that our brains emit. The goal is to decode the signals with software, turning the data into information about everything from our real-time emotions to our unconscious urges. Dream researchers have been approached by companies—including Xbox and Coors—eager to use their findings to pursue “dream incubation” marketing: that is, to use sleep sensor technology to monitor the times when, during sleep, your brain is most suggestible to prompts, such as the brand of beer you should prefer when awake.In one experiment that Ms. Farahany describes, researchers were able to “steal” information from the brains of videogamers using a neural interface. The researchers “inserted subliminal images into the game and probed the players’ unconscious brains for reaction to stimuli—like postal addresses, bank details, or human faces.” By measuring the gamers’ responses, researchers were able to figure out one gamer’s PIN code for a credit card, no doubt opening up new vistas for future brain hackers.

In the here-and-now, however, wearable neurotech is already being used by employers to monitor their employees, enabling a far more granular level of surveillance than was possible before. Ms. Farahany argues that the power of new surveillance tools requires clearer rules about the technologies that serve a public interest—say, by monitoring brain fatigue in long-haul truckers—and those that invade a worker’s privacy, such as mandatory earbuds that measure mood and attention in the guise of promoting “wellness.”

When it comes to governments’ use of such tools, Ms. Farahany warns that a world where consumers embrace wearable neurotech is also one that could allow law enforcement and government agencies to harvest personal data—indeed, our very thoughts. Brain-computer interfaces currently under development by Meta and Elon Musk’s Neuralink, among others, promise to translate the activities of neurons into speech, effectively reading our minds. Should the government have access to those thoughts for the purposes of preventing crime? Are our thoughts considered part of our bodies, or can they be treated as something else?

Likewise, how much mental manipulation should we allow? We are already assaulted by constant advertising online that attempts to guide us to click away from whatever we are reading to purchase products. Is there a point beyond which such prompting and nudging should not go? Ms. Farahany quotes a dream researcher concerned that a lack of regulation might mean a future in which we “become instruments of passive, unconscious overnight advertising, with or without our permission.”

Link to the rest at The Wall Street Journal

BBC crisis escalates as players and stars rally behind soccer host Gary Lineker

From National Public Radio:

The BBC was forced to scrap much of its weekend sports programming as the network scrambled to stem an escalating crisis over its suspension of soccer host Gary Lineker for comments criticizing the British government’s new asylum policy.

As a growing number of English Premier League players and BBC presenters rallied to Lineker’s support and refused to appear on the airwaves on Saturday, Britain’s national broadcaster faced allegations of political bias and suppressing free speech, as well as praise from some Conservative politicians.

The broadcaster said it would air only “limited sport programming” this weekend after hosts of many of its popular sports shows declined to appear, in solidarity with Lineker. The former England captain was suspended from “Match of the Day,” a popular soccer highlights show, over a Twitter post that compared lawmakers’ language about migrants to that used in Nazi Germany.

British Prime Minister Rishi Sunak made his first comments on the storm, saying: “Gary Lineker was a great footballer and is a talented presenter. I hope that the current situation between Gary Lineker and the BBC can be resolved in a timely manner, but it is rightly a matter for them, not the government.”

Instead of blanket coverage on Saturday of the most popular league in the world, the BBC had no preview shows on radio or TV and no early evening summary of the final scores of Premier League games. Lunchtime TV program “Football Focus” was replaced with a rerun episode of antiques show “Bargain Hunt,” while early evening “Final Score” was swapped for “The Repair Shop.”

Soccer fans tuning in for “Match of the Day” — the late-night program that has been a British institution for 60 years — will be getting a 20-minute show instead of one typically lasting around an hour and a half. There will be no commentary on the matches and no studio punditry from some of the most high-profile stars in the British game who have chosen to support Lineker and not work.

There will not be any post-match player interviews, either. The Professional Footballers’ Association said some players wanted to boycott the show, and as a result “players involved in today’s games will not be asked to participate in interviews with ‘Match of The Day.'”

Link to the rest at National Public Radio

Americans may play on soccer teams in the US and elsewhere, but a great many of us don’t really understand the intense popularity of The Beautiful Game elsewhere.

That said, PG has always viewed the powers that be that control the BBC to be more than a little poncey from time to time. Perhaps it’s because BBC programs in the US run primarily on educational channels, usually non-profits, and more often than not associated with a local college or university.

Plus, there’s no US analog to the British television license fee that Brits must pay to watch or record television on any channel. This means that the stations that carry BBC programs in the US tend to interrupt them with breaks to ask for money “to support good programming such as the show you’ve just been watching for ten minutes since our last pledge break,” sounding more than a little like poncey beggers as well.

Of course, in more than a few US universities, the annual salaries paid to the football coach and the basketball coach would fund the university’s public television activities for several years.

Inspiration for The Guidance Groove

From Women Writers, Women’s Books:

My inspiration for writing The Guidance Groove grew partially from my conversations with the undergraduate students I have the privilege of teaching and learning from as a conservation biology professor at the University of California, San Diego. The young people who attend UCSD are amazing—bright, motivated, hard-working, and the best of the best in a myriad of ways. However, so many of them come to my office hours speaking of their imposter syndrome, uncertainty, unhappiness, and fears. I noticed that the underlying themes of their stories were not that different from those I heard from people in other areas of my life, and I sought to understand why so many outstanding, brilliant, and shiny people had self-doubt, lacked contentment, and were unsettled. 

Publishing research papers is the currency for advancement within academia, so translating complicated scientific findings into simpler stories through writing and teaching has been part of my professional life for 20 years, making the use of words my most comfortable form of expression. Thus, I started writing down what I observed and experienced from my students and others and that process helped me discover potential reasons why we humans move through life with less than ideal levels of ease and contentment. Before long, a draft of The Guidance Groove was born.

The Guidance Groove is my first book and, even though the subject is wholly different from my research, throughout the process of its creation, I drew heavily from my scientific writing experience. As with a science paper, the book is logical, succinct, organized, and easy to flip through to find the parts that are most meaningful for the reader. Much like the figures and tables in a research article, the stories used to illustrate my ideas are contained within boxes, making it simple for readers to find the examples that will help them better understand why we adhere to what I call the Unproductive Grooves of inadequacy, obligation, scarcity, and unworthiness and what it feels like to be stuck in those grooves and escape them. 

The logical progression I describe above was invaluable for the creation of my book, but the writing of and the inspiration for The Guidance Groove has another component that is less tangible, more difficult to explain, and not particularly logical. The process involves finding, paying attention to, trusting, and translating the voice that comes from somewhere that urges us writers to string words together in the hopes that we can relay the message of that voice into something meaningful, useful, and wholly authentic for another human to experience. That is the unknown magic of artistic creation. 

I have known this authentic voice for my whole life, and I long ago learned to pay strong attention when I hear its whispers, murmurs, shouts, and calls. For my writing, I listen to and transcribe the wisdom from that voice. To hear the voice better, I consciously quiet the untrue thought patterns in my mind. Those falsehoods that were long ago programmed into me by my upbringing, society, and my own choices to believe the made-up stories that comprise the bulk of my thoughts. I let go of what my mind tells me I “should” do, say, or be, and instead invite my mysterious and wholly authentic voice to be louder, clearer, and more distinct.

Link to the rest at Women Writers, Women’s Books

PG is not in a position to judge whether today’s college students are more angsty than generations before or not.

He tries to avoid any old geezer attitudes that amount to “You think you have it tough, you have no idea what tough is if you didn’t go through the experiences I did when I was your age.”

He suspects that the days of old when college students just had fun and learned have never really existed for a significant portion of college students of any era. The golden glow observed through a rear-view mirror of distant pasts is probably self-generated rather than the way things actually felt during that period of being grown up without attaining true maturity.

Silicon Valley Bank Closed by Regulators, FDIC Takes Control

From The Wall Street Journal:

Silicon Valley Bank collapsed Friday in the second-biggest bank failure in U.S. history after a run on deposits doomed the tech-focused lender’s plans to raise fresh capital.

The Federal Deposit Insurance Corp. said it has taken control of the bank via a new entity it created called the Deposit Insurance National Bank of Santa Clara. All of the bank’s deposits have been transferred to the new bank, the regulator said.

Insured depositors will have access to their funds by Monday morning, the FDIC said. Depositors with funds exceeding insurance caps will get receivership certificates for their uninsured balances, meaning businesses with big deposits stuck at the bank are unlikely to get their money out soon.

The bank is the 16th largest in the U.S., with some $209 billion in assets as of Dec. 31, according to the Federal Reserve. It is by far the biggest bank to fail since the near collapse of the financial system in 2008, second only to the crisis-era collapse of Washington Mutual Inc.

The bank’s parent company, SVB Financial Group, was racing to find a buyer after scrapping a planned $2.25 billion share sale Friday morning. Regulators weren’t willing to wait. The California Department of Financial Protection and Innovation closed the bank Friday within hours and put it under the control of the FDIC.

Customers tried to withdraw $42 billion—about a quarter of the bank’s total deposits—on Thursday alone, the California regulator said in a filing Friday. The flood of withdrawals destroyed the bank’s finances; at close of business Thursday, it had a negative cash balance of nearly $1 billion and couldn’t cover its outgoing payments at the Fed, according to the filing.

The bank was in sound financial condition on Wednesday, the regulator said. A day later, it was insolvent.

SVB’s troubles have dragged down a wide swath of the industry. Investors dumped the shares of banks big and small on Thursday, shaving $52 billion off the value of the four largest U.S. banks alone. The megabanks recovered Friday but many of their smaller peers continued to plunge. Several were halted for volatility.

Link to the rest at The Wall Street Journal

The reason this is of note to visitors to TPV is that a great many Northern California tech companies were holding all or a lot of their money in this bank.

Making the next payroll could be a real problem for a whole lot of tech companies.

The FDIC (Federal Deposit Insurance Corporation) covers $250,000 of a customer’s loss when a bank fails, but some tech companies had billions of dollars of deposits in the bank. It’s estimated that roughly 95% of the bank’s deposits were uninsured, according to filings with the SEC.

We Are Electric

From The Wall Street Journal:

The science writer Sally Adee begins “We Are Electric” in a bullish mood, arguing that it’s time for researchers to focus on the electrome—the “electrical dimensions and properties of cells, the tissues they collaborate to form, and the electrical forces that are turning out to be involved in every aspect of life.” Once the secrets of the electrome are unlocked, Ms. Adee claims, we “should all be programmable at the level of the cell.”

The story begins during the Enlightenment, with the dispute between Luigi Galvani and Alessandro Volta over “animal electricity.” Ms. Adee takes us back to 1780, when Galvani set up a home laboratory complete with Leyden jars, electrostatic generators and a host of frogs cut into various grisly configurations. The author describes how a series of experiments with static electricity, lightning and brass hooks convinced Galvani that “the body is animated by a kind of electricity,” and how Volta—keen to “cement his reputation as a brilliant theorist”—attacked Galvani’s theory and buried it with a “world-changing instrument: the battery.” Despite Galvani’s elegant dissections, most electricians “didn’t care about a theory as long as it yielded a tool that helped them do better science,” Ms. Adee suggests. So when Volta demonstrated a device that for the first time produced a steady electric current, it was enough to win the argument, handing the field of electricity in living creatures over to quacks and charlatans for nearly a century.

The broad outlines of this tale, where bioelectrical pioneers struggle to gain recognition for their work but wind up “sidelined” by the scientific establishment, are repeated as Ms. Adee traces the study of bioelectricity over the next 250 years. The inventor of the electroencephalogram, Hans Berger, killed himself in 1941, in part over his despair at the ridicule he endured after introducing his machine in Germany in the 1920s. After Alan Hodgkin and Andrew Huxley discovered in 1952 that neurons fire by swapping sodium and potassium ions, James Watson and Francis Crick “stole the show” with their discovery of DNA, leaving bioelectricity “sidelined by a ‘bigger’ discovery once again.” Despite an experiment that in 2007 helped a man with a crushed spine walk again, Richard Borgens’s innovative oscillating field stimulator, we are told, was “blocked at every turn.”

. . . .

Ms. Adee looks forward to a future in which implants are made of organic material and dispense ions instead of electrons, allowing them to speak to the body “in its own language.” But some studies have proved challenging to replicate, and she admits that treatments are “an extremely long way from your doctor’s consulting room.” Understanding the human electrome well enough that we can manipulate it precisely will require huge trials to establish how these technologies interact with our bioelectricity, Ms. Adee continues, which raises the question: “Who is going to let you open their brain to get that data?”

. . . .

Ms. Adee writes as a reporter, but also an enthusiast who “ended up buying a brain stimulator” herself. It was through her experience with one such wearable device at a U.S. military training facility—where her brain was electrically stimulated from outside her skull, turning her from a novice marksman into a sharpshooter within hours—that her interest in the field was sparked. For the next few days, she writes, “life was so much easier. Who knew you could just, like, do stuff without first performing the elaborate dance of psychological self-recrimination?”

Link to the rest at The Wall Street Journal

PG wants a brain stimulator. He just searched for brain stimulator device and found lots of products on Amazon, including TENS muscle stimulators plus a lot of fishy-looking devices (including some that claimed to include “safety features”) that may or may not work the way the one described in the WSJ article seemed to.

Slavery in the Americas: Separating Fact from Fiction

From The Mises Institute:

The history of transatlantic slavery is riddled with fables and errors. Erroneous claims have been propagated in the media because history is currently perceived as a political project that must justify present sensibilities. History has become so politicized that rigorous research is unable to disabuse activists of inaccuracies. Due to the rampant politicization of academia, noted scholars are usually cajoled into apologizing for defending historical standards.

After chiding fellow scholars for projecting modern sensibilities onto historical realities, historian James H. Sweet was shamed into penning an apology. Sweet was ruthlessly demeaned by his colleagues for noting the fallacy of using the narratives of identity politics to interpret historical events. Because academics are so willing to genuflect to unhinged mobs, propaganda is becoming history, and instead of digesting hard historical truths, many are fed fabrications.

Link to the rest at The Mises Institute

Is History History?

PG trigger warning: PG will include excerpts from the original article that is the subject of this post.

If you go to the OP, you will see that the original piece now has a groveling apology from the author, who evidently is the President of the American Historical Association, apologizing for “the harm that it has caused” and for the OP foreclosing “this conversation for many members, causing harm to colleagues, the discipline, and the Association.” Further down the introductory apology, the author characterizes the piece as “my ham-fisted attempt at provocation” and invites anyone who would like an additional apology to contact him directly.

The author of the OP ends his apology by writing:

“I sincerely regret the way I have alienated some of my Black colleagues and friends. I am deeply sorry. In my clumsy efforts to draw attention to methodological flaws in teleological presentism, I left the impression that questions posed from absence, grief, memory, and resilience somehow matter less than those posed from positions of power. This absolutely is not true. It wasn’t my intention to leave that impression, but my provocation completely missed the mark.

Once again, I apologize for the damage I have caused to my fellow historians, the discipline, and the AHA. I hope to redeem myself in future conversations with you all. I’m listening and learning.”

From the original pre-groveling article published in Perspectives on History, published by the American Historical Association:

Twenty years ago, in these pages, Lynn Hunt argued “against presentism.” She lamented historians’ declining interest in topics prior to the 20th century, as well as our increasing tendency to interpret the past through the lens of the present. Hunt warned that this rising presentism threatened to “put us out of business as historians.” If history was little more than “short-term . . . identity politics defined by present concerns,” wouldn’t students be better served by taking degrees in sociology, political science, or ethnic studies instead?

The discipline did not heed Hunt’s warning. From 2003 to 2013, the number of PhDs awarded to students working on topics post-1800, across all fields, rose 18 percent. Meanwhile, those working on pre-1800 topics declined by 4 percent. During this time, the Wall Street meltdown was followed by plummeting undergraduate enrollments in history courses and increased professional interest in the history of contemporary socioeconomic topics. Then came Obama, and Twitter, and Trump. As the discipline has become more focused on the 20th and 21st centuries, historical analyses are contained within an increasingly constrained temporality. Our interpretations of the recent past collapse into the familiar terms of contemporary debates, leaving little room for the innovative, counterintuitive interpretations.

This trend toward presentism is not confined to historians of the recent past; the entire discipline is lurching in this direction, including a shrinking minority working in premodern fields. If we don’t read the past through the prism of contemporary social justice issues—race, gender, sexuality, nationalism, capitalism—are we doing history that matters? This new history often ignores the values and mores of people in their own times, as well as change over time, neutralizing the expertise that separates historians from those in other disciplines. The allure of political relevance, facilitated by social and other media, encourages a predictable sameness of the present in the past. This sameness is ahistorical, a proposition that might be acceptable if it produced positive political results. But it doesn’t.

In many places, history suffuses everyday life as presentism; America is no exception. We suffer from an overabundance of history, not as method or analysis, but as anachronistic data points for the articulation of competing politics. The consequences of this new history are everywhere. I traveled to Ghana for two months this summer to research and write, and my first assignment was a critical response to The 1619 Project: A New Origin Story for a forthcoming forum in the American Historical Review. Whether or not historians believe that there is anything new in the New York Times project created by Nikole Hannah-Jones, The 1619 Project is a best-selling book that sits at the center of current controversies over how to teach American history. As journalism, the project is powerful and effective, but is it history?

This new history often ignores the values and mores of people in their own times.

When I first read the newspaper series that preceded the book, I thought of it as a synthesis of a tradition of Black nationalist historiography dating to the 19th century with Ta-Nehisi Coates’s recent call for reparations. The project spoke to the political moment, but I never thought of it primarily as a work of history. Ironically, it was professional historians’ engagement with the work that seemed to lend it historical legitimacy. Then the Pulitzer Center, in partnership with the Times, developed a secondary school curriculum around the project. Local school boards protested characterizations of Washington, Jefferson, and Madison as unpatriotic owners of “forced labor camps.” Conservative lawmakers decided that if this was the history of slavery being taught in schools, the topic shouldn’t be taught at all. For them, challenging the Founders’ position as timeless tribunes of liberty was “racially divisive.” At each of these junctures, history was a zero-sum game of heroes and villains viewed through the prism of contemporary racial identity. It was not an analysis of people’s ideas in their own time, nor a process of change over time.

In Ghana, I traveled to Elmina for a wedding. A small seaside fishing village, Elmina was home to one of the largest Atlantic slave-trading depots in West Africa. The morning after the wedding, a small group of us met for breakfast at the hotel. As we waited for several members of our party to show up, a group of African Americans began trickling into the breakfast bar. By the time they all gathered, more than a dozen members of the same family—three generations deep—pulled together the restaurant’s tables to dine. Sitting on the table in front of one of the elders was a dog-eared copy of The 1619 Project.

. . . .

Later that afternoon, my family and I toured Elmina Castle alongside several Ghanaians, a Dane, and a Jamaican family. Our guide gave a well-rehearsed tour geared toward African Americans. American influence was everywhere, from memorial plaques to wreaths and flowers left on the floors of the castle’s dungeons. Arguably, Elmina Castle is now as much an African American shrine as a Ghanaian archaeological or historical site. As I reflected on breakfast earlier that morning, I could only imagine the affirmation and bonding experienced by the large African American family—through the memorialization of ancestors lost to slavery at Elmina Castle, but also through the story of African American resilience, redemption, and the demand for reparations in The 1619 Project.

Yet as a historian of Africa and the African diaspora, I am troubled by the historical erasures and narrow politics that these narratives convey. Less than one percent of the Africans passing through Elmina arrived in North America. The vast majority went to Brazil and the Caribbean. Should the guide’s story differ for a tour with no African Americans? Likewise, would The 1619 Project tell a different history if it took into consideration that the shipboard kin of Jamestown’s “20. and odd” Africans also went to Mexico, Jamaica, and Bermuda? These are questions of historical interpretation, but present-day political ones follow: Do efforts to claim a usable African American past reify elements of American hegemony and exceptionalism such narratives aim to dismantle?

The Elmina tour guide claimed that “Ghanaians” sent their “servants” into chattel slavery unknowingly. The guide made no reference to warfare or Indigenous slavery, histories that interrupt assumptions of ancestral connection between modern-day Ghanaians and visitors from the diaspora. Similarly, the forthcoming film The Woman King seems to suggest that Dahomey’s female warriors and King Ghezo fought the European slave trade. In fact, they promoted it. Historically accurate rendering of Asante or Dahomean greed and enslavement apparently contradict modern-day political imperatives.

Hollywood need not adhere to historians’ methods any more than journalists or tour guides, but bad history yields bad politics. The erasure of slave-trading African empires in the name of political unity is uncomfortably like right-wing conservative attempts to erase slavery from school curricula in the United States, also in the name of unity. These interpretations are two sides of the same coin. If history is only those stories from the past that confirm current political positions, all manner of political hacks can claim historical expertise.

This is not history; it is dilettantism.

Too many Americans have become accustomed to the idea of history as an evidentiary grab bag to articulate their political positions, a trend that can be seen in recent US Supreme Court decisions.

. . . .

Professional historians would do well to pay attention to Breyer’s admonition. The present has been creeping up on our discipline for a long time. Doing history with integrity requires us to interpret elements of the past not through the optics of the present but within the worlds of our historical actors. Historical questions often emanate out of present concerns, but the past interrupts, challenges, and contradicts the present in unpredictable ways. History is not a heuristic tool for the articulation of an ideal imagined future. Rather, it is a way to study the messy, uneven process of change over time. When we foreshorten or shape history to justify rather than inform contemporary political positions, we not only undermine the discipline but threaten its very integrity.

Link to the rest at Perspectives on History, published by the American Historical Association

PG is not an eminent historian, nor is he a member of the American Historical Association, but the original article excerpted above is consistent with other historical accounts of the slave trade that he has read. In short, the trade in African slaves relied upon the active cooperation of Africans themselves, who captured and enslaved their fellow Africans in preparation for selling them to the slave traders that would carry them across the ocean to the New World.

Additionally, in the New World at the time of active slave trading across the Atlantic, a significant number of captives were sold into slavery in nations and colonies other than the United States. The British government forbade slavery in Great Britain, but slave trading was practiced in more than one British colony during this time and some British citizens made a great deal of money from such activities. The descendants of African slaves can be found today across Brazil and elsewhere in South America.

As PG has mentioned before, The 1619 Project is better understood as a 21st-century political polemic than as an accurate history of slavery in the United States and elsewhere in the New World.

Unfortunately, slavery has been widespread during different time periods in many, many places around the world. It was present during the classical period in ancient Greece and during the height of the Roman Empire. There was also slavery in Ancient Israel.

Three stories of collusion during the second world war

From The Economist:

When nations are licking the wounds of war and occupation, they tell and retell the stories of people who resisted heroically. Equally strong is the instinct to anathematise the accursed characters who colluded with the foe. Focusing on both extremes can be a way for ordinary folk to set aside their own behaviour, which was often somewhere in the middle. Yet even seemingly egregious collaborations can have complex motives and results.

That, broadly, is the theme that holds together these three stories of the second world war, told in intricate but fascinating detail by Ian Buruma, a prolific Dutch-born chronicler of modern times. All three of his subjects are elusive, tantalising targets because they were serial myth-makers and encouraged others to weave fantastical tales around them, leaving questions hanging in the air long after their lifetimes. All three grew up in contested environments where the ability to manipulate narratives seemed indispensable.

The one certain thing is that they co-operated with the Axis powers. Felix Kersten was the masseur and confidante of Heinrich Himmler, commander of the ss. Born in tsarist Estonia to Baltic Germans, he was naturalised in Finland and had to negotiate the complex inter-war contests over that country’s future. Friedrich Weinreb came from a modest Jewish family which in 1915 sought security in the sophistication of Vienna—but felt despised by the city’s more prosperous Jews, as well as threatened by the anti-Semitism that was already rising in central Europe. His family settled in the Netherlands.

The third subject, mostly known by her adopted name of Kawashima Yoshiko, was the daughter of a princely Chinese family. Cut adrift by the dynasty’s overthrow in 1912, she moved to Japan and found succour where she could. First she married a Mongolian; later she offered services, sexual and strategic, to a Japanese officer while herself keeping a Japanese woman in servitude. With a penchant for male uniform, she at one point commanded an equestrian army of ruffians for Manchuria’s Japanese occupiers. In Japan’s propaganda, she was a new Joan of Arc.

All three tried to turn vulnerability into power. As the only person who could ease Himmler’s aches and pains, Kersten later claimed that he used this cosy relationship to ward off some horrific possibilities—such as a plan to deport eastwards the entire Dutch population in 1941. As the book shows, the Nazis never had any such intention. But some assertions he made in self-defence have greater standing: for example, that by arranging a meeting between Himmler and a member of the World Jewish Congress in 1945, he saved the lives of many Jews still in Nazi captivity.

Weinreb’s deception was grosser. During the occupation of the Netherlands he took money from thousands of Jews by claiming, falsely, that he could use high-level German contacts to guarantee their escape. He would later maintain that he had kept people’s hopes alive as liberation loomed; an official investigation found his self-justifying arguments to be nonsense.

Yoshiko was captured by the nationalist Chinese government and executed in 1948. Yet by her own peculiar lights, she was not a traitor. Instead her service to the Japanese occupiers of Manchuria was an element in a wider, well-choreographed initiative to restore, at least partly, the fallen Chinese dynasty, which included the re-coronation of the ousted Emperor Puyi, albeit as a Japanese puppet.

Link to the rest at The Economist

Children’s Rights Roundup: ‘Buy Ukrainian Book Rights’

From Publishing Perspectives:

As Publishing Perspectives readers following our Rights Roundups know, the Federation of European Publishers in Brussels today (February 24) is marking the anniversary of Vladimir Putin’s unprovoked and savage invasion of Ukraine with a statement of support for the Voices of Children Foundation, which has published a book, War Through the Voices of Children. Proceeds of all sales go to special programs in psychological and emotional support for kids . . .

. . . .

As festive as the moment normally might be on the approach to Elena Pasoli’s Bologna Children’s Book Fair in its 60th edition, our world industry can do no less than to observe the fact that one year later, the greatest land war in Europe since World War II is going into advanced stages of combat severity and unthinkable casualties; now officially classified crimes against humanity; deepening military complexity and obligations; and—thankfully—intensifying international resolve against Putin and the Russian nation’s unspeakable aggression against the democratic sovereign state of Ukraine.

It’s meaningful to us today that the first of four calls to action described just below is Buy rights to Ukrainian books. We’re particularly mindful of this because in all the many book deals submitted to us for today’s Rights Roundup, not one title was a book originally written in Ukrainian or by a Ukrainian author.

The International Publishers Association (IPA) in Geneva is making a joint statement today with the federation, laying out the two organizations’ concerns about the damage to book publishing and education for Ukrainians under Putin’s violence, which opened on February 24, 2022.

From the IPA, we have these figures:

  • The number of publishers operating in Ukraine has dropped from 1,053 in 2021 to 563 in 2022
  • Educational publishers have been unable to print textbooks for pupils
  • UNICEF reports that more than 2,600 schools have been damaged, affecting 5.3 million children
  • UNESCO has verified damage to 241 cultural sites including museums and libraries

The United States’ secretary of state, Antony Blinken, this morning (February 24) has told the United Nations’ Security Council that in the last year, among Russians’ atrocities have been bombings of more than 2,600 schools and abductions by Russians of at least 6,000 Ukrainian children for relocation to Russia—”some as young as four months old.”

Link to the rest at Publishing Perspectives

Amazon has a donkey meat problem

Not actually to do with books or publishing, but PG couldn’t resist the title.

From Ars Technica:

When Cindy first tried the Artemisia Anti-Hemorrhage Formula dietary supplements that she purchased on Amazon, she had no reason to suspect that she was eating donkey. A California native and lifelong vegetarian, she assumed that the world’s largest online retailer had vetted the bottle’s claims of being made from “100 percent pure, natural herbs.” But while reading the back of the bottle, she noticed an ingredient she hadn’t seen before: “gelatina nigra.” She googled it, and what she found made her stomach turn.

Every year, millions of donkeys are slaughtered and skinned to make the so-called gelatina nigra found in Cindy’s dietary supplement. More commonly called “ejiao” or “donkey-hide gelatin,” the animal product is made from donkey skin. It’s in such high demand due to its alleged health benefits that it’s decimating the global donkey population and has led to increasingly brutal treatment of the animals, according to a 2019 report by the Donkey Sanctuary, an advocacy organization. A video the organization obtained shows workers in Tanzania bludgeoning donkeys with hammers to meet their slaughter quotas. “It’s not herbal. It’s literally made with donkeys,” says Cindy, who asked to go by only her first name for privacy reasons. “Why would Amazon sell something that cruel?”

While some retailers like Walmart and eBay have committed to drop products that contain ejiao, edible items containing this ingredient are widely for sale on Amazon in spite of multiple petitions asking that it stop selling them. A legal complaint filed in California last week by the law firm Evans & Page on behalf of the Center for Contemporary Equine Studies, a nonprofit, claims Amazon’s continued sale of these donkey-based products is more than distasteful—it may be illegal.

Link to the rest at Ars Technica

The Sensitive Question of Sensitivity Readers

From Publisher’s Weekly:

Under book publishing’s trending best practices, historical authenticity can be secondary to appeasing people’s sensitivities. I’m qualified to say this based on my recent experience as a literary agent on behalf of a client.

The events in question began happily: my client received a Big Five contract for a book about his time as a Marine sniper during the Vietnam War, when he was 17. The original manuscript (written with the assistance of a coauthor) told his story in the context of its time and place, including florid verbatim language and descriptions that wouldn’t be appropriate in other settings, then or now. Historical authenticity and truthfulness were the author’s priorities.

The manuscript passed the publisher’s editorial and legal protocols with relatively few revisions, and no additional hurdles were expected. In fairness, the editor’s good news email included a brief statement that the manuscript still needed to pass a so-called sensitivity read, but we weren’t told what that was or given any reason for concern. I had never heard of it and didn’t give it a second thought. Instead, I asked the editor to request the second advance payment due upon acceptance for publication. But my assumptions were wrong.

I’ve since learned that sensitivity reads are a recent and potentially powerful layer of scrutiny some books are subjected to. Evidently, they have been in use by some children’s publishers for several years. I don’t know which adult publishers may have adapted them, if they are uniformly structured and empowered, or if any written mission statements or guidelines exist. I can only write about my experience.

If properly conceived and used, sensitivity reads can be beneficial for all stakeholders, especially authors. Any manuscript can be potentially infected with inadvertently offensive content that serves no meaningful purpose. For instance, I represent many older backlist titles that possess unacceptable language by current standards but that, when written, seemed innocent. We make an effort to discover and rewrite those segments without distorting the (often deceased) author’s meaning. The key is trying to remain as true as possible to the author’s original intent.

Under the threat of having his book deal terminated, my client was forced to meaningfully modify his manuscript to accommodate a five-page document full of subjective complaints about how the Vietnam War was fought by the author and his co-combatants, the unfiltered descriptions of his horrific experiences, and the unsavory language used by the mostly very young men who were there on behalf of their country. The sensitivity review was written by one person. This person was hired by the publisher, and no information about their qualifications, or who might have reviewed their review, was provided. No appeals or rebuttals were allowed. My author reluctantly complied in full.

I actually agree with many of the sensitivity reader’s sentiments. Everything about that war was appalling. But why sanitize it? It should be depicted exactly as it happened. Following the publisher’s logic would be equal to transforming the M˜y Lai Massacre into a misunderstanding with unpleasant consequences that shouldn’t be discussed because it’s too upsetting for some people.

I felt the publisher endowed the sensitivity reader’s report with the unilateral power to censor my client’s book, which raises serious questions. How are sensitivity readers recruited and what qualifies them? Are their personal views and experiences taken into account? More problematically, how can a person’s feelings qualify as objective or open-minded? How is it possible to oppose a person’s feelings without at least partially invalidating them? Should the need for accuracy be enmeshed with feelings? What outcomes are publishers looking for?

Link to the rest at Publisher’s Weekly

PG wonders if anyone has been dinged by a sensitivity reader on KDP.

Being far out of any sensitivity loop, PG did some quick and dirty research. He discovered that a huge number of traditional news publications and television networks went crazy over the insensitivity of a 3-second GIF online advertisement for Dove:

And here’s that three-second insensitive video:

If Nietzsche Were a Narwhal 

From The Guardian:

Friedrich Nietzsche claimed that humankind was “a fantastic animal that has to fulfil one more condition of existence than any other animal”: we have to know why we exist. Justin Gregg, a researcher into animal behaviour and cognition, agrees, describing humankind as “the why specialists” of the natural world. Our need to know the reasons behind the things we see and feel distinguishes us from other animals, who make effective decisions without ever asking why the world is as it is.

Evidence of this unique aspect of our intelligence first appeared 44,000 years ago in cave paintings of half-human, half-animal figures, supernatural beings that suggest we were asking religious questions: “Why am I here? And why do I have to die?” Twenty-thousand years later, we began planting crops, revealing an awareness of cause and effect – an understanding of how seeds germinate and what to do to keep them alive. Ever since then, our constant questioning of natural phenomena has led to great discoveries, from astronomy to evolution.

But rather than being our crowning glory as a species, is it possible that human intelligence is in fact a liability, the source of our existential angst and increasingly apparent talent for self-destruction? This is the question Gregg sets out to answer in his entertaining and original book.

The delightfully absurd title stems from his claim that the 19th-century German philosopher, who had depression and eventually dementia, was “the quintessential example of how too much profundity can literally break your brain”. The “soul-tortured Nietzsche”, who sought meaning in suffering, is an example of how, as a species, we are simply too smart for our own good. By contrast, the narwhal (“one of my favourite marine animals”) demonstrates the fact that, from an evolutionary perspective, intelligence and complex thought are often a hindrance: “The absurdity of a narwhal experiencing an existential crisis is the key to understanding everything that is wrong about human thinking, and everything that is right about animal thinking.”

In search of evidence to support this theory, Gregg explores the nature of intelligence. Although non-human animals may have simpler minds than us, they are no less successful in their own way than we are, and do far less harm to their fellow beings: “The Earth is bursting with animal species that have hit on solutions for how to live a good life in ways that put the human species to shame.”

Link to the rest at The Guardian

The Enlightenment as reading project

From The Critic:

What is Enlightenment?” Kant’s 1784 answer to the question, that the Enlightenment “dares to know”, is famous and paradoxical. Enlightenment, he maintains, requires freedom of the press, but only an authoritarian regime can allow unchecked debate: thus the Prussia of Frederick the Great is the only state where people can “dare to know”.

Above all, the freedom that matters to Kant is the freedom to question religious authorities on matters of faith. Kant’s Enlightenment thus strengthens secular authority in order to create the space required for a critique of religious authority. Enlightenment is inseparable from despotism.

Gary Kates’s important book offers a new answer to Kant’s question by interrogating a dozen 18th century bestsellers, from Fenelon to Smith. The importance of the book can best be conveyed by comparing it to the work of two scholars who made their reputations in the 1970s: Robert Darnton and Quentin Skinner. Darnton mined the records of the Société typographique de Neuchâtel (STN) to understand the trade in illegal books from the inside. Out of this he wrote a fine book, The Forbidden Best-Sellers of pre-Revolutionary France (1995).

. . . .

I remember, many years ago, sitting mystified through a seminar where the leading lights of English philosophy gathered in Oxford to discuss John Rawls’s newly published Theory of Justice. There was radical disagreement as to what Rawls meant to say. The answer seemed to me (a young Skinnerian) obvious: why not phone him up and ask him?

Kates isn’t interested in getting Rousseau on the phone. Perhaps the most remarkable achievement of this book is to bring out how readings of these bestsellers differed between radicals and conservatives, Christians and pagans, shifting over time. Some of these books were obviously written to be open to divergent interpretations (Montesquieu’s Spirit of the Laws, for example), but in the case of Letters of a Peruvian Woman publishers happily published revised texts giving the story the ending that readers wanted.

The result of Kates’s trinitarian approach is that his book is not a history of ideas, nor book history, nor cultural history, nor a study in reception. It is, in parts, all of these, but much more than the sum of its parts. After all, authors and publishers are also readers; and readers, as they write letters and compile commonplace books, are also authors, not to mention the source of publishers’ profits. Only a Trinitarian approach can grasp the complexity of the book as written, printed and read. I hope no future historian of ideas will write about a book printed before the Industrial Revolution without asking how many copies were printed, how much they cost and who actually owned them.

I think Kates’s fundamental perception, that the Enlightenment can be thought of as a programme of reading a set of books that writers and readers had in common, is sound. The figure who emerges as the most important representative of the Enlightenment on this approach is Montesquieu: his Persian Letters are presented as the model for Voltaire’s Philosophical Letters, Richardson’s Pamela and Graffigny’s Letters of a Peruvian Woman, whilst The Spirit of the Laws is taken to lie at the origin of Smith’s Wealth of Nations, Rousseau’s Emile and Raynal’s History of the Two Indies. This is not Kant’s Enlightenment: none of these authors favour despotism.

Link to the rest at The Critic

The Mistress of Slang

From The Chronicle of Higher Education:

In 2020, on Perry Street in Manhattan’s West Village, there lived a woman named Madeline Kripke, and her books. Kripke was 76, and she had been collecting dictionaries, and books about dictionaries, most of her life, almost since her parents gave her Webster’s Collegiate when she was 10.

Kripke was not a collector like you or I would be. Dictionaries lined not only the shelves she had specially built for them but every surface in her sizable two-bedroom apartment. Drawers were pulled out to make more surfaces on which to stack books, which also lay atop the refrigerator and on her bed. Books stood in towers along the floor, with narrow passageways to ease through. “It’s the biggest collection of dictionaries, period,” said the lexicographer Jesse Sheidlower, author of The F-Word, a history of that verb. Sheidlower is one of a cohort of lexicographers who knew Kripke and used her books, and her knowledge, to inspire their own work. Of her collection, “it’s better than what’s in the Bodleian and the NYPL combined,” he said, referring to libraries at the University of Oxford and in New York City.

Kripke wasn’t only a collector. She read dictionaries and compared them. She knew what her 20,000 volumes contained, and she loved sharing that with people who cared about what she knew. Along with her apartment, she had at least two Manhattan warehouses, each with “more stuff in it than probably any slang collection anywhere else in the country,” said Tom Dalzell, co-editor of The New Partridge Dictionary of Slang and Unconventional English. She had a nose for finding obscure titles and dictionary memorabilia, like correspondence between two Merriam brothers about how to buy the rights to a dictionary from the estate of a guy named Webster. And she was a good businesswoman: Rare-book collectors would be interested in something and approach Sotheby’s, and “Madeline would have it before anyone knew it was there,” said Sheidlower.

Link to the rest at The Chronicle of Higher Education

I Learned I Was Pregnant Right After Publishing An Essay About Not Having Kids

From Electric Lit:

Last December, with some hesitation, I posted a personal essay I’d written for Racquet Magazine on Facebook, Twitter, and Instagram. The piece examined why Serena’s retirement from professional tennis, in order to have another child, had prompted an existential crisis for me. Serena and I are both 41, and her sadness around the word “retirement” echoed my own sadness around the word “motherhood.” While I came to no firm conclusions, I ended the essay suggesting that my husband and I would likely not have children, given my age and our ambivalence, despite family and social pressures to reproduce. 

One week after posting the article, I found out I was pregnant. 

I had initially written the piece in early August, when Alejandro and I were still in the will-we-won’t-we throes of removing my IUD and playing pregnancy roulette. Writing the piece felt like finding solid ground after trudging through a steaming, buggy, couples-therapy swamp. After finishing it, I knew I would be okay without kids, even if a twinge of sadness remained. Still, in September, that very twinge led me to remove the IUD and roll the dice. We decided we would try for six months, a year at most, and then pat ourselves on the backs. 

When I posted the article, a week before Christmas, I was smarting from all the photos of young families on Facebook, and what I knew would be the inevitable round of questions about my childbearing desires at holiday parties. I captioned each post as a semi-manifesto. On Instagram, I wrote “As a childless woman at 41, I’m constantly fielding inappropriate questions from total strangers about having kids. It was healing to find my truth on the page instead of stammering something at a cocktail party.” On Facebook: “There’s a lot of pressure on women to not just have kids but to unequivocally want to be a mother. In this time of holiday cards and families posing together, I’m sending love out to those women who are on the fence and whose photos might look way different.” And on Twitter, I leaned even more provocative: “When a woman feels new life stirring inside, does it always have to be a baby?”

I felt like I had made some sort of declaration, that I had finally side-stepped the pitying looks from mothers when I said I didn’t have kids. But what had I declared, exactly? By claiming temporary childlessness (which is so often treated as temporary insanity), I had simply admitted that I didn’t know what I wanted, but I was tired of feeling ashamed. In the days after posting, I basked in the glow of my friends’ praise and congratulations, for another creature I had birthed: my essay. I made plans with an acquaintance I met in Spanish lessons to grab a drink in the new year and talk more about the subject. She, too, had huge doubts about having kids, even though she was ten years younger, and had been relieved to find companionship in my essay.

But holding the positive pregnancy test in the bathroom a few days later, even before going down to tell Alejandro, among the waves of excitement, fear, dread, and joy, I felt that old stand-by, shame. I had just gone on the record as (probably) not having kids. Now I was switching sides? And, indeed, the first person we told, Ale’s sister, after shrieking and congratulating him, asked: But what about the article?

What about the article? 

Link to the rest at Electric Lit

Outsmart Your Brain

From The Wall Street Journal:

Much has been made of ChatGPT, the artificial-intelligence algorithm, and its potential to disrupt education. We have already seen that it can write college essays and take graduate exams with alarming aptitude. At the very least, we’ll need new guards against students who rely on ChapGPT to cheat. One might even ask why we should bother to continue teaching pupils to remember and write things at all, given this new tool. But, for the foreseeable future, the human brain will have its advantages, and we still need practice loading our minds with content and extracting their insights. School will survive.

And while AI has its fancy feats, the old noggin still has a few tricks up its skull, if we can suss them out. Unfortunately, formal education offers little by way of a user’s guide. Many students flail around as they search for their own ways to organize knowledge and get through the curriculum—or not. Fortunately, Daniel Willingham, a psychologist at the University of Virginia, helps fill that gap with his charming and practical new book, “Outsmart Your Brain: Why Learning Is Hard and How You Can Make It Easy.”

There’s something counterintuitive about the notion that the brain needs a user’s guide. What could the mind know better than itself? The fact that even those who study cognition need such a guide is all the more surprising. But there’s knowing the science, there’s knowing how to apply it—and then there’s knowing how to get yourself to apply it. It’s not a straight shot. As a college student of cognitive neuroscience, I still relied all too often on all-nighters, an established recipe for anxiety and underperformance. In his personable manner, Mr. Willingham recalls: “Until graduate school, my time management system was a mix of writing things on my hand, apologies, and excuses.”

His book is aimed most directly at high-school and college students—people in the world of lectures, exams and lab projects. But much of the advice also applies more widely to anyone who must learn by listening or reading, work in groups, assess what they know, or get anything done at all. It’s a course on learning how to learn and on metacognition in general—how to think about how you think.

Mr. Willingham structures his book cogently. There are 14 chapters covering how to understand a lecture, read a book, take and organize notes, study for a test, take a test, learn from a test performance and so on. Each has several tersely articulated tips, 94 in all, along with bolded text, bullet points and “in a sentence” takeaways. You can skip around. Chapters also include advice for instructors.

Some tips may seem obvious, others less so. But even the obvious-seeming ones merit close study. Consider the tip, which in this case acts as a metatip, “Be Clear About What It Means to ‘Know’ Something.” When someone tells you something and it makes sense, Mr. Willingham explains, that doesn’t mean you “know” it. You still have to explain it and put it to use.

Of the less obvious variety, Mr. Willingham counsels that although rereading notes and books are among the most common strategies for studying—they quickly render the material familiar and therefore feel productive—they are among the least effective. Better to prepare study questions and try to answer them. Memory retrieval feels hard and useless at first, like trying and failing to do a push-up, but that’s precisely why it works. You must build new muscle.

Link to the rest at The Wall Street Journal

Of Age

From The Wall Street Journal:

Before her abolitionist novel “Uncle Tom’s Cabin” appeared as a serial in 1851, Harriet Beecher Stowe made her young children sob by reading aloud the passage that describes Uncle Tom being beaten to death. Today we might be shocked that a mother would risk traumatizing her tots with such upsetting stuff, but Stowe was a woman of the early 19th century, and back then people believed that children benefited from exposure to grisly material. The realm between childhood and adulthood was hazier: Children were expected to pull their weight, boys worked alongside men, and though parents were concerned for their children’s physical well-being, they seem not to have worried overmuch about emotional harm.

Thus in the antebellum years parents didn’t think it wrong for juvenile books and school primers to juxtapose wholesome moral messages with depictions of graphic violence. Once the Civil War broke out, books for even very small readers might show gory details. An abecedary titled “The Union ABC,” for instance, featured bloody battle scenes and a hanged man (T is for Traitor), while the letter Y stood for a Youth bound for soldiering. Preparing boys for battle, such material “steeped them in martial exploits,” Frances M. Clarke and Rebecca Jo Plant write in “Of Age,” a book that explores the phenomenon of mass youth enlistment during the Civil War.

Ms. Clarke, who teaches history at the University of Sydney in Australia, and Ms. Plant, a history professor at the University of California, San Diego, make important claims in this excellent account, which is refreshingly clear of agonized caution and formulaic wokishness. According to their research, historians have been wrong to think that boys and youths constituted a mere 1.6% of the Union army. “When soldiers’ reported ages can be checked against census records and other sources,” the authors explain, “it becomes clear that military records mask an epidemic of lying.” By their reckoning, the proportion is more like 10%, amounting to some 200,000 minors. Most were 16- or 17-year-olds, but a substantial subset of roughly 40,000 boys joined at 15 or younger. Young men served in the Confederacy as well, but there was intense resistance in the rebel states to the enlistment of boys under 18, lest the “seed corn” of the next generation of white males be depleted.

Youngsters who ran off to join the army without permission often became drummers or musicians, freeing adult men for active service while fulfilling valuable functions. In those days, the authors remind us, “almost all actions that soldiers performed took place to the sound of a drum, fife, or bugle, and every regiment was accompanied by musicians.” Musicians buoyed morale and conveyed battle commands. More important, they “led armies on the march, determining the speed and, therefore, the distance that a regiment could travel.”

The Civil War drummer boy was a trope even in his time, the winsome hero of lithographs and ballads, but he was not, Ms. Clarke and Ms. Plant reveal, a figure equally lionized on either side of the conflict. The valiant boy-soldier was primarily a figure of the North. “Pure of heart and ardently committed to the Union war effort, he led troops into battle and died without regret, exhibiting the kind of stalwart patriotism that adults were supposed to emulate.” The symbol of a martyred child, borrowed from the republican iconography of the French Revolution, didn’t suit the purposes of the Confederacy. “At the core of Confederate nationalism were notions of tradition, bloodline, and vertical order,” the authors write. “Confederate leaders emphasized a vision of family life and social order in which slaves, wives, and children existed in an organic hierarchy overseen by benevolent patriarchs.“ The authors dryly add: “Celebrating the agency of the young undermined this agenda.”

Parents in the North didn’t necessarily celebrate the agency of the young either, but for different reasons. Mothers and fathers loved their boys, of course, but also relied on their labor, which by law they owned until a boy turned 21. While young males had done militia duty since Revolutionary times, antebellum Americans were still largely hostile to the notion of a standing army and were aggrieved to have their sons in it. Worst of all, once a boy lied his way into the service, parents found it hard to get him out again.

The army was supposed to release minors but often wouldn’t: They were too numerous and too useful. Agitated parents clogged state and local court dockets with their appeals, consuming so much official attention that, according to Ms. Clarke and Ms. Plant, “they were one of the main grounds for the nationwide suspension of habeas corpus.” For the armchair historian, this last claim is perhaps the book’s most startling. Most of us associate the suspension of this ancient legal right with President Lincoln’s desire to suppress both rebellious speech and his political enemies. But, we learn, “most habeas petitioners in the loyal states were actually parents seeking the discharge of underage enlistees.”

Link to the rest at The Wall Street Journal

If you search in the Library of Congress photo collection, you’ll find lots of photos and other images of Civil War drummer boys.

Taylor, young drummer boy for 78th Colored Troops (USCT) Infantry

Can Literary Scholars Transcend Their Training?

From The Chronicle of Higher Education:

Every semester, thousands of American literary scholars concoct new interpretations of works of literature and new arguments about literary studies itself. They assume, pretend, hope, or dream that their words carry the revolutionary force of radical policy reform. They believe that literary studies done right — like defunding the police or dismantling systemic racism — shall topple what needs toppling. Their criticism will help overthrow the ideological status quo of proto-fascist neoliberal states like the United States. It’s a curious overestimation of muscle for a discipline whose landmarks include Don Quixote and Madame Bovary — novels about people who confuse books with life.

This argument (minus the Cervantes and Flaubert) is the damning linchpin of John Guillory’s new state-of-the-field collection of essays, Professing Criticism. Over the last few decades, Guillory, an English professor at New York University, observes, “the discipline and its institutional structures, especially the curriculum,” have been reimagined as something they’re not — “as surrogates for the social totality.” Battle a book, the thinking goes, and you battle the truth the book reflects; “the curriculum becomes the site of a proxy war.” Since literature professors constitute their own best audience, the echo chambers roar with system-dismantling interventions that dismantle nothing so definitely as the discipline itself. We “must settle for the declaration rather than the realization” of our “critical motives,” Guillory observes. These motives are “a kind of imaginary fiat, imputing to even the most recondite scholarship the capacity to function as a criticism of society, an Archimedean lever.” Archimedes without a place to stand is a freak with a monstrous prop wobbling high above his head. That, in fact, is what we look like on campus.

To those in the field — and to those who read The Chronicle — this argument will be familiar. The novelty is that Guillory is a senior scholar and major figure in the field with impeccable left-wing bona fides — and that he offers a historically profound account of the straits. Guillory surveys trends going back to the Greeks and does so with a particular focus on the last four centuries. Reading Professing Criticism is like taking a familiar hike with an 18-foot-tall friend who sees not only the hills but also the hills beyond them, and the ones beyond those.

Guillory describes our delusions in language borrowed not from literature but from social theory. (This accords with his established practice of handling the profession sociologically; his Cultural Capital, from 1993, helped direct a generation of graduate students to the work of Pierre Bourdieu.) To master something, Nietzsche argued, is to deform yourself in its direction. In modern academe, groups of people tighten the rules by which they deform themselves, which makes it even worse. Thorstein Veblen called the phenomenon “trained incapacity,” and John Dewey, “occupational psychosis.” “Professional deformation,” Guillory writes, “is an unavoidable byproduct of the assertion of that autonomy enabling the cultivation of professional expertise to begin with and that insulates such expertise to some extent from the tyranny of the market and from the draconian intervention of the political system.” Professors achieve power internal to the university by cutting themselves off from the external world.

The problem for literary studies is that throughout its institutionalization it has never ceded its dreams of external sway. Before there was a discipline, there was a reading public, and that public remains the ghost clientele of today’s professors. The 17th century gave birth to the influential man (and occasional woman) who made a living by commenting in fine prose on everything under the sun. The 18th century refined the type, and the 19th century vaunted it. What began in the courts of Louis the XIV as highbrow gossip got written down by John Dryden as serious literary criticism, broadened into taste-cultivating generalizing by Addison and Steele, heaved to the summits of philosophy by Coleridge and Carlyle, and resolved into politics by Matthew Arnold. It still dazzles the ambition of lots of graduate students and professors. We cherish the notion that our literary opinions could carry the force of fact.

Unfortunately, literary opinions carry such force only for people who believe in literature. The old lineage, of course, did believe, and so did its original audience. Twentieth-century scions like T.S. Eliot and the New Critics were believers too, and they helped conceive modern English departments. But these forebears of the discipline were largely conservative. And the graduate students and professors dazzled by them today are not conservative. They are reading Pierre Bourdieu.

Guillory shares the politics of the scholars he chastises. But Professing Criticism is full of arguments that would sound conservative coming from anybody else. Translate them into firmer, simpler language, and you would sound like the enemy — like William Bennett or Allan Bloom. Guillory sees that literary scholars today can’t make a convincing external case for what they do. Rather, they are justified by a faith that only their own ranks share. The future of the discipline cannot possibly lie in its longstanding consensus that “taste” and “judgment” and “standards” are simply the heartless weapons of a mystified right or the silly pretensions of snobs out of step with history. It might even lie in taste and judgment and standards.

The past, for Guillory, is not simply a maelstrom of benighted terror (though it is that, too) but the place where the best practices of people who love language thrived. Since the rise of industrialism, language use has narrowed cripplingly, and scholars can’t regain literary power without regaining intellectual breadth.

Link to the rest at The Chronicle of Higher Education

PG is reminded of a 1765 quote from Samuel Johnson:

It is not easy to discover from what cause the acrimony of a scholiast can naturally proceed. The subjects to be discussed by him are of very small importance; they involve neither property nor liberty; nor favour the interest of sect or party. The various readings of copies, and different interpretations of a passage, seem to be questions that might exercise the wit, without engaging the passions.

But whether it be, that small things make mean men proud, and vanity catches small occasions; or that all contrariety of opinion, even in those that can defend it no longer, makes proud men angry; there is often found in commentaries a spontaneous strain of invective and contempt, more eager and venomous than is vented by the most furious controvertist in politicks against those whom he is hired to defame.

PG Note: A “Scholiast” is a commentator on ancient or classical literature. In other words, a scholar.

Surveillance and The Loneliness of the Long-distance Trucker

From The New Yorker:

In 2011, Karen Levy, a doctoral candidate in Princeton’s sociology department, spent the summer as a research intern at Intel’s offices near Portland, Oregon. Her official remit was fuzzy and open-ended, but the company had at one point emphasized its resolve to fnd use cases for its chips in vehicles. Levy hadn’t thought much about vehicles per se, but her mixed academic background—she was also trained as a lawyer—predisposed her to refect on situations that dramatized the peculiar relationship between formal codes (the realm of the law) and practical expediency (the realm of the ethnographer). The road, it occurred to her, was the site of our most common and thoroughgoing encounter with rules; it was also the scene of our most routine and matter-of-fact disregard for them. Take, as an example, jaywalking. It remains technically criminal in many places, but the enforcement of the prohibition is typically neither expected nor desired. Levy’s work is often about the wiggle room that makes social life possible. As she put it to me recently, “What do we really mean when we say a rule is a rule? When do we not mean it?”

While in Oregon, Levy happened to hear an NPR segment about new restrictions on the wiggle room afforded to long-haul truckers. Since the nineteen-thirties, truckers had been reasonably encumbered by restrictions on the number of hours they were allowed to work. These regulations relied upon self-reports manually inscribed in paper logbooks, which truckers were obligated to provide upon inspection. These logbooks, however, were easily falsifed; at the end of the day, or at the end of a trip, the trucker retroftted his journey to accommodate the law. This was an open secret: truckers called them coloring books, or even swindle sheets. Road safety, however, was a real issue. For decades, regulators had debated the introduction of electronic logs—tamperproof devices, hardwired to trucks’ engines, that could digitally track the time truckers spent behind the wheel. Truckers were, to put it gently, resistant to the idea. Long-haul trucking is not a good job (it’s poorly paid, lonely, bad for your health, and dangerous), but at the very least it was compensated by access to mythological status: truckers, as captains of their own ships, enjoyed the freedom and romance of the open road. Trucking was a vocation for the stubborn. By 2012, a federal mandate was a fait accompli, and, even if the trappings of autonomy had always been more symbolic than material, the deployment of digital trackers was received as a status insult.

Later that week, Levy took public transit to Jubitz, a “nice, big truck stop” near the Washington border, to see what it felt like to strike up unsolicited conversations with truckers and get a lay of the land. Levy, whose prose and conversation is starry with exclamatory asides, told me, “I went up to people at the bar, and it was really fun! Truckers turned out to be really forthcoming—they have lots of stories nobody asks them to tell. These days, we talk about ‘essential workers’ all the time, but nobody likes them or thinks positively of them—despite the fact that, as they like to say, ‘if you bought it, we brought it.’ ” When she returned to Princeton that fall, she told her adviser, Paul DiMaggio, that she’d become enmeshed in the tribulations of truckers. DiMaggio is extremely well regarded as a sociologist—his landmark 1983 article “The Iron Cage Revisited,” on the bureaucratization of the professions, is one of the feld’s all-time most cited papers—but, in a previous life, he had been an aspiring songwriter on the Nashville scene, and frequented honky-tonks in the nineteen-seventies. He not only supported the project but promptly set her up with a trucker playlist—including Dick Curless’s “A Tombstone Every Mile” and Dave Dudley’s “Six Days on the Road.” (Many classics of the genre have an air of dark prophecy; among Levy’s favorites is Ronnie Milsap’s “Prisoner of the Highway.”)

Levy went on to visit truckers in eleven states: “The nice thing about truckers is you can fnd them anywhere, and if one place isn’t great you can go down the road to the next truck stop and see who’s there.” Levy grew up not far from Indianapolis, and at frst she looked for men in Colts jerseys; as an invitation to expound on their expertise, she sometimes asked them how they’d get from, say, Portland to West Lafayette, Indiana, which they could invariably answer off the top of their heads. Her initial encounters did not go all that well. She told me, “I was an idiot. I literally didn’t understand what people were saying— what words were coming out of their mouths. There’s all of this lingo— ‘reefer,’ ‘chicken coop,’ ‘reset your seventy.’ I went home and bought a CB slang dictionary on eBay, and learned that a ‘reefer’ is a refrigerated truck, a ‘chicken coop’ is an inspection station, and ‘resetting your seventy’ means restarting your weekly time clock with a thirty-fourhour break.” She continued, “My conversations were not that useful at frst except that it was all interesting, and then, of course, you pick it up —subscribing to all these newsletters, reading the trade press, and now, more than eleven years later, I still read that stuff. I listen to ‘Road Dog Trucking,’ a satellite-radio channel that hosts call-in shows for trucking professionals.” In the past few months, those shows have invited her to appear as a guest.

Levy’s splendid new book, “Data Driven: Truckers, Technology, and the New Workplace Surveillance,” is a rigorous and surprisingly entertaining ethnographic portrait of a profession in transition. Although truckers have always been technologically savvy subjects— they were early adopters of such new technologies as CB radio—they now had to grow accustomed to life as its object. When she began her feld work, electronic logging devices—E.L.D.s—were a looming threat on the horizon. In 2017, they became a legal requirement, but their industrial applications have gone well beyond the basic federal mandate. Trucking companies realized that they could enhance these devices to do things such as track fuel efficiency in real time. In one sense, this was an old story: strict managerial oversight in the service of productive rationalization was a hallmark of the Industrial Revolution. In another, however, the extension of such scrutiny to the fundamentally antinomian culture of trucking was a relevant novelty. With the pandemic, remote workplace surveillance of the otherwise aloof has become an increasingly common intrusion. Truckers, as she said in an interview with the trucker show “Land Line Now,” were “the canaries in the coal mine.” he process of picking up o

Link to the rest at The New Yorker

Some people believe labor-saving technological change is bad for the workers

Some people believe labor-saving technological change is bad for the workers because it throws them out of work. This is the Luddite fallacy, one of the silliest ideas to ever come along in the long tradition of silly ideas in economics. Seeing why it’s silly is a good way to illustrate further Solow’s logic.

The original Luddites were hosiery and lace workers in Nottingham, England, in 1811. They smashed knitting machines that embodied new labor-saving technology as a protest against unemployment (theirs), publicizing their actions in circulars mysteriously signed “King Ludd.” Smashing machines was understandable protection of self-interest for the hosiery workers. They had skills specific to the old technology and knew their skills would not be worth much with the new technology. English government officials, after careful study, addressed the Luddites’ concern by hanging fourteen of them in January 1813.

The intellectual silliness came later, when some thinkers generalized the Luddites’ plight into the Luddite fallacy: that an economy-wide technical breakthrough enabling production of the same amount of goods with fewer workers will result in an economy with – fewer workers. Somehow it never occurs to believers in Luddism that there’s another alternative: produce more goods with the same number of workers. Labor-saving technology is another term for output-per-worker-increasing technology. All of the incentives of a market economy point toward increasing investment and output rather than decreasing employment; otherwise some extremely dumb factory owners are foregoing profit opportunities. With more output for the same number of workers, there is more income for each worker.

Of course, there could very well be some unemployment of workers who know only the old technology – like the original Luddites – and this unemployment will be excruciating to its victims. But workers as a whole are better off with more powerful output-producing technology available to them. Luddites confuse the shift of employment from old to new technologies with an overall decline in employment. The former happens; the latter doesn’t. Economies experiencing technical progress, like Germany, the United Kingdom, and the United States, do not show any long-run trend toward increasing unemployment; they do show a long-run trend toward increasing income per worker.

Solow’s logic had made clear that labor-saving technical advance was the only way that output per worker could keep increasing in the long run. The neo-Luddites, with unintentional irony, denigrate the only way that workers’ incomes can keep increasing in the long-run: labor-saving technological progress.

The Luddite fallacy is very much alive today. Just check out such a respectable document as the annual Human Development Report of the United Nations Development Program. The 1996 Human Development Report frets about “jobless growth” in many countries. The authors say “jobless growth” happens whenever the rate of employment growth is not as high as the rate of output growth, which leads to “very low incomes” for millions of workers. The 1993 Human Development Report expressed the same concern about this “problem” of jobless growth, which was especially severe in developing countries between 1960 and 1973: “GDP growth rates were fairly high, but employment growth rates were less than half this.” Similarly, a study of Vietnam in 2000 lamented the slow growth of manufacturing employment relative to manufacturing output. The authors of all these reports forget that having GDP rise faster than employment is called growth of income per worker, which happens to be the only way that workers “very low incomes” can increase.

William Easterly, The Elusive Quest for Growth: Economists’ Adventures and Misadventures in the Tropics

Finding Yourself in Prince Harry’s Memoir

From Publisher’s Weekly:

Though I’ve been called a Jewish princess by disgruntled ex-boyfriends, on the surface I have nothing in common with the British Christian prince now residing in California. Yet as a Manhattan memoirist who writes provocative books my parents detest, I’m completely overidentifying with Harry. Except for his $20 million advance; interviews with Oprah, Strahan, Colbert, and Cooper; the record-breaking millions of copies sold; and the frozen body part, the Duke of Sussex and I are kindred spirits and soulmates, uncannily connected. After all, feeling inferior to a close sibling, fleeing the family business, and selling a randy, revealing memoir to Random House that scandalizes relatives was my story first.

Twenty years ago, when my tell-all Five Men Who Broke My Heart debuted from our mutual publisher, I was the loudmouthed rebel who partied too much, moved to the big city for a creative career, married a sympathetic partner, and became a shrinkaholic who’d overanalyze everything publicly. Okay, my Bubbe Yetta’s Fort Lauderdale bungalow wasn’t as big as his Gan-Gan’s Balmoral Castle. But our unrest led us both to booze, pot, cocaine, and magic mushrooms. I lost my virginity outside, too, to an older paramour. And just like Harry, going into therapy to unravel it all put me on a different planet than my inner circle.

In my own childhood, my staid, stiff-upper-lip (similarly balding) brother pleased everyone by following in my father’s footsteps. He dutifully stayed in our hometown, kept family laundry private, and became a physician like Dad, as he was supposed to. As kids, my brother once let his pet white rat out in my bedroom at four in the morning, while my other sibling took pictures of my reaction. As they trashed my politics and Oprah-inspired psychobabble, I felt like I was the victim of extreme insensitivity.

. . . .

While they labeled my racy nonfiction “fiction,” my shrink had me tell them, “You can be proud of my success selling a book without liking the book.”

. . . .

My folks eventually forgave me and we convened for birthday and Bar Mitzvah celebrations. Yet after escaping, I had the same conundrum as Harry, in which critics claimed the clan I couldn’t wait to escape were my best characters.

Aside from being jealous that a rich ginger with other job prospects is muscling out full-time scribes who have no connections with his chronicle of the five royals who broke his heart, I’m just wild about Harry. He’s revitalizing my favorite industry and oft-maligned genre. Having also collaborated on other people’s stories, I’m even obsessively identified with his ghostwriter. Though I admit Tender Bar and Agassi’s autobiography sold a few more copies than my low four-figure sales.

It’s easy to see why Spare resonates. The world watched his pa cheat on his beautiful naive mum with his secret mistress and divorce Diana, leaving her unprotected. At 12, Harry was traumatized by her loss. Making matters worse, Pa married Harry’s dangerous stepmum, elevating her to queen consort. It’s dramatic enough for several more seasons of The Crown, which Haz charmingly admitted to watching.

I’m thrilled that Harry debunked the Anglo fairy tale, avenged his mum’s mistreatment, reminded everyone how illicit affairs can haunt children, and tattled that Willy was a bully. Bestselling memoirist Anne Lamott said, “You own everything that happened to you. Tell your stories. If people wanted you to write warmly about them, they should have behaved better.” Joan Didion added, “Writers are always selling someone out.”

Meanwhile, envious authors—from Piers Morgan (infamous for his sexism, xenophobia, running of fake war photos, and perpetuating of phone hacking scandals) to Patti Davis (who has made a cottage industry out of depicting her famous parents for five decades)—have questioned why the prince wrote such a candid, catty, confessional book.

To me it’s obvious. First, remember that the Obamas, Clintons, Bushes, Carters, and Trumps, as well as Fergie, made more money from book royalties than all their official work put together.

Link to the rest at Publisher’s Weekly

England’s 17th century was a ferment of ideas and revolution

From The Economist:

Writing an accessible history of Britain in the turbulent 17th century, as Jonathan Healey sets out to do in “The Blazing World”, is a noble aim. Starting with the seeds of one revolution and ending with a second, the period teems with ideas about what it means to be a citizen as opposed to a subject, and about how God should be worshipped. By the end of it, a modern concept of the state was emerging. Yet even in Britain it is neglected.

At the turn of the century nine out of ten people lived in the countryside. A tiny minority were literate. Famine and plague were regular scourges; a rising population and stagnant economy spelt misery. Fear of witchcraft was common, as were executions for petty crimes. Fornication could land you in court. By 1700 trade had replaced farming as the mainstay of a burgeoning market economy. Relief from extreme poverty was mandated by Parliament. Towns were hubs of commerce and culture; religious dissent was accepted by a relatively tolerant Anglican elite. Rich Protestant England was a force in Europe.

A continuous thread runs from the accession of England’s first Stuart king, James I, in 1603, to the dynasty’s fall in the so-called Glorious Revolution of 1688-89. Yet historians often balk at telling the tumultuous, ideologically charged story in one go. Often it is divided into three chunks. First come increasing resistance to absolutism and religious intolerance, civil war, the parliamentary army’s victory, the execution of Charles I, and the establishment of the Protectorate under Oliver Cromwell. Next, the monarchy’s restoration under Charles II; finally, the disastrous reign of James II and invitation to William of Orange to take his place and establish a proto-constitutional monarchy.

Mr Healey takes on the whole saga in under 500 pages. It begins when the first James acceded to the throne and was struck by England’s apparent wealth compared with his native Scotland. But the monarchy itself was chronically broke, a condition made worse by the excesses of his court. Tension ensued between his need for cash (he even sold peerages) and Parliament’s traditional control over the royal purse strings. In an era in which absolutism had become the norm across Europe—but was a relatively newfangled notion in England—such restraints were seen first by James, then by his son Charles, as a direct challenge to the “divine right of kings”. All the dynasty’s calamities sprang from this clash.

A far less canny operator than his father, the priggish Charles I was widely seen to be trampling over ancient English liberties. Initially, this was a bigger cause for rebellion than the novel idea of government by and for the people: that developed later, particularly in the ranks of the radicalised parliamentary army. When the civil war began in 1642, nobody thought it would lead to the decapitation of a king and advent of a republic (albeit a short-lived one).

Wily and pragmatic as well as louche, Charles II may have been the only Stuart to see that public opinion, fed by the proliferating news-sheets and pamphlets, could confer or deny legitimacy.

Link to the rest at The Economist

Knopf Press Report Card
Strategy by Random House
No Look Inside FeatureF
Posting the Book Almost Three Months Prior to ReleaseF
Pricing F
Likelihood of Significant Commercial SuccessF
Final GradeF

Armada

From The Wall Street Journal:

On Sept. 21, 1588, a savage storm lashed the western coast of Ireland. As the gale intensified, three bulky ships ran aground on the sands of Streedagh Strand, north of Sligo. Part of a formidable “Armada” sent by King Philip II of Spain to conquer the kingdom of his archenemy, Queen Elizabeth I of England, the armed merchantmen had already made a remarkable odyssey. The mission had taken them from Portugal, through the English Channel in a running fight with the nimble and well-armed warships of the “Virgin Queen,” and then around Scotland on a hazardous homeward passage. Now pulverized by the unrelenting Atlantic surf, the stricken vessels broke apart, with the loss of more than 1,000 lives.

In “Armada: The Spanish Enterprise and England’s Deliverance in 1588,” Colin Martin and Geoffrey Parker trace the genesis, fate and legacy of a venture that is remembered as a disaster but that, in their estimation, came close to achieving its objective. The authors first met in 1973, and in the half-century since have maintained a fruitful academic collaboration. In a revised and expanded version of a book first published in 1988, the two deliver what will surely become the definitive account of what the Spanish called “the Enterprise of England.”

Mr. Parker, a professor of history at Ohio State University, draws upon his unrivaled mastery of the extensive documentary sources. Mr. Martin, a retired reader in maritime archaeology at the University of St. Andrews in Scotland, deploys knowledge he has gained in directing the exploration of three Armada wrecks. Distinguished by incisive analysis, “Armada” fuses the complementary skills of the historian and the underwater archaeologist, exploiting the latest discoveries from the archives and seabed alike to help explain why the endeavor ultimately failed.

The original proposal for the “Enterprise of England,” drawn up in 1586 by its designated commander, the experienced marquis of Santa Cruz, envisaged a single amphibious task force. Philip, an inveterate “micromanager,” could not resist meddling with the plan, making it dependent upon close cooperation between the fleet and an entirely separate army. The authors show how this made the mission much more complicated.

Philip was adamant that the Armada should sail up the English Channel and rendezvous in the narrow Straits of Dover with the Spanish “Army of Flanders,” which would be stationed in the Netherlands. Whatever the provocation, the Armada was to save its strength until positioned to escort almost 30,000 veterans, packed aboard specially prepared barges, to a beachhead in Kent.

The invasion would be commanded by the king’s nephew, Alexander Farnese, duke of Parma. Supplied and reinforced by the Armada, Parma’s army was to push inland against London, its flank braced by ships probing the Thames estuary. The king’s strategic vision may have been compromised by his religious piety; the extremely devout Philip was confident that God’s favor would overcome all difficulties.

The Armada’s departure was delayed by the logistical challenge of assembling and supplying such a vast undertaking. In 1587, Francis Drake hampered Spanish preparations by torching stores stockpiled at Cadiz in a pre-emptive strike that he described as “singeing the King of Spain’s beard.” When the marquis of Santa Cruz succumbed to typhus, the role of organizer was assumed by the duke of Medina Sidonia, despite his reluctance to accept what he regarded as a poisoned chalice. The duke’s administrative ability put the Armada on an even keel: By the time it eventually left Iberia, the revitalized force mustered 130 ships carrying 27,000 men. Significantly, though, two-thirds of them were soldiers with scant experience of life afloat.

Link to the rest at The Wall Street Journal

UPDATE: The first comment on this post mentioned that this book may not be quite so new as the WSJ lead PG to believe.

PG searched Amazon with the following query: the armada garrett mattingly and found Mr. Mattingly’s name on several earlier Armada editions without a co-author dating back to 1959.

PG then clicked on the link from the co-author, Geoffrey Parker. Again, the link didn’t go to an author page, but rather a collection of books. The first was Operations Management for Dummies for which Mr. Parker was the third of three co-authors. The second listing was for The Cambridge Illustrated History of Warfare (Cambridge Illustrated Histories) for which Mr. Parker was the editor. The Armada book in the OP was far down the list for Mr. Parker.

The author links seem a bit dodgy and PG found some other Geoffrey Parkers that may or may not have been the Geoffrey Parker in the OP.

PG hereby offers to show anyone from the Yale University Press how to set up a proper author page on Amazon. Even though he’s not a Yalie, he will not charge a fee to the university for what should be a telephone call lasting about ten minutes or a maximum of three emails.

Publisher: Yale University Press

Yale University Press Report Card
No Look Inside FeatureF
Hardcover OnlyF
PricingF
Likelihood of Significant Commercial SuccessF
Final GradeF

Trump Threatens to Sue Former Prosecutor, S&S over Forthcoming Tell-All

From Publisher’s Weekly:

Embattled former president Donald Trump is threatening to sue publisher Simon & Schuster and author and former New York criminal prosecutor Mark Pomerantz over the forthcoming publication of People vs. Donald Trump: An Inside Account.

According to S&S press materials, Pomerantz, who investigated Donald Trump and the Trump Organization, purports to explain in his book why Trump should be prosecuted for financial crimes—and why he believes that prosecution hasn’t yet happened. Pomerantz resigned last February, reportedly after Manhattan DA Alvin Bragg was said to have put the brakes on an imminent criminal prosecution of the former president.

But in a letter this week, shared with PW, Trump attorney Joe Tacopina warns Pomerantz and S&S officials against publishing a book that repeats allegedly “false” and “defamatory” statements.

“I strongly admonish you to take these next words seriously: If you publish such a book and continue making defamatory statements against my client, my office will aggressively pursue all legal remedies against you and your book publisher, Simon & Schuster,” Tacopina writes. “Trust me, I will zealously use every possible legal resource to punish you and your publisher for the incredible financial harm that you have caused my clients to suffer.”

In a statement issued late Monday evening, Pomerantz dismissed Trump’s threat to sue.

“If the former president should sue me, I will defend that litigation,” Pomerantz said, in a statement issued through S&S. “I stand by the statements I have made previously, and those contained in my forthcoming book.”

The threat marks the latest attempt by Trump to stop publication of a book that criticizes him—which so far have only served to sell books. In 2018, while president, Trump attorneys sent a cease-and-desist letter to publisher Henry Holt and author Michael Wolff over his book of Fire and Fury: Inside the Trump White House. The media attention propelled the book to the top of the bestseller list.

More recently, Trump unsuccessfully sued Simon & Schuster and his niece, author Mary Trump, in New York state court seeking to block publication of her memoir Too Much and Never Enough: How My Family Created the World’s Most Dangerous Man. The book would go on to sell more than a million copies.

Link to the rest at Publisher’s Weekly

Decoding brain waves to identify the music we are hearing

From Medical Xpress:

A new technique for monitoring brain waves can identify the music someone is hearing.

Researchers at the University of Essex hope the project could lead to helping people with severe communication disabilities such as locked-in syndrome or stroke sufferers by decoding language signals within their brains through non-invasive techniques.

Dr. Ian Daly from Essex’s School of Computer Science and Electronic Engineering, who led the research, said, “This method has many potential applications. We have shown we can decode music, which suggests that we may one day be able to decode language from the brain.”

Essex scientists wanted to find a less invasive way of decoding acoustic information from signals in the brain to identify and reconstruct a piece of music someone was listening to.

While there have been successful previous studies monitoring and reconstructing acoustic information from brain waves, many have used more invasive methods such as electrocortiography (ECoG), which involves placing electrodes inside the skull to monitor the actual surface of the brain.

The research, published in the journal Scientific Reports, used a combination of two non-invasive methods—fMRI, which measures blood flow through the entire brain, and electroencephalogram (EEG), which measures what is happening in the brain in real time—to monitor a person’s brain activity while they are listening to a piece of music. Using a deep learning neural network model, the data was translated to reconstruct and identify the piece of music.

Music is a complex acoustic signal, sharing many similarities with natural language, so the model could potentially be adapted to translate speech. The eventual goal of this strand of research would be to translate thought, which could offer an important aid in the future for people who struggle to communicate, such as those with locked-in syndrome.

Dr. Daly added, “One application is brain-computer interfacing (BCI), which provides a communication channel directly between the brain and a computer. Obviously, this is a long way off but eventually we hope that if we can successfully decode language, we can use this to build communication aids, which is another important step towards the ultimate aim of BCI research and could one day provide a lifeline for people with severe communication disabilities.”

Link to the rest at Medical Xpress

PG says human brain/computer interfaces will continue to develop in many different ways, most good.

To Warn or not to Warn: The Controversy around Trigger Warnings in Literature

From Writer Unboxed:

My publisher engaged a sensitivity reader to evaluate the portrayal of a neurodiverse character in my summer 2023 release (The Beauty of Rain). I eagerly anticipated the reader’s feedback, whose notes on that aspect of the manuscript were ultimately helpful and unsurprising. Conversely, her recommendation that I add trigger warnings about suicidal ideation and prescription drug abuse did momentarily throw me.

Most everyone knows that a trigger warning is essentially a statement cautioning a consumer/reader that the content may be disturbing or induce a traumatic response. Although these labels are not as commonplace in publishing as they are in film, television, and music, in recent years they’ve begun to appear on a book’s digital detail page, its back jacket, or in an author’s note. The big argument in favor of such labels is that they give a reader the choice to avoid a book that contains material said reader might find harmful or that could unwittingly force them to revisit past trauma.

While I consider myself to be a compassionate person who would never purposely cause someone harm, my initial reaction was to reject the suggestion. Trust me, I know that sounds awful, but I worried that the warnings somewhat mischaracterized the tone and themes in my work. After all, if A Man Called Ove had included a suicidal ideation warning, many people might have missed out on an extremely life-affirming story. I discussed my concern with my agent and editor, both of whom also expressed doubts about the necessity of the warnings.

Coincidentally, around that same time I was doom-scrolling on Twitter and came across a New Yorker article from 2021 entitled “What if trigger warnings don’t work?” That piece discusses studies conducted with respect to the effectiveness of content warnings in academia (which are on the rise). The data suggests that such warnings not only don’t work, but they may inflict more harm by causing additional stress and reinforcing the idea that a trauma is central to a survivor’s identity (which is the opposite of PTSD therapy goals). On Twitter and in an Authors Guild discussion thread on this topic, more than one licensed therapist concurred with the article’s conclusions and believed trigger warnings had no meaningful effect.

You might think this data cemented my decision, but it merely piqued my interest in the topic. What better excuse to procrastinate writing my next book than to dive down the rabbit hole of articles and blog posts about the pros and cons of trigger warnings in literature?

It did not take long to identify some other commonly debated pitfalls, which include:

  • Spoilers: One simplistic and popular complaint is that a content warning may give away a plot twist and thus spoil the story for every reader, which is especially frustrating for those who didn’t want the warning. This camp argues that, prior to purchase, a sensitive reader can visit websites such as Book Trigger Warnings or Trigger Warning Database to verify whether a particular book contains personally troubling content without forcing the author to ruin the surprise or twist for every potential reader who picks up the book.
  • Trigger Identification: There are as many different triggers as there are readers, making it a practical impossibility to adequately warn every potential reader about every potential trigger. Similarly, readers with comparable experiences might have different reactions and preferences (for example, I was raised in a violent home but did not want or need a warning before reading The Great Alone). We can certainly group some content into broad common categories like domestic abuse, addiction, rape, etc., but what about a reader who might be traumatized by something more obscure (like a color or a setting)? It seems ambitious if not impossible to imagine one could create a list of all possible triggers. If we can’t screen every scenario, is it fair to screen any?
  • Genre expectations: In dark romance, for example, it is almost guaranteed that there will be some level of violence and crime (such as kidnapping the heroine or dubious consent). The same could be said of crime novels and thrillers (graphic violence, rape, murder, mental health matters). Should authors and publishers need to take additional steps to prepare a reader for something that is essentially foundational to that genre?
  • Censorship: Some teachers, librarians, and publishing professionals argue that content labels are a form of censorship, and that the line between labels and trigger warnings is thin. They worry an overreach or abuse of these labels could result in many books being segregated onto separate shelves. For example, YA books often tackle an array of topics from fatphobia to date rape. If the use of multiple warnings persists and leads to segregated shelving, those books might become less visible and accessible to the general public who might otherwise benefit from exploring those topics. This slippery slope could also ultimately affect what stories publishers choose to invest in and distribute, which would be bad for both authors and readers.

In my opinion, some of these arguments hold more weight than others. I haven’t had an epiphany when it comes to their efficacy, nor am I convinced that there is a clear right answer to this complicated question. That said, my research journey helped me focus on the decision I had to make and its effect on my writing goals. I write stories because I want to emotionally connect with others. Would I prefer to have as many readers as possible give my story a chance? Yes. But do I want to sell my books to everyone at any cost, including the potential emotional torment of another? No, of course not.

Link to the rest at Writer Unboxed

PG wonders how humankind was able to evolve from pond scum into its present form without trigger warnings.

Somehow, the ancient Egyptians managed to build an amazing civilization without trigger warnings. (PG doesn’t read hieroglyphics, so he can’t be certain, but he doesn’t think he’s ever seen a hieroglyphic that looked like it might be a trigger warning.)

The Greeks and Romans built amazing civilizations without trigger warnings. He’s not aware of any Latin text that translates to: “This scroll contains references to alcohol consumption, violence using fantasy magic, and panic attacks.”

Nor did the great artists and writers of the Renaissance ever include trigger warnings.

Pieter Bruegel didn’t have trigger warnings for any of his paintings.

Nor did Leonardo da Vinci.

Nor did Stephen Crane:

At times he regarded the wounded soldiers in an envious way. He conceived persons with torn bodies to be peculiarly happy. He wished that he, too, had a wound, a red badge of courage.

Nor did Siegfried Sassoon:

Do you remember the dark months you held the sector at Mametz
The nights you watched and wired and dug and piled sandbags on parapets?
Do you remember the rats; and the stench
Of corpses rotting in front of the front-line trench–
And dawn coming, dirty-white, and chill with a hopeless rain?
Do you ever stop and ask, ‘Is it all going to happen again?’

Do you remember that hour of din before the attack–
And the anger, the blind compassion that seized and shook you then
As you peered at the doomed and haggard faces of your men?
Do you remember the stretcher-cases lurching back
With dying eyes and lolling heads–those ashen-grey
Masks of the lads who once were keen and kind and gay?

How Janet Malcolm Created Her Own Personal Archive

From The Literary Hub:

Stepping into Janet Malcolm’s home overlooking Gramercy Park was like entering an alternate version of New York City, the kind one might have read about in a childhood chapter book. The ceilings were double height, the lighting was warm and soft. The art adorning the walls was attractive, but the pièce de rèsistance was Malcolm’s library.

Books covered the walls of her soaring living room, with a wooden ladder tucked in the corner to offer easy access. The collection was organized by genre: photography, biography, criticism. Books by friends got their own shelf by the door, perhaps to ensure that good company was never far beyond reach.

This was where Malcolm and I met for the first time when, in the waning days of the summer of 2019, she invited me over for tea. I had just completed my own research project on her life and work. Surrounded by  Malcolm’s home library, we discussed my time in her papers.

At one point, Malcolm got up to grab a photograph of her elementary school class at the top of the Empire State Building. She had been telling me about a series of essays she was starting to tinker with, short reflections on old pictures. She was not sure what she wanted to do with them but her editor thought they would make a good collection. In this photograph, the children of P.S. 82 were smiley and windswept, no older than nine or ten. She asked me if I could pick her out of the lineup. Of course I could: small frame, unmistakable grin, third from the left in the front. This is the kind of intimacy built by time spent in an archive.

. . . .

Still Pictures: On Photography and Memory, Malcolm’s final book-length work, which was published posthumously in January 2023. On the surface, these essays are a radical departure from everything else Malcolm wrote over the course of her career: they concern people, places, and items that populated her younger life, rather than subjects from which she could purport to maintain some degree of journalistic remove.

But the choice to root her recollections in printed images was a calculated one. By starting with her own archive, Malcolm created the opportunity to write from a vantage point more akin to that of her earlier work, to keep her readers ever at arm’s length.

These essays are a radical departure from everything else Malcolm wrote over the course of her career: they concern people, places, and items that populated her younger life.

Prior to her death at 86 in 2021, Malcolm had long been a towering figure in American journalism. She had earned a reputation for penning biting criticism and novelistic reporting. Whole issues of the New Yorker were devoted to her deep dives. But despite her robust literary credentials, she was wary of becoming a celebrity in her own right. For much of her storied career, Malcolm seemed to shun any work that might resemble autobiography, or really expose her true self to her audience at all.

After Jeffrey Masson, Sanskrit scholar, psychoanalyst, and the subject of In the Freud Archives, sued Malcolm for libel in 1984, tarnishing her image even though she won the suit, she retreated almost entirely from the public eye. She did not pose for photos. She hardly ever made appearances: for a rare public event in the spring of 2012, she insisted on reading aloud from a pre-written and edited script rather than speak off-the-cuff. She postured in her writing as someone terrifyingly and unapproachably sharp, never missing an opportunity to remind her readers of “the fiction—on which all autobiographical writing is poised—that the person writing and the person being written about are a single seamless entity.”

Her work over the final decade of her life, though, tells a different story. In 2010, she published her first piece hinting at a change of heart, a short essay for the New York Review of Books titled “Thoughts on Autobiography from an Abandoned Autobiography.” At the time, any intimation of a memoir in the works would have taken devoted readers by surprise. There, she’d remarked, “I cannot write about myself as I write about the people I have written about as a journalist. [These people] have posed for me and I have drawn their portraits. No one is dictating to me or posing for me now.” But in the years following the publication of this essay she did find a way to write about herself more directly. The people smiling at the camera in her personal photographs became the ones sitting for her last set of portraits.

It’s hardly coincidental that Malcolm was organizing her own archive around the same time that she began to explore writing about her life. In 2013, she sent her first shipment—59 boxes of assorted detritus—up to New Haven, where they would live at Yale’s Beinecke Rare Book and Manuscript Library. To say that Malcolm organized her papers would be a stretch. In some cases, she simply emptied the contents of her filing cabinets into cardboard boxes. But elsewhere, she annotated letters and folders, leaving easter eggs and reminders for any future researcher that she was thinking carefully about what to include in her archive and, more importantly, what to leave out.

In 2020, she sent a second installment of materials, bringing the total number of boxes in her archive to around a hundred. This project punctuated the final decade of her career. As she built this collection during these years, her aversion to writing about herself, or engaging with her own legacy at all, slowly gave way to something closer to ambivalence.It’s hardly coincidental that Malcolm was organizing her own archive around the same time that she began to explore writing about her life.

Malcolm published the first of what would become the Still Pictures essays, “Six Glimpses of the Past,” in the New Yorker in the fall of 2018. A sequence of short reflections on family photos, the piece began with a snapshot of the writer as a little girl wearing t-strap sandals and a polka-dotted bucket hat and beaming at someone beyond the frame. But Malcolm was careful to remind her readers that just because she was sharing this photograph didn’t mean she was getting personal.

Regarding the picture, Malcolm wrote, “I say ‘my’ age, but I don’t think of the child as me. No feeling of identification stirs as I look at her round face and thin arms and her incongruously assertive pose.” The little girl posing in the picture and the woman writing about her were hardly the same person at all. This was the same perspective she employed to write about a family therapy session from the other side of a one-way mirror, or Sylvia Plath through five other biographies of her, only this time there was no denying that she was much closer to the content. The material was brand new, and the perspective was quintessentially Malcolm.

Link to the rest at The Literary Hub

Largest-ever study of journal editors highlights ‘self-publication’ and gender gap

From Nature:

The gender gap among senior journal editors is bigger than many people thought, and some editors publish a surprising number of their own papers in the journals that they edit, finds the first study to look at these issues over time across multiple disciplines.

“Although we expected women to be under-represented, we certainly didn’t expect the percentage of women on editorial boards to be as low as 14% for editors and 8% for editors-in-chief,” says co-author Bedoor AlShebli, a data scientist at New York University (NYU) Abu Dhabi. By comparison, women account for 26% of all scientific authors (see ‘Gender gap’).

AlShebli and her colleagues analysed the gender and publication habits of more than 80,000 editors at 1,100 Elsevier journals across 15 disciplines and five decades. The work was published on 16 January in Nature Human Behaviour1.

The team found clear evidence of systemic, and persistent, gender inequality in editorial boards across all research disciplines except sociology, says co-author Talal Rahwan, a computer scientist at NYU Abu Dhabi. Although career length, and the attrition of women from academia, explains the gap among editors, it could not account for the gap among editors-in-chief. “This suggests that other factors, such as bias, might be at play,” Rahwan says. Over the past 40 years, the gap between the proportion of women in science and the proportion of female editors has remained mostly stable.

Paper trail

The rates of self-publication — editors publishing their own research in journals that they edit — also raised some eyebrows among the authors, says AlShebli. The study found that one-quarter of all editors published at least 10% of their papers in journals that they edit. For some, the rates were even higher: 12% of editors publish at least one-fifth, and 6% publish at least one-third, of their papers in their own journals (see ‘Self-publication’). A small number published as much as two-thirds of their career output in their own journals. There was no significant difference in the rates of self-publication between male and female editors, but the increase in this rate that occurred immediately after becoming editor was higher for men than for women.

. . . .

Laura Dormer, editor-in-chief of the journal Learned Publishing, says that it isn’t surprising, or necessarily a problem, that editors often publish in their own journals, especially in niche fields. “Generally, researchers will aim to publish in the most impactful journal they can — in terms of actual journal impact factor, and in terms of reaching their peers,” she says. “However, it’s important for journals to have ethical procedures in place that preclude editors from being involved in the handling of their work at any stage of the submission process, and this should be more transparent.”

The gender gap among editors is a trickier problem, but one that is an important focus for many publishers, says Dormer. A variety of approaches are being taken. Dormer suggests that journals should recruit more early-career researchers as board members, because these scientists tend to be more diverse as a group. “This will be beneficial for their own career development, and beneficial for the journal in terms of widening its scope,” she says.

Link to the rest at Nature

Indigenous Authors Have More to Share Than Trauma Narratives

From Publisher’s Weekly:

When I think about what is discussed when talking about Indigenous literature, I often think about what isn’t out there yet or what there isn’t enough of. What do we need more of? What type of stories will benefit us and those who are learning about us?

Last spring, we at Heyday welcomed some of our oldest friends and biggest supporters at our annual Heyday in L.A. event. Among them were two of the most enthusiastic California Native people I’ve come across when it comes to Indigenous literature: a Native bookshop owner and an American Indian literature professor. I had the privilege to engage in a very spirited discussion with the two of them, during which I asked them,“Do you know of any novels by Native authors that are not centered on trauma?”

They both pondered on this question. It was the only time in the conversation where there was silence. The three of us could not come up with an answer; there was not one book we could think of that didn’t deal with the ugly things that colonization brought to us, such as death, disease, grief, poverty, childhood trauma, sexual assault or abuse, drug use, alcoholism, missing and murdered Indigenous peoples, mental illness, or a long list of other things that plague many of our tribal communities.

What I find most upsetting about this is that there is a real possibility that a lot of our own literature is unwittingly perpetuating the narrative that tribal people are tragic, but there is much more to us than this. We are funny. We are resilient. We are smart. We are innovative. We are thriving. We have generosity and kindness in our communities. Many of us are very delightful. Where are these delightful stories of ours? And do people want to read about them? Well, regardless of whether people do, I believe these books should exist in the world.

A lot of my work in the Heyday Berkeley Roundhouse, which publishes books by California’s Indigenous peoples and the quarterly magazine News from Native California, I’ve found that there is a lot that tribal people are not asked about in terms of their lived experiences: stories about connection, growth, and triumph. And we are so lucky when authors come to us with these stories.

When our now editorial manager brought me the submitted manuscript for An Indian Among los Indígenas by Karuk author Ursula Pike, I couldn’t have been more excited. It’s a travel memoir about Pike’s experience in the Peace Corps while serving Indigenous communities in Bolivia. It was fantastic to address the parallel experiences of Indigenous peoples across the Northern Hemisphere and Southern Hemisphere, as well as the differences. The story provoked a lot of introspection in me. I remember thinking that there are plenty of Eat, Pray, Love–type memoirs out there that explore self-discovery, but not many that center Indigenous perspectives.

Link to the rest at Publisher’s Weekly

A Place for Fire

From The Paris Review:

We were still in Colorado when we booked a first appointment with a realtor in Rhode Island. In the hour before our video call, my husband suggested we make a list of must-have and nice-to-have features in a house. He wrote “3 BR” in the must-have column on a page in his notebook, because we each wanted our own office, then leaned back in his chair. “Built-in bookshelves would be nice,” he said. We’ve always wanted built-in bookshelves. We didn’t yet know we were going to run out of space in the shipping container we’d rented and would have to throw out all the shelves we owned. “A fireplace,” he added thoughtfully. I went into my strident mode, a part of my bad personality that for some reason I cannot change. “A fireplace isn’t optional!” I said, taking the pen and writing “fireplace” in the must-have column. “I’m not going to buy a house without a fireplace.”

We’d spent eleven years in Denver, all in the same apartment, not because we liked the apartment so much, but because every year, when our lease renewal came up, we never felt much like moving. We had moved out there from Boston with eighty or ninety boxes of books, and we didn’t want to pack them up again. We kept hitting that snooze button. Finally John convinced me to move back to New England—he was born in Connecticut, and he never stopped missing it, the trees and the stone walls and all that. What pushed us over was the housing market, which was more reasonable in Providence than in Denver. John kept showing me listings for adorable Colonials with mortgage payments not much higher than our rent. They looked cozy, and I thought I could be happy in New England if we had a little house to settle down in—one last move for us and for the books—if we could cozy up together on a couch and read by the fire.

We drove across the country at the end of March 2022, arriving in John’s hometown in early April—an old mill town in Southeastern Connecticut, an hour from Providence. Our plan was to stay with his mother for a few months. This had a dual purpose. We’d save money on rent and recoup the costs of moving while we looked for a permanent place to live. We could also help Linda with some things around the house, and keep her company—John’s father had died the previous fall. We felt useful, helping her clean out the basement, which had flooded the previous summer, and manage the yard, and so did she—on nights when we had to work late, Linda made dinner.

It’s strange to return. I lived in Boston in my twenties, and now I’m in my forties. One weekend in April we visited friends in Cambridge, then stopped in Harvard Square to buy Linda a Mother’s Day present. There was still a bitter chill in the wind that morning, and as we drove around looking for a spot to leave the car, we kept passing places where I remembered being cold. Once I slipped on some ice coming out of a bar on Mass Ave. It must have been 2007. There was frozen, jagged snow all over the sidewalks, and I tore my jeans and scraped up my knees and the palms of my hands. A couple days later I got food poisoning—it was particularly miserable, vomiting while down on my wounded knees.

During the spring and into summer, whenever anybody asked me how the house hunt was going, I’d make the same unfunny joke. We’re facing two problems, I’d say, and they’re related. The places we like, we can’t afford, and the places we can afford, we don’t like. During the nine or so months between our decision to move and the actual move, housing prices had gone up something like 25 percent. We had told our realtor the absolute top of our range. A week or so later, he asked for a reminder of that figure, quoting back a number fifty thousand dollars higher than the one we’d given him. I didn’t know how to respond. The places in our price range lacked our must-have features, to say nothing of nice ones. I felt like a fool.

When I was twelve or so, my parents converted their wood-burning fireplace to gas. The idea was that it would be so much easier to light and extinguish that we’d use it more often. But the fireplace lost almost all of its appeal. It no longer gave off any real heat, and it didn’t smell delicious—it didn’t smell like anything—and worst of all, it didn’t crackle. I love the sound of a wood fire, and I got through many a winter in that Denver apartment by burning a special kind of candle with a crackling wooden wick, and by playing ASMR white noise videos on YouTube with names like “Cozy Reading Nook Ambience” and, my favorite, “Crackling Campfire on the Windy Tundra of Norway.” My family’s new gas fireplace offered no drama. As Jun’ichirō Tanizaki once wrote of electric heaters, “without the red glow of the coals, the whole mood of winter is lost.” After the conversion we only lit a fire once a year, on Christmas, and in a perfunctory fashion. In Providence, I thought we might have to settle for a gas fireplace. But most houses we looked at had no fireplace at all. And with interest rates increasing, we couldn’t afford those houses either.

Link to the rest at The Paris Review

Progress Through Experiment

From The Wall Street Journal:

By the end of the 19th century, physicists thought they had things pretty well figured out. Everything in the universe was made of various types of “atoms,” which they believed were the smallest possible units of matter. Newton’s laws of motion, combined with the new science of electromagnetism, could predict how these immutable atoms would behave, whether they formed the mass of a planet or the parts in an electric motor. “Now only the details were left to explore,” writes Suzie Sheehy in her absorbing scientific history, “The Matter of Everything: How Curiosity, Physics and Improbable Experiments Changed the World.

. . . .

By the 1960s, particle physics had grown from tabletop experiments to industrial-scale operations. Instead of a handful of known sub-atomic particles there would soon be more than 100. And to find the next ones, researchers needed bigger accelerators, more power, more people—and more money. Eventually, the cost of the largest particle accelerators became too much for any single country to bear.

This is where Suzie Sheehy enters the story. As a young physics student, she worked briefly at CERN, the multinational research center near Geneva, Switzerland. At the time, CERN was finishing construction of the Large Hadron Collider. The LHC is the largest particle accelerator ever built and the biggest science experiment in history, involving some 10,000 scientists and technicians and a total investment of more than $10 billion.

In 2012 Ms. Sheehy watched a live-feed as CERN project leaders made a long-awaited announcement: They had confirmed the existence of the Higgs boson, a particle predicted four decades earlier by the British physicist Peter Higgs. The discovery resolved some of the biggest quandaries in today’s Standard Model of particle physics. “The camera zoomed in on eighty-two-year-old Peter Higgs as a tear rolled down his cheek,” she recalls.

Does this mean the mysteries of physics are nearly answered? Is there anything left to discover? Ms. Sheehy argues that—despite all the discoveries of the past 125 years—our universe remains full of enigmas. Every day, she writes, physicists like her find reasons to go to their labs “looking for something that makes us go ‘hmm . . . that’s strange.’”

Link to the rest at The Wall Street Journal

The Good Life

From The Wall Street Journal:

What constitutes a life well-lived? What are the ingredients for lasting happiness? In their captivating book “The Good Life: Lessons From the World’s Longest Scientific Study of Happiness,” the psychiatrist Robert Waldinger and the clinical psychologist Marc Schulz convey key lessons that arise from studying the lifetimes of hundreds of individuals across the 20th and 21st centuries. The major lesson is the overriding importance of positive interpersonal relationships throughout the lifespan.

Dr. Waldinger teaches at Harvard Medical School; Mr. Schulz at Bryn Mawr. They are the current directors of the Harvard Study of Adult Development, an investigation now in its 85th year of data collection. The study began as two independent longitudinal projects, one comprising 268 Harvard sophomores deemed likely to flourish later in life and the other consisting of 456 14-year-old boys growing up in Boston’s most disadvantaged neighborhoods. The purpose of both studies, long since merged, was to identify predictors of health, happiness and flourishing in young adulthood and beyond.

At the outset, the Harvard investigators interviewed participants and their parents and conducted a medical examination of each participant. Although most original members are now deceased, their wives and offspring have been recruited as additional participants. The current protocol asks members to complete a comprehensive questionnaire every two years, authorize disclosure of medical records every five years and agree to a face-to-face interview every 15 years. Questions span every aspect of their lives such as family, employment, mental and physical health, and their views on life, politics and religion. Each assessment point provides a comprehensive snapshot of the participant’s life, and, taken together, the data furnish rich portraits of lives as they unfold over time.

“The Good Life” is not a comprehensive scholarly exposition of the Harvard study. Rather, the authors aim to provide practical wisdom regarding the pursuit of happiness arising from their project. Social fitness, as the authors put it, is the key to mental health, physical health and longevity. Developing skills that enable one to cultivate and maintain positive connections to other people is as least as important as proper nutrition, physical exercise, adequate sleep and the avoidance of harmful habits such as smoking. Yet it is easy to take these relationships for granted in today’s individualistic and hypercompetitive societies.

An important part of social fitness is cognitive flexibility, exemplified by the capacity to see the world through another person’s eyes, to express empathic understanding and to attend fully to others. The authors note how technologies like smartphones and social media seize our attention and keep us from attending to loved ones, to their detriment and ours. They also emphasize the importance of positive relationships in the workplace as well as the upticks in one’s mood prompted by brief, positive interactions with strangers.

Learning interpersonal and emotional skills has become increasingly important in view of the epidemic of loneliness accelerated by the Covid-19 pandemic and the attendant remote work and schooling. Loneliness is not solitude; it is perceived social isolation. People suffering the pain of loneliness are experiencing less positive social contact than they would like. Extroverts favor more social interaction and introverts favor less. But voluntary hermits are rare.

Throughout the book, Dr. Waldinger and Mr. Schulz insightfully narrate episodes from the struggles, failures and triumphs of their participants. One section introduces a man called Neal, whose mother struggled with alcoholism and who found himself helping a daughter who did the same. “Can I get your professional opinion?” Neal at one point asked an interviewer. “Is there anything more I can do for her? Do you think I’ve done something wrong?” The authors illustrate their theme of social connection by praising the way Neal and his wife dealt with their daughter: “Sometimes they had to step back, sometimes they had to step in. But they never turned away.” Elsewhere, they show how children growing up in seriously troubled families can nevertheless flourish if they have at least one positive relationship with an adult, such as a teacher or coach.

Link to the rest at The Wall Street Journal

A Canadian writer visits Chinese restaurants around the world

From The Economist:

Many people in Outlook, a tiny prairie town in Saskatchewan, hoped Noisy Jim would run for mayor, but Jim didn’t want the bother. In Port of Spain, the capital of Trinidad and Tobago, Maurice wanders through an empty school, reminiscing about his childhood love of learning. And late one morning in Istanbul, Fatima and Dawood, now getting on in years, sit across from each other at a table—she is peeling beans, he is “methodically numbering and stamping a receipt book, page by page”.

What links Jim, Maurice and Fatima is that they all ran Chinese restaurants. In his new book, “Have You Eaten Yet?”, Cheuk Kwan tells their stories, along with those of Chinese restaurateurs from 13 other cities outside China, from Tromso, north of the Arctic Circle in Norway, to Toamasina, on the east coast of Madagascar. The result is a charming (if sometimes also meandering) book that weaves its profiles together into an extended meditation on identity, belonging and a sense of home.

Like the restaurateurs he meets, Mr Kwan is a multilingual wanderer: born in Hong Kong, brought up there and in Singapore and Japan, he worked in America and Saudi Arabia before settling in Toronto. He has a fluid, plural identity: “My speech and mannerisms change with the environment: Singaporean-accented English, Hong Kong-Cantonese loudness, Japanese quiet deference and straight-talking American mojo.”

His diasporic life gives him an instant connection with his subjects, which he exploits to his readers’ benefit. Food, he explains, “is just an entry point”; although he is a discerning and enthusiastic eater, his real interest is in the people behind the stoves. “As I travelled the world meeting with far-flung members of the Chinese diaspora, one question always came to mind: Are we defined by our nationality or by our ethnicity?”

The answer, of course, is both. Some of the people he speaks with talk wistfully about wanting to be buried in China; others are more circumspect. Mr Kwan asks Johnny Chi, the head waiter at Ling’s Pavilion in Mumbai, whether “he is ambivalent about his identity”. Mr Chi says no: “Wherever you are born, that’s your land.” He thinks of himself as Indian, “except when I look at myself in the mirror. I say, ‘Oh no, I’m not.’”

Mr Chi expresses a sense of not-belonging. But most of Mr Kwan’s subjects, like many diasporic people, instead have a sense of multiple belonging. At a braai (a barbecue, and a staple of South African culinary identity) in Cape Town, Francis Liang, raised in the Eastern Cape, describes himself as “Chinese…a Chinese South African, definitely I am a South African”. Mai and Dao Wong, whose father ran a restaurant in Haifa, both served in the Israeli army, and Mai, says her friend, “has an Israeli temper”.

Link to the rest at The Economist

PG apologizes for the Free Preview link not working – blame the publisher, Pegasus Books (Berkley and Oakland). Look inside does work on Amazon, although in a slightly clunky manner.

The Siege of Loyalty House

From The Wall Street Journal:

On Oct. 14, 1645, at the height of the English Civil War between Charles I and his defiant Parliament, Gen. Oliver Cromwell unleashed a brigade of the New Model Army against the most notorious remaining royalist outpost. For more than two years, Basing House—a sprawling estate in Hampshire, 55 miles southwest of London—had been held for the king despite repeated efforts to subdue its garrison. Now, at last, the strongpoint was stormed, and all inside were slaughtered or captured.

In “The Siege of Loyalty House,” Jessie Childs tells the compelling story of a place that acquired a mystique far beyond its strategic significance, mounting a staunch resistance justifying the sobriquet recalled in her title. Underpinned by meticulous research, this finely crafted narrative unfolds in evocative and often poetic language, transporting readers back to a “terrifying, electrifying time” and breathing fresh life into the men and women who endured it.

Ms. Childs, a historian whose previous works have focused on the earlier Tudor period, shows how the hardships of enforced confinement revealed the best and worst of Basing’s defenders. It was a “garrison of all the talents,” she writes, and included individuals whose backgrounds as artists, scientists and merchants open vistas into a tumultuous age. The fight for Basing House becomes a prism to view the English Revolution, a much-debated episode encompassing the execution of Charles I in 1649 and the monarchy’s replacement by a decade-long republic. For bewildered witnesses, it truly was a world “turned upside down.”

Highlighting key flashpoints, Ms. Childs traces the gradual polarization of loyalties and the inexorable slide into war. The stubborn, duplicitous Charles Stuart believed in his divine right to rule as he pleased, but Parliament refused to bend to his will, suspecting him of seeking to revive the elaborate “papist” rituals of the pre-Reformation Catholic Church. As tensions escalated in London, the king’s swaggering Cavalier supporters confronted gatherings of Roundheads (a mocking reference to the city’s short-haired apprentices who favored Parliament).

Ms. Childs charts the Civil War’s unspooling tragedy with insight and compassion. Shocked by the opening clash at Edgehill, in October 1642, Parliament’s commander in chief, Robert Devereux, Earl of Essex, was unable to compose the customary postbattle report. After experiencing the carnage, the earl’s “mind and body shut down.”

Link to the rest at The Wall Street Journal

Innovation in Science Is on The Decline And We’re Not Sure Why

Not exactly about books, but innovation and creativity are what makes the book world (and many other worlds) worth following.

From Science Alert:

The rate of ground-breaking scientific discoveries and technological innovation is slowing down despite an ever-growing amount of knowledge, according to an analysis released Wednesday of millions of research papers and patents.

While previous research has shown downturns in individual disciplines, the study is the first that “emphatically, convincingly documents this decline of disruptiveness across all major fields of science and technology,” lead author Michael Park told AFP.

Park, a doctoral student at the University of Minnesota’s Carlson School of Management, called disruptive discoveries those that “break away from existing ideas” and “push the whole scientific field into new territory.”

The researchers gave a “disruptiveness score” to 45 million scientific papers dating from 1945 to 2010, and to 3.9 million US-based patents from 1976 to 2010.

From the start of those time ranges, research papers and patents have been increasingly likely to consolidate or build upon previous knowledge, according to results published in the journal Nature.

The ranking was based on how the papers were cited in other studies five years

after publication, assuming that the more disruptive the research was, the less its predecessors would be cited.

The biggest decrease in disruptive research came in physical sciences such as physics and chemistry.

“The nature of research is shifting” as incremental innovations become more common, senior study author Russell Funk said.

Burden of knowledge

One theory for the decline is that all the “low-hanging fruit” of science has already been plucked.

If that were the case, disruptiveness in various scientific fields would have fallen at different speeds, Park said.

But instead “the declines are pretty consistent in their speeds and timing across all major fields,” Park said, indicating that the low-hanging fruit theory is not likely to be the culprit.

Instead, the researchers pointed to what has been dubbed “the burden of research,” which suggests there is now so much that scientists must learn to master a particular field they have little time left to push boundaries.

Link to the rest at Science Alert

The question in the OP is one of the few about which PG doesn’t have an opinion.

That said, PG hasn’t noticed any lack of innovation in the various categories of knowledge to which he pays attention.

The Huxleys

From The Wall Street Journal:

Thomas Huxley thought “the smallest fact is a window through which the Infinite may be seen.” His ideas moved from minute particulars to universal propositions. In 1869 he coined the word “agnostic’” to denote a method of thinking that required empirical data rather than biblical revelation to accept the existence of God. At the burial of Thomas’s eldest child, Noel, who died of scarlet fever at the age of 4, a clergyman read at the graveside from 1 Corinthians: “If the dead rise not again, let us eat and drink, for tomorrow we die.” Thomas stood aghast in this insensitivity. “Why,” he said, “the very apes know better, and if you shoot their young, the poor brutes grieve their grief out, and do not immediately seek distraction in a gorge.” The child’s funeral intensified his doubts about Christianity. For Thomas, writes Alison Bashford in her book “The Huxleys,” “science fully and richly took the place of religion.” He handed to his descendants a creed that trusted provable and knowable facts, and rejected credulity, myth and superstition. The Huxley tenets entailed wrestling with conscience, psychological stresses, spiritual anxiety, and anguish.

Ms. Bashford, a historian of medicine and professor at the University of New South Wales in Australia, has written a daring and joyously intelligent book on Huxley, his family and their immense legacy. Her focus is on two eminent scientific thinkers, Thomas (1825-1895) and his grandson Julian Huxley (1887-1975), but she does not provide a conventionally structured family biography following a neat chronological sequence. The abundance and diversity of her material makes that impossible. Thomas was a pioneer of comparative anatomy, a zealous force in educational reform and a scientific celebrity whose book “Evidence as to Man’s Place in Nature” was a landmark argument for human evolution. His descendants became investigators and thinkers who worked with high intensity to clarify some of the boldest, most contentious ideas of the 19th and 20th centuries. They cannot be packed into tight parcels.

Ms. Bashford has partitioned her book into thematic essays. She depicts the family culture of the Huxleys, their dynastic intelligence and literary prowess, their masterful capacity to synthesize scientific knowledge, and their driving conviction that the pursuit of pure truth was a social responsibility which upheld the sanctity of human nature. She gives due attention to other members of the dynasty: the novelists Mary Ward and Aldous Huxley, Julian’s younger half-brother Andrew (a biophysician and Nobel laureate), Julian’s distinguished sons Anthony (a botanist) and Francis (an anthropologist). She discusses the study by the Huxleys of the human species in the remote past, and their work to better, as they hoped, the social conditions of the future. In the 1950s Julian coined a word to describe the transcendent possibilities and higher destiny of a scientifically enlightened humanity, “transhumanism”.

Generations of the Huxley family suffered from crippling depression. Thomas Huxley described his depressions as “paroxysms of internal pain”, and “deserts of the most wearisome ennui.” His father, a failed schoolmaster, was a melancholic who died in a mental asylum; his afflicted brother James lived in self-enforced seclusion for over 40 years; his daughter was treated for depression by the famous neurologist Jean-Martin Charcot. Julian, who had an acute episode of depression in 1912-13, described himself in 1917 as a “would-be suicide” sunk in “hopeless despair.” He underwent a variety of treatments, including electroshock. His brilliant brother Trevenen, who spoke of being “lost in a pit an enormous way behind my eyes”, hanged himself in 1914. In Ms. Bashford’s telling, the story of how systematic and concentrated work helped other Huxleys, especially Julian, to manage the disarray of depressive illness is a heroic one.

. . . .

Living creatures, however, delighted Julian best. In Oxford during the 1920s he fed Mexican salamanders called axolotls with thyroid of ox, which transformed them in a few weeks from amphibians to terrestrial, air-breathing creatures—a Frankenstein’s monster moment, says Ms. Bashford, when life was transformed. Later he made a loving study of orangutans and chimpanzees. After becoming director of the London Zoo, and establishing it on a new footing of fame, he became devoted to the primates there. One of them, known to generations of English children as Guy the Gorilla, he loved perhaps as much as anyone in his life.

Thomas Huxley was driven by the need to understand what distinguishes humans from other species, and indeed to identify humankind’s place in the natural world. He was not convinced by Charles Darwin’s notion of natural selection as the leading evolutionary force. He believed that new forms sometimes appeared, in their full perfection, by sudden incidents of mutation: a theory known as saltation. Nevertheless he was known as “Darwin’s bulldog” for his defense of the great evolutionist.

Link to the rest at The Wall Street Journal

Lies Diet Books Tell

From Book Riot:

Content Warning: This article discusses weight loss, disordered eating, and the lie that being smaller makes you more worthy. Use caution and don’t forget you are already good.

I am a 37-year-old fat woman who spent decades of my life trying to shrink. I know diets. As a child I bopped along with Richard Simmons on his Sweatin’ to the Oldies VHS. I used smaller plates to simulate portion control. When I became a teenager, I decided to try recording everything I ate. I also learned that I enjoyed jogging and felt triumphant when I finished a short run in my neighborhood. These habits alone were actually great for my teenage self, but it started an emotional journey that would be torture. 

You see, I lost a little weight, and the response felt like I had saved a child from a burning building. People went wild. A man I loved like an uncle told me my dead father would be so proud that I slimmed down. More hauntingly, friends and extended family members let me know how gross and off putting they had found my old body. People I loved very much let me know that I was more lovable when I was smaller, and therefore more attractive. I started restricting my eating even more, living off Lean Cuisine meals and 100 calorie snack packs. (It was 2006, that was the height of healthy eating.) Getting smaller became very, very important.

I got married at a young age and immediately started working full time as a teacher. I  had less free time to focus on going to the gym and food tracking. This is when my weight started to shift to a higher number and I got desperate. Diet culture had its hooks in deep at this point, and I entered a cycle of having a huge binge period before starting a new diet. Completely separated from the basic habits that had made me feel good in the first place, I tried everything on the market. I have used Weight Watchers, the 21 Day Fix, Beachbody powders, My Fitness Pal, carb-free diets, Whole 30, the Special K diet (two bowls of Special K a day and a sensible dinner), and more. I have tried tricks like chewing gum to keep me from snacking, snapping my wrist with a rubber band when I reach for food, and pouring water over my meal after I’ve eaten half to make sure I wouldn’t eat anymore. It was disordered and it heavily messed with my head.

I gave up diets about eight years ago. Books are what saved me. I’m much larger, happier, and have a better relationship with my body than ever. I’ve learned what diet culture is and what it does. Diet culture (and the diet books that hold it up) spews lies daily. Some are easily debunked, and others I am still detangling. Some of these lies are so insidious we accept them as fact without any thought. The point is, we need to talk about it.

. . . .

LIE: HEALTHY FOOD AND EXERCISE ARE ONLY WORTH THE AMOUNT OF WEIGHT THEY CAN MAKE YOU LOSE.

I’m still deep in my journey of reconnecting with myself and the foods, habits, and movement that make me feel good. It’s not small or simple. In diet culture, certain foods are vilified and exercise is exalted as the most virtuous thing you can do. Once you realize that shrinking is not a worthy life goal and remove yourself from diet culture, it’s easy to reject exercise and any food that was considered “good.” Getting to a spot where you nourish yourself with nutrient dense foods and move for your mental health is so hard when you’ve been taught those things are only worth it if they make you smaller.

LIE: LOSING WEIGHT IS A FEAT OF DISCIPLINE AND MEANS YOU ARE IN CONTROL.

This one is so, so damaging, because it’s really easy to believe. It’s also dangerous to people in larger bodies, because society believes the opposite (fat means sloppy, letting go, lazy) without a single analytical thought. The truth is, all humans genetically have much less control over the size of their bodies than we would like to believe. It’s also laughable to assume that people suffering from disordered eating are in control. To be strict with diets and follow food rules that are based on shrinking, you basically have to shut down your connection with your body and the hunger signals that should vary throughout a day, week, and month. Whenever I’m tempted by the siren call of going back to my dieting days (“Maybe I really was much healthier before…”) I am reminded of a time when I was at my smallest, being praised left and right, and was caught with a spoon in a bag of sugar I had frantically dug out of the back of my mom’s pantry, because I was having an intense craving and snapped. I was not in control.

Link to the rest at Book Riot

PG can’t say he wasn’t warned before he started reading the OP. He made it through without being triggered.

Fit Nation

From The Wall Street Journal:

When fitness guru Jack LaLanne opened a gym in Oakland, Calif., in the 1930s, he had to hire a blacksmith to build the equipment: fitness machines did not yet exist. As Natalia Mehlman Petrzela writes in “Fit Nation,” in the late 19th and early 20th centuries, exercising, except in the context of organized sports, was “marginal, and even suspicious,” and being fat was taken by many as “a positive sign of affluence.” Being lean or muscular was likely the result of manual labor, hardly something to which the middle classes aspired. Weightlifting was for the circus or the effeminate; ladies didn’t perspire, much less sweat. “Fit Nation” is the story of how all that changed.

There were always outliers: bodybuilders and diet advocates like Bernarr Macfadden (publisher of Physical Culture magazine), exercise proponents like Charles Atlas (a former “97-pound weakling” whose mail-order regimen promised “everlasting health and strength” in 15 minutes a day). But they were viewed from a distance; it would be the postwar prosperity of the 1950s that shifted the public attitude to fitness, and created a consumer base with the time and money to pursue it. From the White House, John F. Kennedy promoted a glamorous outdoor athleticism, while pioneers like LaLanne brought fitness into the home via television. No longer a niche pursuit, fitness grew to be a standard part of a middle-class lifestyle (even if many of us, it must be said, observe it in the breach). Between sometime in the 1960s and the mid-1980s, working out went from eccentric interest to “social imperative.” Today, we all know we should be exercising, especially when we see someone else out running.

Ms. Petrzela, who teaches exercise classes when she’s not lecturing on cultural history at the New School, offers an informative overview of this transformation. In 1961, she tells us, 24 percent of American adults were regular exercisers, but “that number jumped to 50 percent by 1968, and 59 percent in 1984.” Those dramatic rises were driven by inventive fitness styles, catering to different consumers: Jazzercise for the housewife; Nautilus gyms for the yuppie; jogging for pretty well anyone.

Other cultural and political trends converged. Jogging, for instance, got a boost thanks to an increase in urban marathons and fun runs from the 1970s onward. Ms. Petrzela suggests that these races became popular because they were cheap for cities to host and helped to promote “an image of fiscal and personal health that offset depictions of urban crime and decay.” Local governments are happy to promote fitness as a social good and community-building activity.

Celebrity health-boosters also modeled fitness for different audiences: Arnold Schwarzenegger with the extreme masculine imagery of the film “Pumping Iron” (1977), Jane Fonda with a “Workout Book” offering her embodied ideal of Hollywood fitness (1981). The consumerist 1980s also made fitness another thing to buy—and linked fitness increasingly to appearance. Beauty standards for women began to shift from thin to toned, and classes proliferated to focus on the looks that would result, from “Buns of Steel” to flat stomachs to sculpted upper arms. Women resolutely pulled on their legwarmers and started stepping to the beat. The speed with which the industry grew is staggering: “By 1984, Jazzercise was the second largest franchise business in the country,” Ms. Petrzela reports, “just after Domino’s Pizza.”

Link to the rest at The Wall Street Journal

The Ghost at the Feast

From The Wall Street Journal:

From the beginning of the 20th century until the Japanese attack on Pearl Harbor, international balances of power were more weirdly skewed, more out of joint, than at any time before or since. The 19th-century world was emphatically multipolar, even if the British Empire held outsize influence in some areas outside Europe. The decades after 1945 were dominated by two nuclear powers—America and the Soviet Union. But the years before World War II cannot be summarized simply, or explained in a sentence or two. The awkward fact was that even then one of the Great Powers, the United States of America, possessed the economic and productive heft of two or three of its rivals combined; on occasion, its single gross domestic product was nearly equal to all the rest put together.

Yet for all that time, except for President Woodrow Wilson’s brief appearance at the center of the world stage in 1918-19, the American giant kept to itself, focused upon its domestic affairs, maintained a minuscule army and refused an international leadership role, to the puzzlement of most foreign observers. It was an economic force, all right, but it usually declined membership in international bodies or security arrangements. In 1919 the experienced British diplomat Harold Nicolson had called it “the ghost at all our feasts.”

This curious and lopsided diplomatic story has attracted the attention of columnist and scholar Robert Kagan, who takes Nicolson’s term for the title of his new study. “The Ghost at the Feast: America and the Collapse of World Order, 1900-1941” is rather different from Mr. Kagan’s previous works, including his best-known one, “Dangerous Nation: America’s Place in the World From Its Earliest Days to the Dawn of the Twentieth Century.” This latest is a professional historian’s product through and through, sharply focused on its period and supported by amazingly detailed endnotes, plus a huge bibliography. Mr. Kagan’s account is probably the most comprehensive, and most impressive, recent analysis we have of how Americans regarded the outside world and its own place in it during those four critical decades.

”The Ghost at the Feast” is neither strictly a diplomatic history nor an analysis of the unfolding of American grand strategy (if it ever had one) in those unusual years. Mr. Kagan recounts presidential decision-making and official actions in great detail, yet offers even greater analysis of the swirls of U.S. public opinion, the arguments of the press and pundits, the evidence in Gallup polls, and the ever-important actions of senators and congressmen. Only when all these elements are taken together, he argues, can one see the way the American mind was going. And most of the time that “mind” was for staying out of the world’s troubles and tending to its own garden, especially as international crises multiplied during the 1930s.

This attitude was not just a benign sort of inertia that regarded the outcome of a ballgame in Cincinnati as more important than the fate of Czechoslovakia; there were also very strong public strains of antisemitism, fears of Communism, a loathing of Wall Street and a deep suspicion of British and French imperialism: Mr. Kagan’s chapter on “Kristallnacht and Its Effect on American Policy” is particularly powerful. Yet America was not completely isolationist, and its leaders did not intend to be taken for granted. If the nation didn’t play a larger role in world affairs, that was because it felt it didn’t need to. It was bigger than anyone else; it couldn’t be intimidated, and if America were to be compelled to take military action abroad, no one—isolationist or interventionist—imagined that it could be defeated. America was unique and America was unpredictable. Everyone is wary of the ghost at the feast.

Link to the rest at The Wall Street Journal

PG thinks that, as described in the OP, not enough understanding of the ongoing emotional and financial effects of the stock market crash of 1929 and the following Great Depression is considered.

Not long after many Americans learned the wisdom of depositing their money in a bank for safekeeping, 9,000 banks failed–taking with them $7 billion in depositors’ assets. There was no source of reimbursement for deposits that were lost.

PG has spoken with enough people who experienced the Depression to understand how badly many parts of the US economy were damaged and the emotional impact of suddenly losing all your money and having banks desperately foreclose on loans as fast as possible in an attempt to remain solvent only to discover that security for those loans was worth a small fraction of its value only a few years earlier.

With that sort of pall hanging over a great many parts of the country and the voters who lived there, any politician proposing to spend a lot of money to help out with a dispute between nations halfway around the world would have a realistic fear of losing his job as well. When every working-age member of a family had to take whatever pay was offered in the local economy, nobody wanted to send a healthy 18 or 19-year old family member across an ocean to settle disputes between kings and dictators.

PG has often speculated about how history might have been different if the Japanese hadn’t decided to attack the American naval base at Pearl Harbor. Before that shocking event, concerns about military disputes in Asia were even farther from the collective mind of the general population than what was happening in Europe.

In May 1940, a Gallup poll found that only 7 percent of Americans believed the United States should declare war on Germany.

17 months later, when Japan attacked Pearl Harbor and sank a great many US Navy warships, US public opinion quickly changed.

One day after the Japanese attack, President Roosevelt gave a speech to a joint session of the US Senate and House which included what would become the most-remembered statement of his time in office.

“Yesterday, December 7, 1941—a date which will live in infamy—the United States of America was suddenly and deliberately attacked by the naval and air forces of the Empire of Japan.”

One week following the Japanese attack, only 7 percent of Americans wanted the country to stay out of war. Only one member of the US House of Representatives voted against a declaration of war that followed.

The First Romantics

From Aeon:

In September 1798, one day after their poem collection Lyrical Ballads was published, the poets Samuel Taylor Coleridge and William Wordsworth sailed from Yarmouth, on the Norfolk coast, to Hamburg in the far north of the German states. Coleridge had spent the previous few months preparing for what he called ‘my German expedition’. The realisation of the scheme, he explained to a friend, was of the highest importance to ‘my intellectual utility; and of course to my moral happiness’. He wanted to master the German language and meet the thinkers and writers who lived in Jena, a small university town, southwest of Berlin. On Thomas Poole’s advice, his motto had been: ‘Speak nothing but German. Live with Germans. Read in German. Think in German.’

After a few days in Hamburg, Coleridge realised he didn’t have enough money to travel the 300 miles south to Jena and Weimar, and instead he spent almost five months in nearby Ratzeburg, then studied for several months in Göttingen. He soon spoke German. Though he deemed his pronunciation ‘hideous’, his knowledge of the language was so good that he would later translate Friedrich Schiller’s drama Wallenstein (1800) and Goethe’s Faust (1808). Those 10 months in Germany marked a turning point in Coleridge’s life. He had left England as a poet but returned with the mind of a philosopher – and a trunk full of philosophical books. ‘No man was ever yet a great poet,’ Coleridge later wrote, ‘without being at the same time a profound philosopher.’ Though Coleridge never made it to Jena, the ideas that came out of this small town were vitally important for his thinking – from Johann Gottlieb Fichte’s philosophy of the self to Friedrich Schelling’s ideas on the unity of mind and nature. ‘There is no doubt,’ one of his friends later said, ‘that Coleridge’s mind is much more German than English.’

Few in the English-speaking world will have heard of this little German town, but what happened in Jena in the last decade of the 18th century has shaped us. The Jena group’s emphasis on individual experience, their description of nature as a living organism, their insistence that art was the unifying bond between mind and the external world, and their concept of the unity of humankind and nature became popular themes in the works of artists, writers, poets and musicians across Europe and the United States. They were the first to proclaim these ideas, which rippled out into the wider world, influencing not only the English Romantics but also American writers such as Henry David Thoreau, Ralph Waldo Emerson and Walt Whitman. Many learned German to understand the works of the young Romantics in Jena in the original; others studied translations or read books about them. They were all fascinated by what Emerson called ‘this strange genial poetic comprehensive philosophy’. In the decades that followed, the Jena Set’s works were read in Italy, Russia, France, Spain, Denmark and Poland. Everybody was suffering from ‘Germanomania’, as Adam Mickiewicz, one of Poland’s leading poets, said. ‘If we cannot be original,’ Maurycy Mochnacki, one of the founders of Polish Romanticism, wrote, ‘we better imitate the great Romantic poetry of the Germans and decisively reject French models.’

This was not a fashionable craze, but a profound shift in thinking, away from Isaac Newton’s mechanistic model of nature. Despite what many people might think today, the young Romantics didn’t turn against the sciences or reason, but lamented what Coleridge described as the absence of ‘connective powers of the understanding’. The focus on rational thought and empiricism in the Enlightenment, the friends in Jena believed, had robbed nature of awe and wonder. Since the late 17th century, scientists had tried to erase anything subjective, irrational and emotional from their disciplines and methods. Everything had to be measurable, repeatable and classifiable. Many of those who were inspired by the ideas coming out of Jena felt that they lived in a world ruled by division and fragmentation – they bemoaned the loss of unity. The problem, they believed, lay with Cartesian philosophers who had divided the world into mind and matter, or the Linnaeun thinking that had turned the understanding of nature into a narrow practice of collecting and classification. Coleridge called these philosophers the ‘Little-ists’. This ‘philosophy of mechanism’, he wrote to Wordsworth, ‘strikes Death’. Thinkers, poets and writers in the US and across Europe were enthralled by the ideas that developed in Jena, which fought the increasing materialism and mechanical clanking of the world.

So, what was going on in Jena? And why was Coleridge so keen to visit this small town in the Duchy of Saxe-Weimar that had become a ‘Kingdom of Philosophy’? Jena looked unassuming and, with around 4,500 inhabitants, it was decidedly small. It was compact and square within its crumbling medieval town walls, and it took less than 10 minutes on foot to cross. At its centre was an open market square, and its cobbled streets were lined with houses of different heights and styles. There was a university, a library with 50,000 books, book binders, printers, a botanical garden and plenty of shops. Students rushed through the streets to their lectures or discussed the latest philosophical ideas in the town’s many taverns. Tucked into a wide valley and surrounded by gentle hills and fields, Jena was lovingly called ‘little Switzerland’ by the Swiss students.

Back in the 18th century, Jena and its university had been part of the Electorate of Saxony but, because of complicated inheritance rules, the state had been divided up and the university was nominally controlled by no fewer than four different Saxon dukes. In practice, it meant that no one was really in charge, allowing professors to teach and explore revolutionary ideas. ‘Here we have complete freedom to think, to teach and to write,’ one professor said. Censorship was less strict compared with elsewhere, and the scope of subjects that could be taught was broad. ‘The professors in Jena are almost entirely independent,’ Jena’s most famous inhabitant, the playwright Friedrich Schiller, explained. Thinkers, writers and poets in trouble with the authorities in their home states came to Jena, drawn by the openness and relative freedoms. Schiller himself had arrived after he had been arrested for his revolutionary play The Robbers (1781) in his home state, the Duchy of Württemberg.

On a lucky day at the end of the 18th century, you might have seen more famous writers, poets and philosophers in Jena’s streets than in a larger city in an entire century. There was the tall, gaunt-looking Schiller (who could only write with a drawer full of rotten apples in his desk), the stubborn philosopher Fichte, who put the self at the centre of his work, and the young scientist Alexander von Humboldt – the first to predict harmful human-induced climate change. The brilliant Schlegel brothers, Friedrich and August Wilhelm, both of them writers and critics with pens as sharp as the French guillotines, lived in Jena, as did the young philosopher Friedrich Schelling, who redefined the relationship between the individual and nature, and G W F Hegel, who would become one of the most influential philosophers in the Western world.

Link to the rest at Aeon

How 2022 became the year of the fragmented-identity novel

From The Los Angeles Times:

The tail end of 2022 has been marked by a worrying sense that the center really isn’t holding. Last month’s U.N. climate conference convened to determine to what degree we might comfortably continue to cook the planet. Election denialism, especially in my home state of Arizona, entrenched itself as a small but no longer negligible branch of political discourse. We could take our grievances to Twitter about such things — or could we? Elon Musk was in charge, not only letting loose the trolls but tweeting as troll-in-chief.

Throw in some overall social and cultural atomization, and it’s coming to feel like we’ve become rhetorically unstuck in time. Fiction is usually a lagging indicator of global crises — Iraq war novels didn’t arrive till years after the war began.But much of the prominent fiction of 2022 met the moment and captured this fragmentation, thick with code-switching, style-shifting and cacophonies of anxious narration. The omniscient, singular authorial voice in literary fiction has become ever more antiquated — still valuable, but more like an exotic, bespoke retreat than literature’s mainland. Call it Franzen Island.

Better befitting our times are a constellation of mosaics — maybe the Egan Archipelago. Jennifer Egan, who set a template for this brand of multi-voiced fiction with her 2009 novel, “A Visit From the Goon Squad,” revisited and updated that sensibility this year with a sequel, “The Candy House,” that despaired of what internet algorithms were doing to our identities. The novel sprays literary gambits like a fallen power line throwing sparks: Here a narrative in tweets, there in emails, here a satire of literary tropes, there a spoof of Hollywood. Egan was writing as if to defend fiction against what the internet was doing to it. A messy online reality — one that spelled “an existential threat to fiction,” she wrote — demanded a stew of styles.

Egan wasn’t alone. In his brilliant second novel, “Trust,” Hernan Diaz cut reality into pieces, telling the story of an early 20th century investor through fiction, memoir and diary to show how each form, on its own, is untrustworthy; every story is self-serving, but put enough of them together and you might get at the truth. The pressures being atomized — rapacious capitalism, entrenched sexism — were systemic, but this was no eon-spanning epic. The narratives were compressed, intimate and particular.

This approach manifested itself poignantly in Namwali Serpell’s second novel, “The Furrows: An Elegy.” A woman mourning her brother’s sudden death switches tones and perspectives to either grasp or escape her complicity in the incident. Her status as a character morphs, as if to suggest that inhabiting someone else’s identity might bring us closer to our own. And in Jonathan Escoffery’s novel-in-stories, “If I Survive You,” the lead character is a Florida-born man of Jamaican heritage whose identity is as fragmentary as the book itself: In the Midwest he’s Black, but in Miami and Jamaica subject to more specific assessments. A chorus of voices seems to consume and splinter him: “You’re brown, but not that kind, and not that kind, and not that kind.”

The style of 2022 — intimate chaos? — was no more specifically American than the chaos of the real world. NoViolet Bulawayo’s second novel, “Glory,” is “Animal Farm” for the age of social media, rooted in the animal residents of an authoritarian African nation staging a collective protest against its leaders. In Mithu Sanyal’s “Identitti,” social media contempt serves as a character in itself, debating the intentions of its protagonist, a Rachel Dolezal-like white academic who performs as a person of color. The novel satirizes the audacity of its heroine, but it also wants to suggest that who we are is increasingly constructed — both internally and externally. (“Oh, so it’s okay to transcend your gender, but a category as obviously made-up as race should be more fixed and inflexible than sex?” she says — a provocation, but Sanyal wrestles seriously with the question.)

Link to the rest at The Los Angeles Times

Perhaps PG is biased from having lived in Southern California for several years, but some commentators writing in The Los Angeles Times always seemed to be aware that no one treated them with the same respect as writers in the other Times, the one in New York.

New York Times writers seemed to receive almost automatic attention and consideration from not only the rest of the nation, but by quite a few well-educated residents of Southern California as well.

The LA Times never managed to gain any lasting credibility beyond a relatively small group of people living in West LA. The huge Latino population didn’t care what the Times had to say nor did the rapidly-growing Orange County populace or anybody in Riverside or San Bernardino.

In 2000, The Times along with the San Diego paper was purchased by a Chicago-based newspaper conglomerate formerly known as The Tribune Company. This organization had been persuaded by some branding geniuses to change its name to Tronc.

Yes, public reaction to Tronc in Chicago and everywhere else was uniformly derogatory.

But the problems in LA were not yet over. Tronc wanted a new look for the Times and fired a bunch of seasoned executives and hired a wonder boy to lead the Times to greater glory, “a visionary and innovative leader” with prior stops in Alta Vista and Yahoo.

It turned out that the new guy was a great salesman and visionary leader, who was also a frat guy who assessed the “hotness” and bodies of female subordinates. Of course he was sued for sexual harassment by various his female employees.

Tronc finally sold its California papers and a bunch of other stuff to a Silicon Valley billionaire who got rid of the stupid name.

Democracy dies behind paywalls

From What’s New in Publishing:

We all surf dozens – even hundreds – of sites and sources, yet none of us can afford to subscribe to everything we’d want to consume. But today we are being forced to subscribe wherever we go – cutting off everyone from access to diverse and crucial information. This is dangerous – here is why.

One of the most important taglines in recent history may be the Washington Post’s “Democracy Dies in Darkness.” To me, it is a mission statement about journalism’s role in society – to inform, to explain, to educate. It implies that if you don’t have access to information, you are simply in the dark. Today, in 2022, it looks like democracy now dies behind paywalls.

This morning, I wanted to read about how a new “anti-white bigotry” movement is gaining momentum – but then I hit a paywall. So I tried to access an article about Twitter dissolving its Trust and Safety Council, but hit the paywall again. A story about Moderna’s mRNA Cancer Vaccine looked very interesting, but could I read it? Not without a subscription.

And by the way, none of these articles were part of my subscription to the New York Times – so I could only read about them somewhere else. This happens to all of us all of the time.

The Internet gave rise to a massive paradigm shift – moving from the paperboy delivering news on our doorsteps to us now accessing any news we want to with a click on our phone. The way content is consumed has fundamentally changed – from a ‘push’ model to what can now be considered ‘pull.’ We live our lives online, looking for content ourselves — the stories we like, the news we’re interested in, jumping directly to the content we want.

The rise of paywall apartheid

In their search for revenue sources to compensate for dropping advertising, publishers are increasingly making quality journalism only accessible to paying customers. It’s understandable from a business perspective. If you have great quality content behind the paywall it attracts users and converts some of them into subscribers – that’s especially the case with exclusive content that explains and interprets important societal, political, and cultural events. But by focusing on subscriptions as their primary revenue source, publishers are making this journalism only accessible to paying subscribers, and that’s where the danger lies.

After all, what are we going to do if we simply don’t want to pay for a subscription (whether because we can’t afford it, we simply have too many subscriptions, it’s a nuisance to sign up, etc.)? It is a pain.

Because the nature of publishing has changed from push to pull, if a reader cannot afford a subscription to access a publisher’s content, they will end up going somewhere else, where the content is available to them easily, or for free, or they bail entirely.

According to a recent NRG and Toolkits report, 53% of U.S. consumers say they attempt to bypass paywalls on publishers’ websites when they encounter them. 66% of respondents say that paywalls make them dislike the website or publication, with 40% saying they instead search for the content on a different website. And this can have potentially disastrous results as it is precisely those freely accessible sites that often have incorrect, biased, or outright false perspectives and narratives.

Extreme political organizations would never have gained such popularity if their discussions and opinions were locked behind a paywall, but because it was free, it drew people in much easier than paid versions. In cases like this, by only allowing access to content via a subscription, publishers are potentially funneling people in the direction that they actually want to protect them from – into the arms of free content that may inspire them to think entirely differently.

The problem is real for everyone – even Elon Musk suffers from it. Here’s what he had to say in a recent interview (prior to his acquisition of Twitter):

I don’t want to get a subscription to the Philadelphia Inquirer, but I’m sure they have, every other day, a good article that I’d like to read. But I don’t want, like 12,000 subscriptions… I understand that all these publications want to maximize their subscribers… But there’s a huge number of people who are never going to subscribe to that publication.Elon Musk

73% of US consumers don’t subscribe to digital publications

According to data just released by the NRG and Toolkits report – 73% of US consumers don’t currently subscribe to digital publications at all. In other words, most people just won’t subscribe. This means we run the risk of being trapped in information bubbles, fed by search algorithms and the limitations of the content we are able to access. If we don’t subscribe to a publication, we don’t get access, which in turn means we have less access to diverse content. Instead, we turn to content that is free – and often more biased, and as the search algorithms learn and reinforce our consumption habits, the information bubble becomes ever more solid.

I think we’d all agree that everyone would benefit from more access to better, quality information. Better content allows for a better culture of discussion and debate, maybe even leading to more balanced political decision-making. And that is not only a good thing in its own right, but it also leads to potentially less extremist points of view.

Limiting the pool accessible to all other readers to just a few pieces of content or media can simply have dangerous effects on civil society and its understanding of democracy. The fewer people who have access to high-quality information from diverse sources, the easier it is for populist or even extreme platforms to pursue their business without counterarguments, and the greater the likelihood that half-truths and fake news will be believed.

Link to the rest at What’s New in Publishing

PG is annoyed by paywalls, but doesn’t think they’re a threat to democracy.

If he really wants to read about a news item that’s behind a paywall, usually a Google search will lead him to another recent and relevant article about the topic, whether it’s a factual or opinion piece.

The most common major news publications that are paywalled are what used to be called newspapers or magazines. Some even print their publications. On paper. Which causes lovely trees to be cut down in a forest somewhere. Every issue results in more trees being cut down, which are then trucked to a factory that uses more energy to change them into pulp and then paper. Which has to be shipped to a big printing factory, unloaded into a warehouse, then moved from the warehouse to where the giant energy-consuming printers are located. To be turned into physical newspapers, which are then trucked or moved by other energy-consuming transportation means to a whole bunch of different locations where people actually read news on paper.

The electronic alternative to all this industrial age news technology is to use a bunch of organized electrons to send the information and stories all over the world with a small fraction of the energy it takes to use dead trees to transport information.

From an objective point of view, which way is the best method for rapid dissemination of useful and accurate information?

PG notes that there is no guarantee that information behind a paywall is more accurate than information that is not.

The large majority of “quality” traditional newspapers have always charged people who wanted to read them, so there’s always been a paywall on their content.

On many more than one occasion, the printed stories were not accurate or favored one view over an opposing view and included information, accurate or not so accurate, that reflected the “narrative” that the owners/editors wanted the public to hear and adopt as its own.

In newspaper days of yore, owners/publishers/editors thought nothing about promoting movements, contemporary issues, political parties, views of various types of commerce, etc., etc. In the glory days of American newspapers, a reader could choose a newspaper that closely mirrored her/his opinions about politics and many other topics. The internet didn’t invent that sort of thing.

A New Ice Age

From The Signal:

Since Russia first invaded Ukraine on February 24, Vladimir Putin has spoken of the attack as part of a civilizational conflict with the West—like the Cold War—while he and his military leaders continue to threaten the use of nuclear weapons. Today, the Russia of Putin looks far weaker than the empire of Stalin and Brezhnev ever did, with Moscow having failed to achieve almost all its goals in Ukraine—and many of the Soviet Union’s old satellite states now NATO members. At the same time, Russia has become much closer to China than it was in the communist era, as Xi Jinping pursues his declared ambitions to counter the global power of the United States. The U.S. and the EU have meanwhile moved to break off economic relations with Russia and halt the development of Chinese tech industries. From the end of World War II through the fall of the Berlin Wall, the world was fundamentally split into two hostile blocs. Is it happening again?

To Lucan Way, it is. Way is a professor of political science at the University of Toronto and the author of three books on authoritarianism. Much of the globe, he says, has been dividing into two camps, democratic and authoritarian—“the free and the unfree”—and the conflict between them is deepening. But the nature and contours of this new division are different. Now, the two sides aren’t fighting over an ideology, as the democratic and communist blocs of the Cold War were. Also, there are regionally powerful countries today that can challenge the goals of the democrats or the authoritarians—or cooperate with either. In the absence of an organizing ideological dimension, and with the presence of other powerful actors, Way sees the new era of global conflict becoming more chaotic—and ultimately more unpredictable—than the Cold War ever was.

———

Michael Bluhm: Putin has framed the Ukraine war as a proxy conflict with the West and NATO. Even after the fall of the Soviet Union, he says, NATO has never stopped aggressively expanding its borders and threatening Moscow.

But now the war seems to have made Russia much weaker—even weaker than the Soviet Union was toward the end of its history. What has Putin done to Russia’s position in the world here?

Lucan Way: First of all, Russia is much richer today than it was in that time of the Soviet Union. Until very recently, it had a dynamic market economy.

The main difference between the eras is that, during the Cold War, the Soviet Union had a universal ideology that could be applied throughout the world: communism. It was the Soviets’ counterpart to the also-universal liberal-democratic ideology of America and its allies.

The Cold War involved a distinctive, global competition for influence. Every local conflict around the world became infused with this broader great-power conflict—in no small part because of these competing ideologies.

Today liberal democracy has had its setbacks around the world, but there’s still a global liberal-democratic ideology. There isn’t a global authoritarian ideology. There are just a lot of parochial nationalists with authoritarian playbooks. The movements supporting them don’t speak to any global ideology. In that sense, Russia’s been reduced from a country with global ambitions, based on a global idea, to a corrupt dictatorship based on the power of a single person.

Bluhm: The historian Tim Sayle said in March that the invasion of Ukraine had brought a period of newfound unity and shared moral clarity to the West.

How do you see this today—with persistent inflation and rising energy costs triggering protests across so many European countries?

Way: I was impressed early on by the remarkable unity we were seeing in Europe. I did feel somewhat apprehensive about the possibility of European fatigue with the war—just because most conflicts that initially spark widespread outrage tend to get normalized fairly quickly; we’ve seen this in Afghanistan and elsewhere. But polls in Europe still show overwhelming support for Ukraine, now nearly 10 months after the invasion. The resilience of the sentiment has surprised me.

Back in March, French President Emanuel Macron and German Chancellor Olaf Scholz were arguing that the West should negotiate with Putin. But they’ve completely abandoned that rhetoric—and the biggest reason is the simplest: Russia has just behaved so atrociously that Europe is more unified now than it was at the outset of the war.

Also in March, there was a conflict between the United States, on the one hand, and Macron and Scholz. The U.S. was saying, We have to fight the Russians, give Ukraine weapons, and win the war. Macron and Scholz were saying, No, we have to reach out. There has to be a negotiated solution. But Putin has shown such a complete unwillingness to compromise or adjust his fundamental goals, despite multiple military losses, that he effectively ended that debate.

And then, since March we’ve seen a tremendous decoupling between Europe’s economy and Russia’s. In January, the European Union relied on Russia for more than 40 percent of its natural gas. Now that number’s down to 17 percent.

To be fair, there are elements of Ukraine fatigue, but there are also now structural reasons for long-term unity in Europe against Russia. Until recently, Russia had deep ties with the European elite. The former German chancellor Gerhard Schroeder was even on the board of Gazprom, the main Russian gas provider, until the invasion of Ukraine. Those days are over. Putin has completely broken these ties. And there’s no way Europe’s going to re-establish them—at least in the medium term.

We’re seeing a starker divide between the free and the unfree in the world today than we’ve ever seen before. During the Cold War, you had communist and anti-communist blocs. The communist world was completely autocratic, but the anti-communist world included many military dictatorships, which were propped up by democratic countries precisely because these dictatorships were Cold-War allies. Today, with Hungary and Poland, we do have autocracies on the Western side, but this is much more exceptional now than it was then.”

China is much more powerful than the Soviet Union was. It has both a very powerful military and an extremely powerful economy. The Soviet Union was quite poor, but it had massive conventional forces in Europe, and that was its main source of power. Beijing has many more resources to challenge the United States with. But there are two problems: One, China doesn’t have a globalizing ideology. Two, it’s much more integrated into the global economy than the Soviet Union was—so it’s much more constrained in its behavior. There are some Cold War elements in play today, then, but there are other elements that balance out the conflict on account of mutual dependence.”

In the Cold War, militaries in many non-communist countries were inherently anti-communist, because they knew they’d be the first to suffer in a communist takeover. But militaries in developing countries today are often more comfortable with a country like China—which doesn’t make human-rights demands and doesn’t interfere in domestic governance to the same extent that the West does. And this could make them more open to an alliance with Beijing.”

Link to the rest at The Signal

Not PG’s usual beat, but certainly a developing situation which has occupied a great deal of interest for PG and many others in the West.