‘The Two-Parent Privilege’ Review: Where Have All the Good Men Gone?

From The Wall Street Journal:

For families with young children, morning routines can resemble an assembly line: Make breakfast. Remind the kids to brush their teeth. Negotiate which snacks to include in their backpacks. Remind them again to brush their teeth. Look for shoes. Head out the door. Head back in the door to get the stray backpacks.

In our household, when one parent is out of town, the process seems to intensify and can feel like the “I Love Lucy” episode in which Lucy takes a job wrapping chocolates. Quickly overwhelmed by the speed of the conveyor belt, she starts shoving chocolates anywhere they’ll fit, and concludes, “I think we’re fighting a losing game.”

Over the past 50 years, the number of one-parent households in America have dramatically increased. In 2019, 57% of U.S. children lived with two parents, down from 80% in 1980. Is the rise of single-parent households an emblem of empowerment or a sign of dwindling support for children?

Discussions of parenting can be fraught, dominated by feelings over facts, and too often tinged with judgment rather than support. The problem is, in part, that there has been limited accessible evidence on the causal effect of household logistics on children’s outcomes.

Enter “The Two-Parent Privilege: How Americans Stopped Getting Married and Started Falling Behind,” Melissa Kearney’s clear-eyed look at the economic impact of having a second parent at home. Ms. Kearney is an economist at the University of Maryland; her topics of research range from the social impact of the MTV show “16 and Pregnant” to the recent Covid baby bust. As she notes, “discomfort and hesitancy have stifled public conversation on a critically important topic that has sweeping implications not just for the well-being of American children and families but for the country’s well-being.”

Ms. Kearney’s objective is two-fold: first, to offer a data-driven overview of the rise and impact of single parenting; second, to propose strategies to enable more kids to live in stable households.

When it comes to the economic well-being of children, she argues, having two parents really is better than one—on average. Consider the conclusion of a 2004 paper, “Is Making Divorce Easier Bad for Children? The Long-Run Implications of Unilateral Divorce,” by the economist Jonathan Gruber. “As a result of the increased incidence of parental divorce,” Ms. Kearney tells us, “children wound up having lower levels of education, lower levels of income, and more marital churn themselves (both more marriages and more separations), as compared to similarly situated children who did not live in places where unilateral divorce laws were in effect.” Moreover, Ms. Kearney notes that children living with a stepparent also tend to have worse behavioral outcomes than those whose birth parents remained married.

While divorce is common, the spike in the number of single-parent households is mainly driven by an increase in the share of mothers who have never married—particularly among those who are less educated. In 2019, 60% of children whose mothers had a high-school degree (but less than a four-year college degree) lived with both parents, “a huge drop from the 83% who did in 1980” and low relative to the roughly 84% of children of college-educated mothers who lived with both parents in 2019. The author also notes significant gaps in family structure according to race: In 2019, 38% of black children lived with married parents, compared with 77% of white children and 88% of Asian children.

What is driving these changes? Among other factors, Ms. Kearney refers to the lack of “marriageable men,” pointing to a 2019 paper by the economists David Autor, David Dorn and Gordon Hanson, “When Work Disappears: Manufacturing Decline and the Falling Marriage Market Value of Young Men.” The paper analyzes the effect of drops in income for less-educated men, driven by increased international competition in manufacturing, and finds, Ms. Kearney tells us, that “the trade-induced reduction in men’s relative earnings led to lower levels of marriage and a higher share of unmarried mothers. It also led to an increase in the share of children living in single-mother households with below-poverty levels of income.” Reintroducing economic opportunities (for instance, through fracking booms) doesn’t seem to reverse this trend—suggesting an interplay between economic shocks and evolving social norms.

Link to the rest at The Wall Street Journal

Is It Worthwhile to Write My Memoir, Especially If a Publishing Deal Is Unlikely?

From Jane Friedman:

Question

In the eighth decade of my life and after having three books traditionally published—a travel memoir 50 years ago and two novels more recently—I am pondering the wisdom of writing a very personal memoir.

What has moved me most to think about this is the #MeToo movement: I was the victim of date rape while working as a civilian employee on an American army base in France from 1963–1964. While my time in France was indeed a wonderful one, a dream come true, tarnished only by this one incident, I sometimes reflect on the high percentage of women who have suffered sexual abuse, many while serving in the military. I was advised not to report this case by my immediate superior with the very real threat that the perpetrator (an officer) most likely would not be punished, and it would likely mean the loss of my job.

The memoir I am thinking of and which I have partially written is about much more than this incident; it is also about the loss of innocence and the excitement of discovering a foreign culture. It includes the story of my first true romance, an interracial affair. I was the “innocent” white girl in love with an African American enlisted man—two “no-no’s” for I was told during my training that it was absolutely not advised to date enlisted men, but only officers, “men of a higher caliber.” Race was not mentioned but implied by the times and by several other statements. These experiences in addition to the opportunity I had to develop wonderful life-long friendships with several French citizens prompts me to want to share them in a memoir. I would like to know if this is worth my writing; would it be received well or would you offer a caveat to me, to avoid what may be a well-worn subject matter?

—Memoirist with a Dilemma

P.S. I would love to have a traditional publisher if I do finish this memoir, but in today’s world, I think it is highly unlikely I would find one interested in an octogenarian author.


Dear Memoirist with a Dilemma,

Oh my goodness, there are so many layers to this question!

I think I want to start by saying that even if #MeToo feels like it’s run its course, even if it feels like the publishing world is tired of women’s stories about rape, or maybe just tired of women’s stories or memoirs, period…I assure you, the market is not oversaturated with memoirs by women in their eighth decade.

Which, as you know, doesn’t mean there’s an easy path ahead of you. The publishing world may not be receptive to a memoir like this for any number of reasons—some of which might be valid and some of which are utter bullshit. Your age might be one of those reasons, but it’s not the only one. Publishing is a highly uncertain field with few guarantees, and the market for memoirs can be particularly uncertain.

As it happens, I’m writing this response on Labor Day, so in answering your question about the value of writing a memoir—and about the worth of writing—I do first want to acknowledge writing (and art-making, generally) as a form of labor that, like any labor, should be fairly compensated, monetarily.

That said, for better and worse, many artistic and writing projects fall largely outside the realm of capitalism. Recently, I was listening to one of the first episodes of the “Wiser Than Me” podcast*, hosted by Julia Louis-Dreyfus; it’s an interview with Isabel Allende (who didn’t start writing novels until 40), who channeled Elizabeth Gilbert giving advice to young writers—which you are not, but maybe this is actually just decent advice for any writer: “Don’t expect your writing to give you fame or money, right? Because you love the process, right? And that’s the whole point, love the process.” 

Which is just to say that, if you’re asking whether writing this memoir is likely to justify your time and energy, financially—well, unfortunately, that’s probably a very short response letter. It’s almost certainly not.

But that doesn’t mean you shouldn’t write it, or that writing this memoir would be unwise, in some way, or unworthy of your time and energy. The answer, here, lies in the whyWhy do you want to write this memoir?

Do you love the process? Do you think you’ll feel better about the world on the average day when you’ve sat down to work on this book than on a day when you haven’t? Do you enjoy writing more than you don’t enjoy it?

If your answers to those questions are enthusiastically positive, then that’s reason enough to write.

There might be other, even more significant reasons to dive fully into this project. Writing a memoir isn’t therapeutic, per se, but the process of writing and rewriting our personal stories can be a rewarding process, one that’s often full of (good) surprises.

In this case, you’re talking about revisiting experiences—including an assault—after 60 years; the opportunity to reshape your story and to reconsider what you make of it might be incredibly meaningful. Indeed, it sounds like you’re already doing this to some extent, inspired in part by the #MeToo movement and other people’s sharing of their stories. One of the reasons #MeToo took off was because it defused and transformed a particular kind of shame and loneliness an awful lot of women had been sitting with for too long. Perhaps you, too, have been feeling that way.

Does revisiting this time and your experiences—the many good ones as well as the bad one—and considering them from fresh and maybe unexpected angles sound appealing and useful? Again, if your answer here is an enthusiastic yes: what are you waiting for?

(This might be an unpopular opinion, but for what it’s worth, I think it’s also completely valid to say, “Nah, I don’t need to relive all that.” But I think you wouldn’t have written in with this question if that were how you felt about it.)

Ultimately, both of those reasons are sort of personal and maybe even a little self-centered. And so what if they are? After all, as Mary Oliver put it in “The Summer Day” (which she wrote at age 62), “What is it you plan to do with your one wild and precious life?” You really don’t have to please anyone but yourself.

But I also understand that writing a memoir solely for the pleasure of it might not feel entirely satisfactory, either. We want our stories to make connections, and to matter to someone, right?

Link to the rest at Jane Friedman

The Band of Debunkers Busting Bad Scientists

From The Wall Street Journal:

An award-winning Harvard Business School professor and researcher spent years exploring the reasons people lie and cheat. A trio of behavioral scientists examining a handful of her academic papers concluded her own findings were drawn from falsified data.

It was a routine takedown for the three scientists—Joe Simmons, Leif Nelson and Uri Simonsohn—who have gained academic renown for debunking published studies built on faulty or fraudulent data. They use tips, number crunching and gut instincts to uncover deception. Over the past decade, they have come to their own finding: Numbers don’t lie but people do. 

“Once you see the pattern across many different papers, it becomes like a one in quadrillion chance that there’s some benign explanation,” said Simmons, a professor at the Wharton School of the University of Pennsylvania and a member of the trio who report their work on a blog called Data Colada. 

Simmons and his two colleagues are among a growing number of scientists in various fields around the world who moonlight as data detectives, sifting through studies published in scholarly journals for evidence of fraud. 

At least 5,500 faulty papers were retracted in 2022, compared with 119 in 2002, according to Retraction Watch, a website that keeps a tally. The jump largely reflects the investigative work of the Data Colada scientists and many other academic volunteers, said Dr. Ivan Oransky, the site’s co-founder. Their discoveries have led to embarrassing retractions, upended careers and retaliatory lawsuits. 

Neuroscientist Marc Tessier-Lavigne stepped down last month as president of Stanford University, following years of criticism about data in his published studies. Posts on PubPeer, a website where scientists dissect published studies, triggered scrutiny by the Stanford Daily. A university investigation followed, and three studies he co-wrote were retracted.

Stanford concluded that although Tessier-Lavigne didn’t personally engage in research misconduct or know about misconduct by others, he “failed to decisively and forthrightly correct mistakes in the scientific record.” Tessier-Lavigne, who remains on the faculty, declined to comment.

The hunt for misleading studies is more than academic. Flawed social-science research can lead to faulty corporate decisions about consumer behavior or misguided government rules and policies. Errant medical research risks harm to patients. Researchers in all fields can waste years and millions of dollars in grants trying to advance what turn out to be fraudulent findings.

The data detectives hope their work will keep science honest, at a time when the public’s faith in science is ebbing. The pressure to publish papers—which can yield jobs, grants, speaking engagements and seats on corporate advisory boards—pushes researchers to chase unique and interesting findings, sometimes at the expense of truth, according to Simmons and others.

“It drives me crazy that slow, good, careful science—if you do that stuff, if you do science that way, it means you publish less,” Simmons said. “Obviously, if you fake your data, you can get anything to work.”

The journal Nature this month alerted readers to questions raised about an article on the discovery of a room-temperature superconductor—a profound and far-reaching scientific finding, if true. Physicists who examined the work said the data didn’t add up. University of Rochester physicist Ranga Dias, who led the research, didn’t respond to a request for comment but has defended his work. Another paper he co-wrote was retracted in August after an investigation suggested some measurements had been fabricated or falsified. An earlier paper from Dias was retracted last year. The university is looking closely at more of his work.

Experts who examine suspect data in published studies count every retraction or correction of a faulty paper as a victory for scientific integrity and transparency. “If you think about bringing down a wall, you go brick by brick,” said Ben Mol, a physician and researcher at Monash University in Australia. He investigates clinic trials in obstetrics and gynecology. His alerts have prompted journals to retract some 100 papers with investigations ongoing in about 70 others.

Among those looking into other scientists’ work are Elisabeth Bik, a former microbiologist who specializes in spotting manipulated photographs in molecular biology experiments, and Jennifer Byrne, a cancer researcher at the University of Sydney who helped develop software to screen papers for faulty DNA sequences that would indicate the experiments couldn’t have worked.

“If you take the sleuths out of the equation,” Oransky said, “it’s very difficult to see how most of these retractions would have happened.”

The origins of Data Colada stretch back to Princeton University in 1999. Simmons and Nelson, fellow grad-school students, played in a cover band called Gibson 5000 and a softball team called the Psychoplasmatics. Nelson and Simonsohn got to know each other in 2007, when they were faculty members in the business school at the University of California, San Diego.

The trio became friends and, in 2011, published their first joint paper, “False-Positive Psychology.” It included a satirical experiment that used accepted research methods to demonstrate that people who listened to the Beatles song “When I’m Sixty-Four” grew younger. They wanted to show how research standards could accommodate absurd findings. “They’re kind of legendary for that,” said Yoel Inbar, a psychologist at the University of Toronto Scarborough. The study became the most cited paper in the journal Psychological Science.

When the trio launched Data Colada in 2013, it became a site to air ideas about the benefits and pitfalls of statistical tools and data analyses. “The whole goal was to get a few readers and to not embarrass ourselves,” Simmons said. Over time, he said, “We have accidentally trained ourselves to see fraud.”

They co-wrote an article published in 2014 that coined the now-common academic term “p-hacking,” which describes cherry-picking data or analyses to make insignificant results look statistically credible. Their early work contributed to a shift in research methods, including the practice of sharing data so other scientists can try to replicate published work.

“The three of them have done an amazing job of developing new methodologies to interrogate the credibility of research,” said Brian Nosek, executive director of the Center for Open Science, a nonprofit based in Charlottesville, Va., which advocates for reliable research.

Nelson, who teaches at the Haas School of Business at the University of California, Berkeley, is described by his partners as the big-picture guy, able to zoom out of the weeds and see the broad perspective.

Simonsohn is the technical whiz, at ease with opaque statistical techniques. “It is nothing short of a superpower,” Nelson said. Simonsohn was the first to learn how to spot the fingerprints of fraud in data sets.

Working together, Simonsohn said, “feels a lot like having a computer with three core processors working in parallel.”

The men first eyeball the data to see if they make sense in the context of the research. The first study Simonsohn examined for faulty data on the blog was obvious. Participants were asked to rate an experience on a scale from zero through 10, yet the data set inexplicably had negative numbers.

Another red flag is an improbable claim—say a study that said a runner could sprint 100 yards in half a second. Such findings always get a second look. “You immediately know, no way,” said Simonsohn, who teaches at the Esade Business School in Barcelona, Spain. Another telltale sign is perfect data in small data sets. Real-world data is chaotic, random.

Any one of those can trigger an examination of a paper’s underlying data. “Is it just an innocent error? Is it p-hacking?” Simmons said. “We never rush to say fraud.”

. . . .

Bad data goes undetected in academic journals largely because the publications rely on volunteer experts to ensure the quality of published work, not to detect fraud. Journals don’t have the expertise or personnel to examine underlying data for errors or deliberate manipulation, said Holden Thorp, editor in chief of the Science family of journals. 

. . . .

Thorp said he talks to Bik and other debunkers, noting that universities and other journal editors should do the same. “Nobody loves to hear from her,” he said. “But she’s usually right.”

The data sleuths have pushed journals to pay more attention to correcting the record, he said. Most have hired people to review allegations of bad data. Springer Nature, which publishes Nature and some 3,000 other journals, has a team of 20 research staffers, said Chris Graf, the company’s research integrity director, twice as many as when he took over in 2021.

Retraction Watch, which with research organization Crossref keep a log of some 50,000 papers discredited over the past century, estimated that, as of 2022, about eight papers have been retracted for every 10,000 published studies.

Bik and others said it can take months or years for journals to resolve complaints about suspect studies. Of nearly 800 papers that Bik reported to 40 journals in 2014 and 2015 for running misleading images, only a third had been corrected or retracted five years later, she said.

Link to the rest at The Wall Street Journal

Bartleby and Me

From The Wall Street Journal:

Gay Talese and Frank Sinatra have enjoyed a rich, symbiotic relationship, one that has long outlasted the singer, who died at 82 a quarter-century ago. Back in 1965, Mr. Talese trailed Sinatra around Las Vegas and Hollywood for a profile for Esquire magazine. At his peak after a triumphant comeback, Sinatra brushed off the writer’s pleas for an interview, but Mr. Talese produced a piece anyway. The result, “Frank Sinatra Has a Cold,” became one of the most celebrated magazine articles from the golden age of the slicks—and an enduring testament to Sinatra’s talent and fame.

Along with Joan Didion, Norman Mailer, Tom Wolfe and others, Mr. Talese has been acclaimed as a virtuoso of the novelistic New Journalism. Now 91, he has published a short and charming second memoir, “Bartleby and Me: Reflections of an Old Scrivener.” Once again, Sinatra takes center stage. But there’s more, especially the author’s take on the kind of journalism he’s practiced for seven decades, starting as a copy boy at the New York Times in 1953.

Mr. Talese takes his inspiration—and his title—from “Bartleby, the Scrivener,” Herman Melville’s 1853 short story about an inconsequential law clerk. “Growing up in a small town on the Jersey Shore in the late 1940s, I dreamed of someday working for a great newspaper,” Mr. Talese writes. “But I did not necessarily want to write news. . . . I wanted to specialize in writing about nobodies.”

His first published piece, carried without a byline on the Times’s editorial page, was about a “nobody” who operated the illuminated ribbon sign that announced the latest news around a lower floor of the old Times Tower in Times Square—a Bartleby for the age of Ike.

Thankfully for magazine journalism, Mr. Talese eventually overcame his original preoccupation, but before he did so he chronicled alley cats, bus drivers, ferry-boat captains, dress-mannequin designers, even those who pushed the three-wheeled rolling chairs along Atlantic City’s boardwalk. After two years of military service at Fort Knox—during which he contributed pieces to the Louisville Courier-Journal—he returned to the Times as a sports writer. (As a college correspondent for the Times in the late ’50s, I sometimes squatted at an empty desk near his in the uncrowded sports department.)

In 1965 Mr. Talese left the paper to join Esquire, then in its glory days under the brilliant editor Harold Hayes. The young writer promptly sold Hayes on a profile of figures at the Times, both obscure and heralded, starting with Alden Whitman. Whitman had revolutionized obituaries at the paper by conducting long premortem interviews with Harry Truman, Pablo Picasso and other luminaries. The lauded “Mr. Bad News” piece helped lay the groundwork for “The Kingdom and the Power,” Mr. Talese’s 1969 book about the Times—his first bestseller.

Bartleby’s murmurous response to the world was “I prefer not to,” while Sinatra famously belted out “I did it my way.” Still, the young Talese was drawn to him.

Fully a third of “Bartleby and Me” is a reconstruction of Mr. Talese’s frustrated pursuit of Sinatra—from his first glimpse of his lonely subject nursing a Jack Daniel’s at the bar of the Hollywood hangout The Daisy, to watching him pick a fight with a young writer because Sinatra didn’t like his boots, and at a recording session after an earlier one was aborted because the crooner had the sniffles. Sinatra genially blows off Mr. Talese’s requests to talk, so the writer interviews Sinatra’s entourage, including his sort-of-look-alike stand-in, as well as the little old lady who totes around his hairpieces, and his daughter Nancy. Mr. Talese even describes how he took his Sinatra notes on cut-down laundered-shirt cardboards.

The 14,000-word cover story ran in the April 1966 issue, was later published as a short book and, on the 70th anniversary of Esquire, was voted by its editors and staff the best piece ever to run in the magazine.

Link to the rest at The Wall Street Journal

How Did He Get Away With It?

From The City Journal:

Long before the rest of us were talking about blue and red America, Tom Wolfe not only recognized the cultural divide; he bridged it. When he began his career in the 1960s, the liberal establishment was more dominant and even smugger than it is today. There were no pesky voices on cable television or the web to challenge the Eastern elites’ hold on the national media. Then along came Wolfe, a lone voice celebrating the hinterland’s culture, mercilessly skewering the pretensions and dogmas of New York’s intelligentsia—and somehow triumphing.

How did he get away with it? The most entertaining analysis opens in theaters this weekend in New York and next weekend in Los Angeles and Toronto. The documentary, Radical Wolfe, is a superb chronicle of his life and career, told through footage of Wolfe (who died in 2018 at the age of 88) expounding in his famous white suits. It features the Jon Hamm reading from Wolfe’s work along with interviews with his friends and enemies, his daughter, Alexandra Wolfe, and his fans, including Christopher Buckley, Niall Ferguson, Gay Talese, and Peter Thiel. Director Richard Dewey draws on the insights and research of Michael Lewis, who pored through the archive of Wolfe’s letters and papers for a 2015 article in Vanity Fair, “How Tom Wolfe Became . . Tom Wolfe.”

Wolfe grew up in Richmond, Virginia, and remained true to his roots when he went north. In private, he remained the quiet, courtly Southern gentleman, the perpetual outsider gently bemused by the Yankees’ tribal beliefs and customs. “He was a contradictory character,” Talese observes in the film. “Such a polite person, such a well-mannered person. With a pen in his hand, he could be a terrorist.”

After getting a Ph.D. in American Studies at Yale, Wolfe worked as a fairly conventional reporter and feature writer at the Springfield Union in Massachusetts, the Washington Post, and the New York Herald Tribune. His breakthrough came during the New York newspaper strike of 1962–63. Needing money to pay his bills, he took an assignment from Esquire to write about custom-car culture in southern California, which fascinated him but left him with a severe case of writer’s block, as related in the film by Wolfe and Byron Dobell, his editor at Esquire.

With the deadline looming and a color photo spread already printed for the upcoming issue, Wolfe told his editor that he just couldn’t write the piece. Dobell told him to type up his notes so another writer could put some words next to the photo spread. Wolfe began typing that evening, stayed up all night, and delivered a 49-page letter to Dobell the next morning. The editor’s reaction: “It’s a masterpiece. This is unbelievable. We’d never seen anything like this. I struck out the ‘Dear Byron’ and struck out the parting words, and we ran it.” The headline was appropriately Wolfean: “There Goes (Varoom! Varoom!) That Kandy-Kolored (Thphhhhhh!) Tangerine-Flake Streamline Baby (Rahghhh!) Around the Bend (Brummmmmmmmmmmmmmm) . . .”

 Wolfe went on writing in his inimitable voice for the Herald Tribune Sunday supplement, which would be reincarnated as the independent New York magazine. His prose style—exclamation points, ellipses, long sentences, and streams of consciousness—appalled the high priests of the literary world, particularly when he applied it against their temple, the New Yorker. Other writers in the 1960s dreamed of being published in the magazine, but Wolfe wrote a two-part series savaging it as a moribund institution. The first piece in the series was titled “Tiny Mummies! The True Story of the Ruler of 43rd Street’s Land of the Walking Dead!” The second article, “Lost in the Whichy Thicket,” mocked its plodding articles, with their cluttered subordinate clauses and understated pseudo-British tone. He dismissed its fiction as a “laughingstock” that kept the magazine in business by serving as filler between pages of luxury ads aimed at suburban women:

Usually the stories are by women, and they recall their childhoods or domestic animals they have owned. Often they are by men, however, and they meditate over their wives and their little children with what used to be called “inchoate longings” for something else. The scene is some vague exurb or country place or summer place, something of the sort, cast in the mental atmosphere of tea cozies, fringe shawls, Morris chairs, glowing coals, wooden porches, frost on the pump handle, Papa out back in the wood bin, leaves falling, buds opening, bird-watcher types of birds, tufted grackles and things, singing, hearts rising and falling, but not far—in short, a great lily-of the-valley vat full of what Lenin called “bourgeois sentimentality.”

The empire struck back. The novelist J.D. Salinger emerged from seclusion to declare that the Herald Tribune would “likely never again stand for anything either respect-worthy or honorable” after Wolfe’s “inaccurate and sub-collegiate and gleeful and unrelievedly poisonous” attack on the New Yorker. There were more denunciations from the writers E.B. White, Ved Mehta, Muriel Spark, Murray Kempton, and from the syndicated columnists Joseph Alsop and Walter Lippmann.

. . . .

Literary critics sneered at his work, but his nonfiction books and novels were best-sellers that changed the national conversation. His coinages entered the common usage—the astronauts’ “Right Stuff,” Wall Street’s “Masters of the Universe,” Park Avenue’s “Social X-Rays.” He identified lowbrow “statuspheres” across America and made heroes out of the stock-car racer Junior Johnson and the fighter ace and test pilot Chuck Yeager. While doomsaying journalists and intellectuals were decrying American culture and modern technology, he declared that we were experiencing a “happiness explosion” and explained, “It’s only really Eng. Lit. intellectuals and Krishna groovies who try to despise the machine in America. The idea that we’re trapped by machines is a 19th-century romanticism invented by marvelous old frauds like Thoreau and William Morris.”

Link to the rest at The City Journal

PG was a huge fan of Wolfe’s writing style a very long time ago. The OP made him realize he should reread some of Wolfe’s books.

The British empire peaked 100 years ago this month

From The Economist:

The British empire was and is many things to many people: a civilising endeavour, a bringer of peace, an exploitative force or a project based on white supremacy. Arguments exist for each characterisation. But there is one thing that the British empire is not: completely over.

It lives on in court cases, including one brought in 2019 by indigenous people of the Chagos Islands, whom the British colonial government forcibly relocated between 1965 and 1973 (with American support). It exists in the loyalties of the 15 commonwealth “realms”, including Australia and Canada, for which King Charles III (pictured in 1984) is their monarch and head of state. And it lives on in the demographic make-up of Britain, where one in five people is Asian, black or mixed race. (A similar share of cabinet ministers, including the prime minister, are children of immigrants from the former empire.) As the old saying goes, “We are here because you were there.”

Two new books consider the “here” and “there”. “One Fine Day” is a sprawling account of the British empire by Matthew Parker, a historian. It travels like the never-setting imperial sun across Asia, Africa and outposts of the “new world” in the Caribbean. The book’s organising principle is a day—September 29th 1923—when the British empire reached its maximum territorial extent. The portrait is achieved with a wide-angled lens, but the choice of a single day also brings focus.

Mr Parker’s approach is to find the most interesting currents in the empire’s various corners in September 1923 and to tell them through little-remembered colonial administrators and prominent locals. For example, in what was then Malaya (modern-day Malaysia and its surrounds) readers meet Hugh Clifford, who learnt Malay and fell in love with the country and its people. He was self-aware enough to wonder whether “the boot of the white man” had stamped out the best parts of local culture. Yet Clifford was also responsible, at the age of just 22, for adding 15,000 square miles of territory to the empire and described Malays as “the cattle of mankind”.

In colonies across continents, elites were disillusioned with the obvious hypocrisy of foreign rulers, while foot-soldiers such as George Orwell found themselves uneasy with the violence of colonial rule. What emerges is a picture of an empire straining under the weight of its own contradictions. The British thought of their role as an enlightened one: stopping tribal warfare and introducing modern health care and education. Yet they brought forced labour and colonial massacres, racist rules, and substandard health care and education. Rather than simply stating so baldly, Mr Parker points this out through copious examples and meticulous research. He appears to have read the front page of every newspaper published in the empire on that day.

. . . .

Imperial Island” by Charlotte Lydia Riley, a historian at the University of Southampton, is half the length and better organised. Starting with the contributions of the empire’s troops in the second world war and the meagre thanks (or even acknowledgment) given to them afterwards, she runs through headline events of post-war British history.

Yet to call this an imperial history is misleading. The book reads more like a history of race relations in modern Britain, and the links to empire often feel forced. Fundraising for a famine in Ethiopia reveals, in Ms Riley’s telling, a guilt-ridden imperial hangover, as do children’s books about India and cookbooks with dishes from around the world. A map of countries where an overseas volunteer organisation operates is—what else?—a throwback to the British empire’s pink map.

This is a shame, because a book that lived up to the promise made by Ms Riley’s would have been revealing and important. The legacy of colonialism, like the empire itself, is riddled with contradictions. It is impossible to attempt to understand Britain today without wrestling with ambiguities. Yes, children of immigrants in Britain carried out the tube bombings in 2005, sparking a national reckoning over homegrown extremism, as Ms Riley describes over several pages; but another child of immigrants, Rishi Sunak, ascended to the highest echelons of government and is not mentioned by Ms Riley. The Brexit campaign to leave the European Union was based on the paradoxical promises of keeping foreigners out while opening up to the foreign empire. It deserves more careful examination than the meagre four pages Ms Riley devotes to it.

Link to the rest at The Economist

The importance of handwriting is becoming better understood

From The Economist:

wo and a half millennia ago, Socrates complained that writing would harm students. With a way to store ideas permanently and externally, they would no longer need to memorise. It is tempting to dismiss him as an old man complaining about change. Socrates did not have a stack of peer-reviewed science to make his case about the usefulness of learning concepts by heart.

Today a different debate is raging about the dangers of another technology—computers—and the typing people do on them. As primary-school pupils and phd hopefuls return for a new school year in the northern hemisphere, many will do so with a greater-than-ever reliance on computers to take notes and write papers. Some parents of younger students are dismayed that their children are not just encouraged but required to tote laptops to class. University professors complain of rampant distraction in classrooms, with students reading and messaging instead of listening to lectures.

A line of research shows the benefits of an “innovation” that predates computers: handwriting. Studies have found that writing on paper can improve everything from recalling a random series of words to imparting a better conceptual grasp of complicated ideas.

For learning material by rote, from the shapes of letters to the quirks of English spelling, the benefits of using a pen or pencil lie in how the motor and sensory memory of putting words on paper reinforces that material. The arrangement of squiggles on a page feeds into visual memory: people might remember a word they wrote down in French class as being at the bottom-left on a page, par exemple.

One of the best-demonstrated advantages of writing by hand seems to be in superiornote-taking. In a study from 2014 by Pam Mueller and Danny Oppenheimer, students typing wrote down almost twice as many words and more passages verbatim from lectures, suggesting they were not understanding so much as rapidly copying the material.

Handwriting—which takes longer for nearly all university-level students—forces note-takers to synthesise ideas into their own words. This aids conceptual understanding at the moment of writing. But those taking notes by hand also perform better on testswhen students are later able to study from their notes. The effect even persisted when the students who typed were explicitly instructed to rephrase the material in their own words. The instruction was “completely ineffective” at reducing verbatim note-taking, the researchers note: they did not understand the material so much as parrot it.

Many studies have confirmed handwriting’s benefits, and policymakers have taken note. Though America’s “Common Core” curriculum from 2010 does not require handwriting instruction past first grade (roughly age six), about half the states since then have mandated more teaching of it, thanks to campaigning by researchers and handwriting supporters. In Sweden there is a push for more handwriting and printed books and fewer devices. England’s national curriculum already prescribes teaching the rudiments of cursive by age seven.

However, several school systems in America have gone so far as to ban most laptops. This is too extreme. Some students have disabilities that make handwriting especially hard. Nearly all will eventually need typing skills. And typing can improve the quality of writing: being able to get ideas down quickly, before they are forgotten, can obviously be beneficial. So can slowing down the speed of typing, says Dr Oppenheimer.

Virginia Berninger, emeritus professor of psychology at the University of Washington, is a longtime advocate of handwriting. But she is not a purist; she says there are research-tested benefits for “manuscript” print-style writing, for cursive (which allows greater speed) but also for typing (which is good practice for composing passages). Since students spend more time on devices as they age, she argues for occasional “tuning up” of handwriting in later school years.

Link to the rest at The Economist

PG admits that he has been using a keyboard to create letters, words and paragraphs for so long, his handwriting has degenerated to the point that he doubts anyone else could read it. And, sometimes PG has trouble reading what he has hand-written a day or two previously.

The Hallmarks of a Bad Argument

From Jane Friedman:

We need more debate.

I’m a firm believer that one of the biggest issues in our society—especially politically—is that people who disagree spend a lot less time talking to each other than they should.

Earlier in June, I wrote about how the two major political candidates are dodging debates. The next week I wrote about how a well known scientist (or someone like him) should actually engage Robert F. Kennedy Jr. on his views about vaccines.

In both cases, I received a lot of pushback. There are, simply put, many millions of Americans who believe that some minority views—whether it’s that the 2020 election was stolen or vaccines can be dangerous or climate change is going to imminently destroy the planet or there are more than two genders—are not worth discussing with the many people who hold those viewpoints. Many of these same people believe influential voices are not worth airing and are too dangerous to be on podcasts or in public libraries or in front of our children.

On the whole, I think this is wrongheaded. I’ve written a lot about why. But something I hadn’t considered is that people are skeptical about the value of debate because there are so many dishonest ways to have a debate. People aren’t so much afraid of a good idea losing to a bad idea; they are afraid that, because of bad-faith tactics, reasonable people will be fooled into believing the bad idea.

Because of that, I thought it would be a good idea to talk about all the dishonest ways of making arguments.

The nature of this job means not only that I get to immerse myself in politics, data, punditry, and research, but that I get a chance to develop a keen understanding of how people argue—for better, and for worse.

Let me give you an example.

Recently, we covered Donald Trump’s fourth indictment, when a grand jury in Fulton County, Georgia, indicted former President Donald Trump and 18 others over allegations of a sprawling conspiracy to overturn Joe Biden’s election victory in Georgia. As usual, we got some feedback and criticism from our readers—which we welcome. A couple people asked why Hillary Clinton isn’t also getting indicted, since she also has disputed that she lost a fair and open election.

This, of course, got me talking about the differences in these cases. Clinton conceded the election the night it was called, Trump didn’t. Clinton’s supporters didn’t swarm the Capitol hoping to overturn the results while she—as president—was silent. Trump’s supporters did, and he was.

Then we started having a conversation about what Hillary Clinton did do. She did say that the election was illegitimate and that Russia tampered, and continues to. She did use a private email server…

And now the topic of conversation has changed, from “did Trump deserve to be indicted” to “should Hillary Clinton have been indicted?”

This is an example of “whataboutism,” where the person you’re talking to or arguing with asks you about a different but similar circumstance, and in doing so changes the subject.

Whataboutism

This is probably the argument style I get from readers the most often. There is a good chance you are familiar with it. This argument is usually signaled by the phrase, “What about…?” For instance, anytime I write about Hunter Biden’s shady business deals, someone writes in and says, “What about the Trump children?” My answer is usually, “They also have some shady deals.”

The curse of whataboutism is that we can often do it forever. If you want to talk about White House nepotism, it’d take weeks (or years) to properly adjudicate all the instances in American history, and it would get us nowhere but to excuse the behavior of our own team. That is, of course, typically how this tactic is employed. Liberals aren’t invoking Jared Kushner to make the case that profiting off your family’s time in the White House is okay, they are doing it to excuse the sins of their preferred president’s kid—to make the case that it isn’t that bad, isn’t uncommon, or isn’t worth addressing until the other person gets held accountable first.

Now, there are times when this kind of argument is useful, and sometimes even enlightening. If we are truly asking where the line for prosecutable conduct is for a presidential candidate, it’s useful to find precedent and see where it is being applied inconsistently. If we’re asking “is the government consistent,” comparing Clinton, Trump, Biden, Nixon—it’s all on the table. The same is true if we’re asking about the bias of media organizations, and seeing if they cover similar stories differently, if the subject of the story is the major element that’s changing.

But if I write a story that says your favorite political candidate answered a question in a very poor way, and you respond by saying, “Well, this other politician said something bad, too—I think even worse. What about that statement?” That wouldn’t be helpful, or enlightening.

Furthermore, context is important. If I’m writing about Hunter Biden’s business deals I may reference how other similar situations were addressed or spoken about in the past. But when the topic of discussion is whether one person’s behavior was bad, saying that someone else did something bad does nothing to address the subject at hand. It just changes the subject.

Link to the rest at Jane Friedman

The OP is more political than PG’s usual selections. He requests that any disagreements in the comments not involve any personal attacks against anyone, candidate or not.

Blenheim’s £5M Gold Toilet Heist: 7 Suspects Await Legal Outcome

From Culture.org:

The Case That Flushed Four Years Down the Drain

Ladies and gentlemen, grab your metaphorical plungers because we’re about to dive into the long and winding pipe of the infamous £5 million golden toilet heist at Blenheim Palace. For four years, this crime has mystified investigators, and its audacity has shocked the art world. Now, finally, we might be on the brink of flushing out the truth.

Back in 2019, Italian artist Maurizio Cattelan unveiled an art piece that, well, dazzled in the literal sense. It was an 18-carat gold toilet entitled ‘America,’ initially exhibited at the Guggenheim Museum in New York. Let’s be honest: this toilet was no ordinary John; it was a symbol of opulence, of irony, and it attracted a whopping 100,000 people in New York eager to, ahem, experience it.

The golden spectacle was then relocated to Blenheim Palace in Oxfordshire, strategically placed in a chamber opposite the room where the British Bulldog himself, Winston Churchill, was born. But before anyone could say “seat’s taken,” it was stolen in a high-stakes heist on September 14, 2019, just a day after its grand UK unveiling.

H2: The Hurdles and Whirlpools of a Baffling Investigation

Despite the passage of four years and the arrest of seven suspects—six men aged between 36 and 68, and one 38-year-old woman—the investigative waters have been murky. Not a single charge has been filed. Until now, that is. It seems the cogs of justice are finally turning.

The Thames Valley Police recently submitted a comprehensive file of evidence to the Crown Prosecution Service (CPS), the organization responsible for pulling the flush, so to speak, on any charges. This move significantly raises the possibility that the seven suspects could soon find themselves in deep water.

Maurizio Cattelan, the mastermind behind the £4.8 million toilet—yes, let’s not forget the 200k difference—initially took the theft with a grain of artistic humor. “Who’s so stupid to steal a toilet? ‘America’ was the one percent for the 99 percent,” he mused. But everyone knows, stealing art is no laughing matter, especially when it’s a toilet that takes on the American Dream, as the Palace’s chief executive Dominic Hare pointed out. The theft didn’t just rob a stately home; it flushed a cultural commentary down the drain.

What Happened to the Golden Throne?

As curious as it sounds, investigators believe the golden toilet was melted down and transformed into jewelry. While not confirmed, this adds another layer of irony to the story—turning an art piece designed for the “99 percent” into an elite object once again, only this time in the form of necklaces and rings.

Link to the rest at Culture.org

PG says the author should have limited herself to fewer garderobe puns.

History, Fast and Slow

From The Chronicle of Higher Education:

Historians! Put down your tools! Your labors are at an end. The scientists have finally solved history, turning it from a jumble of haphazard facts (“just one damn thing after another”), into something measurable, testable and — most importantly — predictable.

That, in short, is the message of Peter Turchin’s provocative new book, End Times: Elites, Counter-Elites, and the Path of Political Disintegration. A theoretical biologist by training, Turchin came to the study of history late, after decades spent developing mathematical models for interactions between predators and prey. At some point he realized those same models could be applied to the boom-and-bust cycles governing the fortunes of states and empires.

Blessed with tenure, Turchin thus made a risky midcareer move to a different field. Or rather, with the historical sociologist Jack Goldstone, he co-founded a brand new one, which they dubbed cliodynamics. This is meant to be a “new science of history” that exposes the hidden processes driving political instability in “all complex societies.” End Times is at once a primer on this “flourishing” new field, which has attracted considerable attention in recent years, and a direct application of it to the political landscape of the United States as it appeared in the last quarter of 2022.

Turchin promises at the outset of the book that, by treating history itself as “Big Data,” he and his collaborators can explain why everything that happened in the past happened, and tell us, with reasonable certainty, what will happen next. Tackling the past, present, and future is a big job. As a consequence, the pace of End Times is brisk, the arguments flashy, and the conclusions, for the most part, unsatisfying.

The overall thesis of Turchin’s book is disarmingly simple. History is shaped by interaction between the elites and the masses. When these two groups are in equilibrium, harmony reigns. But when too many people are vying for elite status all at once, things go out of whack, and instability becomes inevitable. At this point, there are only two real ways to bring the system back in line. The first is to turn off the “wealth pump,” Turchin’s term for whatever economic mechanism — technological change, tax policy, or even agricultural overpopulation in the case of medieval France — is at work in a given moment enriching the elites and depressing the relatives wages of the masses. The second is for the elites to physically eliminate one another in a revolution or civil war.

Elite overproduction is thus for Turchin what class conflict was for Marx and asabiyyah, or group cohesion, was for Ibn Khaldun: It is the engine driving history forward. However, over the course of End Times, Turchin never pins down what exactly defines an elite, or whether they come in different or competing forms. Are cultural elites, for instance, distinct from economic elites, or political elites from religious elites? His explanation that elites are “power holders” is not so much clarifying as circular. For Turchin, elites — whether they are English barons, Russian serf owners, or Southern planters — are a natural fact, like gravity or the rain.

Some elites are more dangerous than others, however. Unemployed degree holders, according to Turchin, have been responsible for most social upheavals reaching back to the Revolutions of 1848. The law, a magnet for the politically ambitious, has been a particularly hazardous profession; as Turchin notes, Robespierre, Lenin, Castro, Lincoln, and Gandhi were all lawyers. In his telling, elite colleges have become factories for the creation of counter-elites, waiting to destabilize existing institutions and usurp the role of existing elites. The most dangerous of these by far is Yale Law School, that “forge of revolutionary cadres,” having produced both left-wingers like Chesa Boudin and right-wingers like Stewart Rhodes, the leader of the Oath Keepers.

Even if the remark about “revolutionary cadres” is made in jest, it is one of many moments in End Times that leads a reader to question the objectivity of Turchin’s take on American politics. Is the Republican Party really in the process of becoming a “true revolutionary party”? Maybe if you see Steve Bannon as its Lenin. Did “The Establishment” run a “counterinsurgency campaign” to get Trump out of office in 2020? Only if you think the election was rigged.

Some of the problem comes from his sources. Turchin intersperses the book with interviews with voters drawn from different rungs on the American social ladder. There is Steve, a blue-collar guy who thinks liberal elites are driving America into the ground; Kathryn, a Beltway 1-percenter who reads Steven Pinker to reassure herself that life has never been better than it is right now; and Jane, an Upper East Side Trotskyite hoping to bring the system down from the inside — which is why she’s a student at Yale Law.

The problem is that all of these figures are fictional, products of Turchin’s limited political imagination. Shorn of real-life interlocutors, his account of contemporary politics feels like the generic product of an ideological echo-chamber. This has an unfortunate effect on End Times’s predictions. Some of Turchin’s forecasts have already been disproved. Will Tucker Carlson be the “crystallization nucleus” for the formation of a genuinely insurgent, anti-establishmentarian Republican Party? Even a year ago, when End Times was published, that seemed like a stretch, but after Carlson’s firing by Fox News this past April, it seems about equal to my hopes of becoming starting quarterback for the Steelers.

Other predictions in End Times are more ominous, and even harder to credit. Turchin’s historical models predict that America will go through a spasm of political violence in the 2020s bad enough to thin the herd of elite aspirants and thus restore political cohesion, only for the violence to recur in 50 to 60 years. Only if wages can be brought up in the near term can this recurrence of violence can be avoided. However, even if this hypothetical New New Deal were to be implemented, it would not prevent a major internal crisis sometime later this decade.

Is America really on the cusp of a second Civil War? Perhaps, perhaps not, but no concrete piece of evidence presented in End Times would lead you to think so. Turchin has valuable things to say about rising inequality in the United States. But the connection between elite enrichment and popular rebellion is neither reliable, nor predictable — least of all in democracies.

In the absence of clearly drawn historical mechanisms, we have to trust Turchin’s models. But he never lets us see under the hood. For an approach to history that prides itself so much on quantitative rigor, cliodynamics — at least as presented in this book — seems strikingly low on actual data. Not a single graph or chart graces the pages of End Times. A pair of appendices do promise to explain the detailed workings of the databases and computations underlying Turchin’s predictions and pronouncements. But while this annex features a variety of things, including ruminations on Tolstoy, scenes from a science-fiction novel, and a rather charming thought experiment about social scientists orbiting Alpha Centauri, it does not clarify the inner workings of cliodynamics. All it offers are generalities along the lines of “one death is a tragedy … but a thousand deaths give us data” without ever explaining what it is that makes such data useful.

. . . .

Turchin is singularly ungenerous to professional historians. He rarely cites the work of scholars in the field, preferring to rely on his own summaries of major events or Wikipedia. Instead of crediting major historiographical concepts, such as the Military Revolution of the 17th century, to their original articulators, he refers readers to his own (often, forthcoming) books. At one point, Turchin even expresses his regret that so much precious historical knowledge is trapped in books and articles and — worst of all — in the “heads of individual scholars,” and fantasizes about a future in which spiderbots could automate the process of learning and harvest information directly from experts’ brains. One gets the feeling reading End Times that Turchin would like to do away with the messy business of human analysis and judgment entirely: all of the things that make history, from his perspective, such a frustratingly inefficient discipline.

Link to the rest at The Chronicle of Higher Education

The Education of Henry Adams

From The Wall Street Journal:

A foundational text for every American Studies program, and one of the most original books published by an American author, “The Education of Henry Adams (An Autobiography)” ought to be read by educated citizens twice. First for its insight into contemporary history and then for what it reveals about the nature of education. Written in 1905, privately printed the next year, and published in 1918, its importance lies in its innovative form and prophetic content. A work startlingly idiosyncratic and unprecedented in the genre, its only forebear was Benjamin Franklin’s “Autobiography” (1793), which Adams called a “model of self-teaching.”

Adams (1838-1918) would demonstrate that writing autobiography was a fusion of record and reflection. In a radical decision, he wrote about himself in the third person, which permitted an important detachment that allowed him to make arbitrary additions or deletions. The best known of the latter was in the center of the book, where he eliminated mention of many years in his life. In 1885 his wife, Marian “Clover” Hooper Adams, had died by suicide. The shock led Adams to suppress any record of the event and its aftermath—and, for that matter, any reference at all to his spouse. He destroyed his diaries of the period and all correspondence between them. He resumed his narrative only after several years of silence. His life’s education would be the only compensation for his loss.

Adams was born in Boston, a direct descendant of two presidents and son of a diplomat. He would end up building a house across Lafayette Square from the White House, and becoming a dominant intellectual figure in American life.

Much of the first half of the book is about his disappointment with, and the failure of, his education. Of his early schooling, he recalled, “In any and all its forms the boy detested school.” From the beginning he felt his learning was inadequate. Of Harvard, which he entered in 1854, he concluded, “The education he had received bore little relation to the education he needed.”

There followed travel to Europe’s major cities of learning, but in Berlin “the experience was hazardous,” and in Rome he found “no education.” In England in 1868, he observed that the typical educated statesman “did not know what he was talking about.”

Back in Washington, Adams noted that the capital was “a poor place for education.” He summarized that “he had been mistaken at every point of his education.” But this also turned out to be a moment of revelation and advance, expressed in some of his memorable sentences: “He knew he knew nothing” and “He knew enough to be ignorant.” “Then he asked questions. . . . Probably this was education.” For Adams, even self-examination was painful: “He knew no tragedy so heartrending as introspection.”

In fact, he had learned the primacy of self-teaching—that it was participatory, not receptive; dynamic rather than passive. The second half of his book would be engaged in the great issues of his time as he envisioned them, and often these were startlingly prophetic.

Visits to the two major expositions of his era helped to reshape his thinking. The 1893 World’s Columbian Exposition in Chicago gave him an image of American power, and the 1900 Universal Exposition in Paris presented him with the new dynamos on display—symbols, he found, of the energy of the modern age. This led him to conceive of a dynamic theory of history, one that was in constant motion and in which all facts were relative. At last, “he was getting an education. . . . The new forces would educate.”

Having previously written a book about the architecture of Mont Saint-Michel and Chartres and the power of the Virgin as the embodiment of 13th-century unity, he could contrast the present as a study of 20th-century multiplicity—the absence of absolutes. The language of science explained the manifestations of modernism, as he took note of the new research on radium, the X-ray and the atom. The idea of magnetism in particular required an understanding of relationships, rather than one fixed position.

Adams was an elegant literary stylist, and he frequently crafted sentences of paired phrases, joined by a central semicolon. One of the most famous in “The Education” was “In plain words, Chaos was the law of nature; Order was the dream of man.” On the one hand, this was like the structure of a Gothic arch, two parts leaning together with a keystone at its apex. But it also conveyed a contemporary notion of balanced opposites.

Politically he was prescient in his comments on Russia, Germany and China. He wrote, “The vast force of inertia known as China was to be united with the huge bulk of Russia in a single mass which no amount of new force could henceforward deflect.” In turn, the difficult step was to “bring Germany into the combine.”

Link to the rest at The Wall Street Journal

The Opening of the Protestant Mind

From The Wall Street Journal:

‘Evangelicals,” or born-again Protestants—Christians who believe in converting non-Christians to their faith—haven’t had a lot of great press of late. The mainstream media all but blamed them for the 2016 election of Donald Trump. Going further back are evangelical ties to the Moral Majority and the religious right. Evangelicals in both politics and religion have a reputation for intolerance. They may have earned it: In 2017, the Pew Research Center found that evangelicals, more than any other Christian group, viewed Hindus, Buddhists, Mormons, atheists and Muslims unfavorably.

Mark Valeri’s “The Opening of the Protestant Mind” isn’t about 21st-century America, but his exploration of born-again Protestantism’s historical roots upends assumptions about religious conversion. Instead of making Christians intolerant, coming to faith by conversion historically went hand in hand with reasonableness, civility and religious toleration. Most readers would likely assume the opposite: If you believe other people need to convert to your faith from theirs, chances are you won’t give them much of a hearing. In Mr. Valeri’s interpretation, though, a figure like Jonathan Edwards, famous for preaching “Sinners in the Hands of an Angry God,” may have been among the most liberal and progressive thinkers in the Anglophone world.

Mr. Valeri’s narrative of Anglo-American Protestantism between 1650 and 1750 is not as oxymoronic as it initially sounds. The Protestant outlook that prevailed in English society at the time of Charles I’s execution in 1649 assumed that the “health of the state depended on a . . . religious confession” to supply social coherence. For the rest of the 17th century, when English writers (including American colonists) encountered non-Christians, they saw “illegitimate, dangerous, and demonic” religions.

But after the Glorious Revolution in 1688, in which the threat of a Catholic monarch was decisively ended, English Protestants began to distinguish “loyalty to the kingdom” from “conformity to any one creed.” English writers—some zealous Protestants, others philosophically inclined—“minimized theological orthodoxy” as a requirement for social standing. Not only did these authors discover “republican ideas of toleration and moral virtue” in other religions; they also revised Christianity. Conversion became the path to faith not by submission to dogma but by persuasion. This shift aided the ascendant Whigs in governing a diverse religious constituency. It also prompted Protestants to regard conversion (and awakenings) as the mark of true faith.

Mr. Valeri, a professor of religion and politics at Washington University in St. Louis, refers to a variety of authors well known and obscure. In the case of John Locke’s “Letter Concerning Toleration” (1689), the key to civil liberty was separating the purpose of the state from that of the church. If religious controversy threatened public order, the state should intervene. Otherwise government should leave religious groups to themselves. Locke’s outlook extended to Native Americans: “If they believe that they please God and are saved by the rites of their forefathers, they should be left to themselves.” Jonathan Edwards, who for a time served as a missionary to Native Americans, echoed Locke. Although an advocate for the First Great Awakening, Edwards regarded the Mohawks and English people as spiritual equals because they shared the same sinful human nature. For that reason, Edwards thought acculturating Native Americans to Anglo-American conventions as unimportant compared with converting them through persuasive preaching.

The 1733 English publication of “The Ceremonies and Religious Customs of the Various Nations of the Known World” is Mr. Valeri’s best example. Compiled by two French Protestants, Bernard Picart and Jean Frédéric Bernard, this popular book reinforced the ideal of conversion. Especially appealing to Anglo-American Protestants was the French catalogers’ contention that ceremonial religion represented an illegitimate “alliance between priests and secular rulers who persecuted religious dissenters.” Ritualized Christianity went hand-in-hand with imperial ambition and produced “uncivil, unreasonable, and coercive” religion.

Link to the rest at The Wall Street Journal

PG notes that there were periods in history (including the times that the book described in the OP discusses) when religion had the power to impel men and women to many different actions.

The Crusades were a series of religious wars between Christians and Muslims that started primarily to secure control of holy sites considered sacred by both groups. In all, eight major Crusade expeditions—varying in size, strength and degree of success—occurred between 1096 and 1291.

The Muslim conquests were a military expansion on an unprecedented scale, beginning in the lifetime of Muhammad and spanning the centuries down to the Ottoman wars in Europe.

PG compares the general power of religion to cause humans to exert substantial amounts of money and blood in ancient times with the far more subtle influences of religion in the 21st Century. PG is certainly pleased that there have been no major religious wars lately, but he also wonders if religion, taken as a whole, occupies a smaller place in the minds of humanity, speaking generally, in today’s world.

PG asks that comments to this post not insult any particular religion or its adherents.

Lots of people mourn when famous writers and musicians die. Why?

From The Economist:

After Alexander Pushkin was shot in a duel in 1837, crowds of mourners formed in St Petersburg. Russia’s nervy authorities moved his funeral service and mustered 60,000 troops. When the wagon bearing the poet’s body reached Pskov province, where he was to be interred, devotees tried to unharness the horses and pull it themselves.

The death of Rudolph Valentino, a silent-movie idol, in 1926 set off similarly fervid lamentation. Mounted police restrained the fans who mobbed the funeral parlour in New York where he lay on view (several reportedly killed themselves). In 1975 some of the millions of Egyptians who paid their respects to Um Kalthoum, a megastar singer, took hold of her coffin and shouldered it for hours through the streets of Cairo.

Today’s celebrity obsequies tend to be less fanatical, and largely digital rather than in-person. But they are passionate all the same. In the past few months, grief has coursed around the internet for Martin Amis, Cormac McCarthy, Tina Turner and, most recently, Jimmy Buffett. If you stop to think about it, many such outpourings for writers, actors and musicians are odd, even irrational.

Unlike other kinds of grief, this one does not stem from personal intimacy. If you ever interacted with a cherished author, it was probably during a book tour when, caffeinated to the eyeballs, she signed your copy of her novel and misspelled your name. Maybe you delude yourself that you once locked eyes with a frontman hero during a gig and that he smiled only for you. But you didn’t really know them, and they certainly didn’t know you.

Nor would you always have liked them if you had. Their books or songs may be touching and wise, but (in the parlance of criticism) it is a biographical fallacy to assume that the work reflects an artist’s life or beliefs. Your favourites may indeed have been lovely people; or perhaps, beneath their curated images, they were spiky money-grubbers, consumed by rivalry or solipsists who drove their families nuts. Rarely do you know for sure.

Though the artists are gone, meanwhile, the art you prize is not. Death does not delete it—on the contrary, curiosity and nostalgia often drive up sales. (David Bowie’s only number-one album in America was “Blackstar”, released days before he died in 2016.) The dead, it is true, write no more books and record no songs. Philip Roth will never set a novel in the era of Donald Trump; you will never hear another operatic Meat Loaf ballad. The cold reality, however, is that many artists’ best work was done long before their demise.

The sorrow makes more sense when a star dies young or violently. Had she not perished at 27, like Jimi Hendrix and Janis Joplin, who knows what music Amy Winehouse would have added to her small, exquisite oeuvre? Sinéad O’Connor, another casualty of 2023, lived a troubled life that ended too soon. Buddy Holly (killed in a plane crash), Amedeo Modigliani (dead of tubercular meningitis at 35), Wilfred Owen (slain in action a week before the armistice in 1918): such premature and cruel exits are tragic.

Objectively, though, the death of a long-lived and fulfilled artist is far from the saddest item in an average day’s headlines. And whereas most mortals sink into oblivion, laureates live on in their output, which Horace, a Roman poet, called a “monument more lasting than bronze”. The standard reasons for mourning don’t apply. Why, then, are these losses felt so widely and keenly?

One interpretation is that the departed celebrities are merely the messengers. The real news is death itself, which comes for everyone, immortal or impervious as some may seem. If the reaper calls for Prince, with all his talent and verve, he will certainly knock for you. As Jim Morrison sang before he, too, died at 27: “No one here gets out alive.”

Part of your past—the years in which the mute musician was the soundtrack, the silenced writer your ally—can seem to fade away with them. Just as plausibly, the grief can be seen as a transmuted form of gratitude for the solidarity and joy they supplied. On your behalf, they undertook to make sense of the world and distil beauty from the muck of life.

Link to the rest at The Economist

How Obscenity Laws Nearly Stopped Nabokov’s Lolita from Being Published

From Literary Hub:

Lolita was originally published as a limited edition in France in September 1955. The book was released by Maurice Girodias of Olympia Press, a company known for specializing in pornography, but which had also built a reputation for publishing challenging literary titles. The first print run had been 5,000 copies and the book received little attention.

Over the next few months, a handful of copies of Lolita were smuggled into England. A single copy made its way into the hands of Graham Greene, who reviewed it favorably in the Sunday Times. This was followed soon after by a scathing article in the Sunday Express, which denounced the book as “sheer unrestrained pornography.” Shortly after, the British Home Office ordered that all copies entering the country should be seized.

When he first read Lolita, George Weidenfeld knew immediately he wanted to publish it. His partner, Nigel Nicolson, was not so sure. Nor were Nigel’s parents. His father Harold Nicolson “hated” the book and said it would be “universally condemned,” while his mother Vita Sackville-West “saw no literary merit in it at all.” As for Nigel himself, he told George that he was not convinced they should proceed with publication.

George was unrelenting. He took legal advice, however, and learned that publication in England would be extremely hazardous. Under the current law, they would likely lose the case, which would result in huge losses. Any copies that had been printed would have to be pulped, not to mention the enormous editorial, marketing, publicity and legal expenses incurred up to that point. Such an outcome would be calamitous for Weidenfeld & Nicolson, placing its future in serious jeopardy.

As luck would have it, the lawyers said, the Labour politician Roy Jenkins was right then guiding a new obscenity bill through Parliament. Under this new law, if the government blocked publication and the case went to court, then the publisher would be able to argue the literary merits of the book by calling authors, academics and reviewers to testify. If this bill was enacted, then just maybe, George might have a chance. The effort would still pose an enormous risk but, for the publisher, it might be worth it.

In the decades leading up to the proposed publication of Lolita, governments had frequently cited “obscenity” as the reason for preventing controversial books being published. In the United States, the Federal Anti-Obscenity Act of 1873 had been used to ban Chaucer’s The Canterbury Tales, John Cleland’s Fanny Hill, Boccaccio’s The Decameron and Voltaire’s Candide. In Britain, the legal test for obscenity derived from a 1868 case known as Regina v Hicklin, in which a judge ruled that obscene material tended “to deprave and corrupt those whose minds are open to such immoral influence.”

In 1928 the British government had relied on the Hicklin case to ban Marguerite Radclyffe Hall’s lesbian novel The Well of Loneliness. Opposition to the book was whipped up by the media, particularly the Sunday Express whose editor wrote, “I would rather give a healthy boy or a healthy girl a phial of prussic acid than this novel.” That same year, D. H. Lawrence’s Lady Chatterley’s Lover was also deemed to violate the obscenity laws and was commercially published in only an expurgated version. Six years later, the publisher Boriswood was prosecuted for obscene libel and severely fined for releasing Boy, a sexually explicit novel by James Hanley.

Over the summer of 1958, with the lawyers saying that the new bill had a good chance of passing through Parliament, and with Nigel Nicolson’s tenuous but nervous agreement, George reached out to the author, Vladimir Nabokov, in New York and asked for permission to publish Lolita in the United Kingdom and across the Commonwealth. By the end of November, they had reached an agreement on the general terms and Nabokov wrote to George saying “an English edition of Lolita is nearing signature.”

In his reply to the author in New York, George said, “May I take this opportunity of telling you how inspired and moved my colleagues and I feel by the book and how determined we are to see that it launches with dignity and success in this country.” Publication of Lolita in Great Britain seemed a little closer, but then, by the year’s end, George’s plans started to unravel.

When word began circulating in the press that Weidenfeld & Nicolson intended to release Lolita, Nigel’s political colleagues pressed him to change course. At one point the Conservative chief whip Ted Heath (and later prime minister) begged him to cancel publication. Nigel asked him if he had read the book. Heath said he had. “Did you think it obscene?” Nigel asked. “As a matter of fact I thought it very boring,” Heath replied. “If it is boring it cannot be obscene,” Nigel said, which he later admitted was not a very good argument.

A few days later, the Attorney-General, Reginald Manningham-Buller (called “Bullying Manner” behind his back), stopped Nigel in a dark corridor in the bowels of the House of Commons. “If you publish Lolita you will be in the dock,” he said, jabbing a finger at him. “Even after the Obscenity Bill has been passed?” asked Nigel. “That won’t make any difference,” responded the country’s top lawyer. ‘The book is thoroughly obscene. I’ve given you a clear warning.”

On 16 December 1958, a week before Christmas Eve, Roy Jenkins’ new Obscenity Bill was debated in the House of Commons. Midway through the proceedings, Nigel stood up to speak. First, he acknowledged that he had an interest in the matter as a director of the firm Weidenfeld & Nicolson, which was planning to publish Lolita. Then he moved on to the substance of his speech. “The question could be asked,” he declared, “Is an obscene work of art a contradiction in terms? I would answer the question by saying, no, it is not. It is quite possible for a work of art to be obscene.” He then went on to say that the book had already been published in America, where over 250,000 copies had been sold.

Lolita had also been published in France and Italy. “The question arose whether it should be published in England. That was the question which my colleagues and I had to answer,” he continued.

Lolita deals with a perversion. It describes the love of a middle-aged man for a girl of twelve. If this perversion had been depicted in such a way as to suggest to any reader of middle age or, for that matter, any little girl—could she understand it—that the practices were pleasant and could lead to happiness, I should have had no hesitation in advising my colleagues that we ought not to publish this book. But, in fact, Lolita has a built-in condemnation of what it describes. It leads to utter misery, suicide, prison, murder and great unhappiness, both to the man and to the little girl whom he seduces.

At this point, Emrys Hughes, a Welsh Labour MP, rebel and general troublemaker, tried to interrupt, but Nigel brushed him aside and moved on to his conclusion. “I asked myself whether the loss to literature in this country through the non-publication of Lolita was greater than the risk which one ran of offending certain people by its publication.” Pausing to take a breath, he then said, “In the end, I came to the conclusion that it was probably right to publish this book.” Nigel had for the first time publicly declared his support for the publication of Lolita.

Link to the rest at Literary Hub

Slanting the History of Handwriting

From Public Books:

Years ago, I wrote my signature on a piece of white paper, scanned it, and inserted it as a picture at the bottom of my digital letterhead. It’s a perfect example of what Richard Grusin has called the “premediated” sign. Others in academia sign their letters by typing out their names in cursive fonts. Whether Zapf, Apple Chancery, or Lucida Calligraphy, the important thing is that the font gestures to cursive, which has become the avatar of handwritten-ness in digital media today. We make these insertions not because we need to signal our authenticating presence but because formal letters are a genre of writing, with certain expectations regarding not only content but also appearance. A formal letter should conclude with the writer’s name inscribed to look a particular way, whether it’s a picture of a signature or a digital simulacrum of one.

All of which is to say, whatever writing is today, it is not self-evident. In the introduction to the new edited volume Handwriting in Early America: A Media History, Mark Alan Mattes suggests that we can come to grips with what writing is by triangulating between inscription, the people inscribing, and the systems of communication in which their inscriptions circulated. The 16 essays in the collection bear out the expansive potentials of this framework, not only by truly taking on the contingency of writing itself but also by revealing how the same kinds of writing can do radically different cultural work.

For example, almost every essay in this rich volume finds a counterpart or mirror image of itself, underscoring just how relative and relational the meaning of every kind of inscription is. A poem on penmanship quoted and copied by a teacher into an African American girl’s friendship album endorses the value of “polite culture” as a means of advancing in the antebellum Black elite.  A different friendship album, owned by Omaha activist Susette La Flesche, also features an array of handwritten quotations, but they document a tense ethics of obligation between writers and recipient—both are impelled to act in accord with an assimilationist vision of Indigenous self-determination.

While this volume underscores the benefits of historicist thinking about writing, Joyce Kinkead’s A Writing Studies Primer attempts to short-circuit that project by taking the opposite approach: condensing 5,000 years of writing technology around the world into a single, unbroken thread. After visiting museums, libraries, and paper-making firms in the US, Europe, India, Japan, Nepal, and South Korea, Kinkead, a professor of English with a focus on writing studies, synthesized her knowledge and experiences into a book that covers a vast range of topics, from the origins of writing, writing systems, implements, and supports to the history of the book and the printing press, punctuation and calligraphy, ancient epistles, and social media. Each of its 16 chapters concludes with prompts for leading class discussion, hands-on exercises, and a short reading from a source such as the New York Times.

While many of the essays in Handwriting in Early America hinge on Foucault’s idea that writing is a technology of the self—the process by which the individual is formed through various mechanisms of social replication—A Writing Studies Primer is a contemporary example of what this theory describes. And not always for the good. The book leans heavily on ethnographic methods that are almost indistinguishable from the Western gaze. The college student reader—presumably American—is advised in the first chapter to avoid getting “lost in a history that crosses so much time and space” by writing their own biography of themselves as a writer. The student’s story then gives way to Kinkead’s, and the Grand Tour of writing on offer measures all material forms and genres against the yardstick of Euro-American writing norms today—norms that, for example, assume handwriting stopped having a history after the advent of print. But writing by hand did not simply continue to “advance” until it inevitably began to erode; its meanings and the cultural work it performed varied. They still do.

. . . .

Nineteenth-century writing exercises were meant to unite the individual body with pen, ink, paper, and prescribed word, thereby fostering the growth of national subjects. A young boy from Massachusetts, for example, practiced his personal hand by rehearsing over and over again the words “Independence now and independence forever,” the announcement Daniel Webster imagined John Adams to have made upon signing the Declaration of Independence. I am reminded of the stock phrase I see from time to time sprinkled in the margins of medieval manuscripts by readers trying their sharpened pens or simply enjoying the scratch of an inky nib on parchment: “ego sum bonus puer quem deus amat.” I am a good boy whom God loves. Surely some of the boys or men who wrote that were at times naughty, but what is a jingle if not aspirational? As Danielle Skeehan remarks on 16th-to-19th-century English copybooks, “authors often draw connections between alphabetic literacy, the literate subject, discipline, and imperial ambition.” The legacy of alphabetic literacy’s facilitation of empire is a long one, still being written, albeit now in corporate blog posts and emailed memos to vendors on the other end of a supply chain thousands of miles away.

A Writing Studies Primer attempts to supplement and enhance the necessarily instrumental nature of a handbook for composition courses by cultivating students’ awareness of writing as a culturally determined act. This is great. But, teeming with factual errors and underpinned by a triumphalist Eurocentrism, it only embraces the surface relativism of liberal values, which ultimately needs history to be quaint so that the surface relativisms of modernity can emerge as modernity’s greatest distinction. From the volume we learn that books lacked page numbers, chapter headings, and indexes until the 16th century. False. “Islam prohibits images of people in art.” Demonstrably not true. Parchment is of lower quality than vellum. Incomprehensible. The printing press in Europe made scribes “irrelevant.” Incorrect. The entire output of medieval European book production equaled 30,000 volumes. Perplexing. Gutenberg had to hide his work on the printing press for fear of being accused of “dark forces or magic.” I am at a loss

Link to the rest at Public Books

PG notes that the publisher of Handwriting in Early America, University of Massachusetts Press, failed to make Look Inside available.

Ukraine Renews Its Independence

From The Wall Street Journal:

The average age of the Verkhovna Rada, Ukraine’s 450-seat parliament, is 41. Only three of the elected representatives are older than 60, while 17 were under 30 at the time of their election. This means that when Ukraine declared its independence, many of us were essentially children, and some weren’t yet born. What do we remember from Aug. 24, 1991?

I was 6. My memories of that day are of something profoundly significant. People didn’t go to work; they gathered in the city center, on what is now Hrushevsky Street, greeting each other in an atmosphere of incredible joy and uplift.

Now, in the 10th year of Russia’s war against Ukraine and 18 months into its full-scale phase, my thoughts drift back to the Verkhovna Rada elected in 1990, before independence. Its composition was diverse and varied. There weren’t many professional politicians. There were only Ukrainian patriots and Communists.

Everyone had an agenda. Some aspired for greater autonomy within the Soviet Union. Some defended the Ukrainian language. Some were building their careers with an eye toward Moscow. All etched their names in Ukraine’s history when they accomplished what our ancestors had dreamt of for centuries and what society demanded at that moment—independence.

On Dec. 1, 1991, the Ukrainian people overwhelmingly affirmed their desire for independence in a referendum with 84% turnout. In the Crimean peninsula, more than 54% voted in favor of independence. In the Donetsk, Luhansk, Kharkiv, and Odesa regions, support was over 80%. Today’s Russian propaganda conveniently forgets these numbers, insisting in its narrative that Ukraine and Ukrainians don’t exist.

Historians often joke that people living through major historical events don’t realize how significant those times are. There’s some truth to that. When the current Verkhovna Rada was elected in 2019, the primary demand of the Ukrainian people was a renewal of political authority. No one could have imagined the challenges we would face in less than three years: working during a full-scale war, making pivotal decisions, defending the nation’s sovereignty, and upholding the rights of Ukrainians to exist.

Like all Ukrainians, I will never forget Feb. 24, 2022, the day Russian troops invaded. By 7 a.m., a full-scale war had been raging for two hours. Russian forces were advancing in the Sumy, Kharkiv, Chernihiv, Zhytomyr, Luhansk and Donetsk regions, and from the direction of Crimea. From Belarus, they were moving toward the Kyiv region and the capital city itself. Cities like Odesa, Kherson, Kharkiv, Zhytomyr, Mykolaiv, Zaporizhzhia, Dnipro and Kyiv, along with their surrounding areas, were under missile attack.

In Kyiv, lines formed at petrol stations, railways and ATMs—but even longer queues formed outside military recruitment offices. Tens of thousands of men and women were eager to take up arms to defend their homes, their loved ones, and their country against the invader. Ukrainians enlisted en masse in territorial defense units. Those ready to fight were given weapons. In Kyiv alone authorities distributed 20,000 rifles on Feb. 24.

. . . .

Ukraine surprised the world, the enemy and even itself. We have managed to unite, support each other, and rally around what’s crucial: our nation, our freedom, and the future of our children.

History is made by ordinary people. They become heroes, and the future depends on them. This isn’t the first time Ukraine has had to fight for its right to exist. We must win. Each and every one of us knows what we are fighting for.

Link to the rest at The Wall Street Journal

Not exactly about books, but Ukraine is a terrific story. PG fervently hopes for a happy ending.

‘Empire of the Sum’ Review: It All Adds Up

From The Wall Street Journal:

In 1976, Steve Wozniak sold his HP-65 programmable calculator for $500 to start a computer company with Steve Jobs. It wasn’t a huge sacrifice. As a calculator engineer at Hewlett-Packard, he knew that the HP-67 was on its way and, with his employee discount, he could buy one for $370. His more highly prized gadget was the HP-35—the world’s premier scientific calculator and his inspiration for going to work at HP in the first place.

The HP-35 was a technological wonder. Until its appearance in 1972, pocket calculators performed only addition, subtraction, multiplication and division. The $395 HP-35 packed advanced functions like logarithms, sines and cosines into a relatively affordable, user-friendly package. Suddenly a computer’s worth of power could fit into anyone’s pocket. In “Empire of the Sum: The Rise and Reign of the Pocket Calculator,” Keith Houston surveys the engineering advances that led to that moment and the human drive to solve equations faster and smarter.

Mr. Houston, whose previous books explored punctation and symbols (“Shady Characters,” 2013) and the history of books (“The Book,” 2016), begins by looking back to when humans started counting using the tools at our immediate disposal: our fingers. The earliest archaeological evidence of counting is a baboon fibula incised with notches indicating the number of whatever its owner wanted to track. That 42,000-year-old tally stick, discovered in 1973 in a cave near South Africa’s border with Swaziland, “marks the point at which we began to delegate our memories to external devices,” Mr. Houston writes.

As civilizations progressed, they moved on from anatomical calculators, conferring numerical values to objects. The Sumerians, for instance, developed tokens whose varied shapes and sizes corresponded to smaller or larger quantities. But the biggest leap forward was the abacus, the first purpose-built calculator, invented in Mesopotamia or Greece between three and five millennia ago. The abacus made solving complicated equations possible, but getting results still required mental gymnastics.

Some shortcuts finally arrived in the 17th century courtesy of John Napier, a Scottish landowner, astrologer and mathematician. His invention: logarithms, a quick means of multiplying or dividing numbers through addition and subtraction. Not long after he published his revelation in 1614, logarithms became the basis for at least two important physical tools. Edmund Gunter placed them on a wooden rule to help seafarers with navigational calculations. Then William Oughtred produced his easier-to-use linear sliding rules, which, with a few later modifications, became the trusty and enduring slide rule.

As some mathematicians sought to simplify equations, others tried to automate them. Blaise Pascal was the first, in 1645, to build and sell a mechanical adding machine. Using gears similar to those of a clock, his Pascaline was an elegant metal box featuring windows for displaying a running total. Unfortunately, it didn’t do much more than add whole numbers, and its cost made it inaccessible to most. Over the next 200 years, more machines with greater functionality were introduced, the most important of which was Charles Xavier Thomas’s mid-19th-century arithmometer. None, however, would be as convenient or portable as a slide rule—until the Curta, a pepper mill-like mechanical gadget designed by the Austrian engineer Curt Herzstark during World War II.

In a book that’s long on technical details and short on compelling anecdotes, Mr. Houston’s profile of Herzstark is a notable highlight. As a salesman for his family’s factory manufacturing unwieldy calculators, Herzstark heard his customers’ calls for a truly portable machine. Not long after Herzstark hatched the idea for one, however, German troops annexed Austria. As the son of a Jewish father and Christian mother, Herzstark was sent to Buchenwald. There he supervised a factory of inmates fabricating rocket parts and repairing looted calculating machines. As Herzstark later recounted, his manager urged him to pursue his Curta side project, promising: “If it is really worth something, then we will give it to the Fuhrer as a present after we win the war. Then, surely, you will be made an Aryan.” When Buchenwald was liberated in April 1945, Herzstark took his blueprints with him and eventually produced the Curta. It was a palm-size engineering marvel but a commercial failure.

. . . .

The year the P101 launched, Texas Instruments decided to enter the calculator market with its big innovation, the microchip. The company had industrial and military customers for its chips, but to expand demand it needed to sell them to consumers. The idea: an electronic calculator that could fit in one’s pocket. The resulting prototype didn’t quite live up to expectations—it was the size of a paperback and weighed 3 pounds—but it laid the groundwork for the smaller, lighter gadgets to come, notably Busicom’s first truly pocketable calculator in 1971 and Hewlett-Packard’s HP-35 in 1972. The era of calculator mania had begun, and throughout the late ’70s and ’80s calculators were everywhere, as standalones and as combinations with other electronics, including clock radios, digital watches and even synthesizers.

The pocket calculator’s heyday would be brief compared with that of the slide rule it replaced. Even scientific calculators grew cheaper and profit margins waned. HP, despite Mr. Wozniak’s pleas to build a personal computer, refused to take the risk, only to see calculators absorbed into PCs, palmtops and, finally, smartphones. The pocket calculator sublimed, becoming “everywhere and nowhere at once,” Mr. Houston writes. “The calculator is dead; long live the calculator.”

Link to the rest at The Wall Street Journal

AI could make it less necessary to learn foreign languages

From The Economist:

On holiday, many will find themselves in places where they do not speak the language. Once upon a time, they might have carried a phrasebook. The rise of English has made that less necessary. But most people—at least seven of the world’s eight billion—still do not speak English. That leaves options like pantomime, a willingness to be surprised by what arrives at dinner—or, increasingly, technology.

More and more people are using simple, free tools, not only to decode text but also to speak. With these apps’ conversation mode, you talk into a phone and a spoken translation is heard moments later; the app can also listen for another language and produce a translation in yours.

You may still get a surprise or two. Google Translate may be the best-known name in machine translation, but it often blunders. Take “my wife is gluten-free,” the kind of thing you might say at a restaurant abroad. In French or Italian, Google Translate renders this as “my wife is without gluten”—true to the words rather than the meaning. DeepL, a rival, does better, offering various options, most of them along the correct lines.

The best tool may not be a translation app at all. Though not marketed for the purpose, Chatgpt, a generative ai system that churns out prose according to users’ prompts, is multilingual. Rather than entering an exact text to translate, users can tell Chatgpt to “write a message in Spanish to a waiter that my wife and I would like the tasting menu, but that she is gluten-free, so we would like substitutions for anything that has gluten.” And out pops a perfect paragraph, including the way Spanish-speakers actually say “my wife is gluten-free”: mi esposa es celíaca. It is a paraphrase rather than a translation, more like having a native-speaking dinner companion than an automated interpreter.

Travel has long been a motivator for study—unless people start to feel ai tools offer a good-enough service. Some are concerned that apps are turning language acquisition into a dwindling pursuit. Douglas Hofstadter, a polyglot and polymath writer, has argued that something profound will vanish when people talk through machines. He describes giving a halting, difficult speech in Mandarin, which required a lot of work but offered a sense of accomplishment at the end. Who would boast of taking a helicopter to the top of Mount Everest?

Others are less worried. Most people do not move abroad or have the kind of sustained contact with a foreign culture that requires them to put in the work to become fluent. Nor do most people learn languages for the purpose of humanising themselves or training their brains. On their holiday, they just want a beer and the spaghetti carbonara without incident (and sometimes without gluten).

As ai translation becomes an even more popular labour-saving tool, people will split into two groups. There will be those who want to stretch their minds, immerse themselves in other cultures or force their thinking into new pathways. This lot will still take on language study, often aided by technology. Others will look at learning a new language with a mix of admiration and puzzlement, as they might with extreme endurance sports: “Good for you, if that’s your thing, but a bit painful for my taste.”

This is largely an Anglophone problem, since native English-speakers miss out on the benefits of language-learning most acutely. In many countries, including Britain and America, schools’ and universities’ foreign-language departments have been closing. (The British government recently devoted a modest fund to trying to get more secondary-school pupils to study foreign languages.) In the rest of the rich world, there is one thriving language that people still study: English. And in poorer countries, many people are multilingual as a matter of course; Africans and Indians learn languages because they are surrounded by them.

Link to the rest at The Economist

Generation Gaps in the Workplace

From BookBrowse:

Walk into any office and you’ll likely find a mix of people at different points of their lives: Baby boomers, Generation Xers, millennials. And the presence of Generation Z continues to grow.

Iona, the main character in Clare Pooley’s Iona Iverson’s Rules for Commuting, often experiences people judging her competencies based on her age. She’s on the older side, some feel she’s past her prime, and she tries desperately to prove them wrong. But what do generational identities say about our capabilities as workers? To tackle this question, we’ll first have a look at the impact our generational differences have on us in the workplace, and then delve into the truth of the issue.

How do generational differences affect us in the workplace?

It isn’t hard to notice the differences between one age group and another: music, communication methods and even values. These differences can manifest themselves in a negative way in the workplace and cause us to argue, taking work from productive and efficient to a situation of lowered engagement. Soon enough, this can become frustrating, and we may have a tendency to blame our generational differences for it, especially if we already hold biases towards one another based on our ages.

The truth

Though some employees may think that they are simply unable to work with a person who isn’t in the same stage of life as them, or that some generations are less reliable, our distinctions in work patterns and capabilities aren’t as accentuated as all that. Research has indicated that the correlation between our generational upbringing and the way we act in and experience the workforce is close to zero, meaning there is little difference in attitudes towards work between generations.

. . . .

Megan Gerhardt, a management professor at Miami University, has researched the impacts of generational differences in the work field. “Many of the generational conversations in the news today rely on false stereotypes and clickbait headlines, rather than taking the time to understand the important differences that are a part of our generational identities,” she claims.

Link to the rest at BookBrowse

Never Enough

From The Wall Street Journal:

A hardworking teenager—let’s call her Amanda—excels at school. She’s a pianist, a varsity athlete, an honor student and the president of the debate club. She gets early acceptance to an elite university, lands the right summer internships, and, after graduation, secures the job of her dreams. Amanda has run the race; she has hit the mark; she has lived up to her potential and fulfilled the ambitions of her parents. Unfortunately, she’s also a mess. For years, despite the accolades, Amanda has felt “utterly vacant inside,” as Jennifer Breheny Wallace puts it in “Never Enough,” a timely exploration of adolescent achievement culture.

At once a description of an insidious problem and a call to arms, “Never Enough” is full of people like Amanda: young men and women whose personal accomplishments and affluent, ultra-supportive environments might be expected to guarantee them blissful satisfaction but have instead produced anxiety, loneliness and a feeling that life lacks meaning.

These effects are the unintended consequences of social norms that have come to prevail in many upper-middle- and upper-class families. The desire that young people succeed has morphed into something more like a demand. Writes Ms. Wallace: “Our children are living under a tyranny of metrics.” Numerous rueful parents testify in these pages, as do social scientists, educators and the author herself, a married journalist who writes with disarming candor of her own moments of overreach in the raising of her three children.

Why have things turned “toxic” for high-flying kids? Ms. Wallace sees widespread status anxiety fueled by a post-1980s stalling of social mobility and a rise in economic precariousness. Competition has intensified for spots in exclusive colleges and universities. So even mothers and fathers who try not to push, who try to be mellow and undemanding—dwelling less on grades and more on effort, for instance—can feel the pressure and pass it on to their children.

“As parents, we listen and chauffeur and chaperone and coach and cheer and help with homework and attend games and even practices,” Ms. Wallace writes of the “intensive parenting” that has become ubiquitous in the moneyed strata of American life. This style of raising children may be well-meaning, but it can suffocate and dispirit its young recipients. With every hour scheduled and every interest maximized for the purposes of a future résumé, children’s lives, observes the author, “become high-budget productions meant to attract the attention of admissions officers, scholarship committees, and football recruiters, not unique and imperfect stories just beginning to unfold.”

There’s nothing wrong with the pursuit of personal excellence, of course. Without the tug of ambition or the spur of competition, few among us would bother to distinguish ourselves, and the cause of human flourishing would go unserved. In the context of childhood, the problem arises when excellence is pursued as monomania. Focused on their AP textbooks and musical instruments and free of domestic chores (adolescence is “their time to be selfish,” says one unreconstructed mother), young people become sleek, siloed missiles aimed at the Ivy League. Once the rocket has delivered its payload—well, then what? As students like Amanda have found, the rewards of a self-oriented childhood spent sprinting toward high achievement may not be rewarding at all.

Link to the rest at The Wall Street Journal

PG notes that it doesn’t take many years after graduation for an individual to meet one or more complete idiots who graduated from an Ivy League or similar “selective” college/university.

The likelihood of meeting such an individual increases significantly if one meets children from wealthy families who attended prestigious institutions. It’s not the done thing to ask them how large a donation mom and dad made to the selective institution to ensure little Suzy’s or little Johnie’s admission.

It is also the case that such an individual will meet graduates from one or more blue-collar college/university who is brilliant.

Long ago, one of PG’s offspring had been admitted to more than one prestigious university. This offspring decided at almost literally the last day to apply to another university that was entirely respectable, but not regarded as prestigious. This offspring was admitted to this institution very quickly, in a matter of days.

It was the perfect choice for this individual and they flourished during their years of college and afterwards.

Let Them Eat Pedagogy

From Public Books:

If you, like me, have participated in the #AcademicTwitter rite of passage that is bemoaning the lack of pedagogical training in graduate education, The New College Classroom is just the book for you. Authors Cathy Davidson and Christina Katopodis aim to fill this yawning gap with “a practical book dedicated to a lofty mission, a step-by-step ‘how to’ for transformation.” Even the most well-intentioned professors, they explain, who believe in “discussion” and equity can get things wrong.

This should be my favorite book of 2022! Not just because it’s about pedagogy and equity; because it’s coauthored by two practitioners, both (white1) women, one a senior scholar held in high regard across US higher ed studies, the other an early career scholar who adjuncted in graduate school, recently completed her doctorate, and took a tenure-track position. The authors are more than qualified through their research and teaching experience to write this text. Moreover, the book is incredibly readable and easy to digest. Every chapter is broken into manageable chunks, with neat subheadings like “What to Do When Nobody Does the Homework” and “How Do I Address Racial and Other Forms of Discrimination?” It also feels timely, having been written remotely during COVID-19, when the always-ongoing crises of teaching exploded in ways that made all academic laborers’ challenges more obviously similar than different. And in a practice that seems to put communitarian principles into action, Davidson and Katopodis “wrote every word together,” meeting online twice a week to discuss ideas, practice the strategies they would recommend, and get “students’ feedback on what was most effective for their learning.”2

This is the book that pedagogy Twitter has been crying out for. So why does it give me the ick?

Because The New College Classroom is not concerned with the material conditions that produce the crisis academics have to navigate today. Despite its romantic visions of “transformation” it is, ultimately, a guide for coping with the status quo. It offers no help for demanding something better, nor for creating alternatives ourselves.

Changing myself and my classroom might help me renew my one-year contract with the university, but it cannot prepare me to demand an alternative to the contract as the basis of my employment. Instead of mystifying “pedagogy” as some pure way of thinking and being in the world, instead of lamenting that we weren’t trained in this special field of study, perhaps we should recognize that to speak of pedagogy is to speak of labor. And that academic labor is not exceptional.

Certainly, the authors made a heroic effort to ground their pedagogical practices in research. Their introduction cites “an exhaustive study of twelve thousand classrooms” that showed that even instructors who believe they are “conducting a seminar or discussion class” fill 89 percent of class time with lecturing. The book illustrates what truly equitable participation can look like and advocates for active and participatory work that puts students in charge of their learning.

The first part of the book, “Changing Ourselves,” analyzes the classroom practices we have inherited and makes the argument for turning away from these modes of study to center “active learning,” which makes the student both “the agent and the source of the learning” rather than a passive recipient of facts from an authority figure. The second part of the book, “Changing Our Classrooms,” offers “practical, easy-to-follow methods for every part of teaching a class.” The authors explain the principles of active learning and distill them into “grab-and-go activities” that an instructor can pick up and implement immediately in any class (think: think-pair-share or entry and exit tickets). Together, these sections are “an invitation to change—ourselves, our classrooms, our society.”

There isn’t a third part on “Changing Our Society,” but the conclusion is titled “Changing the World” and ends with the provocation that “we are the people we’ve been waiting for,” a slogan otherwise popularized by climate justice activists. It might sound ostentatious to speak of higher education in the same way that we speak of the climate crisis, but within higher education circles, it is commonly held that academic labor is facing an existential quagmire (the apocalypse du jour is generative artificial intelligence. A whole field of critical university studies has grown around this crisis. And of course, so have books helping faculty navigate their lot in this crisis (including texts offering “hacks” for the academic job market; ones identifying the challenges faced by minoritized faculty members; ones Putting the Humanities PhD to Work; even practical guides for leaving academia).

Link to the rest at Public Books

There’s nothing like flattering your intended audience with “we are the people we’ve been waiting for.” Self-satisfaction should be one of the seven deadly sins.

Author Dmitry Glukhovsky Sentenced to Prison by Moscow

From Publishing Perspectives:

Fortunately, the author and journalist Dmitry Glukhovsky was not in Russia on Monday (August 7) when a Moscow court found him guilty on a charge of spreading false information about Russia’s armed forces. He has been sentenced to eight years in prison.

Today (August 9), in response to our inquiry, Glukhovsky’s German public relations agent, Dorle Kopetzky at the Weissundblau agency, says that the writer left Moscow shortly before Vladimir Putin began his assault on Ukraine in February 2022, “and did not return after he called the war what it is.”

Glukhovsky, who joined us onstage at Frankfurter Buchmesse (October 18 to 22) in 2018 for a Publishing Perspectives Talk interview, has rarely been complimentary to the Putin administration, and many of his works were openly defiant.

“He has been critical towards the regime all these years now,” Kopetzky says, “and has fortified his efforts in exile.”

The Associated Press account of Glukhovsky’s sentencing points out that he is “the latest artist to be handed a prison term in a relentless crackdown on dissent in Russia,” referencing the May 5 pre-trial detention for theater director Zhenya Berkovich and playwright Svetlana Petriychuk.

Most prominently, of course, on Friday (August 4), the Russian opposition leader Alexei Navalny, already imprisoned, was convicted on charges of extremism and sentenced to 19 years in prison. That event prompted the editorial board of the Wall Street Journal to write, “The world hardly needs another reminder of the true nature of Vladimir Putin’s Russian state.”

A Reuters write-up in February spoke to Glukhovsky from an undisclosed location, and confirmed that prosecutors in Russia were “proceeding with a case against exiled science fiction writer Dmitry Glukhovsky, accused of publishing ‘false information’ about Russian atrocities in the Ukraine war.” As early as June of 2022, Reuters had reported that Glukhovsky was on a Russian interior ministry wanted list, the author on encrypted communication services having called out the Kremlin’s “special military operation” as a euphemism for Putin’s land-grab.

Glukhovsky, in a 2018 pre-Frankfurt interview with Publishing Perspectives, described the “wonderful times” of the current post-Soviet era for writers willing to see “an epoch of not only post-truth but also post-ethic.”

“These are really the times,” he said, “when all a writer needs to do is sit down and focus carefully on the dubious reality unfolding around him. What’s the point of writing a dystopian fiction nowadays,” he asks, “when the reality is exceeding your wildest fantasies?”

. . . .

Having worked in film, video game, and television development Glukhovsky has particularly broad potency as a storyteller and since the release of his debut trilogy Metro 2033, he has cultivated a loyal international following, propelling his writings into broad international translation and publishing deals.

Kopetsky describes his latest two-volume “Outpost” series as being set “in a Russia isolated from the West and ruled by a new czar from Moscow.” In the books, “a disease in Russia turns people into man-eating zombies after they hear a special combination of words, a ‘somewhat pandemic neuro-lingual infection.’”

Link to the rest at Publishing Perspectives

PG says Russia has experienced a huge brain drain as a result of the invasion of Ukraine. A great portion of the nation’s young highly-educated and talented people left the country to live in Eastern Europe and points beyond during the weeks following the outbreak of the war. PG thinks most will never return to Russia. They will certainly not return if Putin or someone who emulates Putin is the nation’s ruler.

Not far into the war, Russia instituted a program to take convicted criminals out of the nation’s prisons with a promise of a full pardon if the convicts agreed to fight on the front lines for a period of time – often one-two years. These conscripts have been used for roles such as leading charges toward dug-in Ukrainian troops armed with machine guns, and artillery.

Such charges define the term, “cannon fodder” and the Russian conscripts have been killed and severely wounded in large numbers. Needless to say, regular Russian soldiers have priority for the treatment of their wounds, and the convicts are left to treat themselves or each other as best they can.

Russia had a shrinking population before the invasion and the death and crippling of so many young Russian men will certainly accelerate the population decline. Russian ex-pats are unlikely to bring their families back to Russia in the aftermath of the war, regardless of how it ends.

An old saying goes, “The future belongs to those who show up.” Fewer and fewer Russians are going to show up for Russia’s future.

The Death of Public School

From The Wall Street Journal:

What is a public school? Is it an institution that is paid for by the public? One staffed by government employees? One that teaches a publicly approved curriculum? One that educates a broad swath of the public’s children? In the view of Cara Fitzpatrick, the author of “The Death of Public School,” it possesses all of these qualities, and properly so. That more than a few parents don’t agree—or have become disenchanted with the idea of public schools altogether—is a source of concern for her.

For Ms. Fitzpatrick, a veteran reporter and an editor at the education site Chalkbeat, the public school’s death—or at least its decline—is attributable mostly to conservatives, who have, as her subtitle has it, “won the war over education in America.” They have advanced their attack, as she sees it, by supporting school choice, school vouchers and charter schools.

Ms. Fitzpatrick’s not-very-sympathetic history of these alternative policies and institutions starts with the segregationists of the 1950s. She shows how the people who opposed racial integration in the civil-rights era lined up behind the idea that parents should get to choose their children’s school—with the goal, in that era, of avoiding black children. She notes that within four years of Brown v. Board of Education—the 1954 Supreme Court ruling that desegregated schools—some Southern states had “taken steps to abandon public schools if necessary” and “created grants that students could use to pay tuition at private schools.” Ms. Fitzpatrick says that such moves were “the first steps toward creating a school voucher system” similar to the one that Milton Friedman notably proposed in 1955.

Though Ms. Fitzpatrick doesn’t accuse Friedman of sharing the motives of the Southerners—she quotes him saying that he despised segregation and racial prejudice—she says that his views on vouchers seem “either naive or willfully ignorant of the racial oppression in the South.” Friedman had argued that a private system—with vouchers helping families pay for tuition—would offer a range of voluntary associations: exclusively white, exclusively black and mixed schools. Rather than force parents to send their children to one kind or another, he said, people opposed to racial prejudice should “try to persuade others of their views.” If they were successful, “the mixed schools will grow at the expense of the non-mixed, and a gradual transition will take place.”

Whether such an idea was naive is hard to say; it certainly wasn’t ignorant or baleful, as Mr. Fitzpatrick seems to imply. At the time, there were high-quality institutions devoted exclusively to black students, notably the Rosenwald schools set up in the South by the philanthropist Julius Rosenwald in the first decades of the 20th century. If black parents had been given the vouchers needed to support them, such schools might well have continued to flourish.

Of course, people have supported school choice for more than one reason. Some have favored limited government, feeling that, when it comes to the education of children, the state should yield to the preferences of parents. Some have favored more resources for religious schools. Virgil Blum, a Jesuit professor at Marquette University, wrote an article in the late 1950s calling for “educational benefits without enforced conformity,” by which he meant that children shouldn’t be denied access to either public or private schools for lack of funds. In part, he wanted to make the system of Catholic schools an even more robust alternative to public schools, which were sometimes run by Protestants intolerant of religious dissent.

How good is the education that results from school choice or vouchers—or from charter schools, which function outside government strictures and teacher-union rules while receiving public funds? Ms. Fitzpatrick says that the jury is still out. She cites various studies, with mixed conclusions. She doesn’t mention the astronomically improved test scores of students who have attended Success Academy, a network of New York City charter schools founded in 2006 and now serving more than 20,000 students in grades K-12.

Debates over education can make for strange bedfellows. Ms. Fitzpatrick tells the story of the alliance in the 1990s between the Milwaukee-based black Democratic legislator Polly Williams, who favored private-school vouchers, and Wisconsin’s white Republican Gov. Tommy Thompson. After helping to implement the country’s first voucher program, Williams turned on Thompson because he wanted vouchers to be universal and she wanted them to go only to the poor.

Ms. Fitzpatrick seems to feel that Democratic-leaning black parents are getting played by white, Republican-leaning advocates who push for alternatives to public schools. Such advocates say they care about the best interests of inner-city children—whose public schools are often miserable—but Ms. Fitzpatrick implies that they really want to engage in religious indoctrination (with church-defined curriculums) or to make money by ultimately privatizing public schools. She uncharitably describes Clint Bolick—who argued many of the first school-voucher cases—as “never one to miss a good public relations opportunity.”

For someone who seems inclined to question the motives of those who favor school choice, Ms. Fitzpatrick doesn’t seem much interested in why others oppose it. Democrats from Bill Clinton to Barack Obama at first favored charter schools and school choice but backed away when teachers unions threatened to withdraw their political support.

Link to the rest at The Wall Street Journal

PG attended public schools exclusively for all twelve years of his pre-college education. Some of the public schools were very good, some were mediocre and some were really terrible.

He suggests that teachers are the determining factor that results in good or poor educational experiences for students. When he had good teachers, public school was a wonderful learning experience. When his teachers were poor, public school was often pure torture.

PG thinks that the politicization of public schools and public school systems is almost certain to result in poorer educational experiences for students.

Transformative Agreements: ‘An Essential Stepping Stone’

From Publishing Perspectives:

As Publishing Perspectives readers know, the academic and scholarly world’s march toward open-access models hasn’t moved as quickly as many would like. The late-June release of Europe’s Coalition S initiative for open access called “Plan S” was plainly presented as a disappointment.

Closer to the ground, if you will, however, there are parties gamely announcing progress and achievements, among them the London-based 182-year-old Royal Society of Chemistry (the URL of which, yes, looks like that of the Royal Shakespeare Company).

In its media messaging today (August 11), the society—which has an international membership of more than 50,000—is focusing on what may be to some a surprising number of transformative agreements in North America, 46 all told. They are:

  • 2018: One agreement (the society’s first in the United States, with Massachusetts Institute of Technology)
  • 2019-21: Three agreements (all in the United States)
  • 2022: Seven agreements (all in the United States)
  • 2023: 35 agreements (21 in the United States, one in Mexico, and 13 in Canada)

The Biden administration in August of 2022 announced its controversial requirement that by the end of 2025 all taxpayer-funded research will have to be made freely available to the public as soon as a final peer-reviewed manuscript of a report is published. The chemistry society in England is now mentioning this as one of the factors that has accelerated its agreements, along with the society’s own plans.

“On the back of the US government’s open-access mandate and our own open-access commitments,” the society reports, “the number of deals has grown rapidly within the region every year, with 2023 seeing 28 new deals, including our first agreements with partners in Canada and Mexico.”

. . . .

Sara Bosshart is the Royal Society of Chemistry’s head of open access, and she’s quoted today, saying, “We were very excited last year to announce that we aim to make all of our fully society-owned journals open access within the next five years. Open access is at the core of our mission to help the chemical sciences make the world a better place and by making our 44 society-owned journals free-to-read, we’ll be providing unrestricted global access to all of the cutting-edge research we publish.

“A key priority for our transition,” Bosshart says, “is to ensure that our international author base continues to have the same ability to publish in our journals. For this reason, we’re planning to spend the next two years working with our world partners, institutions, and community to develop new open-access models that function at an institutional level, rather than relying solely on author publication charges.

“Transformative agreements are an essential stepping stone in our [progress] toward 100-percent open access as they form the basis for future open-access agreements and allow us to transition gradually from subscriptions to open access. They also strengthen the relationships we have with our United States institutional partners and create a forum for conversation and collaboration toward a joint open-access future.

“Our end goal is an open-access future that ensures that everyone, everywhere, has the same potential to access and contribute to the latest discoveries in the chemical sciences and beyond—and we’re looking forward to working collectively with our community to achieve this vision.”

Link to the rest at Publishing Perspectives

Interest in George Orwell and his dystopian fiction is high

From The Economist:

Few writers have achieved the cult status of George Orwell. He is so much a part of the collective imagination that John Rodden, an Orwell scholar, goes so far as to call him “the most important writer who ever lived”. He was not the best writer of his time, explains Mr Rodden, author of several books on the writer’s “afterlife”, but his universal recognition, continuous publication and repeated spikes in popularity are “an unprecedented phenomenon rivalled only by Shakespeare himself”.

Orwell’s most celebrated novel, “1984”, tells the story of Winston Smith, an everyman who embarks on a love affair in defiance of the surveillance state led by Big Brother, the supreme leader, whom some believe Orwell modelled after Josef Stalin. With “telescreens” that snoop on citizens, “thought police” enforcers and a system called “doublethink” in which both everything and the opposite are equally true, Orwell’s fiction has been prescient, invoked to describe the ills of nearly every age. “1984” has repeatedly topped English-language bestseller lists, including in 1954 (the year the bbc did a television adaptation), 1984 and 2003 (the centenary of Orwell’s birth). Political events bring Orwell new readers, including Donald Trump’s inauguration in 2017 and fomenting of the riot at America’s Capitol in 2021. After Russia’s invasion of Ukraine, “1984” became the most-downloaded electronic fiction book in Russia last year.

. . . .

Angst over totalitarianism, the manipulation of truth and the spread of surveillance technologies has hardly abated. Today’s world is increasingly Orwellian, argues Jean Seaton, director of the Orwell Foundation: consider social-media pile-ons, analogous to the “Two Minutes Hate” the novel’s characters spew at enemies of the state. The dangers Orwell flagged are as easily used by the left to bludgeon autocrats as by the right to denounce the left’s punishment of “wrongthink”.

It is thus no surprise that in 2023, with fears of autocracy and culture wars at fever pitch, the man who wrote so deftly about dark subjects is back in the spotlight. Films in production include a new documentary on Orwell’s life and an animated “Animal Farm”; a Russian-language “1984” was recently released. At least three Orwell books have been published in the past year, grappling with subjects including Orwell’s relationship with Russia.

Two more books are forthcoming, which look more closely at the women in Orwell’s life and work. For all his prescience and scrutiny of tyranny, Orwell was blind to another sort of repression: towards women. Along with a new biography by D.J. Taylor, a British historian, the books draw on letters discovered in the past 20 years between the writer and various paramours, as well as some written by Orwell’s first wife, Eileen, to her best friend. The picture that emerges is disheartening—but hardly unusual for a man of Orwell’s time.

Eileen O’Shaughnessy was married to Orwell from 1936 until her death in 1945. One of the first women to attend Oxford University, she was brilliant and witty but abandoned a master’s degree in psychology to wed Orwell. Their life was one of hardship. Eileen struggled to make a remote, unheated cottage a home, nursing the tubercular Orwell back to health while typing up and advising him on his work. She was often the main breadwinner.

Anna Funder’s “Wifedom” offers bleak details, including the day Eileen cleaned a blocked toilet, standing knee-deep in excrement, when Orwell appeared at a window to ask, “Teatime, don’t you think?” His wife dedicated her life to helping Orwell “fulfil his destiny”, one friend wrote, to the point of fatally ignoring her own health. Meanwhile Orwell was conducting numerous affairs. Before and after they married, her husband was a “sexual opportunist” who pounced on women who came his way, Mr Taylor writes.

. . . .

For decades feminists have called out the sexism of Orwell’s depiction of Winston Smith’s lover, Julia, who is presented as a nymphomaniac and honeypot trap, leading to their crushing by the state. “With Julia, everything came back to her own sexuality,” Orwell wrote. Whether the writer was himself a misogynist or simply satirising a group of sexist men, the author’s estate had long felt there was something missing in the story. They sought a writer who might give a new dimension to the tale, writing a spin-off of “1984” for the 21st century. “The only way to approach it was from a feminist perspective, because the whole regime was so horrifyingly misogynist,” says Bill Hamilton, Orwell’s literary executor.

Ms Newman’s “Julia” offers a female character with a rich inner life. Her Julia is a survivor, more subversive than Winston, adroit at evading control, finding a kind of liberty in “sexcrime”. “She imagined freedom as exuberance, a clumsy romping,” Ms Newman writes. If Julia entraps Winston, it is because she too has been coerced and victimised. A twisty ending in keeping with the original makes this an enjoyable read even to those unfamiliar with “1984”.

Link to the rest at The Economist

PG says, shame on Knopf and Mariner Books for getting a high-profile review while not having Look Inside working and no ebook preorders on Zon.

80 %-90% of the people who read The Economist will have forgotten about these books when they come out in a couple of weeks (for Wifedom) and over two months (for Julia).

The Perfection Trap

From The Wall Street Journal:

There comes a moment in every job interview when the applicant will be asked to name his or her greatest weakness. “Well, I’d have to say it’s my perfectionism” is the smart answer, a humblebrag that is pretty short on humility. These days—as Thomas Curran writes in “The Perfection Trap: Embracing the Power of Good Enough”—this “weakness” is a strength. It assures a prospective employer of your commitment to the highest standards, “counted in hours of relentless striving, untold personal sacrifices, and heaps of self-imposed pressure.”

Perfectionism was not always held in such high regard; it was once the stuff of horror. Nathaniel Hawthorne’s 1843 cautionary tale “The Birth-Mark” tells of a scientist who becomes fixated with his beautiful wife’s single blemish—a birthmark. She internalizes his revulsion and wants to remove the defect “at whatever risk.” The scientist at last concocts a remedy and his wife gulps it down. Fine, except it kills her. The lesson? Perfection is a form of madness, one best avoided.

Mr. Curran, an associate professor of psychology at the London School of Economics, writes that perfectionism “seems to be the defining characteristic of our time.” Our rise-and-grind work lives are animated by the notion that “if you’re slacking, slowing down or, worse, taking a moment to simply think about what all the relentless grinding is even for, then you’re going to be left behind.”

I expected “The Perfection Trap” to attack the self-defeating behavior of perfectionists—the author counts himself as one—with a predictable hodgepodge of mindfulness, leavened with a little cognitive behavioral therapy and maybe a phone app thrown in for good measure. Instead, Mr. Curran has produced a manifesto damning our economic system for creating and maintaining a warped set of values that drive perfectionism, values we have internalized without examination. There’s no easy fix, he warns. The task calls for the kind of deep introspection that is both hard and unpopular; we must confront “our most basic assumptions about what’s ‘great’ and ‘good’ in modern society.”

The case of Lance Armstrong exemplifies the dynamic. The cyclist admitted to doping his way to seven Tour de France victories, but wasn’t sorry for it—it was simply, in his mind, leveling the playing field. “The culture was what it was,” he told Oprah Winfrey. “We were all grown men, we all made our choices.” But step back for a moment, Mr.Curran tells us, and see how Mr. Armstrong is attempting to normalize irrational and shocking behavior. The arms race that he willingly took part in risked every cyclist’s health but didn’t make any single rider more likely to succeed. It paid off for Mr. Armstrong, but not everyone was so lucky, costing some cyclists their health, others their lives.

“The same destructive arms race is playing out in wider culture right now,” Mr. Curran writes. We subjugate our own well-being to that of the economy, which “needs to grow far more than we need to feel content. Perfectionism is just the collateral damage.” Mr. Curran is explicit. To him, healthy perfectionism—a hedge used by those of us seeking to exclude ourselves from his critique—is an oxymoron.

. . . .

The lifeblood of social media and advertising is unhappiness. Every day, some two billion of us—a quarter of the earth’s population—log in to Facebook or Instagram and measure the ways our lives are lacking compared to the photoshopped images of impossibly well-turned-out people living their fabulous lives. “Isn’t this what Instagram is mostly about?” an Instagram employee asked rhetorically in a leaked memo. “The (very photogenic) life of the top 0.1%?” It is. And children pay an especially high price when they feel like they’re falling short. “We make body image issues worse for one in three teen girls,” according to a slide from a leaked Facebook presentation.

Even more disturbing was a leaked chart indicating that 6% of teens in the U.S. and 13% of teens in Britain said “spending time on Instagram was one reason why they felt like they wanted to kill themselves.” The company has no incentive to remedy this. “Facebook (now Meta),” which owns Instagram, “has increased its advertising revenues exponentially since 2009, to almost $115 billion today,” Mr. Curran reminds us. It didn’t do that by telling us we’re fine the way we are. “Their algorithms can even pinpoint moments when young people ‘need a confidence boost.’ ”

Link to the rest at The Wall Street Journal

Decide Where You’re Standing in Time as You Write Your Memoir

From Jane Friedman:

Two temporal elements—the time frame of the story and where you are standing in time while you tell your tale—are central to the idea of structure in memoir.

But they are tricky to determine because you are still living the life you are writing about in your memoir, and you existed at all the points in time throughout the story you are telling. It’s easy to think that you are just you and the story is just the story, and to believe that you don’t have to make any decisions around time the way a novelist does. But if you neglect to make conscious choices about time, you risk getting tangled up and writing a convoluted story.

The first decision: choose a time frame for your story

What time period will your story cover? Don’t think about flashbacks to your younger self or memories of times gone by; all stories have that kind of backstory and it doesn’t count when answering this question.

Also don’t think about whether you are going to write your story chronologically or present the story in some other way (such as backward or in a braid); these are questions about form that get sorted out later.

For now, just think about what is known as “story present”: the primary period of time that the reader will experience as they are reading the story.

Here are some examples of story present from well-known memoirs:

  • The several weeks I spent hiking the Pacific Crest Trail (Wild by Cheryl Strayed)
  • The year I planted my first garden (Second Nature by Michael Pollan)
  • The three years I was in an abusive relationship (In the Dream House by Carmen Maria Machado)
  • The three years consumed by the trial of my rapist (Know My Name by Chanel Miller)
  • The four years when I was a dominatrix (Whip Smart by Melissa Febos)
  • My childhood in Ireland (Angela’s Ashes by Frank McCourt)
  • The 18 years following the accidental death of my high school classmate (Half a Life by Darin Strauss)
  • The 30-something years of my marriage (Hourglass by Dani Shapiro)
  • My whole life (I Am, I Am, I Am by Maggie O’Farrell)

If you find yourself considering a time period that covers your whole life or a big chunk of time like the last two examples in my list, make sure that you actually need to include the entire period of time to effectively tell your story.

Dani Shapiro’s Hourglass doesn’t cover her whole life, but it covers many decades. That’s because her topic is itself time—the way it moves and flows in a long marriage, the impact it has on the relationship. Even her title cues us into this truth: She is making a point about the passage of time. The time frame she uses fits her point.

Maggie O’Farrell’s memoir, I Am, I Am, I Am: 17 Brushes with Death, starts in her childhood and covers the entirety of her life up until the moment she is writing the book. It is a beautiful and effective story. But note that the intention of this book is not to tell her life story; it’s to discuss the specific ways that she is mortal and the reality that we are all mortal, and to remind us that every moment is a gift. She imposes a concept onto the material—a form or structure to unify or organize the material so that it’s not just a bunch of things that happened but a very specific and highly curated progression of things that happened. The story presents her whole life, but she only chooses to tell 17 stories. The time frame she uses fits her point as well.

The second decision: where you are standing in time as you tell your tale

While you are thinking about the time frame of your story, you also must decide where you are standing in time when you tell your story. There are two logical choices:

1. Narrating the story as you look back on it

The first option is that the story has already happened, and you are looking back on it with the knowledge and wisdom gained from having lived through those events. You are standing in time at a fixed moment that is not “today” (because today is always changing). It’s a specific day when the story has happened in the past and the future life you are living has not yet happened. This choice has to do with what we call authorial distance, or how far from the story the narrator is standing. In fiction, a first-person point of view often feels closer to the story than a third-person point of view. In memoir, if you are telling the story from a specific day that is just after the events that unfolded, you will be closer to the story than if you were telling the story from a specific day three decades later.

I wrote my breast cancer memoir just months after my treatment had ended and the friend who inspired me to get an (unusually early) mammogram had died. Her recent death was the reason for the story and part of the framing of it. She died young and I did not; the point I was making was about getting an early glimpse at the random and miraculous nature of life—a lesson that most people don’t really metabolize until they are much older. I wanted to preserve the close distance to the events of the story. If I told that story now, I would be telling it with the wisdom of having lived well into middle age—a very different distance from the story and a very different perspective.

I once worked with a client who had been a college admissions officer at an elite private high school. The pressure of the work, the outrageous expectations of the kids and parents, and the whole weight of the dog-eat-dog competitive culture contributed to him having a nervous breakdown. He wrote a memoir in which he answered college application questions from the perspective of a wounded and reflective adult. It was brilliant, and part of its brilliance was the wink and nod of doing something in his forties that so many people do at age seventeen.

We are talking here about authorial distance related to time, but there is also the concept of authorial distance related to self-awareness. I know that sounds a little like an Escher staircase circling back on itself—and it kind of is. The narrator of a memoir (the “you” who is standing at a certain moment in time) has some level of self-awareness about the events and what they mean. One of the reasons that coming-of-age stories are so beloved is that, by definition, the narrator is awakening to themselves and the world for the first time. There is very little distance (temporal or emotional) between who they were and who they became and there is a purity and poignancy to that transformation. It’s as if they are awakening to the very concept of self-awareness.

It is entirely possible for an adult to write a memoir and not bring much self-awareness to what they are writing about; it’s unfortunately quite common. A narrator who is simply reciting what happened—“this happened to me and then this happened and then this other thing happened”— is not exhibiting self-awareness about their life. They are not stepping back emotionally from it, so they don’t have any perspective to offer no matter how far away they are from it in time. They are just telling us what happened. These kinds of stories tend to feel flat and self-absorbed. They make no room for the reader. They don’t offer any sort of reflection or meaning-making, don’t offer any emotional resonance, and don’t ultimately give us the transformation experience we are looking for when we turn to memoir.

Laurel Braitman, author of What Looks Like Bravery, explains it like this: “I tell this to my students now: You can only write at the speed of your own self-awareness. You do not want the reader to have a realization or insight about your life that you haven’t had already or they will lose respect for you.”

If you are telling your story as you are looking back on it, make a clear decision about exactly where you are standing in time and make sure you have enough self-awareness to guide the reader with authority through the events you are recounting.

2. Narrating the story as it unfolds

The second logical option in terms of where the narrator stands in time is to tell the story as though you are experiencing it for the first time. There is no temporal distance from the events you are writing about. You narrate the story as the story unfolds, which means that you narrate it without the knowledge of how it all turned out. In this kind of story, the self-awareness that is necessary for an effective memoir is unfolding as the story unfolds as well.

I wrote a memoir about getting married and the structure of it was a “countdown” to the wedding. In this format, the concept is that events were unfolding as I was living them. This wasn’t technically true—I wrote the book after the wedding had taken place—but I had taken extensive notes and was able to preserve the concept of not knowing how people would behave or how I would feel. (This book embarrasses me now—the whole idea of it. I was 25 when I wrote it, so what can I say? I am grateful for its role in my career and here it is being useful as an example.)

In Joan Didion’s memoir, The Year of Magical Thinking, she wrote about the year after her husband dropped dead at the dinner table, and of the difficulty that the human mind has grasping that kind of catastrophic change. In the first pages of the book, she writes, “It is now, as I begin to write this, the afternoon of October 4, 2004. Nine months and five days ago, at approximately nine o’clock….” What she is doing here is signaling to us that there is not a whole lot of authorial distance or self-awareness to what she is sharing. She is figuring it all out—this tendency to think magical thoughts about the dead not really being dead—as she writes. But the key thing is that she knows that she is figuring it out, and she invites us into the process. She has self-awareness about her own lack of self-awareness. She is not just telling us about the dinner and the table and the call to 911.

In Bomb Shelter, Mary Laura Philpott places her narrator self at a point in time when the story is still unfolding; she has perspective and self-awareness, but those elements are still clearly in flux. The New York Times reviewer Judith Warner called this out in her rave review of the book. Warner said, “I want to say something negative about this book. To be this positive is, I fear, to sound like a nitwit. So, to nitpick: There’s some unevenness to the quality of the sentences in the final chapter. But there’s no fun in pointing that out; Philpott already knows. ‘I’m telling this story now in present tense,’ she writes. ‘I’m still in it, not yet able to shape it from the future’s perspective.’” Like Didion, Philpott was well aware of the choice she made around narration and time, and those choices perfectly serve her story.

Link to the rest at Jane Friedman

PG notes that on the date he posted this item, the book written by the author of the OP had not been published yet. You can find the book here for preorder. Here’s a link to a broken home page for the publisher, Tree Farm Books.

PG was about to comment about a publisher that can’t even keep a home page operating when a large number of readers, including PG, spend time online gathering information for books they might be interested in reading. Not to mention that the profit margins for selling ebooks are much larger than for books on paper.

PG mentions again that, in his opinion, it’s stupid to get a blurb for a book or, in this case, have the author of a book write a detailed online article about the subject of the book when the book isn’t available on Amazon yet. In the case of the book shown below, the publisher, Tree Farm Books, hadn’t even made the book eligible for preorders on Zon.

But PG’s opinions could be entirely wrong.

Life as It Is, Blinkers Removed

From The Wall Street Journal:

As a young man, Christian Madsbjerg was poking around a library and came across a passage by Aristotle that struck him as ridiculous. In “Physics,” Aristotle explained that objects drop earthward because they want to return to the place where they belong. As Mr. Madsbjerg re-read the lines, he felt his derision give way to dizziness. “In [Aristotle’s] world, objects fell to the ground because they were homesick,” Mr. Madsbjerg writes in “Look: How to Pay Attention in a Distracted World,” a provocative, sometimes elusive work of inquiry and instruction. That a philosopher two millennia ago knew nothing of gravity was not in itself surprising; what moved Mr. Madsbjerg was the realization that Aristotle must have really believed what he’d written.

“I have since encountered that vertigo of coalescing insight many, many times,” writes the Danish-born entrepreneur, who has built a career on the pursuit of perceptual breakthroughs. In “Look,” Mr. Madsbjerg attempts to impart the wisdom he has acquired from art and philosophy and from the practical experience of running a corporate consultancy and teaching a class on “human observation” at the New School in New York. His book is full of intriguing goodies: anecdotes and precepts originating in a wide array of sources, as well as summaries of the work of gestalt theorists and practitioners of phenomenology, a discipline he defines as “the study of how the human world works and everything that gives our life meaning.”

The breadth and vagueness of that definition bespeaks the book’s enthusiastic overreach. There’s a lot here, and a lot of it makes sense, but there are moments when the argument is so diffuse as to feel precarious. At times Mr. Madsbjerg teeters on the edge of banality, even incoherence. Yet he never tips over, for there is unmistakably truth in what he’s getting at. And, to be fair, the distinction between apparent understanding and deep understanding is difficult to draw. It is difficult even to describe. Mr. Madsbjerg does a heroic job of seeking to capture the experience of sudden insight.

The “richest reality,” he argues, is reached not by thinking but by looking—by which he means broadly using one’s perceptual apparatus. This dictum sounds simple to follow, but it is not. Looking takes time. Looking requires silencing the chattering mind. Looking means not only letting the eye follow the shiny bouncing object but also taking in the vast realm of context around the object and noticing what is not shiny and not bouncing. Perhaps most challengingly, it means surrendering the idea that we are necessarily seeing what we think we see. For, as Mr. Madsbjerg explains, “perception happens inside us; we change what we see to reflect who and where we are in the world.”

A rose may be a rose may be a rose, to mangle Gertrude Stein, but a rose is going to have very different meanings for a botanist, an interior designer or a love-struck suitor. When we glance at a rose—or any other thing—we see it amid the connotations it holds for us. This is the idea of gestalt, the recognizable “wholeness” of things. To take another example, few of us would stop to assess the color, size, shape and composition of a dining table and a set of chairs before drawing conclusions about their nature and purpose. The same is true with the gestalt of a picnic lunch or a museum gallery or a crowded bus stop: It’s easy to register a scene without noticing any of the details.

This capacity to make rapid sense of what we see equips us to get on with daily life, but it can also be an obstacle to clarity, fitting us with blinkers and causing us to miss important signals. “Look” is full of stories of people who have struggled to see the forest for the trees, or the trees for the forest, or the reason the forest is growing where it is and not some other place. One anecdote tells of an executive at an electronics company who was so intent on developing sophisticated big-screen televisions that he missed the cultural shift toward TV-viewing on laptops and smartphones. Another relates how the city-born biographer Robert Caro had to spend long hours in the open-skied isolation of rural Texas before he felt he could begin to understand his subject, Lyndon Johnson.

Looking through an ideological lens can also be blinding, as Mr. Madsbjerg knows firsthand, having grown up in a Marxist milieu on a Danish island in the Baltic Sea. “For a teenager, Communism was a totalizing, energetic, and angry faith,” he writes. “I was enraptured by fervor—furious about the class inequities in the world.” Only when his revolutionary ardor faded was he able to see how the ideology had distorted his perception of the world. The fall of the Berlin Wall became, in his telling, the impetus to begin a quest to understand how fashions in thinking advance and retreat; why people change their minds; and whether it is possible to anticipate social change, to “feel signs of a storm approaching.”

Link to the rest at The Wall Street Journal

The World Behind the World

From The Wall Street Journal:

Imagine that you could shrink yourself small enough to be able to travel inside a brain. Hiking up the hippocampus or sailing around the cerebellum would no doubt be an awesome experience and well worth the trip. But nowhere would you be able to experience what the person with the brain was thinking, hearing or feeling. The buzzing hive of conscious life would be, to you, just a collection of cells.

The limits of such a fantastic voyage point to two seemingly irreconcilable ways of viewing ourselves: as biological matter and as self-aware mind. The contrast has stumped philosophers and scientists for centuries and is sometimes framed in terms of the objective and subjective, the external and internal. Neuroscientist Erik Hoel, taking up these two realms of human self-perception, calls them the extrinsic and intrinsic perspectives. In “The World Behind the World,” he tries to explain why they exist, how we might try to bring them under a common understanding and why we might never fully succeed. On top of that, he attempts to rebut the argument, ever more frequent these days, that the firings of neurons—not thoughts and desires—are the only real causes of human action. As if that weren’t enough, he tries to save the idea that we have free will too—all in slightly more than 200 pages. His project is, he admits, “impossible in scope,” but he still has the chutzpah to give it a go.

Mr. Hoel begins by challenging the idea that the dual perspective—inside and outside—has been a perennial, unchanging feature of human experience. He argues instead that the intrinsic perspective took a while to come into its own. Since the time of Homer, he says, our ability to describe and express our inner world has developed considerably. The first stirrings can be seen in ancient Greek writers like Simonides, who developed the “memory palace” method of memorization, and Euripides, who gave voice to the inner lives of his characters. This perspective languished during medieval times, Mr. Hoel says, but was given new life in the Renaissance by writers like Cervantes and developed yet more with the rise of the novel in the 18th century.

The argument is speculative and the timeline too neat, but the key point is well-made: Consciousness may be a human universal, but our desire and ability to articulate its qualities and recognize it explicitly as a distinctive perspective are human variables, not any more innate than our ability to study the world scientifically. And indeed, Mr. Hoel says, the rise of science, too, depended on our learning a perspective—the extrinsic one. He gives an even faster whistle-stop tour of this process of development.

Mr. Hoel begins by challenging the idea that the dual perspective—inside and outside—has been a perennial, unchanging feature of human experience. He argues instead that the intrinsic perspective took a while to come into its own. Since the time of Homer, he says, our ability to describe and express our inner world has developed considerably. The first stirrings can be seen in ancient Greek writers like Simonides, who developed the “memory palace” method of memorization, and Euripides, who gave voice to the inner lives of his characters. This perspective languished during medieval times, Mr. Hoel says, but was given new life in the Renaissance by writers like Cervantes and developed yet more with the rise of the novel in the 18th century.

The argument is speculative and the timeline too neat, but the key point is well-made: Consciousness may be a human universal, but our desire and ability to articulate its qualities and recognize it explicitly as a distinctive perspective are human variables, not any more innate than our ability to study the world scientifically. And indeed, Mr. Hoel says, the rise of science, too, depended on our learning a perspective—the extrinsic one. He gives an even faster whistle-stop tour of this process of development.

But then, in perhaps the book’s most interesting chapter, Mr. Hoel changes tack and argues that the achievements of neuroscience—the quintessential mode for extrinsic self-examination—fall far short of what the hype and headlines suggest. We have been promised so much, as if the close study of the brain’s collection of cells will yield a complete explanation for consciousness. The truth is, Mr. Hoel says, that mapping the brain’s neuronal activity has provided us with very little predictive information. You cannot, for example, reliably tell if someone is depressed by looking at a brain scan. The brain is simply too malleable, too fluid in its structures and too complex for scans to pinpoint what does what at the neuronal level and what effects will arise from any particular pattern of neural activity.

But there is a deeper problem. Neuroscience has assumed that it can proceed purely by adopting the extrinsic perspective. However, as our hypothetical miniaturized journey around the brain shows, this perspective has severe limits—almost by definition, it excludes consciousness itself. Neuroscientists carry on as though this exclusion doesn’t matter. They tend to see consciousness, Mr. Hoel writes, as “some sort of minimal subsystem of the brain, possessing no information, almost useless. The steam from an engine.” Mr. Hoel argues that “our very survival as an organism” depends on consciousness and its ability to track reality accurately—or, as he puts it, on “the stream of consciousness being constantly veridical and richly informative.” Nothing in the brain makes sense, he adds, “except in the light of consciousness.”

At this point, Mr. Hoel has deftly set the stage for another whirlwind tour, this one surveying the theories of consciousness that have come out of both psychology and neuroscience. But here the going gets tougher. To mix metaphors, the reader faces a steep learning curve across terrain that is slippery at the best of times. It doesn’t help that Mr. Hoel starts to use logic truth tables as though they were as easy to read as bar charts.

. . . .

You cannot, for example, by analyzing the movement of individual atoms in the atmosphere, tell whether or not it’s going to rain or blow a hurricane. You have to look at a larger, more-inclusive scale, the one of air-pressure systems and cloud formation. Because only events at this level can explain and predict weather events, we rightly say they are their causes. Similarly, patterns of individual neurons firing don’t tell you how a person is going to act. But what someone thinks and feels may do so. These states of mind, Mr. Hoel says, are rightly called causes of action: They have “agency” in a way that mere cells do not.

As if he hasn’t taken on enough already, Mr. Hoel finishes by tackling the thorny problem of free will. The ambition of this task is suggested by his conclusion: “Having free will means being an agent that is causally emergent at the relevant level of description, for whom recent internal states are causally more relevant than distant past states, and who is computationally irreducible.” This is a perfectly reasonable suggestion, but it requires more than the 13 pages it is given to make its meaning clear, let alone to build a case for it.

Link to the rest at The Wall Street Journal

Theoderic the Great

From The Wall Street Journal:

If there was a Roman version of “1066 and All That,” the satirical romp through English history, the year 476 would surely be one of those suspiciously bold lines in our collective historical imagination. It was then that Romulus Augustulus, the last Roman emperor, was deposed in the west. On one side of his 10-month reign lay Antiquity. On the other, the Middle Ages.

Where does that leave Theoderic the Great, the Ostrogothic king who reigned in Italy from 493 until his death in 526? Under the rule of this Gothic-speaking warrior, the Colosseum still rang with the roar of spectators, crisp mountain water still streamed through the aqueducts, and giants of Latin literature, like Cassiodorus and Boethius, still served in the senate.

Hans-Ulrich Wiemer’s “Theoderic the Great: King of Goths, Ruler of Romans” is a monumental exploration of the life and times of this remarkable leader. It is the most important treatment of its subject since Wilhelm Ensslin’s 1947 biography, and since Mr. Wiemer’s book (here in John Noël Dillon’s fluid English translation) surpasses its predecessor in breadth and sophistication, the author can claim the laurel of having written the best profile of Theoderic we have.

The story of Theoderic is epic and improbable. He was born in 453 or 454 in the ever-contested Danubian borderlands, probably in what is now the east of Austria, to an elite Gothic warrior and a mother of obscure background. The Gothic tribe to which Theoderic belonged had just emerged, following the recent death of Attila, from a long spell of domination by the Huns. In 461, the boy Theoderic was shipped to Constantinople as insurance for a treaty. He spent almost a decade, his formative youth, in the great metropolitan capital of the Roman Empire.

Theoderic’s power derived less from his distinguished ancestry or the Gothic respect for royal legitimacy, Mr. Wiemer emphasizes, than from his success as a warrior. As an upstart prince, he killed the Sarmatian King Babai with his own hands. As a commander at the head of a fearsome Gothic army, he proved a fickle ally for the eastern Roman Empire, whose emperors were hardly models of loyalty themselves. In the early 480s, he was named commander-in-chief by the Romans. Within a few years, he was besieging Constantinople.

If his career had ended there, Theoderic’s name would belong among the distinguished mercenary warlords of the troubled fifth century. But fortune favors the bold, and Theoderic had even grander ambitions. In 488, he set off with some 100,000 followers—men, women and children—in an armed wagon train on an uncertain journey from the banks of the Danube (in what is now Bulgaria) to Italy. Their goals were to unseat Odoacer—the deposer of Romulus Augustulus—and to find for themselves a permanent home. Theoderic cornered Odoacer and his forces in the major stronghold of Ravenna, and the two signed a treaty by which they were meant to share power. The treaty lasted all of about 10 days, before Theoderic personally clove his rival in two (“with a single sword stroke,” Mr. Wiemer tells us, “slicing him apart from collarbone to hip”). From such sanguinary beginnings emerged a generation of peace in Italy.

What makes Mr. Wiemer’s survey so rich is his mastery of recent research on the twilight of antiquity. Theoderic’s reign cuts to the heart of virtually every great debate among scholars of this period. Were his Ostrogoths an essentially Germanic tribe, or is ethnicity a fiction ever reconfigured by contingent power dynamics?

For Mr. Wiemer, a professor of ancient history at the University of Erlangen–Nuremberg, Ostrogoths were a “community of violence” whose material basis was war and plunder. But the author recognizes that the masses who followed Theoderic on his Italian adventure were a people of shared history and culture, setting them apart from the natives in Italy and drawing them closer to other groups, such as the Visigoths who had settled in Spain and Gaul.

Mr. Wiemer is convincing on the main lines of Theoderic’s domestic and foreign policy. At home, Theoderic pursued functional specialization between the Goths and the Romans. The former were warriors (if also landowners), the latter civilians. A two-track government reflected this essential division of labor. Theoderic sought complementarity, not fusion.

Abroad, he sought legitimacy from the eastern Roman capital, along with stability in the post-Roman west. By means of strategic treaties and an astonishing network of marriage alliances among the Vandals, Visigoths, Franks, Burgundians and others, Theoderic emerged as the most powerful ruler west of Constantinople. Thanks to opportunistic expansion, he came to control wide swathes of the Balkans, much of southern Gaul and (nominally) the Iberian Peninsula. In the early sixth century, it would not have been obvious that the Frankish kingdom would prove more enduring and consequential.

Link to the rest at The Wall Street Journal

Coin depicting Flavius Theodoricus (Theodoric the Great). Roman Vassal and King of the Ostrogoths. Only a single coin with this design is known; it is in the collection of Italian numismatic Francesco Gnecchi, displayed in Palazzo Massimo, Rome. Source: Wikimedia Commons

Mausoleum of Theoderic, built in 520 AD by Theoderic the Great as his future tomb, Ravenna, Italy Source: Wikimedia Commons

A pilgrimage to the mecca of mediumship

Not exactly to do with books, but certainly to do with genres.

From The Economist:

The gathering is like an auction crossed with a game of Mad Libs. Up steps a medium, mic in hand. Visions appear to her: the first, she says, is an old woman with a spinal deformity who liked caring for kids. The medium asks whether anyone recognises the spirit; a hand shoots up. “Know that she’s proud of you and wants you to treat yourself more!” Applause, then a second apparition, whom someone else recognises.

Such demonstrations of public mediumship are a staple of summers at Lily Dale, a hamlet in western New York founded in 1879. About 250 residents live in gingerbread-trimmed houses in the gated community. All are adherents of spiritualism, a 19th-century movement that got its start 200km away, in Hydesville, New York. Spiritualists believe that the living can contact and glean insights from the dead; mediums are the conduit. At its peak spiritualism had somewhere between 4m and 11m followers in America, including Abraham Lincoln’s wife, who hosted séances at the White House.

Each year more than 20,000 people visit Lily Dale, the oldest and largest spiritualist centre in America. Others are Cassadaga, in Florida, and Camp Chesterfield, in Indiana. Only 34 certified mediums, having passed a test, are allowed to charge for their services on the grounds. Private “readings” cost up to $140 for half an hour.

The mediums operate a guild of sorts. To get inducted one must perform well in trial readings. “We’re like the Harvard of mediumship,” says Sharon Klingler, who earned her credentials a decade ago. Candidates are scored on how many spirits they conjure and the accuracy of the evidence—mannerisms, symptoms, jobs, hometowns—that they give. “People who come here expect someone who’s been vetted,” says Kris Seastedt, vice-president of the Mediums League.

There are dos and don’ts of the trade. Don’t diagnose or prescribe: that could bring legal trouble. Brenda Reading, a newly certified medium, won’t inform women if she thinks that they have breast cancer, though she will ask if they have put off their mammogram. (Nothing stops her from telling your correspondent that he seems a bit low-energy. She recommends vitamin supplements.)

Clients want assurances that their loved ones are at peace. Best to keep the message uplifting; no one likes to relive tragedy. And the more specific the evidence, the better. It’s not enough to say that dead grandma baked. Everyone had a grandma who baked. Helpfully, most visitors to Lily Dale want to believe. The occasional sceptic can be a spoiler. So can people who get too dependent and expect to be told exactly what to do by their dearly departed. Ms Klingler sometimes asks needy clients to stop calling. She reminds them that they have free will, and suggests meditation.

Link to the rest at The Economist

Gadgets and Gizmos That Inspired Adam Smith

From Reason:

Pocket gadgets were all the rage in Adam Smith’s day. Their popularity inspired one of the most paradoxical, charming, and insightful passages in his work.

The best known are watches. A pocket timepiece was an 18th century man’s must-have fashion accessory, its presence indicated by a ribbon or bright steel chain hanging from the owner’s waist, bedecked with seals and a watch key. Contemporary art depicts not just affluent people but sailors and farm workers sporting watch chains. One sailor even wears two. “It had been the pride of my life, ever since pride commenced, to wear a watch,” wrote a journeyman stocking maker about acquiring his first in 1747.

Laborers could buy watches secondhand and pawn them when they needed cash. A favorite target for pickpockets, “watches were consistently the most valuable item of apparel stolen from working men in the eighteenth century,” writes historian John Styles, who analyzed records from several English jurisdictions.

But timepieces were hardly the only gizmos stuffing 18th century pockets, especially among the well-to-do. At a coffeehouse, a gentleman might pull out a silver nutmeg grater to add spice to his drink or a pocket globe to make a geographical point. The scientifically inclined might carry a simple microscope, known as a flea glass, to examine flowers and insects while strolling through gardens or fields. He could gaze through a pocket telescope and then, with a few twists, convert it into a mini-microscope. He could improve his observations with a pocket tripod or camera obscura and could pencil notes in a pocket diary or on an erasable sheet of ivory. (Not content with a single sheet, Thomas Jefferson carried ivory pocket notebooks.)

The coolest of all pocket gadgets were what antiquarians call etuis and Smith referred to as “tweezer cases.” A typical 18th century etui looks like a slightly oversized cigarette lighter covered in shagreen, a textured rawhide made from shark or ray skin. The lid opens up to reveal an assortment of miniature tools, each fitting into an appropriately shaped slot. Today’s crossword puzzle clues often describe etuis as sewing or needle cases, but that was only one of many varieties. An etui might contain drawing instruments—a compass, ruler, pencil, and set of pen nibs. It could hold surgeon’s tools or tiny perfume bottles. Many offered a tool set handy for travelers: a tiny knife, two-pronged fork, and snuff spoon; scissors, tweezers, a razor, and an earwax scraper; a pencil holder and pen nib; perhaps a ruler or bodkin. The cap of a cylindrical etui might separate into a spyglass.

All these “toys,” as they were called, kept early manufacturers busy, especially in the British metal-working capital of Birmingham. A 1767 directory listed some 100 Birmingham toy makers, producing everything from buttons and buckles to tweezers and toothpick cases. “For Cheapness, Beauty and Elegance no Place in the world can vie with them,” the directory declared. Like Smith’s famous pin factory, these preindustrial plants depended on hand tools and the division of labor, not automated machinery.

Ingenious and ostensibly useful, pocket gadgets and other toys epitomized a new culture of consumption that also included tea, tobacco, gin, and printed cotton fabrics. These items were neither the traditional indulgences of the rich nor the necessities of life. Few people needed a pocket watch, let alone a flea glass or an etui. But these gadgets were fashionable, and they tempted buyers from a wide range of incomes.

A fool “cannot withstand the charms of a toyshop; snuff-boxes, watches, heads of canes, etc., are his destruction,” the Earl of Chesterfield warned his son in a 1749 letter. He returned to the subject the following year. “There is another sort of expense that I will not allow, only because it is a silly one,” he wrote. “I mean the fooling away your money in baubles at toy shops. Have one handsome snuff-box (if you take snuff), and one handsome sword; but then no more pretty and very useless things.” A fortune, Chesterfield cautioned, could quickly disappear through impulse purchases.

In The Theory of Moral Sentiments, first published in 1759, Smith examined what made these objects so enticing. Pocket gadgets claimed to have practical functions, but these “trinkets of frivolous utility” struck Smith as more trouble than they were worth. He deemed their appeal less practical than aesthetic and imaginative.

“What pleases these lovers of toys is not so much the utility,” Smith wrote, “as the aptness of the machines which are fitted to promote it. All their pockets are stuffed with little conveniences. They contrive new pockets, unknown in the clothes of other people, in order to carry a greater number.” Toys embodied aptness, “the beauty of order, of art and contrivance.” They were ingenious and precise. They were cool. And they weren’t the only objects of desire with these qualities.

The same pattern applied, Smith argued, to the idea of wealth. He portrayed the ambitious son of a poor man, who imagines that servants, coaches, and a large mansion would make his life run smoothly. Pursuing a glamorous vision of wealth and convenience, he experiences anxiety, hardship, and fatigue. Finally, in old age, “he begins at last to find that wealth and greatness are mere trinkets of frivolous utility, no more adapted for procuring ease of body or tranquillity of mind than the tweezer-cases of the lover of toys.”

Yet Smith didn’t condemn the aspiring poor man or deride the lover of toys. He depicted them with sympathetic bemusement, recognizing their foibles as both common and paradoxically productive. We evaluate such desires as irrational only when we’re sick or depressed, he suggested. In a good mood, we care less about the practical costs and benefits than about the joys provided by “the order, the regular and harmonious movement of the system….The pleasures of wealth and greatness, when considered in this complex view, strike the imagination as something grand and beautiful and noble, of which the attainment is well worth all the toil and anxiety.”

Besides, Smith suggested, pursuing the false promise of tranquility and convenience had social benefits. It was nothing less than the source of civilization itself: “It is this which first prompted them to cultivate the ground, to build houses, to found cities and commonwealths, and to invent and improve all the sciences and arts, which ennoble and embellish human life; which have entirely changed the whole face of the globe, have turned the rude forests of nature into agreeable and fertile plains, and made the trackless and barren ocean a new fund of subsistence, and the great high road of communication to the different nations of the earth.”

Then Smith gave his analysis a twist. The same aesthetic impulse that draws people to ingenious trinkets and leads them to pursue wealth and greatness, he argued, also inspires projects for public improvements, from roads and canals to constitutional reforms. However worthwhile one’s preferred policies might be for public welfare, their benefits—like those of a pocket globe—are secondary to the beauty of the system.

“The perfection of police, the extension of trade and manufactures, are noble and magnificent objects,” he wrote. “The contemplation of them pleases us, and we are interested in whatever can tend to advance them. They make part of the great system of government, and the wheels of the political machine seem to move with more harmony and ease by means of them. We take pleasure in beholding the perfection of so beautiful and grand a system, and we are uneasy till we remove any obstruction that can in the least disturb or encumber the regularity of its motions.” Only the least self-aware policy wonk can fail to see the truth in Smith’s claim.

Here, however, the separation of means and end can be more serious than in the case of a trinket of frivolous utility. Buying a gadget you don’t need because you like the way it works doesn’t hurt anyone but you. Enacting policies because they sound cool can hurt the public they’re supposed to benefit. “All constitutions of government,” Smith reminded readers, “are valued only in proportion as they tend to promote the happiness of those who live under them. This is their sole use and end.” Elsewhere in The Theory of Moral Sentiments, Smith criticized the “man of system” who imposed his ideal order, heedless of the wishes of those he governed.

Link to the rest at Reason

In Defense of Independent Opinion Journalism

A reminder that PG doesn’t always agree with items he posts and usually avoids pieces with strong political position-taking. On occasion, he breaks from his usual pattern.

PG will observe that, at least in the United States, there are periods of time that are characterized by those who hold differing political opinions or values speaking past each other, frequently utilizing straw men strategies.

PG does continue to urge commenters to be respectful of those with differing opinions even if those opinions seem wrong in some way. PG will also note that the large majority of those who choose to leave comments here are intelligent individuals and suggests that differing opinions that appear in the comments are made by intelligent adults, not idiots.

(PG just checked and found that there have been 328,174 comments left on TPV in discussions of the various posts PG has made over several years. The number of comments PG has deleted for going beyond the limits of respectful dissent can be counted on his hands plus a couple of toes.)

From New York Magazine:

A couple decades ago, liberals began to see the structural asymmetry in the news media as one of the major problems in American politics. The Republican Party had an unapologetically partisan media apparatus — anchored by Fox News, founded in 1996 — that it used to promote its message. Democrats lacked anything similar. Even worse, the mainstream media had become highly sensitive to charges of liberal bias and habitually treated Republican-promoted narratives, however superficial or farcical, as inherently newsworthy. The conservative media was slavishly partisan, and the “liberal” media was filled with stories about how Al Gore was seen as a pathological liar, or John Kerry an effete flip-flopper.

Two phrases came into circulation that expressed this frustration. One was working the refs, which was borrowed from the sports world to describe how Republicans pushed reporters and editors rightward with nonstop complaints of bias.

The second was hack gap, which described the imbalance in professional ethos between left and right. Liberal pundits tended to see themselves as journalists rather than activists. They were expected to advance original arguments rather than echo a common message, and the rewards of career advancement generally went to those willing to criticize Democrats and fellow progressives. Conservative pundits usually came out of the conservative movement, saw themselves as working toward an ideological project, and operated with the tight discipline of a movement. Democrats would face swift internal criticism if they fudged the truth or violated any ethical norm, while Republicans, as long as they remained faithful to conservative doctrine, could count on the support from their chorus no matter what they did.

Over time, these critiques have exerted a profound effect on the news media. The mainstream media has moved distinctly to the left, and its once-universal practice of covering every factual debate merely by alternating quotes from opposing parties while treating the truth as unknowable has become rarer.

Progressive opinion journalism has changed even more dramatically. Breaking from the pack to question a shared belief on the left is no longer a prized trait; it is now possible to build a career unswervingly affirming progressive movement stances. On the whole, the profession has changed for the better. The internet has opened up far more voices on the left, in every way. There are more writers from more perspectives and bringing more expertise, and more of them are not white men. The absurdity of the 1990s world in which the ideological spectrum of mainstream thought ended at the center left needed to die. My work as a liberal writer is far more interesting today than it was when I began. Liberalism as a whole benefits from a strong critique from the left as well as the right.

As a political matter, the conservative-messaging apparatus no longer operates without any parallel opposition. The asymmetric structure of the media 15 or 20 years ago, which shaped Republicans into a party free to violate norms while Democrats felt constrained to follow them, is giving way to a more balanced system. After years complaining why liberals lacked their own version of Fox News, we can now see something like it, cobbled together from websites and cable-news programming.

At the same time, the downsides of this new media world have become increasingly obvious. Along with their partisan messaging system, progressives are constructing a counterpart to the information bubble in which conservatives have long resided. Where it was once rare to encounter some pseudo-fact circulating among the left, it is now routine to find people believing Michael Brown was shot with his hands up, lab-leak is a debunked conspiracy theory, or that Republicans are routinely banning instruction about racism.

In 2010, libertarian writer Julian Sanchez described the sealed universe of conservative thinking as “epistemic closure” — any source that refuted conservative claims was automatically deemed untrustworthy. One can now discern on the left at least the embryonic formation of a similar alt-universe, in which any inconvenient challenge is reflexively dismissed as “bothsidesing,” “concern trolling,” some form of bigotry, or any other of an ever-expanding list of buzzwords used to delineate wrong-think.

. . . .

Independence should be understood as a set of habits that can be practiced by writers from the breadth of the ideological spectrum. It does not mean having an “independent” identity in the partisan voting sense, or having a moderate personal politics. Independent-opinion journalism can be produced by writers occupying perspectives located between the two parties, outside of or orthogonal to them, or squarely within them.

Independence encourages (though hardly guarantees; we are all fallible) certain kinds of mental hygiene: Trying to imagine every situation if the partisan identities were reversed, conceding that people whose political commitments you generally oppose sometimes have correct or sympathetic points, testing your own arguments for logical and historical consistency. Would I oppose this tactic currently being used by the opposing party if my own party used it? Would I defend this tactic being used by my party if the opposing party used it?

An activist’s job is to promote (or, in some cases, prevent) political change. This is a completely honorable profession. But the contours of this job of moving public opinion toward the position you desire involves shading some truths and omitting others. Both forms of argument can be persuasive and articulate, but one is designed for edification, and the other is designed to advance political ends.

Think of the difference between a professor analyzing a legal question and a lawyer advocating for a client. The former has a point of view but is using argument for the sake of promoting deeper understanding for their readers. The latter is using whatever facts are most helpful to their client.

If you consider the metaphor working the refs, the distinction between independent-opinion journalism and political activism becomes perfectly clear. The phrase describes the way many coaches berate referees, in the belief that they will force those officials to call the game in a more favorable way. The coach may be biased enough to genuinely believe everything he screams at the refs, and the fans of his team may see the refs the same way the coach does. But a coach who’s working the refs is not setting out to give fans a fair assessment of the officials. His goal is to win the game.

Many of the writers engaging in public critiques of the mainstream media, either from the left or from the right, are working the refs. To the extent you rely on ref-workers as sources of political information, you are putting your brain in the hands of people who aren’t principally interested in enlightening you. They may want you to be informed about stories that encourage you to support their political coalition. They don’t, however, want to inform you about stories that undermine it. They are working you.

Link to the rest at New York Magazine

Thinking With Your Hands

From The Wall Street Journal:

Snobs of Northern Europe have long prided themselves, among other marks of imagined distinction, on their stillness in speech. The gesticulating Italian is a stubborn stereotype, but some drew the boundary even farther north. “A Frenchman, in telling a story that was not of the least consequence to him or to anyone else, will use a thousand gestures and contortions of his face,” Adam Smith said in a lecture in the 1760s, his hands presumably visible and steady. Even when it’s not wielded as a cudgel of nationalism, gesture is still often considered a garish ornament to rational discourse—or a cheap substitute for action, as when we dismiss something as a “political gesture.”

But it’s a mistake to ignore gesture, Susan Goldin-Meadow writes in “Thinking With Your Hands: The Surprising Science Behind How Gestures Shape Our Thoughts.” Far “more than just hand waving,” it is an “undercurrent of conversation” that expresses emotion, conveys information and aids cognition.

Ms. Goldin-Meadow is a scientist—a developmental psychologist at the University of Chicago—and “Thinking With Your Hands” is a book of science exposition, something like a lecture from a good professor. She doesn’t swaddle the facts in phony narrative or make excessive claims for their world-shaking import. She summarizes results from the literature and her own extensive research; generously cites predecessors and collaborators; and frankly admits when more work is needed. There are occasional lumps of jargon, banal formulations (“Moral education is an important topic these days because it prepares children to be fully informed and thoughtful citizens”) and overlapping accounts of the same studies. But the subject is fascinating.

Ms. Goldin-Meadow turns first to “co-speech” gestures—those we make (and make up) as we speak. Unlike “emblems” —the repertoire of culturally specific hand signs such as the thumbs-up, the “OK” circle or the ear-to-ear throat slit—they have no fixed form. They also serve a wider range of functions than emblems, not only communicating meaning to one’s listeners but also supporting our own cognition. People talking on the phone gesture, she points out, as do the congenitally blind, even when talking to other blind people.

One of her studies found that gesturing seemed to reduce the amount of mental work it took to explain the solution to a math problem. Effort was measured by asking the subjects to simultaneously recite a series of letters from memory, with more letters recited suggesting that less effort was required for the math-explanation task. (A small pleasure of “Thinking With Your Hands” is the inferential ingenuity on display in the experimental designs.) Another found that adults who gestured were better able to recount events in videos they had watched weeks earlier than those who didn’t.

Gesturing can also help to spatialize abstractions, making them more tractable for discussion. Children in one study who moved their hands while considering a moral dilemma, seemingly assigning conflicting positions to distinct spaces in front of them, appeared to be better at assimilating multiple points of view. In another experiment, children were taught the meaning of a made-up word with one specific toy used to demonstrate it. Compared with those who didn’t, the children who gestured were quicker to “generalize beyond the particulars of the learning situation” and extend the word’s application to other cases.

An expert in child development, Ms. Goldin-Meadow is especially focused on gesture’s role in education. Taking gesture seriously by noticing and encouraging it, she insists, would benefit both teachers and students. Children learning to solve certain simple equations, it turns out, often verbally describe using an unsuccessful problem-solving strategy while gesturing in a way that indicates a different, effective approach (making V shapes that group certain numbers to be added together, for example). Those who exhibit these manual-verbal mismatches, Ms. Goldin-Meadow has found, are usually the closest to achieving a breakthrough in their understanding. And students whose teachers used such mismatches in their lessons performed better than others, suggesting that gesture offers a rich channel of additional information.

Interestingly, the effect doesn’t seem to come from simply presenting two different strategies. Teachers who described two approaches verbally didn’t achieve the same boost in their classes’ learning. There’s something distinctive, Ms. Goldin-Meadow writes, about the combination of words and movement unfolding in time. (The rate is relatively stable; English speakers tend to produce one gesture per grammatical clause.) In fact, she writes, the integration of sound and gesture is a “hallmark” of humans, used even by pre-adolescent children but not by apes.

Gesture throws indirect light on the nature of human language, Ms. Goldin-Meadow argues, drawing on research into the hand signs devised by deaf children born to hearing parents or otherwise deprived of established sign language. Such “homesign” shows the same sort of organization as spoken languages do, breaking events down into discrete components (signs, words) that are then assembled into an ordered string. “It is our minds,” Ms. Goldin-Meadow concludes, “and not the handed-down languages, that provide structure” for our thoughts. Language is deep enough in our brains that even a child can invent it from scratch. By contrast, she notes, children don’t seem to invent the concept of exact numbers (as opposed to approximations) greater than five or so on their own.

Link to the rest at The Wall Street Journal

Inside the Secretive Russian Security Force That Targets Americans

From The Wall Street Journal:

For years, a small group of American officials watched with mounting concern as a clandestine unit of Russia’s Federal Security Service covertly tracked high-profile Americans in the country, broke into their rooms to plant recording devices, recruited informants from the U.S. Embassy’s clerical staff and sent young women to coax Marines posted to Moscow to spill secrets. 

On March 29, that unit, the Department for Counterintelligence Operations, or DKRO, led the arrest of Wall Street Journal reporter Evan Gershkovich, according to U.S. and other Western diplomats, intelligence officers and former Russian operatives. DKRO, which is virtually unknown outside a small circle of Russia specialists and intelligence officers, also helped detain two other Americans in Russia, former Marines Paul Whelan and Trevor Reed, these people said.

The secretive group is believed by these officials to be responsible for a string of strange incidents that blurred the lines between spycraft and harassment, including the mysterious death of a U.S. diplomat’s dog, the trailing of an ambassador’s young children and flat tires on embassy vehicles. 

The DKRO’s role in the detention of at least three Americans, which hasn’t been previously reported, shows its importance to Russia under Vladimir Putin, a former KGB lieutenant colonel who led the Federal Security Service, or FSB, before rising to the presidency. The unit intensified its operations in recent years as the conflict between Moscow and Washington worsened. 

As with most clandestine activity carried out by covert operatives, it is impossible to know for certain whether DKRO is behind every such incident. The unit makes no public statements. But officials from the U.S. and its closest allies said that DKRO frequently wants its targets to know their homes are being monitored and their movements followed, and that its operatives regularly leave a calling card: a burnt cigarette on a toilet seat. They also have left feces in unflushed toilets at diplomats’ homes and in the suitcase of a senior official visiting from Washington, these people said.

The DKRO is the counterintelligence arm of the FSB responsible for monitoring foreigners in Russia, with its first section, or DKRO-1, the subdivision responsible for Americans and Canadians.

“The DKRO never misses an opportunity if it presents itself against the U.S., the main enemy,” said Andrei Soldatov, a Russian security analyst who has spent years studying the unit. “They are the crème-de-la-crème of the FSB.”

. . . .

This article is based on dozens of interviews with senior diplomats and security officials in Europe and the U.S., Americans previously jailed in Russia and their families, and independent Russian journalists and security analysts who have fled the country. Information also was drawn from public court proceedings and leaked DKRO memos, which were authenticated by former Russian intelligence officers and their Western counterparts. Gershkovich’s lawyers in Russia declined to comment.

“They’re very, very smart on the America target. They’ve been doing this a long time. They know us extremely well,” said Dan Hoffman, a former Central Intelligence Agency station chief in Moscow, about DKRO. “They do their job extremely well, they’re ruthless about doing their job, and they’re not constrained by any resources.”

. . . .

On March 29, DKRO officers led an operation, hailed by the FSB as a success, that made Gershkovich, 31 years old, the first American reporter held on espionage charges in Russia since the Cold War, according to current and former officials and intelligence officers in the U.S. and its closest allies, as well as a former Russian intelligence officer familiar with the situation.

The Journal has vehemently denied the charge. The Biden administration has said that Gershkovich, who was detained during a reporting trip and was accredited to work as a journalist by Russia’s foreign ministry, has been “wrongfully detained.” Friday is his 100th day in captivity.

Putin received video briefings before and after the arrest from Vladislav Menshchikov, head of the FSB’s counterintelligence service, which oversees DKRO, according to Western officials and a former Russian security officer. During the meeting, Putin asked for details about the operation to detain Gershkovich.

DKRO also led the operation to arrest Whelan, in what U.S. officials, the former Marine’s lawyers and his family have said was an entrapment ploy involving a thumb-drive. The U.S. also considers him wrongfully detained.

When Moscow police held Reed, another former Marine, after a drunken night with friends, then claimed he had assaulted a policeman, officers from DKRO took over the case, according to the U.S. officials and Reed. Reed denied the assault and has said Russian law enforcement provided no credible evidence it had taken place. He was given a nine-year sentence, and eventually swapped for a Russian pilot in U.S. custody.

.S. officials blame DKRO for cutting the power to the residence of current U.S. Ambassador to Moscow Lynne Tracy the night after her first meeting with Russian officials in January, and for trailing an embassy official’s car with a low-flying helicopter. U.S. diplomats routinely come home to find bookcases shifted around and jewelry missing, for which they have blamed DKRO officers.

More recently, a Russian drone followed a diplomat’s wife as she drove back to the embassy, unaware that the roof of her car had been defaced with tape in the shape of the letter Z, a Russian pro-war symbol. U.S. officials say they believe the group was behind that. U.S. officials strongly believe that the Russian police posted around Washington’s embassy in Moscow are DKRO officers in disguise.

American diplomats posted to Russia receive special training to avoid DKRO and other officers from the FSB and are given a set of guidelines informally known as “Moscow Rules.” It was updated recently to reflect the security services’ increasingly aggressive posture. One important rule, say the officials who helped craft it: “There are no coincidences.”

In May, the spy agency arrested a former U.S. consulate employee, Robert Shonov, and charged him with collaboration on a confidential basis with a foreign state or international or foreign organization. At the time of his arrest, the Russian national was working as a contractor to summarize nwspaper articles for the State Department, which called the arrangement legal and the allegations against him “wholly without merit.” Like Gershkovich, Shonov is now in Moscow’s Lefortovo prison.

. . . .

“Today, the FSB is incredibly powerful and unaccountable,” said Boris Bondarev, a Russian diplomat who resigned and went into hiding shortly after the invasion of Ukraine. “Anyone can designate someone else as a foreign spy in order to get promoted. If you are an FSB officer and you want a quick promotion, you find some spies.”

DKRO officers occupy a privileged position within the security services and Russian society. Its predecessor was the so-called American Department of the KGB, formed in 1983 by a hero of Putin, Yuri Andropov, the longtime security chief who became Soviet leader.

. . . .

The unit’s officers are well-paid by Russian standards, receiving bonuses for successful operations, access to low-cost mortgages, stipends for unemployed spouses, preferential access to beachside resort towns and medical care at FSB clinics that are among Russia’s best.

The FSB emerged after the collapse of the Soviet Union subject to little legislative or judicial scrutiny. Since the February 2022 invasion of Ukraine, its official duty to expunge spies and dissidents has given it such expansive control over many aspects of Russian life that some security analysts now call Russia a counterintelligence state. In one of his final articles before his arrest, Gershkovich and colleagues reported that the invasion was mainly planned by the spy agency, citing a former Russian intelligence officer and a person close to the defense ministry, and was filtering updates from the front lines—roles usually reserved for the military.

In April, Russia passed new treason legislation that further empowered the FSB to squelch criticism of the war. In May, the spy agency, using wartime powers, said it would start to search homes without a court’s approval.

Putin has publicly berated his spy agencies several times since late 2022, after his so-called special military operation fell short of his expectations. Around that time, U.S. officials noticed an uptick in aggressive actions toward the few Americans still in Russia.

. . . .

“You need to significantly improve your work,” Putin told FSB leaders in a December speech to mark Security Agents Worker’s Day, a Russian holiday. “It is necessary to put a firm stop to the activities of foreign special services, and to promptly identify traitors, spies and diversionists.” 

He repeated the admonishment during a visit to Lubyanka, the FSB headquarters, a month before Gershkovich’s arrest. 

Putin spokesman Dmitry Peskov in April denied that Putin had a role in authorizing the arrest. “It is not the president’s prerogative. The security services do that,” he said. “They are doing their job.”

Putin likes to be personally briefed on the FSB’s surveillance of Western reporters, said U.S. and former Russian officials. Leaked FSB documents from previous surveillance cases against foreign reporters show agency leaders along the chain of command adding penciled notes in the margins of formal memos, so that higher-ups can erase any comments that might upset the president. 

DKRO memos often begin with greetings punctuated by exclamation marks to indicate urgency and militaristic formality—a common style in the Kremlin bureaucracy—followed by meticulous notes about the movements of Westerners in Russia and the locals they meet.

“We ask you to identify an employee of the Ministry of Internal Affairs at his place of employment, interrogate him about the goals and nature of his relations with the British, and as a result, draw a conclusion,” read one 2006 memo reviewed by the Journal. 

The FSB has oversight for espionage trials conducted in secret using specialist investigators and judges. During Putin’s 23 years in power, no espionage trial is known to have ended in acquittal.

Link to the rest at The Wall Street Journal

PG notes that the lives of 20th and 21st Century dictators have often ended in premature death.

For those who manage to hang on and direct the affairs of their nations for more than a brief period of time, the fact of their dictatorship tends to impoverish many of their people and results in a nation in which the economy substantially lags those nations which have non-dictatorial political structures.

Populations that live under dictatorships seldom produce world-class technology innovations or other types of creativity. Persistent anxiety and uncertainty regarding one’s standing with those who are part of the extensive government agencies principally assigned to controlling the populace and rooting out enemies of the government shrivel the creative impulses of all but a miniscule percentage of the larger population.

Leaders who gain and hold their positions using thuggery snuff out creativity and economic dynamism among their people and inevitably fall behind nations with a stable tradition of democratically- elected leaders.

Missed America – Attacking the right without asking about the left.

From The Hedgehog Review:

One day early in the pandemic, when schools and colleges first went online, my undergraduate students and I had just finished discussing an essay on the rise and decline of the innovative and powerful Comanche empire. I logged off and walked downstairs, where my elementary school-aged child was sitting at the dining table. “What did you learn in school today?” I asked, as I always do. He recounted to me—not in these exact words, of course—that North America had been an Edenic paradise before the Europeans arrived. I was shocked. This was the racist myth of the noble savage repackaged by the antiracist left. In reality, Native Americans did not need Europeans to introduce them to warfare, imperialism, slavery, or violence. This does not diminish the significant impact European pathogens and ambitions had on Native American polities. But to teach such distortive myths about the past? That’s the kind of thing historians should be upset about.

So imagine my surprise when I opened Princeton historians Kevin Kruse and Julian Zelizer’s new edited volume on contemporary historical myths and found no essay—not a single one!—that challenged myths that came from the left. The editors acknowledge “bipartisan” myths, but with a few exceptions, such as David A. Bell’s essay on American exceptionalism and Akhil Reed Amar’s on the Founding, the contributors focus on myths from the right. On their own, many of the essays, written by some of our best historians, are insightful. Collectively, they reveal the challenges facing the historical profession—my profession. When it becomes an axiom that truth comes from the left and lies from the right, something is amiss. When all the bad things America did are true, but none of the good things, something is definitely amiss.

The editors attribute the spread of “right-wing myths”—they do not distinguish analytically between myths and lies—to two causes: the “conservative media ecosystem” and the “devolution of the Republican Party’s commitment to truth.” So far, so good. Many prominent Republicans embraced and propagated former president Donald Trump’s lies about a stolen election. Even after it was revealed that Fox News commentators, including Tucker Carlson, knew that they were lying to their viewers, Speaker of the House Kevin McCarthy entrusted Carlson with exclusive access to video footage of the January 6 storming of the Capitol. Such brazen disregard for truth threatens the basic norms and principles of American democracy.

Or does it? To believe that Republican lies threaten our democracy, you also have to believe that our basic norms and principles are worth defending. But why sustain something as corrupt as American democracy? In her contribution to Myth America, Kathleen Belew, professor of history at Northwestern University, condemns those who proclaim that the events of January 6 do not reflect who we are as a country. She argues instead that this is “exactly who we are,” and a careful examination of the white nationalism and violence in our history will prove it. Belew simply inverts the story: White nationalists—including the Ku Klux Klan—embody the true America. Any story suggesting that we Americans are something better, or even that we have ideals that should inspire us to be better, is naive and false.

Like Belew, other contributors to Myth America write in absolutes. There is little that is tentative in this volume. The United States is an empire. American exceptionalism is a lie. The United States is xenophobic. There is no complexity. The world is divided into right and wrong, true and false, left and right. There is a lot of either/or but not much both/and. We find few good people doing bad things, much less flawed people achieving good things. There is almost no engagement with competing scholarly perspectives.

There are many missed opportunities. For example, in his essay on American exceptionalism, David Bell notes in passing that “the more progressive that Americans are in their politics, the more likely they are to see America as exceptional, if at all, in large part because of the harm it has done: the treatment of indigenous peoples, slavery, US foreign policy in the twentieth century, and contemporary inequality and racism.” Why leave this as an aside? Why not devote some space in the volume—especially given our public controversies over how we should teach American history—to the dangers posed by exceptionalist narratives from the left?

Myth America’s editors rightly worry that our divisions over history are caused by “unmooring our debates from some shared understanding of the facts.” They argue that “this shift has been driven by the rise of a new generation of amateur historians who, lacking any training in the field, or familiarity with its norms, have felt freer to write a history that begins with its conclusions and works backwards to find—or invent, if need be—some sort of evidence that will seem to support it.” They condemn the “cottage industry on the right” with “partisan authors producing a partisan version of the past to please partisan audiences.” I think that they are referring to the best-selling publications of journalists such as Bill O’Reilly, but might the same words also apply to other amateur historians such as journalist Nikole Hannah-Jones, the primary editor of and contributor to The 1619 Project?

Myth America does not engage with The 1619 Project because, the editors write, there has already been much public debate, but the American Historical Review, the nation’s most exclusive and influential journal of academic history, disagrees. In its December 2022 issue the AHR published an eighty-two-page forum involving nineteen of the country’s most prominent historians. In it, only a few contributors offer truly critical assessments. Speaking truth to power is hard, and the power that most shapes historians’ careers is our reputation among our colleagues. Do “we” want to risk sounding like “them”? Whose side are you on? Instead, many contributors to the forum take the easy path, accusing The 1619 Project of not being radical enough. For instance, it did not fully account for the displacement of indigenous people. Worse, they argue, Hannah-Jones has not freed herself from American exceptionalism because she hopes that the sacrifices made by generations of black Americans might be redemptive and let Americans, in Hannah-Jones’s words, “finally, live up to the magnificent ideals upon which we were founded.”

It is unusual for historians to be so gentle (just read the book review section of any journal or sit in on one of our graduate seminars). Perhaps Cornell historian Sandra Greene, in her contribution to the forum, explained why: “The publication of The 1619 Project is so important, despite its flaws,” which include “factual errors” and “several chapters [that] simplify to the point of distortion.” Important how? Politically. For Greene, The 1619 Project is “a necessary book” and its flaws “should not be used as an excuse to deny the reality that slavery and racism have influenced every aspect of US history.” But one need not deny slavery’s and racism’s historical significance to ask whether historians should defend public narratives that simplify to the point of distortion—what Kruse and Zelizer call myths.

Fortunately, most Americans have a much more nuanced understanding of American history than professional historians (or their most vocal right-wing opponents). Most Americans recognize that the past is complicated. According to an American Historical Association poll, 78 percent of Democrats and 74 percent of Republicans agree that students should learn painful history even when it makes them uncomfortable. According to another recent poll by the organization More in Common, 95 percent of Democrats and 91 percent of Republicans agree that Americans “have made incredible achievements and ugly errors.” Indeed, despite what one learns in the headlines, 87 percent of Democrats believe Washington and Lincoln should be admired for their role in American history and 83 percent of Republicans agree that all American students should learn about slavery, Jim Crow, and segregation. In other words, we Americans know that we have much to atone for in our past, but also much to celebrate. Americans understand that we contain multitudes. It should give historians pause when the common sense of ordinary American people shows more appreciation for historical complexity than trained experts.

After reading Myth America and the AHR forum, one can understand why Republicans have become so distrustful of professors and have proposed dangerous policies that threaten academic freedom, such as weakening tenure or banning entire fields of study. We historians would like to say that it’s because we speak truth to power, but perhaps the truth is that we are afraid to do so when it endangers our reputations or politics. I’m nervous just writing this review.

Link to the rest at The Hedgehog Review

Bogie and Bacall

From The Wall Street Journal:

Hollywood is famous for successfully pairing acting couples, some “married” on screen (Greer Garson and Walter Pidgeon), some musical (Fred Astaire and Ginger Rogers), and some who became involved both off-screen and on (Katharine Hepburn and Spencer Tracy). The gold standard of the on-screen romance that becomes an off-screen love affair is the one that contains a good lesson on how to whistle: Humphrey Bogart and Lauren Bacall, who first starred together in 1944’s “To Have and Have Not.” Their relationship was unexpected and unlikely but ultimately enduring and finally legendary, which is why nearly 80 years later William J. Mann has published “Bogie and Bacall: The Surprising Story of Hollywood’s Greatest Love Affair.”

The author of several books on film, including a well-researched biography of gay silent film star William Haines, Mr. Mann clearly states his purpose regarding Bogart and Bacall: “to trace myths back to their origins and to draw connections between what was said at the start of their careers and what was said later.” He puts the famous couple under an informed scrutiny, giving the full background of both stars before they met and questioning everything they did after. He pins down every rumor or error connected to their histories: Bogart’s naval service (he saw no action), the origin of his famous lip scar, the legend of Bacall’s discovery and arrival in Hollywood, their encounters with the House Un-American Activities Committee, his part in the original Rat Pack, her infatuation with presidential candidate Adlai Stevenson, and so on. He was a child of wealth, a heavy drinker, “a drifter and idler” who bungled into acting. She was the only child of an impoverished single mother, and she knew from the beginning that she wanted to be a star.

Humphrey DeForest Bogart (original nickname: Hump) was born on Christmas Day in 1899. “I got cheated out of a birthday,” he always crabbed. His father was a New York society doctor and his mother a successful commercial artist who used him as a baby model. Bogart claimed, “There was no affection in my family, ever.” In his youth, he made a mess of everything, including boarding school, early jobs and his World War I service. After Bogart’s discharge in 1919, the Broadway producer William A. Brady (father of his best friend) took pity on the hapless 20-year-old and gave Bogart a nonacting job. From that day forward, Bogart never left show business. Mr. Mann calls Brady “the most influential figure of Humphrey’s early life.”

. . . .

Bogart’s stardom was hard-earned, but never deserted him after it arrived, having been born from such films as “High Sierra” and “Maltese Falcon” in 1941 and “Casablanca” in 1942. Today Bogart ranks as the American Film Institute’s No. 1 most popular actor in film history. (Bacall is No. 20 on the women’s list.)

Her road to fame was smoother. Born in the Bronx in 1924 as Betty Joan Perske, she was determined to become successful and moved quickly. Mr. Mann shrewdly points out that “she wondered when . . . not if ” success would arrive, already impatient by age 13. In 1941, at 17, she started modeling. Mr. Mann describes her as “savvy, confident, and resourceful,” adding she was also “a fawning young woman who was drawn to older men and had already proven her ability to charm them.”

After Bacall posed for an issue of Harper’s Bazaar, director Howard Hawks invited her to Hollywood to take a screen test. He was not impressed: “She had a high nasal voice and no training whatsoever.” In October 1943 he took her to the set of “Passage to Marseille” and introduced Bogart. He was 44, 5-foot-8, a big star and married. She was 19, 5-foot-9, a nobody and single. They said hello and shook hands. Bacall said: “There was no clap of thunder, no lightning bolt.” Not yet.

. . . .

“To Have and Have Not” began filming in March 1944. Nobody expected much from it, but the atmosphere on the set began to crackle. What happened can be seen on the screen. Bacall’s lack of experience and minimum of talent is overcome by her casual confidence, unusual looks and an insolent, slightly sullen manner. She’s fresh and different, and what Bogart sees in her is in his eyes and his amused little smile.

It became a short story: They met in 1943, filmed in 1944, married in 1945 (after Bogart’s divorce) and remained together until Bogart’s death of esophageal cancer on Jan. 14, 1957. (“Goodbye kid,” he said to her.) He died with one Oscar (for 1951’s “The African Queen”), 75 films, two children with Bacall and a marriage that had lasted nearly 13 years. (In terms of Hollywood unions, that’s a lifetime.)

Link to the rest at The Wall Street Journal

Governments are using culture to spur economic regeneration

From The Economist:

St John’s, a district in central Manchester, has long reflected the city’s ambitions. Thanks to its proximity to the River Irwell, the site became a hub for the booming cotton and timber trades during the Industrial Revolution. After the second world war, as the city’s economy turned towards services, Britain’s first purpose-built television studios were set up there. (Most of the studios were closed or relocated in 2013.) Now the area is undergoing yet another transformation. On June 30th a multi-use arts venue on the banks of the river—costing £211m ($268m) and spanning more than 140,000 square feet—welcomed its first visitors.

The building, initially called Factory International but recently rebranded as Aviva Studios, was announced in 2014 by George Osborne, then the chancellor of the exchequer. It is mostly funded by the government and Manchester City Council and is the biggest investment in a cultural project in Britain since Tate Modern in 2000.

Mr Osborne saw Factory International as part of his “northern powerhouse” policy, which aimed to boost the economies of places such as Manchester and Newcastle, and to shift jobs, investment and influence away from the south-east of England. According to recent data from the Office for National Statistics, London’s gross value added (gva, a measure of output) is around 10% higher than the total of 11 other “core” cities, including Manchester. Mr Osborne has since left politics—he is now chairman of the British Museum—but talk of “levelling up” continues.

The venue was designed by Ellen van Loon of oma, an esteemed architecture firm, and its façade evokes Manchester’s mishmash of period buildings. Inside, a vast warehouse space can be configured in various ways; thanks to high-tech acoustic walls, different events can take place simultaneously. It will be the permanent home of the Manchester International Festival, a biennial event founded in 2007.

The venue can host enormous installations. The inaugural exhibition, “You, Me and the Balloons”, is the largest-ever show by Yayoi Kusama, a blockbuster Japanese artist (see picture). For the full launch in October, Danny Boyle, a Mancunian film-maker, has created an immersive production inspired by “The Matrix”.

Aviva Studios’s aims are grandiose, too: executives say it will contribute £1.1bn in gva in the next decade and directly and indirectly create more than 1,500 jobs. The Factory Academy, established in 2018, provides people with the technical skills needed by the venue and by the arts sector at large. “Yes, we’ve built a really exciting international arts venue that people across the world will travel to see,” says Bev Craig, the leader of Manchester City Council, but “that’s only half the story…It’s purposeful growth, not just any old growth.”

Manchester is not alone in betting on culture as a catalyst of regeneration. In America, Jersey City hopes to become a “destination for the arts” and is footing the bill for Centre Pompidou’s first North American outpost, due to open in 2026. In 2021 Abu Dhabi confirmed it would spend $6bn on its creative industries over five years in an attempt to diversify from oil; Muhammad bin Salman hopes to turn Al Ula into Saudi Arabia’s capital of culture. China has opened scores of new museums in recent years, as has South Korea.

. . . .

The idea of state support for culture can be a controversial one, as the well-off are most likely to participate in the arts; detractors would often rather see the money spent on hospitals or schools. Yet evidence suggests that a vibrant culture scene brings several benefits. Research by Centre for Cities, a think-tank, found that proximity to recreation facilities is important to 25- to 34-year-olds and influences their decisions about where to live.

Culture has an impact on well-being, too. Researchers at University College London analysed a series of longitudinal studies conducted between 2017 and 2022 in America and Britain. Controlling for income, education and other demographic factors, they found that enjoying the arts was good for your health. Whether you are reading a book or going to the opera, you are guarding against depression, dementia and chronic pain.

In the 19th century, as Americans and Europeans moved to the city, governments built public institutions to enrich people’s lives. Some economists have long sensed that this kind of investment can pay dividends. In Britain the Arts Council was established in the wake of the second world war to distribute government money across England, Scotland and Wales. John Maynard Keynes, a lover of opera and ballet, was its first chairman. “At last the public exchequer has recognised the support and encouragement of the civilising arts of life as a part of their duty,” he said.

. . . .

The idea of state support for culture can be a controversial one, as the well-off are most likely to participate in the arts; detractors would often rather see the money spent on hospitals or schools. Yet evidence suggests that a vibrant culture scene brings several benefits. Research by Centre for Cities, a think-tank, found that proximity to recreation facilities is important to 25- to 34-year-olds and influences their decisions about where to live.

Culture has an impact on well-being, too. Researchers at University College London analysed a series of longitudinal studies conducted between 2017 and 2022 in America and Britain. Controlling for income, education and other demographic factors, they found that enjoying the arts was good for your health. Whether you are reading a book or going to the opera, you are guarding against depression, dementia and chronic pain.

In the 19th century, as Americans and Europeans moved to the city, governments built public institutions to enrich people’s lives. Some economists have long sensed that this kind of investment can pay dividends. In Britain the Arts Council was established in the wake of the second world war to distribute government money across England, Scotland and Wales. John Maynard Keynes, a lover of opera and ballet, was its first chairman. “At last the public exchequer has recognised the support and encouragement of the civilising arts of life as a part of their duty,” he said.

. . . .

A multitude of factors make a city appealing to tourists as well as migrants. Economist Intelligence Unit, a sister company of The Economist, calculates a city’s “liveability” according to five broad categories: culture and environment, education and infrastructure, health care and stability. Abu Dhabi welcomed an outpost of the Louvre in 2017; in recent years, its score has risen thanks to improvements in public services. Yet it still ranks only slightly above the average rating globally.

The authorities in Manchester know there is no single formula for growth. In 2022 it was announced the city would receive £1bn to improve its public-transport system; it is also focusing on attracting technology companies and startups. A wide cultural ecosystem has been established over the past 25 years, with Media City, a bbc outpost, the National Football Museum and home, an arts complex.

“This building is a manifestation of Manchester investing in creative industries as part of its future,” says John McGrath, the artistic director and chief executive of Factory International (the moniker was retained for the organisation that runs Aviva Studios and the arts festival.) That name hints at the city’s particular cultural heritage: between 1978 and 1992 Factory Records, a music label, championed bands including Happy Mondays, Joy Division and New Order.

Link to the rest at The Economist

‘Life, Liberty, and the Pursuit of Happiness’ Review: America’s British Creed

From The Wall Street Journal:

The youth of the American republic is one of its oldest traditions. Its unique origins will always make it younger than any other nation. Yet the United States is also the world’s oldest democracy. Britain in the time of George III was a liberal monarchy, but Britain democratized only by degrees in the 19th century. France was neither liberal nor democratic before the revolution of 1789, and the French are now on their fifth republic. The American ideal of democratic self-governance looks ever more exceptional as it creaks toward its 250th birthday.

Britain has a kind of old-fashioned pseudo-constitution: an accumulation of legal precedent and patchwork legislation, standing on unwritten assumptions and topped by a hollow crown. Americans were the first to spell out their social contract and specify the rights of individuals in plain English. But what did the magic words of the Declaration of Independence—“life, liberty, and the pursuit of happiness”—mean to their authors?

History is best written by the losers. In “Life, Liberty, and the Pursuit of Happiness: Britain and the American Dream,” Peter Moore, a historian who teaches at Oxford, shows how Britain exported its highest ideals to the Americans who rejected it.

Mr. Moore breaks the American creed into three sections and examines each in context. “Life” explores how Benjamin Franklin embodied colonial intellectual potential in the 1740s, and how he developed in London in the 1750s and 1760s. “Liberty” shows how the London rabblerouser John Wilkes catalyzed the politics of liberty in the 1760s, and why he resonated so loudly in the Colonies. “Happiness” explains what the Enlightenment blend of action and emotion meant in England in the early 1770s, and how Americans understood it on the cusp of their reinvention.

Bible reading made colonial Americans perhaps the most literate population on the planet, but the “life” of the American mind was rooted in London. In 1740, Philadelphia was the Colonies’ leading city, with a modern street grid and a handy location on the post road between Boston and Charleston, but its population of 10,000 was half that of Bristol in England. London’s coffee-house culture, and periodicals such as Addison and Steele’s short-lived Spectator, were the templates for Benjamin Franklin’s self-improving “Junto” book club, his Pennsylvania Gazette, and the Almanack that he published under the pseudonym Richard Saunders.

All American roads led to London, and back. A London printer, William Strahan, supplied British news for the Pennsylvania Gazette. Strahan’s protégé, David Hall, emigrated to Philadelphia and worked in Franklin’s print shop. In 1747, Franklin retired from trade, passed the shop to Hall, and commissioned his “coming-out” portrait as a gentleman. Franklin’s scientific studies were not just an expression of practical polymathy. England’s aristocracy of the mind were fascinated by science. When Franklin went to London in the 1750s, his electrical speculations were his calling card.

Meanwhile in London, Strahan was printing Samuel Johnson’s “Dictionary” in installments. Johnson was writing his own one-man periodical, the Rambler. Franklin launched Johnson in America, publishing excerpts in “Poor Richard’s Almanack.” Though Strahan linked the leading minds of American and British letters, Franklin and Johnson’s “division of perspectives” anticipated the parting of imperial ways. Franklin presented himself carefully, playing the “Gentleman in Philadelphia” for his London correspondents, just as he would later play the noble savage for Parisian admirers during the American revolution. Johnson was a tic-ridden social bumbler. Franklin was irreligious but believed in progress. Johnson, a prayerful Anglican, thought that “all change is of itself an evil.”

Mr. Moore describes their differences in the 1750s as “liberalism against conservatism,” but neither of those terms existed in those happy days before everyone had an “ideology.” The only word that made the king and his ministers “sit up and think hard about America,” Mr. Moore writes, was “France,” and that made the colonists want “more of Britain than less of it.” The Seven Years’ War (1756-1763) brought London and the colonists together, but the subsequent tax burden demonstrated how unequal the relationship was. Americans began to sour on the distant mother country, especially after George III and his ministers tried to ruin John Wilkes.

Link to the rest at The Wall Street Journal

Ukrainian Writer and Activist Victoria Amelina

From Publishing Perspectives:

As Publishing Perspectives readers will recall, when Ukraine’s Victoria Amelina gave us her thoughts on the slain children’s author and illustrator Volodymyr Vakulenko, she said, “Everyone in Ukraine, including Ukrainian writers, keeps losing their loved ones.”

Now, Amelina herself has been lost. She died on Saturday (July 1) of injuries sustained in the June 27 Russian missile strike on the pizza restaurant in the eastern city of Kramatorsk. Victoria Amelina was 37.

In the spring, she had made the trip to Lillehammer to be at the World Expression Forum, WEXFO, on May 22 and accept the International Publishers Association‘s 2023 IPA Prix Voltaire Special Award for Vakulenko. One of the things she told Publishing Perspectives about the slain children’s author was that “Vakulenko believed we are to make history. He always responded to the challenges of his time.”

Today (July 3), the IPA’s offices in Geneva have reported that the Prix Voltaire Special Award now honors Amelina as well as Vakulenko. In a tweet on May 28, Amelina announced that she’d delivered the IPA’s special Prix Voltaire to Vakulenko’s mother.

. . . .

Iryna Baturevych at Ukraine’s publishing-industry news medium Chytomo writes to us, “We are shocked. [Amelina] has a little son, almost the same age as my son. He will be 12 in July. Victoria was courageous.”

As Chytomo’s article notes, Amelina was working with a watchdog organization called Truth Hounds, which monitors and documents details of potential war crimes.

Reported today (July 3) by CNN’s Svitlana Vlasova, Claudia Rebaza, Sahar Akbarzai, and Florencia Trucco, Amelina has become the 13th person now known to have died from that attack Kramatorsk–which is close to the front lines in the Donetsk province. The attack was timed to a particularly busy moment when the Ria Lounge near Vasyl Stus Street was crowded with evening diners. At least 61 people are reported to have been wounded when what analysts say was a Russian short-range ballistic missile called an Iskander hit the restaurant.

In BCC’s write-up, George Wright reports that Amelina’s first English-language nonfiction book, War and Justice Diary: Looking at Women Looking at War, is expected to be published, although no time frame for that release is mentioned.

Link to the rest at Publishing Perspectives

The Supreme Court Declares Independence

From The Wall Street Journal:

Fourth of July celebrations arrive with special resonance for conservatives this year. It is clear now that we are in the throes of a full-scale American cultural counterrevolution, propelled by rising popular opposition to the coercive orthodoxies of a hegemonic left and enforced by a string of impeccable decisions from a Supreme Court intent on reviving the spirit of 1776. Spectacular displays of pyrotechnics from an endangered establishment, increasingly hysterical at the dawning realization of its imminent overthrow, are as entertaining—and ultimately harmless—as any you will witness in the night skies this holiday.

Fourth of July celebrations arrive with special resonance for conservatives this year. It is clear now that we are in the throes of a full-scale American cultural counterrevolution, propelled by rising popular opposition to the coercive orthodoxies of a hegemonic left and enforced by a string of impeccable decisions from a Supreme Court intent on reviving the spirit of 1776. Spectacular displays of pyrotechnics from an endangered establishment, increasingly hysterical at the dawning realization of its imminent overthrow, are as entertaining—and ultimately harmless—as any you will witness in the night skies this holiday.

. . . .


The popular rebellion against the cultural left that has seized so many of the institutions of American life has enjoyed mixed success at the political level. But since that hegemony was established in large part through half a century of grotesque judicial overreach, it was likely to be truly overturned only by a judiciary that would finally move to substitute restraint for activism.

With last week’s timely, pre-Independence Day succession of decisions by the court we can see better than ever that the counterrevolution is advancing apace. In Biden v. Nebraska, Students for Fair Admissions v. Harvard, and 303 Creative v. Elenis, a solid majority struck solid blows for the principles and values that helped create the U.S. in the first place.

In the three cases—respectively over the Biden administration’s student debt cancellation plan, racial preferences in college applications, and free-speech protections in commercial interactions—the judgments rescinded the usurpation by an expansive executive of the power of the purse, restored the principle of merit over group membership as a key determinant of individual opportunity, and reaffirmed a citizen’s right not to be compelled to endorse ideas with which she disagrees.

Note the common prefix in the principal verb in each of those subclauses, “re-.” One meaning is “back” or “backward.” But this is no reactionary backlash to the inevitable march of modernity, as most of the media, with predictable and prejudicial alarmism, calls it. These decisions, along with other critical rulings in this and the preceding court term, represent the necessary undoing of successive judicially authorized derogations of the most defining American principles—fairness, equality, freedom, the proper exercise and distribution of government powers.

The counterrevolution has been achieved through painstaking efforts by conservatives to recruit, develop and advance jurists of reliably originalist disposition, as well as through presidential nomination—by Donald Trump, especially—of well-qualified justices who wouldn’t, for a change, acquiesce to the left’s dominance.

But reading last week’s decisions, I would argue that the most important force working in this revolution’s favor—perhaps as it was in 1776—is the sheer intellectual weight of the argument.

Contrast the various majority opinions last week with the left’s dissents. The opinions of Chief Justice John Roberts and Justice Neil Gorsuch, writing for the majority in the three cases, are characterized by taut writing, unimpeachable logic, close adherence to argument, legal principles, facts and evidence, and detailed interpretation of existing case and statutory law. The fulcrum of the legal argument is—curiously enough—on the law as it was written, as it must be applied, not on some larger political or social objective.

With the exception of Justice Elena Kagan, the court’s members from the left write and speak with the apparently unchallengeable conviction that their role is not to apply the law but rather to make it, in the pursuit of some desired higher outcome—one that happens to conform to their ideological priors rather than to any constitutionally mandated principle or process.

In the process they adduce not legal reasoning, but political rhetoric of the crudest character and most clichéd language.

Justice Ketanji Brown Jackson, dissenting from the racial-preference ruling, tells us that “deeming race irrelevant in law does not make it so in life,” and that race still matters to the “lived experiences” of Americans. Justice Sonia Sotomayor, misrepresenting the decision in 303 Creative—which was, remember, about whether a business can be required to engage in a form of speech that violates its owner’s conscience—by saying the “symbolic effect of the decision is to mark gays and lesbians for second-class status.”

The left’s response to the reversal of its long success in making the court a second legislative branch of government is also telling. Instead of accepting, as limited-government, originalist conservatives did, the need for a long campaign to undo the hegemony of the rival philosophy, they want to short-circuit the process. This means protecting or restoring their authority by making radical institutional changes to the court or, failing that, by delegitimizing it, using a friendly media to impeach the reputation of justices they oppose with spurious allegations of impropriety.

Like almost all rebellions, the current American cultural counterrevolution is certain to face further and intensified resistance from the institutions and people whose dominance it threatens.

Link to the rest at The Wall Street Journal

Not necessarily to do with books, but a legal earthquake for US colleges and universities.

Why right-wing Europeans are flocking to an English thinker

From The Economist:

Customers at the Scruton cafe in Budapest don’t turn up for its chicken and buttered cauliflower. Nor for its decor. Instead they come for the peculiar contents on show. Scattered around the room are a writing table, a blue-and-white china tea set and a collection of vinyl records (classical) all shipped from England. Odd goods are also on sale. Spend 4,900 forints (about £11) and you can pick up a T-shirt emblazoned with “Conservatism is more an instinct than an idea”. A plaster bust of a tousle-haired, middle-aged man costs twice that. The figure depicted was a political thinker, Roger Scruton.

It’s peculiar to see a cafe in Budapest devoted to Scruton, a conservative Englishman who died in 2020. Stranger yet, two more such cafes exist in the Hungarian capital. In each are scattered “Scrutopia”: various items donated by his widow from his flat in London and farm in Wiltshire. Most striking is the sight of a saddle and riding crop. Scruton was not born to hunt, but took it up with enthusiasm in middle age, perhaps because the hobby chimed with his political philosophy and love of tradition. The author of over 50 books, he wrote about values of community, reciprocal obligations, courtliness and kingship, and more. These all, he believed, were embodied in the hunt.

In Britain, Scruton’s ideas have gained some traction. As prime minister, Theresa May appointed him to lead official efforts to rewrite planning rules to ensure new constructions went up along traditionalist lines—a process dubbed “building beautiful”. (It fell apart.) A summer school for avid Scruton disciples in Britain takes place each June. And his ideas still have a strong pull on the New Right of the Conservative Party, which tends to avoid taking detailed policy positions other than being in favour of tradition. Whereas Thatcherism involved ideas of borderless free-market capitalism, which went together with smashing up old ways of doing things, for Scrutonites there is much to be said for harking back to old ways. Sir Jacob Rees-Mogg, who was a senior figure in government until recently, and Suella Braverman, the home secretary, are two Scruton-enthusiasts.

Where Scruton really stirs up strongest interest, however, is among those on the right fringe of continental European politics—the most notable adherent is Hungary’s authoritarian leader, Viktor Orban. Scruton’s ideas of the home and his talk of the rituals of a nation seem to resonate especially strongly. Mr Orban can employ such ideas to try to justify his hostility to immigrants and to international institutions such as the European Union (even as it simultaneously subsidises his country’s economic growth). Some right-leaning Swedes are fond of Scruton, too. Italy’s prime minister, Giorgia Meloni, is another fan. She likes to quote one saying of his: “it is always right to keep things as they are, in case worse things are proposed”.

Link to the rest at The Economist

PG had never heard of Scruton, but he is sometimes drawn to authors who collect a lot of negative attention, especially if they write non-fiction.

The Individualists

From The Wall Street Journal:

Most politically attuned Americans will have some idea of what libertarianism is. Some think of it as an embrace of “business” or “capitalism” in all its forms; others as the anything-goes morality of 1960s radicalism. Still others, coming nearer the truth, will know libertarians as the people who want government “out of the bedroom and the boardroom,” as the slogan has it. In fact, as we are reminded by John Tomasi and Matt Zwolinski in “The Individualists,” libertarianism is a bit of all these things. It is a wildly diverse and inveterately fractious political tradition whose adherents have taken opposite sides on nearly every important political question.

The book documents libertarian thought, from its origins in the second half of the 19th century until now, on an assortment of topics, including markets, poverty, racial justice and the international order. The authors themselves claim the libertarian label and write clearly and charitably about all factions of the philosophy.

What is libertarianism, anyway? The easy answer: an approach to politics that seeks to minimize state coercion and maximize individual liberty. That generalization, though, covers a multitude of disputes. Most libertarians support legal abortion, for example, and some oppose it, in both cases for reasons of individual sanctity. In the absence of any easy formulation for what all libertarians think, Messrs. Tomasi and Zwolinski propose six “markers”: property rights, individualism, free markets, skepticism of authority, negative liberties, and a belief that people are best left to order themselves spontaneously. Libertarians, the authors contend, keep all six principles in view at the same time. 

They divide libertarianism into three historical eras, each responding to particular threats to liberty. The “primordial”-era libertarians—Frédéric Bastiat (1801-1850) in France, Herbert Spencer (1820-1903) in Britain—formed their ideas in opposition to socialism. In 19th-century America, the great threat to liberty wasn’t socialism but slavery. Early American libertarians like the journalist Lysander Spooner (1808-1887) saw slavery “primarily through the lenses of authority and property rather than of race,” the authors write. “Libertarians condemned slavery as an unjust usurpation of individual sovereignty and a denial of the individual’s rightful entitlement to the fruits of their labor.” There was wisdom in understanding chattel slavery as theft—a definable crime—rather than as a form of the more nebulous sin of racism.

During the Cold War—the authors’ second era—libertarians tended to align with conservatives in opposition to central planning and devoted their attention mostly to economic subjects. It was an uneasy alliance. Libertarians and conservatives were both anticommunist on questions of personal liberty and economics, but the right favored military buildup and libertarians hated militarism. The two camps clashed on crime, drug legalization, abortion, obscenity on the airwaves and more. Already in 1969 the libertarian economist Murray Rothbard (1926-1995) urged his readers “to go, to split, to leave the conservative movement where it belongs.”

The alliance largely dissolved in the years after the fall of the Berlin Wall. Libertarians, quite as much as conservatives and liberals, experienced a crisis of identity. With socialism discredited everywhere, what held libertarianism together? In the 21st century, the movement in the U.S. has consisted in an assortment of competing, often disputatious intellectual cadres: anarchists, anarcho-capitalists, paleo-libertarians (right-wing), “liberaltarians” (left-wing) and many others.

Link to the rest at The Wall Street Journal

First Page Critique: Defining the Scope of Your Memoir

From Jane Friedman:

A summary of the work being critiqued

Title: When Did You Know?
Genre: memoir

Emily was adopted at birth, and I was privileged to be present when she was born. When she first emerged, I wondered if something was wrong, she didn’t cry right away. You’re worrying too much. But at six months old, she didn’t respond with recognition when I picked her up from day care, a flat affect. Later Emily had a choking episode that might have started with a seizure. This started a series of medical tests revealing a chromosomal abnormality, a seizure disorder, and autism spectrum disorder. By this time, we had also adopted her biological sister, Madison, also at birth, an energetic blond girl who later was diagnosed with ADHD.

The memoir shares the pains and joys of parenting these girls, addressing topics such as spirituality and autism, nutrition and weight issues, fatigue, behavior management, sibling rivalry, friendships and sexuality. This book includes resources and ideas of what helped me navigate the challenges of autism parenting.

First page of When Did You Know?

“Did you know she had autism when you adopted her?”

“No, I just thought she slept a lot. We were able to take her to movies and she’d sleep through them,” I said. People would look at us as we lugged her carrier into the cinema but she’d be silent all the way through.

But even from the day she was born I wondered. I also wondered about Amber, her biological mother. I eventually found out a lot about Emily, our daughter with autism and Amber, her biological mother.

“The baby’s coming, push Amber,” the doctor said at the foot of the bed. He was surrounded by nurses and medical students in blue scrubs.

“Come look,” he told me. He knew I was the adoptive mother.

I stood behind him as a blueish, brownish dome emerged.

“Push one more time, Amber,” they said.

I stood back. Amber grunted and cried out. That must hurt so much. The baby whooshed into the doctor’s gloved hands. He held her up.

“Is there only one?” Amber asked.

“Yes,” he answered.

Earlier while I waited with Amber during labor, she mentioned that a lady in the grocery store had commented it looked like she was having twins. I was surprised she thought twins were a real possibility and that no one put her fears to rest. I paused and wondered about Amber. Just like when we first met her a few days earlier.

Continue reading the first pages, with color coding and comments by editor Hattie Fletcher.


Dear Julia,

Thanks for sharing your work and the first pages of your memoir about parenting—certainly an important story for you and one that’s potentially incredibly useful to other parents who might find themselves in a similar situation.

My first big-picture observation is that it feels a bit hard to see the shape of the story from the materials you’ve submitted. That is, your summary describes the book as a memoir that “shares the pains and joys of parenting these girls …” but that’s potentially a pretty big and abstract story. Do you plan to focus on their very early years, or the time up until they’re 18 and (perhaps) leaving the nest? Do you want to focus primarily on a specific aspect or aspects of parenting—perhaps on your experiences seeking help from the medical community, or your challenges finding a peaceful rhythm as a family? Obviously, readers don’t want to start a book already knowing the ending (at least, not usually), but in the book pitch you want to define the scope of the story more specifically.

There’s no firm rule for how much time/story a memoir can cover. A quick look at some memoirs about parenting will show you many different approaches: Anne Lamott’s classic Operating Instructions is a journey through the first year of her son’s life; Mary Louise Kelly’s It. Goes. So. Fast. is framed around her son’s last year of high school. (A one-year narrative can make a tidy frame, indeed.) But many writers tackle longer arcs: Ron Suskind’s Life, Animated covers almost two decades of his family’s efforts to use Disney movies to help his autistic son engage as fully as possible with them and the rest of the world.

Once you have established the scope of your story and your overall narrative arc, then you can think about where to first enter the story and begin to introduce your characters/family. Maybe you’re telling the story of your young daughters from Emily’s birth to the day when (I’m making stuff up now), at the ages of twelve and fourteen, both girls climbed on a bus together for a week-long group wilderness adventure. Or maybe you’re telling the story from the day of Emily’s choking episode to her high school graduation lunch. Your goal, essentially, is to find a satisfying narrative arc that will take readers on a journey with you and that will provide—even if there’s a lot of mess along the way—some degree of resolution of a central question or tension.

That being the goal, I’d question whether Emily’s birth is the most effective starting point. Birth has, of course, the obvious advantage of being a very clear beginning—Day One! On the other hand, a purely chronological organization of material can sometimes feel tedious on the page. (First, Emily was born. … In her first week … When she was two months old … And then, when she was one…)

In fact, you’re already sort of building in a bit of that framework, by starting—even if only for a few lines—with a fast-forward in order to flash back to the birth. So, for a next draft, I would be inclined to start somewhere a little farther along. And then, after you’ve established some of the conflict/tension of the overall narrative, you could jump back to the time of Emily’s birth (and even before that, it seems), and look at it through the lens of the information you later learned.

Regardless of where in the larger work the birth scene ends up, when you do get there, it might be helpful to consider the pacing of the scene. There’s a balance, usually, between spending too much time in one scene (which, like a chronological organization to a book, can become tedious) and weaving in other elements, such as reflection, description, character development, etc. It seems to me that your first pages currently bounce around quite a bit between several scenes/times, in a way that feels a bit disorienting and jumpy. Focusing more on individual times might make it easier for readers to follow the story, and also open up a little space for you to go into more detail. I’ve color-coded your first pages to show the jumps in time visually. Until the long stretch at the end, a couple days after the birth, you can see there’s a jump every few paragraphs, more or less. It might make sense to consider grouping some of the color sections into bigger sections, whether or not in chronological order.

Link to the rest at Jane Friedman

‘The World’ Review: History Branched Out

From The Wall Street Journal:

You know you are in for quite a ride when an enormous tome of world history begins: “I am Enheduanna, let me speak to you!” An Akkadian high priestess of the moon god Nanna from the third millennium B.C., Enheduanna had been seized by raiders, likely raped, and then miraculously saved and restored to power. According to Simon Sebag Montefiore, Enheduanna was “the first woman whose words we can hear, the first named author, male or female, the first victim of sexual abuse who wrote about her experiences.”

This is not the history you learned in school.

The World” tells the story of humanity through families, be they large or small, powerful or weak, rich or poor. It is a book for people who want to read about people. There’s little attention paid to impersonal forces. Readers interested in isms—feudalism, imperialism, capitalism, etc.—won’t find these subjects explicitly discussed in this book. Rather, the author addresses the faceless structures of human existence by writing about who advocated for and implemented them, and who benefited from or suffered under them. “The World” pulsates with the hundreds of human stories Mr. Montefiore brings to life in vivid, convincing fashion. Enheduanna has about a page dedicated to her life; other individuals have a bit more; some, a single paragraph. This is history as collective biography, a journey across almost two million years, from the appearance of Homo erectus in east Africa to the rise of Xi Jinping’s China.

Mr. Montefiore has been working up to this ambitious project over his career, from biographies of Prince Grigory Potemkin and Stalin, to a fine study of the Romanov dynasty, to a 3,000-year history of Jerusalem. That book was aptly subtitled “a biography” of the city, for Mr. Montefiore is a biographer at heart. Combining literary flair with keen insight into human psychology, he can evoke a person with a few choice words—“porcine tyrant,” his description of Belarus’s current leader, Alexander Lukashenko, nails its subject.

Among the many strengths of “The World” is its truly global perspective. This is an unabashedly multicultural history that refuses to privilege any particular perspective, be it geographic, cultural or ethnic. Africa warrants as much consideration as Europe, Asia as the Americas. Nor does the book forsake the lives of the common folk for kings and queens, tycoons and presidents. The focus on families allows for light to shine on women, children and others often ignored in our master narratives. In a tactic typical of his original approach, the author opts not to tell the familiar story of Sally Hemings and Thomas Jefferson, but instead focuses on Sally’s mother, Betty, a child of rape with a white father, a certain Capt. Hemings, who later tried to buy her and once tried to kidnap her.

. . . .

For the very powerful, family has often meant dynasty. For many others, it has meant intimacy, care and love. But enslaved persons, throughout human history, have been denied such essential goods. “Slavery shattered families,” Mr. Montefiore writes, “it was an anti-family institution.” Where enslaved families did exist, in Roman households or Islamic harems, for example, they “encompassed coercion without choice, and often outright rape.” His account in “The World” pays significant attention to such injustices and acknowledges the victims of the past. Mr. Montefiore tells us that among the first names ever recorded in writing were the enslaved persons En-pap X and Sukkalgir in Uruk roughly five thousand years ago.

At the same time, history shows that, along with being nests of succor, families can be “webs of struggle and cruelty.” The Chinese statesman Han Fei Tzu warned the monarch in the third century B.C., “Calamity will come to you from those you love.” This truth recognizes no constraints of class, power or wealth.

The relentless chronological march of Mr. Montefiore’s book is leavened, and given an aspect of suspense, by his habit of picking up the family stories of significant individuals long before they take center stage. The Kennedys are introduced through patriarch Joseph Kennedy, swanning about in 1920s Hollywood and cashing out before the crash. Barack Obama’s story begins with his father, a Kenyan economics student whose political patron, Tom Mboya, had met Sen. John F. Kennedy at Hyannisport, Mass., and made the case for foreign scholarships. “Mboya chose Obama, who left for Hawaii; Kennedy won the presidential election.” Donald Trump has long been “the personification of American illusion,” Mr. Montefiore remarks: a “bombastic bazooka of complex inferiority.” But he begins keeping an eye on the “quintessential American story” of the Trump family as far back as the 1880s.

The author is equally on target with his account of Vladimir Putin. He highlights Mr. Putin’s rapid transformation from an awkward, clumsy leader into a murderous authoritarian, notorious for his “gangsterish swagger.” Mr. Montefiore rightly rejects the argument that the West and NATO are to blame for Russia’s invasion of Ukraine. If the war amounts to a needless tragedy, it is not unexpected. After all, such acts of mass violence are nothing new. After decades of peace, as he puts, “normal disorder has been resumed.”

Link to the rest at The Wall Street Journal

Soldiers Don’t Go Mad

From The Wall Street Journal:

Two of England’s finest poets of World War I—Siegfried Sassoon and Wilfred Owen—met in a mental hospital in Scotland in 1917. Craiglockhart War Hospital, near Edinburgh, is the subject of “Soldiers Don’t Go Mad,” Charles Glass’s brisk, rewarding account of the innovative doctors and their “neurasthenic” patients who suffered unprecedented psychological distress (and in unprecedented numbers) on the Western Front. By 1915, the second year of the war, over half a million officers and enlisted personnel were admitted to medical wards for “deafness, deaf-mutism, blindness, stammering, palsies, spasms, paraplegia, acute insomnia, and melancholia”—hallmarks of what at the time doctors termed “shell shock” or, as it has become known, post-traumatic stress disorder (PTSD). Modern warfare overwhelmed countless young soldiers: “For the first time in history, millions of men faced high-velocity bullets, artillery with previously unimaginable explosive power, modern mortar shells, aerial bombardment, poison gas, and flamethrowers designed to burn them alive,” as Mr. Glass, a former chief Middle East correspondent for ABC News, recounts.

Craiglockhart, originally known as the Edinburgh Hydropathic or “The Hydro,” opened as a hospital in October 1916 for “officers only.” Its château-like main building, elaborate gardens and sweeping lawns were more elite health club than mental ward. Low-impact activities available to patients—“carpentry, photography, debating, music, and writing”—may well have confirmed the suspicions of “most senior officers, including many Medical Corps physicians, [who] regarded shell shock as nothing other than malingering or cowardice that demanded not treatment, but punishment.” To the pioneering physicians at Craiglockhart, however, the damage that trench warfare inflicted on the psyche was painfully real, often giving rise to a soldier’s “trembling limbs, halting voice, and confused memory.”

Wilfred Owen, 24, had exhibited these very symptoms in France, after surviving the blast of a trench mortar shell and spending several days unconscious, sprawled amid the remains of a fellow officer. The Army Medical Board declared Second Lt. Owen unfit for duty and consigned him to Craiglockhart for treatment. His physician there, Arthur Brock, had developed a work-based approach to recovery he called “ergotherapy,” as a counter to the popular rest cure of “massage, isolation, and a milk diet.” Brock fostered activity and community, and, when Owen expressed an interest in literature, he “encouraged him to write poetry, essays, and articles” as part of his therapy. Owen took over editorship of the Hydra, the hospital’s literary journal, in which some of the most memorable poems of the war appeared, including those of his newest acquaintance, the 30-year-old Second Lt. Siegfried Sassoon.

Sassoon took a different route to Craiglockhart than Owen, with army politics playing a role as much as mental health. Sassoon had acquired the nickname “Mad Jack” for his forays into that part of the battlefield known as No Man’s Land. Enraged at the death of his training-camp roommate, David Cuthbert Thomas (“little Tommy”), Sassoon charged the enemy line for 18 days, with what some suspected was a death wish: “They say I am trying to get myself killed. Am I? I don’t know.”

But Sassoon’s raids on No Man’s Land—brave, unhinged or both—did not precipitate his review by the medical board. That followed, instead, from the outspoken officer’s criticism of the “political errors and insincerities for which the fighting men are being sacrificed.” Following a medical furlough, occasioned by a bullet wound to the shoulder, Sassoon refused to return to duty. Rather than court martial the dissenter, who had been awarded the Military Cross for “conspicuous gallantry,” the board sent him to Craiglockhart. Was he in fact suffering from PTSD? His friend and fellow poet Robert Graves came to think so, though Sassoon’s doctor, the renowned psychiatrist W.H.R. Rivers demurred. “What have I got then?” Sassoon asked Rivers, to which he laughingly replied, “Well, you appear to be suffering from an anti-war complex.”

Rivers was “a polymath with notable achievements in neurology, clinical psychiatry, medical research, anthropology, and linguistics,” and—even more than Sassoon and Owen—he is the protagonist of Mr. Glass’s account of Craiglockhart. He founded England’s first psychology laboratories in London and Cambridge, where he was a fellow of Saint John’s College. (A dynamic portrait of Rivers may also be found in Kay Redfield Jamison’s recent “Fires in the Dark: Healing the Unquiet Mind,” in which Rivers’s fascination with religion, ritual and myth is shown to have contributed importantly to his treatment of mental illness.) But the physician was also a soldier, and his brief was not only to cure patients but also to fortify them for return to combat, often with predictably dire outcomes. Though Rivers was devoted to Sassoon, who came to see him as his “father confessor,” Sassoon’s pacifism put Rivers in a difficult position.

Sassoon had been at Craiglockhart for three weeks before Owen worked up the courage to introduce himself. By way of entrée, he brought several copies of Sassoon’s collection “The Old Huntsman and Other Poems” for him to sign. They talked for a half hour, during which Owen expressed his admiration, and Sassoon concluded that he “had taken an instinctive liking to him and felt that I could talk freely.” Though both were homosexual, the two men came from completely different worlds. The aristocratic Sassoon, whom Owen described as “very tall and stately, with a fine firm chisel’d . . . head,” was educated at an upper-crust “public” school and Cambridge. Owen, the son of a railway inspector, attended a local “comprehensive” school and missed the first-class honors necessary for a scholarship to University College London.

Link to the rest at The Wall Street Journal

How Books Are Used to Perpetuate the Prison Industrial Complex

From Book Riot:

The prison industrial complex is a term you may have heard if you’ve looked into abolitionist thinking or learned about the contexts around social movements like Black Lives Matter. It’s a term that, as defined by abolition group Critical Resistance, describes “overlapping interests of government and industry that use surveillance, policing, and imprisonment as solutions to economic, social and political problems.” While the concept of the prison industrial complex covers a huge number of social structures, you’ll most frequently hear it used in discussions about mass incarceration, for-profit prisons, and how criminalising and imprisoning people benefits the rich and powerful (particularly politicians and CEOs) while doing nothing to tackle or prevent harm that takes place within society at large.

The term “prison industrial complex” was coined by activists as well as incarcerated people and their families. And a number of academics and writers have provided powerful critiques of the prison industrial complex — Angela Y. Davis with Are Prisons Obsolete?, Michelle Alexander with The New Jim Crow: Mass Incarceration in the Age of Colorblindness, Elizabeth Hinton with From the War on Poverty to the War on Crime: The Making of Mass Incarceration in America, and many less formalised writings from abolitionist groups and activists.

. . . .

There has been a great deal of writing and literature about the prison industrial complex — but are books one of the structures supporting the system itself?

. . . .

In my earlier article on copaganda in crime books, I explored how literature, particularly the detective genre, has often bolstered a pro-police social agenda. The same is true for the prison industrial complex, and pro-carceral justice structures. In detective fiction, prison for the villain is often a major part of a happy ending — although in many crime books, the conclusion involves the murderer dying in a showdown with the hero, rather than being incarcerated. Literature often portrays prison as largely unproblematic, the only issue being when innocent people are imprisoned. While miscarriages of justice are enormously harmful, the prison system also enacts enormous and disproportionate violence on people who have committed the crimes they are accused of; however, social attitudes to this are very different, with the refrain “if you can’t do the time, don’t do the crime” commonly thrown back at people who argue against carceral abuses.

. . . .

Throughout the majority of literature, prisons are represented as frightening but necessary institutions that largely keep the public safe, only harmful when people are incarcerated for a crime they didn’t commit. There is little focus on, or care for, the harm that prisons cause to people who have committed crimes, or their harm to society as a whole. The area of literature that has paid the greatest amount of attention to the harms of the prison industrial complex is the prison memoir, but, once again, the most successful examples of these are books by political prisoners, or people incarcerated because of racism or other forms of bigotry, such as Nelson Mandela or the Exonerated Five. These stories are incredibly important, but they also show how incarcerated people are generally only portrayed as sympathetic if the person is “not a real criminal,” but someone who has been done wrong by the system, instead of considering the possibility that the system of imprisonment is itself wrong. “Real criminals” don’t get the same kind of sympathy or humanisation in literature, and it is rarely suggested that the prison industrial complex itself may be the true villain.

Link to the rest at Book Riot

While PG does not characterize U.S. prisons as ideal surroundings for anyone, including prison employees or convicts, that doesn’t mean that prisons are a terrible idea or that anyone has found a better way of protecting people who don’t commit crimes from people who do commit crimes.

When you’re living in a relatively safe world, where chances of being injured or killed other than by old age or terminal illness, prison can seem to be cruel and overly-strict. However, no one is more relieved to hear that someone who has committed a serious crime that has harmed them has been locked up in prison so she/he can’t commit another crime for a good long while or retaliate against them for reporting the crime.

PG has visited people in prison in two roles – hired attorney and as a representative of the religion of the prisoner, e.g. pastoral visits.

He has yet to meet anyone who is incarcerated who PG didn’t think was a danger to his/her community and was the perpetrators of one or more acts that harmed other people.

The primary purpose of a prison is as a deterrent to all to not violate the law and to punish those who have done so. A secondary purpose is as to protect society in general from a repetition of the bad acts of an individual who, absent punishment, is likely to repeat those bad acts to the detriment of others, either individually or collectively.

The Prison Industrial Complex is manipulative Marxist-style term designed to manipulate perceptions to the detriment of the underlying purpose of prisons – to protect those who don’t commit crimes from the actions of those who do and is a typical Marxist creation designed to manipulate emotions. (See also, Military Industrial Complex)

Absent prior convictions, non-violent criminals tend to receive punishments lighter than prison – release on probation either after they’ve served a period of incarceration or, often with some prior time in jail prior to the hearing, probation, supervised or unsupervised, which usually involves meeting with a state or federal probation officer on a regular basis and a condition that the probationer not commit another crime during probation.

Depending on the jurisdiction, supervised probation may require weekly or monthly or, sometimes bi-monthly visits with a probation officer to discuss what the probationer has been doing, whether he/she is attending school, going to work or some other condition the sentencing judge may impose – attending Alcohol Anonymous meetings weekly, spending some time helping a local charitable organization, etc.

Some prisoners finish getting their high school or college diploma while incarcerated. Prison wardens are generally quite pleased with someone who does this sort of thing and it is a plus if the prisoner seeks early release for good behavior while incarcerated.

PG has yet to visit anyone in prison who does not acknowledge that he/she deserves to be there.

Russians join the resistance with pirate radio and Tinder

From The Economist:

In a bedsit in Vologda, a Russian city 500 miles north of Moscow, a man sat at a desk surrounded by recording equipment. In his early 60s, tall and thin with long grey hair, glasses and a moustache, he looked like an ageing rock star making a new album. His name was Vladimir Rumyantsev. He lived alone, and his day job was as a stoker, tending a furnace in a factory boiler-room. In the evenings he was the dj of his own pirate-radio station, broadcasting anti-war diatribes against Putin’s “special military operation” in Ukraine.

Rumyantsev set up the station before the war as a hobby. Radio Vovan (a play on a nickname for Vladimir) mainly broadcast music from the Soviet era that he found in online archives. He said he needed a break from oppressive state propaganda. “Some kind of ‘patriotic’ hysteria started on the airwaves, and as the sole occupant of the flat I voted unanimously to ban the broadcasting of federal tv and radio channels in my home. Well, I had to create something of my own to replace it,” he wrote to me.

. . . .

In December Rumyantsev was found guilty of spreading “deliberately false information”, in the words of the judge at Vologda city court, that Russian soldiers in Ukraine were “robbing, killing and raping civilians, destroying hospitals, maternity hospitals, schools and kindergartens”. He was given the option of three years of house arrest or three years in prison. He chose prison.

I learned about Rumyantsev when a friend texted me a news article about the case. He knew I would appreciate the coincidence. A couple of months earlier, at the beginning of the war, I had finally published a novel I had been working on for ten years. Called “Radio Martyn”, it’s set in a nameless totalitarian country, ruled by hatred and fear. The novel is about a secret resistance movement, led by a pirate-radio station fighting propaganda. It was meant as a futuristic dystopia, only life has a habit of imitating art.

I struck up a correspondence with Rumyantsev, using his lawyer as an intermediary. His story gave me some hope. It helps destroy the myth that everyone in the country supports the war, that society in Russia is a monolith – a frozen piece of sxxx. The truth is that every day people in Russia, at great risk to their safety and livelihoods, are trying to stop Putin or help his victims. Their desire to fight this monumental injustice is so strong that even knowing they could lose their jobs, be thrown in prison, tortured or even killed, does not deter them. After all, such threats seem insignificant compared with the horrors their state is perpetrating on the people of Ukraine.

. . . .

Rumyantsev was born in Vologda in 1961. His mother worked in an ice-cream factory and his father in a factory that repaired rolling stock. He and his brother, Sergei, grew up in a flat that his family shared with several others. Such poverty was ordinary: “Life was just like most people’s,” said Rumyantsev.

He had very poor eyesight; when he was 14 he was sent to a state-run boarding school for disabled children. The teachers, many of whom had served in the second world war, shocked him by impugning the Russian war effort and Stalin, who had died eight years before Rumyantsev was born. They taught him to think critically: that the truth was more complicated than the official line.

. . . .

As a schoolboy in the 1970s, Rumyantsev had assembled a receiver with the help of a physics textbook and a children’s electronics set (“my first experience of radio hooliganism”). He was able to pick up foreign stations that were banned in the Soviet Union, including the Voice of America and the bbc. He noticed how these so-called “enemy voices” dared to ask questions, unlike their Soviet counterparts.

Once he left school, Rumyantsev worked as a locksmith in factories and briefly as a trolleybus conductor. An expert in local history, he also used to give tours of old Vologda. Rumyantsev, described by his lawyer as a “man of the proletariat”, demolishes one of Putin’s favourite propaganda myths: that only “bearded intellectuals” are against the war.

. . . .

I am not crazy. I know that in a country of nearly 150m people, zombified by two decades of propaganda, there are many people who support Putin. But the state, through its actions in Ukraine, is pushing more and more ordinary citizens to join the resistance. Every day since February 24th 2022, Russians across the country have been protesting against the war.

. . . .

They send money to Ukraine’s army. They hack Russian websites. They help young men escape the draft. One banner at a protest read, “The sixth commandment: thou shalt not kill”. They post leaflets through letterboxes and under windscreen wipers.

They graffiti “No to war” on walls. They scrawl slogans on banknotes. They weave green ribbons into their hair, hang them on fences and in the metro (the colour you get when you mix the yellow and blue of Ukraine’s flag being symbolic of resistance). They make figurines out of plasticine, attach an anti-war message to them and leave them in public places. They hang posters in their windows and on their balconies, on bridges and in shop windows.

They set up Tinder accounts for Putin: the profiles say things like “looking for someone who will love me after all the atrocities”, followed by information about the massacre of civilians by Russian troops in Bucha.

Link to the rest at The Economist

Virginia Woolf’s Forgotten Diary

From The Paris Review:

On August 3, 1917, Virginia Woolf wrote in her diary for the first time in two years—a small notebook, roughly the size of the palm of her handIt was a Friday, the start of the bank holiday, and she had traveled from London to Asheham, her rented house in rural Sussex, with her husband, Leonard. For the first time in days, it had stopped raining, and so she “walked out from Lewes.” There were “men mending the wall & roof” of the house, and Will, the gardener, had “dug up the bed in front, leaving only one dahlia.” Finally, “bees in attic chimney.”

It is a stilted beginning, and yet with each entry, her diary gains in confidence. Soon, Woolf establishes a pattern. First, she notes the weather, and her walk—to the post, or to fetch the milk, or up onto the Downs. There, she takes down the number of mushrooms she finds—“almost a record find,” or “enough for a dish”—and of the insects she has seen: “3 perfect peacock butterflies, 1 silver washed frit; besides innumerable blues feeding on dung.” She notices butterflies in particular: painted ladies, clouded yellows, fritillaries, blues. She is blasé in her records of nature’s more gruesome sights—“the spine & red legs of a bird, just devoured by a hawk,” or a “chicken in a parcel, found dead in the nettles, head wrung off.” There is human violence, too. From the tops of the Downs, she listens to the guns as they sound from France, and watches German prisoners at work in the fields, who use “a great brown jug for their tea.” Home again, and she reports any visitors, or whether she has done gardening or reading or sewing. Lastly, she makes a note about rationing, taking stock of the larder: “eggs 2/9 doz. From Mrs Attfield,” or “sausages here come in.”

Though Woolf, then thirty-five, shared the lease of Asheham with her sister, the painter Vanessa Bell (who went there for weekend parties), for her, the house had always been a place for convalescence. Following her marriage to Leonard in 1912, she entered a long tunnel of illness—a series of breakdowns during which she refused to eat, talked wildly, and attempted suicide. She spent long periods at a nursing home in Twickenham before being brought to Asheham with a nurse to recover. At the house, Leonard presided over a strict routine, in which Virginia was permitted to write letters—“only to the end of the page, Mrs Woolf,” as she reported to her friend Margaret Llewelyn Davies—and to take short walks “in a kind of nightgown.” She had been too ill to pay much attention to the publication of her first novel, The Voyage Out, in 1915, or to take notice of the war. “Its very like living at the bottom of the sea being here,” she wrote to a friend in early 1914, as Bloomsbury scattered. “One sometimes hears rumours of what is going on overhead.”

In the writing about Woolf’s life, the wartime summers at Asheham tend to be disregarded. They are quickly overtaken by her time in London, the emergence of the Hogarth Press, and the radical new direction she took in her work, when her first novels—awkward set-pieces of Edwardian realism—would give way to the experimentalism of Jacob’s Room and Mrs. Dalloway. And yet during these summers, Woolf was at a threshold in her life and work. Her small diary is the most detailed account we have of her days during the summers of 1917 and 1918, when she was walking, reading, recovering, looking. It is a bridge between two periods in her work and also between illness and health, writing and not writing, looking and feeling. Unpacking each entry, we can see the richness of her daily life, the quiet repetition of her activities and pleasures. There is no shortage of drama: a puncture to her bicycle, a biting dog, the question of whether there will be enough sugar for jam. She rarely uses the unruly “I,” although occasionally we glimpse her, planting a bulb or leaving her mackintosh in a hedge. Mostly she records things she can see or hear or touch. Having been ill, she is nurturing a convalescent quality of attention, using her diary’s economical form, its domestic subject matter, to tether herself to the world. “Happiness is,” she writes later, in 1925, “to have a little string onto which things will attach themselves.” At Asheham, she strings one paragraph after another; a way of watching the days accrue. And as she recovers, things attach themselves: bicycles, rubber boots, dahlias, eggs.

. . . .

Between 1915 and her death in 1941, Woolf filled almost thirty notebooks with diary entries, beginning, at first, with a fairly self-conscious account of her daily life which developed, from Asheham onward, into an extraordinary, continuous record of form and feeling. Her diary was the place where she practiced writing—or would “do my scales,” as she described it in 1924—and in which her novels shaped themselves: the “escapade” of Orlando written at the height of her feelings for Vita Sackville-West (“I want to kick up my heels & be off”); the “playpoem” of The Waves, that “abstract mystical eyeless book,” which began life one summer’s evening in Sussex as “The Moths.” 

Link to the rest at The Paris Review

Dominant languages can spread even without coercion

From The Economist:

Never think the world is in decline. A recent book, “Speak Not” by James Griffiths, looks at the bad old days when it was seen as acceptable to impose a culture on others through force. The author tells the stories of Welsh and Hawaiian—languages driven to the brink of death or irrelevance before being saved by determined activists.

Americans fomented a coup in Hawaii that led to its eventual annexation. Missionaries built schools and fervently discouraged local customs like the hula, a performance in honour of ancestors that the Americans considered lascivious. Oppression of culture and of the language went hand in hand: by the late 20th century the only fluent Hawaiian-speakers were worryingly old. But activists fought to expand teaching of it, and eventually brought Hawaiian into many schools. The number of speakers is now growing. Even some of the state’s many citizens of other ethnicities find it fashionable to learn a bit.

Welsh survived centuries of union with England largely because of Wales’s relative isolation and poverty. But in the 19th century British authorities stepped up efforts to impose English; schoolchildren had to wear a token of shame (the “Welsh Not”) if they spoke their native language, the kind of tactic seen in language oppression around the world.

Again, activists fought back. In 1936 three of them set fires at an air-force training ground built despite local opposition. The perpetrators turned themselves in, then refused to speak any language but Welsh at their first trial. It ended in a mistrial; their second resulted in a conviction, but on their release nine months later the arsonists were feted as heroes. They had lit a fire under Welsh-language nationalism, which in later decades would not only halt the decline in Welsh-speakers, but reverse it. Today the right to speak Welsh at trial (and in many other contexts) is guaranteed.

Mr Griffiths’s book ends with a sadder tale. Though Mandarin is the world’s most-spoken native language, China still has hundreds of millions of native speakers of other Chinese languages such as Cantonese (often misleadingly called “dialects”), as well as non-Han languages like those used in Inner Mongolia and Tibet. Evidently regarding this variety as unbefitting for a country on the rise, the authorities have redoubled their efforts to get everyone speaking Mandarin—for instance by cutting down Cantonese television and resettling Han Chinese in Tibet, part of a wider bid to dilute its culture. A regime indifferent to the tut-tutting of outsiders can go even further than American and British colonialists.

But English spreads by less coercive means, too. Rosemary Salomone’s new book, “The Rise of English”, tells the tale of a language that has gone from strength to strength after the demise of Britain’s empire and perhaps also of America’s global dominance. These two forces gave English an impetus, but once momentum takes hold of a language, whether of growth or decline, it tends to continue. Everyone wants to speak a language used by lots of other influential people.

The triumph of English led to the death of many languages (notably indigenous ones in America, Canada and Australia), but elsewhere it has merely humbled them. Ms Salomone looks at the Netherlands, where English fever has led to its explosion in universities. Entire graduate and even undergraduate curriculums are in English. Students submit essays on Dutch poets in English.

Link to the rest at The Economist