The Education of Henry Adams

From The Wall Street Journal:

A foundational text for every American Studies program, and one of the most original books published by an American author, “The Education of Henry Adams (An Autobiography)” ought to be read by educated citizens twice. First for its insight into contemporary history and then for what it reveals about the nature of education. Written in 1905, privately printed the next year, and published in 1918, its importance lies in its innovative form and prophetic content. A work startlingly idiosyncratic and unprecedented in the genre, its only forebear was Benjamin Franklin’s “Autobiography” (1793), which Adams called a “model of self-teaching.”

Adams (1838-1918) would demonstrate that writing autobiography was a fusion of record and reflection. In a radical decision, he wrote about himself in the third person, which permitted an important detachment that allowed him to make arbitrary additions or deletions. The best known of the latter was in the center of the book, where he eliminated mention of many years in his life. In 1885 his wife, Marian “Clover” Hooper Adams, had died by suicide. The shock led Adams to suppress any record of the event and its aftermath—and, for that matter, any reference at all to his spouse. He destroyed his diaries of the period and all correspondence between them. He resumed his narrative only after several years of silence. His life’s education would be the only compensation for his loss.

Adams was born in Boston, a direct descendant of two presidents and son of a diplomat. He would end up building a house across Lafayette Square from the White House, and becoming a dominant intellectual figure in American life.

Much of the first half of the book is about his disappointment with, and the failure of, his education. Of his early schooling, he recalled, “In any and all its forms the boy detested school.” From the beginning he felt his learning was inadequate. Of Harvard, which he entered in 1854, he concluded, “The education he had received bore little relation to the education he needed.”

There followed travel to Europe’s major cities of learning, but in Berlin “the experience was hazardous,” and in Rome he found “no education.” In England in 1868, he observed that the typical educated statesman “did not know what he was talking about.”

Back in Washington, Adams noted that the capital was “a poor place for education.” He summarized that “he had been mistaken at every point of his education.” But this also turned out to be a moment of revelation and advance, expressed in some of his memorable sentences: “He knew he knew nothing” and “He knew enough to be ignorant.” “Then he asked questions. . . . Probably this was education.” For Adams, even self-examination was painful: “He knew no tragedy so heartrending as introspection.”

In fact, he had learned the primacy of self-teaching—that it was participatory, not receptive; dynamic rather than passive. The second half of his book would be engaged in the great issues of his time as he envisioned them, and often these were startlingly prophetic.

Visits to the two major expositions of his era helped to reshape his thinking. The 1893 World’s Columbian Exposition in Chicago gave him an image of American power, and the 1900 Universal Exposition in Paris presented him with the new dynamos on display—symbols, he found, of the energy of the modern age. This led him to conceive of a dynamic theory of history, one that was in constant motion and in which all facts were relative. At last, “he was getting an education. . . . The new forces would educate.”

Having previously written a book about the architecture of Mont Saint-Michel and Chartres and the power of the Virgin as the embodiment of 13th-century unity, he could contrast the present as a study of 20th-century multiplicity—the absence of absolutes. The language of science explained the manifestations of modernism, as he took note of the new research on radium, the X-ray and the atom. The idea of magnetism in particular required an understanding of relationships, rather than one fixed position.

Adams was an elegant literary stylist, and he frequently crafted sentences of paired phrases, joined by a central semicolon. One of the most famous in “The Education” was “In plain words, Chaos was the law of nature; Order was the dream of man.” On the one hand, this was like the structure of a Gothic arch, two parts leaning together with a keystone at its apex. But it also conveyed a contemporary notion of balanced opposites.

Politically he was prescient in his comments on Russia, Germany and China. He wrote, “The vast force of inertia known as China was to be united with the huge bulk of Russia in a single mass which no amount of new force could henceforward deflect.” In turn, the difficult step was to “bring Germany into the combine.”

Link to the rest at The Wall Street Journal

The Opening of the Protestant Mind

From The Wall Street Journal:

‘Evangelicals,” or born-again Protestants—Christians who believe in converting non-Christians to their faith—haven’t had a lot of great press of late. The mainstream media all but blamed them for the 2016 election of Donald Trump. Going further back are evangelical ties to the Moral Majority and the religious right. Evangelicals in both politics and religion have a reputation for intolerance. They may have earned it: In 2017, the Pew Research Center found that evangelicals, more than any other Christian group, viewed Hindus, Buddhists, Mormons, atheists and Muslims unfavorably.

Mark Valeri’s “The Opening of the Protestant Mind” isn’t about 21st-century America, but his exploration of born-again Protestantism’s historical roots upends assumptions about religious conversion. Instead of making Christians intolerant, coming to faith by conversion historically went hand in hand with reasonableness, civility and religious toleration. Most readers would likely assume the opposite: If you believe other people need to convert to your faith from theirs, chances are you won’t give them much of a hearing. In Mr. Valeri’s interpretation, though, a figure like Jonathan Edwards, famous for preaching “Sinners in the Hands of an Angry God,” may have been among the most liberal and progressive thinkers in the Anglophone world.

Mr. Valeri’s narrative of Anglo-American Protestantism between 1650 and 1750 is not as oxymoronic as it initially sounds. The Protestant outlook that prevailed in English society at the time of Charles I’s execution in 1649 assumed that the “health of the state depended on a . . . religious confession” to supply social coherence. For the rest of the 17th century, when English writers (including American colonists) encountered non-Christians, they saw “illegitimate, dangerous, and demonic” religions.

But after the Glorious Revolution in 1688, in which the threat of a Catholic monarch was decisively ended, English Protestants began to distinguish “loyalty to the kingdom” from “conformity to any one creed.” English writers—some zealous Protestants, others philosophically inclined—“minimized theological orthodoxy” as a requirement for social standing. Not only did these authors discover “republican ideas of toleration and moral virtue” in other religions; they also revised Christianity. Conversion became the path to faith not by submission to dogma but by persuasion. This shift aided the ascendant Whigs in governing a diverse religious constituency. It also prompted Protestants to regard conversion (and awakenings) as the mark of true faith.

Mr. Valeri, a professor of religion and politics at Washington University in St. Louis, refers to a variety of authors well known and obscure. In the case of John Locke’s “Letter Concerning Toleration” (1689), the key to civil liberty was separating the purpose of the state from that of the church. If religious controversy threatened public order, the state should intervene. Otherwise government should leave religious groups to themselves. Locke’s outlook extended to Native Americans: “If they believe that they please God and are saved by the rites of their forefathers, they should be left to themselves.” Jonathan Edwards, who for a time served as a missionary to Native Americans, echoed Locke. Although an advocate for the First Great Awakening, Edwards regarded the Mohawks and English people as spiritual equals because they shared the same sinful human nature. For that reason, Edwards thought acculturating Native Americans to Anglo-American conventions as unimportant compared with converting them through persuasive preaching.

The 1733 English publication of “The Ceremonies and Religious Customs of the Various Nations of the Known World” is Mr. Valeri’s best example. Compiled by two French Protestants, Bernard Picart and Jean Frédéric Bernard, this popular book reinforced the ideal of conversion. Especially appealing to Anglo-American Protestants was the French catalogers’ contention that ceremonial religion represented an illegitimate “alliance between priests and secular rulers who persecuted religious dissenters.” Ritualized Christianity went hand-in-hand with imperial ambition and produced “uncivil, unreasonable, and coercive” religion.

Link to the rest at The Wall Street Journal

PG notes that there were periods in history (including the times that the book described in the OP discusses) when religion had the power to impel men and women to many different actions.

The Crusades were a series of religious wars between Christians and Muslims that started primarily to secure control of holy sites considered sacred by both groups. In all, eight major Crusade expeditions—varying in size, strength and degree of success—occurred between 1096 and 1291.

The Muslim conquests were a military expansion on an unprecedented scale, beginning in the lifetime of Muhammad and spanning the centuries down to the Ottoman wars in Europe.

PG compares the general power of religion to cause humans to exert substantial amounts of money and blood in ancient times with the far more subtle influences of religion in the 21st Century. PG is certainly pleased that there have been no major religious wars lately, but he also wonders if religion, taken as a whole, occupies a smaller place in the minds of humanity, speaking generally, in today’s world.

PG asks that comments to this post not insult any particular religion or its adherents.

How Obscenity Laws Nearly Stopped Nabokov’s Lolita from Being Published

From Literary Hub:

Lolita was originally published as a limited edition in France in September 1955. The book was released by Maurice Girodias of Olympia Press, a company known for specializing in pornography, but which had also built a reputation for publishing challenging literary titles. The first print run had been 5,000 copies and the book received little attention.

Over the next few months, a handful of copies of Lolita were smuggled into England. A single copy made its way into the hands of Graham Greene, who reviewed it favorably in the Sunday Times. This was followed soon after by a scathing article in the Sunday Express, which denounced the book as “sheer unrestrained pornography.” Shortly after, the British Home Office ordered that all copies entering the country should be seized.

When he first read Lolita, George Weidenfeld knew immediately he wanted to publish it. His partner, Nigel Nicolson, was not so sure. Nor were Nigel’s parents. His father Harold Nicolson “hated” the book and said it would be “universally condemned,” while his mother Vita Sackville-West “saw no literary merit in it at all.” As for Nigel himself, he told George that he was not convinced they should proceed with publication.

George was unrelenting. He took legal advice, however, and learned that publication in England would be extremely hazardous. Under the current law, they would likely lose the case, which would result in huge losses. Any copies that had been printed would have to be pulped, not to mention the enormous editorial, marketing, publicity and legal expenses incurred up to that point. Such an outcome would be calamitous for Weidenfeld & Nicolson, placing its future in serious jeopardy.

As luck would have it, the lawyers said, the Labour politician Roy Jenkins was right then guiding a new obscenity bill through Parliament. Under this new law, if the government blocked publication and the case went to court, then the publisher would be able to argue the literary merits of the book by calling authors, academics and reviewers to testify. If this bill was enacted, then just maybe, George might have a chance. The effort would still pose an enormous risk but, for the publisher, it might be worth it.

In the decades leading up to the proposed publication of Lolita, governments had frequently cited “obscenity” as the reason for preventing controversial books being published. In the United States, the Federal Anti-Obscenity Act of 1873 had been used to ban Chaucer’s The Canterbury Tales, John Cleland’s Fanny Hill, Boccaccio’s The Decameron and Voltaire’s Candide. In Britain, the legal test for obscenity derived from a 1868 case known as Regina v Hicklin, in which a judge ruled that obscene material tended “to deprave and corrupt those whose minds are open to such immoral influence.”

In 1928 the British government had relied on the Hicklin case to ban Marguerite Radclyffe Hall’s lesbian novel The Well of Loneliness. Opposition to the book was whipped up by the media, particularly the Sunday Express whose editor wrote, “I would rather give a healthy boy or a healthy girl a phial of prussic acid than this novel.” That same year, D. H. Lawrence’s Lady Chatterley’s Lover was also deemed to violate the obscenity laws and was commercially published in only an expurgated version. Six years later, the publisher Boriswood was prosecuted for obscene libel and severely fined for releasing Boy, a sexually explicit novel by James Hanley.

Over the summer of 1958, with the lawyers saying that the new bill had a good chance of passing through Parliament, and with Nigel Nicolson’s tenuous but nervous agreement, George reached out to the author, Vladimir Nabokov, in New York and asked for permission to publish Lolita in the United Kingdom and across the Commonwealth. By the end of November, they had reached an agreement on the general terms and Nabokov wrote to George saying “an English edition of Lolita is nearing signature.”

In his reply to the author in New York, George said, “May I take this opportunity of telling you how inspired and moved my colleagues and I feel by the book and how determined we are to see that it launches with dignity and success in this country.” Publication of Lolita in Great Britain seemed a little closer, but then, by the year’s end, George’s plans started to unravel.

When word began circulating in the press that Weidenfeld & Nicolson intended to release Lolita, Nigel’s political colleagues pressed him to change course. At one point the Conservative chief whip Ted Heath (and later prime minister) begged him to cancel publication. Nigel asked him if he had read the book. Heath said he had. “Did you think it obscene?” Nigel asked. “As a matter of fact I thought it very boring,” Heath replied. “If it is boring it cannot be obscene,” Nigel said, which he later admitted was not a very good argument.

A few days later, the Attorney-General, Reginald Manningham-Buller (called “Bullying Manner” behind his back), stopped Nigel in a dark corridor in the bowels of the House of Commons. “If you publish Lolita you will be in the dock,” he said, jabbing a finger at him. “Even after the Obscenity Bill has been passed?” asked Nigel. “That won’t make any difference,” responded the country’s top lawyer. ‘The book is thoroughly obscene. I’ve given you a clear warning.”

On 16 December 1958, a week before Christmas Eve, Roy Jenkins’ new Obscenity Bill was debated in the House of Commons. Midway through the proceedings, Nigel stood up to speak. First, he acknowledged that he had an interest in the matter as a director of the firm Weidenfeld & Nicolson, which was planning to publish Lolita. Then he moved on to the substance of his speech. “The question could be asked,” he declared, “Is an obscene work of art a contradiction in terms? I would answer the question by saying, no, it is not. It is quite possible for a work of art to be obscene.” He then went on to say that the book had already been published in America, where over 250,000 copies had been sold.

Lolita had also been published in France and Italy. “The question arose whether it should be published in England. That was the question which my colleagues and I had to answer,” he continued.

Lolita deals with a perversion. It describes the love of a middle-aged man for a girl of twelve. If this perversion had been depicted in such a way as to suggest to any reader of middle age or, for that matter, any little girl—could she understand it—that the practices were pleasant and could lead to happiness, I should have had no hesitation in advising my colleagues that we ought not to publish this book. But, in fact, Lolita has a built-in condemnation of what it describes. It leads to utter misery, suicide, prison, murder and great unhappiness, both to the man and to the little girl whom he seduces.

At this point, Emrys Hughes, a Welsh Labour MP, rebel and general troublemaker, tried to interrupt, but Nigel brushed him aside and moved on to his conclusion. “I asked myself whether the loss to literature in this country through the non-publication of Lolita was greater than the risk which one ran of offending certain people by its publication.” Pausing to take a breath, he then said, “In the end, I came to the conclusion that it was probably right to publish this book.” Nigel had for the first time publicly declared his support for the publication of Lolita.

Link to the rest at Literary Hub

Slanting the History of Handwriting

From Public Books:

Years ago, I wrote my signature on a piece of white paper, scanned it, and inserted it as a picture at the bottom of my digital letterhead. It’s a perfect example of what Richard Grusin has called the “premediated” sign. Others in academia sign their letters by typing out their names in cursive fonts. Whether Zapf, Apple Chancery, or Lucida Calligraphy, the important thing is that the font gestures to cursive, which has become the avatar of handwritten-ness in digital media today. We make these insertions not because we need to signal our authenticating presence but because formal letters are a genre of writing, with certain expectations regarding not only content but also appearance. A formal letter should conclude with the writer’s name inscribed to look a particular way, whether it’s a picture of a signature or a digital simulacrum of one.

All of which is to say, whatever writing is today, it is not self-evident. In the introduction to the new edited volume Handwriting in Early America: A Media History, Mark Alan Mattes suggests that we can come to grips with what writing is by triangulating between inscription, the people inscribing, and the systems of communication in which their inscriptions circulated. The 16 essays in the collection bear out the expansive potentials of this framework, not only by truly taking on the contingency of writing itself but also by revealing how the same kinds of writing can do radically different cultural work.

For example, almost every essay in this rich volume finds a counterpart or mirror image of itself, underscoring just how relative and relational the meaning of every kind of inscription is. A poem on penmanship quoted and copied by a teacher into an African American girl’s friendship album endorses the value of “polite culture” as a means of advancing in the antebellum Black elite.  A different friendship album, owned by Omaha activist Susette La Flesche, also features an array of handwritten quotations, but they document a tense ethics of obligation between writers and recipient—both are impelled to act in accord with an assimilationist vision of Indigenous self-determination.

While this volume underscores the benefits of historicist thinking about writing, Joyce Kinkead’s A Writing Studies Primer attempts to short-circuit that project by taking the opposite approach: condensing 5,000 years of writing technology around the world into a single, unbroken thread. After visiting museums, libraries, and paper-making firms in the US, Europe, India, Japan, Nepal, and South Korea, Kinkead, a professor of English with a focus on writing studies, synthesized her knowledge and experiences into a book that covers a vast range of topics, from the origins of writing, writing systems, implements, and supports to the history of the book and the printing press, punctuation and calligraphy, ancient epistles, and social media. Each of its 16 chapters concludes with prompts for leading class discussion, hands-on exercises, and a short reading from a source such as the New York Times.

While many of the essays in Handwriting in Early America hinge on Foucault’s idea that writing is a technology of the self—the process by which the individual is formed through various mechanisms of social replication—A Writing Studies Primer is a contemporary example of what this theory describes. And not always for the good. The book leans heavily on ethnographic methods that are almost indistinguishable from the Western gaze. The college student reader—presumably American—is advised in the first chapter to avoid getting “lost in a history that crosses so much time and space” by writing their own biography of themselves as a writer. The student’s story then gives way to Kinkead’s, and the Grand Tour of writing on offer measures all material forms and genres against the yardstick of Euro-American writing norms today—norms that, for example, assume handwriting stopped having a history after the advent of print. But writing by hand did not simply continue to “advance” until it inevitably began to erode; its meanings and the cultural work it performed varied. They still do.

. . . .

Nineteenth-century writing exercises were meant to unite the individual body with pen, ink, paper, and prescribed word, thereby fostering the growth of national subjects. A young boy from Massachusetts, for example, practiced his personal hand by rehearsing over and over again the words “Independence now and independence forever,” the announcement Daniel Webster imagined John Adams to have made upon signing the Declaration of Independence. I am reminded of the stock phrase I see from time to time sprinkled in the margins of medieval manuscripts by readers trying their sharpened pens or simply enjoying the scratch of an inky nib on parchment: “ego sum bonus puer quem deus amat.” I am a good boy whom God loves. Surely some of the boys or men who wrote that were at times naughty, but what is a jingle if not aspirational? As Danielle Skeehan remarks on 16th-to-19th-century English copybooks, “authors often draw connections between alphabetic literacy, the literate subject, discipline, and imperial ambition.” The legacy of alphabetic literacy’s facilitation of empire is a long one, still being written, albeit now in corporate blog posts and emailed memos to vendors on the other end of a supply chain thousands of miles away.

A Writing Studies Primer attempts to supplement and enhance the necessarily instrumental nature of a handbook for composition courses by cultivating students’ awareness of writing as a culturally determined act. This is great. But, teeming with factual errors and underpinned by a triumphalist Eurocentrism, it only embraces the surface relativism of liberal values, which ultimately needs history to be quaint so that the surface relativisms of modernity can emerge as modernity’s greatest distinction. From the volume we learn that books lacked page numbers, chapter headings, and indexes until the 16th century. False. “Islam prohibits images of people in art.” Demonstrably not true. Parchment is of lower quality than vellum. Incomprehensible. The printing press in Europe made scribes “irrelevant.” Incorrect. The entire output of medieval European book production equaled 30,000 volumes. Perplexing. Gutenberg had to hide his work on the printing press for fear of being accused of “dark forces or magic.” I am at a loss

Link to the rest at Public Books

PG notes that the publisher of Handwriting in Early America, University of Massachusetts Press, failed to make Look Inside available.

Ukraine Renews Its Independence

From The Wall Street Journal:

The average age of the Verkhovna Rada, Ukraine’s 450-seat parliament, is 41. Only three of the elected representatives are older than 60, while 17 were under 30 at the time of their election. This means that when Ukraine declared its independence, many of us were essentially children, and some weren’t yet born. What do we remember from Aug. 24, 1991?

I was 6. My memories of that day are of something profoundly significant. People didn’t go to work; they gathered in the city center, on what is now Hrushevsky Street, greeting each other in an atmosphere of incredible joy and uplift.

Now, in the 10th year of Russia’s war against Ukraine and 18 months into its full-scale phase, my thoughts drift back to the Verkhovna Rada elected in 1990, before independence. Its composition was diverse and varied. There weren’t many professional politicians. There were only Ukrainian patriots and Communists.

Everyone had an agenda. Some aspired for greater autonomy within the Soviet Union. Some defended the Ukrainian language. Some were building their careers with an eye toward Moscow. All etched their names in Ukraine’s history when they accomplished what our ancestors had dreamt of for centuries and what society demanded at that moment—independence.

On Dec. 1, 1991, the Ukrainian people overwhelmingly affirmed their desire for independence in a referendum with 84% turnout. In the Crimean peninsula, more than 54% voted in favor of independence. In the Donetsk, Luhansk, Kharkiv, and Odesa regions, support was over 80%. Today’s Russian propaganda conveniently forgets these numbers, insisting in its narrative that Ukraine and Ukrainians don’t exist.

Historians often joke that people living through major historical events don’t realize how significant those times are. There’s some truth to that. When the current Verkhovna Rada was elected in 2019, the primary demand of the Ukrainian people was a renewal of political authority. No one could have imagined the challenges we would face in less than three years: working during a full-scale war, making pivotal decisions, defending the nation’s sovereignty, and upholding the rights of Ukrainians to exist.

Like all Ukrainians, I will never forget Feb. 24, 2022, the day Russian troops invaded. By 7 a.m., a full-scale war had been raging for two hours. Russian forces were advancing in the Sumy, Kharkiv, Chernihiv, Zhytomyr, Luhansk and Donetsk regions, and from the direction of Crimea. From Belarus, they were moving toward the Kyiv region and the capital city itself. Cities like Odesa, Kherson, Kharkiv, Zhytomyr, Mykolaiv, Zaporizhzhia, Dnipro and Kyiv, along with their surrounding areas, were under missile attack.

In Kyiv, lines formed at petrol stations, railways and ATMs—but even longer queues formed outside military recruitment offices. Tens of thousands of men and women were eager to take up arms to defend their homes, their loved ones, and their country against the invader. Ukrainians enlisted en masse in territorial defense units. Those ready to fight were given weapons. In Kyiv alone authorities distributed 20,000 rifles on Feb. 24.

. . . .

Ukraine surprised the world, the enemy and even itself. We have managed to unite, support each other, and rally around what’s crucial: our nation, our freedom, and the future of our children.

History is made by ordinary people. They become heroes, and the future depends on them. This isn’t the first time Ukraine has had to fight for its right to exist. We must win. Each and every one of us knows what we are fighting for.

Link to the rest at The Wall Street Journal

Not exactly about books, but Ukraine is a terrific story. PG fervently hopes for a happy ending.

‘Empire of the Sum’ Review: It All Adds Up

From The Wall Street Journal:

In 1976, Steve Wozniak sold his HP-65 programmable calculator for $500 to start a computer company with Steve Jobs. It wasn’t a huge sacrifice. As a calculator engineer at Hewlett-Packard, he knew that the HP-67 was on its way and, with his employee discount, he could buy one for $370. His more highly prized gadget was the HP-35—the world’s premier scientific calculator and his inspiration for going to work at HP in the first place.

The HP-35 was a technological wonder. Until its appearance in 1972, pocket calculators performed only addition, subtraction, multiplication and division. The $395 HP-35 packed advanced functions like logarithms, sines and cosines into a relatively affordable, user-friendly package. Suddenly a computer’s worth of power could fit into anyone’s pocket. In “Empire of the Sum: The Rise and Reign of the Pocket Calculator,” Keith Houston surveys the engineering advances that led to that moment and the human drive to solve equations faster and smarter.

Mr. Houston, whose previous books explored punctation and symbols (“Shady Characters,” 2013) and the history of books (“The Book,” 2016), begins by looking back to when humans started counting using the tools at our immediate disposal: our fingers. The earliest archaeological evidence of counting is a baboon fibula incised with notches indicating the number of whatever its owner wanted to track. That 42,000-year-old tally stick, discovered in 1973 in a cave near South Africa’s border with Swaziland, “marks the point at which we began to delegate our memories to external devices,” Mr. Houston writes.

As civilizations progressed, they moved on from anatomical calculators, conferring numerical values to objects. The Sumerians, for instance, developed tokens whose varied shapes and sizes corresponded to smaller or larger quantities. But the biggest leap forward was the abacus, the first purpose-built calculator, invented in Mesopotamia or Greece between three and five millennia ago. The abacus made solving complicated equations possible, but getting results still required mental gymnastics.

Some shortcuts finally arrived in the 17th century courtesy of John Napier, a Scottish landowner, astrologer and mathematician. His invention: logarithms, a quick means of multiplying or dividing numbers through addition and subtraction. Not long after he published his revelation in 1614, logarithms became the basis for at least two important physical tools. Edmund Gunter placed them on a wooden rule to help seafarers with navigational calculations. Then William Oughtred produced his easier-to-use linear sliding rules, which, with a few later modifications, became the trusty and enduring slide rule.

As some mathematicians sought to simplify equations, others tried to automate them. Blaise Pascal was the first, in 1645, to build and sell a mechanical adding machine. Using gears similar to those of a clock, his Pascaline was an elegant metal box featuring windows for displaying a running total. Unfortunately, it didn’t do much more than add whole numbers, and its cost made it inaccessible to most. Over the next 200 years, more machines with greater functionality were introduced, the most important of which was Charles Xavier Thomas’s mid-19th-century arithmometer. None, however, would be as convenient or portable as a slide rule—until the Curta, a pepper mill-like mechanical gadget designed by the Austrian engineer Curt Herzstark during World War II.

In a book that’s long on technical details and short on compelling anecdotes, Mr. Houston’s profile of Herzstark is a notable highlight. As a salesman for his family’s factory manufacturing unwieldy calculators, Herzstark heard his customers’ calls for a truly portable machine. Not long after Herzstark hatched the idea for one, however, German troops annexed Austria. As the son of a Jewish father and Christian mother, Herzstark was sent to Buchenwald. There he supervised a factory of inmates fabricating rocket parts and repairing looted calculating machines. As Herzstark later recounted, his manager urged him to pursue his Curta side project, promising: “If it is really worth something, then we will give it to the Fuhrer as a present after we win the war. Then, surely, you will be made an Aryan.” When Buchenwald was liberated in April 1945, Herzstark took his blueprints with him and eventually produced the Curta. It was a palm-size engineering marvel but a commercial failure.

. . . .

The year the P101 launched, Texas Instruments decided to enter the calculator market with its big innovation, the microchip. The company had industrial and military customers for its chips, but to expand demand it needed to sell them to consumers. The idea: an electronic calculator that could fit in one’s pocket. The resulting prototype didn’t quite live up to expectations—it was the size of a paperback and weighed 3 pounds—but it laid the groundwork for the smaller, lighter gadgets to come, notably Busicom’s first truly pocketable calculator in 1971 and Hewlett-Packard’s HP-35 in 1972. The era of calculator mania had begun, and throughout the late ’70s and ’80s calculators were everywhere, as standalones and as combinations with other electronics, including clock radios, digital watches and even synthesizers.

The pocket calculator’s heyday would be brief compared with that of the slide rule it replaced. Even scientific calculators grew cheaper and profit margins waned. HP, despite Mr. Wozniak’s pleas to build a personal computer, refused to take the risk, only to see calculators absorbed into PCs, palmtops and, finally, smartphones. The pocket calculator sublimed, becoming “everywhere and nowhere at once,” Mr. Houston writes. “The calculator is dead; long live the calculator.”

Link to the rest at The Wall Street Journal

Generation Gaps in the Workplace

From BookBrowse:

Walk into any office and you’ll likely find a mix of people at different points of their lives: Baby boomers, Generation Xers, millennials. And the presence of Generation Z continues to grow.

Iona, the main character in Clare Pooley’s Iona Iverson’s Rules for Commuting, often experiences people judging her competencies based on her age. She’s on the older side, some feel she’s past her prime, and she tries desperately to prove them wrong. But what do generational identities say about our capabilities as workers? To tackle this question, we’ll first have a look at the impact our generational differences have on us in the workplace, and then delve into the truth of the issue.

How do generational differences affect us in the workplace?

It isn’t hard to notice the differences between one age group and another: music, communication methods and even values. These differences can manifest themselves in a negative way in the workplace and cause us to argue, taking work from productive and efficient to a situation of lowered engagement. Soon enough, this can become frustrating, and we may have a tendency to blame our generational differences for it, especially if we already hold biases towards one another based on our ages.

The truth

Though some employees may think that they are simply unable to work with a person who isn’t in the same stage of life as them, or that some generations are less reliable, our distinctions in work patterns and capabilities aren’t as accentuated as all that. Research has indicated that the correlation between our generational upbringing and the way we act in and experience the workforce is close to zero, meaning there is little difference in attitudes towards work between generations.

. . . .

Megan Gerhardt, a management professor at Miami University, has researched the impacts of generational differences in the work field. “Many of the generational conversations in the news today rely on false stereotypes and clickbait headlines, rather than taking the time to understand the important differences that are a part of our generational identities,” she claims.

Link to the rest at BookBrowse

Never Enough

From The Wall Street Journal:

A hardworking teenager—let’s call her Amanda—excels at school. She’s a pianist, a varsity athlete, an honor student and the president of the debate club. She gets early acceptance to an elite university, lands the right summer internships, and, after graduation, secures the job of her dreams. Amanda has run the race; she has hit the mark; she has lived up to her potential and fulfilled the ambitions of her parents. Unfortunately, she’s also a mess. For years, despite the accolades, Amanda has felt “utterly vacant inside,” as Jennifer Breheny Wallace puts it in “Never Enough,” a timely exploration of adolescent achievement culture.

At once a description of an insidious problem and a call to arms, “Never Enough” is full of people like Amanda: young men and women whose personal accomplishments and affluent, ultra-supportive environments might be expected to guarantee them blissful satisfaction but have instead produced anxiety, loneliness and a feeling that life lacks meaning.

These effects are the unintended consequences of social norms that have come to prevail in many upper-middle- and upper-class families. The desire that young people succeed has morphed into something more like a demand. Writes Ms. Wallace: “Our children are living under a tyranny of metrics.” Numerous rueful parents testify in these pages, as do social scientists, educators and the author herself, a married journalist who writes with disarming candor of her own moments of overreach in the raising of her three children.

Why have things turned “toxic” for high-flying kids? Ms. Wallace sees widespread status anxiety fueled by a post-1980s stalling of social mobility and a rise in economic precariousness. Competition has intensified for spots in exclusive colleges and universities. So even mothers and fathers who try not to push, who try to be mellow and undemanding—dwelling less on grades and more on effort, for instance—can feel the pressure and pass it on to their children.

“As parents, we listen and chauffeur and chaperone and coach and cheer and help with homework and attend games and even practices,” Ms. Wallace writes of the “intensive parenting” that has become ubiquitous in the moneyed strata of American life. This style of raising children may be well-meaning, but it can suffocate and dispirit its young recipients. With every hour scheduled and every interest maximized for the purposes of a future résumé, children’s lives, observes the author, “become high-budget productions meant to attract the attention of admissions officers, scholarship committees, and football recruiters, not unique and imperfect stories just beginning to unfold.”

There’s nothing wrong with the pursuit of personal excellence, of course. Without the tug of ambition or the spur of competition, few among us would bother to distinguish ourselves, and the cause of human flourishing would go unserved. In the context of childhood, the problem arises when excellence is pursued as monomania. Focused on their AP textbooks and musical instruments and free of domestic chores (adolescence is “their time to be selfish,” says one unreconstructed mother), young people become sleek, siloed missiles aimed at the Ivy League. Once the rocket has delivered its payload—well, then what? As students like Amanda have found, the rewards of a self-oriented childhood spent sprinting toward high achievement may not be rewarding at all.

Link to the rest at The Wall Street Journal

PG notes that it doesn’t take many years after graduation for an individual to meet one or more complete idiots who graduated from an Ivy League or similar “selective” college/university.

The likelihood of meeting such an individual increases significantly if one meets children from wealthy families who attended prestigious institutions. It’s not the done thing to ask them how large a donation mom and dad made to the selective institution to ensure little Suzy’s or little Johnie’s admission.

It is also the case that such an individual will meet graduates from one or more blue-collar college/university who is brilliant.

Long ago, one of PG’s offspring had been admitted to more than one prestigious university. This offspring decided at almost literally the last day to apply to another university that was entirely respectable, but not regarded as prestigious. This offspring was admitted to this institution very quickly, in a matter of days.

It was the perfect choice for this individual and they flourished during their years of college and afterwards.

Let Them Eat Pedagogy

From Public Books:

If you, like me, have participated in the #AcademicTwitter rite of passage that is bemoaning the lack of pedagogical training in graduate education, The New College Classroom is just the book for you. Authors Cathy Davidson and Christina Katopodis aim to fill this yawning gap with “a practical book dedicated to a lofty mission, a step-by-step ‘how to’ for transformation.” Even the most well-intentioned professors, they explain, who believe in “discussion” and equity can get things wrong.

This should be my favorite book of 2022! Not just because it’s about pedagogy and equity; because it’s coauthored by two practitioners, both (white1) women, one a senior scholar held in high regard across US higher ed studies, the other an early career scholar who adjuncted in graduate school, recently completed her doctorate, and took a tenure-track position. The authors are more than qualified through their research and teaching experience to write this text. Moreover, the book is incredibly readable and easy to digest. Every chapter is broken into manageable chunks, with neat subheadings like “What to Do When Nobody Does the Homework” and “How Do I Address Racial and Other Forms of Discrimination?” It also feels timely, having been written remotely during COVID-19, when the always-ongoing crises of teaching exploded in ways that made all academic laborers’ challenges more obviously similar than different. And in a practice that seems to put communitarian principles into action, Davidson and Katopodis “wrote every word together,” meeting online twice a week to discuss ideas, practice the strategies they would recommend, and get “students’ feedback on what was most effective for their learning.”2

This is the book that pedagogy Twitter has been crying out for. So why does it give me the ick?

Because The New College Classroom is not concerned with the material conditions that produce the crisis academics have to navigate today. Despite its romantic visions of “transformation” it is, ultimately, a guide for coping with the status quo. It offers no help for demanding something better, nor for creating alternatives ourselves.

Changing myself and my classroom might help me renew my one-year contract with the university, but it cannot prepare me to demand an alternative to the contract as the basis of my employment. Instead of mystifying “pedagogy” as some pure way of thinking and being in the world, instead of lamenting that we weren’t trained in this special field of study, perhaps we should recognize that to speak of pedagogy is to speak of labor. And that academic labor is not exceptional.

Certainly, the authors made a heroic effort to ground their pedagogical practices in research. Their introduction cites “an exhaustive study of twelve thousand classrooms” that showed that even instructors who believe they are “conducting a seminar or discussion class” fill 89 percent of class time with lecturing. The book illustrates what truly equitable participation can look like and advocates for active and participatory work that puts students in charge of their learning.

The first part of the book, “Changing Ourselves,” analyzes the classroom practices we have inherited and makes the argument for turning away from these modes of study to center “active learning,” which makes the student both “the agent and the source of the learning” rather than a passive recipient of facts from an authority figure. The second part of the book, “Changing Our Classrooms,” offers “practical, easy-to-follow methods for every part of teaching a class.” The authors explain the principles of active learning and distill them into “grab-and-go activities” that an instructor can pick up and implement immediately in any class (think: think-pair-share or entry and exit tickets). Together, these sections are “an invitation to change—ourselves, our classrooms, our society.”

There isn’t a third part on “Changing Our Society,” but the conclusion is titled “Changing the World” and ends with the provocation that “we are the people we’ve been waiting for,” a slogan otherwise popularized by climate justice activists. It might sound ostentatious to speak of higher education in the same way that we speak of the climate crisis, but within higher education circles, it is commonly held that academic labor is facing an existential quagmire (the apocalypse du jour is generative artificial intelligence. A whole field of critical university studies has grown around this crisis. And of course, so have books helping faculty navigate their lot in this crisis (including texts offering “hacks” for the academic job market; ones identifying the challenges faced by minoritized faculty members; ones Putting the Humanities PhD to Work; even practical guides for leaving academia).

Link to the rest at Public Books

There’s nothing like flattering your intended audience with “we are the people we’ve been waiting for.” Self-satisfaction should be one of the seven deadly sins.

Author Dmitry Glukhovsky Sentenced to Prison by Moscow

From Publishing Perspectives:

Fortunately, the author and journalist Dmitry Glukhovsky was not in Russia on Monday (August 7) when a Moscow court found him guilty on a charge of spreading false information about Russia’s armed forces. He has been sentenced to eight years in prison.

Today (August 9), in response to our inquiry, Glukhovsky’s German public relations agent, Dorle Kopetzky at the Weissundblau agency, says that the writer left Moscow shortly before Vladimir Putin began his assault on Ukraine in February 2022, “and did not return after he called the war what it is.”

Glukhovsky, who joined us onstage at Frankfurter Buchmesse (October 18 to 22) in 2018 for a Publishing Perspectives Talk interview, has rarely been complimentary to the Putin administration, and many of his works were openly defiant.

“He has been critical towards the regime all these years now,” Kopetzky says, “and has fortified his efforts in exile.”

The Associated Press account of Glukhovsky’s sentencing points out that he is “the latest artist to be handed a prison term in a relentless crackdown on dissent in Russia,” referencing the May 5 pre-trial detention for theater director Zhenya Berkovich and playwright Svetlana Petriychuk.

Most prominently, of course, on Friday (August 4), the Russian opposition leader Alexei Navalny, already imprisoned, was convicted on charges of extremism and sentenced to 19 years in prison. That event prompted the editorial board of the Wall Street Journal to write, “The world hardly needs another reminder of the true nature of Vladimir Putin’s Russian state.”

A Reuters write-up in February spoke to Glukhovsky from an undisclosed location, and confirmed that prosecutors in Russia were “proceeding with a case against exiled science fiction writer Dmitry Glukhovsky, accused of publishing ‘false information’ about Russian atrocities in the Ukraine war.” As early as June of 2022, Reuters had reported that Glukhovsky was on a Russian interior ministry wanted list, the author on encrypted communication services having called out the Kremlin’s “special military operation” as a euphemism for Putin’s land-grab.

Glukhovsky, in a 2018 pre-Frankfurt interview with Publishing Perspectives, described the “wonderful times” of the current post-Soviet era for writers willing to see “an epoch of not only post-truth but also post-ethic.”

“These are really the times,” he said, “when all a writer needs to do is sit down and focus carefully on the dubious reality unfolding around him. What’s the point of writing a dystopian fiction nowadays,” he asks, “when the reality is exceeding your wildest fantasies?”

. . . .

Having worked in film, video game, and television development Glukhovsky has particularly broad potency as a storyteller and since the release of his debut trilogy Metro 2033, he has cultivated a loyal international following, propelling his writings into broad international translation and publishing deals.

Kopetsky describes his latest two-volume “Outpost” series as being set “in a Russia isolated from the West and ruled by a new czar from Moscow.” In the books, “a disease in Russia turns people into man-eating zombies after they hear a special combination of words, a ‘somewhat pandemic neuro-lingual infection.’”

Link to the rest at Publishing Perspectives

PG says Russia has experienced a huge brain drain as a result of the invasion of Ukraine. A great portion of the nation’s young highly-educated and talented people left the country to live in Eastern Europe and points beyond during the weeks following the outbreak of the war. PG thinks most will never return to Russia. They will certainly not return if Putin or someone who emulates Putin is the nation’s ruler.

Not far into the war, Russia instituted a program to take convicted criminals out of the nation’s prisons with a promise of a full pardon if the convicts agreed to fight on the front lines for a period of time – often one-two years. These conscripts have been used for roles such as leading charges toward dug-in Ukrainian troops armed with machine guns, and artillery.

Such charges define the term, “cannon fodder” and the Russian conscripts have been killed and severely wounded in large numbers. Needless to say, regular Russian soldiers have priority for the treatment of their wounds, and the convicts are left to treat themselves or each other as best they can.

Russia had a shrinking population before the invasion and the death and crippling of so many young Russian men will certainly accelerate the population decline. Russian ex-pats are unlikely to bring their families back to Russia in the aftermath of the war, regardless of how it ends.

An old saying goes, “The future belongs to those who show up.” Fewer and fewer Russians are going to show up for Russia’s future.

The Death of Public School

From The Wall Street Journal:

What is a public school? Is it an institution that is paid for by the public? One staffed by government employees? One that teaches a publicly approved curriculum? One that educates a broad swath of the public’s children? In the view of Cara Fitzpatrick, the author of “The Death of Public School,” it possesses all of these qualities, and properly so. That more than a few parents don’t agree—or have become disenchanted with the idea of public schools altogether—is a source of concern for her.

For Ms. Fitzpatrick, a veteran reporter and an editor at the education site Chalkbeat, the public school’s death—or at least its decline—is attributable mostly to conservatives, who have, as her subtitle has it, “won the war over education in America.” They have advanced their attack, as she sees it, by supporting school choice, school vouchers and charter schools.

Ms. Fitzpatrick’s not-very-sympathetic history of these alternative policies and institutions starts with the segregationists of the 1950s. She shows how the people who opposed racial integration in the civil-rights era lined up behind the idea that parents should get to choose their children’s school—with the goal, in that era, of avoiding black children. She notes that within four years of Brown v. Board of Education—the 1954 Supreme Court ruling that desegregated schools—some Southern states had “taken steps to abandon public schools if necessary” and “created grants that students could use to pay tuition at private schools.” Ms. Fitzpatrick says that such moves were “the first steps toward creating a school voucher system” similar to the one that Milton Friedman notably proposed in 1955.

Though Ms. Fitzpatrick doesn’t accuse Friedman of sharing the motives of the Southerners—she quotes him saying that he despised segregation and racial prejudice—she says that his views on vouchers seem “either naive or willfully ignorant of the racial oppression in the South.” Friedman had argued that a private system—with vouchers helping families pay for tuition—would offer a range of voluntary associations: exclusively white, exclusively black and mixed schools. Rather than force parents to send their children to one kind or another, he said, people opposed to racial prejudice should “try to persuade others of their views.” If they were successful, “the mixed schools will grow at the expense of the non-mixed, and a gradual transition will take place.”

Whether such an idea was naive is hard to say; it certainly wasn’t ignorant or baleful, as Mr. Fitzpatrick seems to imply. At the time, there were high-quality institutions devoted exclusively to black students, notably the Rosenwald schools set up in the South by the philanthropist Julius Rosenwald in the first decades of the 20th century. If black parents had been given the vouchers needed to support them, such schools might well have continued to flourish.

Of course, people have supported school choice for more than one reason. Some have favored limited government, feeling that, when it comes to the education of children, the state should yield to the preferences of parents. Some have favored more resources for religious schools. Virgil Blum, a Jesuit professor at Marquette University, wrote an article in the late 1950s calling for “educational benefits without enforced conformity,” by which he meant that children shouldn’t be denied access to either public or private schools for lack of funds. In part, he wanted to make the system of Catholic schools an even more robust alternative to public schools, which were sometimes run by Protestants intolerant of religious dissent.

How good is the education that results from school choice or vouchers—or from charter schools, which function outside government strictures and teacher-union rules while receiving public funds? Ms. Fitzpatrick says that the jury is still out. She cites various studies, with mixed conclusions. She doesn’t mention the astronomically improved test scores of students who have attended Success Academy, a network of New York City charter schools founded in 2006 and now serving more than 20,000 students in grades K-12.

Debates over education can make for strange bedfellows. Ms. Fitzpatrick tells the story of the alliance in the 1990s between the Milwaukee-based black Democratic legislator Polly Williams, who favored private-school vouchers, and Wisconsin’s white Republican Gov. Tommy Thompson. After helping to implement the country’s first voucher program, Williams turned on Thompson because he wanted vouchers to be universal and she wanted them to go only to the poor.

Ms. Fitzpatrick seems to feel that Democratic-leaning black parents are getting played by white, Republican-leaning advocates who push for alternatives to public schools. Such advocates say they care about the best interests of inner-city children—whose public schools are often miserable—but Ms. Fitzpatrick implies that they really want to engage in religious indoctrination (with church-defined curriculums) or to make money by ultimately privatizing public schools. She uncharitably describes Clint Bolick—who argued many of the first school-voucher cases—as “never one to miss a good public relations opportunity.”

For someone who seems inclined to question the motives of those who favor school choice, Ms. Fitzpatrick doesn’t seem much interested in why others oppose it. Democrats from Bill Clinton to Barack Obama at first favored charter schools and school choice but backed away when teachers unions threatened to withdraw their political support.

Link to the rest at The Wall Street Journal

PG attended public schools exclusively for all twelve years of his pre-college education. Some of the public schools were very good, some were mediocre and some were really terrible.

He suggests that teachers are the determining factor that results in good or poor educational experiences for students. When he had good teachers, public school was a wonderful learning experience. When his teachers were poor, public school was often pure torture.

PG thinks that the politicization of public schools and public school systems is almost certain to result in poorer educational experiences for students.

Transformative Agreements: ‘An Essential Stepping Stone’

From Publishing Perspectives:

As Publishing Perspectives readers know, the academic and scholarly world’s march toward open-access models hasn’t moved as quickly as many would like. The late-June release of Europe’s Coalition S initiative for open access called “Plan S” was plainly presented as a disappointment.

Closer to the ground, if you will, however, there are parties gamely announcing progress and achievements, among them the London-based 182-year-old Royal Society of Chemistry (the URL of which, yes, looks like that of the Royal Shakespeare Company).

In its media messaging today (August 11), the society—which has an international membership of more than 50,000—is focusing on what may be to some a surprising number of transformative agreements in North America, 46 all told. They are:

  • 2018: One agreement (the society’s first in the United States, with Massachusetts Institute of Technology)
  • 2019-21: Three agreements (all in the United States)
  • 2022: Seven agreements (all in the United States)
  • 2023: 35 agreements (21 in the United States, one in Mexico, and 13 in Canada)

The Biden administration in August of 2022 announced its controversial requirement that by the end of 2025 all taxpayer-funded research will have to be made freely available to the public as soon as a final peer-reviewed manuscript of a report is published. The chemistry society in England is now mentioning this as one of the factors that has accelerated its agreements, along with the society’s own plans.

“On the back of the US government’s open-access mandate and our own open-access commitments,” the society reports, “the number of deals has grown rapidly within the region every year, with 2023 seeing 28 new deals, including our first agreements with partners in Canada and Mexico.”

. . . .

Sara Bosshart is the Royal Society of Chemistry’s head of open access, and she’s quoted today, saying, “We were very excited last year to announce that we aim to make all of our fully society-owned journals open access within the next five years. Open access is at the core of our mission to help the chemical sciences make the world a better place and by making our 44 society-owned journals free-to-read, we’ll be providing unrestricted global access to all of the cutting-edge research we publish.

“A key priority for our transition,” Bosshart says, “is to ensure that our international author base continues to have the same ability to publish in our journals. For this reason, we’re planning to spend the next two years working with our world partners, institutions, and community to develop new open-access models that function at an institutional level, rather than relying solely on author publication charges.

“Transformative agreements are an essential stepping stone in our [progress] toward 100-percent open access as they form the basis for future open-access agreements and allow us to transition gradually from subscriptions to open access. They also strengthen the relationships we have with our United States institutional partners and create a forum for conversation and collaboration toward a joint open-access future.

“Our end goal is an open-access future that ensures that everyone, everywhere, has the same potential to access and contribute to the latest discoveries in the chemical sciences and beyond—and we’re looking forward to working collectively with our community to achieve this vision.”

Link to the rest at Publishing Perspectives

The Perfection Trap

From The Wall Street Journal:

There comes a moment in every job interview when the applicant will be asked to name his or her greatest weakness. “Well, I’d have to say it’s my perfectionism” is the smart answer, a humblebrag that is pretty short on humility. These days—as Thomas Curran writes in “The Perfection Trap: Embracing the Power of Good Enough”—this “weakness” is a strength. It assures a prospective employer of your commitment to the highest standards, “counted in hours of relentless striving, untold personal sacrifices, and heaps of self-imposed pressure.”

Perfectionism was not always held in such high regard; it was once the stuff of horror. Nathaniel Hawthorne’s 1843 cautionary tale “The Birth-Mark” tells of a scientist who becomes fixated with his beautiful wife’s single blemish—a birthmark. She internalizes his revulsion and wants to remove the defect “at whatever risk.” The scientist at last concocts a remedy and his wife gulps it down. Fine, except it kills her. The lesson? Perfection is a form of madness, one best avoided.

Mr. Curran, an associate professor of psychology at the London School of Economics, writes that perfectionism “seems to be the defining characteristic of our time.” Our rise-and-grind work lives are animated by the notion that “if you’re slacking, slowing down or, worse, taking a moment to simply think about what all the relentless grinding is even for, then you’re going to be left behind.”

I expected “The Perfection Trap” to attack the self-defeating behavior of perfectionists—the author counts himself as one—with a predictable hodgepodge of mindfulness, leavened with a little cognitive behavioral therapy and maybe a phone app thrown in for good measure. Instead, Mr. Curran has produced a manifesto damning our economic system for creating and maintaining a warped set of values that drive perfectionism, values we have internalized without examination. There’s no easy fix, he warns. The task calls for the kind of deep introspection that is both hard and unpopular; we must confront “our most basic assumptions about what’s ‘great’ and ‘good’ in modern society.”

The case of Lance Armstrong exemplifies the dynamic. The cyclist admitted to doping his way to seven Tour de France victories, but wasn’t sorry for it—it was simply, in his mind, leveling the playing field. “The culture was what it was,” he told Oprah Winfrey. “We were all grown men, we all made our choices.” But step back for a moment, Mr.Curran tells us, and see how Mr. Armstrong is attempting to normalize irrational and shocking behavior. The arms race that he willingly took part in risked every cyclist’s health but didn’t make any single rider more likely to succeed. It paid off for Mr. Armstrong, but not everyone was so lucky, costing some cyclists their health, others their lives.

“The same destructive arms race is playing out in wider culture right now,” Mr. Curran writes. We subjugate our own well-being to that of the economy, which “needs to grow far more than we need to feel content. Perfectionism is just the collateral damage.” Mr. Curran is explicit. To him, healthy perfectionism—a hedge used by those of us seeking to exclude ourselves from his critique—is an oxymoron.

. . . .

The lifeblood of social media and advertising is unhappiness. Every day, some two billion of us—a quarter of the earth’s population—log in to Facebook or Instagram and measure the ways our lives are lacking compared to the photoshopped images of impossibly well-turned-out people living their fabulous lives. “Isn’t this what Instagram is mostly about?” an Instagram employee asked rhetorically in a leaked memo. “The (very photogenic) life of the top 0.1%?” It is. And children pay an especially high price when they feel like they’re falling short. “We make body image issues worse for one in three teen girls,” according to a slide from a leaked Facebook presentation.

Even more disturbing was a leaked chart indicating that 6% of teens in the U.S. and 13% of teens in Britain said “spending time on Instagram was one reason why they felt like they wanted to kill themselves.” The company has no incentive to remedy this. “Facebook (now Meta),” which owns Instagram, “has increased its advertising revenues exponentially since 2009, to almost $115 billion today,” Mr. Curran reminds us. It didn’t do that by telling us we’re fine the way we are. “Their algorithms can even pinpoint moments when young people ‘need a confidence boost.’ ”

Link to the rest at The Wall Street Journal

Decide Where You’re Standing in Time as You Write Your Memoir

From Jane Friedman:

Two temporal elements—the time frame of the story and where you are standing in time while you tell your tale—are central to the idea of structure in memoir.

But they are tricky to determine because you are still living the life you are writing about in your memoir, and you existed at all the points in time throughout the story you are telling. It’s easy to think that you are just you and the story is just the story, and to believe that you don’t have to make any decisions around time the way a novelist does. But if you neglect to make conscious choices about time, you risk getting tangled up and writing a convoluted story.

The first decision: choose a time frame for your story

What time period will your story cover? Don’t think about flashbacks to your younger self or memories of times gone by; all stories have that kind of backstory and it doesn’t count when answering this question.

Also don’t think about whether you are going to write your story chronologically or present the story in some other way (such as backward or in a braid); these are questions about form that get sorted out later.

For now, just think about what is known as “story present”: the primary period of time that the reader will experience as they are reading the story.

Here are some examples of story present from well-known memoirs:

  • The several weeks I spent hiking the Pacific Crest Trail (Wild by Cheryl Strayed)
  • The year I planted my first garden (Second Nature by Michael Pollan)
  • The three years I was in an abusive relationship (In the Dream House by Carmen Maria Machado)
  • The three years consumed by the trial of my rapist (Know My Name by Chanel Miller)
  • The four years when I was a dominatrix (Whip Smart by Melissa Febos)
  • My childhood in Ireland (Angela’s Ashes by Frank McCourt)
  • The 18 years following the accidental death of my high school classmate (Half a Life by Darin Strauss)
  • The 30-something years of my marriage (Hourglass by Dani Shapiro)
  • My whole life (I Am, I Am, I Am by Maggie O’Farrell)

If you find yourself considering a time period that covers your whole life or a big chunk of time like the last two examples in my list, make sure that you actually need to include the entire period of time to effectively tell your story.

Dani Shapiro’s Hourglass doesn’t cover her whole life, but it covers many decades. That’s because her topic is itself time—the way it moves and flows in a long marriage, the impact it has on the relationship. Even her title cues us into this truth: She is making a point about the passage of time. The time frame she uses fits her point.

Maggie O’Farrell’s memoir, I Am, I Am, I Am: 17 Brushes with Death, starts in her childhood and covers the entirety of her life up until the moment she is writing the book. It is a beautiful and effective story. But note that the intention of this book is not to tell her life story; it’s to discuss the specific ways that she is mortal and the reality that we are all mortal, and to remind us that every moment is a gift. She imposes a concept onto the material—a form or structure to unify or organize the material so that it’s not just a bunch of things that happened but a very specific and highly curated progression of things that happened. The story presents her whole life, but she only chooses to tell 17 stories. The time frame she uses fits her point as well.

The second decision: where you are standing in time as you tell your tale

While you are thinking about the time frame of your story, you also must decide where you are standing in time when you tell your story. There are two logical choices:

1. Narrating the story as you look back on it

The first option is that the story has already happened, and you are looking back on it with the knowledge and wisdom gained from having lived through those events. You are standing in time at a fixed moment that is not “today” (because today is always changing). It’s a specific day when the story has happened in the past and the future life you are living has not yet happened. This choice has to do with what we call authorial distance, or how far from the story the narrator is standing. In fiction, a first-person point of view often feels closer to the story than a third-person point of view. In memoir, if you are telling the story from a specific day that is just after the events that unfolded, you will be closer to the story than if you were telling the story from a specific day three decades later.

I wrote my breast cancer memoir just months after my treatment had ended and the friend who inspired me to get an (unusually early) mammogram had died. Her recent death was the reason for the story and part of the framing of it. She died young and I did not; the point I was making was about getting an early glimpse at the random and miraculous nature of life—a lesson that most people don’t really metabolize until they are much older. I wanted to preserve the close distance to the events of the story. If I told that story now, I would be telling it with the wisdom of having lived well into middle age—a very different distance from the story and a very different perspective.

I once worked with a client who had been a college admissions officer at an elite private high school. The pressure of the work, the outrageous expectations of the kids and parents, and the whole weight of the dog-eat-dog competitive culture contributed to him having a nervous breakdown. He wrote a memoir in which he answered college application questions from the perspective of a wounded and reflective adult. It was brilliant, and part of its brilliance was the wink and nod of doing something in his forties that so many people do at age seventeen.

We are talking here about authorial distance related to time, but there is also the concept of authorial distance related to self-awareness. I know that sounds a little like an Escher staircase circling back on itself—and it kind of is. The narrator of a memoir (the “you” who is standing at a certain moment in time) has some level of self-awareness about the events and what they mean. One of the reasons that coming-of-age stories are so beloved is that, by definition, the narrator is awakening to themselves and the world for the first time. There is very little distance (temporal or emotional) between who they were and who they became and there is a purity and poignancy to that transformation. It’s as if they are awakening to the very concept of self-awareness.

It is entirely possible for an adult to write a memoir and not bring much self-awareness to what they are writing about; it’s unfortunately quite common. A narrator who is simply reciting what happened—“this happened to me and then this happened and then this other thing happened”— is not exhibiting self-awareness about their life. They are not stepping back emotionally from it, so they don’t have any perspective to offer no matter how far away they are from it in time. They are just telling us what happened. These kinds of stories tend to feel flat and self-absorbed. They make no room for the reader. They don’t offer any sort of reflection or meaning-making, don’t offer any emotional resonance, and don’t ultimately give us the transformation experience we are looking for when we turn to memoir.

Laurel Braitman, author of What Looks Like Bravery, explains it like this: “I tell this to my students now: You can only write at the speed of your own self-awareness. You do not want the reader to have a realization or insight about your life that you haven’t had already or they will lose respect for you.”

If you are telling your story as you are looking back on it, make a clear decision about exactly where you are standing in time and make sure you have enough self-awareness to guide the reader with authority through the events you are recounting.

2. Narrating the story as it unfolds

The second logical option in terms of where the narrator stands in time is to tell the story as though you are experiencing it for the first time. There is no temporal distance from the events you are writing about. You narrate the story as the story unfolds, which means that you narrate it without the knowledge of how it all turned out. In this kind of story, the self-awareness that is necessary for an effective memoir is unfolding as the story unfolds as well.

I wrote a memoir about getting married and the structure of it was a “countdown” to the wedding. In this format, the concept is that events were unfolding as I was living them. This wasn’t technically true—I wrote the book after the wedding had taken place—but I had taken extensive notes and was able to preserve the concept of not knowing how people would behave or how I would feel. (This book embarrasses me now—the whole idea of it. I was 25 when I wrote it, so what can I say? I am grateful for its role in my career and here it is being useful as an example.)

In Joan Didion’s memoir, The Year of Magical Thinking, she wrote about the year after her husband dropped dead at the dinner table, and of the difficulty that the human mind has grasping that kind of catastrophic change. In the first pages of the book, she writes, “It is now, as I begin to write this, the afternoon of October 4, 2004. Nine months and five days ago, at approximately nine o’clock….” What she is doing here is signaling to us that there is not a whole lot of authorial distance or self-awareness to what she is sharing. She is figuring it all out—this tendency to think magical thoughts about the dead not really being dead—as she writes. But the key thing is that she knows that she is figuring it out, and she invites us into the process. She has self-awareness about her own lack of self-awareness. She is not just telling us about the dinner and the table and the call to 911.

In Bomb Shelter, Mary Laura Philpott places her narrator self at a point in time when the story is still unfolding; she has perspective and self-awareness, but those elements are still clearly in flux. The New York Times reviewer Judith Warner called this out in her rave review of the book. Warner said, “I want to say something negative about this book. To be this positive is, I fear, to sound like a nitwit. So, to nitpick: There’s some unevenness to the quality of the sentences in the final chapter. But there’s no fun in pointing that out; Philpott already knows. ‘I’m telling this story now in present tense,’ she writes. ‘I’m still in it, not yet able to shape it from the future’s perspective.’” Like Didion, Philpott was well aware of the choice she made around narration and time, and those choices perfectly serve her story.

Link to the rest at Jane Friedman

PG notes that on the date he posted this item, the book written by the author of the OP had not been published yet. You can find the book here for preorder. Here’s a link to a broken home page for the publisher, Tree Farm Books.

PG was about to comment about a publisher that can’t even keep a home page operating when a large number of readers, including PG, spend time online gathering information for books they might be interested in reading. Not to mention that the profit margins for selling ebooks are much larger than for books on paper.

PG mentions again that, in his opinion, it’s stupid to get a blurb for a book or, in this case, have the author of a book write a detailed online article about the subject of the book when the book isn’t available on Amazon yet. In the case of the book shown below, the publisher, Tree Farm Books, hadn’t even made the book eligible for preorders on Zon.

But PG’s opinions could be entirely wrong.

Life as It Is, Blinkers Removed

From The Wall Street Journal:

As a young man, Christian Madsbjerg was poking around a library and came across a passage by Aristotle that struck him as ridiculous. In “Physics,” Aristotle explained that objects drop earthward because they want to return to the place where they belong. As Mr. Madsbjerg re-read the lines, he felt his derision give way to dizziness. “In [Aristotle’s] world, objects fell to the ground because they were homesick,” Mr. Madsbjerg writes in “Look: How to Pay Attention in a Distracted World,” a provocative, sometimes elusive work of inquiry and instruction. That a philosopher two millennia ago knew nothing of gravity was not in itself surprising; what moved Mr. Madsbjerg was the realization that Aristotle must have really believed what he’d written.

“I have since encountered that vertigo of coalescing insight many, many times,” writes the Danish-born entrepreneur, who has built a career on the pursuit of perceptual breakthroughs. In “Look,” Mr. Madsbjerg attempts to impart the wisdom he has acquired from art and philosophy and from the practical experience of running a corporate consultancy and teaching a class on “human observation” at the New School in New York. His book is full of intriguing goodies: anecdotes and precepts originating in a wide array of sources, as well as summaries of the work of gestalt theorists and practitioners of phenomenology, a discipline he defines as “the study of how the human world works and everything that gives our life meaning.”

The breadth and vagueness of that definition bespeaks the book’s enthusiastic overreach. There’s a lot here, and a lot of it makes sense, but there are moments when the argument is so diffuse as to feel precarious. At times Mr. Madsbjerg teeters on the edge of banality, even incoherence. Yet he never tips over, for there is unmistakably truth in what he’s getting at. And, to be fair, the distinction between apparent understanding and deep understanding is difficult to draw. It is difficult even to describe. Mr. Madsbjerg does a heroic job of seeking to capture the experience of sudden insight.

The “richest reality,” he argues, is reached not by thinking but by looking—by which he means broadly using one’s perceptual apparatus. This dictum sounds simple to follow, but it is not. Looking takes time. Looking requires silencing the chattering mind. Looking means not only letting the eye follow the shiny bouncing object but also taking in the vast realm of context around the object and noticing what is not shiny and not bouncing. Perhaps most challengingly, it means surrendering the idea that we are necessarily seeing what we think we see. For, as Mr. Madsbjerg explains, “perception happens inside us; we change what we see to reflect who and where we are in the world.”

A rose may be a rose may be a rose, to mangle Gertrude Stein, but a rose is going to have very different meanings for a botanist, an interior designer or a love-struck suitor. When we glance at a rose—or any other thing—we see it amid the connotations it holds for us. This is the idea of gestalt, the recognizable “wholeness” of things. To take another example, few of us would stop to assess the color, size, shape and composition of a dining table and a set of chairs before drawing conclusions about their nature and purpose. The same is true with the gestalt of a picnic lunch or a museum gallery or a crowded bus stop: It’s easy to register a scene without noticing any of the details.

This capacity to make rapid sense of what we see equips us to get on with daily life, but it can also be an obstacle to clarity, fitting us with blinkers and causing us to miss important signals. “Look” is full of stories of people who have struggled to see the forest for the trees, or the trees for the forest, or the reason the forest is growing where it is and not some other place. One anecdote tells of an executive at an electronics company who was so intent on developing sophisticated big-screen televisions that he missed the cultural shift toward TV-viewing on laptops and smartphones. Another relates how the city-born biographer Robert Caro had to spend long hours in the open-skied isolation of rural Texas before he felt he could begin to understand his subject, Lyndon Johnson.

Looking through an ideological lens can also be blinding, as Mr. Madsbjerg knows firsthand, having grown up in a Marxist milieu on a Danish island in the Baltic Sea. “For a teenager, Communism was a totalizing, energetic, and angry faith,” he writes. “I was enraptured by fervor—furious about the class inequities in the world.” Only when his revolutionary ardor faded was he able to see how the ideology had distorted his perception of the world. The fall of the Berlin Wall became, in his telling, the impetus to begin a quest to understand how fashions in thinking advance and retreat; why people change their minds; and whether it is possible to anticipate social change, to “feel signs of a storm approaching.”

Link to the rest at The Wall Street Journal

The World Behind the World

From The Wall Street Journal:

Imagine that you could shrink yourself small enough to be able to travel inside a brain. Hiking up the hippocampus or sailing around the cerebellum would no doubt be an awesome experience and well worth the trip. But nowhere would you be able to experience what the person with the brain was thinking, hearing or feeling. The buzzing hive of conscious life would be, to you, just a collection of cells.

The limits of such a fantastic voyage point to two seemingly irreconcilable ways of viewing ourselves: as biological matter and as self-aware mind. The contrast has stumped philosophers and scientists for centuries and is sometimes framed in terms of the objective and subjective, the external and internal. Neuroscientist Erik Hoel, taking up these two realms of human self-perception, calls them the extrinsic and intrinsic perspectives. In “The World Behind the World,” he tries to explain why they exist, how we might try to bring them under a common understanding and why we might never fully succeed. On top of that, he attempts to rebut the argument, ever more frequent these days, that the firings of neurons—not thoughts and desires—are the only real causes of human action. As if that weren’t enough, he tries to save the idea that we have free will too—all in slightly more than 200 pages. His project is, he admits, “impossible in scope,” but he still has the chutzpah to give it a go.

Mr. Hoel begins by challenging the idea that the dual perspective—inside and outside—has been a perennial, unchanging feature of human experience. He argues instead that the intrinsic perspective took a while to come into its own. Since the time of Homer, he says, our ability to describe and express our inner world has developed considerably. The first stirrings can be seen in ancient Greek writers like Simonides, who developed the “memory palace” method of memorization, and Euripides, who gave voice to the inner lives of his characters. This perspective languished during medieval times, Mr. Hoel says, but was given new life in the Renaissance by writers like Cervantes and developed yet more with the rise of the novel in the 18th century.

The argument is speculative and the timeline too neat, but the key point is well-made: Consciousness may be a human universal, but our desire and ability to articulate its qualities and recognize it explicitly as a distinctive perspective are human variables, not any more innate than our ability to study the world scientifically. And indeed, Mr. Hoel says, the rise of science, too, depended on our learning a perspective—the extrinsic one. He gives an even faster whistle-stop tour of this process of development.

Mr. Hoel begins by challenging the idea that the dual perspective—inside and outside—has been a perennial, unchanging feature of human experience. He argues instead that the intrinsic perspective took a while to come into its own. Since the time of Homer, he says, our ability to describe and express our inner world has developed considerably. The first stirrings can be seen in ancient Greek writers like Simonides, who developed the “memory palace” method of memorization, and Euripides, who gave voice to the inner lives of his characters. This perspective languished during medieval times, Mr. Hoel says, but was given new life in the Renaissance by writers like Cervantes and developed yet more with the rise of the novel in the 18th century.

The argument is speculative and the timeline too neat, but the key point is well-made: Consciousness may be a human universal, but our desire and ability to articulate its qualities and recognize it explicitly as a distinctive perspective are human variables, not any more innate than our ability to study the world scientifically. And indeed, Mr. Hoel says, the rise of science, too, depended on our learning a perspective—the extrinsic one. He gives an even faster whistle-stop tour of this process of development.

But then, in perhaps the book’s most interesting chapter, Mr. Hoel changes tack and argues that the achievements of neuroscience—the quintessential mode for extrinsic self-examination—fall far short of what the hype and headlines suggest. We have been promised so much, as if the close study of the brain’s collection of cells will yield a complete explanation for consciousness. The truth is, Mr. Hoel says, that mapping the brain’s neuronal activity has provided us with very little predictive information. You cannot, for example, reliably tell if someone is depressed by looking at a brain scan. The brain is simply too malleable, too fluid in its structures and too complex for scans to pinpoint what does what at the neuronal level and what effects will arise from any particular pattern of neural activity.

But there is a deeper problem. Neuroscience has assumed that it can proceed purely by adopting the extrinsic perspective. However, as our hypothetical miniaturized journey around the brain shows, this perspective has severe limits—almost by definition, it excludes consciousness itself. Neuroscientists carry on as though this exclusion doesn’t matter. They tend to see consciousness, Mr. Hoel writes, as “some sort of minimal subsystem of the brain, possessing no information, almost useless. The steam from an engine.” Mr. Hoel argues that “our very survival as an organism” depends on consciousness and its ability to track reality accurately—or, as he puts it, on “the stream of consciousness being constantly veridical and richly informative.” Nothing in the brain makes sense, he adds, “except in the light of consciousness.”

At this point, Mr. Hoel has deftly set the stage for another whirlwind tour, this one surveying the theories of consciousness that have come out of both psychology and neuroscience. But here the going gets tougher. To mix metaphors, the reader faces a steep learning curve across terrain that is slippery at the best of times. It doesn’t help that Mr. Hoel starts to use logic truth tables as though they were as easy to read as bar charts.

. . . .

You cannot, for example, by analyzing the movement of individual atoms in the atmosphere, tell whether or not it’s going to rain or blow a hurricane. You have to look at a larger, more-inclusive scale, the one of air-pressure systems and cloud formation. Because only events at this level can explain and predict weather events, we rightly say they are their causes. Similarly, patterns of individual neurons firing don’t tell you how a person is going to act. But what someone thinks and feels may do so. These states of mind, Mr. Hoel says, are rightly called causes of action: They have “agency” in a way that mere cells do not.

As if he hasn’t taken on enough already, Mr. Hoel finishes by tackling the thorny problem of free will. The ambition of this task is suggested by his conclusion: “Having free will means being an agent that is causally emergent at the relevant level of description, for whom recent internal states are causally more relevant than distant past states, and who is computationally irreducible.” This is a perfectly reasonable suggestion, but it requires more than the 13 pages it is given to make its meaning clear, let alone to build a case for it.

Link to the rest at The Wall Street Journal

Theoderic the Great

From The Wall Street Journal:

If there was a Roman version of “1066 and All That,” the satirical romp through English history, the year 476 would surely be one of those suspiciously bold lines in our collective historical imagination. It was then that Romulus Augustulus, the last Roman emperor, was deposed in the west. On one side of his 10-month reign lay Antiquity. On the other, the Middle Ages.

Where does that leave Theoderic the Great, the Ostrogothic king who reigned in Italy from 493 until his death in 526? Under the rule of this Gothic-speaking warrior, the Colosseum still rang with the roar of spectators, crisp mountain water still streamed through the aqueducts, and giants of Latin literature, like Cassiodorus and Boethius, still served in the senate.

Hans-Ulrich Wiemer’s “Theoderic the Great: King of Goths, Ruler of Romans” is a monumental exploration of the life and times of this remarkable leader. It is the most important treatment of its subject since Wilhelm Ensslin’s 1947 biography, and since Mr. Wiemer’s book (here in John Noël Dillon’s fluid English translation) surpasses its predecessor in breadth and sophistication, the author can claim the laurel of having written the best profile of Theoderic we have.

The story of Theoderic is epic and improbable. He was born in 453 or 454 in the ever-contested Danubian borderlands, probably in what is now the east of Austria, to an elite Gothic warrior and a mother of obscure background. The Gothic tribe to which Theoderic belonged had just emerged, following the recent death of Attila, from a long spell of domination by the Huns. In 461, the boy Theoderic was shipped to Constantinople as insurance for a treaty. He spent almost a decade, his formative youth, in the great metropolitan capital of the Roman Empire.

Theoderic’s power derived less from his distinguished ancestry or the Gothic respect for royal legitimacy, Mr. Wiemer emphasizes, than from his success as a warrior. As an upstart prince, he killed the Sarmatian King Babai with his own hands. As a commander at the head of a fearsome Gothic army, he proved a fickle ally for the eastern Roman Empire, whose emperors were hardly models of loyalty themselves. In the early 480s, he was named commander-in-chief by the Romans. Within a few years, he was besieging Constantinople.

If his career had ended there, Theoderic’s name would belong among the distinguished mercenary warlords of the troubled fifth century. But fortune favors the bold, and Theoderic had even grander ambitions. In 488, he set off with some 100,000 followers—men, women and children—in an armed wagon train on an uncertain journey from the banks of the Danube (in what is now Bulgaria) to Italy. Their goals were to unseat Odoacer—the deposer of Romulus Augustulus—and to find for themselves a permanent home. Theoderic cornered Odoacer and his forces in the major stronghold of Ravenna, and the two signed a treaty by which they were meant to share power. The treaty lasted all of about 10 days, before Theoderic personally clove his rival in two (“with a single sword stroke,” Mr. Wiemer tells us, “slicing him apart from collarbone to hip”). From such sanguinary beginnings emerged a generation of peace in Italy.

What makes Mr. Wiemer’s survey so rich is his mastery of recent research on the twilight of antiquity. Theoderic’s reign cuts to the heart of virtually every great debate among scholars of this period. Were his Ostrogoths an essentially Germanic tribe, or is ethnicity a fiction ever reconfigured by contingent power dynamics?

For Mr. Wiemer, a professor of ancient history at the University of Erlangen–Nuremberg, Ostrogoths were a “community of violence” whose material basis was war and plunder. But the author recognizes that the masses who followed Theoderic on his Italian adventure were a people of shared history and culture, setting them apart from the natives in Italy and drawing them closer to other groups, such as the Visigoths who had settled in Spain and Gaul.

Mr. Wiemer is convincing on the main lines of Theoderic’s domestic and foreign policy. At home, Theoderic pursued functional specialization between the Goths and the Romans. The former were warriors (if also landowners), the latter civilians. A two-track government reflected this essential division of labor. Theoderic sought complementarity, not fusion.

Abroad, he sought legitimacy from the eastern Roman capital, along with stability in the post-Roman west. By means of strategic treaties and an astonishing network of marriage alliances among the Vandals, Visigoths, Franks, Burgundians and others, Theoderic emerged as the most powerful ruler west of Constantinople. Thanks to opportunistic expansion, he came to control wide swathes of the Balkans, much of southern Gaul and (nominally) the Iberian Peninsula. In the early sixth century, it would not have been obvious that the Frankish kingdom would prove more enduring and consequential.

Link to the rest at The Wall Street Journal

Coin depicting Flavius Theodoricus (Theodoric the Great). Roman Vassal and King of the Ostrogoths. Only a single coin with this design is known; it is in the collection of Italian numismatic Francesco Gnecchi, displayed in Palazzo Massimo, Rome. Source: Wikimedia Commons

Mausoleum of Theoderic, built in 520 AD by Theoderic the Great as his future tomb, Ravenna, Italy Source: Wikimedia Commons

Gadgets and Gizmos That Inspired Adam Smith

From Reason:

Pocket gadgets were all the rage in Adam Smith’s day. Their popularity inspired one of the most paradoxical, charming, and insightful passages in his work.

The best known are watches. A pocket timepiece was an 18th century man’s must-have fashion accessory, its presence indicated by a ribbon or bright steel chain hanging from the owner’s waist, bedecked with seals and a watch key. Contemporary art depicts not just affluent people but sailors and farm workers sporting watch chains. One sailor even wears two. “It had been the pride of my life, ever since pride commenced, to wear a watch,” wrote a journeyman stocking maker about acquiring his first in 1747.

Laborers could buy watches secondhand and pawn them when they needed cash. A favorite target for pickpockets, “watches were consistently the most valuable item of apparel stolen from working men in the eighteenth century,” writes historian John Styles, who analyzed records from several English jurisdictions.

But timepieces were hardly the only gizmos stuffing 18th century pockets, especially among the well-to-do. At a coffeehouse, a gentleman might pull out a silver nutmeg grater to add spice to his drink or a pocket globe to make a geographical point. The scientifically inclined might carry a simple microscope, known as a flea glass, to examine flowers and insects while strolling through gardens or fields. He could gaze through a pocket telescope and then, with a few twists, convert it into a mini-microscope. He could improve his observations with a pocket tripod or camera obscura and could pencil notes in a pocket diary or on an erasable sheet of ivory. (Not content with a single sheet, Thomas Jefferson carried ivory pocket notebooks.)

The coolest of all pocket gadgets were what antiquarians call etuis and Smith referred to as “tweezer cases.” A typical 18th century etui looks like a slightly oversized cigarette lighter covered in shagreen, a textured rawhide made from shark or ray skin. The lid opens up to reveal an assortment of miniature tools, each fitting into an appropriately shaped slot. Today’s crossword puzzle clues often describe etuis as sewing or needle cases, but that was only one of many varieties. An etui might contain drawing instruments—a compass, ruler, pencil, and set of pen nibs. It could hold surgeon’s tools or tiny perfume bottles. Many offered a tool set handy for travelers: a tiny knife, two-pronged fork, and snuff spoon; scissors, tweezers, a razor, and an earwax scraper; a pencil holder and pen nib; perhaps a ruler or bodkin. The cap of a cylindrical etui might separate into a spyglass.

All these “toys,” as they were called, kept early manufacturers busy, especially in the British metal-working capital of Birmingham. A 1767 directory listed some 100 Birmingham toy makers, producing everything from buttons and buckles to tweezers and toothpick cases. “For Cheapness, Beauty and Elegance no Place in the world can vie with them,” the directory declared. Like Smith’s famous pin factory, these preindustrial plants depended on hand tools and the division of labor, not automated machinery.

Ingenious and ostensibly useful, pocket gadgets and other toys epitomized a new culture of consumption that also included tea, tobacco, gin, and printed cotton fabrics. These items were neither the traditional indulgences of the rich nor the necessities of life. Few people needed a pocket watch, let alone a flea glass or an etui. But these gadgets were fashionable, and they tempted buyers from a wide range of incomes.

A fool “cannot withstand the charms of a toyshop; snuff-boxes, watches, heads of canes, etc., are his destruction,” the Earl of Chesterfield warned his son in a 1749 letter. He returned to the subject the following year. “There is another sort of expense that I will not allow, only because it is a silly one,” he wrote. “I mean the fooling away your money in baubles at toy shops. Have one handsome snuff-box (if you take snuff), and one handsome sword; but then no more pretty and very useless things.” A fortune, Chesterfield cautioned, could quickly disappear through impulse purchases.

In The Theory of Moral Sentiments, first published in 1759, Smith examined what made these objects so enticing. Pocket gadgets claimed to have practical functions, but these “trinkets of frivolous utility” struck Smith as more trouble than they were worth. He deemed their appeal less practical than aesthetic and imaginative.

“What pleases these lovers of toys is not so much the utility,” Smith wrote, “as the aptness of the machines which are fitted to promote it. All their pockets are stuffed with little conveniences. They contrive new pockets, unknown in the clothes of other people, in order to carry a greater number.” Toys embodied aptness, “the beauty of order, of art and contrivance.” They were ingenious and precise. They were cool. And they weren’t the only objects of desire with these qualities.

The same pattern applied, Smith argued, to the idea of wealth. He portrayed the ambitious son of a poor man, who imagines that servants, coaches, and a large mansion would make his life run smoothly. Pursuing a glamorous vision of wealth and convenience, he experiences anxiety, hardship, and fatigue. Finally, in old age, “he begins at last to find that wealth and greatness are mere trinkets of frivolous utility, no more adapted for procuring ease of body or tranquillity of mind than the tweezer-cases of the lover of toys.”

Yet Smith didn’t condemn the aspiring poor man or deride the lover of toys. He depicted them with sympathetic bemusement, recognizing their foibles as both common and paradoxically productive. We evaluate such desires as irrational only when we’re sick or depressed, he suggested. In a good mood, we care less about the practical costs and benefits than about the joys provided by “the order, the regular and harmonious movement of the system….The pleasures of wealth and greatness, when considered in this complex view, strike the imagination as something grand and beautiful and noble, of which the attainment is well worth all the toil and anxiety.”

Besides, Smith suggested, pursuing the false promise of tranquility and convenience had social benefits. It was nothing less than the source of civilization itself: “It is this which first prompted them to cultivate the ground, to build houses, to found cities and commonwealths, and to invent and improve all the sciences and arts, which ennoble and embellish human life; which have entirely changed the whole face of the globe, have turned the rude forests of nature into agreeable and fertile plains, and made the trackless and barren ocean a new fund of subsistence, and the great high road of communication to the different nations of the earth.”

Then Smith gave his analysis a twist. The same aesthetic impulse that draws people to ingenious trinkets and leads them to pursue wealth and greatness, he argued, also inspires projects for public improvements, from roads and canals to constitutional reforms. However worthwhile one’s preferred policies might be for public welfare, their benefits—like those of a pocket globe—are secondary to the beauty of the system.

“The perfection of police, the extension of trade and manufactures, are noble and magnificent objects,” he wrote. “The contemplation of them pleases us, and we are interested in whatever can tend to advance them. They make part of the great system of government, and the wheels of the political machine seem to move with more harmony and ease by means of them. We take pleasure in beholding the perfection of so beautiful and grand a system, and we are uneasy till we remove any obstruction that can in the least disturb or encumber the regularity of its motions.” Only the least self-aware policy wonk can fail to see the truth in Smith’s claim.

Here, however, the separation of means and end can be more serious than in the case of a trinket of frivolous utility. Buying a gadget you don’t need because you like the way it works doesn’t hurt anyone but you. Enacting policies because they sound cool can hurt the public they’re supposed to benefit. “All constitutions of government,” Smith reminded readers, “are valued only in proportion as they tend to promote the happiness of those who live under them. This is their sole use and end.” Elsewhere in The Theory of Moral Sentiments, Smith criticized the “man of system” who imposed his ideal order, heedless of the wishes of those he governed.

Link to the rest at Reason

In Defense of Independent Opinion Journalism

A reminder that PG doesn’t always agree with items he posts and usually avoids pieces with strong political position-taking. On occasion, he breaks from his usual pattern.

PG will observe that, at least in the United States, there are periods of time that are characterized by those who hold differing political opinions or values speaking past each other, frequently utilizing straw men strategies.

PG does continue to urge commenters to be respectful of those with differing opinions even if those opinions seem wrong in some way. PG will also note that the large majority of those who choose to leave comments here are intelligent individuals and suggests that differing opinions that appear in the comments are made by intelligent adults, not idiots.

(PG just checked and found that there have been 328,174 comments left on TPV in discussions of the various posts PG has made over several years. The number of comments PG has deleted for going beyond the limits of respectful dissent can be counted on his hands plus a couple of toes.)

From New York Magazine:

A couple decades ago, liberals began to see the structural asymmetry in the news media as one of the major problems in American politics. The Republican Party had an unapologetically partisan media apparatus — anchored by Fox News, founded in 1996 — that it used to promote its message. Democrats lacked anything similar. Even worse, the mainstream media had become highly sensitive to charges of liberal bias and habitually treated Republican-promoted narratives, however superficial or farcical, as inherently newsworthy. The conservative media was slavishly partisan, and the “liberal” media was filled with stories about how Al Gore was seen as a pathological liar, or John Kerry an effete flip-flopper.

Two phrases came into circulation that expressed this frustration. One was working the refs, which was borrowed from the sports world to describe how Republicans pushed reporters and editors rightward with nonstop complaints of bias.

The second was hack gap, which described the imbalance in professional ethos between left and right. Liberal pundits tended to see themselves as journalists rather than activists. They were expected to advance original arguments rather than echo a common message, and the rewards of career advancement generally went to those willing to criticize Democrats and fellow progressives. Conservative pundits usually came out of the conservative movement, saw themselves as working toward an ideological project, and operated with the tight discipline of a movement. Democrats would face swift internal criticism if they fudged the truth or violated any ethical norm, while Republicans, as long as they remained faithful to conservative doctrine, could count on the support from their chorus no matter what they did.

Over time, these critiques have exerted a profound effect on the news media. The mainstream media has moved distinctly to the left, and its once-universal practice of covering every factual debate merely by alternating quotes from opposing parties while treating the truth as unknowable has become rarer.

Progressive opinion journalism has changed even more dramatically. Breaking from the pack to question a shared belief on the left is no longer a prized trait; it is now possible to build a career unswervingly affirming progressive movement stances. On the whole, the profession has changed for the better. The internet has opened up far more voices on the left, in every way. There are more writers from more perspectives and bringing more expertise, and more of them are not white men. The absurdity of the 1990s world in which the ideological spectrum of mainstream thought ended at the center left needed to die. My work as a liberal writer is far more interesting today than it was when I began. Liberalism as a whole benefits from a strong critique from the left as well as the right.

As a political matter, the conservative-messaging apparatus no longer operates without any parallel opposition. The asymmetric structure of the media 15 or 20 years ago, which shaped Republicans into a party free to violate norms while Democrats felt constrained to follow them, is giving way to a more balanced system. After years complaining why liberals lacked their own version of Fox News, we can now see something like it, cobbled together from websites and cable-news programming.

At the same time, the downsides of this new media world have become increasingly obvious. Along with their partisan messaging system, progressives are constructing a counterpart to the information bubble in which conservatives have long resided. Where it was once rare to encounter some pseudo-fact circulating among the left, it is now routine to find people believing Michael Brown was shot with his hands up, lab-leak is a debunked conspiracy theory, or that Republicans are routinely banning instruction about racism.

In 2010, libertarian writer Julian Sanchez described the sealed universe of conservative thinking as “epistemic closure” — any source that refuted conservative claims was automatically deemed untrustworthy. One can now discern on the left at least the embryonic formation of a similar alt-universe, in which any inconvenient challenge is reflexively dismissed as “bothsidesing,” “concern trolling,” some form of bigotry, or any other of an ever-expanding list of buzzwords used to delineate wrong-think.

. . . .

Independence should be understood as a set of habits that can be practiced by writers from the breadth of the ideological spectrum. It does not mean having an “independent” identity in the partisan voting sense, or having a moderate personal politics. Independent-opinion journalism can be produced by writers occupying perspectives located between the two parties, outside of or orthogonal to them, or squarely within them.

Independence encourages (though hardly guarantees; we are all fallible) certain kinds of mental hygiene: Trying to imagine every situation if the partisan identities were reversed, conceding that people whose political commitments you generally oppose sometimes have correct or sympathetic points, testing your own arguments for logical and historical consistency. Would I oppose this tactic currently being used by the opposing party if my own party used it? Would I defend this tactic being used by my party if the opposing party used it?

An activist’s job is to promote (or, in some cases, prevent) political change. This is a completely honorable profession. But the contours of this job of moving public opinion toward the position you desire involves shading some truths and omitting others. Both forms of argument can be persuasive and articulate, but one is designed for edification, and the other is designed to advance political ends.

Think of the difference between a professor analyzing a legal question and a lawyer advocating for a client. The former has a point of view but is using argument for the sake of promoting deeper understanding for their readers. The latter is using whatever facts are most helpful to their client.

If you consider the metaphor working the refs, the distinction between independent-opinion journalism and political activism becomes perfectly clear. The phrase describes the way many coaches berate referees, in the belief that they will force those officials to call the game in a more favorable way. The coach may be biased enough to genuinely believe everything he screams at the refs, and the fans of his team may see the refs the same way the coach does. But a coach who’s working the refs is not setting out to give fans a fair assessment of the officials. His goal is to win the game.

Many of the writers engaging in public critiques of the mainstream media, either from the left or from the right, are working the refs. To the extent you rely on ref-workers as sources of political information, you are putting your brain in the hands of people who aren’t principally interested in enlightening you. They may want you to be informed about stories that encourage you to support their political coalition. They don’t, however, want to inform you about stories that undermine it. They are working you.

Link to the rest at New York Magazine

Thinking With Your Hands

From The Wall Street Journal:

Snobs of Northern Europe have long prided themselves, among other marks of imagined distinction, on their stillness in speech. The gesticulating Italian is a stubborn stereotype, but some drew the boundary even farther north. “A Frenchman, in telling a story that was not of the least consequence to him or to anyone else, will use a thousand gestures and contortions of his face,” Adam Smith said in a lecture in the 1760s, his hands presumably visible and steady. Even when it’s not wielded as a cudgel of nationalism, gesture is still often considered a garish ornament to rational discourse—or a cheap substitute for action, as when we dismiss something as a “political gesture.”

But it’s a mistake to ignore gesture, Susan Goldin-Meadow writes in “Thinking With Your Hands: The Surprising Science Behind How Gestures Shape Our Thoughts.” Far “more than just hand waving,” it is an “undercurrent of conversation” that expresses emotion, conveys information and aids cognition.

Ms. Goldin-Meadow is a scientist—a developmental psychologist at the University of Chicago—and “Thinking With Your Hands” is a book of science exposition, something like a lecture from a good professor. She doesn’t swaddle the facts in phony narrative or make excessive claims for their world-shaking import. She summarizes results from the literature and her own extensive research; generously cites predecessors and collaborators; and frankly admits when more work is needed. There are occasional lumps of jargon, banal formulations (“Moral education is an important topic these days because it prepares children to be fully informed and thoughtful citizens”) and overlapping accounts of the same studies. But the subject is fascinating.

Ms. Goldin-Meadow turns first to “co-speech” gestures—those we make (and make up) as we speak. Unlike “emblems” —the repertoire of culturally specific hand signs such as the thumbs-up, the “OK” circle or the ear-to-ear throat slit—they have no fixed form. They also serve a wider range of functions than emblems, not only communicating meaning to one’s listeners but also supporting our own cognition. People talking on the phone gesture, she points out, as do the congenitally blind, even when talking to other blind people.

One of her studies found that gesturing seemed to reduce the amount of mental work it took to explain the solution to a math problem. Effort was measured by asking the subjects to simultaneously recite a series of letters from memory, with more letters recited suggesting that less effort was required for the math-explanation task. (A small pleasure of “Thinking With Your Hands” is the inferential ingenuity on display in the experimental designs.) Another found that adults who gestured were better able to recount events in videos they had watched weeks earlier than those who didn’t.

Gesturing can also help to spatialize abstractions, making them more tractable for discussion. Children in one study who moved their hands while considering a moral dilemma, seemingly assigning conflicting positions to distinct spaces in front of them, appeared to be better at assimilating multiple points of view. In another experiment, children were taught the meaning of a made-up word with one specific toy used to demonstrate it. Compared with those who didn’t, the children who gestured were quicker to “generalize beyond the particulars of the learning situation” and extend the word’s application to other cases.

An expert in child development, Ms. Goldin-Meadow is especially focused on gesture’s role in education. Taking gesture seriously by noticing and encouraging it, she insists, would benefit both teachers and students. Children learning to solve certain simple equations, it turns out, often verbally describe using an unsuccessful problem-solving strategy while gesturing in a way that indicates a different, effective approach (making V shapes that group certain numbers to be added together, for example). Those who exhibit these manual-verbal mismatches, Ms. Goldin-Meadow has found, are usually the closest to achieving a breakthrough in their understanding. And students whose teachers used such mismatches in their lessons performed better than others, suggesting that gesture offers a rich channel of additional information.

Interestingly, the effect doesn’t seem to come from simply presenting two different strategies. Teachers who described two approaches verbally didn’t achieve the same boost in their classes’ learning. There’s something distinctive, Ms. Goldin-Meadow writes, about the combination of words and movement unfolding in time. (The rate is relatively stable; English speakers tend to produce one gesture per grammatical clause.) In fact, she writes, the integration of sound and gesture is a “hallmark” of humans, used even by pre-adolescent children but not by apes.

Gesture throws indirect light on the nature of human language, Ms. Goldin-Meadow argues, drawing on research into the hand signs devised by deaf children born to hearing parents or otherwise deprived of established sign language. Such “homesign” shows the same sort of organization as spoken languages do, breaking events down into discrete components (signs, words) that are then assembled into an ordered string. “It is our minds,” Ms. Goldin-Meadow concludes, “and not the handed-down languages, that provide structure” for our thoughts. Language is deep enough in our brains that even a child can invent it from scratch. By contrast, she notes, children don’t seem to invent the concept of exact numbers (as opposed to approximations) greater than five or so on their own.

Link to the rest at The Wall Street Journal

Inside the Secretive Russian Security Force That Targets Americans

From The Wall Street Journal:

For years, a small group of American officials watched with mounting concern as a clandestine unit of Russia’s Federal Security Service covertly tracked high-profile Americans in the country, broke into their rooms to plant recording devices, recruited informants from the U.S. Embassy’s clerical staff and sent young women to coax Marines posted to Moscow to spill secrets. 

On March 29, that unit, the Department for Counterintelligence Operations, or DKRO, led the arrest of Wall Street Journal reporter Evan Gershkovich, according to U.S. and other Western diplomats, intelligence officers and former Russian operatives. DKRO, which is virtually unknown outside a small circle of Russia specialists and intelligence officers, also helped detain two other Americans in Russia, former Marines Paul Whelan and Trevor Reed, these people said.

The secretive group is believed by these officials to be responsible for a string of strange incidents that blurred the lines between spycraft and harassment, including the mysterious death of a U.S. diplomat’s dog, the trailing of an ambassador’s young children and flat tires on embassy vehicles. 

The DKRO’s role in the detention of at least three Americans, which hasn’t been previously reported, shows its importance to Russia under Vladimir Putin, a former KGB lieutenant colonel who led the Federal Security Service, or FSB, before rising to the presidency. The unit intensified its operations in recent years as the conflict between Moscow and Washington worsened. 

As with most clandestine activity carried out by covert operatives, it is impossible to know for certain whether DKRO is behind every such incident. The unit makes no public statements. But officials from the U.S. and its closest allies said that DKRO frequently wants its targets to know their homes are being monitored and their movements followed, and that its operatives regularly leave a calling card: a burnt cigarette on a toilet seat. They also have left feces in unflushed toilets at diplomats’ homes and in the suitcase of a senior official visiting from Washington, these people said.

The DKRO is the counterintelligence arm of the FSB responsible for monitoring foreigners in Russia, with its first section, or DKRO-1, the subdivision responsible for Americans and Canadians.

“The DKRO never misses an opportunity if it presents itself against the U.S., the main enemy,” said Andrei Soldatov, a Russian security analyst who has spent years studying the unit. “They are the crème-de-la-crème of the FSB.”

. . . .

This article is based on dozens of interviews with senior diplomats and security officials in Europe and the U.S., Americans previously jailed in Russia and their families, and independent Russian journalists and security analysts who have fled the country. Information also was drawn from public court proceedings and leaked DKRO memos, which were authenticated by former Russian intelligence officers and their Western counterparts. Gershkovich’s lawyers in Russia declined to comment.

“They’re very, very smart on the America target. They’ve been doing this a long time. They know us extremely well,” said Dan Hoffman, a former Central Intelligence Agency station chief in Moscow, about DKRO. “They do their job extremely well, they’re ruthless about doing their job, and they’re not constrained by any resources.”

. . . .

On March 29, DKRO officers led an operation, hailed by the FSB as a success, that made Gershkovich, 31 years old, the first American reporter held on espionage charges in Russia since the Cold War, according to current and former officials and intelligence officers in the U.S. and its closest allies, as well as a former Russian intelligence officer familiar with the situation.

The Journal has vehemently denied the charge. The Biden administration has said that Gershkovich, who was detained during a reporting trip and was accredited to work as a journalist by Russia’s foreign ministry, has been “wrongfully detained.” Friday is his 100th day in captivity.

Putin received video briefings before and after the arrest from Vladislav Menshchikov, head of the FSB’s counterintelligence service, which oversees DKRO, according to Western officials and a former Russian security officer. During the meeting, Putin asked for details about the operation to detain Gershkovich.

DKRO also led the operation to arrest Whelan, in what U.S. officials, the former Marine’s lawyers and his family have said was an entrapment ploy involving a thumb-drive. The U.S. also considers him wrongfully detained.

When Moscow police held Reed, another former Marine, after a drunken night with friends, then claimed he had assaulted a policeman, officers from DKRO took over the case, according to the U.S. officials and Reed. Reed denied the assault and has said Russian law enforcement provided no credible evidence it had taken place. He was given a nine-year sentence, and eventually swapped for a Russian pilot in U.S. custody.

.S. officials blame DKRO for cutting the power to the residence of current U.S. Ambassador to Moscow Lynne Tracy the night after her first meeting with Russian officials in January, and for trailing an embassy official’s car with a low-flying helicopter. U.S. diplomats routinely come home to find bookcases shifted around and jewelry missing, for which they have blamed DKRO officers.

More recently, a Russian drone followed a diplomat’s wife as she drove back to the embassy, unaware that the roof of her car had been defaced with tape in the shape of the letter Z, a Russian pro-war symbol. U.S. officials say they believe the group was behind that. U.S. officials strongly believe that the Russian police posted around Washington’s embassy in Moscow are DKRO officers in disguise.

American diplomats posted to Russia receive special training to avoid DKRO and other officers from the FSB and are given a set of guidelines informally known as “Moscow Rules.” It was updated recently to reflect the security services’ increasingly aggressive posture. One important rule, say the officials who helped craft it: “There are no coincidences.”

In May, the spy agency arrested a former U.S. consulate employee, Robert Shonov, and charged him with collaboration on a confidential basis with a foreign state or international or foreign organization. At the time of his arrest, the Russian national was working as a contractor to summarize nwspaper articles for the State Department, which called the arrangement legal and the allegations against him “wholly without merit.” Like Gershkovich, Shonov is now in Moscow’s Lefortovo prison.

. . . .

“Today, the FSB is incredibly powerful and unaccountable,” said Boris Bondarev, a Russian diplomat who resigned and went into hiding shortly after the invasion of Ukraine. “Anyone can designate someone else as a foreign spy in order to get promoted. If you are an FSB officer and you want a quick promotion, you find some spies.”

DKRO officers occupy a privileged position within the security services and Russian society. Its predecessor was the so-called American Department of the KGB, formed in 1983 by a hero of Putin, Yuri Andropov, the longtime security chief who became Soviet leader.

. . . .

The unit’s officers are well-paid by Russian standards, receiving bonuses for successful operations, access to low-cost mortgages, stipends for unemployed spouses, preferential access to beachside resort towns and medical care at FSB clinics that are among Russia’s best.

The FSB emerged after the collapse of the Soviet Union subject to little legislative or judicial scrutiny. Since the February 2022 invasion of Ukraine, its official duty to expunge spies and dissidents has given it such expansive control over many aspects of Russian life that some security analysts now call Russia a counterintelligence state. In one of his final articles before his arrest, Gershkovich and colleagues reported that the invasion was mainly planned by the spy agency, citing a former Russian intelligence officer and a person close to the defense ministry, and was filtering updates from the front lines—roles usually reserved for the military.

In April, Russia passed new treason legislation that further empowered the FSB to squelch criticism of the war. In May, the spy agency, using wartime powers, said it would start to search homes without a court’s approval.

Putin has publicly berated his spy agencies several times since late 2022, after his so-called special military operation fell short of his expectations. Around that time, U.S. officials noticed an uptick in aggressive actions toward the few Americans still in Russia.

. . . .

“You need to significantly improve your work,” Putin told FSB leaders in a December speech to mark Security Agents Worker’s Day, a Russian holiday. “It is necessary to put a firm stop to the activities of foreign special services, and to promptly identify traitors, spies and diversionists.” 

He repeated the admonishment during a visit to Lubyanka, the FSB headquarters, a month before Gershkovich’s arrest. 

Putin spokesman Dmitry Peskov in April denied that Putin had a role in authorizing the arrest. “It is not the president’s prerogative. The security services do that,” he said. “They are doing their job.”

Putin likes to be personally briefed on the FSB’s surveillance of Western reporters, said U.S. and former Russian officials. Leaked FSB documents from previous surveillance cases against foreign reporters show agency leaders along the chain of command adding penciled notes in the margins of formal memos, so that higher-ups can erase any comments that might upset the president. 

DKRO memos often begin with greetings punctuated by exclamation marks to indicate urgency and militaristic formality—a common style in the Kremlin bureaucracy—followed by meticulous notes about the movements of Westerners in Russia and the locals they meet.

“We ask you to identify an employee of the Ministry of Internal Affairs at his place of employment, interrogate him about the goals and nature of his relations with the British, and as a result, draw a conclusion,” read one 2006 memo reviewed by the Journal. 

The FSB has oversight for espionage trials conducted in secret using specialist investigators and judges. During Putin’s 23 years in power, no espionage trial is known to have ended in acquittal.

Link to the rest at The Wall Street Journal

PG notes that the lives of 20th and 21st Century dictators have often ended in premature death.

For those who manage to hang on and direct the affairs of their nations for more than a brief period of time, the fact of their dictatorship tends to impoverish many of their people and results in a nation in which the economy substantially lags those nations which have non-dictatorial political structures.

Populations that live under dictatorships seldom produce world-class technology innovations or other types of creativity. Persistent anxiety and uncertainty regarding one’s standing with those who are part of the extensive government agencies principally assigned to controlling the populace and rooting out enemies of the government shrivel the creative impulses of all but a miniscule percentage of the larger population.

Leaders who gain and hold their positions using thuggery snuff out creativity and economic dynamism among their people and inevitably fall behind nations with a stable tradition of democratically- elected leaders.

Missed America – Attacking the right without asking about the left.

From The Hedgehog Review:

One day early in the pandemic, when schools and colleges first went online, my undergraduate students and I had just finished discussing an essay on the rise and decline of the innovative and powerful Comanche empire. I logged off and walked downstairs, where my elementary school-aged child was sitting at the dining table. “What did you learn in school today?” I asked, as I always do. He recounted to me—not in these exact words, of course—that North America had been an Edenic paradise before the Europeans arrived. I was shocked. This was the racist myth of the noble savage repackaged by the antiracist left. In reality, Native Americans did not need Europeans to introduce them to warfare, imperialism, slavery, or violence. This does not diminish the significant impact European pathogens and ambitions had on Native American polities. But to teach such distortive myths about the past? That’s the kind of thing historians should be upset about.

So imagine my surprise when I opened Princeton historians Kevin Kruse and Julian Zelizer’s new edited volume on contemporary historical myths and found no essay—not a single one!—that challenged myths that came from the left. The editors acknowledge “bipartisan” myths, but with a few exceptions, such as David A. Bell’s essay on American exceptionalism and Akhil Reed Amar’s on the Founding, the contributors focus on myths from the right. On their own, many of the essays, written by some of our best historians, are insightful. Collectively, they reveal the challenges facing the historical profession—my profession. When it becomes an axiom that truth comes from the left and lies from the right, something is amiss. When all the bad things America did are true, but none of the good things, something is definitely amiss.

The editors attribute the spread of “right-wing myths”—they do not distinguish analytically between myths and lies—to two causes: the “conservative media ecosystem” and the “devolution of the Republican Party’s commitment to truth.” So far, so good. Many prominent Republicans embraced and propagated former president Donald Trump’s lies about a stolen election. Even after it was revealed that Fox News commentators, including Tucker Carlson, knew that they were lying to their viewers, Speaker of the House Kevin McCarthy entrusted Carlson with exclusive access to video footage of the January 6 storming of the Capitol. Such brazen disregard for truth threatens the basic norms and principles of American democracy.

Or does it? To believe that Republican lies threaten our democracy, you also have to believe that our basic norms and principles are worth defending. But why sustain something as corrupt as American democracy? In her contribution to Myth America, Kathleen Belew, professor of history at Northwestern University, condemns those who proclaim that the events of January 6 do not reflect who we are as a country. She argues instead that this is “exactly who we are,” and a careful examination of the white nationalism and violence in our history will prove it. Belew simply inverts the story: White nationalists—including the Ku Klux Klan—embody the true America. Any story suggesting that we Americans are something better, or even that we have ideals that should inspire us to be better, is naive and false.

Like Belew, other contributors to Myth America write in absolutes. There is little that is tentative in this volume. The United States is an empire. American exceptionalism is a lie. The United States is xenophobic. There is no complexity. The world is divided into right and wrong, true and false, left and right. There is a lot of either/or but not much both/and. We find few good people doing bad things, much less flawed people achieving good things. There is almost no engagement with competing scholarly perspectives.

There are many missed opportunities. For example, in his essay on American exceptionalism, David Bell notes in passing that “the more progressive that Americans are in their politics, the more likely they are to see America as exceptional, if at all, in large part because of the harm it has done: the treatment of indigenous peoples, slavery, US foreign policy in the twentieth century, and contemporary inequality and racism.” Why leave this as an aside? Why not devote some space in the volume—especially given our public controversies over how we should teach American history—to the dangers posed by exceptionalist narratives from the left?

Myth America’s editors rightly worry that our divisions over history are caused by “unmooring our debates from some shared understanding of the facts.” They argue that “this shift has been driven by the rise of a new generation of amateur historians who, lacking any training in the field, or familiarity with its norms, have felt freer to write a history that begins with its conclusions and works backwards to find—or invent, if need be—some sort of evidence that will seem to support it.” They condemn the “cottage industry on the right” with “partisan authors producing a partisan version of the past to please partisan audiences.” I think that they are referring to the best-selling publications of journalists such as Bill O’Reilly, but might the same words also apply to other amateur historians such as journalist Nikole Hannah-Jones, the primary editor of and contributor to The 1619 Project?

Myth America does not engage with The 1619 Project because, the editors write, there has already been much public debate, but the American Historical Review, the nation’s most exclusive and influential journal of academic history, disagrees. In its December 2022 issue the AHR published an eighty-two-page forum involving nineteen of the country’s most prominent historians. In it, only a few contributors offer truly critical assessments. Speaking truth to power is hard, and the power that most shapes historians’ careers is our reputation among our colleagues. Do “we” want to risk sounding like “them”? Whose side are you on? Instead, many contributors to the forum take the easy path, accusing The 1619 Project of not being radical enough. For instance, it did not fully account for the displacement of indigenous people. Worse, they argue, Hannah-Jones has not freed herself from American exceptionalism because she hopes that the sacrifices made by generations of black Americans might be redemptive and let Americans, in Hannah-Jones’s words, “finally, live up to the magnificent ideals upon which we were founded.”

It is unusual for historians to be so gentle (just read the book review section of any journal or sit in on one of our graduate seminars). Perhaps Cornell historian Sandra Greene, in her contribution to the forum, explained why: “The publication of The 1619 Project is so important, despite its flaws,” which include “factual errors” and “several chapters [that] simplify to the point of distortion.” Important how? Politically. For Greene, The 1619 Project is “a necessary book” and its flaws “should not be used as an excuse to deny the reality that slavery and racism have influenced every aspect of US history.” But one need not deny slavery’s and racism’s historical significance to ask whether historians should defend public narratives that simplify to the point of distortion—what Kruse and Zelizer call myths.

Fortunately, most Americans have a much more nuanced understanding of American history than professional historians (or their most vocal right-wing opponents). Most Americans recognize that the past is complicated. According to an American Historical Association poll, 78 percent of Democrats and 74 percent of Republicans agree that students should learn painful history even when it makes them uncomfortable. According to another recent poll by the organization More in Common, 95 percent of Democrats and 91 percent of Republicans agree that Americans “have made incredible achievements and ugly errors.” Indeed, despite what one learns in the headlines, 87 percent of Democrats believe Washington and Lincoln should be admired for their role in American history and 83 percent of Republicans agree that all American students should learn about slavery, Jim Crow, and segregation. In other words, we Americans know that we have much to atone for in our past, but also much to celebrate. Americans understand that we contain multitudes. It should give historians pause when the common sense of ordinary American people shows more appreciation for historical complexity than trained experts.

After reading Myth America and the AHR forum, one can understand why Republicans have become so distrustful of professors and have proposed dangerous policies that threaten academic freedom, such as weakening tenure or banning entire fields of study. We historians would like to say that it’s because we speak truth to power, but perhaps the truth is that we are afraid to do so when it endangers our reputations or politics. I’m nervous just writing this review.

Link to the rest at The Hedgehog Review

Bogie and Bacall

From The Wall Street Journal:

Hollywood is famous for successfully pairing acting couples, some “married” on screen (Greer Garson and Walter Pidgeon), some musical (Fred Astaire and Ginger Rogers), and some who became involved both off-screen and on (Katharine Hepburn and Spencer Tracy). The gold standard of the on-screen romance that becomes an off-screen love affair is the one that contains a good lesson on how to whistle: Humphrey Bogart and Lauren Bacall, who first starred together in 1944’s “To Have and Have Not.” Their relationship was unexpected and unlikely but ultimately enduring and finally legendary, which is why nearly 80 years later William J. Mann has published “Bogie and Bacall: The Surprising Story of Hollywood’s Greatest Love Affair.”

The author of several books on film, including a well-researched biography of gay silent film star William Haines, Mr. Mann clearly states his purpose regarding Bogart and Bacall: “to trace myths back to their origins and to draw connections between what was said at the start of their careers and what was said later.” He puts the famous couple under an informed scrutiny, giving the full background of both stars before they met and questioning everything they did after. He pins down every rumor or error connected to their histories: Bogart’s naval service (he saw no action), the origin of his famous lip scar, the legend of Bacall’s discovery and arrival in Hollywood, their encounters with the House Un-American Activities Committee, his part in the original Rat Pack, her infatuation with presidential candidate Adlai Stevenson, and so on. He was a child of wealth, a heavy drinker, “a drifter and idler” who bungled into acting. She was the only child of an impoverished single mother, and she knew from the beginning that she wanted to be a star.

Humphrey DeForest Bogart (original nickname: Hump) was born on Christmas Day in 1899. “I got cheated out of a birthday,” he always crabbed. His father was a New York society doctor and his mother a successful commercial artist who used him as a baby model. Bogart claimed, “There was no affection in my family, ever.” In his youth, he made a mess of everything, including boarding school, early jobs and his World War I service. After Bogart’s discharge in 1919, the Broadway producer William A. Brady (father of his best friend) took pity on the hapless 20-year-old and gave Bogart a nonacting job. From that day forward, Bogart never left show business. Mr. Mann calls Brady “the most influential figure of Humphrey’s early life.”

. . . .

Bogart’s stardom was hard-earned, but never deserted him after it arrived, having been born from such films as “High Sierra” and “Maltese Falcon” in 1941 and “Casablanca” in 1942. Today Bogart ranks as the American Film Institute’s No. 1 most popular actor in film history. (Bacall is No. 20 on the women’s list.)

Her road to fame was smoother. Born in the Bronx in 1924 as Betty Joan Perske, she was determined to become successful and moved quickly. Mr. Mann shrewdly points out that “she wondered when . . . not if ” success would arrive, already impatient by age 13. In 1941, at 17, she started modeling. Mr. Mann describes her as “savvy, confident, and resourceful,” adding she was also “a fawning young woman who was drawn to older men and had already proven her ability to charm them.”

After Bacall posed for an issue of Harper’s Bazaar, director Howard Hawks invited her to Hollywood to take a screen test. He was not impressed: “She had a high nasal voice and no training whatsoever.” In October 1943 he took her to the set of “Passage to Marseille” and introduced Bogart. He was 44, 5-foot-8, a big star and married. She was 19, 5-foot-9, a nobody and single. They said hello and shook hands. Bacall said: “There was no clap of thunder, no lightning bolt.” Not yet.

. . . .

“To Have and Have Not” began filming in March 1944. Nobody expected much from it, but the atmosphere on the set began to crackle. What happened can be seen on the screen. Bacall’s lack of experience and minimum of talent is overcome by her casual confidence, unusual looks and an insolent, slightly sullen manner. She’s fresh and different, and what Bogart sees in her is in his eyes and his amused little smile.

It became a short story: They met in 1943, filmed in 1944, married in 1945 (after Bogart’s divorce) and remained together until Bogart’s death of esophageal cancer on Jan. 14, 1957. (“Goodbye kid,” he said to her.) He died with one Oscar (for 1951’s “The African Queen”), 75 films, two children with Bacall and a marriage that had lasted nearly 13 years. (In terms of Hollywood unions, that’s a lifetime.)

Link to the rest at The Wall Street Journal

‘Life, Liberty, and the Pursuit of Happiness’ Review: America’s British Creed

From The Wall Street Journal:

The youth of the American republic is one of its oldest traditions. Its unique origins will always make it younger than any other nation. Yet the United States is also the world’s oldest democracy. Britain in the time of George III was a liberal monarchy, but Britain democratized only by degrees in the 19th century. France was neither liberal nor democratic before the revolution of 1789, and the French are now on their fifth republic. The American ideal of democratic self-governance looks ever more exceptional as it creaks toward its 250th birthday.

Britain has a kind of old-fashioned pseudo-constitution: an accumulation of legal precedent and patchwork legislation, standing on unwritten assumptions and topped by a hollow crown. Americans were the first to spell out their social contract and specify the rights of individuals in plain English. But what did the magic words of the Declaration of Independence—“life, liberty, and the pursuit of happiness”—mean to their authors?

History is best written by the losers. In “Life, Liberty, and the Pursuit of Happiness: Britain and the American Dream,” Peter Moore, a historian who teaches at Oxford, shows how Britain exported its highest ideals to the Americans who rejected it.

Mr. Moore breaks the American creed into three sections and examines each in context. “Life” explores how Benjamin Franklin embodied colonial intellectual potential in the 1740s, and how he developed in London in the 1750s and 1760s. “Liberty” shows how the London rabblerouser John Wilkes catalyzed the politics of liberty in the 1760s, and why he resonated so loudly in the Colonies. “Happiness” explains what the Enlightenment blend of action and emotion meant in England in the early 1770s, and how Americans understood it on the cusp of their reinvention.

Bible reading made colonial Americans perhaps the most literate population on the planet, but the “life” of the American mind was rooted in London. In 1740, Philadelphia was the Colonies’ leading city, with a modern street grid and a handy location on the post road between Boston and Charleston, but its population of 10,000 was half that of Bristol in England. London’s coffee-house culture, and periodicals such as Addison and Steele’s short-lived Spectator, were the templates for Benjamin Franklin’s self-improving “Junto” book club, his Pennsylvania Gazette, and the Almanack that he published under the pseudonym Richard Saunders.

All American roads led to London, and back. A London printer, William Strahan, supplied British news for the Pennsylvania Gazette. Strahan’s protégé, David Hall, emigrated to Philadelphia and worked in Franklin’s print shop. In 1747, Franklin retired from trade, passed the shop to Hall, and commissioned his “coming-out” portrait as a gentleman. Franklin’s scientific studies were not just an expression of practical polymathy. England’s aristocracy of the mind were fascinated by science. When Franklin went to London in the 1750s, his electrical speculations were his calling card.

Meanwhile in London, Strahan was printing Samuel Johnson’s “Dictionary” in installments. Johnson was writing his own one-man periodical, the Rambler. Franklin launched Johnson in America, publishing excerpts in “Poor Richard’s Almanack.” Though Strahan linked the leading minds of American and British letters, Franklin and Johnson’s “division of perspectives” anticipated the parting of imperial ways. Franklin presented himself carefully, playing the “Gentleman in Philadelphia” for his London correspondents, just as he would later play the noble savage for Parisian admirers during the American revolution. Johnson was a tic-ridden social bumbler. Franklin was irreligious but believed in progress. Johnson, a prayerful Anglican, thought that “all change is of itself an evil.”

Mr. Moore describes their differences in the 1750s as “liberalism against conservatism,” but neither of those terms existed in those happy days before everyone had an “ideology.” The only word that made the king and his ministers “sit up and think hard about America,” Mr. Moore writes, was “France,” and that made the colonists want “more of Britain than less of it.” The Seven Years’ War (1756-1763) brought London and the colonists together, but the subsequent tax burden demonstrated how unequal the relationship was. Americans began to sour on the distant mother country, especially after George III and his ministers tried to ruin John Wilkes.

Link to the rest at The Wall Street Journal

Ukrainian Writer and Activist Victoria Amelina

From Publishing Perspectives:

As Publishing Perspectives readers will recall, when Ukraine’s Victoria Amelina gave us her thoughts on the slain children’s author and illustrator Volodymyr Vakulenko, she said, “Everyone in Ukraine, including Ukrainian writers, keeps losing their loved ones.”

Now, Amelina herself has been lost. She died on Saturday (July 1) of injuries sustained in the June 27 Russian missile strike on the pizza restaurant in the eastern city of Kramatorsk. Victoria Amelina was 37.

In the spring, she had made the trip to Lillehammer to be at the World Expression Forum, WEXFO, on May 22 and accept the International Publishers Association‘s 2023 IPA Prix Voltaire Special Award for Vakulenko. One of the things she told Publishing Perspectives about the slain children’s author was that “Vakulenko believed we are to make history. He always responded to the challenges of his time.”

Today (July 3), the IPA’s offices in Geneva have reported that the Prix Voltaire Special Award now honors Amelina as well as Vakulenko. In a tweet on May 28, Amelina announced that she’d delivered the IPA’s special Prix Voltaire to Vakulenko’s mother.

. . . .

Iryna Baturevych at Ukraine’s publishing-industry news medium Chytomo writes to us, “We are shocked. [Amelina] has a little son, almost the same age as my son. He will be 12 in July. Victoria was courageous.”

As Chytomo’s article notes, Amelina was working with a watchdog organization called Truth Hounds, which monitors and documents details of potential war crimes.

Reported today (July 3) by CNN’s Svitlana Vlasova, Claudia Rebaza, Sahar Akbarzai, and Florencia Trucco, Amelina has become the 13th person now known to have died from that attack Kramatorsk–which is close to the front lines in the Donetsk province. The attack was timed to a particularly busy moment when the Ria Lounge near Vasyl Stus Street was crowded with evening diners. At least 61 people are reported to have been wounded when what analysts say was a Russian short-range ballistic missile called an Iskander hit the restaurant.

In BCC’s write-up, George Wright reports that Amelina’s first English-language nonfiction book, War and Justice Diary: Looking at Women Looking at War, is expected to be published, although no time frame for that release is mentioned.

Link to the rest at Publishing Perspectives

The Supreme Court Declares Independence

From The Wall Street Journal:

Fourth of July celebrations arrive with special resonance for conservatives this year. It is clear now that we are in the throes of a full-scale American cultural counterrevolution, propelled by rising popular opposition to the coercive orthodoxies of a hegemonic left and enforced by a string of impeccable decisions from a Supreme Court intent on reviving the spirit of 1776. Spectacular displays of pyrotechnics from an endangered establishment, increasingly hysterical at the dawning realization of its imminent overthrow, are as entertaining—and ultimately harmless—as any you will witness in the night skies this holiday.

Fourth of July celebrations arrive with special resonance for conservatives this year. It is clear now that we are in the throes of a full-scale American cultural counterrevolution, propelled by rising popular opposition to the coercive orthodoxies of a hegemonic left and enforced by a string of impeccable decisions from a Supreme Court intent on reviving the spirit of 1776. Spectacular displays of pyrotechnics from an endangered establishment, increasingly hysterical at the dawning realization of its imminent overthrow, are as entertaining—and ultimately harmless—as any you will witness in the night skies this holiday.

. . . .


The popular rebellion against the cultural left that has seized so many of the institutions of American life has enjoyed mixed success at the political level. But since that hegemony was established in large part through half a century of grotesque judicial overreach, it was likely to be truly overturned only by a judiciary that would finally move to substitute restraint for activism.

With last week’s timely, pre-Independence Day succession of decisions by the court we can see better than ever that the counterrevolution is advancing apace. In Biden v. Nebraska, Students for Fair Admissions v. Harvard, and 303 Creative v. Elenis, a solid majority struck solid blows for the principles and values that helped create the U.S. in the first place.

In the three cases—respectively over the Biden administration’s student debt cancellation plan, racial preferences in college applications, and free-speech protections in commercial interactions—the judgments rescinded the usurpation by an expansive executive of the power of the purse, restored the principle of merit over group membership as a key determinant of individual opportunity, and reaffirmed a citizen’s right not to be compelled to endorse ideas with which she disagrees.

Note the common prefix in the principal verb in each of those subclauses, “re-.” One meaning is “back” or “backward.” But this is no reactionary backlash to the inevitable march of modernity, as most of the media, with predictable and prejudicial alarmism, calls it. These decisions, along with other critical rulings in this and the preceding court term, represent the necessary undoing of successive judicially authorized derogations of the most defining American principles—fairness, equality, freedom, the proper exercise and distribution of government powers.

The counterrevolution has been achieved through painstaking efforts by conservatives to recruit, develop and advance jurists of reliably originalist disposition, as well as through presidential nomination—by Donald Trump, especially—of well-qualified justices who wouldn’t, for a change, acquiesce to the left’s dominance.

But reading last week’s decisions, I would argue that the most important force working in this revolution’s favor—perhaps as it was in 1776—is the sheer intellectual weight of the argument.

Contrast the various majority opinions last week with the left’s dissents. The opinions of Chief Justice John Roberts and Justice Neil Gorsuch, writing for the majority in the three cases, are characterized by taut writing, unimpeachable logic, close adherence to argument, legal principles, facts and evidence, and detailed interpretation of existing case and statutory law. The fulcrum of the legal argument is—curiously enough—on the law as it was written, as it must be applied, not on some larger political or social objective.

With the exception of Justice Elena Kagan, the court’s members from the left write and speak with the apparently unchallengeable conviction that their role is not to apply the law but rather to make it, in the pursuit of some desired higher outcome—one that happens to conform to their ideological priors rather than to any constitutionally mandated principle or process.

In the process they adduce not legal reasoning, but political rhetoric of the crudest character and most clichéd language.

Justice Ketanji Brown Jackson, dissenting from the racial-preference ruling, tells us that “deeming race irrelevant in law does not make it so in life,” and that race still matters to the “lived experiences” of Americans. Justice Sonia Sotomayor, misrepresenting the decision in 303 Creative—which was, remember, about whether a business can be required to engage in a form of speech that violates its owner’s conscience—by saying the “symbolic effect of the decision is to mark gays and lesbians for second-class status.”

The left’s response to the reversal of its long success in making the court a second legislative branch of government is also telling. Instead of accepting, as limited-government, originalist conservatives did, the need for a long campaign to undo the hegemony of the rival philosophy, they want to short-circuit the process. This means protecting or restoring their authority by making radical institutional changes to the court or, failing that, by delegitimizing it, using a friendly media to impeach the reputation of justices they oppose with spurious allegations of impropriety.

Like almost all rebellions, the current American cultural counterrevolution is certain to face further and intensified resistance from the institutions and people whose dominance it threatens.

Link to the rest at The Wall Street Journal

Not necessarily to do with books, but a legal earthquake for US colleges and universities.

The Individualists

From The Wall Street Journal:

Most politically attuned Americans will have some idea of what libertarianism is. Some think of it as an embrace of “business” or “capitalism” in all its forms; others as the anything-goes morality of 1960s radicalism. Still others, coming nearer the truth, will know libertarians as the people who want government “out of the bedroom and the boardroom,” as the slogan has it. In fact, as we are reminded by John Tomasi and Matt Zwolinski in “The Individualists,” libertarianism is a bit of all these things. It is a wildly diverse and inveterately fractious political tradition whose adherents have taken opposite sides on nearly every important political question.

The book documents libertarian thought, from its origins in the second half of the 19th century until now, on an assortment of topics, including markets, poverty, racial justice and the international order. The authors themselves claim the libertarian label and write clearly and charitably about all factions of the philosophy.

What is libertarianism, anyway? The easy answer: an approach to politics that seeks to minimize state coercion and maximize individual liberty. That generalization, though, covers a multitude of disputes. Most libertarians support legal abortion, for example, and some oppose it, in both cases for reasons of individual sanctity. In the absence of any easy formulation for what all libertarians think, Messrs. Tomasi and Zwolinski propose six “markers”: property rights, individualism, free markets, skepticism of authority, negative liberties, and a belief that people are best left to order themselves spontaneously. Libertarians, the authors contend, keep all six principles in view at the same time. 

They divide libertarianism into three historical eras, each responding to particular threats to liberty. The “primordial”-era libertarians—Frédéric Bastiat (1801-1850) in France, Herbert Spencer (1820-1903) in Britain—formed their ideas in opposition to socialism. In 19th-century America, the great threat to liberty wasn’t socialism but slavery. Early American libertarians like the journalist Lysander Spooner (1808-1887) saw slavery “primarily through the lenses of authority and property rather than of race,” the authors write. “Libertarians condemned slavery as an unjust usurpation of individual sovereignty and a denial of the individual’s rightful entitlement to the fruits of their labor.” There was wisdom in understanding chattel slavery as theft—a definable crime—rather than as a form of the more nebulous sin of racism.

During the Cold War—the authors’ second era—libertarians tended to align with conservatives in opposition to central planning and devoted their attention mostly to economic subjects. It was an uneasy alliance. Libertarians and conservatives were both anticommunist on questions of personal liberty and economics, but the right favored military buildup and libertarians hated militarism. The two camps clashed on crime, drug legalization, abortion, obscenity on the airwaves and more. Already in 1969 the libertarian economist Murray Rothbard (1926-1995) urged his readers “to go, to split, to leave the conservative movement where it belongs.”

The alliance largely dissolved in the years after the fall of the Berlin Wall. Libertarians, quite as much as conservatives and liberals, experienced a crisis of identity. With socialism discredited everywhere, what held libertarianism together? In the 21st century, the movement in the U.S. has consisted in an assortment of competing, often disputatious intellectual cadres: anarchists, anarcho-capitalists, paleo-libertarians (right-wing), “liberaltarians” (left-wing) and many others.

Link to the rest at The Wall Street Journal

First Page Critique: Defining the Scope of Your Memoir

From Jane Friedman:

A summary of the work being critiqued

Title: When Did You Know?
Genre: memoir

Emily was adopted at birth, and I was privileged to be present when she was born. When she first emerged, I wondered if something was wrong, she didn’t cry right away. You’re worrying too much. But at six months old, she didn’t respond with recognition when I picked her up from day care, a flat affect. Later Emily had a choking episode that might have started with a seizure. This started a series of medical tests revealing a chromosomal abnormality, a seizure disorder, and autism spectrum disorder. By this time, we had also adopted her biological sister, Madison, also at birth, an energetic blond girl who later was diagnosed with ADHD.

The memoir shares the pains and joys of parenting these girls, addressing topics such as spirituality and autism, nutrition and weight issues, fatigue, behavior management, sibling rivalry, friendships and sexuality. This book includes resources and ideas of what helped me navigate the challenges of autism parenting.

First page of When Did You Know?

“Did you know she had autism when you adopted her?”

“No, I just thought she slept a lot. We were able to take her to movies and she’d sleep through them,” I said. People would look at us as we lugged her carrier into the cinema but she’d be silent all the way through.

But even from the day she was born I wondered. I also wondered about Amber, her biological mother. I eventually found out a lot about Emily, our daughter with autism and Amber, her biological mother.

“The baby’s coming, push Amber,” the doctor said at the foot of the bed. He was surrounded by nurses and medical students in blue scrubs.

“Come look,” he told me. He knew I was the adoptive mother.

I stood behind him as a blueish, brownish dome emerged.

“Push one more time, Amber,” they said.

I stood back. Amber grunted and cried out. That must hurt so much. The baby whooshed into the doctor’s gloved hands. He held her up.

“Is there only one?” Amber asked.

“Yes,” he answered.

Earlier while I waited with Amber during labor, she mentioned that a lady in the grocery store had commented it looked like she was having twins. I was surprised she thought twins were a real possibility and that no one put her fears to rest. I paused and wondered about Amber. Just like when we first met her a few days earlier.

Continue reading the first pages, with color coding and comments by editor Hattie Fletcher.


Dear Julia,

Thanks for sharing your work and the first pages of your memoir about parenting—certainly an important story for you and one that’s potentially incredibly useful to other parents who might find themselves in a similar situation.

My first big-picture observation is that it feels a bit hard to see the shape of the story from the materials you’ve submitted. That is, your summary describes the book as a memoir that “shares the pains and joys of parenting these girls …” but that’s potentially a pretty big and abstract story. Do you plan to focus on their very early years, or the time up until they’re 18 and (perhaps) leaving the nest? Do you want to focus primarily on a specific aspect or aspects of parenting—perhaps on your experiences seeking help from the medical community, or your challenges finding a peaceful rhythm as a family? Obviously, readers don’t want to start a book already knowing the ending (at least, not usually), but in the book pitch you want to define the scope of the story more specifically.

There’s no firm rule for how much time/story a memoir can cover. A quick look at some memoirs about parenting will show you many different approaches: Anne Lamott’s classic Operating Instructions is a journey through the first year of her son’s life; Mary Louise Kelly’s It. Goes. So. Fast. is framed around her son’s last year of high school. (A one-year narrative can make a tidy frame, indeed.) But many writers tackle longer arcs: Ron Suskind’s Life, Animated covers almost two decades of his family’s efforts to use Disney movies to help his autistic son engage as fully as possible with them and the rest of the world.

Once you have established the scope of your story and your overall narrative arc, then you can think about where to first enter the story and begin to introduce your characters/family. Maybe you’re telling the story of your young daughters from Emily’s birth to the day when (I’m making stuff up now), at the ages of twelve and fourteen, both girls climbed on a bus together for a week-long group wilderness adventure. Or maybe you’re telling the story from the day of Emily’s choking episode to her high school graduation lunch. Your goal, essentially, is to find a satisfying narrative arc that will take readers on a journey with you and that will provide—even if there’s a lot of mess along the way—some degree of resolution of a central question or tension.

That being the goal, I’d question whether Emily’s birth is the most effective starting point. Birth has, of course, the obvious advantage of being a very clear beginning—Day One! On the other hand, a purely chronological organization of material can sometimes feel tedious on the page. (First, Emily was born. … In her first week … When she was two months old … And then, when she was one…)

In fact, you’re already sort of building in a bit of that framework, by starting—even if only for a few lines—with a fast-forward in order to flash back to the birth. So, for a next draft, I would be inclined to start somewhere a little farther along. And then, after you’ve established some of the conflict/tension of the overall narrative, you could jump back to the time of Emily’s birth (and even before that, it seems), and look at it through the lens of the information you later learned.

Regardless of where in the larger work the birth scene ends up, when you do get there, it might be helpful to consider the pacing of the scene. There’s a balance, usually, between spending too much time in one scene (which, like a chronological organization to a book, can become tedious) and weaving in other elements, such as reflection, description, character development, etc. It seems to me that your first pages currently bounce around quite a bit between several scenes/times, in a way that feels a bit disorienting and jumpy. Focusing more on individual times might make it easier for readers to follow the story, and also open up a little space for you to go into more detail. I’ve color-coded your first pages to show the jumps in time visually. Until the long stretch at the end, a couple days after the birth, you can see there’s a jump every few paragraphs, more or less. It might make sense to consider grouping some of the color sections into bigger sections, whether or not in chronological order.

Link to the rest at Jane Friedman

‘The World’ Review: History Branched Out

From The Wall Street Journal:

You know you are in for quite a ride when an enormous tome of world history begins: “I am Enheduanna, let me speak to you!” An Akkadian high priestess of the moon god Nanna from the third millennium B.C., Enheduanna had been seized by raiders, likely raped, and then miraculously saved and restored to power. According to Simon Sebag Montefiore, Enheduanna was “the first woman whose words we can hear, the first named author, male or female, the first victim of sexual abuse who wrote about her experiences.”

This is not the history you learned in school.

The World” tells the story of humanity through families, be they large or small, powerful or weak, rich or poor. It is a book for people who want to read about people. There’s little attention paid to impersonal forces. Readers interested in isms—feudalism, imperialism, capitalism, etc.—won’t find these subjects explicitly discussed in this book. Rather, the author addresses the faceless structures of human existence by writing about who advocated for and implemented them, and who benefited from or suffered under them. “The World” pulsates with the hundreds of human stories Mr. Montefiore brings to life in vivid, convincing fashion. Enheduanna has about a page dedicated to her life; other individuals have a bit more; some, a single paragraph. This is history as collective biography, a journey across almost two million years, from the appearance of Homo erectus in east Africa to the rise of Xi Jinping’s China.

Mr. Montefiore has been working up to this ambitious project over his career, from biographies of Prince Grigory Potemkin and Stalin, to a fine study of the Romanov dynasty, to a 3,000-year history of Jerusalem. That book was aptly subtitled “a biography” of the city, for Mr. Montefiore is a biographer at heart. Combining literary flair with keen insight into human psychology, he can evoke a person with a few choice words—“porcine tyrant,” his description of Belarus’s current leader, Alexander Lukashenko, nails its subject.

Among the many strengths of “The World” is its truly global perspective. This is an unabashedly multicultural history that refuses to privilege any particular perspective, be it geographic, cultural or ethnic. Africa warrants as much consideration as Europe, Asia as the Americas. Nor does the book forsake the lives of the common folk for kings and queens, tycoons and presidents. The focus on families allows for light to shine on women, children and others often ignored in our master narratives. In a tactic typical of his original approach, the author opts not to tell the familiar story of Sally Hemings and Thomas Jefferson, but instead focuses on Sally’s mother, Betty, a child of rape with a white father, a certain Capt. Hemings, who later tried to buy her and once tried to kidnap her.

. . . .

For the very powerful, family has often meant dynasty. For many others, it has meant intimacy, care and love. But enslaved persons, throughout human history, have been denied such essential goods. “Slavery shattered families,” Mr. Montefiore writes, “it was an anti-family institution.” Where enslaved families did exist, in Roman households or Islamic harems, for example, they “encompassed coercion without choice, and often outright rape.” His account in “The World” pays significant attention to such injustices and acknowledges the victims of the past. Mr. Montefiore tells us that among the first names ever recorded in writing were the enslaved persons En-pap X and Sukkalgir in Uruk roughly five thousand years ago.

At the same time, history shows that, along with being nests of succor, families can be “webs of struggle and cruelty.” The Chinese statesman Han Fei Tzu warned the monarch in the third century B.C., “Calamity will come to you from those you love.” This truth recognizes no constraints of class, power or wealth.

The relentless chronological march of Mr. Montefiore’s book is leavened, and given an aspect of suspense, by his habit of picking up the family stories of significant individuals long before they take center stage. The Kennedys are introduced through patriarch Joseph Kennedy, swanning about in 1920s Hollywood and cashing out before the crash. Barack Obama’s story begins with his father, a Kenyan economics student whose political patron, Tom Mboya, had met Sen. John F. Kennedy at Hyannisport, Mass., and made the case for foreign scholarships. “Mboya chose Obama, who left for Hawaii; Kennedy won the presidential election.” Donald Trump has long been “the personification of American illusion,” Mr. Montefiore remarks: a “bombastic bazooka of complex inferiority.” But he begins keeping an eye on the “quintessential American story” of the Trump family as far back as the 1880s.

The author is equally on target with his account of Vladimir Putin. He highlights Mr. Putin’s rapid transformation from an awkward, clumsy leader into a murderous authoritarian, notorious for his “gangsterish swagger.” Mr. Montefiore rightly rejects the argument that the West and NATO are to blame for Russia’s invasion of Ukraine. If the war amounts to a needless tragedy, it is not unexpected. After all, such acts of mass violence are nothing new. After decades of peace, as he puts, “normal disorder has been resumed.”

Link to the rest at The Wall Street Journal

Soldiers Don’t Go Mad

From The Wall Street Journal:

Two of England’s finest poets of World War I—Siegfried Sassoon and Wilfred Owen—met in a mental hospital in Scotland in 1917. Craiglockhart War Hospital, near Edinburgh, is the subject of “Soldiers Don’t Go Mad,” Charles Glass’s brisk, rewarding account of the innovative doctors and their “neurasthenic” patients who suffered unprecedented psychological distress (and in unprecedented numbers) on the Western Front. By 1915, the second year of the war, over half a million officers and enlisted personnel were admitted to medical wards for “deafness, deaf-mutism, blindness, stammering, palsies, spasms, paraplegia, acute insomnia, and melancholia”—hallmarks of what at the time doctors termed “shell shock” or, as it has become known, post-traumatic stress disorder (PTSD). Modern warfare overwhelmed countless young soldiers: “For the first time in history, millions of men faced high-velocity bullets, artillery with previously unimaginable explosive power, modern mortar shells, aerial bombardment, poison gas, and flamethrowers designed to burn them alive,” as Mr. Glass, a former chief Middle East correspondent for ABC News, recounts.

Craiglockhart, originally known as the Edinburgh Hydropathic or “The Hydro,” opened as a hospital in October 1916 for “officers only.” Its château-like main building, elaborate gardens and sweeping lawns were more elite health club than mental ward. Low-impact activities available to patients—“carpentry, photography, debating, music, and writing”—may well have confirmed the suspicions of “most senior officers, including many Medical Corps physicians, [who] regarded shell shock as nothing other than malingering or cowardice that demanded not treatment, but punishment.” To the pioneering physicians at Craiglockhart, however, the damage that trench warfare inflicted on the psyche was painfully real, often giving rise to a soldier’s “trembling limbs, halting voice, and confused memory.”

Wilfred Owen, 24, had exhibited these very symptoms in France, after surviving the blast of a trench mortar shell and spending several days unconscious, sprawled amid the remains of a fellow officer. The Army Medical Board declared Second Lt. Owen unfit for duty and consigned him to Craiglockhart for treatment. His physician there, Arthur Brock, had developed a work-based approach to recovery he called “ergotherapy,” as a counter to the popular rest cure of “massage, isolation, and a milk diet.” Brock fostered activity and community, and, when Owen expressed an interest in literature, he “encouraged him to write poetry, essays, and articles” as part of his therapy. Owen took over editorship of the Hydra, the hospital’s literary journal, in which some of the most memorable poems of the war appeared, including those of his newest acquaintance, the 30-year-old Second Lt. Siegfried Sassoon.

Sassoon took a different route to Craiglockhart than Owen, with army politics playing a role as much as mental health. Sassoon had acquired the nickname “Mad Jack” for his forays into that part of the battlefield known as No Man’s Land. Enraged at the death of his training-camp roommate, David Cuthbert Thomas (“little Tommy”), Sassoon charged the enemy line for 18 days, with what some suspected was a death wish: “They say I am trying to get myself killed. Am I? I don’t know.”

But Sassoon’s raids on No Man’s Land—brave, unhinged or both—did not precipitate his review by the medical board. That followed, instead, from the outspoken officer’s criticism of the “political errors and insincerities for which the fighting men are being sacrificed.” Following a medical furlough, occasioned by a bullet wound to the shoulder, Sassoon refused to return to duty. Rather than court martial the dissenter, who had been awarded the Military Cross for “conspicuous gallantry,” the board sent him to Craiglockhart. Was he in fact suffering from PTSD? His friend and fellow poet Robert Graves came to think so, though Sassoon’s doctor, the renowned psychiatrist W.H.R. Rivers demurred. “What have I got then?” Sassoon asked Rivers, to which he laughingly replied, “Well, you appear to be suffering from an anti-war complex.”

Rivers was “a polymath with notable achievements in neurology, clinical psychiatry, medical research, anthropology, and linguistics,” and—even more than Sassoon and Owen—he is the protagonist of Mr. Glass’s account of Craiglockhart. He founded England’s first psychology laboratories in London and Cambridge, where he was a fellow of Saint John’s College. (A dynamic portrait of Rivers may also be found in Kay Redfield Jamison’s recent “Fires in the Dark: Healing the Unquiet Mind,” in which Rivers’s fascination with religion, ritual and myth is shown to have contributed importantly to his treatment of mental illness.) But the physician was also a soldier, and his brief was not only to cure patients but also to fortify them for return to combat, often with predictably dire outcomes. Though Rivers was devoted to Sassoon, who came to see him as his “father confessor,” Sassoon’s pacifism put Rivers in a difficult position.

Sassoon had been at Craiglockhart for three weeks before Owen worked up the courage to introduce himself. By way of entrée, he brought several copies of Sassoon’s collection “The Old Huntsman and Other Poems” for him to sign. They talked for a half hour, during which Owen expressed his admiration, and Sassoon concluded that he “had taken an instinctive liking to him and felt that I could talk freely.” Though both were homosexual, the two men came from completely different worlds. The aristocratic Sassoon, whom Owen described as “very tall and stately, with a fine firm chisel’d . . . head,” was educated at an upper-crust “public” school and Cambridge. Owen, the son of a railway inspector, attended a local “comprehensive” school and missed the first-class honors necessary for a scholarship to University College London.

Link to the rest at The Wall Street Journal

How Books Are Used to Perpetuate the Prison Industrial Complex

From Book Riot:

The prison industrial complex is a term you may have heard if you’ve looked into abolitionist thinking or learned about the contexts around social movements like Black Lives Matter. It’s a term that, as defined by abolition group Critical Resistance, describes “overlapping interests of government and industry that use surveillance, policing, and imprisonment as solutions to economic, social and political problems.” While the concept of the prison industrial complex covers a huge number of social structures, you’ll most frequently hear it used in discussions about mass incarceration, for-profit prisons, and how criminalising and imprisoning people benefits the rich and powerful (particularly politicians and CEOs) while doing nothing to tackle or prevent harm that takes place within society at large.

The term “prison industrial complex” was coined by activists as well as incarcerated people and their families. And a number of academics and writers have provided powerful critiques of the prison industrial complex — Angela Y. Davis with Are Prisons Obsolete?, Michelle Alexander with The New Jim Crow: Mass Incarceration in the Age of Colorblindness, Elizabeth Hinton with From the War on Poverty to the War on Crime: The Making of Mass Incarceration in America, and many less formalised writings from abolitionist groups and activists.

. . . .

There has been a great deal of writing and literature about the prison industrial complex — but are books one of the structures supporting the system itself?

. . . .

In my earlier article on copaganda in crime books, I explored how literature, particularly the detective genre, has often bolstered a pro-police social agenda. The same is true for the prison industrial complex, and pro-carceral justice structures. In detective fiction, prison for the villain is often a major part of a happy ending — although in many crime books, the conclusion involves the murderer dying in a showdown with the hero, rather than being incarcerated. Literature often portrays prison as largely unproblematic, the only issue being when innocent people are imprisoned. While miscarriages of justice are enormously harmful, the prison system also enacts enormous and disproportionate violence on people who have committed the crimes they are accused of; however, social attitudes to this are very different, with the refrain “if you can’t do the time, don’t do the crime” commonly thrown back at people who argue against carceral abuses.

. . . .

Throughout the majority of literature, prisons are represented as frightening but necessary institutions that largely keep the public safe, only harmful when people are incarcerated for a crime they didn’t commit. There is little focus on, or care for, the harm that prisons cause to people who have committed crimes, or their harm to society as a whole. The area of literature that has paid the greatest amount of attention to the harms of the prison industrial complex is the prison memoir, but, once again, the most successful examples of these are books by political prisoners, or people incarcerated because of racism or other forms of bigotry, such as Nelson Mandela or the Exonerated Five. These stories are incredibly important, but they also show how incarcerated people are generally only portrayed as sympathetic if the person is “not a real criminal,” but someone who has been done wrong by the system, instead of considering the possibility that the system of imprisonment is itself wrong. “Real criminals” don’t get the same kind of sympathy or humanisation in literature, and it is rarely suggested that the prison industrial complex itself may be the true villain.

Link to the rest at Book Riot

While PG does not characterize U.S. prisons as ideal surroundings for anyone, including prison employees or convicts, that doesn’t mean that prisons are a terrible idea or that anyone has found a better way of protecting people who don’t commit crimes from people who do commit crimes.

When you’re living in a relatively safe world, where chances of being injured or killed other than by old age or terminal illness, prison can seem to be cruel and overly-strict. However, no one is more relieved to hear that someone who has committed a serious crime that has harmed them has been locked up in prison so she/he can’t commit another crime for a good long while or retaliate against them for reporting the crime.

PG has visited people in prison in two roles – hired attorney and as a representative of the religion of the prisoner, e.g. pastoral visits.

He has yet to meet anyone who is incarcerated who PG didn’t think was a danger to his/her community and was the perpetrators of one or more acts that harmed other people.

The primary purpose of a prison is as a deterrent to all to not violate the law and to punish those who have done so. A secondary purpose is as to protect society in general from a repetition of the bad acts of an individual who, absent punishment, is likely to repeat those bad acts to the detriment of others, either individually or collectively.

The Prison Industrial Complex is manipulative Marxist-style term designed to manipulate perceptions to the detriment of the underlying purpose of prisons – to protect those who don’t commit crimes from the actions of those who do and is a typical Marxist creation designed to manipulate emotions. (See also, Military Industrial Complex)

Absent prior convictions, non-violent criminals tend to receive punishments lighter than prison – release on probation either after they’ve served a period of incarceration or, often with some prior time in jail prior to the hearing, probation, supervised or unsupervised, which usually involves meeting with a state or federal probation officer on a regular basis and a condition that the probationer not commit another crime during probation.

Depending on the jurisdiction, supervised probation may require weekly or monthly or, sometimes bi-monthly visits with a probation officer to discuss what the probationer has been doing, whether he/she is attending school, going to work or some other condition the sentencing judge may impose – attending Alcohol Anonymous meetings weekly, spending some time helping a local charitable organization, etc.

Some prisoners finish getting their high school or college diploma while incarcerated. Prison wardens are generally quite pleased with someone who does this sort of thing and it is a plus if the prisoner seeks early release for good behavior while incarcerated.

PG has yet to visit anyone in prison who does not acknowledge that he/she deserves to be there.

Virginia Woolf’s Forgotten Diary

From The Paris Review:

On August 3, 1917, Virginia Woolf wrote in her diary for the first time in two years—a small notebook, roughly the size of the palm of her handIt was a Friday, the start of the bank holiday, and she had traveled from London to Asheham, her rented house in rural Sussex, with her husband, Leonard. For the first time in days, it had stopped raining, and so she “walked out from Lewes.” There were “men mending the wall & roof” of the house, and Will, the gardener, had “dug up the bed in front, leaving only one dahlia.” Finally, “bees in attic chimney.”

It is a stilted beginning, and yet with each entry, her diary gains in confidence. Soon, Woolf establishes a pattern. First, she notes the weather, and her walk—to the post, or to fetch the milk, or up onto the Downs. There, she takes down the number of mushrooms she finds—“almost a record find,” or “enough for a dish”—and of the insects she has seen: “3 perfect peacock butterflies, 1 silver washed frit; besides innumerable blues feeding on dung.” She notices butterflies in particular: painted ladies, clouded yellows, fritillaries, blues. She is blasé in her records of nature’s more gruesome sights—“the spine & red legs of a bird, just devoured by a hawk,” or a “chicken in a parcel, found dead in the nettles, head wrung off.” There is human violence, too. From the tops of the Downs, she listens to the guns as they sound from France, and watches German prisoners at work in the fields, who use “a great brown jug for their tea.” Home again, and she reports any visitors, or whether she has done gardening or reading or sewing. Lastly, she makes a note about rationing, taking stock of the larder: “eggs 2/9 doz. From Mrs Attfield,” or “sausages here come in.”

Though Woolf, then thirty-five, shared the lease of Asheham with her sister, the painter Vanessa Bell (who went there for weekend parties), for her, the house had always been a place for convalescence. Following her marriage to Leonard in 1912, she entered a long tunnel of illness—a series of breakdowns during which she refused to eat, talked wildly, and attempted suicide. She spent long periods at a nursing home in Twickenham before being brought to Asheham with a nurse to recover. At the house, Leonard presided over a strict routine, in which Virginia was permitted to write letters—“only to the end of the page, Mrs Woolf,” as she reported to her friend Margaret Llewelyn Davies—and to take short walks “in a kind of nightgown.” She had been too ill to pay much attention to the publication of her first novel, The Voyage Out, in 1915, or to take notice of the war. “Its very like living at the bottom of the sea being here,” she wrote to a friend in early 1914, as Bloomsbury scattered. “One sometimes hears rumours of what is going on overhead.”

In the writing about Woolf’s life, the wartime summers at Asheham tend to be disregarded. They are quickly overtaken by her time in London, the emergence of the Hogarth Press, and the radical new direction she took in her work, when her first novels—awkward set-pieces of Edwardian realism—would give way to the experimentalism of Jacob’s Room and Mrs. Dalloway. And yet during these summers, Woolf was at a threshold in her life and work. Her small diary is the most detailed account we have of her days during the summers of 1917 and 1918, when she was walking, reading, recovering, looking. It is a bridge between two periods in her work and also between illness and health, writing and not writing, looking and feeling. Unpacking each entry, we can see the richness of her daily life, the quiet repetition of her activities and pleasures. There is no shortage of drama: a puncture to her bicycle, a biting dog, the question of whether there will be enough sugar for jam. She rarely uses the unruly “I,” although occasionally we glimpse her, planting a bulb or leaving her mackintosh in a hedge. Mostly she records things she can see or hear or touch. Having been ill, she is nurturing a convalescent quality of attention, using her diary’s economical form, its domestic subject matter, to tether herself to the world. “Happiness is,” she writes later, in 1925, “to have a little string onto which things will attach themselves.” At Asheham, she strings one paragraph after another; a way of watching the days accrue. And as she recovers, things attach themselves: bicycles, rubber boots, dahlias, eggs.

. . . .

Between 1915 and her death in 1941, Woolf filled almost thirty notebooks with diary entries, beginning, at first, with a fairly self-conscious account of her daily life which developed, from Asheham onward, into an extraordinary, continuous record of form and feeling. Her diary was the place where she practiced writing—or would “do my scales,” as she described it in 1924—and in which her novels shaped themselves: the “escapade” of Orlando written at the height of her feelings for Vita Sackville-West (“I want to kick up my heels & be off”); the “playpoem” of The Waves, that “abstract mystical eyeless book,” which began life one summer’s evening in Sussex as “The Moths.” 

Link to the rest at The Paris Review

Vitality Rages Against the Machine

From The Wall Street Journal:

“All history is the history of longing,” Jackson Lears has written. In four major works, ranging across subjects as varied as advertising, gambling and the search for grace, his gaze has turned to one form of longing in particular: our desire to escape the iron cage. That metaphor, as used by Max Weber, illustrates our modern predicament. Progress has left us mired in bureaucracy, disenchanted by science and severed from the nonhuman world.

According to Mr. Lears, the period between the Civil War and World War I was when the cage snapped shut. “Rebirth of a Nation” (2009), his study of American life in those years, traced how the modern corporation organized everybody’s lives for maximum productivity, depriving us of our full, messy humanity. Our longing to regain our freedom peeks through all of Mr. Lears’s books, sometimes in unexpected places. The craps table might be unsavory and slot machines addictive, but in “Something for Nothing” (2003), his book about the culture of chance, Mr. Lears suggested that they are also useful tools for rebelling against “the modern utopian fantasy of the systematically productive life.” Bucking the system this way, bashing against the bars of our cage, satisfies a deep human craving, though not one that tourists think they’re going to Las Vegas to assuage.

Now, in his fifth book, this note of longing at last finds full voice. In “Animal Spirits: The American Pursuit of Vitality From Camp Meeting to Wall Street,” Mr. Lears offers a direct challenge to what he calls “the official conceptions of what America was (and is) all about”—which, to again evoke Weber, is the perspective of the guy holding the key to the cage. According to this positivist view, we are efficient workers doing efficient jobs in a world where nothing enchanted or spooky or even particularly irrational is going on. Mr. Lears unfolds, in abundant detail, another tradition. For generations, a few bold souls have thought, or preached, or simply lived out a defiant conviction: There is no cage.

. . . .

The huge-and-elusive-ness of Mr. Lears’s subject arrives before you even open the book. The phrase “animal spirits” is fiendishly tricky: an unstable compound of two unstable words. “Animal,” in this context, originally meant “animated,” as though a person received a spark or push from some invisible power. But in the last few centuries, the word has developed a different usage, referring to nonhuman creatures. “Spirit,” for its part, originally meant “breath” or “respiration.” But it, too, has taken on a new meaning, shedding its physiological origins to connote something ethereal: a soul.

Link to the rest at The Wall Street Journal

What Is Juneteenth? And How Did It Become a Federal Holiday?

For visitors from outside of the United States, today is a new national holiday, Jueteenth.

From The Wall Street Journal:

Juneteenth is an annual holiday observing the end of slavery in the U.S. It marks the day (June 19, 1865) when news of emancipation reached people in the deepest parts of the former Confederacy in Galveston, Texas.

In 2021, it became the first new federal holiday created by Congress in nearly four decades. The bipartisan legislation was signed into law by President Biden on June 17, giving Juneteenth the same status as Memorial Day, Veterans Day, Martin Luther King Jr. Day and other federal holidays.

Celebrated for decades through family gatherings and events such as parades and public readings of the Emancipation Proclamation, the holiday received more national attention in recent years—in particular after the global protests sparked in 2020 by the deaths of George Floyd, Breonna Taylor, Ahmaud Arbery and Rayshard Brooks, as well as a national conversation to rethink policing in America. Amid calls for racial equity, more companies, including Nike, Twitter and Spotify Technology, moved to observe the holiday.

. . . .

The holiday, also known as Emancipation Day, Black Independence Day or Jubilee Day, recognizes the date when news of emancipation finally reached Galveston, on June 19, 1865.

Nearly two months after the end of the Civil War, Maj. Gen. Gordon Granger, along with more than 1,800 federal troops, arrived to take control of the state, confirming the freedom of the last remaining slaves in the deepest parts of the South.

Although the Emancipation Proclamation—an executive order declaring that “all persons held as slaves” would be free—was signed by President Abraham Lincoln in 1863, and Gen. Robert E. Lee’s surrender in Appomattox, Va., marked the end of the Civil War in April 1865, news spread slowly and often met resistance from plantation owners.

The 13th Amendment, enshrining a ban on slavery into the Constitution, was ratified in December 1865. In pockets of the country, however, enslavement of African-Americans continued for several years.

. . . .

Juneteenth is the first federal holiday to be created by Congress since 1983, when lawmakers designated the third Monday in January as Martin Luther King Jr. Day, in honor of the slain civil-rights leader.

Texas was the first state, in 1980, to declare Juneteenth a holiday. All 50 states, as well as the District of Columbia, now acknowledge or observe Juneteenth, according to the Congressional Research Service.

. . . .

The new federal holiday took effect immediately. Because the first observance fell on Saturday, most federal employees were given off Friday, June 18, 2021.

Many states scrambled that year to give some of their public employees the day off, and employers from Goldman Sachs and Bank of America to Stanley Black & Decker were among the organizations around the U.S. that quickly rolled out new holiday policies for workers, with some allowing people to take Friday off with mere hours of notice. Some universities, like Ohio State, canceled classes.

. . . .

Stock markets, which close for many federal holidays, but not all, remained open on the first official federally recognized Juneteenth, but closed in 2022. The New York Stock Exchange and Nasdaq will be closed on Monday in recognition of Juneteenth. Markets will reopen Tuesday.

Most banks will be closed, since the Federal Reserve observes Juneteenth as a holiday. The U.S. Postal Service will also close in observance of the holiday. Most stores, including grocery stores, gyms and other consumer-facing retailers, will be open. 

. . . .

Hundreds of official events take place across the U.S. and the world in celebration of Juneteenth. When the announcement of freedom finally reached Galveston in 1865, newly liberated African-Americans celebrated with prayer, dance and community feasts. The earliest observances of the holiday presented an occasion to bring together family members and recognize Black freedom by reading passages from the Emancipation Proclamation and holding religious services.

After the Covid-19 pandemic limited festivities in 2020, major Juneteenth celebrations, parades and festivals were set for 2021.

Many cities and municipalities will hold Juneteenth events again this year. The Atlanta Parade & Music Festival runs from Friday to Sunday at the city’s Centennial Olympic Park. In Washington, D.C., the National Museum of African American History and Culture has planned presentations, stories and exhibitions highlighting issues and the cause for celebrations around the holiday.

Link to the rest at The Wall Street Journal

“Power and Progress” Technology and the New Leviathan

From The Wall Street Journal:

Daron Acemoglu and Simon Johnson have written a long, eloquent book arguing that technological progress is a decidedly mixed bag. They believe that the power of the state can and should be used to select the best of the goodies from the bag. The state, they argue, can do a better job than the market of selecting technologies and making investments to implement them.

Mr. Acemoglu is a prolific economist and a shoo-in for the Nobel Prize; his MIT colleague Mr. Johnson is an economist and professor of management. In “Power and Progress: Our Thousand-Year Struggle Over Technology and Prosperity” they claim that the billions of daily decisions by you and me—to venture on a new purchase or a new job or a new idea—do not “automatically” turn out optimally for ourselves or society. In particular, poor workers are not always helped by new technology. The invisible hand of human creativity and innovation, in the authors’ analysis, requires the wise guidance of the state.

This is a perspective many voters increasingly agree with—and politicians from Elizabeth Warren to Marco Rubio. We are children, bad children (viewed from the right) or sad children (viewed from the left). Bad or sad, as children we need to be taken in hand. Messrs. Acemoglu and Johnson warmly admire the U.S. Progressive Movement of the late 19th century as a model for their statism: experts taking child-citizens in hand.

The authors begin with the questionable assertion that the most prevalent attitude toward technology today is a heedless optimism. “Every day,” they write, “we hear . . . that we are heading relentlessly toward a better world, thanks to unprecedented advances in technology.” Their chapters then skip briskly through history—from the agricultural revolution of the neolithic era, to the industrial revolution of the 19th century, to the Western postwar economic expansion of the 20th century—seeking to show how at each turn new innovations tended to empower certain sections of society at the expense of others. The “power” that concerns them, in other words, is private power.

Since the 1920s, economists from John Maynard Keynes to Paul Samuelson to Joseph Stiglitz have been claiming, with increasing self-assurance though with surprisingly little evidence beyond the blackboard, that (1) private arrangements work poorly, (2) the state knows better, and (3) we therefore need more state. Messrs. Acemoglu and Johnson have long believed in this anti-liberal syllogism. Statism recommends a growing Leviathan, as Mr. Acemoglu argued equally eloquently in “Why Nations Fail,” a 2012 book with James Robinson.

We need, in other words, the legislation currently being pushed by left and right to try again the policies of antitrust, trade protection, minimum wage and, above all, subsidy for certain technologies. Messrs. Acemoglu and Johnson are especially eager to regulate digital technologies such as artificial intelligence. “Technology should be steered in a direction that best uses a workforce’s skills,” they write, “and education should . . . adapt to new skill requirements.” How the administrators of the Economic Development Administration at the Department of Commerce would know the new direction to steer, or the new skills required, remains a sacred mystery.

Choosing a path for a society and its economy is not the only role of Leviathan; distributing economic justice is equally important. “Government subsidies for developing more socially beneficial technologies,” the authors declare, “are one of the most powerful means of redirecting technology in a market economy.” Messrs. Acemoglu and Johnson regard the private economy as an inequality machine.

In former times, they write, “shared benefits appeared only when landowning and religious elites were not dominant enough to impose their vision and extract all the surplus from new technologies.” Today we need the state to use its powers “to induce the private sector to move away from excessive automation and surveillance, and toward more worker-friendly technologies.” Fear of surveillance is a major theme of the book; therefore “antitrust should be considered as a complementary tool to the more fundamental aim of redirecting technology away from automation, surveillance, data collection, and digital advertising.”

Link to the rest at The Wall Street Journal

PG says absent draconian government controls, which would snuff out modern economies, there have always been and will always be inequalities in wealth and power.

Government is never populated by saints, angels, or true prophets, just by the same type of people occupied in the management tiers of capitalism, small and large.

One difference is that a company, even a very large one, cannot force individuals to do what it wants. Capitalism, in its relatively unfettered form, must always work hard to attract customers lest another group of more intelligent people finds a better way to attract individuals to be their customers.

Governments can and are regularly persuaded, through various methods, legal and illegal, to favor one group of people or another, to create ingroups and outgroups. Ingroups support those at the heights (or depths) of current governments. Outgroups are the ones who want to change what governments are
doing by kicking out the incumbents and replacing them with people who will make decisions differently.

If a significant portion of humankind was comprised solely of saints and angels, such people would be suited to run a society from the top down, treating each individual equally and fairly and ensuring that individuals would be treated with unconditional kindness and equity.

Unfortunately, only a relatively small percentage of the general population even comes close to being saints and angels, and a large share of such people don’t aspire to be rulers or politicians, or the leaders of large business
organizations.

In the OP, “government subsidies for developing more socially beneficial technologies” sounds like something straight out of the faculty lounge of a typical large university. Who decides what a “socially beneficial technology” is, especially when it is in its nascent stage? What happens if there are two candidates to provide a “socially beneficial
technology,” and each wants to do this differently?

Do we really want to have a group of government officials deciding the Windows vs. Mac type of question? Which of the two is more socially beneficial?

The solution is to let individuals decide which one they prefer for whatever reason is most important to them right now. The faculty lounge solution will constantly be screwed up in some way for a whole bunch of people.

A Considerable Aura: On Adam Shatz’s “Writers and Missionaries”

From The Los Angeles Review of Books:

THE NEW COLLECTION from London Review of Books editor Adam Shatz, jointly published by the LRB and Verso, comprises mostly essays previously published in that magazine. And despite its title—Writers and Missionaries: Essays on the Radical Imagination—it is less the missionary function of writing than the value of that periodical that the book most clearly illuminates. The LRB was founded in 1979 when a lockout at TheTimes put TheTimes Literary Supplement on indefinite hiatus. Frank Kermode, who had written for TheNew York Review of Books since its founding in 1963, published an article in TheObserver calling for something to stop the gap: and in due turn, the NYRB launched its London counterpart, with NYRB editor Karl Miller at the helm. But more essential still than Miller was his deputy editor, Mary-Kay Wilmers, who would later serve as the magazine’s editor for almost 30 years. From an immensely wealthy family, Wilmers had a childhood so cosmopolitan that for a long time, she’s said, she was more comfortable in French than in English.

Her worldliness was essential to the LRB’s direction, and so, in a very real sense, has been her wealth. The paper has never made a profit, and has been sustained by the Wilmers family trust, managed by Wilmers’s billionaire banker brother until his death in 2017. By January 2010, the LRB was £27 million in debt to the Wilmers trust, as The Times revealed that month; nevertheless, the trust appeared to have “no intention of […] seeking repayment of the loan in the near future.” As such, the LRB has always had money, and the freedom that it confers.

How has it used that freedom? Most obviously, it has no need for clickbait; where even historically respectable literary reviews have often lapsed into political alarmism and frenetic, occasionally pathetic, trend-chasing, the LRB continues to publish multithousand-word essays on real tennis, selfhood in medieval literature, and Victorian pets. It appeals to the idea that one ought to be interested and slightly conversant in, say—to take the most recent issue—the Revolutions of 1848, 16th-century Swiss social history, Epictetus, contemporary curation, and postindependence Nigeria. Scope aside, the pieces are also distinctly more opinionated than those in most book reviews: in one notorious incident of the last decade, the German English poet, critic, and translator Michael Hofmann called beloved Jewish novelist Stefan Zweig—who died by suicide in 1942 after fleeing the Holocaust—“the Pepsi of Austrian writing,” and the suicide note itself boring. Some called Hofmann’s piece offensive; no one, I think, called it boring.

And on a substantial level, the paper’s odd financial position has also conferred a certain political freedom. In the words of frequent contributor Alan Bennett—whom Wilmers has known since her Oxford days—“the LRB has maintained a consistently radical stance on politics and social affairs.” This brings us, happily, back to the title of Adam Shatz’s collection.

To be consistently radical has meant neither to be the most radical nor the most consistent. The LRB is certainly left-of-center, and has suffered the expected blowback for this—recall the uproar over Pankaj Mishra’s review of Niall Ferguson’s Civilization: The West and the Rest (2011), which led Ferguson to threaten a libel suit. But as is evident from the very first page of Shatz’s collection, it is not quite as in thrall to the suffocating pressures of conformity as its peers. This makes, sometimes, for a distinctly skeptical, irreverent attitude toward contemporary politics. Shatz’s introduction consists of a concise and quite touching defense of biographical criticism—and criticism itself—which appropriately begins with biography. In 1990, he writes, he arrived at Columbia University “with a vague and yet passionately held idea of becoming a writer.” Yet he soon learned, in his courses on French thought, about Roland Barthes’s 1967 essay “The Death of the Author,” and Michel Foucault’s own disavowal of authorship shortly thereafter. At this point, a small footnote states: “Today, of course, the idea that an author’s identity is irrelevant to an understanding of their work would be unfathomable to young people who are increasingly taught that an author’s identity is almost all that matters; but back then it exerted a considerable aura, at least in critical theory.”

. . . .

And throughout, the book is infused with life—to read it is almost an antidote to the cynicism that indeed does develop from too many book reviews obviously written as favors, or strategic plays on the part of the reviewer. Some of this can certainly be attributed to Shatz’s subjects being largely historical. The book is roughly divided into four sections: “Native Sons,” on Arab figures; “Equal in Paris,” on Black writers in Paris; “Signs Taken for Wonders,” on French figures from Claude Lévi-Strauss to Michel Houellebecq; and “Lessons of Darkness,” about World War II and more French individuals. And yet, only three of his subjects are living: Kamel Daoud, Houellebecq, and, at the time of writing, Fouad Ajami. The criticism, then, is almost entirely biographical, and at some remove (even with Houellebecq—about whom, in arguing that his 2015 novel Submission is not Islamophobic, Shatz has woven an almost mythical psychobiography).

And yet, what is his argument? Nonfiction books, in their quest for a cognizable marketing hook, often concoct a “thesis” in a somewhat limiting, formulaic way. It would be hard to say what the thesis of Shatz’s collection is, or even of one individual essay. As is common with LRB pieces, it is very hard (perhaps impossible) to detect where they shift between common opinion on the subject, an argument made by the reviewed text, and an argument made by Shatz himself; this was my experience even in cases where I had closely studied the authors in question.

But if the collection doesn’t contain a thesis, per se, it certainly displays some consistent tendencies. Politically, all the essays operate under the basic assumption that while “the Right” is hateful, wrong, and so on, “the Left” is certainly far from perfect, and it’s essentially good to poke fun at one’s idols and ideals (indeed, his criticism of giants like Jacques Lacan and Jacques Derrida are virtuosic). And crucially, Shatz’s book adopts the basic assumption that philosophy, theory, and literature are inextricable from the lives and times of their authors, and that any interesting critical reaction is one that combines the two. In other words, it is basically “historicist.” Yet there is a third factor that, while never explicated, is unmistakable: Shatz, while mostly eschewing the “I,” makes little attempt at impartiality.

Link to the rest at The Los Angeles Review of Books

The OP caused PG to wonder whether there is any future for the classic, printed-on-paper literary magazine.

The New Makers of Modern Strategy

From The Wall Street Journal:

A group of generals is called a “glitter”; a group of historians an “argumentation.” There is no colorful group noun for academic analysts of strategy. Perhaps, like owls, they form a “college.” In “The New Makers of Modern Strategy,” Hal Brands, a professor of strategy at the Johns Hopkins University’s School of Advanced International Studies, gathers a college of 45 such experts. All are wise after the facts of their field, and each attempts the historian’s equivalent of the owl’s neck rotation—a sweep that, taking in past and present, looks to the future.

Strategists must prepare for the next war. “The New Makers of Modern Strategy” is the third collection to bear this title. The first was published in 1943, when applied strategy was a matter of life and death and the future of democratic states uncertain. “If strategic studies was a child of hot war,” Mr. Brands writes, “it matured during the Cold War.” The second “Makers of Modern Strategy,” published in 1986, absorbed the “nuclear revolution,” redefined the “relationship between force and diplomacy,” emphasized resources and the long haul, and examined the challenges of the Cold War’s hotter regions, such as irregular warfare and counterinsurgency.

This third edition, “The New Makers of Modern Strategy,” returns to those topics, and adds AI, drones, cyberwarfare and other developments altering the familiar patterns of conflict. America’s post-Cold War “holiday” from history is over, Mr. Brands writes, and great-power political competition is back. China is challenging the U.S. for hegemony, and other “revisionist actors” such as Russia and Iran seek to alter the international order. Strategies of “hybrid” warfare, which mixes civilian and military methods, expand the “gray zone” between peace and war. This collection is a handbook for a crowded, unstable world in which America’s leaders need “strategic discipline and insight.”

Lawrence Freedman opens the first section, “Foundations and Founders,” with a lucid definition of terms. Next, Walter Russell Mead examines the legacies of Thucydides and Polybius: “they integrated strategy, the art of winning wars, and statecraft, the art of building and leading states.” Toshi Yoshihara then pivots to Asia, with an essay on Sun Tzu’s “Art of War.” Sun Tzu, he writes, cannot be identified as “a historical figure in a specific time and setting,’ and “The Art of War” was “not written by one author in a single act.” Sun Tzu is the Homer of Chinese strategy. Mr. Yoshihara surveys Chinese civilization and explains its military ethos, suggesting an underlying universal logic. Even so, it seems impossible to translate Sun Tzu’s elusive elixir of shi (described with kinetic metaphors about water, diving hawks and rolling boulders) into the American anthropology of “strategic culture.”

Machiavelli, the founder of modern political thought, also wrote an “Art of War.” Matthew Koenig asks whether Machiavelli fits best into a “Western tradition” that seeks to use “overwhelming force on the enemy’s center of gravity in a decisive battle of annihilation,” or into the “Eastern tradition” epitomized by Sun Tzu, which “prioritizes deception and winning without fighting.” Mr. Koenig concludes that Machiavelli’s amoral realism “may lean East.”

But the Western tradition includes such deceivers as Odysseus’ Trojan Horse and Joshua’s Hebrew spies. It also includes Machiavelli’s “full-throated defense of democracy” in “The Discourses on Livy”—a proposal to revive the Roman republic’s strategic culture in the modern state. That state-building meant harnessing military means to political ends: If this integration seems obvious to us, that is because modern Western strategy rediscovered the Roman way through the trials of war, and also its errors.

Samuel Huntington, summarizing the ideas of Clausewitz, once wrote that war aspires to be an “autonomous science” but functions as a “subordinate science.” Strategy is too important to be left to soldiers. It is also too complex to be left to politicians. Mr. Brands’s second and third sections examine the professionalization of the military in the age of early modern state-building, the integration of economic and political strategies, and the testing of these full-spectrum theories after 1914.

Veteran armchair strategists will know Napoleon Bonaparte, Alfred Thayer Mahan and J.F.C. Fuller, but they may be caught off-balance by Priya Satia writing about Gandhi’s “consummately coercive” passive aggression and S.C.M. Paine on Mao Zedong’s strategy of “nested war.” Ms. Paine’s account of how Mao fought three wars at once—nesting a Chinese civil war within a war with Japan, and the war with Japan within World War II—suggests parallels to the emerging Western understanding of strategy as multilevel and multidimensional. It also shows that Mao, like Eisenhower, was a largely unsung genius of grand strategy.

Link to the rest at The Wall Street Journal

Comb-Over No More: Why Men’s Hair Transplants Are Flourishing

Nothing to do with books, but many of the visitors to TPV are male.

From The Wall Street Journal:

JAMIE CONNORS’S hairline started betraying him in his mid-20s. Though the Brooklyn video editor, now 36, had flirted with the idea of getting a hair transplant, his follicle fallout never seemed bad enough to warrant such measures. And, besides, how could he take enough time off work to recover away from colleagues’ prying eyes?

That all changed during the pandemic. In early 2021, a photo of Mr. Connors made him realize his scalp situation was getting dire. And with no socializing on the books and WFH in full force, he could recuperate covertly on his sofa without wasting sick days. He booked a consultation with Dr. Benjamin Paul, a Manhattan hair surgeon, who recommended Follicular unit extraction (FUE), a procedure that involves harvesting individual follicles from the back of the head and painstakingly replanting them in the front or the crown. It takes between four and eight hours and costs $11,000-15,000. “I figured I am going to do this right,” said Mr. Connors, who got his new hair several weeks after his initial consultation.

He is among a wave of men who capitalized on Covid-era downtime to acquire fresh shags. Hair transplants—perhaps the male equivalent of women’s pandemic face-lifts—are enjoying a healthy growth spurt. Upon returning to the office, you might find that your colleague has replaced his unconvincing comb-over with a mane that would make Jason Momoa jealous.

“We have seen a big increase in men seeking transplants in the last two years or so,” said Dr. Gary Linkov, a New York plastic surgeon. And many men, he added, are jetting off to medical-tourism spots like Turkey, Portugal and Panama for bushier new ’dos. According to market researcher Fortune Business Insights, the global hair-transplant market is predicted to reach $43.13 billion by 2026, up from 5.94 billion in 2018, a sevenfold surge.

Thinning locks have triggered male anxiety for eons. The Ancient Egyptians painted sparse crowns with a paste concocted from dates, donkey hooves and dog paws in a (charmingly doomed) effort to boost growth. In the 1990s, drugs such as Finasteride and Rogaine started providing hope; more recently, trendy startups like Hims and Nutrafol have launched pills and sprays with purported follicle-enhancing powers.

. . . .

Although transplants have been performed commercially since the ’50s, they were generally considered too extreme by most men. Various factors are shifting that perception. The pandemic was unkind to hairlines: According to the American Academy of Dermatology, Covid—and pandemic-related stress—caused hair shedding in some folks. And any thinning was magnified by the new Zoom routine: It’s harder to ignore those naked temples when you’re staring at yourself all day, said Dr. Paul Jarrod Frank, a New York cosmetic dermatologist.

The stigma associated with transplants—and other cosmetic procedures for men—is also lessening, said Dr. Frank. In 2020, actor Cheyenne Jackson documented his transplant journey on Instagram; designer Marc Jacobs has also discussed his reinvigorated thatch in recent years. But not everyone is happy to open up. Some transplant recipients would only speak to us anonymously. And Dr. Frank referenced A-listers who have seemingly enjoyed miraculous hair recoveries lately but have offered no explanation. Nonetheless, he said, regular guys are becoming inspired by “the results these celebrities are getting.”

Link to the rest at The Wall Street Journal

The Big Business of the British Empire

From The Wall Street Journal:

Those of us who went to school before our past was rewritten as a catalog of the White Man’s crimes were taught that empire—with all its vices and virtues—was built by monarchs and statesmen. In “Empire, Incorporated,” Philip J. Stern tells us that this picture, while not inaccurate, is quite incomplete.

British colonialism in particular, Mr. Stern says, was conceived by investors, creditors, entrepreneurs and, lest we forget, parvenus and embezzlers. This cast of men-on-the-make flourished alongside sovereigns and their ministers and produced what Mr. Stern calls “venture colonialism”—a form of overseas expansion that was driven by a belief that “the public business of empire was and had always been best done by private enterprise.”

The history of British colonialism is really the history of the joint-stock corporation. A novel strategy in the mid-16th century, this form of enterprise procured capital from an array of investors with ownership shares or profit-sharing and created a single legal entity, often granted special privileges. Among much else, the joint-stock corporation made undertakings on a global scale newly possible.

But what made joint stocks so appealing also made them “unsettling,” Mr. Stern writes, and a faction in Parliament worried about the clout of these companies in far-flung places. There was, no doubt, a class component to the apprehension. The original investors in the East India Co.—the pre-eminent joint stock, from its founding in 1600 to well into the 19th century—were all merchants, with only one aristocrat among them.

By the 1620s, the East India Co.’s demographics had changed, its successes attracting a posher class of investor. But the company ingeniously—and audaciously—refused to admit the king himself, “fearing the loss of independence it would entail,” Mr. Stern explains. The public justification for this apparent irreverence was that it was unseemly for the king to enter into commercial partnerships with his subjects.tells us how the joint-stock corporation shaped British colonialism. As a narrative, the author says, “it is like a novel that places an originally supporting character in the center of the story,” elevating fortune-hunters into fabled men. Robert Clive, that most infamous of the nabobs enriched by the East India Co. in the mid-18th century, and Sir Humphrey Gilbert, a mass killer of rebellious Irishmen who took possession of Newfoundland for the crown in 1583, were both VCs, or venture colonialists. As were Thomas Smythe, the first governor of the East India Co. and later treasurer of the Virginia Co., whose tobacco yielded great riches; and Sir Thomas Roe, who went as the ambassador of James I to the court of the Mughal Emperor Jahangir even as his true purpose was to secure a trade monopoly in India.

England’s “portfolio colonialism” came into existence through royal charters, by which the sovereign doled out juicy commercial advantages to those who petitioned for them. These plums ranged from exemptions from duties and taxes to the prerogative to claim territory overseas in the name of the crown (as Gilbert did in Newfoundland). The terms could be audacious, Mr. Stern observes, allowing companies to run all sorts of enterprises over “ill-defined geographic spaces insouciantly superimposed over indigenous sovereignty.” Breathtaking claims to territory or jurisdiction resulted in assertions of rights to “sacrosanct” private property that were enforceable in British courts. The charters redrew the maps of the world.

The first such charter was granted to the Muscovy Co. in 1555. Mr. Stern writes that the company effectively became “the English government over Anglo-Russian commerce” and, as a conduit of relations between England and Russia, exercised “de facto command over Anglo-Russian diplomacy.”

Link to the rest at The Wall Street Journal

The Secret History And Strange Future Of Charisma

From Noema:

In 1929, one of Germany’s national newspapers ran a picture story featuring globally influential people who, the headline proclaimed, “have become legends.” It included the former U.S. President Woodrow Wilson, the Russian revolutionary Vladimir Lenin and India’s anti-colonialist leader Mahatma Gandhi. Alongside them was a picture of a long-since-forgotten German poet. His name was Stefan George, but to those under his influence he was known as “Master.”

George was 61 years old that year, had no fixed abode and very little was known of his personal life and past. But that didn’t matter to his followers; to them he was something more than human: “a cosmic ego,” “a mind brooding upon its own being.” Against the backdrop of Weimar Germany — traumatized by postwar humiliation and the collapse of faith in traditional political and cultural institutions — George preached an alternate reality through books of poetry. His words swam in oceans of irrationalism: of pagan gods, ancient destinies and a “spiritual empire” he called “Secret Germany” bubbling beneath the surface of normal life. In essence, George dreamed of that terribly persistent political fantasy: a future inspired by the past. He wanted to make Germany great again.

George dazzled Germans on all sides of the political spectrum (although many, with regret, would later distance themselves). Walter Benjamin loitered for hours around the parks of Heidelberg that he knew the poet frequented, hoping to catch sight of him. “I am converting to Stefan George,” wrote a young Bertolt Brecht in his diary. The economist Kurt Singer declared in a letter to the philosopher Martin Buber: “No man today embodies the divine more purely and creatively than George.”

Max Weber, one of the founding fathers of sociology, met Stefan George in 1910 and immediately became curious. He didn’t buy George’s message — he felt he served “other gods” — but was fascinated by the bizarre hold he seemed to have over his followers. At a conference in Frankfurt, he described the “cult” that was growing around him as a “modern religious sect” that was united by what he described as “artistic world feelings.” In June that year, he wrote a letter to one of his students in which he described George as having “the traits of true greatness with others that almost verge on the grotesque,” and rekindled a particularly rare word to capture what he was witnessing: charisma.

At the time, charisma was an obscure religious concept used mostly in the depths of Christian theology. It had featured almost 2,000 years earlier in the New Testament writings of Paul to describe figures like Jesus and Moses who’d been imbued with God’s power or grace. Paul had borrowed it from the Ancient Greek word “charis,” which more generally denoted someone blessed with the gift of grace. Weber thought charisma shouldn’t be restricted to the early days of Christianity, but rather was a concept that explained a far wider social phenomenon, and he would use it more than a thousand times in his writings. He saw charisma echoing throughout culture and politics, past and present, and especially loudly in the life of Stefan George.

I knew: This man is doing me violence — but I was no longer strong enough. I kissed the hand he offered and with choking voice uttered: ‘Master, what shall I do?’

— Ernst Glöckner

It certainly helped that George was striking to look at: eerily tall with pale blueish-white skin and a strong, bony face. His sunken eyes held deep blue irises and his hair, a big white mop, was always combed backward. He often dressed in long priest-like frock coats, and not one photo ever shows him smiling. At dimly lit and exclusive readings, he recited his poems in a chant-like style with a deep and commanding voice. He despised the democracy of Weimar Germany, cursed the rationality and soullessness of modernity and blamed capitalism for the destruction of social and private life. Instead, years before Adolf Hitler and the Nazis came to power, he foresaw a violent reckoning that would result in the rise of a messianic “fuhrer” and a “new reich.”

Many were immediately entranced by George, others unnerved. As the Notre Dame historian Robert Norton described in his book “Secret Germany,” Ernst Bertram was left haunted by their meeting — “a werewolf!” he wrote. Bertram’s partner, Ernst Glöckner, on the other hand, described his first encounter with George as “terrible, indescribable, blissful, vile … with many fine shivers of happiness, with as many glances into an infinite abyss.” Reflecting on how he was overcome by George’s force of personality, Glöckner wrote: “I knew: This man is doing me violence — but I was no longer strong enough. I kissed the hand he offered and with choking voice uttered: ‘Master, what shall I do?’”

As German democracy began to crumble under the pressure of rebellions and hyperinflation, George’s prophecy increased in potency. He became a craze among the educated youth, and a select few were chosen to join his inner circle of “disciples.” The George-Kreis or George Circle, as it came to be known, included eminent writers, poets and historians like Friedrich Gundolf, Ernst Kantorowicz, Max Kommerell, Ernst Morwitz and Friedrich Wolters; aristocrats like brothers Berthold, Alexander and Claus von Stauffenberg; and the pharmaceutical tycoon Robert Boehringer. These were some of the country’s most intellectually gifted young men. They were always young men, and attractive too — partly due to George’s misogynistic views, his homosexuality and his valorization of the male-bonding culture of Ancient Greece. 

Between 1916 and 1934, the George Circle published 18 books, many of which became national bestsellers. Most of them were carefully selected historical biographies of Germanic figures like Kaiser Frederick II, Goethe, Nietzsche and Leibniz, as well as others that George believed were part of the same spiritual empire: Shakespeare, Napoleon and Caesar. The books ditched the usual objectivity of historical biographies of the era in favor of scintillating depictions and ideological mythmaking. Their not-so-secret intention was to sculpt the future by peddling a revision of Germany’s history as one in which salvation and meaning were delivered to the people by the actions of heroic individuals.

In 1928, he published his final book of poetry, “Das Neue Reich” (“The New Reich,”) and its vision established him as some kind of oracle for the German far-right. Hitler and Heinrich Himmler pored over George Circle books, and Hermann Göring gave one as a present to Benito Mussolini. At book burnings, George’s work was cited as an example of literature worth holding onto; there was even talk of making him a poet laureate. 

Link to the rest at Noema

‘The Forgotten Girls’ Review: The Friend Who Was Left Behind

From The Wall Street Journal:

When journalist Monica Potts came across research detailing a striking drop in life expectancy among the least-educated white Americans, she returned to her depressed rural hometown of Clinton, Ark., to investigate. She was especially interested in understanding the rise in midlife deaths among white women, but her focus promptly narrowed to one woman in particular: her childhood best friend, Darci Brawner.

In the years after high school, Ms. Potts had attended Bryn Mawr College, earned a master’s degree in journalism from Columbia University, and established a successful career as a reporter. (She currently covers politics for the website FiveThirtyEight.) Ms. Brawner, on the other hand, had failed to complete high school and developed addictions to methamphetamines and methadone, cycling in and out of prison and rehab while her mother and stepfather raised her two children.

The question driving Ms. Potts’s clear-eyed and tender debut, “The Forgotten Girls,” is why, given that she and her friend were both smart and ambitious and hell-bent on escaping their blighted hometown, did one succeed and the other fail? The author hopes that determining how their paths diverged will illuminate the drop in life expectancy, which is widely attributed to suicide, drug overdoses and alcohol-related fatalities—the so-called “deaths of despair” identified by economists Anne Case and Angus Deaton after their 2015 study of rising mortality. While Ms. Brawner does not belong to that sad demographic—hers is rather a life of despair—Ms. Potts wonders whether her repeated attempts to turn her life around are “just delaying the inevitable.”

Ms. Potts is well-positioned to explain her insular birthplace to outsiders. She retains strong ties to the community in the foothills of the Ozarks, and she didn’t merely helicopter in to report the book: She left the Washington, D.C., suburbs to move back permanently in 2017. People there trust her and speak to her candidly. They include not only Ms. Brawner, but other friends from the ’80s and ’90s, along with their parents and former teachers.

As its subtitle, “A Memoir of Friendship and Lost Promise in Rural America,” suggests, this book is as much the author’s story as a piece of reportage. Ms. Potts reconstructs events with the help of her teenage journals (she’s granted access to Ms. Brawner’s as well), and she considers ways she might have failed her friend. “When I began my investigation into what was happening to women like Darci,” she writes, “I didn’t realize how personal and emotional this journey would become.”

While “The Forgotten Girls” glancingly addresses larger forces like the disappearance of solid blue-collar jobs in Clinton and the desertion of the downtown business district, the narrative is rooted in the two women’s experiences. Readers might feel that the solution to the riddle of why one girl grew up to succeed and one to fail is hiding in plain sight: Ms. Potts, though poor like Ms. Brawner, came from a relatively stable family that prioritized education.

Though the author hesitates to assign blame to Ms. Brawner’s mother and stepfather, their parenting appears lax by any definition. Ms. Brawner’s home had a den with its own entrance to the street, and she and her older brother turned it into, in the author’s words, “a twenty-four-hour teenage clubhouse, complete with alcohol and, later, drugs.” Meth, in fact, was more easily accessible than alcohol.

In a town without much to do, and where smart kids were unlikely to be challenged by their schoolwork, Ms. Brawner began drinking and taking drugs in her early teens; she became sexually active at 14 and essentially had a live-in boyfriend by 16. Ms. Potts is critical of the town’s dominant evangelical churches, which stressed trust in God’s plan above all else. This, she argues, encouraged people to surrender control of their own lives. Ms. Brawner’s mother seemed to be a mere spectator to her daughter’s slow derailment. When Ms. Potts asks her how she handles problems, she replies, “Oh you know, I just give it up to God.”

Link to the rest at The Wall Street Journal

This story resonated with PG because he grew up under somewhat similar circumstances. Fortunately, there were no drug problems, but there were more than a few problems with alcoholism.

PG graduated as the valedictorian of a class of 22 students. His high school girlfriend was the salutatorian. PG and his high school girlfriend were the only ones who graduated from college. The #3 graduate went to nursing school for a couple of years. Most of the others didn’t apply to college and the two or three who did dropped out well before completing a degree.

The schools in that tiny town closed for good a few years after PG graduated. The students from the town and adjoining farmland were bussed to a larger town nearby. The town’s population is about two-thirds its size when PG left for good. Based on what PG has been able to determine from online research, there are a lot of abandoned houses in the town. The house where PG lived was torn down a few years after he left.

Some parts of rural America are thriving due to improvements in agricultural practices, but a great many of the towns are in stasis.

PG’s younger brother was a successful real estate broker in Iowa prior to a premature death. Iowa farmland is right at the top of the most productive farmland in the world, due to a combination of very good soil and a good climate for growing corn, soy beans and similar midwestern staples.

PG’s brother recounted a typical conversation he had frequently with some of the older farmers who had spent their entire lives farming. When he asked them what would happen to the farm after they were unable to handle the hard work of fertilizing, planting, cultivating, harvesting, etc., they would often respond that their children would take over the family farm which had remained in the family for 75-100 years.

“Well,” my brother would say something like, “Is your daughter who’s an ophthalmologist in Los Angeles going to come back to take over the farm or will it be your son who’s a stockbroker in Manhattan?”

The farm provided for a good life for the family, but the kids found something that interested them more than farming.

On one such occasion, my brother sold a large family farm to Goldman Sachs, a large stockbroker and financial firm headquartered in New York City, who recognized the asset value of the farmland, but which hired a farm manager to oversee its large Iowa land holdings, but the land was unlikely to ever revert to a family farm.

Generations

From The Wall Street Journal:

“Every generation blames the one before,” sang Mike + the Mechanics in their chart-topping 1988 song “The Living Years.” There’s a certain truth to that. It’s also true that every generation can’t help blaming the one that comes after. They’re lazy. They don’t know how to dress. They speak in strange slang. They’ve never heard of groups like Mike + the Mechanics.

In America, the generations seem to be engaged in a low-intensity forever war: Baby Boomers vs. the Silent Generation, Gen X vs. Gen Z, Millennials vs. Everybody. Jean Twenge, a psychology professor at San Diego State University, wants to broker a truce. She has made it her mission to spread peace and understanding among cohorts she likens to “squabbling siblings.” In books like “iGen” (2017) and “Generation Me” (2006), she has tracked the development of generational gaps and tried her best to bridge them. “The more we understand the perspective of different generations,” she writes, “the easier it is to see we’re all in this together.”

Are we though? An old theory has it that each generation adopts its group characteristics by way of the shared experience of “major events at impressionable ages,” as Ms. Twenge puts it in “Generations,” her latest book. The privation of the Great Depression and the national sacrifice of World War II instilled in the Silents, born in 1925-45, an urge to live stable, frugal lives. The idealism of the Baby Boomers (1946-64) was the product of the 1960s youth culture and the era-defining achievements of the civil-rights movement. The end of the Cold War gave Gen X (1965-79) its insouciant self-confidence. The 9/11 attacks and the financial collapse of 2007-09 shaped the fatalism common to Millennials (1980-94). The personality of Gen Z (1995-2012) isn’t fully developed yet, but the pandemic and digital media loom over everything they do.

Ms. Twenge doesn’t buy this theory. “History is not just a series of events,” she writes. “It’s also the ebb and flow of a culture and all that entails: technology, attitudes, beliefs, behavioral norms, diversity, prejudice, time use, education, family size, divorce.” She has her own theory: Technological change is the main driver of generational differences. Unlike wars, pandemics and economic cycles, she notes, “technological change is linear.” It moves toward ever more sophistication and convenience. It has the power to change things completely, making our lives “strikingly different from the lives of those in decades past.”

By “technology” she doesn’t just mean microchips and satellites. She means everything from air-conditioning to sanitation to birth control to architecture. The progressive development of technology shapes us, she writes, primarily by nudging us toward a greater degree of self-reliance—Ms. Twenge calls it “individualism”—and a “slower life trajectory.” Every generation has had the privilege of living “longer lives with less drudgery” than the lives of their parents and grandparents.

The “slow life” thesis may do more to explain the friction between the generations than anything else. Baby Boomers and Gen X’ers have a hard time understanding why Millennials and Gen Z’ers have so enthusiastically refused to grow up. Ms. Twenge’s explanation is that technology has arranged it so that they don’t have to. Young people can put off education, career, marriage and child-rearing in ways their parents and grandparents couldn’t. Labor-saving devices and longer life spans have given Millennials and Gen Z’ers the “priceless gift of time.” Why so many of them choose to use it watching cat videos and filming themselves dancing is one of life’s great riddles.

Link to the rest at The Wall Street Journal

An Extraordinary Mission to Find an American WWII Bomber Crew at the Bottom of the Pacific

Not necessarily about books and writing, but an interesting story about the credo of those who fight our wars.

Tennyson crew photo taking during stateside training, fall 1943. The pilot, 1st Lt. Herbert G. Tennyson, is second from left on the front row. The bombardier, 2nd Lt. Thomas V. Kelly Jr., is in the front row on the far right. Photographer S/Sgt. John W. Emmer, who wasn’t part of the crew but was assigned to their aircraft to document the mission, isn’t pictured. Credit: Courtesy of the Kelly family.

From The Wall Street Journal:

In the spring of 1944, Second Lt. Thomas V. Kelly Jr.’s mother received a one-page letter at her home in Livermore, Calif., informing her that her son was killed in action. His plane was hit by antiaircraft fire and disintegrated midair during a mission in New Guinea, his commander wrote.

“Unfortunately this is the only information we can furnish,” the letter read. 

Lt. Kelly and 10 other men were aboard the B-24 bomber when it was downed over the Pacific Ocean. Nearly eight decades later, remains recovered from a remote seabed more than 200 feet deep have arrived in the U.S., the result of an extraordinary effort by relatives, scientists and the American military.

Their journey spanned 10 years. Elite Navy divers lived inside a pressurized cabin for weeks so they could stay underwater longer and work at greater depths. About 250 tons of equipment was carried to the site in 17 shipping containers.

More than 81,500 Americans remain missing from past conflicts, according to the Defense POW/MIA Accounting Agency, part of the U.S. Defense Department that is tasked with finding them. About 75% were last seen somewhere in the Indo-Pacific, and most of them perished during World War II. 

Lt. Kelly and most of his fellow airmen were in their 20s when they died. Their bomber, with the words “Heaven Can Wait” painted across its nose, lay shattered on the ocean floor for decades. An unlikely spark ignited a quest to find it.  

A photo of the B-24 Bomber Heaven Can Wait provided by the family of Col. Harry Bullis. Heaven Can Wait was struck by anti-aircraft fire and crashed during a mission in Papua New Guinea on March 11, 1944, with 11 men on board. Credit: Courtesy of the family of Col. Harry Bullis.

A Name

On Memorial Day weekend in 2013, Scott Althaus reflected on a fuzzy childhood memory. When he was about 10 years old, his mother took him to visit the family plot at a cemetery in Livermore. He couldn’t remember the name etched on a small gray stone that they went to see, but he did recall the shape of an airplane carved beneath it.

Mr. Althaus, now 56 years old, is a political scientist at the University of Illinois in Urbana-Champaign, where he specializes in public perceptions of war. While he knew he had two relatives who died in World War II, he didn’t know much about them.

He asked his mother if she knew their names, and she emailed him a photograph of Lt. Kelly, her first cousin. Surviving relatives said he grew up on the family ranch in Livermore and dreamed of becoming a cowboy. After his death at age 21, his mother kept his boyhood bedroom almost as he left it, but with a folded American flag on top of the dresser.

Mr. Althaus did some more research. Lt. Kelly enlisted in the U.S. Army Air Forces in August 1942, trained as a bombardier and was assigned to what became known as the Tennyson Crew, named after its pilot, First Lt. Herbert G. Tennyson.

. . . .

On Dec. 7, 1943, they landed on an island north of Australia where Allied forces were fighting the Japanese, in what is now the nation of Papua New Guinea.

Mr. Althaus gathered as many details as he could about the 11 men who died. Ten of them were the plane’s crew. One was a photographer assigned to document their mission. At least two were married.

The Tennyson Crew was part of the 90th Bombardment Group, known as the “Jolly Rogers.” Fortuitously, a unit historian named Wiley O. Woods had meticulously documented their deployment in a book and a 64-volume set of records housed in a library at the University of Memphis.

In 2015, Mr. Althaus recruited four relatives to travel to Memphis. They spent two days scouring the archives and returned with nearly 800 photographs of documents—diary entries, maps, official army records.

“We learned so many things about that final mission that we never, ever thought we could learn,” Mr. Althaus said. He stitched together a chronology.

Heaven Can Wait took off from an airfield called Nadzab on March 11, 1944, loaded with eight 1,000-pound bombs. It was part of a group assigned to weaken Japanese antiaircraft batteries at several positions along the coast of the island of New Guinea.  

They came under fire as they neared their first target of Boram airfield and dropped some of their bombs. The Tennyson Crew was one of three that were assigned a second target on the island. They peeled off and veered toward an area called Awar Point at the northern edge of Hansa Bay. Visibility was good as they approached. Then came the sudden roar of enemy fire. 

The crew of another aircraft had a clear view of what happened next. Heaven Can Wait was struck near its middle and caught fire. Three men were seen falling but none of their parachutes opened. Part of the tail broke off. The plane and its remaining passengers also plunged into the water.

One eyewitness, identified as “Red” Tonder, said in a 1992 account that just before Heaven Can Wait went down, its co-pilot looked over and gave a final salute.

. . . .

In mid-2017, Eric Terrill and Andrew Pietruszka were planning an expedition to Papua New Guinea when they received a pivotal email from Mr. Althaus. The two work for a nonprofit, now called Project Recover, which was founded 30 years ago to find and repatriate Americans missing in action. 

Mr. Althaus had sent over a 32-page report that was essentially a treasure map leading them to the wreckage. Mr. Althaus had compiled the serial numbers of the aircraft’s engines and weapons, as well as landmark features of the local terrain that helped narrow down the search field. 

Heaven Can Wait was likely resting a quarter of a mile from Awar Point, his report said. The location is so remote that, at the time, it wasn’t searchable using Google Maps.

Messrs. Terrill and Pietruszka, an underwater archaeologist who used to work for the Defense POW/MIA Accounting Agency, rounded up a team of about a dozen experts and set off in October 2017 to find it. They scoured the seabed with underwater robots, working 16-hour days for almost two weeks. 

. . . .

Two days before their mission was due to end, one of the robots equipped with sonar sensors hovered over an area about a mile east of where Mr. Althaus predicted the plane would be. It detected jagged shapes that looked like they could be part of an engine and wing.

The next day, they sent down a remotely-operated vehicle outfitted with a high-definition camera for a closer look. In May 2018, they announced that Heaven Can Wait had been found. They had located the plane’s tail assembly, fuselage and wing debris—and identified probable crew positions.

. . . .

There were setbacks: An initial dive team sent to sweep for unexploded ordnance and survey the crash had to be pulled out when a nearby volcano erupted in December 2018. The Covid-19 pandemic delayed them again in 2020 and 2021. This year, it was go-time.

Members of the Navy Experimental Diving Unit trained for two months at a base on the Florida panhandle. For the first time, they would deploy what they call SAT FADS—the Saturation Fly-Away Diving System, developed by the U.S. military.

The system uses a custom-fitted ship that houses a small airtight habitat. It connects to a capsule called a diving bell, which is tethered to the ship and transports divers to the seabed. Once they enter the pressurized habitat, they stay inside for weeks.

The divers said the environment distorts their senses. The sound of teeth brushing seems to echo in their heads. Certain foods like fish and bread suddenly smell and taste repulsive. They breathe a mixture of helium and oxygen that makes them talk like Mickey Mouse.

Unable to use electronic devices inside, they passed the time mostly by reading. One diver devoured the book “A Higher Call,” the true story of a German fighter ace who guided a damaged American aircraft out of enemy airspace during World War II.

Eventually, someone had the idea to use a bed sheet as a movie screen. Crew outside the sealed habitat propped a projector in front of their tiny porthole so they could watch the “John Wick” series. The soundtrack crackled through the vessel’s built-in communications system.

. . . .

Each diver carried about 80 pounds of gear, with extra weights strapped to their waists and ankles to ground them to the seabed. They maneuvered slowly through strong currents.

“It felt like you were walking into a windstorm at times,” said Lt. Cmdr. Daniel Kinney, the diver officer in-charge. 

The wreckage was well preserved, with different parts of the plane visible. Six divers alternated on four-hour shifts in teams of three, using a hose that gently vacuumed up whatever they could find. Large pieces were placed in baskets and reeled to the surface.

Each night, the mission’s lead scientist, Greg Stratton, hunched over trays to examine their haul with a jeweler’s loop in search of bits of bones and teeth.

After five weeks of diving, they came ashore with hundreds of pieces of evidence. The Defense POW/MIA Accounting Agency said it includes “osseous material,” which could be bone.

Link to the rest at The Wall Street Journal

The Soviet Century

From The Wall Street Journal:

We have had reason of late to think anew about the Soviet Union and the legacy of the Cold War—the fighting in Ukraine reverberates with the ruthless geopolitics of an earlier era. In “The Soviet Century,” Karl Schlögel takes us on a tour of the Soviet past in all its materiality, a tour that puts on display, as he puts it, the “archaeology of a lost world.”

He begins with a visit to the vast outdoor flea market at Moscow’s Izmailovsky Park, where, as the Soviet Union began to collapse in the early 1990s, “an entire world-historical era was being sold off on the cheap.” A modern-day refuse heap, the bazaar showcased the offerings of hundreds of individual households eager to turn their once-cherished tchotchkes into much-needed cash.

It wasn’t only in Moscow that such a selling-off was attempted. Haphazard bazaars, Mr. Schlögel says, sprang up across the country. Using blankets or folding tables, people displayed samovars, cups and saucers, Red Army hats, insignia pins, captured German military uniforms, pennants, Communist Party membership cards—“the debris and the fragments of the world of objects belonging to the empire that has ceased to exist,” as he writes. Anything that might attract a buyer.

Mr. Schlögel doesn’t mention the avoyska—a “just in case” knitted bag—that Soviet citizens routinely carried with them on the chance they would happen upon tomatoes or melons for sale on a street corner (something I used to see for myself on my visits to the Soviet Union in the 1980s). He does note that when urban dwellers lined up for goods—not only at bazaars but at the entrances to subway stations, where people sold loaves of bread and articles of clothing—they often did so without knowing what everyone else was waiting for and just assumed it would be for something they needed.

For Mr. Schlögel, an esteemed historian based in Frankfurt, such improvised markets are an emblem of a broader theme. His focus is not on the foreign relations or domestic crises of Soviet rule but on outward appearances: the look, the smell, the sounds of everyday life. Based on decades of research and an intimate knowledge of history and culture, “The Soviet Century” is a fascinating chronicle of a not-so-distant era.

Among much else, we learn about life in a typical communal apartment, where several families had to share a space that was now divided into single rooms for each multigenerational family. As late as the 1970s, 40% of Moscow’s population “enjoyed” such accommodations, with all of its inevitable tensions, petty disputes and invasions of privacy. We learn about the system of door bells: “Ring once for Occupant A, twice for Occupant B and so forth.” And about the lavatory as a semipublic space. “A toilet for over thirty people . . . was not untypical,” Mr. Schlögel writes. A gallery of toilet seats would hang on the lavatory wall.

Other stories in “The Soviet Century” (translated from the German by Rodney Livingstone) capture unique and surprising moments in cultural history. Who would have guessed that the original formula for the Soviet perfume Red Moscow—developed before the Revolution but introduced to the public in 1927—led to the creation of Chanel No. 5? Or that when the special archive of banned books and periodicals was finally made available to researchers during the time of Mikhail Gorbachev’s reforms in 1987, it included “300,000 book titles, 560,000 journals, and a million newspapers”? Of course, the Soviet century included bannings of another sort. As Mr. Schlögel reminds us, more than 200 “philosophers, writers, university teachers, and agronomists” were personally chosen by Lenin and banished to Western Europe in 1922. Others were simply shot.

Link to the rest at The Wall Street Journal

PG suggests that, while some details have changed, Putin’s Russia still seems a lot like Stalin’s Russia.

Americans Abroad

From The Paris Review:

By the time I saw Nixon in China during its 2011 run at the Metropolitan Opera, it had become a classic, if not an entirely undisputed one. It had made it to the Met, at least, with its composer, John Adams, conducting, and James Maddalena, who originated the role of Nixon in the 1987 premiere at the Houston Grand Opera, back at it, now nearly the age Nixon was when he made the trip. A friend of mine, with theatrical élan, bought out a box for a group of us and encouraged formal dress, as if we were in a nineteenth-century novel. He showed up in a tux. I don’t remember my outfit, but I’d be surprised, knowing myself, if I managed anything more presentable than a mildly rumpled off-the-rack suit. At the time, I was working as an assistant to a magazine editor who regularly attended the opera, in full formal dress, with a pair of its major donors, fitting in an elaborate meal on the Grand Tier during intermission. My handling of his invitations gave me a surprising proprietary sense about the place. I didn’t feel that I belonged, of course, but at least I had a narrow help’s-eye-view of its workings. In the upper deck, and even in our box, my friends and I had the sense of superiority that comes from being broke and artistic among the rich and, presumably, untalented.

Not that I had any major insight into the opera at the time, this one specifically or the art form more generally. I’d sat in the cheap seats on a few occasions, trying to rouse myself awake for the end of Tristan und Isolde, once, with a Wagner-loving girlfriend. I’d even stood in the back row of the orchestra for Leoš Janáček’s From the House of the Dead, feeling obligated as a Dostoyevsky loyalist to bear witness. (All I remember is a general brownness and a grim, monochromatic score. It was, after all, a Czech opera about a Russian prison camp.) I did, however, have an abiding interest, bordering on mania, in the pathos of conservative politics, and only a person who has lost interest in the world could fail to be interested in Richard Nixon. The friend who had arranged this outing was, among other things, a news junkie and former Republican, and his relationship to the former president was characterized, like the opera’s relationship to its subject, by a complicated mix of irony and enthusiasm. Dramatic renderings of Nixon tend toward the sweaty and profane (as in Robert Altman’s Secret Honor) or the broadly comic (Philip Roth’s novel Our Gang, or the 1999 film Dick, starring a young Kirsten Dunst and Michelle Williams, an overlooked gem surely due for reappraisal). But Adams’s monumental, hypnotically Glassian score and Alice Goodman’s dense postmodern libretto invest Nixon with a weird if inarticulate dignity that he rarely displayed in life. The striving and paranoia are tamped down, replaced with yearning naïveté and statesmanship.

Though the opera remains true to the publicly known contours of the actual trip, Nixon in China’s Dick and Pat are as much stand-ins for Americans Abroad—hopeful, a bit bumbling, but fundamentally decent, albeit with the power of the world’s wealthiest country at their back—as they are representations of real people. (Nixon, it’s worth noting, was still alive when the opera premiered, and was invited to the opening. A few years later his representative said that he didn’t attend because he “has never liked to see himself on television or other media, and has no interest in opera.” Okay!) In his arias, Nixon delivers a garbled mix of clichés, non sequiturs, impressionistic memories, and Ashberyian koans, the most famous of which, “News has a kind of mystery,” is repeated in dizzying variations soon after Nixon descends from his Boeing 707 and shakes hands with Chou En-lai (as the Chinese premiere’s name is unorthodoxly rendered in the libretto). The song lodges in one’s brain immediately—I’ve been continually exclaiming “News! News! News!” at my four-month-old son—and serves as a kind of motto or benediction for the entire work, simultaneously insistent and ambiguous. It’s the exclamation of a man who is marveling at the mythmaking apparatus that he has been an active beneficiary of and that will ultimately destroy him. It’s the vagueness that makes it transcendent, a half-formed thought one might jot down in a notebook and turn over in one’s head for days. A kind of mystery?

There’s an emptiness at the core of Nixon in China that is appropriate, given that it’s about political pageantry, the kind of nonevents that Joan Didion identified as the stock-in-trade of modern politics in her 1988 essay “Insider Baseball.” One of opera’s chief methods is to turn private emotions into grand spectacle, to give voice to feelings that could never be as beautifully expressed as they are in a duet between two doomed lovers. Nixon in China turns superficial spectacle into another spectacle, a copy of a copy. There is action—Nixon meets with Mao; Nixon and Chou deliver toasts at a banquet; Pat goes on an official sightseeing tour—but there is little dramatic movement. Even when we do get insight into the “private” Nixon, Pat, Mao, et al., in quieter scenes that take place behind closed doors, what is revealed is not fundamentally different from what is presented publicly. Adams’s score ebbs and flows, churning on and on, threatening, but never tipping over into, catharsis. The work steadily resists resolution—it ends with an extended coda, taking up the entire third act, in which the characters prepare for bed.

Link to the rest at The Paris Review

The Last Secret of the Secret Annex

From The Wall Street Journal:

After two years in hiding in Amsterdam, 15-year-old Anne Frank was arrested in August 1944, along with her sister, mother, father and four other Jews. All but Anne’s father, Otto Frank, perished in Nazi concentration camps, along with three-quarters—more than 100,000—of the Netherlands’ Jewish population. Anne’s adolescent diary, first published in 1947, has since become one of the most celebrated and poignant artifacts of the Holocaust. A flood of literature on the Frank family and the Dutch people who helped them survive has followed. Among the nagging questions that remain: Who betrayed the Franks and the others in hiding with them?

The Last Secret of the Secret Annex” is both a fascinating attempt to unlock this mystery and a case study in how Holocaust trauma can ripple through the generations. It comes from the Belgian journalist Jeroen De Bruyn, who confesses a lifelong obsession with Anne’s story, and Joop van Wijk-Voskuijl, whose mother, Elisabeth “Bep” Voskuijl, was, in her early 20s, the youngest of the Franks’ Dutch “helpers.” The authors met when Mr. De Bruyn was just 15, and eventually became partners in the enterprise.

Narrated in Mr. van Wijk-Voskuijl’s voice, “The Last Secret of the Secret Annex” updates and expands an earlier book by the duo, published in 2015 in the Netherlands, and self-published in the United States three years later as “Anne Frank: The Untold Story.” The current volume details the courage of the narrator’s mother, who foraged for food for those in hiding, and his maternal grandfather, Johan, who built the revolving bookcase that concealed the “annex” in which the Frank family lived. It also takes withering aim at the multiyear “cold case” investigation chronicled in Rosemary Sullivan’s 2022 book “The Betrayal of Anne Frank.”

Led by former FBI special agent Vince Pankoke, that inquiry—in which the authors cooperated—concluded that the culprit was likely the notary Arnold van den Bergh, a member of Amsterdam’s Jewish Council. Citing an anonymous accusation and other evidence, it posited that he traded addresses of Jews in hiding to the Gestapo in exchange for his family’s survival. Dutch scholars found that scenario far-fetched, and their criticisms led to the Sullivan book’s withdrawal from circulation in the Netherlands.

Messrs. De Bruyn and van Wijk-Voskuijl propose a different possible informant: Mr. van Wijk-Voskuijl’s maternal aunt, Bep’s younger sister Nelly. During the Occupation, the then-teenage girl was, in the authors’ words, “seduced by everything German.” High-spirited and combative, Nelly had Nazi boyfriends and worked for the German military. Two survivors of that period—another of Bep’s sisters, Diny, and Bep’s wartime fiancé, Bertus Hulsman—attested that Nelly knew her relatives were helping Jews in hiding. Both recalled her angrily saying “Just go to your Jews!”—or words to that effect—to other family members.

Link to the rest at The Wall Street Journal

Knowing What We Know: How Information Was Born

From The Wall Street Journal:

Knowledge is power, Francis Bacon wrote in 1597, using a quill and the Elizabethans’ distinctive “secretary hand.” Thomas Hobbes, who started out as Bacon’s secretary, agreed: Scientia potentia est, Hobbes wrote in the 1668 edition of “Leviathan.” Generations of spymasters, dictators and tax inspectors concurred, and so, as the rubble of the Humanities confirms, did the French theorist Michel Foucault. Yet knowledge is no longer power.

Today digital information is power. The quantity of information debases its value: The printed newspaper is dematerializing before our eyes. The smartphone offers more than a different physical experience from its predecessors, the tablet, scroll, manuscript and printed book. It carries the entire history of information. Writing, Socrates warns in Plato’s “Phaedrus,” “will implant forgetfulness.” If we “cease to exercise memory,” we will be “calling things to remembrance no longer from within themselves, but by means of external marks.” When we outsource the storage of information, we outsource our knowledge of the world and ourselves.

Philosophers agonize over how knowledge is made. Historians are more interested in its circulation and application. In “Knowing What We Know,” Simon Winchester dispenses with the technicalities. Mr. Winchester, a prolific author whose bestsellers include “The Meaning of Everything,” considers knowledge as per the Oxford English Dictionary, meaning no. 4b: “The apprehension of fact or truth with the mind; clear and certain perception of fact or truth; the state or condition of knowing fact or truth.” With his typical fluency and range, Mr. Winchester then traces the intertwined evolution of knowledge, society and the individual, from ancient illiteracy to the wisdom of the hour, artificial intelligence.

The first transmissions of knowledge, Mr. Winchester writes, were “oral or pictorial.” As current indigenous practice shows, the collective cultural inheritance and identity of the tribe is transmitted by “knowledge keepers,” usually “designated elders or specially skilled custodians.” The oldest surviving written transmission, a “small tablet of sunbaked red clay” found recently in what is now Iraq, dates to around 3100 B.C. In the Sumerian city of Uruk, a man named Kushim, who “appears to have been an accountant,” issued a receipt in a Mesopotamian warehouse for a delivery of barley. He had created a piece of movable information. Anyone who could read it was educated: able to acquire information, able to pass it along. As scarcity ensured value, the invention of writing devalued knowledge. It also lowered the tone. When people started to write as they thought, Mr. Winchester argues, they aired the “more vulgar aspects of society.”

Mr. Winchester is adroit at arranging information in pursuit of knowledge, and he has an eye for the anecdote. The familiar prehistory of the Latin alphabet is here, but he emphasizes the simultaneous making of a cross-civilizational consensus on education and its methods.

Innate human curiosity is the engine of knowledge, but the engine runs on two fuels, experience and facts taken on trust. Mr. Winchester’s own experiential curiosity was triggered at the age of two by a wasp sting. For his brain to develop into “some kind of mental context-cabinet,” he needed a mental filing system. Facts and memorization were emphasized in the imperial-minded curricula of ancient Sumeria, Confucian China and Mr. Winchester’s schools in England. The Chinese examination system ran for 1,316 years, until 1905. Mao revived the idea of early testing to identify a future elite in 1952, and the annual gaokao exams remain “an ordeal of the first magnitude,” requiring proof that “one’s degree of acquired knowledge is both immense and of the highest quality.” The American schoolroom may be a kinder place, but the rest of the world thinks that the SAT is “ridiculously easy.”

Link to the rest at The Wall Street Journal

Reddit, Tell Me Where I Went Wrong

From Electric Lit:

My neighbor (32F) is not speaking to me (44M) because I made some repairs to her home while she was out of town. These were mostly exterior and relatively minor (clearing debris, replacing deck boards, adding a utility sink, installing a rain cap), but I did climb onto her roof. She says I was out of line by not asking permission and that she no longer trusts my judgment.

We live two streets away from each other in a small neighborhood of old houses. We have been friends for a year and hooking up for about three months. I would like more, but she is a relatively new widow and single parent to a four-year-old boy and doesn’t have the capacity right now. She is seriously my ideal woman, though, and I am willing to wait. I am not the most attractive guy and never thought I’d interest a person of her caliber. We’ve gone out a few times when her mom was watching her son or if there was a “Parents’ Night Out” at his daycare, but mostly it’s a couple hours together after her son goes to sleep. She’s invited me along with a larger group to go hiking a couple times, and we get each other’s mail and water each other’s plants if the other person is out of town.  

I bought a house in this neighborhood after my divorce because it was close to my job and to my ex-wife’s house (we share custody of two teenagers), but a lot of people move here because it is one of the few affordable city neighborhoods in a good school district. Then they realize that because the houses are all extremely old repairing them is a hassle. You think about yanking down the wallpaper somebody painted over only to discover lead paint or try to replace a door and realize you’ll have to get one custom made. I’m an engineer and can get into this kind of stuff, but a lot of people don’t. My neighbor told me on more than one occasion that her house stressed her out. She could handle the yard work and minor repairs and outsource the truly big projects, but then there were all of these things in between. Installing a utility sink felt impossible when you had a full-time job and a young child and no spouse, but were you really going to pay someone to do that? “You don’t have to pay me,” I’d tell her. “Get the sink, and I’ll put it in,” but she wouldn’t let me. I figured it was about her son and his father, about not wanting him to see anyone step into that kind of role, and so I dropped it.

The night before she went out of town, we were on her porch drinking beers and watching for the fox that lives in the overgrown lot across the street. Her son had gone to bed about thirty minutes before and was still sleeping lightly. We couldn’t go upstairs yet and so we got to talk. Work, TV shows, a book she almost loved whose ending felt contrived, my daughter’s failing grade in chemistry that brought me and my ex-wife to a moment of real collaboration. We had a fan going to ward off the mosquitos, and the sunset was just beginning to brighten the edges of the summer sky. When the dog walkers passed, we’d wave, and this gave me a good feeling, all of these people seeing me with her. It felt like being claimed.

“This is nice,” I said.

“What?”

“Being with you. I’m glad we don’t sneak around.”

She made a face. “Why would we do that?”

Her voice had a slight edge to it, and I knew I had to tread lightly. I couldn’t imply she was risking her reputation or trusting a person she barely knew to behave well if whatever it was we had ended.

“That first night you slept with me I was so happy,” I said. “I told myself, she has a kid and we’re neighbors. She isn’t going to hook up with me unless she thinks it could really be something.”

She took a long drink of her beer and seemed to consider her response. I was hoping she would say I was right, but she just shrugged. “We’re both adults. You never struck me as a lunatic.”

Link to the rest at Electric Lit

Fatherland

From The Wall Street Journal:

When he was 28, Burkhard Bilger learned a jarring family secret: Shortly after World War II, his grandfather spent two years in jail while on trial as an accused Nazi war criminal.

The revelation shocked Mr. Bilger. His parents, who moved to the U.S. from Germany in 1962, seldom spoke about their Nazi-era upbringing. Mr. Bilger, who was born in Oklahoma a year and half after his parents’ emigration, similarly avoided calling attention to a heritage that could give pause to new acquaintances. “To be German, it seemed, was always to be one part Nazi,” he writes. “In my case, that part was my grandfather.” Rather than dwell on the past, for the most part he avoided it. Then the past found him.

In 2005, a package arrived from one of Mr. Bilger’s aunts in Germany containing a shoebox filled with letters dating from around World War II. Mostly handwritten, some in an old-fashioned German script difficult for contemporary readers to decipher, the documents propelled Mr. Bilger into a yearslong journey to make sense of how his grandfather, a reserved and seemingly upstanding schoolteacher, had entangled himself and his family in the rise and fall of Hitler’s Third Reich.

The result is Mr. Bilger’s resolutely unflinching and ultimately illuminating book “Fatherland: A Memoir of War, Conscience, and Family Secrets.” In the course of his quest, Mr. Bilger, a staff writer at the New Yorker, interviewed far-flung family members as well as his grandfather’s long-lost neighbors, and scoured government archives in both Germany and France. As he pieces together the memories and documentary evidence, Mr. Bilger makes palpable the tension he feels between the wish to forget the past, in all its discomforting details, and the desire to understand behavior that might be easier to erase from memory than to confront and try to take in, much less forgive.

He begins by wondering how his grandfather Karl Gönner could have been both the father his mother loved and “the monster history suggested.” She was still a schoolgirl when her father finally returned from the war, and she remained too fearful to ever ask him if he was guilty of the crimes for which he was accused. What if Mr. Bilger discovered, now, that the answer was yes?

An authentic reckoning with his grandfather’s past demanded that he find out. Mr. Bilger charts his family’s history, generation by generation, back to the 18th century. Gönner himself had provided the roadmap in his personal “ancestry passport”—the official document laying out his “pure” Aryan genealogy over the centuries, as required for his membership in the Nazi Party as well as for his government-regulated teaching job.

Like his ancestors before him, Gönner was born in the Black Forest village of Herzogenweiler, founded in 1721 by a successful consortium of glassblowers. By the time of Gönner’s birth in 1899, though, the glass business had collapsed and the once-flourishing village had become derelict.

Religious and bookish by nature, Gönner set his sights on the priesthood as his best route to an education and a career away from poverty. Then came World War I. Drafted in 1917, Gönner arrived at the Western Front in time to join the German army’s battered retreat. In late September 1918, beaten down by hunger and the muck of the trenches, his troop arrived at Meuse-Argonne, the site of one of the war’s final and most brutal battles. A land mine blasted Gönner unconscious, its shards piercing his right eye, arm and thigh. Upon his release from the hospital several months later, Mr. Bilger writes, Gönner “came home hobbled and half blind, with a sense that never left him that the world was a shattered thing, in need of radical repair.”

Yet he was told he was lucky. After all, his brother Josef, who had been killed in Flanders, never returned from the war at all. But what kind of a life could Gönner have back in the impoverished villages of the Black Forest? The war cost him his religious faith and replaced his right eye with a sightless glass prosthetic. Eventually he married and started a family, became a teacher and, in 1933, joined the Nazi Party.

Link to the rest at The Wall Street Journal

Why Beethoven

From The Wall Street Journal:

In 2010 the British music critic Norman Lebrecht published “Why Mahler?” The book was an attempt to explain why the symphonies of Gustav Mahler (1860-1911) inspired both hatred and adoration in his lifetime, were forgotten for a half-century after his death, and now dominate the concert repertoire. “Why Beethoven”—no question mark—explains why no such thing happened in the case of that composer.

“Johann Sebastian Bach’s oratorios lay untouched for a hundred years,” Mr. Lebrecht writes. “The operas of Handel were hardly seen for two centuries. Mozart, popular as his operas may have been, had his symphonies and concertos used as kindling. . . . Schubert’s piano sonatas gathered dust for generations. Schumann’s symphonies were discarded, as were several Verdi operas. Beethoven, alone among classical and romantic composers, was embraced first to last, his time to ours. Why is that?”

I take Mr. Lebrecht’s point that Beethoven’s greatness has never been disputed by serious people, but his comparisons are trite. Bach and Handel lived a century before Beethoven, and their music had to endure the 1760s and ’70s, when European musical authorities were fools and there was no concertgoing public in the 19th-century sense. Schumann’s symphonies are very fine but the loss of them would not amount to a civilizational tragedy. And maybe Schubert’s sonatas gathered dust, but his songs did not, whereas Beethoven’s songs might be forgotten at no great cost.

The putative aim of the book, in any case—this is also true of the book on Mahler—is a conceit, a framing device that allows Mr. Lebrecht to write as he likes about Beethoven and his works. In this case Mr. Lebrecht has written 100 chapters on 100 compositions by Beethoven. These chapters treat not only the well-known works—the symphonies, the violin concerto, the string quartets, the piano sonatas, the cello and violin sonatas, “Fidelio,” “Missa Solemnis”—but also many works casual listeners will not have heard: various songs, the horn sonata, the sextet for two clarinets, two horns and two bassoons, and so on. In most of these chapters, he judges the merits of assorted recordings of the works.

Mr. Lebrecht’s reflections are as predictable as Beethoven’s music: which is to say, not at all. Some are autobiographical. The chapter on the sixth symphony, the “Pastoral”—possibly one of the most beautiful symphonic works ever written—is titled “Hell on Earth.” Huh? Mr. Lebrecht could not enjoy this work for many years, he tells us, because his cruel and abusive German stepmother would force him to listen to a recording of it with her, over and over. She idolized the German conductor Bruno Walter and took young Norman to concerts at the Royal Festival Hall. But her behavior to him was so appalling that for years after leaving home he stopped listening to music altogether. She urged him to call her “Mother” and hit him when he refused, and forced him to go on long Sunday-morning treks across the countryside.

Was she so terrible? He says yes, and who are we to doubt it? “Her voice squeaked like an unoiled gate,” he writes. “Her cooking was tasteless, her outlook joyless, her rare smiles menacing. She got me a piano teacher and hit me if I skimped on practice. She hit me for many other sins, and for none.” Still, I wonder. Mr. Lebrecht has made a brilliant career in music criticism. His stepmother had him listen to Bruno Walter recordings and took him to concerts as a teenager. Hitting aside, it’s not obvious that he owes her nothing more than the vitriol recorded here.

Link to the rest at The Wall Street Journal