Stalin: Passage to Revolution

The information card on “I. V. Stalin”, from the files of the Imperial police in Saint Petersburg, 1911
via Wikipedia

From The Wall Street Journal:

Not surprisingly, Joseph Stalin has been the subject of many biographical studies, in recent years in particular, when formerly closed Soviet archives became open to students of history. Decades before, Leon Trotsky, Isaac Deutscher, Adam Ulam and Robert Tucker, to name a handful of prominent authors, wrote hefty volumes on Stalin’s life, attempting to tell the story with limited information. Their work has been surpassed by another generation of scholars, led by Dmitri Volkogonov, Robert Service, Oleg Khlevniuk and Stephen Kotkin. They have plumbed the archives and benefited from a host of memoirs that have deepened our understanding of a murderous dictator whose legacy, nearly 70 years after his death, still haunts the countries he once ruled.

Ronald Grigor Suny’s “Stalin: Passage to Revolution” is a worthy contribution to this continuing enterprise. “The telling of Stalin’s life has always been more than biography,” Mr. Suny writes. “There is wonder at the achievement—the son of a Georgian cobbler ascending the heights of world power, the architect of an industrial revolution and the destruction of millions of the people he ruled, the leader of the state that stopped the bloody expansion of fascism.” It is the story of how the Romanov dynasty, convinced of its own divine right to rule the Russian Empire, confronted “a newly emerging social class” of industrial workers, a clash that “exploded into violence, bloodshed, and eventually revolution.” Reading Mr. Suny’s chronicle, one can’t help recalling John F. Kennedy’s remark, in a 1962 speech, that “those who make peaceful revolution impossible will make violent revolution inevitable.”

. . . .

Mr. Suny’s focus is Stalin’s early decades, from his birth and education to the eve of revolution in 1917. Born in 1878 in the Georgian town of Gori, on the southern periphery of the Russian Empire, Ioseb Jughashvili, as he was christened, was raised in a poor family. His father scratched out a living as a cobbler; his mother was a religious woman who worked as a seamstress. The couple had lost their first two sons in infancy, driving his father to become “violent, erratic, and drunk,” Mr. Suny says, and to abandon the family. Convinced of Joseph’s abilities, his mother worked to gain his admission to a seminary so that he could become a priest.

Using his access to archives in Georgia, Mr. Suny describes the milieu in which the young Joseph grew up—the children’s games he enjoyed and the literature and myths that animated his imagination. It was at the seminary in the Georgian capital of Tiflis that the teenage Joseph confronted the obstinacy of his teachers, who denigrated Georgian culture and insisted on the primacy of Russian language and history. Life at the seminary, Mr. Suny writes, was “colorless and monotonous . . . , a strict routine designed to inculcate obedience and deference.” It proved to be as much a “crucible for revolutionaries as for priests” and pushed “an intelligent but still quite ordinary adolescent into opposition.” At the seminary, Joseph “came to socialism through reading and the fellowship of classmates.”

. . . .

Stalin, known as Koba to his comrades, made a name for himself as a party organizer in the Caucasus, among miners and oil workers. Here confrontations with czarist officials were violent and bloody, marked by heists and assassinations.

Stalin closely studied the works of Marx and, not least, the writings of Lenin before he met the Bolshevik leader in 1905, an encounter that began a close and fateful association. Mr. Suny’s close study of these years uncovers the traits of suspicion and intrigue that came to define Stalin in power. Koba, he writes, “was not above using dubious means against comrades with whom he disagreed,” lying about them behind their backs to compromise their standing. In his encounters with Mensheviks, he indulged in anti-Semitic insults, knowing that there were more Jews among them than among the Bolsheviks he favored.

Mr. Suny’s account of the tensions between Bolsheviks and Mensheviks is spirited and compelling, especially when he describes these ostensible allies splitting into “antagonistic cultures,” each demonizing the other over their motives, making reconciliation ever less likely. Lenin is often at the center of this story, engaging in vicious polemics against his ideological adversaries. 

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

The Alignment Problem

From The Wall Street Journal:

In the mid-1990s, a group of software developers applied the latest computer learning to tackle a problem that emergency-room doctors were routinely facing: which of the patients who showed up with pneumonia should be admitted and which could be sent home to recover there? An algorithm analyzed more than 15,000 patients and came up with a series of predictions intended to optimize patient survival. There was, however, an oddity—the computer concluded that asthmatics with pneumonia were low-risk and could be treated as outpatients. The programmers were skeptical.

Their doubts proved correct. As clinicians later explained, when asthmatics show up to an emergency room with pneumonia, they are considered so high-risk that they tend to be triaged immediately to more intensive care. It was this policy that accounted for their lower-than-expected mortality, the outcome that the computer was trying to optimize. The algorithm, in other words, provided the wrong recommendation, but it was doing exactly what it had been programmed to do.

The disconnect between intention and results—between what mathematician Norbert Wiener described as “the purpose put into the machine” and “the purpose we really desire”—defines the essence of “the alignment problem.” Brian Christian, an accomplished technology writer, offers a nuanced and captivating exploration of this white-hot topic, giving us along the way a survey of the state of machine learning and of the challenges it faces.

The alignment problem, Mr. Christian notes, is as old as the earliest attempts to persuade machines to reason, but recent advances in data-capture and computational power have given it a new prominence. To show the limits of even the most sophisticated algorithms, he describes what happened when a vast database of human language was harvested from published books and the internet. It enabled the mathematical analysis of language—facilitating dramatically improved word translations and creating opportunities to express linguistic relationships as simple arithmetical expressions. Type in “King-Man+Woman” and you got “Queen.” But if you tried “Doctor-Man+Woman,” out popped “Nurse.” “Shopkeeper-Man+Woman” produced “Housewife.” Here the math reflected, and risked perpetuating, historical sexism in language use. Another misalignment example: When an algorithm was trained on a data set of millions of labeled images, it was able to sort photos into categories as fine-grained as “Graduation”—yet classified people of color as “Gorillas.” This problem was rooted in deficiencies in the data set on which the model was trained. In both cases, the programmers had failed to recognize, much less seriously consider, the shortcomings of their models.

We are attracted, Mr. Christian observes, to the idea “that society can be made more consistent, more accurate, and more fair by replacing idiosyncratic human judgment with numerical models.” But we may be expecting too much of our software. A computer program intended to guide parole decisions, for example, delivered guidance that distilled and arguably propagated underlying racial inequalities. Is this the algorithm’s fault, or ours?

To answer this question and others, Mr. Christian devotes much of “The Alignment Problem” to the challenges of teaching computers to do what we want them to do. A computer seeking to maximize its score through trial and error, for example, can quickly figure out shoot-’em-up videogames like “Space Invaders” but struggles with Indiana Jones-style adventure games like “Montezuma’s Revenge,” where rewards are sparse and you need to swing across a pit and climb a ladder before you start to score. Human gamers are instinctively driven to explore and figure out what’s behind the next door, but the computer wasn’t—until a “curiosity” incentive was provided.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

When PG was in high school, The Mother of PG aka Mom, made PG take a typing class. Learning how to type and type quickly might have been the most useful thing PG learned in high school.

PG earned money in college by typing papers for other students who couldn’t type. He charged a high per-page rate and collected it because he specialized in typing for procrastinators. If you finished your rough draft at midnight, PG would show up with his portable typewriter and turn it into something your professor would accept at 8:00 am the next morning.

PG kept typing through law school, typing all his law school exams and whatever parts of the bar exam that could be typed.

When PG was a baby lawyer, he had a client who was also working with a fancy law firm in Los Angeles. He went over to the fancy law firm on occasion to meet with the fancy lawyers who worked there (He rode up the elevator to the law firm’s offices with Marlon Brando one time and Kareem Abdul-Jabbar another time. Kareem looked a lot less dissipated than Marlon.)

The fancy law firm had the first word-processing computers PG had ever seen. The firm had eight of these computers and they were operated by the fastest and most-accurate typists PG has ever seen. The machines and operators were in their own glass-walled room and at least a couple of typists were on duty 24 hours a day. (PG was there at midnight to pick up a rush project and one of them delivered a finished contract to him at midnight.) PG just checked and each of the computerized word processors cost over $180,000 in 2020 dollars.

PG was the first lawyer he knew who bought a personal computer for his law office. Fortunately, personal computers could also be used for playing videogames, so the price had come way, way, way, way down from $180,000.

Because he could still type fast, PG learned how a word processing program worked. Plus a bunch of other programs. He quickly started using his PC for legal work. Why type a document you used for a lot of different clients over and over when you could just type it once for Client A, save a copy, then use the copy as the basis for Clients B-Z?

PC’s were evolving quickly, so when a more powerful PC was released, PG bought one and moved his prior PC to his secretary’s desk and showed her how to use the word processing program.

Since PG always hired the smartest secretaries he could find, within a couple of weeks, she was better with the word processor than PG was.

For a variety of different reasons, PG started doing a lot of divorces for people who didn’t have a lot of money (the local Legal Aid office thought he did a good job and sent a lot of clients his way).

In order to make money doing divorces for people who didn’t have much (Legal Aid never had enough money, so it didn’t pay much for a divorce either), PG built a computer program so he could do the paperwork necessary for a divorce very quickly.

The wife’s name, the husband’s name, the kids names and ages, the year and make of the rusted-out pickup, the TV, sofa, etc., were the same from start to finish, so why not type them into a computer program once, then build standard legal forms that would use the same information for all the various forms the state legislature, in its infinite wisdom, had said were necessary to end a marriage?

PG has meandered for too long, but to conclude quickly, he ended up building a commercial divorce computer program he named “Splitsville” and sold it to about 20% of the attorneys in the state where he was practicing at the time.

(In the United States, the laws governing divorce AKA Dissolution of Marriage vary from state-to-state, so Splitsville couldn’t cross state lines. Even though the fundamental human and property issues are the same any time a marriage is ended, PG suspects there are enough idiots in any state legislature to shout down anyone who says, “Why don’t we just do it the way Alabama does instead of concocting a divorce law of our own?”)

Which means PG doesn’t have enough knowledge to build artificial intelligence programs as described in the OP, but he does have an intuitive grasp of how to persuade computers do things you would like them to accomplish. PG and computers seem to understand each other at a visceral level even though PG is less like a computer than a whole lot of smart people he knows. It’s sort of a Yin/Yang thing.

His liberal-arts assessment of the problem described in the OP is that the computer scientists in the OP haven’t figured out how to ask the ultra-computer for the answers they would like it to provide. A computer can do smart things and dumb things very quickly, but useful output requires understanding what you really want it to do, then figuring out how to explain the job to the computer.

But, undoubtedly, PG is missing something entirely and is totally off-base.

The Alignment Problem may be a good description of both the computer issue described in the book and of PG himself.

Tecumseh and the Prophet

From The Wall Street Journal:

In 1808, on the Wabash River—just downstream from where the Tippecanoe River flows into it—a new settlement was being built, in what is now northwestern Indiana. You could hear trees being cut down to construct houses and a 5,000-square-foot meeting house. Women were planting corn, beans and pumpkins. Founded by the Shawnee brothers Tecumseh and Tenskwatawa, this was Prophetstown.

The brothers’ houses were close to each other on Prophetstown’s southwestern edge, from which they could see the wide Wabash flowing through the prairie. And they could see pilgrims coming and going, visiting this place of hope in a dark time. A vast diversity of Native peoples—Wyandots, Ottawas, Lenapes (Delawares), Miamis, Potawatomis, Sauks—would pilgrimage to this multiethnic religious community, some staying and some returning home to spread the brothers’ universal Nativist message. As Tenskwatawa explained: The “Master of Life had taken pity on his red children,” who had been pushed around so long by white men. He “wished to save them from destruction” if they would cast aside “wealth and ornaments,” whisky and other trappings of “evil and unclean” white Americans and band together against those who “have taken your lands, which were not made for them.”

In “Tecumseh and the Prophet: The Shawnee Brothers Who Defied a Nation,” Peter Cozzens tells the intertwined history of the brothers Tecumseh and Tenskwatawa and makes the important argument that, without Tenskwatawa—who was known as “the prophet” for his spiritual visions and prophecies—“there would have been no Tecumseh.”

In most biographies and popular versions of this history, the famous warrior Tecumseh, who led a pan-Native force against the United States in the War of 1812, stands alone—exactly the opposite of his mission in life. As Mr. Cozzens shows, the brothers sought to bring together all Native Americans under Tenskwatawa’s teachings, persuading them to cast aside their political, cultural and religious differences to become one mighty race.

While the Master of Life spoke to the prophet Tenskwatawa, Tecumseh traveled throughout the eastern half of North America to spread the word, from Creek and Choctaw towns in the deep South to the Iroquois (Haudenosaunee) nations in the Northeast, and across the Mississippi River to the Quapaws, the powerful Osages and bands of the brothers’ own Shawnee people who had already moved west. Everywhere, Tecumseh preached Tenskwatawa’s prophecies and readied men for battle against the United States.

. . . .

The book’s sharply drawn characters go beyond the central figures of Tecumseh and Tenskwatawa. One of their greatest influences was their older brother Cheeseekau, killed in 1792 fighting against Tennessee settlements alongside Cherokees and Creeks. Their Shawnee opponent, Chief Black Hoof, believed Tenskwatawa’s call for pan-Indian resistance, instead of Shawnee-directed diplomacy, was madness. The great Miami war leader Little Turtle defeated U.S. forces in the 1790s, but by the time of Tenskwatawa’s movement he believed that compromise with the United States was the only path. As governor of the Indiana Territory, William Henry Harrison was impressed by Tecumseh’s rhetorical and martial skills and frightened by his popularity. Harrison later would win the U.S. presidency as “Old Tippecanoe,” famed for defeating Tecumseh.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

PG has read a bit of American history, but this was completely new to him.

However, he did a bit of quick research and

Tenskwatawa, “The Prophet” painted in 1830 by George Catlin via Wikipedia
Frieze of the Rotunda of the United States Capitol “Death of Tecumseh” – Tecumseh is shown being fatally shot by Colonel Johnson at the Battle of the Thames in Upper Canada during the War of 1812. With Tecumseh’s death, however, the momentum and power of the Indian confederacy was broken. via Wikipedia, public domain, US Government work
Bronze reproduction of the figurehead of the USS Delaware, located at the US Naval Academy. Originally Tamanend, chief of the Delawares, the statue is called now commonly called Tecumseh. It was designed by William Luke (1790-1839).
Photo by Employees of the U.S. Naval Academy, Public domain, via Wikimedia Commons

Cary Grant: A Brilliant Disguise

From The Wall Street Journal:

H.L. Mencken was doubtful that Shakespeare wrote the plays assigned to him because there is substantial evidence that he acted in them, which is an amusing way of saying that actors are not notable for searing intelligence. Their intelligence and much else about famous movie actors was nicely kept under cover during the years, from the 1930s through the early 1960s, of the studio system in Hollywood. The men who ran the great studios—MGM, Fox, Warner Bros., Paramount—knew that the people went to the movies above all to see their favorite actors, and so the actors had to be protected from showing themselves the coarse, ignorant, foolish beings they often were. The studio bosses did this by controlling the interviews their actors gave, restraining them from making political statements, hiding anything peculiar about their sex lives. Actors were where the money was, the vehicles in which the movie business drove all the way to the bank.

One reads about the off-screen lives of actors at the peril of never again being able to enjoy in quite the same innocent way the movies they made.

. . . .

I began Scott Eyman’s biography of Cary Grant with some trepidation. In his movies Cary Grant was the embodiment of suavity, the master of savoir faire, elegant, witty, in every way winning. He was dazzlingly but somehow inoffensively (to men) handsome, for in most of his movies he won over women not by his good looks but by his bumbling yet invincible charm. Would Cary Grant, too, in so-called real life, turn out to be a jerk, a creep, a monster, another disappointment? I, for one, distinctly preferred not.

Cary Grant was born Archibald Alexander Leach in 1904 in Bristol, England, to an alcoholic working-class father (he was a tailor’s presser) and a mother who spent more than 20 years in a mental institution. In Mr. Eyman’s account, Grant, an only child largely ignored by his parents, “would spend the rest of his life coping with the damage inflicted on him during these years,” harassed all his days by unreasonable fear and uncertainty.

The young Archie Leach left school at 14—actually, he was kicked out—and found succor in Bristol’s music halls, the English version of our vaudeville, with a touch of bawdiness added. He soon acquired low-level work among some of the performers and not long after joined a troupe of tumblers, with whom he did acrobatics, stilts-walking and pantomime. The troupe traveled to America, where it played second- and third-line theaters, and when it returned to England the young Archie Leach chose not to return with it.

He found a place acting in B-minus movies in New York, then traveled out to Hollywood, where he gradually found parts in better movies. In 1931 he had his name changed to Cary Grant—or, as Mr. Eyman puts it, “the matchless specimen of masculine charm known as Cary Grant.” A friend of Grant’s once told him, “I always wanted to be Cary Grant,” to which he replied, “So did I.” The subtitle of “Cary Grant” is “A Brilliant Disguise.”

. . . .

What was disguised underneath Grant’s nonchalant aristocratic facade, according to Mr. Eyman, “was a personality of nearly perpetual anxiety.” Grant was a man who had no fewer than five marriages (he remarked late in life that he was a better judge of scripts than wives), spent much of his life in therapy, once attempted suicide, and claimed LSD (which he had taken under supervision more than 100 times) to be a wonder drug that quieted the rumblings in his soul and becalmed him by revealing his true self to him.

Whatever the rich complications in his personal life, Cary Grant was never less than keen about cultivating his professional life. He was sedulous about his personal appearance. He worked daily on his perfect tan. His clothes were, beyond impeccable, perfection. Never rumpled, even when chased by an airplane through a farm field or climbing Mount Rushmore, he was often on Ten Best-Dressed Men lists, and the other nine men, whoever they were, must all have felt themselves more than a touch shabby compared with him. “I consider him not only the most beautiful but the most beautifully dressed man in the world,” said Edith Head, the fabled Hollywood costume designer.

Over his 40-year career, Grant made 73 movies. 

. . . .

Romantic comedy was Cary Grant’s specialty. “Grant was to romantic comedy,” Mr. Eyman writes, “what Fred Astaire was to dance—he made something extremely difficult look easy.” Grant recognized that the key to comedy was in timing, and his own timing, first learned on the English music-hall stage, was consummate. He knew his strengths and limitations and kept his ambition in bounds. William Wilkerson III, son of the founder of the Hollywood Reporter, noted that Grant “was one of the few English actors who had no desire to play Shakespeare.” He avoided glum parts generally, sensing, correctly, that movie audiences had no interest in seeing him, in a wife-beater undershirt, screaming “Stella!”

Grant understood that a key to success for an actor in Hollywood was to work with the best directors. For the most part, he was able to arrange to do so. He worked in films directed by Leo McCarey, Howard Hawks, George Stevens, George Cukor and Alfred Hitchcock. Given his popularity at the box office, he had, as Mr. Eyman writes, “first crack at nearly every script that didn’t involve a cattle drive or space aliens.”

Equally careful about female co-stars, Grant played in movies with Katharine Hepburn, Irene Dunne, Audrey Hepburn, Grace Kelly and Ingrid Bergman. He especially admired Bergman. “Grant found that he liked Ingrid Bergman a great deal,” Mr. Eyman notes. “She was beautiful, but lots of actresses are beautiful. What made Bergman special was her indifference to her looks, her clothes, to everything except her art.” With Bergman he made “Notorious,” “the high-water mark,” according to Mr. Eyman, “of the Hitchcock-Grant collaborations.”

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

Hellacious California!

From The Los Angeles Review of Books:

NINETEENTH-CENTURY AMERICAN CRITIC Hinton Rowan Helper left a lasting impression on how Californian culture is still viewed to this day through one mordant comment:

I will say, that I have seen purer liquors, better segars [cigars], finer tobacco, truer guns and pistols, larger dirks and bowie knives, and prettier courtezans here, than in any other place I have ever visited; and it is my unbiased opinion that California can and does furnish the best bad things that are obtainable in America.

Gary Noy draws on Helper’s gleeful sentiment for the title of his book Hellacious California!: Tales of Rascality! Revelry! Dissipation! and Depravity! and the Birth of the Golden State, sharing the view that California’s origin story is a combination of greatness and immorality. The book teems with bittersweet compounds of 19th-century nefariousness, including — but not limited to — gambling, knife fights, the demon drink, con artistry, and prostitution.

. . . .

Gracious dining and gluttony was also at its peak right after the Civil War, with residents binging on jackass rabbit and codfish. I respect Noy’s ability to evenly weigh the temptations of the era. There are the “bad things” that affect the self (e.g., demon drink, gambling, tobacco), and those that affect others (e.g., divorce, knife fights, sex slavery). There is heavy content on Old California’s call for political change and the depth of the unhappiness with elected leadership. Most social issues stemmed from political corruption, especially corruption brought by the railroads. Nineteenth-century state government was also not big on quality law enforcement. Instead, San Francisco local citizens formed their own vigilance committees. Miners and local townspeople created their own form of justice.

In the same way that many civilians helped one another, others tried to harm each other. Many people belonging to the lower-class scammed and tried to “eat the rich.” Wealthy individuals spent incredible amounts of money on luxurious things they did not necessarily value. Arabella Huntington, widow of Central Pacific Railroad founder Collis P. Huntington, stepped into her carriage after attending an art gallery. Soon after, a gallery employee chased the carriage to let Mrs. Huntington know she had forgotten her handbag, which contained “eleven pearl necklaces valued at more than $3.5 million, the equivalent of $108 million today.”

Link to the rest at The Los Angeles Review of Books

Having lived in California a long time ago and having close relatives and friends who still live there, PG can assure one and all that the California you will find today has changed from the California described in the OP.

In some respects.

And in some places.

California was and is a big place with lots of variations in climate, people and cultures.

San Francisco is not Fresno. Los Angeles is not Barstow. Quite a number of residents of each of these four cities are vociferously happy that they don’t live in one of the other three cities mentioned.

California includes both Hollywood and Death Valley (parts of which are shared with Nevada).

In the last half of the 19th century, a great portion of California qualified as nearly or completely uninhabited mountains and deserts that would have been described as useless and dangerous wastelands at the time. If California felt too settled, you could always go east to Nevada (which has places a bit more welcoming than Death Valley) or Arizona for more alone time.

The first transcontinental railroad was started in 1863, while the Civil War was still being fought, beginning in Council Bluffs, Iowa, and and ending in Oakland, California, in 1869.

Prior to that time, if you wished to travel from one coast of the United States to the other, you either took a miserable, long, dirty and dangerous horse-powered trip across the United States or, if you had more money, you took a ship that landed in either Nicaragua or Panama, crossed one of those countries on foot or by horse, hoping to avoid catching any tropical diseases, then boarded a ship on the other side and completed your journey to the opposite coast of the US.

Either the land or the sea route included significant dangers to life and/or health.

Some people became very rich in both the East and the West from their involvement in building the railroad. Others didn’t.

Some people in the East and West got very rich by financing the construction of the railroad and others lost their shirts, banks and fortunes.

Most US government politicians and employees received bribes for their services in picking the route and funding the construction of the railroad. State politicians sometimes participated in the bribing and at other times collected bribes. There were competing bribers who promoted one route over another because they owned a lot of land on one prospective route or another.

All this is to say that California, its residents and elected officials participated in the disorganized and corrupt parts of building the railroad, but residents and elected officials in other parts of the country did the same.

California residents and residents of other states also organized and performed the incredible engineering and construction feats necessary to build a railroad across vast uninhabited deserts and high, little-known mountains.

Imported Chinese laborers were also essential to the construction. During the crossing of the Sierra Nevada mountains, some parts of which were snow-covered all year and others snow-covered much of the year, some of the Chinese dug tunnels into the deep snow along the route and built snow caverns in which to eat and sleep under the snow to avoid the freezing winds that blew almost constantly. Such shelter was necessary for their survival because death by freezing was a real danger to workers of all nationalities.

For visitors to The Passive Voice from outside the United States, the transcontinental railroad was a bit over 1,900 miles (over 3,000 km), longer than the distance from London to Moscow. (Yes, the Trans-Siberian Railway is longer.)

The book that describes this great effort that PG read a few months ago and greatly enjoyed is Nothing Like It In the World: The Men Who Built the Transcontinental Railroad 1863-1869 by Stephen E. Ambrose. If you’re interested in more detail, PG highly recommends this book.

The switch to coal changed everything in Britain

From The Wall Street Journal:

The grimy furnaces and coal-stained cheeks of Dickensian Britain seem like an indelible birthright, but it wasn’t always so. As Ruth Goodman writes in “The Domestic Revolution,” Britons had for ages burned wood as well as peat and other plant fuels to heat their homes and cook their food. Then, in the late 16th century, London switched to coal.

This revolutionary change was carried out by ordinary families, the “ ‘hidden people’ of history,” as Ms. Goodman calls them. They switched to coal for the most prosaic of reasons—personal comfort, convenience, a small savings.

Yet the “big switch” set in motion a series of large transformations. Thousands of Britons found new work as miners and as merchant seafarers. The island’s fabled heathland, site of all those chest-throbbing novels, faded and disappeared as woodland, no longer needed for fuel, was given over to agriculture. To vacate sulfurous coal fumes, chimneys sprouted all over London, prompting homeowners to build more spacious layouts and second and third stories.

Since coal fires required a different sort of cookware, investment poured into brass and iron, hastening the development of pig iron—hastening, that is, the onset of the Industrial Revolution. Wall tapestries came down (in a coal-fired home, they quickly stained) and were replaced by smoother, washable surfaces and paint. There was a bull market in soap.

Not least, British cooking, which Ms. Goodman stoutly defends, was forced to adapt. Stirring a pot precariously dangled over a row of coals was difficult. Thick, starchy fare gave way to boiled puddings and kidney pies, which the author forgivingly describes as “democratic.” Thanks to the pleasing effects of roasting on an open grate, Ms. Goodman maintains, coal even led to the “modern British love affair” with toast. The new energy source touched every corner of life.

. . . .

Whatever the causes, the changeover to coal happened quickly. When Elizabeth ascended the throne, in 1558, London homes burned wood. A generation later, the increasingly crowded city was importing 27,000 metric tons of coal per year. By roughly the time of Elizabeth’s death, in 1603, imports had soared to 144,000 tons.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

Lover, Mother, Soviet Spy

From The Wall Street Journal:

Though little known to the broader world, Ursula Kuczynski, born in 1907 into a wealthy, cultured Jewish family in Berlin, was one of the most successful spies of the 20th century. She worked for the GRU—Soviet Military Intelligence—in China, Eastern Europe, Switzerland and, most damagingly, Britain. Beginning in the early 1930s as the provider of a safe house for spies to meet, she was soon trained as a radio operator, courier and liaison between communist underground activists and Moscow. Eventually she ran her own network of Soviet spies in Nazi Germany and, after war broke out, in Britain. There she served as a courier for Klaus Fuchs, the émigré physicist who later worked at Los Alamos and betrayed its secrets to the Soviet Union.

Kuczynski—whose code name was “Sonya”—never endured prison or torture or the other gruesome fates that befell many of her comrades, though she was pursued by the security services of the Chinese Nationalists, the Japanese, the Nazis and various British governments. She spent most of her 20-odd years as a spy living in fairly comfortable surroundings with her children, posing as a middle-class foreigner and friendly neighbor. Only when Fuchs himself was arrested in 1950, and she faced the possibility of exposure and arrest, did she board a plane to return to East Germany. She then transformed herself into a novelist and published, under a pseudonym, an autobiography that resulted in a triumphant book tour in Britain, the country that had given her refuge and that she had betrayed.

With “Agent Sonya,” Ben Macintyre, the author of several popular works about espionage, has written a lively account of Kuczynski’s remarkable career. He has been aided by the cooperation of her family and by his research in the British and (in a limited way) Russian archives. Inevitably, as the reader should keep in mind, much of Kuczynski’s life is filtered through her autobiography, which was written in East Germany under the scrutiny of censors by a woman whose survival depended on lying about many of her activities. Her account of Stalin’s purges of the GRU, for example, is limited to the statement that, “unfortunately, comrades in leading positions changed frequently at that time.” While government files and private letters offer a partial reality check, GRU archives remain inaccessible, limiting the best source for our knowing how far-reaching her career was.

Kuczynski was an early rebel, Mr. Macintyre tells us, participating in communist demonstrations in Berlin at age 17. Her father, a demographer, and her brother, an economist, had connections to many government officials throughout Europe and the United States, and both later fed her information for transmission to the U.S.S.R. Jurgen, her brother, led the underground Communist Party in Britain during World War II and was the first to put her in touch with Fuchs.

Her entry into espionage came in Shanghai, where she was living in 1930 with her husband, Rudi Hamburger, an architect. Appalled by the poverty and brutality of the city, and repulsed by the racism and luxurious lifestyle of the Western community there, she was recruited by Agnes Smedley, the American journalist, and Richard Sorge, the legendary Soviet spy. Kuczynski and Sorge (a compulsive womanizer) ended up having a passionate affair. Mr. Macintyre observes that she was “intoxicated by the thrill of her own destiny, the entwining of danger and domesticity, living one life in public and another in deepest secrecy.”

. . . .

Fearing deportation from Switzerland to Germany in 1940, she concluded that marriage to a British citizen would enable her to obtain a British passport. She arranged to divorce Rudi and married Len Beurton, a veteran of the International Brigades in Spain.

One of the great mysteries of Kuczynski’s career is how she managed to avoid detection by British authorities. 

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

Smith lands Instagrammer’s guide to planning for Ebury

From The Bookseller:

Ebury Press editorial director Emma Smith has acquired Happy Planning, a “practical guide for those who like to prep” from Charlotte Plain, a.k.a. Instagrammer Princess Planning.

Smith bought world all language rights to the title directly from the author.

Plain is the person behind Instagram account and website Princess Planning, where she sells diaries, planners and stationery which aim to help organise and inspire positivity. Happy Planning will give readers the tools they need to plan every aspect of their life, from the weekly shop and daily meal prep to big occasions like weddings, parties and holidays.

The publisher explains: “Planning is about taking away last-minute panic pressure, gaining control and helping you to be the best version of yourself. Charlotte’s everyday approach has been so successful that she launched a business off the back of it, and is now sharing all of her practical and positive know-how in this book. As well as her planning mantras and toolkit, each section of the book is dedicated to an area of life that benefits from planning and is packed with personal learning experiences, planning methods, tips and tricks, practical guidance and interactive elements. It’s simple, positive and practical planning that will lead to a healthier happier you.”

Smith added: “We all need a good dose of practical positive planning in our lives (now, more than ever), so we are incredibly excited to be publishing an Instagram star”

Link to the rest at The Bookseller

PG couldn’t resist visiting Princess Planning Ltd. on Instagram (192K followers). He found several Instagram star gems:

Weight loss is never just about losing the weight.

You have to lose the habits that got you there in the first place and replace them with better ones

Princess.Planning Ltd.

if 2020 was a chocolate it would be a turkish delight

Princess.Planning Ltd.

What would we do without traditional publishers to act as curators of culture?

‘Not Made by Slaves’ Review: Marketing to Abolitionists

From The Wall Street Journal:

As Ecclesiastes reminds us, there is nothing new under the sun. Demands for institutional divestiture of morally suspect assets, boycotts of goods from Goya beans to “blood diamonds,” and movements to promote ethically sourced consumer products from dolphin-safe tuna to fair-trade coffee, all have a long pedigree. Among the earliest expressions of American national identity was the 1765 nonimportation agreement among Boston merchants in response to British revenue acts, supported by boycotts of British goods by local households. And, starting in the late 18th century, antislavery activists in the Atlantic world urged consumers to refuse to buy products made with slave labor—primarily cotton cloth and sugar—to hasten the end of the international slave trade and ultimately slavery itself. Conscientious consumers flocked to free-labor goods and to such virtue-signaling items as emblazoned sugar bowls assuring guests that the content was “East India Sugar not made by Slaves.”

In “Not Made by Slaves,” Bronwen Everill, a lecturer in history at the University of Cambridge, terms this movement “ethical capitalism” and places it in the context of the 18th-century global consumer revolution that put luxury goods in the hands of the many. The book offers an important contribution by emphasizing West Africa’s role in the trade network that linked producers, merchants and consumers around the world. Just as Western economies traded for tropical luxuries such as tea, coffee and sugar, sophisticated African markets exchanged a highly valued commodity—unfree labor—for French wines, East Indian cottons and British firearms. It was this complex global trading community that the abolitionists sought to reform and that they disrupted in ways both foreseen and unforeseen.

Ms. Everill’s account rests on a chain of related events. In the late 1700s, opposition to the slave trade grew as the world came to appreciate the horrors of the Middle Passage between Africa and America. In their efforts to suppress the trade, antislavery activists asked Atlantic consumers to boycott goods associated with slave labor. But for the boycott to be effective, consumers needed to be certain the goods they bought were ethically produced. Antislavery traders thus began to source free-labor goods and, in early branding efforts, to identify them with labels such as “made by escaped slaves,” spawning an ethical-goods economy. In a parallel development, West African Islamic jihadists attacked local consumption of luxury goods and the international slave trade that supported those tastes, eventually banning the sale of slaves to non-Islamic traders and organizing boycotts of European goods such as tobacco and alcohol.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

When the Revolution Left Kate Millett Behind

From Public Books:

In March 1979, the white American feminist Kate Millett landed in Tehran, in the wake of one of the most significant revolutions of the 20th century. Just weeks earlier, the Shah—the monarch of Iran—had been overthrown. Millett arrived with a suitcase of recording equipment and her partner, filmmaker Sophie Kier. While there, Millett methodically recorded her whispered reflections on everything around her: the cups of tea with her hosts, the hours stuck in traffic, and the International Women’s Day celebration, which exploded into major protests against Ayatollah Khomeini’s new mandatory veiling laws.

Millett’s whispers were the raw material for her own Going to Iran (1982), but they have been newly transcribed and examined by Negar Mottahedeh in her new book, Whisper Tapes: Kate Millett in Iran. With the same recordings, Mottahedeh does something in Whisper Tapes that Millett never could. She listens closely to the women speaking, yelling, and demonstrating in Farsi around Millett, centering their voices in a radically new and vital account of the revolution.

By exploring the complexities of what Millett couldn’t hear, Whisper Tapes also reveals the narrowness of her white feminism and her lack of reciprocity. Yet, there is no need to “cancel” Kate Millett (who profoundly contributed to both feminist and literary theory, not least in her pathbreaking 1970 work, Sexual Politics). Instead, it is necessary to explore her particular brand of white, Western feminism critically, asking what Millett’s brief time in Iran might offer contemporary understandings of feminist solidarity.

. . . .

This paradox—between the book’s centering and decentering of its subject—mirrors a wider paradox: the tension between the alleged universalism of Millett’s feminism and the increasingly particular way in which she pronounces it. We might read Millett’s paradox against and alongside a revolutionary slogan pulsing throughout Mottahedeh’s book—one that Millett only “provisionally understood”: “Azadi, na sharghist, na gharbist, jahanist,” or “Freedom is neither eastern nor western, it is planetary.”

. . . .

Mottahedeh identifies the conceptual problem of misunderstanding—of a white ally who just doesn’t get it—largely as a problem of mistranslation. In an elegant anecdote, Millett is given some chaghaleh badoom, which she says on the tape are “beans,” while the women around her suggest—in English—that they are “walnuts.” In fact, neither translation is adequate, Mottahedeh tells us; the best approximation is “green, unripe almonds.”

A thirsty Millett struggles to comprehend going “through a whole revolution and not being able to have a glass of wine after it’s all over?” She declares Iran “joyless,” not understanding that there are so many versions of the good life, so many ways to be joyful. Millett can’t quite grasp why the revolution happened alongside, and so also included, men. “It’s important to ignore men,” she advises a demonstrator. “He is never gonna listen. Why waste your time?”

Millett’s unfamiliarity with Iran allows us, with the benefit of hindsight, to laugh at her presence as an awkward white woman. But this is not really the problem with Millett’s white feminism. What white feminism means, at least in the context of Whisper Tapes, is that Millett considers patriarchy to be the primary organizing structure in women’s lives, globally. This is despite the interactions she has with women who explain otherwise.

Millett reads Iranian women’s heterogeneous experiences of religion, demonstration, and revolution through this lens, and only this lens. It is this focus on patriarchy that allows her to quickly diagnose the women of Iran as being behind white American women on the path of liberation; the path that she herself, through Sexual Politics and her work in the women’s liberation movement, helped to pave. Millett’s white feminism means that she applies the logic and schedule of US women’s liberation to the Iranian revolutionary moment.

Mottahedeh’s careful treatment of Millett reveals that “white feminism” is not just a scolding charge. Instead, Millett’s white feminism is a generative and persistent world view that creates particular behaviors, blinkers, and blinds, while simultaneously proclaiming to be a universalist politics that speaks for all women. It means that Millett’s “ambitions and preoccupations are elsewhere.” She is always waiting for the moment of a radical global women’s uprising. She is “out of sync with what is right in front of her,” be it green-shelled, unripe almonds in their crinkled paper bag, or men’s crucial place alongside women in the ongoing Iranian revolution.

Kate Millett certainly does not understand that she is imposing a presumed universality steeped in the specificity of the American context. Indeed, this is just one of the things that she does not get.

Link to the rest at Public Books

No Time But the Present

From Harper’s Magazine:

From Breaking Bread with the Dead: A Reader’s Guide to a More Tranquil Mind, which was published last month by Penguin Press.

Navigating life in the internet age is a lot like doing battlefield triage. There are days we can’t even put gas in our cars without being assaulted by advertisements blared at ear-rattling volume. And so we learn to be ruthless in deciding how to deploy our attention. We only have so much of it, and often the decision of whether or not to “pay” it must be made in an instant. To avoid madness we must learn to reject appeals for our time, and reject them without hesitation or pity.

Add to this problem of information overload what the sociologist Hartmut Rosa calls “social acceleration,” the widespread belief that “the ‘tempo of life’ has increased, and with it stress, hecticness, and lack of time.” Rosa points out that our everyday experience of this acceleration has a weirdly contradictory character. On the one hand, we feel that everything is moving so fast, but we simultaneously feel trapped in our social structures and patterns of life, imprisoned, deprived of meaningful choice. Think of the college student who takes classes to prepare for a job that might not exist in a decade. To her, there doesn’t seem any escaping the need for professional self-presentation; but there also doesn’t seem to be any reliable means of knowing what form that self-presentation should take. You can’t stop playing the game, but its rules keep changing. There’s no time to think about anything other than the Now, and the not-Now increasingly takes on the character of an unwelcome and, in its otherness, even befouling imposition.

William James famously commented that “the baby, assailed by eyes, ears, nose, skin, and entrails at once, feels it all as one great blooming, buzzing confusion.” But this is the experience of everyone whose temporal bandwidth is narrowed to this instant.

What do I mean by “temporal bandwidth”? I take that phrase from one of the most infuriatingly complex and inaccessible twentieth-century novels, Thomas Pynchon’s Gravity’s Rainbow. Fortunately, you don’t have to read the novel to grasp the essential point that one of its characters makes:

“Temporal bandwidth” is the width of your present, your now. . . . The more you dwell in the past and in the future, the thicker your bandwidth, the more solid your persona. But the narrower your sense of Now, the more tenuous you are. It may get to where you’re having trouble remembering what you were doing five minutes ago.

Increasing our temporal bandwidth helps us address the condition of frenetic standstill by slowing us down and at the same time giving us more freedom of movement. It is a balm for agitated souls.

Link to the rest at Harper’s Magazine

The Innovation Delusion

From The Wall Street Journal:

‘Do you ever get the feeling that everyone around you worships the wrong gods?” So ask Lee Vinsel and Andrew L. Russell in the first pages of “The Innovation Delusion.” They are consumed by this question, convinced that America has been seduced by the false charms of innovation, causing us to chase novelty and pursue disruption while neglecting maintenance and infrastructure in both the public and private sectors. We end up discounting the value of “the ordinary work that keeps our world going.” Anyone compelled to “ideate” at a corporate breakout session can surely relate.

Agitated by Walter Isaacson’s triumphalist portraits in “The Innovators” (2014), Messrs. Vinsel and Russell, scholars of the history of technology, became increasingly troubled by what they saw as a broad cultural emphasis on “the shiny and new.” They started to wonder why no one ever celebrates the “bureaucrats, standards engineers, and introverts” who manage to keep established systems running smoothly. We live in an inverted world, they say, where “our society’s charlatans have been cast as its heroes, and the real heroes have been forgotten.”

In this dystopian view, we’ve mistaken novelty for progress and, in the desperate pursuit of growth, confused true innovation—creating things that work—with fraudulent “innovation-speak.” The result is, as the authors put it, an “unholy marriage of Silicon Valley’s conceit with the worst of Wall Street’s sociopathy.” Champions of change—like the late Harvard professor and father of disruptive innovation, Clay Christensen, and the influential thinkers at IDEO, the Palo Alto, Calif., design firm—have garnered hefty consultant fees while offering, the authors contend, little of true substance in return. Despite the frenetic pursuit of innovation stoked by the fear of missing out, “we should resist the notion that anyone on this planet knows how to increase the rate and quality of innovation.”

Privileging innovation, the authors note, costs us all. Localities find it far easier to attract federal funding for new infrastructure projects than to secure support for maintaining what already exists. And the funding for new development typically comes without the resources for downstream maintenance, saddling municipalities with unmanageable future obligations. Better for communities first to fix what’s broken, Messrs. Vinsel and Russell argue, and practice preventive maintenance. In any case, resources should be focused on what matters: Transit riders, one survey revealed, care most about service frequency and travel time, not power outlets and Wi-Fi.

The authors’ most emphatic recommendations involve talent—and our perception of it. When we overvalue innovation, they say, we forget that the vast majority of engineers will wind up maintaining existing systems, not coming up with the next Facebook. While we revere and reward data scientists and algorithm developers, we overlook the humble IT workers who keep our networks humming. Many students who might find “more joy, meaning, and pleasure” working in maintenance roles are shunted toward innovation careers sure to make them miserable. A rebalancing of our priorities is in order, Messrs. Vinsel and Russell contend.

Link to the rest at The Wall Street Journal (Sorry if you encounter a paywall)

Nonfiction, and the Past as ‘a Foreign Country’

From Publishing Perspectives:

Nonfiction publishing is often perceived by the outside world as somehow more predictable and less risky than fiction.

It’s certainly the case that betting on untried novelists is very high-risk. It’s even the case that lashing out large advances on established writers carries a degree of risk, should the writer have a declining fan base or say something to upset a special-interest group.

While some areas of nonfiction appear to be risk-free, it’s remarkable how often that’s only an appearance. The reality is that nonfiction is just as risky as fiction, just as hard to predict, and just as affected by vogue, trends, and demographics.

Per the opening line of LP Hartley’s 1953 The Go-Between, “The past is a foreign country: they do things differently there.” But perhaps we can learn something from the past while we look to the future and try to discern how the publishing landscape might be once the clouds of the coronavirus COVID-19 pandemic have cleared.

. . . .

The first lesson comes from two of my ancestors.

Garment manufacturing in Clapham Square, London, 1910. Image by Toby Charkin, provided by Richard Charkin

One owned a wedding dress factory in Margaret Street, which was then the clothing district of London. The other oversaw the manufacture of underwear and dresses in Clapham Square for the city’s burgeoning middle class. The business principle that guided them both was, “The one who makes the most money is the one who is first out of a fashion craze—not the first in.”

The same might be true in nonfiction publishing.

. . . .

The first book craze I can remember was for illustrated nostalgia. Edith Holden’s The Country Diary of an Edwardian Lady . . . was the front-runner, followed by Flora Thompson’s Lark Rise to Candleford.  They sold in their hundreds of thousands of units by satisfying a desire to enjoy what appeared to be a simpler and more perfect world.

Of course, publishers being the owners of excellent rear-view mirrors, they piled into the genre until the profit was eviscerated.

And there was the computer books boom of the 1980s: Fortran, Cobol, and all that.

. . . .

Right now, there seems to be an unquenchable thirst in the English-speaking world for books on politics, political scandal, and racial inequality.

. . . .

The overdue realization that health really matters will ensure that governments will prioritize the funding of primary and secondary health care, public health, and communication with the general public.

I suspect that there will be growing demand in digital and print for reliable and comprehensible information, formerly known as popular medicine.

In addition, we’ll see renewed research activity into all aspects of infectious disease and epidemiology with consequent growth in high-level open-access research and review publications. The distinction between general public, professional, and research information about health will narrow as more people want to know more about their own health—and as more professionals understand the need to communicate outside their own specialist communities.

. . . .

In this age of uncertainty, people will turn to books about happiness, de-stressing, self-awareness, empathy, and human interaction. “Mind, body, spirit” will emerge as a major genre, challenging even the dominance in British bookshops of celebrity-led cookery books.

Link to the rest at Publishing Perspectives

‘Twilight of the Gods’ Review: A Blood-Soaked Peace

From The Wall Street Journal:

A tale-telling axiom holds that complex narratives—whether from a writer’s quill, the pulpit or a Hollywood storyboard—are best broken into threes. From Sophocles to Coppola, the trilogy has thrived as a means to carve an enormous meal into manageable courses.

World War II, history’s most complex bloodbath, often seems to require such treatment, and over the decades the war’s two billion individual stories have been compiled into dozens of memorable (and not-so-memorable) three-volume sets. The best known of recent threepeats is Rick Atkinson’s “Liberation Trilogy,” a brilliant study of the U.S. Army in the Europe and the Mediterranean. James Holland (“Normandy ’44”) has released two of his three volumes on the Anglo-American war against Germany, and for the hard-core history geek, David M. Glantz offers a three-part deep dive into the Stalingrad campaign. Novelist James Jones (“From Here to Eternity”) drew the Pacific War’s thin red line through three volumes, while respected historian Richard B. Frank (“Tower of Skulls”) recently launched his first of three volumes on the Asian-Pacific struggle. It remains to be seen whether Mr. Frank’s series will rise to the level of Ian W. Toll’s Pacific War trilogy, now capped by “Twilight of the Gods.”

Mr. Toll, who has spent his literary career chronicling the U.S. Navy, built a solid foundation for the war’s final act in the first two volumes. The opening work, “Pacific Crucible” (2011), spanned the Navy’s disaster at Pearl Harbor to its redemption at Midway. The second installment, “The Conquering Tide” (2015), spotlighted America’s hard-won education in amphibious landings, from the six-month charnel house of Guadalcanal to the red-tinged tides of Guam. In “Twilight of the Gods,” he carries the reader through the war’s violent death rattles, spanning Peleliu to Okinawa.

The Pacific War’s complexity—and brutality—resist detailed depiction. The 8,800-mile American odyssey from Pearl Harbor to Tokyo Bay was dominated by saltwater, airstrips and islands few had heard of before 1941. Chinese, Dutch, Australians, Indians, Filipinos, British, Burmese and New Zealanders played major supporting roles in a conflict we often think of today as “U.S. versus Japan.” Setting the table of personalities, objectives, resources and innovative weapons systems is an immense job for any historian.

. . . .

As his narrative rolls through the Philippine Sea, Peleliu, the Philippine islands, Luzon, Iwo Jima and Okinawa, Mr. Toll introduces the reader to America’s battle captains of the waves. Adm. Raymond Spruance, commander of the Fifth Fleet, was an eccentric thinker who delegated nearly everything to his subordinates. “Spruance did not fit the conventional mold of a wartime fleet commander,” Mr. Toll writes. “He was aloof, introverted, and monkish. . . . On an average day at sea, Spruance paced for three to four hours around the forecastle of the Indianapolis while dressed in a garish Hawaiian floral-print bathing suit, no shirt, white socks, and his regulation black leather shoes.” Yet, he continues, Spruance’s “insistence upon delegating authority down the line of command tended to bring out the best in subordinates.” Because Spruance’s résumé included spectacular victories at Midway and the Philippine Sea, Roosevelt would tolerate eccentricities.

Third Fleet’s Adm. William Halsey, nicknamed “Bull” by the press, jumps off the pages as an instantly likeable, Pattonesque leader whose reputation was cemented with his victory at Leyte Gulf, one of the largest naval battles in history. “He was a profane, rowdy, fun-loving four-star admiral who laughed at jokes at his own expense and fired provocative verbal salvos against the enemy.” His rapport with the press would yank him out of trouble on more than one occasion and propel him to the rank of five-star fleet admiral.

. . . .

Mr. Toll’s interest in the evolution of weaponry dots the pages of “Twilight of the Gods.” The big Essex-class aircraft carriers and their unruly children, Hellcat fighter-bombers, play critical roles, as does the ultimate piece of the war’s power game: the atomic bomb. Doppler radars, proximity fuses, air-dropped mines and napalm raise the curtain on modern warfare. Carrier combat, no longer the “whites of the eyes” affair of 1941, morphed into a long-range campaign in which, Mr. Toll notes, “often the crews of the ships did not even lay eyes on a hostile plane.”

Yet on the ground Marines laid eyes on many enemies, human and natural. On Peleliu, a wasteland Mr. Toll compares to J.R.R. Tolkien’s Mordor, “clouds of large greenish-blue flies fed off the unburied dead and tormented the living. Sudden torrential rainstorms came in the late afternoon, and sometimes at night. There was no escape from the relentless artillery and mortar barrages.” Worse horrors faced the doomed enemy: “When the guns paused, the marines could hear wounded and dying Japanese crying out in the night. Often they cried for their mothers, as did dying men of all races.” Of the cave-dwelling Japanese defenders of Iwo Jima, he writes: “The noise and blast concussions took a steady toll on their nerves, and many were reduced to a catatonic stupor. Their subterranean world grew steadily more fetid and unlivable. There was no way to bury the dead, so the living simply laid them out on the ground and stepped around them. The stench was unspeakable.”

Link to the rest at The Wall Street Journal (Sorry if you encounter a paywall)

World War II in the Pacific theater covered almost unimaginable distances, particularly for the world of the early and mid-1940’s.

The OP mentions the distance from Pearl Harbor, Hawaii, to Tokyo as 8,800 miles (not in a straight line, presumably following the sequence of the location of major battles). For comparison’s sake, the distance from London to Moscow is approximately 1500 air miles. London to Shanghai via a direct flight is approximately 5700 air miles. New York to Tokyo via a direct flight is approximately 6700 air miles.

All through World War II, the only way to transport large numbers of soldiers or any significant amount of military equipment from the US to the site where needed was via ship. Most of the ships used for this purpose were a variant of the Liberty ship.

A Liberty ship cruised at about 11 knots (a bit less than 13 miles per hour). The distance between San Francisco and Honolulu is about 2400 miles. The trip took more than a week via Liberty ship. The distance between Honolulu and Manila is about 5300 miles. That trip took about 2.5 weeks. The potential for enemy submarine or air attacks that required evasive maneuvers added even more time.

The same distances were also covered while delivering massive amounts of military equipment, supplies, ammunition, food, etc., etc., etc., necessary for the Marines, troops and sailors to live and wage war on the ground, at sea and in the air.

Since virtually all of the Pacific war was waged from island to island, everything and everyone had to be put back on ships, taken to the location for the next battle and unloaded once again. This process was repeated many times.

The Conspiracy on Pushkin Street: The Costs of Humor in the USSR

From The Los Angeles Review of Books:

“IF THEY COME for me, I won’t give you up. I won’t tell them what happened in this room.” Vasily Babansky let out a sigh and locked eyes with the four young men around him. It was February 1940 and 18-year-old Vasily had become increasingly sure that the NKVD was closing in on him.

The silence hung thickly in the air, so at odds with the laughter they usually shared here. The five students were gathered together in their usual haunt — one of the dormitories at the Zoological Institute in Stavropolsky District, southwestern Russia. The door was locked, as it always was when they wanted to speak freely, but now the bolt seemed woefully inadequate. If the NKVD was coming for them, all they could rely on was silence and their loyalty to one another.

Silence would be a problem, though. They’d have to tell the NKVD something if they were arrested; Stalin’s secret police didn’t take “No” for an answer. Aleksandr Mitrofanov proposed they should tell the truth, but not the whole truth — they would come clean about anything they’d said or done in front of witnesses at the Institute, “but keep quiet about what went on in our room,” recalled another of the students, Pavel Gubanov.

They all solemnly agreed, and then Mitrofanov rushed off to find the poem he’d written criticizing the Soviet regime. He was proud of his work, and the group had hoped to make anonymous copies and spread them across campus. Instead, after relocking the door behind him, he would ritualistically read the poem aloud one last time to his comrades, then set the paper alight and watch the flames consume his words.

It would be another 11 months before the NKVD descended, but when they did, the lives of these young men would be torn apart. Despite their earnest pact not to inform on each other, in the end they had little choice. The NKVD has gone down in history for its brutality and willingness to extract confessions by any means necessary. All five would break their vow of silence as the interrogators raked through the ashes of their lives at the Institute. Aleksandr Mitrofanov, Vasily Babansky, Mikhail Penkov, and Pavel Gubanov would all be sentenced for the crime of “anti-Soviet agitation” and for being part of a “counterrevolutionary organization” that, the authorities were sure, was actively plotting the downfall of the Soviet regime. Mitrofanov and Babansky received 10 years, Penkov eight, and Gubanov seven. The fifth man, Damir Naguchev, was for some reason treated with a touch more leniency: he received “only” three years for failing to denounce his comrades.

Locked doors, burnt evidence, and a plan for resisting interrogation: at first glance, it certainly sounds like conspiracy was afoot at the Institute. But if we take a closer look at the evidence left behind in the formerly secret Soviet archives, the fate of these five teenagers reveals a very different story. A story of how, under Stalin, a poem, a few jokes, and five open minds could spell disaster.

Link to the rest at The Los Angeles Review of Books

As PG reviewed the OP and thought of a novel he is reading that is set, in part, in the all-female 588th Night Bomber Regiment of the Soviet Red Army Air Forces. One of the characters has to flee from her regiment because her father has been convicted and executed for anti-Soviet, anti-Stalin outbursts and all his family members are to be arrested.

Back to the 588th regiment. This group, which flew all of its bombing missions at night, were called the Nachthexen, or “night witches,” by their targets in the Wehrmacht because the whooshing noise their wooden planes made as they dived into their attacks resembled that of a sweeping broom. There was no other noise because the pilots were instructed to idle their engines at altitude, prior to beginning their bombing glide to drop their bombs on the German troops.

The antiquated bombers, 1920s bi-plane crop-dusters that had been used as training vehicles prior to being repurposed for night bombing were effectively invisible to German radar or infrared defense systems. They were unarmored, built of plywood with canvas stretched on top and most had no guns for defense. Machine guns and ammunition would be too heavy to carry in addition to the weight of a single bomb attached under each wing. Parachutes were also too heavy to carry.

These planes had a top speed of 94 mph and a cruising speed of 68 mph. The most common German fighter plane the Night Witches faced in battle was the Messerschmitt 109, which had a top speed of 385 mph. The maximum speed of the bombers was slower than the stall speed of the German planes, which meant these wooden planes, ironically, could maneuver faster than the enemy, making them hard to target.

The Night Witches continued their attacks through three winters, 1942-43, 1943-44 and 1944-45. Their planes had open cockpits and no insulation. Flying them exposed their pilots and navigators to almost unimaginably bitter cold temperatures. During those Russian winters, the planes became so cold, just touching them would rip off bare skin.

The Night Witches ended the war as the most highly decorated female unit in the Soviet Air Force.

On average, each pilot/navigator crew flew about 800 missions. For comparison, United States heavy bomber crews flew bomber crews’ obligations were between 25-35 combat missions.

(In fairness, since the Night Witches were stationed at airfields so close to the front lines, the flight time spent on each of their missions was much shorter. Many US crews spent far more time in the air because their missions involved much longer flights to reach their targets. On the other hand, the Night Witches were under direct enemy fire far more frequently, sometimes flying 8 missions in a single night.)

A Place at the Table

From Publishers Weekly:

Chef, restaurateur, and TV personality Marcus Samuelsson began working on his latest cookbook, The Rise (Voracious, Nov.), three years ago. A celebration of Black cooking, the book brings together chefs, food writers, and activists to share their stories and recipes, and emphasizes the diversity of the Black American experience. “There wouldn’t be American food without the contributions of Black people,” Samuelsson says. “[This book] is an opportunity to give authorship and recognition.”

The Rise arrives at a moment of racial reckoning in the U.S. more broadly, and in food media specifically. In May, cookbook author and Instagram star Alison Roman was placed on temporary hiatus from her New York Times column after mocking the achievements of Marie Kondo and fellow cookbook author Chrissy Teigen, both women of color. Weeks later, Adam Rapoport resigned from his position as editor-in-chief of Bon Appétit after a 2004 photo of him in brownface surfaced, which in turn opened up a public discussion about pay inequity in the magazine’s test kitchen. Subsequently, four on-screen personalities of color declined to participate in the brand’s popular video series, and the magazine’s only two Black editorial staff members quit.

“This moment is important; the world is watching,” says Samuelsson, who on August 17 was named Bon Appétit’s first brand advisor. “To be able to uplift Black stories of craftsmanship is important. I feel honored and privileged.”

Link to the rest at Publishers Weekly

The 22-Year-Old Blogger Behind Protests in Belarus

More under the category, “There are worse things than Covid.”

From The Atlantic:

In the videos posted last Sunday from Belarus, thousands of people can be seen streaming into the center of Minsk, walking up the broad avenues, gathering in a park. In smaller cities and even little towns—Brest, Gomel, Khotsimsk, Molodechno, Shklov—they are walking down main streets, meeting in squares, singing pop songs and folk songs. They are remarkably peaceful, and remarkably united. Many of them are carrying a flag, though not the country’s formal flag, the red and green flag used in the Soviet era. Instead, they carry a red-white-red striped flag, a banner first used in 1918 and long associated with Belarusian independence.

It was a marvelous feat of coordination: Just as in Hong Kong a few months ago, the crowds knew when to arrive and where to go. They knew what they were marching for: Many people carried posters with slogans like leave—directed at the Belarus dictator/president, Alexander Lukashenko—or freedom for political prisoners! or free elections! They carried the flag, or they wore red and white clothes, or they drove cars festooned with red and white balloons.

And yet, at most of these marches, few leaders were visible; no one ascended a stage or delivered a speech into a microphone. The opposition presidential candidate, Sviatlana Tsikhanouskaya, who probably won the contested election held on August 9, fled the country last week. How did everyone know exactly what to do? The answer, improbably, is a 22-year-old blogger named Stsiapan Sviatlou, who lives outside the country and runs a channel called Nexta Live on the encrypted messaging app Telegram.

On Sunday morning, Nexta—the word means “somebody”—posted a red and white invitation to the march. “Ring the doorbells of your neighbors, call your friends and relatives, write to your colleagues,” the message instructed them: “We are going EXCLUSIVELY peacefully to the center of town to hold the authorities to account.” The invitation also contained a list of demands: the immediate freeing of political prisoners, the resignation of Lukashenko, the indictment of those responsible for a shocking series of political murders.

People went to the Minsk march, and to dozens of smaller marches across the country, because they saw that message. On subsequent days, many went on strike because they saw another message on that channel and on channels like it. Over the past 10 days, people all across Belarus have marched, protested, carried red and white flags and banners, and gathered at factories and outside prisons because they trust what they read on Nexta. They trust Nexta even though Sviatlou is only 22 years old, even though he is an amateur blogger, and even though he is outside the country.

Or to put it more accurately, they trust Nexta because Sviatlou is only 22, and because he is an amateur who lives outside the country. In Belarus, the government is a kind of presidential monarchy with no checks, no balances, and no rule of law. State media are grotesquely biased: Memo98, a media-monitoring group, reckons that Belarus state television devoted 97 percent of all political news programming to Lukashenko in May and June, with only 30 seconds devoted to opposition presidential candidates. Political leaders in Belarus are routinely repressed, and their voices are muffled: Tsikhanouskaya was running for president because her husband, Siarhei Tsikhanouski, was arrested before he could start his own presidential campaign. Other candidates and politicians were also arrested, along with their staff. Some are still in prison. Human-rights groups have evidence of torture.

. . . .

Paradoxically, the Lukashenko regime is also the source of his unusual power. By suppressing all other sources of information, it has given him unprecedented influence. This also has its downsides. One member of the tiny but determined community of independent journalists in Belarus—I am leaving him unnamed because he remains in Minsk—pointed out that the administrators of Telegram channels outside the country (Sviatlou is one of several) have no way to check whether what they are publishing is true, and no way to coordinate what they are doing with anyone else. Although he does communicate with other channel administrators, as well as with coordinators in Minsk, mistakes are sometimes made. A couple of days ago, crosscurrents of information nearly led one group of opposition protesters into a public brawl with another.

Link to the rest at The Atlantic

Belarusian Writers Stand By Their People

From Publishing Perspectives:

As the Belarusian election protests expand, opposition leader Svetlana Tikhanovskaya has spoken from her exile in Lithuania today (August 21). She is urging widening strikes, asking citizens not to be “fooled by intimidation,” as reported by the BBC.

Days before the now-disputed election of August 9, PEN International issued a joint statement–writing for the 24 PEN centers that stand in various parts of the world. That statement of concern is focused on what the Belarus PEN Center’s Sviatlana Aleksievič has said were two dozen political prisoners whose freedom of speech the center believed was being suppressed.

“Among them are bloggers and journalists, patrons of culture, Aleksievič wrote, “those who in 2020 awakened the Belarusian society, and for the first time in 26 years create serious competition for the authoritarian regime of Aliaksandr Lukašenka [longtime Belarusian president Alexander Lukashenko].

“Today, Belarusian writers stand by their people as the story is being written in the streets and squares, not at desks.”

. . . .

Indeed, this morning (August 21), The Economist (which does not allow its writers bylines) has released a story describing, as other media have reported, how “prisoners were forced to kneel with their hands behind their backs for hours in overcrowded cells. Men and women were stripped, beaten, and raped with truncheons.”

“The repression was ostentatious,” The Economist piece continues. “Some victims were paraded on state television. By August 19, at least four people had been killed. The aim was both to terrorize citizens and to bind the regime’s officers by having them commit atrocities together, a tactic used by dictators and mafiosi to prevent defections.”

Link to the rest at Publishing Perspectives

Alert visitors will have noted that there are worse things than Covid.

The Medieval University Monopoly

From History Today

(not really to do with books, but PG found it interesting):

In June 1686, a small family – a clergyman, his wife, and their daughter – disembarked from a ship at the docks of Boston, Massachusetts. They had just finished a long journey of a month or more across the Atlantic, escaping from England. The clergyman, a scholarly, 60 year old named Charles Morton, was fleeing prosecution. His crime? Teaching students – or, more specifically, teaching students in north London.  

From 1334 onwards, graduates of Oxford and Cambridge were required to swear an oath that they would not give lectures outside these two English universities. It was a prohibition occasioned by the secession in 1333 of men from Oxford to the little Lincolnshire town of Stamford. They were escaping the violence and chaos which often attended medieval university life – the frequent battles between students, and between students and other communities within the town – the same conditions, in fact, which had led an earlier generation of scholars to up sticks and leave Oxford for Cambridge. But their action now threatened both universities, and so the Stamford experiment had to be suppressed. The sheriff of Lincoln, the lord chancellor, even the king, Edward III, were all called into play and the result became known as the ‘Stamford Oath’; an oath which Oxford and Cambridge graduates continued to swear until 1827.

It is true to say that Charles Morton was unusually unlucky in being prosecuted for breaking this oath by establishing his own academy at Newington Green in London. His evident success in recruiting numerous and impressive students, like Daniel Defoe, was part of the problem, as were his staunchly Presbyterian religious beliefs and his radical, republican political views. But the depressing effect of the Stamford Oath was undeniable and its symbolism inescapable. Repeated at each graduation and reinforced by successive revisions of both universities’ statutes, it made their determination to preserve a duopoly in higher learning absolutely plain.

This was in sharp contrast to the European experience. Just as Oxford and Cambridge were establishing and policing their unique right to produce graduates, ever growing numbers of universities were being founded across the Continent. In the 14th century new institutions appeared in towns from Pisa to Prague; from Kraków to Cahors. In the years that followed, the gap in numbers between English universities and those on the Continent grew even greater, with over 100 founded or refounded in Europe after 1500. Oxford and Cambridge remained the only universities in England. Indeed, even as Morton’s teaching career began in the mid-17th century, universities were springing up in such unlikely places as the small towns of Prešov in Slovakia and Nijmegen in the Netherlands. The English experience was also very unlike that of the Scots, who acquired five universities between 1451, when Glasgow opened, and 1582, when Edinburgh was established.

. . . .

In the first place, there is the question of why it was that Oxford and Cambridge were so keen to suppress other universities. Secondly, there is the question of how they succeeded. Finally, just as importantly, and perhaps even more interestingly, there is the question of what changed to make them reverse this position so comprehensively in the years after 1827.

In some respects, the question of why Oxbridge was so jealous of its status seems the easiest to answer. In the most general terms, it makes sense for the providers of an exclusive product – a university degree, say – to take action to preserve their exclusivity. Universities were originally little more than a sort of trade guild, a separate group of masters and their students, who controlled admission, regulated quality and negotiated with the local authorities. Just as butchers and bakers sought to restrict the supply of their skills, so masters within the university hoped to protect their distinctive rights. These privileges were threatened by rivals. Oxford and Cambridge continued to act like guilds long after they lost or forgot their origins. Thus it was that even in the 17th century they fought off attempts by places as various as Carlisle and London, Ripon and Shrewsbury to establish their own institutes of higher learning. Thus it was that they crushed the nascent Durham University in 1660. And thus it was that they pursued poor Charles Morton.

. . . .

The answer is control. Just as the two universities wanted to control the supply of teachers and students, so the English Church and state wanted to control the universities. Universities could be – indeed, were – the source of dangerous heresies, where people learnt to think the wrong things. Oxford gave birth to the reforming, proto-Protestant Lollard movement in the 14th century. Cambridge was home to an alarming nest of evangelicals – humanist-inspired converts to church reform like the martyrs Robert Barnes (c.1495-1540) and Thomas Bilney (1495-1531) – 200 years later. With only two universities it was easier to control theological debate and even to use one of the institutions to oversee the other. It is no coincidence that the Cambridge-educated bishops Hugh Latimer and Nicholas Ridley, together with the Cambridge-educated archbishop Thomas Cranmer, were sent to loyalist, Catholic Oxford to be tried and burnt in the 1550s.


Link to the rest at History Today

Talking Back to Cookbooks

From The Wall Street Journal:

On a scorching hot day last week, I decided to make a cooling salad of roasted figs and onions with mint and green leaves, a recipe that caught my eye in the lovely new cookbook “Falastin” by Sami Tamimi and Tara Wigley. After I started, I realized that I had only half as many fresh figs as I needed. I also didn’t have the radicchio or walnuts or goat’s cheese that the recipe stipulated.

In the past, I might have anxiously rushed to the store to get exactly the “right” ingredients. But this is 2020, and new rules apply. I doubled up on onions to make up for the missing figs, subbed in feta for the goat’s cheese, used lettuce instead of radicchio and toasted cashews in place of the walnuts. The rest of the recipe—the dressing, the cooking times—I followed to the letter. It may not have been quite what the authors intended, but I put a Post-it Note in my copy of “Falastin” saying that it was still one of the best salads I’ve made all summer.

When we finally resurface from this pandemic, one of the many things that will have changed is our relationship with recipes. Through necessity, we have been forced to become more experimental cooks and start talking back to our cookbooks. This is a good thing, if you ask me.

For years, many of us tortured ourselves with the idea that recipes were stone-carved commandments issued from on high by godlike chefs. But a recipe is more like a never-ending kitchen conversation between writer and cook than a one-way lecture. Recipes were originally designed to help people remember how to cook something rather than to give them exact blueprints. When something in a recipe doesn’t work for you, for whatever reason, you are free to say so and make it your own.

Link to the rest at The Wall Street Journal (Sorry if you encounter a paywall)

All Tomorrow’s Warnings

From Public Books:

Lamenting the shortsightedness of environmental policy—in 1971—U Thant, Secretary-General of the UN, deployed a by-now familiar move from the playbook of ecological advocacy. He looked to the future:

When we watch the sun go down, evening after evening, through the smog across the poisoned waters of our native earth, we must ask ourselves seriously whether we really wish some future universal historian on another planet to say about us: “With all their genius and with all their skill, they ran out of foresight and air and food and water and ideas,” or “They went on playing politics until their world collapsed around them.

Despite its familiarity, U Thant’s statement here is rhetorically complicated. And it continues to inform efforts to tackle climate change to this day. Rather than castigating the present from some abstract future perspective, contemporary environmental defenders often follow U Thant’s lead, grounding their judgments in a singular figure—“some future universal historian,” geologist, or brother from another planet—who sifts through our future remains with fascination and disbelief. The question, then as now, is simple: Will we collectively close our eyes to the future dangers barreling toward us?

But such a question leads inevitably to a second, perhaps even more pressing question: How can we create a scientifically informed history of the future? This question has galvanized a slew of contemporary writers, filmmakers, and activists, who, echoing U Thant’s warning, are turning to speculative nonfiction, a genre that strives to document the years ahead. The vogue for histories of tomorrow is driven primarily by climate breakdown and the Anthropocene. Such anticipatory histories seek to counter a disastrous temporal parochialism unequal to the demands of the warmer, more insecure world. Nonfictional forays into the future, on the one hand, tend to warn us of coming disasters, and on the other, urge us to take action today.

In a spirit of anticipatory memory, writers, artists, and activists encourage us to own the future by inhabiting it in sample form. They encourage us to feel our way forward into the emergent worlds that our current actions are precipitating. They encourage us to break out of our temporal silos and—from our diverse Anthropocene positions—face the challenges that shadow the path ahead.

In the Anthropocene, Clive Hamilton observes, “the present is drenched with the future.” Despite that, powerful economic, technological, and neurological forces intensify our present bias, severing current actions from future fallout. The neoliberal fantasy of infinite short-term growth, the digital splintering of attention spans, and the rewiring of our brains for restless interruption: all favor dissociation. The average American, after all, checks their phone 150 times a day. A succession of staccato inputs now threatens to crowd out futures of remote concern—futures that seem immaterial, in both senses of the term.

Speculative nonfiction has no innate politics. After all, Big Oil has invested heavily in creating documentaries set in the future that present the companies’ energy trajectories in a glowing light. That said, it is progressives who, recognizing that the trend lines all point toward a warmer, less stable climate, have been most insistently adventurous in experimenting with this futuristic documentary form. Again and again, progressives have conscripted speculative nonfiction as an ally against short-term extractive economics, digital dispersion, political prevarication, and ethical inertia.

Link to the rest at Public Books

As he read the OP, PG reflected that readers generally have few problems accepting science fiction as fiction in part because it is set in the future and includes elements that do not exist (or do not exist in the form depicted) at the present time.

Suicide Missions

From The Los Angeles Review of Books:

YES, YOU HAVE HEARD this story before: we face a serious problem, which is likely to become much worse if we do not take serious measures to stop it now. But the immediate measures we need to take are pretty painful — not as painful as what sufferers in the future will experience, but they are not necessarily us. They may be people we care about, our children or grandchildren, but, even so, their future distress feels less real than actual, albeit lesser, distress happening right now to us (especially to me). Why sacrifice our well-being for their better-being? Economists call this “having a steep discount rate,” the sinister twin of compound interest: we value things in the future less the further out they are. The economists’ language has the clinical asepsis of much of their lexicon and does not quite convey how inevitable, even fated, the intrinsic reaction is.

If you are reading this in the summer of 2020, you do not have to reach far for an example: social distancing. It is inconvenient on a personal level and ruinous on the scale of the economy, but if one adheres to the restrictions then the coronavirus could be controllable: fewer deaths, a functioning health-care infrastructure, time bought to develop plans to restore economic activity without devastating public health. All that good stuff only happens to future people — in this case, to future us in six months — if we grit our teeth and forgo haircuts now. You can evaluate for yourself how well that’s been going.

These sorts of problems yoke the present with the future. However, they also necessarily tie the present to the past, since the past sets the conditions of our present, propelling the trajectory we now have to alter. The complex interactions of the three time frames conflate two distinct issues: how we know what we know about what has been, is, and will be happening; and how we act to solve the problem — a question of knowledge and a question of practice. Each “how” is in turn linked with a “who”: these sorts of challenges can only be tackled with massive coordination, so specific individuals must either take the responsibility of leadership or assume the responsibility of its abdication.

The commonness of these problems does not make them any less frustrating. Debates over the second issue of what to do — which is typically where one starts in an emergency — devolve more or less rapidly to the question of knowledge, because that seems easier to get a handle on. It also does not require anyone to assume present pain. Meanwhile the present turns into the future, and the usurious loan we unwittingly took out will eventually come due.

. . . .

It has been over 50 years since a human first walked on the surface of the Moon. Although there are some noises about returning — and many more about going to Mars — nobody, except possibly China, is seriously contemplating it. A feat that required tremendous ingenuity and courage, it also, of course, required oodles of money. And so we no longer go because of the cash. NASA consumed a mind-blowing 4.41 percent of the federal budget in 1966; in 2019, it was below half a percent.

In his engagingly readable Spacefarers, science writer Christopher Wanjek briefly relates what happened after the Moon landing: once his name was safely embossed on a lunar plaque, President Nixon promptly cancelled the Apollo program. It was time to economize. Enter the Space Shuttle.

. . . .

[Wanjek] wants humans to be out there, and there does not stop at the ISS, about 250 miles up. Indeed, he thinks the returns on investment in the ISS are meager. There have been no exciting indications that terrestrial industrial processes would be better performed in orbit, and much of the science is trivial. From decades of ISS work (and Russian research on the Mir space station beforehand), the most important thing we have learned about microgravity — either in deep space or in free fall — “is to get out of microgravity as quickly as possible.”

Some of the most compelling passages in this book designed to get you excited about space are about how unpleasant space is. All sorts of bad stuff happens to your circulation, your bones, your muscles, and your eyeballs. The radiation is sure to kill you once you get beyond Earth’s magnetic field unless you have lots of shielding. Solar weather is a big deal.

Wanjek is most thorough about the challenges of Mars, but most plausible when it comes to the Moon and asteroids. He has a knack for explaining the practical details of how one might possibly overcome them: how to mine water from the Moon’s regolith and then split it to release oxygen; how to 3-D print radiation shielding from Moon dust; how to manage the temperature extremes; how to treat toxic chemicals that would otherwise frustrate Martian agriculture; and so on. He also works through how to simulate gravity through rotating segments of a spacecraft — without it, the months to Mars will render the passengers so weak they won’t be able to stand even on the 38 percent of Earth’s gravity on the red planet. He devotes a great deal of attention to the “tyranny of the rocket equation”: how to transport material out of the Earth’s gravity well, and how a stable infrastructure could make it less ruinously expensive.

. . . .

Wanjek posits two reasons why we go to space: “war and profits.” During the Cold War, competition between the Soviets and the Americans kept space rockets flying in lieu of nuclear ones. If we want to send crewed missions farther afield than the ISS, then — absent a reprise “Sputnik moment” such as a Chinese Moon landing — it has to be about making money. He discusses space tourism, mining asteroids, and even colonization, though he is cautious about the last until we know whether human physiology can fertilize an embryo, birth a baby, and raise a child to adulthood under conditions of .38 (Mars) or .166 (Moon) of Earth’s gravity. (It’ll never happen in microgravity.) State leadership across the globe has been weak on this front, so the mantle is being seized by private corporations. It will be the anticipation of astronomical profits — the pun fits with Wanjek’s charming proclivity for dad jokes — that will get us to space for good.

Link to the rest at The Los Angeles Review of Books

The Spy Who Read Me

From Public Books:

“Spying and fiction are not entirely different processes,” says historian of British espionage Ben Macintyre, in a conversation with master of spy fiction and former intelligence officer John le Carré. “You try to create an artificial world. And the better and more realistic and more emotionally believable you can make that world, as either a spy or a novelist, the better you are going to be at it.” Yet, Charles McCarry, who was a deep-cover operative for the CIA, and the author of the Paul Christopher novels, doesn’t see continuity between spying and fiction but, instead, between the secret and everyday worlds: “The fact of the matter is, the secret world is too much like the ordinary world to be altogether entertaining. The elements of tradecraft that thrill us in books—cover stories, clandestine meetings, dead drops, telephone codes and so on—are techniques familiar to anyone who has ever covered a big story for a newspaper, negotiated a big contract against serious competition or conducted a clandestine love affair.” Fiction and spying can look like each other, and spying and everyday life can look like each other.

What to make, then, of the new glut of women writing about spying—both in fiction and in memoir? On April 28, 2020, Jung H. Pak—a history PhD who spent 10 years working for the Central Intelligence Agency—published Becoming Kim Jong Un: A Former CIA Officer’s Insights into North Korea’s Enigmatic Young Dictator. Pak writes as a former intelligence officer, and as a woman former intelligence officer, about a uniquely powerful, brutal, and secretive male leader. Pak’s biography of Kim was published during an obviously fascinating and enormously consequential context: active, ongoing speculation about his health. It also emerged into a specific literary context: innovative writing by women about the work of intelligence. Intelligence work by women is at the heart of new novels and memoirs about women intelligence officers. Books by Lara Prescott and Amarylliss Fox (not to mention books by Kate Atkinson, Lauren Wilkinson, Nada Bakos, and Tracy Walder) show women serving as spies, writing about serving as spies, and, in doing so, interlacing writing and spying.

Recently, women writing about spying—in memoir and fiction—has moved in two directions. In the past, fiction about espionage synched up with the intelligence concerns and capabilities of its day. But, today, fiction about espionage also sets its stories in the past. I focus here on The Secrets We Kept, by Prescott, a novel published late last year but set during the Cold War.

Meanwhile, a new crop of memoirs takes readers inside the lives of women intelligence officers who served in the relatively recent past, whose service is shaped not by the Cold War but by 9/11 and its aftermath. Fox’s Life Undercover is one such memoir and is central to what follows.

The women writers tell women’s stories of writing and spying. The experiences of practicing tradecraft aren’t precisely McCarry’s, and that’s worth discussing. Most importantly, they chronicle what spying looks like and what the everyday looks like, and, meaningfully, insist on their overlap.

For these women, paradoxically, the practice of being a spy and being an ordinary woman are not dissimilar. “I’m neck-deep in a game of make-believe,” laments Fox, “and the game is so convincing, I have no idea when it began. Or the ‘I’ who is playing it.” Sound familiar?

Link to the rest at Public Books

The mother load

From The Guardian:

Never in my life had I been so high.

I’d just given a reading in Amsterdam after which the gracious hosts of the evening took me out for drinks. Three young women asked me questions about sex and love and desire as though I were an expert and it was nice but I was tired and unused to being considered an expert in anything but panic.

I thanked the hosts and slipped out. I’d always wanted to visit Amsterdam and I had only two nights. I wanted to walk the streets alone. I wanted to walk across the bridges and look at the waving water and look inside the windows of the closed shops. I wanted to find the loveliest cafe and mark it for the morning. I wanted to eat bitterballen and wash them down with stroopwaffel. And I wanted to get high.

The streets were dark with rain. I found a deli. It wasn’t one of the coffeeshops with the meticulously bagged furry sativa. This was just a deli, cartons of milk, packs of gum. Before leaving I bought one large plastic tub of marijuana brownies. It seemed wasteful not to, and the man assured me I absolutely could take the cookies on my flight to Romania early the next morning. OK yes why not yes yes is OK yes. He was equal parts aloof and confident and not understanding what I was saying. So it felt right.

In the hour that followed I held the joint with one hand and a broken umbrella with the other. I walked and smoked and the cherry kept going out on the joint and I didn’t have a lighter and so twice I stopped to ask strangers for a light and tried to balance the umbrella and the joint and the unwieldy weight of my embarrassment. I got so high that I didn’t feel panic about my imminent flight. I got so high that I didn’t get lost. I found my pretty hotel but had gotten so high that I forgot my four-year-old daughter was sleeping in a room upstairs.

Hang on now. Her father was in the room with her. But I almost forgot I was a mother. But that’s not it. I forgot enough about my panic that I wasn’t acting like the neurotic mother that I am. I rarely drink and when I do, I don’t drink much. So that getting high (so high) felt like a real breach. I got so high that I didn’t care that I got so high.

To some (or many!) I’m sure I would be considered in that moment (or many!) a bad mother. I know it for a fact because I spoke to hundreds of women for my book – many of them mothers – and they all had at some point been called “bad”. Many of them believed it to the extent that they felt they weren’t good enough for their children.

One of the women I spoke to was a talented musician. She told me that the only one of her singles that underperformed told the story of a bad mother. It was one of her favourite songs, but she had to stop singing it at concerts because she would receive death threats on Twitter. One listener threatened to kidnap her child, because she was too bad a mother to keep her.

Link to the rest at The Guardian

Publishing the full Spectrum

From The Bookseller:

For a long time, I felt like I had been failed by publishing. After a diagnosis of Aspergers Syndrome – now Autism Spectrum Disorder (or ASD) in 2015 – I set out to learn more about my new ‘label’, and what it meant to me. Recommendations included looking to TV, because characters such as Sheldon Cooper in “The Big Bang Theory” were ostensibly ‘good’ representation. I couldn’t relate. Frustrated, I turned to books, expecting someone, somewhere, to have written about my experience. There was very little that was supportive, or even relevant, to me.

It’s good to see that this is changing at long last, although publishing still has a long way to go to plug the gap. Despite Autism Spectrum Disorder being exactly that – a spectrum! – there remains a lack of nuance in books that touch on the varied experiences of people with ASD.

Take the recent backlashes around books about Autism. To Siri With Love – Judith Newman’s recent memoir about her Autistic son – may have been met with huge praise, but Autistic individuals shot back. Accusations of eugenics and ableism abounded – Kaelan Rhywiol summarised the objections in a piece for Bustle – as well as a ‘Twitterstorm’, complete with the hashtag #BoycottToSiri. The author responded that she had not written the book for an Autistic audience.

And this year, my Instagram feed flooded with petitions calling for the removal of I Wish My Kids Had Cancer by Micheal Alan – a book that appeared to equate Autism with cancer. Enough said.

. . . .

There are also a lot of books about parenting – but they are written by parents not on the spectrum. Spectrum Women: Autism & Parenting is out next month – and, so far, has been seen as a ‘revelation’. Why? Because it is written by people on the Autistic spectrum! As the saying goes, ‘nothing about us, without us’ – and this should apply to books about parenting Autistic children. It’s good to have books that are almost like textbooks – but they are not necessarily the real, lived experience of being on the spectrum. They miss the colour, the humanity. And that, I think, could often be said when someone not on the spectrum writes about being Autistic. 

. . . .

Stim: An Autistic Anthology was released earlier this year. Edited by Lizzie Huxley-Jones, this book was notable for giving free rein to the Autistic contributors. Essays, art, even fiction – not necessarily about Autism! – made this book a stand-out tome in its niche. It’s refreshing to read, offering a range of non-neurotypical perspectives.

Illustrator Megan Rhiannon has also released Existing Autistic – a self-published, illustrated book that contains information about functioning labels, sensory overload, and more. It has been received with thunderous applause – with her needing to re-stock it at least once since the release. 

Link to the rest at The Bookseller


From The Wall Street Journal:

‘Hotwife’ Pornographer Gulls Harvard Prof With ‘Wife of Jesus’ Hoax.” The headlines could have been worse for Karen King, the Hollis Professor of Divinity at Harvard University. But not much worse.

The first line of act I of “Veritas,” Ariel Sabar’s mesmerizing five-act real-life melodrama, is “Dr. Karen Leigh King had reached the summit of her field as a dazzling interpreter of condemned scripture.” We join Ms. King at the apex of her career, her September 2012 unveiling of the “Gospel of Jesus’ Wife” at the International Congress of Coptic Studies, held a stone’s throw from the Vatican in Rome. Speaking to three dozen colleagues, Ms. King describes the tiny papyrus fragment that had come into her possession, lingering over its fateful line 4: “Jesus said to them, ‘My wife . . .’ ”

This little snippet, Ms. King claimed, “leads us to . . . completely re-evaluate the way in which Christianity looks at sexuality and at marriage.” Ms. King considered calling the bit of papyrus the “Mary Fragment” but chose to call it a “Gospel”—“something that will stick,” she later explained. From some 30 Coptic words spread across eight discontinuous lines, Mr. Sabar writes, Ms. King had “alchemized . . . [the] case for a thoroughgoing Gospel of Jesus’s Wife.”

A married Jesus would turn the Catholic Church on its head. The papyrus hinted at a wife named Mary, presumed to be Mary Magdalene, painted as a prostitute by Pope Gregory the Great in the sixth century. The New Testament, however, never mentions a marriage, other than in references to the Church or holy Jerusalem as Christ’s spiritual bride. Christ’s purported bachelorhood undergirds the Catholic doctrine of priestly celibacy. If the papyrus accurately described a wife of Christ, “this means that the whole Catholic claim of a celibate priesthood based on Jesus’s celibacy has no historical foundation,” noted Ms. King, a feminist scholar and expert on the apocryphal, second-century Gospel of Mary.

The papyrus presented problems from the start. Before the Rome event, two of the three anonymous peer reviewers retained by the Harvard Theological Review suggested Ms. King’s fragment might be a fake—although none of the scholars assembled in Rome knew that. Ms. King’s reviewers examined only a digital photograph of the “Gospel,” and “something felt off,” Mr. Sabar reports. One expert said the script “looked like twenty-first-century handwriting.” On closer inspection, small imperfections manifested themselves: missing characters and the “grammatical monstrosity” of an impossible double conjugation.

Brown University Egyptologist Leo Depuydt called the papyrus’s grammar a “colossal double blunder,” arguing that its creator was less likely to have been “a very incompetent ancient scribe” than “a modern author who might have benefited from one more semester of Coptic.” Ms. King, who, Mr. Sabar reminds us, taught Coptic at Harvard, “had somehow failed to spot most of the text’s grammatical irregularities.”

The besieged Ms. King fought back. Nineteen months after the Rome reveal, the Theological Review published her article defending the fragment’s authenticity, backstopped by testing carried out at Harvard, Columbia and MIT. Harvard issued a triumphant press release: “Testing Indicates ‘Gospel of Jesus’s Wife’ Papyrus Fragment to Be Ancient.” Ms. King and the as-yet-unidentified owner of the fragment exchanged a sigh of relief. The lab tests, he emailed her, served as “the ultimate confirmation for me that we’ve been right all along, confirming again what has been obvious from day one.”

. . . .

Mr. Sabar doesn’t name the purported papyrus pusher until page 162, and the man’s bona fides seem quite unusual indeed. A former student in Egyptology at Berlin’s Free University, Walter Fritz briefly directed the Stasi Museum, housed in the former headquarters of the notorious East German secret police, before moving to Florida, becoming a pornographer, and carving out a name for himself and his wife in the state’s “vibrant swingers’ community.” Mr. Fritz is the proverbial man of many parts; one wonders why a prodigious researcher like Ms. King didn’t perform a few more Google searches or place some phone calls before dynamiting 2,000 years of patriarchal tradition on the basis of his sketchy offering.

As the reader moves through acts III and IV, Mr. Sabar continues to tantalize us. It is curious, we learn, that Ms. King had urged the Theological Review to scotch a dissenting article by Mr. Depuydt that was printed in the 2014 issue devoted to the papyrus. It is equally curious that the Columbia and MIT “authentications” of the fragment were performed by scholars with “close personal ties” to Ms. King and to one of her key allies. The MIT man was the son of a family friend and “an expert in explosives detection.” The ink analyst from Columbia “had no experience with ancient objects.” Oops.

. . . .

“[Ms. King’s] ideological commitments were choreographing her practice of history,” Mr. Sabar writes. “The story came first; the dates managed after. The narrative before the evidence; the news conference before the scientific analysis.”

Link to the rest at The Wall Street Journal (Sorry if you encounter a paywall)

Longest running conservation journal goes Open Access

From The Bookseller:

Oryx, the international journal of conservation published by Cambridge University Press, is to become Open Access from January next year, in a move made possible by a grant from The Rufford Foundation.

From January 2021, the journal–which is the world’s longest running conservation journal–will be free to anyone with an internet connection. Past content dating as far back as 1950 will be made freely available, as well as all new research which will be published Open Access from next year. Meanwhile unfunded authors will benefit from a new APC (article processing charge) waiver policy, also thanks to The Rufford Foundation, dedicated to nature conservation.

CUP publishes the journal on behalf of wildlife conservation charity Fauna & Flora International, and it is billed as the “go-to publication for anyone interested in biodiversity conservation, conservation policy and related social, economic and political issues”.

Editor Dr Martin Fisher, who has overseen Oryx for almost 20 years, said: “This is the most significant development in the journal’s eminent history. Thanks to the support of the Rufford Foundation and Cambridge University Press’ commitment to Open Access publishing, the research published in Oryx will be freely accessible to all readers, no matter where they live or work.”

Link to the rest at The Bookseller

Iron Empires

From The Wall Street Journal:

The public image of the robber barons has always been a barometer of how America thinks about wealth. Were they financiers or swindlers? Builders or monopolists? In the Progressive Era, the muckraker Ida Tarbell cast John D. Rockefeller as a ruthless monopolist, and Matthew Josephson’s compelling but one-sided Depression-era tome, “The Robber Barons,” scorched the lot of them. In recent decades serious biographers have reappraised the turn-of-the-century moguls and found more to like. Could the wheel be poised to turn again? With inequality considered a public enemy, a reappraisal might be ripe.

In “Iron Empires: Robber Barons, Railroads, and the Making of Modern America,” Michael Hiltzik pokes among the ghostly bones of tycoons past but doesn’t generally offer a new interpretation. Mr. Hiltzik, a journalist with the Los Angeles Times, presents a colorful cast in conventional terms. Once again we hear that the railroad barons made money by watering stock (meaning that they inflated it, as in watering cattle), swindling rivals, buying judges and milking decrepit properties.

Daniel Drew was vaunted for his ability to manipulate shares. He played with the float of the Erie Railroad “with the ease of a child inflating and deflating a toy balloon,” as Mr. Hiltzik puts it. For unalloyed greed, it was hard to top George Pullman, the sleeping-car magnate, who set up a “beautiful” company town but gouged his underpaid workers on rent. When a depression set in, Pullman cut wages 30%; somehow the workers’ rent was unaffected, while corporate dividends rose.

Labor unions objected, but they were weak. The most serious threat emanated from the barons themselves, who recklessly overbuilt. The finagling Jay Gould ruined the Union Pacific, the Civil War-era trunk line chartered and subsidized by Congress. Talk about a fiduciary! Gould thought nothing of (a) destroying the company with an expensive acquisition while (b) personally wheeling and dealing in the stock while also (c) serving on the board. The result was redundant tracks, ruinous rate-cutting and repeated waves of bankruptcy. Mr. Hiltzik likens the barons to rival duchies in Napoleonic Europe, jousting and plotting but incapable of asserting order.

Link to the rest at The Wall Street Journal (Sorry if you encounter a paywall)

PG is reading The Transcontinental Railroad, which covers a time a bit earlier than the books mentioned in the OP, and enjoying it greatly. The politics of the time after the Civil War were corrupt, rough and tumble and it was amazing when anything involving the federal government actually worked out.

Jamaica: Anatomy of an Uprising

From The Wall Street Journal:

Just after Christmas in 1831, the British Empire’s wealthiest island exploded. “Five weeks of burning, looting, crop destruction, courts-martial, on-the-spot executions, severed heads mounted atop poles, and outright human hunting for sport . . . shook slaveholding Jamaica to its foundations.” So writes Tom Zoellner, a professor at Chapman University, in “Island on Fire,” a pounding narrative of events that led to the end of slavery in the British colonies. “Soon the hills were on fire, each spiky leaf of sugar like a small torch or match head. Millions of yellow, flaming pinpricks spread in all directions in the velvety Caribbean night.”

Hundreds of slaves, having been pushed beyond endurance, attacked hated overseers and their masters’ property. “We have worked enough already, and will work no more,” striking laborers told a pair of plantation owners. “The life we live is too bad; it is the life of a dog.” In all, 145 estate houses were destroyed and many others severely damaged. Mr. Zoellner’s vigorous, fast-paced account brings to life a varied gallery of participants, black, white and “colored”— the then-standard designation for quasi-free people of mixed race.

Among these figures are Richard Barrett, one of the island’s richest sugar growers and a relative of the poet Elizabeth Barrett Browning, who passed for a moderate in the island’s reactionary society; the remarkable, precariously positioned “colored” newspaper editor Edward Jordon, who had only gained full civil rights the previous year; and the revolt’s tragic central figure, an enslaved Baptist deacon named Samuel Sharpe. An apparently gifted speaker, Sharpe preached the equality of man based on the teachings of the Bible. He also believed inaccurate rumors that the king had already declared slaves free but that their masters were keeping the news a secret. In response, Sharpe surreptitiously planned a peaceful work stoppage. He may have ultimately hoped for the establishment of an independent republic similar to the one that had come into being a generation earlier in Haiti. Whatever his intentions, the stoppage quickly spiraled beyond his control and into full rebellion.

The uprising was soon over, having been weakened by its poor organization and thwarted by the failure of the island’s 300,000 slaves to rise en masse. It was also overwhelmed by the firepower of British troops. Few whites were killed, but the colonial elite’s confidence in its ability to defend itself was deeply shaken. Hundreds of enslaved men and women were killed in battle or summarily executed, some simply because they had attended a Baptist meeting. The exact number is unknown.

The revolt failed to improve conditions for the enslaved in Jamaica, but it crucially wounded the institution of slavery itself. Mr. Zoellner acknowledges that it was only one factor in the ending of slavery, along with surging abolitionism in Britain, an increasingly muscular reform movement in Parliament, and the falling price of sugar, the islands only export crop. But the revolt, he says, “sent an unambiguous message to London that slavery was no longer sustainable—not economically, not militarily, and not morally.”

The challenge to slavery in Jamaica and the rest of Britain’s Caribbean possessions had been a long time coming. As Trevor Burnard, a professor at the University of Hull, amply shows in his expansive and scholarly “Jamaica in the Age of Revolution,” colonial Jamaica was characterized by extreme systemic violence against enslaved people. It was also ruled over by a dissolute planter class obsessed with short-term profits that made it cheaper to work slaves to death and buy new ones than to sustain them into their later, less productive years.

Link to the rest at The Wall Street Journal

PG has read more than one article about slavery that has described the practice as a “uniquely American” or “peculiar” institution found only or almost-only in the United States.

This is, of course, not correct. Egypt, Babylon, Greece, and Rome each had large numbers of slaves. A great many Christians were enslaved during the Ottoman invasions of Europe. White slaves were common in Europe from the Dark Ages to the Middle Ages. China formally abolished slavery in 1909.

Serfs in feudal Europe were not personal property that could be bought and sold, but, rather, they were attached to land. If a landowner sold a piece of land, the serfs living on the land went with it and were obligated to give a substantial portion of the fruits of their labors to the landowner and could be compelled to cultivate other land of their owner that was not occupied by serfs.

Russian serfdom was even more rigid.

From JSTOR Daily:

[Peter[ Kolchin writes that the Russian nobles “invented many of the same kinds of racial arguments to defend serfdom that American slave-owners used to justify” slavery. Some nobles went so far as to say they had white bones, while the serfs had black bones. Kolchin calls this an “essentially racial argument in defense of serfdom, even though no racial distinction divided lord and peasant.”

Then there was the aristocratic paternalism of the arguments that bondage was a humane institution in comparison to the precariousness of the free labor market. Both Russians and Americans argued that their systems of bondage resulted in a superior society.

Kolchin quotes American slave-advocates who argued that the race of slaves was actually immaterial. Absent Africans, these defenders of American slavery said whites would do just as well as blacks. Because planters needed the support of non-slaveholding whites, however, such arguments never dominated the defense of slavery.

Link to the rest at JSTOR Daily

PG intends none of this be any sort of excuse for or defense of slavery in any form or fashion. It is always and everywhere a despicable evil. However, unfortunately, while it has been an American evil, it has also been a British, Russian, Chinese, Arabian, etc., etc., evil

A Guidebook Editor’s Dilemma

From Publishers Weekly:

Since the world stopped traveling several months ago, I’ve been trying to find the way forward for my guidebooks. The future of travel is anything but clear: countries are tightening their borders, and some U.S. states that have reduced their coronavirus infection rates are restricting visitors from other states where the pandemic continues to wreak havoc.

I’m editor-in-chief for North America of 111 Places That You Must Not Miss, a guidebook series for locals and experienced travelers. Our books always sell best in the cities and regions they cover, and in many ways, our approach is well suited to a moment when people are planning staycations and local getaways.

But virtually all of our retail outlets were vastly diminished for several months. Bookstores, gift shops, and museum shops were closed due to the pandemic, and major e-commerce sites deprioritized books in order to direct resources toward shipping hand sanitizer, surface cleaners, and other crucial supplies.

My colleagues and I knew that our sales reports were going to be grim. But the difference between our figures for Q1 and Q2 2020 and those from the first half of last year is shocking nonetheless. When everyone went home in mid-March, we had several books either just released or on their way to our warehouse. It broke my heart to see talented, enthusiastic writers and photographers miss out on the once-in-a-lifetime experience of celebrating their first books and signing their first autographs. Normally, a book release is a time for parties with friends and family, book talks, TV and radio interviews, and a wonderful sense of accomplishment. But those events have been postponed.

Where our books once might have garnered good media placements, in recent months our press releases have yielded more out-of-office messages than interview requests. Many journalists have gotten sick or been laid off or furloughed, just like those in so many other professions.

Link to the rest at Publishers Weekly

Political Books: PEN America’s Suzanne Nossel ‘Dares To Speak’

From Publishing Perspectives:

With a release date Tuesday (July 28), Dare To Speak: Defending Free Speech for All is by Suzanne Nossel—currently the CEO of PEN America and a previous COO of Human Rights Watch and executive director of Amnesty International USA.

Nossel enters this politically charged summer’s lineup in the right sector: nonfiction, and focused on the underlying issue behind “cancel culture.”

. . . .

In essence on the broader scale, however, much of the debate of the day revolves around what some perceive to be an “intolerant climate that has set in on all sides,” as it was described in the July 7 open letter published by Harper’s Magazine. Many leading authors including Margaret Atwood, Anne Applebaum, JK Rowling, Salman Rushdie, and Khaled Kalifa signed the letter, which was led by Thomas Chatterton Williams.

While praising the recent “powerful protests for racial and social justice,” the letter warned against letting “resistance harden into its own brand of dogma or coercion—which right-wing demagogues are already exploiting.” Those who signed the piece pointed to “greater risk aversion among writers, artists, and journalists who fear for their livelihoods if they depart from the consensus, or even lack sufficient zeal in agreement.” And that’s self-censorship, a real and present danger.

Another letter then followed at The Objective, this time from writers who cast the first piece’s signatories as a kind of establishment class of well-paid pundits who risk little in expressing their views with elite authority. “Under the guise of free speech and free exchange of ideas,” reads the second letter about the first, “the letter appears to be asking for unrestricted freedom to espouse their points of view free from consequence or criticism.

“There are only so many outlets, and while these individuals have the ability to write in them, they have no intention of sharing that space or acknowledging their role in perpetuating a culture of fear and silence among writers who, for the most part, do not look like the majority of the signatories.”

It’s a thorny discussion, with “cancel culture” veering close to “political correctness” for some—and not for others—and one that Nossel seems to approach in her book, described as being about “a time when free speech is often pitted against other progressive axioms—namely diversity and equality.”

Needless to say, the speed and force with which social media polarization can gather around any comment, image, or concept is its own dilemma, one that often hustles engaged users past what should have been a period of review, thought, assessment, and decision and depositing everyone in a heap of knee-jerk responses to only partially understood points.

If anything, the industry and culture of books is in a good position to demonstrate through its own output the values of the indispensable requirement of the freedom to publish. And however important it is to speed up traditional publishing amid the pace of contemporary debate, the inherent requirements of book preparation lie on the side of clearer thought and patient retort.

. . . .

In her forthcoming book, Nossel, according to Harper’s promotional material, “warns against the increasingly fashionable embrace of expanded government and corporate controls over speech, warning that such strictures can reinforce the marginalization of lesser-heard voices. She argues that creating an open market of ideas demands aggressive steps to remedy exclusion and ensure equal participation.”

. . . .

And when it comes to authoritarianism and its dangers, Anne Applebaum’s just-released (July 21) Twilight of Democracy: The Seductive Lure of Authoritarianism is so clear in its picture of a fast evolution of otherwise liberal thinkers into “closet authoritarians” that she’s able to look at federal police actions during protests in cities like Portland and describe them as “performative authoritarianism”—a kind of pageant of tyrannical intervention, staged to gain favor with a political base.

What Applebaum wants to tell you is, “Given the right conditions, any society can turn against democracy.”

. . . .

“The rules that will govern speech in the 21st century are being written right now, formally and informally. European countries are experimenting with new constraints on speech, some of which would be unconstitutional in the United States, and others of which may warrant close scrutiny.

“Almost daily, social media companies roll out new guidelines and rule changes governing their platforms. Young people are forging new norms for discussing race, sex, and gender identity.

“Those who remain silent in the face of these debates cede the ground to those with the most extreme views and most self-serving motivations.”

. . . .

As more political releases line up in coronavirus-delayed release dates this year, watch for more sales action among them—and watch for the question of how comfortably they and their authors and publishers share the shelves of late summer. These questions of free expression and actual tolerance of it are headed for more stress tests, and soon.

Link to the rest at Publishing Perspectives

PG didn’t plan for this post to appear next to the one about Cuba he put up earlier, but the juxtaposition of the two is a warning to one and all about the crucial importance of free speech and what happens when any individual, group or government punishes those whose opinions differ from those currently “approved” is not able to speak the truth as he/she sees it.

Note: As mentioned, the book discussed is not available for preview until July 28, 2020, when it will be released. However, it is available for pre-order on Amazon and, if you are reading this post on July 28 or thereafter, it will be immediately available and, if you see a Preview button below should work (although this function is controlled by the book’s publisher).

The Cubans

From The Wall Street Journal:

In 1976, Cary, a young Afro-Cuban woman, sailed out of Havana Harbor with 2,000 fellow students bound for the Soviet Union. They were thrilled to be going to the land that led the global communist movement, where they would earn university degrees that would help them build the Cuba of their dreams. But instead of discovering the thriving revolutionary society she’d heard about from regime officials in Cuba, Cary found herself surrounded by politically apathetic Russian classmates. They laughed at her for wanting to go to the May Day parade—the International Workers’ Day celebration. She realized: “These Russians don’t think the way we do.”

By the end of Anthony DePalma’s remarkable book “The Cubans: Ordinary Lives in Extraordinary Times,” Cary’s faith in the Cuban Revolution and the classless, race-blind system it promised has vanished. She’s gone from a rising star in the Communist Party—she returned from the Soviet Union as an economist and rose to become vice minister of light industry with her own car and driver—to a struggling business owner hunting for needles and thread. Cuba, as the author explains, permits some private businesses to operate, but restricts their growth and the accumulation of wealth. Mr. DePalma gives us an unforgettable analogy that sums up Cary’s plight. Cuba, he says, “is toying with capitalism the way a tiger plays with its prey: tapping it lightly one minute, squeezing the life out of it the next.”

Cary, who has resigned from her state post and redirected her analytical skills toward dressmaking, is one of the intrepid Cubans who opened up to Mr. DePalma, a veteran foreign correspondent, as he set out to capture “a more profound truth” about their country. The author accomplishes this by taking us behind the romantic veil that hides the day-to-day experiences of ordinary Cubans. Their voices are rarely heard, he believes, outside of Cuba and especially inside, where the government represses independent journalists and jails people who criticize it on social media.

Mr. DePalma plunges us into the lives of a diverse group of Cubans living in Guanabacoa, a 500-year-old township across the harbor from Havana. Some pray in Catholic churches. Some follow Santería. His five primary subjects are men and women in their 50s, 60s and 70s, busy with work, caring for their aging parents, and helping their grown children and young grandchildren. Lili remains a dedicated communist even after the government refuses to help her care for her dying father. His dementia and wanderings drive her to keep him in a locked closet, where he spends the last months of his life. Jorge, now 73 and living in the United States, still fights for justice for the 14 members of his family who drowned in 1994 when they tried to escape Cuba on a tugboat: Witnesses said the Cuban government rammed and sank the boat. Cuba denied any responsibility, but Amnesty International’s investigation, in 1997, concluded that the 37 men, women and children who died were “victims of extrajudicial execution.”

Younger family members become part of the narrative, offering their own stories as the elders share theirs with an astounding—maybe even dangerous—level of candor. Mr. DePalma walks with them through Guanabacoa, sits in their kitchens and workshops, and shows us what he calls “the gritty 3-D reality most Cubans live with—broken streets, collapsing buildings, more garbage than flowers. Hot. Smelly. Noisy. Raw.” As we learn about the Cubans’ triumphs and failures over the six decades since the revolution, Mr. DePalma weaves in the major events in Cuba’s history—from the wars of independence with Spain in the 1800s, to the recent ascent of Miguel Díaz-Canel to the presidency. One Cuban describes Mr. Díaz-Canel—the first non-Castro to lead post-revolutionary Cuba—with a popular expression: same dog, different collar.

. . . .

Some of the most harrowing stories take place during the so-called special period of the ’90s, when the Soviet Union fell and its subsidies to Cuba vanished. Cubans’ struggle for food and consumer goods went from difficult to desperate. They were forced to shred blankets, season and fry the material, and stuff it into bread for sandwiches. They pilfered industrial chemicals from factories and concocted household soaps and detergents. Women made hair dye with the black paste inside batteries. 

. . . .

The neighborhood committees that the Castro government installed on almost every block in 1960 still serve as the government’s watchdogs. Denunciations are a permanent threat, since many Cubans, in order to survive, end up breaking laws restricting private sales. And finding allies can’t be easy when, Mr. DePalma writes, “everybody in Cuba, at one time or another, suspected nearly everyone else of being an informer.”

But the biggest obstacle for would-be protesters might be time. The daily quest for food and basic supplies—from eggs to bedsheets—seems to demand every ounce of the families’ energy and creativity. Mr. DePalma believes that Cubans are “cursed by their own greatest strength—their indomitable adaptability.” Their inventive resilience has a downside. And it may be why Cuba is embargo-proof: “People who can turn a plastic soda bottle into a gas tank for a motorcycle . . . see the world differently from other more conventional societies.”

Link to the rest at The Wall Street Journal (Sorry, PG hasn’t figured out a way around the paywall)

Children’s Chorus

From Voices from Chernobyl:

Alyosha Belskiy, 9; Anya Bogush, 10; Natasha Dvoretskaya, 16; Lena Zhudro, 15; Yura Zhuk, 15; Olya Zvonak, 10; Snezhana Zinevich, 16; Ira Kudryacheva, 14; Ylya Kasko, 11; Vanya Kovarov, 12; Vadim Karsnosolnyshko, 9; Vasya Mikulich, 15; Anton Nashivankin, 14; Marat Tatartsev, 16; Yulia Taraskina, 15; Katya Shevchuk, 15; Boris Shkirmankov, 16.

There was a black cloud, and hard rain. The puddles were yellow and green, like someone had poured paint into them. They said it was dust from the flowers. Grandma made us stay in the cellar. She got down on her knees and prayed. And she taught us, too. “Pray! It’s the end of the world. It’s God’s punishment for our sins.” My brother was eight and I was six. We started remembering our sins. He broke the glass can with the raspberry jam, and I didn’t tell my mom that I’d got my new dress caught on a fence and it ripped. I hid it in the closet.


Soldiers came for us in cars. I thought the war had started. They were saying these things: “deactivation,” “isotopes.” One soldier was chasing after a cat. The dosimeter was working on the cat like an automatic: click, click. A boy and a girl were chasing the cat, too. The boy was all right, but the girl kept crying, “I won’t give him up!” She was yelling: “Run away, run little girl!” But the soldier had a big plastic bag.


I heard – the adults were talking – Grandma was crying – since the year I was born [1986], there haven’t been any boys or girls born in our village. I’m the only one. The doctors said I couldn’t be born. But my mom ran away from the hospital and hid at Grandma’s. So I was born at Grandma’s. I heard them talking about it.

I don’t have a brother or sister. I want one.

Tell me, lady, how could it be that I wouldn’t be born? Where would I be? High in the sky? On another planet?


The sparrows disappeared from our town in the first year after the accident. They were lying around everywhere – in the yards, on the asphalt. They’d be raked up and taken away in the containers with the leaves. They didn’t let people burn the leaves that year, because they were radioactive, so they buried the leaves.

The sparrows came back two years later. We were so happy, we were calling to each other: “I saw a sparrow yesterday! They’re back.”

The May bugs also disappeared, and they haven’t come back. Maybe they’ll come back in a hundred years or a thousand. That’s what our teacher says. I won’t see them.


September first, the first day of school, and there wasn’t a single flower. The flowers were radioactive. Before the beginning of the year, the people working weren’t masons, like before, but soldiers. They mowed the flowers, took off the earth and took it away somewhere in cars with trailers.

In a year they evacuated all of us and buried the village. My father’s a cab driver, he drove there and told us about it. First they’d tear a big pit in the ground, five meters deep. Then the firemen would come up and use their hoses to wash the house from its roof to its foundation, so that no radioactive dust gets kicked up. They wash the windows, the roof, the door, all of it. Then a crane drags the house from its spot and puts it down into the pit. There’s dolls and books and cans all scattered around. The excavator picks them up. Then it covers everything with sand and clay, leveling it. And then instead of a village, you have an empty field. They sowed our land with corn. Our house is lying there, and our school and our village council office. My plants are there and two albums of stamps, I was hoping to bring them with me. Also I had a bike.


I’m twelve years old and I’m an invalid. The mailman brings two pension checks to our house – for me and my granddad. When the girls in my class found out that I had cancer of the blood, they were afraid to sit next to me. They didn’t want to touch me.

The doctors said that I got sick because my father worked at Chernobyl. And after that I was born. I love my father.


They came for my father at night. I didn’t hear how he got picked, I was asleep. In the morning I saw my mother was crying. She said, “Papa’s in Chernobyl now.”

We waited for him like he was at the war.

He came back and started going to the factory again. He didn’t tell us anything. At school I bragged to everyone that my father just came back from Chernobyl, that he was a liquidator, and the liquidators were the ones who helped clean up after the accident. They were heroes. All the boys were jealous.

A year later he got sick.

We walked around in the hospital courtyard – this was after his second operation – and that was the first time he told me about Chernobyl.

They worked pretty close to the reactor. It was quiet and peaceful and pretty, he said. And as they’re working, things are happening. The gardens are blooming. For who? The people have left the villages. They “cleaned” the things that needed to be left behind. They took off the topsoil that had been contaminated by cesium and strontium, and they washed the roofs. The next day everything would be “clicking” on the dosimeters again.

“In parting they shook our hands and gave us certificates of gratitude for our self-sacrifice.” He talked and talked. The last time he came back from the hospital, he said: “If I stay alive, no more physics or chemistry for me. I’ll leave the factory. I’ll become a shepherd.” My mom and I are alone now. I won’t go to the technical institute, even though she wants me to. That’s where my dad went.


I used to write poems. I was in love with a girl. In fifth grade. In seventh grade I found out about death.

I read in Garcia Lorca: “the cry’s black root.” I began to learn how to fly. I don’t like playing that game, but what can you do?

I had a friend, Andrei. They did two operations on him and then sent him home. Six months later he was supposed to get a third operation. He hanged himself from his belt, in an empty classroom, when everyone else had gone to gym glass. The doctors had said he wasn’t allowed to run or jump.

Yulia, Katya, Vadim, Oksana, Oleg, and now Andrei. “We’ll die, and then we’ll become science,” Andrei used to say. “We’ll die and everyone will forget us,” Katya said. “When I die, don’t bury me at the cemetery, I’m afraid of the cemetery, there are only dead people and crows there,” said Oksana. “Bury me in the field.” Yulia used to just cry. The whole sky is alive for me now when I look at it, because they’re all there.

Link to the rest at Voices from Chernobyl, by Svetlana Alexievich, winner of The Nobel Prize for Literature 2015

The Necessity of Awe

From Aeon:

hen a scientific paradigm breaks down, scientists need to make a leap into the unknown. These are moments of revolution, as identified by Thomas Kuhn in the 1960s, when the scientists’ worldview becomes untenable and the agreed-upon and accepted truths of a particular discipline are radically called into question. Beloved theories are revealed to have been built upon sand. Explanations that held up for hundreds of years are now dismissed. A particular and productive way of looking at the world turns out to be erroneous in its essentials. The great scientific revolutions – such as those instigated by Copernicus, Galileo, Newton, Lavoisier, Einstein and Wegener – are times of great uncertainty, when cool, disinterested reason alone doesn’t help scientists move forward because so many of their usual assumptions about how their scientific discipline is done turn out to be flawed. So they need to make a leap, not knowing where they will land. But how?

To explain how scientists are able to make this leap, the philosopher of science Bas van Fraassen in The Empirical Stance (2002) drew on Jean-Paul Sartre’s Sketch for a Theory of the Emotions (1939). Sartre was dissatisfied with the major mid-20th-century theories about emotions (especially those by William James and Sigmund Freud) that treated emotions as mere passive states. You might fall in love, or be gripped with jealousy. It seemed that emotions happened to you without any agency on your part. Sartre, by contrast, held that emotions are things that we do. They have a purpose, and they are intentional. For example, when we get angry, we do so to seek a solution, to resolve a tense situation. Sartre wrote:

When the paths before us become too difficult, or when we cannot see our way, we can no longer put up with such an exacting and difficult world. All ways are barred and nevertheless we must act. So then we try to change the world.

The world that Sartre referred to is the world of our subjective experience. It is the world of our needs, our wants, our fears and our hopes. In his view, emotions transform the world like magic. A magical act, such as voodoo, alters the attitude of the practitioner to the world. Magical spells and incantations don’t change the physical environment, but they change our world, by shifting our desires and hopes. Similarly, emotions change our outlook and how we engage with the world. Take Sartre’s example of sour grapes: seeing that the grapes are unreachable, you decide, ‘they are too sour anyway’. Though you didn’t change the chemical property of the grapes in any way, the world has become a bit more bearable. Anticipating contemporary ideas about embodied cognition, Sartre speculated that physical actions help us to produce emotions. We clench our fists in anger. We weep in sadness.

Applying this idea to scientific practice, Van Fraassen argues that scientists draw on their emotions when dealing with new, bewildering ideas, especially those that sprout up during scientific revolutions. If the paradigm is faltering, scientists need to change the way they view the world – and this requires that they change themselves. Scientists need to transform both who they are and what they know. Only once scientists themselves are transformed in this way can they accept a theory that they originally thought outlandish or ridiculous.

There are a few problems with this theory. Van Fraassen doesn’t specify which emotions can help scientists. Would it be sufficient to be intrigued or excited by a new theory, or to feel curiosity? Would anger at the failure of the old paradigm do the job? And it’s not clear how scientists can use emotions to change their minds. Sartre seems at times to assume that we have our emotions under direct voluntary control. But this appears implausible, on the face of it. Surely not all our emotions are under our direct control?

One way to salvage the Sartre and Van Fraassen account is to propose that emotions are under our indirect control. We can’t control our emotions directly, but we can engage in practices that, over time, help to shape how we emotionally respond to a variety of situations. And as for which emotion most helps scientists, I have a particular one in mind: awe.

In their classic account of awe, the psychologists Dacher Keltner and Jonathan Haidt characterise awe as a spiritual, moral and aesthetic emotion. In their view, all clear cases of awe have the following two components: an experience of vastness, and a need for cognitive accommodation of this vastness. You might feel awe for things that are physically large, but also for ideas that are conceptually vast. For example, at the end of the first edition of his Origin of Species (1859), Charles Darwin expressed awe for his theory of natural selection:

There is grandeur in this view of life, with its several powers, having been originally breathed into a few forms or into one; and that, whilst this planet has gone cycling on according to the fixed law of gravity, from so simple a beginning endless forms most beautiful and most wonderful have been, and are being, evolved.

The need for cognitive accommodation makes you aware that there is a lot you don’t know. You feel small, insignificant and part of something bigger. In this way, awe is a self-transcendent emotion because it focuses our attention away from ourselves and toward our environment. It is also an epistemic emotion, because it makes us aware of gaps in our knowledge. We can feel overwhelmed looking at the night sky, deeply aware that there is so much we don’t know about the Universe. In one recent study, participants listed nature as their most common elicitor of awe, followed by scientific theories, works of art and the achievements of human cooperation.

The philosopher Adam Morton speculates that epistemic emotions play a crucial role in scientific practice. Imagine a scientist who knows the latest research techniques, and who is intelligent and analytical. If she lacks curiosity, awe and other epistemic emotions, she won’t have the drive to become a good scientist, who can change her mind on the basis of evidence, explore new hypotheses or pay attention to unexpected results. As Van Fraassen argued, to change the field or accept radical changes in it, you need to alter your outlook on the world. Awe can do this. It focuses attention away from yourself and makes you think outside of your usual thought patterns.

Link to the rest at Aeon

I have a history of making decisions very quickly about men

I have a history of making decisions very quickly about men. I have always fallen in love fast and without measuring risks. I have a tendency not only to see the best in everyone, but to assume that everyone is emotionally capable of reaching his highest potential. I have fallen in love more times than I care to count with the highest potential of a man, rather than with the man himself, and I have hung on to the relationship for a long time (sometimes far too long) waiting for the man to ascend to his own greatness. Many times in romance I have been a victim of my own optimism.”

Elizabeth Gilbert, Eat Pray Love

How To Get Divorced And Learn To Surf In Your 40s

From The Huffington Post:

Newly divorced and edging further into her 40s, Diane Cardwell’s life hadn’t turned out as planned. So one day, out on a reporting trip for The New York Times in Montauk, a beach town at the eastern end of Long Island in New York, Cardwell seized the opportunity when a cottage rental and surfing lesson presented itself.

The result was the first step on a long journey that took her from a version of her life she’d always imagined — the house, the career, the husband — to another version in which she found joy and love again, complete with a new community of friends and a house in Rockaway, a surf town — yes, a surf town — in New York City.

In the spirit of “Wild” by Cheryl Strayed and “Eat, Pray, Love” by Elizabeth Gilbert, Cardwell sets out on a journey to learn to surf and comes to the end of the road a changed woman, transformed by her experience. The result is her memoir, “Rockaway: Surfing Headlong Into a New Life.”

. . . .

Tell me about the seed of this memoir. What inspired you to write it?

I wrote an essay for Vogue after Sandy, and I just got such a tremendous response from it. I wrote the essay during the first, I don’t know, three or four days after the storm. I was in a fugue state. I didn’t know how I was going to help replace my utilities that were gone. And when the editor called and he was like, “We can pay you,” I was like, “That’ll buy a new boiler.” So I ended up doing the story and I was actually very grateful to do it.

So that was the first inkling that maybe I had a subject that might interest people. I mean, I literally had strangers coming up to me on the beach like, “Oh my God, it’s so great to see you back in the water.” And, “I love that story.” And then a couple of years later, I think in 2015, I did a story for the style section that actually yielded the subtitle of the book; the headline was ”Surfing Headlong into a New Life.” That was the piece that showed me that I had an arc, a beginning, a middle, and an end— and even a happy ending where I’m finally able to surf, at least reasonably decently on good days. And I was in love and had this life that I couldn’t have imagined even three years later. So that was how I got to writing the book.

. . . .

Surfing is not something you just pick up and know how to do. It’s a huge undertaking. It boils down to practicing it over and over. There is a scene in “Rockaway” where you and your friend undertake a month of practicing every day: He practices writing and you practice surfing. Could you talk about that?

Right. So I think, well, with surfing specifically, you stand on the beach and you watch good surfers in the water and it looks incredibly easy — and I’m talking about normal-sized surf, not those enormous skyscraper-high waves — but then when you try to do it, it’s actually really hard. And I think some people, especially if they start much younger, it can come really quickly. Their bodies are more flexible, they pick up new things. But surfing isn’t really like that and was really, really not like that for me. I mean, I think of myself as the anti-natural.

I just could not do it for so long, but I loved it. And so I knew that the only way to get even a reasonable competence was going to be to practice. And that month was pivotal for me in a way, because I had this friend who wanted to try to write; I was trying to surf. He said, “What if you were to surf every day? And what if we were both to keep each other honest, and report back?” That kind of accountability really helped me stick to it, because I didn’t feel like getting in the water every day.

I would get up and say, “Oh, it’s cold.” Or, “I’m tired.” Or, “I just want to sit out back in the garden and have a beer.” But it became a thing that I had to do. And so the day was organized around “When am I going to surf?” I think that that kind of practice, and also accountability, is important in anything that you want to pursue in a serious way. And that doesn’t mean that it’s ever going to be something that’s your career or something that you do to the exclusion of other things. But if you want to do it seriously and get the benefits of doing something seriously, then it just takes attention.

Link to the rest at The Huffington Post

A Woman’s Life in Letters

From The Wall Street Journal:

‘To be A woman is to grow up and leave for another household.” So begins “The Great Learning for Women,” an 18th-century Japanese primer attributed to the neo-Confucian scholar Kaibara Ekken. As generation after generation of girls sat down to study, learning the texts and skills that would prepare them for adulthood, this was among the first and most fundamental precepts: They would grow older and, in time, depart their family home to marry into another. Biology and society admitted no alternate possibility.

In the snowy town of Ishigami at the beginning of the 19th century, a girl named Tsuneno likely read this text. The oldest daughter of a Buddhist priest, Tsuneno was expected to be disciplined, skilled and quiet; to marry the man her parents selected from within her family’s social network and raise another generation of devout sons and obedient daughters.

As Amy Stanley, a professor of history at Northwestern University, recounts in her absorbing new book, “Stranger in the Shogun’s City: A Japanese Woman and Her World,” Tsuneno would try, more than once, to fulfill the mandate set out by her family and society. By the age of 35 she had been married and divorced three times, each union shorter than the previous one and none yielding any children. Quarreling with her older brother and desperate to avoid a fourth arranged marriage, Tsuneno left—not for another husband or household, but for the bustling city of Edo (now Tokyo).

With her departure, Tsuneno changed course in dramatic fashion and initiated a furious exchange of correspondence to explain her decisions and persuade her relatives to support them. That trove of letters, carefully preserved by Tsuneno’s family, eventually became part of the Niigata Prefectural Archives. Ms. Stanley read Tsuneno’s words online and followed them to the archive, painstakingly deciphering one messily handwritten document after another until she could assemble the events of Tsuneno’s life. The resulting book is a compelling story, traced with meticulous detail and told with exquisite sympathy.

Other parts of the globe were deep into the Age of Revolution when Tsuneno was born in 1804, but peace had reigned over Tokugawa Japan for nearly two centuries. While Japan was not the “closed empire” others have depicted, Ms. Stanley writes, “it was a sheltered place, inaccessible to most foreigners and at a remove from global markets.” Ishigami, about two weeks’ walk northwest from Edo when the roads were passable, sat even further removed. As Tsuneno and her seven siblings grew up surrounded by the cyclical rhythms of religious ritual, it must have seemed unthinkable that anything would ever change.

Link to the rest at The Wall Street Journal (Sorry if you encounter a paywall)


From The Wall Street Journal:

Current debates about liberalism—especially about whether a free society can thrive alongside ever more urgent calls for government action—bring to mind an episode from the Edwardian era. It was then that the Liberal Party in Britain underwent a kind of identity crisis over policy and principle. Richard Burdon Haldane, later Viscount Haldane, stands out as a key figure in that story. Haldane (1856-1928) was “a picture of the well-fed but poorly slept” lawyer, John Campbell says, a man who combined professional success and public service. Now largely forgotten, Haldane embodied a political type that is familiar to Americans in the high-minded figures of the Progressive Era and in their descendants today, who possess an almost missionary zeal for human betterment.

In “Haldane: The Forgotten Statesman Who Shaped Britain and Canada,” Mr. Campbell, an investment banker with childhood ties to Haldane’s family, makes a persuasive case for his subject’s importance and, along the way, touches on larger questions of culture and governance. The book’s structure—less chronological than propelled by themes and causes—may challenge readers new to the story, but its wealth of detail and insightful character sketches will reward the effort.

As Mr. Campbell shows, Haldane’s family heritage adumbrated his public role. His English mother was descended from an eminent Tory jurist and lord chancellor (a position Haldane would himself occupy). On his father’s side, his grandfather and great-uncle both retired from military service to promote the evangelical movement in Scotland. Though strict Calvinism was at the center of the family’s Scottish home, Haldane lost his faith as a teenager. Over time, and perhaps without realizing the change, he transformed it into a secular commitment to reform and social progress.

. . . .

After practicing law in London and making a good deal of money, he was elected to Parliament in 1885—at a pivotal moment, as Mr. Campbell shows. The Liberal Party had dominated British politics since 1830, but tensions among its members were growing even before William Gladstone split the party in 1886 over Irish Home Rule.

Mr. Campbell describes Haldane as a living embodiment of such tensions—between an older liberalism of laissez-faire economics and limited government and a new kind, which responded to the rising spirit of socialism and organized labor. (It was around this time that Marx’s ideas were being popularized in England.) The young politician sought a rationally organized state along German lines, what his friend Beatrice Webb called “a deliberately organized society.” Haldane’s liberalism went beyond the ideals of an earlier Liberal Party, which had sought to minimize the state’s checks on individual action. He preferred to follow Wilhelm von Humboldt’s idea that government intervention, especially in education, helped citizens cultivate themselves. He thought property owed a debt to society for guaranteeing the wealth it earned. While this new liberalism inspired Haldane and his colleagues to press for costly social reforms, it drove others toward the Conservative Party.

. . . .

Early in his career, Haldane helped establish the London School of Economics, and he guided the University of London toward its becoming a true teaching institution. He advised colleges in provincial cities to extend their access to a wider range of students and social classes, and he drew on German models to improve technical education.

Oddly, though, it was in military matters that Haldane’s legacy is most notable. When, in 1905, a Liberal prime minister—Henry Campbell-Bannerman—reached an impasse with his party’s grandees, he offered Haldane the War Office, hoping that “Schopenhauer,” as he called him, could manage what was viewed as the cabinet’s most thankless job. The post gave Haldane plenty to reform. He restructured the army to cut costs and created an expeditionary force that could be quickly sent abroad with reserves at home to reinforce it. He also created a General Staff to facilitate planning. These changes helped Britain stop the German invasion of France in 1914 and led Sir Douglas Haig to call Haldane “the greatest secretary of war England ever had.”

A career capstone came with Haldane’s elevation to lord chancellor in 1912—the head of Britain’s legal system. As a longtime member of the judicial committee of the privy council, which heard appeals from the empire’s dominions, he played a “leading role” in shaping the development of Canadian law, according to Mr. Campbell. For a time, his knowledge of Germany, and fluency in the language, gave him a liaison role. After Wilhelm II asked him to join a meeting with the kaiser’s ministers, he joked about Haldane’s being the only Englishman to sit in a German cabinet. But tensions with Berlin made his position difficult. On a visit to London, the kaiser invited himself to lunch at Haldane’s home, intensifying doubt about his loyalty—“doubt that would, in time of war, bury Haldane’s reputation and political career,” Mr. Campbell writes.

Link to the rest at The Wall Street Journal (Sorry if you encounter a paywall)

What Happened to Those Who Signed the Declaration of Independence? Part 2

This is a continuation of a prior post you can find here.

Oliver Wolcott

A delegate from Connecticut, Oliver Wolcott served as captain and then major general in the state militia. In 1776, he was appointed to lead 14 regiments in defense of New York City. He also commanded thousands of men in the Battle of Saratoga. Wolcott worked tirelessly to recruit for the Connecticut militia, which, like the army in general, was sorely lacking in numbers within its ranks.

William Whipple

William Whipple of New Hampshire served as brigadier general in the state militia. He fought against Gen. Burgoyne at the battles of Stillwater and Saratoga (commonly pointed to as the turning point for Americans in the war) in 1777. The following year, Whipple participated in the retaking of Rhode Island.

Thomas McKean

Thomas McKean of Delaware served as colonel in the Delaware state militia. Once McKean was appointed to the office of President of Delaware in 1777, he was targeted by the British (the British captured John McKinley, the previous president). He had to move his family on five occasions because of raids by both the British and local Indian tribes.

Francis Lewis

Francis Lewis of New York signed the declaration on August 2, 1776. Although he was present when independence was declared a month earlier, the New York delegation did not get permission from the state’s legislature to sign the document. A few months after affixing his signature on the declaration, British troops destroyed the Long Island estate of Lewis. They took Lewis’ wife and put her in prison where she was tortured on a regular basis. Under the direction of George Washington, she was finally returned in a prisoner exchange two years later.

Benjamin Franklin

Known as the sage of Philadelphia, Benjamin Franklin of Pennsylvania was the oldest of the signers of the declaration. Prior to setting sail for France in late 1776 to ask the French for assistance in the war, Franklin gave his entire fortune to Congress to help fund the war.

John Hart

Hessian mercenaries plundered signer John Hart’s 400-acre farm outside of Hopewell, New Jersey. Prior to his farm being captured, Hart was forced to leave his family because of advancing British troops. During his absence, his wife died, and his children were sent to live with neighbors.

William Ellery

The estate of William Ellery of Delaware was burned down during the British occupation of Newport, Rhode Island. Ellery served in the Second Continental Congress until the British left Newport, which they held for three years. He returned home in order to salvage what was left of his property.

Joseph Hewes

With his fortunes built on trade, Joseph Hewes of North Carolina was a vigorous proponent of the decision of the First Continental Congress to cut off all imports and exports with the British. This of course had the effect of drying up his wealth. Interestingly, Hewes also renounced his Quaker religion in order to support the war.

James Smith

A delegate from Pennsylvania, James Smith served in the Pennsylvania militia as captain, colonel, and then as brigadier general. He was one of the first to raise men for the possibility of defending his home state, a duty he took up beginning as early as 1774.

Benjamin Harrison

Benjamin Harrison of Virginia, whose son and grandson both served as U.S. presidents, complained in a letter to Gov. William Livingston of New Jersey that his debts had accumulated substantially because of the “ravages” and “plunderings” of the British.

William Floyd

While William Floyd of New York served as a delegate in the Second Continental Congress, the British sacked his estate, forcing his family to flee. Though they made it safely to Connecticut, his family was left without a home for the duration of the war.

William Hooper

William Hooper of North Carolina outlasted British raiders who were looking to capture him and his family. In 1782, he and his family fled Wilmington after it fell to the British. Though much of his property was destroyed, he and his family were reunited at the conclusion of the war.

Lyman Hall

The British destroyed the home and plantation of Lyman Hall of Georgia. Luckily, his family escaped before the British arrived and moved up North to be with him.

Link to the rest at The Daily Signal (a publication of The Heritage Foundation)

What Happened to Those Who Signed the Declaration of Independence? Part 1

From The Daily Signal:

Thomas Heyward Jr., Edward Rutledge, and Arthur Middleton

Thomas Heyward Jr. of South Carolina was a signer of both the declaration and the Articles of Confederation. Heyward drew the ire of the British when, as a circuit court judge, he presided over the trial of several loyalists who were found guilty of treason. The prisoners were summarily executed in full view of British troops. In 1779, he joined the South Carolina militia as a captain of artillery.

Heyward’s compatriot in the South Carolina delegation, Edward Rutledge, also served in the state militia. At age 26, Rutledge was the youngest signer of the Declaration of Independence. After returning home from attending the Second Continental Congress in 1777, he joined the militia as captain of an artillery battalion.

Both Heyward and Rutledge aided their country in the battle at Port Royal Island, where they helped Gen. Moultrie defeat British Maj. William Gardiner and his troops.

Arthur Middleton, the last of the South Carolina delegation who served in the militia, took up arms against the British during the siege of Charleston in 1780. His fellow signers, Heyward and Rutledge, fought in that battle as well.

Upon the surrender of Charleston, all three men were captured by the British and were sent to a prison in St. Augustine, Florida, which was reserved for people the British thought were particularly dangerous. They were held there for almost a year before being released. On route to Philadelphia for a prisoner exchange in July 1781, Heyward almost drowned. He survived his fall overboard by clinging to the ship’s rudder until he could be rescued.

During the British occupation of Charleston, Commandant Nisbet Balfour ordered the seizure of many estates in Charleston, including those owned by Heyward and Middleton.

During his imprisonment, Heyward’s wife died at home, and his estate and property were heavily damaged. Rutledge’s estate was left intact, but his family had to sell many of their belongings in order to make the trip to Philadelphia to reunite with him after his release. Middleton’s estate was left relatively untouched, but his collection of rare paintings was destroyed during the British occupation of his home.

Thomas Nelson Jr.

Thomas Nelson Jr. of the Commonwealth of Virginia was appointed to the position of brigadier general and commander-in-chief of the Virginia militia by Gov. Patrick Henry in August 1777. At that time it was thought that the British would be making a full scale invasion of the state. Nelson was able to muster only a few hundred men to defend Virginia, but the British instead decided to attack Philadelphia.

Nelson inherited a vast family fortune, much of which he used to support the American effort. He personally paid for the return journey home of 70 troops he had led to meet the British in Philadelphia during the summer of 1778. In the spring of 1780, Nelson signed his name to a loan for $2 million that was needed to purchase provisions for the French fleet that was coming to America’s aid in the war.

As then-governor of Virginia, during the Battle of Yorktown he ordered American troops to fire upon his mansion, which had been commandeered by Gen. Cornwallis and his men.

Richard Stockton

A member of the New Jersey delegation, Richard Stockton, had his estate commandeered by the British for use as a headquarters. As they left, British troops burned all his personal effects—including his library, private papers, furniture, and clothes.

Though Stockton was in hiding at the time, he ultimately did not escape capture; a traitor led the British to his position in November 1776. He was held captive in Amboy, New Jersey, and was then sent to New York City where he was imprisoned in a jail reserved for common criminals. Incensed by his treatment, Congress worked with British Gen. William Howe to obtain his release.

George Walton

Because of his small build and stature, George Walton was thought to be the youngest of the signers of the declaration (he was actually in his mid-30s). He hailed from Georgia and served as colonel in the first regiment of the state militia in 1778. During the siege of Savannah, a cannonball broke Walton’s leg, which led to his being captured. He was held captive for nine months and was released in the early fall of 1779 in a prisoner exchange for a British navy captain.

At the same time Walton was held prisoner, his wife Dorothy was captured by the British. She was imprisoned on an island in the West Indies and was eventually freed after a prisoner exchange. During the Waltons’ confinement, the British ransacked their home.

George Clymer

British troops destroyed the home of George Clymer of Pennsylvania in September 1777 when they captured Philadelphia. Though his home was outside of the city, it was right in the middle of the path of the British march. American loyalists pointed out to the British homes belonging to patriots, which of course included Clymer’s estate.

Clymer also contributed to the war monetarily. He converted his entire fortune into continental currency, a risky move considering the likelihood that the currency would be rendered worthless. He also told wealthy friends to contribute to the American cause.

Robert Morris

A delegate from Pennsylvania, Robert Morris helped insure Washington’s victory at Yorktown by using his own credit to obtain the supplies necessary to defeat the British. He spent more than $1 million (not adjusted for inflation) of his own money to accomplish this.

While serving as superintendent of finance of the United States, Morris regularly used his own financial resources to obtain much needed supplies. Using his own funds, for example, he purchased one thousand barrels of flour for Washington’s men in late spring of 1778.

Lewis Morris

Lewis Morris of New York served as a major general in the state militia. Morris devoted himself to recruiting men to serve in the militia and to help keep supplies up, which was a constant problem. For almost the entire length of the war, the British occupied his home, Morrisania, and used it as their headquarters. This forced Morris to live off of his close friends and associates until the war ended in 1783.

John Hancock

John Hancock of Massachusetts, the man with the largest signature on the declaration, served in the militia as major general in 1778. Hancock was put in command of approximately 6,000 men during the Rhode Island campaign. That campaign was ultimately unsuccessful because the French failed to carry out their end of the bargain.

Caesar Rodney

Caesar Rodney served in the Delaware militia as well, attaining the rank of brigadier general. Rodney famously road on horseback straight from Dover to Philadelphia to cast his vote in favor of declaring independence (the Delaware delegation was split). He was with his men in the field during the brutal winter of 1776, helped quash an uprising in Delaware (there were a large number of loyalists within the state), and helped in George Washington’s effort to defend Philadelphia from being taken by the British.

Carter Braxton

Carter Braxton of the Virginia delegation accumulated massive personal debts helping the American effort in the war. He loaned 10,000 pounds sterling to Congress, which was never repaid. He also spent much of his wealth outfitting American ships so that they could carry more cargo. Due to the British capturing some of his vessels and others being lost out on the high seas, he suffered great financial calamity. These accumulated losses left him bankrupt by war’s end.

Link to the rest at The Daily Signal (a publication of The Heritage Foundation)

Part 2 of this list will be posted tomorrow (July 4)

The World

From The Wall Street Journal:

Richard Haass is a prolific author on international affairs, served as a foreign-policy official in the Reagan and both Bush administrations, and is now president of the Council on Foreign Relations. He is, in short, a high-ranking member of American foreign policy’s clerisy. As if to emphasize the point, he relates that the inspiration for his book “The World: A Brief Introduction” began with a day of fishing in Nantucket, where he spoke with a student from Stanford who confessed that he had taken few courses in economics, politics or history. Otherwise educated young people today, Mr. Haass concludes, “are essentially uninformed about the world they are entering.” He hopes to change this state of affairs with “The World.”

What Mr. Haass has written, alas, is a series of dry primers about the world’s regions and their problems. The book is rife with soporific statements with which it would be difficult to disagree: “Economic problems within Europe have been ever more significant. As a result, the Continent has had low rates of growth.” The assumption seems to be that the young have disengaged from the world because they lack access to information. But engagement has fallen even as the internet has made access to information effortless.

Mr. Haass is among the most respected foreign-policy experts in the world and is fully capable of proposing bold ideas that would put American strategy on a more sustainable path. That “The World” offers mostly uncontroversial data points rather than fresh analysis helps to explain why two (and in some respects three) consecutive U.S. administrations have often rejected the dominant views of foreign-policy experts.

The useful parts of the book mostly come in the opening section, which briskly relays the “essential history” of international affairs. The Treaty of Westphalia in 1648 established the nation-state as the basic political unit in Europe. Webs of alliances and the rise of nationalism set the stage for World War I—and trade ties were not enough to prevent it. This context is important because contemporary debates about international relations often proceed as if history started with World War II.  

Link to the rest at The Wall Street Journal (sorry if you run into a paywall)

An exploration of ‘How Innovation Works’

From The Washington Post:

Innovation, Matt Ridley tells us at the start of his new treatise on the subject, “is the most important fact about the modern world, but one of the least well understood.” Even as it functions as a powerful engine of prosperity — the accelerant of human progress — innovation remains the “great puzzle” that baffles technologists, economists and social scientists alike. In many respects, Ridley is on to something. After decades of careful study, we’re still not entirely sure about innovation’s causes or how it can best be nurtured. Is innovation dependent on a lone genius, or is it more a product of grinding teamwork? Does it occur like a thunderclap, or does it take years or even decades to coalesce? Is it usually situated in cities, or in well-equipped labs in office parks?

We can’t even agree on its definition. Generally speaking, an innovation is more than an idea and more than an invention. Yet beyond that, things get confusing. We live in a moment when we’re barraged by new stuff every day — new phones, new foods, new surgical techniques. In the pandemic, we’re confronted, too, with new medical tests and pharmaceutical treatments. But which of these are true innovations and which are novel variations on old products? And while we’re at this game, is innovation limited to just technology, or might we include new additions to our culture, like a radical work of literature, art or film?

Unfortunately, no one happens to be policing the innovation space to say what it is and is not. Mostly we have to allow for judgment calls and an open mind. As an occasional writer on the subject, I tend to define innovation simply, but also flexibly: a new product or process that has both impact and scale. Usually, too, an innovation is something that helps us do something we already do, but in a way that’s better or cheaper. Artificial light is an excellent case study. Over time we’ve moved from candles, to whale oil and kerosene lamps, to incandescent and fluorescent bulbs, and now to LEDs. Or, as another example, we might look to one of the great accomplishments of the 20th century, the Haber-Bosch process to make synthetic fertilizer, as a leap that changed the potential of agricultural production. On the other hand, we can regard the Juicero press — a recent Silicon Valley-backed idea that promised to “disrupt” the juice market and burned up more than $100 million in the process — as a fake or failed innovation. And still, this leaves us plenty of room for disagreement about what falls between these extremes and why.

Ridley enters into this messy arena with the intent of organizing the intellectual clutter. The first half of his book, “How Innovation Works: And Why It Flourishes in Freedom,” takes us on a tour through some highlights in the history of innovation. We visit with the early developers of the steam engine, witness the events leading to the Wright brothers’ first flight at Kitty Hawk, N.C., and hear about the industrialization of the Haber-Bosch fertilizer process. There are likewise forays back to the early days of automobiles and computing, the development of smallpox vaccines and clean drinking water, and stories that trace the origins of the Green Revolution in agriculture, which alleviated famine for more than 1 billion people. For dedicated science readers, Ridley’s lessons may have a glancing and derivative feel. He knits together stories many of us have probably heard before — say, through the renditions of writers like Steven Johnson, Charles Mann or Walter Isaacson — but somehow misses the opportunity to enliven these sketches with a sense of wonder and surprise. More seriously, he skirts the opportunity to footnote his summarizations, leaving only a skeletal guide to sources in his back pages.

What becomes clear, though, is that Ridley is focused less on exploring the pageant of history than on fashioning a new belief system. I don’t necessarily mean this as a critique; in fact, the second half of his book — where he looks closely, chapter by chapter, at the factors that shaped the innovations he’s spent his first 200 pages describing — is more polemical in its approach but often more engaging, even as one might disagree with a narrative direction that arises from what I would characterize as the libertarian right. 

Link to the rest at The Washington Post

May heaven protect the unsuspecting Washington Post reader from any political attitudes not consistent with the paper’s editorial page.

The Human Factor

From The Wall Street Journal:

In “The President, the Pope, and the Prime Minister” (2006), the journalist John O’Sullivan asserted that the Cold War had been won by Ronald Reagan, John Paul II and Margaret Thatcher. “Without Reagan,” he stated, “no perestroika or glasnost either.” This belief, according to Archie Brown, emeritus politics professor at Oxford University, is nothing less than “specious.” In “The Human Factor,” Mr. Brown gives most of the credit for the Cold War’s end to Mikhail Gorbachev, whom he presents as almost a pacifist who voluntarily wound up the Soviet Union, albeit with a little assistance from Thatcher. So who is right?

The title of Mr. Brown’s last book, “The Myth of the Strong Leader” (2014), suggests that he might have a philosophical problem with the Great Man and Woman theory of history, and he certainly underplays the role of John Paul II during the last decade of the Cold War. The pope’s call for spiritual renewal and for freedom, not least for his native Poland, stirred the hearts of millions, but he rates only five anodyne sentences in 400 pages.

Mr. Brown was awarded a British honor in 2005 “for services to UK-Russian relations.” One Russian in particular—Mr. Gorbachev—gets lauded in the current work for his “bold leadership,” “new ideas,” “formidable powers of persuasion,” “embrace of democratization,” “emphasis on freedom of choice” and so on. At best, Reagan, George Shultz, George H.W. Bush and the others are praised for their “constructive engagement.” At worst, Reagan is criticized for introducing “complications” to an already begun process of Russian collapse.

At no point does Mr. Brown acknowledge that the primary reason that Mr. Gorbachev liberalized the Soviet Union was that Reagan, Thatcher and other Western leaders forced him to, by keeping Western defenses strong and mercilessly exposing the moral bankruptcy—and looming economic bankruptcy too—of what Reagan accurately called Russia’s “evil empire.”

For Mr. Brown, Reagan lacked sophistication, and his style was all wrong for high-minded diplomacy. It was a familiar critique at the time, though one would think that, with the end of the Cold War, it had lost its plausibility. Still, Mr. Brown hopes to revive it. “In his speeches, at every stage of his career,” Mr. Brown complains of Reagan, “he used stories and ‘quotations’ that came from very unreliable sources or from the recesses of his own mind, often drawing on films he had acted in or seen. . . . For Reagan, whether they were actually true or not appeared less important than the part they played in his narrative.”

A president who told unreliable jokes and unverifiable stories! Lincoln fits the description, as do a dozen other U.S. presidents. Showing a folksy informality and raconteur skill is thought to be an asset in politics.

Link to the rest at The Wall Street Journal (sorry if you run into a paywall)

PG notes that TPV is a blog focused on the contemporary business of writing, not politics. He will also note that since much of the publishing world, indie and traditional appears to be sheltering in place, he sometimes casts his net a bit wider than he might absent the publishing commentary drought.

(Yes, PG does recognize a sort of mixed metaphor in the “casting his net” and “drought” combination.)


If a book has been in print for forty years, I can expect it to be in print for another forty years. But, and that is the main difference, if it survives another decade, then it will be expected to be in print another fifty years. This, simply, as a rule, tells you why things that have been around for a long time are not “aging” like persons, but “aging” in reverse. Every year that passes without extinction doubles the additional life expectancy. This is an indicator of some robustness. The robustness of an item is proportional to its life!

Nassim Taleb, Antifragile


From The Wall Street Journal:

On April 27, 1939, the British government announced plans to conscript young men for military training. It was a dramatic departure: Never previously in its modern history had the nation conscripted men for the military in time of peace. As the prime minister, Neville Chamberlain, explained to the public, however, with countries all over Europe preparing for battle, and everyone fearing a war might start at any moment, “no one can pretend that this is peacetime in any sense in which the term could fairly be used.”

This liminal period, starting with the sighs of relief at the signing of the Munich Agreement in September 1938, is the subject of Frederick Taylor’s “1939: A People’s History of the Coming of the Second World War.” Mr. Taylor, whose previous works about the period include “Coventry: November 14, 1940” and “Dresden: Tuesday, February 13, 1945,” charts the escalating tensions as Hitler’s brinkmanship pushed Europe to the edge of war, and the insidious onset of a “wartime” mood across Europe, even before German forces invaded Poland. The book concerns the United Kingdom and Germany, and it intersperses clear explanations of the decisions being taken by statesmen with the way these were experienced by “ordinary” people in both countries.

Rich in social and cultural details that bring the era to life, “1939” makes use of a range of eyewitness testimony and contemporary assessments of public opinion, which together illuminate the variety of individual experience within a historic moment in international affairs. Discussions of the ways that new forms of entertainment, such as television and cheap holiday camps, appeared in Germany and in Britain illuminate both the similarities among European experiences and the stark cultural and political differences. Though each chapter deals with a month, Mr. Taylor dives back into the 1930s to explain the back story of that final year of “peace.”

But Mr. Taylor’s inverted commas on “ordinary” are necessary. The figures to whose testimony Mr. Taylor returns throughout the book are German: the journalist (and later anti-Nazi resister) Ruth Andreas-Friedrich and the well-connected novelist and screenwriter Erich Ebermayer. Their diary accounts provide the self-scrutinizing outsiders’ view of the mainstream that, for the British part of his story, comes from the more numerous contributors to the social research project Mass-Observation, the surviving archives of which are such a boon for historians of this period.

. . . .

Mr. Taylor [keeps] up the momentum of a much-told story—the coming of the European war—while conveying a powerful sense of what it felt like to watch the precipice approach.

For some, the drop had already begun. Matching up the dynamics of genocide and war, Mr. Taylor explains how ordinary Germans carried on as attacks on Jews became part of national and civic life. The author is very good at showing the fear and horror produced by escalating Nazi violence, as well as the bizarre dualities that resulted as everyday routines continued around them. Walking to church or the cinema over the smashed glass from shop windows and through the smoke from burning synagogues, gentile Germans managed not to feel that their world was disintegrating around them. Even Britons who got past the casual anti-Semitism typical of the age to offer aid to Jewish refugees, meanwhile, remained remarkably convinced that decent Germans would one day reject Nazi brutality.

What worried everyone was the onset of another world war, when the last one was fresh in memory. Mr. Taylor quotes one report from a local Nazi party official about popular reactions to the invasion of Poland in the Westphalian city of Bielefeld: The last great war, the document observed, had “returned remarkably vividly to people’s memories, its misery, its four-year duration, its two million German fallen. No enthusiasm for war at all.” That the German people acquiesced speaks not only to the power of Nazi propaganda, which used modern means to tap into deeper strands of European anti-Semitism, but also to the degree to which life was already militarized by September 1939. For all the horror at the slaughter a generation before, mobilizing to fight was something that this state—and this society—knew how to do.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

Pencil Leaners

From Public Books:

Between 1935 and 1939, the Federal Writers’ Project (FWP)—an initiative funded by the Works Project Administration under the New Deal—provided employment for some 6,000 jobless writers [in the United States]. Today, as stunned authors in Australia and around the world come to terms with the economic consequences of the coronavirus pandemic, that experiment deserves reconsideration. As the ABC recently noted, Australian writers—who earn, on average, less than $13,000 directly from their work each year—will be affected on multiple levels: by the cancellation of festivals, talks, and other paying gigs; by the closure of bookshops; by redundancies and cuts in publishing houses; and by job losses in the related industries (from academia to hospitality) through which they supplement their incomes.

It was the American New Deal more than anything else that legitimated the kind of stimulus packages again being discussed in Australia not just for the arts but across the economy. When Franklin D. Roosevelt took office, the crisis of the Great Depression forced him, despite his own fiscal conservatism, to rush through various rescue measures of a now-familiar nature. The US government guaranteed bank loans to prevent further financial collapses; it encouraged industrial cartels to control prices and production levels; it purchased unsold crops from farmers; and through the Civil Works Administration, the Federal Emergency Relief Administration, and, eventually, the Works Progress Administration it sought to create jobs.

Recent calls for postpandemic bailouts for artists in general or writers implicitly evoke that legacy.

. . . .

Obviously, the publishing scene today—dominated by vast multinationals, for whom books are merely part of a broader engagement with the “entertainment industry”—differs greatly from the more small-scale milieu of the 1930s. Even so, it’s still worth noting how contemporary thinking about funding literature differs from the Federal Writers’ Project in several important ways.

Most importantly, the job schemes of the 1930s as a whole, including the Writers’ Project—emerged from intense class struggles in a way that today’s plans do not.

In her history of the Works Progress Administration, Nancy E. Rose writes:

Starting in early 1930, unemployed councils, organized by the Communist Party, began to lead hunger marches to demand more relief. On March 6, 1932, which was proclaimed International Unemployment Day, hunger marches took place throughout the country. … In general, cities with strong Unemployed Councils provided better relief.

Agitation by the unemployed coincided with intensified industrial disputation. By 1934, some 1.5 million workers were on strike and FDR went to the polls the following year in the midst of a massive wave of industrial action, in which the newly formed Congress of Industrial Organizations played an important role. Those titanic clashes paved the way for the Second New Deal, under which the most significant reforms (including the WPA) were implemented.

Crucially, writers themselves fought, through explicitly political groups like the Writers’ Union and [before that] the Unemployed Writers’ Association, for the program from which they benefited. In 1934, the UWA’s secretary Robert Whitcomb explained:

The unemployed writers of New York City do not intend to continue under the semi-starvation conditions meted out to them. If the government does not intend to formulate some policy regarding the class of intellectual known as a writer … then the writer must organize and conduct a fight to better his condition.

The following year, with something like a quarter of the entire publishing industry out of work, the two organizations launched a widely publicized picket of the New York Port Authority, in which their members carried signs reading: “Children Need Books. Writers Need Bread. We Demand Projects.”

. . . .

The authors employed by the FWP included many who went on to conventional success, people like Nelson Algren, Saul Bellow, Arna Bontemps, Malcolm Cowley, Ralph Ellison, Zora Neale Hurston, Claude McKay, Kenneth Patchen, Philip Rahv, Kenneth Rexroth, Harold Rosenberg, Studs Terkel, Margaret Walker, Richard Wright, Frank Yerby, and others. As David A. Taylor notes in Soul of a People, his history of the FWP, “four of the first ten winners of the National Book Award in fiction and one in poetry came from this emergency relief project.”

. . . .

Thus, even though the program did actively recruit some literary stars, the author Anzia Yezierska, who’d previously worked in Hollywood, experienced enlisting in the New York FWP as a kind of proletarianization. “There was,” she wrote later, “a hectic camaraderie among us, though we were as ill-assorted as a crowd on a subway express, spinster poetesses, pulp specialists, youngsters … veteran newspapermen, art-for-art’s-sake literati, clerks and typists … people of all ages, all nationalities, all degrees of education, tossed together in a strange fellowship of necessity.”

Not everyone approved of this camaraderie—W. H. Auden dismissed it as “absurd”; one of the project’s own directors complained that “all the misfits and maniacs on relief have been dumped here”

. . . .

The FWP faced especial hostility and ridicule, with one editorialist complaining that it meant that literary “pencil leaners” would join the “shovel leaners” of the WPA. Again, the authorities stressed the project’s utility, with its remit described in an official announcement as the

employment of writers, editors, historians, research workers, art critics, architects, archaeologists, map draftsmen, geologists, and other professional workers for the preparation of an American Guide and the accumulation of new research material on matters of local, historical, art and scientific interest in the United States; preparation of a complete encyclopedia of government functions and periodical publications in Washington; and the preparation of a limited number of special studies in the arts, history, economics, sociology, etc., by qualified writers on relief.

It duly enlisted its staff to labor on perhaps a thousand volumes, including 50 state and territorial guides, 30 city guides and 20 regional guides. David Taylor describes these texts, composed by a dazzling group of writers, as “a multifaceted look at America by Americans, assembled during one of the greatest crises in the country.”

Many writers resented their tasks (at one point, Yezeriska was sent to catalog the trees in Central Park); many worked on their own manuscripts on the side.

. . . .

In books like Gumbo Ya-Ya: A Collection of Louisiana Folk TalesBibliography of Chicago Negroes, and Drums and Shadows: Survival Studies among the Georgia Coastal Negroes, FWP employees collected the folklore that Zora Neale Hurston described as “the boiled-down juice of human living.” They interviewed people who had been enslaved, generating an astonishing assemblage of reminiscences. It’s thanks to the FWP that we have a small number of audio clips in which we can hear the actual voices of the survivors of slavery explaining what was done to them.

Alfred Kazin described how, in the late 1930s:

Whole divisions of writers now fell upon the face of America with a devotion that was baffled rather than shrill, and an insistence to know and to love what it knew that seemed unprecedented. Never before did a nation seem so hungry for news of itself.

Link to the rest at Public Books

Women’s Ways of Aging

From Public Books:

As the coronavirus pandemic continues to rage, it intensifies fears of aging and debility that characterize our culture of fitness and drive our aspirations to bodily invincibility. The stigma of aging affects women differentially. While feminists have touted the achievements of older women and insisted that the later years can be the best, we now find ourselves on the other side of an increasingly solid barrier between a “younger” population and an “elderly,” “older,” or “old” one. Those of us who are age 65 or older are the most vulnerable and at risk, both in need of extra protection and most likely to lose out in the triage battle for hospital beds and ventilators. At the same time, our vulnerability to the virus makes it impossible for many of us in this age cohort to participate in the historic street protests we are condemned to witness from afar.

This is therefore a good moment to assess our experiences of aging, and to face our own attitudes more squarely. Rather than battling an ageist and sexist media by insisting that older women can do and be more than ever before by working and playing harder, might we instead focus on care and interdependence, accepting rather than disavowing bodily, emotional, and social vulnerabilities? Rather than celebrating individual victories against aging and mortality, we might embrace a communal ethos of mutuality to which the old have a great deal to contribute.

In proclaiming older women’s powers, the titles of two recent books give a clear sense of their tone and mission: No Stopping Us Now: The Adventures of Older Women in American History, by journalist Gail Collins, and In Our Prime: How Older Women Are Reinventing the Road Ahead, by communications and media scholar Susan J. Douglas. Indignant about the blatant disparagement of older women that characterizes our moment, Collins and Douglas take a celebratory, if not outright triumphalist, tone. Both search for greater social importance and acceptance of older women in earlier historical periods and find examples of their unrelenting energy and productivity today. Both books encourage all women to fight against gendered ageism. They call for forms of cultural recognition that would better represent what their authors see as older women’s mostly positive experiences of aging.

. . . .

In a whirlwind journey through United States history, from the colonial period to today, No Stopping Us Now traces changes in opportunities for and attitudes toward older women. With spirit and energy, Collins leads us through the lives of numerous, mostly well-known older women who wielded considerable influence at different historical moments. Although the book touches upon larger economic arguments about shifting social roles available to mature women—brought about by the need for their products in colonial times, for example, or the opportunities for widows to run their husbands’ farms or businesses—Collins is more interested in how individual women were able to circumvent prejudices and taboos, and thereby thrive in their later years. Collins’s story is one not so much of steady progress as it is of a series of gains and losses, advances and declines—a story that leads to what she sees as today’s open future of increased possibility.

Thanks to Collins, one certainly gets a sense of women’s energy and activity, which is hard to reconcile with popular attitudes of gendered ageism, then and now. She paints vivid portraits, for example, by following the writing, publishing, and public-speaking “adventures” of 19th-century luminaries like Sarah Josepha Hale, who continued writing until she was 89; Elizabeth Cady Stanton, who urged middle-class women to start a whole new life in their 50s; Catharine Beecher, who took courses at Cornell in her 70s; and Jane Addams, who advocated a postponement of old age.

Notably, historians studying American women have analyzed the feminist strategies these and lesser-known women used to advance their work: by seemingly conforming to set gender roles, even as they radically subverted them. Collins, meanwhile, is content to tell these stories chronologically, ending with encouraging contemporary examples that range from Ruth Bader Ginsburg and Nancy Pelosi to Gloria Steinem and Helen Mirren. She does fold these individual white women into a broad historical sweep that also includes exceptional African American figures like Sojourner Truth, Harriet Tubman, Frances Harper, and 98-year-old National Park Service ranger Betty Reid Soskin. Yet she only mentions—without analyzing in any depth—how gendered prejudices are structurally inflected by racial, economic, and other social inequalities.

Link to the rest at Public Books

Anti-Semitism and the Intellectuals

From The Wall Street Journal:

George Eliot was at the peak of her renown in 1874 when John Blackwood, her publisher, learned that she was at work on “Daniel Deronda, ” a new novel. As a literary man, he was in thrall to her genius. As a businessman with an instinct for the market, he valued her passionately dedicated readership. But an early look at portions of her manuscript astonished and appalled him: Too much of it was steeped in sympathetic evocations of Jews, Judaism and what was beginning to be known as Zionism.

All this off-putting alien erudition struck him as certain to be more than merely unpopular. It was personally tasteless, it went against the grain of English sensibility, it was an offense to the reigning political temperament. It was, in our notorious idiom, politically incorrect. Blackwood was unquestionably a member of England’s gentlemanly intellectual elite. In recoiling from Eliot’s theme, he showed himself to be that historically commonplace figure: an intellectual anti-Semite.

Anti-Semitism is generally thought of as brutish, the mentality of mobs, the work of the ignorant, the poorly schooled, the gutter roughnecks, the torch carriers. But these are only the servants, not the savants, of anti-Semitism. Mobs execute, intellectuals promulgate. Thugs have furies, intellectuals have causes.

The Inquisition was the brainchild not of illiterates, but of the most lettered and lofty prelates. Goebbels had a degree in philology. Hitler fancied himself a painter and doubtless knew something of Dürer and da Vinci. Pogroms aroused the murderous rampage of peasants, but they were instigated by the cream of Russian officialdom. The hounding and ultimate expulsion of Jewish students from German universities was abetted by the violence of their Aryan classmates, but it was the rectors who decreed that only full-blooded Germans could occupy the front seats. Martin Heidegger, the celebrated philosopher of being and non-being, was quick to join the Nazi Party, and as himself a rector promptly oversaw the summary ejection of Jewish colleagues.

Stupid mobs are spurred by clever goaders: The book burners were inspired by the temperamentally bookish—who else could know which books to burn? Even invidious folk myths have intellectual roots, as when early biblical linguists mistranslated as horns the rays of light emanating from Moses’ brow.

Link to the rest at The Wall Street Journal (sorry if you run into a paywall)

The Woman Who Cracked the Anxiety Code

From The Wall Street Journal:

As we have been reminded of late, there is an astonishing complexity—and at times fragility—to our mental and physical health, and we owe a debt to the legions of scientists whose insights and discoveries, over the years, have improved our chances of well-being. Alas, too many of them are unknown to us. One name that was once broadly known has fallen into lamentable obscurity—that of Claire Weekes, an Australian doctor who did ground-breaking work on one of the great scourges of humanity. With Judith Hoare’s “The Woman Who Cracked the Anxiety Code,” we have a chance to learn about Weekes’s varied life and, as important, become reacquainted with her work.

Decades before her death in 1990 at the age of 87, Weekes had been a global sensation, reaching millions of people through her books—“transfusions of hope,” she called them. One of the original self-helpers, she believed that sufferers could master themselves without the aid of professionals, and the strategies she gave them were firmly grounded in the biology of anxiety.

Weekes didn’t plan on medicine as a career, Ms. Hoare tells us. In 1928, at the age of 25, she began graduate studies in zoology in London on a prestigious fellowship. When her beloved mentor died of a stroke, she developed severe heart palpitations. Doctors misinterpreted her condition as tuberculous and sent her to a sanatorium. There she fell into a general state of fear. Six months later, doctors retracted their diagnosis, and Weekes, now nearly incapacitated by stress, resumed her research.

The turning point came when she confided in a friend, a World War I veteran, that she suffered from a frenzied heartbeat. “Far from being surprised or concerned,” Ms. Hoare writes, “he shrugged,” saying: “Those are only the symptoms of nerves.” He told Weekes, in Ms. Hoare’s paraphrase, that “her heart continued to race because she was frightened of it. It was programmed by her fear. This made immediate sense.”

The explanation was deceptively profound, going straight to the core of the mind-body connection. 

. . . .

Weekes had hypothesized a “first fear and second fear” process. The first is a reflex—and the problem in many anxiety disorders is that the reflex is set off for no obvious reason. The second is the conscious feeling of fear. Relief of suffering, for her, came when she learned to quell the “fear of the first fear,” thereby short-circuiting the cycle that was set in motion by the original, unbidden rush of panic: the pounding heart. According to Ms. Hoare, Weekes “immediately grasped the point that she needed to stop fighting the fear.” She had cracked the code.

But this insight would not reach the public for another 30 years. After becoming the first woman to be awarded the degree of Doctor of Science at Sydney University, Weekes conducted research in endocrinology and neurology. Eventually she sought a more pragmatic occupation and enrolled in medical school at age 38. During her work as a general practitioner, she felt special sympathy for her anxious patients and began to counsel them to do as she herself had done: “float past” panic, give bodily sensations and fearful thoughts no power. One of her patients asked for written advice. Her pages to him became “Self Help for Your Nerves,” published in 1962, when Weekes was 59; the book rocketed up the bestseller lists in the U.S. and the U.K. As Ms. Hoare shows, Weekes’s contributions to human welfare live on in mindfulness training and forms of behavioral therapy, sometimes combined with medication. Contemporary neuroscience has vindicated her theory.

Link to the rest at The Wall Street Journal (sorry if you run into a paywall)

How to Fill a Yawning Gap

From The Wall Street Journal:

Is boredom really all that interesting? Thanks perhaps to the subject’s dreary durability, it has generated a considerable literature over the years. Alberto Moravia wrote an engaging novel called “Boredom,” and psychologists, philosophers and classicists have also had their say.

Out of My Skull,” the latest work on this strangely alluring topic, has an exciting title, but nothing about the book is wild or crazy. James Danckert and John D. Eastwood, a pair of psychologists in Canada, know an awful lot about the subject (Mr. Eastwood even runs a Boredom Lab at York University), and they examine it methodically. “In our view, being bored is quite fascinating, and maybe, just maybe, it might even be helpful,” they write, echoing predecessors who find boredom salutary. “Boredom is a call to action, a signal to become more engaged. It is a push toward more meaningful and satisfying actions. It forces you to ask a consequential question: What should I do?”

A taxonomy of boredom, if it’s to avoid exemplifying what it describes, ought to be simple. So let’s just say that boredom is of two kinds. The first is better known to us as ennui, and the democratization of this once-rarefied feeling is one of civilization’s triumphs. At first the preserve of aristocrats and later taken up by intellectuals, nowadays it is available to affluent citizens everywhere. Our endless search for palliatives in the face of this affliction underpins the consumer economy.

The other kind of boredom is the version that most of us get paid for. Commentators on boredom usually genuflect briefly toward factory workers, nannies and other hard-working members of the hoi polloi whose tasks can be mind-numbing. But such people live with a version of boredom that intellectuals find, well, boring. So the focus is usually on the self-important existential variety.

Link to the rest at The Wall Street Journal (sorry if you run into a paywall)