An exploration of ‘How Innovation Works’

From The Washington Post:

Innovation, Matt Ridley tells us at the start of his new treatise on the subject, “is the most important fact about the modern world, but one of the least well understood.” Even as it functions as a powerful engine of prosperity — the accelerant of human progress — innovation remains the “great puzzle” that baffles technologists, economists and social scientists alike. In many respects, Ridley is on to something. After decades of careful study, we’re still not entirely sure about innovation’s causes or how it can best be nurtured. Is innovation dependent on a lone genius, or is it more a product of grinding teamwork? Does it occur like a thunderclap, or does it take years or even decades to coalesce? Is it usually situated in cities, or in well-equipped labs in office parks?

We can’t even agree on its definition. Generally speaking, an innovation is more than an idea and more than an invention. Yet beyond that, things get confusing. We live in a moment when we’re barraged by new stuff every day — new phones, new foods, new surgical techniques. In the pandemic, we’re confronted, too, with new medical tests and pharmaceutical treatments. But which of these are true innovations and which are novel variations on old products? And while we’re at this game, is innovation limited to just technology, or might we include new additions to our culture, like a radical work of literature, art or film?

Unfortunately, no one happens to be policing the innovation space to say what it is and is not. Mostly we have to allow for judgment calls and an open mind. As an occasional writer on the subject, I tend to define innovation simply, but also flexibly: a new product or process that has both impact and scale. Usually, too, an innovation is something that helps us do something we already do, but in a way that’s better or cheaper. Artificial light is an excellent case study. Over time we’ve moved from candles, to whale oil and kerosene lamps, to incandescent and fluorescent bulbs, and now to LEDs. Or, as another example, we might look to one of the great accomplishments of the 20th century, the Haber-Bosch process to make synthetic fertilizer, as a leap that changed the potential of agricultural production. On the other hand, we can regard the Juicero press — a recent Silicon Valley-backed idea that promised to “disrupt” the juice market and burned up more than $100 million in the process — as a fake or failed innovation. And still, this leaves us plenty of room for disagreement about what falls between these extremes and why.

Ridley enters into this messy arena with the intent of organizing the intellectual clutter. The first half of his book, “How Innovation Works: And Why It Flourishes in Freedom,” takes us on a tour through some highlights in the history of innovation. We visit with the early developers of the steam engine, witness the events leading to the Wright brothers’ first flight at Kitty Hawk, N.C., and hear about the industrialization of the Haber-Bosch fertilizer process. There are likewise forays back to the early days of automobiles and computing, the development of smallpox vaccines and clean drinking water, and stories that trace the origins of the Green Revolution in agriculture, which alleviated famine for more than 1 billion people. For dedicated science readers, Ridley’s lessons may have a glancing and derivative feel. He knits together stories many of us have probably heard before — say, through the renditions of writers like Steven Johnson, Charles Mann or Walter Isaacson — but somehow misses the opportunity to enliven these sketches with a sense of wonder and surprise. More seriously, he skirts the opportunity to footnote his summarizations, leaving only a skeletal guide to sources in his back pages.

What becomes clear, though, is that Ridley is focused less on exploring the pageant of history than on fashioning a new belief system. I don’t necessarily mean this as a critique; in fact, the second half of his book — where he looks closely, chapter by chapter, at the factors that shaped the innovations he’s spent his first 200 pages describing — is more polemical in its approach but often more engaging, even as one might disagree with a narrative direction that arises from what I would characterize as the libertarian right. 

Link to the rest at The Washington Post

May heaven protect the unsuspecting Washington Post reader from any political attitudes not consistent with the paper’s editorial page.

The Human Factor

From The Wall Street Journal:

In “The President, the Pope, and the Prime Minister” (2006), the journalist John O’Sullivan asserted that the Cold War had been won by Ronald Reagan, John Paul II and Margaret Thatcher. “Without Reagan,” he stated, “no perestroika or glasnost either.” This belief, according to Archie Brown, emeritus politics professor at Oxford University, is nothing less than “specious.” In “The Human Factor,” Mr. Brown gives most of the credit for the Cold War’s end to Mikhail Gorbachev, whom he presents as almost a pacifist who voluntarily wound up the Soviet Union, albeit with a little assistance from Thatcher. So who is right?

The title of Mr. Brown’s last book, “The Myth of the Strong Leader” (2014), suggests that he might have a philosophical problem with the Great Man and Woman theory of history, and he certainly underplays the role of John Paul II during the last decade of the Cold War. The pope’s call for spiritual renewal and for freedom, not least for his native Poland, stirred the hearts of millions, but he rates only five anodyne sentences in 400 pages.

Mr. Brown was awarded a British honor in 2005 “for services to UK-Russian relations.” One Russian in particular—Mr. Gorbachev—gets lauded in the current work for his “bold leadership,” “new ideas,” “formidable powers of persuasion,” “embrace of democratization,” “emphasis on freedom of choice” and so on. At best, Reagan, George Shultz, George H.W. Bush and the others are praised for their “constructive engagement.” At worst, Reagan is criticized for introducing “complications” to an already begun process of Russian collapse.

At no point does Mr. Brown acknowledge that the primary reason that Mr. Gorbachev liberalized the Soviet Union was that Reagan, Thatcher and other Western leaders forced him to, by keeping Western defenses strong and mercilessly exposing the moral bankruptcy—and looming economic bankruptcy too—of what Reagan accurately called Russia’s “evil empire.”

For Mr. Brown, Reagan lacked sophistication, and his style was all wrong for high-minded diplomacy. It was a familiar critique at the time, though one would think that, with the end of the Cold War, it had lost its plausibility. Still, Mr. Brown hopes to revive it. “In his speeches, at every stage of his career,” Mr. Brown complains of Reagan, “he used stories and ‘quotations’ that came from very unreliable sources or from the recesses of his own mind, often drawing on films he had acted in or seen. . . . For Reagan, whether they were actually true or not appeared less important than the part they played in his narrative.”

A president who told unreliable jokes and unverifiable stories! Lincoln fits the description, as do a dozen other U.S. presidents. Showing a folksy informality and raconteur skill is thought to be an asset in politics.

Link to the rest at The Wall Street Journal (sorry if you run into a paywall)

PG notes that TPV is a blog focused on the contemporary business of writing, not politics. He will also note that since much of the publishing world, indie and traditional appears to be sheltering in place, he sometimes casts his net a bit wider than he might absent the publishing commentary drought.

(Yes, PG does recognize a sort of mixed metaphor in the “casting his net” and “drought” combination.)

Antifragile

If a book has been in print for forty years, I can expect it to be in print for another forty years. But, and that is the main difference, if it survives another decade, then it will be expected to be in print another fifty years. This, simply, as a rule, tells you why things that have been around for a long time are not “aging” like persons, but “aging” in reverse. Every year that passes without extinction doubles the additional life expectancy. This is an indicator of some robustness. The robustness of an item is proportional to its life!

Nassim Taleb, Antifragile

1939

From The Wall Street Journal:

On April 27, 1939, the British government announced plans to conscript young men for military training. It was a dramatic departure: Never previously in its modern history had the nation conscripted men for the military in time of peace. As the prime minister, Neville Chamberlain, explained to the public, however, with countries all over Europe preparing for battle, and everyone fearing a war might start at any moment, “no one can pretend that this is peacetime in any sense in which the term could fairly be used.”

This liminal period, starting with the sighs of relief at the signing of the Munich Agreement in September 1938, is the subject of Frederick Taylor’s “1939: A People’s History of the Coming of the Second World War.” Mr. Taylor, whose previous works about the period include “Coventry: November 14, 1940” and “Dresden: Tuesday, February 13, 1945,” charts the escalating tensions as Hitler’s brinkmanship pushed Europe to the edge of war, and the insidious onset of a “wartime” mood across Europe, even before German forces invaded Poland. The book concerns the United Kingdom and Germany, and it intersperses clear explanations of the decisions being taken by statesmen with the way these were experienced by “ordinary” people in both countries.

Rich in social and cultural details that bring the era to life, “1939” makes use of a range of eyewitness testimony and contemporary assessments of public opinion, which together illuminate the variety of individual experience within a historic moment in international affairs. Discussions of the ways that new forms of entertainment, such as television and cheap holiday camps, appeared in Germany and in Britain illuminate both the similarities among European experiences and the stark cultural and political differences. Though each chapter deals with a month, Mr. Taylor dives back into the 1930s to explain the back story of that final year of “peace.”

But Mr. Taylor’s inverted commas on “ordinary” are necessary. The figures to whose testimony Mr. Taylor returns throughout the book are German: the journalist (and later anti-Nazi resister) Ruth Andreas-Friedrich and the well-connected novelist and screenwriter Erich Ebermayer. Their diary accounts provide the self-scrutinizing outsiders’ view of the mainstream that, for the British part of his story, comes from the more numerous contributors to the social research project Mass-Observation, the surviving archives of which are such a boon for historians of this period.

. . . .

Mr. Taylor [keeps] up the momentum of a much-told story—the coming of the European war—while conveying a powerful sense of what it felt like to watch the precipice approach.

For some, the drop had already begun. Matching up the dynamics of genocide and war, Mr. Taylor explains how ordinary Germans carried on as attacks on Jews became part of national and civic life. The author is very good at showing the fear and horror produced by escalating Nazi violence, as well as the bizarre dualities that resulted as everyday routines continued around them. Walking to church or the cinema over the smashed glass from shop windows and through the smoke from burning synagogues, gentile Germans managed not to feel that their world was disintegrating around them. Even Britons who got past the casual anti-Semitism typical of the age to offer aid to Jewish refugees, meanwhile, remained remarkably convinced that decent Germans would one day reject Nazi brutality.

What worried everyone was the onset of another world war, when the last one was fresh in memory. Mr. Taylor quotes one report from a local Nazi party official about popular reactions to the invasion of Poland in the Westphalian city of Bielefeld: The last great war, the document observed, had “returned remarkably vividly to people’s memories, its misery, its four-year duration, its two million German fallen. No enthusiasm for war at all.” That the German people acquiesced speaks not only to the power of Nazi propaganda, which used modern means to tap into deeper strands of European anti-Semitism, but also to the degree to which life was already militarized by September 1939. For all the horror at the slaughter a generation before, mobilizing to fight was something that this state—and this society—knew how to do.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

Pencil Leaners

From Public Books:

Between 1935 and 1939, the Federal Writers’ Project (FWP)—an initiative funded by the Works Project Administration under the New Deal—provided employment for some 6,000 jobless writers [in the United States]. Today, as stunned authors in Australia and around the world come to terms with the economic consequences of the coronavirus pandemic, that experiment deserves reconsideration. As the ABC recently noted, Australian writers—who earn, on average, less than $13,000 directly from their work each year—will be affected on multiple levels: by the cancellation of festivals, talks, and other paying gigs; by the closure of bookshops; by redundancies and cuts in publishing houses; and by job losses in the related industries (from academia to hospitality) through which they supplement their incomes.

It was the American New Deal more than anything else that legitimated the kind of stimulus packages again being discussed in Australia not just for the arts but across the economy. When Franklin D. Roosevelt took office, the crisis of the Great Depression forced him, despite his own fiscal conservatism, to rush through various rescue measures of a now-familiar nature. The US government guaranteed bank loans to prevent further financial collapses; it encouraged industrial cartels to control prices and production levels; it purchased unsold crops from farmers; and through the Civil Works Administration, the Federal Emergency Relief Administration, and, eventually, the Works Progress Administration it sought to create jobs.

Recent calls for postpandemic bailouts for artists in general or writers implicitly evoke that legacy.

. . . .

Obviously, the publishing scene today—dominated by vast multinationals, for whom books are merely part of a broader engagement with the “entertainment industry”—differs greatly from the more small-scale milieu of the 1930s. Even so, it’s still worth noting how contemporary thinking about funding literature differs from the Federal Writers’ Project in several important ways.

Most importantly, the job schemes of the 1930s as a whole, including the Writers’ Project—emerged from intense class struggles in a way that today’s plans do not.

In her history of the Works Progress Administration, Nancy E. Rose writes:

Starting in early 1930, unemployed councils, organized by the Communist Party, began to lead hunger marches to demand more relief. On March 6, 1932, which was proclaimed International Unemployment Day, hunger marches took place throughout the country. … In general, cities with strong Unemployed Councils provided better relief.

Agitation by the unemployed coincided with intensified industrial disputation. By 1934, some 1.5 million workers were on strike and FDR went to the polls the following year in the midst of a massive wave of industrial action, in which the newly formed Congress of Industrial Organizations played an important role. Those titanic clashes paved the way for the Second New Deal, under which the most significant reforms (including the WPA) were implemented.

Crucially, writers themselves fought, through explicitly political groups like the Writers’ Union and [before that] the Unemployed Writers’ Association, for the program from which they benefited. In 1934, the UWA’s secretary Robert Whitcomb explained:

The unemployed writers of New York City do not intend to continue under the semi-starvation conditions meted out to them. If the government does not intend to formulate some policy regarding the class of intellectual known as a writer … then the writer must organize and conduct a fight to better his condition.

The following year, with something like a quarter of the entire publishing industry out of work, the two organizations launched a widely publicized picket of the New York Port Authority, in which their members carried signs reading: “Children Need Books. Writers Need Bread. We Demand Projects.”

. . . .

The authors employed by the FWP included many who went on to conventional success, people like Nelson Algren, Saul Bellow, Arna Bontemps, Malcolm Cowley, Ralph Ellison, Zora Neale Hurston, Claude McKay, Kenneth Patchen, Philip Rahv, Kenneth Rexroth, Harold Rosenberg, Studs Terkel, Margaret Walker, Richard Wright, Frank Yerby, and others. As David A. Taylor notes in Soul of a People, his history of the FWP, “four of the first ten winners of the National Book Award in fiction and one in poetry came from this emergency relief project.”

. . . .

Thus, even though the program did actively recruit some literary stars, the author Anzia Yezierska, who’d previously worked in Hollywood, experienced enlisting in the New York FWP as a kind of proletarianization. “There was,” she wrote later, “a hectic camaraderie among us, though we were as ill-assorted as a crowd on a subway express, spinster poetesses, pulp specialists, youngsters … veteran newspapermen, art-for-art’s-sake literati, clerks and typists … people of all ages, all nationalities, all degrees of education, tossed together in a strange fellowship of necessity.”

Not everyone approved of this camaraderie—W. H. Auden dismissed it as “absurd”; one of the project’s own directors complained that “all the misfits and maniacs on relief have been dumped here”

. . . .

The FWP faced especial hostility and ridicule, with one editorialist complaining that it meant that literary “pencil leaners” would join the “shovel leaners” of the WPA. Again, the authorities stressed the project’s utility, with its remit described in an official announcement as the

employment of writers, editors, historians, research workers, art critics, architects, archaeologists, map draftsmen, geologists, and other professional workers for the preparation of an American Guide and the accumulation of new research material on matters of local, historical, art and scientific interest in the United States; preparation of a complete encyclopedia of government functions and periodical publications in Washington; and the preparation of a limited number of special studies in the arts, history, economics, sociology, etc., by qualified writers on relief.

It duly enlisted its staff to labor on perhaps a thousand volumes, including 50 state and territorial guides, 30 city guides and 20 regional guides. David Taylor describes these texts, composed by a dazzling group of writers, as “a multifaceted look at America by Americans, assembled during one of the greatest crises in the country.”

Many writers resented their tasks (at one point, Yezeriska was sent to catalog the trees in Central Park); many worked on their own manuscripts on the side.

. . . .

In books like Gumbo Ya-Ya: A Collection of Louisiana Folk TalesBibliography of Chicago Negroes, and Drums and Shadows: Survival Studies among the Georgia Coastal Negroes, FWP employees collected the folklore that Zora Neale Hurston described as “the boiled-down juice of human living.” They interviewed people who had been enslaved, generating an astonishing assemblage of reminiscences. It’s thanks to the FWP that we have a small number of audio clips in which we can hear the actual voices of the survivors of slavery explaining what was done to them.

Alfred Kazin described how, in the late 1930s:

Whole divisions of writers now fell upon the face of America with a devotion that was baffled rather than shrill, and an insistence to know and to love what it knew that seemed unprecedented. Never before did a nation seem so hungry for news of itself.

Link to the rest at Public Books

Women’s Ways of Aging

From Public Books:

As the coronavirus pandemic continues to rage, it intensifies fears of aging and debility that characterize our culture of fitness and drive our aspirations to bodily invincibility. The stigma of aging affects women differentially. While feminists have touted the achievements of older women and insisted that the later years can be the best, we now find ourselves on the other side of an increasingly solid barrier between a “younger” population and an “elderly,” “older,” or “old” one. Those of us who are age 65 or older are the most vulnerable and at risk, both in need of extra protection and most likely to lose out in the triage battle for hospital beds and ventilators. At the same time, our vulnerability to the virus makes it impossible for many of us in this age cohort to participate in the historic street protests we are condemned to witness from afar.

This is therefore a good moment to assess our experiences of aging, and to face our own attitudes more squarely. Rather than battling an ageist and sexist media by insisting that older women can do and be more than ever before by working and playing harder, might we instead focus on care and interdependence, accepting rather than disavowing bodily, emotional, and social vulnerabilities? Rather than celebrating individual victories against aging and mortality, we might embrace a communal ethos of mutuality to which the old have a great deal to contribute.

In proclaiming older women’s powers, the titles of two recent books give a clear sense of their tone and mission: No Stopping Us Now: The Adventures of Older Women in American History, by journalist Gail Collins, and In Our Prime: How Older Women Are Reinventing the Road Ahead, by communications and media scholar Susan J. Douglas. Indignant about the blatant disparagement of older women that characterizes our moment, Collins and Douglas take a celebratory, if not outright triumphalist, tone. Both search for greater social importance and acceptance of older women in earlier historical periods and find examples of their unrelenting energy and productivity today. Both books encourage all women to fight against gendered ageism. They call for forms of cultural recognition that would better represent what their authors see as older women’s mostly positive experiences of aging.

. . . .

In a whirlwind journey through United States history, from the colonial period to today, No Stopping Us Now traces changes in opportunities for and attitudes toward older women. With spirit and energy, Collins leads us through the lives of numerous, mostly well-known older women who wielded considerable influence at different historical moments. Although the book touches upon larger economic arguments about shifting social roles available to mature women—brought about by the need for their products in colonial times, for example, or the opportunities for widows to run their husbands’ farms or businesses—Collins is more interested in how individual women were able to circumvent prejudices and taboos, and thereby thrive in their later years. Collins’s story is one not so much of steady progress as it is of a series of gains and losses, advances and declines—a story that leads to what she sees as today’s open future of increased possibility.

Thanks to Collins, one certainly gets a sense of women’s energy and activity, which is hard to reconcile with popular attitudes of gendered ageism, then and now. She paints vivid portraits, for example, by following the writing, publishing, and public-speaking “adventures” of 19th-century luminaries like Sarah Josepha Hale, who continued writing until she was 89; Elizabeth Cady Stanton, who urged middle-class women to start a whole new life in their 50s; Catharine Beecher, who took courses at Cornell in her 70s; and Jane Addams, who advocated a postponement of old age.

Notably, historians studying American women have analyzed the feminist strategies these and lesser-known women used to advance their work: by seemingly conforming to set gender roles, even as they radically subverted them. Collins, meanwhile, is content to tell these stories chronologically, ending with encouraging contemporary examples that range from Ruth Bader Ginsburg and Nancy Pelosi to Gloria Steinem and Helen Mirren. She does fold these individual white women into a broad historical sweep that also includes exceptional African American figures like Sojourner Truth, Harriet Tubman, Frances Harper, and 98-year-old National Park Service ranger Betty Reid Soskin. Yet she only mentions—without analyzing in any depth—how gendered prejudices are structurally inflected by racial, economic, and other social inequalities.

Link to the rest at Public Books

Anti-Semitism and the Intellectuals

From The Wall Street Journal:

George Eliot was at the peak of her renown in 1874 when John Blackwood, her publisher, learned that she was at work on “Daniel Deronda, ” a new novel. As a literary man, he was in thrall to her genius. As a businessman with an instinct for the market, he valued her passionately dedicated readership. But an early look at portions of her manuscript astonished and appalled him: Too much of it was steeped in sympathetic evocations of Jews, Judaism and what was beginning to be known as Zionism.

All this off-putting alien erudition struck him as certain to be more than merely unpopular. It was personally tasteless, it went against the grain of English sensibility, it was an offense to the reigning political temperament. It was, in our notorious idiom, politically incorrect. Blackwood was unquestionably a member of England’s gentlemanly intellectual elite. In recoiling from Eliot’s theme, he showed himself to be that historically commonplace figure: an intellectual anti-Semite.

Anti-Semitism is generally thought of as brutish, the mentality of mobs, the work of the ignorant, the poorly schooled, the gutter roughnecks, the torch carriers. But these are only the servants, not the savants, of anti-Semitism. Mobs execute, intellectuals promulgate. Thugs have furies, intellectuals have causes.

The Inquisition was the brainchild not of illiterates, but of the most lettered and lofty prelates. Goebbels had a degree in philology. Hitler fancied himself a painter and doubtless knew something of Dürer and da Vinci. Pogroms aroused the murderous rampage of peasants, but they were instigated by the cream of Russian officialdom. The hounding and ultimate expulsion of Jewish students from German universities was abetted by the violence of their Aryan classmates, but it was the rectors who decreed that only full-blooded Germans could occupy the front seats. Martin Heidegger, the celebrated philosopher of being and non-being, was quick to join the Nazi Party, and as himself a rector promptly oversaw the summary ejection of Jewish colleagues.

Stupid mobs are spurred by clever goaders: The book burners were inspired by the temperamentally bookish—who else could know which books to burn? Even invidious folk myths have intellectual roots, as when early biblical linguists mistranslated as horns the rays of light emanating from Moses’ brow.

Link to the rest at The Wall Street Journal (sorry if you run into a paywall)

The Woman Who Cracked the Anxiety Code

From The Wall Street Journal:

As we have been reminded of late, there is an astonishing complexity—and at times fragility—to our mental and physical health, and we owe a debt to the legions of scientists whose insights and discoveries, over the years, have improved our chances of well-being. Alas, too many of them are unknown to us. One name that was once broadly known has fallen into lamentable obscurity—that of Claire Weekes, an Australian doctor who did ground-breaking work on one of the great scourges of humanity. With Judith Hoare’s “The Woman Who Cracked the Anxiety Code,” we have a chance to learn about Weekes’s varied life and, as important, become reacquainted with her work.

Decades before her death in 1990 at the age of 87, Weekes had been a global sensation, reaching millions of people through her books—“transfusions of hope,” she called them. One of the original self-helpers, she believed that sufferers could master themselves without the aid of professionals, and the strategies she gave them were firmly grounded in the biology of anxiety.

Weekes didn’t plan on medicine as a career, Ms. Hoare tells us. In 1928, at the age of 25, she began graduate studies in zoology in London on a prestigious fellowship. When her beloved mentor died of a stroke, she developed severe heart palpitations. Doctors misinterpreted her condition as tuberculous and sent her to a sanatorium. There she fell into a general state of fear. Six months later, doctors retracted their diagnosis, and Weekes, now nearly incapacitated by stress, resumed her research.

The turning point came when she confided in a friend, a World War I veteran, that she suffered from a frenzied heartbeat. “Far from being surprised or concerned,” Ms. Hoare writes, “he shrugged,” saying: “Those are only the symptoms of nerves.” He told Weekes, in Ms. Hoare’s paraphrase, that “her heart continued to race because she was frightened of it. It was programmed by her fear. This made immediate sense.”

The explanation was deceptively profound, going straight to the core of the mind-body connection. 

. . . .

Weekes had hypothesized a “first fear and second fear” process. The first is a reflex—and the problem in many anxiety disorders is that the reflex is set off for no obvious reason. The second is the conscious feeling of fear. Relief of suffering, for her, came when she learned to quell the “fear of the first fear,” thereby short-circuiting the cycle that was set in motion by the original, unbidden rush of panic: the pounding heart. According to Ms. Hoare, Weekes “immediately grasped the point that she needed to stop fighting the fear.” She had cracked the code.

But this insight would not reach the public for another 30 years. After becoming the first woman to be awarded the degree of Doctor of Science at Sydney University, Weekes conducted research in endocrinology and neurology. Eventually she sought a more pragmatic occupation and enrolled in medical school at age 38. During her work as a general practitioner, she felt special sympathy for her anxious patients and began to counsel them to do as she herself had done: “float past” panic, give bodily sensations and fearful thoughts no power. One of her patients asked for written advice. Her pages to him became “Self Help for Your Nerves,” published in 1962, when Weekes was 59; the book rocketed up the bestseller lists in the U.S. and the U.K. As Ms. Hoare shows, Weekes’s contributions to human welfare live on in mindfulness training and forms of behavioral therapy, sometimes combined with medication. Contemporary neuroscience has vindicated her theory.

Link to the rest at The Wall Street Journal (sorry if you run into a paywall)

How to Fill a Yawning Gap

From The Wall Street Journal:

Is boredom really all that interesting? Thanks perhaps to the subject’s dreary durability, it has generated a considerable literature over the years. Alberto Moravia wrote an engaging novel called “Boredom,” and psychologists, philosophers and classicists have also had their say.

Out of My Skull,” the latest work on this strangely alluring topic, has an exciting title, but nothing about the book is wild or crazy. James Danckert and John D. Eastwood, a pair of psychologists in Canada, know an awful lot about the subject (Mr. Eastwood even runs a Boredom Lab at York University), and they examine it methodically. “In our view, being bored is quite fascinating, and maybe, just maybe, it might even be helpful,” they write, echoing predecessors who find boredom salutary. “Boredom is a call to action, a signal to become more engaged. It is a push toward more meaningful and satisfying actions. It forces you to ask a consequential question: What should I do?”

A taxonomy of boredom, if it’s to avoid exemplifying what it describes, ought to be simple. So let’s just say that boredom is of two kinds. The first is better known to us as ennui, and the democratization of this once-rarefied feeling is one of civilization’s triumphs. At first the preserve of aristocrats and later taken up by intellectuals, nowadays it is available to affluent citizens everywhere. Our endless search for palliatives in the face of this affliction underpins the consumer economy.

The other kind of boredom is the version that most of us get paid for. Commentators on boredom usually genuflect briefly toward factory workers, nannies and other hard-working members of the hoi polloi whose tasks can be mind-numbing. But such people live with a version of boredom that intellectuals find, well, boring. So the focus is usually on the self-important existential variety.

Link to the rest at The Wall Street Journal (sorry if you run into a paywall)

Bread Winner

From The Wall Street Journal:

For Jack Lawson, “ten hours a day in the dark prison below really meant freedom for me.” At age 12, this Northern England boy began full-time work down the local mine. His life underwent a transformation; there would be “no more drudgery at home.” Jack’s wages lifted him head and shoulders above his younger siblings and separated him in fundamental ways from the world of women. He received better food, clothing and considerably more social standing and respect within the family. He had become a breadwinner.

Rooted in firsthand accounts of life in the Victorian era, Emma Griffin’s “Bread Winner” is a compelling re-evaluation of the Victorian economy. Ms. Griffin, a professor at the University of East Anglia, investigates the personal relationships and family dynamics of around 700 working-class households from the 19th century, charting the challenges people faced and the choices they made. Their lives are revealed as unique personal voyages caught within broader currents.

“I didn’t mind going out to work,” wrote a woman named Bessie Wallis. “It was just that girls were so very inferior to boys. They were the breadwinners and they came first. They could always get work in one of the mines, starting off as a pony boy then working themselves up to rope-runners and trammers for the actual coal-hewers. Girls were nobodies. They could only go into domestic service.”

Putting the domestic back into the economy, Ms. Griffin addresses a longstanding imbalance in our understanding of Victorian life. By investigating how money and resources moved around the working-class family, she makes huge strides toward answering the disconcerting question of why an increasingly affluent country continued to fail to feed its children. There was, her account makes clear, a disappointingly long lag between the development of an industrialized lifestyle in Britain and the spread of its benefits throughout the population.

. . . .

In preindustrial times, both men and women had faced a fairly set course in life on the edge of subsistence. During the Victorian era, their fortunes rapidly diverged. Many of the best-paid roles within the newly industrialized economy were designated as exclusively male. Those designated as female were very low paid (well below subsistence level). Thus developed the “breadwinner wage” model—the idea being that a man needed to support a family upon his earnings but a woman needed only pin money, her basic needs having been provided by father or husband.

Ms. Griffin’s groundbreaking research tracks the effects of this philosophy through personal autobiographical accounts. Working-class men gained power and personal freedom from the new opportunities and broader horizons. Working-class women, by contrast, faced the same old narrow set of options. This new pattern of gender divergence was most pronounced in urban situations, where the higher male wages were largely to be had, and was attended by a significant rise in family breakdown.

Link to the rest at The Wall Street Journal (Sorry if you encounter a paywall)

In the dim caverns of his memory, PG remembers reading that survival and eventual prosperity of married settlers who journeyed to the American West generally required the hard work and often income-producing farming and ranching also required the active physical participation of both spouses to a greater extent than was were the family circumstances of the typical Breadwinner economic model.

In the American West economic model, children were also expected to provide productive labor in the process of obtaining food and income from the farm or ranch.

When PG was a wee lad, he remembers helping to herd livestock and run to get tools from the shed as required by his parents.

His mother was definitely involved in the agricultural and livestock activities on a regular basis. She and PG both were chased by an angry cow as they were trying to give some medication to her calf. PG was 7 years old at the time and discovered a running speed he didn’t realized he possessed as he headed for the fence.

When PG was 11 years old, he learned to operate a Caterpillar D6 (sometimes known as the most important piece of military equipment used in the Pacific theater of World War II) and thought he was the coolest kid around driving it to help his father on the farm.

Unfortunately, PG doesn’t have any photos of himself operating the D6, but here are a few to give anyone who is still interested a sense of the size of the machine. PG remembers that it required about a three-step climb from the ground to the seat.

Military Caterpillar D6
This is a civilian D6
Caterpillar D4 (a little smaller than a D6) practicing landings in preparations for service in the Pacific Theater.
Another D4 in the Army’s Fort Leonard Wood Combat Engineer’s Museum.

Rivers of Power

From The Wall Street Journal:

It’s hard to imagine a world without rivers. The continents would be higher, colder and more rugged, and we humans might still be hugging the coastlines. Our iconic cities, situated along rivers, would not have been built. Global trade and travel might never have developed. Even so, rivers’ crucial role in shaping civilization is “grandly underappreciated,” according to Laurence C. Smith, professor of earth, environmental and planetary sciences at Brown University. In his important new book “Rivers of Power,” he surveys mankind’s long, shifting relationship with our rivers, ranging from prehistory to the present and embracing nearly every region of the world.

Rain started falling on Earth at least 4 billion years ago. Merging into streams and then rivers, the water launched its eternal assault on the continents, grinding them down and carrying them grain by grain toward the sea. The rivers, over their tortuous course, occasionally slowed and dropped some of their silt, forming tangled deltas and wide valley plains. Perhaps as recently as 12,000 years ago, nomadic peoples in the Mideast and Asia settled these valleys and began to plant crops such as wheat, barley and rice.

The valley soil was fertile, and early farmers learned to divert river water for irrigation, increasing their harvests and producing surpluses of grain. Starting about 4,000 B.C., they built the world’s first great cities, in present-day Iraq, Egypt, India, Pakistan and China. As these societies grew wealthier and more populous, they also became more complex, supporting a ruling class, traders, philosophers and engineers. In fact, these civilizations (the Egyptian, Sumerian, Harappan and Chinese) were so utterly dependent on their rivers (the Nile, Tigris and Euphrates, Indus and Ghaggar-Hakra, and Yangtze and Yellow) that they have been dubbed “hydraulic societies.”

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

Farm Girl, A Wisconsin Memoir

From The New York Times:

My grandfather died many years ago, but I still remember his stories of growing up in the Texas Hill Country in the early 20th century, walking two miles each way to a one-room schoolhouse and doing chores that were, to me, unfathomable: making laundry soap out of lard and lye, plucking chickens, hauling water from the well. I thought of him often as I read “Farm Girl,” Carlson’s spare, charming memoir of her Depression-era childhood.

Carlson grew up on her parents’ farm outside Plum City in western Wisconsin, where she was born in 1926. (Family lore has it that the doctor who delivered her exclaimed: “Well, this is a nice, big one! Nine or 10 pounds.”) She and her three siblings roamed through “80 acres of beautiful, rich, fertile Wisconsin cropland, pasture and woodlot” while their parents shielded them from the worst economic woes of the period. Her memories, mostly rosy, are punctuated by descriptions of the era’s terrible droughts. “We could hear the cattle bawling as they searched the dry pastures for a bit of grass,” she recalls, and “we saw the leaves on the stunted corn plants in the sun-baked fields curl to conserve moisture.”

“Farm Girl” isn’t chronological. It’s split into two sections — one on her family, the other on the seasonal rhythms that define life on a farm — and divided into thematic chapters, some as short as two pages: “The Party Line Telephone,” “Butternuts and Maple Sugar Candy,” “Sunday Dinner,” “Long Underwear” (“nothing, nothing separated the farm kids from the town kids like the dreaded long underwear … the scourge of Wisconsin winters”).

Link to the rest at The New York Times

For those who have never lived in a place with extremely cold winters, long underwear is an important winterwear component. If the electricity goes out or the schoolbus becomes stranded during a cold snap, long underwear can become very important.

That said, long underwear is seldom regarded as a fashion-forward piece of clothing other than among the old guys sitting around a hot wood stove at the local grain elevator, spinning stores about the winters of their childhoods when winters were really something and, when you woke up, crawled out from under five or six blankets and your bare feet hit the linoleum floor, you got dressed in a big hurry, then went out to help your father thaw out the water pump because Mom couldn’t make oatmeal without water and refused to use melted snow because who knew what might have been done on that exact spot by some creature or another.

Girl Decoded

From The Wall Street Journal:

What if the disinterested machines that surround us and encroach on every aspect of our lives were sensitive to our emotional states? Imagine fridges reprimanding us for our furtive late-night snacks. Or cars decelerating when we are anxious, or preventing us from driving when we are distracted. Consider laptops offering gentle words of consolation or praise, or washing machines groaning with indignation and wristwatches chastising us for our misdemeanors and lack of attention.

In “Girl Decoded,” Rana el Kaliouby’s compelling vision of an emotionally imbued future for artificial intelligence, indifferent machines are elevated into magnificent humanlike creations. While lacking—for now—the authentic emotions of their human counterparts, emotionally enhanced automatons might nevertheless do a perfectly good job of imitating them.

Such devices, in addition to invigorating human-machine relations, have the potential to convey emotional awareness to people—such as those with autism—who struggle to navigate routine emotions. They may also help track emotional states, predict depressive crises and detect the loss of emotional expression that often accompanies diseases like Parkinson’s. Marketing companies could engage them to evaluate reactions to new products. Had Shakespeare’s Othello possessed such a device, he might have been better equipped to understand Desdemona’s intentions. But how might such an imagined world of machine-facilitated emotional enlightenment be brought to fruition?

Ms. el Kaliouby’s brilliance is demonstrated in the simplicity of her solution. While earning her doctorate at Cambridge University, she learned the importance of nonverbal information as she communicated with her geographically distant family back home. She suspected the intricate facial muscles that enable us to grimace, smile, laugh and frown might provide a conduit into the lexicon of human emotions. Once a range of expressions is defined, they could be incorporated into the anatomical structures of emotionally enabled automatons.

Former archivists of the anatomy of emotions, such as Charles Bell in “Essays on the Anatomy of Expression in Painting” (1806) and Charles Darwin in “The Expression of the Emotions in Man and Animals” (1872), established the foundations of the science of emotions. Darwin’s unique use of photographic representations was itself rooted in artistic exposition, perhaps influenced by the drawings of the Renaissance painter Giovanni Agostino da Lodi, whose early-16th-century “A Man With Eyes Shut Tight” documented a dystonic facial expression in a remarkable level of detail.

A Man With Eyes Shut Tight
Giovanni Agostino da Lodi

Inspired by Rosalind Picard’s seminal book “Affective Computing” (1997)—which emphasized the importance of emotions to intelligence, rational decision-making, perception and learning, and reimagined our relationship with machines—Ms. el Kaliouby set out to construct a “mind-reading machine” or “emotion decoder” based on the deciphering of facial features. Given the potential universality of emotions, such a device would need to be relevant to all ethnic groups and cultures.

[Note: PG couldn’t find the book. The link to Affective Computing goes to a 1995 paper published in the M.I.T Media Laboratory Perceptual Computing Section Technical Report No. 321]

A fortuitous encounter with Simon Baron-Cohen, a leading autism expert, led Ms. el Kaliouby to his unique archive of videos that captured people displaying a wide range of emotions. With the help of sophisticated machine-learning algorithms, and later innovations while she was a research scientist at the MIT Media Lab, Ms. el Kaliouby’s machines eventually learned to recognize a rudimentary “emotional palette” encompassing six different human emotional categories.

Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)

The reference to Dystonia in the the OP in connection with the image of the man with his eyes shut sent PG down a Dystonia rabbit hole.

From The National Institute of Neurological Disorders and Stroke:

What is dystonia?

Dystonia is a disorder characterized by involuntary muscle contractions that cause slow repetitive movements or abnormal postures. The movements may be painful, and some individuals with dystonia may have a tremor or other neurologic features. There are several different forms of dystonia that may affect only one muscle, groups of muscles, or muscles throughout the body. Some forms of dystonia are genetic but the cause for the majority of cases is not known.

What are the symptoms?

Dystonia can affect many different parts of the body, and the symptoms are different depending upon the form of dystonia. Early symptoms may include a foot cramp or a tendency for one foot to turn or drag—either sporadically or after running or walking some distance—or a worsening in handwriting after writing several lines.  In other instances, the neck may turn or pull involuntarily, especially when the person is tired or under stress. Sometimes both eyes might blink rapidly and uncontrollably; other times, spasms will cause the eyes to close.  Symptoms may also include tremor or difficulties speaking.  In some cases, dystonia can affect only one specific action, while allowing others to occur unimpeded.  For example, a musician may have dystonia when using her hand to play an instrument, but not when using the same hand to type.  The initial symptoms can be very mild and may be noticeable only after prolonged exertion, stress, or fatigue.  Over a period of time, the symptoms may become more noticeable or widespread; sometimes, however, there is little or no progression. Dystonia typically is not associated with problems thinking or understanding, but depression and anxiety may be present.

Following is another example of dystonic facial expression depicted in art. According to Google Translate, the meaning of hargneux includes surly, aggressive, fractious, angry, snappish, crabbed, waspish or shrewish. A snarling dog is sometimes called a hargneux.

Chien hargneux a toujours les oreilles déchirées translates to “A snarling dog always has torn ears.”

Le hargneux (Soult)
Honoré Daumier

According to PG’s quick and dirty research, Soult refers to French Marshal General Jean-de-Dieu Soult (1769-1851), described as Napoleon’s most able Marshal. Soult appears to carry a secondary meaning of a stern or aggressive appearance. PG is happy to have any of his errors corrected in the comments.