Harvard Probe Finds Honesty Researcher Engaged in Scientific Misconduct

From The Wall Street Journal:

A Harvard University probe into prominent researcher Francesca Gino found that her work contained manipulated data and recommended that she be fired, according to a voluminous court filing that offers a rare behind-the-scenes look at research misconduct investigations.

It is a key document at the center of a continuing legal fight involving Gino, a behavioral scientist who in August sued the university and a trio of data bloggers for $25 million.

The case has captivated researchers and the public alike as Gino, known for her research into the reasons people lie and cheat, has defended herself against allegations that her work contains falsified data.

The investigative report had remained secret until this week, when the judge in the case granted Harvard’s request to file the document, with some personal details redacted, as an exhibit.

The investigative committee that produced the nearly 1,300-page document included three Harvard Business School professors tapped by HBS dean Srikant Datar to examine accusations about Gino’s work.

They concluded after a monthslong probe conducted in 2022 and 2023 that Gino “engaged in multiple instances of research misconduct” in the four papers they examined. They recommended that the university audit Gino’s other experimental work, request retractions of three of the papers (the fourth had already been retracted at the time they reviewed it), and place Gino on unpaid leave while taking steps to terminate her employment.

“The Investigation Committee believes that the severity of the research misconduct that Professor Gino has committed calls for appropriately severe institutional action,” the report states.

HBS declined to comment.

The investigative report offers a rare look at the ins and outs of a research misconduct investigation, a process whose documents and conclusions are often kept secret.

Dorothy Bishop, a psychologist at the University of Oxford whose work has drawn attention to research problems in psychology, praised the disclosure. “Along with many other scientists, I have been concerned that institutions are generally very weak at handling investigations of misconduct and they tend to brush things under the carpet,” Bishop said. “It is refreshing to see such full and open reporting in this case.”

Harvard started looking into Gino’s work in October 2021 after a group of behavioral scientists who write about statistical methods on their blog Data Colada complained to the university. They had analyzed four papers co-written by Gino and said data in them appeared falsified. 

. . . .

Academic Misconduct

Dorothy Bishop, a psychologist at the University of Oxford whose work has drawn attention to research problems in psychology, praised the disclosure. “Along with many other scientists, I have been concerned that institutions are generally very weak at handling investigations of misconduct and they tend to brush things under the carpet,” Bishop said. “It is refreshing to see such full and open reporting in this case.”

Harvard started looking into Gino’s work in October 2021 after a group of behavioral scientists who write about statistical methods on their blog Data Colada complained to the university. They had analyzed four papers co-written by Gino and said data in them appeared falsified.

An initial inquiry conducted by two HBS faculty included an examination of the data sets from Gino’s computers and records, and her written responses to the allegations. The faculty members concluded that a full investigation was warranted, and Datar agreed.

In the course of the full investigation, the two faculty who ran the initial inquiry plus a third HBS faculty member interviewed Gino and witnesses who worked with her or co-wrote the papers. They gathered documents including data files, correspondence and various drafts of the submitted manuscripts. And they commissioned an outside firm to conduct a forensic analysis of the data files.

The committee concluded that in the various studies, Gino edited observations in ways that made the results fit hypotheses.

When asked by the committee about work culture at the lab, several witnesses said they didn’t feel pressured to obtain results. “I never had any indication that she was pressuring people to get results. And she never pressured me to get results,” one witness said.

According to the documents, Gino suggested that most of the problems highlighted in her work could have been the result of honest error, made by herself or research assistants who frequently worked on the data. The investigative committee rejected that explanation because Gino didn’t give evidence that explained “major anomalies and discrepancies.”

Gino also argued that other people might have tampered with her data, possibly with “malicious intent,” but the investigative committee also rejected that possibility. “Although we acknowledge that the theory of a malicious actor might be remotely possible, we do not find it plausible,” the group wrote.

Link to the rest at The Wall Street Journal

American Flannel

From The Wall Street Journal:

Many years ago, during a reporting trip to Copenhagen, I met an economist for the Danish labor federation. I had just visited a company that was transferring production of hearing aids abroad, and I asked him about this move. To my surprise, the economist was entirely in favor. “We want to be a wealthy country,” he said, “and we can’t be a wealthy country with low-wage jobs.”

That conversation came back to me as I read a pair of books about people committed to reviving U.S. garment manufacturing. Both Steven Kurutz’s “American Flannel” and Rachel Slade’s “Making It in America” follow entrepreneurs who have dared to produce American-made apparel at a moment when the domestic supply chains for such products barely exist. Both books are interesting to read. The human stories are moving, the founders’ determination admirable. But neither book finally provides a convincing answer to a question that lurks in the background: Should we even want apparel manufacturing to rebound in the U.S.?

Mr. Kurutz introduces Bayard Winthrop, a descendant of the first governor of the Massachusetts Bay Colony. He grew up in Greenwich, Conn., and spent two years on Wall Street before deciding to seek more meaningful work. After stints selling snowshoe bindings and marketing messenger bags, he created American Giant in 2011 to manufacture clothing in the U.S. Labor costs precluded products that would have required extensive sewing. Instead, Mr. Winthrop settled on cotton hoodies with heavier fabric and better construction than most others. A favorable news article brought a flood of orders, enabling American Giant to buy sewing plants and inspiring Mr. Winthrop to attempt a far more complicated product: the flannel shirts to which the title of Mr. Kurutz’s book alludes.

For guidance, Mr. Winthrop turned to James McKinnon, the head of Cotswold Industries, a family-owned textile company. “Flannel is an art and an art form,” Mr. McKinnon explains. He introduced Mr. Winthrop to small companies that had survived the textile industry’s contraction and that might have the skills and equipment to dye yarn and finish cotton fabric the way Mr. Winthrop wanted it. Mr. Kurutz’s on-the-scene reporting provides a ground-level view of what it means to reassemble a domestic supply chain for flannel, colorfully illustrating why “reshoring” is so complicated a task.

Mr. Kurutz’s other main character is Gina Locklear, who grew up around her family’s sock mill in Fort Payne in Alabama’s DeKalb County. At the start of the 1990s, the author tells us, “there was hardly a man or woman in all of DeKalb County who didn’t have a connection to the hosiery business.” But as imports gained ground, the self-proclaimed “sock capital of the world” lost its kick. Many mills closed. In 2008 Ms. Locklear used spare capacity in her parents’ factory to start Zkano, a company that knits high-fashion socks from organic cotton spun and dyed domestically. The socks sold well. The challenge, she found, was finding workers. As Mr. Kurutz writes: “You could create new business. You couldn’t invent someone with thirty years of experience in the hosiery industry.”

Rachel Slade presents a similar story in “Making It in America,” featuring Ben Waxman, a former union official and political consultant who decided in 2015 to start a business with his then-girlfriend (and now wife), Whitney Reynolds. They mortgaged their home in Maine to fund American Roots. “Together, they would bring apparel manufacturing back to America,” Ms. Slade writes. “They would be uncompromising in their commitment to domestic sourcing and the welfare of their employees.” They began with a hoodie designed to fit large men doing physical jobs in harsh weather, a product whose manufacture required 54 operations on six different kinds of sewing machines. At $80, it was too expensive for the retail market, but Mr. Waxman sold it to labor unions that were delighted to offer members a garment made by union workers in the U.S.

I have no criticism of the individuals Mr. Kurutz and Ms. Slade profile. If these entrepreneurs can make a profit and create jobs by producing clothing in the U.S., congratulations are in order. They are well-intentioned people committed to doing right, and it’s hard not to admire them. But it’s disingenuous to pretend that a handful of mom-and-pop companies sewing hoodies and socks point the way to the revival of manufacturing in the U.S. Apparel is fundamentally different from most other manufacturing sectors; its factories still rely heavily on sewing machines, usually operated by women stitching hem after hem or attaching collar after collar. The U.S. has no great productivity advantage when it comes to making hoodies.

The authors present gauzy portraits of the industrial past. Fort Payne, Mr. Kurutz writes, was “a Silicon Valley for socks” where the annual Hosiery Week “bonded the community together” before the U.S. government lowered barriers to sock imports. That’s not exactly true: Silicon Valley is notorious for ample paychecks, Fort Payne’s sock mills less so. In 1993, the year before apparel and textile makers received what Mr. Kurutz calls a “death blow” from the North American Free Trade Agreement, the 5,478 hosiery workers in DeKalb County were paid, on average, 11% less than the average private-sector employee in the same county. In North Carolina, where American Giant does its sewing, the gap was even wider: The average private-sector worker was paid $22,000 in 1993, the average apparel worker $14,649. People took jobs in garment factories not for bonding but because that was all they could find.

The American garment industry was in decline for decades before Nafta. That it survived as long as it did was due to strict import quotas and high tariffs on both textiles and apparel, fought for by an unholy alliance between garment-workers’ unions and virulently antiunion textile manufacturers. The result was that U.S. consumers were forced to pay high prices for linens and clothes to keep those industries alive.

Link to the rest at The Wall Street Journal

A very long time ago, PG practiced law in a small town located in a part of the United States not known for its wealth.

One of his clients was a small garment manufacturing business owned and operated by a husband and wife. As per expectations, the business employed mostly women and paid minimum wage or something close to it.

The jobs offered provided important income for families, including single mothers, but nobody expected to get rich working there. Had the garment manufacturer closed its doors, the result would likely have been an increased number of families relying on welfare payments for their survival.

In some circles, low-wage jobs are regarded with disdain, but they can play an important economic role in small communities as a first job or an emergency job in the event of a change in family structure or injury to a working member of a family.

In PG’s observations during this time period, while government welfare was sometimes necessary for families with no ability to support themselves, having a job-holder in the family made a positive overall improvement to the family’s long-term stability and happiness.

Classical Culture and White Nationalism

From BookBrowse:

The hands of history have reshaped the Greek past for centuries, sculpting it into an idealized version credited with birthing a myriad of ideas and concepts, notably identity. Certain contemporary political currents claim that Hellenic identity was what we would today consider white, although Greece was a multiethnic society that did not have our modern concepts of race.

Groups promoting racist ideology have pushed the interpretation that the apparent lack of color and ornamentation in Greco-Roman classical sculpture, which is in fact due to the erosion of pigments over time, is indicative of a more advanced and sophisticated culture resulting from the supposed superiority of white Europeans. As Lauren Markham writes in A Map of Future Ruins, “classical iconography continues to be a touchstone of white supremacy today, building off the myth that ancient Greece is the taproot of so-called Western culture.”

. . . .

The former president has also drawn on classical imagery. In 2020, a draft of an executive order titled “Make Federal Buildings Beautiful Again” was leaked. It sought to establish neoclassical architecture as the preferred style for federal buildings. The draft argues that in designing Washington D.C. buildings, the founding fathers embraced the classical models of “democratic Athens” and “republican Rome” because they symbolized “self-governing ideals.”

. . . .

Classicists are pushing against this misuse of antiquity’s ideas and symbols. According to Curtis Dozier, founder of Pharos and assistant professor of Greek and Roman studies at Vassar College, these views “all depend on the widespread assumption that Greco-Roman antiquity is admirable, foundational and refined. Thus any presentation that promotes uncritical admiration for the ancient world, by presenting it as a source of ‘timeless’ models and wisdom, has the potential to be complicit in white supremacy.”

As classics and black world studies professor Denise McCoskey explains, the “idea that the Greeks and Romans identified as ‘white’ with other people in modern-day Europe is just a complete misreading of ancient evidence.

Link to the rest at BookBrowse

PG reminds one and all that he doesn’t necessarily agree with items he posts on TPV.

He also asks that any comments to this or other posts avoid any disparagement of anyone on the basis of race or any other characteristic that is inherent to an individual’s personhood.

Trust Your Intuition: The Writer’s Sixth Sense

From Writers in the Storm:

Just as martial artists trust their instincts, writers must trust their intuition. If something doesn’t feel right, don’t dismiss it. Whether it’s a subtle discomfort or a gut feeling, your intuition is a valuable tool for detecting potential threats. Trust it and take appropriate action to protect your safety.

Writing in public places, such as coffee shops and libraries, offers a unique blend of inspiration and potential challenges. As both a martial artist and author, the combination of creativity and personal safety comes naturally. However, for others, safety may not be a major consideration. Drawing from my experience as a black belt and self-defense seminar instructor, I offer these tips for writers to balance safety and creativity.

The Art of Location Selection: Choose Well-Lit and Crowded Spots

Just as a martial artist assesses their environment for safety, writers should be discerning about their chosen writing spaces. Select well-lit and populated areas where the flow of people ensures a reasonable level of security. Avoid secluded corners or dimly lit spots that might pose safety risks. Your writing sanctuary should inspire creativity without compromising your well-being.

Strategic Positioning: Sit Facing Entrances for Enhanced Awareness

In martial arts, practitioners learn the significance of positioning themselves for optimal defense. Similarly, when writing in public places, sit facing entrances and exits. This strategic placement not only allows for a clear view of your surroundings, but also enhances situational awareness. Observing who enters and exits establishes a mental map of the immediate environment, helping you to focus on your writing without neglecting your safety.

Engage and Disengage: Knowing When to Look Up

Immersing yourself in your writing is crucial, but so is periodically disengaging to assess your surroundings. Establish a rhythm—write for a set period, then take a moment to look up and scan your environment. It’s a dance between creativity and vigilance, ensuring you remain connected to both your work and the world around you. Designate breaks in your writing session to focus solely on your surroundings. Use these moments to reorient yourself and ensure your safety protocols are intact.

Make a habit of being mindful of those around you and any unusual behavior. Trust your instincts—if something feels off, it probably is. Being mindful of your surroundings helps protect your creative flow from unexpected disruptions.

Guarding the Arsenal: Keep Valuables Secure

Martial artists safeguard their weapons, and for writers, the laptop or tablet is a formidable tool. Be mindful of your belongings—keep your laptop, bags, and personal items within reach. Avoid leaving them unattended, as distraction can provide an opportunity for opportunistic individuals. By maintaining control over your possessions, you safeguard both your creative work and personal safety.

Digital Fortifications: Use Lock Screen Features and VPNs

Just as martial artists fortify their defenses, writers should fortify their digital presence. Enable lock screen features on your devices to protect your work and personal information. Use strong passwords or biometric authentication for an added layer of security. When working on public Wi-Fi, avoid accessing sensitive financial or personal information. Consider using a virtual private network (VPN) for added security, ensuring that your digital activities remain shielded from potential threats.

Strategic Alliances: The Buddy System for Writers

In martial arts, strength often lies in alliances. Likewise, writers can benefit from the buddy system. If possible, work with a writing partner or a friend when venturing into public spaces. Having someone by your side not only deters potential threats but also provides a safety net, allowing you to immerse yourself in your writing without undue worry.

. . . .

Trust Your Intuition: The Writer’s Sixth Sense

Just as martial artists trust their instincts, writers must trust their intuition. If something doesn’t feel right, don’t dismiss it. Whether it’s a subtle discomfort or a gut feeling, your intuition is a valuable tool for detecting potential threats. Trust it and take appropriate action to protect your safety.

Link to the rest at Writers in the Storm

The corny over-extension of martial arts wisdom would normally have caused PG to pass the OP by.

However, he was concerned about authors, perhaps mostly female authors, having problems finding safe public spaces in which to write.

In ancient college times, PG would often walk to the library to write if things were a little chaotic around the apartment he shared, checking only on how much longer the library would remain open. He relied on library staff to kick him out at closing time. He would then walk home, without a second thought, seldom seeing anyone else on his way.

Thinking back, he almost never saw any female students during his close-the-library trips. To be fair, the campus was regarded as quite safe, even after dark.

But in those ancient times, a gentleman or would-be gentleman or guy who didn’t want to be excoriated by persons of both genders (there were only two back then), would always walk his date to the door of her dormitory, sorority, apartment, etc., and wait until the door locked behind her before leaving, regardless of what substances he had taken into his body during the preceding hours.

Why it’s hard to write a good book about the tech world

From The Economist:

When people ask Michael Moritz, a former journalist and prominent tech investor, what book they should read to understand Silicon Valley, he always recommends two. “They are not about Silicon Valley, but they have everything to do with Silicon Valley,” he says.

One is “The Studio” (1969) by John Gregory Dunne, an American writer who spent a year inside 20th Century Fox watching films get made and executives try to balance creativity with profit-seeking. The other, “Swimming Across” (2001) by Andy Grove, the former boss of Intel, a chipmaker, is a memoir about surviving the Holocaust. It shows how adversity can engender grit, which every entrepreneur needs.

That Sir Michael does not suggest a book squarely about the tech business says a lot. Silicon Valley has produced some of the world’s most gargantuan companies, but it has not inspired many written accounts with a long shelf life. Wall Street, on the other hand, claims a small canon that has stood the test of time, from chronicles of meltdowns (“Too Big to Fail”), to corporate greed (“Barbarians at the Gate”) to a fictionalised account (“The Bonfire of the Vanities”) that popularised the term “masters of the universe”.

Why not the masters of Silicon Valley? Part of the problem is access, as is often the case when writing about the powerful. Tech executives may let their guards down at Burning Man, but they have been painstakingly trained by public-relations staff to not get burned by writers. This has been the case for a while. When John Battelle was writing “The Search” (2005), about online quests for information, he spent over a year asking to interview Google’s co-founder, Larry Page. The firm tried to impose conditions, such as the right to read the manuscript in advance and add a footnote and possible rebuttal to every mention of Google. He declined. Google ended up granting the interview anyway.

Journalists who manage to finagle access can feel they owe a company and its executives and, in turn, write meek and sympathetic accounts rather than penetrating prose. Or they cannot break in—or do not even try—and write their book from a distance, without an insider’s insights.

Two new books demonstrate how hard it is to write well about Silicon Valley. “Filterworld” is an outsider’s account of the Valley’s impact, which reads as if it was entirely reported and written in a coffee shop in Brooklyn. The book laments how “culture is stuck and plagued by sameness” and blames Silicon Valley’s algorithms, “the technological spectre haunting our own era of the early 21st century”.

This is the sort of tirade against tech that has spread as widely as Silicon Valley’s apps. It is not wrong, but nor is it insightful. The author, Kyle Chayka, who is a journalist for the New Yorker, never reconciles the tension between the cultural “sameness” he decries and the personalisation everyone experiences, with online users possessing individual feeds and living in separate informational bubbles. Nor is this a wholly new phenomenon. People have been complaining about globalisation eroding local culture since “recorded civilisation” began, the author concedes. In 1890 Gabriel Tarde, a French sociologist, lamented the “persistent sameness in hotel fare and service, in household furniture, in clothes and jewellery, in theatrical notices and in the volumes in shop windows” that spread with the passenger train.

Burn Book” is a better, though imperfect, read. Kara Swisher, a veteran chronicler of Silicon Valley, is both an insider and an outsider. She has attended baby showers for tech billionaires’ offspring, and even hosted Google’s top brass for a sleepover at her mother’s apartment. But she has a distaste for the Valley’s “look-at-me narcissists, who never met an idea that they did not try to take credit for”.

In delicious detail, she offers her verdict on the techies who have become household names, such as Facebook’s founder: “As sweat poured down Mark Zuckerberg’s pasty and rounded face, I wondered if he was going to keel over right there at my feet.” (That was in 2010,before he had gone through media-training galore.) Much as Truman Capote, an American writer, was willing to skewer the socialite swans of New York, Ms Swisher delights in prodding some of her subjects to make readers smile and squirm, such as media mogul Rupert Murdoch (“Uncle Satan”) and Amazon’s Jeff Bezos (“a frenetic mongoose” with “a genuinely infectious maniacal laugh”).

Link to the rest at The Economist

Catastrophe Ethics

From The Wall Street Journal:

It’s getting harder and harder to be a decent person. You wake up hankering for a coffee. But hold on! Before you order one, better make sure that a fair-trade producer supplied the beans. And then: a drop of milk? Cows have such a huge carbon footprint. Almond milk? Growing almonds requires copious quantities of water. Soy? Don’t even think about it.

Time to walk to work. That’s how you’ve been getting there since learning, last week, that your electric car’s cobalt comes from mines that engage in unacceptable labor practices. Now you’ve arrived but—can you believe it?—a co-worker just made a deplorable comment about the presidential campaign. Cut ties? A busy day of discussion ensues.

At last you’re home for the evening. Perhaps watch a comedian on your favorite streaming service? Not till you’ve checked whether he’s uttered something offensive in the past 15 years.

Modern life, Travis Rieder declares in “Catastrophe Ethics,” is “morally exhausting.” Everything we do “seems to matter,” he notes, and yet “nothing we do seems to matter.” The term “catastrophe” might seem to apply more to climate change than offensive comedians, but Mr. Rieder is speaking generally of collective problems that lie beyond the capacity of any of us to affect individually. They’re catastrophic in that they involve large social matters—the comedian, say, might be contributing to public prejudices by ridiculing a particular group—even though our own role in affecting them is vanishingly small. You’re not going to stop climate change on your own—you are, after all, one person in a global population of eight billion. Nor will the comedian you cancel even notice. What to do?

The great moral theories, Mr. Rieder tells us, are of little help. His prime target is utilitarianism, which holds that the right thing to do is whatever will maximize benefits and minimize costs for all concerned. Such counsel is useless, though, when our individual actions will neither yield any measurable benefit nor reduce any perceptible cost.

Other doctrines are explored as well. “Deontology” argues that we should not treat other human beings manipulatively, simply as means to our own ends. But its prohibitions seem better suited to acts like lying or promise-breaking, Mr. Rieder notes, than buying coffee or watching a comedian.

Then there is virtue ethics, which advises us to cultivate morally good character traits like temperance or moderation. Because the development of such traits takes place over time, though, it can’t really tell us whether our taking a joyride in our Hummer next Tuesday is right or wrong, since it’s unlikely to affect our character one way or another. Virtue ethics is not, Mr. Rieder concludes, particularly action-guiding.

Mr. Rieder, a bioethicist at Johns Hopkins, advises that the best course is simply to follow our own sense of personal “integrity,” an idea he derives from the philosopher Bernard Williams. For example, you might drink only fair-trade coffee because the proper treatment of workers is central to your sense of right and wrong, but you’re OK with listening to a comedian who offends particular groups. I, on the other hand, might cancel the comedian because his humor crosses some non-negotiable lines in my moral core, but I don’t get particularly worked up over where my coffee comes from.

We can’t, Mr. Rieder says, do everything. But we can be a person of integrity, as long as we “walk the walk” of our deepest values. There is a gentle wisdom here, reminiscent of the rabbinical saying: “You are not obliged to complete the work of the world, but neither are you free to desist from it.” Even so, Mr. Rieder might be too quick to dismiss utilitarianism and too sanguine about personal integrity.

Link to the rest at The Wall Street Journal

Catastrophe Ethics

From The Wall Street Journal:

It’s getting harder and harder to be a decent person. You wake up hankering for a coffee. But hold on! Before you order one, better make sure that a fair-trade producer supplied the beans. And then: a drop of milk? Cows have such a huge carbon footprint. Almond milk? Growing almonds requires copious quantities of water. Soy? Don’t even think about it.

Time to walk to work. That’s how you’ve been getting there since learning, last week, that your electric car’s cobalt comes from mines that engage in unacceptable labor practices. Now you’ve arrived but—can you believe it?—a co-worker just made a deplorable comment about the presidential campaign. Cut ties? A busy day of discussion ensues.

At last you’re home for the evening. Perhaps watch a comedian on your favorite streaming service? Not till you’ve checked whether he’s uttered something offensive in the past 15 years.

Modern life, Travis Rieder declares in “Catastrophe Ethics,” is “morally exhausting.” Everything we do “seems to matter,” he notes, and yet “nothing we do seems to matter.” The term “catastrophe” might seem to apply more to climate change than offensive comedians, but Mr. Rieder is speaking generally of collective problems that lie beyond the capacity of any of us to affect individually. They’re catastrophic in that they involve large social matters—the comedian, say, might be contributing to public prejudices by ridiculing a particular group—even though our own role in affecting them is vanishingly small. You’re not going to stop climate change on your own—you are, after all, one person in a global population of eight billion. Nor will the comedian you cancel even notice. What to do?

The great moral theories, Mr. Rieder tells us, are of little help. His prime target is utilitarianism, which holds that the right thing to do is whatever will maximize benefits and minimize costs for all concerned. Such counsel is useless, though, when our individual actions will neither yield any measurable benefit nor reduce any perceptible cost.

Other doctrines are explored as well. “Deontology” argues that we should not treat other human beings manipulatively, simply as means to our own ends. But its prohibitions seem better suited to acts like lying or promise-breaking, Mr. Rieder notes, than buying coffee or watching a comedian.

Then there is virtue ethics, which advises us to cultivate morally good character traits like temperance or moderation. Because the development of such traits takes place over time, though, it can’t really tell us whether our taking a joyride in our Hummer next Tuesday is right or wrong, since it’s unlikely to affect our character one way or another. Virtue ethics is not, Mr. Rieder concludes, particularly action-guiding.

Mr. Rieder, a bioethicist at Johns Hopkins, advises that the best course is simply to follow our own sense of personal “integrity,” an idea he derives from the philosopher Bernard Williams. For example, you might drink only fair-trade coffee because the proper treatment of workers is central to your sense of right and wrong, but you’re OK with listening to a comedian who offends particular groups. I, on the other hand, might cancel the comedian because his humor crosses some non-negotiable lines in my moral core, but I don’t get particularly worked up over where my coffee comes from.

We can’t, Mr. Rieder says, do everything. But we can be a person of integrity, as long as we “walk the walk” of our deepest values. There is a gentle wisdom here, reminiscent of the rabbinical saying: “You are not obliged to complete the work of the world, but neither are you free to desist from it.” Even so, Mr. Rieder might be too quick to dismiss utilitarianism and too sanguine about personal integrity.

Link to the rest at The Wall Street Journal

Tackling the TikTok Threat

From The Wall Street Journal:

The House on Wednesday is expected to vote on a bill that would give popular social-media app TikTok an ultimatum: Break up with the Chinese Communist Party (CCP), or break up with the U.S. It didn’t need to come to this, but Beijing and TikTok’s Chinese-owner ByteDance left Washington with no choice.

Congress has spent years debating how to mitigate the national-security risks of TikTok’s Chinese ownership that have grown with the site’s popularity. About 150 million Americans use TikTok, and the app is a top source of news and search for Generation Z.

Donald Trump tried in 2020 to force ByteDance to divest TikTok, but his executive order was blocked in court, partly because the President lacked clear authority from Congress. Legislation by Wisconsin Republican Mike Gallagher and Illinois Democrat Raja Krishnamoorthi aims to overcome the legal obstacles.

Their bill would ban TikTok from app stores and web-hosting services in the U.S. if the company doesn’t divest from ByteDance. It also establishes a process by which the President can prohibit other social-media apps that are “controlled by a foreign adversary.” The bill is narrowly tailored while giving the President tools to combat future threats.

Banning TikTok should be a last resort, but ByteDance and Beijing have demonstrated that they can’t be trusted. Reams of evidence show how the Chinese government can use the platform for cyber-espionage and political influence campaigns in the U.S.

Numerous reports have found that posts about Uyghur forced labor in Xinjiang province, the Tiananmen Square massacre, Hong Kong protests, Tibet and other politically sensitive content in China are suppressed on TikTok. A December study by the Network Contagion Research Institute found significant disparities between hashtags on Instagram and TikTok. The site also appears to amplify content that sows discord and ignorance in America. Pro-Hamas videos trend more than pro-Israel ones. Videos promoting Osama bin Laden’s 2002 “letter to America” went viral on TikTok last autumn.

How has TikTok responded to allegations that its algorithms are controlled by the Chinese government? In January it restricted researcher access to its hashtag data to make it harder to study. “Some individuals and organizations have misused the Center’s search function to draw inaccurate conclusions, so we are changing some of the features to ensure it is used for its intended purpose,” a TikTok spokesperson said.

Yet TikTok can’t explain why posts that are divisive in America go viral, while those that are sensitive for the CCP get few views. TikTok tried to ameliorate concerns about CCP wizards behind the screen with its Project Texas, which houses American user data on Oracle servers and gives the U.S. software company access to its algorithms.

But TikTok’s algorithms are still controlled by ByteDance engineers in China. The Journal reported in January that TikTok executives have said internally that they sometimes need to share protected U.S. data with ByteDance to train the algorithms and keep problematic content off the site. Like protests for democracy in Hong Kong?

TikTok’s other major security risk is cyber-espionage. The app vacuums up sensitive American user information, including searches, browsing histories and locations. This data can and does flow back to China. “Everything is seen in China,” a TikTok official said in a leaked internal recording reported by Buzzfeed.

ByteDance employees tried to uncover internal leakers by spying on American journalists. After this surveillance was reported, ByteDance blamed “misconduct of certain individuals” who were no longer employed. But there’s nothing to stop CCP puppets in ByteDance back-offices from spying on Americans.

Meta ignited a firestorm several years ago when it was found to have given British consulting firm Cambridge Analytica access to user personal data. Political campaigns used the data to target ads. TikTok’s privacy risks and malign political influence are more disturbing since it answers to Beijing.

Xi Jinping has eviscerated any distinction between the government and private companies. ByteDance employs hundreds of employees who previously worked at state-owned media outlets. A former head of engineering in ByteDance’s U.S. offices has alleged that the Communist Party “had a special office or unit” in the company “sometimes referred to as the ‘Committee.’”

. . . .

In any case, the House bill doesn’t restrict First Amendment rights. It regulates national security. It also has ample precedent since U.S. law restricts foreign ownership of broadcast stations. The Committee on Foreign Investment in the United States forced the Chinese owners of Grindr, the gay dating app, to give up control of the company.

Link to the rest at The Wall Street Journal (Sorry if you encounter a paywall)

PG thinks the WSJ opinion writers are going more than a bit overboard in their fears and, particularly, their “solution” to the problem they’ve overthought.

China is going to continue to engage in Communist behavior regardless of what happens with TikTok. If, “everything is seen in China,” then China is spending a huge amount of human time looking at TikTok videos.

“TikTok can’t explain why posts that are divisive in America go viral.”

Posts that are divisive in America frequently go viral without any help from foreign nations. It’s a feature of democratic societies, not a bug.

People disagree on political issues face-to-face, by snail mail, by email, via newspaper editorial pages and, especially, online. Look at what was printed in old-fashioned newspapers run by William Randolph Hearst and Pulitzer when there were far fewer sources of news available to regular folk than is the case today.

When a case containing dismembered human remains surfaced in New York’s East River in June of 1897, the publisher of the New York Journal–a young, devil-may-care millionaire named William Randolph Hearst–decided that his newspaper would “scoop” the city’s police department by solving this heinous crime. Pulling out all the stops, Hearst launched more than a journalistic murder investigation; his newspaper’s active intervention in the city’s daily life, especially its underside, marked the birth of the Yellow Press.

Most notable among Hearst’s competitors was New York City’s The World, owned and managed by a European Jewish immigrant named Joseph Pulitzer. These two papers and others exploited the scandal, corruption, and crime among the city’s most influential citizens, and its most desperate inhabitants

PG claims no deep expertise about what goes on in the uncountable number of online discussion groups, but suspects there are copious numbers of people from all sorts of places who subscribe to the belief that Joe Biden is being controlled by alien invaders, just like Donald Trump was before him.

How Pseudo-Intellectualism Ruined Journalism

From Persuasion:

I was sitting across from the professor as she went over my latest piece. This was 1986, Columbia School of Journalism, Reporting and Writing I, the program’s core course. At one point, in response to what I don’t recall, I said, “That doesn’t bode well for me.” I could have been referring to a lot of things; there were so many, in my time in journalism school, that did not bode well for me. One was the next set of words that came out of her mouth. “‘Bode?’” she said. “I haven’t heard anyone bode anything in a long time.” Another was her comment, on a previous piece, about my use of “agglomerate.” She had circled it and written, “No such word.”

But the most important was the intellectual climate of the school as a whole, in that it did not have one. We were not there to think. We were there to learn a set of skills. One of them, ironically, was asking questions, just not about the profession itself: its premises, its procedures, its canon of ethics. I know, because from time to time I tried, and it didn’t go well. This was trade school, not liberal arts school. When a teacher said something, you were supposed to write it down, not argue.

The main thing that I learned in journalism school was that I didn’t belong in journalism school. The other thing I learned was that journalists were deeply anti-intellectual. They were suspicious of ideas; they regarded theories as pretentious; they recoiled at big words (or had never heard of them). For a long time, I had contempt for the profession on that score. In recent years, though, this has yielded to a measure of respect. For notice that I didn’t say that journalists are anti-intellectual. I said they were. Now they’re something else: pseudo-intellectual. And that is much worse.

The shift reflects the transformation of journalists’ social position. This phenomenon is familiar. Journalism used to be a working-class profession. I think of Jimmy Breslin and Pete Hamill, icons of the New York tabloids, the working people’s papers, in the second half of the twentieth century. Breslin’s father was a drunk and a piano player who went out for rolls one day and never came back. His mother was a teacher and civil servant. Hamill’s father was a grocery clerk and factory worker; his mother, a domestic, a nurse’s aide, a cashier. Breslin went to Long Island University but dropped out after two years. Hamill left school at fifteen to apprentice as a sheet metal worker, enlisted in the Navy, and took some art school classes later on (he hoped to be a cartoonist). Both were Irish Catholic: Hamill from Brooklyn, Breslin from Queens, long before those boroughs were discovered by the hipsters and the condo creeps.

Coming up working-class, you develop a certain relationship to facticity. Your parents work with their hands, with things, or on intimate, sometimes bodily terms with other people. Your environment is raw and rough—asphalt, plain talk, stains and smells—not cushioned and sweetened. You imbibe a respect for the concrete, the tangible, that which can be known through direct experience, and a corresponding contempt for euphemism and cant. You develop a gut and a bullshit detector, acquire a suspicion of experts who operate at a remove from reality, which means academics in particular. Hence the recognition, in figures like Breslin and Hamill, that the world is chaotic, full of paradox, that people evade our understanding. Hence their sense of curiosity and irony and wonder. At the source of their moral commitments, they had not rules but instincts, a feeling for the difference between right and wrong. For the masses, they felt not pity but solidarity, since they were of them.

That was the profession’s ethos—skeptical, demotic—and you didn’t have to grow up working class (or be a man) to absorb it. Molly Ivins, Nora Ephron, Cokie Roberts, Maureen Dowd, Mara Liasson, even Joan Didion and Janet Malcolm, in their own ways: all had it or have it. But none of them was born more recently than 1955. In the last few decades, journalists have turned into a very different kind of animal. “Now we’re not only a college-dominated profession,” wrote David Brooks not long ago, citing a study that found that more than half of writers at The New York Times attended one of the 29 most selective institutions in the country; “we’re an elite-college-dominated profession.”

. . . .

A couple of years ago, writing in The Chronicle of Higher Education, an Ivy League professor said the quiet part out loud. “Not all of our students will be original thinkers,” she wrote, “nor should they all be. A world of original thinkers, all thinking wholly inimitable thoughts, could never get anything done. For that we need unoriginal thinkers, hordes of them, cloning ideas by the score and broadcasting them to every corner of our virtual world. What better device for idea-cloning than the cliché?” She meant academic clichés, having mentioned “performativity,” “normativity,” “genderqueer,” and others. “[W]e should instead strive to send our students forth—and ourselves too—armed with clichés for political change.”

And that’s exactly what has happened, nowhere more so than in journalism. The progressive academic ideology has become the intellectual framework of the field, or at least of its most visible and influential parts: The New York TimesThe Washington Post, NPR, et al. More to the point, the field now has an intellectual framework, one that journalists seek, top-down, to impose on the world, on the stories they report. The practice travels under the name of “moral clarity”—as if moral clarity were anything, in this world, besides a very rare commodity (I would love to know what Didion thought of the concept), and as if the phrase meant anything other, in this context, than tailoring the evidence to fit one’s pre-existing beliefs. Facts are now subordinated to the “narrative,” a revealing word: first, because it comes from academia (it is one of those clichés); second, because it’s almost always misused, a particle of garbled theory cloned and memed (as the professor would have wanted). When journalists say “narrative,” they mean “idea.” And it is always an idea they’ve received from someone else.

They think they’re thinking, but they’re wrong. They think that thinking means applying ideas, in the sense that you’d apply an ointment. What it actually means is examining them, reworking them, without fear, without cease. They believe that they are skeptical. In fact, they’re alternately cynical and gullible: cynical toward the other side and gullible toward their own (that they see themselves as being on a side is part of the problem, of course). That is why they’re helpless before the assertions of like-minded activists and academics or of acceptably credentialed experts—incapable of challenging their premises or putting pressure on their arguments. For those who lie outside their mental world, who haven’t taken the courses and acquired the jargon, they feel not kinship but, depending on the former’s demographic category, condescension or contempt.

Few students, at any time, come out of college fully equipped to think. 

Link to the rest at Persuasion

Oppenheimer Couldn’t Run a Hamburger Stand. How Did He Run a Secret Lab?

From The Wall Street Journal:

When he was named the director of the Manhattan Project’s Los Alamos Laboratory, J. Robert Oppenheimer was an improbable choice for the most important job in America. 

At the time, he was a 38-year-old theoretical physicist who had never managed anything more than a dozen graduate students, much less an operation with the fate of the world at stake. Leslie Groves, the Army general who hired him, said he received “no support, only opposition” for his decision. One close friend who would later win a Nobel Prize called Oppenheimer “absolutely the most unlikely choice” to run a secret lab that would build the atomic bomb.  

“He couldn’t run a hamburger stand,” said another colleague. 

So how did he transform into one of the most effective and consequential leaders in history? 

This weekend, “Oppenheimer” is expected to dominate the Oscars. But even watching a three-hour movie from a painstakingly meticulous auteur like Christopher Nolan isn’t enough to understand what made Oppenheimer tick. If you really want to get inside his mind, you have to read two Pulitzer Prize-winning books, “American Prometheus” by Kai Bird and the late Martin J. Sherwin and “The Making of the Atomic Bomb” by Richard Rhodes. (PG notes that the Rhodes book is available with Kindle Unlimited.)

. . . .

Oppenheimer the recruiter

Before he could build the bomb, Oppenheimer had to build something else with the potential to blow up in his face: a team. 

Los Alamos hadn’t yet been selected as the site of his secret lab when Oppenheimer began hunting for talent. Once he identified scientists and decided to hire them, he did whatever it took to get them. When the physicist Richard Feynman turned him down because his wife was sick with tuberculosis, for example, Oppenheimer found a sanatorium close enough to Los Alamos that he could visit on the weekends. 

It’s a revealing story not because Feynman was a star but the opposite: The future Nobel winner was still merely a graduate student, “not anybody famous at all,” as he put it, and yet Oppenheimer still went above and beyond. 

. . . .

Oppenheimer the communicator

Once he got them, Oppenheimer knew how to get the best work out of his scientists. In his new book, Charles Duhigg writes about “supercommunicators,” people who are “capable of saying exactly the right thing, breaking through to almost anyone, figuring out how to connect in even the most unlikely circumstances.” Oppenheimer, as it turns out, was a supercommunicator. 

Others in Los Alamos were better physicists, chemists and engineers. But what he could do better than anybody there—and maybe better than anybody on the planet—was take scientists with different perspectives and bring them to a consensus. 

“He would stand at the back of the room and listen as everyone argued,” Bird said. “Then he would step forward at just the right moment, summarize the salient points that everyone had been making that were in common and point the way forward.” 

“He would walk in, quickly grasp what the problem was and almost always suggest some leads to a solution,” Rhodes said. 

. . . .

But what set him apart from the other geniuses at Los Alamos was his broad knowledge and breadth of interests, which allowed him to make connections across disciplines and see what others in the room couldn’t. They were specialists. He was a generalist. They were singularly focused on their narrow fields of research. He was curious about philosophy, literature, poetry and the Bhagavad Gita. “He was a good scientist precisely because he was also a humanist,” Bird says.

Groves was so impressed by Oppenheimer’s range of interests that he once declared: “Oppenheimer knows everything.” He also could explain everything he knew without condescending, another trait that distinguished him from other eminently qualified scientists who interviewed for the job. 

“He was able to speak in plain English,” Bird said. 

. . . .

Oppenheimer the collaborator

The scientists were willing to drop everything in their lives to work around the clock in the middle of nowhere. What they were not willing to do was wear a military uniform. 

Oppenheimer himself was so allergic to hierarchy that he objected to making a basic organizational chart. He was intense but informal, someone who commanded respect without demanding it, and the biggest difference between Oppenheimer and Army generals was how they believed teams should operate. 

The military relied on compartmentalization. He insisted on collaboration. 

By demanding a flatter structure, Oppenheimer might as well have asked the Army if everyone in Los Alamos could have a mullet. In fact, when Groves learned that Oppenheimer was in favor of instituting a weekly colloquium for hundreds of scientists, he tried to shut it down. Oppenheimer prevailed. He understood the value of gathering people from different parts of a project in the same place, encouraging them to discuss their work and combine their ideas.

“Very often a problem discussed in one of these meetings would intrigue a scientist in a completely different branch of the laboratory,” Bethe once wrote, “and he would come up with unexpected solutions.” 

The meetings also improved morale at Los Alamos, providing a weekly reminder that everyone on the Manhattan Project had a role to play. Oppenheimer was right to fight for their existence. 

“He won the loyalty of people inside the fence,” Bird says. “They could see that he was protecting them, allowing them to collaborate and talk freely, which was necessary to the success of the project.”

They worked six days a week, but Oppenheimer made sure they weren’t only working. On their off days, there was horseback riding, mountain climbing, skiing, hiking and some of the geekiest basketball games of all time. When a local theater group staged a performance of “Arsenic and Old Lace,” Oppenheimer brought the house down with his surprise cameo as a corpse. And he was especially famous for his parties, where Oppenheimer paired his deadly gin martinis with his favorite toast: “To the confusion of our enemies!” 

Link to the rest at The Wall Street Journal

Larry Summers on What Went Wrong on Campus

From Persuasion:

Larry Summers is an economist, the Charles W. Eliot University Professor and director of the Mossavar-Rahmani Center for Business and Government at Harvard Kennedy School, and a member of the board of directors of OpenAI. Summers is the former President of Harvard University, the former Secretary of the Treasury under Bill Clinton, and was a director of the National Economic Council under Barack Obama.

In this week’s conversation, Yascha Mounk and Larry Summers discuss how universities can re-commit to pursuing truth and protecting academic freedom; how current economic indicators contrast with how many people actually experience the economy.

. . . .

Yascha Mounk: The last few months have been rather eventful at Harvard University. Tell us your view of what has happened and why it matters.

Larry Summers: It’s been a very difficult time. I think what universities do is as important as the work of any other institution in our society, in terms of training young people and preparing them for careers of leadership, and in terms of developing new ideas that set the tone for the cultural, the political, the policy debates that go forward.

Paul Samuelson famously said that if he would be allowed to write the economics textbooks, he didn’t care who would get to perform as the finance ministers going forward. So I think what happens in universities is immensely important. And I think there is a widespread sense—and it is, I think, unfortunately, with considerable validity—that many of our leading universities have lost their way; that values that one associated as central to universities—excellence, truth, integrity, opportunity—have come to seem like secondary values relative to the pursuit of certain concepts of social justice, the veneration of certain concepts of identity, the primacy of feeling over analysis, and the elevation of subjective perspective. And that has led to clashes within universities and, more importantly, an enormous estrangement between universities and the broader society.

When the president of Harvard is a figure on a Saturday Night Live skit, when three presidents of universities combine to produce the most watched congressional hearing film clip in history, when applications to Harvard fall in a several-month period by more they’ve ever fallen before, when alumni are widely repudiating their alma mater, when they’re the subject of as many legal investigations as the Boeing company, you have a real crisis in higher education. And I think it’s been a long time coming because of those changes in values that I was describing.

Mounk: Tell us a little bit more about the nature of the conflict here. What is the conception of the university that has historically guided it, and how is it that those values have changed over the last ten years?

Summers: I think the values that animated me to spend my life in universities were values of excellence in thought, in pursuit of truth. We’re never going to find some ultimate perfect truth, but through argument, analysis, discussion, and study we can get closer to truth. And a world that is better understood is a world that is made better. And I think, increasingly, all you have to do is read the rhetoric of commencement speeches. It’s no longer what we talk about. We talk about how we should have analysis, we should have discussion, but the result of that is that we will each have more respect for each other’s point of view, as if all points of view are equally good and there’s a kind of arbitrariness to a conception of truth. That’s a kind of return to pre-Enlightenment values and I think very much a step backward. I thought of the goal of the way universities manage themselves as being the creation of an ever larger circle of opportunity in support of as much merit and as much excellence as possible.

I spoke in my inaugural address about how, a century before, Harvard had been a place where New England gentlemen taught other New England gentlemen. And today it was so much better because it reached to every corner of the nation, every subgroup within the population, every part of the world. It did that as a vehicle for providing opportunity and excellence for those who could make the greatest contribution. But again, we’ve moved away from that to an idea of identity essentialism, the supposition that somehow the conditions of your birth determine your views on intellectual questions, whether it’s interpretations of quantum theory or Shakespeare. And so that, instead, our purpose is not to bring together the greatest minds, but is back to some idea around multiplicity of perspective with perspective being identified with identity. We used to venerate and celebrate excellence. Now, at Harvard, and Harvard is not atypical of leading universities, 70 to 75% of the grades are in A-range. Why should the institutions that are most celebrating of excellence have only one grade for everyone in the top half of the class, but nine different grades that are applied to students in the lower half of the class? That is a step away from celebrating and venerating excellence. 

We celebrate particular ideas in ways that are very problematic, and we are reluctant to come to judgment: What started all the controversy at Harvard, and it has many different strands, was on October 7, when 34 student groups at Harvard, speaking as a coalition of Harvard students, condemned Israel as being responsible for the Hamas attacks. Those reports of the 34 student groups were reported in places where literally billions of people read them. And based on some inexplicable theory, the Harvard administration and the Harvard corporation (the Trustees of the University) could not find it within themselves to disassociate the university from those comments. I have no doubt that if similar comments had been made of a racist variety, there would have been no delay in the strongest possible disassociation of the university. But because Israel demonization is the fashion in certain parts of the social justice-proclaiming left, there was a reluctance to reach any kind of judgment, even about the most morally problematic statements.

It is not that the university was slow to comment on George Floyd. It is not that the university was slow to comment when some within it wanted to host a “black mass.” It is not that the university has been slow when social scientists have wanted to speculate about group differences. So I think that this combination—the veneration of a particular concept of social justice, the act of disrespect for excellence, the celebration of identity rather than the pursuit of opportunity, and the rejection of truth—have made these institutions problematic in the impact they have on those who pass through them, in whatever influence they have on the broader society and estranged from the broader society. And I think for any kind of private institution, it has to find a social contract in which it can operate with the broader society. And the fact that the ways in which great universities have acted have so enabled the Elise Stefaniks, the Bill Ackmans, and the Christopher Rufos, speaks to the danger with which they have been governed. 

I come from a left of center tradition. And I’m not far left of center, but surely left of center. And I’ve always been acutely aware, in thinking of universities, that Ronald Reagan got his political start by condemning and running against what was happening at Berkeley in the mid-1960s. And that the tradition of then-Governor Brown—who had inaugurated this wonderful idea of free college education for anyone who had a B-average in a California high school—got completely blown away in a tide of fury about “welfare Cadillacs.” But what brought that tide to prominence was a general revulsion at what had been going on at Berkeley that Ronald Reagan rode to his political career. And so it seems to me that universities that fail to govern themselves effectively are at immense peril to themselves and to the broader progressive values that they hold.

Mounk: How dangerous do you think this moment is, not just to the reputation of universities but to their actual ability to function as co-institutions of the United States? I’m a little torn on this. On the one hand, you can make the case that even the most affluent and insulated universities like Harvard need federal funding for the research that they undertake, and to finance a lot of the student loans that its undergraduates take out. On the other hand, Harvard has an endowment of, what, $50 billion? And it does continue to have real support in the population. Where would you place yourself on the worry scale about sort of the worst-case scenarios here?

Summers: I think one would find for any Ivy League school that the federal government was ten times as large a donor, at least, as any other donor. And I think it’s fair to say that the universities have thumbed their nose at what is by far their largest donor. And they’re certainly not prepared to take that casual and cavalier attitude towards much smaller individual donors because of what they think the consequences would be. I think it’s fine to stand strongly against a set of people who in many ways are riding this horse, but wish the process of thought and wish academic freedom ill. The problem is not that Harvard has worked itself into a war with Elise Stefanik. The problem is that it got itself condemned from the White House press briefing room of the Biden administration, that it finds itself subject to investigation from the Department of Education of the Biden administration, that the attacks on it are coming in a bipartisan way.

I think one of the aspects of how this has happened is that while on the one hand we think of intellectual communities as being the most broad-minded of communities, on the other hand they are actually among the most narrow, insular and inward-looking in the way they evaluate themselves and in the way they think of the necessary decision making. There’s an old story about when Pat Moynihan had decided to leave the UN and called the Dean of Harvard to say he would be returning. He said he’d let the president know and the Dean of Harvard assumed he was referring to the President of Harvard rather than the President of the United States. And that bespeaks a kind of attitude that I think is very problematic. 

Link to the rest at Persuasion

Compulsively Trying to Please People Who Never Liked Me

From Electric Lit:

Having been her editor for a few years now, this is not the first time I’ve been asked to say something coherent about Jessi Jezewska Stevens’ fiction. The curious thing is that every time the question comes, I go rummaging through my critical cupboard in search of the right tools for getting at her work and find myself defaulting, again and again, to the bludgeon over the scalpel. Though no one’s work could be less a blunt object, there’s something about Stevens’ writing that tempts me toward the grand statement. It makes me want to issue unpleasant and no doubt blinkered generalizations about the State of Contemporary Fiction. I want to hold Stevens’ work up as an antidote to this or that literary malaise. My tone becomes oracular, apocalyptic, even when the register of the text in question—as in “A New Book of Grotesques” from her collection Ghost Pains—would seem to be anything but.

Partly to blame are her two novels, The Exhibition of Persephone Q and The Visitors. Each, in its way, tackles a doomsday: Persephone, the early 2000s tipping point between pre- and post-digital notions of personhood; The Visitors more literally, with the financial crisis precipitating counterfactual collapse in New York circa 2011. But there’s more to it than Armageddon by association.

If I were to tell you that Stevens often writes about quasi-narcissistic women enjoying lives of temporal comfort while suffering from a comedic inability to act with either decisiveness or effectiveness, you would, I suspect, yawn in my face. Stevens and everyone else, you might say. Yet her fiction perches upon and pries open little fissures in our prejudices about what fiction “ought” to be doing at this moment in history. She manages, with rigor and strangeness, to make the old wounds hurt, to make our shallowness feel dangerous again. Her stories remind me that there’s a great difference between a self-regarding writer whose project is the anatomizing of her own anomie and a writer whose project is to interrogate the anomie of self-regard. It’s the difference between diminishing our artform until it takes on the proportions and vocabulary of quotidian pettiness and approaching that pettiness with all the great and varied tools to which our artform has claim. Or maybe it’s just the difference between complaint and diagnosis.

“Can you believe the mistakes I was already making?” asks the glib narrator of “A New Book of Grotesques.”

Link to the rest at Electric Lit

The Dictator’s Best Friend

From The New Statesman:

“Writers under despots,” says Simon Ings, “may have to take instruction, but they’re rarely out of a job.” Every regime requires a story to validate it, and a regime lacking the authority of tradition needs one most urgently. Songs and slogans; heroes and martyrs; legends of the past, visions of the future: every would-be dictator makes use of them, and so every dictatorship seeks out literary celebrities who can inspire an uprising and then be flattered or coaxed or bought or terrified into celebrating the brave new world.

This book describes the relationships between four such authors and the political leaders whose causes – with varying degrees of willingness – they defined or promoted.

Ings begins with a failure. General Boulanger’s populist-militarist image, waving his hat from the back of his black horse while calling for a war of revenge against Prussia, proved compelling to a variety of interest groups in late-19th-century France. Maurice Barrès, admired nationalist and nostalgic author, put his talents at Boulanger’s service. But though Boulanger was charismatic personally he was weak strategically. His programme (essentially “Make France Great Again”) was vacuous. “The less Boulanger said,” writes Ings, “the better he did.” When the moment came for him to seize power he drew back. Meanwhile, Barrès, as Ings candidly admits, was a “grouch”.

Ings makes the best of the poor material the pair offer him by beginning in 1900 with a panoptic view of the Exposition Universelle, which he describes with gusto. Barrès, who detested universalism, makes his way gloomily through it to attend a dinner given by Action Française – an organisation with which he shared a quasi-mystical reverence for patrie, rootedness and the heroic dead.

Boulanger took his own life but Barrès, his admirer, lived on, his patriotism mutating nastily into anti-Semitism. For all their shortcomings, the two of them provide a handy vehicle to carry Ings’s ideas about celebrity and its political uses, about “the religious instinct of crowds” and the power of popular sentiment.

. . . .

Ings’s next pairing is more dynamic. The author is Gabriele D’Annunzio – poet, playwright, serial seducer, aviator, war-mongering orator and, briefly, small-scale dictator. The large-scale dictator is Benito Mussolini.

D’Annunzio declared that when he temporarily laid aside “scribbling” for violent political action he was working in a new art form whose material was human lives. He seized the Croatian port city of Fiume (now Rijeka) in 1919, and made himself its “Duce”, using it as the setting for a 15-month-long piece of spectacular street-theatre. Parades, marching bands, anthems belted out by volunteers with piratical hair-dos and black uniforms – all in celebration of a greater Italy, of soldiers as sacrificial victims offered up on the altar of the patria, and of D’Annunzio himself. Mussolini took note. He later called D’Annunzio the “John the Baptist of Fascism”. As Ings writes, “All Mussolini’s ritual, symbolism, mystique and style can be tied back to D’Annunzio.”

The young Mussolini was a dogged and voracious autodidact. Ings praises his early journalistic essays as “deeply thought-out” and summarises the texts he was reading – by Georges Sorel, Gustave Le Bon, Roberto Michels. Lightly skimming where Mussolini dug deep, Ings gives his readers a concise round-up of the intellectual ground in which the 20th-century dictatorships took root. He has a talent for succinct statements so well turned that they immediately ring true. His summings-up are forceful. He can make sense of syndicalism (something many historians struggle to do) and explain how attractive it seemed to early-20th-century thinkers from each end of the political spectrum, and why it lent itself so conveniently to totalitarianism.

. . . .

His Russian “engineer of human souls” (the phrase is Stalin’s) is Maxim Gorky. Ings opens this section in comic mode with Gorky’s visit to New York in 1906. Mark Twain has laid on mass meetings and grand dinners in his honour. Americans who, as Ings tartly comments, “can’t tell one Russian revolutionary from another” are delighted to applaud him. When trouble comes it has nothing to do with politics: it arises from the clash between revolutionary/bohemian sexual mores and American prudishness. Gorky, travelling with the actress Maria Andreyeva, to whom he is not married, causes scandal. He may deliver stirring speeches about how the “black blood-soaked wings of death” hover over his fatherland. He can recite Poe’s “The Raven” in Russian, thrilling auditors with his “deep musical voice”. But the grandeur of his mission and his manner are repeatedly undercut by farce. The pair of sinful lovers are turned out of hotels, and passed from one host’s spare room to another “like unexploded ordnance”.

They move on to London, where they meet Lenin, in exile like them. Lenin checks their bed sheets for damp – “We need to take good care of you” – and congratulates Gorky on having written a “useful” book (the novel Mother, which Gorky himself considers “really bad” but which lends itself to use as propaganda). So begins an association between the party and its tame author that will last uneasily for 30 years. To the Bolsheviks, Gorky is an asset as unreliable as he is valuable. To Gorky, the Soviet Union is a paymaster that allows him to practise the fascinating work of “god-building” – creating a faith for those who had abolished God. But at the time of the October Revolution he wrote that what was coming was “a long bloody anarchy, and, after it, a not less bloody and dark reaction”. Some 20 years later, Romain Rolland, watching him being treated as a literary lion, thought that Gorky was more like a sad old performing bear.

Link to the rest at The New Statesman

A Memoirist Who Told Everything and Repented Nothing

From The New Yorker:

When she died at a hundred and one in January of 2019, Diana Athill had publicly chronicled both ends of her long life in a series of nine memoirs. The first of these, “Instead of a Letter,” was published in 1963 and recently rereleased in the U.S. as part of the NYRB Classics series; it recounts her jolly, upper-class English childhood on the family estate of Ditchingham, in Norfolk. The last book that she wrote, “Alive, Alive Oh!,” came together in her “darling little room” at the Mary Feilding Guild, in Highgate, London, a garden-set home for the elderly; it’s a high-spirited, recalcitrant account of “waiting to die” at ninety-six.

Athill was the sort of character who ought to have seen her obituaries before she went. First, because she would have bewitchingly written off any high praise—the New York Times noted “her luminous prose, gimlet social acuity and ability to convey a profound sense of place”—with her brand of droll humor. (She refused burial at the Highgate Cemetery because of the cost: “I think being dead is an expensive business.”) And, second, because she would have enjoyed the evidence of how much her reputation had emerged; she’d worked behind the scenes for meagre wages and little adulation as one of the century’s great editors. In 1952, she became a co-founding director of the publishing house André Deutsch, and, until her retirement, in 1992, shepherded the likes of Philip Roth, John Updike, and Jean Rhys to publication. Athill wrote seven of her memoirs after leaving her nine-to-five, but, until that relatively late turn toward autobiographical mania, she knew her place. “We must always remember that we are only midwives—if we want praise for progeny we must give birth to our own,” she writes, in “Stet: An Editor’s Life.” We might not have known her had she not brought forth her own romping and exuberant litter.

Critics frequently used the terms “frank” or “candid” to describe Athill’s memoirs. But Athill doesn’t write as if no one is watching; she writes as if she’d never even imagined someone might watch, and therefore doesn’t have a scruple to hold on to. To describe honesty as her hallmark isn’t quite enough: that’s the least we can ask of our memoirists. What she is marvellous at is admitting, sans self-recrimination. In the early twenty-first century, the memoir has turned into a confessional, in a nearly religious sense. Writers go there to seek redemption, and to chart their evolution from naïve to knowing: no narrative is more marketable than metamorphosis. Athill doesn’t treat her foibles and losses—of love, of money, of caste, of certainty—as traumas, events that would define her life as troubled and scarring. Instead, she makes the case that being kicked out of Eden is good for the soul.

“I am glad that I have not inherited money or possessions,” Athill writes, striking a defiant note in “Instead of a Letter.” Inheritance was never her due, though as a child she once counted the bodies that stood between her and the palatial Ditchingham estate. “It appeared that at least twelve people, seven of them my contemporaries, would have to die before I would have a claim, and I hardly thought I ought to pray for this however much I would have liked to.” Ditchingham belonged to her mother’s parents, who offered it out as the extended family’s seasonal home, where they spent long summers and holidays throughout her early life. The thousand-acre estate with a twenty-bedroom, fully staffed house granted the family security in their Englishness, as members of an élite and unquestionable class. Athill stresses that the experience of growing up with such surety turned Ditchingham into a cocoon, a secure location from which to launch a life, but also a place she would inevitably leave. “There I used to be,” she opines, “as snug and as smug as anyone.” From an early age, she knew that adulthood would exist elsewhere.

Athill’s joy in Ditchingham, the children’s after-tea appearance in front of the grownups in the drawing room and the horsemen wandering across the fields, is the bright marrow of her writing: it suffuses her later life, and her prose, with bubbling, fresh oxygen. But, in “Instead of a Letter,” she writes as if she’s relieved that she got away from the estate and its inhabitants. “Like anyone else they had their charms,” she writes of her family, but “physically, intellectually, and morally, they were no more than middling.” Yet they thought themselves superior beings: “Smugness is too small a word for what it feels like from inside. From inside, it feels like moral and aesthetic rightness; from inside, it is people like me, who question it, who look stupid, ugly, and pitiful.”

Hence her happiness that she didn’t inherit: staying on at Ditchingham for a lifetime might have trapped her in the same small, closed life. Her childhood remained blissful to her as she aged because it lived on in her memory but didn’t define her future. “Never to have broken through its smothering folds would have been, I have always thought, extremely depressing,” she writes. “But on the other hand, not to have enjoyed a childhood wrapped warmly in those folds—that would be a sad loss.” Cousins were saddled with managing the finances of an upkeep-heavy country pile, whereas she, the oldest child of a fourth daughter, absorbed the bliss of the place but not the narrowness.

Ditchingham wasn’t the only inheritance that Athill would forgo. At thirteen, her mother told her that they’d “lost” their money, but what she meant was that they’d spent it all. “My parents felt they were living austerely because we ourselves looked after our ponies and they had not kept on their own hunters,” Athill writes, dryly. She recounts her mother telling her that “the really bloody thing about being poor is that if you leave something on the floor when you go out, you know that it will still be there when you get back.” Along with her two younger siblings, the family had been living in a well-staffed, six-bedroom house in Hertfordshire since her father had retired from the Army. Financially, they fell out a window but landed on a mattress—Athill’s grandparents rented them Manor Farm, a house on the estate, for cheap. A governess cost too much, so Athill was sent to Runton, a girls’ boarding school on the North Sea, and then up to St. Mary’s College at Oxford, in 1936.

When Athill was twenty-two, her future disintegrated again. She’d been engaged for two years when her fiancé, a Royal Air Force pilot named Tony Irvine, was deployed to Egypt. Then his letters suddenly stopped. She discovered in rapid succession that he’d married someone else while abroad and then been killed in action. “A long, flat unhappiness” set in, her sense of her own value collapsed, and her twenties were filled with broken-off relationships with incompatible men. “By the time I had reached my thirties,” she writes, toward the end of “Instead of a Letter,” “I was convinced that I lacked some vital quality necessary to inspire love.” At age ninety-nine, she explained in an interview, “there was a basic, underlying sense of failure—and it came from the very simple thing of having been brought up expecting to get married.”

“How did I get this way?” is one of memoir’s primary questions. Typical culprits are poverty or abandonment, sometimes a remarkable, indelible catastrophe. Cheryl Strayed’s mother died when Strayed, the author of “Wild,” was in college: she calls it her “genesis story.” Dani Shapiro, the author of five memoirs, starts her autobiographical path in “Slow Motion” with the story of her parents’ tragic car accident. Even Joan Didion reached new heights of cultural resonance with “The Year of Magical Thinking,” her memoir of the year following her husband’s death. The modern memoir is the proving ground for our national obsession with trauma, a place to gawk at whoever comes through the emotional meat grinder with the good sense and talent to finesse their damage into a redemption song.

Link to the rest at The New Yorker

Ditchingham Hall. (2022, April 20). In Wikipedia. https://en.wikipedia.org/wiki/Ditchingham_Hall. Photographer – Stephen Richards, CC BY-SA 2.0 (Creative Commons Attribution Share-alike license 2.0)

Is everything you assumed about the Middle Ages wrong?

From The Economist:

“In public, your bottom should emit no secret winds past your thighs. It disgraces you if other people notice any of your smelly filth.” This useful bit of advice for young courtiers in the early 13th century appears in “The Book of the Civilised Man”, a poem by Daniel of Beccles. It is the first English guide to manners.

Ian Mortimer, a historian, argues that this and other popular works of advice that began appearing around the same time represent something important: a growing sense of social self-awareness, self-evaluation and self-control. Why then? Probably because of the revival of glass mirrors in the 12th century, which had disappeared from Europe after the fall of Rome. The mirror made it possible for men and women to see themselves as others did. It confirmed their individuality and inspired a greater sense of autonomy and potential. By 1500 mirrors were cheap, and their impact had spread through society.

Mr Mortimer sets out to show that the medieval period, from 1000 to 1600, is profoundly misunderstood. It was not a backward and unchanging time marked by violence, ignorance and superstition. Instead, huge steps in social and economic progress were made, and the foundations of the modern world were laid.

The misapprehension came about because people’s notion of progress is so bound up with scientific and technological developments that came later, particularly with the industrial and digital revolutions. The author recounts one claim he has heard: that a contemporary schoolchild (armed with her iPhone) knows more about the world than did the greatest scientist of the 16th century.

Never mind that astronomers such as Copernicus and Galileo knew much more about the stars than most children do today. Could a modern architect (without his computer) build a stone spire like Lincoln Cathedral’s, which is 160 metres (525 feet) tall and was completed by 1311? Between 1000 and 1300 the height of the London skyline quintupled, whereas between 1300 and the completion of the 72-storey Shard in 2010, it only doubled. Inventions, including gunpowder, the magnetic compass and the printing press, all found their way from China to transform war, navigation and literacy.

This led to many “expanding horizons” for Europeans. Travel was one. In the 11th century no European had any idea what lay to the east of Jerusalem or south of the Sahara. By 1600 there had been several circumnavigations of the globe.

Law and order was another frontier. Thanks to the arrival of paper from China in the 12th century and the advent of the printing press in the 1430s, document-creation and record-keeping, which are fundamental to administration, surged. Between 1000 and 1600 the number of words written and printed in England went from about 1m a year to around 100bn. In England, a centralised legal and criminal-justice system evolved rapidly from the 12th century. Violent deaths declined from around 23 per 100,000 in the 1300s to seven per 100,000 in the late 16th century.

Link to the rest at The Economist

Late Middle English

It was during the 14th century that a different dialect (known as the East-Midlands) began to develop around the London area.

Geoffrey Chaucer, a writer we have come to identify as the Father of English Literature[5] and author of the widely renowned Canterbury Tales, was often heralded as the greatest poet of that particular time. It was through his various works that the English language was more or less “approved” alongside those of French and Latin, though he continued to write up some of his characters in the northern dialects.

It was during the mid-1400s that the Chancery English standard was brought about. The story goes that the clerks working for the Chancery in London were fluent in both French and Latin. It was their job to prepare official court documents and prior to the 1430s, both the aforementioned languages were mainly used by royalty, the church, and wealthy Britons. After this date, the clerks started using a dialect that sounded as follows:

gaf (gave) not yaf (Chaucer’s East Midland dialect)
such not swich
theyre (their) not hir [6]

As you can see, the above is starting to sound more like the present-day English language we know.
If one thinks about it, these clerks held enormous influence over the manner of influential communication, which ultimately shaped the foundations of Early Modern English.

Early Modern English

The changes in the English language during this period occurred from the 15th to mid-17th Century, and signified not only a change in pronunciation, vocabulary or grammar itself but also the start of the English Renaissance.

The English Renaissance has much quieter foundations than its pan-European cousin, the Italian Renaissance, and sprouted during the end of the 15th century. It was associated with the rebirth of societal and cultural movements, and while slow to gather steam during the initial phases, it celebrated the heights of glory during the Elizabethan Age.

It was William Caxton’s innovation of an early printing press that allowed Early Modern English to become mainstream, something we as English learners should be grateful for! The Printing Press was key in standardizing the English language through distribution of the English Bible.

Caxton’s publishing of Thomas Malory’s Le Morte d’Arthur (the Death of Arthur) is regarded as print material’s first bestseller. Malory’s interpretation of various tales surrounding the legendary King Arthur and the Knights of the Round Table, in his own words, and the ensuing popularity indirectly ensured that Early Modern English was here to stay.

It was during Henry the VIII’s reign that English commoners were finally able to read the Bible in a language they understood, which to its own degree, helped spread the dialect of the common folk.

The end of the 16th century brought about the first complete translation of the Catholic Bible, and though it didn’t make a markable impact, it played an important role in the continued development of the English language, especially with the English-speaking Catholic population worldwide.

The end of the 16th and start of the 17th century would see the writings of actor and playwright, William Shakespeare, take the world by storm.

Why was Shakespeare’s influence important during those times? Shakespeare started writing during a time when the English language was undergoing serious changes due to contact with other nations through war, colonisation, and the likes. These changes were further cemented through Shakespeare and other emerging playwrights who found their ideas could not be expressed through the English language currently in circulation. Thus, the “adoption” of words or phrases from other languages were modified and added to the English language, creating a richer experience for all concerned.

It was during the early 17th century that we saw the establishment of the first successful English colony in what was called The New World. Jamestown, Virginia, also saw the dawn of American English with English colonizers adopting indigenous words, and adding them to the English language.

The constant influx of new blood due to voluntary and involuntary (i.e. slaves) migration during the 17th, 18th and 19th century meant a variety of English dialects had sprung to life, this included West African, Native American, Spanish and European influences.

Meanwhile, back home, the English Civil War, starting mid-17th century, brought with it political mayhem and social instability. At the same time, England’s puritanical streak had taken off after the execution of Charles I. Censorship was a given, and after the Parliamentarian victory during the War, Puritans promoted an austere lifestyle in reaction to what they viewed as excesses by the previous regime[7]. England would undergo little more than a decade under Puritan leadership before the crowning of Charles II. His rule, effectively the return of the Stuart Monarchy, would bring about the Restoration period which saw the rise of poetry, philosophical writing, and much more.

It was during this age that literary classics, like those of John Milton’s Paradise Lost, were published, and are considered relevant to this age!

Late Modern English

The Industrial Revolution and the Rise of the British Empire during the 18th, 19th and early 20th-century saw the expansion of the English language.

The advances and discoveries in science and technology during the Industrial Revolution saw a need for new words, phrases, and concepts to describe these ideas and inventions. Due to the nature of these works, scientists and scholars created words using Greek and Latin roots e.g. bacteria, histology, nuclear, biology. You may be shocked to read that these words were created but one can learn a multitude of new facts through English language courses as you are doing now!

Colonialism brought with it a double-edged sword. It can be said that the nations under the British Empire’s rule saw the introduction of the English language as a way for them to learn, engage, and hopefully, benefit from “overseas” influence. While scientific and technological discoveries were some of the benefits that could be shared, colonial Britain saw this as a way to not only teach their language but impart their culture and traditions upon societies they deemed as backward, especially those in Africa and Asia.

The idea may have backfired as the English language walked away with a large number of foreign words that have now become part and parcel of the English language e.g. shampoo, candy, cot and many others originated in India!

English in the 21st Century

If one endevours to study various English language courses taught today, we would find almost no immediate similarities between Modern English and Old English. English grammar has become exceedingly refined (even though smartphone messaging have made a mockery of the English language itself) where perfect living examples would be that of the current British Royal Family. This has given many an idea that speaking proper English is a touch snooty and high-handed. Before you scoff, think about what you have just read. The basic history and development of a language that literally spawned from the embers of wars fought between ferocious civilisations. Imagine everything that our descendants went through, their trials and tribulations, their willingness to give up everything in order to achieve freedom of speech and expression.

Getting a Grip on Gravity Aboard a Ship

From The Wall Street Journal:

There’s an endless stream of advice about things that will stop you sleeping well: bright screens before bedtime, too much caffeine, the wrong pillows and so on. Let me add one to the list. It’s almost impossible to get any decent sleep while you’re constantly sliding across the bedsheets down to the end of the bed, and then a few seconds later sliding back the other way.

This sounds extreme, but it’s been a regular feature of my life over the past month. I spent some of November and all of December on a scientific research ship in the middle of the Labrador Sea, as part of an international project to measure stormy seas and how they help the ocean breathe. It was a great opportunity from a scientific point of view, but it came with a built-in challenge—the middle of the ocean during a big storm is a very difficult place to work, because gravity becomes a fickle trickster.

Gravity is incredibly useful, most of all because it’s reliable. We put things on shelves knowing that they’re going to stay there. Faucets point downward because the water will come out and keep going down. But if you need a ship to go in specific directions for scientific reasons, it can’t sit in the most stable orientation amid the wind and the waves. The ship will roll much more than normal and gravity effectively becomes changeable.

You find yourself chasing the water around the shower cubicle because the direction that it’s falling relative to you keeps shifting. Every corridor has handrails that you need to hold onto as you zigzag along. The default assumption is that everything will move, so you have to strap everything down as soon as you move it and if you don’t, it certainly won’t be there when you get back. After one particularly violent night, I got down to my lab to find that a large, heavy instruction manual that I left on a shelf had been thrown onto the top of the printer 3 yards away.

But there is one exception to all this motion. There’s a type of nonslip rubbery mesh table mat that I’ve never seen outside a ship, and anything that’s placed on that will just stay put. The night that the manual took flight, I had left a laptop sitting on a grippy tabletop mat, not strapped down, and I really should have been punished for that mistake by finding the laptop smashed in a corner. But it hadn’t moved.

My savior was friction, and this type of mat is exceptional at creating it. Surfaces are all rough at tiny scales, and so when one object is placed on top of another they only really touch at a small number of tiny points. It’s like one mountain range being placed upside down on top of another one, so that only the biggest peaks actually poke into the other surface, and those points take all the weight.

But the soft rubber is easily squashed, increasing the contact area as the mat molds itself to the underside of the object on top. That makes it much harder for the upper object to slide. When there’s a sideways force, the top of the rubber can also move slightly sideways without breaking, absorbing the force within the mat and reducing the sideways push on the object. That all reduces the sensitivity of the object to the exact direction of gravity. As long as gravity is still mostly downward, the mat will stop things sliding around. This is a game-changer at sea, because “mostly downward” is as reliable as gravity gets.

Link to the rest at The Wall Street Journal (Sorry if you encounter a paywall)

English Is A Germanic Language

From Rosetta Stone:

German is widely considered among the easier languages for native English speakers to pick up. That’s because these languages are true linguistic siblings—originating from the exact same mother tongue. In fact, eighty of the hundred most used words in English are of Germanic origin. These most basic, common words in English and German derive from the same roots, making them amazingly similar. That’s why an English word like “friend” is “freund” in German. Plus, there are an incredible number of German and English words that aren’t simply related, but identical: arm, hand, kindergarten, angst, bitter, and many more.

Generally, if you’re an English speaker with no exposure to other languages, here are some of the most challenging: Mandarin Chinese, Arabic, Icelandic, Thai, Vietnamese, Finnish, Japanese, and Korean.

Link to the rest at Rosetta Stone

From Oxford International:

History of the English language

Charles Laurence Barber comments, “The loss and weakening of unstressed syllables at the ends of words destroyed many of the distinctive inflections of Old English.”

Similarly, John McWhorter points out that while the Norsemen and their English counterparts were able to comprehend one another in a manner of speaking, the Norsemen’s inability to pronounce the endings of various words ultimately resulted in the loss of inflectional endings.

Many of you will be forgiven for thinking that studying an English Language course consists of English grammar more than anything else. While English grammar does play a part when taking courses to improve English overall, it is but a small part of the overall curriculum where one becomes immersed in a history that was partly influenced by myths, battles, and legends on one hand, and the everyday workings of its various social class on the other.

According to the Encyclopedia Britannica, the English language itself really took off with the invasion of Britain during the 5th century. Three Germanic tribes, the JutesSaxons and Angles were seeking new lands to conquer, and crossed over from the North Sea. It must be noted that the English language we know and study through various English language courses today had yet to be created as the inhabitants of Britain spoke various dialect of the Celtic language.

During the invasion, the native Britons were driven north and west into lands we now refer to as Scotland, Ireland, and Wales. The word England and English originated from the Old English word Engla-land, literally meaning “the land of the Angles” where they spoke Englisc.

Old English (5th to 11th Century)

Albert Baugh, a notable English professor at the University of Pennsylvania notes amongst his published works that around 85% of Old English is no longer in use; however, surviving elements form the basis of the Modern English language today.

Old English can be further subdivided into the following:

  • Prehistoric or Primitive (5th to 7th Century) – available literature or documentation referencing this period is not available aside from limited examples of Anglo-Saxon runes;
  • Early Old English (7th to 10th Century) – this period contains some of the earliest documented evidence of the English language, showcasing notable authors and poets like Cynewulf and Aldhelm who were leading figures in the world of Anglo-Saxon literature.
  • Late Old English (10th to 11th Century) – can be considered the final phase of the Old English language which was brought about by the Norman invasion of England. This period ended with the consequential evolution of the English language towards Early Middle English.

Early Middle English

It was during this period that the English language, and more specifically, English grammar, started evolving with particular attention to syntax. Syntax is “the arrangement of words and phrases to create well-formed sentences in a language,” and we find that while the British government and its wealthy citizens Anglicised the language, Norman and French influences remained the dominant language until the 14th century.

An interesting fact to note is that this period has been attributed with the loss of case endings that ultimately resulted in inflection markers being replaced by more complex features of the language. Case endings are “a suffix on an inflected noun, pronoun, or adjective that indicates its grammatical function.

Late Middle English

It was during the 14th century that a different dialect (known as the East-Midlands) began to develop around the London area.

Geoffrey Chaucer, a writer we have come to identify as the Father of English Literature and author of the widely renowned Canterbury Tales, was often heralded as the greatest poet of that particular time. It was through his various works that the English language was more or less “approved” alongside those of French and Latin, though he continued to write up some of his characters in the northern dialects.

It was during the mid-1400s that the Chancery English standard was brought about. The story goes that the clerks working for the Chancery in London were fluent in both French and Latin. It was their job to prepare official court documents and prior to the 1430s, both the aforementioned languages were mainly used by royalty, the church, and wealthy Britons. After this date, the clerks started using a dialect that sounded as follows:

  • gaf (gave) not yaf (Chaucer’s East Midland dialect)
  • such not swich
  • theyre (their) not hir [6]

As you can see, the above is starting to sound more like the present-day English language we know.
If one thinks about it, these clerks held enormous influence over the manner of influential communication, which ultimately shaped the foundations of Early Modern English.

Early Modern English

The changes in the English language during this period occurred from the 15th to mid-17th Century, and signified not only a change in pronunciation, vocabulary or grammar itself but also the start of the English Renaissance.

The English Renaissance has much quieter foundations than its pan-European cousin, the Italian Renaissance, and sprouted during the end of the 15th century. It was associated with the rebirth of societal and cultural movements, and while slow to gather steam during the initial phases, it celebrated the heights of glory during the Elizabethan Age.

It was William Caxton’s innovation of an early printing press that allowed Early Modern English to become mainstream, something we as English learners should be grateful for! The Printing Press was key in standardizing the English language through distribution of the English Bible.

Caxton’s publishing of Thomas Malory’s Le Morte d’Arthur (the Death of Arthur) is regarded as print material’s first bestseller. Malory’s interpretation of various tales surrounding the legendary King Arthur and the Knights of the Round Table, in his own words, and the ensuing popularity  indirectly ensured that Early Modern English was here to stay.

It was during Henry the VIII’s reign that English commoners were finally able to read the Bible in a language they understood, which to its own degree, helped spread the dialect of the common folk.

The end of the 16th century brought about the first complete translation of the Catholic Bible, and though it didn’t make a markable impact, it played an important role in the continued development of the English language, especially with the English-speaking Catholic population worldwide.

The end of the 16th and start of the 17th century would see the writings of actor and playwright, William Shakespeare, take the world by storm.

Why was Shakespeare’s influence important during those times? Shakespeare started writing during a time when the English language was undergoing serious changes due to contact with other nations through war, colonisation, and the likes. These changes were further cemented through Shakespeare and other emerging playwrights who found their ideas could not be expressed through the English language currently in circulation. Thus, the “adoption” of words or phrases from other languages were modified and added to the English language, creating a richer experience for all concerned.

It was during the early 17th century that we saw the establishment of the first successful English colony in what was called The New World. Jamestown, Virginia, also saw the dawn of American English with English colonizers adopting indigenous words, and adding them to the English language.

The constant influx of new blood due to voluntary and involuntary (i.e. slaves) migration during the 17th, 18th and 19th century meant a variety of English dialects had sprung to life, this included West African, Native American, Spanish and European influences.

Meanwhile, back home, the English Civil War, starting mid-17th century, brought with it political mayhem and social instability. At the same time, England’s puritanical streak had taken off after the execution of Charles I. Censorship was a given, and after the Parliamentarian victory during the War, Puritans promoted an austere lifestyle in reaction to what they viewed as excesses by the previous regime. England would undergo little more than a decade under Puritan leadership before the crowning of Charles II. His rule, effectively the return of the Stuart Monarchy, would bring about the Restoration period which saw the rise of poetry, philosophical writing, and much more.

It was during this age that literary classics, like those of John Milton’s Paradise Lost, were published, and are considered relevant to this age!

Late Modern English

The Industrial Revolution and the Rise of the British Empire during the 18th, 19th and early 20th-century saw the expansion of the English language.

The advances and discoveries in science and technology during the Industrial Revolution saw a need for new words, phrases, and concepts to describe these ideas and inventions. Due to the nature of these works, scientists and scholars created words using Greek and Latin roots e.g. bacteria, histology, nuclear, biology. You may be shocked to read that these words were created but one can learn a multitude of new facts through English language courses as you are doing now!

Colonialism brought with it a double-edged sword. It can be said that the nations under the British Empire’s rule saw the introduction of the English language as a way for them to learn, engage, and hopefully, benefit from “overseas” influence. While scientific and technological discoveries were some of the benefits that could be shared, colonial Britain saw this as a way to not only teach their language but impart their culture and traditions upon societies they deemed as backward, especially those in Africa and Asia.

The idea may have backfired as the English language walked away with a large number of foreign words that have now become part and parcel of the English language e.g. shampoo, candy, cot and many others originated in India!

English in the 21st Century

If one endevours to study various English language courses taught today, we would find almost no immediate similarities between Modern English and Old English. English grammar has become exceedingly refined (even though smartphone messaging have made a mockery of the English language itself) where perfect living examples would be that of the current British Royal Family. This has given many an idea that speaking proper English is a touch snooty and high-handed. Before you scoff, think about what you have just read. The basic history and development of a language that literally spawned from the embers of wars fought between ferocious civilisations. Imagine everything that our descendants went through, their trials and tribulations, their willingness to give up everything in order to achieve freedom of speech and expression.

Link to the rest at Oxford International

What is Syntax and Why is it Important to Understand Language?

From Akorbi:

What is syntax?

Syntax is a term used by linguists to describe a set of principles and rules that govern sentence structure and word order in a particular language. In English, the general rule of syntax follows the subject-verb-object rule. The subject refers to the person or thing (a noun) performing the action, the verb describes the action being taken, and the object (another noun) refers to what is being acted upon (if anything).

More than 85% of the world’s languages put the subject first in a sentence, making it understood who or what performs the action. Many of the rest of the languages put the verb first, followed by the subject and the object.

Syntax may also include descriptive words such as adjectives and adverbs that add descriptions to nouns and verbs. Prepositions, like “to” and “above,” communicate the direction or placement of an object or subject.

Examples of Syntax in Various Languages

Every sentence in English breaks down to “subject-verb-object” no matter how long you make it.

“John cautiously drives the red car in the snow” shortens to “John drives the car.”

Spanish follows the same basic structure, except the noun and adjective are inverted. “Juan conduce con cautela el coche rojo en la naive.” The phrase “coche rojo” means “red car” in English, but in Spanish it reads as “car red.”

However, in both sentences it’s understood that the car is red as opposed to anything else because the word “red” is adjacent to the word “car.” That’s because of syntax rules that govern Spanish and English.

. . . .

Modifiers

Modifiers, such as adjectives and adverbs, should be close to the noun or verb that they modify in the sentence. The relationship between a modifier and its referent can be clarified with commas or punctuation to ensure the correct meaning is communicated.

“The deft driver swerved his car to the right to avoid an accident just in time.”

You could change this sentence to, “The car’s driver swerved deftly to the right just in time to avoid an accident.”

Both sentences are grammatically and syntactically correct. However, the second sentence is more clear. He swerved just in time to avoid an accident.

Turning Syntax on Its Head

Have you ever heard Yoda from Star Wars speak? He turns syntax on its head by putting the subject towards the end of the sentence instead of the beginning.

“When 900 years old you reach, look as good you will not.”

Ordinarily, you say to someone, “You won’t look this good when you reach 900 years old.

Link to the rest at Akorbi

Discussion: Language

From MIT Open Courseware – Introduction to Psychology:

Session Overview

How do I move meaning from my mind to yours? 

Discussion

Language is just incredible – think about how easy it is for us, as babies, to learn our native language effortlessly, and yet how hard it is, once we’ve already learned a language, to learn another.

I think about this every time I see a Chinese baby speaking perfect Mandarin. I have a master’s degree in linguistics and I’ve been trying to learn Mandarin for 10 years, but I’m just awful at it. And language is just spectacularly complicated in terms of our capacity for explaining things. There are sentences that you utter, or that friends utter, that have never been uttered before in the history of the human language, and that will probably never be uttered again, and yet they’re perfectly understandable.

We can talk about language in terms of signals for communication, that is, the perception and production of speech and sign, or what you have to do to move the message from one place to another. We can also talk about language as structures for information, or what the rules are that govern how the basic building blocks of language go together to convey meaning – including rules for sounds, words, sentences, and discourse.

In psychology, some of the big questions about language have to do with language acquisition (both as babies and as adults); the brain bases of speech and language; and communication and language disorders, such as aphasia and dyslexia. Language is a huge topic, but we’ll hit some of the highlights here.

Demonstration

Phonology is the structure of the sounds that make up the words of a language. Phonemes are the building blocks of speech sounds. Phonemes aren’t large enough units of language to convey meaning all by themselves, but they do distinguish one word from another. For example, bit and hit differ by one phoneme. English has about 45 phonemes altogether.

But think about two things. One, there’s a lot of sounds that we can make (whistles, coughs, snorts, etc.) that aren’t linguistic. Two, there’s incredible variety in how the people around us pronounce the “same” sound. Think about people speaking with different accents, or how you sound when you have a cold. How does the brain handle this?

Listen to Tyler describe and demonstrate the phenomenon of categorical perception: (Includes recorded demonstrations of “bad/bat” and “slash/splash” courtesy of UCLA Phonetics Lab, used with permission. The original recordings, and many others like them, are available at Peter Ladefoged’s website Vowels and Consonants.)

Demonstration

However, we don’t rely on our ears alone to determine what we’re hearing. The McGurk effect is a famous example of how visual cues impact our perception of speech sounds. For this demonstration, you will play a single, five-second video clip three times, with different instructions each time.

First, play the video with your eyes closed. Make note of what the man is saying.

(PG Comment: This is a very short video. On PG’s computer, at the close of this video, YouTube starts another that doesn’t seem to be related.)

Second, play the video again with your eyes open. Now what is he saying?

Third, mute the sound on the video and just watch his mouth move. What does it look like he’s saying?

What do you think is happening in this situation? Why?

. . . .

Demonstration

Consider the following joke:

A woman is taking a shower when her doorbell rings. She yells, “Who’s there?” and a man answers, “Blind man.” Being a charitable person, she runs out of the shower naked and opens the door. The man says, “Where should I put these blinds, lady?”

The use of the word ‘blind’ in this joke relies upon the particularities of English semantics and syntaxSemantics refers to the meaning of a word or a sentence; syntax refers to the rules for combining words into sentences. The word ‘blind’ has several meanings (it can be an adjective or a noun), and the one that comes to mind first for most listeners is ‘visually impaired,’ so “blind man” is at first understood as adjective + noun. However, the rules of English syntax allow us to interpret noun + noun phrases such as ‘ice cream man’ not as ‘a man made of ice cream’ but ‘a man who sells ice cream.’ It’s not until the punchline is delivered that you realize that it was a different meaning of ‘blind’ all along.

Link to the rest at MIT Open Courseware

20 Rare Languages Still Spoken Today

From Universal Translation Services:

Languages:

Culture is the most personal thing society came up with. It defines the people of a community and regulates their everyday life. There are many aspects of culture, but language is the most important. Figuring out humans’ first language is impossible, but we know some ancient tongues. We can also figure out which is the least spoken language. But languages die, too, even if they were quite famous at some point. Latin is a good example of this because it was a mighty tongue once, but today, it does not have a single native speaker. Experts are sure half of the seven thousand languages spoken today will become extinct within a hundred years.

Here are the top 20 most spoken languages in the world:

  1. English
  2. Mandarin
  3. Hindi
  4. Spanish
  5. French
  6. Standard Arabic
  7. Bengali
  8. Russian
  9. Portuguese
  10. Indonesian
  11. Urdu
  12. Standard German
  13. Japanese
  14. Swahili
  15. Marathi
  16. Telugu
  17. Western Punjabi
  18. Wu Chinese
  19. Tamil
  20. Turkish


What Are All the Languages in the World?

Language defines our cultural identity. More than seven thousand languages are spoken in the world. Some have less than a thousand speakers and risk becoming extinct. At the same time, others have millions of speakers. English and Mandarin are two vernaculars with over a billion speakers. Spanish is one of the most widely spoken languages.

. . . .

The Rarest Language in the World

Kaixana is an unknown language because it only has one speaker left today. Kaixana has never been very popular. But it had 200 speakers in the past. However, that number has been reduced to a single digit today. Learning is a complicated task since there isn’t much known about the vocabulary.

. . . .

What is the Oldest Language Still Spoken Today?

The oldest language that is still spoken today is Tamil. It has been around for at least 5000 years. It is spoken in India by more than 60 million speakers. Other old languages that are still spoken in the world today with little changes are Greek, Hebrew, Egyptian, and Farsi. Sanskrit is another language that has been around for over 3000 years, but it is only spoken by Hindu priests nowadays.

Link to the rest at Universal Translation Services

Language City

From The Wall Street Journal:

Words are coined and money talks, but the link between linguistics and economics is more than metaphor. Most notably, as Ross Perlin writes in his superb “Language City: The Fight to Preserve Endangered Mother Tongues in New York,” the English-speaking peoples’ geopolitical dominance has made their language the “reserve currency of communication.” But economic power can pull as well as push, luring immigrants and refugees to the metropole with the prospect of better lives, or at least better pay. So it is, Mr. Perlin writes, that New York City is now home to some 700 languages—there are roughly 7,000 extant—making it the “most linguistically diverse city in the history of the world.”

Linguistic variety is “often seen as a problem, the curse of Babel,” but for a linguist, New York City is a riotous collection of living specimens—a “greenhouse, not a graveyard.” Many of its languages have only thousands or hundreds of speakers worldwide. Mr. Perlin, who has a doctorate in linguistics, helps run the Endangered Language Alliance, which works to document such minority tongues. “Language City” centers on six: Yiddish; Seke, from Nepal; Wakhi, from the borderlands where Afghanistan, Tajikistan, Pakistan and China meet; N’ko, a script invented in 1949 to standardize a family of West African languages; Nahuatl, from Mexico and Central America; and Lenape, the language of New York’s first inhabitants.

The heart of “Language City” is portraits of individual New York-based speakers. Mr. Perlin writes about their work as well as his, capturing the grind of immigrant life with empathy, balance and wit. (Linguistic fieldwork, which involves coaxing people to talk while you note every nuance of their speech, is excellent training for journalism.) “If the country was rich we would never leave,” says Husniya, a Wakhi speaker from bleak post-Soviet Tajikistan. But she savors the city’s entrepreneurial energy: “New York opened my eyes. It shapes you to be a human being, not dividing based on religion, face, or race, or anything.”

. . . .

Mr. Perlin can set a scene with quick, sure strokes. “Boris, sipping from a slim-waisted glass of Turkish tea on Emmons Avenue, insists that he is not the last major Yiddish writer,” he writes of an aging Brooklynite born in Bessarabia. “This village has elevators. We step into one of them,” begins a chapter on Rasmina, a young speaker of Seke. Her language originates in a cluster of five villages in upland Nepal and is spoken by some 700 people in the world. At one point or another 100 of them have lived in a “vertical village” in Brooklyn—a six-story apartment building where a now-forgotten Nepali established a foothold decades ago.

Rasmina visits a class Mr. Perlin teaches at Columbia so that his students can practice field techniques. Like most of the world’s languages, Seke is barely recorded on paper; there’s no dictionary or grammar guide. In fact, Mr. Perlin notes later, as he travels with Rasmina back to Nepal, most languages are documented only “once, if at all.” In class, his students ask Rasmina the words in Seke for a range of objects and concepts, such as body parts and family relationships, drawn from a set of putatively universal vocabulary items called a Swadesh list. Even these are tricky for English speakers: Seke, like Russian and other languages, treats limbs as integral units, with a word that means “leg-foot” and one that means “arm-hand.”

Wonderfully rich, “Language City” is in part an introduction to the diverse ways different languages work. Seke and other “evidential” languages, for example, have different grammatical forms to indicate how the speaker knows what she’s asserting—whether from observation or inference, hearsay or hunch. Other languages syntactically “tag the speaker’s surprise at unexpected information” or have a special temporal marking “just for things happening today.”

It is also a brief survey of U.S. immigration, full of piquant detail about its tortuous history. Ellis Island, contrary to legend, was known for linguistic sensitivity, with translators capable of handling 20 languages—though premium-class passengers could skip immigration checks altogether. It’s the kind of book where even the notes are pinpoint portraits. Did you know that when Andy Warhol met Pope John Paul II (in 1980), they spoke Ruthenian? “Though Warhol famously said ‘I come from nowhere,’ ” Mr. Perlin deadpans, “his family in fact came from Mikova in what is now Slovakia.”

Why do we need so many languages, though? There’s something that seems inefficient about slicing one world into 7,000 pieces; there’s a reason that developing a universal language long seemed like the easiest route to utopia. It took surprisingly long to realize that sharing a language is no cure for conflict. The concerted push to preserve endangered tongues is only a few decades old, and the benefits it claims are more subtle than those of the biodiversity movement that helped inspire it. Exotic species have yielded many valuable medicines, but we’re not going to discover a treatment for cancer in an unfamiliar grammar. A central dogma of modern linguistics is that all languages are functionally equal: “Any society can run in any language,” as Mr. Perlin puts it.

Link to the rest at The Wall Street Journal (Sorry if you encounter a paywall)

Russian spies are back—and more dangerous than ever

From The Economist:

It is unusual for spymasters to taunt their rivals openly. But last month Bill Burns, the director of the CIA, could not resist observing that the war in Ukraine had been a boon for his agency. “The undercurrent of disaffection [among Russians] is creating a once-in-a-generation recruiting opportunity for the CIA,” he wrote in Foreign Affairs. “We’re not letting it go to waste.” The remark might well have touched a nerve in Russia’s “Special Services”, as the country describes its intelligence agencies. Russian spies botched preparations for the war and were then expelled from Europe en masse. But evidence gathered by the Royal United Services Institute (RUSI), a think-tank in London, and published exclusively by The Economist today, shows that they are learning from their errors, adjusting their tradecraft and embarking on a new phase of political warfare against the West.

The past few years were torrid for Russian spies. In 2020 operatives from the FSB, Russia’s security service, botched the poisoning of Alexei Navalny, the recently deceased opposition activist. He mocked them for spreading Novichok on his underwear. Then the FSB gave the Kremlin a rosy view of how the war would go, exaggerating Ukraine’s internal weaknesses. It failed to prevent Western agencies from stealing and publicising Russia’s plans to invade Ukraine. And it was unwilling or unable to halt a brief mutiny by Yevgeny Prigozhin, the leader of the Wagner mercenary group, last year. The SVR, Russia’s foreign intelligence agency, saw its presence in Europe eviscerated, with some 600 officers expelled from embassies across the continent. At least eight “illegals”—intelligence officers operating without diplomatic cover, often posing as non-Russians—were exposed.

The study by RUSI, written by Jack Watling and Nick Reynolds, a pair of the organisation’s analysts, and Oleksandr Danylyuk, a former adviser to both Ukraine’s defence minister and foreign intelligence chief, draws on documents “obtained from the Russian Special Services” and on interviews with “relevant official bodies”—presumably intelligence agencies—in Ukraine and Europe. In late 2022, the study says, Russia realised that it needed more honest reporting from its agencies. It put Sergei Kiriyenko, the Kremlin’s deputy chief of staff, in charge of “committees of special influence”. These co-ordinate operations against the West and then measure their effect.

That personnel change appears to have produced more coherent propaganda campaigns. In Moldova, for instance, a once-scattershot disinformation effort against the country’s bid for European Union membership grew more consistent and focused last year. It tied the accession bid to the president personally, all the while blaming her for Moldova’s economic woes. Campaigns aimed at undermining European support for Ukraine have also picked up. In January German experts published details of bots spreading hundreds of thousands of German-language posts a day from a network of 50,000 accounts over a single month on x (Twitter as was). On February 12th France exposed a large network of Russian sites spreading disinformation in France, Germany and Poland.

Meanwhile the GRU, Russia’s military intelligence agency, has also been re-evaluating its tradecraft. In recent years its Unit 29155—which had attempted to assassinate Sergei Skripal, a former GRU officer, in Salisbury, Britain in 2018—saw many of its personnel, activities and facilities exposed by Bellingcat. The investigative group draws on publicly available information and leaked Russian databases for its exposés.
The GRU concluded that its personnel were leaving too many digital breadcrumbs, in particular by carrying their mobile phones to and from sensitive sites associated with Russian intelligence. It also realised that the expulsion of Russian intelligence officers in Europe had made it harder to mount operations and control agents abroad—one reason why the invasion of Ukraine went awry.

The result was wholesale reform, which began in 2020 but sped up after the war began. General Andrei Averyanov, the head of Unit 29155, was, despite his litany of cock-ups, promoted to deputy head of the GRU and established a new “Service for Special Activities”. Unit 29155’s personnel—once exemplified by Alexander Mishkin and Anatoly Chepiga, Mr Skripal’s hapless poisoners, who insisted that they had travelled to Salisbury to see its cathedral’s famous spire—no longer carry their personal or work phones to its facility, using landlines instead. Training is done in a variety of safe houses rather than onsite. Whereas half of personnel once came from the Spetsnaz, Russia’s special forces, most new recruits no longer have military experience, making it harder for Western security services to identify them through old photographs or leaked databases.

A separate branch of the Service for Special Activities, Unit 54654, is designed to build a network of illegals operating under what Russia calls “full legalisation”—the ability to pass muster even under close scrutiny from a foreign spy agency. It recruits contractors through front companies, keeping their names and details out of government records, and embeds its officers in ministries unrelated to defence or in private companies. The GRU has also targeted foreign students studying at Russian universities, paying stipends to students from the Balkans, Africa and elsewhere in the developing world.

For another example of how Russian spies have turned disaster into opportunity, consider the case of the Wagner Group, a series of front companies overseen by Mr Prigozhin. Wagner initially served as a deniable arm of Russian influence, providing muscle and firepower to local autocrats in Syria, Libya and other African countries. In June 2023 Mr Prigozhin, angered by the mismanagement of the war by Russia’s defence minister and army chief, marched on Moscow. The mutiny was halted; two months later Mr Prigozhin was killed when his plane exploded midair.

Russia’s special services quickly divided Mr Prigozhin’s sprawling military-criminal enterprise among themselves. The FSB would keep domestic businesses, and the SVR the media arms, such as the troll farms which interfered in America’s presidential election in 2016. The GRU got the foreign military bits, split into a Volunteer Corps for Ukraine and an Expeditionary Corps, managed by General Averyanov, for the rest of the world. The latter missed its target of recruiting 20,000 troops by the end of last year, says RUSI, though its strength is “steadily rising”. There have been hiccups: Mr Prigozhin’s son, who mystifyingly remains alive and at liberty, offered Wagner troops to the Rosgvardia, Russia’s national guard, prompting a bidding war between the guard and the GRU, according to the authors.
. . . .

Mission Possible

Russian intelligence, though bruised, is firmly back on its feet after its recent humiliations. In recent weeks the Insider, a Riga-based investigative website, has published a series of stories documenting Russian espionage and influence across Europe. They include details of how a GRU officer in Brussels continues to provide European equipment to Russian arms-makers, and the revelation that a top aide in the Bundestag and a Latvian member of the European Parliament were both Russian agents, the latter for perhaps more than 20 years.

“It’s not as bad for them as we think it is,” says Andrei Soldatov, an investigative journalist, who reckons that the Russian services are “back with a vengeance” and increasingly inventive. Vladimir Putin, Russia’s president, and once a (mediocre) KGB officer, is “trying to restore the glory of Stalin’s formidable secret service”, explains Mr Soldatov. He points to a case in April 2023 when Artem Uss, a Russian businessman arrested in Milan on suspicion of smuggling American military technology to Russia, was spirited back to Russia with the help of a Serbian criminal gang—a common intermediary for the Russian services.

In the past, says Mr Soldatov, the FSB, SVR and GRU had a clearer division of labour. No longer. All three agencies have been particularly active in recruiting among the flood of exiles who left Russia after the war. It is easy to hide agents in a large group and simple to threaten those with family still in Russia. Germany is of particular concern, given that the many Russians who have moved there could make up a recruiting pool for Russian spy bodies. The flood of new arrivals is thanks in part to Baltic countries having grown more hostile to Russian emigres.

Moreover, Russian cyber-activity goes from strength to strength. In December America and Britain issued public warnings over “Star Blizzard”, an elite FSB hacking group which has been targeting Nato countries for years. The following month Microsoft said that “Cosy Bear”, a group linked to the SVR, had penetrated email accounts belonging to some of the company’s most senior executives. That came on top of a sophisticated GRU cyber-attack against Ukraine’s power grid, causing a power outage apparently co-ordinated with Russian missile strikes in the same city.

The renewal of Russia’s intelligence apparatus comes at a crucial moment in east-west competition. An annual report by Norway’s intelligence service, published on February 12th, warned that, in Ukraine, Russia was “seizing the initiative and gaining the upper hand militarily”. Estonia’s equivalent report, released a day later, said that the Kremlin was “anticipating a possible conflict with NATO within the next decade”.

Link to the rest at The Economist

PG understands that Russia is trying to harm the U.S. and any other nations that are allied with the U.S. and such activities need to be taken seriously. However, the OP reminded him of spy vs. spy during the Cold War.

Strong Passions

From The Wall Street Journal:

Peter Strong’s marriage to his young wife, Mary, had not been particularly happy over the previous few months. Now they were dealing with the death of their 14-month-old daughter. Peter, a bon vivant living off inherited wealth, is anxious to rekindle the romance with Mary. Instead, his wife’s response shocks him. She pulls away, sobbing. “Oh, forgive me, forgive me.” Then she confesses that for the previous two years she has been having an affair with his widowed younger brother, Edward.

In “Strong Passions: A Scandalous Divorce in Old New York,” Barbara Weisberg describes a case from the 1860s that has all the elements of a soap opera—powerful families, a tearful confession, adultery, abortion and the fate of two innocent little girls.

Peter and Mary would never again share a bed, but at first there was no talk of a divorce. “Whatever a couple’s miseries,” Ms. Weisberg writes, in mid-19th-century America “legally ending a marriage was viewed as an abhorrent act, especially among members of the Strongs’ social class.” In New York, divorce was especially difficult because the courts accepted only one cause—adultery—and it would have humiliated both Peter, who had been cuckolded by his own brother, and Mary, who would be branded a “fallen woman.” A wife divorced for adultery would have had to surrender custody of her children. By all accounts, Peter and Mary were equally devoted to their daughters, 7-year-old Mamie and 21/2 -year-old Allie.

Perhaps the Strongs might have managed a quiet separation, during which they would jointly raise their daughters. But then Mary admitted that she was pregnant, and that the paternity was uncertain. Peter was enraged. At this point, either Mary had a miscarriage or Peter secured an abortion. Relations between the couple deteriorated rapidly, and two years after Mary’s confession, Peter filed for divorce. Members of Manhattan’s upper crust were appalled. Mary retaliated by bolting, taking Allie with her and disappearing completely. Meanwhile, Peter was indicted for manslaughter for the murder of his wife’s unborn child.

Ms. Weisberg, a former television producer and the author of “Talking to the Dead” (2004), reconstructs the events that led up to the Strong divorce almost entirely from a few court documents plus the newspaper articles that covered the subsequent court cases. There are no personal papers remaining from any of those involved, except for a few laconic journal entries by Peter’s cousin, the diarist George Templeton Strong. This means the author had no access to the unfiltered voices of the main actors in this family saga, and she has struggled to bring them to life. Nonfiction accounts of long-dead individuals always require a degree of speculation, but “Strong Passions” has more than its share of words like “probably,” “perhaps,” “likely” and “undoubtedly.”

Instead, Ms. Weisberg devotes two-thirds of her book to the overlapping narratives heard in court from an extraordinary number and range of witnesses—“a governess, a detective, a judge’s daughter, an undertaker, an abortionist’s spouse, a laundress and Teddy Roosevelt’s uncle.” She writes that “the series of dramatic incidents that precipitated the divorce suit were clouded by a divergence in bitterly contested versions of what had occurred.” Was Mary a victim or instigator of her affair? Did she admit guilt or deny the accusations against her? Was Peter a brutal villain or a gentle and put-upon husband? Did he have an affair with the abortionist? How many of the witnesses were bribed?

Despite the dramas described in court, Peter was acquitted of the criminal charges for lack of evidence, and the jury in the civil-law divorce case deliberated for 45 hours but remained deadlocked. There was therefore no divorce. George Templeton Strong declared that “the public is sick of this horrible case.” He went on to observe that some commentators found neither Peter nor Mary “particularly admirable,” and as such the two seemed “so well matched” it would be “a pity to divorce them.”

Link to the rest at The Wall Street Journal (Sorry if you encounter a paywall)

The Literary Memoir

Non-fiction, and in particular the literary memoir, the stylised recollection of personal experience, is often as much about character and story and emotion as fiction is.

Chimamanda Ngozi Adichie

The Pursuit of Happiness

From The Wall Street Journal:

Jeffrey Rosen spent the Covid lockdowns of 2020 reading various treatises by Xenophon, Seneca, Cicero, John Locke and David Hume. He was led to these and other works by a list compiled by Thomas Jefferson in 1771 in which the future president advised a friend on the books with which a good private library ought to be stocked.

For a year, Mr. Rosen writes in “The Pursuit of Happiness,” “I got up every morning before sunrise, read a selection from his list, and found myself taking notes on the reading in sonnet form, so that I could easily remember the daily lesson.” Later, he says, he discovered that important figures in Founding-era America, including Benjamin Franklin and Alexander Hamilton, similarly took notes on their reading in verse form.

All this reading and metrical note taking, Mr. Rosen tells us, changed his understanding of that famous phrase in the Declaration of Independence, “the pursuit of happiness.” Today, he says, “we think of happiness as the pursuit of pleasure. But classical and Enlightenment thinkers defined happiness as the pursuit of virtue—as being good, rather than feeling good.”

Jefferson might have been expected to follow Locke, whom he had definitely read, in naming life, liberty and “property” as the chief things to which man has rights. Instead he wrote “pursuit of happiness.” Until roughly the middle of the past century, historians tended to view the phrase as a bit of rhetorical fluff. More recent historians mostly acknowledge that “happiness” had a deeper and nobler meaning in 1776 than it would have two centuries later.

Garry Wills, in his brilliant study of the Declaration, “Inventing America” (1978), traces Jefferson’s words—“pursuit,” “happiness” and many others—to their sources in the writings of British philosophers of the 17th and 18th centuries, including Locke. More recently, University of Missouri law professor Carli Conklin has concluded that, for Jefferson and the Founders, the right to pursue happiness meant something like the freedom to align one’s life with the laws of nature.

Mr. Rosen, the president of the National Constitution Center in Philadelphia, refers hardly at all to the long and tangled debate over the meaning of Jefferson’s phrase and cites primary sources almost exclusively. His book’s overarching argument holds that, for the Founders—he concentrates on Jefferson, John Adams, Franklin and George Mason—the pursuit of happiness lay in the ancient creed of Stoicism.

The Stoics, recall—Epictetus, Seneca and Marcus Aurelius, among others—believed that the good life consisted in self-mastery and the cultivation of virtue. The Founders, in Mr. Rosen’s view, derived their understanding of “happiness,” and therefore of the purpose of political liberty, from the Stoics, in some cases directly, in others via their readings of Enlightenment philosophers such as Locke and Hume. “The Founders,” he writes, “believed that the pursuit of happiness regards freedom not as boundless liberty to do whatever feels good in the moment but as bounded liberty to make wise choices that will help us best develop our capacities and talents over the course of our lives.”

Link to the rest at The Wall Street Journal (Sorry if you encounter a paywall)

How the ghostwriter of Biden’s memoirs ended up in the center of a classified documents probe

From The Associated Press:

President Joe Biden worked so closely with the ghostwriter with whom he is accused of sharing classified secrets that he once declared that he’d trust the author with his life.

Mark Zwonitzer worked with Biden on two memoirs, 2007’s “Promises to Keep” and “Promise Me, Dad,” which was published 10 years later. According to a report released Thursday by special counsel Robert Hur, Biden was sloppy in his handling of classified material found at his home and former office, and shared classified information contained in some of them with Zwonitzer while the two were working on the Biden’s second book.

Hur’s report says no criminal charges are warranted against Biden. It says his office considered charging Zwonitzer with obstruction of justice because the ghostwriter destroyed recordings of interviews he conducted with Biden while they worked on his second memoir together once he learned of the documents investigation. But Hur also said Zwonitzer offered “plausible, innocent reasons” for having done so and cooperated with investigators subsequently, meaning the evidence against him was likely “insufficient to obtain a conviction.”

Hours after the report was released, Biden addressed reporters at the White House and spoke to what he shared with Zwonitzer, saying, “I did not share classified information” adding he didn’t do so “with my ghostwriter. Guarantee you I did not.”

Link to the rest at The Associated Press

No, TPV is not going to turn into a political blog. PG notes this story involves a ghostwriter who was working on a book with Pres. Biden.

Why Some Are More Equal Than Others

From Literary Review:

The Remigia cave, about eighty miles north of Valencia, features paintings dating from around 6500 BC. Some depict bands of archers hunting ibex; others appear to show executions. These are the ones tourists come for. But the most significant image is the least dramatic. Fourteen individuals gather closely together, watching a lone figure departing from the group. It appears to be an ostracism – a social death, not a physical one.

The hunter-gatherer tribes of that era were perhaps the most equal communities in human history. But this egalitarianism was strictly bounded. Individuals who were not part of the tribe or who broke its norms were cast out or killed. Inclusion required exclusion.

In a famous essay, the economist and philosopher Amartya Sen pointed out that we are all in favour of equality. We just disagree about whether we mean equality of money, or power, or respect, or legal standing, or whatever. The question is ‘equality of what?’ But there is an even deeper question than this: ‘equality of whom?’ Where is the line between those considered as equals and those who are not – between the fourteen and the one?

This is the question animating Equality, a landmark work of intellectual history by Dartmouth historian Darrin McMahon. ‘Time and again we have seen controversies play out over equality’s “substance” and the degree to which it could admit of difference,’ McMahon writes. ‘Did equality imply common religious or national belonging? Was it delimited by sex, title, or race? Or did it free up individuals to make claims on the collective regardless of the fortunes of their birth?’

It is easy to invoke equality without facing its limits. Contra John Lennon, it is actually very hard to imagine a world with no countries. ‘For all the high-minded talk of “global equality” in recent times’, McMahon writes, ‘its contours have most often been imagined from within the walls of nation-states, where equality extends only to those who share a passport and more often than not a place of birth.’

McMahon has set himself an almost impossible task: to analyse humanity’s most powerful and contested idea throughout history and across the globe. Most attempts at total histories of ideas fail. Depth is sacrificed to achieve breadth, the reader is marched along too strict a chronological path or the author gets stuck in an etymological quagmire. But McMahon succeeds. This book is deeply researched, tightly argued and sparklingly written. It ought to be read by anyone interested in equality, and also anyone interested in people, history, God, politics, religion, nationalism, war or love.

The book is structured around what McMahon calls ‘figures’ of equality, a term he uses in the rhetorical sense of a ‘figure of speech’. These figures are explored in roughly chronological order, from ‘Reversal’, the overturning by hunter-gatherers of the dominance of our ape ancestors, all the way to ‘Dream’, the invoking by 20th-century reformers such as Martin Luther King of a new concept of equality founded on universal brotherhood. The only downside of this approach is that it involves a degree of repetition.

There’s no romanticisation in these pages. Not only did hunter-gatherers kill or expel in order to maintain order, they also formed hierarchies. Or rather, hierarchies formed them. McMahon insists that hierarchies are everywhere in human history, just as they exist in every primate community. Human beings ‘cannot live without hierarchies’, he writes, since ‘status is part of the air we breathe’.

One of the big advantages of human hierarchies is their diversity: there’s more than one way to be top dog. McMahon writes that ‘unlike animals, we regularly inhabit multiple hierarchies at once, with the result that a low-status individual in one environment, say a janitor at a corporation, may be a high-status individual, the captain of the company softball team, in another’. This insight is not developed, but it is critical. One way to square equality with hierarchies is to scramble them, not only over generations but also over the course of an average day. In other words, you defang hierarchies not by denying them but by multiplying them.

Link to the rest at Literary Review

Bonus Book About Equality

The Gettysburg Address

From Abraham Lincoln Online:

On June 1, 1865, Senator Charles Sumner referred to the most famous speech ever given by President Abraham Lincoln. In his eulogy on the slain president, he called the Gettysburg Address a “monumental act.” He said Lincoln was mistaken that “the world will little note, nor long remember what we say here.” Rather, the Bostonian remarked, “The world noted at once what he said, and will never cease to remember it. The battle itself was less important than the speech.”

There are five known copies of the speech in Lincoln’s handwriting, each with a slightly different text, and named for the people who first received them: Nicolay, Hay, Everett, Bancroft and Bliss. Two copies apparently were written before delivering the speech, one of which probably was the reading copy. The remaining ones were produced months later for soldier benefit events. Despite widely-circulated stories to the contrary, the president did not dash off a copy aboard a train to Gettysburg. Lincoln carefully prepared his major speeches in advance; his steady, even script in every manuscript is consistent with a firm writing surface, not the notoriously bumpy Civil War-era trains. Additional versions of the speech appeared in newspapers of the era, feeding modern-day confusion about the authoritative text.

Bliss Copy

Ever since Lincoln wrote it in 1864, this version has been the most often reproduced, notably on the walls of the Lincoln Memorial in Washington. It is named after Colonel Alexander Bliss, stepson of historian George Bancroft. Bancroft asked President Lincoln for a copy to use as a fundraiser for soldiers (see “Bancroft Copy” below). However, because Lincoln wrote on both sides of the paper, the speech could not be reprinted, so Lincoln made another copy at Bliss’s request. It is the last known copy written by Lincoln and the only one signed and dated by him. Today it is on display at the Lincoln Room of the White House.

Four score and seven years ago our fathers brought forth on this continent, a new nation, conceived in Liberty, and dedicated to the proposition that all men are created equal.

Now we are engaged in a great civil war, testing whether that nation, or any nation so conceived and so dedicated, can long endure. We are met on a great battle-field of that war. We have come to dedicate a portion of that field, as a final resting place for those who here gave their lives that that nation might live. It is altogether fitting and proper that we should do this.

But, in a larger sense, we can not dedicate — we can not consecrate — we can not hallow — this ground. The brave men, living and dead, who struggled here, have consecrated it, far above our poor power to add or detract. The world will little note, nor long remember what we say here, but it can never forget what they did here. It is for us the living, rather, to be dedicated here to the unfinished work which they who fought here have thus far so nobly advanced. It is rather for us to be here dedicated to the great task remaining before us — that from these honored dead we take increased devotion to that cause for which they gave the last full measure of devotion — that we here highly resolve that these dead shall not have died in vain — that this nation, under God, shall have a new birth of freedom — and that government of the people, by the people, for the people, shall not perish from the earth.

Abraham Lincoln
November 19, 1863

Nicolay Copy

Named for John G. Nicolay, President Lincoln’s personal secretary, this is considered the “first draft” of the speech, begun in Washington on White house stationery. The second page is writen on different paper stock, indicating it was finished in Gettysburg before the cemetery dedication began. Lincoln gave this draft to Nicolay, who went to Gettysburg with Lincoln and witnessed the speech. The Library of Congress owns this manuscript.

Four score and seven years ago our fathers brought forth, upon this continent, a new nation, conceived in liberty, and dedicated to the proposition that all men are created equal.

Now we are engaged in a great civil war, testing whether that nation, or any nation so conceived, and so dedicated, can long endure. We are met on a great battle field of that war. We come to dedicate a portion of it, as a final resting place for those who died here, that the nation might live. This we may, in all propriety do.

But, in a larger sense, we can not dedicate we can not consecrate we can not hallow, this ground The brave men, living and dead, who struggled here, have hallowed it, far above our poor power to add or detract. The world will little note, nor long remember what we say here; while it can never forget what they did here.

It is rather for us, the living, we here be dedicated to the great task remaining before us that, from these honored dead we take increased devotion to that cause for which they here, gave the last full measure of devotion that we here highly resolve these dead shall not have died in vain; that the nation, shall have a new birth of freedom, and that government of the people, by the people, for the people, shall not perish from the earth.

Hay Copy

Believed to be the second draft of the speech, President Lincoln gave this copy to John Hay, a White House assistant. Hay accompanied Lincoln to Gettysburg and briefly referred to the speech in his diary: “the President, in a fine, free way, with more grace than is his wont, said his half dozen words of consecration.” The Hay copy, which includes Lincoln’s handwritten changes, also is owned by the Library of Congress.

Four score and seven years ago our fathers brought forth, upon this continent, a new nation, conceived in Liberty, and dedicated to the proposition that all men are created equal.

Now we are engaged in a great civil war, testing whether that nation, or any nation so conceived, and so dedicated, can long endure. We are met here on a great battlefield of that war. We have come to dedicate a portion of it, as a final resting place for those who here gave their lives that that nation might live. It is altogether fitting and proper that we should do this.

But in a larger sense, we can not dedicate we can not consecrate we can not hallow this ground. The brave men, living and dead, who struggled here, have consecrated it far above our poor power to add or detract. The world will little note, nor long remember, what we say here, but can never forget what they did here.

It is for us, the living, rather to be dedicated here to the unfinished work which they have, thus far, so nobly carried on. It is rather for us to be here dedicated to the great task remaining before us that from these honored dead we take increased devotion to that cause for which they gave the last full measure of devotion that we here highly resolve that these dead shall not have died in vain; that this nation shall have a new birth of freedom; and that this government of the people, by the people, for the people, shall not perish from the earth.

Link to the rest at Abraham Lincoln Online

There are two more versions of the speech in the OP.

Here are a few quotes from public speeches discussing slavery by Abraham Lincoln before he became President, also from Abraham Lincoln Online.

Slavery is founded in the selfishness of man’s nature — opposition to it is in his love of justice. These principles are an eternal antagonism; and when brought into collision so fiercely, as slavery extension brings them, shocks, and throes, and convulsions must ceaselessly follow. Repeal the Missouri Compromise — repeal all compromises — repeal the declaration of independence — repeal all past history, you still can not repeal human nature. It still will be the abundance of man’s heart, that slavery extension is wrong; and out of the abundance of his heart, his mouth will continue to speak.
–October 16, 1854 Speech at Peoria

I believe this Government cannot endure, permanently half slave and half free. I do not expect the Union to be dissolved — I do not expect the house to fall — but I do expect it will cease to be divided.
–June 16, 1858 House Divided Speech

He [Stephen Douglas] is blowing out the moral lights around us, when he contends that whoever wants slaves has a right to hold them; that he is penetrating, so far as lies in his power, the human soul, and eradicating the light of reason and the love of liberty, when he is in every possible way preparing the public mind, by his vast influence, for making the institution of slavery perpetual and national.
–October 7, 1858 Lincoln-Douglas Debate at Galesburg, Illinois

That is the real issue. That is the issue that will continue in this country when these poor tongues of Judge Douglas and myself shall be silent. It is the eternal struggle between these two principles — right and wrong — throughout the world. They are the two principles that have stood face to face from the beginning of time, and will ever continue to struggle. The one is the common right of humanity and the other the divine right of kings. It is the same principle in whatever shape it develops itself. It is the same spirit that says, “You work and toil and earn bread, and I’ll eat it.” No matter in what shape it comes, whether from the mouth of a king who seeks to bestride the people of his own nation and live by the fruit of their labor, or from one race of men as an apology for enslaving another race, it is the same tyrannical principle.
— October 15, 1858 Debate at Alton, Illinois

Sister-in-law’s letters provide insights into Charles Dickens’ life and legacy

From The Guardian:

A collection of letters from one of Charles Dickens’ most valued confidantes will go on display in London for the first time this week to mark 212 years since the literary giant’s birth.

The letters were written by Dickens’ sister-in-law turned housekeeper and executor Georgina Hogarth, who was instrumental in preserving the author’s legacy after his death in 1870.

Acquired by the Charles Dickens Museum, the correspondence offers an illuminating insight into a woman who “had a huge impact both within Dickens’ life and after it”, said Emma Harper, a curator at the museum.

The extensive correspondence – a small portion of which will go on public display – was written by Hogarth to journalist Charles Kent between 1867 and 1898. It discusses Hogarth’s “unbearable” grief after Dickens’ death as well as her great admiration for the man.

“We’re hoping the letters provide some insight into Dickens’ personal life and Hogarth’s role in it,” said Harper, who is in the process of deciphering the 120-strong collection at a rate of around six letters a day. “It’s also just lovely to have Georgina’s own words because women, at this point in time, were not recorded quite so well.”

Hogarth lived with Dickens for 28 years and bore witness to some of the most significant events in his private life – namely, his separation from her sister Catherine, whom he left for 18-year-old actor Ellen Ternan in 1858.

Alongside biographer John Forster, she was an executor of Dickens’ will, and was responsible for distributing his estate. She also co-edited three volumes of letters with the author’s daughter Mamie, whom she had been involved in raising, alongside the nine other Dickens children.

“For us the key thing is to find out more about Georgina’s life and her relationship with Dickens, exploring their friendship and the role of women within the household of this very famous writer,” added Harper. “The letters will add a great deal to research about her.”

Writing to Kent on Dickens’ birthday on 7 February 1871 – the first after his death – Georgina expressed that life without her companion was “very very hard to bear” and that she wished to die. She caveated: “I know he is happy and blessed and indeed I do not think I would recall him to this dark world if I could.”

Link to the rest at The Guardian

How the ghostwriter of Biden’s memoirs ended up in the center of a classified documents probe

From The Associated Press:

President Joe Biden worked so closely with the ghostwriter with whom he is accused of sharing classified secrets that he once declared that he’d trust the author with his life.

Mark Zwonitzer worked with Biden on two memoirs, 2007’s “Promises to Keep” and “Promise Me, Dad,” which was published 10 years later. According to a report released Thursday by special counsel Robert Hur, Biden was sloppy in his handling of classified material found at his home and former office, and shared classified information contained in some of them with Zwonitzer while the two were working on the Biden’s second book.

Hur’s report says no criminal charges are warranted against Biden. It says his office considered charging Zwonitzer with obstruction of justice because the ghostwriter destroyed recordings of interviews he conducted with Biden while they worked on his second memoir together once he learned of the documents investigation. But Hur also said Zwonitzer offered “plausible, innocent reasons” for having done so and cooperated with investigators subsequently, meaning the evidence against him was likely “insufficient to obtain a conviction.”

Hours after the report was released, Biden addressed reporters at the White House and spoke to what he shared with Zwonitzer, saying, “I did not share classified information” adding he didn’t do so “with my ghostwriter. Guarantee you I did not.”

Zwonitzer did not immediately return messages seeking comment.

Years earlier, in an interview Biden conducted as part of the Audible audiobook version of “Promise Me, Dad,” Biden called Zwonitzer a “great, great guy” and said, “I trust him with my life.”

He added that Zwonitzer “helped me organize; that was his great asset to me.”

It may feel like less of one now.

Hur’s report says Biden saved notebooks from his time as vice president that contained classified information and used them to help Zwonitzer put together his memoir — sometimes reading from them verbatim for more than hour at a time. Biden did that, the report says, despite being aware from when he once suggested that Zwonitzer could be hired as historian for the Office of the Vice President, that the author did not have security clearance.

The report details one of the boxes recovered by federal investigators was labeled “mark Z,” and that one recorded conversation with Zwonitzer in 2017 had Biden saying that he’d “just found all the classified stuff downstairs” of a home he was then renting in Virginia.

Biden spoke to that incident Thursday night and, when pressed that Hur’s reports suggested he had read classified documents to his ghostwriter responded, “It did not say that.”

. . . .

Biden added, “it was not classified information in that document. That was not classified.”

Though the report concludes that the published finished product of “Promise Me, Dad” did not contain classified information, it says that Zwonitzer deleted recordings he made during his previous conservations with Biden after he learned about the special counsel’s probe.

But it also says that Zwonitzer offered explanations for his deletions and made available transcripts of the recordings. Additionally, he gave investigators his notes and the computer and external hard drive from which the recordings were removed, which allowed authorities to recover most of what had been deleted.

Link to the rest at The Associated Press

Against the Current

From TLS:

On July 4, 1845, a man from Concord, Massachusetts, declared his own independence and went into the woods nearby. On the shore of a pond there, Henry David Thoreau built a small wooden cabin, which he would call home for two years, two months and two days. From this base he began a philosophical project of “deliberate” living, intending to “earn [a] living by the labor of my hands only”. Though an ostensibly radical undertaking, this experiment was not a break with his past, but the logical culmination of years of searching and groping. Since graduating from Harvard in 1837 Thoreau had tried out many ways of earning his keep, and fortunately proved competent in almost everything he set his mind to. Asked once to describe his professional situation, he responded: “I don’t know whether mine is a profession, or a trade, or what not … I am a schoolmaster, a private tutor, a surveyor, a gardener, a farmer, a painter (I mean a house-painter), a carpenter, a mason, a day-laborer, a pencil-maker, a glass-paper-maker, a writer, and sometimes a poetaster”.

From this position, with any number of routes before him, yet none decided on, Thoreau was particularly well placed to consider questions about the nature, purpose and fundamental meaning of work. Yet he was also a born contrarian, a natural dissenter, with a knack for swimming against the current (his friend and mentor Ralph Waldo Emerson spoke of him as “spiced throughout with rebellion”), and when finally he emerged from the woods he was set not on a trade or career, but on life as a communal gadfly – a professional pain in the neck. “I do not propose to write an ode to dejection”, he writes in Walden (1854), “but to brag as lustily as a chanticleer in the morning, standing on his roost, if only to wake my neighbors up.” His self-imposed seclusion had allowed him to see his outsiderness anew, to understand it from within, to become of a piece with it.

This was a time of unprecedented change in American history. In a generation the country had gone from a motley collection of states, lagging the European powers, to a key player on the world stage. It was a sharp and swift upheaval, resulting not only in a dramatic depredation of the natural environment, but also in a dangerous straining of the country’s social fabric and a remaking of the American collective psyche. Thoreau had already seen the effects in 1843, when he visited New York City, which was then in the vanguard of the great transformation. The rapid technological advancements, the piling up of wealth, the relentless drive to prosperity, the general acceleration of life – such markers of progress may, he worried, end up killing the humanity in us. “I walked through New York yesterday – and met no real and living person”, he wrote in his diary. The future may have seemed radiant to some, but Thoreau was not impressed: “I am ashamed of my eyes that behold it. It is a thousand times meaner than I could have imagined. It is something to hate – that’s the advantage it will be to me”. That meanness would in time follow him back to Massachusetts. In Walden he protests the arrival of the railway in Concord, a stone’s throw from his cabin: “What an infinite bustle! I am awakened almost every night by the panting of the locomotive. It interrupts my dreams. There is no sabbath”.

At a moment when everything in America seemed to be accelerating, Thoreau, always true to form, came up with a counterproposal: slow down, do as little as you need to. Nothing, ideally. “Why should we be in such desperate haste to succeed, and in such desperate enterprises? If a man does not keep pace with his companions, perhaps it is because he hears a different drummer. Let him step to the music which he hears, however measured or far away.” Thoreau wanted not only to bring back Sabbath to a world that seemed to have lost it, but also to re-signify it. “The order of things should be somewhat reversed”, he had said a few years before, in his Harvard commencement speech. The “seventh should be man’s day of toil … and the other six his Sabbath of the affections of the soul”.

. . . .

Three recent books give us a sense of how the chanticleer of Concord keeps us awake today. In Thoreau’s Axe: Distraction and discipline in American culture, Caleb Smith uses Thoreau as the starting point for a wider discussion of attention and wakefulness in nineteenth-century America. Our concerns with distraction and dwindling mental focus, Smith argues, are nothing new. They were prefigured, centuries ago, by important public conversations – an “attention revival”, Smith calls them – sparked in America by the arrival of new economic systems and technologies, which threatened to dismantle traditional forms of life. Smith discusses twenty-eight short texts on attention by religious authors, fiction writers, social reformers and spiritual seekers. He examines Herman Melville’s Moby-Dick (1851), for example. We see here an Ishmael who, because of his “opium-like listlessness”, proves to be the most incompetent of masthead watchmen. “Over the course of three years at sea, he fails to call out a single whale.” Ishmael suffers chronically from a form of distraction that places him among the “romantic, melancholy, and absent-minded young men, disgusted with the carking cares of earth”, a condition to which we can relate only too well. “Today, in our age of new media and chronic attention deficit”, Smith writes, such “passages from the nineteenth century have a strange resonance.” His book is “a salvage operation”. In that century’s “ways of valuing and practicing attention” he hopes to find resources for “living through the present”.

Smith’s book has the merit of showing a meaningful continuity not only between our time and Thoreau’s, but also between Thoreau and like-minded thinkers of his century. It places his work in the broader tradition of “spiritual exercises”, developed over centuries by philosophers and religious thinkers, designed to “detach people’s minds from the passions and drama of everyday social life so they can focus on higher, more enduring realities”. As Smith sees it, whether Thoreau was conscious of it or not, he was “reworking an older asceticism”. Just as Christian penitents strove to master their flesh and discipline their lives, so Thoreau acted on himself to become more wakeful – or “mindful”, as we would say today. In this reading his famous walks in the woods were no ordinary perambulations, but opportunities to practise what he called “the discipline of looking always at what is to be seen”.

Link to the rest at TLS

A Terribly Serious Adventure

From The New York Times:

When setting out to write “A Terribly Serious Adventure: Philosophy and War at Oxford, 1900-1960,” Nikhil Krishnan certainly had his work cut out for him. How to generate excitement for a “much-maligned” philosophical tradition that hinges on finicky distinctions in language? Whose main figures were mostly well-to-do white men, routinely caricatured — and not always unfairly — for being suspicious of foreign ideas and imperiously, insufferably smug?

Krishnan, a philosopher at Cambridge, confesses up front that he, too, felt frustrated and resentful when he first encountered “linguistic” or “analytic” philosophy as a student at Oxford. He had wanted to study philosophy because he associated it with mysterious qualities like “depth” and “vision.” He consequently assumed that philosophical writing had to be densely “allusive”; after all, it was getting at something “ineffable.” But his undergraduate tutor, responding to Krishnan’s muddled excuse for some muddled writing, would have none of it. “On the contrary, these sorts of things are entirely and eminently effable,” the tutor said. “And I should be very grateful if you’d try to eff a few of them for your essay next week.”

A Terribly Serious Adventure” is lively storytelling as sly “redescription”: an attempt to recast the history of philosophy at Oxford in the mid-20th century by conveying not only what made it influential in its time but also what might make it vital in ours. The philosophers in this book were preoccupied with questions of language — though Krishnan says that to call what they practiced the “linguistic turn” is to obscure continuities with what came before and what came after. Still, Gilbert Ryle, one of the book’s central figures, believed that the philosophy he was doing marked some sort of break from a tradition that was full of woolly speculation about reality and truth. He joked that being appointed the chair in metaphysics — as Ryle was in 1945 — was like being named a chair in tropical diseases: “The holder was committed to eliminating his subject.”

As one of the mainstays in this book, Ryle keeps showing up as others come and go. Born in 1900, he became a fixture at Oxford, asking successive generations a version of the question that was posed to him as a student: “Now, Ryle, what exactly do you mean by …?” This insistence on clarification was foundational to his approach. He liked to use verbal puzzles constructed around ordinary examples: someone buying gloves, a circus seal performing tricks, a confectioner baking a cake. He argued against the “fatalist doctrine” by giving the example of a mountaineer in the path of an avalanche. The fatalist’s doomsaying misuses the language of inevitability. The unlucky mountaineer is doomed in one (immediate) sense but not in another: “The avalanche is practically unavoidable, but it is not logically inevitable.”

Language is full of expressions that Ryle called “systematically misleading.” Philosophers, he warned, could be seduced by imprecision. In the 1920s, having recognized Martin Heidegger as a “thinker of real importance,” Ryle nevertheless worried there was something in Heidegger’s writing style that suggested his school of phenomenology was “heading for bankruptcy and disaster and will end in either self-ruinous Subjectivism or in a windy mysticism” — in other words, metaphysics. Heidegger, of course, would join the Nazi Party in 1933.

Krishnan’s book is teeming with Oxford characters: A.J. Ayer, J.L. Austin, Peter Strawson and Isaiah Berlin, among others. There are cameos by Ludwig Wittgenstein and Theodor Adorno, who worked with Ryle on a dissertation about the phenomenologist Edmund Husserl. Krishnan also dedicates part of the book to Elizabeth Anscombe, Philippa Foot, Mary Midgley and Iris Murdoch — four women who met at Oxford and became important figures in moral philosophy. The linguistic analysis that reigned supreme at Oxford was limited and limiting, they argued, though they allowed that a careful parsing of language still had its place. “Bad writing,” Murdoch said, “is almost always full of the fumes of personality.”

Krishnan himself is so skillful at explicating the arguments of others that at various points it seems as if he must be stating his own position. But no — he mostly hangs back, elucidating a variety of ideas with the respect he thinks they deserve. His own example lays bare the distinction between the critical essay and the hatchet job. The Oxford-trained philosopher-turned-anthropologist Ernest Gellner heaped scorn on his mentors in a 1959 slash-and-burn polemic and derided their work as “rubbish.” Gellner’s salvo wasn’t an attempt at debate; “what it sought was the humiliation and destruction of the enemy,” Krishnan writes. “It was entertaining, especially if one had nothing at stake.”

This, as it happens, was a common charge against Ryle and his colleagues: that their approach was “superciliously apolitical,” as one reviewer of Gellner’s book put it, fixated on picayune verbal puzzles, with nothing at stake. But Krishnan urges us to see things another way. Superficially “flippant examples” about a foreign visitor to a university or a game of cricket could build up “to a more subversive point,” he writes. Verbal puzzles can get us to think more deeply and precisely about how language can warp or clarify our presuppositions; envisioning a game of cricket is less likely than a political example to get our hackles up.

“Conversation, rather than mere speech, was the thing,” Krishnan writes. And one-on-one tutorials — as opposed to enormous lectures — were essential. Students weren’t supposed to learn what to think, “but how.” He writes movingly of Austin’s widow, Jean, who continued to teach at Oxford after Austin died in 1960. “Finding her students altogether too quick to dismiss the philosophers they read as stupid, she enjoins humility and generosity: Read them charitably, don’t overestimate your own ability to refute what you’re only beginning to understand.”

Link to the rest at The New York Times

The Counterfeit Countess

From The Wall Street Journal:

The remarkable story of Janina Mehlberg almost didn’t see the light of day. A Holocaust survivor and a mathematics professor in Chicago, Mehlberg stood out for making her way in an academic field dominated by men. But while teaching her students and giving conference papers, she was privately writing an account of her life’s most remarkable episode: her daring impersonation of a Polish aristocrat in World War II, a deception that allowed her to aid Poles who had been imprisoned by the Nazis. She kept it all secret, but her husband, a survivor and distinguished philosopher of science in his own right, preserved and translated the memoir after her death in 1969.

The manuscript found its way to a historian in Florida, but at that time, there wasn’t enough interest in Holocaust stories for publication. Later the historian Elizabeth B. White received a copy of the memoir and then joined with her colleague Joanna Sliwa to bring the story to readers around the world. Stitching the pieces together, the pair collaborated with an international community of researchers to corroborate the memoir’s key elements. In “The Counterfeit Countess,” Ms. White and Ms. Sliwa put Mehlberg’s accomplishments in the context of the awful events of war, genocide, collaboration—all to properly frame the heroism of a woman whose decision to risk her own life saved uncounted others.

Janina Mehlberg was born Pepi Spinner in 1905 in the “comfortable elegance” of an assimilated Jewish family in the town of Zurawno, then in Poland and now part of Ukraine. Although a fragile child, Pepi became a star student, eventually heading to the university in Lwów (now Lviv), where she studied philosophy and mathematics with some of Europe’s leading thinkers. There she met her future husband and fellow-scholar Henry Mehlberg. By the early 1930s they were married and had both found teaching positions in Lwów.

The city was occupied by Soviet forces in 1939 and then by Germany in 1941. The arrival of Nazi rule in Lwów almost immediately touched off a spree of extreme violence against the city’s Jewish population; massacres of Jews by local militias began even before deportations to death camps were set in motion.

With the help of a family friend, Pepi Mehlberg and her husband fled Lwów for Lublin, a city ravaged by the war and near the site of the Majdanek death camp. On the journey, she realized that an audacious masquerade—leaving behind her identity as a Jewish professor to become the “Countess Janina Suchodolska”—would give her the means to help others. Once she became convinced “that she would not survive the war,” write Ms. White and Ms. Sliwa, “fear’s grip on her began to loosen. The problem she needed to solve, she realized, was not how to survive, but how to live what remained of her life.”

Mehlberg’s solution was to save as many people as possible, and she took extraordinary risks to see her mission through. Yet the authors emphasize that she was rigorous and unsentimental. She had fixed her goal of saving lives and then took only appropriate risks to accomplish her task. Allowing herself to be overcome by feeling, she knew, could get her and many others killed.

The Majdanek camp held Polish prisoners forced into slave labor, Russian prisoners of war, and Jews who would be murdered either by being shot at close range or poisoned by gas. The bodies of the dead, ranging in age from infants to the elderly, were incinerated in the crematoria or buried in pits dug by the camp’s inmates anticipating their own murders. As “the Countess,” Mehlberg served as the head of the Polish Main Welfare Council, visiting the camp regularly. The haughty, demanding countess negotiated ways to bring soup, bread, medicine—and hope—to a great many Polish prisoners. Betraying little emotion, this hidden Jew became a sort of patron saint by appearing again and again to witness their suffering and alleviate it as best she could. “Janina’s story is unique,” the authors assert. “She was a Jew who rescued non-Jews in the midst of the largest murder operation of the Holocaust.”

“The Counterfeit Countess,” too, is unsentimental. The writing is matter of fact; the authors include data about the numbers of meals served, the details of negotiations with Nazi officers, the changes in camp conditions as the war unfolded. Mehlberg recognized that the Germans were making trade-offs within their sick paradigm of racial superiority. Would it be more efficient to murder Poles or starve them while they worked? She persuaded Nazi higher-ups to let her organization provide thousands of tons of food to prisoners so that they could do the work that would feed the Nazi war machine. German commanders decided it served their interests to allow “the Countess” to continue providing food and medicine to enslaved workers.

Meanwhile, Mehlberg worked with the Polish resistance to coordinate efforts to undermine the Nazi regime, especially as the Germans began losing the war. In the eyes of those Polish patriots who knew of her courage, she was a hero. Yet when the war was over, Mehlberg (like many Jews who had taken on false identities) recognized that it was still too dangerous to reveal who she really was. The saddest pages of this often sad book describe the antisemitic violence that swept through Poland after the Nazis were defeated. As the Soviet Union imposed Stalinist orthodoxy on Eastern Europe, it was unsafe to be a Polish patriot or a Jew, or to be known to think freely about anything. Some poor souls—currently being tarred as colonizers by blinkered progressives—fled to Palestine. Mehlberg and her husband managed to settle in North America.

Link to the rest at The Wall Street Journal

Why Is Music Journalism Collapsing?

From The Honest Broker:

“This feels like the end of music reviews,” complained a depressed critic last night.

That gloomy prediction is a response to the demolition of Pitchfork, a leading music media outlet for the last 25 years. Parent company boss Anna Wintour sent out the bad news in an employee memo yesterday.

It’s more than just layoffs—the news is much worse than that. That’s because parent company Condé Nast also announced that Pitchfork will get merged into GQ.

Swallowed up by GQ? Is this some cruel joke?

Ah, for writers it’s all too familiar. In music media, disappearing jobs are now more common than backstage passes.

Just a few weeks ago, Bandcamp laid off 58 (out of 120) employees—including about “half of its core editorial staff.”

And Bandcamp was considered a more profitable, stable employer than most media outlets. The parent company before the recent sale (to Songtradr) and subsequent layoffs, Epic Games, will generate almost a billion dollars in income this year—but they clearly don’t want to waste that cash on music journalism.

Why is everybody hating on music writers?

Many people assume it’s just the same story as elsewhere in legacy media. And I’ve written about that myself—predicting that 2024 will see more implosions of this sort.

Sure, that’s part of the story.

But there’s a larger problem with the music economy that nobody wants to talk about. The layoffs aren’t just happening among lowly record reviewers—but everywhere in the music business.

Universal Music announced layoffs two days ago.

YouTube announced layoffs yesterday.

Soundcloud announced last week that the company is up for sale—after two rounds of layoffs during the last 18 months.

Spotify announced layoffs five weeks ago.

That same week, Tidal announced layoffs.

A few weeks earlier, Amazon Music laid off employees on three continents.

Meanwhile, almost every music streaming platform is trying to force through price increases (as predicted here). This is an admission that they don’t expect much growth from new users—so they need to squeeze old ones as hard as possible.

As you can see, the problem is more than just music writers—something is rotten at a deeper level.

What’s the real cause of the crisis? Let’s examine it, step by step:

  1. The dominant music companies decided that they could live comfortably off old music and passive listeners. Launching new artists was too hard—much better to keep playing the old songs over and over.
  2. So major labels (and investment groups) started investing huge sums into acquiring old song publishing catalogs.
  3. Meanwhile streaming platforms encouraged passive listening—so people don’t even know the names of songs or artists.
  4. The ideal situation was switching listeners to AI-generated tracks, which could be owned by the streaming platform—so no royalties are ever paid to musicians.
  5. These strategies have worked. Streaming fans don’t pay much attention to new music anymore.

I’ve warned about each of these—but we are now seeing the long-term results.

Link to the rest at The Honest Broker

PG notes that a lot of different parts of journalism has been collapsing for a long time.

In New York City and its environs, The New York Times is #3 by circulation. Newsday, which primarily covers Long Island, is #2. The Wall Street Journal, a paper that actually publishes a conservative editorial from time to time(!) has been #1 for a long time.

Newsday!!! A tabloid suburban newspaper. The ultimate snub for Manhattanites! Before you know it, a paper that covers farm news (New York Farmer) will have a larger circulation than The New York Times.

Infectious Generosity

From The Wall Street Journal:

In this unpleasant cultural moment, there’s no shortage of thinkers with ideas to cure what ails us. In “The Soul of Civility,” Alexandra Hudson argued that a broader spirit of respect and tolerance is the saving tonic. In “The Canceling of the American Mind,” Greg Lukianoff and Rikki Schlott made the case that free speech supplies the needful corrective. Now comes Chris Anderson to advocate the radical throwing-wide of arms, hearts and wallets as a means of ensuring a better future for all. In “Infectious Generosity,” Mr. Anderson describes with boyish enthusiasm and embarrassing naiveté his vision of how generosity could save the world. He imagines a future “lit up by maximum philanthropy,” in which human needs are gratified to such an extent that human beings cease to be handicapped by what cynics might call human nature.

“Let’s be generous in a way that gives people goosebumps,” writes the excitable Mr. Anderson, a British expatriate who has run the nonprofit speech-sponsoring TED organization for the past 23 years. “You don’t have to be rich or a creative genius. If you can adopt a generous mindset, seek to understand people you disagree with, and write words that are kind instead of cruel, you can help turn the tide.”

Mr. Anderson makes the case that beneficent acts need no longer go unknown. Good deeds and largess can go viral online, thus establishing as social norms the doing of good deeds and the giving of largess. Media executives can emphasize heart-warming stories. By such means, the internet can be a force-multiplier, making generosity contagious and causing people to crowdfund, spread kindness and share artistic work free.

To the author, generosity is “arguably the best idea humans have ever embraced.” As to why one ought place others before oneself, Mr. Anderson makes no transcendental claims, though as a former Christian he respects the role that religion plays in helping men and women to behave unselfishly—partly because they think God is watching and partly because, in most faith traditions, they are called to be charitable and give alms.

Money and the spreading of it plays a big part in “Infectious Generosity,” which is dotted with Liana Finck’s humorous drawings. Mr. Anderson believes that everyone, from billionaires downward, should give more than they do. He favors an examination of conscience—“Am I being generous with my money? Is my carbon footprint fully offset?”—and a secular sacrifice to replicate the Christian and Jewish practice of tithing (giving away 10% of one’s income) or the Muslim obligation of zakat (giving 2.5% of one’s wealth to charity).

Were all persons to follow suit, Mr. Anderson promises, the result would be a pot of philanthropic cash he puts at between $3.5 trillion and $10 trillion. Almost giddily, he lists the outcomes that such a flood of money could produce, from ending hunger and establishing a universal basic income “for the entire world” to making “meat alternatives as cheap and tasty as the real thing” by, in part, giving “meat alternatives the same advertising budget as the meat industry.”

The inclusion of meat substitutes on Mr. Anderson’s list of grandiose possibilities points to the fundamental silliness of “Infectious Generosity.” Meat substitutes, like heart-warming news operations, have not managed to entice the public. If people really wanted synthetic meat, they’d buy it; if they wanted media that feature only good news, they’d pay for them.

There are more serious problems with the premise of Mr. Anderson’s ecstatic vision. People differ on what is desirable, and desires conflict. Davos Man’s meatless cause is a cattle rancher’s anathema. A cash handout designed to uplift may just as easily corrupt. Do-gooders often fail to take into account the law of unintended consequences. For instance, the mosquito nets blanketing Africa to fight malaria are also causing ecological damage, because people are using them for fishing. Charlatans, too, are always ready to take advantage of donors with soft hearts and deep pockets.

Link to the rest at The Wall Street Journal

Optimal

From The Wall Street Journal:

What makes for a good work day? We might fantasize about achieving a state of “flow”—so enamored with work that time seems to stand still. But for an average day at the office, that’s asking a bit much. Better, say Daniel Goleman and Cary Cherniss, to aim for “a very satisfying day, one where we feel we did well in ways that matter to us, were in a mood that facilitates what we did, and felt ready to take on whatever challenges came along.”

In “Optimal: How to Sustain Personal and Organizational Excellence Every Day,” Messrs. Goleman and Cherniss tout the concept of emotional intelligence as the key to entering this state. Emotional intelligence involves self-mastery and empathy, social skills and stress resilience—virtues that a cursory glance at social media reveals to be more rare than one might hope. Still, they can be found outside the internet, not least in a successful workplace. The authors, both psychologists, marshal what they’ve learned through years of studying emotional intelligence to argue that its skill set offers “a crucial advantage in today’s tough business climate.” At many top companies, nearly everyone has a high IQ, so effectiveness is determined by something else—namely, the ability to manage one’s emotions and inspire others to want to do their best.

All this is unexceptionable and rather familiar by now. So the authors spend much of “Optimal” insisting that this worldview is more radical than it seems, casting it in opposition to the supposed conventional wisdom that classroom intelligence is all that matters: “The standard view . . . assumes that what makes you successful in school and academics is all you need for success in your career.”

The authors are most convincing when they show how emotional intelligence does, practically, pay off for people and organizations. For instance, one financial company found that advisers “had difficulty talking with clients about buying life insurance—preparing for your death, or even thinking about it, can be awkward or even off-putting.” But the advisers who went through emotional competence training, learning how to have difficult conversations, generated more sales than those who didn’t.

The authors tell the stories of several leaders whose wise self-control and empathy kept situations in check—such as the district manager for a large food-services company who responded to an (unsubstantiated) workplace-safety complaint from a disgruntled employee by trying to figure out why she was so upset. The manager was able to help that employee through the family health crisis that had precipitated the desire to lash out.

The chief executive of an architecture and engineering company was emotionally intelligent enough not to give a pep talk when she addressed employees after a round of layoffs. Instead she talked about her own sadness, presumably at having to lose good workers and disrupt their lives and gave people space to speak about their concerns before discussing the next steps. “While there was lingering anxiety,” the authors write, “gradually the collective mood shifted toward enthusiasm about the company’s future—an emotional contagion built on reality; facing the sad facts, but then looking beyond.”

Folks who are not leading organizations can use emotional intelligence to improve their own situations. Messrs. Goleman and Cherniss share the wonderful story of Govan Brown, a New York City bus driver who aimed to transfer his own “ebullient mood” to his passengers—taking care of his flock, as he saw it, and ensuring that everyone got off the bus happier than when they got on.

Link to the rest at The Wall Street Journal (Sorry if you encounter a paywall)

She Talked Like a Millionaire, Slept in a Parking Garage and Fooled Nearly Everybody

From The Wall Street Journal:

University of Florida officials went back and forth with documentary filmmaker Jo Franklin over details for a planned gala in Franklin’s honor at the Four Seasons Hotel in Washington, D.C.

Franklin had pledged $2 million to her alma mater, and requested her guest list for the party include the entire staff of the PBS NewsHour. A day before the gala, school officials learned her seven-figure check had bounced. They boarded their flight to Washington, hoping to straighten everything out.

The next day, they found out Franklin hadn’t arrived at the Four Seasons, and the credit card number she gave the hotel wasn’t working. A person who identified as Franklin’s assistant emailed to say Franklin had broken her foot and couldn’t make it to Washington. University workers began phoning guests to say the gala was canceled.

The school’s esteemed graduate, once a journalist and documentary filmmaker specializing in the Middle East, emerged as a troubled and gifted fabulist. The $2 million gift was an illusion, one in a yearslong string of fantasies concocted by Franklin, who tumbled from a life of apparent success to homelessness. For years, she persuaded many around her that she was living the high life. Her family knew better.

“She is very ill and we need to have her put into a medical treatment facility of some type before she harms other people and herself,” her younger brother, George Franklin, wrote to family members days after they learned of the 2014 gala fiasco. Jo Franklin was 68 years old at the time and estranged from her daughter and siblings. 

In the years that followed, Franklin sometimes spent nights in a South Florida hotel parking garage. She was arrested a few times, once for allegedly stealing $11.98 worth of wine. Franklin also befriended a group of regulars at a local Starbucks who were impressed with her professional background and insights. “She was on point when it came to the political world, what’s going on in the world,” said Stephen Sussman, a Starbucks friend. Franklin made a point of her success. She mentioned having a driver and a home on affluent Jupiter Island, Fla. She said she stayed at a local hotel only to be closer to her job, which included working with government officials regarding the Saudis.

Over time, her image dissolved. Franklin’s friends noticed that she wore the same clothes. Her sandals had holes. She said she didn’t carry a cellphone because the Saudis were tracking her.

Lost in a surge of mental illness cases and a record-high homeless population are a growing number of Americans who can’t fully care for themselves but aren’t easily diverted into treatment, either on their own or involuntarily. Masses of homeless people around the U.S. have fueled aggressive efforts to push more of the mentally ill into care.

Franklin’s family worried for years about her mental stability and, like many others, were frustrated because they saw no easy path to getting her help. They didn’t know if she had ever been diagnosed or treated.

“My hope is just there was a way, even if she didn’t want it, to be forced to sit down with a mental health professional and figure out, ‘What is there to do here?’ ” said Franklin’s 38-year-old son, Hugh Trout, who last saw his mother nearly a decade ago.

George Franklin said his sister “wasn’t ever going to admit she had a problem.” He and the Starbucks friends had a plan to get Jo Franklin off the streets. It, too, involved a tall tale.

This account of Jo Franklin’s life is based on public records, emails and interviews with family, friends, former colleagues and associates familiar with her professional highs as well as her steep, slow-motion fall.

Josephine Anne Franklin was born in Chicago on July 31, 1946, the second oldest of four children in an upper middle-class family. The family moved to Tampa in the mid-1960s, and Jo graduated from the University of Florida in 1968. Her semester abroad in Lebanon kindled a lifelong fascination with the Middle East. She married a surgeon and had two children, Ashley Trout in 1981 and Hugh Trout four years later.

Franklin’s résumé included work as a producer for the MacNeil/Lehrer Report on PBS. She later made a series of documentaries for PBS on Saudi Arabia and the Middle East, as well as a series on the international space race. She often appeared on location in her films and was known professionally by her married name, Jo Franklin-Trout.

Her 1989 documentary “Days of Rage: The Young Palestinians,” aired on PBS and drew barbs from critics who saw a pro-Palestinian slant and lack of Israeli viewpoints. She said at the time that she wanted to showcase a rarely seen perspective.

Franklin spent the early part of the 1990s writing a novel, “The Wing of the Falcon,” billed as a love story and thriller set during the Persian Gulf War. The self-published novel sold poorly after its 1995 release, said Alan Gadney, whose company Franklin hired to produce and market the book, which Franklin had hoped would be made into a movie. A New York Times review acknowledged her Middle East expertise but said, “what she cannot do is write.”

Gadney said he later sued Franklin for roughly $25,000 she owed the company. “We went after her but she disappeared,” he said.

Franklin moved to California, and her family said they didn’t know how she supported herself during the roughly two decades she lived there. She split from her husband and, in 1996, a judge in their divorce case noted that Franklin had little taxable income from 1990 through 1992 and hadn’t filed a personal tax return since then. Yet she leased a Jaguar XJ6, the judge said, and had accumulated more than $150,000 in debt. 

He denied Franklin’s alimony request and awarded custody of her two children to Franklin’s ex-husband who lived in Washington. “Ms. Franklin is a person with many highly marketable skills,” the judge wrote.

Jerome Parks, an East Coast friend, kept in touch after she moved to California. “She was always talking about good things that were going to happen,” he said, “but they just never seemed to happen.”

Franklin’s family believed she became lost in her fantasies around the time of her divorce. Her daughter, Ashley Trout, who was sent to spend summers with her mother in California as a teenager, said Franklin lived beyond her means, focusing more on her image than productive work. Ashley said they had a combative relationship and that she often challenged her mother about lying and overspending.

When confronted, “she would just monologue for 30 minutes,” said Ashley, now 42. “Just a torrent, a firehose.” Her brother, Hugh, said, “When anyone started to tamper with that fantasyland, it would get very, very dark.”

In 2004, Ashley was taken to the hospital after a rock-climbing fall in Japan. Franklin called the hospital and said she was flying there on Colin Powell’s jet. “I get my mom on the phone and I tell her, listen, here’s the deal, there’s no jet,” Ashley said. “You don’t have access to Colin Powell’s jet.” 

Link to the rest at The Wall Street Journal (Sorry if you encounter a paywall)

Mental Maps of the Founders

A plan of my farm on Little Huntg. Creek & Potomk. R., George Washington (Library of Congress)

From The Wall Street Journal:

On July 1, 1776, as the Declaration of Independence was about to be published, its author complained to a friend that it was painful, at such a fraught moment, to be in Philadelphia, “300 miles from one’s country.”

Thomas Jefferson’s sentiment, expressed casually in a private letter, reveals an arresting fact—namely, that the new nation did not yet exist even in the mind of the man who announced its birth. This inchoate quality was obvious in the sense that the 13 former colonies, united only in their bid for independence, didn’t create a genuine national government for more than a decade. But Jefferson’s remark hit on something more elemental, the geopolitical space that a citizen intuitively recognizes as “one’s country.”

Michael Barone’s “Mental Maps of the Founders” focuses on this spatial sense among key members of the revolutionary generation. By “mental maps,” Mr. Barone intends more than a regional affinity. He means a “geographical orientation”—the perspective that place confers. He argues that, for the Founders, it shaped “what the new nation they hoped they were creating would look like and be like.”

Mr. Barone, a distinguished journalist and political analyst, develops his theme through a series of six biographical portraits. He argues, for example, that the crucial event in George Washington’s life was his military campaign to oust the French from the Ohio country in 1754. A skirmish during that adventure began the Seven Years’ War, a struggle for empire waged across Europe, the Americas and Asia. Though Horace Walpole, writing in his diary in London, didn’t know Washington by name, he was referring to him when he wrote that “a volley fired by a young Virginian in the backwoods of America set the world on fire.” Washington’s youthful involvement in winning the Northwest established, for him, a lifelong sense of its strategic importance for any predominate power in North America. Almost by reflex, Mr. Barone writes, Washington “looked west by northwest from the front door of his beloved Mount Vernon to the continental nation he did more than anyone else to establish.”

The original 13 states could be easily divided into distinctive regional cultures: Puritan New England, with its homogeneous population and communal religious foundations; the pluralistic middle states, established by Quakers, the Dutch and Catholics, pragmatically preoccupied with commerce rather than political abstractions; and the Deep South, set apart by its planter elite and an overwhelming dependence on slave labor. Virginia, the largest state and the home of three of the Founders in Mr. Barone’s survey—Jefferson, Washington and James Madison—didn’t wholly belong to any of these regional blocks but possessed elements of each.

The other three figures in Mr. Barone’s study—Benjamin Franklin, Albert Gallatin and Alexander Hamilton—were immigrants to the regions in which they settled. Hamilton was born in the West Indies and settled in New York. Today we recognize his origins as the background of an immigrant. But Franklin also left one British colony (Massachusetts) and settled in another (Pennsylvania). And both men were born British subjects. Not so Gallatin, the Treasury secretary under both Jefferson and Madison, who came to America from the French-speaking part of Switzerland. Mr. Barone argues that these figures’ experiences as successful outsiders within their communities gave them a broader national outlook than that of their more insular countrymen.

The Founders’ geographical visions, as Mr. Barone shows, informed their sense of how America’s distinctive regional cultures might fuse into a common whole. Hamilton, inspired by the trade and credit networks that made the small sugar islands of his birth into engines of imperial wealth, favored a strong central government backed by a fiscal system that earned the loyalty of elites in all sections. Jefferson, dreaming of an egalitarian republic from his elevated place in Virginia’s planter aristocracy, believed in the proliferation of independent yeoman farmers united by an instinctive love of liberty. Madison, characteristically, added a more realistic version of Jefferson’s slightly utopian ideas. Pluralistic expansion, he felt, would prevent any one region from dominating the rest. The problem was not, in his view, too many regional subcultures but too few. These philosophical differences are well known, but Mr. Barone persuasively describes the place-derived assumptions underlying them.

Link to the rest at The Wall Street Journal

Lawrence of Arabia Review: Dreams of Empire

From The Wall Street Journal:

The British explorer Ranulph Fiennes is the first person to have crossed Antarctica on foot and the only living person to have circumnavigated the planet by its poles. In 2000 he fell through the ice while walking solo to the North Pole, leaving the fingertips of his left hand severely frostbitten. When he got home, Mr. Fiennes, a veteran of the elite Special Air Service, trimmed the fingertips with a saw, saving himself £6,000 in medical expenses and, he says, considerable pain.

In 2003 Mr. Fiennes had a heart attack and a bypass operation, then completed seven marathons on seven continents in seven days. In 2009 he reached the summit of Mount Everest on the third attempt. His other feats of endurance include writing more than 30 books, including biographies of the polar pioneers Ernest Shackleton and Robert Falcon Scott, handbooks for ultrarunners and travelers with weak hearts, and the perhaps inevitable family history “Mad Dogs and Englishmen.”

Mr. Fiennes (pronounced “Fines”) is a classic English type, the diffident hero and driven adventurer. He is the square peg who inspires irregular soldiers in inhospitable places. He crosses deserts, forests and frozen wastes, facing down danger and the limits of human endurance, death included.

The rarest such figure, combining all these characteristics of imperial legend with lasting historical significance, was T.E. Lawrence (1888-1935). Dubbed Lawrence of Arabia in his lifetime and immortalized twice over, once by himself in his 1926 memoir “Seven Pillars of Wisdom” and again by Peter O’Toole in the 1962 movie “Lawrence of Arabia,” Lawrence played a crucial but thwarted role in the shaping of the modern Middle East.

In 1916 Britain was at war with Germany’s Ottoman Turkish allies. Lawrence, an archaeologist turned intelligence officer, helped organize the Arab tribes of the Hejaz (today’s western Saudi Arabia) into a guerrilla army and led the Arab Revolt that, in October 1918, displaced the Turks from Damascus. The revolt raised hopes for a unified, self-determining Arab nation, but Lawrence’s political masters and their French allies connived at frustrating that ambition. After the war, the British and French took over from the Turks and created new borders and nations. The events of that era still complicate today’s local and global politics.

Mr. Fiennes’s “Lawrence of Arabia: My Journey in Search of T.E. Lawrence” is a casually elegant biography and an expert reflection on the kind of irregular warfare that Lawrence pioneered and Mr. Fiennes experienced as a young officer fighting a Marxist insurgency in the mountains of Oman in the late 1960s. Lawrence’s example, Mr. Fiennes writes, preceded him and “often inspired me to victory in life-or-death situations.” But Lawrence was also his companion in facing “impossible military and political odds, as well as confronting personal scars.”

The second of five sons, Lawrence grew up in a villa in Oxford. Fascinated by military history, especially the Crusades and Napoleon Bonaparte, he was pushed by his mother to achieve greatness. Lawrence’s motives in risking his life, Mr. Fiennes writes, were not just “his attachment to the Arabs and his hatred for the Ottomans.” He was not who he claimed to be or who he wanted to be. “Lawrence” was an assumed name. His father, an Irish baronet named Thomas Chapman, had eloped with the family maid. When Chapman’s wife refused to grant a divorce, he and his mistress adopted a new name and pretended to be married. Lawrence’s mother saddled him with the “burden that he was special” and a mission to “redeem the family.” Raised a strict Christian, Lawrence was 10 years old when he discovered his illegitimacy.

He became a feted student at Oxford, then cultivated the English romance with tribal life while digging at ancient Carchemish, north of Aleppo, Syria, from 1911 to 1914. “I am Arab in habits and slip in talking from English to French to Arabic unnoticing,” he wrote home. When war broke out in 1914, the British army needed Arabic speakers with local knowledge. Lawrence’s old Oxford tutor, now an intelligence officer, summoned him to Cairo as a mapmaker and adviser. But Lawrence already had a plan to redraw the map by unifying the Arab tribes against the Turks. “I want to pull them all together,” he wrote in 1915, “and roll up Syria by way of the Hedjaz” in the name of Hussein Ibn Ali, the Emir of Mecca.

Meanwhile Hussein and his four sons secretly planned their own revolt, but needed weapons and support. The British could not spare soldiers, but they sent Hussein money and antique rifles, as well as a promise from Henry McMahon, the British high commissioner in Egypt, to recognize Hussein’s rule from Aleppo to Aden if the revolt succeeded.

In July 1916, Hussein’s followers expelled the Turkish garrison from Mecca but, lacking heavy weapons, they were repulsed at the ports of Jidda, Yenbo and Rabegh. In October, with the revolt stalled and British confidence faltering, Lawrence sailed from Egypt for Jidda. From there, he disguised himself in Arab robes, crossed roughly 100 miles of desert by camel through sandstorms, bearing the heat and weeping saddle sores, met Hussein’s son Feisal at the rebel camp and launched his legend.

Lawrence said he preferred “the Arab untouched” to the “perfectly hopeless vulgarity of the half-Europeanized Arab.” In Feisal he found his ideal partner, tall and white-robed, with a “strange still watchfulness” and an “almost regal” bearing that reminded Lawrence of Richard the Lionheart and the young Napoleon. Returning to Cairo, Lawrence secured explosives, the support of Royal Navy ships and British advisers “to train Arab bands,” and a supply of gold.

The Turks controlled the Red Sea ports of the Hejaz via their new Damascus-Medina railroad. Lawrence soon saw that the Arabs could not match well-drilled Turkish troops and their German-supplied artillery. Inland, however, Lawrence believed that the tribesmen’s mobility, small numbers and familiarity with “one of the most trying countries in the world for civilized warfare” made them “the most elusive enemy an army ever had.” Lawrence convinced Feisal to adopt an indirect strategy: disrupt and pin down the Turks by sabotaging the railroad line, then bypass and isolate the Turkish garrisons at Medina and the Red Sea ports and push northward to Syria.

In November 1917, Ottoman troops captured Lawrence, disguised as a peasant, as he spied out the railroad junction of Deraa. He was brutally beaten and raped by the Turkish governor and his men. The shame he felt after that episode was only multiplied at the end of the war: After he entered Damascus on Oct. 1, 1918, in a Rolls Royce, he had to watch as the British commander Edmund Allenby informed Feisal that Britain and France did not intend to honor McMahon’s promises of a unified Arab kingdom.

Damaged by his experiences in Arabia and disenchanted by the political aftermath, Lawrence became a celebrity when “Seven Pillars of Wisdom” appeared in 1926, then did his best to disappear, enlisting in the Royal Air Force under an assumed name. He died in a motorbike accident in 1935.

Link to the rest at The Wall Street Journal

Lawrence is in the second row on the right below

Social media’s online diarists have a long lineage – Who are personal journals written for?

From The Economist:

The tales embrace the mundane and the seismic, from being dumped by a boyfriend before the school prom to the sudden death of a parent. The tone ranges from cheesy to heartbreaking. The storytellers are “journal influencers”, mostly young women reading their teenage diaries to audiences online.

Some videos are mingled with other content, merging pre-teen dreams with make-up tips; others are simple shrines to past selves. One influencer, Carrie Walker (pictured), draws 1.2m views for a half-hour read on YouTube; the shorter content on TikTok’s #diarytok tag has reached 54m. And sharing secrets presents commercial opportunity: selling notebooks and pens on Amazon; auctioning copies of diaries on eBay.

Many people think about writing a diary, especially at New Year. Some start. Some even keep it up. But why write, and for whom? Whether a novice facing a blank page or a seasoned scribbler with years of good meals and gossip in irregular notebooks, almost any diarist has asked themselves that question.

Sally Bayley of the University of Oxford, author of “The Private Life of the Diary”, regards sharing on social media as the antithesis of diary-keeping. The journal is “an attempt to be honest with yourself”. It is “an internal territory, which you are mapping onto the page”, inseparable from privacy. Even Sylvia Plath, a “theatrical individual”, Dr Bayley notes, wrote a diary in order to “generate a voice in private”.

Yet diaries have also long been shared, if more discreetly than on TikTok. Keeping a journal rose in popularity in the 19th century, especially among women. According to Cynthia Huff, an academic specialist in Victorian culture, diary-sharing then was “extremely common”.

Diaries were read aloud, sent to friends or left open for visitors to peruse. “That distinction between public and private really doesn’t hold at all,” says Professor Huff. Some diaries served practical uses, sharing advice on self-improvement, pregnancy or childbirth. British women in the colonies often sent diaries back home. They were “creating an extended family through these diaries” and fostering an ocean-spanning sense of Englishness.

Many journal videos also create a sense of community. They share stories of isolation: of suffering homophobia, struggles with body image or early romantic obsessions. They poke fun at the distorted expectations of youth and the disappointments of adulthood, with the ear of sympathetic strangers.

. . . .

The symbiosis of secrecy and celebration was perhaps best understood by Anaïs Nin, a 20th-century French-born American whose diary was an unapologetic exercise in self-creation. “I am in my Journal, and in my Journal only, nowhere else. Nothing shows on the outside. Perhaps I do not exist except as a fantastic character in this story.”

Nin’s mix of fantasy and truth included an illegal abortion, extramarital affairs and, most notoriously, an incestuous relationship with her father. Her assertions of confidentiality—“you won’t say anything, will you”; “only my Journal knows it” —treat the reader as the sole listener.

And yet, of course, Nin published her journal. Its scandalous content won her fame that her fiction had not. Her confessional texts penetrated the thin veil between public and privateThe diaries are a masterclass in broadcast secrecy, a megaphoned whisper.

“We write to taste life twice,” Nin wrote, “in the moment and in retrospection.” 

Link to the rest at The Economist

Social media’s online diarists have a long lineage

From The Economist:

The tales embrace the mundane and the seismic, from being dumped by a boyfriend before the school prom to the sudden death of a parent. The tone ranges from cheesy to heartbreaking. The storytellers are “journal influencers”, mostly young women reading their teenage diaries to audiences online.

Some videos are mingled with other content, merging pre-teen dreams with make-up tips; others are simple shrines to past selves. One influencer, Carrie Walker (pictured), draws 1.2m views for a half-hour read on YouTube; the shorter content on TikTok’s #diarytok tag has reached 54m. And sharing secrets presents commercial opportunity: selling notebooks and pens on Amazon; auctioning copies of diaries on eBay.

Many people think about writing a diary, especially at New Year. Some start. Some even keep it up. But why write, and for whom? Whether a novice facing a blank page or a seasoned scribbler with years of good meals and gossip in irregular notebooks, almost any diarist has asked themselves that question.

Sally Bayley of the University of Oxford, author of “The Private Life of the Diary”, regards sharing on social media as the antithesis of diary-keeping. The journal is “an attempt to be honest with yourself”. It is “an internal territory, which you are mapping onto the page”, inseparable from privacy. Even Sylvia Plath, a “theatrical individual”, Dr Bayley notes, wrote a diary in order to “generate a voice in private”.

Yet diaries have also long been shared, if more discreetly than on TikTok. Keeping a journal rose in popularity in the 19th century, especially among women. According to Cynthia Huff, an academic specialist in Victorian culture, diary-sharing then was “extremely common”.

Diaries were read aloud, sent to friends or left open for visitors to peruse. “That distinction between public and private really doesn’t hold at all,” says Professor Huff. Some diaries served practical uses, sharing advice on self-improvement, pregnancy or childbirth. British women in the colonies often sent diaries back home. They were “creating an extended family through these diaries” and fostering an ocean-spanning sense of Englishness.

Many journal videos also create a sense of community. They share stories of isolation: of suffering homophobia, struggles with body image or early romantic obsessions. They poke fun at the distorted expectations of youth and the disappointments of adulthood, with the ear of sympathetic strangers.

Some diary-sharers go further. At Queer Diary, a series of events across Britain begun in 2020 by Beth Watson, a performer, lgbtq adults read their old diaries to a live audience. The drama, confusion and mayhem of teenage life are performed to a sympathetic crowd. The celebration, Ms Watson says, is as important as the reflection.

The symbiosis of secrecy and celebration was perhaps best understood by Anaïs Nin, a 20th-century French-born American whose diary was an unapologetic exercise in self-creation. “I am in my Journal, and in my Journal only, nowhere else. Nothing shows on the outside. Perhaps I do not exist except as a fantastic character in this story.”

Nin’s mix of fantasy and truth included an illegal abortion, extramarital affairs and, most notoriously, an incestuous relationship with her father. Her assertions of confidentiality—“you won’t say anything, will you”; “only my Journal knows it” —treat the reader as the sole listener.

And yet, of course, Nin published her journal. Its scandalous content won her fame that her fiction had not. Her confessional texts penetrated the thin veil between public and privateThe diaries are a masterclass in broadcast secrecy, a megaphoned whisper.

Link to the rest at The Economist

‘A Book of Noises’ Review: Sound and Sensibility

From The Wall Street Journal:

All science is either physics or stamp collecting, the physicist Ernest Rutherford supposedly said, distinguishing what he saw as the pursuit of deep principles from the accumulation of endless instances. Even those who find this smug quip tiresome—it doesn’t help that it’s probably apocryphal—would concede that Caspar Henderson’s “A Book of Noises: Notes on the Auraculous” has a definite postal air. There’s physics here, certainly, but also bits of zoology, biology, anthropology, linguistics and geology, as well as music and literature.

Which is not to say that the book isn’t interesting or sometimes stimulating. Philately will get you somewhere. Mr. Henderson, a British journalist who is remarkably cheerful for someone who writes about environmental problems and human rights as well as science, explores in 48 short essays how sound, construed broadly, shapes the universe and the life in it, from the still-detectable “ripples” of the Big Bang to the song of nightingales.

As in Mr. Henderson’s previous books, “The Book of Barely Imagined Beings” (2012) and “A New Map of Wonders” (2017), the keynote is wonder—a secular awe at the variety and complexity of the world. “Glory be to Creation for wonky things,” one chapter begins, in a play on Gerard Manley Hopkins’s poem “Pied Beauty.” Hopkins was praising “all things counter, original, spare, strange” (and the Christian God), while Mr. Henderson is talking about asymmetrical animals, such as bottom-lying flatfish that “stare up with a second eye that has yanked itself all the way round their skulls,” or certain species of owls, whose uneven ears sharpen their ability to locate sounds. To this advantage, Mr. Henderson writes, they add a beak shaped to minimize sound reflection and ears whose acuity doesn’t diminish with age. “Go, owls!” he concludes brightly.

Also praised in the “Biophony” section are amphibians (“What a piece of work is a frog. How noble in breathing, which it does through its skin”), elephants (they use their feet to “hear” vibrations in the ground) and bat echolocation (“the more one considers it, the more amazing it is”). The entries in “Cosmophony” include one on “sound in space” that contemplates how your voice would sound in the thin, cold atmosphere of Mars—quiet—and one on the ancient connections between astronomy and music. Mr. Henderson also explains “sonification,” in which scientific data is turned into sound. Though mostly an artistic experiment or public-outreach gimmick, sonification has been used for research, harnessing our ears’ ability to discern subtle changes in signals. It even allowed one astrophysicist, Wanda Díaz-Merced, to continue her career after going blind.

From “Geophony” we learn that the Northern Lights can sound like crackles, sizzles, rustles or bangs, though the idea that they sounded like anything at all was long dismissed. It takes particular atmospheric conditions to make them audible, and even then they’re barely loud enough to hear over the chatter of other tourists. Another entry, by contrast, discusses the loudest sound in recorded history (the 1883 eruption of the volcano Krakatoa, whose reverberations looped the Earth for five days) and one of the loudest ever on Earth (the Chicxulub asteroid, death knell for the dinosaurs).

Link to the rest at The Wall Street Journal

How to Be Multiple and Twinkind

From The Wall Street Journal:

Twins have long sparked fascination, awe, fear and superstition. They appear in literature both high and low—from Shakespeare and Charles Dickens to the Bobbsey Twins and the Sweet Valley High young-adult books. In mythology, their stories are linked to the origins of nations (Romulus and Remus; Esau and Jacob) and the power of the gods (Castor and Pollux; Apollo and Artemis). Venerated in some cultures and feared in others, twins have served as emblems of affinity and rivalry and, for scientists and philosophers, as case studies in the search to understand the power of nature over nurture.

Today twins remain relatively rare—only 1 in 60 people is a twin in the U.S. and Europe—although the rate of twin births has risen precipitously since the 1980s, thanks mostly to in-vitro fertilization. The relative rarity of twins invites speculation about how their experience is distinct from that of singletons—and what that distinctiveness can teach us about personhood.

In “How to Be Multiple,” Helena de Bres, a philosophy professor at Wellesley College and herself an identical twin, draws on her own experience as a way to explore mind-body boundaries and the nature of individualism and autonomy. “What makes a person themself rather than someone else?” she asks. “Can one person be spread across two bodies? Might two people be contained in one?”

This approach works some of the time. It is true that the resemblances between twins—both physical and psychological—can seem uncanny, as if they are a single identity. Twins can express similar or identical tastes in food, living situations and even sexual preferences, as Ms. de Bres notes, suggesting bonds far stronger than those that exist among ordinary siblings.

. . . .

Ms. de Bres also argues that twins offer evidence of the so-called extended-mind thesis, which posits that, given our reliance on tools and close interaction with the physical world, some cognition can be understood as entwined with our material surroundings—extended. As an identical twin, Ms. de Bres describes in fascinating detail the ways in which her twin’s thoughts and her own—and their physical experiences—seem to merge. She notes the “experience of merger I sometimes have with [her twin] Julia: the sense that I’m thinking via her mind.” She describes the twin contemporary artists Tim and Greg Hildebrandt, who work together on pieces of art and sign only their last name, since, as Tim puts it, “it’s our painting, not Greg’s or mine.”

. . . .

More interesting is her claim that twins prompt a kind of unacknowledged existential dread in singletons because people “raised in Western cultures” are “deeply attached to being discrete individuals with sharply distinct physical, mental, and emotional barriers between ourselves and other humans.” Thus twinhood offers an alternative to the radical individualism ascendant in Western culture. Twins “and other freaks” encourage “social interdependence” and “model more communal ways of living.”

. . . .

Ms. de Bres notes that “much of the experience of twinhood is determined not by twinship itself but the response of non-twins to it.” As a mother of fraternal twin boys, I know too well that the parents of twins have pet peeves about how their children are treated by the world. I wince whenever someone refers to my sons as “the twins,” as if they always act in concert like the Hanna-Barbera cartoon Wonder Twins rather than as the very different individuals they are. In “Twinkind,” William Viney captures the full range of nontwin response, among much else, aiming to explore what he calls “the singular significance of twins.” His book is a visual feast, replete with reproductions of twins in paintings, sculptures, medical drawings and advertisements. Like Ms. de Bres, Mr. Viney is a twin, but his focus is less on his personal experience than on how people in different cultures and eras have understood twinhood.

“Transfigured as gods and monsters, spirits and animal familiars, and viewed as phantoms, freaks and clones,” Mr. Viney writes, “twins pass through a hall of unpredictable mirrors.” He describes, for instance, twins as symbols of both power and mystery in Egyptian, Greek, Aztec and Indo-European mythology, as well as the ways in which cultures change their views of twins over time. In the 18th and 19th centuries, the Oyo Yorubans (in present-day Nigeria) saw twins as dangerous—a “supernatural curse”—but by the early 20th century began to see them as markers of good fortune and “treated them with care and reverence as a deity in miniature.”

. . . .

Mr. Viney acknowledges the dangers of reading too much into twinhood. “One reason that a twin history is difficult to portray in one image or a few hundred . . . is that each twin is what US anthropologist Roy Wagner might have called a ‘fractal person’—neither singular nor plural.” Perhaps our fascination with twins is the result of that enduring, mysterious fact: As singletons, we can never really understand what it means to be multiple.

Link to the rest at The Wall Street Journal

Of course, The Princeton University Press can’t be bothered with a free preview of Twinkind the way everyone else does on Amazon because, ebooks and previews are for proles.

You can see a sort of queer preview of Twinkind by clicking Here, then cursoring down instead of clicking on each page the way Amazon and all non-troglodytes do.

Putting Ourselves Back in the Equation

From The Wall Street Journal:

Popular science often feels like a kind of voyeurism. Those who can’t manage the real thing are given a thrilling glimpse of its intrigue and excitement but kept at a distance. So a book that tackles the mind-boggling triad of physics, consciousness and artificial intelligence might be expected to provide little more than intellectual titillation. The science journalist George Musser even says at its end that “many physicists and neuroscientists are just as perplexed as the rest of us.”

But Mr. Musser knows that the point of popular science is not for the reader to understand everything fully but to get a sense of what’s at stake, what kinds of answers are being offered to difficult questions, and why it all matters. One could not ask more of “Putting Ourselves Back in the Equation”—on all three counts it delivers.

The central puzzle of the book is what the contemporary philosopher William Seager has called a “ticking time bomb”: the intellectual division between the objective and the subjective. Natural science was able to make such great strides after the Middle Ages only because it left the analysis of thoughts and feelings to poets and philosophers, focusing instead on measurable observables. The strategy worked a treat until it hit two brick walls.

The first is the nature of consciousness. Modern neuroscience, at first, stuck to examining the brain events that corresponded to conscious experiences: the “neural correlates of consciousness.” But at a certain point it became clear that such a focus left out a good deal. How is it possible that mushy cells give rise to sensations, emotions and perceptions? The science of mind had to ignore precisely what it was supposed to explain because a purely objective account of consciousness cannot encompass its subjective character.

And then—a second and related problem—physicists discovered that they couldn’t leave conscious minds out of their equations. A central tenet of quantum theory is that observers change what they observe. This is embarrassing. Physics is meant to describe the mind-independent world. But its best description ended up having minds—with their particular points of view—at its center. So for physics to be anything like complete, it has to find a way to kick minds out again or account for what makes them conscious and why they should affect physical matter.

Mr. Musser provides a chatty and informal overview of the many ways in which physicists have been trying to rise to these challenges. He speaks to many of the leading scientists in the field, trying a bit too hard to make them seem like regular folks so that we don’t feel intimidated. A bigger challenge, for the reader, is that he introduces us to so many theories that it’s difficult to judge which should be taken most seriously and which lean toward the cranky. Given that even the most well-evidenced theories in physics sound crazy, our intuitions are no guide.

But by the end a number of general insights shine through. The central one is that we have to think of both physics and consciousness in terms of networks and relations. You can’t find consciousness in single neurons, no matter how hard you look. The reductive approach, which seeks to break down phenomena to their smallest parts, doesn’t work for everything. The clearest evidence of the limits of reductionism is quantum entanglement, or “spooky action at a distance,” the title-phrase of Mr. Musser’s previous book. This is the phenomenon by which two particles appear to affect each other even though they are too far apart for any information to pass between them without exceeding the speed of light, a physical impossibility. No explanation of this oddity is possible if we focus reductively on the particles as discrete entities. Instead we have to see them as interrelated.

Consciousness, too, seems to depend upon patterns of interconnectedness. For a while now researchers into artificial intelligence have realized that we can get nothing close to human reasoning if we have computers that follow only linear processes. AI took off when scientists started to create neural networks, in which processes are conducted in parallel, mimicking the brain’s capacity to run different processes at the same time in its many parts.

This insight led to the currently hottest theory in consciousness studies, integrated information theory, which holds that consciousness is essentially the result of information being kept in whole systems rather than in parts. Adherents even quantify the degree of this integration with the Greek letter phi, which, says Mr. Musser, “represents the amount of information that is held collectively in the network rather than stored in its individual elements.” The higher the value of phi, the more conscious the system is.

By the end of “Putting Ourselves Back in the Equation,” Carlo Rovelli emerges as the physicist who is on the hottest trail. For Mr. Rovelli, there are no independent, objective facts. Truth is always a matter of relations. We understand that New York is west of London but east of San Francisco. Mr. Rovelli argues that all physical properties are like this: Nothing is anything until it is related to something else. It’s an old idea, found in ancient Greek, Chinese and Indian philosophy, and scientists are discovering it anew.

Link to the rest at The Wall Street Journal

Witness to a Prosecution

From The Wall Street Journal:

In the popular perception of the typical white-collar case, a judicious government prosecutes a mendacious executive on a mountain of incontrovertible evidence. Think Bernie Madoff or Sam Bankman-Fried. Then there’s Michael Milken, the former “junk bond king” from the infamous “decade of greed.” If there were a Mount Rushmore of white-collar crime, all three men might have a place.

Thanks to Richard Sandler, however, you can now scratch one of those names off that list. In “Witness to a Prosecution,” Mr. Sandler, a childhood friend who was Mr. Milken’s personal lawyer at the time, walks the reader through Mr. Milken’s 30-plus year legal odyssey, beginning in 1986 with the federal government’s investigation, followed by his indictment, plea bargain, and prison term, right through to his pardon by President Donald Trump in 2020. The author tells a convincing and concerning story of how the government targeted a largely innocent man and, when presented with proof of that innocence, refused to turn away from a bad case.

I have always been more than a bit skeptical about Mr. Sandler’s underlying thesis—and the thesis of many of Mr. Milken’s supporters on this page. After all, Mr. Milken served nearly two years in jail, pleaded guilty to six felonies and paid a large fortune to settle with the government.

I have also read books, chief among them James B. Stewart’s “Den of Thieves” (1991), that seem to make the case for Mr. Milken’s culpability—the methods he employed as head of Drexel Burnham Lambert’s high-yield department, the alleged epicenter of the destructive “leveraged buyout” mania of the 1980s that cratered companies and led to mass unemployment; his alliances with smarmy corporate raiders; his supposed insider trading with the notorious arbitrageur Ivan Boesky. The list goes on.

After reading Mr. Sandler’s account, I no longer believe in Mr. Milken’s guilt, and neither should you. The author argues that most of what we know about Mr. Milken’s misdeeds is grossly exaggerated, if not downright wrong. What the government was able to prove in the court of law, as opposed to the court of public opinion, were mere regulatory infractions: “aiding and abetting” a client’s failure to file an accurate stock-ownership form with the SEC, a violation of broker-dealer reporting requirements, assisting with the filing of a false tax return. There was no insider-trading charge involving Mr. Boesky or anyone else, because the feds couldn’t prove one.

The witnesses against Mr. Milken, among them Mr. Boesky, led investigators on a wild-goose chase that turned up relatively little. One key piece of evidence linking the two men: A $5.3 million payment to Drexel from Mr. Boesky for what turned out to be routine corporate finance work that the feds thought looked shady.

When you digest the reality of the case against Mr. Milken, you find that much of it was nonsense. As Mr. Sandler puts it: “The nature of prosecution and the technicality and uniqueness of the regulatory violations . . . certainly never would have been pursued had Michael not been so successful in disrupting the traditional way business was done on Wall Street.”

That gets us to why Mr. Milken was prosecuted so viciously. The lead prosecutor on the case, Rudy Giuliani, was the U.S. Attorney for the Southern District of New York. It’s hard to square the current Mr. Giuliani, fighting to keep his law license while being enmeshed in Mr. Trump’s election-denying imbroglio, with the man who was then the nation’s foremost crime fighter, taking on mobsters, corrupt politicians and those targeted as unscrupulous Wall Street financiers.

Mr. Giuliani’s ambition for political office—he would later become mayor of New York City—made Mr. Milken an enticing target, Mr. Sandler tells us. The author suggests that Mr. Giuliani made up for his weak legal case by crafting an image of the defendant as an evil bankster and feeding it through leaks to an all-too-compliant media. “Michael Milken became the subject of intensive newspaper articles, press leaks, rumors, and innuendo for years before he was charged with anything,” the author writes. “I am sure Giuliani and his team of prosecutors believed that Mike would succumb to the pressure and quickly settle and cooperate and implicate others. When this did not happen, the prosecutors became more committed to using their immense power to pressure Michael and try to win at all costs.”

Link to the rest at The Wall Street Journal

AAP’s September StatShot: US Book Market Up 0.8 Percent YTD

From Publishing Perspectives:

In its September 2023 StatShot report released this morning (December 12), the Association of American Publishers (AAP) cites total revenues across all categories to have been flat as compared to September 2022, at US1.4 billion.

The American market’s year-to-date revenues, the AAP reports, were up 0.8 percent at US$9.4 billion for the first nine months of the year.

As Katy Hershberger at Publishers Lunch today is noting, children’s books continued to gain in September, up 5.2 percent over the same month in 2022, sales this year reaching $272.8 million.

Publishing Perspectives readers know, that the AAP’s numbers reflect reported revenue for tracked categories including trade (consumer books); higher education course materials; and professional publishing.

. . . .

Trade Book Revenues

Year-Over-Year Numbers
Trade revenues were down 0.4 percent in September over the same month last year, at $905.9 million.

In print formats:

  • Hardback revenues were up 7.2 percent, coming in at $379 million
  • Paperbacks were down 4.9 percent, with $299.1 million in revenue
  • Mass market was down 39.5 percent to $11.3 million
  • Special bindings were up 11.8 percent, with $27.1 million in revenue

In digital formats:

  • Ebook revenues were down 1.8 percent for the month as compared to September 2022, for a total of $85.2 million
  • The closely monitored digital audio format was up 3.2 percent for September 2022, coming in at $69.9 million in revenue
  • Physical audio was down 24.4 percent, coming in at $1.2 million

Link to the rest at Publishing Perspectives

PG notes that the Association of American Publishers includes far more publishers than the large trade fiction and non-fiction publishers in New York City, the ones that the New York Times uses for its best-seller lists.

The AAP stats include educational publishers that provide textbooks for all of the different levels of education in the US. It also includes religious publishers and business publishers providing books for the business, medical, law, technical and scientific markets.

Amsterdam’s Elsevier: Research and Real-World Impact

From Publishing Perspectives:

As we work to recoup some of the relevant material released to the news media near the end of our publication year, we look now at two significant research reports from Elsevier, one on research evaluation and the other on real-world impact—both increasingly pressing interests in the world of academic publication.

In the 30-page report “Back to Earth: Landing Real-World Impact in Research Evaluation,” the program carried out a survey of 400 academic leaders, funders, and researchers in seven countries about real-world impact as part of academic evaluation. Key findings include:

  • Sixty-six percent of respondents say academia has a moral responsibility to incorporate real-world impact into standard research evaluation​
  • Seventy percent say they are passionate about research that has a positive real-world impact
  • Fifty-three percent say a more holistic approach to evaluation would improve research cost-effectiveness.\
  • Fifty-one percent of respondents identified at least one serious problem with current methods of research evaluation
  • In terms of barriers to change, 56 percent of those surveyed said the “lack of common frameworks or methodologies” while 48 percent said “lack of consensus on what constitutes impact”

In this report, it’s interesting to note some of the differences, culture-to-culture in the question of how important it is for research “to aim for real-world impact.” Particularly in the coronavirus COVID-19 pandemic, there could hardly have been a time when it was so obvious, the need that the world-at-large has for the most sophisticated, committed, and efficient research.

Nevertheless, this graphic indicates that surveyed personnel on this point came in on the affirmative side (yes, research should aim for real-world impact) at rates up to 93 percent in the United Kingdom and a low of 64 percent in the Elsevier report’s home, the Netherlands.

Another very interesting point in this report compares the view of funders and those of researchers.

While funders surveyed seem to agree with researchers that more holistic approaches are important, the funders did say that they were more in agreement with the researchers that the current system creates vested interests.

And it’s the researchers who said they were more passionate than the funders about having “real-world impact as researchers and academic leaders.”

Topping the list of barriers offered by funders to a more holistic form of research assessment was lack of resources at 53 percent, on a 53-percent par with lack of consensus on what actually constitutes impact.

Also running heavily were the lack of a common framework or methodology in holistic method of assessing research’s impact, at 49 percent. But another tie came in next, with 38 percent each of respondents saying that two more barriers are “achieving sufficient alignment between different actors” and “complexity.”

Link to the rest at Publishing Perspectives

The Quest for Cather

From The American Scholar:

Willa Cather loathed biographers, professors, and autograph fiends. After her war novel, One of Ours, won the Pulitzer in 1923, she decided to cull the herd. “This is not a case for the Federal Bureau of Investigation,” she told one researcher. Burn my letters and manuscripts, she begged her friends. Hollywood filmed a loose adaptation of A Lost Lady, starring Barbara Stanwyck, in 1934, and Cather soon forbade any further screen, radio, and television versions of her work. No direct quotations from surviving correspondence, she ordered libraries, and for decades a family trust enforced her commands.

Archival scholars managed to undermine what her major biographer James Woodress called “the traps, pitfalls and barricades she placed in the biographer’s path,” even as literary critics reveled in trench warfare over Cather’s sexuality. In 2018, her letters finally entered the public domain, allowing Benjamin Taylor to create the first post-ban life of Cather for general readers.

Chasing Bright Medusas is timed for the 150th anniversary of Cather’s birth in Virginia. The title alludes to her 1920 story collection on art’s perils, Youth and the Bright Medusa. (“It is strange to come at last to write with calm enjoyment,” she told a college friend. “But Lord—what a lot of life one uses up chasing ‘bright Medusas,’ doesn’t one?”) Soon she urged modern writers to toss the furniture of naturalism out the window, making room for atmosphere and emotion. What she wanted was the unfurnished novel, or “novel démeublé,” as she called it in a 1922 essay. Paraphrasing Dumas, she posited that “to make a drama, all you need is one passion, and four walls.”

Chasing Bright Medusas is an appreciation, a fan’s notes, a life démeublé. Taylor’s love of Cather’s sublime prose is evident and endearing, but in his telling of her life, context is sometimes defenestrated, too. When Taylor sets an idealistic Cather against cynical younger male rivals, we learn of Ernest Hemingway’s mockery but not of William Faulkner’s declaration that the greatest American novelists were Herman Melville, Theodore Dreiser, Hemingway, and Cather. Taylor rightly notes that Cather’s First World War novel differs from Hemingway’s merely in tone, not understanding; A Farewell to Arms and One of Ours, he writes, “ask to be read side by side.”

. . . .

Not much Dark Cather surfaces in Bright Medusas—a pity, for she was a genius of horror. My Ántonia brims with bizarre ways to die (thrown to ravening wolves, suicide by threshing machine); carp devour a little girl in Shadows on the Rock; and One of Ours rivals Cormac McCarthy for mutilation and gore. Cather repeatedly changed her name and lied about her birthdate, as a professional time traveler must, on the page and in life. She saw her mother weep for a Confederate brother lost at Manassas, rode the Plains six years after Little Bighorn, was the first woman to receive an honorary degree from severely masculinist Princeton, and though in failing health, spent World War II writing back to GIs who read her work in Armed Services Editions paperbacks, especially the ones who picked up Death Comes for the Archbishop, assuming it was a murder mystery. She died the same spring that Elton John and Kareem Abdul-Jabbar and David Letterman were born. She is one of ours.

Link to the rest at The American Scholar

How to Interpret the Constitution

From The Wall Street Journal:

It is a testament to our nation’s commitment to the rule of law that, nearly 250 years after its ratification, Americans still argue about the Constitution. And as often as we argue about the outcomes of controversial hot-button constitutional cases, we argue about the methodologies that lead judges to make their rulings.

Today originalism—the idea that constitutional meaning should be considered as being fixed at the time of enactment—is the dominant judicial philosophy, thanks in part to decades of persuasive arguments put forward by conservative and libertarian lawyers and scholars. But there are different flavors of originalism, corresponding to different understandings of “original meaning”—the framers’ intent, a provision’s “public meaning,” or its expected application, to name a few—and various liberal lawyers and politicians propose their own interpretative methods, originalist or otherwise.

Cass Sunstein, a Harvard Law School professor, has written “How to Interpret the Constitution,” a clear, accessible survey that discusses originalist interpretations alongside their competitors. Among those nonoriginalist approaches are John Hart Ely’s argument for democracy-enforcing judicial review, Ronald Dworkin’s moral reading of the Constitution and James Bradley Thayer’s advocacy of extreme judicial restraint. Those are all varieties of what has been called “living constitutionalism”; they all allow that the Constitution may evolve in meaning without being amended.

Mr. Sunstein argues repeatedly that the Constitution “does not contain the instructions for its own interpretation.” To some degree, this is true: Though statutes often include definitions in their wording, the Constitution, for the most part, does not. For example, the first words of the First Amendment read: “Congress shall make no law respecting an establishment of religion.” Neither “respecting,” “establishment,” nor “religion” are set out with clear definitions, and the Supreme Court has entertained many possible meanings for each over the course of American history.

There is also no explicit constitutional command, in the text of the Constitution or in the works of the Founders, that those tasked with interpreting it must follow any particular method, either originalist or living-constitutionalist. “The idea of interpretation is capacious,” Mr. Sunstein writes. He therefore proposes his own first principle for choosing among methods: “Any particular approach to the Constitution must be defended on the ground that it makes the relevant constitutional order better rather than worse.”

Originalists propose that we resolve constitutional ambiguities by unearthing the law’s true and unchanged meaning. Mr. Sunstein, by contrast, proposes that judges and other constitutional interpreters rely on their “firm intuitions” to determine which constitutional rules are desirable and then see what theories might yield those rules. To do so, the author borrows from John Rawls, the giant of 20th-century liberal political theory, to endorse a methodology of “reflective equilibrium,” in which “our moral and political judgments line up with one another, do not contradict each other, and support one another.”

“In deciding how to interpret the Constitution,” Mr. Sunstein writes, “we have to think about what we are most firmly committed to, and about what we are least likely to be willing to give up.” He reveals how he would apply this methodology in his own case by listing his 10 “fixed points,” or constitutional outcomes that resonate with his own sense of rightness and justice. They are “clearly correct” propositions in the author’s view and include the contentions that “the Constitution does not forbid maximum hour and minimum wage laws” and that “the First Amendment should be understood to give very broad protection to political speech.” Of course, you might believe exactly the opposite. That, to Mr. Sunstein, is equally legitimate. One begins to wonder at this point how much “interpretation” exactly is going on.

Consider Mr. Sunstein’s claim that judges and justices should interpret laws in a manner that improves the constitutional order. Why shouldn’t we just allow legislators, who figure nowhere in Mr. Sunstein’s philosophy, to make legitimate changes to legislation when needed? We have mechanisms for improving our nation’s laws, and we have one for improving our Constitution. The Republicans who revamped our constitutional system in the aftermath of the Civil War by devising the Reconstruction Amendments—banning slavery, guaranteeing the equal protection of the law and enforcing individual rights against the states—understood that they couldn’t simply project their moral and political views onto the unamended law. They had to change the Constitution.

Like most nonoriginalists, Mr. Sunstein evades the key insight that gives originalism its appeal. It begins with a phrase from the Constitution that refutes Mr. Sunstein’s premise that the document doesn’t contain instructions for its own interpretation. “This Constitution,” it proclaims, “shall be the supreme law of the land.” The Constitution is a legal document, even if its provisions are sometimes more ambiguous at first glance than we would want a law to be. And laws have the crucial characteristic sometimes known as “fixation”: They remain unchanged until changed by authorized means. Constitutional interpretation must be constrained by this basic principle of legal reasoning.

Link to the rest at The Wall Street Journal (Sorry if you encounter a paywall)