I’d just given a reading in Amsterdam after which the gracious hosts of the evening took me out for drinks. Three young women asked me questions about sex and love and desire as though I were an expert and it was nice but I was tired and unused to being considered an expert in anything but panic.
I thanked the hosts and slipped out. I’d always wanted to visit Amsterdam and I had only two nights. I wanted to walk the streets alone. I wanted to walk across the bridges and look at the waving water and look inside the windows of the closed shops. I wanted to find the loveliest cafe and mark it for the morning. I wanted to eat bitterballen and wash them down with stroopwaffel. And I wanted to get high.
The streets were dark with rain. I found a deli. It wasn’t one of the coffeeshops with the meticulously bagged furry sativa. This was just a deli, cartons of milk, packs of gum. Before leaving I bought one large plastic tub of marijuana brownies. It seemed wasteful not to, and the man assured me I absolutely could take the cookies on my flight to Romania early the next morning. OK yes why not yes yes is OK yes. He was equal parts aloof and confident and not understanding what I was saying. So it felt right.
In the hour that followed I held the joint with one hand and a broken umbrella with the other. I walked and smoked and the cherry kept going out on the joint and I didn’t have a lighter and so twice I stopped to ask strangers for a light and tried to balance the umbrella and the joint and the unwieldy weight of my embarrassment. I got so high that I didn’t feel panic about my imminent flight. I got so high that I didn’t get lost. I found my pretty hotel but had gotten so high that I forgot my four-year-old daughter was sleeping in a room upstairs.
Hang on now. Her father was in the room with her. But I almost forgot I was a mother. But that’s not it. I forgot enough about my panic that I wasn’t acting like the neurotic mother that I am. I rarely drink and when I do, I don’t drink much. So that getting high (so high) felt like a real breach. I got so high that I didn’t care that I got so high.
To some (or many!) I’m sure I would be considered in that moment (or many!) a badmother. I know it for a fact because I spoke to hundreds of women for my book – many of them mothers – and they all had at some point been called “bad”. Many of them believed it to the extent that they felt they weren’t good enough for their children.
One of the women I spoke to was a talented musician. She told me that the only one of her singles that underperformed told the story of a bad mother. It was one of her favourite songs, but she had to stop singing it at concerts because she would receive death threats on Twitter. One listener threatened to kidnap her child, because she was too bad a mother to keep her.
For a long time, I felt like I had been failed by publishing. After a diagnosis of Aspergers Syndrome – now Autism Spectrum Disorder (or ASD) in 2015 – I set out to learn more about my new ‘label’, and what it meant to me. Recommendations included looking to TV, because characters such as Sheldon Cooper in “The Big Bang Theory” were ostensibly ‘good’ representation. I couldn’t relate. Frustrated, I turned to books, expecting someone, somewhere, to have written about my experience. There was very little that was supportive, or even relevant, to me.
It’s good to see that this is changing at long last, although publishing still has a long way to go to plug the gap. Despite Autism Spectrum Disorder being exactly that – a spectrum! – there remains a lack of nuance in books that touch on the varied experiences of people with ASD.
Take the recent backlashes around books about Autism. To Siri With Love – Judith Newman’s recent memoir about her Autistic son – may have been met with huge praise, but Autistic individuals shot back. Accusations of eugenics and ableism abounded – Kaelan Rhywiol summarised the objections in a piece for Bustle – as well as a ‘Twitterstorm’, complete with the hashtag #BoycottToSiri. The author responded that she had not written the book for an Autistic audience.
And this year, my Instagram feed flooded with petitions calling for the removal of I Wish My Kids Had Cancer by Micheal Alan – a book that appeared to equate Autism with cancer. Enough said.
. . . .
There are also a lot of books about parenting – but they are written by parents not on the spectrum. Spectrum Women: Autism & Parenting is out next month – and, so far, has been seen as a ‘revelation’. Why? Because it is written by people on the Autistic spectrum! As the saying goes, ‘nothing about us, without us’ – and this should apply to books about parenting Autistic children. It’s good to have books that are almost like textbooks – but they are not necessarily the real, lived experience of being on the spectrum. They miss the colour, the humanity. And that, I think, could often be said when someone not on the spectrum writes about being Autistic.
. . . .
Stim: An Autistic Anthology was released earlier this year. Edited by Lizzie Huxley-Jones, this book was notable for giving free rein to the Autistic contributors. Essays, art, even fiction – not necessarily about Autism! – made this book a stand-out tome in its niche. It’s refreshing to read, offering a range of non-neurotypical perspectives.
Illustrator Megan Rhiannon has also released Existing Autistic – a self-published, illustrated book that contains information about functioning labels, sensory overload, and more. It has been received with thunderous applause – with her needing to re-stock it at least once since the release.
‘Hotwife’ Pornographer Gulls Harvard Prof With ‘Wife of Jesus’ Hoax.” The headlines could have been worse for Karen King, the Hollis Professor of Divinity at Harvard University. But not much worse.
The first line of act I of “Veritas,” Ariel Sabar’s mesmerizing five-act real-life melodrama, is “Dr. Karen Leigh King had reached the summit of her field as a dazzling interpreter of condemned scripture.” We join Ms. King at the apex of her career, her September 2012 unveiling of the “Gospel of Jesus’ Wife” at the International Congress of Coptic Studies, held a stone’s throw from the Vatican in Rome. Speaking to three dozen colleagues, Ms. King describes the tiny papyrus fragment that had come into her possession, lingering over its fateful line 4: “Jesus said to them, ‘My wife . . .’ ”
This little snippet, Ms. King claimed, “leads us to . . . completely re-evaluate the way in which Christianity looks at sexuality and at marriage.” Ms. King considered calling the bit of papyrus the “Mary Fragment” but chose to call it a “Gospel”—“something that will stick,” she later explained. From some 30 Coptic words spread across eight discontinuous lines, Mr. Sabar writes, Ms. King had “alchemized . . . [the] case for a thoroughgoing Gospel of Jesus’s Wife.”
A married Jesus would turn the Catholic Church on its head. The papyrus hinted at a wife named Mary, presumed to be Mary Magdalene, painted as a prostitute by Pope Gregory the Great in the sixth century. The New Testament, however, never mentions a marriage, other than in references to the Church or holy Jerusalem as Christ’s spiritual bride. Christ’s purported bachelorhood undergirds the Catholic doctrine of priestly celibacy. If the papyrus accurately described a wife of Christ, “this means that the whole Catholic claim of a celibate priesthood based on Jesus’s celibacy has no historical foundation,” noted Ms. King, a feminist scholar and expert on the apocryphal, second-century Gospel of Mary.
The papyrus presented problems from the start. Before the Rome event, two of the three anonymous peer reviewers retained by the Harvard Theological Review suggested Ms. King’s fragment might be a fake—although none of the scholars assembled in Rome knew that. Ms. King’s reviewers examined only a digital photograph of the “Gospel,” and “something felt off,” Mr. Sabar reports. One expert said the script “looked like twenty-first-century handwriting.” On closer inspection, small imperfections manifested themselves: missing characters and the “grammatical monstrosity” of an impossible double conjugation.
Brown University Egyptologist Leo Depuydt called the papyrus’s grammar a “colossal double blunder,” arguing that its creator was less likely to have been “a very incompetent ancient scribe” than “a modern author who might have benefited from one more semester of Coptic.” Ms. King, who, Mr. Sabar reminds us, taught Coptic at Harvard, “had somehow failed to spot most of the text’s grammatical irregularities.”
The besieged Ms. King fought back. Nineteen months after the Rome reveal, the Theological Review published her article defending the fragment’s authenticity, backstopped by testing carried out at Harvard, Columbia and MIT. Harvard issued a triumphant press release: “Testing Indicates ‘Gospel of Jesus’s Wife’ Papyrus Fragment to Be Ancient.” Ms. King and the as-yet-unidentified owner of the fragment exchanged a sigh of relief. The lab tests, he emailed her, served as “the ultimate confirmation for me that we’ve been right all along, confirming again what has been obvious from day one.”
. . . .
Mr. Sabar doesn’t name the purported papyrus pusher until page 162, and the man’s bona fides seem quite unusual indeed. A former student in Egyptology at Berlin’s Free University, Walter Fritz briefly directed the Stasi Museum, housed in the former headquarters of the notorious East German secret police, before moving to Florida, becoming a pornographer, and carving out a name for himself and his wife in the state’s “vibrant swingers’ community.” Mr. Fritz is the proverbial man of many parts; one wonders why a prodigious researcher like Ms. King didn’t perform a few more Google searches or place some phone calls before dynamiting 2,000 years of patriarchal tradition on the basis of his sketchy offering.
As the reader moves through acts III and IV, Mr. Sabar continues to tantalize us. It is curious, we learn, that Ms. King had urged the Theological Review to scotch a dissenting article by Mr. Depuydt that was printed in the 2014 issue devoted to the papyrus. It is equally curious that the Columbia and MIT “authentications” of the fragment were performed by scholars with “close personal ties” to Ms. King and to one of her key allies. The MIT man was the son of a family friend and “an expert in explosives detection.” The ink analyst from Columbia “had no experience with ancient objects.” Oops.
. . . .
“[Ms. King’s] ideological commitments were choreographing her practice of history,” Mr. Sabar writes. “The story came first; the dates managed after. The narrative before the evidence; the news conference before the scientific analysis.”
Oryx, the international journal of conservation published by Cambridge University Press, is to become Open Access from January next year, in a move made possible by a grant from The Rufford Foundation.
From January 2021, the journal–which is the world’s longest running conservation journal–will be free to anyone with an internet connection. Past content dating as far back as 1950 will be made freely available, as well as all new research which will be published Open Access from next year. Meanwhile unfunded authors will benefit from a new APC (article processing charge) waiver policy, also thanks to The Rufford Foundation, dedicated to nature conservation.
CUP publishes the journal on behalf of wildlife conservation charity Fauna & Flora International, and it is billed as the “go-to publication for anyone interested in biodiversity conservation, conservation policy and related social, economic and political issues”.
Editor Dr Martin Fisher, who has overseen Oryx for almost 20 years, said: “This is the most significant development in the journal’s eminent history. Thanks to the support of the Rufford Foundation and Cambridge University Press’ commitment to Open Access publishing, the research published in Oryx will be freely accessible to all readers, no matter where they live or work.”
The public image of the robber barons has always been a barometer of how America thinks about wealth. Were they financiers or swindlers? Builders or monopolists? In the Progressive Era, the muckraker Ida Tarbell cast John D. Rockefeller as a ruthless monopolist, and Matthew Josephson’s compelling but one-sided Depression-era tome, “The Robber Barons,” scorched the lot of them. In recent decades serious biographers have reappraised the turn-of-the-century moguls and found more to like. Could the wheel be poised to turn again? With inequality considered a public enemy, a reappraisal might be ripe.
In “Iron Empires: Robber Barons, Railroads, and the Making of Modern America,” Michael Hiltzik pokes among the ghostly bones of tycoons past but doesn’t generally offer a new interpretation. Mr. Hiltzik, a journalist with the Los Angeles Times, presents a colorful cast in conventional terms. Once again we hear that the railroad barons made money by watering stock (meaning that they inflated it, as in watering cattle), swindling rivals, buying judges and milking decrepit properties.
Daniel Drew was vaunted for his ability to manipulate shares. He played with the float of the Erie Railroad “with the ease of a child inflating and deflating a toy balloon,” as Mr. Hiltzik puts it. For unalloyed greed, it was hard to top George Pullman, the sleeping-car magnate, who set up a “beautiful” company town but gouged his underpaid workers on rent. When a depression set in, Pullman cut wages 30%; somehow the workers’ rent was unaffected, while corporate dividends rose.
Labor unions objected, but they were weak. The most serious threat emanated from the barons themselves, who recklessly overbuilt. The finagling Jay Gould ruined the Union Pacific, the Civil War-era trunk line chartered and subsidized by Congress. Talk about a fiduciary! Gould thought nothing of (a) destroying the company with an expensive acquisition while (b) personally wheeling and dealing in the stock while also (c) serving on the board. The result was redundant tracks, ruinous rate-cutting and repeated waves of bankruptcy. Mr. Hiltzik likens the barons to rival duchies in Napoleonic Europe, jousting and plotting but incapable of asserting order.
PG is reading The Transcontinental Railroad, which covers a time a bit earlier than the books mentioned in the OP, and enjoying it greatly. The politics of the time after the Civil War were corrupt, rough and tumble and it was amazing when anything involving the federal government actually worked out.
Just after Christmas in 1831, the British Empire’s wealthiest island exploded. “Five weeks of burning, looting, crop destruction, courts-martial, on-the-spot executions, severed heads mounted atop poles, and outright human hunting for sport . . . shook slaveholding Jamaica to its foundations.” So writes Tom Zoellner, a professor at Chapman University, in “Island on Fire,” a pounding narrative of events that led to the end of slavery in the British colonies. “Soon the hills were on fire, each spiky leaf of sugar like a small torch or match head. Millions of yellow, flaming pinpricks spread in all directions in the velvety Caribbean night.”
Hundreds of slaves, having been pushed beyond endurance, attacked hated overseers and their masters’ property. “We have worked enough already, and will work no more,” striking laborers told a pair of plantation owners. “The life we live is too bad; it is the life of a dog.” In all, 145 estate houses were destroyed and many others severely damaged. Mr. Zoellner’s vigorous, fast-paced account brings to life a varied gallery of participants, black, white and “colored”— the then-standard designation for quasi-free people of mixed race.
Among these figures are Richard Barrett, one of the island’s richest sugar growers and a relative of the poet Elizabeth Barrett Browning, who passed for a moderate in the island’s reactionary society; the remarkable, precariously positioned “colored” newspaper editor Edward Jordon, who had only gained full civil rights the previous year; and the revolt’s tragic central figure, an enslaved Baptist deacon named Samuel Sharpe. An apparently gifted speaker, Sharpe preached the equality of man based on the teachings of the Bible. He also believed inaccurate rumors that the king had already declared slaves free but that their masters were keeping the news a secret. In response, Sharpe surreptitiously planned a peaceful work stoppage. He may have ultimately hoped for the establishment of an independent republic similar to the one that had come into being a generation earlier in Haiti. Whatever his intentions, the stoppage quickly spiraled beyond his control and into full rebellion.
The uprising was soon over, having been weakened by its poor organization and thwarted by the failure of the island’s 300,000 slaves to rise en masse. It was also overwhelmed by the firepower of British troops. Few whites were killed, but the colonial elite’s confidence in its ability to defend itself was deeply shaken. Hundreds of enslaved men and women were killed in battle or summarily executed, some simply because they had attended a Baptist meeting. The exact number is unknown.
The revolt failed to improve conditions for the enslaved in Jamaica, but it crucially wounded the institution of slavery itself. Mr. Zoellner acknowledges that it was only one factor in the ending of slavery, along with surging abolitionism in Britain, an increasingly muscular reform movement in Parliament, and the falling price of sugar, the islands only export crop. But the revolt, he says, “sent an unambiguous message to London that slavery was no longer sustainable—not economically, not militarily, and not morally.”
The challenge to slavery in Jamaica and the rest of Britain’s Caribbean possessions had been a long time coming. As Trevor Burnard, a professor at the University of Hull, amply shows in his expansive and scholarly “Jamaica in the Age of Revolution,” colonial Jamaica was characterized by extreme systemic violence against enslaved people. It was also ruled over by a dissolute planter class obsessed with short-term profits that made it cheaper to work slaves to death and buy new ones than to sustain them into their later, less productive years.
PG has read more than one article about slavery that has described the practice as a “uniquely American” or “peculiar” institution found only or almost-only in the United States.
This is, of course, not correct. Egypt, Babylon, Greece, and Rome each had large numbers of slaves. A great many Christians were enslaved during the Ottoman invasions of Europe. White slaves were common in Europe from the Dark Ages to the Middle Ages. China formally abolished slavery in 1909.
Serfs in feudal Europe were not personal property that could be bought and sold, but, rather, they were attached to land. If a landowner sold a piece of land, the serfs living on the land went with it and were obligated to give a substantial portion of the fruits of their labors to the landowner and could be compelled to cultivate other land of their owner that was not occupied by serfs.
Russian serfdom was even more rigid.
From JSTOR Daily:
[Peter[ Kolchin writes that the Russian nobles “invented many of the same kinds of racial arguments to defend serfdom that American slave-owners used to justify” slavery. Some nobles went so far as to say they had white bones, while the serfs had black bones. Kolchin calls this an “essentially racial argument in defense of serfdom, even though no racial distinction divided lord and peasant.”
Then there was the aristocratic paternalism of the arguments that bondage was a humane institution in comparison to the precariousness of the free labor market. Both Russians and Americans argued that their systems of bondage resulted in a superior society.
Kolchin quotes American slave-advocates who argued that the race of slaves was actually immaterial. Absent Africans, these defenders of American slavery said whites would do just as well as blacks. Because planters needed the support of non-slaveholding whites, however, such arguments never dominated the defense of slavery.
PG intends none of this be any sort of excuse for or defense of slavery in any form or fashion. It is always and everywhere a despicable evil. However, unfortunately, while it has been an American evil, it has also been a British, Russian, Chinese, Arabian, etc., etc., evil
Since the world stopped traveling several months ago, I’ve been trying to find the way forward for my guidebooks. The future of travel is anything but clear: countries are tightening their borders, and some U.S. states that have reduced their coronavirus infection rates are restricting visitors from other states where the pandemic continues to wreak havoc.
I’m editor-in-chief for North America of 111 Places That You Must Not Miss, a guidebook series for locals and experienced travelers. Our books always sell best in the cities and regions they cover, and in many ways, our approach is well suited to a moment when people are planning staycations and local getaways.
But virtually all of our retail outlets were vastly diminished for several months. Bookstores, gift shops, and museum shops were closed due to the pandemic, and major e-commerce sites deprioritized books in order to direct resources toward shipping hand sanitizer, surface cleaners, and other crucial supplies.
My colleagues and I knew that our sales reports were going to be grim. But the difference between our figures for Q1 and Q2 2020 and those from the first half of last year is shocking nonetheless. When everyone went home in mid-March, we had several books either just released or on their way to our warehouse. It broke my heart to see talented, enthusiastic writers and photographers miss out on the once-in-a-lifetime experience of celebrating their first books and signing their first autographs. Normally, a book release is a time for parties with friends and family, book talks, TV and radio interviews, and a wonderful sense of accomplishment. But those events have been postponed.
Where our books once might have garnered good media placements, in recent months our press releases have yielded more out-of-office messages than interview requests. Many journalists have gotten sick or been laid off or furloughed, just like those in so many other professions.
With a release date Tuesday (July 28), Dare To Speak: Defending Free Speechfor Allis by Suzanne Nossel—currently the CEO of PEN America and a previous COO of Human Rights Watch and executive director of Amnesty International USA.
Nossel enters this politically charged summer’s lineup in the right sector: nonfiction, and focused on the underlying issue behind “cancel culture.”
. . . .
In essence on the broader scale, however, much of the debate of the day revolves around what some perceive to be an “intolerant climate that has set in on all sides,” as it was described in the July 7 open letter published by Harper’s Magazine. Many leading authors including Margaret Atwood, Anne Applebaum, JK Rowling, Salman Rushdie, and Khaled Kalifa signed the letter, which was led by Thomas Chatterton Williams.
While praising the recent “powerful protests for racial and social justice,” the letter warned against letting “resistance harden into its own brand of dogma or coercion—which right-wing demagogues are already exploiting.” Those who signed the piece pointed to “greater risk aversion among writers, artists, and journalists who fear for their livelihoods if they depart from the consensus, or even lack sufficient zeal in agreement.” And that’s self-censorship, a real and present danger.
Another letter then followed at The Objective, this time from writers who cast the first piece’s signatories as a kind of establishment class of well-paid pundits who risk little in expressing their views with elite authority. “Under the guise of free speech and free exchange of ideas,” reads the second letter about the first, “the letter appears to be asking for unrestricted freedom to espouse their points of view free from consequence or criticism.
“There are only so many outlets, and while these individuals have the ability to write in them, they have no intention of sharing that space or acknowledging their role in perpetuating a culture of fear and silence among writers who, for the most part, do not look like the majority of the signatories.”
It’s a thorny discussion, with “cancel culture” veering close to “political correctness” for some—and not for others—and one that Nossel seems to approach in her book, described as being about “a time when free speech is often pitted against other progressive axioms—namely diversity and equality.”
Needless to say, the speed and force with which social media polarization can gather around any comment, image, or concept is its own dilemma, one that often hustles engaged users past what should have been a period of review, thought, assessment, and decision and depositing everyone in a heap of knee-jerk responses to only partially understood points.
If anything, the industry and culture of books is in a good position to demonstrate through its own output the values of the indispensable requirement of the freedom to publish. And however important it is to speed up traditional publishing amid the pace of contemporary debate, the inherent requirements of book preparation lie on the side of clearer thought and patient retort.
. . . .
In her forthcoming book, Nossel, according to Harper’s promotional material, “warns against the increasingly fashionable embrace of expanded government and corporate controls over speech, warning that such strictures can reinforce the marginalization of lesser-heard voices. She argues that creating an open market of ideas demands aggressive steps to remedy exclusion and ensure equal participation.”
. . . .
And when it comes to authoritarianism and its dangers, Anne Applebaum’s just-released (July 21) Twilight of Democracy: The Seductive Lure of Authoritarianism is so clear in its picture of a fast evolution of otherwise liberal thinkers into “closet authoritarians” that she’s able to look at federal police actions during protests in cities like Portland and describe them as “performative authoritarianism”—a kind of pageant of tyrannical intervention, staged to gain favor with a political base.
What Applebaum wants to tell you is, “Given the right conditions, any society can turn against democracy.”
. . . .
“The rules that will govern speech in the 21st century are being written right now, formally and informally. European countries are experimenting with new constraints on speech, some of which would be unconstitutional in the United States, and others of which may warrant close scrutiny.
“Almost daily, social media companies roll out new guidelines and rule changes governing their platforms. Young people are forging new norms for discussing race, sex, and gender identity.
“Those who remain silent in the face of these debates cede the ground to those with the most extreme views and most self-serving motivations.”
. . . .
As more political releases line up in coronavirus-delayed release dates this year, watch for more sales action among them—and watch for the question of how comfortably they and their authors and publishers share the shelves of late summer. These questions of free expression and actual tolerance of it are headed for more stress tests, and soon.
PG didn’t plan for this post to appear next to the one about Cuba he put up earlier, but the juxtaposition of the two is a warning to one and all about the crucial importance of free speech and what happens when any individual, group or government punishes those whose opinions differ from those currently “approved” is not able to speak the truth as he/she sees it.
Note: As mentioned, the book discussed is not available for preview until July 28, 2020, when it will be released. However, it is available for pre-order on Amazon and, if you are reading this post on July 28 or thereafter, it will be immediately available and, if you see a Preview button below should work (although this function is controlled by the book’s publisher).
In 1976, Cary, a young Afro-Cuban woman, sailed out of Havana Harbor with 2,000 fellow students bound for the Soviet Union. They were thrilled to be going to the land that led the global communist movement, where they would earn university degrees that would help them build the Cuba of their dreams. But instead of discovering the thriving revolutionary society she’d heard about from regime officials in Cuba, Cary found herself surrounded by politically apathetic Russian classmates. They laughed at her for wanting to go to the May Day parade—the International Workers’ Day celebration. She realized: “These Russians don’t think the way we do.”
By the end of Anthony DePalma’s remarkable book “The Cubans: Ordinary Lives in Extraordinary Times,” Cary’s faith in the Cuban Revolution and the classless, race-blind system it promised has vanished. She’s gone from a rising star in the Communist Party—she returned from the Soviet Union as an economist and rose to become vice minister of light industry with her own car and driver—to a struggling business owner hunting for needles and thread. Cuba, as the author explains, permits some private businesses to operate, but restricts their growth and the accumulation of wealth. Mr. DePalma gives us an unforgettable analogy that sums up Cary’s plight. Cuba, he says, “is toying with capitalism the way a tiger plays with its prey: tapping it lightly one minute, squeezing the life out of it the next.”
Cary, who has resigned from her state post and redirected her analytical skills toward dressmaking, is one of the intrepid Cubans who opened up to Mr. DePalma, a veteran foreign correspondent, as he set out to capture “a more profound truth” about their country. The author accomplishes this by taking us behind the romantic veil that hides the day-to-day experiences of ordinary Cubans. Their voices are rarely heard, he believes, outside of Cuba and especially inside, where the government represses independent journalists and jails people who criticize it on social media.
Mr. DePalma plunges us into the lives of a diverse group of Cubans living in Guanabacoa, a 500-year-old township across the harbor from Havana. Some pray in Catholic churches. Some follow Santería. His five primary subjects are men and women in their 50s, 60s and 70s, busy with work, caring for their aging parents, and helping their grown children and young grandchildren. Lili remains a dedicated communist even after the government refuses to help her care for her dying father. His dementia and wanderings drive her to keep him in a locked closet, where he spends the last months of his life. Jorge, now 73 and living in the United States, still fights for justice for the 14 members of his family who drowned in 1994 when they tried to escape Cuba on a tugboat: Witnesses said the Cuban government rammed and sank the boat. Cuba denied any responsibility, but Amnesty International’s investigation, in 1997, concluded that the 37 men, women and children who died were “victims of extrajudicial execution.”
Younger family members become part of the narrative, offering their own stories as the elders share theirs with an astounding—maybe even dangerous—level of candor. Mr. DePalma walks with them through Guanabacoa, sits in their kitchens and workshops, and shows us what he calls “the gritty 3-D reality most Cubans live with—broken streets, collapsing buildings, more garbage than flowers. Hot. Smelly. Noisy. Raw.” As we learn about the Cubans’ triumphs and failures over the six decades since the revolution, Mr. DePalma weaves in the major events in Cuba’s history—from the wars of independence with Spain in the 1800s, to the recent ascent of Miguel Díaz-Canel to the presidency. One Cuban describes Mr. Díaz-Canel—the first non-Castro to lead post-revolutionary Cuba—with a popular expression: same dog, different collar.
. . . .
Some of the most harrowing stories take place during the so-called special period of the ’90s, when the Soviet Union fell and its subsidies to Cuba vanished. Cubans’ struggle for food and consumer goods went from difficult to desperate. They were forced to shred blankets, season and fry the material, and stuff it into bread for sandwiches. They pilfered industrial chemicals from factories and concocted household soaps and detergents. Women made hair dye with the black paste inside batteries.
. . . .
The neighborhood committees that the Castro government installed on almost every block in 1960 still serve as the government’s watchdogs. Denunciations are a permanent threat, since many Cubans, in order to survive, end up breaking laws restricting private sales. And finding allies can’t be easy when, Mr. DePalma writes, “everybody in Cuba, at one time or another, suspected nearly everyone else of being an informer.”
But the biggest obstacle for would-be protesters might be time. The daily quest for food and basic supplies—from eggs to bedsheets—seems to demand every ounce of the families’ energy and creativity. Mr. DePalma believes that Cubans are “cursed by their own greatest strength—their indomitable adaptability.” Their inventive resilience has a downside. And it may be why Cuba is embargo-proof: “People who can turn a plastic soda bottle into a gas tank for a motorcycle . . . see the world differently from other more conventional societies.”
There was a black cloud, and hard rain. The puddles were yellow and green, like someone had poured paint into them. They said it was dust from the flowers. Grandma made us stay in the cellar. She got down on her knees and prayed. And she taught us, too. “Pray! It’s the end of the world. It’s God’s punishment for our sins.” My brother was eight and I was six. We started remembering our sins. He broke the glass can with the raspberry jam, and I didn’t tell my mom that I’d got my new dress caught on a fence and it ripped. I hid it in the closet.
Soldiers came for us in cars. I thought the war had started. They were saying these things: “deactivation,” “isotopes.” One soldier was chasing after a cat. The dosimeter was working on the cat like an automatic: click, click. A boy and a girl were chasing the cat, too. The boy was all right, but the girl kept crying, “I won’t give him up!” She was yelling: “Run away, run little girl!” But the soldier had a big plastic bag.
I heard – the adults were talking – Grandma was crying – since the year I was born , there haven’t been any boys or girls born in our village. I’m the only one. The doctors said I couldn’t be born. But my mom ran away from the hospital and hid at Grandma’s. So I was born at Grandma’s. I heard them talking about it.
I don’t have a brother or sister. I want one.
Tell me, lady, how could it be that I wouldn’t be born? Where would I be? High in the sky? On another planet?
The sparrows disappeared from our town in the first year after the accident. They were lying around everywhere – in the yards, on the asphalt. They’d be raked up and taken away in the containers with the leaves. They didn’t let people burn the leaves that year, because they were radioactive, so they buried the leaves.
The sparrows came back two years later. We were so happy, we were calling to each other: “I saw a sparrow yesterday! They’re back.”
The May bugs also disappeared, and they haven’t come back. Maybe they’ll come back in a hundred years or a thousand. That’s what our teacher says. I won’t see them.
September first, the first day of school, and there wasn’t a single flower. The flowers were radioactive. Before the beginning of the year, the people working weren’t masons, like before, but soldiers. They mowed the flowers, took off the earth and took it away somewhere in cars with trailers.
In a year they evacuated all of us and buried the village. My father’s a cab driver, he drove there and told us about it. First they’d tear a big pit in the ground, five meters deep. Then the firemen would come up and use their hoses to wash the house from its roof to its foundation, so that no radioactive dust gets kicked up. They wash the windows, the roof, the door, all of it. Then a crane drags the house from its spot and puts it down into the pit. There’s dolls and books and cans all scattered around. The excavator picks them up. Then it covers everything with sand and clay, leveling it. And then instead of a village, you have an empty field. They sowed our land with corn. Our house is lying there, and our school and our village council office. My plants are there and two albums of stamps, I was hoping to bring them with me. Also I had a bike.
I’m twelve years old and I’m an invalid. The mailman brings two pension checks to our house – for me and my granddad. When the girls in my class found out that I had cancer of the blood, they were afraid to sit next to me. They didn’t want to touch me.
The doctors said that I got sick because my father worked at Chernobyl. And after that I was born. I love my father.
They came for my father at night. I didn’t hear how he got picked, I was asleep. In the morning I saw my mother was crying. She said, “Papa’s in Chernobyl now.”
We waited for him like he was at the war.
He came back and started going to the factory again. He didn’t tell us anything. At school I bragged to everyone that my father just came back from Chernobyl, that he was a liquidator, and the liquidators were the ones who helped clean up after the accident. They were heroes. All the boys were jealous.
A year later he got sick.
We walked around in the hospital courtyard – this was after his second operation – and that was the first time he told me about Chernobyl.
They worked pretty close to the reactor. It was quiet and peaceful and pretty, he said. And as they’re working, things are happening. The gardens are blooming. For who? The people have left the villages. They “cleaned” the things that needed to be left behind. They took off the topsoil that had been contaminated by cesium and strontium, and they washed the roofs. The next day everything would be “clicking” on the dosimeters again.
“In parting they shook our hands and gave us certificates of gratitude for our self-sacrifice.” He talked and talked. The last time he came back from the hospital, he said: “If I stay alive, no more physics or chemistry for me. I’ll leave the factory. I’ll become a shepherd.” My mom and I are alone now. I won’t go to the technical institute, even though she wants me to. That’s where my dad went.
I used to write poems. I was in love with a girl. In fifth grade. In seventh grade I found out about death.
I read in Garcia Lorca: “the cry’s black root.” I began to learn how to fly. I don’t like playing that game, but what can you do?
I had a friend, Andrei. They did two operations on him and then sent him home. Six months later he was supposed to get a third operation. He hanged himself from his belt, in an empty classroom, when everyone else had gone to gym glass. The doctors had said he wasn’t allowed to run or jump.
Yulia, Katya, Vadim, Oksana, Oleg, and now Andrei. “We’ll die, and then we’ll become science,” Andrei used to say. “We’ll die and everyone will forget us,” Katya said. “When I die, don’t bury me at the cemetery, I’m afraid of the cemetery, there are only dead people and crows there,” said Oksana. “Bury me in the field.” Yulia used to just cry. The whole sky is alive for me now when I look at it, because they’re all there.
hen a scientific paradigm breaks down, scientists need to make a leap into the unknown. These are moments of revolution, as identified by Thomas Kuhn in the 1960s, when the scientists’ worldview becomes untenable and the agreed-upon and accepted truths of a particular discipline are radically called into question. Beloved theories are revealed to have been built upon sand. Explanations that held up for hundreds of years are now dismissed. A particular and productive way of looking at the world turns out to be erroneous in its essentials. The great scientific revolutions – such as those instigated by Copernicus, Galileo, Newton, Lavoisier, Einstein and Wegener – are times of great uncertainty, when cool, disinterested reason alone doesn’t help scientists move forward because so many of their usual assumptions about how their scientific discipline is done turn out to be flawed. So they need to make a leap, not knowing where they will land. But how?
To explain how scientists are able to make this leap, the philosopher of science Bas van Fraassen in The Empirical Stance (2002) drew on Jean-Paul Sartre’s Sketch for a Theory of the Emotions (1939). Sartre was dissatisfied with the major mid-20th-century theories about emotions (especially those by William James and Sigmund Freud) that treated emotions as mere passive states. You might fall in love, or be gripped with jealousy. It seemed that emotions happened to you without any agency on your part. Sartre, by contrast, held that emotions are things that we do. They have a purpose, and they are intentional. For example, when we get angry, we do so to seek a solution, to resolve a tense situation. Sartre wrote:
When the paths before us become too difficult, or when we cannot see our way, we can no longer put up with such an exacting and difficult world. All ways are barred and nevertheless we must act. So then we try to change the world.
The world that Sartre referred to is the world of our subjective experience. It is the world of our needs, our wants, our fears and our hopes. In his view, emotions transform the world like magic. A magical act, such as voodoo, alters the attitude of the practitioner to the world. Magical spells and incantations don’t change the physical environment, but they change our world, by shifting our desires and hopes. Similarly, emotions change our outlook and how we engage with the world. Take Sartre’s example of sour grapes: seeing that the grapes are unreachable, you decide, ‘they are too sour anyway’. Though you didn’t change the chemical property of the grapes in any way, the world has become a bit more bearable. Anticipating contemporary ideas about embodied cognition, Sartre speculated that physical actions help us to produce emotions. We clench our fists in anger. We weep in sadness.
Applying this idea to scientific practice, Van Fraassen argues that scientists draw on their emotions when dealing with new, bewildering ideas, especially those that sprout up during scientific revolutions. If the paradigm is faltering, scientists need to change the way they view the world – and this requires that they change themselves. Scientists need to transform both who they are and what they know. Only once scientists themselves are transformed in this way can they accept a theory that they originally thought outlandish or ridiculous.
There are a few problems with this theory. Van Fraassen doesn’t specify which emotions can help scientists. Would it be sufficient to be intrigued or excited by a new theory, or to feel curiosity? Would anger at the failure of the old paradigm do the job? And it’s not clear how scientists can use emotions to change their minds. Sartre seems at times to assume that we have our emotions under direct voluntary control. But this appears implausible, on the face of it. Surely not all our emotions are under our direct control?
One way to salvage the Sartre and Van Fraassen account is to propose that emotions are under our indirect control. We can’t control our emotions directly, but we can engage in practices that, over time, help to shape how we emotionally respond to a variety of situations. And as for which emotion most helps scientists, I have a particular one in mind: awe.
In their classic account of awe, the psychologists Dacher Keltner and Jonathan Haidt characterise awe as a spiritual, moral and aesthetic emotion. In their view, all clear cases of awe have the following two components: an experience of vastness, and a need for cognitive accommodation of this vastness. You might feel awe for things that are physically large, but also for ideas that are conceptually vast. For example, at the end of the first edition of his Origin of Species (1859), Charles Darwin expressed awe for his theory of natural selection:
There is grandeur in this view of life, with its several powers, having been originally breathed into a few forms or into one; and that, whilst this planet has gone cycling on according to the fixed law of gravity, from so simple a beginning endless forms most beautiful and most wonderful have been, and are being, evolved.
The need for cognitive accommodation makes you aware that there is a lot you don’t know. You feel small, insignificant and part of something bigger. In this way, awe is a self-transcendent emotion because it focuses our attention away from ourselves and toward our environment. It is also an epistemic emotion, because it makes us aware of gaps in our knowledge. We can feel overwhelmed looking at the night sky, deeply aware that there is so much we don’t know about the Universe. In one recent study, participants listed nature as their most common elicitor of awe, followed by scientific theories, works of art and the achievements of human cooperation.
The philosopher Adam Morton speculates that epistemic emotions play a crucial role in scientific practice. Imagine a scientist who knows the latest research techniques, and who is intelligent and analytical. If she lacks curiosity, awe and other epistemic emotions, she won’t have the drive to become a good scientist, who can change her mind on the basis of evidence, explore new hypotheses or pay attention to unexpected results. As Van Fraassen argued, to change the field or accept radical changes in it, you need to alter your outlook on the world. Awe can do this. It focuses attention away from yourself and makes you think outside of your usual thought patterns.
I have a history of making decisions very quickly about men. I have always fallen in love fast and without measuring risks. I have a tendency not only to see the best in everyone, but to assume that everyone is emotionally capable of reaching his highest potential. I have fallen in love more times than I care to count with the highest potential of a man, rather than with the man himself, and I have hung on to the relationship for a long time (sometimes far too long) waiting for the man to ascend to his own greatness. Many times in romance I have been a victim of my own optimism.”
Newly divorced and edging further into her 40s, Diane Cardwell’s life hadn’t turned out as planned. So one day, out on a reporting trip for The New York Times in Montauk, a beach town at the eastern end of Long Island in New York, Cardwell seized the opportunity when a cottage rental and surfing lesson presented itself.
The result was the first step on a long journey that took her from a version of her life she’d always imagined — the house, the career, the husband — to another version in which she found joy and love again, complete with a new community of friends and a house in Rockaway, a surf town — yes, a surf town — in New York City.
Tell me about the seed of this memoir. What inspired you to write it?
I wrote an essay for Vogue after Sandy, and I just got such a tremendous response from it. I wrote the essay during the first, I don’t know, three or four days after the storm. I was in a fugue state. I didn’t know how I was going to help replace my utilities that were gone. And when the editor called and he was like, “We can pay you,” I was like, “That’ll buy a new boiler.” So I ended up doing the story and I was actually very grateful to do it.
So that was the first inkling that maybe I had a subject that might interest people. I mean, I literally had strangers coming up to me on the beach like, “Oh my God, it’s so great to see you back in the water.” And, “I love that story.” And then a couple of years later, I think in 2015, I did a story for the style section that actually yielded the subtitle of the book; the headline was ”Surfing Headlong into a New Life.” That was the piece that showed me that I had an arc, a beginning, a middle, and an end— and even a happy ending where I’m finally able to surf, at least reasonably decently on good days. And I was in love and had this life that I couldn’t have imagined even three years later. So that was how I got to writing the book.
. . . .
Surfing is not something you just pick up and know how to do. It’s a huge undertaking. It boils down to practicing it over and over. There is a scene in “Rockaway” where you and your friend undertake a month of practicing every day: He practices writing and you practice surfing. Could you talk about that?
Right. So I think, well, with surfing specifically, you stand on the beach and you watch good surfers in the water and it looks incredibly easy — and I’m talking about normal-sized surf, not those enormous skyscraper-high waves — but then when you try to do it, it’s actually really hard. And I think some people, especially if they start much younger, it can come really quickly. Their bodies are more flexible, they pick up new things. But surfing isn’t really like that and was really, really not like that for me. I mean, I think of myself as the anti-natural.
I just could not do it for so long, but I loved it. And so I knew that the only way to get even a reasonable competence was going to be to practice. And that month was pivotal for me in a way, because I had this friend who wanted to try to write; I was trying to surf. He said, “What if you were to surf every day? And what if we were both to keep each other honest, and report back?” That kind of accountability really helped me stick to it, because I didn’t feel like getting in the water every day.
I would get up and say, “Oh, it’s cold.” Or, “I’m tired.” Or, “I just want to sit out back in the garden and have a beer.” But it became a thing that I had to do. And so the day was organized around “When am I going to surf?” I think that that kind of practice, and also accountability, is important in anything that you want to pursue in a serious way. And that doesn’t mean that it’s ever going to be something that’s your career or something that you do to the exclusion of other things. But if you want to do it seriously and get the benefits of doing something seriously, then it just takes attention.
‘To be A woman is to grow up and leave for another household.” So begins “The Great Learning for Women,” an 18th-century Japanese primer attributed to the neo-Confucian scholar Kaibara Ekken. As generation after generation of girls sat down to study, learning the texts and skills that would prepare them for adulthood, this was among the first and most fundamental precepts: They would grow older and, in time, depart their family home to marry into another. Biology and society admitted no alternate possibility.
In the snowy town of Ishigami at the beginning of the 19th century, a girl named Tsuneno likely read this text. The oldest daughter of a Buddhist priest, Tsuneno was expected to be disciplined, skilled and quiet; to marry the man her parents selected from within her family’s social network and raise another generation of devout sons and obedient daughters.
As Amy Stanley, a professor of history at Northwestern University, recounts in her absorbing new book, “Stranger in the Shogun’s City: A Japanese Woman and Her World,” Tsuneno would try, more than once, to fulfill the mandate set out by her family and society. By the age of 35 she had been married and divorced three times, each union shorter than the previous one and none yielding any children. Quarreling with her older brother and desperate to avoid a fourth arranged marriage, Tsuneno left—not for another husband or household, but for the bustling city of Edo (now Tokyo).
With her departure, Tsuneno changed course in dramatic fashion and initiated a furious exchange of correspondence to explain her decisions and persuade her relatives to support them. That trove of letters, carefully preserved by Tsuneno’s family, eventually became part of the Niigata Prefectural Archives. Ms. Stanley read Tsuneno’s words online and followed them to the archive, painstakingly deciphering one messily handwritten document after another until she could assemble the events of Tsuneno’s life. The resulting book is a compelling story, traced with meticulous detail and told with exquisite sympathy.
Other parts of the globe were deep into the Age of Revolution when Tsuneno was born in 1804, but peace had reigned over Tokugawa Japan for nearly two centuries. While Japan was not the “closed empire” others have depicted, Ms. Stanley writes, “it was a sheltered place, inaccessible to most foreigners and at a remove from global markets.” Ishigami, about two weeks’ walk northwest from Edo when the roads were passable, sat even further removed. As Tsuneno and her seven siblings grew up surrounded by the cyclical rhythms of religious ritual, it must have seemed unthinkable that anything would ever change.
Current debates about liberalism—especially about whether a free society can thrive alongside ever more urgent calls for government action—bring to mind an episode from the Edwardian era. It was then that the Liberal Party in Britain underwent a kind of identity crisis over policy and principle. Richard Burdon Haldane, later Viscount Haldane, stands out as a key figure in that story. Haldane (1856-1928) was “a picture of the well-fed but poorly slept” lawyer, John Campbell says, a man who combined professional success and public service. Now largely forgotten, Haldane embodied a political type that is familiar to Americans in the high-minded figures of the Progressive Era and in their descendants today, who possess an almost missionary zeal for human betterment.
In “Haldane: The Forgotten Statesman Who Shaped Britain and Canada,” Mr. Campbell, an investment banker with childhood ties to Haldane’s family, makes a persuasive case for his subject’s importance and, along the way, touches on larger questions of culture and governance. The book’s structure—less chronological than propelled by themes and causes—may challenge readers new to the story, but its wealth of detail and insightful character sketches will reward the effort.
As Mr. Campbell shows, Haldane’s family heritage adumbrated his public role. His English mother was descended from an eminent Tory jurist and lord chancellor (a position Haldane would himself occupy). On his father’s side, his grandfather and great-uncle both retired from military service to promote the evangelical movement in Scotland. Though strict Calvinism was at the center of the family’s Scottish home, Haldane lost his faith as a teenager. Over time, and perhaps without realizing the change, he transformed it into a secular commitment to reform and social progress.
. . . .
After practicing law in London and making a good deal of money, he was elected to Parliament in 1885—at a pivotal moment, as Mr. Campbell shows. The Liberal Party had dominated British politics since 1830, but tensions among its members were growing even before William Gladstone split the party in 1886 over Irish Home Rule.
Mr. Campbell describes Haldane as a living embodiment of such tensions—between an older liberalism of laissez-faire economics and limited government and a new kind, which responded to the rising spirit of socialism and organized labor. (It was around this time that Marx’s ideas were being popularized in England.) The young politician sought a rationally organized state along German lines, what his friend Beatrice Webb called “a deliberately organized society.” Haldane’s liberalism went beyond the ideals of an earlier Liberal Party, which had sought to minimize the state’s checks on individual action. He preferred to follow Wilhelm von Humboldt’s idea that government intervention, especially in education, helped citizens cultivate themselves. He thought property owed a debt to society for guaranteeing the wealth it earned. While this new liberalism inspired Haldane and his colleagues to press for costly social reforms, it drove others toward the Conservative Party.
. . . .
Early in his career, Haldane helped establish the London School of Economics, and he guided the University of London toward its becoming a true teaching institution. He advised colleges in provincial cities to extend their access to a wider range of students and social classes, and he drew on German models to improve technical education.
Oddly, though, it was in military matters that Haldane’s legacy is most notable. When, in 1905, a Liberal prime minister—Henry Campbell-Bannerman—reached an impasse with his party’s grandees, he offered Haldane the War Office, hoping that “Schopenhauer,” as he called him, could manage what was viewed as the cabinet’s most thankless job. The post gave Haldane plenty to reform. He restructured the army to cut costs and created an expeditionary force that could be quickly sent abroad with reserves at home to reinforce it. He also created a General Staff to facilitate planning. These changes helped Britain stop the German invasion of France in 1914 and led Sir Douglas Haig to call Haldane “the greatest secretary of war England ever had.”
A career capstone came with Haldane’s elevation to lord chancellor in 1912—the head of Britain’s legal system. As a longtime member of the judicial committee of the privy council, which heard appeals from the empire’s dominions, he played a “leading role” in shaping the development of Canadian law, according to Mr. Campbell. For a time, his knowledge of Germany, and fluency in the language, gave him a liaison role. After Wilhelm II asked him to join a meeting with the kaiser’s ministers, he joked about Haldane’s being the only Englishman to sit in a German cabinet. But tensions with Berlin made his position difficult. On a visit to London, the kaiser invited himself to lunch at Haldane’s home, intensifying doubt about his loyalty—“doubt that would, in time of war, bury Haldane’s reputation and political career,” Mr. Campbell writes.
This is a continuation of a prior post you can find here.
A delegate from Connecticut, Oliver Wolcott served as captain and then major general in the state militia. In 1776, he was appointed to lead 14 regiments in defense of New York City. He also commanded thousands of men in the Battle of Saratoga. Wolcott worked tirelessly to recruit for the Connecticut militia, which, like the army in general, was sorely lacking in numbers within its ranks.
William Whipple of New Hampshire served as brigadier general in the state militia. He fought against Gen. Burgoyne at the battles of Stillwater and Saratoga (commonly pointed to as the turning point for Americans in the war) in 1777. The following year, Whipple participated in the retaking of Rhode Island.
Thomas McKean of Delaware served as colonel in the Delaware state militia. Once McKean was appointed to the office of President of Delaware in 1777, he was targeted by the British (the British captured John McKinley, the previous president). He had to move his family on five occasions because of raids by both the British and local Indian tribes.
Francis Lewis of New York signed the declaration on August 2, 1776. Although he was present when independence was declared a month earlier, the New York delegation did not get permission from the state’s legislature to sign the document. A few months after affixing his signature on the declaration, British troops destroyed the Long Island estate of Lewis. They took Lewis’ wife and put her in prison where she was tortured on a regular basis. Under the direction of George Washington, she was finally returned in a prisoner exchange two years later.
Known as the sage of Philadelphia, Benjamin Franklin of Pennsylvania was the oldest of the signers of the declaration. Prior to setting sail for France in late 1776 to ask the French for assistance in the war, Franklin gave his entire fortune to Congress to help fund the war.
Hessian mercenaries plundered signer John Hart’s 400-acre farm outside of Hopewell, New Jersey. Prior to his farm being captured, Hart was forced to leave his family because of advancing British troops. During his absence, his wife died, and his children were sent to live with neighbors.
The estate of William Ellery of Delaware was burned down during the British occupation of Newport, Rhode Island. Ellery served in the Second Continental Congress until the British left Newport, which they held for three years. He returned home in order to salvage what was left of his property.
With his fortunes built on trade, Joseph Hewes of North Carolina was a vigorous proponent of the decision of the First Continental Congress to cut off all imports and exports with the British. This of course had the effect of drying up his wealth. Interestingly, Hewes also renounced his Quaker religion in order to support the war.
A delegate from Pennsylvania, James Smith served in the Pennsylvania militia as captain, colonel, and then as brigadier general. He was one of the first to raise men for the possibility of defending his home state, a duty he took up beginning as early as 1774.
Benjamin Harrison of Virginia, whose son and grandson both served as U.S. presidents, complained in a letter to Gov. William Livingston of New Jersey that his debts had accumulated substantially because of the “ravages” and “plunderings” of the British.
While William Floyd of New York served as a delegate in the Second Continental Congress, the British sacked his estate, forcing his family to flee. Though they made it safely to Connecticut, his family was left without a home for the duration of the war.
William Hooper of North Carolina outlasted British raiders who were looking to capture him and his family. In 1782, he and his family fled Wilmington after it fell to the British. Though much of his property was destroyed, he and his family were reunited at the conclusion of the war.
The British destroyed the home and plantation of Lyman Hall of Georgia. Luckily, his family escaped before the British arrived and moved up North to be with him.
Thomas Heyward Jr., Edward Rutledge, and Arthur Middleton
Thomas Heyward Jr. of South Carolina was a signer of both the declaration and the Articles of Confederation. Heyward drew the ire of the British when, as a circuit court judge, he presided over the trial of several loyalists who were found guilty of treason. The prisoners were summarily executed in full view of British troops. In 1779, he joined the South Carolina militia as a captain of artillery.
Heyward’s compatriot in the South Carolina delegation, Edward Rutledge, also served in the state militia. At age 26, Rutledge was the youngest signer of the Declaration of Independence. After returning home from attending the Second Continental Congress in 1777, he joined the militia as captain of an artillery battalion.
Both Heyward and Rutledge aided their country in the battle at Port Royal Island, where they helped Gen. Moultrie defeat British Maj. William Gardiner and his troops.
Arthur Middleton, the last of the South Carolina delegation who served in the militia, took up arms against the British during the siege of Charleston in 1780. His fellow signers, Heyward and Rutledge, fought in that battle as well.
Upon the surrender of Charleston, all three men were captured by the British and were sent to a prison in St. Augustine, Florida, which was reserved for people the British thought were particularly dangerous. They were held there for almost a year before being released. On route to Philadelphia for a prisoner exchange in July 1781, Heyward almost drowned. He survived his fall overboard by clinging to the ship’s rudder until he could be rescued.
During the British occupation of Charleston, Commandant Nisbet Balfour ordered the seizure of many estates in Charleston, including those owned by Heyward and Middleton.
During his imprisonment, Heyward’s wife died at home, and his estate and property were heavily damaged. Rutledge’s estate was left intact, but his family had to sell many of their belongings in order to make the trip to Philadelphia to reunite with him after his release. Middleton’s estate was left relatively untouched, but his collection of rare paintings was destroyed during the British occupation of his home.
Thomas Nelson Jr.
Thomas Nelson Jr. of the Commonwealth of Virginia was appointed to the position of brigadier general and commander-in-chief of the Virginia militia by Gov. Patrick Henry in August 1777. At that time it was thought that the British would be making a full scale invasion of the state. Nelson was able to muster only a few hundred men to defend Virginia, but the British instead decided to attack Philadelphia.
Nelson inherited a vast family fortune, much of which he used to support the American effort. He personally paid for the return journey home of 70 troops he had led to meet the British in Philadelphia during the summer of 1778. In the spring of 1780, Nelson signed his name to a loan for $2 million that was needed to purchase provisions for the French fleet that was coming to America’s aid in the war.
As then-governor of Virginia, during the Battle of Yorktown he ordered American troops to fire upon his mansion, which had been commandeered by Gen. Cornwallis and his men.
A member of the New Jersey delegation, Richard Stockton, had his estate commandeered by the British for use as a headquarters. As they left, British troops burned all his personal effects—including his library, private papers, furniture, and clothes.
Though Stockton was in hiding at the time, he ultimately did not escape capture; a traitor led the British to his position in November 1776. He was held captive in Amboy, New Jersey, and was then sent to New York City where he was imprisoned in a jail reserved for common criminals. Incensed by his treatment, Congress worked with British Gen. William Howe to obtain his release.
Because of his small build and stature, George Walton was thought to be the youngest of the signers of the declaration (he was actually in his mid-30s). He hailed from Georgia and served as colonel in the first regiment of the state militia in 1778. During the siege of Savannah, a cannonball broke Walton’s leg, which led to his being captured. He was held captive for nine months and was released in the early fall of 1779 in a prisoner exchange for a British navy captain.
At the same time Walton was held prisoner, his wife Dorothy was captured by the British. She was imprisoned on an island in the West Indies and was eventually freed after a prisoner exchange. During the Waltons’ confinement, the British ransacked their home.
British troops destroyed the home of George Clymer of Pennsylvania in September 1777 when they captured Philadelphia. Though his home was outside of the city, it was right in the middle of the path of the British march. American loyalists pointed out to the British homes belonging to patriots, which of course included Clymer’s estate.
Clymer also contributed to the war monetarily. He converted his entire fortune into continental currency, a risky move considering the likelihood that the currency would be rendered worthless. He also told wealthy friends to contribute to the American cause.
A delegate from Pennsylvania, Robert Morris helped insure Washington’s victory at Yorktown by using his own credit to obtain the supplies necessary to defeat the British. He spent more than $1 million (not adjusted for inflation) of his own money to accomplish this.
While serving as superintendent of finance of the United States, Morris regularly used his own financial resources to obtain much needed supplies. Using his own funds, for example, he purchased one thousand barrels of flour for Washington’s men in late spring of 1778.
Lewis Morris of New York served as a major general in the state militia. Morris devoted himself to recruiting men to serve in the militia and to help keep supplies up, which was a constant problem. For almost the entire length of the war, the British occupied his home, Morrisania, and used it as their headquarters. This forced Morris to live off of his close friends and associates until the war ended in 1783.
John Hancock of Massachusetts, the man with the largest signature on the declaration, served in the militia as major general in 1778. Hancock was put in command of approximately 6,000 men during the Rhode Island campaign. That campaign was ultimately unsuccessful because the French failed to carry out their end of the bargain.
Caesar Rodney served in the Delaware militia as well, attaining the rank of brigadier general. Rodney famously road on horseback straight from Dover to Philadelphia to cast his vote in favor of declaring independence (the Delaware delegation was split). He was with his men in the field during the brutal winter of 1776, helped quash an uprising in Delaware (there were a large number of loyalists within the state), and helped in George Washington’s effort to defend Philadelphia from being taken by the British.
Carter Braxton of the Virginia delegation accumulated massive personal debts helping the American effort in the war. He loaned 10,000 pounds sterling to Congress, which was never repaid. He also spent much of his wealth outfitting American ships so that they could carry more cargo. Due to the British capturing some of his vessels and others being lost out on the high seas, he suffered great financial calamity. These accumulated losses left him bankrupt by war’s end.
Richard Haass is a prolific author on international affairs, served as a foreign-policy official in the Reagan and both Bush administrations, and is now president of the Council on Foreign Relations. He is, in short, a high-ranking member of American foreign policy’s clerisy. As if to emphasize the point, he relates that the inspiration for his book “The World: A Brief Introduction” began with a day of fishing in Nantucket, where he spoke with a student from Stanford who confessed that he had taken few courses in economics, politics or history. Otherwise educated young people today, Mr. Haass concludes, “are essentially uninformed about the world they are entering.” He hopes to change this state of affairs with “The World.”
What Mr. Haass has written, alas, is a series of dry primers about the world’s regions and their problems. The book is rife with soporific statements with which it would be difficult to disagree: “Economic problems within Europe have been ever more significant. As a result, the Continent has had low rates of growth.” The assumption seems to be that the young have disengaged from the world because they lack access to information. But engagement has fallen even as the internet has made access to information effortless.
Mr. Haass is among the most respected foreign-policy experts in the world and is fully capable of proposing bold ideas that would put American strategy on a more sustainable path. That “The World” offers mostly uncontroversial data points rather than fresh analysis helps to explain why two (and in some respects three) consecutive U.S. administrations have often rejected the dominant views of foreign-policy experts.
The useful parts of the book mostly come in the opening section, which briskly relays the “essential history” of international affairs. The Treaty of Westphalia in 1648 established the nation-state as the basic political unit in Europe. Webs of alliances and the rise of nationalism set the stage for World War I—and trade ties were not enough to prevent it. This context is important because contemporary debates about international relations often proceed as if history started with World War II.
Innovation, Matt Ridley tells us at the start of his new treatise on the subject, “is the most important fact about the modern world, but one of the least well understood.” Even as it functions as a powerful engine of prosperity — the accelerant of human progress — innovation remains the “great puzzle” that baffles technologists, economists and social scientists alike. In many respects, Ridley is on to something. After decades of careful study, we’re still not entirely sure about innovation’s causes or how it can best be nurtured. Is innovation dependent on a lone genius, or is it more a product of grinding teamwork? Does it occur like a thunderclap, or does it take years or even decades to coalesce? Is it usually situated in cities, or in well-equipped labs in office parks?
We can’t even agree on its definition. Generally speaking, an innovation is more than an idea and more than an invention. Yet beyond that, things get confusing. We live in a moment when we’re barraged by new stuff every day — new phones, new foods, new surgical techniques. In the pandemic, we’re confronted, too, with new medical tests and pharmaceutical treatments. But which of these are true innovations and which are novel variations on old products? And while we’re at this game, is innovation limited to just technology, or might we include new additions to our culture, like a radical work of literature, art or film?
Unfortunately, no one happens to be policing the innovation space to say what it is and is not. Mostly we have to allow for judgment calls and an open mind. As an occasional writer on the subject, I tend to define innovation simply, but also flexibly: a new product or process that has both impact and scale. Usually, too, an innovation is something that helps us do something we already do, but in a way that’s better or cheaper. Artificial light is an excellent case study. Over time we’ve moved from candles, to whale oil and kerosene lamps, to incandescent and fluorescent bulbs, and now to LEDs. Or, as another example, we might look to one of the great accomplishments of the 20th century, the Haber-Bosch process to make synthetic fertilizer, as a leap that changed the potential of agricultural production. On the other hand, we can regard the Juicero press — a recent Silicon Valley-backed idea that promised to “disrupt” the juice market and burned up more than $100 million in the process — as a fake or failed innovation. And still, this leaves us plenty of room for disagreement about what falls between these extremes and why.
Ridley enters into this messy arena with the intent of organizing the intellectual clutter. The first half of his book, “How Innovation Works: And Why It Flourishes in Freedom,” takes us on a tour through some highlights in the history of innovation. We visit with the early developers of the steam engine, witness the events leading to the Wright brothers’ first flight at Kitty Hawk, N.C., and hear about the industrialization of the Haber-Bosch fertilizer process. There are likewise forays back to the early days of automobiles and computing, the development of smallpox vaccines and clean drinking water, and stories that trace the origins of the Green Revolution in agriculture, which alleviated famine for more than 1 billion people. For dedicated science readers, Ridley’s lessons may have a glancing and derivative feel. He knits together stories many of us have probably heard before — say, through the renditions of writers like Steven Johnson, Charles Mann or Walter Isaacson — but somehow misses the opportunity to enliven these sketches with a sense of wonder and surprise. More seriously, he skirts the opportunity to footnote his summarizations, leaving only a skeletal guide to sources in his back pages.
What becomes clear, though, is that Ridley is focused less on exploring the pageant of history than on fashioning a new belief system. I don’t necessarily mean this as a critique; in fact, the second half of his book — where he looks closely, chapter by chapter, at the factors that shaped the innovations he’s spent his first 200 pages describing — is more polemical in its approach but often more engaging, even as one might disagree with a narrative direction that arises from what I would characterize as the libertarian right.
In “The President, the Pope, and the Prime Minister” (2006), the journalist John O’Sullivan asserted that the Cold War had been won by Ronald Reagan, John Paul II and Margaret Thatcher. “Without Reagan,” he stated, “no perestroika or glasnost either.” This belief, according to Archie Brown, emeritus politics professor at Oxford University, is nothing less than “specious.” In “The Human Factor,” Mr. Brown gives most of the credit for the Cold War’s end to Mikhail Gorbachev, whom he presents as almost a pacifist who voluntarily wound up the Soviet Union, albeit with a little assistance from Thatcher. So who is right?
The title of Mr. Brown’s last book, “The Myth of the Strong Leader” (2014), suggests that he might have a philosophical problem with the Great Man and Woman theory of history, and he certainly underplays the role of John Paul II during the last decade of the Cold War. The pope’s call for spiritual renewal and for freedom, not least for his native Poland, stirred the hearts of millions, but he rates only five anodyne sentences in 400 pages.
Mr. Brown was awarded a British honor in 2005 “for services to UK-Russian relations.” One Russian in particular—Mr. Gorbachev—gets lauded in the current work for his “bold leadership,” “new ideas,” “formidable powers of persuasion,” “embrace of democratization,” “emphasis on freedom of choice” and so on. At best, Reagan, George Shultz, George H.W. Bush and the others are praised for their “constructive engagement.” At worst, Reagan is criticized for introducing “complications” to an already begun process of Russian collapse.
At no point does Mr. Brown acknowledge that the primary reason that Mr. Gorbachev liberalized the Soviet Union was that Reagan, Thatcher and other Western leaders forced him to, by keeping Western defenses strong and mercilessly exposing the moral bankruptcy—and looming economic bankruptcy too—of what Reagan accurately called Russia’s “evil empire.”
For Mr. Brown, Reagan lacked sophistication, and his style was all wrong for high-minded diplomacy. It was a familiar critique at the time, though one would think that, with the end of the Cold War, it had lost its plausibility. Still, Mr. Brown hopes to revive it. “In his speeches, at every stage of his career,” Mr. Brown complains of Reagan, “he used stories and ‘quotations’ that came from very unreliable sources or from the recesses of his own mind, often drawing on films he had acted in or seen. . . . For Reagan, whether they were actually true or not appeared less important than the part they played in his narrative.”
A president who told unreliable jokes and unverifiable stories! Lincoln fits the description, as do a dozen other U.S. presidents. Showing a folksy informality and raconteur skill is thought to be an asset in politics.
PG notes that TPV is a blog focused on the contemporary business of writing, not politics. He will also note that since much of the publishing world, indie and traditional appears to be sheltering in place, he sometimes casts his net a bit wider than he might absent the publishing commentary drought.
(Yes, PG does recognize a sort of mixed metaphor in the “casting his net” and “drought” combination.)
If a book has been in print for forty years, I can expect it to be in print for another forty years. But, and that is the main difference, if it survives another decade, then it will be expected to be in print another fifty years. This, simply, as a rule, tells you why things that have been around for a long time are not “aging” like persons, but “aging” in reverse. Every year that passes without extinction doubles the additional life expectancy. This is an indicator of some robustness. The robustness of an item is proportional to its life!
On April 27, 1939, the British government announced plans to conscript young men for military training. It was a dramatic departure: Never previously in its modern history had the nation conscripted men for the military in time of peace. As the prime minister, Neville Chamberlain, explained to the public, however, with countries all over Europe preparing for battle, and everyone fearing a war might start at any moment, “no one can pretend that this is peacetime in any sense in which the term could fairly be used.”
This liminal period, starting with the sighs of relief at the signing of the Munich Agreement in September 1938, is the subject of Frederick Taylor’s “1939: A People’s History of the Coming of the Second World War.” Mr. Taylor, whose previous works about the period include “Coventry: November 14, 1940” and “Dresden: Tuesday, February 13, 1945,” charts the escalating tensions as Hitler’s brinkmanship pushed Europe to the edge of war, and the insidious onset of a “wartime” mood across Europe, even before German forces invaded Poland. The book concerns the United Kingdom and Germany, and it intersperses clear explanations of the decisions being taken by statesmen with the way these were experienced by “ordinary” people in both countries.
Rich in social and cultural details that bring the era to life, “1939” makes use of a range of eyewitness testimony and contemporary assessments of public opinion, which together illuminate the variety of individual experience within a historic moment in international affairs. Discussions of the ways that new forms of entertainment, such as television and cheap holiday camps, appeared in Germany and in Britain illuminate both the similarities among European experiences and the stark cultural and political differences. Though each chapter deals with a month, Mr. Taylor dives back into the 1930s to explain the back story of that final year of “peace.”
But Mr. Taylor’s inverted commas on “ordinary” are necessary. The figures to whose testimony Mr. Taylor returns throughout the book are German: the journalist (and later anti-Nazi resister) Ruth Andreas-Friedrich and the well-connected novelist and screenwriter Erich Ebermayer. Their diary accounts provide the self-scrutinizing outsiders’ view of the mainstream that, for the British part of his story, comes from the more numerous contributors to the social research project Mass-Observation, the surviving archives of which are such a boon for historians of this period.
. . . .
Mr. Taylor [keeps] up the momentum of a much-told story—the coming of the European war—while conveying a powerful sense of what it felt like to watch the precipice approach.
For some, the drop had already begun. Matching up the dynamics of genocide and war, Mr. Taylor explains how ordinary Germans carried on as attacks on Jews became part of national and civic life. The author is very good at showing the fear and horror produced by escalating Nazi violence, as well as the bizarre dualities that resulted as everyday routines continued around them. Walking to church or the cinema over the smashed glass from shop windows and through the smoke from burning synagogues, gentile Germans managed not to feel that their world was disintegrating around them. Even Britons who got past the casual anti-Semitism typical of the age to offer aid to Jewish refugees, meanwhile, remained remarkably convinced that decent Germans would one day reject Nazi brutality.
What worried everyone was the onset of another world war, when the last one was fresh in memory. Mr. Taylor quotes one report from a local Nazi party official about popular reactions to the invasion of Poland in the Westphalian city of Bielefeld: The last great war, the document observed, had “returned remarkably vividly to people’s memories, its misery, its four-year duration, its two million German fallen. No enthusiasm for war at all.” That the German people acquiesced speaks not only to the power of Nazi propaganda, which used modern means to tap into deeper strands of European anti-Semitism, but also to the degree to which life was already militarized by September 1939. For all the horror at the slaughter a generation before, mobilizing to fight was something that this state—and this society—knew how to do.
Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)
Between 1935 and 1939, the Federal Writers’ Project (FWP)—an initiative funded by the Works Project Administration under the New Deal—provided employment for some 6,000 jobless writers [in the United States]. Today, as stunned authors in Australia and around the world come to terms with the economic consequences of the coronavirus pandemic, that experiment deserves reconsideration. As the ABC recently noted, Australian writers—who earn, on average, less than $13,000 directly from their work each year—will be affected on multiple levels: by the cancellation of festivals, talks, and other paying gigs; by the closure of bookshops; by redundancies and cuts in publishing houses; and by job losses in the related industries (from academia to hospitality) through which they supplement their incomes.
It was the American New Deal more than anything else that legitimated the kind of stimulus packages again being discussed in Australia not just for the arts but across the economy. When Franklin D. Roosevelt took office, the crisis of the Great Depression forced him, despite his own fiscal conservatism, to rush through various rescue measures of a now-familiar nature. The US government guaranteed bank loans to prevent further financial collapses; it encouraged industrial cartels to control prices and production levels; it purchased unsold crops from farmers; and through the Civil Works Administration, the Federal Emergency Relief Administration, and, eventually, the Works Progress Administration it sought to create jobs.
Recent calls for postpandemic bailouts for artists in general or writers implicitly evoke that legacy.
. . . .
Obviously, the publishing scene today—dominated by vast multinationals, for whom books are merely part of a broader engagement with the “entertainment industry”—differs greatly from the more small-scale milieu of the 1930s. Even so, it’s still worth noting how contemporary thinking about funding literature differs from the Federal Writers’ Project in several important ways.
Most importantly, the job schemes of the 1930s as a whole, including the Writers’ Project—emerged from intense class struggles in a way that today’s plans do not.
In her history of the Works Progress Administration, Nancy E. Rose writes:
Starting in early 1930, unemployed councils, organized by the Communist Party, began to lead hunger marches to demand more relief. On March 6, 1932, which was proclaimed International Unemployment Day, hunger marches took place throughout the country. … In general, cities with strong Unemployed Councils provided better relief.
Agitation by the unemployed coincided with intensified industrial disputation. By 1934, some 1.5 million workers were on strike and FDR went to the polls the following year in the midst of a massive wave of industrial action, in which the newly formed Congress of Industrial Organizations played an important role. Those titanic clashes paved the way for the Second New Deal, under which the most significant reforms (including the WPA) were implemented.
Crucially, writers themselves fought, through explicitly political groups like the Writers’ Union and [before that] the Unemployed Writers’ Association, for the program from which they benefited. In 1934, the UWA’s secretary Robert Whitcomb explained:
The unemployed writers of New York City do not intend to continue under the semi-starvation conditions meted out to them. If the government does not intend to formulate some policy regarding the class of intellectual known as a writer … then the writer must organize and conduct a fight to better his condition.
The following year, with something like a quarter of the entire publishing industry out of work, the two organizations launched a widely publicized picket of the New York Port Authority, in which their members carried signs reading: “Children Need Books. Writers Need Bread. We Demand Projects.”
. . . .
The authors employed by the FWP included many who went on to conventional success, people like Nelson Algren, Saul Bellow, Arna Bontemps, Malcolm Cowley, Ralph Ellison, Zora Neale Hurston, Claude McKay, Kenneth Patchen, Philip Rahv, Kenneth Rexroth, Harold Rosenberg, Studs Terkel, Margaret Walker, Richard Wright, Frank Yerby, and others. As David A. Taylor notes in Soul of a People, his history of the FWP, “four of the first ten winners of the National Book Award in fiction and one in poetry came from this emergency relief project.”
. . . .
Thus, even though the program did actively recruit some literary stars, the author Anzia Yezierska, who’d previously worked in Hollywood, experienced enlisting in the New York FWP as a kind of proletarianization. “There was,” she wrote later, “a hectic camaraderie among us, though we were as ill-assorted as a crowd on a subway express, spinster poetesses, pulp specialists, youngsters … veteran newspapermen, art-for-art’s-sake literati, clerks and typists … people of all ages, all nationalities, all degrees of education, tossed together in a strange fellowship of necessity.”
Not everyone approved of this camaraderie—W. H. Auden dismissed it as “absurd”; one of the project’s own directors complained that “all the misfits and maniacs on relief have been dumped here”
. . . .
The FWP faced especial hostility and ridicule, with one editorialist complaining that it meant that literary “pencil leaners” would join the “shovel leaners” of the WPA. Again, the authorities stressed the project’s utility, with its remit described in an official announcement as the
employment of writers, editors, historians, research workers, art critics, architects, archaeologists, map draftsmen, geologists, and other professional workers for the preparation of an American Guide and the accumulation of new research material on matters of local, historical, art and scientific interest in the United States; preparation of a complete encyclopedia of government functions and periodical publications in Washington; and the preparation of a limited number of special studies in the arts, history, economics, sociology, etc., by qualified writers on relief.
It duly enlisted its staff to labor on perhaps a thousand volumes, including 50 state and territorial guides, 30 city guides and 20 regional guides. David Taylor describes these texts, composed by a dazzling group of writers, as “a multifaceted look at America by Americans, assembled during one of the greatest crises in the country.”
Many writers resented their tasks (at one point, Yezeriska was sent to catalog the trees in Central Park); many worked on their own manuscripts on the side.
. . . .
In books like Gumbo Ya-Ya: A Collection of Louisiana Folk Tales, Bibliography of Chicago Negroes, and Drums and Shadows: Survival Studies among the Georgia Coastal Negroes, FWP employees collected the folklore that Zora Neale Hurston described as “the boiled-down juice of human living.” They interviewed people who had been enslaved, generating an astonishing assemblage of reminiscences. It’s thanks to the FWP that we have a small number of audio clips in which we can hear the actual voices of the survivors of slavery explaining what was done to them.
Alfred Kazin described how, in the late 1930s:
Whole divisions of writers now fell upon the face of America with a devotion that was baffled rather than shrill, and an insistence to know and to love what it knew that seemed unprecedented. Never before did a nation seem so hungry for news of itself.
As the coronavirus pandemic continues to rage, it intensifies fears of aging and debility that characterize our culture of fitness and drive our aspirations to bodily invincibility. The stigma of aging affects women differentially. While feminists have touted the achievements of older women and insisted that the later years can be the best, we now find ourselves on the other side of an increasingly solid barrier between a “younger” population and an “elderly,” “older,” or “old” one. Those of us who are age 65 or older are the most vulnerable and at risk, both in need of extra protection and most likely to lose out in the triage battle for hospital beds and ventilators. At the same time, our vulnerability to the virus makes it impossible for many of us in this age cohort to participate in the historic street protests we are condemned to witness from afar.
This is therefore a good moment to assess our experiences of aging, and to face our own attitudes more squarely. Rather than battling an ageist and sexist media by insisting that older women can do and be more than ever before by working and playing harder, might we instead focus on care and interdependence, accepting rather than disavowing bodily, emotional, and social vulnerabilities? Rather than celebrating individual victories against aging and mortality, we might embrace a communal ethos of mutuality to which the old have a great deal to contribute.
In proclaiming older women’s powers, the titles of two recent books give a clear sense of their tone and mission: No Stopping Us Now: The Adventures ofOlder Women in American History, by journalist Gail Collins, and In Our Prime: How Older Women Are Reinventing the Road Ahead, by communications and media scholar Susan J. Douglas. Indignant about the blatant disparagement of older women that characterizes our moment, Collins and Douglas take a celebratory, if not outright triumphalist, tone. Both search for greater social importance and acceptance of older women in earlier historical periods and find examples of their unrelenting energy and productivity today. Both books encourage all women to fight against gendered ageism. They call for forms of cultural recognition that would better represent what their authors see as older women’s mostly positive experiences of aging.
. . . .
In a whirlwind journey through United States history, from the colonial period to today, No Stopping Us Now traces changes in opportunities for and attitudes toward older women. With spirit and energy, Collins leads us through the lives of numerous, mostly well-known older women who wielded considerable influence at different historical moments. Although the book touches upon larger economic arguments about shifting social roles available to mature women—brought about by the need for their products in colonial times, for example, or the opportunities for widows to run their husbands’ farms or businesses—Collins is more interested in how individual women were able to circumvent prejudices and taboos, and thereby thrive in their later years. Collins’s story is one not so much of steady progress as it is of a series of gains and losses, advances and declines—a story that leads to what she sees as today’s open future of increased possibility.
Thanks to Collins, one certainly gets a sense of women’s energy and activity, which is hard to reconcile with popular attitudes of gendered ageism, then and now. She paints vivid portraits, for example, by following the writing, publishing, and public-speaking “adventures” of 19th-century luminaries like Sarah Josepha Hale, who continued writing until she was 89; Elizabeth Cady Stanton, who urged middle-class women to start a whole new life in their 50s; Catharine Beecher, who took courses at Cornell in her 70s; and Jane Addams, who advocated a postponement of old age.
Notably, historians studying American women have analyzed the feminist strategies these and lesser-known women used to advance their work: by seemingly conforming to set gender roles, even as they radically subverted them. Collins, meanwhile, is content to tell these stories chronologically, ending with encouraging contemporary examples that range from Ruth Bader Ginsburg and Nancy Pelosi to Gloria Steinem and Helen Mirren. She does fold these individual white women into a broad historical sweep that also includes exceptional African American figures like Sojourner Truth, Harriet Tubman, Frances Harper, and 98-year-old National Park Service ranger Betty Reid Soskin. Yet she only mentions—without analyzing in any depth—how gendered prejudices are structurally inflected by racial, economic, and other social inequalities.
George Eliot was at the peak of her renown in 1874 when John Blackwood, her publisher, learned that she was at work on “Daniel Deronda, ” a new novel. As a literary man, he was in thrall to her genius. As a businessman with an instinct for the market, he valued her passionately dedicated readership. But an early look at portions of her manuscript astonished and appalled him: Too much of it was steeped in sympathetic evocations of Jews, Judaism and what was beginning to be known as Zionism.
All this off-putting alien erudition struck him as certain to be more than merely unpopular. It was personally tasteless, it went against the grain of English sensibility, it was an offense to the reigning political temperament. It was, in our notorious idiom, politically incorrect. Blackwood was unquestionably a member of England’s gentlemanly intellectual elite. In recoiling from Eliot’s theme, he showed himself to be that historically commonplace figure: an intellectual anti-Semite.
Anti-Semitism is generally thought of as brutish, the mentality of mobs, the work of the ignorant, the poorly schooled, the gutter roughnecks, the torch carriers. But these are only the servants, not the savants, of anti-Semitism. Mobs execute, intellectuals promulgate. Thugs have furies, intellectuals have causes.
The Inquisition was the brainchild not of illiterates, but of the most lettered and lofty prelates. Goebbels had a degree in philology. Hitler fancied himself a painter and doubtless knew something of Dürer and da Vinci. Pogroms aroused the murderous rampage of peasants, but they were instigated by the cream of Russian officialdom. The hounding and ultimate expulsion of Jewish students from German universities was abetted by the violence of their Aryan classmates, but it was the rectors who decreed that only full-blooded Germans could occupy the front seats. Martin Heidegger, the celebrated philosopher of being and non-being, was quick to join the Nazi Party, and as himself a rector promptly oversaw the summary ejection of Jewish colleagues.
Stupid mobs are spurred by clever goaders: The book burners were inspired by the temperamentally bookish—who else could know which books to burn? Even invidious folk myths have intellectual roots, as when early biblical linguists mistranslated as horns the rays of light emanating from Moses’ brow.
As we have been reminded of late, there is an astonishing complexity—and at times fragility—to our mental and physical health, and we owe a debt to the legions of scientists whose insights and discoveries, over the years, have improved our chances of well-being. Alas, too many of them are unknown to us. One name that was once broadly known has fallen into lamentable obscurity—that of Claire Weekes, an Australian doctor who did ground-breaking work on one of the great scourges of humanity. With Judith Hoare’s “The Woman Who Cracked the Anxiety Code,” we have a chance to learn about Weekes’s varied life and, as important, become reacquainted with her work.
Decades before her death in 1990 at the age of 87, Weekes had been a global sensation, reaching millions of people through her books—“transfusions of hope,” she called them. One of the original self-helpers, she believed that sufferers could master themselves without the aid of professionals, and the strategies she gave them were firmly grounded in the biology of anxiety.
Weekes didn’t plan on medicine as a career, Ms. Hoare tells us. In 1928, at the age of 25, she began graduate studies in zoology in London on a prestigious fellowship. When her beloved mentor died of a stroke, she developed severe heart palpitations. Doctors misinterpreted her condition as tuberculous and sent her to a sanatorium. There she fell into a general state of fear. Six months later, doctors retracted their diagnosis, and Weekes, now nearly incapacitated by stress, resumed her research.
The turning point came when she confided in a friend, a World War I veteran, that she suffered from a frenzied heartbeat. “Far from being surprised or concerned,” Ms. Hoare writes, “he shrugged,” saying: “Those are only the symptoms of nerves.” He told Weekes, in Ms. Hoare’s paraphrase, that “her heart continued to race because she was frightened of it. It was programmed by her fear. This made immediate sense.”
The explanation was deceptively profound, going straight to the core of the mind-body connection.
. . . .
Weekes had hypothesized a “first fear and second fear” process. The first is a reflex—and the problem in many anxiety disorders is that the reflex is set off for no obvious reason. The second is the conscious feeling of fear. Relief of suffering, for her, came when she learned to quell the “fear of the first fear,” thereby short-circuiting the cycle that was set in motion by the original, unbidden rush of panic: the pounding heart. According to Ms. Hoare, Weekes “immediately grasped the point that she needed to stop fighting the fear.” She had cracked the code.
But this insight would not reach the public for another 30 years. After becoming the first woman to be awarded the degree of Doctor of Science at Sydney University, Weekes conducted research in endocrinology and neurology. Eventually she sought a more pragmatic occupation and enrolled in medical school at age 38. During her work as a general practitioner, she felt special sympathy for her anxious patients and began to counsel them to do as she herself had done: “float past” panic, give bodily sensations and fearful thoughts no power. One of her patients asked for written advice. Her pages to him became “Self Help for Your Nerves,” published in 1962, when Weekes was 59; the book rocketed up the bestseller lists in the U.S. and the U.K. As Ms. Hoare shows, Weekes’s contributions to human welfare live on in mindfulness training and forms of behavioral therapy, sometimes combined with medication. Contemporary neuroscience has vindicated her theory.
Is boredom really all that interesting? Thanks perhaps to the subject’s dreary durability, it has generated a considerable literature over the years. Alberto Moravia wrote an engaging novel called “Boredom,” and psychologists, philosophers and classicists have also had their say.
“Out of My Skull,” the latest work on this strangely alluring topic, has an exciting title, but nothing about the book is wild or crazy. James Danckert and John D. Eastwood, a pair of psychologists in Canada, know an awful lot about the subject (Mr. Eastwood even runs a Boredom Lab at York University), and they examine it methodically. “In our view, being bored is quite fascinating, and maybe, just maybe, it might even be helpful,” they write, echoing predecessors who find boredom salutary. “Boredom is a call to action, a signal to become more engaged. It is a push toward more meaningful and satisfying actions. It forces you to ask a consequential question: What should I do?”
A taxonomy of boredom, if it’s to avoid exemplifying what it describes, ought to be simple. So let’s just say that boredom is of two kinds. The first is better known to us as ennui, and the democratization of this once-rarefied feeling is one of civilization’s triumphs. At first the preserve of aristocrats and later taken up by intellectuals, nowadays it is available to affluent citizens everywhere. Our endless search for palliatives in the face of this affliction underpins the consumer economy.
The other kind of boredom is the version that most of us get paid for. Commentators on boredom usually genuflect briefly toward factory workers, nannies and other hard-working members of the hoi polloi whose tasks can be mind-numbing. But such people live with a version of boredom that intellectuals find, well, boring. So the focus is usually on the self-important existential variety.
For Jack Lawson, “ten hours a day in the dark prison below really meant freedom for me.” At age 12, this Northern England boy began full-time work down the local mine. His life underwent a transformation; there would be “no more drudgery at home.” Jack’s wages lifted him head and shoulders above his younger siblings and separated him in fundamental ways from the world of women. He received better food, clothing and considerably more social standing and respect within the family. He had become a breadwinner.
Rooted in firsthand accounts of life in the Victorian era, Emma Griffin’s “Bread Winner” is a compelling re-evaluation of the Victorian economy. Ms. Griffin, a professor at the University of East Anglia, investigates the personal relationships and family dynamics of around 700 working-class households from the 19th century, charting the challenges people faced and the choices they made. Their lives are revealed as unique personal voyages caught within broader currents.
“I didn’t mind going out to work,” wrote a woman named Bessie Wallis. “It was just that girls were so very inferior to boys. They were the breadwinners and they came first. They could always get work in one of the mines, starting off as a pony boy then working themselves up to rope-runners and trammers for the actual coal-hewers. Girls were nobodies. They could only go into domestic service.”
Putting the domestic back into the economy, Ms. Griffin addresses a longstanding imbalance in our understanding of Victorian life. By investigating how money and resources moved around the working-class family, she makes huge strides toward answering the disconcerting question of why an increasingly affluent country continued to fail to feed its children. There was, her account makes clear, a disappointingly long lag between the development of an industrialized lifestyle in Britain and the spread of its benefits throughout the population.
. . . .
In preindustrial times, both men and women had faced a fairly set course in life on the edge of subsistence. During the Victorian era, their fortunes rapidly diverged. Many of the best-paid roles within the newly industrialized economy were designated as exclusively male. Those designated as female were very low paid (well below subsistence level). Thus developed the “breadwinner wage” model—the idea being that a man needed to support a family upon his earnings but a woman needed only pin money, her basic needs having been provided by father or husband.
Ms. Griffin’s groundbreaking research tracks the effects of this philosophy through personal autobiographical accounts. Working-class men gained power and personal freedom from the new opportunities and broader horizons. Working-class women, by contrast, faced the same old narrow set of options. This new pattern of gender divergence was most pronounced in urban situations, where the higher male wages were largely to be had, and was attended by a significant rise in family breakdown.
In the dim caverns of his memory, PG remembers reading that survival and eventual prosperity of married settlers who journeyed to the American West generally required the hard work and often income-producing farming and ranching also required the active physical participation of both spouses to a greater extent than was were the family circumstances of the typical Breadwinner economic model.
In the American West economic model, children were also expected to provide productive labor in the process of obtaining food and income from the farm or ranch.
When PG was a wee lad, he remembers helping to herd livestock and run to get tools from the shed as required by his parents.
His mother was definitely involved in the agricultural and livestock activities on a regular basis. She and PG both were chased by an angry cow as they were trying to give some medication to her calf. PG was 7 years old at the time and discovered a running speed he didn’t realized he possessed as he headed for the fence.
When PG was 11 years old, he learned to operate a Caterpillar D6 (sometimes known as the most important piece of military equipment used in the Pacific theater of World War II) and thought he was the coolest kid around driving it to help his father on the farm.
Unfortunately, PG doesn’t have any photos of himself operating the D6, but here are a few to give anyone who is still interested a sense of the size of the machine. PG remembers that it required about a three-step climb from the ground to the seat.
It’s hard to imagine a world without rivers. The continents would be higher, colder and more rugged, and we humans might still be hugging the coastlines. Our iconic cities, situated along rivers, would not have been built. Global trade and travel might never have developed. Even so, rivers’ crucial role in shaping civilization is “grandly underappreciated,” according to Laurence C. Smith, professor of earth, environmental and planetary sciences at Brown University. In his important new book “Rivers of Power,” he surveys mankind’s long, shifting relationship with our rivers, ranging from prehistory to the present and embracing nearly every region of the world.
Rain started falling on Earth at least 4 billion years ago. Merging into streams and then rivers, the water launched its eternal assault on the continents, grinding them down and carrying them grain by grain toward the sea. The rivers, over their tortuous course, occasionally slowed and dropped some of their silt, forming tangled deltas and wide valley plains. Perhaps as recently as 12,000 years ago, nomadic peoples in the Mideast and Asia settled these valleys and began to plant crops such as wheat, barley and rice.
The valley soil was fertile, and early farmers learned to divert river water for irrigation, increasing their harvests and producing surpluses of grain. Starting about 4,000 B.C., they built the world’s first great cities, in present-day Iraq, Egypt, India, Pakistan and China. As these societies grew wealthier and more populous, they also became more complex, supporting a ruling class, traders, philosophers and engineers. In fact, these civilizations (the Egyptian, Sumerian, Harappan and Chinese) were so utterly dependent on their rivers (the Nile, Tigris and Euphrates, Indus and Ghaggar-Hakra, and Yangtze and Yellow) that they have been dubbed “hydraulic societies.”
Link to the rest at The Wall Street Journal (PG apologizes for the paywall, but hasn’t figured out a way around it.)
My grandfather died many years ago, but I still remember his stories of growing up in the Texas Hill Country in the early 20th century, walking two miles each way to a one-room schoolhouse and doing chores that were, to me, unfathomable: making laundry soap out of lard and lye, plucking chickens, hauling water from the well. I thought of him often as I read “Farm Girl,” Carlson’s spare, charming memoir of her Depression-era childhood.
Carlson grew up on her parents’ farm outside Plum City in western Wisconsin, where she was born in 1926. (Family lore has it that the doctor who delivered her exclaimed: “Well, this is a nice, big one! Nine or 10 pounds.”) She and her three siblings roamed through “80 acres of beautiful, rich, fertile Wisconsin cropland, pasture and woodlot” while their parents shielded them from the worst economic woes of the period. Her memories, mostly rosy, are punctuated by descriptions of the era’s terrible droughts. “We could hear the cattle bawling as they searched the dry pastures for a bit of grass,” she recalls, and “we saw the leaves on the stunted corn plants in the sun-baked fields curl to conserve moisture.”
“Farm Girl” isn’t chronological. It’s split into two sections — one on her family, the other on the seasonal rhythms that define life on a farm — and divided into thematic chapters, some as short as two pages: “The Party Line Telephone,” “Butternuts and Maple Sugar Candy,” “Sunday Dinner,” “Long Underwear” (“nothing, nothing separated the farm kids from the town kids like the dreaded long underwear … the scourge of Wisconsin winters”).
For those who have never lived in a place with extremely cold winters, long underwear is an important winterwear component. If the electricity goes out or the schoolbus becomes stranded during a cold snap, long underwear can become very important.
That said, long underwear is seldom regarded as a fashion-forward piece of clothing other than among the old guys sitting around a hot wood stove at the local grain elevator, spinning stores about the winters of their childhoods when winters were really something and, when you woke up, crawled out from under five or six blankets and your bare feet hit the linoleum floor, you got dressed in a big hurry, then went out to help your father thaw out the water pump because Mom couldn’t make oatmeal without water and refused to use melted snow because who knew what might have been done on that exact spot by some creature or another.