Food writers can be divided into two branches: sensualists and moralists

17 November 2018

From The Wall Street Journal:

You can tell a lot about a person by how they order at Waffle House, but I would suggest that the thing that separates the sheep from the goats isn’t what they regularly order, but rather that they have an order at all.

I don’t think it matters, in other words, if you like your hash browns the way I do—scattered (on the grill), peppered (with jalapenos), smothered (with onions) and covered (with melted cheese). But you should like them some particular way, and you should know how you like them. In the midst of his criss-crossing of culinary America in “Buttermilk Graffiti,” chef Edward Lee stops into a Waffle House and orders “a bowl of chili and hash browns, smothered and covered.” He’s in Prattville, Ala., where “Valerie works the griddle station and belts out rock-and-roll classics all the while, changing the lyrics to make waffle and egg references.”

In these passages Mr. Lee, who runs acclaimed restaurants in Louisville, Ky., National Harbor, Md., and Washington, D.C., is reflecting on the intimacy and social comfort he’s found in this Southern chain restaurant. He contrasts it to a rather awkward series of adventures in Korean restaurants in Montgomery, Ala., of which there are surprisingly many. Oddly, he found, not many of the Koreans running these restaurants wanted to talk to him about the food they were cooking, and though the food was familiar—Mr. Lee’s parents immigrated from Korea—he didn’t feel quite at home. These sorts of questions are the crux of “Buttermilk Graffiti”: Where are we comfortable? How can assimilation and tradition coexist? Why is there such excellent Cambodian food in Lowell, Mass.? How can it be that there are more Korean restaurants per capita in Montgomery than in Manhattan?

. . . .

By that time Jane and Michael Stern’s “Roadfood” had been out a few years; Calvin Trillin had published “American Fried” a few years before that. Those books laid out the pattern for future American food travelogues. What Mr. Trillin did was to insist that good-old down-home stuff was where the fun was. He was thumbing his nose at the fancy-schmancy, insisting that we all avoid “Le Maison de la Casa House” and any place of which the chamber of commerce is proud. In a culinary world that was, by all reports, drenched in thick continental sauces, this was a good message and needed to be spread. But what I experienced in Carlisle, Pa.—and what Edward Lee tracks intrepidly throughout the country—is something different. It’s fresh, comfortable excellence—sometimes a frisson that the newly arrived bring to the table, sometimes a tradition that is just one step outside of the mainstream. In Miami, it is the flavors of Cuban exiles, and in Dearborn, Mich., it is the mint tea with which Mr. Lee breaks his fast at sundown during Ramadan.

Link to the rest at The Wall Street Journal




‘Terrible times are coming’: the Holocaust diary that lay unread for 70 years

15 November 2018

From The Guardian:

Seventy years after writing her final entry, the diary of Polish teenager Renia Spiegel, who has drawn comparisons to Anne Frank for her moving account of life as a Jew during the Nazi occupation of Poland and who was shot on the streets days after her 18th birthday, appeared in English this week for the first time.

Running to almost 700 pages, Spiegel’s diary begins in January 1939, when she was 15, and ends on the last day of her life, 30 July 1942, when she was executed by German soldiers. The last lines in the journal are written by her boyfriend, Zygmunt Schwarzer, who ended it with his account of her death and that of his parents: “Three shots! Three lives lost! All I can hear are shots, shots.”

Spiegel’s mother, Róża, and her younger sister, Ariana, survived the war and moved to America. In the 1950s, Schwarzer, who had survived several concentration camps, located them and gave them Renia’s diary. Neither Róża nor Ariana could bring themselves to read it and so the diary lay “dormant”, said Ariana’s daughter Alexandra Bellak, until she decided to send it for translation.

“I realised how important it was, not just for me to learn more about my past but also for the world to see how meaningful this diary was,” said Bellak. “It was very moving.It is heart-breaking and heart-wrenching because you know how the story ends, but also her writing is so beautiful. She’s so mature and thoughtful and introspective. You get a sense of a young woman who’s going through puberty, falling in love with her first boyfriend, having little spats with her sister. You see how intelligent she is – she references philosophers and classical musicians and composers, it’s pretty amazing. With the rise of anti-Semitism and nationalism, here and abroad, it’s more timely than ever.”

. . . .

“Why did I decide to start a diary today? Has something important happened?” Spiegel writes in the Smithsonian’s translation, by Anna Blasiak and Marta Dziurosz. “No! I just want a friend. Somebody I can talk to about my everyday worries and joys. Somebody who will feel what I feel, believe what I say and never reveal my secrets.”

The diary shows a girl with typical teenage obsessions, whether it is her schoolmates (“the next girl in our row is Belka — fat and stocky like 300 devils”), teachers or first kisses. Filled with poetry, it also reveals a girl desperately missing her mother: at the time, Renia and Ariana were living in in Przemyśl with their grandparents. As war breaks out, Renia describes the creation of the ghetto to which they are confined, and the deportation of members of her community. On 26 June 1941, she writes of “horrible days in the basement”; a few days later, she tells her diary of how she will have to start wearing a white armband. “To you I will always remain the same Renia, but to others I’ll become someone inferior: a girl wearing a white armband with a blue star. I will be a Jude.”

“Remember this day; remember it well,” she wrote on 15 July 1942. “You will tell generations to come. Since 8 o’clock today we have been shut away in the ghetto. I live here now. The world is separated from me and I’m separated from the world.” Leaving the ghetto without a pass, she writes, is “punishable by death”.

Link to the rest at The Guardian and thanks to Karen for the tip.

The Great War Begins

7 November 2018

For those not familiar with it, The Khan Academy is a free online educational institution that provides courses on a wide range of subjects for students of all ages. It’s a great place to learn how to code in various computer languages.

However, it also includes a variety of history courses. Below is a screenshot from a six-minute tutorial/introduction to one of their history courses discussing the beginning of World War I.

Here is a link to the tutorial.



The Books That Saved My Life in Prison

6 November 2018

From Medium:

From where I grew up on Division Avenue in Washington, D.C., it was eight blocks to the East Capitol Library. The route was through the Lincoln Heights Houses, a notorious public housing project, but I walked it by myself at least once a week. This was the late 1980s; bullet casings and baggies littered the ground. Crack never came in vials in my neighborhood.

At East Capitol, I escaped that reality. I traveled the world: China, England, Africa. I listened in wonder to stories of the great ancient library at Alexandria in Egypt, which in my mind was a hundred stories tall and where all the sailors came to find maps of the world. I checked out children’s versions of classics and read them at night, wrapped in my sheet on the floor. There were so many drive-bys, I was afraid a stray bullet might come through the window and hit me if I slept up high in my bed.

I thought, The world is so big. It’s full of ideas and people. I can go anywhere. I can do anything. I can be anyone I want.

It didn’t happen. At 17, I killed a man in a confrontation. I was sentenced to life in prison with no hope of parole. I sat on my bunk that day, in my solitary confinement cell, and cried, because my life was over.

About a year later, I noticed an inmate named Steve. He was a lifer like me, in since 15, but he read books every day, all day. One day, I asked what he was reading. He showed me a book on computer coding. I laughed in his face. We didn’t even have computers at Patuxent in those days. Who was he fooling?

But I admired his persistence, and I admired his optimism. Steve didn’t care what other people thought. He didn’t care about the odds. Despite his impossible sentence, he was determined to make something of his life.

So I started reading. Not computer coding! I read about entrepreneurs like Mark Cuban. I read about historical figures like Frederick Douglass, Leonardo da Vinci, and Napoleon. I loved Napoleon, because he was an outsider like me. Nobody wanted him. But he went to a library and learned everything there was to know about military tactics so he’d be ready when they couldn’t deny him anymore.

. . . .

I read self-help books, too, like Leil Lowndes’ How to Talk to Anyone. Other prisoners laughed at me: “What, Chris, you don’t know how to talk?”

I said, “I’m improving myself, inside and out. I’m improving my body and my mind. You should too. Just because you’re in here doesn’t mean you can’t do great things.”

Steve became more than my best friend. He became my brother. We got cells across from each other and exchanged books through the bars. My family wrote me off. My mom said, “You got life. What’s the point?” and stopped answering my calls. Steve’s parents took me in. They offered to spend $50 a month on things I needed. It was the first spending money I ever had. I asked them to buy me books. I kept the best ones on a shelf in my cell. The rest I loaned out to other inmates or gave to the prison library.

Eventually, I got my GED. Then Steve and I convinced the prison to start a college program. In college, we had full access to the prison library. I lived in that library. I lived for that library. It took me everywhere: deep into outer space, deep into history, and even deeper into myself. I’ve probably spent more than 10,000 hours in the Patuxent library. I’m practically a librarian!

. . . .

I didn’t just live for that library. I lived because of that library. The Patuxent prison library saved me from crushing despair. It saved hundreds of other guys, too.

. . . .

People often ask about my favorite book. It’s 500 Spanish Verbs, a book I carried with me every day for more than four years while I taught myself the language. (I’m fluent in Italian, too.) For me, that book is a symbol of my hard work and commitment.

Link to the rest at Medium

History for a Post-Fact America

27 October 2018

From The New York Review of Books:

What was America? The question is nearly as old as the republic itself. In 1789, the year George Washington began his first term, the South Carolina doctor and statesman David Ramsay set out to understand the new nation by looking to its short past. America’s histories at the time were local, stories of states or scattered tales of colonial lore; nations were tied together by bloodline, or religion, or ancestral soil. “The Americans knew but little of one another,” Ramsay wrote, delivering an accounting that both presented his contemporaries as a single people, despite their differences, and tossed aside the assumptions of what would be needed to hold them together. “When the war began, the Americans were a mass of husbandmen, merchants, mechanics and fishermen; but the necessities of the country gave a spring to the active powers of the inhabitants, and set them on thinking, speaking and acting in a line far beyond that to which they had been accustomed.” The Constitution had just been ratified at the time of Ramsay’s writing, the first system of national government submitted to its people for approval. “A vast expansion of the human mind speedily followed,” he wrote. It hashed out the nation as a set of principles. America was an idea. America was an argument.

The question has animated American history ever since. “For the last half century,” the historian and essayist Jill Lepore told an interviewer in 2011, academic historians have been trying “to write an integrated history of the United States, a history both black and white, a history that weaves together political history and social history, the history of presidents and the history of slavery.”

. . . .

Lepore has surmised . . . that too much historical writing—and perhaps too much nonfiction in general—proceeds without many of the qualities that readers recognize as essential to experience: “humor, and art, and passion, and love, and tenderness, and sex… and fear, and terror, and the sublime, and cruelty.” Things that she calls “organic to the period, and yet lost to us.” Lepore’s training as a historian, she’s said, tried to teach her that these things did not contain worthy explanations. In graduate school her interest in them “looked like a liability, and I took note.”

. . . .

For Lepore, history is essentially a writing problem: how we know what we know (or think we do), how different forms and genres transmit different kinds of signals, what it might mean to encounter a gap between the evidence and the truth. Her work has confronted the tension between what a reader needs to know for a story to work and the limits of what can be known, and what makes the difference between a person and a character. When she wrote a historical novel, set in colonial Boston and employing the dialect of the time, her co-author called it “a different way of knowing and telling the past.” After publication, they began receiving etymology queries from the Oxford English Dictionary.

Lepore has called history “the anti-novel” and “the novel’s twin.” Both history and the novel took the forms we recognize today over roughly the same period, emerging as ways of making sense of the world during a time of great changes, and over the past two centuries or so they have followed parallel paths. Between them was often the boundary of the self: history happens out in the world, growing out of inquiries into its documents, registers, and other records, while novels present an experience of it.

Link to the rest at The New York Review of Books

The Growing Crisis in Research

17 October 2018

From Plagiarism Today:

Last week, two of the largest academic publishers filed a lawsuit against the social networking site ResearchGate, saying that the site is not doing enough to discourage the pirating of academic papers that they hold the copyright to.

It is their second lawsuit against ResearchGate, the first was filed in Germany last year. That case is ongoing.

Meanwhile, China has been working for more than five months on creating a blacklist of “poor quality” journals that their scientists should not submit to. Once the list is complete and implemented, research published in those journals will not count toward a scientist’s promotion prospects or grant funding.

While these two stories might seem completely separate, they are actually both symptoms of a growing crisis in research. It’s a crisis with complicated origins and no easy resolution, but it’s also a problem that strikes to the core how we share the latest scientific knowledge.

However, the crisis can be summed up like this: Reviewing and publishing research costs money and no one is really sure how to best pay for it.

. . . .

Frustratingly, there’s no simple beginning to the problem. Though, as with many things in our modern world, it’d be easy to blame it on the internet, the truth is that many of the dominos were in place and falling long before the internet even existed.

The internet certainly contributed, but can’t be pinned as the cause.

Instead, the cause can be traced back to four separate important parts.

  1. Pressure to Publish: Researchers and have felt a growing pressure to publish. The environment is often described as “publish or perish” as such publication is required to maintain one’s position, seek promotions or to secure funding.
  2. Limited Publication Space: Though publishers have increased the number of journals available, the numbers haven’t risen as quickly as the number of submissions, making competition for the top journal spots especially intense.
  3. Increased Costs of Subscription: At a time where academic libraries either have stagnant or shrinking budgets, the cost of subscribing to even the most noted journals is increasing, causing many to reduce the number of subscriptions they keep.
  4. The Ease of Piracy: The internet has made it easy to share academic research broadly, with or without approval from the copyright holder. Though research was not at the forefront of the early piracy battles, it’s become the subject of a growing piracy landscape, one dominated by Sci-Hub but also compounded by stories like the ResearchGate one.

The issue is that it costs money to publish an academic journal and, whether publishers are profit or non-profit, they have to recoup those costs.  However, as 2008 research showed, the budgets of academic libraries, the primary consumers of such journals, have either shrunk or remained flat. This has resulted in many universities scaling back their subscriptions.

. . . .

Open access is a fairly straightforward concept that says, when an article is published, it should easily and freely accessible to everyone. Usually, such articles are published under a Creative Commons or similar licensing meaning that users are free to copy, share and distribute the research as they please.

The idea began in the early 1990s but began to rapidly expand in the 2000s with the launch of PLOS One, the largest and best-known open access journal.

The benefits of open access are obvious. There is no paywall or barrier between a research paper and those who might use it. Anyone can read or build off of open access research at any time. This is especially positive in cases where research is government-funded but might otherwise be hidden away from public consumption.

For researchers, the benefits are also obvious. Studies have found that open access works are more regularly cited and it helps increase both the impact of their work and their own reputation in the academic community.

However, where traditional journals charge for access to a work, open access journals have to recoup their costs elsewhere. They do this one of two ways:

  1. Charging Article Processing Fees: Either charging the submitter of an article when their work is submitted or after it is accepted. This is the model that approximately 28% of open access journals use, including PLOS One. At PLOS journals, those fees range from $1,595 – $3,000 depending on the specific journal.
  2. Subsidized: Other journals don’t charge article processing or access fees but, instead, either have a direct subsidy from a University, laboratory or other research entity or adopt a different business model such as advertising or selling reprints to make up costs.

While many journals successfully and ethically use both approaches, they also can create problems.

Article processing fees, for example, have led to the rise of predatory journals. Though the issue of journals publishing fake science is as old as research itself, article processing fees have turned it into a business model. No longer having to fight for subscribers, many journals will simply publish anything for a fee, even if it’s nonsense.

There have been many attempts to stop predatory journals, or at least make scientists aware of them. However, many are still caught unaware and, due to the aforementioned “publish or perish” environment, some publish in such journals willingly.

This is why China is working on its list of low quality journals.

Link to the rest at Plagiarism Today

Epidemics as Entertainment

16 October 2018


Plagues are a staple of modern day popular culture. There’s the actual news cycles dominated by the disease du jour—Ebola, H1N1, SARS. Then there’s an entire genre of film based on mysterious bugs that wipe out populations, or turn people into the murderous undead. A popular online game, Pandemic, has you deliberately infecting as many countries as possible.

You would think that our fascination with epidemics might be a natural human reaction to a threat, and yet, our specific fear of certain diseases has little to do with their actual degree of deadliness. We pay much less attention to hypertension, respiratory diseases, and diabetes, which according to the Centers for Disease Control and Prevention are among the deadliest conditions in the United States.

Scholar Nancy Tomes set out to interrogate what exactly makes an illness newsworthy. She writes that although “historians and literary critics have produced many useful studies of individual diseases, they have only rarely explored the larger process of competition for attention that operated in making one disease seem a more compelling subject than another.”

Tomes revisited the literature and culture of the early twentieth century, interrogating past patterns of mortality and press coverage to determine what factors might contribute to the public’s fervor around a particular disease. What she found was a complex interplay of factors: political agendas, scientific advancement, and media interest fueled fear more than a straightforward correlation between “danger” and “publicity.”

From “parrot fever” (a respiratory infection caused by an avian-borne strain of chlamydia) to “undulant fever” (also known as brucellosis, caused by a bacteria in raw milk), Tomes investigates what diseases were seen as newsworthy, which were best suited to plays and entertainment that captured the public’s imagination, and which could be safely designated to “the other” and therefore titillate as well as deepen social divides for political reasons. None of these on their own create a public pandemic panic—but together, they’re a powerful vehicle that can distract us from what’s really dangerous.

. . . .

For example, we are familiar with tuberculosis from its prominent place in the art and culture of the early twentieth century. Given its prevalence in popular culture, we may assume it was decimating populations. According to Tomes, however, T.B.’s notoriety is not a fully accurate representation of the peril it caused. While indeed a fatal and contagious disease, other equally terrible afflictions may not have been awarded the same level of attention. Sexually transmitted diseases, for example, were on the rise and taking a toll on the population at the time, but were not seen as fit for public discussion.

“T.B. became the ‘master disease’ of early-twentieth-century reformers and editors not because it was on the rise but because it served other compelling agendas,” writes Tomes. It bolstered the then-new germ theory of disease. Although it hadn’t previously been considered a communicable disease, when the medical profession proved its contagious nature, germ theory took hold in the public’s mind.

. . . .

“Themes of cross-class infection and romance were staples of… disease melodramas. Sick workers infected the beloved wives and children of greedy businessmen and slum lords; innocent young women and noble doctors served as instruments of their salvation,” writes Tomes. She also notes that politically, “T.B. served well as a vehicle for pushing a wide range of societal reforms aimed at easing the dislocations of urbanization and industrialization.”

Link to the rest at JSTOR

Matters of Tolerance

14 October 2018

From The New York Review of Books:

Scientists and engineers recognize an elusive but profound difference between precision and accuracy. The two qualities often go hand in hand, of course, but precision involves an ideal of meticulousness and consistency, while accuracy implies real-world truth. When a sharpshooter fires at a target, if the bullets strike close together—clustered, rather than spread out—that is precise shooting. But the shots are only accurate if they hit the bull’s eye. A clock is precise when it marks the seconds exactly and unvaryingly but may still be inaccurate if it shows the wrong time. Perversely, we sometimes value precision at the expense of accuracy.

Simon Winchester, whose The Perfectionists ventures a history of this abstract concept, offers another way of looking at the distinction: a Rolls-Royce automobile, the 1984 Camargue model. In the course of a story filled with wonderful machines of every type, Winchester reveals himself to be something of a Rolls-Royce fanboy, but he declares this one to have been an ugly behemoth:

While the engineers had lovingly made yet another model of a car that enjoyed great precision in every aspect of its manufacture, those who had commissioned and designed and marketed and sold it had no feel for the accuracy of their decisions.

Winchester is a longtime journalist turned author, a meticulous researcher and catholic thinker who has written superb books about The Oxford English Dictionary, the Krakatoa eruption, the birth of modern geology, and (separately) the Atlantic and Pacific Oceans. Compared with topics like those, precision may seem an odd choice. What does it mean to write a history of so abstract a concept? Where does it even begin?

First Winchester needs to convince us that precision is a thing. It is, he tells us, a component of machines, and for that matter “an essential component of the modern world,…invisible, hidden in plain sight.” Besides being a component, it is a “phenomenon” that has transformed human society. We take it for granted, like the air we breathe, though we are suckers for precision snow tires and precision beard trimmers and we aspire to precision medicine and precision tattoo removal. It is “an essential aspect of modernity that makes the modern possible,” Winchester writes:

Precision is an integral, unchallenged, and seemingly essential component of our modern social, mercantile, scientific, mechanical, and intellectual landscapes. It pervades our lives entirely, comprehensively, wholly.

Which of the sciences are the most precise? Biology is messy, a science of divergence and variation, of creatures in all shapes and sizes. “Astronomical precision” is an oxymoron, astronomy being full of approximations and guesses piled atop one another—although the instruments of astronomy are tools of increasing and, lately, astounding precision. Mathematical precision trumps astronomical precision; mathematics is precise by definition. Winchester is not exploring the world of abstractions, though, but the real world, where people make things. His father was a precision engineer who turned metal into the most perfect machinery possible. Wood is nice but imprecise. The story of precision begins with metal.

. . . .

In Scotland, James Watt was designing a new engine to pump water by means of the power of steam. In England, John “Iron-Mad” Wilkinson was improving the manufacture of cannons, which were prone to exploding, with notorious consequences for the sailors manning the gun decks of the navy’s ships. Rather than casting cannons as hollow tubes, Wilkinson invented a machine that took solid blocks of iron and bored cylindrical holes into them: straight and precise, one after another, each cannon identical to the last. His boring machine, which he patented, made him a rich man.

Watt, meanwhile, had patented his steam engine, a giant machine, tall as a house, at its heart a four-foot-wide cylinder in which blasts of steam forced a piston up and down. His first engines were hugely powerful and yet frustratingly inefficient. They leaked. Steam gushed everywhere. Winchester, a master of detail, lists the ways the inventor tried to plug the gaps between cylinder and piston: rubber, linseed oil–soaked leather, paste of soaked paper and flour, corkboard shims, and half-dried horse dung—until finally John Wilkinson came along. He wanted a Watt engine to power one of his bellows. He saw the problem and had the solution ready-made. He could bore steam-engine cylinders from solid iron just as he had naval cannons, and on a larger scale. He made a massive boring tool of ultrahard iron and, with huge iron rods and iron sleighs and chains and blocks and “searing heat and grinding din,” achieved a cylinder, four feet in diameter, which as Watt later wrote “does not err the thickness of an old shilling at any part.”

By “an old shilling” he meant a tenth of an inch, which is a reminder that measurement itself—the science and the terminology—was in its infancy. An engineer today would say a tolerance of 0.1 inches.

Link to the rest at The New York Review of Books

Next Page »