How ‘Sherlock of the library’ cracked the case of Shakespeare’s identity

From The Guardian:

Deep in the Folger Library, in Washington DC, Heather Wolfe says that studying Shakespeare makes an ideal preparation for the onset of Trump’s America. You can see her point: Shakespeare would have revelled in the mad excesses, the sinister vanities and the pervasive stench of cronyism and corruption surrounding the president-elect as America makes the painful transition from Barack Obama.

Dr Wolfe is a willowy, bright-eyed manuscript scholar, a paleographer specialising in Elizabethan England who in certain moods of candour might put you in mind of Portia or perhaps Cordelia. She’s also a Shakespeare detective who, last year, made the career-defining discovery that is going to transform our understanding of Shakespeare’s biography. In the simplest terms, Wolfe delivered the coup de grace to the wild-eyed army of conspiracy theorists, including Vanessa Redgrave and Derek Jacobi, who contest the authenticity, even the existence, of the playwright known to contemporaries as Master Will Shakespeare.

Wolfe is an accidental sleuth. Her scholar’s passion is as much for old manuscripts as for the obscurities surrounding our national poet. Project Dustbunny, for example, one of her initiatives at the Folger Shakespeare Library, has made some extraordinary discoveries based on microscopic fragments of hair and skin accumulated in the crevices and gutters of 17th-century books.

. . . .

Before Wolfe arrived on the scene, all that scholars could be certain about was that a man named Shaxpere, Shaxberd or Shakespear was born in Stratford in 1564, and that he was an actor whose name is printed in the collected edition of his work published in 1623. We also know that he married Anne Hathaway, and died in 1616, according to legend, on his birthday, St George’s Day. The so-called “Stratfordian” case for Shakespeare rested on these, and a few other facts, but basically, that was it.

. . . .

Wolfe’s appetite for manuscript corroboration has led her into many dusty corners of the Elizabethan archives. It was this research instinct that first led her to reopen the file on the coat of arms granted to Shakespeare’s father, the small-town glover, in 1596.

John Shakespeare, from Stratford-upon-Avon, was ambitious to rise in the world. He was certainly not the first Englishman keen to put his origins as a provincial tradesman behind him. Among his contemporaries in Stratford, he was a figure of fun for his social climbing. English class snobbery has a long pedigree. His son, who would continue the quest for official recognition after his father’s death, also attracted metropolitan disdain as “an upstart crow beautified with our feathers”. In 1601, after his father’s death, Shakespeare the upstart returned to the college of arms to renew the family application for a coat of arms. He had made a small fortune in the theatre, and was buying property in and around Stratford. Now he set out to consolidate his reputation as a “Gentleman”. Under the rules that governed life at the court of Elizabeth I, only the Queen’s heralds could grant this wish.

A much-reproduced sketch for a coat of arms crystallised Shakespeare’s hopes for legitimacy in the antique jargon of heraldry: “On a Bend Sables, a Speare of the first steeled argent. And for his Crest, a falcon, his winges displayed Argent, supporting a Speare Gould …” The needy applicant also attached a motto: Non Sanz Droit (“Not Without Right”). All this, and much more, is buried in the archives of the college of arms in London.

Wolfe’s fascination with Shakespeare’s quest for a family crest grew out of her immersion in the manners and customs of late Elizabethan England, in particular the College of Heralds. These court officials were required to administer the complex rituals governing the lives of the knights, barons and earls surrounding Queen Elizabeth.

An adjunct to the court, the College of Heralds was not exempt from its own secret feuds. In 1602, the internecine rivalry between Sir William Dethick, the Garter King of Arms, and another herald, Ralph Brooke, burst into the open when Brooke released a list of 23 “mean persons” whose applications for crests (he claimed) had been wrongfully preferred by Dethick. When “Shakespeare the Player” found himself on this list, his campaign for social advancement seemed in jeopardy. A bitter row broke out at court between two factions. Shakespeare himself became an object of ridicule. Another rival, Ben Jonson, in his satire Every Man out of his Humour, poked fun at him as a rustic buffoon who pays £30 for a ridiculous coat of arms with the humiliating motto “Not Without Mustard”.

It’s at this point in the story that Wolfe discovered “the smoking gun”. In the Brooke-Dethick feud, it becomes clear that “Shakespeare, Gent. from Stratford” and “Shakespeare the Player” are the same man. In other words, “the man from Stratford” is indeed the playwright. Crucially, in the long-running “authorship” debate, this has been a fiercely contested point. But Wolfe’s research nails any lingering ambiguity in which the Shakespeare deniers can take refuge.

Link to the rest at The Guardian and thanks to Matthew for the tip.

A Brief Survey of the Great American Novel(s)

From Literary Hub:

On this date in 1868, novelist John William DeForest coined the now inescapable term “the great American novel” in the title of an essay in The Nation. Now, don’t forget that in 1868, just a few years after the end of the Civil War, “America” was still an uncertain concept for many—though actually, in 2017 we might assert the same thing, which should give you a hint as to why the term “great American novel” is so problematic.

At the time of his writing, DeForest claimed that the Great American Novel, which he defined as “the picture of the ordinary emotions and manners of American existence,” had not yet been achieved, though he thought he could spot it on the horizon—he noted that Uncle Tom’s Cabin was “the nearest approach to the desired phenomenon.” (He also pooh-poohed both Hawthorne’s The Scarlet Letter and Cooper’s The Last of the Mohicans, which is why, though others have dubbed them GANs, they don’t appear below.)

In the nearly 150 years since the essay was written, the argument over the Great American Novel—what it is, what it should be, do we have one, do we need one, why so many white men—has gone on and on. As A.O. Scott memorably put it, “the Great American Novel, while also a hybrid (crossbred of romance and reportage, high philosophy and low gossip, wishful thinking and hard-nosed skepticism), may be more like the yeti or the Loch Ness monster—or Sasquatch, if we want to keep things homegrown. It is, in other words, a creature that quite a few people—not all of them certifiably crazy, some of them bearing impressive documentation—claim to have seen.”

. . . .

F. Scott Fitzgerald, The Great Gatsby

Gatsby’s magic emanates not only from its powerhouse poetic style—in which ordinary American language becomes unearthly—but from the authority with which it nails who we want to be as Americans. Not who we are; who we want to be. It’s that wanting that runs through every page of Gatsby, making it our Greatest American Novel. But it’s also our easiest Great American Novel to underrate: too short; too tempting to misread as just a love story gone wrong; too mired in the Roaring Twenties and all that jazz.

–Maureen Corrigan, So We Read On: How The Great Gatsby Came to Be and Why It Endures, 2014

. . . .

Mark Twain, The Adventures of Huckleberry Finn

There was no sense [upon its publication] that a great American novel had landed on the literary world of 1885. The critical climate could hardly anticipate T. S. Eliot and Ernest Hemingway’s encomiums 50 years later. In the preface to an English edition, Eliot would speak of “a master piece. … Twain’s genius is completely realized,” and Ernest went further. In “Green Hills of Africa,” after disposing of Emerson, Hawthorne and Thoreau, and paying off Henry James and Stephen Crane with a friendly nod, he proceeded to declare, “All modern American literature comes from one book by Mark Twain called Huckleberry Finn. … It’s the best book we’ve had. All American writing comes from that. There was nothing before. There has been nothing as good since.” … What else is greatness but the indestructible wealth it leaves in the mind’s recollection after hope has soured and passions are spent? It is always the hope of democracy that our wealth will be there to spend again, and the ongoing treasure of Huckleberry Finn is that it frees us to think of democracy and its sublime, terrifying premise: let the passions and cupidities and dreams and kinks and ideals and greed and hopes and foul corruptions of all men and women have their day and the world will still be better off, for there is more good than bad in the sum of us and our workings. Mark Twain, whole embodiment of that democratic human, understood the premise in every turn of his pen, and how he tested it, how he twisted and tantalized and tested it until we are weak all over again with our love for the idea.

–Norman Mailer, The New York Times, 1984

. . . .

Saul Bellow, The Adventures of Augie March

The Adventures of Augie March is the Great American Novel. Search no further. All the trails went cold forty-two years ago. The quest did what quests very rarely do: it ended. … Augie March, finally, is the Great American Novel because of its fantastic inclusiveness, its pluralism, its qualmless promiscuity. In these pages the highest and lowest mingle and hobnob in the vast democracy of Bellow’s prose. Everything is in here, the crushed and the exalted, and all the notches in between, from the kitchen stiff… to the American eagle.

–Martin Amis, The Atlantic Monthly, 1995

Link to the rest at Literary Hub

A Peek Inside the Strange World of Fake Academia

From The New York Times:

The caller ID on my office telephone said the number was from Las Vegas, but when I picked up the receiver I heard what sounded like a busy overseas call center in the background. The operator, “John,” asked if I would be interested in attending the 15th World Cardiology and Angiology Conference in Philadelphia next month.

“Do I have to be a doctor?” I said, because I’m not one. I got the call because 20 minutes earlier I had entered my phone number into a website run by a Hyderabad, India, company called OMICS International.

“You can have the student rate,” the man replied. With a 20 percent discount, it would be $599. The conference was in just a few weeks, I pointed out — would that be enough time for the academic paper I would be submitting to be properly reviewed? (Again, I know nothing about cardiology.) It would be approved on an “expedited basis” within 24 hours, he replied, and he asked which credit card I would like to use.

If it seems that I was about to be taken, that’s because I was. OMICS International is a leader in the growing business of academic publication fraud. It has created scores of “journals” that mimic the look and feel of traditional scholarly publications, but without the integrity. This year the Federal Trade Commission formally charged OMICS with “deceiving academics and researchers about the nature of its publications and hiding publication fees ranging from hundreds to thousands of dollars.”

. . . .

OMICS is also in the less well-known business of what might be called conference fraud, which is what led to the call from John. Both schemes exploit a fundamental weakness of modern higher education: Academics need to publish in order to advance professionally, get better jobs or secure tenure. Even within the halls of respectable academia, the difference between legitimate and fake publications and conferences is far blurrier than scholars would like to admit.

. . . .

In October, a New Zealand college professor submitted a paper to the OMICS-sponsored “International Conference on Atomic and Nuclear Physics,” which was held last month at the Hilton Atlanta Airport. It was written using the autocomplete feature on his iPhone, which produced an abstract that begins as follows: “Atomic Physics and I shall not have the same problem with a separate section for a very long long way. Nuclear weapons will not have to come out the same day after a long time of the year he added the two sides will have the two leaders to take the same way to bring up to their long ways of the same as they will have been a good place for a good time at home the united front and she is a great place for a good time.”

The paper was accepted within three hours.

An OMICS employee who identified himself as Sam Dsouza said conference papers are reviewed by its “experts” within 24 hours of submission. He couldn’t provide a list of its reviewers or their credentials.

Link to the rest at The New York Times

Is Poetry Making a Comeback?

From BookRiot:

High school-me wanted to become a poet. So did college-me. Then a professor pointed out at one point that to become a poet, one accepts that very few people, and often nobody at all, will read your work. Poverty and a small audience were the burden of the genre. So, my poet days were over.

Which is ironic, because that decision in college did not actually make me reject poetry. In fact, in some ways, it made poetry more inviting, more delicious–the thing I read and wrote when the rest of the world’s expectations were put on hold. And now, sometimes, I want to hold down people and force them to read poetry.

. . . .

But then again, maybe my professor was wrong. Maybe in a traditionalist sense, yes, many chapbooks aren’t read. Most poets go into the world only to be read by other poets. What’s important here is to note that a rejection of online audiences by traditional poets maintains the isolation of poetry. It seems that 2016 was the year for online poetry access to crack open wide.

Link to the rest at BookRiot

Not My Sherlock

From Literary Hub:

Sherlock Holmes holds the distinction of being the literary character most frequently portrayed on screen. This, necessarily, has resulted in countless reinventions—the most recent of which is the version of Holmes who appears in “The Six Thatchers,” the first episode of the new season of the BBC’s Sherlock.

The problem with any reinvention of a beloved character—superhero, spy, Victorian detective—is that character’s fans, and the weight of those fans’ expectations. We demand a certain number of fangirl callouts. We get upset if the character’s idiosyncrasies aren’t name-checked, if their catchphrases are left unemployed. Their signature hats not at least nodded to. I’m not immune: I have my own wish list, too.

What I want from a Sherlock Holmes adaptation is Holmes and Watson solving crimes together.

In “The Six Thatchers,” our titular detective delivers a rapid-fire monologue to the person in the armchair across from him. “You see, but you don’t observe,” he chides, accusing his partner of “romantic whimsy” while he himself runs on logic and reason. Right as we’re in the thick of his rhetoric, he bends to pick up his partner’s stuffed toy. This is the joke: that the speech is delivered to one Rosie Watson, Dr. Watson’s infant daughter.

How much do I wish it had been Dr. Watson instead.

But this is almost always the joke in Sherlock, that we’re given the Holmesiana we want in the archest possible way. The majority of that speech is drawn from Sir Arthur Conan Doyle’s excellent “A Scandal in Bohemia,” the first of the Holmes short stories to be published in the Strand, where it’s delivered to the actual Watson. Toby, the dog that Sherlock borrows in “The Six Thatchers” to trail a suspect through London, sits on the pavement and refuses to move. Maybe that’s because he’s been taken from The Sign of Four, the novel which first introduces Mary Morstan. As is the Agra treasure, which in the show is no chest of gold but instead a memory stick containing Mary’s sordid past. After Sherlock recovers it in “The Six Thatchers,” we watch Mary flee her husband and infant daughter to finally put her past to rest. It’s a strong impulse, but delivered here in a five-minute montage of her donning disguises and fetching hidden passports only to find Sherlock Holmes having beaten her to her end point. Where the two of them have a good laugh about it. The steps we’ve followed are meaningless ones.

As the episode progresses, each character gets a chance to wink broadly at the camera. Then the Doyle stories are stuck back into the blender, and the plot hurries on.

Link to the rest at Literary Hub

Here’s looking at you

Although this is January 2, in the US, it is a holiday because January 1 occurred on a Sunday. It is also the last widely-recognized non-working holiday for awhile.

So far as PG is able to ascertain from online activity, the entire publishing and self-publishing industry is off duty. PG will probably go off duty pretty soon.

So, he’s posting some items that interest him which he might consider a bit far afield on a more normal day.

From Aeon:

About 25 minutes into the action film Iron Man 2 (2010), there is an explosive sequence in the middle of an auto race through the streets of Monaco. The scene is a technical tour de force, with explosions, cars flipping and fire everywhere, all in front of thousands of panicked race spectators. At a 2014 event at the Academy of Motion Picture Arts and Sciences, the film’s director Jon Favreau got to see the eye movements of audience members who watched the clip. He told us he was thrilled – and relieved – to see that everyone was watching the actors Robert Downey Jr and Mickey Rourke, particularly their faces and hands, and that nobody was looking at the crowd – because the crowd was all computer-generated, and if you look closely they don’t look all that real. As long as you don’t look closely, Favreau (who was also an executive producer) could go a little cheap on these effects and save the money for where it would really count.

This phenomenon – the audience’s eyes moving in unison – is characteristic of film viewing. It is not typical of real-world vision. Rather, filmmakers use editing, framing and other techniques to tightly control where we look. Over 125 years, the global filmmaking community has been engaged in an informal science of vision, conducting a large number of trial-and-error experiments on human perception. The results are not to be found in any neuroscience or psychology textbook, though you can find some in books on cinematography and film editing, and in academic papers analysing individual films. Other insights are there in the films themselves, waiting to be described. In recent years, professional scientists have started to mine this rich, informal database, and some of what we have learned is startling.

To understand how the eyes are affected by movies, you need to know a bit about how they work outside the theatre. When we are just living our lives, our eyes jump from one location to another two or three times per second, taking in some things and skipping over others. Those jumps are called saccades. (Our eyes also make smooth tracking movements, say when we are following a bird in the sky or a car on the road, but those are somewhat rare.) Why do we do this? Because our brains are trying to build a reasonably complete representation of what is happening using a camera – the eye – that has high resolution only in a narrow window. If any visual detail is important for our understanding of the scene, we need to point our eyes at it to encode it.

The way people use eye movements to explore a scene has a consistent rhythm that involves switching between a rapid exploratory mode and a slower information-extraction mode. Suppose you check into a resort, open a window, and look out on a gorgeous beach. First, your eyes will rapidly scan the scene, making large movements to fix on objects throughout the field of what you can see. Your brain is building up a representation of what is there in the scene – establishing the major axes of the environment, localising landmarks within that space, categorising the objects. Then, you will transition to a slower, more deliberate mode of seeing. In this mode, your eyes will linger on each object for longer, and your eye movements will be smaller and more deliberate. Now, your brain is filling in details about each object. Given enough time, this phase will peter out. At this point, you might turn to another window and start all over again, or engage in a completely different activity – writing a postcard or unpacking.

. . . .

In these initial studies, Hochberg and Brooks documented the switch from an exploratory phase lasting a few seconds at most to an information-extraction phase. The duration of each exploratory phase depended on the complexity of the slide being viewed. When presented with more complex pictures, people explored for longer before they settled down. And when they were given a choice between looking at more complex and less complex images, they spent more time looking at the more complex images. Later researchers investigated which visual features specifically draw the eyes, finding that viewers tend to look at parts of an image with edges, with a lot of contrast between light and dark, with a lot of texture, and with junctures such as corners.

. . . .

One difference between real-world scenes and film is that movies move. How does this change what people look at? In a recent experiment, Parag Mital, Tim Smith, Robin Hill and John Henderson from the University of Edinburgh recorded eye movements from a few dozen people while they watched a grab-bag of videos, including ads, documentaries, trailers, news and music videos. A number of effects carried over from looking at still pictures. People still look at places with a lot of contrast, and at corners. However, with moving pictures, new effects dominate: viewers look at things that are moving, and at things that are going from light to dark or from dark to light. This makes good ecological sense: things that are changing are more likely relevant for guiding your actions than things that are just sitting there. In particular, the eyes follow new motion that could reveal something that you need to deal with in a hurry – an object falling or an animal on the move.

Motion onsets are known to powerfully capture attention, even more quickly than the eyes can move. For example, when we first see Edward Scissorhands in the 1990 Tim Burton film of the same name, he is attempting to hide in shadow in a complex scene. It is the involuntary movement of his scissors that gives him away, attracting the viewer’s eye at the same moment it attracts the eye of Peg the Avon Lady.

. . . .

People also tend to blink at cuts. This might be a response to the sudden change in brightness that can occur at a cut. Or, it might reflect that our brains are taking the visual change as a sign to take a very quick break, and using the opportunity to wet our eyes. Last but not least, when we watch edited movies we make more eye movements. Whereas eye movements happen 2-3 times a second when we are looking at naturalistic stimuli, when watching unedited videos, eye movements happen at a rate of about four per second; when you add in editing, this rate increases further to about five per second.

Link to the rest at Aeon

Fridges and washing machines could be vital witnesses in murder plots

From The Telegraph:

High-tech washing machines and fridges will soon be used by detectives gathering evidence from crime scenes, experts have forecast.

The advent of ‘the internet of things’ in which more devices are connected together in a world of ‘smart working’ could in future provide important clues for the police.

Detectives are currently being trained to look for gadgets and white goods which could provide a ‘digital footprint’ of victims or criminals.

Mark Stokes, the head of the digital, cyber and communications forensics unit at the Metropolitan Police told The Times: “Wireless cameras within a device, such as fridge, may record the movement of owners and suspects.

. . . .

The new Samsung Family Hub Fridge has cameras that carry a live feed of its contents, so shoppers can tell what they need when they are out at the shop. The dates and times that people logon to the fridge, therefore could provide alibis or prove people were not were they said they were.

Mr Stokes said detectives of the future would carry a ‘digital forensics toolkit’ which would allow them to analyse microchips and download data at the scene, rather than removing devices for testing.

Link to the rest at The Telegraph and thanks to Dan, who says, “Nothing to do with publishing, except new story ideas. Not only Alexa sits on the cutting edge of crime investigation.” for the tip.

Secured By miniOrange