Writing Advice

Our Brains Tell Stories so We Can Live

9 August 2019

From Nautilis:

We are all storytellers; we make sense out of the world by telling stories. And science is a great source of stories. Not so, you might argue. Science is an objective collection and interpretation of data. I completely agree. At the level of the study of purely physical phenomena, science is the only reliable method for establishing the facts of the world.

But when we use data of the physical world to explain phenomena that cannot be reduced to physical facts, or when we extend incomplete data to draw general conclusions, we are telling stories. Knowing the atomic weight of carbon and oxygen cannot tell us what life is. There are no naked facts that completely explain why animals sacrifice themselves for the good of their kin, why we fall in love, the meaning and purpose of existence, or why we kill each other.

Science is not at fault. On the contrary, science can save us from false stories. It is an irreplaceable means of understanding our world. But despite the verities of science, many of our most important questions compel us to tell stories that venture beyond the facts. For all of the sophisticated methodologies in science, we have not moved beyond the story as the primary way that we make sense of our lives.

To see where science and story meet, let’s take a look at how story is created in the brain. Let’s begin with an utterly simple example of a story, offered by E. M. Forster in his classic book on writing, Aspects of the Novel: “The king died and then the queen died.” It is nearly impossible to read this juxtaposition of events without wondering why the queen died. Even with a minimum of description, the construction of the sentence makes us guess at a pattern. Why would the author mention both events in the same sentence if he didn’t mean to imply a causal relationship?

Once a relationship has been suggested, we feel obliged to come up with an explanation. This makes us turn to what we know, to our storehouse of facts. It is general knowledge that a spouse can die of grief. Did the queen then die of heartbreak? This possibility draws on the science of human behavior, which competes with other, more traditional narratives. A high school student who has been studying Hamlet, for instance, might read the story as a microsynopsis of the play.

. . . .

The pleasurable feeling that our explanation is the right one—ranging from a modest sense of familiarity to the powerful and sublime “a-ha!”—is meted out by the same reward system in the brain integral to drug, alcohol, and gambling addictions. The reward system extends from the limbic area of the brain, vital to the expression of emotion, to the prefrontal cortex, critical to executive thought. Though still imperfectly understood, it is generally thought that the reward system plays a central role in the promotion and reinforcement of learning. Key to the system, and found primarily within its brain cells, is dopamine, a neurotransmitter that carries and modulates signals among brain cells. Studies consistently show that feeling rewarded is accompanied by a rise in dopamine levels.

. . . .

Critical to understanding how stories spark the brain’s reward system is the theory known as pattern recognition—the brain’s way of piecing together a number of separate components of an image into a coherent picture. The first time you see a lion, for instance, you have to figure out what you’re seeing. At least 30 separate areas of the brain’s visual cortex pitch in, each processing an aspect of the overall image—from the detection of motion and edges, to the register of color and facial features. Collectively they form an overall image of a lion.

Each subsequent exposure to a lion enhances your neural circuitry; the connections among processing regions become more robust and efficient. (This theory, based on the research of Canadian psychologist Donald O. Hebb, a pioneer in studying how people learn, is often stated as “cells that fire together wire together.”) Soon, less input is necessary to recognize the lion. A fleeting glimpse of a partial picture is sufficient for recognition, which occurs via positive feedback from your reward system. Yes, you are assured by your brain, that is a lion.

. . . .

Science is in the business of making up stories called hypotheses and testing them, then trying its best to make up better ones. Thought-experiments can be compared to storytelling exercises using well-known characters. What would Sherlock Holmes do if he found a body suspended in a tree with a note strapped to its ankle? What would a light ray being bounced between two mirrors look like to an observer sitting on a train? Once done with their story, scientists go to the lab to test it; writers call editors to see if they will buy it.

People and science are like bread and butter. We are hardwired to need stories; science has storytelling buried deep in its nature. But there is also a problem. We can get our dopamine reward, and walk away with a story in hand, before science has finished testing it. This problem is exacerbated by the fact that the brain, hungry for its pattern-matching dopamine reward, overlooks contradictory or conflicting information whenever possible. A fundamental prerequisite for pattern recognition is the ability to quickly distinguish between similar but not identical inputs. Not being able to pigeonhole an event or idea makes it much more difficult for the brain to label and store it as a discrete memory. Neat and tidy promotes learning; loose ends lead to the “yes, but” of indecision and inability to draw a precise conclusion.

Link to the rest at Nautilus

Why You Should Have (at Least) Two Careers

8 August 2019

From The Harvard Business Review:

It’s not uncommon to meet a lawyer who’d like to work in renewable energy, or an app developer who’d like to write a novel, or an editor who fantasizes about becoming a landscape designer. Maybe you also dream about switching to a career that’s drastically different from your current job. But in my experience, it’s rare for such people to actually make the leap. The costs of switching seem too high, and the possibility of success seems too remote.

But the answer isn’t to plug away in your current job, unfulfilled and slowly burning out. I think the answer is to do both. Two careers are better than one. And by committing to two careers, you will produce benefits for both.

In my case, I have four vocations: I’m a corporate strategist at a Fortune 500 company, US Navy Reserve officer, author of several books, and record producer. The two questions that people ask me most frequently are “How much do you sleep?” and “How do you find time to do it all?” (my answers: “plenty” and “I make the time”). Yet these “process” questions don’t get to the heart of my reasons and motivations. Instead, a more revealing query would be, “Why do you have multiple careers?” Quite simply, working many jobs makes me happier and leaves me more fulfilled. It also helps me perform better at each job. Here’s how.

. . . .

My corporate job paycheck subsidizes my record producing career. With no track record as a producer, nobody was going to pay me to produce his or her music, and it wasn’t money that motivated me to become a producer in the first place — it was my passion for jazz and classical music. Therefore, I volunteered so that I could gain experience in this new industry. My day job not only afforded me the capital to make albums, but it taught me the skills to succeed as a producer. A good producer should be someone who knows how to create a vision, recruit personnel, establish a timeline, raise money, and deliver products.

. . . .

t the same time, I typically invite my corporate clients to recording sessions. For someone who works at an office all day, it’s exciting to go “behind-the-scenes” and interact with singers, musicians, and other creative professionals.

. . . .

[O]ne of my clients wanted to understand what Chinese citizens were saying to each other. Because I am an author, I have gotten to know other writers, so I reached out to my friend who was a journalist at a periodical that monitors chatter in China. Not restricted by the compliance department of a bank, he was able to give an unbridled perspective to my client, who was most appreciative.

Link to the rest at The Harvard Business Review

PG didn’t notice this when it first appeared in 2017, but thinks it may be interesting for many indie authors.

The Creative Compulsions of OCD

7 August 2019

From The Paris Review:

Here is my morning routine: when I get out of bed, my feet must touch the edge of the rug, one at a time, while I softly vocalize two magic words that are best described as puffing and plosive sounds. If my feet don’t touch correctly, or if I don’t say the words right, I get back in bed and try again. Once I have properly performed this initial procedure, I again tap my left foot on the carpet while vocalizing the first magic word, and then—while holding my breath and without moving my mouth or tongue one millimeter during the duration—I silently incant a phrase that is far too nonsensical and embarrassing to share publicly, then tap my right foot while vocalizing the second magic word.

This can take anywhere from ten seconds, if I’m lucky, to two or three minutes. Once executed to my satisfaction, I am able to go downstairs, unplug my phone and perform roughly the same procedure on it, with my thumbs instead of my feet, and then I am allowed to use my phone. Likewise, the refrigerator door when I’m making coffee. Likewise, the edges of my laptop when I power it on. With these routines completed, I can start my day, open a Word document, and begin writing.

I realize this sounds bad, but it’s a compromise I’ve reached after decades of managing my obsessive-compulsive disorder. I’ve gone cold turkey before, renouncing all habits and tics but they eventually creep back in. A therapist once described OCD behaviors as a “blob,” which felt apt—whatever part of it you press down on, another part bulges back up. These little routines are, in a sense, a deal I make with myself, so I don’t have to perform random routines all day long. Not doing them is not an option. If I don’t do them, the world will end.

I can’t remember exactly when it began. As is true of so many disorders, medical literature generally links OCD with the onset of adolescence, and this tracks with my earliest OCD memories: missing the bus to middle school because I had to touch mailboxes and the curb in a certain sequence; playing songs on my cassette deck over and over in order to pause on an exact word or chord; staying up in my teenage basement lair, flicking the lights on and off in patterns that, if my parents had noticed, would have looked like some Morse code call for help, which in a way, it was.

The most vivid memory I have from this era is typing out the final draft of an English paper over and over. I’d written and revised it first by longhand in a notebook, in anticipation of the Sisyphean task to come. The rule I’d set for myself—or rather, the rule that had mercilessly evolved over the course of the school year—was that if I made one error typing, I had to erase the whole thing and start over. When I typoed in the final paragraph, I deleted all four pages and took a break to cry. I finally pecked the last period when the sun was coming up.

. . . .

One of my earliest memories is sitting on the carpeted stairs of our home, suddenly aware of my heartbeat. I was perhaps four, and I was convinced that I was having a heart attack. I listened to it thudding away in my chest and expected to die with each little rush. When I was eight or nine, I became terrified by and obsessed with a Time Magazine article about something called AIDS.

. . . .

The vague motivating threat for not touching a curb or counting to a certain number was always an imaginary illness—ironic given the very real illness I actually had: although it took me decades to realize it, by thirteen, I was suffering from a full-blown mental health crisis.

. . . .

Controlling a sentence—controlling this sentence, as I type—is for me the best, most pleasurable work there is. I build the paragraph, tagged by its thematic first word: control. In crafting this sentence, this paragraph, this essay, I get to be both architect and construction worker, and both jobs offer equally pleasing aspects of control. The former involves creative design and abstract thought; the latter brings the visceral, simultaneously logical and intuitive pleasure of finding the right word, moving it around, putting it in just the right place. Having written that sentence, I know I must reverse myself and concede that the idea of there being “just the right place” is illusory—that even this work is, in its essence, as arbitrary as anything else. This is true, but nonetheless as I write, I shut out the world, other responsibilities, Twitter, the news, everything.

Link to the rest at The Paris Review

An Author Heads to the Stage

3 August 2019
Comments Off on An Author Heads to the Stage

From Publishing Perspectives:

As I bundled up my 225-page memoir manuscript and mailed it to editor Jane Rosenman, I hoped she would reveal the magic formula for transforming my pages into a book. I’d received glowing rejections but still no takers for my story, The Inheritance, about how, six weeks after my mother died, I discovered that she had disinherited me, and my quest to understand why.

Although Rosenman found much to praise, some aspects of my story still weren’t working, including a whiff of bitterness on the page. Yet who wouldn’t be bitter after being blindsided from beyond the grave? But the problem with bitterness, I later discovered, is that it lacks drama.

As I was revising the manuscript, I received an invitation to perform a 10-minute story with Portland Story Theater in Oregon, where I live. When I walked onto the stage, into the pressure cooker of live performance, something happened: my bitterness transformed into humor, and I discovered a liveliness and emotional depth that had not been as evident on the page.

Was I onto something that could help me crack open my story? To find out, I enrolled in a solo performance class with Seth Barrish at New York City’s Barrow Group Theatre, who I then hired to help me craft a performance of my story. With script in hand, I secured a director—Lauren Bloom Hanover—and performed the 50-minute, one-person show, retitled Firstborn, at Performance Works Northwest in Portland, as part of the Fertile Ground Festival. My minitour culminated with my off-Broadway performance at the United Solo Theatre Festival last October, where Jane was in the audience.

. . . .

By telling my story on stage, I found not only its through line but also its beating heart. Writing for performance also gave me more to work with than just the words. Now I had my body, voice, lighting, and music, plus props and images. Also, I could take shortcuts: a transition could be made with a turn of my body or a look to the audience. As Jane said when I spoke with her afterward, the demands of performance helped me get to the “nub of the story.”

Link to the rest at Publishing Perspectives

 

‘Close’ Proximity, ‘End’ Result, and More Redundant Words to Delete from Your Writing

1 August 2019

From Medium:

There’s a lot of deleting in copyediting, not just of the “very”s and “rather”s and “quite”s and excrescent “that”s with which we all encase our prose like so much Bubble Wrap and packing peanuts, but of restatements of information — “as estab’d,” one politely jots in the margin.

Much repetition, though, comes under the more elementary heading of Two Words Where One Will Do, and here’s a collection of easily disposed of redundancies. Some of these may strike you as obvious — though their obviousness doesn’t stop them from showing up constantly. Others are a little more arcane — the sorts of things you could likely get away with without anyone’s noticing — but they’re snippable nonetheless.

In either case, for those moments when you’re contemplating that either you or your prose could stand to go on a diet and your prose seems the easier target, here’s a good place to start.

(The bits in italics are the bits you can dispose of.)

  • ABM missile
    ABM = anti-ballistic missile.
  • absolutely certain, absolute certainty, absolutely essential
  • added bonus
  • advance planning, advance warning
  • all-time record
    As well, one doesn’t set a “new record.” One merely sets a record.

. . . .

  • exact same 
    To be sure, “exact same” is redundant. To be sure, I still say it and write it.
  • fall down
    What are you going to do, fall up?
  • fellow countryman
  • fetch back 
    To fetch something is not merely to go get it but to go get it and return with it to the starting place. Ask a dog.
  • few in number
  • fiction novel 
    Appalling. A novel is a work of fiction. That’s why it’s called a novel. That said, “nonfiction novel” is not the oxymoron it might at first seem. The term refers to the genre pioneered — though not, as is occasionally averred, invented — by Truman Capote with In Cold Blood, that of the work of nonfiction written novelistically. Lately one encounters people referring to any full-length book, even a work of nonfiction, as a novel. That has to stop.
  • final outcome
  • follow after
  • free gift
    A classic of the redundancy genre, much beloved of retailers and advertisers.
  • from whence
    Whence means “from where,” which makes “from whence” pretty damn redundant. Still, the phrase has a lot of history, including, from the King James Version of the Bible, “I will lift up mine eyes unto the hills, from whence cometh my help.” So I suppose you can write “from whence” if you’re also talking about thine eyes and the place your help is comething from.
    For a dazzling (and purposeful) use of “from whence,” consider Frank Loesser’s Guys and Dolls lyric “Take back your mink / to from whence it came” — gorgeously appropriate for the tawdry nightclub number in which it’s sung.
  • frontispiece illustration
    A frontispiece is an illustration immediately preceding, and generally facing, a book’s title page.
  • full gamut
    A gamut is the full range or scope of something, so the word needs no modifier. Ditto “complete range,” “broad spectrum,” “full extent,” and their cousins.

Link to the rest at Medium

The Problem with Sarcasm

1 August 2019

“Well, that was super helpful.”

Was it? Or are you trying to be sarcastic?

Because if it was helpful, you could simply write, “thank you, that was helpful.”

On the other hand, if you’re trying to express disappointment or displeasure, you could write, “I’m disappointed that you weren’t able to contribute more here. We were really looking forward to your input.”

The problem with sarcasm is that the level of displeasure is hidden. You might come across as snarky when you don’t mean to, or, the snarkiness you were sending might not land.

My new rule of thumb is to always assume goodwill and ignore any perceived sarcasm. Call it a Type II sarcasm-detection error.

It’s hard to imagine a situation where sarcasm is the most effective way to make your point.

Link to the rest at Seth’s Blog

The Birth of the Semicolon

1 August 2019
Comments Off on The Birth of the Semicolon

From The Paris Review:

The semicolon was born in Venice in 1494. It was meant to signify a pause of a length somewhere between that of the comma and that of the colon, and this heritage was reflected in its form, which combines half of each of those marks. It was born into a time period of writerly experimentation and invention, a time when there were no punctuation rules, and readers created and discarded novel punctuation marks regularly. Texts (both handwritten and printed) record the testing-out and tinkering-with of punctuation by the fifteenth-century literati known as the Italian humanists. The humanists put a premium on eloquence and excellence in writing, and they called for the study and retranscription of Greek and Roman classical texts as a way to effect a “cultural rebirth” after the gloomy Middle Ages. In the service of these two goals, humanists published new writing and revised, repunctuated, and reprinted classical texts.

One of these humanists, Aldus Manutius, was the matchmaker who paired up comma and colon to create the semicolon. Manutius was a printer and publisher, and the first literary Latin text he issued was De Aetna, by his contemporary Pietro Bembo. De Aetna was an essay, written in dialogue form, about climbing volcanic Mount Etna in Italy. On its pages lay a new hybrid mark, specially cut for this text by the Bolognese type designer Francesco Griffo: the semicolon (and Griffo dreamed up a nice plump version) is sprinkled here and there throughout the text, conspiring with colons, commas, and parentheses to aid readers.

. . . .

The semicolon had successfully colonized the letter cases of the best presses in Europe, but other newborn punctuation marks were not so lucky. The humanists tried out a lot of new punctuation ideas, but most of those marks had short life spans. Some of the printed texts that appeared in the centuries surrounding the semicolon’s birth look as though they are written partially in secret code: they are filled with mysterious dots, dashes, swoops, and curlicues. There were marks for the minutest distinctions and the most specific occasions. For instance, there was once a punctus percontativus, or rhetorical question mark, which was a mirror-image version of the question mark. Why did the semicolon survive and thrive when other marks did not? Probably because it was useful. Readers, writers, and printers found that the semicolon was worth the trouble to insert. The rhetorical question mark, on the other hand, faltered and then fizzled out completely. This isn’t too surprising: does anyone really need a special punctuation mark to know when a question is rhetorical?

. . . .

In humanist times, just as in our own, hand-wringing sages forecast a literary apocalypse precipitated by too-casual attitudes about punctuation. “It is not concealed from you how great a shortage there is of intelligent scribes in these times,” wrote one French humanist to another,

and above all in transcribing those things which observe style to any degree; in which unless points and marks of distinctions, by which the style flows through the colacommata, and periodi, are separated with more attentive diligence, that which is written is confused and barbarous … Which carelessness, in my opinion, has occurred chiefly since we have for a long time lacked eloquence, in which these things are necessary: the ancient manner of handwriting, therefore, in which the scribes of books (antiquarii) were gradually writing a perfect and correctly formed script with precise punctuation (certa distinctione) of clausulae and with notes of accentuation, has perished together with the art of expression (dictatu).

The entire art of expression—dead, because careless writers just couldn’t hack it when it came to punctuation.

Link to the rest at The Paris Review

What People Actually Say Before They Die

31 July 2019

From The Atlantic:

Mort Felix liked to say that his name, when read as two Latin words, meant “happy death.” When he was sick with the flu, he used to jokingly remind his wife, Susan, that he wanted Beethoven’s “Ode to Joy” played at his deathbed. But when his life’s end arrived at the age of 77, he lay in his study in his Berkeley, California, home, his body besieged by cancer and his consciousness cradled in morphine, uninterested in music and refusing food as he dwindled away over three weeks in 2012. “Enough,” he told Susan. “Thank you, and I love you, and enough.” When she came downstairs the next morning, she found Felix dead.

During those three weeks, Felix had talked. He was a clinical psychologist who had also spent a lifetime writing poetry, and though his end-of-life speech often didn’t make sense, it seemed to draw from his attention to language. “There’s so much so in sorrow,” he said at one point. “Let me down from here,” he said at another. “I’ve lost my modality.” To the surprise of his family members, the lifelong atheist also began hallucinating angels and complaining about the crowded room—even though no one was there.

Felix’s 53-year-old daughter, Lisa Smartt, kept track of his utterances, writing them down as she sat at his bedside in those final days. Smartt majored in linguistics at UC Berkeley in the 1980s and built a career teaching adults to read and write. Transcribing Felix’s ramblings was a sort of coping mechanism for her, she says. Something of a poet herself (as a child, she sold poems, three for a penny, like other children sold lemonade), she appreciated his unmoored syntax and surreal imagery. Smartt also wondered whether her notes had any scientific value, and eventually she wrote a book, Words on the Threshold, published in early 2017, about the linguistic patterns in 2,000 utterances from 181 dying people, including her father.

. . . .

To assess people’s “mental condition just before death,” MacDonald mined last-word anthologies, the only linguistic corpus then available, dividing people into 10 occupational categories (statesmen, philosophers, poets, etc.) and coding their last words as sarcastic, jocose, contented, and so forth. MacDonald found that military men had the “relatively highest number of requests, directions, or admonitions,” while philosophers (who included mathematicians and educators) had the most “questions, answers, and exclamations.” The religious and royalty used the most words to express contentment or discontentment, while the artists and scientists used the fewest.

MacDonald’s work “seems to be the only attempt to evaluate last words by quantifying them, and the results are curious,” wrote the German scholar Karl Guthke in his book Last Words, on Western culture’s long fascination with them. Mainly, MacDonald’s work shows that we need better data about verbal and nonverbal abilities at the end of life. One point that Guthke makes repeatedly is that last words, as anthologized in multiple languages since the 17th century, are artifacts of an era’s concerns and fascinations about death, not “historical facts of documentary status.” They can tell us little about a dying person’s actual ability to communicate.

. . . .

At the end of life, Keeley says, the majority of interactions will be nonverbal as the body shuts down and the person lacks the physical strength, and often even the lung capacity, for long utterances. “People will whisper, and they’ll be brief, single words—that’s all they have energy for,” Keeley said. Medications limit communication. So does dry mouth and lack of dentures. She also noted that family members often take advantage of a patient’s comatose state to speak their piece, when the dying person cannot interrupt or object.

. . . .

We have a rich picture of the beginnings of language, thanks to decades of scientific research with children, infants, and even babies in the womb. But if you wanted to know how language ends in the dying, there’s next to nothing to look up, only firsthand knowledge gained painfully.

Link to the rest at The Atlantic

Last words have a unique standing in the legal world.

From The Legal Information Institute, Cornell Law School:

Definition

dying declaration is a statement made by a declarant, who is unavailable to testify in court (typically because of the declarant’s death), who made the statement under a belief of certain or impending death. The statement must also relate to what the declarant believed to be the cause or circumstances of the declarant’s impending death.

Overview

An out-of-court statement is referred to as hearsay. A dying declaration is a type of hearsay. However, unlike regular hearsay, a dying declaration is admissible in court. As such, a dying declaration is as an exception to the hearsay rule.

Other general rules of admissibility also apply, such as the requirement that the declaration must be based on the declarant’s actual knowledge.

Link to the rest at The Legal Information Institute

Needless to say, dying declarations are not commonly observed in most litigation. The theory behind the admissibility of such a statement is

1) Of course, the person who made the statement is not available to come to court and testify in person; and

2) A person at the end of his/her life is unlikely to tell a lie because she/he is about to face judgment in the world beyond.

PG is unaware of any studies which have explored whether dying declarations are, in fact, accurate and true, and the judge or jury trying a case in which a dying declaration may give such evidence the weight they believe it should have in coming to a final decision about the case.

 

Next Page »