Writing Advice

Chekhov’s Gun: The Importance of Follow-Through in Fiction

6 October 2019

From Anne R. Allen’s Blog:

Anton Chekhov, the Russian playwright, also wrote short stories, essays and instructions for young writers. Probably his most famous writerly advice is this admonition:

“If in the first act you have hung a pistol on the wall, then in the following one it should be fired. Otherwise don’t put it there.”

In other words, remove everything that has no relevance to the story. If chapter one says your mild-mannered reporter heroine won a bunch of trophies for archery which she displays prominently alongside her handmade Mongolian horse longbow, she’d better darn well shoot an arrow before the story is done.

. . . .

Yeah, but what if that longbow is there to show us what her apartment looks like? It’s good to show her décor, because it gives an insight into her character, right?

It depends. Yes, we do want to use details to set tone and give depth to our characters. Ruth Harris told us all about that in her post on using details to create memorable characters.

But the key is how you stress those details when you first present them. If there’s a whole paragraph about those archery trophies, or the characters have a conversation about the Mongolian  horse longbow, you gotta shoot some arrows. But if there’s just a cursory mention, “her apartment walls were decorated with an odd assortment of personal trophies and exotic weapons” then you can leave them on the wall.

. . . .

Wait just a goldern minute, sez you. I write mysteries. Mysteries need to have irrelevant clues and red herrings. Otherwise the story will be over before chapter seven.

This is true. But mystery writers need to manage their red herrings. If the deceased met his demise via arrow, probably shot by a Mongolian horse longbow, then Missy Mild-Mannered Reporter is going to look like a very viable subject to the local constabulary.

But of course she didn’t do it because she’s our hero, so the longbow and the trophies are red herrings.

But they still need to be “fired.” Maybe not like Chekhov’s gun, but they need to come back into the story and be reckoned with. Like maybe the real killer visited her apartment earlier when delivering pizza, then broke in to “borrow” the longbow in order to make Missy look like the murderous archer.

Link to the rest at Anne R. Allen’s Blog

The difference between creative and critical voice

3 October 2019

From Kristine Kathryn Rusch:

We spend a lot of time at workshops discussing the difference between creative voice and critical voice. When I teach, I really want to nurture the creative voice. But I do need to give guidance, and some of that sounds critical. Because we’ve all been through decades of schooling, we also hear the voice of the teacher as “the authority.” I do what I can to mitigate that, in that, I don’t want my students to think my voice is the correct one, particularly when it comes to their vision.

I might have misunderstood their vision. I might not know exactly what they’re trying to do. Or, in some ways, worse, I might not like the kind of fiction they’re writing, and that seeps out in some of my comments.

So I’m constantly thinking about the difference between creative and critical voice during workshop weeks. I’m also monitoring myself, because if I’m not careful, teaching can make me too critical of myself.

I thought I had escaped that part. I love teaching romance. Even though it’s the hardest genre to write well, it’s also the happiest. A happily ever after ending is an essential part of the genre. And so, instead of looking to the worst of humanity, when we write romance, we write about the best.

. . . .

The next morning, I woke up with my own voice in my head, repeating a line from the KIckstarter script I wrote: Because I couldn’t help myself, I wrote two of the longer stories, and an entire novel.

And then I stomped around the condo, because I realized that—dang it—the critical voice had been there all along.

You see, I’ve been whining that 2019 isn’t as good a writing year as I want. I wanted to get to a big project that I’ve been looking forward to, but I need to finish a few things first. And I’m nowhere near finishing those things.

I also had to drop a lot of work because we had a crisis at WMG when Allyson Longueira had emergency brain surgery. We’ve had some tough years the past few years. I was the emergency in 2018; Allyson in 2019. I’m hoping that 2020 is much, much better.

So I lost writing time. A lot of it, as I took on other projects that needed finishing.

. . . .

But that sentence from the Kickstarter script—Because I couldn’t help myself, I wrote two of the longer stories, and an entire novel—kept coming up in my brain all day Friday. I noodled over that sentence.

Because I had been telling myself, severely and somewhat angrily, that I haven’t been doing enough. I haven’t been writing enough. Not enough new words.

Even though, I did the two novellas and that novel from March to August, while doing other things. And I finished some big projects in January and February in prep for the even bigger project that I haven’t gotten to.

None of that counts the work I’ve done here, on this business blog, because that’s nonfiction, and I don’t count nonfiction. Just like I don’t count editing, because none of that is new words of fiction, which is all I do count.

And yes, I’ve had to take some time away, but Holy Carpal Tunnel, Batman, I have been doing a lot of fiction writing just the same. I had thought of it as things that either got in the way (some promised fiction for anthologies/other people’s projects), the stories for the Holiday Spectacular, and the novel that I had started thinking it was a novella.

If I total my words, I’m down a bit from pre-2016 levels, but not much. And I’m better than I was in 2018 by a long shot.

So the critical voice, for me, had moved from what’s wrong with the fiction to what’s wrong with production. And it had been lashing me, hard, in ways that I would never allow an actual person to do.

Link to the rest at Kristine Kathryn Rusch

Here’s a link to Kris Rusch’s books. If you like the thoughts Kris shares, you can show your appreciation by checking out her books.

Novelist Cormac McCarthy’s tips on how to write a great science paper

28 September 2019

From Nature:

For the past two decades, Cormac McCarthy — whose ten novels include The RoadNo Country for Old Men and Blood Meridian — has provided extensive editing to numerous faculty members and postdocs at the Santa Fe Institute (SFI) in New Mexico. He has helped to edit works by scientists such as Harvard University’s first tenured female theoretical physicist, Lisa Randall, and physicist Geoffrey West, who authored the popular-science book Scale.

Van Savage, a theoretical biologist and ecologist, first met McCarthy in 2000, and they overlapped at the SFI for about four years while Savage was a graduate student and then a postdoc. Savage has received invaluable editing advice from McCarthy on several science papers published over the past 20 years. While on sabbatical at the SFI during the winter of 2018, Savage had lively weekly lunches with McCarthy. They worked to condense McCarthy’s advice to its most essential points so that it could be shared with everyone. These pieces of advice were combined with thoughts from evolutionary biologist Pamela Yeh and are presented here. McCarthy’s most important tip is to keep it simple while telling a coherent, compelling story. The following are more of McCarthy’s words of wisdom, as told by Savage and Yeh.

  • Use minimalism to achieve clarity. While you are writing, ask yourself: is it possible to preserve my original message without that punctuation mark, that word, that sentence, that paragraph or that section? Remove extra words or commas whenever you can.
  • Decide on your paper’s theme and two or three points you want every reader to remember. This theme and these points form the single thread that runs through your piece. The words, sentences, paragraphs and sections are the needlework that holds it together. If something isn’t needed to help the reader to understand the main theme, omit it.
  • Limit each paragraph to a single message. A single sentence can be a paragraph. Each paragraph should explore that message by first asking a question and then progressing to an idea, and sometimes to an answer. It’s also perfectly fine to raise questions in a paragraph and leave them unanswered.

. . . .

  • Don’t over-elaborate. Only use an adjective if it’s relevant. Your paper is not a dialogue with the readers’ potential questions, so don’t go overboard anticipating them. Don’t say the same thing in three different ways in any single section. Don’t say both ‘elucidate’ and ‘elaborate’. Just choose one, or you risk that your readers will give up.
  •  

    And don’t worry too much about readers who want to find a way to argue about every tangential point and list all possible qualifications for every statement. Just enjoy writing.

  • With regard to grammar, spoken language and common sense are generally better guides for a first draft than rule books. It’s more important to be understood than it is to form a grammatically perfect sentence.

Link to the rest at Nature

The Universe in a Sentence: On Aphorisms

18 September 2019

From The Millions:

“A fragment ought to be entirely isolated from the surrounding world like a little work of art and complete in itself like a hedgehog.”
Friedrich SchlegelAthenaeum Fragments (1798)

“I dream of immense cosmologies, sagas, and epics all reduced to the dimensions of an epigram.”
Italo CalvinoSix Memos for the Next Millennium (1988)

From its first capital letter to the final period, an aphorism is not a string of words but rather a manifesto, a treatise, a monograph, a jeremiad, a sermon, a disputation, a symposium. An aphorism is not a sentence, but rather a microcosm unto itself; an entrance through which a reader may walk into a room the dimensions of which even the author may not know. Our most economic and poetic of prose forms, the aphorism does not feign argumentative completism like the philosophical tome, nor does it compel certainty as does the commandment—the form is cagey, playful, and mysterious. To either find an aphorism in the wild, or to peruse examples in a collection that mounts them like butterflies nimbly held in place with push-pin on Styrofoam, is to have a literary-naturalist’s eye for the remarkable, for the marvelous, for the wondrous. And yet there has been, at least until recently, a strange critical lacuna as concerns aphoristic significance. Scholar Gary Morson writes in The Long and Short of It: From Aphorism to Novel that though they “constitute the shortest [of] literary genres, they rarely attract serious study. Universities give courses on the novel, epic, and lyric…But I know of no course on…proverbs, wise sayings, witticisms and maxims.”

An example of literary malpractice, for to consider an aphorism is to imbibe the purest distillation of a mind contemplating itself. In an aphorism every letter and word counts; every comma and semicolon is an invitation for the reader to discover the sacred contours of her own thought. Perhaps answering Morson’s observation, critic Andrew Hui writes in his new study A Theory of the Aphorism: From Confucius to Twitter that the form is “Opposed to the babble of the foolish, the redundancy of bureaucrats, the silence of mystics, in the aphorism nothing is superfluous, every word bear weight.” An aphorism isn’t a sentence—it’s an earthquake captured in a bottle. It isn’t merely a proverb, a quotation, an epigraph, or an epitaph; it’s fire and lightning circumscribed by the rules of syntax and grammar, where rhetoric itself becomes the very stuff of thought. “An aphorism,” Friedrich Nietzsche aphoristically wrote, “is an audacity.”

. . . .

[A]phorism is rife in the pre-Socratic philosophy that remains, from Heraclitus’s celebrated observation that “You can’t step into the same river twice” to Parmenides’s exactly opposite contention that “It is indifferent to me where I am to begin, for there shall I return again.” Thus is identified one of the most difficult qualities of the form—that it’s possible to say conflicting things and that by virtue of how you say them you’ll still sound wise. A dangerous form, the aphorism, for it can confuse rhetoric for knowledge. Yet perhaps that’s too limiting a perspective, and maybe its better to think of the chain of aphorisms as a great and confusing conversation; a game in which both truth and its opposite can still be true.

Link to the rest at The Millions

PG did some quick hunting for aphorisms and discovered the following:

  • There are some secrets which do not permit themselves to be told.
    Edgar Allan Poe
  • Who would venture upon the journey of life, if compelled to begin it at the end?
    Francoise d`Aubigne Marquise de Maintenon
  • There are no solved problems; there are only problems that are more or less solved.
    Jules Henri Poincare
  • Life isn`t hard to manage when you`ve nothing to lose.
    Ernest Hemingway
  • It takes a woman twenty years to make a man of her son, and another woman twenty minutes to make a fool of him.
    Helen Rowland
  • In school, every period ends with a bell. Every sentence ends with a period. Every crime ends with a sentence.
    Steven Wright

 

Our Brains Tell Stories so We Can Live

9 August 2019

From Nautilis:

We are all storytellers; we make sense out of the world by telling stories. And science is a great source of stories. Not so, you might argue. Science is an objective collection and interpretation of data. I completely agree. At the level of the study of purely physical phenomena, science is the only reliable method for establishing the facts of the world.

But when we use data of the physical world to explain phenomena that cannot be reduced to physical facts, or when we extend incomplete data to draw general conclusions, we are telling stories. Knowing the atomic weight of carbon and oxygen cannot tell us what life is. There are no naked facts that completely explain why animals sacrifice themselves for the good of their kin, why we fall in love, the meaning and purpose of existence, or why we kill each other.

Science is not at fault. On the contrary, science can save us from false stories. It is an irreplaceable means of understanding our world. But despite the verities of science, many of our most important questions compel us to tell stories that venture beyond the facts. For all of the sophisticated methodologies in science, we have not moved beyond the story as the primary way that we make sense of our lives.

To see where science and story meet, let’s take a look at how story is created in the brain. Let’s begin with an utterly simple example of a story, offered by E. M. Forster in his classic book on writing, Aspects of the Novel: “The king died and then the queen died.” It is nearly impossible to read this juxtaposition of events without wondering why the queen died. Even with a minimum of description, the construction of the sentence makes us guess at a pattern. Why would the author mention both events in the same sentence if he didn’t mean to imply a causal relationship?

Once a relationship has been suggested, we feel obliged to come up with an explanation. This makes us turn to what we know, to our storehouse of facts. It is general knowledge that a spouse can die of grief. Did the queen then die of heartbreak? This possibility draws on the science of human behavior, which competes with other, more traditional narratives. A high school student who has been studying Hamlet, for instance, might read the story as a microsynopsis of the play.

. . . .

The pleasurable feeling that our explanation is the right one—ranging from a modest sense of familiarity to the powerful and sublime “a-ha!”—is meted out by the same reward system in the brain integral to drug, alcohol, and gambling addictions. The reward system extends from the limbic area of the brain, vital to the expression of emotion, to the prefrontal cortex, critical to executive thought. Though still imperfectly understood, it is generally thought that the reward system plays a central role in the promotion and reinforcement of learning. Key to the system, and found primarily within its brain cells, is dopamine, a neurotransmitter that carries and modulates signals among brain cells. Studies consistently show that feeling rewarded is accompanied by a rise in dopamine levels.

. . . .

Critical to understanding how stories spark the brain’s reward system is the theory known as pattern recognition—the brain’s way of piecing together a number of separate components of an image into a coherent picture. The first time you see a lion, for instance, you have to figure out what you’re seeing. At least 30 separate areas of the brain’s visual cortex pitch in, each processing an aspect of the overall image—from the detection of motion and edges, to the register of color and facial features. Collectively they form an overall image of a lion.

Each subsequent exposure to a lion enhances your neural circuitry; the connections among processing regions become more robust and efficient. (This theory, based on the research of Canadian psychologist Donald O. Hebb, a pioneer in studying how people learn, is often stated as “cells that fire together wire together.”) Soon, less input is necessary to recognize the lion. A fleeting glimpse of a partial picture is sufficient for recognition, which occurs via positive feedback from your reward system. Yes, you are assured by your brain, that is a lion.

. . . .

Science is in the business of making up stories called hypotheses and testing them, then trying its best to make up better ones. Thought-experiments can be compared to storytelling exercises using well-known characters. What would Sherlock Holmes do if he found a body suspended in a tree with a note strapped to its ankle? What would a light ray being bounced between two mirrors look like to an observer sitting on a train? Once done with their story, scientists go to the lab to test it; writers call editors to see if they will buy it.

People and science are like bread and butter. We are hardwired to need stories; science has storytelling buried deep in its nature. But there is also a problem. We can get our dopamine reward, and walk away with a story in hand, before science has finished testing it. This problem is exacerbated by the fact that the brain, hungry for its pattern-matching dopamine reward, overlooks contradictory or conflicting information whenever possible. A fundamental prerequisite for pattern recognition is the ability to quickly distinguish between similar but not identical inputs. Not being able to pigeonhole an event or idea makes it much more difficult for the brain to label and store it as a discrete memory. Neat and tidy promotes learning; loose ends lead to the “yes, but” of indecision and inability to draw a precise conclusion.

Link to the rest at Nautilus

Why You Should Have (at Least) Two Careers

8 August 2019

From The Harvard Business Review:

It’s not uncommon to meet a lawyer who’d like to work in renewable energy, or an app developer who’d like to write a novel, or an editor who fantasizes about becoming a landscape designer. Maybe you also dream about switching to a career that’s drastically different from your current job. But in my experience, it’s rare for such people to actually make the leap. The costs of switching seem too high, and the possibility of success seems too remote.

But the answer isn’t to plug away in your current job, unfulfilled and slowly burning out. I think the answer is to do both. Two careers are better than one. And by committing to two careers, you will produce benefits for both.

In my case, I have four vocations: I’m a corporate strategist at a Fortune 500 company, US Navy Reserve officer, author of several books, and record producer. The two questions that people ask me most frequently are “How much do you sleep?” and “How do you find time to do it all?” (my answers: “plenty” and “I make the time”). Yet these “process” questions don’t get to the heart of my reasons and motivations. Instead, a more revealing query would be, “Why do you have multiple careers?” Quite simply, working many jobs makes me happier and leaves me more fulfilled. It also helps me perform better at each job. Here’s how.

. . . .

My corporate job paycheck subsidizes my record producing career. With no track record as a producer, nobody was going to pay me to produce his or her music, and it wasn’t money that motivated me to become a producer in the first place — it was my passion for jazz and classical music. Therefore, I volunteered so that I could gain experience in this new industry. My day job not only afforded me the capital to make albums, but it taught me the skills to succeed as a producer. A good producer should be someone who knows how to create a vision, recruit personnel, establish a timeline, raise money, and deliver products.

. . . .

t the same time, I typically invite my corporate clients to recording sessions. For someone who works at an office all day, it’s exciting to go “behind-the-scenes” and interact with singers, musicians, and other creative professionals.

. . . .

[O]ne of my clients wanted to understand what Chinese citizens were saying to each other. Because I am an author, I have gotten to know other writers, so I reached out to my friend who was a journalist at a periodical that monitors chatter in China. Not restricted by the compliance department of a bank, he was able to give an unbridled perspective to my client, who was most appreciative.

Link to the rest at The Harvard Business Review

PG didn’t notice this when it first appeared in 2017, but thinks it may be interesting for many indie authors.

The Creative Compulsions of OCD

7 August 2019
Comments Off on The Creative Compulsions of OCD

From The Paris Review:

Here is my morning routine: when I get out of bed, my feet must touch the edge of the rug, one at a time, while I softly vocalize two magic words that are best described as puffing and plosive sounds. If my feet don’t touch correctly, or if I don’t say the words right, I get back in bed and try again. Once I have properly performed this initial procedure, I again tap my left foot on the carpet while vocalizing the first magic word, and then—while holding my breath and without moving my mouth or tongue one millimeter during the duration—I silently incant a phrase that is far too nonsensical and embarrassing to share publicly, then tap my right foot while vocalizing the second magic word.

This can take anywhere from ten seconds, if I’m lucky, to two or three minutes. Once executed to my satisfaction, I am able to go downstairs, unplug my phone and perform roughly the same procedure on it, with my thumbs instead of my feet, and then I am allowed to use my phone. Likewise, the refrigerator door when I’m making coffee. Likewise, the edges of my laptop when I power it on. With these routines completed, I can start my day, open a Word document, and begin writing.

I realize this sounds bad, but it’s a compromise I’ve reached after decades of managing my obsessive-compulsive disorder. I’ve gone cold turkey before, renouncing all habits and tics but they eventually creep back in. A therapist once described OCD behaviors as a “blob,” which felt apt—whatever part of it you press down on, another part bulges back up. These little routines are, in a sense, a deal I make with myself, so I don’t have to perform random routines all day long. Not doing them is not an option. If I don’t do them, the world will end.

I can’t remember exactly when it began. As is true of so many disorders, medical literature generally links OCD with the onset of adolescence, and this tracks with my earliest OCD memories: missing the bus to middle school because I had to touch mailboxes and the curb in a certain sequence; playing songs on my cassette deck over and over in order to pause on an exact word or chord; staying up in my teenage basement lair, flicking the lights on and off in patterns that, if my parents had noticed, would have looked like some Morse code call for help, which in a way, it was.

The most vivid memory I have from this era is typing out the final draft of an English paper over and over. I’d written and revised it first by longhand in a notebook, in anticipation of the Sisyphean task to come. The rule I’d set for myself—or rather, the rule that had mercilessly evolved over the course of the school year—was that if I made one error typing, I had to erase the whole thing and start over. When I typoed in the final paragraph, I deleted all four pages and took a break to cry. I finally pecked the last period when the sun was coming up.

. . . .

One of my earliest memories is sitting on the carpeted stairs of our home, suddenly aware of my heartbeat. I was perhaps four, and I was convinced that I was having a heart attack. I listened to it thudding away in my chest and expected to die with each little rush. When I was eight or nine, I became terrified by and obsessed with a Time Magazine article about something called AIDS.

. . . .

The vague motivating threat for not touching a curb or counting to a certain number was always an imaginary illness—ironic given the very real illness I actually had: although it took me decades to realize it, by thirteen, I was suffering from a full-blown mental health crisis.

. . . .

Controlling a sentence—controlling this sentence, as I type—is for me the best, most pleasurable work there is. I build the paragraph, tagged by its thematic first word: control. In crafting this sentence, this paragraph, this essay, I get to be both architect and construction worker, and both jobs offer equally pleasing aspects of control. The former involves creative design and abstract thought; the latter brings the visceral, simultaneously logical and intuitive pleasure of finding the right word, moving it around, putting it in just the right place. Having written that sentence, I know I must reverse myself and concede that the idea of there being “just the right place” is illusory—that even this work is, in its essence, as arbitrary as anything else. This is true, but nonetheless as I write, I shut out the world, other responsibilities, Twitter, the news, everything.

Link to the rest at The Paris Review

An Author Heads to the Stage

3 August 2019
Comments Off on An Author Heads to the Stage

From Publishing Perspectives:

As I bundled up my 225-page memoir manuscript and mailed it to editor Jane Rosenman, I hoped she would reveal the magic formula for transforming my pages into a book. I’d received glowing rejections but still no takers for my story, The Inheritance, about how, six weeks after my mother died, I discovered that she had disinherited me, and my quest to understand why.

Although Rosenman found much to praise, some aspects of my story still weren’t working, including a whiff of bitterness on the page. Yet who wouldn’t be bitter after being blindsided from beyond the grave? But the problem with bitterness, I later discovered, is that it lacks drama.

As I was revising the manuscript, I received an invitation to perform a 10-minute story with Portland Story Theater in Oregon, where I live. When I walked onto the stage, into the pressure cooker of live performance, something happened: my bitterness transformed into humor, and I discovered a liveliness and emotional depth that had not been as evident on the page.

Was I onto something that could help me crack open my story? To find out, I enrolled in a solo performance class with Seth Barrish at New York City’s Barrow Group Theatre, who I then hired to help me craft a performance of my story. With script in hand, I secured a director—Lauren Bloom Hanover—and performed the 50-minute, one-person show, retitled Firstborn, at Performance Works Northwest in Portland, as part of the Fertile Ground Festival. My minitour culminated with my off-Broadway performance at the United Solo Theatre Festival last October, where Jane was in the audience.

. . . .

By telling my story on stage, I found not only its through line but also its beating heart. Writing for performance also gave me more to work with than just the words. Now I had my body, voice, lighting, and music, plus props and images. Also, I could take shortcuts: a transition could be made with a turn of my body or a look to the audience. As Jane said when I spoke with her afterward, the demands of performance helped me get to the “nub of the story.”

Link to the rest at Publishing Perspectives

 

« Previous PageNext Page »