Conspiracy Theories

Given the nature of the post that will appear immediately adjacent to this one – “The Silurian Hypothesis“, PG discovered that Wikipedia has a page devoted to Conspiracy Theories that could surely contain some of the best writing prompts ever for authors writing in particular genres:

Aviation

Numerous conspiracy theories pertain to air travel and aircraft. Incidents such as the 1955 bombing of the Kashmir Princess, the 1985 Arrow Air Flight 1285 crash, the 1986 Mozambican Tupolev Tu-134 crash, the 1987 Helderberg Disaster, the 1988 bombing of Pan Am Flight 103 and the 1994 Mull of Kintyre helicopter crash as well as various aircraft technologies and alleged sightings, have all spawned theories of foul play which deviate from official verdicts.[3]

Black helicopters

This conspiracy theory emerged in the U.S. in the 1960s. The John Birch Society, who asserted that a United Nations force would soon arrive in black helicopters to bring the U.S. under UN control, originally promoted it.[4] The theory re-emerged in the 1990s, during the presidency of Bill Clinton, and has been promoted by talk show host Glenn Beck.[5][6] A similar theory concerning so-called “phantom helicopters” appeared in the UK in the 1970s.[7]

Chemtrails

Main article: Chemtrail conspiracy theory

A high-flying jet’s engines leaving a condensation trail (contrail)

Also known as SLAP (Secret Large-scale Atmospheric Program), this theory alleges that water condensation trails (“contrails“) from aircraft consist of chemical or biological agents, or contain a supposedly toxic mix of aluminumstrontium and barium,[8] under secret government policies. An estimated 17% of people globally believe the theory to be true or partly true. In 2016, the Carnegie Institution for Science published the first-ever peer-reviewed study of the chemtrail theory; 76 out of 77 participating atmospheric chemists and geochemists stated that they had seen no evidence to support the chemtrail theory, or stated that chemtrail theorists rely on poor sampling.[9][10]

Korean Air Lines Flight 007

The destruction of Korean Air Lines Flight 007 by Soviet jets in 1983 has long drawn the interest of conspiracy theorists. The theories range from allegations of a planned espionage mission, to a US government cover-up, to the consumption of the passengers’ remains by giant crabs.

Malaysia Airlines Flight MH370

The disappearance of Malaysia Airlines Flight 370 in southeast Asia in March 2014 has prompted many theories. One theory suggests that this plane was hidden away and reintroduced as Flight MH17 later the same year in order to be shot down over Ukraine for political purposes. Prolific American conspiracy theorist James H. Fetzer has placed responsibility for the disappearance with Israeli Prime Minister Benjamin Netanyahu.[11] Theories have also related to allegations that a certain autopilot technology was secretly fitted to the aircraft.[12]

Malaysia Airlines Flight MH17

Malaysia Airlines Flight 17 was shot down over Ukraine in July 2014. This event has spawned numerous alternative theories. These variously include allegations that it was secretly Flight MH370, that the plane was actually shot down by the Ukrainian Air Force to frame Russia, that it was part of a conspiracy to conceal the “truth” about HIV (seven disease specialists were on board), or that the Illuminati or Israel was responsible.[11][13]

. . . .

Espionage

Israel animal spying

Conspiracy theories exist alleging that Israel uses animals to conduct espionage or to attack people. These are often associated with conspiracy theories about Zionism. Matters of interest to theorists include a series of shark attacks in Egypt in 2010Hezbollah’s accusations of the use of “spying” eagles,[73] and the 2011 capture of a griffon vulture carrying an Israeli-labeled satellite tracking device.[74]

Harold Wilson

Numerous persons, including former MI5 officer Peter Wright and Soviet defector Anatoliy Golitsyn, have alleged that British Prime Minister Harold Wilson was secretly a KGB spy. Historian Christopher Andrew has lamented that a number of people have been “seduced by Golitsyn’s fantasies”.[75][76][77]

Malala Yousafzai

Conspiracy theories concerning Malala Yousafzai are widespread in Pakistan, elements of which originate from a 2013 satirical piece in Dawn. These theories variously allege that she is a Western spy, or that her attempted murder by the Taliban in 2012 was a secret operation to further discredit the Taliban, and was organized by her father and the CIA and carried out by actor Robert de Niro disguised as an Uzbek homeopath.[78][79][80][81]

Link to the rest at List of Conspiracy Theories – Wikipedia

The Silurian Hypothesis

From The Paris Review:

When I was eleven, we lived in an English Tudor on Bluff Road in Glencoe, Illinois. One day, three strange men (two young, one old) knocked on the door. Their last name was Frank. They said they’d lived in this house before us, not for weeks but decades. For twenty years, this had been their house. They’d grown up here. Though I knew the house was old, it never occurred to me until then that someone else had lived in these rooms, that even my own room was not entirely my own. The youngest of the men, whose room would become mine, showed me the place on a brick wall hidden by ivy where he’d carved his name. “Bobby Frank, 1972.” It had been there all along. And I never even knew it.

That is the condition of the human race: we have woken to life with no idea how we got here, where that is or what happened before. Nor do we think much about it. Not because we are incurious, but because we do not know how much we don’t know.

What is a conspiracy?

It’s a truth that’s been kept from us. It can be a secret but it can also be the answer to a question we’ve not yet asked.

Modern humans have been around for about 200,000 years, but life has existed on this planet for 3.5 billion. That leaves 3,495,888,000 pre-human years unaccounted for—more than enough time for the rise and fall of not one but several pre-human industrial civilizations. Same screen, different show. Same field, different team. An alien race with alien technology, alien vehicles, alien folklore, and alien fears, beneath the familiar sky. There’d be no evidence of such bygone civilizations, built objects and industry lasting no more than a few hundred thousand years. After a few million, with plate tectonics at work, what is on the surface, including the earth itself, will be at the bottom of the sea and the bottom will have become the mountain peaks. The oldest place on the earth’s surface—a stretch of Israel’s Negev Desert—is just over a million years old, nothing on a geological clock.

The result of this is one of my favorite conspiracy theories, though it’s not a conspiracy in the conventional sense, a conspiracy usually being a secret kept by a nefarious elite. In this case, the secret, which belongs to the earth itself, has been kept from all of humanity, which believes it has done the only real thinking and the only real building on this planet, as it once believed the earth was at the center of the universe.

Called the Silurian Hypothesis, the theory was written in 2018 by Gavin Schmidt, a climate modeler at NASA’s Goddard Institute, and Adam Frank, an astrophysicist at the University of Rochester. Schmidt had been studying distant planets for hints of climate change, “hyperthermals,” the sort of quick temperature rises that might indicate the moment a civilization industrialized. It would suggest the presence of a species advanced enough to turn on the lights. Such a jump, perhaps resulting from a release of carbon, might be the only evidence that any race, including our own, will leave behind. Not the pyramids, not the skyscrapers, not Styrofoam, not Shakespeare—in the end, we will be known only by a change in the rock that marked the start of the Anthropocene.

Link to the rest at The Paris Review

William Gibson Builds A Bomb

From National Public Radio:

William Gibson does not write novels, he makes bombs.

Careful, meticulous, clockwork explosives on long timers. Their first lines are their cores — dangerous, unstable reactant mass so packed with story specific detail that every word seems carved out of TNT. The lines that follow are loops of brittle wire wrapped around them.

Once, he made bombs that exploded. Upended genre and convention, exploded expectations. The early ones were messy and violent and lit such gorgeous fires. Now, though, he does something different. Somewhere a couple decades ago, he hit on a plot architecture that worked for him — this weird kind of thing that is all build-up and no boom — and he has stuck to it ever since. Now, William Gibson makes bombs that don’t explode. Bombs that are art objects. Not inert. Still goddamn dangerous. But contained.

You can hear them tick. You don’t even have to listen that close. His language (half Appalachian economy, half leather-jacket poet of neon and decay) is all about friction and the gray spaces where disparate ideas intersect. His game is living in those spaces, checking out the view, telling us about it.

Agency, that’s his newest. It’s a prequel/sequel (requel?) to his last book, The Peripheral, which dealt, concurrently, with a medium-future London after a slow-motion apocalypse called “The Jackpot,” and a near-future now where a bunch of American war veterans, grifters, video game playtesters and a friendly robot were trying to stop an even worse future from occurring. It was a time travel story, but done in a way that only Gibson could: Almost believably, in a way that hewed harshly to its own internal logic, and felt both hopeful and catastrophic at the same time.

Link to the rest at National Public Radio

Ten Things You (Probably) Didn’t Know About C.S. Lewis

From The Millions:

C.S. Lewis gained acclaim as a children’s author for his classic series The Chronicles of Narnia. He also gained acclaim for his popular apologetics, including such works as Mere Christianity and The Screwtape Letters. What is more, he gained acclaim as a science fiction writer for his Ransom Trilogy. Furthermore, he gained acclaim for his scholarly work in Medieval and Renaissance literature with The Allegory of Love and A Preface to Paradise Lost. Many writers have their fleeting moment of fame before their books become yesterday’s child—all the rage and then has-been. Remarkably, Lewis’s books in all of these areas have remained in print for 70, 80, and 90 years. Over the years, the print runs have grown.

. . . .

1. Lewis was not English. He was Irish. Because of his long association with Oxford University, and later with Cambridge, many people assume he was English. When he first went to school in England as a boy, he had a strong Irish accent. Both the students and the headmaster made fun of young Lewis, and he hated the English in turn. It would be many years before he overcame his prejudice against the English.

. . . .

4. Lewis gave away the royalties from his books. Though he had only a modest salary as a tutor at Magdalen College, Lewis set up a charitable trust to give away whatever money he received from his books. Having given away his royalties when he first began this practice, he was startled to learn that the government still expected him to pay taxes on the money he had earned!

5. Lewis never expected to make any money from his books. He was sure they would all be out of print by the time he died. He advised one of his innumerable correspondents that a first edition of The Screwtape Letters would not be worth anything since it would be a used book. He advised not paying more than half the original price. They now sell for over $1200.

6. Lewis was instrumental in Tolkien’s writing of The Lord of the RingsSoon after they became friends in the 1920s, J. R. R. Tolkien began showing Lewis snatches of a massive myth he was creating about Middle Earth. When he finally began writing his “new Hobbit” that became The Lord of the Rings, he suffered from bouts of writer’s block that could last for several years at a time. Lewis provided the encouragement and the prodding that Tolkien needed to get through these dry spells.

. . . .

10. Lewis was very athletic. Even though he hated team sports throughout his life, Lewis was addicted to vigorous exercise. He loved to take 10-, 15-, and 20-mile rapid tromps across countryside, but especially over rugged hills and mountains. He loved to ride a bicycle all over Oxfordshire. He loved to swim in cold streams and ponds. He loved to row a boat. He kept up a vigorous regimen until World War II interrupted his life with all of the new duties and obligations he accepted to do his bit for the war effort.

Link to the rest at The Millions

Sci-Fi Set in the 2020’s Predicted a Dim Decade for Humanity

From BookBub:

Science fiction has always had a complicated relationship with the future; however, sci-fi is all about looking forward to the wondrous things that mankind will achieve — Flying cars! Personal jetpacks! Venusian vacations! After all, a bright and happy future is kind of…boring. Even when you imagine a post-scarcity future like the one in Star Trek, you have to throw in a bit of nuclear holocaust, and the Neutral Zone to spice things up.

Now that we’re firmly entrenched in the 21st century (which for a long time was shorthand for ‛the future’ in sci-fi), it’s fascinating to look at all the stories set in this particular decade to see how past SF masters thought things were going to go. One thing is abundantly clear: No matter how bad you think the decade is going to be, sci-fi writers think the 2020s are going to be worse.

. . . .

The horrifying, dystopian, and extinction-level scenarios imagined in sci-fi set in the 2020s are impressive. There’s the quiet desperation depicted in The Children of Men—the critically-acclaimed, influential, and unexpected sci-fi novel from master crime writer P.D. James—which imagined the last human children being born in 1995, leading to a world swamped by apathy and suicide in 2021. On the other end of the spectrum, you have a 2020 like the one in the film Reign of Fire, where we’re all battling literal dragons in the ashen remnants of society.

. . . .

In-between, you have just about every kind of misery. In Stephen King’s prescient novel The Running Man, 2025 finds the United States on the brink of economic collapse, with desperate citizens driven to appear on deadly reality-TV shows. (Although maybe it doesn’t matter since Ray Bradbury’s classic short story There Will Come Soft Rains tells us that by 2026, the world will be a nuclear blast zone anyway.) The Running Man is one of King’s most underrated novels, weaving themes of economic inequality decades before the issue was mainstream.

. . . .

[A]pocalypse and dystopia are just more fun. What would you rather be doing, flying around the world with a jetpack because everyone is rich and healthy? Or hunting down replicants in a Blade Runner version of Los Angeles that resembles… well, today’s actual Los Angeles if we’re being honest? Here’s another take: Which is more interesting, going to your job every day in a stable if imperfect society? Or firing up the artillery and battling real, actual dragons? The latter, obviously, which is why sci-fi always goes to the dragons, the evil AIs, and violently sociopathic clones, usually accompanied by a society that’s so far gone that no one bothers with things like jobs anymore.

Link to the rest at BookBub

PG is trying out an Amazon embed function to see how it works (or doesn’t) for visitors to THE PASSIVE VOICE.

Does AI judge your personality?

Perhaps a writing prompt. Among many others, PG has always been fascinated by AI books and stories, but this one generates a bit less optimism.

From ArchyW:

AirBnB wants to know if it has a “Machiavellian” personality before renting a house on the beach.

The company may be using software to judge if you are reliable enough to rent a house based on what you post on Facebook, Twitter and Instagram.

They will free the systems on social networks, execute the algorithms and get results. For people at the other end of this process, there will be no transparency in the process, no knowledge, no appeal process.

The company owns a technology patent designed to rate the “personalities” of potential guests by analyzing their activity on social networks to decide if they are a risky guest that could damage a host’s house.

The final product of its technology is to assign each AirBnB guest customer a “reliability score”. According to reports, this will be based not only on social media activity, but also on other data found online, including blog posts and legal records.

The technology was developed by Trooly, which AirBnB acquired three years ago. Trooly created a tool based on artificial intelligence designed to “predict reliable relationships and interactions,” and that uses social networks as a data source.

The software builds the score based on perceived “personality traits” identified by the software, including some that you could predict – awareness, openness, extraversion, kindness – and some strangers – “narcissism” and “Machiavellianism,” for example. (Interestingly, the software also seeks to get involved in civil litigation, suggesting that now or in the future they can ban people based on the prediction that they are more likely to sue.)

AirBnB has not said whether they use the software or not.

If you are surprised, shocked or unhappy with this news, then it is like most people who are unaware of the enormous and rapidly growing practice of judging the people (clients, citizens, employees and students) who use AI applied to networks social. exercise.

AirBnB is not the only organization that scans social networks to judge personality or predict behavior. Others include the Department of Homeland Security, employers, school districts, police departments, the CIA, insurance companies and many others.

Some estimates say that up to half of all university admission officers use social monitoring tools based on artificial intelligence as part of the candidate selection process.

Human resources departments and hiring managers also increasingly use AI social monitoring before hiring.

. . . .

Some estimates say that up to half of all university admission officers use social monitoring tools based on artificial intelligence as part of the candidate selection process.

Human resources departments and hiring managers also increasingly use AI social monitoring before hiring.

. . . .

There is only one problem.

AI-based social media monitoring is not that smart

. . . .

The question is not whether the AI ​​applied to data collection works. It surely does. The question is whether social networks reveal truths about users. I am questioning the quality of the data.

For example, scanning someone’s Instagram account can “reveal” that it is fabulously rich and travels the world enjoying champagne and caviar. The truth may be that they are broken and stressed influential people who exchange social exposure for hotel rooms and meals in restaurants where they take highly manipulated photos created exclusively to build reputation. Some people use social networks to deliberately create a deliberately false image of themselves.

A Twitter account can show a user as a prominent, constructive and productive member of society, but a second anonymous account unknown to social media monitoring systems would have revealed that person as a sociopathic troll who just wants to see the fire burn. world. People have multiple social media accounts for different aspects of their personalities. And some of them are anonymous.

. . . .

For example, using profanity online can reduce a person’s reliability score, based on the assumption that rude language indicates a lack of ethics or morality. But recent research suggests the opposite: people with their mouths in the bathroom may, on average, be more reliable, as well as more intelligent, more honest and more capable, professionally. Do we trust that Silicon Valley software companies know or care about the subtleties and complexities of human personality?

. . . .

There is also a generational division. Younger people are statistically less likely to publish in public, preferring private messaging and social interaction in small groups. Is AI-based social media monitoring fundamentally ageist?

Women are more likely than men to post personal information on social networks (information about oneself), while men are more likely than women to post impersonal information. Posting about personal matters can be more revealing about personality. Is social media monitoring based on AI fundamentally sexist?

Link to the rest at ArchyW

How William Gibson Keeps His Science Fiction Real

From The New Yorker:

Suppose you’ve been asked to write a science-fiction story. You might start by contemplating the future. You could research anticipated developments in science, technology, and society and ask how they will play out. Telepresence, mind-uploading, an aging population: an elderly couple live far from their daughter and grandchildren; one day, the pair knock on her door as robots. They’ve uploaded their minds to a cloud-based data bank and can now visit telepresently, forever. A philosophical question arises: What is a family when it never ends? A story flowers where prospective trends meet.

This method is quite common in science fiction. It’s not the one employed by William Gibson, the writer who, for four decades, has imagined the near future more convincingly than anyone else. Gibson doesn’t have a name for his method; he knows only that it isn’t about prediction. It proceeds, instead, from a deep engagement with the present. When Gibson was starting to write, in the late nineteen-seventies, he watched kids playing games in video arcades and noticed how they ducked and twisted, as though they were on the other side of the screen. The Sony Walkman had just been introduced, so he bought one; he lived in Vancouver, and when he explored the city at night, listening to Joy Division, he felt as though the music were being transmitted directly into his brain, where it could merge with his perceptions of skyscrapers and slums. His wife, Deborah, was a graduate student in linguistics who taught E.S.L. He listened to her young Japanese students talk about Vancouver as though it were a backwater; Tokyo must really be something, he thought. He remembered a weeping ambulance driver in a bar, saying, “She flatlined.” On a legal pad, Gibson tried inventing words to describe the space behind the screen; he crossed out “infospace” and “dataspace” before coming up with “cyberspace.” He didn’t know what it might be, but it sounded cool, like something a person might explore even though it was dangerous.

Gibson first used the word “cyberspace” in 1981, in a short story called “Burning Chrome.” He worked out the idea more fully in his first novel, “Neuromancer,” published in 1984, when he was thirty-six. Set in the mid-twenty-first century, “Neuromancer” follows a heist that unfolds partly in physical space and partly in “the matrix”—an online realm. “The matrix has its roots in primitive arcade games,” the novel explains, “in early graphics programs and military experimentation with cranial jacks.” By “jacking in” to the matrix, a “console cowboy” can use his “deck” to enter a new world:

Cyberspace. A consensual hallucination experienced daily by billions of legitimate operators, in every nation. . . . A graphic representation of data abstracted from the banks of every computer in the human system. Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters and constellations of data. Like city lights, receding.

. . . .

Most science fiction takes place in a world in which “the future” has definitively arrived; the locomotive filmed by the Lumière brothers has finally burst through the screen. But in “Neuromancer” there was only a continuous arrival—an ongoing, alarming present. “Things aren’t different. Things are things,” an A.I. reports, after achieving a new level of consciousness. “You can’t let the little pricks generation-gap you,” one protagonist tells another, after an unnerving encounter with a teen-ager. In its uncertain sense of temporality—are we living in the future, or not?—“Neuromancer” was science fiction for the modern age. The novel’s influence has increased with time, establishing Gibson as an authority on the world to come.

Link to the rest at The New Yorker