The Future Of Prompt Engineering

This content has been archived. It may no longer be accurate or relevant.

From Paul DelSignore:

Understanding how to write a good prompt will help you in getting the output you are looking for.

While there are some good UI tools that can write prompts for you, the ability to change, fine-tune and craft your own prompts is a skill that will serve you well. There’s even a term used to describe that skill — sometimes referred to as “prompt crafting” or “prompt engineering.”

Of course it’s entirely possible to get some amazing results without following any guidelines at all. I’ve seen some beautiful images rendered from just a simple word or phrase. However, if you want consistency and the ability to improve your output, you will need to learn how AI responds to language patterns.

The AI artists that I follow on community forums and discord channels have mastered this skill, and studying how they write their prompts has helped me at writing better prompts myself.

What I would like to do in this article is show you the thought process that I use when I am writing a prompt. I am also writing this agnostic to any specific AI art tool, as while there might be differences in the syntax between the different tools, the writing approach is largely the same. For the examples below, I will be showing art generated from Midjourney.

. . . .

Crafting Your Prompt

I like to think of the anatomy of the prompt in four distinct groupings and in a specific order (note the order affects how AI prioritizes the output).

  1. Content type
  2. Description
  3. Style
  4. Composition

Let’s take a look at each of them in the process of writing out a prompt.

1. Content type

When you approach creating a piece of artwork, the first thing to think about is what is the type of artwork you want to achieve, is it a PhotographDrawingSketch or 3D render?

So the prompt would start with…

A photograph of...

Link to the rest at Paul DelSignore on Medium

In ancient times, PG learned the craft/art of searching legal resources for attorneys, primarily Lexis with a bit of WestLaw thrown in. One thing he liked about both systems is that he could find exactly what he was looking for without extraneous search results. Of course, this cost a lot of money if you didn’t have complimentary accounts from each company as PG did for several years.

Prior to Google dominating web search, there were other web search engines. Does anyone remember AltaVista, which was acquired by Yahoo?

When Google showed up, PG learned how to use the various Google search commands to help find the sort of thing he was looking for without seeing a thousand different things that were sort of what he was looking for – search social media, search hashtags, exclude words from your search, etc., etc. (There are lots of locations online that will show you how to use Googles various search commands – see, for example, Google’s Refine Web Searches page.)

There are more search operators than Google includes in the link above. He found some sites that claim to include all of Google’s search operators – there are at least 40, perhaps more. Here’s a link to a non-Google site that claims to list all of Big G’s search operators.

PG’s major gripe against Google is that the search engine always wants to show you something. With the classic versions of legal search engines, if PG searched for something and it didn’t exist, Lexis would tell him that nothing existed that met his query.

PG will have to experiment with a combination of Google search operators to see if Big G ever admits that it is stunned.

Back to the OP, PG hasn’t figured out exactly how the various publicly-available AI systems are looking for their user inputs. But he’s enjoying his experiments.

4 thoughts on “The Future Of Prompt Engineering”

  1. So “prompt fu” is the next skill to learn?

    Maybe. Maybe not.

    I expect the GPT tools themselves to lean into their conversational tools to craft the proper prompts instead of expecting humans to figure them out. Exploring the current state of GPT creation tools can be fun but the way things are going, a lot of those skills will be outdated in six months.

    One thing prompy fu ignores is that computer science has an entire wing devoted to human usability and user interfaces. Crafting prompts is the CLI equivalent of single user software. More sophisticated front ends are coming. Just one example, the Stability plug-in for Photoshop:

    https://www.windowscentral.com/software-apps/this-photoshop-plugin-lets-you-add-ai-generated-elements-or-edits-within-seconds

    “As shown by @mrgreen on Twitter, “Stable Diffusion’s Photoshop plugin lets you create, edit, and iterate AI images from within Photoshop.” The attached video demonstrates how typing in a quick prompt and making vague brush strokes on a document can lead the plugin to generate artwork that matches.

    “For instance, opening a new document, typing “mountain range” into the text prompt box, and clicking Generate leads to several mountain range options appearing for users to choose from. Photoshop users can also use the Stability AI plugin(opens in new tab) to perform edits or add complete details to an initial prompt in just a few seconds. The video @mrgreen shared demonstrates this by adding a flying red dragon over the mountain range image. Instead of taking hours, this was done simply by drawing quick red brush strokes in the sky, selecting the area around the red strokes, typing “red dragon” into the Prompt box, clicking Generate, and selecting one of the options that appeared.”

    In a few months, this will be considered clunky.

    ABODE themselves are working on their own GPT-based model for their suite of products, called FIREFLY. Currently in limited access beta.

    Current tools like ChatGPT and DALL-E are the equivalent of BASIC or COBOL that expect programmers at the human end.

    Worth remembering: the current weakness of these models when in arbitrary use is context.
    The next generation of apps will narrow the focus and provide context to better infer what the operator wants. (That is what MS evangelists are talking about when they say their PROMETHEUS module grounds GPT4 for use in Bing. )

    As the GPT tech spreads out into creative and non-creative productivity tools, how well they manage human interaction is what will determine the long term winners and losers. And the winners willbe the ones that are the most flexible while being the least demanding of the user. As in: not everybody needs to learn COBOL or VBA to use Excel.

Comments are closed.