Working With AI

From The Wall Street Journal:

In August, first prize in the digital-art category of the Colorado State Fair’s fine-art competition went to a man who used artificial intelligence (AI) to generate his submission, “Théâtre d’Opéra Spatial.” He supplied the AI, a program called Midjourney, with only a “prompt”—a textual description of what he wanted. Systems like Midjourney and the similar DALL-E 2 have led to a new role in our AI age: “prompt engineer.” Such people can even sell their textual wares in an online market called PromptBase.

Midjourney and DALL-E 2 emerged too late to be included in “Working With AI: Real Stories of Human-Machine Collaboration,” by Thomas Davenport and Steven Miller, information-systems professors at Babson College and Singapore Management University, respectively. But the authors note other novel titles: chief automation officer; senior manager of content systems; architect, ethical AI practice. As AI’s influence expands, its borders with the work world gain complexity. Next up: deputy regional manager of AI-prompt quality and security assurance.

The bulk of “Working With AI” comprises 29 case studies in which corporate teams integrate automation into a workflow. Each chapter ends on three or four “lessons we learned.” For each study, one or both authors typically interview not only a worker interacting directly with the AI but also the worker’s supervisor, the manager who decided to adopt the technology, the software’s developer and the company’s customers. Though they include some statistics on, say, time saved, the reports are largely qualitative.

The book is aimed at managers, consultants and students planning their careers. As none of the above, I still appreciated the accessible narratives as a diverse survey of how current technologies can expand the range of human capabilities. Some of the applications came to resemble each other, but the mild level of bland business-speak, like “stakeholder” and “buy-in,” was positively tolerable.

Early cases lean toward desk-ridden workers. One system helps financial advisers at Morgan Stanley personalize investment ideas for their clients. Another helps fundraisers at Arkansas State University target potential donors and drafts emails for them. Others suggest life-insurance premiums to underwriters at MassMutual, or help forecast sales for Kroger. In all cases, humans have the final say. And in many cases, the systems provide explanations for their outputs, listing, for example, the variables that most heavily influenced a decision.

Later cases breach the office walls. One system predicts which field activities will be most dangerous to Southern California Edison workers, and recommends precautions. Another highlights neighborhoods where crime is likely to occur and recommends that police officers patrol the area. (The latter, a form of predictive policing, has raised concerns about biased algorithms. The vendor says they’ve implemented countermeasures, but the book doesn’t elaborate.)

The benefit in most cases is increased efficiency. AI relieves employees of boring and time-consuming work, freeing them to address other tasks, such as strategic thinking or client interactions. The authors spend less time discussing ways in which machines might perform with more accuracy than humans, though they do point to Stitch Fix, where algorithms assist stylists in clothing recommendations. The company’s director of data science notes that it’s usually best not to override the AI, whose choices tend to be superior. While algorithms can be biased, so too can humans. Stitch Fix’s styling supervisor said the software nudges stylists away from their own preferences and toward those of the clients.

Many readers’ first question might be: Will AI take my job? Or: Can I replace my expensive employees with AI? The short answer from the authors is: In the near future, no. Wealthy countries are actually experiencing a long-term labor shortage. And there are still many things AI (often) can’t do, such as understand context, deal with dynamic settings, create a coherent story, coordinate people, frame a problem and know when to use AI.

The authors include an oft-quoted comment from the radiologist Keith Dreyer: “The only radiologists who will lose their jobs to AI will be those who refuse to work with AI.” The authors elaborate: “If you’re a human reading this book—and we suspect you are—that means you need to shift your focus from worrying about being replaced by a machine to worrying about whether you can add value to a job that you like where a smart machine is your collaborator. Adding value can mean checking on the machine’s work to make sure it was done well, making improvements to the machine’s logic or decisions, interpreting the machine’s results for other humans, or performing those tasks that the machine can’t or shouldn’t do for some reason.”

Link to the rest at The Wall Street Journal

4 thoughts on “Working With AI”

  1. 1- “Wealthy countries are actually experiencing a long-term labor shortage. ”

    It’s not just rich countries.
    All the urbanized countries are loosing population at black plague rates.
    Bulgaria and all of eastern Europe is really hurting demographically.

    Germany and Italy are right behind, Japan and South Korea.
    Russia was even worse before the war and Putin is decimating the males of fathering age both as casualties and as refugees. (300,000 have left, since announcing the draft, on top of the 300,000 flagged as canon fodder and the 100,000 casualties to date from the war.)

    But the worst of all is China, which because of their One Child policy is running on pace to halve tbeir population by 2050. Their worker base peaked 15 years ago and they are running millions of workers short of what their economy needs to keep running.

    2- In contrast, the US is running short of 250,000 *trained* workers as the last boomers have moved up their retirement because of the pandemic. The shortage will remain until the next generation (the millenial’s kids) hit working age, circa 2040. There will also be a parallel shortage of investment liquidity as the retiring boomers move their savings from higher risk investments to safer ones like treasury bonds. All part of tbe crisis of the 20s.

    AI and advanced automation will minimize the economic damage (as long as wealth taxes and “tax the robots” remain hairbrained ideas) but the social issues will only get worse because the demand for unskilled labor and liberal arts types will decline. (“Those are not then droids you’re looking for.”) The US demography is *on average* fairly healthy and actually good in certain (cough*red*cough) red states and because of legal migrants. Dreams of “equity” will remain dreams. In good part because of “AI”.

    “AI” art is more of a novelty in the greater scheme of things but the underlying tech will be very useful in the productive areas. Real time translation, semantic analysis, way better dictation and TTS systems just for starters. In engineering systems analysis and architecture. In 3D printing. But the biggest gains are likely going to be in the sciences.

    A recent example is software that predicts the folded shapes that proteins produced by specific genes produce might take. By the million. Doesn’t mean the precicted folded shapes are guaranteed correct but the outputs will point researchers in the right direction. And knowing those shapes will lead to new drugs and treatments, especially when combined with mRNA tech.

    Bottom line, “AI” art is cute and amusing but the tech itself is serious stuff arriving just as it is needed. It is here to stay.

  2. Millions of people don’t realize none of this stuff is new. How many remember a time before digital calculators and word processors? A time with no smart phones or even cell phones? Days when million dollar computers were the size of refrigerators and had 4K of Ram (IBM 1401). Paper maps instead of GPS?

    All that was as much a take over by AI as what people talk about today. It’s fully integrated into our lives and work. The takeover depends on what one defines as the normal stuff.

    • Absolutely.
      The fundamental fact everybody (willfully) ignores is we are no longer living in an industrial civilization but a technological one since the 80’s or so. It started in thd mid 50’s but it really took off in 1978. (Blame Shockley and Fairchild and Intel.)

      All the talk about “stagnant wages”is because people who aren’t integrating into the new economy (and they are legion) are fungible. They are trying to prosper, skilless, in a marketplace of skills. Not going to happen.
      No amount of handwaving and cursing “capitalism” is going to change anything, barring a total collapse of all civilizations ala Venezuela. And if it comes to tbat the unskilled will be the first casualties.

      They fear and vilify the very things that keep them *alive*.
      Not rational.
      But then, Homo Spaiens Sapiens ain’t rational anyway.

      In that, “AI” art is a good exemple: It takes actual skill and creativity to manipulate those art tools but the result is economically viable. It mzy produce the eauivalent of velvert tigers or poker-playing dogs, but where there is money there is survival.

      “Starving artist” painters, take note.

      BTW, looking at the evolution of computing, today’s art generation tools are operating at the level of 50’s mainframe, where every single task had to be manually programmed at the hardsare level. Then came Grace Hopper and FORTRAN, and COBOL and BASIC followed, followed by VAXes and PCs and, yes Smartphones. Each evolution spread the technology more and more by making it easier to use. Most folks don’t even realize tbey have a vintage supercomputer in the pocket.

      These prompt-driven “AI” tools will soon be superceded by “AI” fed millions of prompts and natural language processor that will produce quality images out of a simple english language ( 😉 ) sentence like “Give me a beach scene of a Tahiti twilight at 6PM in mid summer. ”

      (Because “Tahiti is magical”.) 😉

      And a while that, all the code will be reduced to a cellphone app running locally.

      Welcome to the 21st century.

      • And very few people understand that today’s computers are simply expanded versions of those old 4K machines. (Yes, that means memory was 4,000 bytes.) Data was punched cards.

        The register sizes have increased, The number of registers has increased. The number of cores has increased, The number of machines has increased. But it’s the same basic thing. Registers. It will be that way until we jump to the quantums.

        Now get the hell off my lawn.

Comments are closed.