Dating apps need women. Advertisers need diversity. AI companies offer a solution: Fake people

This content has been archived. It may no longer be accurate or relevant.

From The Washington Post:

Artificial intelligence start-ups are selling images of computer-generated faces that look like the real thing, offering companies a chance to create imaginary models and “increase diversity” in their ads without needing human beings.

One firm is offering to sell diverse photos for marketing brochures and has already signed up clients, including a dating app that intends to use the images in a chatbot. Another company says it’s moving past AI-generated headshots and into the generation of full, fake human bodies as early as this month.

. . . .

But AI experts worry that the fakes will empower a new generation of scammers, bots and spies, who could use the photos to build imaginary online personas, mask bias in hiring and damage efforts to bring diversity to industries. The fact that such software now has a business model could also fuel a greater erosion of trust across an Internet already under assault by disinformation campaigns, “deepfake” videos and other deceptive techniques.

Elana Zeide, a fellow in artificial intelligence, law and policy at the University of California at Los Angeles’s law school, said the technology “showcases how little power and knowledge users have in terms of the reality of what they see online.”

“There’s no objective reality to compare these photos against,” she said. “We’re used to physical worlds with sensory input … but with this, we don’t have any instinctive or taught responses on how to detect what’s real and what isn’t. It’s exhausting.”

. . . .

Icons8, an Argentina-based design firm that sells digital illustrations and stock photos, launched its online business Generated.photos last month, offering “worry-free, diverse models on-demand using AI.”

The site allows anyone to filter fake photos based on age (from “Infant” to “Elderly”), ethnicity (including “White,” “Latino,” “Asian” and “Black”) and emotion (“Joy,” “Neutral,” “Surprise”), as well as gender, eye color and hair length. The system, however, shows a number of odd gaps and biases: For instance, the only available skin color for infants is white.

. . . .

Companies infamously have embarrassed themselves through haphazard diversity-boosting attempts, Photoshopping a black man into an all-white crowd, as the University of Wisconsin-Madison did on an undergraduate booklet, or superimposing women into group photos of men.

But while the AI start-ups boast a simple fix — offering companies the illusion of diversity, without working with a diverse set of people — their systems have a crucial flaw: They mimic only the likenesses they’ve already seen. Valerie Emanuel, a Los Angeles-based co-founder of the talent agency Role Models Management, said she worried that these kinds of fake photos could turn the medium into a monoculture, in which most faces look the same.

“We want to create more diversity and show unique faces in advertising going forward,” Emanuel said. “This is homogenizing one look.”

Icons8 created its faces first by taking tens of thousands of photos of about 70 models in studios around the world, said Ivan Braun, the company’s founder. Braun’s colleagues — who work remotely across the United States, Italy, Israel, Russia and Ukraine — then spent several months preparing a database, cleaning the images, labeling data and organizing the photos to the computer’s precise specifications.

With those images at the ready, engineers then used an AI system known as StyleGAN to output a flood of new photos, generating 1 million images in a single day. His team then selected the 100,000 most convincing images, which were made available for public use. More will be generated in the coming months.

The company, Braun said, signed three clients in its first week: an American university, a dating app and a human-resources planning firm. Braun declined to name the clients.

Link to the rest at The Washington Post

PG went to Generated Photos to look for faces.

First, Elderly:

Photo by Generated Photos
Photo by Generated Photos

Elderly Latina

Photo by Generated Photos

Surprised Adult Asian

Photo by Generated Photos

Visitors will come to their own conclusions, but if PG were using this type of image for anything commercial, he would make it very small.

And when you look at a group of these photos, they can seem a little creepy:

Photo by Generated Photos
Photo by Generated Photos
Photo by Generated Photos

11 thoughts on “Dating apps need women. Advertisers need diversity. AI companies offer a solution: Fake people”

  1. It could be handy for authors, too: what if your cover needs the image of a blonde woman in a turquoise-and-silver sari squatting to put her hand in the Ganges? It’s going to be tricky to find a photo to use to get the cover artist started, or to create your own cover. It would be expensive and complicated to get the right photo taken.

    But if they can actually make it lifelike at a reasonable cost, I’d consider buying.

    • That scenario is why I end up paying for custom art. Maybe it’s my generation, but those CGI covers indies are using now just scream “ReBoot,” which was a 90’s era cartoon about a brother and sister living inside a video game. I always think the CGI covers are signaling the “people stuck inside video games genre” (whatever that’s called), but it never turns out to be the case.

      The idea does seem promising, though. I think I’ll wait for the 2.0 version before I even consider using them for books.

  2. I think I’m missing something here: what is stopping the companies that need diverse models from getting human models? Is there a shortage? I stopped reading at the part where they came to the monster faces, because I don’t need the nightmare fuel before bed 🙂 The Emilia Clarke version of stock photos is far less creepy.

    Seriously, there are modeling agencies that have online databases where you can pick models from all over the world. Through no special effort, I have known women and children who model. I saw one acquaintance in a magazine ad, which was serendipitous because I was talking about her just then, and could point her out. What’s the disconnect?

    I’m also not getting the part about the dating app. Are there women using this app or not? Are any of them good looking? Why can’t the company ask the good looking people to sign a release to use their pictures for promotional purposes? Or give a call out to recruit local models for the job? To me, the real question is, where are they searching for models, that they’re not able to find black, Latino, or Asian humans, and have to resort to CGI? Are these agencies not in America?

    And I’m also curious what happens when some man picks what he thinks is a woman, only to have it turn out to be CGI. Austin Powers voice: That’s a machine, baby! I would hope the company is smart enough to not allow those pictures to be used in the database. Especially since for some reason, the computer doesn’t always understand where mouths are supposed to be located on faces. What model were they using?

    • J – You are correct that there are lots of existing modeling agencies that will sell rights to use photos of models with whom they have contracts (presumably).

      My speculation is that the AI model sites may be less expensive to use and, presumably, there is no live person to complain about how the image is used, claim that appropriate written permission was not granted to the photographer, etc.

      A second topic of speculation is that the ai will likely improve with time so a customer can order a photo with two Asian women talking and laughing with an African and a Latino man. Somewhere, there will be film versions of the still ai photos as well.

      • Eh, for the last paragraph / possibility, it seems like they could easily accomplish this objective if they just go to a city. I was under the impression a lot of agencies are in New York City, which is why I’m having trouble grasping how they’re running into the diversity problem.

        The first possibility, that the humans are concerned about image rights and so on, seem more likely. I’m reminded that models stopped being the focus of magazine covers after Linda Evangelista’s infamous “I don’t get out of bed for less than $10,000 a day,” and publishers realized that actresses would get out of bed for free. So to speak. Plus, no AI will get into trouble for a Tweet. And AI models don’t have beds to get out of 🙂

  3. Each one of these artificial ‘intelligence’ (ie, algorithmic) creations displaces the people it was based on.

    In this case, models have their features used and homogenized – so that users no longer have to pay fees to the models! Users will pay the company that produces these fake models, and the company will then become richer. For a niche use, the user will guarantee that its pictures are actual humans photographed by a (hehe) digital camera… and not those fake ones.

    The same as Uber and Lyft drivers will be replaced by robot/software drivers for cars with humans in them, because all their data about everything from pickup to price is in a huge database. Being rich might earn you an actual human driver. A status symbol, even if only on your own estate.

    Human think this is going to make things cheaper – it might; safer – maybe (or it will slow things down when anything unsafe might happen and cause major gridlock); and available – if you can afford it.

    The Industrial Revolution copied the patterns made by human weavers, replicated them on punch cards, and replaced said humans.

    I wonder if AI will avoid war or go all out with it.

    • Humans supplied the values resulting in replicating patterns onto punch cards.

      In all the talk of AI, I rarely see where the standards AI will max or minimize come from. Maybe it will go to war against lawn mowers? Or maximize alfalfa production?

      I can use the exact same program to build a neural net to initiate trades based on stochastics or the spread on a 10/40 exponential average. Simple example, but I have supplied the standard. .

    • I suspect the full range of outcomes have yet to be understood, A.

      On the likely legal front, have the models whose photos are fed into the AI creator given their consent for such a use? What if an AI-created face looks exactly like a real person and that person experiences reputational harm due to the use to which the AI face is used. What if an AI-created face looks exactly like a famous actor and is used in a motion picture?

      Given human nature, I suspect it may be only a matter of time before someone uses an actual photo of a real person, tweaks it a bit, uses it commercially and claims it was created as the result of an AI process performed somewhere in Russia.

      • Suppose Brad Pitt looks just like me? Do I have any recourse against him for using his pic without making any reference to me? Against someone else using his pic with his permission and no reference to me?

        How about his recourse against me if I use my pic without any reference to him?

Comments are closed.