December 6, 2022 • AI

Who’s Training Our AI Apps?

The very human biases behind AI

LENSA AI WOMAN

Over the weekend, my Instagram feed was suddenly filled with stylized, artsy, fanciful portraits of friends and follows – the fruits of a new AI app called Lensa, from Prisma Labs. As I scrolled, I saw an army of avatars with flowing hair, glimmering skin, and/or ostentatiously-brushstroked faces. I also saw breasts.

I saw a LOT of breasts. Encased in an metallic skintight spacesuit in the “Iridescent” category or the flowing, décolletage-baring gowns of the “Fantasy” category, or the delicate vines and rosettes encasing milky-yet-heaving bosoms of the “Fairy Princess” category, or just your garden-variety cleavage and come-hither eyes of the “Light” “Focus” and “Anime” categories.

Under the breasts, I saw captions from women, and many of them commented that they had not uploaded sexy, cleavage-baring photos and were mildly confused about the number of sexpot images. (The app allows you to have a trial period, during which you can pay for a slate of 50, 100 or 200 images across various categories. If you buy the app for $49.99 then it’s cheaper. And yes, I bought it. Research!) I gave it a try, and uploaded my 10 selfies, in a range of sweaters, blazers and the like. 100 avatars later, there was a not-inconsiderable variety of tank tops, plunging Vs and weird skintight suits straining to stay buttoned over the bosom imputed by the app, since I had not uploaded anything below the shoulders.

I tried it again, with nary a clavicle in sight. But still, when my results came back: breasts.

Then I tried it with my boyfriend’s photo – and to my surprise, he got A WHOLE BUNCH OF DIFFERENT CATEGORIES!!! “Rock Star.” “Superhero.” “Cyborg.” “Adventure.” “Astronaut.” Are you effing kidding me!!! This wasn’t even subtle. And to add insult to injury, every photo seemed to have been overlaid with a hotness filter reminiscent of Timothée Chalamet.

As my friend Allana Harkin (formerly of Full Frontal with Samantha Bee) observed in a direct message: “These AI things make men look like themselves and women have bigger tits.” I responded: “They make men look like HOW MEN SEE THEMSELVES.” Rock Star. Superhero. Adventurer. Astronaut.

If you want to see a thorough taxonomy of these images, check out Brooke Hammerling’s exhaustive round-up in her Pop Culture Monday newsletter, “It’s An AI World” edition. She includes a man wondering why everyone seems to be an astronaut (er, not everyone!) and feminist scholar Anna Horn, a Black woman, observing that “It perpetuates racism and sexism – I ended up looking like a white woman in most of the pictures.”

(Brooke also runs down the MyHeritage AI app and ChatGPT from OpenAI. Def worth checking out.)

Obviously this is not random. AI is the result of machine learning, bringing in a whole host of data and inputs that are processed together to generate text, images and video. But at some point, someone, somewhere decided what to teach these machines. Who gets to do that? 

Says Brooke:

“Ultimately the answer is that AI is built by humans… the reality is, as of now, a majority of these AI developers are white men who play a lot of Settlers of Katan and live in fantasy sci-fi worlds. You can say I am generalizing, but IT IS A FACT so do with it as you will.”

(I should add that Brooke has been working in tech for almost two decades. She goes on:)

“We have a long way to go. But it makes sense that if sexism and racism exist in our world, then they will exist in whatever world we create. And, of course, as the images get better over time, it shows how your image could be used in ways you never consented to, like sexual content…it gets more and more tricky.”

AI is the result of machine learning, bringing in a whole host of data and inputs that are processed together to generate text, images and video. But at some point, someone, somewhere decided what to teach these machines. Who gets to do that? 

Already TechCrunch is noticing that “It’s way too easy to trick Lensa AI into making NSFW images” (we’re shocked, shocked) and the art world is pushing back on AI that generates images based on original artwork – without permission and uncompensated, of course. There is the added wrinkle that Lensa is making money off this app (a LOT of money). According to Crunchbase, the Prisma Labs leadership team shows five people, four men and one woman.

I’m sure the Prisma team is very nice. I’m sure lots of engineers working on AI are very nice. But they are human, and they come to the table with human biases. And those human biases are making men look like rock stars and women look like porn stars.

Back to the app: after running myself through it twice and my boyfriend through it once, in the “Male” and “Female” categories, I tried my selfies in the “Other” category. This did result in fewer breasts (good) and a wider range of categories (“Cyber Scientist” “Superhero” “Cyborg” and “Rainbow”), though the app could not resist giving 3/10 of my Cyborgs a nice bouncy bosom (even if one of those three did not actually have a face). Then I went all out and ran the same selfies through the “Male” category. This resulted in some beautifully ethereal superheroes (think: Leia of Alderaan) which marked a change from the “Fantasy” category from “Other” (think: General Leia Organa). My “Male” cyborgs were very comely and decidedly full-breasted; one had three arms. Every single on of my Sci-Fi and Astronaut avatars looked like Don Draper.

What can I glean from this? That the app pulls enough data to impute “Female” despite being given the “Male” category (not ideal in a gender-fluid world). That the app doesn’t have any images of Astronauts that reflect non cis-presenting men. That the app really, really, really wants to give what it reads as “Female” a nice firm pair of breasts.

I should be clear here: These are not pictures of me. These are not pictures of anybody. These are AI images generated using a sample image and a number of other behind-the-scenes inputs. But these sample images may or may not be authorized for this use (did Taylor Swift authorize this image?). And the identities imposed on those images may not be authorized, either.

Per TechCrunch:

“The big turning point, and the ethical nightmare, is the ease with which you can create near-photorealistic AI-generated art images by the hundreds without any tools other than a smartphone, an app and a few dollars… It appears that if you have 10-15 “real” photos of a person and are willing to take the time to photoshop a handful of fakes, Lensa will gladly churn out a number of problematic images.”

No guardrails? Check. Sexist and racist defaults? Check.

Welcome to the future. Looks a lot like the past.

We’re expanding The Riveter Fast Five to include curated links to stories we think you’ll enjoy!

My boyfriend, a writer, broke up with me because I’m a writer – Isabel Kaplan, The Guardian

This has deservedly been making the rounds. Come for the jaw-dropping gaslighting by the gentleman in question, stay for the trenchant observations about women and their right to tell their stories.

The Real Reason to Support BIPOC Brands – S. Mitra Kalita, Time/Charter

“The reason to support BIPOC brands and vendors is because they are at least twice as good as everyone else.”

Parents, You Need Narcan – Dan Kois, Slate

This article is not about business, it’s about saving lives. “You’re not inviting drugs into your house by getting Narcan. No one buckles their seat belt and thinks, ‘Finally, I can get in a car crash!’” Narcan is safe and there are no side effects. Read this excellent article to learn more, and then share it.

Bernice King struggled to define herself on her own terms. Then she accepted her role in continuing her parents’ legacy. – Emma Hinchcliffe and Paige McGlauflin, Fortune

“It’s 55 years now since my father was assassinated. The world is still craving and looking to that legacy for answers. I now understand that I don’t have to figure out why I’m here. I was born into a legacy.”

Your New Life Blend” with Sali Christeson – Shoshanna Hecht, “Your New Life Blend” podcast

Friend of The Riveter Shoshanna Hecht interviews other friend of The Riveter Sali Christeson, founder & CEO of workwear brand Argent, about making women’s workwear less sexist (a VC actually told Sali that “utility doesn’t belong in women’s clothing”). Hijinks – and pockets – ensue.

Related Stories