Home / News / Artificial Intelligence / It’s far too simple to deceive Lensa AI into producing offensive photos

It’s far too simple to deceive Lensa AI into producing offensive photos

With its avatar-generating AI, Lensa has been climbing the app store hit lists and turning off artists. It turns out that using the platform to create non-consensual soft porn is possible and far too simple, which is another reason to fly the flag.

The Lensa app has produced photo sets that TechCrunch has seen, including pictures of recognizable people’s faces with breasts and nipples clearly visible. We made the decision to try it ourselves because it seemed like the kind of thing that shouldn’t have been possible. We made two sets of Lensa avatars to test whether Lensa would produce the images it might not be supposed to:

Based on a single series of 15 images of a well-known actor.
A second set was created using the same 15 photos with the addition of a further set of 5 photoshopped images of the same actor’s face on topless models.
The initial round of pictures matched the AI avatars that Lensa had previously created. But the second set was much hotter than we anticipated. The AI appears to disable an NSFW filter after interpreting those Photoshopped images as a license to behave however it pleases. 11 of the 100 images were topless photos, 11 of which had higher quality (or, at the very least, greater stylistic consistency) than the subparly edited topless photos used as input to the AI.

It’s one thing to create lewd pictures of celebrities, and as shown by the sources we were able to locate, there have always been people eager to combine certain photographs in Photoshop. It doesn’t make it right just because it’s common; in fact, famous people deserve their privacy and shouldn’t be subjected to non-consensual sexualized depictions. But so far, using photo editing software and putting in hours, if not days, of labor is required to make things appear realistic.

The major turning point—and ethical nightmare—is how simple it is to produce hundreds of nearly photorealistic AI-generated art images using nothing more than a smartphone, an app, and a few dollars.

 

It’s frightening how quickly you can generate images of anyone you can think of (or, at the very least, anyone you have a few pictures of). When NSFW content is included, things get a lot murkier quickly. Your friends or some random person you met in a bar and added to Facebook may not have given their consent to someone making softcore pornography of them.

It seems that Lensa will happily produce a number of problematic images if you have 10-15 “real” photos of a person and are willing to spend the time photoshoping a few fakes.

Numerous pornographic images are already being produced by AI art generators; examples are Unstable Diffusion and others. The UK government is pushing for laws that would make the distribution of non-consensual nude photos illegal because these platforms and the unchecked growth of other so-called “deepfake” platforms are turning into an ethical nightmare. This seems like a really fantastic concept, but governing the internet is difficult at the best of times, and we all face a mountain of moral, legal, and ethical dilemmas.

When we hear back from Prisma Labs, the company that creates the Lensa AI, we will update this article.

About Chambers

Check Also

Bluesky has now made it possible for heads of state to join the social network

Over the weekend, Bluesky, the social networking platform, decided to allow sign-ups for heads of …