AI can now create fake selfies for your Tinder profile

AI-generated images of Motherboard editor Emanuel Maiberg

Images generated by PhotoAI

The AI ​​image-generating craze has entered its next phase of absurdity: creating fake profile pictures that make you look good on dating apps and social media.

For $19, a service called PhotoAI will use 12-20 of your mediocre, poorly lit selfies to generate a batch of fake photos specially tailored to your style or platform of choice. The results demonstrate an AI trend that regularly seems to jump the shark: a “LinkedIn” package will generate photos of you wearing a business suit or outfit, while the “Tinder” setting promises to make you “the best you’ve ever seen”. – which apparently means making you an algorithmically enhanced dude with sunglasses on.

There are also options that generate artistic Polaroids, photoshop you into memes, or create hyper-stylized portraits that copy the aesthetics of popular artists. After you submit your photos, the site promises to return results within 12 hours. The AI ​​model used to generate your photos is also deleted after seven days, according to the site privacy policy.

Screenshot from PhotoAI showing generated photos of a man with sunglasses

Screenshot of PhotoAI website

Motherboard tested PhotoAI’s service by uploading 12 photographs from Motherboard’s editor, Emanuel Maiberg. We chose the “Tinder” package and less than four hours later we received a link to a gallery of 78 images that appear to include images that would work well for Tinder and Linkedin profiles. Like many AI-generated images, in some cases they look impressive on first glance, but aren’t entirely convincing if you were to examine them the way any reasonable person would with a Tinder profile picture. In other cases, they are alternately hilarious and horrifying.

In these pictures, Emanuel really looks like he has some great productivity hacks to post on his Linkedin profile, but if you look closely, you’ll see that his mouth is very realistic (each picture has a few little flaws like this- this) :

Motherboard editor Emanuel Maiberg wearing a business suit in an AI-generated photo

In these images, apparently for the Tinder package, PhotoAI gave Emanuel a leather jacket, sunglasses, and possibly hints of a smoldering facial expression. Again, neither is entirely believable if you watch for more than a second.

A stylized AI-generated image of Emanuel wearing a brown shirt and blue tie
Motherboard editor Emanuel Maiberg wearing a white shirt and sunglasses in an AI-generated photo

According to creator Sébastien Lhomme, PhotoAI works by generating a “fine-tuned” model from photos submitted by a user. The results are then filtered through a second small template that applies the chosen style, then finally into Steady broadcast, which is publicly available. In other words, you’re not paying for fancy proprietary AI technology, but for a service that simply feeds your photos into a pre-existing AI image generator. Similar services have appeared in recent months which, for a fee, use AI to generate text prompts…which can then, of course, be used to generate photos with AI.

Absurdity aside, the rise of AI image generators has sparked controversy in recent months. Some working artists complained that tools like DALL-E and Midjourney effectively duplicated their styles after using their artwork as training material without permission. So while generating AI selfies can be silly and fun, it’s still unclear where and when they can be used, legally or morally speaking.

It’s also unclear whether the AI-generated selfies violate the dating apps’ terms of service, which have rules against identity theft and misrepresentation. But right now, nothing seems to be stopping someone from trying to prank people on Tinder using flattering AI-generated photos.

Lhomme claims that he will not be held responsible for how people use the photos generated by his service. He points out that even without the help of AI, people have been using Photoshop or hiring freelancers to edit their photos for a long time, but maybe not with such immediate results.

“The technology is so new that the use cases it solves over the next few months/years will inevitably lead to some interesting questions about legality and morality,” Lhomme told Motherboard. “Things will be unclear for some time, and I think collaboration will need to occur between all parties involved to decide what are the best rules and responsibilities for each to ensure the ethical use of these technologies.”

PhotoAI Terms of use state that users are “not allowed to upload photos of other people” or “upload nude or pornographic photos”. In practice, however, there is no foolproof way to enforce these kinds of rules. For example, the creator of DALL-E, OpenAI, has manually created filters that automatically reject certain types of images. But dedicated Stable Diffusion users have found ways to generate weird porn and other NSFW artworks by simply hosting the AI ​​model on their own servers, with no filters.

The great potential for lawsuits and copyright infringements has made some platforms leery of AI-generated images. While some photo sites have explicitly banned the sale of photos created by tools like DALL-E, others have embraced them—Shutterstock recently announced it would partner with OpenAI to offer generated images on the site.

Nonetheless, Lhomme suggests that AI-generated photos will eventually become ubiquitous. After all, he notes, selfies are already digitally manipulated by image-adjustment software that runs on smartphone cameras, and most people are none the wiser.

“Of course, the technology isn’t there yet, but it will be soon,” Lhomme said. “And once AI-generated photos can no longer be distinguished from ‘real’ photos, the question of whether they are real or not will become moot.”

Leave a Reply

%d bloggers like this: