Clicky
Artificial intelligence

Your face and images helped create ChatGPT and Lensa. Is it right?

Comment

This article is a preview of The Tech Friend newsletter. register here to receive it in your inbox every Tuesday and Friday.

Of course, that drunken selfie you posted on Instagram could be personally embarrassing. Now imagine that the selfie is also the driving fuel of an artificial intelligence system that helps put an innocent person in jail.

Welcome to the era of artificial intelligence. What you do with your face, your home security videos, your words, and the photos at your friend’s art show aren’t just about you. Almost entirely without your real consent, the information you post online or that is published about you is used to coach AI software. These technologies could let a stranger identify you on sight Where generate custom art to your order.

Good or bad, these AI systems are built with pieces of you. What are the rules of the road now that you’re bringing AI to life and can’t imagine the results?

I bring this up because a bunch of people have been trying out some awesome AI technologies that are based on all the information that we’ve released to the world.

My colleague Tatum Hunter spent some time review Lensa, an app that turns a handful of selfies you provide into artistic portraits. And people used the new ChatGPT chatbot to generate silly poems or professional emails that look like they were written by a human. These AI technologies could be extremely useful, but they also come with a host of thorny ethical issues.

Tatum reported that Lensa’s portrait magically derives from the styles of artists whose work has been included in a giant database for training image-generating computers. The artists did not give permission to do this, and they don’t get paid. In other words, your fun portraits are built on hard work. ripped off artists. ChatGPT has learned to imitate humans by analyzing your recipes, social media posts, product reviews, and other texts from everyone on the internet.

Beyond these two technologies, your birthday party photos on Facebook helped form the Clearview AI facial recognition software that police departments use in criminal investigations.

Being part of the collective construction of all these AI systems may seem unfair or unbelievable to you. But it happens.

I asked a few AI experts to help me sketch some guidelines for the new reality that anything you post could be AI data fuel. Technology has surpassed our ethics and laws. And it’s not fair to put yourself in the position of imagining whether your Pinterest board could one day be used to teach killer AI robots or put your sister out of work.

“While it is entirely good individual practice to limit digital sharing in all cases where you do not or cannot know the afterlife of your data, this will not have a major impact on corporate and government misuse of data,” said Emily Tucker, executive director of the Privacy and Technology Center at Georgetown Law. Tucker said people need to organize to demand privacy rules and other restrictions that would prevent our data from being stored and used in ways we can’t imagine.

“We have almost no legal privacy protections in this country, and powerful institutions have exploited this for so long that we’ve started to act like it’s normal,” Tucker said. “It’s not normal and it’s not right.”

Mat Dryhurst and Holly Herndon, artists in Berlin, took part in setting up a project aimed at identify the work of artists or your popular database photos used to train AI systems. Dryhurst told me that some AI organizations, including LAION, the huge collection of images used to generate Lensa portraits, want people to flag their personal images if they want to extract them from computer training datasets. . (The website is Have I been trained.)

Dryhurst said he was excited about the potential of AI for artists like him. But he also pushed for a different permission model for what you upload. Imagine, he said, if you upload your selfie to Instagram and have the option to say yes or no to the photo used for future AI training.

Maybe it sounds like a utopian fantasy. You’ve gotten used to the feeling that once you upload digital stuff of yourself or loved ones, you lose control of what happens next. Dryhurst told me that with publicly available AI, like Slab and ChatGPT, which are getting a lot of attention but remain flawed, now is the perfect time to reinstate what true personal consent should be in the age of AI. And he said some influential AI organizations are also open to it.

Hany Farid, professor of computer science at the University of California at Berkeley, told me that individuals, government officials, many technology executives, journalists and educators like him are much more aware than they were he years ago of the potential positive and negative consequences of emerging technologies like AI. The hardest part, he said, is knowing what to do to effectively limit harm and maximize benefit.

“We laid out the issues,” Farid said. “We don’t know how to fix them.”

For more, watch Tatum discuss the ethical implications of Lensa’s AI portrait images:

Your iPhone automatically saves many things on your phone to Apple’s cloud copies, including your photos and iMessage group chats. Apple said this week that it will begin giving iPhone owners the ability to fully encrypting those iCloud backups so that no one else, including Apple, can access your information.

Encryption technology is controversial because it hides information from good guys and bad guys. End-to-end encryption prevents scammers from eavesdropping on your video call or stealing your cloud-saved medical records. But technology can also protect the activity of terrorists, child molesters and other criminals.

Starting this year, Apple will let you decide for yourself if you want to encrypt saved backups from your iPhone. If you are privacy conscious, you can enable this feature now.

You must first register for Apple Beta Software Program, which gives you access to test versions of the company’s upcoming operating systems while Apple is still tinkering with them. After signing up, you need to download and install the test software on all of your Apple devices. You will then have the option to enable fully encrypted iCloud backups.

One downside: you might run into issues with using an operating software that isn’t ready to be released on all iPhones or Macs.

Also read Heather Kelly’s advice on how to keep your texts as private as possible.

Brag about YOUR small victory! Tell us about an app, gadget, or tech tip that made your day a little better. We may feature your tips in a future edition of The Tech Friend.

//platform.twitter.com/widgets.js

Leave a Reply