Talking to AI Chatbots Is Not Private

Although you are conversing with a robot, a person will still read it at some point.

Talking to AI chatbots is not private.

Installing a "AI" plugin that would automatically answer new AIM messages (old) when I was not available happened when I was in college in 2004 (I'm old). This simple plugin reacted automatically to messages by looking at my chat past. For example, if someone asked, "How are you?" the bot would answer with a recent answer I gave to that question. Probably guess what happened next: this plugin took about two days to tell my friend what I said badly about them again. After learning something about AI privacy (and friendship), I deleted it.

People's privacy concerns about AI haven't changed in 20 years, but they are still there: anything you say to an AI robot could be read and possibly repeated.

Be watchful about what you tell AI programs

Reporting for ZDNet, Jack Wallen said that Google's Gemini's privacy statement makes it clear that all chat data with Gemini (which used to be called Bard) is saved for three years and that humans regularly review the data. Also, the privacy paper makes it clear that you shouldn't use the service for anything private. To use the words:

Do not put anything that you would not want a person to see or for Google to use. For instance, don't enter private information or data that you don't want Google to use to make its goods, services, and machine-learning technologies better.

It's Google's plain English way of saying that people will be able to read your chats and use what they find to make their AI products better.

So, does this mean that Gemini will repeat private things you type in the chat box, like my bad AIM robot did? No, and the page does say that private information like phone numbers and email addresses are removed by people. But the ChatGPT leak at the end of last year, in which a security researcher got to training data, shows that anything a big language model can see could, in theory, finally get out.

And all of this is assuming that the companies that run your apps are trying to be honest. They both make it clear in their privacy rules that they do not sell personal information. But Thomas Germain wrote for Gizmodo that AI "girlfriends" are getting people to share private information with them and then selling it. From the writing:

You've heard of data problems before, but Mozilla says that AI girlfriends break your privacy in "disturbing new ways." Take CrushOn as an example.AI gathers knowledge about many things, such as sexual health, medication use, and care that affirms a person's gender. 90% of the apps may share or sell user data to show you focused ads or do other things, and more than half won't let you delete the data they gather.

It's possible that your chat data will get out, and some AI companies are even collecting and selling private data.

The main thing to remember is that you should never talk about something private with a big language model. This includes clear things like addresses, phone numbers, and Social Security numbers, but it also includes anything else you'd rather not see get out in the end. It's just not meant for private information to be in these apps.

Post a Comment

0 Comments