How Secure are Your Conversations with AI Chatbots - Laxman Baral Blog
How Secure are Your Conversations with AI ChatbotsHow Secure are Your Conversations with AI Chatbots

How Secure are Your Conversations with AI Chatbots Although the history of Artificial Intelligence (AI) is a bit old, the development of applications based on it has reached a rapid pace nowadays. Not much time has passed, but after OpenAI’s ChatGPT, AI tools are being developed in such a way that I can’t even remember the name.

More recently, such tools have begun to affect the basic aspects of human life. An example of them is in conversation. Some such tools have started appearing in the place of talking friends/family. Apart from general inquiries, conversations with such tools have started to go deeper into personal conversations. And there are news about Jamaat who are used to talking to AI tool because they cannot find someone who understands their mind.

In this case, the question arises, are the conversations made with such AI tools safe? Are they secret? Or even if they are safe, to what level are they safe? Especially the rapid development of chatbots has brought such questions to the surface.

Because remember, they are chatbots. It is a machine. Not people. And those conversations are stored somewhere. Behind which there is a company or person.

Journalist Justin Pat of ‘Life Hacker’ has shared an incident about this. In his college days (2004) he once used a normal chatbot. The bot’s job was to answer questions even when he was not present on the famous online messaging platform AOL Instant Messenger (AIM) at that time. But he was almost trapped in a personal conversation with the bot.

Private conversations in AI chatbots

Another technology journalist, Jake Wallen, says that Gemini (formerly Bard)’s privacy policy states that all conversations with bots are stored for three years, and that Google employees regularly review the data.

Also, in the privacy policy, it is clarified that the bot will not share any of your private conversations. “Google is clearly saying this to use your conversations to improve their AI models,” he says.

However, in the same privacy policy, it is mentioned that employees will work to remove private data such as phone number, email address. Because such tools can always be at risk. Like: ChatGPT had its first data leak at the end of last year.

They have also clarified the policy of not selling user’s personal information. But another tech journalist, Thomas Germain, wrote that entertainment-focused chatbots like AI “girlfriends” are encouraging users to share private information.

What you need to understand is that chatbot conversations, which are data, are not just leaks. They can also be sold. There are now thousands of such companies collecting and selling private information.

So it is imperative that you be cautious while talking to new and unfamiliar chatbots. Especially when it comes to private conversations, providing important information such as phone numbers, addresses, banking details, government documents, etc., you should be strict. Because to a large extent, it is your responsibility to keep the conversations in such AI chatbots safe.

Valspar Championship on ESPN hdfc bank chairman atanu chakraborty moon sighting in india premier league manchester united israel iran war news latest