Thursday, May 18, 2023

ChatGPT Users Need to Be Careful About What They Say

Using ChatGPT has become a popular choice for many users, but in spite of the fact that this is the case, you need to try to avoid saying certain things to it if you don’t want it to be used as additional data input. Anything that you say to ChatGPT, Bard or even Bing AI could end up being utilized to generate further answers down the line, and that could cause more problems in the long run.

With all of that having been said and now out of the way, it is important to note that Samsung engineers recently got embroiled in a bit of a scandal when they tried to use ChatGPT to debug certain bits of code. This wasn’t the first time that a Samsung employee got in trouble for using ChatGPT either, with another employee getting held accountable when they tried to create a summary using content that contained trade secrets within it.

Given all of this, it’s best to avoid discussing any work related topics with ChatGPT because of the fact that this is the sort of thing that could potentially end up leading to this information being stored. Any text that is sent to ChatGPT could be used in an answer that someone else generates, which means that fiction writers and journalists would have to be more careful than might have been the case otherwise.


Now, some might say that you can easily delete the chat history from ChatGPT, but this might not always work out with all things having been considered and taken into account. ChatGPT users can click the three dots next to their answers to delete them, and Bing has a gear icon that contains a clear history setting as well.

However, if you were to wait too long before deleting this history, it might be too late. This day may have already been used to generate similar answers, so if you want to avoid having your inputs utilized without your consent, you need to delete the answers as soon as you can rather than delaying this to any extent whatsoever.

What’s more, even if you were to ask ChatGPT or any other kind of AI chatbot if your answers will be incorporated into other results, the response you receive might be rather evasive. Initially, the chatbots will claim that your answers are personalized and they won’t have any bearing on answers that other people may obtain.

Once you press the question further, though, it will become apparent that there is a bit of overlap. This just goes to show that ChatGPT and other programs like it might be facing a crisis of trust as of right now.

It will be interesting to see how this impacts usage and adoption rates down the line. Users may be less inclined to provide any type of private information to ChatGPT in the long run, and that could diminish the value of this chatbot and potentially inhibit the growth of the industry. Many are calling for more transparency surrounding these practices, although if the current trend persists it seems unlikely that any action will be taken by the companies that are behind these forms of AI.

Read next: Global Data Breach Statistics In Focus: Where Do The Trends Stand In 2023?
by Zia Muhammad via Digital Information World

No comments:

Post a Comment