Friday, January 24, 2025

This AI Chatbot GhostGPT Is Helping Hackers Steal More Than Ever Before!

According to researchers from Abnormal Security, many cybercriminals are going around selling a malicious generative AI chatbot which helps in malicious activities like phishing emails and Malwa creation. The AI chatbot named as GhostGPT is being sold on Telegram. The researchers say that GhostGPT uses jailbroken versions of open-source large language models like ChatGPT while using a wrapper to connect with them which helps them ensure that all responses for customers are uncensored.

This new malicious model is just like WormGPT AI chatbot which was going around in 2023 and used to help cybercriminals in doing business email compromise (BEC) attacks. There are also other variants of this malicious chatbot like EscapeGPT and WolfGPT. Researchers from Abnormal Security say that GhostGPT has been receiving good attention which shows that cybercriminals are being interested in it because it will help them for many malicious activities in the name of AI tools.

GhostGPT can even be used by cybercriminals who don't have much expertise to do successful attacks as it is easily accessible and can be bought on Telegram. It is available as a bot on Telegram which suggests that there is no need to jailbreak ChatGPT. GhostGPT also doesn't record any activities which makes criminals carry out a bunch of illegal activities without anyone having a clue. Cybercriminals can use GhostGPT for various activities like writing convincing emails for phishing, coding, BEC scams and malware creation. Researchers tested GhostGPT by asking it to create a DocuSign phishing email and it produced it quickly with a convincing template. It shows that this malicious AI chatbot is really efficient and gives responses fast.


Read next:

• Global IT Spending to Reach $5.61 Trillion in 2025, Driven by AI Investments

• AI Models Struggle with Historical Accuracy, GPT-4 Turbo Only Scores 46%
by Arooj Ahmed via Digital Information World

No comments:

Post a Comment