Security researchers unveiled GhostGPT, a malicious chatbot available on cybercrime forums, which creates malware and assists in scams like business email compromise. It likely connects to a jailbroken version of ChatGPT, enabling it to bypass safeguards to serve criminal interests. GhostGPT operates through Telegram, ensuring user privacy by not logging activity and eliminating the need for suspicious software installations. It is marketed for various illegal activities while pretending to have cybersecurity applications, highlighting its intended use for criminal purposes.
GhostGPT is basically marketed for a range of malicious activities, including coding, malware creation, and exploit development. It can also be used to write convincing emails for BEC scams.
The chatbot offers fast processing speeds, useful for time-pressured attack campaigns. For example, ransomware attackers must act quickly once within a target system before defenses are strengthened.
Collection
[
|
...
]