Advertisement
Singapore markets open in 3 hours 41 minutes
  • Straits Times Index

    3,332.80
    -10.55 (-0.32%)
     
  • S&P 500

    5,460.48
    -22.39 (-0.41%)
     
  • Dow

    39,118.86
    -45.24 (-0.12%)
     
  • Nasdaq

    17,732.60
    -126.10 (-0.71%)
     
  • Bitcoin USD

    61,819.01
    +949.65 (+1.56%)
     
  • CMC Crypto 200

    1,287.30
    +3.47 (+0.27%)
     
  • FTSE 100

    8,164.12
    -15.56 (-0.19%)
     
  • Gold

    2,336.90
    -2.70 (-0.12%)
     
  • Crude Oil

    81.46
    -0.08 (-0.10%)
     
  • 10-Yr Bond

    4.3430
    +0.0550 (+1.28%)
     
  • Nikkei

    39,583.08
    +241.58 (+0.61%)
     
  • Hang Seng

    17,718.61
    +2.11 (+0.01%)
     
  • FTSE Bursa Malaysia

    1,590.09
    +5.15 (+0.32%)
     
  • Jakarta Composite Index

    7,063.58
    -6,967.95 (-49.66%)
     
  • PSE Index

    6,411.91
    +21.33 (+0.33%)
     

Stolen ChatGPT accounts for sale on the dark web

The app for OpenAI’s ChatGPT on a smartphone screen in Oslo, on 12 July, 2023 (Getty Images)
The app for OpenAI’s ChatGPT on a smartphone screen in Oslo, on 12 July, 2023 (Getty Images)

Hundreds of thousands of stolen login credentials for ChatGPT are being listed for sale on dark web markets, security researchers have warned.

Cyber security firm Flare discovered over 200,000 OpenAI logins on the dark web – a section of the internet unreachable through conventional web browsers – offering criminals a way to access users’ accounts or simply use the premium version of the AI tool for free.

The Independent has reached out to OpenAI for further information and comment. The AI firm previously defended its security practices after a smaller batch of credentials were discovered online.

ADVERTISEMENT

“OpenAI maintains industry best practices for authenticating and authorising users to services including ChatGPT,” a spokesperson said last month. “We encourage our users to use strong passwords and install only verified and trusted software to personal computers.”

The listings come amid a surge in interest in generative artificial intelligence from malicious actors, with discussions about ChatGPT and other AI chatbots flooding criminal forums.

Research published in March found that the number of new posts about ChatGPT on the dark web grew seven-fold between January and February this year.

Security firm NordVPN described the exploitation of ChatGPT as “the dark web’s hottest topic”, with cyber criminals seeking to “weaponise” the technology.

Among the topics under discussion were how to create malware with ChatGPT and ways to hack the AI tool to make it carry out cyber attacks.

Earlier this month, researchers discovered a ChatGPT-style AI tool with “no ethical boundaries or limitations” called WormGPT.

The AI tool WormGPT features similar functionality to ChatGPT, without any of the restrictions (iStock/ The Independent)
The AI tool WormGPT features similar functionality to ChatGPT, without any of the restrictions (iStock/ The Independent)

It was described as ChatGPT’s “evil twin”, allowing hackers to perform attacks on a never-before-seen scale.

“ChatGPT has carried out certain measures to limit nefarious use of its application but it was inevitable that a competitor platform would soon take advantage of using technology for illicit gain,” Jake Moore, an advisor at the cyber security firm ESET, told The Independent.

“AI chat tools create a powerful tool but we are wandering into the next phase which casts a dark cloud over the technology as a whole.”