Bridging individuals with technology thru innovative solutions & delivery of excellence in  service.

G360-Expanded

440.973.6652

Bridging individuals with technology thru innovative solutions & delivery of excellence in  service.

‘Godmode’ GPT-4o jailbreak released by hacker — powerful exploit was quickly banned

‘Godmode’ GPT-4o jailbreak released by hacker — powerful exploit was quickly banned

May 31, 2024



A jailbroken version of GPT-4o hit the ChatGPT website this week, lasting only a few precious hours before being destroyed by OpenAI. 

Twitter user “Pliny the Prompter,” who calls themselves a white hat hacker and “AI red teamer,” shared their “GODMODE GPT” on Wednesday. Using OpenAI’s custom GPT editor, Pliny was able to prompt the new GPT-4o model to bypass all of its restrictions, allowing the AI chatbot to swear, jailbreak cars, and make napalm, among other dangerous instructions.





Source link

You May Also Like…

0 Comments