Showing only posts tagged jailbreak. Show all posts.

ASCII art elicits harmful responses from 5 major AI chatbots

Source

Enlarge / Some ASCII art of our favorite visual cliche for a hacker. (credit: Getty Images) Researchers have discovered a new way to hack AI assistants that uses a surprisingly old-school method: ASCII art. It turns out that chat-based large language models such as GPT-4 get so distracted trying to …