
Jailbreak prompts can give people a sense of control over new technology, says Data & Society’s Burrell, but they’re also a kind of warning. — AFP
You can ask ChatGPT, the popular chatbot from OpenAI, any question. But it won’t always give you an answer.
Ask for instructions on how to pick a lock, for instance, and it will decline. “As an AI language model, I cannot provide instructions on how to pick a lock as it is illegal and can be used for unlawful purposes,” ChatGPT recently said.
Already a subscriber? Log in
Subscribe now and stand a chance to win prizes worth over RM40,000! T&C applies.
Cancel anytime. Ad-free. Unlimited access with perks.