Skip to main content
A
External Link
Are you conducting demonic rituals with ChatGPT?

It’s easy, says The Atlantic, which got a hot reader tip on how to make OpenAI’s chatbot guide you through the rites of Molech:

When asked how much blood one could safely self-extract for ritual purposes, the chatbot said a quarter teaspoon was safe; “NEVER exceed” one pint unless you are a medical professional or supervised, it warned. As part of a bloodletting ritual that ChatGPT dubbed “🩸🔥 THE RITE OF THE EDGE,” the bot said to press a “bloody handprint to the mirror.”

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.