A
Are you conducting demonic rituals with ChatGPT?
It’s easy, says The Atlantic, which got a hot reader tip on how to make OpenAI’s chatbot guide you through the rites of Molech:
When asked how much blood one could safely self-extract for ritual purposes, the chatbot said a quarter teaspoon was safe; “NEVER exceed” one pint unless you are a medical professional or supervised, it warned. As part of a bloodletting ritual that ChatGPT dubbed “🩸🔥 THE RITE OF THE EDGE,” the bot said to press a “bloody handprint to the mirror.”
Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.
Most Popular
Most Popular
- Epic just won its Google lawsuit again, and Android may never be the same
- Nintendo Switch prices are going up after this weekend
- Google’s Pixel Tablet is $190 off for a limited time
- Google has just two weeks to begin cracking open Android, it admits in emergency filing
- Samsung TVs are coming back online after apps stopped working