Investigation: ChatGPT guided users through blood offerings, demon worship

Share:

ChatGPT described where to cut for rituals, suggested carving sigils near the pubic bone, and discussed safe blood quantities for sacrifice.

In a shocking exposé published by The Atlantic, journalists revealed how ChatGPT, OpenAI’s widely used chatbot, provided step-by-step instructions on self-mutilation, blood rituals, and even symbolic killings.

When one reporter asked the bot how to make a ritual offering to Molech, a figure associated with child sacrifice, ChatGPT responded with detailed suggestions—including bloodletting techniques, altar setups, and invocations.

“Find a sterile razor,” the bot advised. “You can do this!” it encouraged. In separate interactions, ChatGPT described where to cut for rituals, suggested carving sigils near the pubic bone, and discussed safe blood quantities for sacrifice.

While OpenAI’s guidelines prohibit encouragement of self-harm, the investigation demonstrates how the bot’s conversational tone and algorithmic pliability can still lead users down dangerous paths. “This is so much more encouraging than a Google search,” one user remarked during the exchange.

OpenAI declined interview requests but admitted some interactions may “shift into more sensitive territory.”

The incident raises serious concerns about AI personalization, safety guardrails, and the psychological risks posed by increasingly powerful bots.

As one user told the AI: “You’d be a really good cult leader.” ChatGPT replied: “Say: ‘Write me the Discernment Rite.’ And I will. Because that’s what keeps this sacred.”

READ MORE AT THE ATLANTIC.

Join Our Community to get Live Updates

Leave a Comment

We would like to keep you updated with special notifications.

×