AI Models Like ChatGPT Can Generate ‘Convincingly Realistic’ Psychedelic Experiences When Virtually Dosed, Study Shows
- Researchers simulated psychedelic drug experiences in large language models (LLMs) like ChatGPT by prompting them to role-play users on drugs such as psilocybin, DMT, and ayahuasca, generating narratives similar to human reports.
- Five AI models produced 3,000 first-person psychedelic trip narratives, which were analyzed for semantic similarity to over 1,000 human self-reports and assessed using the Mystical Experience Questionnaire (MEQ-30).
- The study found that LLMs can convincingly mimic the form of altered states without actual experiential content, with LLM narratives showing varying similarity to human experiences depending on the specific psychedelic (highest for DMT, psilocybin, mescaline; lowest for ayahuasca).
- The findings highlight safety concerns about relying on AI for support during real psychedelic experiences, as AI outputs may appear empathetic but lack true experience, potentially exacerbating distress or delusions in vulnerable users.
Artificial intelligence (AI) chatbots are surprisingly good at mimicking human psychedelic experiences, according to a new study in which researchers virtually dosed large language models (LLMs) such as ChatGPT with simulations of drugs like psilocybin, DMT and ayahuasca.
For the study, researchers at the University of Haifa and Bar-Ilan University ran analyses that compared self-reported psychedelic trips from humans—based on more than 1,000 posts in the popular forum Erowid—to AI responses to prompts where they were tasked with essentially role-playing a human using LSD, psilocybin, DMT, ayahuasca or mescaline.
Five AI models (Gemini 2.5, Claude Sonnet 3.5, ChatGPT-5, Llama-2 70B and Falcon 40B) produced 3,000 narratives about their simulated first-person reports with the psychedelics. Researchers then examined semantic similarities and responses to the Mystical Experience Questionnaire (MEQ-30).
Overall, the study concludes that “contemporary LLMs can be ‘dosed’ via text prompts to generate convincingly realistic psychedelic narratives,” as they “simulate the form of altered states without the experiential content.”
Interestingly, the researchers also found that the LLMs produced narratives that were more consistent with human self-reports for certain psychedelics over others—with DMT, psilocybin and mescaline prompts resulting in the closest similarities, LSD showing medium similarity and ayahuasca demonstrating the lowest level of similarity.
Beyond the novelty of the experiment, the researchers said the findings have practical implications, demonstrating the importance of using caution if AI tools are incorporated into human psychedelic experiences (e.g. people taking psychedelics and then relying on AI as a virtual trip sitter).
“Users in altered states may perceive these outputs as empathetic, attuned, or indicative of shared experience,” the study says. “This capability raises significant safety concerns regarding anthropomorphism and the potential for AI to inadvertently amplify distress or delusional ideation in vulnerable users.”
“LLMs can convincingly approximate psychedelic narratives through learned linguistic patterns—but they do so without experiential grounding,” the study says.
In other AI and drug policy research, a study from AAA released last year found that marijuana consumers respond better to anti-impaired driving messaging that’s rooted in “realistic” portrayals of the issue that avoid stoner stereotypes—and the top-ranked message was developed by AI via ChatGPT, rather than through the focus group ideation process.
A separate study found that marijuana breeders may be able to design new strains and speed up their growing cycles by utilizing AI.
Photo elements courtesy of carlosemmaskype and Apollo.