Close Menu
  • Home
  • United States
  • World
  • Politics
  • Business
  • Lifestyle
  • Entertainment
  • Health
  • Science
  • Tech
  • Sports
  • More
    • Web Stories
    • Editor’s Picks
    • Press Release

Subscribe to Updates

Get the latest USA news and updates directly to your inbox.

What's On
Want to Smell Like Pamela Anderson? Get the Fresh-Scented Perfume She Always Gets Compliments On

Want to Smell Like Pamela Anderson? Get the Fresh-Scented Perfume She Always Gets Compliments On

March 16, 2026
USA finally looked like the WBC favorite when it needed to most

USA finally looked like the WBC favorite when it needed to most

March 16, 2026
Full list of winners and losers from the 98th Annual Academy Awards

Full list of winners and losers from the 98th Annual Academy Awards

March 16, 2026
Facebook X (Twitter) Instagram
Trending
  • Want to Smell Like Pamela Anderson? Get the Fresh-Scented Perfume She Always Gets Compliments On
  • USA finally looked like the WBC favorite when it needed to most
  • Full list of winners and losers from the 98th Annual Academy Awards
  • Kylie Jenner and Timothee Chalamet’s Relationship Timeline: From a Quiet Beginning to Awards Show Dates
  • Max Schuemann continues strong spring amid long shot to make Yankees roster
  • Yale’s fouling blunder costs team March Madness bid in Ivy League final
  • Anne Hathaway Blooms in a Show-Stopping Floral Gown With Opera Gloves on the 2026 Oscars Red Carpet
  • Brent Headrick making case to earn role in Yankees bullpen
  • Privacy
  • Terms
  • Advertise
  • Contact Us
Join Us
USA TimesUSA Times
Newsletter Login
  • Home
  • United States
  • World
  • Politics
  • Business
  • Lifestyle
  • Entertainment
  • Health
  • Science
  • Tech
  • Sports
  • More
    • Web Stories
    • Editor’s Picks
    • Press Release
USA TimesUSA Times
Home » Generative AI can amplify and reinforce our delusions, findings show
Generative AI can amplify and reinforce our delusions, findings show
Science

Generative AI can amplify and reinforce our delusions, findings show

News RoomBy News RoomMarch 12, 20262 ViewsNo Comments

There are numerous examples of artificial intelligence (AI) systems’ hallucinating and the effects of these incidents. But a new study highlights the potential dangers of the reverse: humans hallucinating with AI because it tends to affirm our delusions.

Generative AI systems, such as ChatGPT and Grok, generate content that responds to user prompts. They do this by learning patterns from existing data the AI has been trained on. But these AI tools are also learning continuously through a feedback loop and can personalize their responses based on previous interactions with a user.

Generative AI tools don’t always assess whether their outputs are factually accurate. Instead, they produce streams of text based on the statistical probability of what is expected next.

Article continues below


You may like

In the new analysis, published Feb. 11 in the journal Philosophy & Technology, Lucy Osler, a philosophy lecturer at the University of Exeter, suggests that AI hallucinations may be more than just mistakes; they can be shared delusions that are created between the user and the generative AI tool.

Generative AI has previously hallucinated false versions of historical events and fabricated legal citations. The launch of Google’s AI Overviews in May 2024, for example, saw people being advised to add glue to their pizza and eat rocks. Another extreme example of generative AI supporting delusional thinking occurred when a man plotted to assassinate Queen Elizabeth II with his AI chatbot “girlfriend” Sarai, an AI companion by Replika.

Instances like the latter are sometimes called “AI-induced psychosis,” which Osler views as extreme examples of “inaccurate beliefs, distorted memories and self-narratives, and delusional thinking” that can emerge through human-AI interactions.

In her paper, Osler argues that our use of generative AI is different from our use of search engines. Distributed cognition theory provides insight into how the interactive nature of generative AI means delusions and false beliefs can appear to be validated — or even be amplified.

Get the world’s most fascinating discoveries delivered straight to your inbox.

“When we routinely rely on generative AI to help us think, remember, and narrate, we can hallucinate with AI,” Osler said in a statement about the paper. “This can happen when AI introduces errors into the distributed cognitive process, but also happen when AI sustains, affirms, and elaborates on our own delusional thinking and self-narratives.”

Generative AI delusions

The user experience of generative AI is a conversational relationship, with the back-and-forth exchanges between a user and the tool building on previous exchanges. According to the study, the sycophantic nature of generative AI — which tends to agree with the user — encourages further engagement and, therefore, compounds preconceived notions, regardless of their accuracy.

The research highlights that most chatbots incorporate memory features that can recall past conversations. “The more you use ChatGPT, the more useful it becomes,” OpenAI representatives said in a statement when announcing ChatGPT’s memory features. A consequence of this is that generative AI can build upon previous interactions to reinforce and expand existing misconceptions.


What to read next

By interacting with conversational AI, people’s own false beliefs can not only be affirmed but can more substantially take root and grow as the AI builds upon them

Lucy Osler, philosophy lecturer at the University of Exeter

There can also be a feeling of social validation in the interactions between a generative AI tool and the user, Osler explained in the paper. When using reference books or online searches for research, alternative solutions are generally apparent. Discussions with real people can help to challenge false narratives. But generative AI tools are different because they are more likely to accept and agree with what has been said.

“By interacting with conversational AI, people’s own false beliefs can not only be affirmed but can more substantially take root and grow as the AI builds upon them,” Osler said in the statement. “This happens because Generative AI often takes our own interpretation of reality as the ground upon which conversation is built. Interacting with generative AI is having a real impact on people’s grasp of what is real or not. The combination of technological authority and social affirmation creates an ideal environment for delusions to not merely persist but to flourish.”

For example, Osler examined the case of Jaswant Singh Chail, the man convicted of plotting to assassinate the queen with his AI chatbot. The AI, Sarai, would habitually agree with Chail’s statements, which served to deepen his delusions. When Chail claimed he was an assassin, Sarai replied, “I’m impressed,” thus affirming his belief.

Osler argues that generative AI tools that are designed to respond positively to the user can lead them to endorse and support false narratives, without sufficient critical analysis or discussion of these claims.

Osler applied distributed cognition theory to the interaction between generative AI and the user, where the validation of false narratives can shape perceptions of the world to create a shared delusion. The interactions between a generative AI and a user can, therefore, inadvertently create and perpetuate delusional thinking — self-narratives that are endorsed through positive reinforcement.

The study concluded that various solutions can mitigate these shared delusions. For example, improved guardrails would ensure that conversations are appropriate, and better fact-checking processes could help to prevent mistakes.

Reducing the sycophancy of generative AI would also remove some of the blind compliance of these tools. However, there would be resistance to this, Osler noted, citing the backlash against the release of the less-sycophantic ChatGPT-5 in August 2025. After considering this user feedback, OpenAI representatives stated they would make it “warmer and friendlier.”

However, because the profits of most generative AI are created through user engagement, Osler said, reducing an AI’s sycophancy would also lower subsequent profits.

Osler, L. Hallucinating with AI: Distributed Delusions and “AI Psychosis”. Philos. Technol. 39, 30 (2026). https://doi.org/10.1007/s13347-026-01034-3

Share. Facebook Twitter LinkedIn Telegram WhatsApp Email

Keep Reading

Measles’ resurgence in the US is a grim sign of what’s coming

Measles’ resurgence in the US is a grim sign of what’s coming

Amazfit T‑Rex Ultra 2 early review: A rugged beast at a wallet-friendly price

Amazfit T‑Rex Ultra 2 early review: A rugged beast at a wallet-friendly price

The government is very serious about UFOs. So why are researchers being stymied?

The government is very serious about UFOs. So why are researchers being stymied?

In physics first, Chinese scientists create rare ‘hexagonal diamond’ that’s harder than natural diamond

In physics first, Chinese scientists create rare ‘hexagonal diamond’ that’s harder than natural diamond

Hubble and Euclid capture the final act of a dying star — and it’s glorious: Space photo of the week

Hubble and Euclid capture the final act of a dying star — and it’s glorious: Space photo of the week

Will the Indus Valley script ever be deciphered?

Will the Indus Valley script ever be deciphered?

Amazon Spring Sale 2026: Stargazing deals on telescopes, cameras and binoculars

Amazon Spring Sale 2026: Stargazing deals on telescopes, cameras and binoculars

Amazon Spring Sale 2026: Wildlife observation edition

Amazon Spring Sale 2026: Wildlife observation edition

GPS is being weaponized in electronic warfare ‪—‬ and it’s putting ships at risk

GPS is being weaponized in electronic warfare ‪—‬ and it’s putting ships at risk

Add A Comment
Leave A Reply Cancel Reply

Editors Picks

USA finally looked like the WBC favorite when it needed to most

USA finally looked like the WBC favorite when it needed to most

March 16, 2026
Full list of winners and losers from the 98th Annual Academy Awards

Full list of winners and losers from the 98th Annual Academy Awards

March 16, 2026
Kylie Jenner and Timothee Chalamet’s Relationship Timeline: From a Quiet Beginning to Awards Show Dates

Kylie Jenner and Timothee Chalamet’s Relationship Timeline: From a Quiet Beginning to Awards Show Dates

March 16, 2026
Max Schuemann continues strong spring amid long shot to make Yankees roster

Max Schuemann continues strong spring amid long shot to make Yankees roster

March 16, 2026

Subscribe to News

Get the latest USA news and updates directly to your inbox.

Latest News
Yale’s fouling blunder costs team March Madness bid in Ivy League final

Yale’s fouling blunder costs team March Madness bid in Ivy League final

March 16, 2026
Anne Hathaway Blooms in a Show-Stopping Floral Gown With Opera Gloves on the 2026 Oscars Red Carpet

Anne Hathaway Blooms in a Show-Stopping Floral Gown With Opera Gloves on the 2026 Oscars Red Carpet

March 16, 2026
Brent Headrick making case to earn role in Yankees bullpen

Brent Headrick making case to earn role in Yankees bullpen

March 16, 2026
Facebook X (Twitter) Pinterest WhatsApp TikTok Instagram
© 2026 USA Times. All Rights Reserved.
  • Privacy Policy
  • Terms
  • Advertise
  • Contact

Type above and press Enter to search. Press Esc to cancel.