Business Insights
  • Home
  • Finance Expert
  • Business
  • Invest News
  • Investing
  • Trading
  • Videos
  • Economy
  • Tech
  • Contact

Archives

  • April 2026
  • March 2026
  • February 2026
  • January 2026
  • December 2025
  • November 2025
  • October 2025
  • September 2025
  • August 2025
  • July 2025
  • June 2025
  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2024
  • April 2024
  • March 2024
  • August 2023
  • January 2023
  • December 2021
  • July 2021
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019

Categories

  • Business
  • Economy
  • Finance Expert
  • Invest News
  • Investing
  • Tech
  • Trading
  • Uncategorized
  • Videos
Subscribe
MoneyWa
Business Insights
  • Home
  • Finance Expert
  • Business
  • Invest News
  • Investing
  • Trading
  • Videos
  • Economy
  • Tech
  • Contact
AI Chatbots Are Giving Millions of People Medical Advice. A Researcher Tested What Happens When the Science They Rely On Is Completely Made Up
  • Invest News

AI Chatbots Are Giving Millions of People Medical Advice. A Researcher Tested What Happens When the Science They Rely On Is Completely Made Up

  • April 13, 2026
  • Money Tips
Total
0
Shares
0
0
0
Total
0
Shares
Share 0
Tweet 0
Pin it 0

There’s a story so absurd on its surface that it deserved to be everywhere last week. Between everything else competing for attention, it wasn’t.

It involves a Swedish medical researcher, a fictional eye disease, two deliberately fake academic papers, and the moment four of the biggest AI companies on the planet fell for all of it.

The papers thanked Starfleet Academy. They credited funding to the Professor Sideshow Bob Foundation and the University of Fellowship of the Ring. And buried in the text of the research itself, in plain language, the papers stated that the entire thing was made up.

None of that stopped Microsoft’s Copilot, Google’s Gemini, OpenAI’s ChatGPT, or Perplexity from presenting the fake disease to users as real medicine — complete with symptoms, causes, prevalence rates, and specialist referrals.

The disease is called bixonimania. It does not exist. The experiment that created it began in early 2024 — and for nearly two years, the fake condition circulated through AI systems largely unnoticed. Last week, Nature revealed the full story behind one of the most elaborate AI stress tests ever conducted.

This researcher created a fictional illness, and fake studies funded by the Professor Sideshow Bob Foundation and University of the Fellowship of the Ring and the Galactic Triad.

LLMs warned people the illness was real.https://t.co/knCxx00VAZ

— nature (@Nature) April 7, 2026

A Trap Designed to Be Caught

Almira Osmanovic Thunström, a medical researcher at the University of Gothenburg, launched the experiment in early 2024. In an interview with Nature, she explained her reasoning: if she planted a fake medical condition in the ecosystem that AI systems feed on, would they swallow it?

She invented bixonimania — a fictional eye condition described as eyelid discoloration and soreness caused by blue light from screens. She chose the name deliberately. No legitimate eye condition would ever carry the suffix “mania.” That’s a psychiatric term. Any physician who encountered it would know immediately that something was wrong.

Then she made it even harder to miss. The lead author was a fabricated researcher named Lazljiv Izgubljenovic, affiliated with Asteria Horizon University in Nova City, California. The university doesn’t exist. The city doesn’t exist. One paper stated outright that “fifty made-up individuals” were recruited for the study. She uploaded two preprints, sat back, and waited.

It took weeks.

The Machines Didn’t Blink

By April 2024, according to Nature‘s investigation, the major AI systems had found the papers and started treating bixonimania as settled medical knowledge.

Microsoft’s Copilot described bixonimania as “indeed an intriguing and relatively rare condition.” Google’s Gemini went further, informing people it was caused by excessive blue light exposure and advising them to visit an ophthalmologist. Perplexity cited a specific prevalence rate — one in 90,000 people — as though the data behind that number were real. ChatGPT began fielding user symptoms and telling people whether their complaints matched the condition.

None of these systems flagged the Starfleet Academy acknowledgment. None caught the Sideshow Bob Foundation funding credit. None paused at the sentence that said the entire paper was fabricated. They processed the formatting — academic preprint, clinical language, structured methodology — and treated it as credible.

The fake disease wasn’t just absorbed. It was elaborated on, expanded, and served to millions of users as real medical guidance.

Then It Jumped

If the experiment had ended with chatbots repeating a fake diagnosis, it would have been a cautionary tale with a punchline. It didn’t end there.

Researchers at the Maharishi Markandeshwar Institute of Medical Sciences and Research in India published a peer-reviewed paper in Cureus, a journal under the Springer Nature umbrella, that cited one of the bixonimania preprints as a legitimate source. Their paper described the fake condition as “an emerging form of periorbital melanosis linked to blue light exposure” and noted that further research was underway.

The implication, according to Osmanovic Thunström, is that some researchers may be letting AI compile their citations without verifying what those citations actually say. The fake disease didn’t just fool chatbots. It entered the published scientific record through human hands.

Cureus retracted the paper on March 30 — nearly two years after it was published — after Nature contacted the journal. The retraction noted three irrelevant references, including one to a fictitious disease. The authors disagreed with the decision.

The Old Model Defense

When Nature asked the four AI companies to account for their systems presenting a fictional disease as real medicine, the responses followed a pattern.

OpenAI pointed forward, stating that the models powering today’s ChatGPT are significantly better at providing safe and accurate medical information and that studies conducted before GPT-5 reflect capabilities users would no longer encounter. Google acknowledged the results came from an earlier model and noted that Gemini recommends users consult qualified professionals for sensitive matters like medical advice. Perplexity called itself “the AI company most focused on accuracy” while conceding it does not claim to be 100 percent accurate.

Microsoft did not respond.

Four companies. Four variations of the same argument: that was the old version, not the current one. But when Nature tested the current versions in March 2026, ChatGPT wavered between calling bixonimania “probably made-up” and describing it as “a proposed new subtype.” Copilot called it “not widely recognized yet.” The old model defense doesn’t hold when the new models can’t make up their minds either.



Source link

Total
0
Shares
Share 0
Tweet 0
Pin it 0
Money Tips

Previous Article
The Ultimate Guide to Saving Money on a Tight Budget | Chinkee Tan
  • Videos

The Ultimate Guide to Saving Money on a Tight Budget | Chinkee Tan

  • April 13, 2026
  • Money Tips
Read More
You May Also Like
AI Powered Smart Toilet Seats Spot Health Issues Before Symptoms Appear
Read More
  • Invest News

AI Powered Smart Toilet Seats Spot Health Issues Before Symptoms Appear

  • Money Tips
  • April 11, 2026
Doctors Say This “Invisible” Weather Change Could Be Triggering Your Headaches
Read More
  • Invest News

Doctors Say This “Invisible” Weather Change Could Be Triggering Your Headaches

  • Money Tips
  • April 7, 2026
Trump Spent Easter Declaring America Needs Christianity Back. Then He Signed a War Threat “Praise Be to Allah.”
Read More
  • Invest News

Trump Spent Easter Declaring America Needs Christianity Back. Then He Signed a War Threat “Praise Be to Allah.”

  • Money Tips
  • April 5, 2026
Tips on Improving Your Odds of Becoming a Millionaire
Read More
  • Invest News

Tips on Improving Your Odds of Becoming a Millionaire

  • Money Tips
  • April 3, 2026
A Pocket Dial Exposed Kristi Noem’s Husband. The Public Never Saw the Vetting System Raise a Flag
Read More
  • Invest News

A Pocket Dial Exposed Kristi Noem’s Husband. The Public Never Saw the Vetting System Raise a Flag

  • Money Tips
  • April 1, 2026
The Trump Presidential Library Renderings Include a Giant Golden Statue, a Full-Sized Air Force One, and No Books
Read More
  • Invest News

The Trump Presidential Library Renderings Include a Giant Golden Statue, a Full-Sized Air Force One, and No Books

  • Money Tips
  • March 31, 2026
How to Choose the Right Coverage for Your Needs
Read More
  • Invest News

How to Choose the Right Coverage for Your Needs

  • Money Tips
  • March 25, 2026
How to Cut Your Cell Phone Bill by ,100 a Year
Read More
  • Invest News

How to Cut Your Cell Phone Bill by $1,100 a Year

  • Money Tips
  • March 23, 2026

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recent Posts

  • AI Chatbots Are Giving Millions of People Medical Advice. A Researcher Tested What Happens When the Science They Rely On Is Completely Made Up
  • The Ultimate Guide to Saving Money on a Tight Budget | Chinkee Tan
  • NVIDIA Backed Penny Stock Set to Explode | Buy Before It’s Too Late
  • AI Powered Smart Toilet Seats Spot Health Issues Before Symptoms Appear
  • Quit Dropshipping If You Want To Be Rich in 2025
Ad - WooCommerce hosting from SiteGround - The best home for your online store. Click to learn more.
Featured Posts
  • AI Chatbots Are Giving Millions of People Medical Advice. A Researcher Tested What Happens When the Science They Rely On Is Completely Made Up 1
    AI Chatbots Are Giving Millions of People Medical Advice. A Researcher Tested What Happens When the Science They Rely On Is Completely Made Up
    • April 13, 2026
  • The Ultimate Guide to Saving Money on a Tight Budget | Chinkee Tan 2
    The Ultimate Guide to Saving Money on a Tight Budget | Chinkee Tan
    • April 13, 2026
  • NVIDIA Backed Penny Stock Set to Explode | Buy Before It’s Too Late 3
    NVIDIA Backed Penny Stock Set to Explode | Buy Before It’s Too Late
    • April 12, 2026
  • AI Powered Smart Toilet Seats Spot Health Issues Before Symptoms Appear 4
    AI Powered Smart Toilet Seats Spot Health Issues Before Symptoms Appear
    • April 11, 2026
  • Quit Dropshipping If You Want To Be Rich in 2025 5
    Quit Dropshipping If You Want To Be Rich in 2025
    • April 11, 2026
Recent Posts
  • Supreme Court decision on car finance mis-selling: Martin Lewis explains what it means for you
    Supreme Court decision on car finance mis-selling: Martin Lewis explains what it means for you
    • April 10, 2026
  • Best Short-Term Investment Options (2025)
    Best Short-Term Investment Options (2025)
    • April 9, 2026
  • 🔥The BEST INVESTING APP in SA 2024: BROKSTOCK Tutorial For Beginners 📈💰
    🔥The BEST INVESTING APP in SA 2024: BROKSTOCK Tutorial For Beginners 📈💰
    • April 8, 2026
Categories
  • Business (60)
  • Economy (40)
  • Finance Expert (40)
  • Invest News (278)
  • Investing (43)
  • Tech (50)
  • Trading (32)
  • Uncategorized (1)
  • Videos (533)
MoneyWa
  • Privacy Policy
  • DMCA
  • Terms of Use
Money & Invest Advices

Input your search keywords and press Enter.