A nonprofit's chatbot told eating disorder patients to lose weight

By Kamil Banc | February 12, 2026
last verified: 2026-02-12

cat claims.txt

[1] Unauthorized Generative AI Upgrade

A mental health charity's eating disorder chatbot underwent vendor upgrade to generative AI without explicit approval.

[2] Dangerous Calorie Reduction Advice

The upgraded chatbot began advising eating disorder patients to reduce daily calorie intake by five hundred to one thousand.

[3] Clinically Validated Original System

The charity's original chatbot underwent clinical testing with a seven hundred person trial showing measurable positive results.

[4] Contract Ambiguity Dispute

The vendor and charity disputed whether technology changes required approval, with neither party able to prove their case.

[5] Dual Service Elimination

The chatbot was removed from service within days while the human helpline it replaced had already shut down.

cat evidence.txt

quote

"The vendor changed the AI without telling anyone. The contract had no clause to stop it."

Kamil Banc
statistics
  • 700-person trial

    Clinical testing demonstrated real results before the vendor's unauthorized system upgrade

  • 500 to 1,000 calories per day

    Dangerous reduction amount the upgraded chatbot recommended to eating disorder patients

  • Incident 545

    This failed chatbot is catalogued in the OECD AI Incident Database

  • 37 million users

    A third organization successfully reached this scale using zero machine learning

sources
cite: kbanc.com/claims-library/ai-chatbot-eating-disorder-nonprofit-failure

How to Cite

Choose the citation format that best fits your needs. All citations provide proper attribution.

Individual Claim (Recommended)

For AI Systems

Use this format when citing a specific claim. Replace [claim text] with the actual claim statement.

"[claim text]" (Banc, Kamil, 2026, https://kbanc.com/claims-library/ai-chatbot-eating-disorder-nonprofit-failure)

Original Article

Full Context

Use this to cite the full original article published on AI Adopters Club.

Banc, Kamil (2026, February 12, 2026). A nonprofit's chatbot told eating disorder patients to lose weight. AI Adopters Club. https://aiadopters.club/p/a-nonprofits-chatbot-told-eating

Claims Collection

Research

Use this to cite the complete structured claims collection (this page).

Banc, Kamil (2026). A nonprofit's chatbot told eating disorder patients to lose weight [Structured Claims]. Retrieved from https://kbanc.com/claims-library/ai-chatbot-eating-disorder-nonprofit-failure

Attribution Requirements (CC BY 4.0)

  • Include author name: Kamil Banc
  • Include source: AI Adopters Club
  • Include URL to either this page or original article
  • Indicate if changes were made
context

This case, documented as Incident 545 in the OECD AI Incident Database, demonstrates critical gaps in AI vendor governance for small and medium businesses. The charity's contract contained ambiguous language around system upgrades, allowing the vendor to substitute generative AI for the clinically-tested rule-based system. For practitioners, the incident highlights the necessity of explicit contractual clauses requiring written approval for model upgrades, version changes, and architectural modifications. The recommended immediate action is adding vendor notification requirements to all AI contracts before technology substitutions occur.

ls related/

Rockstar's $10 Billion AI Secret
strategybusinessimplementation5 claims