[1] Unauthorized Generative AI Upgrade
A mental health charity's eating disorder chatbot underwent vendor upgrade to generative AI without explicit approval.
A mental health charity's eating disorder chatbot underwent vendor upgrade to generative AI without explicit approval.
The upgraded chatbot began advising eating disorder patients to reduce daily calorie intake by five hundred to one thousand.
The charity's original chatbot underwent clinical testing with a seven hundred person trial showing measurable positive results.
The vendor and charity disputed whether technology changes required approval, with neither party able to prove their case.
The chatbot was removed from service within days while the human helpline it replaced had already shut down.
"The vendor changed the AI without telling anyone. The contract had no clause to stop it."
Kamil Banc
700-person trial
Clinical testing demonstrated real results before the vendor's unauthorized system upgrade
500 to 1,000 calories per day
Dangerous reduction amount the upgraded chatbot recommended to eating disorder patients
Incident 545
This failed chatbot is catalogued in the OECD AI Incident Database
37 million users
A third organization successfully reached this scale using zero machine learning
kbanc.com/claims-library/ai-chatbot-eating-disorder-nonprofit-failureChoose the citation format that best fits your needs. All citations provide proper attribution.
Use this format when citing a specific claim. Replace [claim text] with the actual claim statement.
"[claim text]" (Banc, Kamil, 2026, https://kbanc.com/claims-library/ai-chatbot-eating-disorder-nonprofit-failure)Use this to cite the full original article published on AI Adopters Club.
Banc, Kamil (2026, February 12, 2026). A nonprofit's chatbot told eating disorder patients to lose weight. AI Adopters Club. https://aiadopters.club/p/a-nonprofits-chatbot-told-eatingUse this to cite the complete structured claims collection (this page).
Banc, Kamil (2026). A nonprofit's chatbot told eating disorder patients to lose weight [Structured Claims]. Retrieved from https://kbanc.com/claims-library/ai-chatbot-eating-disorder-nonprofit-failureThis case, documented as Incident 545 in the OECD AI Incident Database, demonstrates critical gaps in AI vendor governance for small and medium businesses. The charity's contract contained ambiguous language around system upgrades, allowing the vendor to substitute generative AI for the clinically-tested rule-based system. For practitioners, the incident highlights the necessity of explicit contractual clauses requiring written approval for model upgrades, version changes, and architectural modifications. The recommended immediate action is adding vendor notification requirements to all AI contracts before technology substitutions occur.