Anthropic Defaults Claude Data for Model Training, Retains Chats Five Years

TechCrunch •

Anthropic now asks Claude users to decide by Sept. 28 whether their chats and coding sessions can be used to train models; those who don’t opt out will have data retained for up to five years. The change — defaulted “on” in a small toggle for existing users — reverses a prior 30‑day deletion policy and has sparked privacy and dark‑pattern concerns.

Read original ↗