Anthropic To Use Claude Chats For Training AI Models From September: How To Opt Out
Anthropic has announced that starting Sep. 28, Claude users must actively opt out if they don’t want their chat transcripts to be used for training the company’s AI models.

Amazon-backed AI start-up Anthropic is changing the way it manages user data. In a recent blog post, the company said that conversations with its chatbot Claude will be used to train and improve future versions of the system, unless users choose to opt out. It has asked all Claude users to decide by Sept. 28 on whether they want their chats to be included in AI training.
The company made the announcement on the revised set of Consumer Terms and the new Privacy Policy on Aug. 28. In the blog post, the company said, “These updates apply to users on our Claude Free, Pro, and Max plans.”
According to a TechCrunch report, Anthropic, earlier, kept consumer chat data out of model training. Now, it plans to use user conversations and coding sessions to train its AI systems. For those who don’t opt out, the company also said, “We are also extending data retention to five years, if you allow us to use your data for model training.”
A Verge report suggests users will see a pop-up titled “Updates to Consumer Terms and Policies” in large text. Just below, it states, “An update to our Consumer Terms and Privacy Policy will take effect on Sept. 28, 2025. You can accept the updated terms today.” At the bottom, there’s a prominent black “Accept” button.
Anthropic added in the blog post that if users click “Accept” now, the company will start using their data right away to train its AI models.
As per the Verge report, in smaller print, the pop-up also includes a line that says, “Allow the use of your chats and coding sessions to train and improve Anthropic AI models,” accompanied by a toggle switch that defaults to “On.” This means many users may click the large “Accept” button without noticing or changing the toggle option.
How To Opt Out
According to The Verge, to opt out, toggle the switch to “Off” when the pop-up appears.
If you’ve already clicked “Accept” without noticing, you can change your choice later:
Go to Settings
Open the Privacy tab
Scroll to the Privacy Settings section
Toggle “Off” under “Help improve Claude”
You can update this decision anytime through privacy settings. It's important to note that the change will only apply to future data. Data already used for training cannot be withdrawn.