Claude — one of the most popular AI chatbots out there and a potential candidate for Apple to enhance Siri — will soon start saving a transcript of all your chats for AI training purposes. The policy change announced by Anthropic has already started appearing to users and gives them until September 28th to accept the terms.
What’s changing?
Anthropic says the log of users’ interactions with Claude and its developer-focused Claude Code tool will be used for training, model improvement, and strengthening the safety guardrails. So far, the AI company hasn’t tapped into user data for AI training.
“By participating, you’ll help us improve model safety, making our systems for detecting harmful content more accurate and less likely to flag harmless conversations,” says the company. The policy change is not mandatory, and users can choose to opt out of it.
Anthropic
Users will see a notification alert about the data usage policy until September 28. The pop-up will let them easily disable the toggle that says “You can help improve Claude,” and then save their preference by hitting the Accept button. After September 28, users will have to manually change their preference from the model training settings dashboard.
What’s the bottom line?
The updated training policy only covers new chats with Claude, or those you resume, but not old chats. But why the reversal? The road to AI supremacy is pretty straightforward. The more training material you can get your hands on, the better the AI model’s performance.
Anthropic
The industry is already running into a data shortage, so Anthropic’s move is not surprising. If you are an existing Claude user, you can opt out by following this path: Settings > Privacy > Help Improve Claude.
Anthropic’s new user data policy covers the Claude Free, Pro, and Max plans. It, however, doesn’t apply to Claude for Work, Claude Gov, Claude for Education, APU use, or when connected to third-party platforms such as Google’s Vertex AI and Amazon Bedrock.
In addition to this, Anthropic is also changing its data retention rules. Now, the company can keep a copy of user data for up to five years. Chats that have been manually deleted by users will not be used for AI training.