Close Menu
Tech News VisionTech News Vision
  • Home
  • What’s On
  • Mobile
  • Computers
  • Gadgets
  • Apps
  • Gaming
  • How To
  • More
    • Web Stories
    • Global
    • Press Release

Subscribe to Updates

Get the latest tech news and updates directly to your inbox.

Trending Now

EcoFlow’s Delta Pro Ultra X can power a home for weeks

13 October 2025

‘We Went for It’ — Channing Tatum Still Wants to Make His Fox Gambit Movie, but Says Marvel and Disney Would Never Approve Its R-Rated Mutant Sex Scenes

13 October 2025

Vivo X300 Pro launches with an Ultra-rivaling camera

13 October 2025
Facebook X (Twitter) Instagram
  • Privacy
  • Terms
  • Advertise
  • Contact
Facebook X (Twitter) Instagram Pinterest VKontakte
Tech News VisionTech News Vision
  • Home
  • What’s On
  • Mobile
  • Computers
  • Gadgets
  • Apps
  • Gaming
  • How To
  • More
    • Web Stories
    • Global
    • Press Release
Tech News VisionTech News Vision
Home » Anthropic Will Use Claude Chats for Training Data. Here’s How to Opt Out
What's On

Anthropic Will Use Claude Chats for Training Data. Here’s How to Opt Out

News RoomBy News Room1 October 2025Updated:1 October 2025No Comments
Facebook Twitter Pinterest LinkedIn Tumblr Email

If a user doesn’t opt out of model training, then the changed training policy covers all new and revisited chats. That means Anthropic is not automatically training its next model on your entire chat history, unless you go back into the archives and reignite an old thread. After the interaction, that old chat is now reopened and fair game for future training.

The new privacy policy also arrives with an expansion to Anthropic’s data retention policies for those that don’t opt out. Anthropic increased the amount of time it holds onto user data from 30 days in most situations to a much more extensive five years, whether or not users allow model training on their conversations. Users who opt out will still be under the 30 day policy.

Anthropic’s change in terms applies to commercial-tier users, free as well as paid. Commercial users, like those licensed through government or educational plans, are not impacted by the change and conversations from those users will not be used as part of the company’s model training.

Claude is a favorite AI tool for some software developers who’ve latched onto its abilities as a coding assistant. Since the privacy policy update includes coding projects as well as chat logs, Anthropic could gather a sizable amount of coding information for training purposes with this switch.

Prior to Anthropic updating its privacy policy, Claude was one of the only major chatbots not to use conversations for LLM training automatically. In comparison, the default setting for both OpenAI’s ChatGPT and Google’s Gemini for personal accounts include the possibility for model training, unless the user chooses to opt out.

Check out WIRED’s full guide to AI training opt-outs for more services where you can request generative AI not be trained on user data. While choosing to opt out of data training is a boon for personal privacy, especially when dealing with chatbot conversations or other one-on-one interactions, it’s worth keeping in mind that anything you post publicly online, from social media posts to restaurant reviews, will likely be scraped by some startup as training material for its next giant AI model.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related Posts

Apple TV Plus is being rebranded to… Apple TV

13 October 2025

Nvidia’s Spark ‘personal AI supercomputers’ are available to buy

13 October 2025

EcoFlow’s Delta Pro Ultra X can power a home for weeks

13 October 2025

Vivo X300 Pro launches with an Ultra-rivaling camera

13 October 2025
Editors Picks

Apple TV Plus is being rebranded to… Apple TV

13 October 2025

Battlefield 6 Presents Players With a Campaign Uninstall Button After They Beat It, Tells Them It’ll Save Space

13 October 2025

Nvidia’s Spark ‘personal AI supercomputers’ are available to buy

13 October 2025

Borderlands 4 Chief Randy Pitchford Says if More Developers Better Understood Why Gamers Love Making Decisions About Loot, ‘We’d Have Good Competitors’

13 October 2025

Subscribe to Updates

Get the latest tech news and updates directly to your inbox.

Trending Now
Tech News Vision
Facebook X (Twitter) Instagram Pinterest Vimeo YouTube
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact
© 2025 Tech News Vision. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.