How to Prevent Your Data from Being Used to Train AI

On its help pages, OpenAI says ChatGPT web users should navigate to ChatGPT without an account Settings and then deactivate it Improve the model for everyone. If you have an account and are logged in via a web browser, select ChatGPT, settings, data controls, and then turn off Chat history and training. If you’re using ChatGPT’s mobile apps, go to Settingschoose data controls, and turn off Chat history and training. Changing these settings will not sync across different browsers or devices, according to OpenAI’s support pages. Therefore, you will need to make the change wherever you use ChatGPT.

OpenAI is about much more than just ChatGPT. The startup has one for its Dall-E 3 image generator Form that allows you to send images be removed from “future training data sets”. You will be asked for your name, email address, whether you own the image rights or are contacting on behalf of a company, details about the image and any images you may have uploaded. OpenAI also says that if you have a “large amount” of images hosted online that you want to remove from the training data, it may be “more efficient” to add GPTBot to the robots.txt file of the website hosting the images.

Traditionally, a website’s robots.txt file – a simple text file usually located at websitename.com/robots.txt – is used to tell search engines and others whether they can include your pages in their results. It can now also be used to tell AI crawlers not to extract your publications – and AI companies have said they will honor this agreement.

confusion

Perplexity is a startup that uses AI to help you search the internet and find answers to questions. As with any other software on this list, you automatically consent to your interactions and data being used to further train Perplexity’s AI. Turn this off by clicking Account namescroll down to Account section and turning off the AI data storage Switch.

Quora

Quora via Matt Burgess

Quora says it “currently” does not use answers to questions, posts, or comments from people about training AI. No user data was sold for AI training either, says a spokesman. However, in the event that this changes in the future, opt-out options will be offered. To do this, visit it Settings Page, click to Privacy, and turn on the “Enable training of large language models for your content” possibility. Despite this choice, there are some Quora posts that can be used for training LLMs. If you respond to a machine-generated answer, the company’s help pages state that these answers may be used for AI training Please note that third parties could simply scrape out the content anyway.

Rev

Rev, a voice transcription service that uses both human freelancers and AI to transcribe audio, says it uses data “ongoing” and “anonymously” to train its AI systems. Even if you delete your account, the AI ​​will continue to be trained using this information.

Kendell Kelton, head of brand and corporate communications at Rev, says the company has the “largest and most diverse data set of voices,” consisting of more than 6.5 million hours of voice recordings. According to Kelton, Rev does not sell user data to third parties. The company’s terms of service state that the data will be used for training and that customers can opt out. People can object to the use of their data by send an email to [email protected], it says on the help pages.

Relaxed

All those random Slack messages at work could also be used by the company to train its models. “Slack has been using machine learning in its product for many years. This includes platform-level machine learning models for things like channel and emoji recommendations, says Jackie Rocca, vice president of product at Slack, which focuses on AI.

Even though the company doesn’t use customer data to train a large language model for its AI product Slack, Slack can use your interactions to improve the software’s machine learning capabilities. “To develop AI/ML models, our systems analyze customer data (e.g. messages, content, and files) submitted to Slack,” Slack’s privacy page says. Similar to Adobe, there’s not much you can do on an individual level to opt out if you’re using a corporate account.

Sharing Is Caring:

Leave a Comment