Gravité Blog
Tip of the Week: Stop LinkedIn from Using Your Data to Train AI
Artificial intelligence is everywhere, and it is making data more valuable than ever. This is because AI platforms rely heavily on data to function effectively. Many platforms and services collect data from their users to fuel these algorithms. LinkedIn has recently been found to do this—by default—without properly informing its users or updating its terms of service.
Fortunately, you can easily prevent LinkedIn from using your data for AI model generation.
How to Stop LinkedIn from Using Your Data for AI Training
Unfortunately, this process won’t reverse any AI training that’s already taken place.
- Open LinkedIn and go to Settings & Privacy.
- Click on Data privacy.
- Toggle off the option labeled Data for Generative AI Improvement.
Alternatively, LinkedIn offers a form where users can request that the processing of their data be limited.
Remember, though, that this setting will only prevent your data from being used for training if you don’t engage with AI tools on the platform. Any data you provide while using these tools will still be used.
Why is LinkedIn Collecting My Data?
According to LinkedIn, AI is used for various purposes, with a focus on supporting generative AI functions like its writing assistant. LinkedIn emphasizes that AI can help users create posts and messages.
You can find more details by clicking the Learn more link under the Data for Generative AI Improvement setting.
However, a closer look reveals that some of this data is shared with LinkedIn’s partners and affiliates, including Microsoft and its Azure OpenAI division. This means your data might have been extracted from LinkedIn’s platform without your direct knowledge or consent.
Unfortunately, any already-used data is likely beyond your control, making it crucial to limit future data collection. We highly recommend following the steps above and calling Gravité today at 1300 008 123 for help managing your data and the security of your business’ digital resources.
Comments