This policy enables Grok, an AI developed by X.ai, another entity under Musk’s ownership, to access extensive user data. The intent is to improve Grok’s capabilities, making it a competitor to established models like OpenAI’s ChatGPT. Although this move may appear aggressive, it mirrors the AI industry practices. Several AI giants and LLMs have used publicly available data for training their AI systems.
While some users are accepting of the policy, seeing their contribution as a minor part of advancing AI technology, others strongly criticize the lack of notice and the automatic opt-in. They view the default data-sharing setting as an unacceptable breach of privacy.
X added a setting for "we'll take your data to train grok" without any notice and just defaulted to "yes" for everyone.
— Kevin Schawinski (@kevinschawinski) July 26, 2024
This is BAD. pic.twitter.com/UkR1Tx0swd
Users concerned about their data being used can adjust their settings. This must be done through the desktop version of X (as this setting is not available on mobile devices for now).
Users should go to the "Privacy and Safety" settings, that can be accessed simply by visiting this page: https://x.com/settings/grok_settings.
Now on that page, select "Grok," and uncheck the box that authorizes data sharing for training purposes.
Additionally, users have the option to delete their conversation history with Grok.
The policy shift has attracted criticism from privacy regulators, especially in Europe. The Irish Data Protection Commission (DPC), responsible for overseeing X’s compliance with the General Data Protection Regulation (GDPR), has expressed surprise at the automatic opt-in. The DPC has been engaging with X on data processing matters and is seeking clarification on the policy’s compliance with GDPR. Similar data-sharing plans by Meta were recently suspended in Europe due to regulatory concerns.
Read next: A New Research Shows that AI Models Trained by Other AI Models Often Produce Incoherent Output
by Asim BN via Digital Information World
No comments:
Post a Comment