Slack’s recent controversies surrounding its machine-learning practices have raised serious questions about user privacy and data protection.
The revelation that the company trains its models on user messages, files, and other content without explicit consent has sparked widespread concern among users and privacy advocates.
The issue came to light when Corey Quinn, an executive at DuckBill Group, uncovered the policy buried within Slack’s Privacy Principles and shared his discovery on social media.
This revelation shed light on a practice where Slack’s systems analyze various forms of user data, including messages and content sent through the platform, as well as additional information outlined in the company’s privacy policy and customer agreements.
What’s particularly troubling about this practice is that it operates on an opt-out basis, meaning users’ private data is automatically included in the training process unless they specifically raise a request to be excluded from the data set.
To make matters worse, users cannot opt out by themselves; instead, they must rely on their organization’s Slack administrator to initiate the process, creating an additional layer of complexity and inconvenience.
In response to mounting concerns, Slack attempted to address the issue with a blog post clarifying how a customer’s data is used. The company claimed that user data is not used to train its generative AI products but is fed to machine learning models for tasks such as channel and emoji recommendations and search results.
However, this explanation failed to assuage the privacy concerns that the discovery raised, as users remained skeptical about the extent to which they could access and the adequacy of privacy safeguards.
The twisted and complicated nature of the opt-out process further exacerbates the situation, burdening users with the task of navigating administrative channels and actively requesting exclusion from data training activities. This approach shifts the responsibility onto users to safeguard their data rather than placing the onus on the company to obtain explicit consent for using the personal information for training purposes.
Moreover, inconsistencies in Slack’s privacy policies have added to the confusion and skepticism surrounding the company’s data practices. While one section claims that Slack cannot access the underlying content when developing AI and ML models, other policies appear to contradict this assertion, leading to uncertainty among users about the true extent of data access and usage.
In a statement to Firstpost, Slack claimed that the company “is not scanning message content to train AI models.” Furthermore, they published a blog post to “clarify our practices and policies regarding customer data.”
Slack also claims that its policies & practices did not change and that they have “updated the language in our privacy principles to clarify that Slack does not use customer data to train LLMs” in the aforementioned statement.
Slack’s marketing of its premium generative AI tools as not using user data for training purposes has further muddied the waters. While these tools may adhere to strict privacy standards, the implication that all user data is immune from AI training processes is misleading, given the company’s practices with other machine-learning models.
Slack also claimed that its other intelligent features (not generative AI via Slack AI) “analyze user behavior data, but their models do not access the message content.” Some examples where user behavior data is used include accessing the timestamp of the last message sent in a channel, which can help Slack recommend channels to archive; the number of interactions between two users that is incorporated into the user recommendation list when a user goes to start a new conversation; and the number of words overlapping between a channel name and other channels so that Slack can inform its relevance to a user.
Ultimately, Slack’s handling of user data and its communication regarding privacy practices have raised significant concerns about transparency, consent, and data protection. As users increasingly prioritize privacy and data security, it is imperative for companies like Slack to address these issues promptly and transparently to maintain trust and accountability within their user communities.