It’s the Digital Security Training team at Freedom of the Press Foundation (FPF), with security news that keeps you, your sources, and your devices safe. If someone has shared this newsletter with you, please subscribe here.
In the news
Over this past week, Slack published a blog post defending its privacy practices following widespread criticism over its use of customer data to train its global AI models. At the moment, organizations are required to opt out to prevent their messages, content, and files from being mined to develop Slack’s AI. The company has updated its privacy principles to clarify that it does not develop “generative models” using customer data, (e.g., chatbots) suggesting it instead uses such data to train “non-generative” models “for features such as emoji and channel recommendations.” Further adding to the confusion, Slack’s messaging is inconsistent. Its AI landing page reads, “Your data is your data. We don't use it to train Slack AI.” But the company now explicitly requires organizations to opt out to prevent their data from being used in this way. Read more here.
What you can do
- If you are in a team on Slack and don’t want your data to be used to develop Slack’s AI models, you will need your team’s “owners” to contact Slack directly at [email protected]. Include your organization’s URL (e.g., myexample.slack.com) and use the subject line “Slack global model opt-out request.”
- While we know many media organizations use Slack, this is a good reminder that Slack messages are perfectly legible to the company and should therefore not be considered private. Read my colleague Kunal Mehta’s breakdown on the privacy properties of Slack.
Our team is always ready to assist journalists with digital security concerns. Reach out here, and stay safe and secure out there.
Best,
Martin