Facebook Inc. scans the text and images that people send each other on Facebook Messenger, making sure it all abides by the company’s rules governing content. If it doesn’t, it gets blocked.
The company confirmed the practice after an interview published earlier this week with chief executive officer (CEO) Mark Zuckerberg raised questions about Messenger’s practices and privacy. Zuckerberg told Vox’s Ezra Klein a story about receiving a phone call related to ethnic cleansing in Myanmar. Facebook had detected people trying to send sensational messages through the Messenger app, he said.
“In that case, our systems detect what’s going on,” Zuckerberg said. “We stop those messages from going through.”
Some people reacted with concern on Twitter: Was Facebook reading messages more generally? Facebook has been under scrutiny in recent weeks over how it handles users’ private data and the revelation struck a nerve. Messenger doesn’t use the data from the scanned messages for advertising, the company said, but the policy may extend beyond what Messenger users expect.
The company told Bloomberg that while Messenger conversations are private, Facebook scans them and uses the same tools to prevent abuse there that it does on the social network more generally. All content must abide by the same “community standards.” People can report posts or messages for violating those standards, which would prompt a review by the company’s “community operations” team. Automated tools can also do the work.
“For example, on Messenger, when you send a photo, our automated systems scan it using photo matching technology to detect known child exploitation imagery or when you send a link, we scan it for malware or viruses,” a Facebook Messenger spokeswoman said in a statement. “Facebook designed these automated tools so we can rapidly stop abusive behavior on our platform.”
Messenger used to be part of Facebook, before it was spun off into a separate application in 2014. Facebook’s other major chat app, WhatsApp, encrypts both ends of its users’ communications, so that not even WhatsApp can see it—a fact that’s made it more secure for users, and more difficult for lawmakers wanting information in investigations.
Facebook is on the defensive after revelations that private information from about 50 million users wound up in the hands of political ad-data firm Cambridge Analytica without their consent. Zuckerberg has agreed to testify before the House next week and is holding a conference call on Wednesday afternoon to discuss changes to Facebook privacy policies.
The company is working to make its privacy policies clearer, but still ends up with gaps between what it says users have agreed to, and what users think they actually agreed to.
The Messenger scanning systems “are very similar to those that other internet companies use today,” the company said. Bloomberg
The company confirmed the practice after an interview published earlier this week with chief executive officer (CEO) Mark Zuckerberg raised questions about Messenger’s practices and privacy. Zuckerberg told Vox’s Ezra Klein a story about receiving a phone call related to ethnic cleansing in Myanmar. Facebook had detected people trying to send sensational messages through the Messenger app, he said.
“In that case, our systems detect what’s going on,” Zuckerberg said. “We stop those messages from going through.”
Some people reacted with concern on Twitter: Was Facebook reading messages more generally? Facebook has been under scrutiny in recent weeks over how it handles users’ private data and the revelation struck a nerve. Messenger doesn’t use the data from the scanned messages for advertising, the company said, but the policy may extend beyond what Messenger users expect.
The company told Bloomberg that while Messenger conversations are private, Facebook scans them and uses the same tools to prevent abuse there that it does on the social network more generally. All content must abide by the same “community standards.” People can report posts or messages for violating those standards, which would prompt a review by the company’s “community operations” team. Automated tools can also do the work.
“For example, on Messenger, when you send a photo, our automated systems scan it using photo matching technology to detect known child exploitation imagery or when you send a link, we scan it for malware or viruses,” a Facebook Messenger spokeswoman said in a statement. “Facebook designed these automated tools so we can rapidly stop abusive behavior on our platform.”
Messenger used to be part of Facebook, before it was spun off into a separate application in 2014. Facebook’s other major chat app, WhatsApp, encrypts both ends of its users’ communications, so that not even WhatsApp can see it—a fact that’s made it more secure for users, and more difficult for lawmakers wanting information in investigations.
Facebook is on the defensive after revelations that private information from about 50 million users wound up in the hands of political ad-data firm Cambridge Analytica without their consent. Zuckerberg has agreed to testify before the House next week and is holding a conference call on Wednesday afternoon to discuss changes to Facebook privacy policies.
The company is working to make its privacy policies clearer, but still ends up with gaps between what it says users have agreed to, and what users think they actually agreed to.
The Messenger scanning systems “are very similar to those that other internet companies use today,” the company said. Bloomberg
No comments:
Post a Comment