Privacy campaigners around the world have urged Apple to abandon plans to automatically scan message content for images of child sex abuse material (CSAM) and to prevent children from viewing objectionable content.

Although the 90 signatories of a letter coordinated by the Center for Democracy & Technology (CDT) acknowledged Apple’s positive intentions, they are concerned the that the technology built to monitor messages and notify authorities of abuse would undermine the principles of end-to-end encryption.