Apple: CSAM Image-Detection Backdoor ‘Narrow’ in Scope

Apple provided additional design and security details this week about the planned rollout of a feature aimed at detecting child sexual abuse material (CSAM) images stored in iCloud Photos. Privacy groups like the Electronic Frontier Foundation warned that the process of flagging CSAM images essentially narrows the definition of end-to-end encryption to allow client-side access — which essentially means Apple is building a backdoor into its data storage, it said.

Read full article on Threat Post

 


Date:

Categorie(s):