Apple says its CSAM scan code can be verified by researchers. Corellium starts throwing out dollar bills

Last week, Apple essentially invited security researchers to probe its forthcoming technology that’s supposed to help thwart the spread of known child sexual abuse material (CSAM). In an attempt to clear up what it characterized as misunderstandings about its controversial plan to analyze iCloud-bound photos for this awful material, the Cupertino giant described [PDF] the systems, protections, and mechanisms involved.

Read full article on The Register

 


Date:

Categorie(s):

Tag(s):