Apple to scan iPhones for child sexual abuse material

Apple to scan iPhones for child sexual abuse material
  • PublishedAugust 6, 2021

New versions of Apple IOS and IpadOS to be released later this year will have a new cryptography feature as part of the company’s campaign to limit the spread of Child Sexual Abuse Material (CSAM).

The new system will work by comparing images to a database of known CSAM compiled by the US National Center for Missing and Exploited Children (NCMEC) and other child safety organisations. If a match is found, a human reviewer will assess the image and report it to law enforcement.

Apple has expressed confidence in the new system adding that it has an “extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account”. Moreover, the company also says that the new system will offer “significant” privacy benefits compared to existing techniques.

However, experts have expressed worry that the system could be used by governments to spy on citizens by flagging prohibited content and political speech.

Should students’ crimes remain a permanent record? – children’s rights with lawyer Bobby Mkangi
Children too have rights and are protected in the constitution of Kenya. In this episode, Lawyer Bobby Mkangi explains to us what the law says about the prot…

Written By