Apple to begin reporting Child Sexual Abuse Material (CSAM)

Next story
Ranson Burkette & Tony Anscombe | Season 4 Episode 10

Show notes

Apple recently announced it will begin reporting Child Sexual Abuse Material (CSAM) to law enforcement with the latest iOS 15 update. The new system aims to identify images using a process called hashing, which turns images into numbers. On this episode, we discuss how Apple’s new system will work and how this bold step in combating Child Sexual Abuse is being received by privacy-sensitive users around the world.

Links:

Apple to combat Child Sexual Abuse Material:  www.cnbc.com/2021/08/05/apple-will-report-child-sexual-abuse-images-on-icloud-to-law.html 

 National Center for Missing Exploited Children (NCMEC): Home (missingkids.org) 

 Internet Watch Foundation (IWF): Homepage | Internet Watch Foundation (iwf.org.uk)