Apple Starts to Scan User's Photos

3 years ago
25

Apple announced they would start searching your photos under the guise of fighting child harm. "The new feature scans iCloud Photos images to find child sexual abuse material, or CSAM, and reports it to Apple moderators — who can pass it on to the National Center for Missing and Exploited Children, or NCMEC."

From Episode 270
For the full episode (video) and bonus episodes, go to http://patreon.com/thedickshow and https://thedickshow.com/
http://patreon.com/thedickshow

#Iphone #Privacy #TheDickShow

Loading comments...