Cyber Blurbs: Apple to Scan iPhones For Child Sex Abuse

In this week’s Cyber Blurbs Roundup, we take a look at Apple’s sweeping decision to scan iPhones for child abuse imagery, as well as Zoom’s costly lawsuit.


Apple to Scan US iPhones for Child Abuse Imagery

Last Thursday, Apple announced plans to roll out technologies intended to combat child abuse imagery on iPhones in the United States. 

Part of a future iOS 15 update, the tech will scan images on an iPhone as they are being uploaded to iCloud, determining whether those images are a match to known child sexual abuse material (CSAM) existing in the National Center for Missing and Exploited Children (NCMEC) database. Should the system detect a match of the image hashes, the image will then be reviewed by a human. 

This only applies to images being stored on iCloud. An iPhone’s local images are not subject to scanning.

“Apple’s method of detecting known CSAM is designed with user privacy in mind,” the company wrote in a blog post. “Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.”

“Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.”

The company also announced plans for a tool that will warn parents when their children 12 years old or younger are sharing or receiving sexually explicit messages via iMessage. 

A technical summary of Apple’s new policies can be found here

The policies, while well-intentioned, have some privacy and security experts concerned. 

“All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts,” the Electronic Frontier Foundation wrote in a post. “That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.”

Some have also developed concern over the possibility that governments may pressure Apple to expand on its photo-scanning technology to look for more than just CSAM. The company published an FAQ to address this issue.

“Apple will refuse any such demands. Apple's CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC (National Center for Missing and Exploited Children) and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government's request to expand it. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.”

News of the new technology comes just one year after Apple announced its unconventional but widely appreciated approach to user privacy. Alongside iOS 14, the company introduced the concept of App Tracking Transparency, a policy that would now require mobile applications to acquire consent from users to have their usage tracked across other applications. The move was one to celebrate for privacy advocates. 


Zoom to Pay $85M Over Lying About End-to-End Encryption

Zoom is set to pay $85 million in a settlement of a class-action lawsuit in which the company was accused of lying about offering end-to-end encryption. Users will be eligible to receive $15 or $25 each, regardless of whether they had a paid account or not. 

Zoom has "agreed to over a dozen major changes to its practices, designed to improve meeting security, bolster privacy disclosures, and safeguard consumer data," according to court documents (via ArsTechnica). 

The lawsuit stems from Zoom’s definition of E2EE, which doesn’t appear to follow industry standards.

"The definition of end-to-end encryption is not up for interpretation in the industry," the complaint read. "Zoom's misrepresentations are a stark contrast to other videoconferencing services, such as Apple's FaceTime, which have undertaken the more challenging task of implementing true E2E encryption for a multiple party call."

News of the settlement comes nine months after the company settled on a different case with the Federal Trade Commission, which accused Zoom of misrepresenting its privacy offerings. 

"Zoom did not provide end-to-end encryption for any Zoom Meeting that was conducted outside of Zoom's 'Connecter' product (which are hosted on a customer's own servers), because Zoom's servers—including some located in China—maintain the cryptographic keys that would allow Zoom to access the content of its customers' Zoom Meetings,"

Zoom’s popularity soared last year, largely due to the COVID-19 pandemic. It became the go-to video conferencing platform for many newly remote businesses both small and large, quadrupling its annual revenue from $622.7 (2019) million to $2.7 billion (2020). The company is on pace to surpass its 2020 numbers, hitting nearly a billion dollars in revenue during Q1 2021 (February–April) alone. 

Its success came in spite of serious concerns from security experts, namely in the form of Zoombombing, which allowed hackers to interfere with otherwise private calls. 


RECENT POSTS