New Apple Security May Lead to Problems

Last month, Apple announced new security precautions they are implementing with help from the National Center for Missing and Exploited Children (NCMEC). These new security precautions will scan images on iPhones and other iOS products and compare them to NCMEC’s known Child Sexual Abuse Material (CSAM) database. The intention is to protect children and shut down users who are breaking the law. The announcement has received backlash because it has implications for much broader privacy invasions and censorship. If you already know the details of the updates, skip to the end to learn how it can potentially affect comics.

A Quick Note

Before getting into the security updates, it is worth clarifying this is not an attempt to sideline an issue. Online abuse of children and the effect it has on them and their families is a serious issue. We need large tech companies like Apple to continue to step up and work to eradicate this problem. However, the methods used need to be thoughtful and not contribute to exacerbating other digital issues. It is worth looking at the Apple security updates critically to understand how we can improve the digital landscape.

What are the Changes?

The security updates consist of three major changes. For an in-depth examination of the updates, you can find analyses here and an explanation from Apple here. The three security functions are as follows.

1) Search function and Siri Searches for CSAM will be directed to resources for reporting abuse or seeking help for abuse.

2) Blocking sexually explicit images Before loading explicit images in Messenger, devices will require an extra click-through for users 18 and younger. When a user is under 13, the account holder will receive a notification that an explicit image was viewed from that account.

3) Scanning iCloud Photos for CSAM iCloud photos will be compared against a CSAM database in-device. If matches are found, Apple will be notified, its team will review the findings, and the authorities will be alerted.

What are the Specifics?

The Second Update

For users under 18, images downloaded through Messenger will be scanned. If an algorithm deems the image sexually explicit, a warning message will appear over the image. Users will need to confirm the warning and click-through to the image. If the account user is 13 or younger and they click through, the primary account holder for the plan receives an alert that someone on the account viewed sexually explicit material. The account holder are only issued a warning and not shown the actual image.

The Third Update

By far the most controversial update, much of the criticism is focused on this aspect of the changes. Apple will use a proprietary technology that breaks down images from the NCMEC CSAM database into strings of numbers. Each image, regardless of pixel count or cropping, will be identified by its string. This technique is called “NeuralHashing.” Apple has assured the public that the original images can’t be reverse-engineered from these strings or hashes.

Apple uses these hashes in the process it calls Private Set Intersection (PSI). With the new security update, the iOS will contain a CSAM database of the hashes. Any images on a user’s phone saved to the cloud will be converted to hashes on-device and matched with the existing CSAM database on their device. When a match is found, that image will be encrypted, and to put it simply, placed in a holding pattern of sorts. These encrypted images are not yet accessible by Apple.

The last step of the process is called Threshold Secret Sharing (TSS). Apple is only notified of the matches to the CSAM database if a certain number of images, or threshold, is reached. The encrypted images in the holding pattern or cache get released to Apple and are decrypted. Apple subjects the images to a human review, and if necessary, lock down the account and notify the authorities.

What are the Objections?

So why are folks worried about the security updates? Many view them as a breach of privacy and fear the creation of an exploitable backdoor. The real issues rely on the second and third updates.

The Second Update

Criticism has surfaced for the second update for two reasons. One criticism is the reliance on algorithms to decide if the material is sexually explicit; we have seen time and time again how inaccurate algorithms can be. The second criticism is the potential to endanger kids, particularly LGBTQ+ kids. More in-depth coverage of this issue can be found here.

The Third Update

The issue with the third update surrounds scanning and matching images. Many companies already scan data for CSAM images, but this happens on their servers. The difference with Apple is that they are conducting the scan on the user’s personal device. Though iCloud doesn’t use encryption, this technique would potentially give them a backdoor to get around encryption. Apple adds that users can opt-out by not allowing iCloud access to their device. 

So…What About Comics?

Okay, so what does all of this have to do with comics?

These security changes have the highest potential of endangering editorial cartoonists and their audience. Across the globe, cartoonists are targeted by illiberal and authoritarian governments. It is easy to see these protections for children expanded to include counterterrorism, public health, and national security. What if a cartoon is labeled seditious or misinformation?

It is conceivable that NeuralHashing will be used for images outside the CSAM database. A government would be able to create its own database of hashed images. What if opposition cartoons were in the database or works of art by marginalized groups? Bad actors could repurpose the technology to invade privacy pursuing their political agendas, and censoring ideas or identities they don’t like. In a recent FAQ, Apple responded to this fear,

We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it.

Apple has stood up against censorship, but they have also assisted government censorship. The tension between the tech industry and national governments has been a rising issue. Governments are putting pressure on companies to help them police information and maintain servers in-country. Non-compliance can lead to fines (link to Russia) or expulsion from the market. Caving in to these demands can cause additional problems; recently, Apple’s censorship in China bled over into Taiwan and Hong Kong. Apple aside, is there a possibility of this security update creating a backdoor for hackers? We don’t know.

Conclusion

Is this take on the security updates alarmist? Maybe. However, we need to discuss and examine digital issues with potentially damaging consequences before they’re implemented. Currently, there is not enough information about how the privacy of users is protected under these new policies. Opening a door now may let in others later and may increase the vulnerability of cartoonists or others who are outspoken. As we continue to expand and rely on digital technology in our lives, it’s important to weigh the cost against the benefit and strive for the best solution not the quickest.