top of page
Writer's pictureisaacdektor

Apple’s iOS 15 sparks concerns over user privacy

Updated: Sep 8, 2021

The new update pushes the boundaries of user privacy in an attempt to stop the dissemination of child pornography.

By Isaac Dektor, News Editor


Surrounded by Apple products, two boys try out the latest devices on display at an Apple store. (Photo by Luis Flores/The Valley Star)

Set to release this fall, Apple’s latest operating system iOS 15 ignited debates over privacy rights as the company implements new child safety features.


The company will target child sexual abuse material using algorithms to comb through users’ iCloud photos detecting potentially exploitative images. Digital hashing is the process by which material is sifted through and copies of previously flagged images or videos are identified based on the fingerprint, or hash, of the images. Apple plans to utilize this technique in conjunction with the National Center for Missing and Exploited Children, which will provide image hashes that Apple’s systems will search for.


The NCMEC received over 65 million reports of child exploitation on the internet in 2020.


Valley College student Anthony Ramirez does not use Apple products, but believes that the ends may not justify the means.


“It's a double edged sword,” Ramirez said. “It’s a good thing you might catch somebody or you might intrude on someone’s privacy. I don't think they should be able to just - by themselves - scan through your phone.”


Roughly 47 percent of U.S. citizens use iPhones as of 2021 according to statistica.


Apple released an overview of the security risks of its new safety features that addresses privacy concerns.


“The matching process must only reveal CSAM, and must learn no information about any non-CSAM image,” the report says.


In addition to actively seeking out CSAM, iOS 15 Siri and Search directs users who search for CSAM to resources to seek help with their mental health and walks them through filing a report.


In a future update, Apple will also be installing a feature into Messages that blurs sexually explicit content received by minors, giving them the option of whether or not to view it and notifying parents if the child decides to do so. The same protocol will apply when a child attempts to send similar material.


Over 90 civil rights, human rights and digital rights organizations from across the globe signed an open letter urging Apple CEO Tim Cook to abandon the company’s planned new features.

“We are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children,” the letter says.


Apple’s new features draw a stark contradiction to the hard-line the company drew in 2016 when it refused to comply with a court order to unlock the iPhone of dead terrorist Syed Farook who carried out the San Bernardino shootings. Cook opposed the court order on the grounds that it would jeopardize the privacy of all of their users.


The FBI eventually unlocked the iPhone, however Apple stood firm against the U.S. government’s order to create a backdoor into their devices.


"Our battle was over whether or not the government could force Apple to create a tool that could put hundreds of millions of people at risk in order to get into a phone," Cook said about the matter.


Ed Snowden, a former NSA whistleblower, issued an ominous warning about the potential slippery slope Apple’s new features may lead to.


"No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this,” Snowden wrote. “Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow."

Comments


bottom of page