For years, Apple has closely guarded its reputation for privacy among data-hungry, growth-hungry tech companies.
In multi-platform advertising campaigns, the company told consumers that “what happens on your iPhone stays on your iPhone” and equated its products with security through slogans such as “Privacy. It’s an iPhone.”
But experts say that while Apple sets the bar when it comes to hardware and, in some cases, software security, the company could do more to protect user data from falling into the hands of police and other authorities.
In recent years, US law enforcement has increasingly used data collected and held by technology companies in investigations and prosecutions. Experts and civil liberties advocates have raised concerns about authorities’ widespread access to consumer digital information, warning that it could violate Fourth Amendment protections against unreasonable searches. These concerns have only intensified as once-protected behaviors like access to abortion have become a felony in many states.
“The more a company like Apple can do to either not get requests from law enforcement or be able to say they can’t comply using tools like end-to-end encryption, the better it will be. be for the company,” said Caitlin Sealy George, director of campaigns and managing director of digital advocacy group Fight for the Future.
Apple passed data to law enforcement 90% of the time
According to its own transparency reports, Apple receives thousands of requests from law enforcement agencies each year for user data and, in the vast majority of cases, cooperates with them.
In the first half of 2021, Apple received 7,122 U.S. law enforcement requests for account data from 22,427 people. According to the company’s latest transparency report, Apple has shared some level of data in response to 90% of requests. Of those 7,122 requests, the iPhone maker disputed or denied 261 requests.
The company’s positive response rate is mostly in line with, and sometimes slightly higher than, competitors like Facebook and Google. However, both of these companies have documented many more requests from authorities than the iPhone maker.
Facebook received nearly 60,000 law enforcement requests from US authorities in the second half of 2021, according to the company’s latest transparency report, and provided data 88% of the time. In the same period, Google received 46,828 law enforcement requests affecting more than 100,000 accounts, and shared some level of data in response to more than 80% of the requests, according to the search giant’s transparency report. This is more than six times the number of law enforcement requests received by Apple in a comparable time period.
That’s because the amount of data Apple collects about its users pales in comparison to other players in the space, says Jennifer Golbeck, a professor of computer science at the University of Maryland. She noted that Apple’s business model relies less on marketing, advertising and user data – operations are based on data collection. “They don’t naturally use people for analytics like Google and many other places do,” she said.
Apple has developed detailed guidelines outlining exactly what data authorities can obtain and how they can obtain it — a level of detail that the company says is in line with best practice.
Despite “secure” hardware, iCloud and other services pose a risk
But serious gaps remain, privacy advocates say.
While iMessages sent between Apple devices are end-to-end encrypted, preventing anyone but the sender and recipient from accessing them, not all information backed up in iCloud, Apple’s cloud server, has the same level of encryption.
“iCloud content as it exists in the customer’s account” may be released to law enforcement in response to a search warrant, Apple’s law enforcement guidelines say. This includes everything from detailed logs of the time, date, and recipient of emails sent in the previous 25 days to “saved photos, documents, contacts, calendars, bookmarks, Safari browsing history, map search history, messages, and iOS device backups.” “. The device backup itself may include “camera roll photos and videos, device settings, app data, iMessage, business chat, SMS, and MMS.” [multimedia messaging service] messages and voicemail,” according to Apple.
Golbeck is an iPhone user but refuses to use iCloud because he is worried about the system’s vulnerability to hacking and law enforcement requests. “I’m one of those people who, if someone asks if they should get an Android or an iPhone, I’m like, ‘Well, the iPhone will be better protected than Android, but the bar is just really low.’ she said.
“[Apple’s] the hardware is the most secure on the market,” echoed Albert Fox Kahn, founder of the Surveillance Technology Oversight Project, a privacy rights organization. But the company’s iCloud data policy also worries him: “I have to spend so much time giving up things that they try to automatically push me to use that are supposed to make my life better, but really just put me at risk. .
“As long as Apple continues to limit privacy to a matter of hardware design rather than considering the full lifecycle of data and considering the full range of government surveillance threats, Apple will fail,” he said.
It’s a double standard, Kahn says, that was already evident in Apple’s stance in the most high-profile privacy case, the 2015 San Bernardino, California mass shooting.
At the time, Apple refused to comply with the FBI’s request to create a backdoor to access the shooter’s locked iPhone. The company claimed that the security bypass could be used by hackers as well as law enforcement officials in future cases.
But the company said in court documents that if the FBI hadn’t changed the phone’s iCloud password, it wouldn’t have had to create a backdoor because all the data would have been backed up and therefore available through the subpoena.
In fact, the company said that up to this point, Apple had already “provided all the data it had regarding the attackers’ accounts.”
“They were very clear that they didn’t want to hack their own iPhones, but they really wanted to hack an iCloud backup,” Kahn said.
Apple said in a statement that it considers privacy a fundamental human right and argued that users have always been given the option to opt out when the company collects their data.
“Our products incorporate innovative privacy technologies and practices designed to minimize the amount of your data that we or anyone else can access,” said Apple spokesman Trevor Kincaid, adding that the company is proud of new privacy features such as like application tracking transparency. and email privacy protection, which gives users more control over what information is shared with third parties.
“Wherever possible, data is processed on the device, and in many cases we use end-to-end encryption. Where Apple collects personal information, we communicate clearly and transparently to users how their data is used and how to opt out at any time.”
Apple reviews all legal requests and is required to comply with them when valid, Kincaid added, but stressed that the personal data Apple collects is limited to begin with. For example, the company encrypts all health data and does not collect device location data.
People are ‘completely unaware of what’s going on with their data’
Meanwhile, privacy groups like the Electronic Frontier Foundation (EFF) are urging Apple to implement end-to-end encryption for iCloud backups.
“When we say they’re better than everyone else, it’s more of an accusation of what everyone else is doing, and not necessarily that Apple is particularly good,” said EFF staff technologist Erica Portnoy.
Portnoy credits Apple for protecting some services like iMessage by default. “In a way, some of the defaults might be a little better. [than other companies]which is nothing,” she said. But, she noted, messages are only safe if they are sent between iPhones.
“We know that if messages are not end-to-end encrypted, many people can access these communications,” said George, whose organization Fight for the Future launched a campaign to push Apple and other companies to improve the security of their exchange systems. messages.
That’s a problem the company could solve, for example, by implementing a Google-backed messaging system called Rich Communication Services (RCS), George argued. The system itself doesn’t have end-to-end encryption, she said, but it does support encryption, unlike SMS and MMS, and will allow Apple to secure messages between iPhone and Android.
At the Code 2022 tech conference, Apple CEO Tim Cook indicated that the company has no plans to support RCS, arguing that users have not called it a priority. But they “don’t know what RCS is,” George said. “If Apple really doesn’t want to use RCS because it comes from Google, they might come up with other solutions to show a good faith effort to protect people’s messages.”
Kincaid said consumers didn’t ask for another messaging service because there are many encrypted offerings like Signal. He also said that Apple is concerned that RCS is not a modern standard and is not encrypted by default.
Golbeck, who has a TikTok channel about privacy, says people are “totally unaware of what’s going on with their data” and “think they have privacy that they don’t.”
“We really don’t want our own devices to turn into surveillance tools for the state,” Golbeck said.