もっと詳しく

It may be more difficult for hackers to grab your iCloud data — and even Apple is rethinking its access to sensitive content. The company is introducing a raft of security measures that run the gamut from expanded end-to-end encryption to a reversal of a controversial program intended to identify possible child sex offenders. The launch is headlined by Advanced Data Protection, an optional feature that applies end-to-end encryption to more iCloud data. While Apple was already protecting 14 data categories, the new offering protects 23 — including iCloud device backups, photos and notes. Your calendar, contacts and iCloud Mail are still unencrypted to support global systems.

Advanced Data Protection is available to try in the US today as part of the Apple Beta Software Program. Americans will have broader access by the end of 2022. Other countries will have access sometime in early 2023. You’ll have to set up an alternative recovery method if you enable the technology, as Apple won’t have the keys needed to salvage your data.

The two further safeguards are aimed more at preventing misuses of accounts and devices. iMessage Contact Key Verification will help those who face “extraordinary” threats (such as activists, government officials and journalists) ensure that chat participants are authentic. You’ll get an automatic alert if a state-sponsored hacker or similar intruder manages to add a rogue device to an account. Users with the feature enabled can even compare verification codes through FaceTime, secure calls and in person.

iCloud users will also have the option of using hardware security keys as part of two-factor authentication. This includes both plug-in keys as well as NFC keys that only need to sit close to your iPhone. Both the iMessage and security key protections will be available worldwide in 2023.

At the same time, Apple is backing away from its controversial efforts to screen for child sexual abuse material (CSAM). The company tellsWired it has shelved a technology that would have detected known CSAM photos in iCloud and flagged accounts for reviews if they held a certain number of the toxic images. The change of heart comes after “extensive consultation” with experts, according to Apple — the company has decided that it can protect children without searching this data. Instead, it’s focusing on opt-in Communication Safety features that warn parents about nudity in iMessage photos as well as attempts to search for CSAM using Safari, Siri and Spotlight.

Apple plans to expand Communication Safety to recognize nudity in videos as well as content in other communications apps. The company further hopes to enable third-party support for the feature so that many apps can flag child abuse. There’s no timeframe for when these extra capabilities will arrive, but Apple added that it would continue making it easy to report exploitative material.

The tech giant is pitching the new security features as useful tools for its most privacy- and security-conscious users, whether they’re high-profile targets or simply people willing to trade some convenience for peace of mind. However, they could also set up further conflicts between Apple and law enforcement. The FBI and other agencies have frequently attacked Apple for making it difficult to crack suspects’ iPhones through iOS’ end-to-end encryption. Now, police might also be shut out of iCloud data they could previously obtain through official requests — Apple couldn’t comply with orders even if it wanted to.

This new approach might rankle some in government as well. In recent years, politicians have put forward bills that would require cooperation with court orders for encrypted data, and would either mandate or encourage the creation of encryption backdoors. While these measures haven’t succeeded, their supporters could easily be frustrated by the presence of stronger digital locks. Not that Apple is likely to back down. As with Meta and other industry heavyweights, Apple has argued that backdoors expose data to any would-be intruder, not just police making lawful requests.