In July 2022, Apple announced a new protection feature for its devices. Called “Lockdown Mode,” it severely restricts the functionality of your Apple smartphone, tablet or laptop. Its purpose is to reduce the success rate of targeted attacks, which politicians, activists and journalists, among others, are subjected to. Lockdown Mode is set to appear in the upcoming releases of iOS 16 (for smartphones), iPadOS 16 (for tablets) and macOS 13 Ventura (for desktops and laptops).
For ordinary users, this operating mode is likely to cause more of an inconvenience than actual good. For this reason, Apple recommends it only to users whose activities mean they are likely to face targeted attacks. In this post, we analyze the ins and outs of Lockdown Mode, compare the new restrictions with the capabilities of well-known exploits for Apple smartphones and examine why this mode, although useful, is no silver bullet.
Lockdown Mode in detail
Before the end of this year, with the release of the new versions of iOS, your Apple smartphone or tablet (if relatively recent, that is, no earlier than 2018) will have the new Lockdown Mode in its settings.
After activation, the phone will reboot, and some small (but, for some people, vital) features will stop working. For example, iMessage attachments will be blocked and websites may stop working properly in the browser. It will be harder to reach you by people you’ve had no contact with before. All of these restrictions are an effort to close the entry points most commonly exploited by attackers.
Digging deeper, Lockdown Mode introduces the following restrictions on your Apple device:
- In iMessage chats, you can see only text and images sent to you. All other attachments will be blocked.
- Some technologies will be disabled in the browsers, including just-in-time compilation.
- All incoming invitations to communicate through Apple services will be blocked. For example, you will be unable to make a FaceTime call if you have not previously chatted with the other user.
- If locked, your smartphone will not interact in any way with your computer (or other external devices connected with a cable).
- It won’t be possible to install configuration profiles or enroll the phone into Mobile Device Management (MDM).
The first three measures aim to limit the most common remote targeted attack vectors on Apple devices: an infected iMessage, a link to a malicious website and an incoming video call.
The fourth is designed to protect from connecting your iPhone, if left unattended, to a computer and having any valuable information stolen through a vulnerability in the communication protocol.
And the fifth restriction makes it impossible to connect a smartphone in Lockdown Mode to an MDM system. Normally, companies often use MDM for security purposes, such as deleting information on a lost phone. But this feature can also be used to steal data, since it gives the MDM administrator wide-ranging control over the device.
All in all, Lockdown Mode sounds like a good idea. Maybe we should all put up with some inconvenience to stay safe?
Features versus bugs
Before addressing this question, let’s assess how radical Apple’s solution actually is. If you think about it, it’s the exact opposite of all established norms in the industry. Usually, it goes like this: first, a developer comes up with a new feature, rolls it out and then wrestles to rid the code of bugs. With Lockdown Mode, on the other hand, Apple proposes giving up a handful of existing features for the sake of better protection.
A simple (and purely theoretical) example: suppose the maker of a messenger app adds the ability to exchange beautiful animated emojis, and even create your own. Then it turns out that it’s possible to create an emoji that causes the devices of all recipients to constantly reboot. Not nice.
To avoid this, the feature should have been scrapped, or had more time spent on vulnerability analysis. But it was more important to release and monetize the product as quickly as possible. In this behind-the-scenes struggle between security and convenience, the latter always won. Until now — for Apple’s new mode places security ahead of everything else. There’s only one word to describe it: cool.
Does it mean that iPhones without Lockdown Mode are unsafe?
Apple mobile devices are already pretty secure, which is important in the context of this announcement. Stealing data from an iPhone isn’t easy, and Apple is bending over backwards to keep it that way.
For example, your biometric information for unlocking your phone is stored only on the device and is not sent to the server. Data in the phone’s storage is encrypted. Your PIN to unlock the phone cannot be brute-forced: after several wrong attempts, the device is locked. Smartphone apps run in isolation from each other and cannot, generally speaking, access data stored by other apps. Hacking an iPhone is getting harder every year. For most users, this level of security is more than sufficient.
So why add yet more protection?
The question concerns a fairly small number of users whose data is so valuable that those who want it are prepared to go to extraordinary lengths to get it. Extraordinary lengths in this context means spending a lot of time and money on developing complex exploits able to bypass known protection systems. Such sophisticated cyberattacks threaten only a few tens of thousands people in the whole world.
This ballpark figure is known to us from Pegasus Project. In 2020, a list was leaked of some 50,000 names and phone numbers of individuals who allegedly had (or could have) been attacked using a piece of spyware developed by NSO Group. This Israeli company has long been criticized for its “legal” development of hacking tools for clients, who include many intelligence agencies worldwide.
NSO Group itself denied any link between its solutions and the leaked list of targets, but evidence later emerged that activists, journalists and politicians (all the way up to heads of state and government) had indeed been attacked using the company’s technologies. Developing exploits, even legally, is a dodgy business that can result in the leakage of extremely dangerous attack methods, which anyone can then use.
How sophisticated are exploits for iOS?
The complexity of these exploits can be gauged by looking at a zero-click attack that Google’s Project Zero team investigated at the end of last year. Normally, the victim at least has to click a link to activate the attacker’s malware, but “zero-click” means that no user action is required to compromise the targeted device.
Particularly in the case described by Project Zero, it is sufficient to send a malicious message to the victim in iMessage, which on most iPhones is enabled by default and replaces regular texts. In other words, it is enough for an attacker to know the victim’s phone number and send a message, whereupon they gain remote control over targeted device.
The exploit is very complicated. In iMessage, the victim receives a file with the GIF extension, that is actually not a GIF at all but rather a PDF compressed using certain algorithm that was fairly popular back in the early 2000s. The victim’s phone attempts to show a preview of this document. In most cases, Apple’s own code is used for this, but for this particular compression a third-party program is employed. And in it, a vulnerability was found — a not particularly remarkable buffer overflow error. To put it as simply as possible, built around this minor vulnerability is a separate and independent computational system, which ultimately executes malicious code.
In other words, the attack exploits a number of non-obvious flaws in the system, each of which seems insignificant in isolation. However, if they are strung together in a chain, the net result is iPhone infection by means of a single message, with no user clicks required.
This, quite frankly, is not something a teenage hacker might accidentally stumble across. And not even what a team of regular malware writers might create: they are usually after a much more direct route to monetization. Such a sophisticated exploit must have required many thousands of hours and many millions of dollars to create.
But let’s remember a key feature of Lockdown Mode mentioned above: almost all attachments are blocked. This is precisely to make zero-click attacks far harder to pull off, even if the iOS code does contain the corresponding bug.
The remaining features of Lockdown Mode are there to close other common “entry points” for targeted attacks: web browser, wired connection to a computer, incoming FaceTime calls. For these attack vectors, there already exist quite a few exploits, though not necessarily in Apple products.
What are the chances of such an elaborate attack being deployed against you personally if you are not on the radar of intelligence services? Pretty much zero unless you get hit by accident. Therefore, for the average user, using Lockdown Mode doesn’t make much sense. There is little point in making your phone or laptop less usable in exchange for a slight decrease in the chances of being at the end of a successful attack.
Not by lockdown alone
On the other hand, for those who are in the circle of potential targets of Pegasus and similar spyware, Apple’s new Lockdown Mode is certainly a positive development, but not a silver bullet.
In addition to (and, until its release, instead of) Lockdown Mode, our experts have a few other recommendations. Keep in mind, this is about a situation in which someone very powerful and very determined is hunting for your data. Here are a few tips:
- Reboot your smartphone daily. Creating an iPhone exploit is already hard, making it resistant to a reboot is much harder. Turning off your phone regularly will provide a little more protection.
- Disable iMessage altogether. Apple is unlikely to recommend this, but you can do it yourself. Why just reduce the chances of an iMessage attack when you can eliminate the whole threat in one fell swoop?
- Do not open links. In this case, it doesn’t even matter who sent them. If you really need to open a link, use a separate computer and preferably the Tor browser, which hides your data.
- If possible, use a VPN to mask your traffic. Again, this will make it harder to determine your location and harvest data about your device for a future attack.
For more tips, see Costin Raiu’s post “Staying safe from Pegasus, Chrysaor and other APT mobile malware.”