FBI couldn't get into WaPo reporter's iPhone because Lockdown Mode enabled
522 points
7 hours ago
| 25 comments
| 404media.co
| HN
bwoah
7 hours ago
[-]
reply
nova22033
5 hours ago
[-]
Remember...they can make you use touch id...they can't make you give them your password.

https://x.com/runasand/status/2017659019251343763?s=20

The FBI was able to access Washington Post reporter Hannah Natanson's Signal messages because she used Signal on her work laptop. The laptop accepted Touch ID for authentication, meaning the agents were allowed to require her to unlock it.

reply
wackget
5 hours ago
[-]
Link which doesn't directly support website owned by unscrupulous trillionaire: https://xcancel.com/runasand/status/2017659019251343763?s=20
reply
throwawayfour
4 hours ago
[-]
Good reminder to also set up something that does this automatically for you:

https://news.ycombinator.com/item?id=46526010

reply
JimA
1 hour ago
[-]
I generally avoid extensions that can read all sites (even if technically necessary), so use the suggestion found here [1] instead.

A few bookmarklets:

javascript:(function(){if (location.host.endsWith('x.com')) location.host='xcancel.com';})()

javascript:(function(){if (location.host.endsWith('youtube.com')) location.host='inv.nadeko.net';})()

javascript:(function(){if (location.hostname.endsWith('instagram.com')) {location.replace('https://imginn.com' + location.pathname);}})()

[1] https://www.reddit.com/r/uBlockOrigin/comments/1cc0uon/addin...

reply
forgotTheLast
4 hours ago
[-]
I actually think it is fitting to read about a government agency weaponized by an unscrupulous billionaire going after journalists working for an unscrupulous billionaire on an unscrupulous trillionaire owned platform.
reply
b8
3 hours ago
[-]
They can hold you in contempt for 18 months for not giving your password, https://arstechnica.com/tech-policy/2020/02/man-who-refused-....
reply
ElevenLathe
3 hours ago
[-]
Being held in contempt at least means you got a day in court first. A judge telling me to give up my password is different than a dozen armed, masked secret police telling me to.
reply
C6JEsQeQa5fCjE
1 hour ago
[-]
> A judge telling me to give up my password is different than a dozen armed, masked secret police telling me to.

Yes, a judge is unlikely to order your execution if you refuse. Based on recent pattern of their behavior, masked secret police who are living their wildest authoritarian dreams are likely to execute you if you anger them (for example by refusing to comply with their desires).

reply
noident
3 hours ago
[-]
That's a very unusual and narrow exception involving "foregone conclusion doctrine", an important fact missed by Ars Technica but elaborated on by AP: https://apnews.com/general-news-49da3a1e71f74e1c98012611aedc...
reply
OGWhales
2 hours ago
[-]
> Authorities, citing a “foregone conclusion exception” to the Fifth Amendment, argued that Rawls could not invoke his right to self-incrimination because police already had evidence of a crime. The 3rd Circuit panel agreed, upholding a lower court decision.

I do not follow the logic here, what does that even mean? It seems very dubious. And what happens if one legitimately forgets? They just get to keep you there forever?

reply
direwolf20
30 minutes ago
[-]
And why do they need to unlock your phone if they already proved you did the crime?
reply
seanw444
1 hour ago
[-]
You're delusional. When ICE starts executing people on the spot for not giving up iPhone passwords, I'll eat my words.
reply
OGWhales
1 hour ago
[-]
???
reply
teejmya
4 hours ago
[-]
I previously commented a solution to another problem, but it assists here too:

https://news.ycombinator.com/item?id=44746992

This command will make your MacBook hibernate when lid is closed or the laptop sleeps, so RAM is written to disk and the system powers down. The downside is that it does increase the amount of time it takes to resume.

A nice side benefit though, is that fingerprint is not accepted on first unlock, I believe secrets are still encrypted at this stage similar to cold boot. A fingerprint still unlocks from screensaver normally, as long as the system does not sleep (and therefore hibernate)

reply
goda90
3 hours ago
[-]
Remember that our rights aren't laws of nature. They have to be fought for to be respected by the government.
reply
patrickmay
5 hours ago
[-]
Is the knowledge of which finger to use protected as much as a passcode? Law enforcement might have the authority to physically hold the owner's finger to the device, but it seems that the owner has the right to refuse to disclose which finger is the right one. If law enforcement doesn't guess correctly in a few tries, the device could lock itself and require the passcode.

Another reason to use my dog's nose instead of a fingerprint.

reply
parl_match
4 hours ago
[-]
I really wish Apple would offer a pin option on macos. For this reason, precisely. Either that, or an option to automatically disable touchid after a short amount of time (eg an hour or if my phone doesn't connect to the laptop)
reply
fpoling
4 hours ago
[-]
You can setup a separated account with a long password on MacOS and remove your user account from accounts that can unlock FileVault. Then you can change your account to use a short password. You can also change various settings regarding how long Mac has to sleep before requiring to unlock FileVault.
reply
AnonHP
4 hours ago
[-]
I didn’t understand how a user that cannot unlock FileVault helps. Can you please elaborate on this setup? Thanks.
reply
fpoling
38 minutes ago
[-]
With that setup on boot or after a long sleep one first must log in into an account with longer password. Then one logs out of that and switches to the primary account with a short password.
reply
xoa
4 hours ago
[-]
As another alternative, rather than using Touch ID you can setup a Yubikey or similar hardware key for login to macOS. Then your login does indeed become a PIN with 3 tries before lockout. That plus a complex password is pretty convenient but not biometric. It's what I've done for a long time on my desktop devices.
reply
Wistar
3 hours ago
[-]
On my Macbook Pro, I usually need to use both touch and a password but that might be only when some hours have passed between log ins.
reply
NetMageSCW
2 hours ago
[-]
You can script a time out if desired.
reply
redeeman
1 hour ago
[-]
uhm, are you saying its not possible to require an actual password to unlock osx?
reply
thecapybara
3 hours ago
[-]
There's only ten possible guesses, and most people use their thumb and/or index finger, leaving four much likelier guesses.

Also, IANAL, but I'm pretty sure that if law enforcement has a warrant to seize property from you, they're not obligated to do so immediately the instant they see you - they could have someone follow you and watch to see how you unlock your phone before seizing it.

reply
z3phyr
3 hours ago
[-]
0.1 in itself is a very good odd, and 0.1 * n tries is even more laughable. Also most people have two fingers touchID, which makes this number close to half in reality.
reply
deltastone
54 minutes ago
[-]
Also, using biometrics on a device, and your biometrics unlock said device, do wonders for proving to a jury that you owned and operated that device. So you're double screwed in that regard.
reply
notyourwork
1 hour ago
[-]
I don't get why I can be forced to use my biometrics to unlock but I cannot be forced to give a pin. Doesn't jive in my brain.
reply
direwolf20
28 minutes ago
[-]
When they arrest you, they have physical control of your body. You're in handcuffs. They can put your fingers against the unlock button. You can make a fist, but they can have more strength and leverage to unfist your fist.

There's no known technique to force you to input a password.

reply
deltastone
1 hour ago
[-]
It's something you know vs. something you have. That's how the legal system sees it. You might not tell someone the pin to your safe, but if police find the key to it, or hire a locksmith to drill out your safe, it's theirs with a warrant.

It's interesting in the case of social media companies. Technically the data held is the companies data (Google, Meta, etc.) however courts have ruled that a person still has an expectation of privacy and therefore police need a warrant.

reply
wan23
1 hour ago
[-]
The fifth amendment gives you the right to be silent, but they didn't write in anything about biometrics.
reply
rustyhancock
20 minutes ago
[-]
As far as I know lockdown mode and BFU prevent touch ID unlocking.

At least a password and pin you choose to give over.

reply
direwolf20
31 minutes ago
[-]
Remember, this isn't how it works in every country.
reply
joecool1029
1 hour ago
[-]
> they can't make you give them your password.

Except when they can: https://harvardlawreview.org/print/vol-134/state-v-andrews/

reply
p0w3n3d
3 hours ago
[-]
Allowed to require - very mildly constructed sentence, which could include torture or force abuse...

https://xkcd.com/538/

reply
mbil
5 hours ago
[-]
Reminder that you can press the iPhone power button five times to require passcode for the next unlock.
reply
rawgabbit
4 hours ago
[-]
Serious question. If I am re-entering the US after traveling abroad, can customs legally ask me to turn the phone back on and/or seize my phone? I am a US citizen.

Out of habit, I keep my phone off during the flight and turn it on after clearing customs.

reply
verall
4 hours ago
[-]
my understanding is that they can hold you for a couple days without charges for your insubordination but as a citizen they have to let you back into the country or officially arrest you, try to get an actual warrant, etc.
reply
direwolf20
28 minutes ago
[-]
they can just break the law
reply
Analemma_
4 hours ago
[-]
If you are a US citizen, you legally cannot be denied re-entry into the country for any reason, including not unlocking your phone. They can make it really annoying and detain you for a while, though.
reply
thecapybara
3 hours ago
[-]
Did you know that on most models of iPhone, saying "Hey Siri, who's iPhone is this?" will disable biometric authentication until the passcode is entered?
reply
rconti
2 hours ago
[-]
hm. didn't work on my 17 pro :( might be due to a setting i have.
reply
fragmede
2 hours ago
[-]
They disabled that in like iOS 18.
reply
kstrauser
5 hours ago
[-]
Or squeeze the power and volume buttons for a couple of seconds. It’s good to practice both these gestures so that they become reflex, rather than trying to remember them when they’re needed.
reply
regenschutz
3 hours ago
[-]
Sad, neither of those works on Android. Pressing the power button activates the emergency call screen with a countdown to call emergency services, and power + volume either just takes a screenshot or enables vibrations/haptics depending on which volume button you press.
reply
silisili
2 hours ago
[-]
Did you check your phone settings? Mine has an option to add it to the power menu, so you get to it by whichever method you use to do that (which itself is sad that phones are starting to differ in what the power key does).
reply
thallium205
3 hours ago
[-]
On Pixel phones, Power + Volume Up retrieves a menu where you can select "Lockdown".
reply
rationalist
3 hours ago
[-]
Not on my Pixel phone, that just sets it to vibrate instead of ring. Holding down the power button retrieves a menu where you can select "Lockdown".
reply
zerocrates
2 hours ago
[-]
On my 9 you get a setting to choose if holding Power gets you the power menu or activates the assistant (I think it defaulted to assistant? I have it set to the power menu because I don't really ever use the assistant.)
reply
rationalist
46 minutes ago
[-]
Yes, that was the default for me, but I changed it in settings.
reply
pkulak
4 hours ago
[-]
Oh wow, just going into the "should I shutdown" menu also goes into pre-boot lock state? I didn't know that.
reply
duskwuff
4 hours ago
[-]
It doesn't reenter a BFU state, but it requires a passcode for the next unlock.
reply
snuxoll
3 hours ago
[-]
It's close enough, because (most of) the encryption keys are wiped from memory every time the device is locked, and this action makes the secure enclave require PIN authentication to release them again.
reply
overfeed
2 hours ago
[-]
> It's close enough

Not really, because tools like Cellbrite are more limited with BFU, hence the manual informing LEO to keep (locked) devices charged, amd the countermeasures being iOS forcefully rebooting devices that have been locked for too long.

reply
CGMthrowaway
1 hour ago
[-]
There is a way now to force BFU from a phone that is turned on, I can't remember the sequence
reply
duskwuff
38 minutes ago
[-]
Eh? BFU ("before first unlock") is, by definition, the state that a phone is in when it is turned on. There's no need to "force" it.

If you mean forcing an iOS device out of BFU, that's impossible. The device's storage is encrypted using a key derived from the user's passcode. That key is only available once the user has unlocked the device once, using their passcode.

reply
paulsmith
5 hours ago
[-]
Alternately, hold the power button and either volume button together for a few seconds.
reply
tosapple
4 hours ago
[-]
This is the third person advocating button squeezing, as a reminder: IF a gun is on you the jig is up, you can be shot for resisting or reaching for a potential weapon. Wireless detonators do exist, don't f around please.
reply
fogzen
5 hours ago
[-]
In case anyone is wondering: In newer versions of MacOS, the user must log out to require a password. Locking screen no longer requires password if Touch ID is enabled.
reply
alistairSH
4 hours ago
[-]
Is that actually true? I'm fairly confident my work Mac requires a password if it's idle more than a few days (typically over the weekend).
reply
jen729w
4 hours ago
[-]
Shift+Option+Command+Q is your fastest route there, but unsaved work will block.
reply
raw_anon_1111
3 hours ago
[-]
Settings -> lock screen -> “Require password after screen saver begins or display is turned off”
reply
neves
2 hours ago
[-]
I just searched the case. I'm appalled. It looks like USA doesn't have legal protection for reporter sources. Or better, Biden created some, but it was revoked by the current administration.

The real news here isn't privacy control in a consumer OS ir the right to privacy, but USA, the leader of the free world, becoming an autocracy.

reply
raw_anon_1111
3 hours ago
[-]
As if the government is not above breaking the law and using rubber hose decryption. The current administration’s justice department has been caught lying left and right
reply
direwolf20
27 minutes ago
[-]
Plausible deniability still works. You enter your duress code and your system boots to a secondary partition with Facebook and Snapchat. No such OS exists.
reply
TheDong
5 hours ago
[-]
I find it so frustrating that Lockdown Mode is so all-or-nothing.

I want some of the lockdown stuff (No facetime and message attachments from strangers, no link previews, no device connections), but like half of the other ones I don't want.

Why can't I just toggle an iMessage setting for "no link preview, no attachments", or a general setting for "no automatic device connection to untrusted computers while locked"? Why can't I turn off "random dickpicks from strangers on iMessage" without also turning off my browser's javascript JIT and a bunch of other random crap?

Sure, leave the "Lockdown mode" toggle so people who just want "give me all the security" can get it, but split out individual options too.

Just to go through the features I don't want:

* Lockdown Mode disables javascript JIT in the browser - I want fast javascript, I use some websites and apps that cannot function without it, and non-JIT js drains battery more

* Shared photo albums - I'm okay viewing shared photo albums from friends, but lockdown mode prevents you from even viewing them

* Configuration profiles - I need this to install custom fonts

Apple's refusal to split out more granular options here hurts my security.

reply
quizzical8432
1 hour ago
[-]
I’m with you on the shared photo albums. I’d been using lockdown mode for quite a while before I discovered this limitation, though. For me, this is one I’d like to be able to selectively enable (like the per-website/app settings). In my case, it was a one-off need, so I disabled lockdown mode, shared photos, then enabled it again.

The other feature I miss is screen time requests. This one is kinda weird - I’m sure there’s a reason they’re blocked, but it’s a message from Apple (or, directly from a trusted family member? I’m not 100% sure how they work). I still _recieve_ the notification, but it’s not actionable.

While I share with your frustration, though, I do understand why Apple might want to have it as “all-or-nothing”. If they allow users to enable even one “dangerous” setting, that ultimately compromises the entire security model. An attacker doesn’t care which way they can compromise your device. If there’s _one_ way in, that’s all they need.

Ultimately, for me the biggest PiTA with lockdown mode is not knowing if it’s to blame for a problem I’m having. I couldn’t tell you how many times I’ve disabled and re-enabled it just to test something that should work, or if it’s the reason a feature/setting is not showing up. To be fair, most of the time it’s not the issue, but sometimes I just need to rule it out.

reply
Terretta
5 hours ago
[-]
The profiles language may be confusing -- what you can't do is change them while in Lockdown mode.
reply
ectospheno
5 hours ago
[-]
Family albums work with lockdown mode. You can also disable web restrictions per app and website.
reply
everdrive
3 hours ago
[-]
>* Lockdown Mode disables javascript JIT in the browser - I want fast javascript, I use some websites and apps that cannot function without it, and non-JIT js drains battery more

This feature has the benefit of teaching users (correctly) that browsing the internet on a phone has always been a terrible idea.

reply
rantingdemon
2 hours ago
[-]
I'll bite. Why is it so terrible? I'm browsing this site right now on my phone and don't see the horror.
reply
mghackerlady
2 hours ago
[-]
Phone networks by design track you more precisely than possible over a conventional internet connection to facilitate the automatic connection to the nearest available network. Also, for similar reasons it requires the phone network to know that it is your phone
reply
LoganDark
1 hour ago
[-]
You don't need to connect to the internet for that. It has nothing to do with web browsing at all.
reply
jgwil2
3 hours ago
[-]
I think that ship has sailed.
reply
eth0up
1 minute ago
[-]
Every time I see these articles about iphones posing trouble for authorities, I always think of it as free (and fraudulent) advertisement.

I could be naive, but just don't think they'd really have any difficulty getting what they needed. Not that I give a fuck, but I guess I've seen one too many free ads.

reply
nxobject
6 hours ago
[-]
Sadly, they still got to her Signal on her Desktop – her sources might still be compromised. It's sadly inherent to desktop applications, but I'm sad that a lot more people don't know that Signal for Desktop is much, much less secure against adversaries with your laptop.
reply
tadzikpk
3 hours ago
[-]
> I'm sad that a lot more people don't know that Signal for Desktop is much, much less secure against adversaries with your laptop

Educate us. What makes it less secure?

reply
armadyl
2 hours ago
[-]
In addition to what the other person who replied said, ignoring that iOS/Android/iPadOS is far more secure than macOS, laptops have significantly less hardware-based protections than Pixel/Samsung/Apple mobile devices do. So really the only way a laptop in this situation would be truly secure from LEO is if its fully powered off when it’s seized.
reply
digiown
3 hours ago
[-]
The key in the desktop version is not always stored in the secure enclave, is my assumption (it definitely supports plaintext storage). Theoretically this makes it possible to extract the key for the message database. Also a different malicious program can read it. But this is moot anyway if the FBI can browse through the chats. This isn't what failed here.
reply
anigbrowl
1 hour ago
[-]
Also last time I looked (less than 1 year ago) files sent over Signal are stored in plain, just with obfuscated filenames. So even without access to Signal it's easy to see what message attachments a person has received, and copy any interesting ones.
reply
stronglikedan
5 hours ago
[-]
If people don't have Signal set to delete sensitive messages quickly, then they may as well just be texting.
reply
AdamN
5 hours ago
[-]
That's a strong statement. Also imho it's important that we use Signal for normal stuff like discussing where to get coffee tomorrow - no need for disappearing messages there.
reply
CGMthrowaway
1 hour ago
[-]
Strong and accurate. Considering non-disappearing messages the same as texts is not the same thing as saying all Signal messages ought to be disappearing or else the app is useless.

Telegram allows you to have distinct disappearing settings for each chat/group. Not sure how it works on Signal, but a solution like this could be possible.

reply
aschobel
4 hours ago
[-]
I'm weird, i even have disappearing messages for my coffee chats. It's kind of refreshing not having any history.
reply
zikduruqe
3 hours ago
[-]
I'm an inbox zero person... I keep even my personal notes to disappear after 2 days. For conversations 1 day.
reply
tptacek
4 hours ago
[-]
Not if you're using Signal for life-and-death secure messaging; in that scenario it's table stakes.
reply
mrandish
4 hours ago
[-]
I would have thought reporters with confidential sources at that level would already exercise basic security hygiene. Hopefully, this incident is a wake up call for the rest.
reply
NewsaHackO
5 hours ago
[-]
Yea, I also would want to question the conclusions in the article. Was the issue that they couldn't unlock the iPhone, or that they had no reason to pursue the thread? To my understanding, the Apple ecosystem means that everything is synced together. If they already got into her laptop, wouldn't all of the iMessages, call history, and iCloud material already be synced there? What would be the gain of going after the phone, other than to make the case slightly more watertight?
reply
pbhjpbhj
6 hours ago
[-]
Did she have Bitlocker or FileVault or other disk encryption that was breeched? (Or they took the system booted as TLAs seek to do?)
reply
deltastone
6 minutes ago
[-]
Bitlocker isn't secure, for several reasons, that I won't get into on here.
reply
bmicraft
4 hours ago
[-]
There was a story here the other day, bitlocker keys stored in your Microsoft account will be handed over.
reply
deltastone
4 minutes ago
[-]
This has been known for a while, though I don't know if your typical layperson was aware until recently. People need to remember that any access a company has to a device, so does LE with a warrant. Even moreso once you get into federal resources and FISA courts.
reply
direwolf20
26 minutes ago
[-]
Which windows does by default and makes it hard to turn off
reply
MoonWalk
5 hours ago
[-]
breached
reply
827a
4 hours ago
[-]
Is there an implication here that they could get into an iPhone with lower security settings enabled? There's Advanced Data Protection, which E2EEs more of your data in iCloud. There's the FaceID unlock state, which US law enforcement can compel you to unlock; but penta-click the power button and you go into PIN unlock state, which they cannot compel you to unlock.

My understanding of Lockdown Mode was that it babyifies the device to reduce the attack surface against unknown zero-days. Does the government saying that Lockdown Mode barred them from entering imply that they've got an unknown zero-day that would work in the PIN-unlock state, but not Lockdown Mode?

reply
kingnothing
2 hours ago
[-]
It's relatively well know that the NSO Group / Pegasus is what governments use to access locked phones.
reply
827a
49 minutes ago
[-]
This was known, in the past, but if its relying on zero-days Apple & Google are, adversarially, attempting to keep up with and patch, my assumption would not be that pegasus is, at any time, always able to breach a fully-updated iPhone. Rather, its a situation where maybe there are periods of a few months at a time where they have a working exploit, until Apple discovers it and patches it, repeat indefinitely.
reply
direwolf20
26 minutes ago
[-]
How does Apple discover their exploits? I'm sure they keep some around for extremely high value targets.
reply
zymhan
3 hours ago
[-]
Yes
reply
macintux
7 hours ago
[-]
> Natanson said she does not use biometrics for her devices, but after investigators told her to try, “when she applied her index finger to the fingerprint reader, the laptop unlocked.”

Curious.

reply
QuantumNomad_
7 hours ago
[-]
Probably enabled it at some point and forgot. Perhaps even during setup when the computer was new.
reply
intrasight
4 hours ago
[-]
My recollection is the computers do by default ask the user to set up biometrics
reply
NewsaHackO
5 hours ago
[-]
I want to say that is generous of her, but one thing that is weird is if I didn’t want someone to go into my laptop and they tried to force me to use my fingerprint to unlock it, I definitely wouldn’t use the finger I use to unlock it on the first try. Hopefully, Apple locks it out and forces a password if you use the wrong finger “accidentally” a couple of times.
reply
altairprime
4 hours ago
[-]
Correct. That’s why my Touch ID isn’t configured to use the obvious finger.
reply
b112
7 hours ago
[-]
Very much so, because the question is... did she set it up in the past?

How did it know the print even?

reply
ezfe
6 hours ago
[-]
Why is this curious?
reply
macintux
5 hours ago
[-]
There appear to be a relatively few possibilities.

* The reporter lied.

* The reporter forgot.

* Apple devices share fingerprint matching details and another device had her details (this is supposed to be impossible, and I have no reason to believe it isn't).

* The government hacked the computer such that it would unlock this way (probably impossible as well).

* The fingerprint security is much worse than years of evidence suggests.

Mainly it was buried at the very end of the article, and I thought it worth mentioning here in case people missed it.

reply
orwin
3 hours ago
[-]
My opinion is that she set it up, it didn't work at first, she didn't use it, forgot that it existed, and here we are.

> Apple devices share fingerprint matching details and another device had her details

I looked into it quite seriously for windows thinkpads, unless Apple do it differently, you cannot share fingerprint, they're in a local chip and never move.

reply
fragmede
2 hours ago
[-]
So how does TouchID on an external keyboard work without having to re-set up fingerprints?
reply
piperswe
1 hour ago
[-]
Presumably the fingerprint data is stored in the Mac's Secure Enclave, and the external keyboard is just a reader
reply
ezfe
3 hours ago
[-]
The reporter lying or forgetting seems to be the clear answer, there's really no reason to believe it's not one of those. And the distinction between the two isn't really important from a technical perspective.

Fingerprint security being poor is also unlikely, because that would only apply if a different finger had been registered.

reply
dyauspitr
6 hours ago
[-]
She has to have set it up before. There is no way to divine a fingerprint any other way. I guess the only other way would be a faulty fingerprint sensor but that should default to a non-entry.
reply
quesera
5 hours ago
[-]
> faulty fingerprint sensor

The fingerprint sensor does not make access control decisions, so the fault would have to be somewhere else (e.g. the software code branch structure that decides what to do with the response from the secure enclave).

reply
d1sxeyes
44 minutes ago
[-]
If you're interested in this in more detail, check this out:

https://blackwinghq.com/blog/posts/a-touch-of-pwn-part-i/

reply
giraffe_lady
6 hours ago
[-]
Could be a parallel construction type thing. They already have access but they need to document a legal action by which they could have acquired it so it doesn't get thrown out of court.

I think this is pretty unlikely here but it's within the realm of possibility.

reply
tsol
6 hours ago
[-]
Seems like it would be hard to fake. The was she tells it she put her finger on the pad and the OS unlocked the account. Sounds very difficult to do
reply
operator-name
5 hours ago
[-]
I think they mean if they already have her fingerprint from somewhere else, and a secret backdoor into the laptop. Then they could login, setup biometrics and pretend they had first access when she unlocked it. All without revealing their backdoor.
reply
niemandhier
1 hour ago
[-]
Depending on your jurisdiction faceid is safer than fingerprint, because faceid won’t unlock while your eyes are closed.

In many European countries forcing your finger on a scanner would be permissible under certain circumstances, forcing your eyes open so far has been deemed unacceptable.

reply
1vuio0pswjnm7
3 hours ago
[-]
"Lockdown Mode is a sometimes overlooked feature of Apple devices that broadly make[sic] them harder to hack."

Funny to see disabling "features" itself described as "feature"

Why not call it a "setting"

Most iPhone users do not change default settings. That's why Google pays Apple billions of dollars for a default setting that sends data about users to Google

"Lockdown Mode" is not a default setting

The phrase "sometimes overlooked" is an understatement. It's not a default setting and almost no one uses it

If it is true Lockdown Mode makes iPhones "harder to hack", as the journalist contends, then it is also true that Apple's default settings make iPhones "easier to hack"

reply
rick_dalton
2 hours ago
[-]
The intention behind lockdown mode is protection for a select few groups of people such as journalists, that are at risk of having software like Pegasus used against them. It’s to reduce the attack surface. The average user wouldn’t want most of it as a default setting, for example: almost no message attachments allowed, no FaceTime calls from people you haven’t called and safari is kneecapped. Making this a default setting for most people is unrealistic and also probably won’t help their cybersecurity as they wouldn’t be targeted anyway.
reply
1vuio0pswjnm7
1 hour ago
[-]
A "reduced attack surface" can also be a reduced surface for telemetry, data collection, surveillance and advertising services, thereby directly or indirectly causing a reduction in Apple revenues

Perhaps this could be a factor in why it's not a default setting

reply
throwmeaway820
7 hours ago
[-]
It seems unfortunate that enhanced protection against physically attached devices requires enabling a mode that is much broader, and sounds like it has a noticeable impact on device functionality.

I never attach my iPhone to anything that's not a power source. I would totally enable an "enhanced protection for external accessories" mode. But I'm not going to enable a general "Lockdown mode" that Apple tells me means my "device won’t function like it typically does"

reply
jonpalmisc
6 hours ago
[-]
There is a setting as of iOS 26 under "Privacy & Security > Wired Accessories" in which you can make data connections always prompt for access. Not that there haven't been bypasses for this before, but perhaps still of interest to you.
reply
H8crilA
7 hours ago
[-]
GrapheneOS does this by default - only power delivery when locked. Also it's a hardware block, not software. Seems to be completely immune to these USB exploit tools.
reply
aaronmdjones
6 hours ago
[-]
It also has various options to adjust the behaviour, from no blocks at all, to not even being able to charge the phone (or use the phone to charge something else) -- even when unlocked. Changing the mode of operation requires the device PIN, just as changing the device PIN does.

Note that it behaves subtly differently to how you described in case it was connected to something before being locked. In that case data access will remain -- even though the phone is now locked -- until the device is disconnected.

reply
Terretta
5 hours ago
[-]
> I would totally enable an "enhanced protection for external accessories" mode.

Anyone can do this for over a decade now, and it's fairly straightforward:

- 2014: https://www.zdziarski.com/blog/?p=2589

- recent: https://reincubate.com/support/how-to/pair-lock-supervise-ip...

This goes beyond the "wired accessories" toggle.

reply
pkteison
6 hours ago
[-]
It isn’t. Settings > Privacy & Security > Wired Accessories

Set to ask for new accessories or always ask.

reply
sodality2
5 hours ago
[-]
I have to warn you, it does get annoying when you plug in your power-only cable and it still nags you with the question. But it does work as intended!
reply
neilalexander
3 hours ago
[-]
You might want to check that charger. I have the same option set to ask every time and it never appears for chargers.
reply
mrandish
3 hours ago
[-]
> it has a noticeable impact on device functionality.

The lack of optional granularity on security settings is super frustrating because it leads to many users just opting out of any heightened security.

reply
UltraSane
7 hours ago
[-]
Computer security is generally inversely proportional to convenience. Best opsec is generally to have multiple devices.
reply
ur-whale
7 hours ago
[-]
> I never attach my iPhone to anything that's not a power source.

It's "attached" to the wifi and to the cell network. Pretty much the same thing.

reply
boring-human
7 hours ago
[-]
Can a hacked phone (such as one that was not in Lockdown Mode at one point in time) persist in a hacked state?

Obviously, the theoretical answer is yes, given an advanced-enough exploit. But let's say Apple is unaware of a specific rootkit. If each OS update is a wave, is the installed exploit more like a rowboat or a frigate? Will it likely be defeated accidentally by minor OS changes, or is it likely to endure?

This answer is actionable. If exploits are rowboats, installing developer OS betas might be security-enhancing: the exploit might break before the exploiters have a chance to update it.

reply
quenix
6 hours ago
[-]
Forget OS updates. The biggest obstacle to exploit persistence: a good old hard system reboot.

Modern iOS has an incredibly tight secure chain-of-trust bootloader. If you shut your device to a known-off state (using the hardware key sequence), on power on, you can be 99.999% certain only Apple-signed code will run all the way from secureROM to iOS userland. The exception is if the secureROM is somehow compromised and exploited remotely (this requires hardware access at boot-time so I don't buy it).

So, on a fresh boot, you are almost definitely running authentic Apple code. The easiest path to a form of persistence is reusing whatever vector initially pwned you (malicious attachment, website, etc) and being clever in placing it somewhere iOS will attempt to read it again on boot (and so automatically get pwned again).

But honestly, exploiting modern iOS is already difficult enough (exploits go for tens millions $USD), persistence is an order of magnitude more difficult.

reply
doublerabbit
6 hours ago
[-]
It's why I keep my old iPhone XR on 15.x for jail breaking reasons. I purchased an a new phone specially for the later versions and online banking.

Apple bought out all the jail breakers as Denuvo did for the game crackers.

reply
noname120
5 hours ago
[-]
> Apple bought out all the jail breakers > Denuvo did for the game crackers

Do you have sources for these statements?

reply
doublerabbit
1 hour ago
[-]
Like anything in that field its more NDA, antidotal.

> in 2018, the prominent Denuvo cracker known as "Voksi" (of REVOLT) was arrested in Bulgaria following a criminal complaint from Denuvo.

https://www.dsogaming.com/news/denuvo-has-sued-revolts-found...

That's how you get off such charges. I'll work for you, if you drop charges. There was a reddit post I can't find when EMPRESS had one of their episodes where she was asked if she wanted to work for. It's happened in the cracking scene before.

> The jailbreaking community is fractured, with many of its former members having joined private security firms or Apple itself. The few people still doing it privately are able to hold out for big payouts for finding iPhone vulnerabilities. And users themselves have stopped demanding jailbreaks, because Apple simply took jailbreakers’ best ideas and implemented them into iOS.

https://www.vice.com/en/article/iphone-jailbreak-life-death-...

And from the jail break community discord.

reply
digiown
7 hours ago
[-]
Secure boot and verified system partition is supposed to help with that. It's for the same reason jailbreaks don't persist across reboots these days.
reply
nxobject
6 hours ago
[-]
Re: reboots – TFA states that recent iPhones reboot every 3 days when inactive for the same reasons. Of course, now that we know that it's linked to inactivity, black hatters will know how to avoid it...
reply
maldev
4 hours ago
[-]
You should read into IOS internals before commenting stuff like this. Your answer is wrong, and rootkits have been dead on most OS's for years, but ESPECIALLY IOS. Not every OS is like Linux where security is second.

Even a cursory glance would show it's literally impossible on IOS with even a basic understanding.

reply
ramuel
3 hours ago
[-]
Can't they just use Pegasus or Cellebrite???
reply
aw1621107
45 minutes ago
[-]
It's unlikely that Pegasus would work since Apple patched the exploit it used.

I think it's unclear whether Cellebrite can or cannot get around Lockdown Mode as it would depend very heavily on whether the technique(s)/exploit(s) Cellebrite uses are suitable for whatever bugs/vulnerabilities remain exposed in Lockdown Mode.

reply
cdrnsf
2 hours ago
[-]
Given Cook's willing displays of fealty to Trump this time around I wouldn't be shocked if they were to remove lockdown mode in a future release.
reply
aquir
7 hours ago
[-]
We need a Lockdown mode for MacBooks as well!
reply
steve-atx-7600
7 hours ago
[-]
Looks like it’s a feature: https://support.apple.com/en-us/105120
reply
LordGrey
6 hours ago
[-]
To save a click:

* Lockdown Mode needs to be turned on separately for your iPhone, iPad, and Mac.

* When you turn on Lockdown Mode for your iPhone, it's automatically turned on for your paired Apple Watch.

* When you turn on Lockdown Mode for one of your devices, you get prompts to turn it on for your other supported Apple devices.

reply
mmooss
5 hours ago
[-]
Don't be idiots. The FBI may say that whether or not they can get in:

1. If they can get in, now people - including high-value targets like journalists - will use bad security.

2. If the FBI (or another agency) has an unknown capability, the FBI must say they can't get in or reveal their capabilities to all adversaries, including to even higher-profile targets such as counter-intelligence targets. Saying nothing also risks revealing the capability.

3. Similarly if Apple helped them, Apple might insist that is not revealed. The same applies to any third party with the capability. (Also, less significantly, saying they can't get in puts more pressure on Apple and on creating backdoors, even if HN readers will see it the other way.)

Also, the target might think they are safe, which could be a tactical advantage. It also may exclude recovered data from rules of handling evidence, even if it's unusable in court. And at best they haven't got in yet - there may be an exploit to this OS version someday, and the FBI can try again then.

reply
coppsilgold
1 hour ago
[-]
I would not recommend that one trust a secure enclave with full disk encryption (FDE). This is what you are doing when your password/PIN/fingerprint can't contain sufficient entropy to derive a secure encryption key.

The problem with low entropy security measures arises due to the fact that this low entropy is used to instruct the secure enclave (TEE) to release/use the actual high entropy key. So the key must be stored physically (eg. as voltage levels) somewhere in the device.

It's a similar story when the device is locked, on most computers the RAM isn't even encrypted so a locked computer is no major obstacle to an adversary. On devices where RAM is encrypted the encryption key is also stored somewhere - if only while the device is powered on.

reply
pregnenolone
34 minutes ago
[-]
RAM encryption doesn’t prevent DMA attacks and perofming a DMA attack is quite trivial as long as the machine is running. Secure enclaves do prevent those and they're a good solution. If implemented correctly, they have no downsides. I'm not referring to TPMs due to their inherent flaws; I’m talking about SoC crypto engines like those found in Apple’s M series or Intel's latest Panther Lake lineup. They prevent DMA attacks and side-channel vulnerabilities. True, I wouldn’t trust any secure enclave never to be breached – that’s an impossible promise to make even though it would require a nation-state level attack – but even this concern can be easily addressed by making the final encryption key depend on both software key derivation and the secret stored within the enclave.
reply
KKKKkkkk1
6 hours ago
[-]
What is she investigated for?
reply
buckle8017
6 hours ago
[-]
They're not actually investigating her, they're investigating a source that leaked her classified materials.
reply
zozbot234
2 hours ago
[-]
If they're not investigating her she doesn't have any 5th-amendment protection and can be compelled to testify on anything relevant, including how to unlock her devices.
reply
deltastone
15 seconds ago
[-]
This here is true. 5th amendment protections only protect you from SELF-incrimination, and in some ways, your spouse. It does not apply to protecting others. Though some have tried arguing that they are protecting themselves, which then requires some form of admittance of them having been committing ANOTHER crime, which doesn't look good to a jury.
reply
jimt1234
3 hours ago
[-]
Did the individual store the classified material in the bathroom at his beach-side resort?
reply
PlatoIsADisease
6 hours ago
[-]
Little too late for 1000 people hacked by pegasus.
reply
ChrisArchitect
6 hours ago
[-]
Previously, direct link to the court doc:

FBI unable to extract data from iPhone 13 in Lockdown Mode in high profile case [pdf]

https://storage.courtlistener.com/recap/gov.uscourts.vaed.58...

(https://news.ycombinator.com/item?id=46843967)

reply
kittikitti
5 hours ago
[-]
It sounds like almost all of our devices have security by annoyance as default. Where are the promises of E2E encryption and all the privacy measures? When I turned on lockdown mode on my iPhone, there were a few notifications where the random spam calls I get were attempting a FaceTime exploit. How come we have to wait until someone can prove ICE can't get into our devices?
reply
davidfekke
4 hours ago
[-]
I guess they got a 404
reply
mrexcess
7 hours ago
[-]
I trust 404 media more than most sources, but I can’t help but reflexively read every story prominently showcasing the FBI’s supposed surveillance gaps as attempted watering hole attacks. The NSA almost certainly has hardware backdoors in Apple silicon, as disclosed a couple of years ago by the excellent researchers at Kaspersky. That being the case, Lockdown Mode is not even in play.
reply
chuckadams
7 hours ago
[-]
The NSA is not going to tip its hand about any backdoors it had built into the hardware for something as small as this.
reply
ddtaylor
6 hours ago
[-]
It depends on if parallel reconstruction can be used to provide deniability.
reply
chuckadams
6 hours ago
[-]
Even a parallel construction has limited uses, since you can't use the same excuse every time. The NSA probably doesn't trust the FBI to come up with something plausible.
reply
UltraSane
7 hours ago
[-]
Samsung phones have the Secure Folder which can have a different, more secure password and be encrypted when the phone is on.
reply
Itoldmyselfso
6 hours ago
[-]
Secure folder uses or is in the process of starting to use Android native feature private space, which is available on all Android 15 phones.
reply
delichon
6 hours ago
[-]
I use the Cryptomator app for this, it works as advertised. I keep ~60 GiB of personal files in there that would be an easy button to steal my identity and savings. I'm just hoping it doesn't include an NSA back door.
reply
piperswe
1 hour ago
[-]
The NSA definitely has easier ways to steal your identity and savings if they wanted to anyways
reply
vorticalbox
6 hours ago
[-]
you can check the github https://github.com/cryptomator/ios
reply
delichon
6 hours ago
[-]
Even if I had the skills to confirm the code is secure, how could I know that this is the code running on my phone, without also having the skills to build and deploy it from source?
reply
warkdarrior
4 hours ago
[-]
Also, you need to make sure that the installation process does not insert a backdoor into the code you built from source.
reply
fragmede
1 hour ago
[-]
reply
mandeepj
7 hours ago
[-]
For now! They’ll get something from open market like the last time when Apple refused to decrypt (or unlock?) a phone for them.
reply
PlatoIsADisease
4 hours ago
[-]
Yeah this is low stakes stuff, Pegasus historically breaks Apple phones easy. Bezos's nudes and Khashoggi knows. (not really Khashoggi is dead)
reply
PunchyHamster
5 hours ago
[-]
They just need to ask apple to unlock it. And they can't really refuse under US law
reply
quesera
5 hours ago
[-]
They can refuse, and they have refused. See San Bernardino and the concept of "compelled work".
reply
direwolf20
22 minutes ago
[-]
That was the old US law, not the one where Tim Cook delivered gold bars to Trump
reply
dec0dedab0de
7 hours ago
[-]
Every time something like this happens I assume it is a covert marketing campaign.

If the government wants to get in they’re going to get in. They can also hold you in contempt until you do.

Don’t get me wrong, it’s a good thing that law enforcement cant easily access this on their own. Just feels like the government is working with Apple here to help move some phones.

reply
Cthulhu_
6 hours ago
[-]
Better to be held in contempt than to give up constitutional rights under pressure - most functioning democracies have and defend the right to free press, protecting said press sources, and can't make you incriminate yourself.

Anyway, it's a good thing to be skeptical about claims that iphones can't be hacked by government agencies, as long as it doesn't mean you're driven to dodgier parties (as those are guaranteed honeypots).

reply
pc86
7 hours ago
[-]
"Government propaganda to help one of the richest companies in the history of the world sell 0.000000001% more phones this quarter" is quite frankly just idiotic.

You only said half the sentence anyway. The full sentence is: "If the government wants to get in they're going to get in, unless they want to utilize the courts in any way, in which case they have to do things the right way."

If this reporter was a terrorist in Yemen they would have just hacked her phone and/or blown up her apartment. Or even if they simply wanted to knock off her source they probably could have hacked it or gotten the information in some other illicit fashion. But that's not what is happening here.

reply