https://x.com/runasand/status/2017659019251343763?s=20
The FBI was able to access Washington Post reporter Hannah Natanson's Signal messages because she used Signal on her work laptop. The laptop accepted Touch ID for authentication, meaning the agents were allowed to require her to unlock it.
A few bookmarklets:
javascript:(function(){if (location.host.endsWith('x.com')) location.host='xcancel.com';})()
javascript:(function(){if (location.host.endsWith('youtube.com')) location.host='inv.nadeko.net';})()
javascript:(function(){if (location.hostname.endsWith('instagram.com')) {location.replace('https://imginn.com' + location.pathname);}})()
[1] https://www.reddit.com/r/uBlockOrigin/comments/1cc0uon/addin...
Yes, a judge is unlikely to order your execution if you refuse. Based on recent pattern of their behavior, masked secret police who are living their wildest authoritarian dreams are likely to execute you if you anger them (for example by refusing to comply with their desires).
I do not follow the logic here, what does that even mean? It seems very dubious. And what happens if one legitimately forgets? They just get to keep you there forever?
https://news.ycombinator.com/item?id=44746992
This command will make your MacBook hibernate when lid is closed or the laptop sleeps, so RAM is written to disk and the system powers down. The downside is that it does increase the amount of time it takes to resume.
A nice side benefit though, is that fingerprint is not accepted on first unlock, I believe secrets are still encrypted at this stage similar to cold boot. A fingerprint still unlocks from screensaver normally, as long as the system does not sleep (and therefore hibernate)
Another reason to use my dog's nose instead of a fingerprint.
Also, IANAL, but I'm pretty sure that if law enforcement has a warrant to seize property from you, they're not obligated to do so immediately the instant they see you - they could have someone follow you and watch to see how you unlock your phone before seizing it.
There's no known technique to force you to input a password.
It's interesting in the case of social media companies. Technically the data held is the companies data (Google, Meta, etc.) however courts have ruled that a person still has an expectation of privacy and therefore police need a warrant.
At least a password and pin you choose to give over.
Except when they can: https://harvardlawreview.org/print/vol-134/state-v-andrews/
Out of habit, I keep my phone off during the flight and turn it on after clearing customs.
Not really, because tools like Cellbrite are more limited with BFU, hence the manual informing LEO to keep (locked) devices charged, amd the countermeasures being iOS forcefully rebooting devices that have been locked for too long.
If you mean forcing an iOS device out of BFU, that's impossible. The device's storage is encrypted using a key derived from the user's passcode. That key is only available once the user has unlocked the device once, using their passcode.
The real news here isn't privacy control in a consumer OS ir the right to privacy, but USA, the leader of the free world, becoming an autocracy.
I want some of the lockdown stuff (No facetime and message attachments from strangers, no link previews, no device connections), but like half of the other ones I don't want.
Why can't I just toggle an iMessage setting for "no link preview, no attachments", or a general setting for "no automatic device connection to untrusted computers while locked"? Why can't I turn off "random dickpicks from strangers on iMessage" without also turning off my browser's javascript JIT and a bunch of other random crap?
Sure, leave the "Lockdown mode" toggle so people who just want "give me all the security" can get it, but split out individual options too.
Just to go through the features I don't want:
* Lockdown Mode disables javascript JIT in the browser - I want fast javascript, I use some websites and apps that cannot function without it, and non-JIT js drains battery more
* Shared photo albums - I'm okay viewing shared photo albums from friends, but lockdown mode prevents you from even viewing them
* Configuration profiles - I need this to install custom fonts
Apple's refusal to split out more granular options here hurts my security.
The other feature I miss is screen time requests. This one is kinda weird - I’m sure there’s a reason they’re blocked, but it’s a message from Apple (or, directly from a trusted family member? I’m not 100% sure how they work). I still _recieve_ the notification, but it’s not actionable.
While I share with your frustration, though, I do understand why Apple might want to have it as “all-or-nothing”. If they allow users to enable even one “dangerous” setting, that ultimately compromises the entire security model. An attacker doesn’t care which way they can compromise your device. If there’s _one_ way in, that’s all they need.
Ultimately, for me the biggest PiTA with lockdown mode is not knowing if it’s to blame for a problem I’m having. I couldn’t tell you how many times I’ve disabled and re-enabled it just to test something that should work, or if it’s the reason a feature/setting is not showing up. To be fair, most of the time it’s not the issue, but sometimes I just need to rule it out.
This feature has the benefit of teaching users (correctly) that browsing the internet on a phone has always been a terrible idea.
I could be naive, but just don't think they'd really have any difficulty getting what they needed. Not that I give a fuck, but I guess I've seen one too many free ads.
Educate us. What makes it less secure?
Telegram allows you to have distinct disappearing settings for each chat/group. Not sure how it works on Signal, but a solution like this could be possible.
My understanding of Lockdown Mode was that it babyifies the device to reduce the attack surface against unknown zero-days. Does the government saying that Lockdown Mode barred them from entering imply that they've got an unknown zero-day that would work in the PIN-unlock state, but not Lockdown Mode?
Curious.
How did it know the print even?
* The reporter lied.
* The reporter forgot.
* Apple devices share fingerprint matching details and another device had her details (this is supposed to be impossible, and I have no reason to believe it isn't).
* The government hacked the computer such that it would unlock this way (probably impossible as well).
* The fingerprint security is much worse than years of evidence suggests.
Mainly it was buried at the very end of the article, and I thought it worth mentioning here in case people missed it.
> Apple devices share fingerprint matching details and another device had her details
I looked into it quite seriously for windows thinkpads, unless Apple do it differently, you cannot share fingerprint, they're in a local chip and never move.
Fingerprint security being poor is also unlikely, because that would only apply if a different finger had been registered.
The fingerprint sensor does not make access control decisions, so the fault would have to be somewhere else (e.g. the software code branch structure that decides what to do with the response from the secure enclave).
I think this is pretty unlikely here but it's within the realm of possibility.
In many European countries forcing your finger on a scanner would be permissible under certain circumstances, forcing your eyes open so far has been deemed unacceptable.
Funny to see disabling "features" itself described as "feature"
Why not call it a "setting"
Most iPhone users do not change default settings. That's why Google pays Apple billions of dollars for a default setting that sends data about users to Google
"Lockdown Mode" is not a default setting
The phrase "sometimes overlooked" is an understatement. It's not a default setting and almost no one uses it
If it is true Lockdown Mode makes iPhones "harder to hack", as the journalist contends, then it is also true that Apple's default settings make iPhones "easier to hack"
Perhaps this could be a factor in why it's not a default setting
I never attach my iPhone to anything that's not a power source. I would totally enable an "enhanced protection for external accessories" mode. But I'm not going to enable a general "Lockdown mode" that Apple tells me means my "device won’t function like it typically does"
Note that it behaves subtly differently to how you described in case it was connected to something before being locked. In that case data access will remain -- even though the phone is now locked -- until the device is disconnected.
Anyone can do this for over a decade now, and it's fairly straightforward:
- 2014: https://www.zdziarski.com/blog/?p=2589
- recent: https://reincubate.com/support/how-to/pair-lock-supervise-ip...
This goes beyond the "wired accessories" toggle.
Set to ask for new accessories or always ask.
The lack of optional granularity on security settings is super frustrating because it leads to many users just opting out of any heightened security.
It's "attached" to the wifi and to the cell network. Pretty much the same thing.
Obviously, the theoretical answer is yes, given an advanced-enough exploit. But let's say Apple is unaware of a specific rootkit. If each OS update is a wave, is the installed exploit more like a rowboat or a frigate? Will it likely be defeated accidentally by minor OS changes, or is it likely to endure?
This answer is actionable. If exploits are rowboats, installing developer OS betas might be security-enhancing: the exploit might break before the exploiters have a chance to update it.
Modern iOS has an incredibly tight secure chain-of-trust bootloader. If you shut your device to a known-off state (using the hardware key sequence), on power on, you can be 99.999% certain only Apple-signed code will run all the way from secureROM to iOS userland. The exception is if the secureROM is somehow compromised and exploited remotely (this requires hardware access at boot-time so I don't buy it).
So, on a fresh boot, you are almost definitely running authentic Apple code. The easiest path to a form of persistence is reusing whatever vector initially pwned you (malicious attachment, website, etc) and being clever in placing it somewhere iOS will attempt to read it again on boot (and so automatically get pwned again).
But honestly, exploiting modern iOS is already difficult enough (exploits go for tens millions $USD), persistence is an order of magnitude more difficult.
Apple bought out all the jail breakers as Denuvo did for the game crackers.
Do you have sources for these statements?
> in 2018, the prominent Denuvo cracker known as "Voksi" (of REVOLT) was arrested in Bulgaria following a criminal complaint from Denuvo.
https://www.dsogaming.com/news/denuvo-has-sued-revolts-found...
That's how you get off such charges. I'll work for you, if you drop charges. There was a reddit post I can't find when EMPRESS had one of their episodes where she was asked if she wanted to work for. It's happened in the cracking scene before.
> The jailbreaking community is fractured, with many of its former members having joined private security firms or Apple itself. The few people still doing it privately are able to hold out for big payouts for finding iPhone vulnerabilities. And users themselves have stopped demanding jailbreaks, because Apple simply took jailbreakers’ best ideas and implemented them into iOS.
https://www.vice.com/en/article/iphone-jailbreak-life-death-...
And from the jail break community discord.
Even a cursory glance would show it's literally impossible on IOS with even a basic understanding.
I think it's unclear whether Cellebrite can or cannot get around Lockdown Mode as it would depend very heavily on whether the technique(s)/exploit(s) Cellebrite uses are suitable for whatever bugs/vulnerabilities remain exposed in Lockdown Mode.
* Lockdown Mode needs to be turned on separately for your iPhone, iPad, and Mac.
* When you turn on Lockdown Mode for your iPhone, it's automatically turned on for your paired Apple Watch.
* When you turn on Lockdown Mode for one of your devices, you get prompts to turn it on for your other supported Apple devices.
1. If they can get in, now people - including high-value targets like journalists - will use bad security.
2. If the FBI (or another agency) has an unknown capability, the FBI must say they can't get in or reveal their capabilities to all adversaries, including to even higher-profile targets such as counter-intelligence targets. Saying nothing also risks revealing the capability.
3. Similarly if Apple helped them, Apple might insist that is not revealed. The same applies to any third party with the capability. (Also, less significantly, saying they can't get in puts more pressure on Apple and on creating backdoors, even if HN readers will see it the other way.)
Also, the target might think they are safe, which could be a tactical advantage. It also may exclude recovered data from rules of handling evidence, even if it's unusable in court. And at best they haven't got in yet - there may be an exploit to this OS version someday, and the FBI can try again then.
The problem with low entropy security measures arises due to the fact that this low entropy is used to instruct the secure enclave (TEE) to release/use the actual high entropy key. So the key must be stored physically (eg. as voltage levels) somewhere in the device.
It's a similar story when the device is locked, on most computers the RAM isn't even encrypted so a locked computer is no major obstacle to an adversary. On devices where RAM is encrypted the encryption key is also stored somewhere - if only while the device is powered on.
FBI unable to extract data from iPhone 13 in Lockdown Mode in high profile case [pdf]
https://storage.courtlistener.com/recap/gov.uscourts.vaed.58...
https://www.cs.cmu.edu/~rdriley/487/papers/Thompson_1984_Ref...
If the government wants to get in they’re going to get in. They can also hold you in contempt until you do.
Don’t get me wrong, it’s a good thing that law enforcement cant easily access this on their own. Just feels like the government is working with Apple here to help move some phones.
Anyway, it's a good thing to be skeptical about claims that iphones can't be hacked by government agencies, as long as it doesn't mean you're driven to dodgier parties (as those are guaranteed honeypots).
You only said half the sentence anyway. The full sentence is: "If the government wants to get in they're going to get in, unless they want to utilize the courts in any way, in which case they have to do things the right way."
If this reporter was a terrorist in Yemen they would have just hacked her phone and/or blown up her apartment. Or even if they simply wanted to knock off her source they probably could have hacked it or gotten the information in some other illicit fashion. But that's not what is happening here.