Microsoft gave FBI set of BitLocker encryption keys to unlock suspects' laptops
239 points
1 hour ago
| 23 comments
| techcrunch.com
| HN
Aurornis
1 hour ago
[-]
FYI BitLocker is on by default in Windows 11. The defaults will also upload the BitLocker key to a Microsoft Account if available.

This is why the FBI can compel Microsoft to provide the keys. It's possible, perhaps even likely, that the suspect didn't even know they had an encrypted laptop. Journalists love the "Microsoft gave" framing because it makes Microsoft sound like they're handing these out because they like the cops, but that's not how it works. If your company has data that the police want and they can get a warrant, you have no choice but to give it to them.

This makes the privacy purists angry, but in my opinion it's the reasonable default for the average computer user. It protects their data in the event that someone steals the laptop, but still allows them to recover their own data later from the hard drive.

Any power users who prefer their own key management should follow the steps to enable Bitlocker without uploading keys to a connected Microsoft account.

reply
thewebguyd
41 minutes ago
[-]
> Any power users who prefer their own key management should follow the steps to enable Bitlocker without uploading keys to a connected Microsoft account.

Except the steps to to that are disable bitlocker, create a local user account (assuming you initially signed in with a Microsoft account because Ms now forces it on you for home editions of windows), delete your existing keys from OneDrive, then re-encrypt using your local account and make sure not to sign into your Microsoft account or link it to Windows again.

A much more sensible default would be to give the user a choice right from the beginning much like how Apple does it. When you go through set up assistant on mac, it doesn't assume you are an idiot and literally asks you up front "Do you want to store your recovery key in iCloud or not?"

reply
dgrunwald
6 minutes ago
[-]
> make sure not to sign into your Microsoft account or link it to Windows again

That's not so easy. Microsoft tries really hard to get you to use a Microsoft account. For example, logging into MS Teams will automatically link your local account with the Microsoft account, thus starting the automatic upload of all kinds of stuff unrelated to MS Teams.

In the past I also had Edge importing Firefox data (including stored passwords) without me agreeing to do so, and then uploading those into the Cloud.

Nowadays you just need to assume that all data on Windows computers is available to Microsoft; even if you temporarily find a way to keep your data out of their hands, an update will certainly change that.

reply
LtdJorge
4 minutes ago
[-]
Teams inside a VM it is, then.
reply
modeless
8 minutes ago
[-]
They don't do that for iMessage though... https://james.darpinian.com/blog/apple-imessage-encryption
reply
cesarb
1 hour ago
[-]
> Any power users who prefer their own key management should follow the steps to enable Bitlocker without uploading keys to a connected Microsoft account.

Once the feature exists, it's much easier to use it by accident. A finger slip, a bug in a Windows update, or even a cosmic ray flipping the "do not upload" bit in memory, could all lead to the key being accidentally uploaded. And it's a silent failure: the security properties of the system have changed without any visible indication that it happened.

reply
jollyllama
54 minutes ago
[-]
There's a lot of sibling comments to mine here that are reading this literally, but instead, I would suggest the following reading: "I never selected that option!" "Huh, must have been a cosmic ray that uploaded your keys ;) Modern OS updates never obliterate user-chosen configurations"
reply
Aurornis
57 minutes ago
[-]
If users are so paranoid that they worry about a cosmic ray bit flipping their computer into betraying them, they're probably not using a Microsoft account at all with their Windows PC.
reply
SoftTalker
41 minutes ago
[-]
If your security requirements are such that you need to worry about legally-issued search warrants, you should not connect your computer to the internet. Especially if it's running Windows.
reply
direwolf20
37 minutes ago
[-]
In the modern political environment, everyone should be worried about that.
reply
oskarw85
34 minutes ago
[-]
Because all cops are honest, all warrants are lawful and nothing worrying happens in the land of freedom right now.
reply
qmr
7 minutes ago
[-]
Appeal to the law fallacy.
reply
bobbob1921
53 minutes ago
[-]
This is correct, I also discovered while preparing several ThinkPads for a customer based on a Windows 11 image i made, that even if you have bitlocker disabled you may also need to check that hardware disk encryption is disabled as well (was enabled by default in my case). Although this is different from bitlocker in that the encryption key is stored in the TPM, it is something to be aware of as it may be unexpected.
reply
tokyobreakfast
1 hour ago
[-]
>even a cosmic ray flipping the "do not upload" bit in memory

Stats on this very likely scenario?

reply
homebrewer
58 minutes ago
[-]
Given enough computers, anything will happen. Apparently enough bit flips happen in domains (or their DNS resolution) that registering domains one bit away from the most popular ones (e.g. something like gnogle.com for google.com) might be worth it for bad actors. There was a story a few years ago, but I can't find it right now; perhaps someone will link it.
reply
pixl97
56 minutes ago
[-]
reply
homebrewer
55 minutes ago
[-]
Great, thanks. Here's a discussion on this site:

https://news.ycombinator.com/item?id=4800489

reply
lanyard-textile
44 minutes ago
[-]
A very old game speedrun -- of the era that speedruns weren't really a "thing" like they are today -- apparently greatly benefited from a hardware bit flip, and it was only recently discovered.

Can't find an explanatory video though :(

reply
direwolf20
35 minutes ago
[-]
The Tick Tock Clock upwarp in Super Mario 64. All evidence that exists of it happening is a video recording. The most similar recording was generated by flipping a single bit in Mario's Y position, compared to other possibilities that were tested, such as warping Mario up to the closest ceiling directly above him.
reply
strbean
1 hour ago
[-]
> IBM estimated in 1996 that one error per month per 256 MiB of RAM was expected for a desktop computer.

From the wikipedia article on "Soft error", if anyone wants to extrapolate.

reply
d1sxeyes
44 minutes ago
[-]
That makes it vanishingly unlikely. On a 16GB RAM computer with that rate, you can expect 64 random bit flips per month.

So roughly you could expect this happen roughly once every two hundred million years.

Assuming there are about 2 billion Windows computers in use, that’s about 10 computers a year that experience this bit flip.

reply
eszed
33 minutes ago
[-]
> 10 computers a year experience this bit flip

That's wildly more than I would have naively expected to experience a specific bit-flip. Wow!

reply
halfmatthalfcat
1 hour ago
[-]
It's "HN-likely" which translates to "almost never" in reality.
reply
patja
1 hour ago
[-]
Especially since HN readers are more likely to be using ECC memory
reply
smegger001
1 hour ago
[-]
if cosmic ray bit flips were so rare then ecc ram wouldn't be a thing.
reply
Sayrus
57 minutes ago
[-]
ECC protects against more events than cosmic rays. Those events are much more like, for instance magnetic/electric interferences or chip issues.
reply
direwolf20
35 minutes ago
[-]
Those random unexplainable events are also referred to casually as "cosmic rays"
reply
wang_li
42 minutes ago
[-]
In the 2010 era of RAM density, random bit flips were really uncommon. I worked with over a thousand systems which would report ECC errors when they happen and the only memorable events at all were actual DIMM failures.

Also, around 1999-2000, Sun blamed cosmic rays for bit flips for random crashes with their UltraSPARC II CPU modules.

reply
drysine
55 minutes ago
[-]
At google "more than 8% of DIMM memory modules were affected by errors per year" [0]

More on the topic: Single-event upset[1]

[0] https://en.wikipedia.org/wiki/ECC_memory

[1] https://en.wikipedia.org/wiki/Single-event_upset

reply
egorfine
47 minutes ago
[-]
Oh, no accidents needed. Microsoft will soon forcibly extract and upload keys to their servers.

Before you downvote, please entertain this one question: have you been able to predict that mandatory identification of online users under the guise of protecting children would literally be implemented in leading western countries in such a quick fashion? If you were, then upvote my comment instead because you know that will happen. If you couldn't even imagine this say in 2023 - then upvote my comment instead because neither you can imagine mandatory key extraction.

reply
zdragnar
38 minutes ago
[-]
I can't believe it took this long.

We have mandatory identification for all kinds of things that are illegal to purchase or engage in under a certain age. Nobody wants to prosecute 12 year old kids for lying when the clicked the "I am at least 13 years old" checkbox when registering an account. The only alternative is to do what we do with R-rated movies, alcohol, tobacco, firearms, risky physical activities (i.e. bungee jumping liability waiver) etc... we put the onus of verifying identification on the suppliers.

I've always imagined this was inevitable.

reply
thewebguyd
13 minutes ago
[-]
The problem is the implementation is hasty.

When I go buy a beer at the gas station, all I do is show my ID to the cashier. They look at it to verify DOB and then that's it. No information is stored permanently in some database that's going to get hacked and leaked.

We can't trust every private company that now has to verify age to not store that information with whatever questionable security.

If we aren't going to do a national registry that services can query to get back only a "yes or no" on whether a user is of age or not, then we need regulation to prevent the storage of ID information.

We should still be able to verify age while remaining psuedo-anonymous.

reply
gruez
1 hour ago
[-]
>A finger slip, a bug in a Windows update, or even a cosmic ray flipping the "do not upload" bit in memory, could all lead to the key being accidentally uploaded.

This is absurd, because it's basically a generic argument about any sort of feature that vaguely reduces privacy. Sorry guys, we can't have automated backups in windows (even opt in!), because if the feature exists, a random bitflip can cause everything to be uploaded to microsoft against the user's will.

reply
redox99
32 minutes ago
[-]
Uploading your encryption keys is not just "any sort of feature".
reply
gruez
30 minutes ago
[-]
You're right, it's less intrusive than uploading your files directly, like a backup does.
reply
salawat
39 minutes ago
[-]
What part of "We can't have nice things" do you not understand?
reply
gruez
37 minutes ago
[-]
The part where you're asking me about the phrase when it's not been used anywhere in this thread prior to your comment.
reply
vik0
1 hour ago
[-]
You can always count on someone coming along and defending the multi-trillion dollar corporation that just so happens to take a screenshot of your screen every few seconds (among many, many - too many other things)
reply
Aurornis
1 hour ago
[-]
Sorry to interrupt the daily rage session with some neutral facts about how Windows and the law work.

> that just so happens to take a screenshot of your screen every few seconds

Recall is off by default. You have to go turn it on if you want it.

reply
dns_snek
47 minutes ago
[-]
It only became off by default after those "daily rage sessions" created sufficient public pressure to turn them off.

Microsoft also happens to own LinkedIn which conveniently "forgets" all of my privacy settings every time I decide to review them (about once a year) and discover that they had been toggled back to the privacy-invasive value without my knowledge. This has happened several times over the years.

reply
yoyohello13
1 hour ago
[-]
I big demographic of HN users are people who want to be the multi-trillion dollar corporation so it’s not too surprising. In this case though I think they are right. And I’m a big time Microsoft hater.
reply
zer00eyz
1 hour ago
[-]
https://en.wikipedia.org/wiki/Room_641A ... Then, years later every one acts like Snowden had some big reveal.

There is the old password for candy bar study: https://blog.tmb.co.uk/passwords-for-chocolate

Do users care? I would posit that the bulk of them do not, because they just dont see how it applies to them, till they run into some type of problem.

reply
patja
1 hour ago
[-]
Are you referring to Microsoft Recall? My understanding is that is opt-in and only stored locally.
reply
parliament32
59 minutes ago
[-]
Stored locally.. until it's uploaded by OneDrive or Windows Backup?
reply
mcmcmc
1 hour ago
[-]
AI enshittification is irrelevant here. Why is someone pointing out that sensible secure defaults are a good thing suddenly defending the entire company?
reply
LoganDark
1 hour ago
[-]
Microsoft doesn't take the screenshot; their operating system does if Recall is enabled, and although the screenshots themselves are stored in an insecure format and location, Microsoft doesn't get them by default.
reply
gruez
1 hour ago
[-]
Yes, because object level facts matter, and it's intellectually dishonest to ignore the facts and go straight into analyzing which side is the most righteous, like:

>Microsoft is an evil corporation, so we must take all bad stories about them at face value. You're not some corpo bootlicker, now, are you? Now, in unrelated news, I heard Pfizer, another evil corporation with a dodgy history[1] is insisting their vaccines are safe...

[1] https://en.wikipedia.org/wiki/Pfizer#Legal_issues

reply
drnick1
1 hour ago
[-]
> Any power users who prefer their own key management should follow the steps to enable Bitlocker without uploading keys to a connected Microsoft account.

The real issue is that you can't be sure that the keys aren't uploaded even if you opt out.

At this point, the only thing that can restore trust in Microsoft is open sourcing Windows.

reply
Aurornis
55 minutes ago
[-]
> The real issue is that you can't be sure that the keys aren't uploaded even if you opt out.

The fully security conscious option is to not link a Microsoft account at all.

I just did a Windows 11 install on a workstation (Windows mandatory for some software) and it was really easy to set up without a Microsoft account.

reply
MereInterest
42 minutes ago
[-]
Last time I needed to install Windows 11, avoiding making a Microsoft account required (1) opening a command line to run `oobe/bypassnro`, and (2) skipping past the wifi config screen. While these are quick steps, neither of those are at all "easy", since they require a user to first know that it is an option in the first place.

And newer builds of Windows 11 are removing these methods, to force use of a Microsoft account. [0]

[0] https://www.windowslatest.com/2025/10/07/microsoft-confirms-...

reply
epistasis
33 minutes ago
[-]
> it was really easy to set up without a Microsoft account.

By "really easy" do you mean you had a checkbox? Or "really easy" in that there's a secret sequence of key presses at one point during setup? Or was it the domain join method?

Googling around, I'm not sure any of the methods could be described as "really easy" since it takes a lot of knowledge to do it.

reply
vanviegen
48 minutes ago
[-]
And how do you know the keys are never uploaded if you don't have an account?
reply
jjnoakes
24 minutes ago
[-]
The same way you know that your browser session secrets, bank account information, crypto private keys, and other sensitive information is never uploaded. That is to say, you don't, really - you have to partially trust Microsoft and partially rely on folks that do black-box testing, network analysis, decompilation, and other investigative techniques on closed-source software.
reply
postalcoder
1 hour ago
[-]
I'm not sure how to do this on Windows, but to disable FileVault cloud key backup on Mac, go to `Settings > Users & Groups > click on the (i) tooltip next to your account` and uncheck "Allow user to reset password using Apple Account".

This is a part of Settings that you will never see at a passing glance, so it's easy to forget that you may have it on.

I'd also like to gently push back against the cynicism expressed about having a feature like this. There are more people who benefit from a feature like this than not. They're more likely thinking "I forgot my password and I want to get the pictures of my family back" than fully internalizing the principles and practices of self custody - one of which is that if you lose your keys, you lose everything.

reply
Melatonic
1 hour ago
[-]
Or use a local account to login ?
reply
dcrazy
54 minutes ago
[-]
I’m not sure if you misunderstand how macOS accounts work or how FileVault works.

There are two ways to log into macOS: a local user account or an LDAP (e.g. OpenDirectory, Active Directory) account. Either of these types of accounts may be associated with an iCloud account. macOS doesn’t work like Windows where your Microsoft account is your login credential for the local machine.

FileVault key escrow is something you can enable when enabling FileVault, usually during initial machine setup. You must be logged into iCloud (which happens in a previous step of the Setup Assistant) and have iCloud Keychain enabled. The key that wraps the FileVault volume encryption key will be stored in your iCloud Keychain, which is end-to-end encrypted with a key that Apple does not have access to.

If you are locked out of your FileVault-encrypted laptop (e.g. your local user account has been deleted or its password has been changed, and therefore you cannot provide the key to decrypt the volume encryption key), you can instead provide your iCloud credentials, which will use the wrapping key stored in escrow to decrypt the volume encryption key. This will get you access to the drive so you can copy data off or restore your local account credentials.

reply
duskwuff
35 minutes ago
[-]
> There are two ways to log into macOS: a local user account or an LDAP (e.g. OpenDirectory, Active Directory) account.

And just in case it wasn't clear enough, I'd add: a local user account is the default. The only way you'd end up with an LDAP account is if you're in an organization that deliberately set your computer up for networked login; it's not a standard configuration, nor is it a component of iCloud.

reply
matheusmoreira
27 minutes ago
[-]
Power users should stop bothering with Windows nonsense and install Linux instead so that they can actually have control over their system.

It's 2026. The abuses of corporations are well documented. Anyone who still chooses Windows of their own volition is quite literally asking for it and they deserve everything that happens to them.

reply
jbstack
12 minutes ago
[-]
You only have to run through a modern Windows installer to understand how screwed you are if you install it. Last time I did this for a disposable Windows VM (a couple of years ago) I remember having to click through a whole bunch of prompts asking about all the different types of data Microsoft wanted my computer to send them. Often the available answers weren't "yes" or "no" but more like "share all data" vs "share just some data". After that I recall being forced to sign up for an outlook account just to create a local login unless I unplugged my network cable during the install. I've heard they have closed that loophole in recent installers.

I'd already long since migrated away from Windows but if I'd been harbouring any lingering doubts, that was enough to remove them.

reply
SmellTheGlove
19 minutes ago
[-]
I’ll bite. What Linux distro currently has the nicest desktop experience? I work on a MacBook but my desktop is a windows PC that I use for gaming and personal projects. I hear Proton has made the former pretty good now, and the latter is mostly in WSL for me anyway. Maybe a good time to try.

What do you suggest? I’ll try it in a VM or live usb.

reply
amitav1
8 minutes ago
[-]
Something with KDE. Never used KDE extensively because I hate non-tiling WMs, but something like Kubuntu would give you a more windows-esque experience by default. Here's the download link:

https://kubuntu.org/download/

Bon appetit!

reply
g947o
57 minutes ago
[-]
> It protects their data in the event that someone steals the laptop, but still allows them to recover their own data later from the hard drive.

False. If you only put the keys on the Microsoft account, and Microsoft closes your account for whatever reason, you are done.

reply
Melatonic
1 hour ago
[-]
Exactly. And any halfway decent corporate IT setup would be managing the keys themselves as well (although I would imagine many third party tools could also be compelled to do this with a proper warrant)

Bitlocker on by default (even if Microsoft does have the keys and complies with warrants) is still a hell if a lot better than the old default of no encryption. At least some rando can't steal your laptop, pop out the HDD, and take whatever data they want.

reply
Hizonner
1 hour ago
[-]
The "reasonable default" is to force the user to actually make the choice, probably after forcing the user to prove they understand the implications.
reply
x0x0
35 minutes ago
[-]
I don't think there's a good answer here.

Users absolutely 100% will lose their password and recovery key and not understand that even if the bytes are on a desk physically next to you, they are gone. Gone baby gone.

In university, I helped a friend set up encryption on a drive w/ his work after a pen drive with work on it was stolen. He insisted he would not lose the password. We went through the discussion of "this is real encryption. If you lose the password, you may as well have wiped the files. It is not in any way recoverable. I need you to understand this."

6 weeks is all it took him.

reply
thewebguyd
31 minutes ago
[-]
Apple gives users the choice during set up assistant, no reason Microsoft can't.
reply
knollimar
17 minutes ago
[-]
I bet he learned a valuable lesson
reply
armada651
49 minutes ago
[-]
> If your company has data that the police want and they can get a warrant, you have no choice but to give it to them.

They can fight the warrant, if you don't at least object to it then "giving the keys away" is not an incorrect characterization.

reply
mattmaroon
1 hour ago
[-]
It’s definitely better than no encryption at all, which would be what most people would have otherwise.
reply
giancarlostoro
29 minutes ago
[-]
To be fair, if they didn't have BitLocker enabled at all, the FBI would have just scanned the hard-drive as-is. The only usefulness of BitLocker is if a stranger steals your laptop, assuming Microsoft doesn't hand out the keys to just anybody, your files should be safe, in theory.
reply
throwway120385
56 minutes ago
[-]
Correct me if I'm wrong, but isn't forcing you to divulge your encryption password compelled speech? So the police can crack my phone but they can't force me to tell them my PIN.
reply
thewebguyd
32 minutes ago
[-]
Yes, you cannot be compelled to testify against yourself, but Microsoft is under no such obligation when served a warrant because of third party doctrine. Microsoft holding bitlocker recovery keys is considered you voluntarily giving the information to a third party, so the warrant isn't compelling you to do anything, so not a rights violation.

But, the 5th amendment is also why its important to not rely on biometrics. Generally (there are some gray areas) in the US you cannot be compelled to give up your password, but biometrics are viewed as physical evidence and not protected by the 5th.

reply
dcrazy
43 minutes ago
[-]
Warrants are a mechanism by which speech is legally compelled.

The 5th Amendment gives you the right to refuse speech that might implicate you in a crime. It doesn’t protect Microsoft from being compelled to provide information that may implicate one of its customers in a crime.

reply
salawat
30 minutes ago
[-]
Indeed. Third Party Doctrine has undermined 4th/5th Amendment protections due to the hair brained power grab that was "if you share info with a third party as art of the only way of doing business, you waive 4th Amendment protections. I ironically, Boomers basically knee-capped Constitutional protections for the very data most critically in need of protection in a network state.

Only fix is apparently waiting until enough for to cram through an Amendment/set a precedent to fix it.

reply
qingcharles
14 minutes ago
[-]
Well, SCOTUS has ummed and erred over several cases about whether to extend the 4th Amend to third party data in some scenarios. IIRC there is an online email case working up through 9th Cir right now?

One of the reasons giving for (usually) now requiring a warrant to open your phone they grab from you is because of the amount of third-party data you can access through it, although IIRC they framed is a regular 4th Amend issue by saying if you had a security camera inside your house the police would be bypassing the warrant requirement by seeing directly into your abode.

reply
direwolf20
33 minutes ago
[-]
They can't force you to tell them your PIN in some countries, but they can try all PINs, and they can search your desk drawer to find the post-it where you wrote your PIN.
reply
qingcharles
14 minutes ago
[-]
They can also hold you in a jail cell until the end of time until you give it up, in many places.
reply
fn-mote
54 minutes ago
[-]
In the US.

But this is irrelevant to the argument made above, right?

reply
throwaway85825
39 minutes ago
[-]
That would be all well and good if any of this was communicated to the user.
reply
SilverElfin
9 minutes ago
[-]
Doesn’t windows 11 force you to use a Microsoft account
reply
wolvoleo
1 hour ago
[-]
It would make me a lot less angry if Microsoft didn't go out of their way to force people to use a Microsoft account of course.
reply
joering2
22 minutes ago
[-]
> you have no choice but to give it to them

Will they shoot me in head?

What if I truly forgot the password to my encrypted drive? Will they also shoot me in the head?

reply
qingcharles
12 minutes ago
[-]
Do they need to actually shoot you? Have you had a loaded gun pressed to your head and asked for your password?

What about your wife's head? Your kids' heads?

reply
kypro
52 minutes ago
[-]
I think this is a fair position and believe you're making it in good faith, but I can't help but disagree.

I think the reasonable default here would be to not upload to MS severs without explicit consent about what that means in practise. I suspect if you actually asked the average person if they're okay with MS having access to all of the data on their device (including browser history, emails, photos) they'd probably say no if they could.

Maybe I'm wrong though... I admit I have a bad theory of mind when it comes to this stuff because I struggle to understand why people don't value privacy more.

reply
whalesalad
1 hour ago
[-]
Any power users should avoid Windows entirely.
reply
drnick1
56 minutes ago
[-]
This. Real "power users" (as opposed to people who aren't completely computer-illiterate) use the likes of Arch Linux and Gentoo and self-host whatever "cloud" services they need, they aren't running Windows and paying for Copilot 365 subscriptions.
reply
bigyabai
1 hour ago
[-]
If by "power user" you mean "enemy of the state", there's a lot of software you'd be better-off avoiding.
reply
wolvoleo
1 hour ago
[-]
"enemy of the state" depends a lot on the current state of the state.

Eg in England you're already an enemy of the state when you protest against Israel's actions in Gaza. In America if you don't like civilians being executed by ICE.

This is really a bad time to throw "enemy of the state" around as if this only applies to the worst people.

Current developments are the ideal time to show that these powers can be abused.

reply
phanimahesh
1 hour ago
[-]
That is a strange viewpoint. Are we calling everyone who wants some control over their computers enemies of the state?
reply
WarOnPrivacy
1 hour ago
[-]
> Are we calling everyone who wants some control over their computers enemies of the state?

As of today at 00:00 UTC, no.

    But there's an increasingly possible future
    where authoritarian governments will brand users
    who practice 'non-prescribed use' as enemies of the state.

    And when we have a government who's leader
    openly gifts deep, direct access to federal power
    to unethical tech leaders who've funded elections (ex:Thiel),
    that branding would be a powerful perk to have access to
    (even if indirectly).
reply
bigyabai
1 hour ago
[-]
It's holistic philosophy. You're not going to save yourself from FBI surveillance by avoiding Windows, I guarantee that to you.
reply
thewebguyd
29 minutes ago
[-]
You're not going to avoid any state surveillance if the state is really interested in you specifically.

But you can still help prevent abuses of mass surveillance without probable cause by making such surveillance as expensive and difficult as possible for the state

reply
pawelduda
1 hour ago
[-]
Maybe he's just trying to avoid Candy Crush Saga
reply
amitav1
4 minutes ago
[-]
I can't think of anybody apart from Osama bin Laden who wouldn't want to play Candy Crush. \s
reply
anonym29
57 minutes ago
[-]
https://news.ycombinator.com/item?id=46700219

Criticizing the current administration? That sounds like something an enemy of the state would do!

Prepare yourself for the 3am FBI raid, evildoer! You're an enemy of the state, after all, that means you deserve it! /s

reply
mistercheph
48 minutes ago
[-]
Yeah guys, if it's encrypted by default, it's not a violation of user security or privacy expectations to have a set of master keys that you hold onto and give to third parties to decrypt user devices. I mean it was just encrypted by default... by default...
reply
riversflow
1 hour ago
[-]
> you have no choice but to give it to them

There is always a choice.

reply
paulpauper
1 hour ago
[-]
VeraCrypt exists for this reason or other open source programs. Why would you ever trust encryption to closed source?
reply
tokyobreakfast
1 hour ago
[-]
> This makes the privacy purists angry, but in my opinion it's the reasonable default for the average computer user. It protects their data in the event that someone steals the laptop, but still allows them to recover their own data later from the hard drive.

In the opposite scenario - without key escrow - we would be reading sob stories of "Microsoft raped my data and I'm going to sue them for irreparably ruining my business" after someone's drive fails and they realize the data is encrypted and they never saved a backup key (and of course have no backups either).

>Journalists love the "Microsoft gave" framing because it makes Microsoft sound like they're handing these out because they like the cops, but that's not how it works.

Because in 2026 "journalism" has devolved into ragebait farming for clicks. Getting the details correct is secondary, and sometimes optional.

reply
pjc50
1 hour ago
[-]
Microsoft shouldn't be uploading keys, but nor should they be turning bitlocker on without proper key backup. Therefore it should be left as an optional feature.
reply
devkit1
1 hour ago
[-]
The quality of journalism you consume is highly dependent on the sources you choose. Some outlets still highly value journalistic integrity. I prefer to read those. Not that any of them are perfect. But it makes a huge difference and they typically provide a much more nuanced view. The Atlantic and the Wall Street Journal are good examples of this in my opinion.
reply
ferrouswheel
1 hour ago
[-]
It's interesting how many comments these days are like, "well of course".

Back in the day hackernews had some fire and resistance.

Too many tech workers decided to rollover for the government and that's why we are in this mess now.

This isn't an argument about law, it's about designing secure systems. And lazy engineers build lazy key escrow the government can exploit.

reply
Aurornis
1 hour ago
[-]
> Back in the day hackernews had some fire and resistance.

Most of the comments are fire and resistance, but they commonly take ragebait and run with the assumptions built-in to clickbait headlines.

> Too many tech workers decided to rollover for the government and that's why we are in this mess now.

I take it you've never worked at a company when law enforcement comes knocking for data?

The internet tough guy fantasy where you boldly refuse to provide the data doesn't last very long when you realize that it just means you're going to be crushed by the law and they're getting the data anyway.

reply
thewebguyd
28 minutes ago
[-]
> I take it you've never worked at a company when law enforcement comes knocking for data?

The solution to that is to not have the data in the first place. You can't avoid the warrants for data if you collect it, so the next best thing is to not collect it in the first place.

reply
nemomarx
52 minutes ago
[-]
If you design it so you don't have access to the data, what can they do? I'm sure there's some cryptographic way to avoid Microsoft having direct access to the keys here.
reply
t-3
29 minutes ago
[-]
If you design it so you don't have access to the data, how do you make money?

Microsoft (and every other corporation) wants your data. They don't want to be a responsible custodian of your data, they want to sell it and use it for advertising and maintaining good relationships with governments around the world.

reply
caminante
25 minutes ago
[-]
What are you talking about?

> I'm sure there's some cryptographic way to avoid Microsoft having direct access to the keys here.

FTA (3rd paragraph): don't default upload the keys to MSFT.

>If you design it so you don't have access to the data, what can they do?

You don't have access to your own data? If not, they can compel you to reveal testimony on who/what is the next step to accessing the data, and they chase that.

reply
direwolf20
30 minutes ago
[-]
"Good" companies in the old days would ensure they don't have your data, so they don't have to give it to the police.
reply
matheusmoreira
11 minutes ago
[-]
Plenty of companies would do that if they could. The problem is it has become illegal for them to do that now. KYC/AML laws form the financial arm of warrantless global mass surveillance.
reply
direwolf20
8 minutes ago
[-]
KYC/AML is luckily still confined to the financial sector. There's no law for operating system vendors to do KYC/AML.
reply
egorfine
45 minutes ago
[-]
> This isn't an argument about law, it's about designing secure systems

False. You can design truly end-to-end encrypted secure system and then the state comes at you and says that this is not allowed, period. [1]

[1] https://medium.com/@tahirbalarabe2/the-encryption-dilemma-wh...

reply
direwolf20
29 minutes ago
[-]
reply
al_borland
42 minutes ago
[-]
I'd love to see companies stop service in countries that request things like this, to put pressure on the governments to not be scumbags.
reply
hmokiguess
38 minutes ago
[-]
I actually understood that as in “of course . . . because Microsoft”
reply
smegger001
57 minutes ago
[-]
it the natural results this site catter not just to tech nerds but one chasing venture capital money. its an inudustry that has never seen a dark patern it didn't like. we have gone from "don't be evil" to "be evil if makes the stonks go up"
reply
salawat
14 minutes ago
[-]
It's why tech loves young engineers who just do what their told, of old engineers only as long as they can't say no. Once you dig into the system and see how all the pieces fit together, you can't ethically or morally continue to participate any longer. Learned that the hard way. In the middle of an attempt at midlife career change because of it to maybe free myself to write software that needs to be written instead of having to have a retained lawyer on hand to wrangle employment contract clauses to keep my work belonging to me.
reply
CodingJeebus
59 minutes ago
[-]
It’s not about engineers being lazy, it’s about money.

Trying to resist building ethically questionable software usually means quitting or being fired from a job.

reply
conception
46 minutes ago
[-]
No this is lazy. Microsoft shouldn’t have access to your keys. If they do, anyone who hacks Microsoft (again) also has them.
reply
kypro
29 minutes ago
[-]
I agree with you, but also think this is only true because we as an industry have been so completely corrupted by money at this point.

In the 90s and 00s people overwhelmingly built stuff in tech because they cared about what they were building. The money wasn't bad, but no one started coding for the money. And that mindset was so obvious when you looked at the products and cultures of companies like Google and Microsoft.

Today however people largely come into this industry and stay in it for the money. And increasingly tech products are reflecting the attitudes of those people.

reply
thinkingtoilet
47 minutes ago
[-]
Saying "of course" doesn't mean we agree with it or try to resist it. It's simply not surprising that this happened.

When you get high up in an org, choosing Microsoft is the equivalent of the old "nobody ever got fired for buying IBM". You are off-loading responsibility. If you ever get high up at a fortune 500 company, good luck trying to get off of behemoths like Microsoft.

reply
kccqzy
41 minutes ago
[-]
I don’t see that at all. Instead, I think tech workers, including the engineers and the product managers, are correctly prioritizing user convenience over resistance to government abuse. It’s honestly the right trade off to make. Most users worry about casual criminals, not governments. Say a criminal snatching your laptop and accessing your files that way. If you worry about governments you should already know what to do.
reply
MattSteelblade
28 minutes ago
[-]
Based on the comments in the thread, I sense I will be in the minority, but for most consumers this is a reasonable default. Broadly speaking, the threat model most users are concerned with doesn't account for their government. The previous default is no encryption at rest, which doesn't protect from the most common threats, like theft or tampering. With BitLocker on, a new risk for users is created: loss of access to their data because they don't have their recovery key. You are never forced to keep your recovery keys in Microsoft's servers and it's not a default for corporate users.
reply
nancyminusone
12 minutes ago
[-]
I'll always remember - when I was first learning about it, one of the interesting counter-arguments to ignoring privacy was "what if the Nazis come back, would you want them to have your data?". I suppose there's some debate these days, but hostile governments seem a lot closer than they were 10-15 years ago.

Will this make people care? Probably not, but you never know.

reply
observationist
1 hour ago
[-]
Hear that? It's the sound of the year of the Linux desktop.

It's time - it's never been easier, and there's nothing you'll miss about Windows.

reply
AmazingTurtle
2 minutes ago
[-]
I have opted out of all cloud services in my windows installation; I use a passphrase, too (it is even before booting the computer). I feel like this is pretty safe
reply
cogman10
1 hour ago
[-]
> Microsoft told Forbes that the company sometimes provides BitLocker recovery keys to authorities, having received an average of 20 such requests per year.

At least they are honest about it, but a good reason to switch over to linux. Particularly if you travel.

If microsoft is giving these keys out to the US government, they are almost certainly giving them to all other governments that request them.

reply
Aurornis
1 hour ago
[-]
It's not like companies have a choice. If they have a key in their possession and law enforcement gets an order for it, they have to provide it.
reply
function_seven
1 hour ago
[-]
That only strengthens the parent point. Switch to an OS where this requirement doesn't come into play if you're worried about any governments having a backdoor into your own machine.
reply
Aurornis
49 minutes ago
[-]
> Switch to an OS where this requirement doesn't come into play

I use BitLocker on my Windows box without uploading the keys. I don't even have it connected to a Microsoft account. This isn't a requirement.

reply
charcircuit
1 hour ago
[-]
If you sync your Linux machines key in the cloud, police could subpoena it too. The solution is not to switch to Linux, but to stop storing it in plain text in the cloud.
reply
NewsaHackO
18 minutes ago
[-]
Do you know what a private key means in this context?
reply
charcircuit
17 seconds ago
[-]
No, I don't. The bitlocker key is a symmetric key.
reply
Zambyte
1 hour ago
[-]
> It's not like companies have a choice.

> If they have a key in their possession [...]

So they do have a choice.

reply
mc32
1 hour ago
[-]
People/users have an option to keep the key themselves. Most wouldn’t bother to manage encryption keys.
reply
egorfine
44 minutes ago
[-]
And even if they don't have the key. Case in point: https://medium.com/@tahirbalarabe2/the-encryption-dilemma-wh...
reply
TrainedMonkey
1 hour ago
[-]
All other governments is a stretch here, but likelihood of at least one another government getting same privileges is extremely high.
reply
slashdave
1 hour ago
[-]
Why take the drastic step of switching to linux (a difficult endeavor) when you can simply turn off key uploading.
reply
varun_ch
1 hour ago
[-]
Why continue to use an operating system that’s adversarial towards you?
reply
bogwog
55 minutes ago
[-]
I will never understand this from software engineers/tech people in general. That demographic knows how technology works, and are equipped to see exactly where and how Microsoft is taking advantage of them, and how the relationship is all take and zero give from their end. These people are also in the strongest position to switch to Linux.

The only explanation that makes sense to me is that there's an element of irrationality to it. Apple has a well known cult, but Microsoft might have one that's more subtle? Or maybe it's a reverse thing where they hate Linux for some equally irrational reasons? That one is harder to understand because Linux is just a kernel, not a corporation with a specific identity or spokesperson (except maybe Torvalds, but afaik he's well-regarded by everyone)

reply
wolvoleo
1 hour ago
[-]
Because that gives you a lot more control over your computer than just solving this particular issue. If you care about privacy it's definitely a good idea.
reply
egorfine
43 minutes ago
[-]
Because Microsoft absolutely will make it mandatory somewhere in the not so distant future.
reply
axus
48 minutes ago
[-]
Here's a story about what the FBI may do when they don't unlock the laptop:

https://cointelegraph.com/news/fbi-cant-be-blamed-for-wiping...

Perhaps next time, an agent will copy the data, wipe the drive, and say they couldn't decrypt it. 10 years ago agents were charged for diverting a suspect's Bitcoin, I feel like the current leadership will demand a cut.

reply
tokyobreakfast
1 hour ago
[-]
This is almost certainly users who elect to store their BitLocker keys in OneDrive.

Don't think Apple wouldn't do the same.

If you don't want other people to have access to your keys, don't give your keys to other people.

reply
piccirello
1 hour ago
[-]
In Apple's case, starting with macOS Tahoe, Filevault saves your recovery key to your iCloud Keychain [0]. iCloud Keychain is end-to-end encrypted, and so Apple doesn't have access to the key.

As a US company, it's certainly true that given a court order Apple would have to provide these keys to law enforcement. That's why getting the architecture right is so important. Also check out iCloud Advanced Data Protection for similar protections over the rest of your iCloud data.

[0] https://sixcolors.com/post/2025/09/filevault-on-macos-tahoe-...

reply
eddyg
1 hour ago
[-]
You shouldn't include Apple in this.

As of macOS Tahoe, the FileVault key you (optionally) escrow with Apple is stored in the iCloud Keychain, which is cryptographically secured by HSM-backed, rate-limited protections.

You can (and should) watch https://www.youtube.com/watch?v=BLGFriOKz6U&t=1993s for all the details about how iCloud is protected.

reply
giobox
1 hour ago
[-]
> Don't think Apple wouldn't do the same.

Of course Apple offers a similar feature. I know lots of people here are going to argue you should never share the key with a third party, but if Apple and Microsoft didn't offer key escrow they would be inundated with requests from ordinary users to unlock computers they have lost the key for. The average user does not understand the security model and is rarely going to store a recovery key at all, let alone safely.

> https://support.apple.com/en-om/guide/mac-help/mh35881/mac

Apple will escrow the key to allow decryption of the drive with your iCloud account if you want, much like Microsoft will optionally escrow your BitLocker drive encryption key with the equivalent Microsoft account feature. If I recall correctly it's the default option for FileVault on a new Mac too.

reply
ezfe
1 hour ago
[-]
Apple's solution is iCloud Keychain which is E2E encrypted, so would not be revealed with a court order.
reply
tokyobreakfast
1 hour ago
[-]
What is your proof they don't have a duplicate key that also unlocks it? A firm handshake from Tim?
reply
eddyg
1 hour ago
[-]
You should watch the whole BlackHat talk (from 2016!) from Apple's Head of Security Engineering and Architecture, but especially this part:

https://www.youtube.com/watch?v=BLGFriOKz6U&t=1993s

reply
otterley
1 hour ago
[-]
If they say they don't, and they do, then that's fraud, and they could be held liable for any damages that result. And, if word got out that they were defrauding customers, that would result in serious reputational damage to Apple (who uses their security practices as an industry differentiator) and possibly a significant customer shift away from them. They don't want that.
reply
direwolf20
27 minutes ago
[-]
The government would never prosecute a company for fraud where that fraud consists of cooperating with the government after promising to a suspected criminal that they wouldn't.
reply
otterley
17 minutes ago
[-]
That's not the scenario I was thinking of. There are other possibilities here, like providing a decryption key (even if by accident) to a criminal who's stolen a business's laptop, or if a business had made contractual promises to their customers, based on Apple's promises to them. The actions would be private (civil) ones, not criminal fraud prosecution.

Besides, Apple's lawyers aren't stupid enough to forget to carve out a law-enforcement demand exception.

reply
tokyobreakfast
54 minutes ago
[-]
Absent the source code, it's incredibly difficult to disprove when the only proof you have is good vibes.
reply
otterley
51 minutes ago
[-]
There are many things you can't prove or disprove in this world. That's where trust and reputation comes in - to fill the uncertainty gap.
reply
jcalvinowens
1 hour ago
[-]
> Apple's solution is iCloud Keychain which is E2E encrypted, so would not be revealed with a court order.

Nope. For this threat model, E2E is a complete joke when both E's are controlled by the third party. Apple could be compelled by the government to insert code in the client to upload your decrypted data to another endpoint they control, and you'd never know.

reply
dcrazy
34 minutes ago
[-]
That was tested in the San Bernardino shooter case. Apple stood up and the FBI backed down.
reply
jcalvinowens
28 minutes ago
[-]
It's incredibly naive to believe apple will continue to be able to do that.
reply
tokyobreakfast
1 hour ago
[-]
That's what I said. I admit the double-negative grammar is a bit confusing.
reply
teejmya
1 hour ago
[-]
> Don't think Apple wouldn't do the same.

Except for that time they didn't.

https://www.apple.com/customer-letter/

reply
malfist
1 hour ago
[-]
It is the default setting on windows 11 to share your key with microsoft.
reply
raverbashing
1 hour ago
[-]
It's also the "default" in Windows 11 to require a recovery bitlocker key every time you do a minor modification to the "bios" like changing the boot order
reply
parineum
1 hour ago
[-]
Both Microsoft and Apple (I think Apple does) have the option to encrypt those keys with the user's password where they are storing them.
reply
paulpauper
1 hour ago
[-]
Just use open source encryption
reply
t1234s
11 minutes ago
[-]
If you use a local windows account does it still upload your bitlocker key to M$?
reply
masfuerte
2 minutes ago
[-]
No, and by default the keys are stored on the disk so it's not actually secure.

If you open the BitLocker control panel applet your drive(s) will be labelled as "Bitlocker waiting for activation".

reply
aeternum
1 hour ago
[-]
Not your keys not your {thing}
reply
uriegas
40 minutes ago
[-]
The problems of centralization. Some economic sectors are centralized by nature, IT is not.
reply
dmitrygr
1 hour ago
[-]
This is why local account setup is so important on windows, and why microsoft makes it harder and harder each update.
reply
paulpauper
1 hour ago
[-]
or not use microsoft products for encryption
reply
gethly
6 minutes ago
[-]
it's like microsoft has nothing better to do other than keep digging the hole to burry windows as mainstay operating system deeper and deeper with every new day.
reply
alexfromapex
40 minutes ago
[-]
I don't know how many bad things Microsoft has to do before consumers realize they are a terrible company and you should stop buying their stuff.
reply
Jigsy
1 hour ago
[-]
This is by far one of the best advertisements for LUKS/VeraCrypt I've ever seen.
reply
g947o
59 minutes ago
[-]
So, forcing user to connect to Internet and log in to Microsoft account has more to do than tracking you and selling ads -- Microsoft may be intentionally helping law enforcement unlocking your computer -- and that's not a conspiracy.
reply
kittikitti
35 minutes ago
[-]
The Zionist cloud is not secure.
reply
diego_moita
9 minutes ago
[-]
This isn't even about Microsoft or BitLocker. This is about the U.S.A.: anyone who thrusts the rule of law in the U.S. is a fool.

Yes, the American government retrieves these keys "legally". But so what? The American courts won't protect foreigners, even if they are heads of state or dictators. The American government routinely frees criminals (the ones that donate to Republicans) and persecutes lawful citizens (the ones that cause trouble to Republicans). The "rule of law" in the U.S. is a farce.

And this is not just about the U.S. Under the "five eyes" agreement, the governments of Canada, UK, Autralia and New Zealand could also grab your secrets.

Never trust the United States. We live in dangerous times. Ignore it at your own risk.

reply
bigyabai
1 hour ago
[-]
Quid pro quo.
reply
advisedwang
59 minutes ago
[-]
What quid pro quo? Is there an allegation that the FBI gave Microsoft something in exchange?

As far as I can see this particular case is a straightforward search warrant. A court absolutely has the power to compel Microsoft to hand over the keys.

The bigger question is why Microsoft has the recovery feature at all. But honestly I believe Microsoft cares so little about privacy and security that they would do it just to end the "help customers who lose their key" support tickets, with no shady government deal required. I'd want to see something more than speculation to convince me otherwise.

reply
ChrisArchitect
53 minutes ago
[-]
reply
SilverElfin
1 hour ago
[-]
This is disappointing but I wonder if this is quid pro quo. Microsoft and Nadella want to appear to be cooperating with the government, so they are given more government contracts and so they don’t get regulatory problems (like on antitrust or whatever).
reply
tucnak
1 hour ago
[-]
Water is wet. More news at 11
reply
yndoendo
50 minutes ago
[-]
Water is not wet. Water makes non-hydrophobic materials wet.

This news piece from a non-tech organization will help educate non-tech people.

reply
londons_explore
1 hour ago
[-]
> The case involved several people suspected of fraud related to the Pandemic Unemployment Assistance program

If it were preventing a mass murder I might feel differently...

But this is protecting the money supply (and indirectly the governments control).

Not a reason to violate privacy IMO, especially when at the time this was done these people were only suspected of fraud, not convicted.

reply
Aurornis
1 hour ago
[-]
> Not a reason to violate privacy IMO, especially when at the time this was done these people were only suspected of fraud, not convicted.

Well you can't really wait until the conviction to collect evidence in a criminal trial.

There are several stages that law enforcement must go through to get a warrant like this. The police didn't literally phone up Microsoft and ask for the keys to someone's laptop on a hunch. They had to have already confiscated the laptop, which means they had to have collected enough early evidence to prove suspicion and get a judge to sign off and so on.

reply
SoftTalker
1 hour ago
[-]
They had a warrant. That's enough. Nobody at Microsoft is going to be willing to go to jail for contempt to protect fraudsters grifting off of the public taxpayer. Would you?
reply