Take Ubuntu, for example. It’s one of the most popular and recommended distros for non-techy users, but just look at the install process: https://ubuntu.com/tutorials/install-ubuntu-desktop#1-overvi...
Let’s be honest, I don’t think most people would actually go through with that.
One idea to fix this and get more people to switch would be for Ubuntu to offer a Windows app that handles everything. It could download the ISO in the background, format the flash drive, install Ubuntu in dual boot with Windows by default, and clearly explain each step so users know how to start using Ubuntu or go back to Windows.
EDIT: Beyond skill, just getting the external media is a substantial friction. I haven't used a thumb drive besides for Linux install media in 15 years; I'm good at computers but just finding / buying one of those things is its own roadblock.
This sort of thing used to be more common. My first exposure to Linux was before CD-Rs were ubiquitous so there was often no possibility of using external media if you downloaded Linux. Partitioning the drive and installing there was typical.
They were almost never all busy. Then in summer of '95 or so, they installed 100 Mbps Ethernet.
That was a big jump. And security was a huge afterthought at the time; many, many people shared their entire hard drive with no password. If only it had been a few years later with MP3's and affordable CD burners...
This is far from a simple solution (for the layman end-user) compared to the parent comment.
On the other hand, if someone finds that part too complicated to follow perhaps they may not be able to install Linux - or Windows for that matter - by themselves and come across other issues down the line. Ultimately replacing your OS with another one does require some minimum level of technical knowledge that you either need to have or be fine with learning during the process.
Most people don’t want “tools” — they want a magic button with no guesswork, no fear of nuking the wrong drive, and no tutorial rabbit holes.
A win32 installer that bundles the ISO, sets up the USB, and gently walks you through the transition? That’s the move.
We don’t need smarter users. We need smoother defaults.
I am almost certain something like this existed 15-20 years ago from Canonical.
- Avoid requiring the user to figure out how to get into BIOS/EFI and change boot order. Windows has APIs for manipulating EFI things, may be worth looking into that.
- Replace GRUB with something more modern like rEFInd or Clover with a nice looking theme.
For the latter point, while GRUB is technically functional, it looks scary and arcane to new users and has little resiliency to things like Windows updates mucking with boot entries. It makes for a bad first impression (“why is my computer showing hacker screens suddenly”) and when it breaks your average user doesn’t have a prayer of fixing it. Something that looks more modern and self-heals would be a big improvement.
Replace Grub with nothing. If you're not doing bootable snapshots like openSUSE, then there is virtually no benefit in a "boot loader". The linux kernel + cmdline (+other stuff like ucode or secure boot signing stuff) can easily be packed into a single bootable .efi file.
That efi file will then get an entry in your uefi boot device list just like windows already has/had. This way is better anyway, since windows will overwrite your uefi boot order with every significant update, meaning users will already need to know how to boot other os's.
If the idea is they go cold turkey full Linux, good luck with that.
if the idea is they use their UEFI firmware boot menu, you're forgetting how unintuitive that is for most users with most uefi interfaces (spam hotkey at boot, wait for slow loading uefi, navigate to subscreen with boot order, find right menu item, either reorder and save or press F-key combo to "boot once now")
If you managed to install linux then this really shouldn't be a thing to get hung up on.
The biggest sticking point is the fear of losing what they do have, but we're at the point where even their previous generation computer could be made to run Linux.
I guess I'm not surprised with how frequently "reinstall Windows" is offered as a solution, that there is now some lighter version of that. But really I was talking about obtaining/creating installation media and reinstalling from scratch.
Except of course, licenses and copy protection. That stuff is gone and you have to buy it all again, since the install-id is regenerated.
Installing Ubuntu bricked a Samsung laptop I had some years back. Never again.
What? How? I've never seen an installation break the BIOS. I'm sure it's possible, but I wonder what went wrong here.
I do think maybe it would be possible to improve the UX somewhat, what about having some app called "os changer" or some accessible user-friendly name, that shows a list of options with screenshots, short descriptions and perhaps some categorization/tagging/rating system (this one is good for gaming, this one has lots of support for old hardware, this one is user-friendly). Then if you select an option it starts to download the iso silently while it asks you to insert a usb-key, when you insert that key, it shows the contents if it's not empty along with a confirmation that it gets emptied. Hopefully after formatting the image has finished downloading, and it creates a bootable usb key from it. Possibly it could read system information to suggest a key to hold during boot, then reboot the system.
However you do it, I don't think there's any way around needing some intermediate to boot into. Come to think of it, maybe a live-distro where that intermediate basically is the eventual system seems very user friendly.
Ubuntu and Linux Mint are now recommending balenaEtcher, which is easier to use than Rufus.
For the tech, sure but for common people not so.
Why cannot Ubuntu just offer a download media creation tool like Windows does. Surely it's not that hard to couple dd with a batch gui.
Although, the `cat liveimage.iso > /dev/sdX` tip mentioned in this thread is very handy and is probably enough for me. Anything I can do without a distro specific tool is a win.
A fresh install of Windows on consumer laptops requires users to locate drivers and supporting software from the OEM's website and not infect themselves with malicious software in the process.
> to fix your busted drive, just nuke the boot sector and send it
> bash
> dd if=/dev/zero of=/dev/xxx bs=512 count=1 conv=notrunc
Install Ventoy on a USB flash drive. Copy the ISO - copy as in cp. That's it.
You can add as many ISOs and select which one to boot with.
Most people I know does exactly the same. Takes Rufus or similar and downloads official image from Microsoft
if you are on windows it is possible but involves ether going the hard route and downloading special tools from microsoft (the media creation utility) or the easy route and getting rufus.
on linux it is bewildering, however there are tools you can find that do it.
on openbsd(that's me, the weirdo using openbsd on the desktop) you are out of luck, I mean partially it is my fault, who even used obsd on the desktop. but when every single linux distro is "just dump this dvd image to a usb drive" and openbsd itself is "here is a usb image to dump to a drive" you have to wonder why microsoft makes it so difficult.
I ended up getting it done but after 30 minutes chewing through the linux script trying to get it to work on obsd went "this is not worth it for a task I am only going to do once" so booted a linux usb and used the script from there.
https://atkdinosaurus.wordpress.com/2023/03/24/another-way-t...
What worked for me was to try arch. With arch you basically start with nothing, and you build it up from there. It'll take a few hours reading the install process on the wiki, but it's otherwise very easy. In the end you'll have a full understanding of what is happening on your machine. When something happens you will know what program is causing it, because you installed and configured it yourself. Or at least it makes me feel like that, because nothing has ever actually broke or not worked on my current arch install that I'm using for over 8 years. My install seems to 'just work' and I can highly recommend it, even though the install process is a bit more involved.
The only problem with Arch is that it's not afraid to introduce breaking changes in their packages, so sometimes when updating, there could be some manual intervention involved. Best is to update at least once a week, and check the website once in a while.
> Take a look at a default emacs
Is that the default terminal editor on Ubuntu? I fully agree those text editors where typing doesn't enter text should never be the default. Something like nano should always be the default imo.
And I assume that he refers to that emacs is the default $EDITOR var on ubuntu? I don't know if that's true, but i think emacs is one of these editors which is seen as having very bad usability. I can't imagine that emacs is the default $EDITOR on a recent ubuntu version, but I agree with the sentiment that those kind of editors should never be the defualt.
„Running Linux in VM“ as you have put it, is miles better because it works all the time with 0 friction, driver issues, random freezes, reboots, etc.
Hardware support issues are certainly understandable, but blaming "opinionated nerds" for them is asinine. It cannot be understated how difficult it is to deal with certain OEMs.
Let's not forget that computer science and programming initially has been a field for and by expert and academics. A lot of tools are written by experts, people that were used to writing and reading long documents with instructions, are intimately familiar with their systems, and often write similar software themselves too. Nowadays that have changed of course, but the field still has a lot of these experts.
And experts in any field often forget how it feels to not be an expert, and as such assume a lot of things to be obvious and often forget to mention or explain crucial things. But you can't expect every expert in some field to also be an expert in educational psychology, that is a whole field in itself. And even then, you might not have the creativity or writing prowess to write clear and intuitive documentation and pick variable and function names.
On top of that people are free to do what they want. That they work on this stuff publicly and make it available in itself is something worth of praise. They don't suddenly have the responsibility to make stuff easy to use for people who aren't at the same level like that.
A distribution like Ubuntu is in part basically a specific collection of all these free tools and software and presets. And maybe there doesn't exist a super user-friendly alternative for every one, and there is basically no incentive for people to spend free time making these.
Even if Ubuntu specifically states to be user-friendly (i don't know if they do), there are numerous valid reasons might exist for that. It could be that the rather spend their time refining and improving a often used part of the system. Or that they see the terminal in itself as an expert tool, and they rather have non-experts use the configuration that's available through the UI.
Although it's beside the point I think it's also important to realize that it's basically inevitable that there's going to be at least some difference between the quality of software that's free and open source, versus a company that has the ability to spend, and stands to profit from it. In my opinion the free part means infinite value and trumps any gap in quality or functionality.
Still agree that nano or something similar should be the default and vi or emacs should be an option.
At the same time, we still have a major problem at work if Microsoft goes through with this. I work in a research lab with 10s of 1000s of dollars worth of Windows 10 workstations that cannot be upgraded. We use Windows remote desktop and plenty of other software that is Windows only. The hardware is still pretty new and capable. With NIH cuts the last thing we need now is to have to spend money and lots of time to replace all that for no good reason.
You can buy extended support for orgs like yours that require it - https://learn.microsoft.com/en-us/windows/whats-new/extended...
1. in higher use than its successors
2. only had one possible successor
3. the successor did not support hardware in use at the time
?
I'm sure it won't stop them, as you say, but really Microsoft, as someone who used to be a (relatively rare at the time) defender of yours, get fucked. The Raymond Chen camp is truly dead (https://www.joelonsoftware.com/2004/06/13/how-microsoft-lost...)
2. ... I mean, that's every version of Windows. XP? Vista. Vista? 7, etc. The last time you had two choices of Windows was in the '90s.
3. It does support hardware in use 'at the time'. I upgraded from 10 to 11 on existing hardware.
If you mean older hardware, 98 and NT4 were the last to support the 486, yet 486s were still in use by the time of release of Me/2000 (I sadly had to interact with said 486s in a school lab). XP -> Vista made the jump from a Pentium 233Mhz minimum to 800Mhz minimum, /and/ caused many issues due to the introduction of WDDM causing a lot of graphics hardware to become incompatible.
This is nothing new. Those pulling the shocked pikachu face perhaps just haven't been around the Windows block enough to realize... this is nothing new.
Good for you. There is plenty of hardware out there without TPM 2.0, that is not allowed to upgrade, even if they in every other aspect are more than capable enough.
Starting with this in 2021 https://christitus.com/update-any-pc-to-windows11/ and likely (I'd have to check) integrated into Chris Titus's WinUtil by now.
Some combo of tweaking registry values or zero sizing a DLL has done the trick so far (but perhaps not into the future with upgrades and patches).
Now let's have a long prattle about our environment stewardship: https://www.microsoft.com/en-us/corporate-responsibility/sus...
> It does support hardware in use 'at the time'. I upgraded from 10 to 11 on existing hardware.
Of course it supports some hardware in use right now. But core requirements were generally just speed, now even if you have a fast processor, you're SOL if your system doesn't support TPM and specific models. Vista had more compatibility issues than usual with peripherals, but that's quite different from having to toss the whole machine. And even then: Vista was released in 2007. You had 7 more years to stay on XP.
Not only are we handwaving the obvious reality that hardware used to have a shorter effective life because it was advancing so rapidly, but the Pentium 233 came out in 1997. XP went EOL in 2014. That's almost 20 years of hardware support. My family has various machines from 2015, 2017, etc. that otherwise work perfectly fine but don't support W11. I have an older laptop with a 4 core (8 HT) 2.6 GHz CPU (3.6 Turbo) with a 1 TB SSD and 16 GB of RAM, amply powerful, but nope, no Windows 11.
Not just speed but instructions.
> you're SOL if your system doesn't support TPM and specific models
TPM support at this point in time is very old, roughly 7 years or so, along with processor model. Newer processors lack the appropriate features to support the security features of Windows 11, i.e. VBS.
New OSes have new features which require new hardware; new being highly relative here as it's quite old hardware at this point.
In fact, let's compare this pointless consumer-hostile debacle with XP, where MS went out of their way to actually improve security by heavily revamping XP and keeping it alive longer than it would have been. Meanwhile, the obvious reality that's going to happen this time around is people are not going to throw out their machines, those machines are just going to stop getting security updates. Great work, Microsoft.
So really then, what is it you're trying to advocate, that this is all...good? Or is it just argument for argument's sake?
Microsoft (well, the Windows part) is looking more and more like the Apple and Sun in that article. It’s the #2 or #3 user-facing OS these days. The fancy new programming environment happened and most stuff moved there, but it’s JavaScript and the browser rather than C# and .NET. Running old software is becoming a niche and getting more so by the day.
I've given up on my hobby projects because it was to the point where each time I got a few hours to look at them I'd spend it all doing updates or adjusting to deprecations.
One thing that struck me rereading Joel's article: those shiny new APIs he rattled off, indeed almost none of them gained any traction. And he was spot on about the UI framework fragmentation too.
Recently Windows Phone popped up and a lot of the same themes popped up, for example changing the SDKs repeatedly, charging for the privilege of using the app store (so much for giving the tools away), etc. I think part of the issue is that Apple somehow gets away with doing this sort of thing but Microsoft doesn't have anything close to the marketing chops to brainwash people into getting screwed over and liking it. Maybe because they go out of their way to make it a positive experience to buy new Apple products, rather than a trip to a dealership for a new car
But perhaps one difference is that with consoles there's a free Sisyphean "reset" with each generation, which never happened with the phone. That gives a spot to enter the race.
Plus the whole thing with the phone carriers...in the US at least I'd wager that if the carriers don't offer your phone, and the salespeople don't talk it up (which phones are going to give the salesperson the most lucrative commission, by the way? Don't forget accessories...), then that's the ball game
Fortunately there might be hope on that. Pathetic that it had to be someone presumably doing it on their own time, after all we know how resource-constrained a small business like Microsoft is
https://www.reddit.com/r/WindowsMR/comments/1l65ji8/things_a...
It's the same situation as last time with Windows 7. You can get three years of extended support for the monthly cumulative update, which I assume is being done given it is fairly inexpensive. The US government gets favorable pricing from Microsoft.
The consumer price for Windows 10 ESU is $30/$60/$90 for the first/second/third year.
Some companies may be buying prolongation for specific equipment which run win10.
Computers are cheap!
I've been there.
I assume the research equipment is the usual uncommon specialized non-consumer (expensive) electronics, and the PCs involved are of course the exact opposite, cheap commodity office machines no matter what you do.
And Windows comes along for the ride.
Naturally IT has trouble just keeping up with the OS as it morphs, and their ratio of non-technical users over technical-minded is through the roof on the office machines, they're packed to the gills with boneheaded problems even if Windows never changed, so they're never going to have what a research place needs.
All the technology is there, you just have an IT gap.
I was too.
I've put Linux on PCs for the occasional everyday user, usually multi-booting. But for labs, mostly they need excess help with Windows that they can't expect from IT.
Now when I started building labs it was before there was DOS or Windows and that kind of stuff, and naturally no IT yet either :)
So when I got to a place where they had the modern shitshow already in progress (or lack of progress), I could cry foul as soon as any significant delay was introduced by IT. Anything really that would not have been even a speedbump if there were no IT, much less a roadblock or show-stopper. But I didn't try to say there was an answer right away, I would say eventually nature took its course and delays alone accumulated enough to allow for a site-specific correction. So I stepped up to fill the gap.
>they are basically gaming machines that we bought and maintain independently of the IT department because we have specific computing needs.
Often the only effective approach, and one that nobody else would dream of.
Looks like you are already about in position :)
It may be uncharted territory and you may need to keep on going until you reach unquestionably more effective performance. Otherwise some pressure could develop to turn back
If you're going to get the most out of your scientific equipment, you're going to need a scientist on your team who can navigate the installation & maintenance of the specialized software for various vintages of instruments across multiple versions of Windows and different generations of PC hardware. At the same time moving toward mastery of each instrument itself as a primary goal, so much of the time you need Windows and the PC component of the apparatus to "just work", get out of the way, and never change. Not exactly a good fit for any IT roadmap when you have a PC with needs very far outside their routine or comfort zone. Must be able to do things with Windows that IT can not, resulting in a much more trouble-free experience overall compared to the routine office machines.
IT can rarely do an ideal job even when they have only the office machines to worry about, and some labs have really been needing a lot more than that for a while. If you don't do it, who will?
>if I could find some extra time.
Only took a few short years, and it's really like getting two years experience for every year :\
After that you sure can get a lot more out of the electronics though :)
Oh yeah, this is just Windows, that was challenging enough but the mission-critical instruments depend on it.
You must also be able to demonstrate comprehensive backup and rapid full recovery way more effectively than any alternative if you're doing your best.
As for Linux, the next step at this site would be migration of the internet and office machines to a stable Linux or two, but their IT is not ready for that and I don't know exactly how I would substitute for everything they are doing with Windows either.
I'm glad it's not my turf anyway and took the time to develop a good bridge to IT by reducing their headaches rather than encroaching on their domain. That could not have happened overnight but it was worth it to get stuff accomplished in the labs.
Windows 10 ending in October blows my mind in contrast to the free as in beer near GUI-less Microsoft Hyper-V Server 2019 receiving extended support (security updates) until 2029. I'll probably assemble a patched-up/slipstreamed installer for recycling older equipment!
While the TPM and RAM requirements can safely be bypassed, user apps (e.g. Adobe CC and certain anticheats) may assume those requirements are satisfied.
And that calculation is affected by so many things, including the physical security and if it's internet-connected.
None the less, it was never a Microsoft official statement.
Even if it said go install Ubuntu or something... Very few people think of a kernel and OS as separate things. Hardware and software separation is already sketchy enough. Instead of people interjecting for a moment, can there just be a penguin-branded "Linux" OS already?
Nobody in their right mind would claim that they are building the official Linux OS without turning the whole community against them.
And it's not as if the average user need to use linux. If developers move from windows 10 to linux, the impact would be huge.
Nobody is upset that there's an official Linux kernel. Of course it takes Linus Torvalds to declare it, and he's understandably not interested in designating an official OS, but this is a consequence.
Because he was literally the creator of the whole thing. And the word "official" means little in the open source community. Yt-dlp took the crown out of youtube-dl hands when it comes to downloading videos. Is yt-dlp official? What official even means?
And that's fine that many devs want things that just work. Little by little, everyone is noticing that windows not only not improving but taking direct action to make the experience worse. The balance is tilting in favor of linux not only because linux is getting better but because windows is also getting worse
But some people are more curious than others. And among them, chance is that a large fraction can understand what a distro is.
One of the biggest faults of Linux is we don't have an easy, user friendly, idiot proof distro for normies, but Ubuntu is just broken corporate slop.
When I was wearing the various "save users from themselves" hats in my previous life, Ubuntu users were 100% the bane of my existence... since they were all server customers, the ones that took my advice and let me help them switch over to Debian suddenly stopped being frequent footgun fliers, no matter what their original issue was.
Ubuntu, to me, is simply Debian that has been aggressively turned into enterprise slop.
There has to be some acceptable default that doesn't change too much, even if it's not the best thing ever. Ubuntu changed DEs twice even though the original was fine. Windows UI is intentionally bad at this point, but at least it's stable.
Windows 2000 to XP to Vista to 7 were big breaks in UI. 7 to 10 was a break. 10 to 11 was a break. When I now click the lower left corner, weather opens. When I'm pressing the Windows key, no applications menu to be seen, just some web search slop.
The only thing that's constant with windows are the lying percentages, where 99% and 100% take as long as 0-98%...
Unless I'm misremembering, Windows key still opens the start menu with your apps. It's just that they added tons of adware and crap next to it.
Windows updates are a support nightmare. It's just that everyone accepts those and forgets about the pain they cause. Whereas some tiny change in Linux desktop environments is always a catastrophe when people tell about it.
Microsoft gets a free pass beause of Stockholm Syndrome I guess...
I would totally use Linux on desktop if I didn't have a Mac. But it's only because I'm stubborn enough not to tolerate Windows treating me like dirt, not because it's actually a rational decision to use Linux.
Also, who googles now anyways? Ask your local ChatGPT haha.
Jokes aside, ideally would be to make a decision and recommend Ubuntu (or PopOS).
Anticheats like BattleEye started as private servers add-ons like this too, not official support, but admins choose to install them. I even remember Brood War's private ICCUP servers had their anti-hack as they called it.
Of course the well known gaming company that releases a distro is Valve. But, rootkits don’t seem like they fit their particular ethos (they are well known for their less annoying DRM scheme, right?). TBH, it seems like a rare opportunity to break the hold they have on the “game store” concept.
Fortnite uses EAC which does work on Linux, only they decide to block it.
> The openness of the Linux operating systems makes it an attractive one for cheaters and cheat developers. Linux cheats are indeed harder to detect and the data shows that they are growing at a rate that requires an outsized level of focus and attention from the team for a relatively small platform. There are also cases in which cheats for the Windows OS get emulated as if it’s on Linux in order to increase the difficulty of detection and prevention. We had to weigh the decision on the number of players who were legitimately playing on Linux/the Steam Deck versus the greater health of the population of players for Apex. While the population of Linux users is small, their impact infected a fair amount of players’ games. This ultimately brought us to our decision today.
Linux's inability to run specific anti-cheat solutions is a vendor support issue on the anti-cheat maker's part, because they don't care about your security, and they've managed to convince game developers that this practice is acceptable. It's not. Vote with your wallet.
If a user agrees to a kernel level anti-cheat, it's not a rootkit.
Who reads the EULA? Nobody knows what they're agreeing to, ever. Even for computer-savvy individuals, do they know all of what the kernel-level anti-cheat does? Of course not. Even their consent isn't informed. For normal users, they don't know anything about anything.
I'm not holding my breath for this to happen though.
https://support.apple.com/guide/security/securely-extending-...
But with Linux being open, they certainly would produce a loadable module if there was enough install base to justify it.
True, but the main point of a kernel mode anticheat is the ability to verify that the OS and game isn't being tampered with. If the OS has that capability already built in, then the needed for a kernel mode anticheat diminishes.
>they certainly would produce a loadable module if there was enough install base to justify it
It's not realistic for there to be such an install base to support such complexity compared to having them implement a simple API into their game and server.
It's not actually the message from the kernel that provides the value, it's the work needed to fake such a message.
The issue is that Windows is designed to be able to protect the will of proprietary software publishers against the will of users that want to assert control over the software running on their computer. It's very similar to the story with DRM.
Linux desktop OSes will never put in place the measures to make a Vanguard-like system work, because it's just unethical for a bunch of reasons, the most basic of which being that it's a violation of freedoms 0 and 1.
This isn't true. And supply chain wise just look at the xz backdoor. A random person was able to compromise the supply chain of many Linux distros. Security also is not just supply chain integrity.
>Windows is designed to be able to protect the will of proprietary software publishers against the will of users
I'm not sure what you mean by this. Just because Micrsoft cares about developers, it doesn't mean they don't care about users.
>that it's a violation of freedoms 0 and 1
It's not. Freedom 0 and 1 does not give you the freedom to cheat against other players without being banned. You can be free to modify the game client, but you aren't entitled to play with others using it.
For a multiplayer game, I'd argue that playing with others (even if you're restricted to private servers, not that most games support that anymore..) is running the software. Being able to use a piece of software for its intended purpose is more relevant than a literal reading "you are allowed to exec the binary and nothing more"
It's very obviously true. Linux culture is installing software from trusted repositories. Windows culture is downloading random .exe or .msi from websites and then immediately running them with full permissions.
That's why Windows has a lot of malware and Linux doesn't. It's trivial really to smuggle malware into closed-source applications that are distributed like the wild west.. If I google a popular Windows program right now, I'm going to get a lot of download websites that supply me a sketchy exe.
Some of the malware differences is because of popularity, sure. But ultimately it's 10x easier for me to add a virus to photoshop and upload that exe to download.com as opposed to smuggling malware in an open-source software in the Debian repository.
> I'm not sure what you mean by this.
It means that when companies want capabilities X Y Z which limit user actions on their own computers, Microsoft will cave. They do it all the time. Microsoft cares about making companies happy and they don't care too much about keeping power users happy.
> It's not.
It is. You're constructing a strawman. You're saying that freedoms 0 and 1 don't allow you to cheat freely. Okay, you're correct - nobody has ever said that.
What we're saying is that building kernel-level APIs to hook in anti-cheat or other anti-user software is antithetical to freedoms 0 and 1. Which it is.
I was talking more about the supply chain of the operating system itself, but lets not forget Linux has a culture of people running random commands off the internet which is also an easy vector to get people to install malware. Also I think you are overconfident in how much vetting repositories like npm do. I'm sure Linux people download random stuff off of github too like appimages.
>it's 10x easier for me to add a virus to photoshop and upload that exe to download.com
You can do the same thing but with a Linux binary of "photoshop."
>That's why Windows has a lot of malware and Linux doesn't.
This is due to more consumers using Windows than Linux.
>You're constructing a strawman.
I'm trying to assume what you mean due to this being asynchronous communication since the claim of attestation being related to freedom 0 and 1 is not true. One is about proving information to another party and the other is about having freedom of what you are running on your computer.
>What we're saying is that building kernel-level APIs to hook in anti-cheat or other anti-user software is antithetical to freedoms 0 and 1.
In this case being able to prove with relatively high confidence that no one in a game is cheating is a pro-user feature.
Being able to attest to the system state does not limit freedom 0. Anyone is still free to run any system they want, they just can't attest to their system being trusted if they are not running something trusted. Attestation doesn't make software any harder to modify than before, freedom 1, it only prevents you from attesting that you are using unmodified software when you aren't. Linux distros are not arms of the free software foundation so I don't think trying to argue about what they think is free or not is necessarily relevant to something like this being created.
It's really not and the culture is not that big.
In Windows, ALL software is installed through suspicious means. In Linux world, MOST software is not. That's the difference.
If some dumbass wants to curl a random URL into a shell that's their problem. That's a very rare occurrence.
> You can do the same thing but with a Linux binary of "photoshop."
Yes, but it seems to me you are choosing to be dense on purpose and it's irritating me.
Please read what I am actually saying. I'm not saying it's impossible to make malware for Linux systems. I'm saying the CULTURE of Linux users is not to download random executables. So if I do that, it wouldn't be very effective. If I upload a random ELF executable to download.com, close to nobody is going to download it. On Windows, this is not the case.
> This is due to more consumers using Windows than Linux.
Again, I've already addressed this. It seems you cut off the quote too early.
This is PART of the reason, but we have to acknowledge how much easier it is to actually distribute malware on Windows.
The "popularity" argument is also just a bad argument. Linux is absolutely not unpopular - almost all the servers worldwide run some Linux distro. Those servers, lots of them, contain valuable data. They are absolutely a target for malware authors. There's probably more servers running Debian alone than Windows Server and it's not even close. Even still, there's a lot more malware that runs on Windows Server than Debian.
> Linux Distros...
There seems to be a fundamental misunderstanding here.
What you are proposing is a change to the Linux kernel which allows it to not be modified in some way. That's not something that is a distribution concern - that's a kernel API concern. Which will never be implemented in the kernel for the reasons already specified.
>What you are proposing is a change to the Linux kernel which allows it to not be modified in some way.
Linux already supports attestation and it doesn't prevent the kernel from being modified.
>That's not something that is a distribution concern
It is because distros like Android already support such an API to apps.
>Which will never be implemented in the kernel for the reasons already specified.
Again it already exists in the kernel.
As for `curl ... | bash` that's a developer only thing. No user space normal applications are installed that way. I've never seen it.
Is this method good? No. Is it used exclusively by power users who presumably know what they're installing and from where? Yes.
The difference here is ALL software on Windows is installed this way. There's basically no exceptions. And don't even try bringing up the Windows store.
Switched to Fedora, and now the majority of things is recently updated in the repos. (The flatpak library is increasingly robust, but that of course applies to Debian too.)
The xz backdoor was successfully caught before it landed in mainstream release branches, because it's free software.
But broadening the scope a bit, the norms of using package managers as opposed to the norm on Windows of "download this .exe" is a much stronger security posture overall.
I am aware the Windows Store exists, it's not widely used enough to make exes a marginal distribution pathway. I am aware curl | bash exists, it's more common than it should be, but even in those cases the source is visible and auditable, and that's very uncommon for non-technical users to ever do (unlike downloading random exes).
> Freedom 0 and 1 does not give you the freedom to cheat against other players without being banned.
That's a strawman, I never claimed you should have the right to cheat against other players.
> You can be free to modify the game client, but you aren't entitled to play with others using it.
And that's the issue, Windows has functionality to impede your ability to run the software as you see fit and modify it to your needs. Perhaps you want to run your own server, with different moderation policies.
What? It literally got included with several distros. It wasn't caught before it shipped to end users. Just because it got caught before slower to update distros got it, that doesn't mean it is okay. It reveals how low the barrier is for an anonymous person to get code into the OS.
>I never claimed you should have the right to cheat against other players.
Attestation doesn't take away your ability to modify and run software which means that you still have freedom 0 and 1. It just means that you can not prove to a remote server that you bare running unmodified software. To me you were implying that the server being able to kick people who modified the client to cheat was violating their freedom.
>Perhaps you want to run your own server, with different moderation policies.
Nothing would stop you from running your own server like that.
What do you exactly mean by this as right now no users can use Linux and play the game. Allowing more Linux operating systems to be able to play the game is providing users more choice than before.
>Client-side anticheat is inherently security through obscurity
There is nothing fundamentally wrong with security through obscurity. It's just that for some problems the return on investment (security gained for the resources needed) is not worth it. For anticheat the obscurity can slow down cheat developers and raise the barrier to entry for developing cheats. Cheaters just have to make one mistake to get caught.
Realistically most Linux users are using a stock kernel and not something custom compiled. You can have both customization and a way to offer a secure environment for apps that need it. Even if you want to allow for custom kernels and drivers, the game could be setup to run in a secure virtual machine.
>The only way for this kind of anticheat to work is by introducing some part of the kernel that users can't touch.
To be clear, attestation is not anticheat. But yes, there would be components that end users would be unable to modify without removing their ability to attest to there being a secure environment for the game. Either these customizations need to be turned into policy for a trusted component to handle, or the customization needs to itself become trusted.
>but Linux isn't about obscuring the system from its owner.
Nothing about attestation requires obfuscation.
What you're asking for does exist though, in the form of Android devices and game consoles. Was curious about Steam Deck and... turns out it doesn't have secure boot. Someone could build a desktop OS on top of an anticheat-friendly kernel, but it'd probably not be big enough for gamemakers to care, and Linux desktop people would be uninterested in it to say the least. (I'm on a Mac btw, I have no horse in this race, just understand people who do)
I only wish the process/instructions were a little more friendly for normies.
In practice, it may not work properly even on their "supported" models. For example, sound does not work on my Dell E7270. Secondly, you must be willing use the Chrome browser. I will not because Chrome no longer has the option to always show the scrollbars. I am convinced that modern UX/UI designers hate their users.
This is the first post to get substantial conversation, though. The impression I get is that on-topic reposts are fine until such time as they get traction - provided that they a. aren’t self-promotion and b. are made by different users.
As someone who has setting up new computers regularly dumped on them, having to click thru all of those dumb screens before being allowed to start using the browser has been the biggest contributor in my decision to ditch Windows
Anecdata— a mate of mine plays Hell Divers 2, and thought he couldn’t play it or it wouldn’t work well. I told I had played it and it worked fine. Two days later, he’s using Linux and getting better performance than he was on Windows.
It has been five years of gaming exclusively on Linux, and I have yet to find a game I can’t play with the only exceptions (for me) being League of Legends and iRacing. But I can live without them. If you don’t play extremely competitive online games you can probably play it. My rule of thumb is, “are there IRL pro tournaments for money?” if there aren’t it’ll very likely just work.
My only tip is just use something like common. Ubuntu, Mint, PopOS, Arch, ZorinOS, Kubuntu… all will probably work with zero effort. Don’t go mucking about with weird distros, and bizarre tweaks, and you’re more than likely gonna have the most stable system you’ve ever used.
I cannot recommend Linux highly enough. Five years ago I was skeptical and unsure but tired of Windows bullshit and here I am— still loving it. I’ve fully upgraded the system recently, except for the GPU (because 5090 prices are ridiculous and I don’t want less VRAM than my 3090 has) and it even booted from my old install and just worked.
Try Linux, friends. It’s pretty freaking great these days.
Can't help thinking that should be in a bigger font. It's a shame there doesn't seem to be a away to install Linux and keep your Documents directory at least. Is that due to file systems?
[Yes, yes, backup to memory stick/external drive but I'm talking about for your average person on the street]
So long as enough contiguous space is available to install the desired Linux distro.
You can't do this all on the same drive, because you need a place to copy the documents directory to. You need to delete the NTFS partition to create the place to copy the files to, but by the time you've done that, the Documents are inaccessible. You could do it in memory, feasibly, if you create a RAMdisk and are lucky enough to have enough memory for all your documents, but then you're still gambling on not running out of memory during the install.
So it is possible to copy the documents on the same device, and it's possible to even automate the process, but it's not possible to do it reliably or safely, and the reliability is so low that it's not worth even offering the possibility. If somebody has a handful of gigabytes of documents, it's already a nonstarter. To be safe you'd demand the user make a backup onto another device anyway, in which case they might as well do that and then copy the files into a fresh install themselves
It's not just shrinking and copying over to the new `/home` because of the locality of the data. If your NTFS partition is taking the entirety of the disk (minus EFI and system partitions), shrinking it will then make it take up the first X% of the disk. Then you have to make the linux installation on the last (100-X)% of the disk, copy the files over, and then when you delete the NTFS partition, your Linux filesystem is on the last half of the disk with a big blank unallocated area on the beginning. BTRFS or LVM2 could help a little bit there, but that's far from ideal in any case.
Probably the best approach would be to shrink NTFS, create a new partition at the end of at least the right size, copy the files over, then wipe the NTFS partition, install Linux as the first partition (after system/EFI and such), then copy the files into the user's home, and then remove the documents partition. That's still not super reliable, though. You are at the mercy of your documents sizes, filesystem fragmentation (remember, even if your filesystem is mostly empty, you might not be able to shrink if fragmentation is in a bad place. You could defrag, but then the install time can balloon up many hours for the defrag process alone, just to shrink a filesystem that you're going to delete anyway), how big the Linux install will end up being, and many other factors. You'd have a lot of people who simply can't copy their documents over on install who will be simply SOL. I can't think of a situation where this kind of thing wouldn't be better served by just telling the user to backup their documents to a USB drive and move them back afterward, because many people are going to have to do that anyway.
I like to preserve the ability for the user to boot to the original Windows they have become familiar with, so they will have continued access to their local files using established permissions, apps, bitlocker, etc. Going forward on their own time at their leisure.
Shrink that old NTFS volume quite a bit, which the user won't be using that much more anyway, and make a new NTFS partition for W11. Pay attention to the usual optimizations like no hibernation or auto Daylight Savings adjustment and nothing beats dual booting the regular NT6 way. I also disable bitlocker for the Windows install process once that became the default, this must be carefully reserved for intentional deployment with the user's involvement afterward. Then in remaining space on a third major partition, install Linux to a single dedicated EXT-formatted type {0FC63DAF-8483-4772-8E79-3D69D8477DE4} volume.
Usr, swap, home, etc will all be there in one place (not unlike Windows which most often is confined to a single main partition itself, utilizing only the boot files located on a separate dedicated boot volume), and Grub will point to the still-existing functional NT6 bootloader when you need W10 or W11.
You create the new Linux {0FC63DAF-8483-4772-8E79-3D69D8477DE4} partition in Windows beforehand, which does a good job of alignment on SSDs, and leave the intended Linux partition unformatted. In Diskpart SET ID={0FC63DAF-8483-4772-8E79-3D69D8477DE4} to categorize a selected ordinary Windows OS partition as a blanket Linux partition instead, regardless of whether either one is formatted yet or not. Once the Windows system is OK (multiboot or not), then boot to the setup USB of the chosen Linux distro instead, and if everything is nominal the established boot volume will be autorecognized, you can choose the target unformatted space for your root, making no other choices for things like USR, Linux will install to that single target partition and it will just work. Linux goes onto its own partition, never touches Windows at all, nor anything in the \EFI\Microsoft folder. In this case Linux merely replaces the \EFI\Microsoft-seeking \EFI\BOOT\bootx64.efi, one single file (which you can "easily" back up once you gain access to the EFI folder :\), with an identically-named \EFI\BOOT\bootx64.efi which instead seeks a \EFI\ubuntu folder, for instance.
Where the \EFI\ubuntu folder is its own boootfolder autocreated during a ubuntu install process. Not much differently than the \EFI\Microsoft folder that was autocreated during the initial NT6 installation process. Where additional Windows versions installed later to other partitions do not create additional \EFI\Microsoft folders, but instead adds a bootmenu entry within, pointing to the newest Windows install as the new default. Leaving previous Windows installs as non-default entries.
You also may have to put adequate focus on the EFI subfolders on the SSD so they can handle the boot process completely, without any dependency on the actual UEFI firmware boot entries within the mainboard, but with some optimnized settings this just works too as soon as the Linux install is complete. Regardless it often may be best to delete the mainboard entries themselves once this is confirmed. But different mainboard UEFIs can have different approaches to the settings needed for this to work to your advantage. UEFI may be stupidly more complex than BIOS, but there are still not that many different settings compared to most user software and it doesn't take that much effort to become more familiar than the average person. After all this time has passed, the bar is still very low due to so many mainstream users recoiling in absolute learned fear at the thought of even looking at their BIOS settings. Not a problem for a true tech noob if they put their mind to it, but when does the average noob get around to that? I thought so.
In an ideal UEFI implementation proper firmware entries are autogenerated from what is found on the SSD. But sometimes only when there are no existing entries in the UEFI or anything else unexpected, and not often will unused entries be autoremoved properly once a particular SSD has been intentionally disconnected and is no longer part of the system. UEFI Shellx64.efi can be your friend which is like DOS only simpler, but few go there either. If you can do a hello world in any language you can probably remove unwanted entries with a UEFI shell though. Can also be accomplished from the command line in Linux or Windows but Shell is easier.
If you do get a wild hair and manually put back the original \EFI\Microsoft-seeking bootx64.efi file back into the \EFI\BOOT\ folder, to replace a Linux version of bootx64.efi, well the PC will again act like there in no Linux at all then. No sign of Grub will exist and Windows will naturally not natively see the files on the EXT volume.
Upon startup a Linux-seeking bootx64.efi proceeds to Grub in the \EFI\ubuntu folder, where Linux is the default but you can choose to (multi)boot Windows as the main alternate choice any time you are at the Grub bootmenu. Because Windows bootloader is detected and autoadded to the Grub bootmenu during Linux install from the beginning. Additional versions of Linux installed to further partitions will become the new default in Grub. To get Windows to be the default in Grub you have take action yourself though.
If you then install an additional Windows version, or re-install Windows in many nominal ways, it will usually overwrite the Linux-seeking bootx64.efi with a Windows-seeking version, and then it acts like there is no Linux any more either, but Linux is still there untouched assuming you always correctly direct any Windows or Linux installs to a single partition at most, without overwriting, formatting or deleting anything else, especially not re-partitioning of any kind.
This is all provided you have almost every single option on the Linux install routine very carefully chosen to achieve this exact scenario. Once you determine the settings it's a breeze to get there every time.
Unfortunately there can be a big difference between distros as to how to get the settings right so serious rehearsal using SSDs containing no valuable data is a must :\
When the time is right, any OS on the drive can have its ethernet & wifi devices disabled once it is no longer being effectively secured from network threats. So the user doesn't accidentally go on the web with an OS that they shouldn't.
Sure is a lot easier to just discard everything on the SSD, turn it over to Linux completely and kill 'em all, but it's not for everybody and some people do really well with a transition piece or two.
Think about the demand and supply curves of calculations (or computation). For most of history, they moved in tandem, with supply moving slightly faster, so computers would always do more at slightly lower costs.
Now both curves are speeding up, but demand is moving faster, so the costs of hardware are going up. And when high end servers (with GPUs) are unavailable, people hold onto the older ones longer.
Examples:
- Screensharing in Teams. There was a gaussian blur over everything. I had this happen during a work call.
- Nvidia. I kept getting screen-tearing. I went through various guides, installed drivers and so on, but it never worked properly.
- Office. LibreOffice slaughters my Office docs. The formatting is wrong, things are broken.
- Media. I had issues watching things that I could just watch on Windows.
Those kinds of issues were fun to me 20 years ago; they were part of the adventure of roughing it and sticking it to the man. Today, I don't have the time or energy. I'd rather use an OS that Just Works. When I need Linux, WSL has worked great.
But arch basically starts you with nothing and you build up, you'll exactly know what software does what, and when problems arise you know where the problem is coming from. That being said, I've never had a system that is as stable and 'just works' like my current arch install, and besides from twice needing manual intervention after not updating for a while, it's been going 8 years. I'm currently at the state that whenever something is not working (adb not recognizing my phone, external usb-harddrive bay not being properly recognized) I always know that those things will just work as soon as I reboot my machine into arch. Although on win10 most things 'just work' as well, so those moment's aren't that common.
As a side note Arch has some very good informative wiki pages on basically any software you might want to use on linux. Even if you're not using arch it's very useful for basically any linux user, and it often has a section with known issues and workarounds.
1. Nothing is googleable. People have to google how to do things like adjust the layout of external monitors, and it's significantly harder to do that on linux.
2. There are a lot of different ways to install applications, and different options are available depending on which distro or application you're targeting
2. Most distros have an App Store that’s easy to find these days. Works great for non-cli tools
It's like 900x easier to install random software you find about online on a Mac (there's zip containing the .app directory, done), and about 10x easier to install random software on Windows (they give you a .exe you double click, click next a few times, done). Versus Linux where you look at a list of different file types, consider the differences between a .deb, .rpm, figure out if it should come from Flathub, deal with enabling unverified Flathub packages, possibly disable a Flathub package from your distro that sucks and overrides the maintainer's package, etc. See things like https://www.reddit.com/r/pcmasterrace/comments/1htu87i/it_to...
I doubt much people are interested in googling apps, finding exe and clicking 5 or 6 times a "next" button in 2025.
I know some distros hide by defaults repos and flatpaks with non free software but even then there are many open source music software that support streaming from spotify and advertize it.
Having set one parent up on Mint, I can say categorically that it is still a bit of a config nightmare.
There's been ton of progress, thankfully people keep using linux besides the very vocal frustrated "failed" migrations.
In a more reasonable world they’d owe their customers a recall.
Untrue.
https://arstechnica.com/gadgets/2025/05/linux-to-end-support...
I tend to play around with old machines (late 80s, early 90s mostly) and getting any kind of modern-ish distro working on old machines, even distros really cut down, can be quite challenging.
As a response to the kernel's various SNAFUs, I've gone ahead and refunded to myself all of the money I've spent on Linux kernels over the past several decades -- and updated my install to the new version for free.
Aren't we just trying to do to much and releasing defective software. Why is it accepted?
1) there’s no implied warranty of merchantability with the hobbyist system
2) the “business model” (such as it is) of open source doesn’t push distros to hide security updates behind a pathway
3) generally Linux is usually getting better so I want to update anyway
A well configured firewall between your computer and the internet, uBlock Origin in the browser, and not downloading untrusted files off the internet can do a long way to help. Not stopping everything but at least shielding you from the worst.
I think the bigger issue is like on iPhones and Androids. Your software and apps stop supporting your OS long before the hardware or OS fails you.
Which from what I understand is that even Windows 11 still has support for SMBv1.
But my point was that your standard “up to date” XP install in 2016 was highly vulnerable and could effectively be nuked by such an attack. It took nearly 7 years after support ended for that to happen. So you could theoretically get another 7 years out of Windows 10 before a similar situation happens where a global cyberattack negatively impacts you with no way to protect yourself because your OS doesn’t support a configuration that would prevent you from being a victim.
Btw I do have a spare PC, it only got Win10 because the GPU didn't support 7, and it's not getting 11 even though it supports it. Microsoft's job to keep that secure.
It is definitely possible to heavily lockdown a Windows computer to prevent 99% of attacks and if you don’t need WAN access especially that becomes significantly easier.
It is far more likely browsers will drop support for 10 in a few years and that will be what stops the average user from being able to continue to use their Windows 10 computer.
If you are not driven by curiosity, most of the time the driver is either money, a vision of software as only an occupation, work life balance, etc.
Which is usually the kind of people that is not excited by software, doesn't have a passion for it and even take passion away from others.
Then again plenty of modern browsers have some type of profile syncing built in, which does all this for you.
> email inboxes
Please don't use POP3. Your inbox should live on a remote server and simply follow your account. Storing your inbox exclusively on your PC will make you very sad some day.
most cheaper/free email providers have a storage limit.
besides, i disagree conceptually. if i want to reduce the risk of my email being read or handed to someone i don't trust, then removing it from the server is a good idea. i can make my own backups.
On the desktop side, the GNOME online accounts feature is pretty good at getting you most of the way there.
Then everything works... until you try to adjust the display brightness.
This on pre-2020 Lenovo laptops.
At a fraction of time spent following this guide you can extend win 10 by a few more years by switching to ltsc or go win11 bypassing all software restrictions
It’s still a great device, it just sucks I’m stuck with windows (10).
This varies a lot on hardware, drivers, and other configuration choices. I've had plenty of machines that went to sleep quickly on a lid close and would actually stay asleep. I've had a desktop which the default configuration for the network adapter would have it wake the machine any time it saw any traffic come across the interface.
My personal and work laptop, I can close the lid and know its asleep. I never get a hot bag with them. My Legion Go will almost certainly wake itself up if I put it to sleep and put it in its case. Almost certainly a hardware event causing it to wake.
And that's it, they are lost and tired at that point. They will just go back to Windows.
In a way I kind of wish this was how more windows support was handled just because PowerShell is so uhh... powerful.
It might be that Linux is less capable for your use case, but people seem to be generally content with ChromeOS and I think that the standard Fedora desktop install is more capable than that so I think the market exists.
Granted we use remote desktop but still.
Love your blog btw.
Yes, it is often possible to upgrade your PC hardware to make it compatible with Windows 11, but the feasibility and cost depend heavily on which specific requirements your current PC fails to meet.
Windows 11 has stricter hardware requirements than Windows 10, primarily focusing on security and modern capabilities. The key hurdles for older PCs are usually:
CPU (Processor) Compatibility:
Requirement: 1 GHz or faster with 2 or more cores on a compatible 64-bit processor. Microsoft maintains a list of approved CPUs. Generally, this means Intel 8th Gen (Coffee Lake) or newer, and AMD Ryzen 2000 series or newer.
Upgradability: This is often the trickiest and most expensive upgrade. If your CPU isn't on the list, you would likely need to replace your motherboard AND CPU (and possibly RAM, as newer motherboards often require different RAM types). This is essentially building a new core system and might not be cost-effective for an older PC. TPM (Trusted Platform Module) 2.0:
Requirement: TPM version 2.0. This is a hardware security module that stores cryptographic keys. Upgradability: Enable in BIOS/UEFI: Many PCs manufactured in the last 5-7 years actually have TPM 2.0 (or fTPM/PTT, firmware-based TPM) but it might be disabled in the BIOS/UEFI settings. This is the easiest fix – just enable it. Add a TPM Module: Some older motherboards (typically from around the Intel 6th/7th gen or similar AMD era) have a TPM header where you can purchase and install a physical TPM 2.0 module. This is a relatively inexpensive upgrade if your motherboard supports it. Motherboard Replacement: If your motherboard doesn't have an integrated fTPM/PTT and lacks a TPM header, you would need to replace the motherboard (which usually means a new CPU and RAM too). UEFI Firmware with Secure Boot Capability:
Requirement: Your system firmware must be UEFI (Unified Extensible Firmware Interface, a modern BIOS replacement) and Secure Boot capable. Upgradability: Enable in BIOS/UEFI: Similar to TPM, many modern PCs are UEFI-capable but might be running in "Legacy BIOS" or "CSM" (Compatibility Support Module) mode. You can often switch to UEFI mode in your BIOS/UEFI settings.
Enable Secure Boot: Once in UEFI mode, you can usually enable Secure Boot from within the BIOS/UEFI settings. Motherboard Limitation: Very old PCs might only support Legacy BIOS and not UEFI at all. In this case, a motherboard replacement would be necessary. RAM (Memory):
Requirement: 4 GB or greater. Upgradability: This is usually the easiest and cheapest upgrade. Most desktops and many laptops allow you to add more RAM. Storage:
Requirement: 64 GB or larger storage device. Upgradability: Easily upgradable. You can replace a smaller HDD/SSD with a larger one. Graphics Card:
Requirement: Compatible with DirectX 12 or later with WDDM 2.0 driver. Upgradability: Most integrated and dedicated graphics cards from the last several years meet this. If yours doesn't, you could install a new graphics card (for desktops) or be out of luck (for laptops). How to Check Your PC's Compatibility: The best way to determine what specifically is holding your PC back is to use Microsoft's PC Health Check app. It will tell you exactly which requirements your system meets and which it doesn't.
Summary of Upgrade Possibilities: Most Common & Easiest: Enabling TPM 2.0 in BIOS/UEFI. Enabling Secure Boot in BIOS/UEFI (after switching to UEFI mode if needed). Adding more RAM (if less than 4GB). Upgrading storage drive size. More Involved & Potentially Costly: Adding a physical TPM 2.0 module (if your motherboard has the header). Upgrading the CPU (often requires a new motherboard and RAM too). Replacing the motherboard (almost always requires new CPU and RAM). Upgrading the graphics card (for desktops). Is it worth it? For older PCs that require a new CPU and motherboard, it often makes more sense financially to purchase a new PC that comes with Windows 11 pre-installed or is fully compatible out-of-the-box. The cost of individual component upgrades can quickly add up, and you'll end up with a system that's still fundamentally older than a brand-new one.
However, if you only need to enable TPM/Secure Boot in BIOS or add RAM, it's definitely a viable and cheap way to get on Windows 11.
And if enough people move to Linux even those holdouts will eventually have to support it. The Steam deck has been the gateway drug to Linux for the masses, and I’m stoked for it. Moving to Linux for my desktop gaming machine was the single best decision I made 5 years ago, and I haven’t used Windows since. It’s more stable than Windows ever was, and I also don’t have an errant update break a game, the system, or cause a reboot at the worst possible time.
I play Battlefield 2042, Call of Duty Warzone, Apex Legends, PUBG, Rainbow 6 Siege, and Fortnite, and none of these seem to work on Linux.
The only games I regularly play that work on Linux are DotA 2 and CS2, but I would also prefer using faceit for CS2 as there are way too many cheaters without it, and faceit does not work on Linux.
My point is, more or less, if someone mostly plays solo games, or games that will never have professional players (because there won’t be enough money to pay someone to be a professional)—- Linux slaps. Use Linux. It’s free. It’s great. It’s stable it works.
Is it for literally everyone? No. But it is probably for a lot more people than realize it.
Additionally, if the market becomes big enough, those games that don’t work—- will be forced to. Money talks.
So what?
It is a fact that having an old PC AND using Linux further limits the options compared to using an old box with a windows with extended support.
I think one of the main issues with the Linux world is that people pretend that it is easy to get on the bandwagon. "The is xyz distro that makes it really simple and it is 90% the same". Well that's a lot of bull. It results in people believing you and then silently running away never to come back again.
I work with linux and mac for about 15 years and I will definitely not give up my gaming PC with windows 10 because I would lose access to some games I am playing plus there is a performance hit that would bug me to no end (which would end up putting me in a optimization loop where I spend more time tuning the PC than actually playing).
I've found Ubuntu's default, and "vanilla gnome shell" to both be pretty cohesive and "modern".
And at the same time, I've never really felt like Windows or Mac actually end up with a more cohesive UI than the various linux desktop envs. For every Qt/GTK theming mismatch, I find a Windows mismatch between apps due to Windows being 12+ generations of design languages and toolkits built on top of each other. (e.g. the 3+ distinct "current" windows control panel looks (11, then 10, then 7, then XP as you keep digging into more and more obscure settings). And apps typically "freeze" at the UI design when they're born. e.g. XP apps still look XP, and so on.
And on Mac, you have the (relatively!) small number of apps actually artfully designed for macos. And then you have all the other ones - electron, java-based, cross-platform Qt apps (which naturally look like Qt apps... just like on KDE/gnome).
There's of course various quibbles over font render, that have existed since time immemorial. I don't think any one platform really wins hands-down here, though it's my understanding that mac typically does the best (as long as none of the non-mac-native apps manage to mess it up).
I really think people just have double-standards at this point, where their "home" platform's flaws are minor, and candidates to replace it must be flawless. (I'll also admit I'm the same, though NATURALLY I think I'm right - i figure if everything is electron and mismatched anyway, I might as well have a free-as-in-freedom operating system under it. Nobody is putting ads in my start menu or advertising xbox game pass to me in my notifications.
There are a huge number of examples here: https://www.reddit.com/r/unixporn/
I used to use Openbox and compile my own freetype with patches but these days want to spend my time on other things, so I'm just using macOS which has the best out of the box experience with the lowest TODO list when setting up a new computer.
It's hard for me to imagine anything uglier than the above, but beauty is in the eye of the beholder as they say.