Blue95: a desktop for your childhood home's computer room
555 points
2 days ago
| 46 comments
| github.com
| HN
metadat
2 days ago
[-]
This looks nice and easy to use.

My hypothesis is today's "modern" OS user interfaces are objectively worse from a usability perspective, obfuscating key functionality behind layers of confusing menus.

It reminds me of these "OS popularity since the 70s" time lapse views:

https://youtube.com/watch?v=cTKhqtll5cQ

The dominance of Windows is crazy, even today, Mac desktops and laptops are comparatively niche

reply
hi_hi
1 day ago
[-]
As a kid, the OS's supported me in learning. They were simple, intuitive and rewarding. I'd click around and explore, and discover cool things like a Wheezer music video, or engaging puzzle games.

There was no one who could help me when I got stuck, beyond maybe an instruction manual. I just had to figure it out, mostly by trial and error. I learned so much, eventually being able to replace hardware, install and upgrade drivers, re-install the entire OS and partition the hard drive, figure out networking and filesystems. It built confidence.

Now my kid sits infront of an OS (Windows, Mac, it doesn't really matter) and there's so much noise. Things popping up, demanding attention. Scary looking warnings. So much choice. There's so many ways to do simple things. Actions buried deep within menus. They have no hope of building up a mental model or understanding how the OS connects them to the foundations of computing.

Even I'm mostly lost now if there's a problem. I need to search the internet, find a useful source, filter out the things that are similar to my problem but not the same. It isn't rewarding any more, it's frustrating. How is a young child meant to navigate that by themselves?

This looks like a step in the right direction. I look forward to testing it out.

reply
accrual
1 day ago
[-]
> Things popping up

This is one of my biggest frustrations with modern GUI computing. It's especially bad with Windows and Office, but it happens on iOS and macOS too to an extent. Even though I've had Office installed for weeks I still get a "look over here at this new button!" pop-up while I'm in the middle of some Excel task. Pop-up here, pop-up there. It's insane the number of little bubbles and pop-ups and noise we experience in modern computing.

reply
askvictor
1 day ago
[-]
Even on gnome, I regularly have applications stealing focus when they decide they're the most important thing. As well as being really annoying, it's a security risk if an application steals focus while you're typing your password or otp key
reply
accrual
1 day ago
[-]
> if an application steals focus while you're typing your password

Definitely. Bitwarden does this ironically. It will pop-up "UPDATE AVAILABLE!" half way through typing a passphrase. Why not suppress the pop-up if the user is typing, or make it non-modal? Every few days I am interrupted just trying to unlock a vault.

reply
boudin
1 day ago
[-]
Did you install an extension for that ? By default Gnome prevents this and shows a notification instead. It's extensions like Steal my focus that allows focus stealing
reply
askvictor
9 hours ago
[-]
I'm using PopOS, so there's a chance that it's related to that, but no I haven't. And I've tried the gsetting that supposedly helps prevent it. I've only ever seen Zoom and VSCode do this; I get the feeling that they don't quite follow the same conventions that 'native' Linux applications use, or use some trickery (as they see themselves as the most important thing on your computer).
reply
overgard
1 day ago
[-]
Apple has kind of made things worse in the recent macOS, where my phone's notifications show up on the desktop now. Like, man, I was already drowning in them before anyway, I don't want them on two screens now.
reply
LoganDark
1 day ago
[-]
You probably already know this if it was bothering you, but just in case: System Settings -> Notifications -> toggle off "Allow notifications from iPhone".
reply
ascagnel_
1 day ago
[-]
I would also argue, regardless of what mobile OS you're on, quieting, delaying, or disabling notifications on a regular basis (and taking stock of what you let through) is prudent.
reply
LoganDark
1 day ago
[-]
I'm always shocked whenever I see someone having hundreds or even thousands of unread notifications. It pains me to see that instead of controlling their feed to only what interests them, they've just let everything pile up forever, completely unread.

For example, on Discord, I sometimes see people with unreads for every server, and dozens to hundreds of completely unread DMs, just because they don't know that you can turn notifications off for the stuff you don't care about. Instead of doing that they just learned to ignore everything, leading to a disorganized mess.

I'm somewhat familiar with what typically leads to this (usually something like ADHD), but when you let it go for so many years it's such a big task to fix it that the fixing never happens and you're just kind of screwed for eternity.

In Discord, I have no unread servers and no unread DMs, despite being at the server limit. This is because all my servers are completely silenced and all my DMs are read immediately. My only unread email is one I marked as such because I still plan to reply to it soon. I have the attention for every single notification because I aggressively optimize the notifications I receive to the point where they all are typically things I care about. Back in the day I would instantly report every email to SpamCop, typically in under one minute, but I eventually stopped doing that because there's no point.

I simultaneously do and don't understand people who just submit to a flood of irrelevant garbage. Control it!!

reply
oarsinsync
1 day ago
[-]
> Instead of doing that they just learned to ignore everything

Leading to notifications being ignored, and not mattering at all.

A well curated set of notifications that only gives you the things you actually need is superb, but incredibly difficult to get right.

Learning how to ignore all of the noise is probably a more valuable skill for a future where control is slowly wrestled away from the user. A certain Black Mirror episode (Fifteen Million Merits) comes to mind.

reply
LoganDark
1 day ago
[-]
> Leading to notifications being ignored, and not mattering at all.

You found my point! Some notifications may be important, but if one has learned to ignore all notifications, then they won't catch the important ones anymore.

When I get important notifications, I can act on them immediately, because I have not learned to ignore all my notifications, because I don't need to. I have taken care to block any notifications I don't care about, leaving only the ones that I do.

> A well curated set of notifications that only gives you the things you actually need is superb, but incredibly difficult to get right.

Ehh... maybe it's incredibly difficult to fix once you are already drowning in them, but since I immediately nuke anything I don't like from orbit, and have done so my whole life, there are a lot fewer things I don't like than if I hadn't to do that.

> Learning how to ignore all of the noise is probably a more valuable skill for a future where control is slowly wrestled away from the user. A certain Black Mirror episode (Fifteen Million Merits) comes to mind.

I have a really hard time ignoring noise. I think that is because of my autism. Ignoring noise would be a nice skill, I guess, but I feel significantly better when the noise is simply not there in the first place. I'd imagine most would, but for some reason I seem to break a lot more easily when uncomfortable.

reply
oarsinsync
1 day ago
[-]
> > Learning how to ignore all of the noise is probably a more valuable skill for a future where control is slowly wrestled away from the user. A certain Black Mirror episode (Fifteen Million Merits) comes to mind.

> I have a really hard time ignoring noise. I think that is because of my autism. Ignoring noise would be a nice skill, I guess, but I feel significantly better when the noise is simply not there in the first place. I'd imagine most would, but for some reason I seem to break a lot more easily when uncomfortable.

I share this discomfort, but only in physical space. I struggle with visual and audible noise in the real world, but my screen I’ve become very adept at ignoring. Giving up on inbox-zero in my personal life and ending up with 2000+ unread emails in my personal mailbox is where that started.

reply
LoganDark
1 day ago
[-]
> my screen I’ve become very adept at ignoring.

I unfortunately don't share this ease. As an example, I have to use custom CSS to completely remove blocked messages from Discord, because once I know a blocked message exists I can not seem to ignore it. The only way to properly protect myself from blocked users is to ensure that I can never even know they're there.

I do a similar thing for muted DMs. Once I know that someone has sent a message, I cannot seem to avoid checking the message. The only way for me to properly take a break is to ensure that there's no way for me to accidentally discover the existence of new messages. So I have some JavaScript that completely removes muted DMs from the list so that I won't even see them jumping back up to the top every new message.

I don't know what causes this to happen or how to fix it, but I do know that ignoring things has always been nearly impossible for me. Whenever there is anything I need to protect myself from, I need to also protect myself from ever noticing any related activity.

Maybe this is some sort of OCD, I'm not sure. Pretty sure it would meet the criteria, I guess.

reply
nkrisc
1 day ago
[-]
That sounds like a lot of work. Alternatively, I can just ignore them all. I don’t care if there’s a red circle with a number in it. I don’t notice it and it doesn’t bother me.
reply
LoganDark
1 day ago
[-]
> That sounds like a lot of work.

Again, it is a lot of work if you have to do it at once. If you've done it naturally over the course of your life, you would never have had a big pile of things to take care of at once.

> Alternatively, I can just ignore them all. I don’t care if there’s a red circle with a number in it. I don’t notice it and it doesn’t bother me.

Suit yourself. Personally, I care about responding promptly to certain things, like instant messages or emails, but I don't appreciate unwelcome distractions.

See: https://vxtwitter.com/AutisticCallum_/status/190533113360614... & https://vxtwitter.com/AutisticCallum_/status/190533113707064...

reply
nkrisc
1 day ago
[-]
> Personally, I care about responding promptly to certain things, like instant messages or emails, but I don't appreciate unwelcome distractions.

Sure, if I’m being paid to respond right away then I will. The “instant” in “instant message” describes the delivery time, not the required response time. Email, IM are things I all treat asynchronous communications. Anything urgent should be a phone call, ideally. If there’s a particular email address I consider important I’ll make a separate inbox for it.

Otherwise, I find sort of fun to see how big the number can get.

On my work email I just mark everything as read at the end of the day so the number starts over each day.

reply
mrehler
18 hours ago
[-]
Discord is really awful about this, and that's one of the many reasons I dislike it. Its defaults are bad. I should be able to set, at the account level, the default that servers don't send me notifications. I believe by default, I only get @everyone and @role notifications, which is only a few taps per server. But every time I join a new server, I have to remember to do it. If I didn't actively care about notifications from one particular server, I'd just block Discord notifications entirely and stop managing them in their app at all.
reply
overgard
1 day ago
[-]
It's not really a matter of not knowing it can be done, it's that the mental effort of curating it is not really worth it (short term, anyway), because then I have to decide what's a worthwhile notification and what isn't.
reply
m463
19 hours ago
[-]
do you turn off amber alerts? emergency alerts?
reply
m463
19 hours ago
[-]
I'm reminded of the aircraft crash investigations where the pilots would have a frequent warning, learn to ignore it, then miss the important one.

unnecessary notifications hurt us all. Wonder how many people have turned off the emergency/amber alerts because of non-critical use of the system?

reply
int_19h
1 day ago
[-]
Microsoft has a similar thing on Windows with Phone Link.
reply
mrob
1 day ago
[-]
The worst example of things popping up I've seen is Youtube's sponsored content warnings. These immediately appear above the active click target (video thumbnail) when you move your mouse over it, hijacking the expected action there. If you click things as a single action like every skilled mouse user does (rather than pointing and then clicking as two separate actions), it's physically impossible to react in time to avoid clicking them. And because only a part of the click target gets hijacked it's too inconsistent to learn to avoid it by intuition. I've clicked them accidentally several times and every time it's been disturbing because it feels like some serious and unexpected error.
reply
fragmede
1 day ago
[-]
Omfg I do not need national political news shoved into my face in on the left side of my taskbar while I'm trying to focus on work, thank you Microsoft, k thx bye!
reply
netsharc
1 day ago
[-]
Google Android phones do so freaking much of this too. Open the Google app (or swipe left from the home screen on the Pixel Launcher), gone are the days where it's a copy of their original homepage and is just a search bar, now it has news... Go to the search bar, and it shows trending searches. Die in a tucking fire!
reply
red_trumpet
1 day ago
[-]
You can definitely disable the "swipe left" thing on the launcher.
reply
oefnak
1 day ago
[-]
Yea, but you can't make it a home screen like when you swipe right.
reply
imgabe
1 day ago
[-]
You can turn that off, it’s the first thing I do on Windows. I agree you shouldn’t have to though.
reply
booleandilemma
1 day ago
[-]
Yeah I just wanted to check the weather...
reply
Andrex
1 day ago
[-]
Humans hate being bored but only dick-around and learn things like this when they're bored. Speaking personally, I guess.

Since the 90s we've found "better" ways at "curing" our boredom. Put this UI on a modern OS in front of a kid today and they would just download Steam, Chrome and Discord. And be assured, they're very proficient at the in-and-outs of those platforms.

Just some random thoughts I had, not sure any of it tracks...

reply
jon_richards
1 day ago
[-]
I used to think I watched TV or scrolled Reddit because I didn’t have the energy to pursue more interesting things. I blocked Reddit and TV. Turns out I have plenty of energy, those were just stealing it from me.
reply
_carbyau_
1 day ago
[-]
I think part of it is that ubiquitous internet access.

Take that away so you have a standalone computer that you "run programs on" and it becomes simpler.

reply
whatevertrevor
1 day ago
[-]
Yeah I agree. The trial and error mentioned in GP needed a good amount of focused time commitment, the sort of thing at a premium in the modern attention economy of tech.
reply
girvo
1 day ago
[-]
> I learned so much, eventually being able to replace hardware

As a young teenager in the early-mid 2000s, I learned the hard way what the little standoffs are for by killing a motherboard by screwing it directly into the steel case :')

Never made that mistake again, that's for sure. And I share all the same experiences as yourself

reply
xandrius
1 day ago
[-]
There should be a club for people like us: we learnt the hard way to double-check and never ever fully trust ourselves, especially with hardware connections.

The first Mobo I ever purchased with my own money was insta-fried exactly like that, it still hurts a little to think about that.

reply
interludead
1 day ago
[-]
I did the exact same thing - mounted a shiny new board straight onto the case, powered it on, and… nothing. Spent hours troubleshooting before realizing I'd basically shorted the whole thing
reply
brulard
1 day ago
[-]
I share your nostalgia. I did learn a lot by exploring Amiga OS and later Windows 98 without anyone to help or an internet connection. It was fun, but we had time to spend back then. Today time feels much more scarce and I no longer appreciate if I have to learn by trial and error. Now it feels like you have to keep up with the tech progress, which is crazy fast. And youth has too many things to do that are more appealing, like social media, youtubes, loads of games etc. For them the exploration of OS or some software is not attractive anymore.
reply
underlipton
1 day ago
[-]
There're also performance issues. Building muscle memory (which means offloading tasks from working memory, leaving it open for learning) can't happen if you're constantly trying to figure out when the system is going to actually respond to your input.
reply
titzer
1 day ago
[-]
We largely abandoned an unbelievably efficient form of human input in favor of big fat dumb slow touchscreens. Can you imagine where we'd be as a species if we got our shit together 25 years ago and standardized a few of the most important keyboard shortcuts and layouts and that was the default everywhere? I won't advocate terminal-only, but even classical GUIs with windows and icons and all could have been much more efficient if keyboard input and navigation was given priority instead of the comedy of using a pixel-precise indirect pointer to fart around a virtual screen select some button when...there are buttons under my fingers.
reply
int_19h
1 day ago
[-]
We did standardize a lot of keyboard shortcuts and layouts. And it was fairly widespread, even, with both Windows and macOS retaining some bits of it (sadly, fewer and fewer with each new update).

https://en.wikipedia.org/wiki/IBM_Common_User_Access

reply
crims0n
1 day ago
[-]
This is so true. The most frustrating thing in the world to me is waiting for the UI to catch up to my actions… that should just never happen in 2025. Not only is it frustrating to wait, but as you elegantly stated it forces the menial task to enter working memory.
reply
LoganDark
1 day ago
[-]
Win32 would buffer keystrokes so that a sequence of commands wouldn't be lost even if the UI took long to respond (e.g. if a dialog took long to open), but that has mostly been lost in the era of web apps and other similar bullshit.
reply
int_19h
1 day ago
[-]
Modern web (and web-like) apps all too often don't even bother supporting keyboard for anything other than text field input. What I find especially infuriating are the dialog boxes where the only thing you have is a textbox and a button, and yet you cannot press Enter to submit the dialog once you finish typing - nope, you have to reach out for the mouse and click that button.
reply
LoganDark
1 day ago
[-]
This is why I commonly have a trackpad in addition to a mouse when I use Windows. Literally, mouse on right, trackpad on left. It's much faster to use the trackpad for annoyances like these. I also find that trackpads are ten times better for scrolling.
reply
hulitu
1 day ago
[-]
> Actions buried deep within menus.

Maybe they are though in CS school that every layer of abstraction is better. I don't see other explanation for this level of stupidity.

reply
mananaysiempre
2 days ago
[-]
> This looks nice

These kinds of things almost always give me an uncanny-valley feeling. Here I'm looking at the screenshot and can’t help noticing that the taskbar buttons are too close to the taskbar’s edge, the window titles are too narrow, the folders are too yellow, and so on and so forth. (To its credit, Wine is the one exception that is not susceptible to this, even when configured to use a higher DPI value so the proportions aren’t actually the ones I’m used to.) I’m not so much criticizing the theme’s authors as wondering why this is so universal across the many replicas.

reply
mouse_
2 days ago
[-]
Computing is largely a cargo cult thing these days.

The problem is that the interfaces these bootleg skins draw "inspiration" from were designed on the back of millions of pre-inflationary dollars' R&D from only the best at Golden-Age IBM, Microsoft, Apple, etc.. BeOS, OS/2, Windows 95-2000 do not look the way they do because it looks good, they look the way they do because it works good, countless man hours went into ensuring that. Simply designing an interface that looks similar is not going to bring back the engineering prowess of those Old Masters.

reply
mananaysiempre
2 days ago
[-]
I’m less inclined to attribute it to “these days”, as I remember the contemporary copycat themes in e.g. KDE and Tk looking off as well. Even Swing with the native look-and-feel didn’t quite look or feel right, IIRC.

As a (weak) counterpoint to supplicating ourselves to the old UI masters, I submit Raymond Chen’s observations from 2004[1] that the flat/3D/flat cycle is largely fashion, e.g. how the toolbars in Office 97 (and subsequent “coolbars”) had buttons that did not look like buttons until you hovered over them, in defiance of the Windows 95 UI standard. (Despite Chen’s characteristic confident tone, he doesn’t at all acknowledge the influence of the limited palettes of baseline graphics adapters on the pre-Win95 “flat” origins of that cycle.)

Also worth noting are the scathing critiques of some Windows 95 designs[2,3] in the Interface Hall of Shame (2000). I don’t necessarily agree with all of them (having spent the earlier part of my childhood with Norton Commander, the separate folder/file selectors in Windows 3.x felt contrived to me even at the time) but it helps clear up some of the fog of “it has always been this way” and remember some things that fit badly at first and never felt quite right (e.g. the faux clipboard in file management). And yes, it didn’t fail to mention the Office 97 UI, either[4,5]. (Did you realize Access, VB, Word, and IE used something like three or four different forks of the same UI toolkit, “Forms3”, among them—a toolkit that looked mostly native but was in fact unavailable outside of Microsoft?..)

None of that is meant to disagree with the point that submitting to the idea of UI as branding is where it all went wrong. (I’ll never get tired of mentioning that the futuristic UI of the in-game computers of the original Deus Ex, from 2000, supported not only Tab to go between controls and Enter and Esc to submit and dismiss, but also Alt accelerators, complete with underlined letters in the labels.)

[1] https://devblogs.microsoft.com/oldnewthing/20040728-00/?p=38...

[2] http://hallofshame.gp.co.at/file95.htm

[3] http://hallofshame.gp.co.at/explore.htm

[4] http://hallofshame.gp.co.at/visual.html#VISUAL36

[5] http://hallofshame.gp.co.at/visual.html#VISUAL38

reply
Uvix
1 day ago
[-]
> Despite Chen’s characteristic confident tone, he doesn’t at all acknowledge the influence of the limited palettes of baseline graphics adapters on the pre-Win95 “flat” origins of that cycle.

It's right in the second sentence: "...Windows 1.0, which looked very flat because... color depth was practically non-existent."

reply
int_19h
1 day ago
[-]
> I’ll never get tired of mentioning that the futuristic UI of the in-game computers of the original Deus Ex, from 2000, supported not only Tab to go between controls and Enter and Esc to submit and dismiss, but also Alt accelerators, complete with underlined letters in the labels

I think that's because they used the stock UI toolkit of the original Unreal Engine, which also had all these things. If you recall, UT'99 actually had a UI more like a desktop app at the time, complete with a menu bar and tabbed dialogs:

http://hw-museum.cz/data/article/VGA-Benchmarks/Benchmark-VG...

reply
charcircuit
1 day ago
[-]
>they look the way they do because it works good

In modern times telemetry can show how well new designs work. The industry never forgot how to measure and do user research for ui changes. We've only gotten better at it.

reply
everdrive
1 day ago
[-]
I've had an alternate theory for a while. Prior to verbose metrics, UIs could only be designed by experts and via small samples of feedback sessions. And UIs used to be much, much better. I suspect two things have happened:

- With a full set of metrics, we're now designing toward the bottom half of the bell curve, ie, towards the users who struggle the most. Rather than building UIs which are very good, but must be learned, we're now building UIs which must suit the weakest users. This might seem like a good thing, but it's really not. It's a race to the bottom, and robs those novice users from ever having the chance of becoming experts.

- Worse, because UIs must always serve the interests of the bottom of the bell curve, this actually is why we have constant UI churn. What's worse than a bad UI? 1,000 bad UIs which each change every 1-6 months. No one can really learn the UIs if they're always churning, and the metrics and the novice users falsely encourage teams to constantly churn their UIs.

I strongly believe that you'd see better UIs either with far fewer metrics, or with products that have smaller, expert-level user bases.

reply
cosmic_cheese
1 day ago
[-]
I don’t believe either is the primary driver of modern UI design. Cynical as it may be, I think the only things that get any level of thought are:

1. Which design is most effective at steering the most users to the most lucrative actions

2. What looks good in screenshots, presentations, and marketing

The rest is tertiary or an afterthought at best. Lots of modern UI is actually pretty awful for those mentioned bottom of the bell curve users and not much better for anybody else in terms of being easy to use or serving the user’s needs.

Proper use of analytics might be of assistance here, but those are also primarily used to figure out the most profitable usage patterns, not what makes a program more pleasant or to easy to use. They’re also often twisted or misused to justify whatever course of action the PM in question wants to take, which is often to degrade the user experience in some way.

reply
cyberax
1 day ago
[-]
There's a much simpler explanation. At some point, the UI becomes about as good as it can be. It can't really be improved any further without changing the whole paradigm, and just needs to be maintained.

But product managers inside the large corporations can't get promoted for merely maintaining the status quo. So they push for "reimagining" projects, like Google's "Material Screw You" UI.

And we get a constant treadmill of UI updates that don't really make anything better.

reply
hakfoo
1 day ago
[-]
Just because they're measuring doesn't mean they're measuring the same things as before.

The goal in 1995 might be "The user can launch the text editor, add three lines to a file, and save it from a fresh booted desktop within 2 minutes".

The goal in 2015 might be "we can get them from a bare desktop to signing up for a value-add service within 2 minutes"

I'd actually be interested if there's a lot of "regression testing" for usability-- if they re-run old tests on new user cohorts or if they assume "we solved XYZ UI problem in 1999" and don't revisit it in spite of changes around the problem.

reply
wlesieutre
1 day ago
[-]
With so many things being ad-funded, I always wonder if what they're optimizing for is "it took the user 50% longer to complete a task"
reply
II2II
1 day ago
[-]
Telemetry may tell you the "what" but, at best, it will only allow you to infer the "why". It may provide insights into how people do things, yet it will say nothing about how they feel about it. Most of all, telemetry will only answer the questions it is designed to answer. The only surprises will be in the answers (sometimes). There is no opportunity to be surprised by how the end user responds.
reply
Narishma
1 day ago
[-]
That assumes that they are using the telemetry to create a better product for the user rather than the developer.
reply
Lammy
1 day ago
[-]
Fuck telemetry. Don't spy on me while telling me it's in my best interest. Don't spy on me at all.
reply
goosedragons
1 day ago
[-]
It can look better. This is basically a distro with Chicago95 out of the box and not well configured. If you take the time it can look more like 95. The Chicago95 screenshots IMO look better:

https://github.com/grassmunk/Chicago95

reply
int_19h
1 day ago
[-]
Fonts make the biggest difference here. Tahoma would also be decent (if not quite right).
reply
Tade0
1 day ago
[-]
To be fair at least the title bar height was configurable and I recall at least one original Windows theme taking advantage of that.
reply
zestyping
1 day ago
[-]
Ouch. That screenshot is uncomfortable to look at. The window title bars are painfully narrow, the frame borders have inconsistent thicknesses, the Start menu overlaps the taskbar, the vertical centering of text is wrong.

The answer to your question is that these replicas are of low quality. This one looks like the whole thing was made by someone (or a committee of people) lacking attention to detail.

reply
bowlofhummus
1 day ago
[-]
The text is the worst. The icons are nice and pixely but the fonts are baby butt smooth anti aliased
reply
j45
2 days ago
[-]
Someone having spend too much time using the original replicating it would likely notice these things.

Still it is hopefully a nice introduction for some.

reply
voidfunc
2 days ago
[-]
I got in an argument with an accessibility engineer about this recently...

The whole UI as branding thing has utterly killed usability.

reply
burnte
1 day ago
[-]
> The whole UI as branding thing has utterly killed usability.

This is caused by a change in who is hired as UI/UX developers. In days past it was HCI experts and engineers, now it's graphic designers. "Pretty" is the order of the day, not "useful". "There are too many menu items" is now answered with "So let's hide them" when it used to be "How can we organize them in the UI us a simple, discoverable manner?" But then that "overflow" menu (really? Needed menu commands are now OVERFLOW?) gets crowded so they start just removing features so the UI is nice.

reply
girvo
1 day ago
[-]
Having worked with amazing HCI experts over the years, you've hit the nail on the head. It's wild how much design is done for designs sake at my work, with nary a nod to HCI given. The a11y team try to patch over it as best as possible, but we end up with a mess, and I'm treated like a pariah for pushing back on some of it
reply
burnte
21 hours ago
[-]
I'm glad to hear my suspicions and impressions are accurate. I have a number of friends who have gone into UI over the past 20 years and while they're all very smart people, In my opinion exactly 1 of them has what it takes to work in UI/UX. The rest are mediocre graphic designers with average to no training in graphic design, and zero experience in anything technical. Only one knew what Fitts law was, or what HCI meant, or what GUI meant
reply
ivan_gammel
1 day ago
[-]
>This is caused by a change in who is hired as UI/UX developers.

„UX/UI developers“ is a strange name for it.

In 2000s the web enabled more sophisticated presentation designs and there was a push from client-server to web-based applications using incredibly strange technologies for building UIs — HTML, CSS and JavaScript, which gave the rise to UX design as a interdisciplinary job (HCI+digital graphics design). By 2010 the internet of applications kicked off and in mid-2010s moved to mobile, dramatically increasing the demand for UX designers. By then it actually mattered more who is hiring designers, not who is hired. Since only relatively small fraction of hiring managers does understand the scope of this job even now, they even started calling it „UX/UI designers“ or „Product designers“ as if that name change could help, still judging design work by often-fake screenshots on Behance rather than by case studies in the portfolio. Even HCI professionals are often reduced to mere graphic designers by those managers who skip research and apply „taste“ to a science-based discipline. At the same time, since UX design is one of the most well-paid and less stressful creative jobs, a lot of people switched to it without proper education or experience, having no idea what is statistical significance or how to design a survey. And voila, we are here.

reply
burnte
20 hours ago
[-]
> „UX/UI developers“ is a strange name for it.

I agree. It's a UI Engineer. User Experience is just the fluffification of the title to something that sounds expansive and nebulous when it's actually pretty focused and critical.

reply
ivan_gammel
7 hours ago
[-]
Then you don’t understand it too. Software solves user problems by offering experience, not UI. Not every solution requires UI, but every solution creates user experience.
reply
hyperbrainer
2 days ago
[-]
It's interesting especially because it seems like companies today pour tens of millions into "accessibility", but I never see a thing's usability in terms of simple and easy-to-do-what-I-want UX fall in to the same category.
reply
cenamus
2 days ago
[-]
Even just simple UX testing with people that have never seen or used your software seems to be a lost art.
reply
hnthrowaway0315
1 day ago
[-]
Companies are outsourcing testing. I'm not surprised that they get rid of UI testing. Back in the day companies used to invite people to sit down and use their software. Nowadays they just push out whatever they have and then start collecting bug tickets. Then they let the community to vote on the tickets. It's basically a huge "pay for being a beta tester" scheme.
reply
ivan_gammel
1 day ago
[-]
UX testing is not UI testing and it is not QA.
reply
girvo
1 day ago
[-]
Quite, but lots of companies jam them all together these days.
reply
hnthrowaway0315
19 hours ago
[-]
You are probably right, just saying in general...
reply
rafram
2 days ago
[-]
One is required by law and/or contract terms, the other is just nice to have.
reply
layer8
1 day ago
[-]
And that is why we can’t have nice things, apparently.
reply
cosmic_cheese
2 days ago
[-]
It’s a completely predictable result if you think about it.

Old style UI was developed with the findings of countless man-hours of UX research performed by field experts, while branded UI is typically whipped together purely based on trends and vibes in an evening by a visual designer who’s probably never performed an ounce of serious research or user trials. It’s natural that the latter is only going to be good at the most superficial of purposes. UI as branding is the McMansion of UX.

reply
bri3d
2 days ago
[-]
I think it’s worse from a time wasting standpoint, really - a lot of modern UX does have thousands of hours of UX research dumped into it, but with faulty metrics driven goal seeking and internal politics bolted on. I agree that Vibe Branding killed UX in the way you describe in the 2000s (remember when every company had some abominable Flash site?!), but now, we’ve come full circle: from the ashes we’ve allowed warring factions of UX researchers to return to create hundreds of carefully constructed disparate systems with no consistency.
reply
cosmic_cheese
1 day ago
[-]
I don’t think we’re quite back to where we were, because branded UI widgets are almost always devoid of functionality compared to their traditional UI toolkit counterparts. If a feature is even slightly “power user”, branded UI widgets probably don’t implement it, even in tools made for technical users.

One of my favorite examples is tree-style lists (“outline views” in AppKit nomenclature). On macOS these have a very convenient functionality where holding down option while expanding/collapsing a section performs that action on all children as well, and it’s practically never implemented in custom-built tree widgets even in cases where the primary audience skews Mac-heavy.

reply
Lorkki
1 day ago
[-]
It's also repeating what the hellscape of inconsistent skinned UIs did in the late 90s and early 2000s. People are looking back at those times with a rather selective memory.
reply
Gormo
1 day ago
[-]
The themed UIs of that era were very superficial -- if they applied to serious software at all, they were just a cosmetic layer on top of an otherwise well-engineered interface, and could be easily disabled. Most people I knew, for example, disabled the theming engine that shipped with Windows XP. Most applications that supported UI skinning still had a default or fallback UI that adhered well enough to modern conventions.

Not so much anymore. The abandonment of any coherent organizing principle to UI layout in favor of pure aesthetics has been a massive regression. Reasonably complex software often doesn't even include menu bars anymore, for example.

reply
xandrius
1 day ago
[-]
People forget having to use IE with 12 toolbars when going over at some friend's house.
reply
pwg
2 days ago
[-]
"It is difficult to get a man to understand something, when his salary depends on his not understanding it.”

https://www.goodreads.com/quotes/21810-it-is-difficult-to-ge...

reply
WarOnPrivacy
1 day ago
[-]
> The whole UI as branding thing has utterly killed usability.

Imagine if Active Desktop had taken over.

I eventually came up with a not-awful use for AD but that was a few years after it went away.

reply
anthk
1 day ago
[-]
It did under several ways since w98SE and Explorer with IE merged on.
reply
leonidasv
1 day ago
[-]
When I used XFCE as my daily driver, I once tried installing Chicago95 just for nostalgia and it stick as my daily driver for almost a year! The UI is less distracting than modern UIs and there's something to it that makes it easier to just know which window is open over which window that's lacking in modern UIs (I think it's the over-reliance on soft shadows and the borderless windows).

Eventually, I stopped using it because: 1- it was always annoying to send an screenshot to someone and have to explain that no, I wasn't using Windows 95, and why; 2- the grey-ish look of everything started to bother me over time; 3- I wanted a more integrated desktop experience and moved to KDE Plasma. Still, I configured my Plasma to work like old Windows: window titles on taskbar, zero to none animations, etc.

reply
caseyy
1 day ago
[-]
I also dailied it on XFCE. The UI was very utilitarian and purposeful. I suppose aesthetically it is unimpressive and not streamlined, but it serves the purpose of being a good interface to do a task.

Same as you say, people have asked me a lot about it and even asked me if I could set it up for them. The theme is evangelizing Linux a little bit, and that is interesting. In the right hands, these UI principles could convert many people to some product.

P.S. You can now change the grey-ish look with Win95-style theming support. I've not used it, but here's more info: https://github.com/grassmunk/Chicago95/blob/master/Plus/READ...

reply
keyringlight
1 day ago
[-]
>and there's something to it that makes it easier to just know which window is open over which window that's lacking in modern UIs (I think it's the over-reliance on soft shadows and the borderless windows).

I think this started with Vista, I remember watching a video criticizing the new love of glass effects on UI chrome as it got rid of or minimized the color/shading difference between focused/unfocused windows. The example the video used was 6 notepad windows and pick which one was focused, and the main cue you'd need to look for is that the window with focus had a close button colored red.

Thin borders and minimalist/hiding scrollbars is another one that annoys me, give me something graphical for my gaze to grasp.

reply
fc417fc802
1 day ago
[-]
This entire comment section has me repeatedly thinking to myself "you don't run into that problem with i3-alikes" over and over. My choice not to put up with modern UX bullshit is feeling strongly reaffirmed right now. Not that it needed to be.

> it was always annoying to send an screenshot to someone and have to explain that no, I wasn't using Windows 95

That's not a negative, that's a fringe benefit as an endless source of entertainment.

reply
breadwinner
2 days ago
[-]
Agree that "modern" OS user interfaces are objectively worse from a usability perspective. That's thanks to Flat UI, mostly.

In my opinion, nothing beats the 35-year-old NeXTSTEP interface (which W95 is a weak imitation of):

https://www.gnustep.org/carousel/PC_1300x650.png

reply
esafak
2 days ago
[-]
Microsoft Windows programs hid functionality under layers of menus and the registry. MacOS, at least, surfaces much less functionality, because it offers sensible defaults. I never had to do anything akin to fiddling with the Windows Registry.

I did like some Windows things, though, like the ribbon, and reconfigurable UIs. Today's UIs are more immutable, for the worse.

reply
zamadatix
2 days ago
[-]
I'd agree macOS surfaces much less functionality but I feel like it's more "because they don't want you to feel like there is a choice to make in the first place" rather than "because the defaults are ideal for everyone". Over time it feels like "layers of menus" have definitely made their way into Apple's software anyways.

The replacement to the registry seems to half be "magic CLI incantations for settings which can't be found in the GUI for some reason" and half "here's a $4.99 app to 3 finger tap to close tabs".

reply
p_l
2 days ago
[-]
And the defaults system is just registry by another name
reply
cosmic_cheese
2 days ago
[-]
Not really, defaults are stored in per-application plist files rather than in a singular database.
reply
p_l
1 day ago
[-]
And what difference to end user it makes where exactly the key/value data is stored? No real difference whether the data is HKEY_CURRENT_USER\Software\MyAppName or com.my.app when you're trying to coerce some internals whose configuration is not exposed because you're not worthy of it
reply
Lammy
1 day ago
[-]
It was common in the Windows 9x days for the two Registry Hives (SYSTEM.DAT and USER.DAT) to get corrupted leading to an unbootable system or to get fragmented and/or full of disused values from poorly-uninstalled software leading to increased memory usage.

Here are some KB articles to check out for context:

- https://helparchive.huntertur.net/document/105563

- https://helparchive.huntertur.net/document/89799

- https://helparchive.huntertur.net/document/89794

reply
p_l
1 day ago
[-]
Been there, know the pain, still not actually a big difference to the question of modifying "unexported" settings
reply
julik
8 hours ago
[-]
There is - removing a wonk preference namespace is as easy as `rm ~/Library/Preferences/com.cheapskatesoftware.wonko.plist`. Whereas the Windows Registry is a monolithic piece of gunk you need a Microsoft editor for to zap something
reply
p_l
5 hours ago
[-]
Considering the day I once spent hunting for all possible plist locations of a single program, I'd rate it about same for registry and plists
reply
cosmic_cheese
1 day ago
[-]
I’d say that a quick defaults command is probably on the whole more friendly than trawling around in the arcane mess that is the Windows registry. It’s not as friendly as it could be, but at least it’s a somewhat human readable one liner.

It’s also reasonable to back up plists and/or sync them between machines like some users do with their dotfiles, because they’re just files.

reply
bshacklett
1 day ago
[-]
Registry settings can be modified via CLI, too. Windows users are just far more averse to the command line.
reply
p_l
1 day ago
[-]
I have never seen anyone backup defaults database between macs[1], I have seen a lot of scripts calling setting by setting instead.

Which has direct equivalent in "reg" files, to be quite honest.

[1] Other than restoring time machine backup to another system or similar cloning setups

reply
cosmic_cheese
1 day ago
[-]
It’s not a database, they’re individual files. Most are even plain XML that can be hand written and edited with a text editor.
reply
p_l
5 hours ago
[-]
This is not a gotcha, really. An XML file can be considered database just as well (similarly, part of registry on NT is portable between machines).
reply
cosmic_cheese
2 days ago
[-]
It’s not a 1:1 mapping, but much power user functionality in macOS is designed to progressively reveal itself as the user becomes more technically capable, a type of design known as progressive disclosure. This allows newbies to not feel overwhelmed while also allowing power users to feel at home.

The problem is that way too many people approach macOS with the Windows way of doing things firmly planted in their minds as “correct”, which interferes with this process. For example, over the years I’ve encountered numerous posters complaining about how macOS can’t do X thing, after which I point out that X thing is right there as an easy to find top level menu item, but the poster in question never bothered to take a look around and just assumed the functionality didn’t exist since it wasn’t surfaced the same way as under Windows or KDE or whatever they were coming from.

Of course there are things macOS just doesn’t do, but there’s plenty that it does if users are willing to set their preconceptions aside for a moment.

reply
exiguus
2 days ago
[-]
If you approach macOS the Linux or BSD way, it feels like Windows Powershell. Of course you can use brew and stuff, setup you dev enviroments etc. But when it comes to system settings, its bad, very bad. Also stuff like docker, k8s suffer performance and usability.
reply
cosmic_cheese
2 days ago
[-]
Docker, etc are going to suffer on anything that’s not Linux due to how coupled they are to Linux. Even WSL isn’t as good as bare metal Linux in that regard. To me it speaks to a need for return to platform agnosticism in dev tooling more than anything.
reply
cogman10
1 day ago
[-]
WSL2 works fine, but that's merely because it's a linux VM with a little polish.

I'd kill to do dev on a linux machine, but alas it's not company policy :(

reply
int_19h
1 day ago
[-]
One thing that I always hated about macOS is the menu bar placement.

Ironically, in the long run, it has proven to be an asset for the simple reason that any macOS app has to have a main menu with commands in it, if it doesn't want to look silly. So this whole modern trend of replacing everything with "hamburger" menus that don't have the functionality isn't killing UX quite so bad there.

Although some apps - Electron ones, especially - stick a few token options there, and then the rest still has to be pixel-hunted in the window. Some don't even put "Settings" where it's supposed to be (under the application menu). Ugh.

reply
cosmic_cheese
1 day ago
[-]
On the last paragraph, something is better than nothing, though. It’s always bugged me that Electron doesn’t offer Windows and Linux users a way to enable the menus that’ve been provided for the Mac version.
reply
SlackSabbath
1 day ago
[-]
As somebody who recently had to switch to Mac for work, my experience has been the exact opposite of this. Every other OS I've used since Windows 95 I've been able to get to grips with the same way: start off using the mouse to find my way around the UI, and introduce keyboard shortcuts as and when I find them useful. Eventually I get to the point of being able to use either exclusively keyboard or exclusively mouse for most tasks.

MacOS seems to _require_ some unergonomic combination of both from the get go. Some basic things are easy with the keyboard but hard/impossible with the mouse and vice versa. The Finder app doesn't even have a button to go 'up' a directory for god's sake.

reply
wpm
16 hours ago
[-]
Right click the window title and you get a drop down navigation menu of every folder in the hierarchy up to the "/Volumes/$VOLUME" the folder is on.

I'd kill for this on Windows or any mainstream Linux DE.

reply
cosmic_cheese
1 day ago
[-]
It helps to keep in mind where the OS and its core user base is coming from.

For the case of the up button for example, prior to OS X the Finder was a spacial file manager where each folder had a single corresponding window that remembered its position on screen, allowing users to rely on spacial memory to quickly navigate filesystems. Its windows didn’t even have toolbars, because they weren’t navigator windows — every time you opened a folder you got a new window (unless that folder’s window was already open, in which case it was foregrounded).

So when OS X rolls around in ~2000 and switches the Finder to navigator windows, they’re looking at what existing users will find familiar. Back/forward is easy since most had used a web browser by that point and map cleanly to most people’s mental models (“I want to go back”), but up? That’s a lot more rare. A handful of folks who’d used an FTP client might’ve been familiar with the concept, but few outside of that few would’ve, and how “up” relates to a filesystem is not in any way obvious. And so, the Finder never got an up button, just a key shortcut because anybody advanced enough to be hunting down shortcuts is going to understand the notion of “up” in a filesystem.

reply
diggan
2 days ago
[-]
> I never had to do anything akin to fiddling with the Windows Registry

If I recall correctly, when I got my first Macbook, I had to edit plist files or something similar in order to do basic things like permanently showing hidden files, showing the full path in Finder, show file extensions for all file types, increase the animation speed so the computer didn't feel slow as molasses, etc, etc.

Maybe these things are now easier to configure via GUI on macOS?

reply
baq
1 day ago
[-]
One thing that is weird is that you’re expected to look around the menu bar holding the option key as the menu contents change when that is pressed (also applies to tray icon menus, e.g. WiFi icon shows a lot of stuff when option-clicked.) IIRC some of what you say can be toggled with option menu items.
reply
cosmic_cheese
2 days ago
[-]
Toggling hidden file visibility in Finder and open/save dialogs has been doable with the key shortcut Command-Shift-. for quite some time now.
reply
foobarchu
1 day ago
[-]
The last time I was setting up a system, it's still very difficult to find in the menus. If it's not discoverable and I have to know the incantation/shortcut to do it, then that's bad UI.
reply
diggan
2 days ago
[-]
Is that permanent across reboots and all? I think that was the main issue I had with it, but was a long time ago now.
reply
wpm
16 hours ago
[-]
There is a checkbox for file extensions and a View menu item for full paths. Hidden files is still not surfaced in the GUI, but is persistent.
reply
bbqfog
2 days ago
[-]
MacOS is pretty cursed. The equivalent to registry fiddling is doing anything in ~/Library/Application Support

It still has "Services" as a hold over from Next that is completely broken and unused (but still present in every app for some reason). Now you also have the joy of diving deep into the Settings every time an app needs some sort of permission.

I'd say something about .DS_Store files, but that's not really UI.

reply
wpm
16 hours ago
[-]
I'd much rather work with plain-text human readable property list files with straightforward `defaults` commands than the hive of Hell called the registry.
reply
anthk
1 day ago
[-]
NeXTStep/GNUStep/Cocoa 'defaults' commands.
reply
burnte
1 day ago
[-]
> I never had to do anything akin to fiddling with the Windows Registry.

I don't believe you. You have never, EVER, NOT ONCE run a terminal command to change an option on MacOS? I just refuse to believe anyone on HN hasn't altered preferences in the terminal on MacOS.

reply
esafak
1 day ago
[-]
Not with anything near the frequency of Windows. The last such thing I remember doing is restarting the locate indexing service with launchctl. I do lots of things in the command line, of course, but not so much to configure MacOS itself.
reply
brandon272
2 days ago
[-]
There is an entire ecosystem of free and paid Mac apps meant to augment the Mac experience because MacOS does not provide functionality or configuration needed for a sensible computing experience out of the box.
reply
exiguus
2 days ago
[-]
I think, the main difference of MacOS and Windows is, that Windows allow drivers from 3rd-party. MacOS does not. Drivers means also hardware. So you can build your own PC. Same as with Linux.

This is the Apple secret of success IMO. No 3rd-party drivers and hardware, means, it will just work and no one will blame you for stuff 3rd-parties messed up.

But its also like: There is only a red and blue t-shirt. Choose. No gray, no white, no yellow, no printings.

reply
cosmic_cheese
2 days ago
[-]
macOS allows third-party drivers too, Apple just wants vendors to write them in userspace rather than kernelspace. That’s probably not the worst thing, because proprietary driver code is notoriously shoddy and should be run somewhere that limits the blast radius.
reply
exiguus
2 days ago
[-]
Sure, i think the userspace restriction is also the reason, that nearly no 3rd-party hardware for Mac exist.
reply
cosmic_cheese
2 days ago
[-]
That’s mainly restricted to graphics cards. Audio cards like used for production as well as I/O (USB, etc) and networking cards have drivers and work fine given you have a PCI-E slot to plug them into, and of course almost anything external connected via USB or Thunderbolt works fine. For GPUs, it’s only a specific subset of users that needs a discrete GPU, especially as the GPU built into M-series SoCs has become powerful enough for most uses outside of high-end gaming.
reply
exiguus
2 days ago
[-]
Selecting the appropriate tool for the task at hand is crucial, in my opinion. However, I believe the choice is often influenced by companies mandating the use of Microsoft and Mac systems due to cost and maintenance considerations, rather than allowing employees to choose between Mac, Windows, or Linux based on their preferences. Proprietary software that only runs on Mac or Windows, never was an argument, because you can just RDP stream remote desktop apps or use the browser.
reply
happyopossum
1 day ago
[-]
> My hypothesis is today's "modern" OS user interfaces are objectively worse from a usability perspective, obfuscating key functionality behind layers of confusing menus.

I think if you went back and actually tried to use these old UIs you would realize that one of the reasons stuff isn't hidden behind layers of menus is that in a lot of cases the 'hidden' features just didn't exist back then.

reply
epolanski
1 day ago
[-]
Mobile too, I'm sick of those OSs updating every 18 months because some product person along marketing decides the wheel has to be reinvented otherwise there will be no buzz.

I have a harder and harder time navigating both iOS and Android as time goes, should be the opposite.

Same for Windows or MacOs.

reply
hx8
1 day ago
[-]
> The dominance of Windows is crazy, even today, Mac desktops and laptops are comparatively niche

I was actually surprised that macOS/Linux/ChromeOS together are >20% of all desktop/laptops. I would have expected Microsoft machines to be closer to 90% than 80%.

reply
everdrive
1 day ago
[-]
I'm surprised by the breakdown as well. At least according to these two citations in Wikipedia, the breakdown is:

---

For desktop computers and laptops, Microsoft Windows has 71%, followed by Apple's macOS at 16%, unknown operating systems at 8%, desktop Linux at 4%, then Google's ChromeOS at 2%.[3][4]

---

[3] "Now more than ever, ChromeOS is Linux with Google's desktop environment". About Chromebooks. 1 August 2023. Retrieved 25 September 2024.

[4]"Desktop Operating System Market Share Worldwide". StatCounter Global Stats. Retrieved 9 March 2025.

reply
toast0
2 days ago
[-]
Honestly, I don't think anyone has done real user research on basic interfaces since Microsoft did it for Windows 95. I'm pretty sure that was the last publication I've seen.

It's a lot of time, effort, and expense to run user research, but the potential benefits to the users are big.

reply
hnthrowaway0315
1 day ago
[-]
For modern Windows operating system, i.e. >= Windows 10, getting rid of ads, weather, auto-update and search bar can improve UI usability significantly.

Windows 95 was not stable enough back then. I believe Windows 2000 was the first OS that was easy to use and relatively stable. XP and 7 are both solid options too.

I have used both MacOS X and modern MacOS (15) and the UI of X is definitely way better than the UI of 15. It is more clean cut.

reply
epolanski
1 day ago
[-]
98 was a very good os. I don't think many got 2000, XP was the major windows for millennials.
reply
anthk
1 day ago
[-]
XP was almost a re-skinned 2000. Most drivers for XP worked under 2000 and viceversa.
reply
trbutler
1 day ago
[-]
Yes. I'd love to see someone take the basic design of Windows 95 or even early OS X and reimplement it not so much visually, but tactilely. Make something that works as well, is as simple but isn't nostalgia.

In any case though, this particular attempt at giving a complete Windows 95 experience is quite cool.

reply
interludead
1 day ago
[-]
The Windows dominance really puts things in perspective... Even with the rise of mobile and Macs making some headway, it's amazing how entrenched it still is on the desktop.
reply
walrus01
1 day ago
[-]
Give today's XFCE4 a try, on debian stable or testing. It's a remarkably no bullshit GUI for use with xorg. You can of course still install all the gnome and kde libraries and run all the gnome and kde applications, though things will look a little bit mismatched.
reply
TacticalCoder
2 days ago
[-]
> The dominance of Windows is crazy, even today, Mac desktops and laptops are comparatively niche

But for whom and to do what? People around me, like my wife and mother in law, are happy with an Android phone. If they use a computer, the only thing they need is a browser.

I wouldn't be surprised if 90%+ of all Windows users would use Windows for one thing: click on a browser icon.

That's why after one malware too many I confiscated my mother-in-law's Windows laptop and got her a Chromebook.

I'd say that's why the shit UI/UX doesn't even matter anymore for most users: the only use a browser anyway.

reply
exiguus
2 days ago
[-]
What do you mean exactly? Like the Menu-Issues in Windows 10? Because from a UX perspective, basically nothing has change. UI of course, but UX is the same like in the 90's following "The Design of Everyday Things" by Donald A.

I think its more about the change management, expectations. For example in Win XP you had the option to use the NT theme. As a user: "I can decide when to move on to the new design."

Usually around 50% of your users are conservative about change. You have to keep this in mind when u change design. On the other hand, if you sell a product with subscription, you have to introduce new feature, else user will move to another product. But, when you introduce new feature, UI gets more complicated and user will blame you for that.

reply
myself248
2 days ago
[-]
Like making window borders 1px wide, even as screen pixel density increases. It's darn near impossible to resize a window anymore.

Like making buttons auto-hide unless you mouse-over them. I don't remember when this came in, but the default PDF viewer in something did this, and I spent _weeks_ being baffled that some jerk made a PDF viewer that couldn't zoom in on the page, until I randomly waggled the mouse for some reason and the missing buttons magically appeared. I have no words for how upsetting this was.

Like having icons-only for many functions, with no text-and-icons or text-only option to replace them. I'm sure some people are fine with that, but other people can scan a screen for a desired word MUCH faster than they can scan for a desired icon, and removal of text labels is just an insult to that segment of the userbase.

Like no longer highlighting, or even having, hotkeys for many menus. I can alt-space or alt-menukey my way through a late-90s menu tree _way_ faster than I can mouse through it, even with today's better mice, but that simply doesn't work anymore in a great many programs.

It's one thing for people who've never known a different UI to just be slow in this one and that's all they've known, and that's fine for them I guess, if it's pretty and they prefer that, or if keyboards frighten them.

But for people who have DECADES of reflexes invested in these shortcuts to suddenly find that they don't work anymore, and we're forced to SLOW DOWN and be less productive than in the past, that's a high insult.

Microsoft spilled tankers of ink in the 90s talking about how their new GUI patterns would make people more productive by unifying these things across programs (which was true; in the DOS era every program made up its own shortcuts and ways to access them), and folks who learned them are now being punished for trusting MS with our loyalty.

"Basically nothing has changed" my ass.

reply
4k93n2
2 days ago
[-]
> Like making window borders 1px wide, even as screen pixel density increases. It's darn near impossible to resize a window anymore.

check out altDrag if youre on windows (its discontinued now but i think i remember seeing newer forks)

it lets you hold down a key and then drag the cursor in one of the 4 quarters of a window to resize it.

a lot of the ubuntu based distros ive tried have had this feature built in for a while now and its far superior

reply
myself248
1 day ago
[-]
Thank you for the recommendation, I'll look at installing that.

But the problem with add-ons is that every machine you use will have a different combination of them installed. Maybe you're at work and don't have privileges to install them. Maybe you forgot to install one on your desktop even if it's on your laptop, etc.

And building it into "a lot" of Ubuntu-based distros lacks discoverability. I might have that now on the machine I'm typing this on, but it does me no good if I don't know it's there. (Everything in Linux has worse-than-terrible discoverability, but that's another rant entirely.)

MS's dominant position meant their defaults Just Worked everywhere, and when those defaults were good, they were really, really good, by virtue of their ubiquity. Then they fucked us by using their dominant position to just... I don't know... completely lose the plot? Aside from HiDPI fractional scaling and support for large monitor "maximize to a quadrant" and stuff, I can't point to a single MS UI improvement since the XP days. Everything else has just gotten worse, fragmented, and for no good reason.

reply
exiguus
2 days ago
[-]
I agree, thats bad. And for example the "icon only" thing follows a bad but hip UI pattern where designers assume the knowledge of the icon meaning of the user. They should not in my opinion. I mean, in the end, you can decide. 1. To learn all this new patterns in windows. or 2. Move on to another, more stable window environment like gnome or KDE or whatever. In the end, its all about the effort for now and on the long run. And you get forced to calculate that because of the introduced change.
reply
zozbot234
2 days ago
[-]
The 'icon only' thing is less of a problem when you can hover your mouse pointer on the icon and get a tooltip that tells you what it's for.
reply
exiguus
2 days ago
[-]
There is also a very good example about this in The Design of Everyday Things - the one with the revolver doors.
reply
exiguus
2 days ago
[-]
No no no. That's bad, relay bad because this involves at minimum 2 unnecessary steps:

1. touch the mouse (if not already) 2. move the mouse to the button 3. wait until the tool tip appears

reply
floundy
1 day ago
[-]
Have you used Windows 11? When right clicking, there's now a context menu within the right click context menu. To see what you could see in Win10, you have to right click, then select "See more options" or something. Which just opens up the "old" Win10 context menu with a totally different visual appearance than the Win11 one. Talk about jank and bloat.
reply
selfhoster11
2 days ago
[-]
I'm sorry, but absolutely no. Fuck no. Nothing from Microsoft has even been in the same building as a copy of "The Design of Everyday Things", or as a copy of any old-school UX book from before UX meant "Electron". UX is just as much about the "how" as it is about the "what", and Microsoft has been failing everyone lately on this count.
reply
exiguus
2 days ago
[-]
Thats not true. For example simple stuff like Toggles to the hole Windows management is derived from that. IMO the huge change in Windows 11 is how the Menu, App Starter and so on works (if you use the mouse).
reply
haswell
1 day ago
[-]
Lately I've been strongly considering helping migrate my parents to Linux. Their needs are primarily web-based with some basic productivity tools mixed in, and Windows has just been getting more and more hostile. On top of this, they're at an age where they're now more susceptible than ever to various scams/attacks, and shutting down an entire category of problems by removing Windows from the picture is increasingly attractive.

I had forgotten that Chicago95 exists, but this might be exactly the right thing. They'd immediately find it familiar, and while the theme isn't the whole story, this would go a long way in easing the transition I think.

I miss this era of computing.

reply
ianmcgowan
1 day ago
[-]
A chromebox mounted behind the monitor did the trick for me. Haven't had an emergency wipe/reinstall in years. Also, a tablet with keyboard takes some of the pressure off having a "computer" and you can go with iOS or Android depending on what phone they use.
reply
haswell
1 day ago
[-]
I've been evaluating a Lenovo ThinkCentre m920q tiny I picked up for not very much money on eBay (the m720q models are even cheaper) and they seem like perfect machines for the task.

My parents use some tools and hardware that require a full OS so the tablet route isn't an option, but I'm starting to really like the idea of deploying a couple of these micro PCs.

reply
mixmastamyk
1 day ago
[-]
Out of the frying pan, into the fire. Exposing your parents to total surveillance (from one corp to another) is not what I'd characterize as safe or friendly. Linux is fine these days if the hardware is supported, and you can use an immutable distro if extra reliability is warranted.
reply
hattmall
1 day ago
[-]
Strong disagree. I mean tracking is what it is, but it's happening regardless and if they are using Chrome to browse and Google services they are being tracked.

ChromeOS seems to work really well though and is dead simple and intuitive. It used to be incredibly awesome with crouton but that's mostly dead. Crostini is acceptable though. I would absolutely recommend ageing people getting a chrome device for security and simplicity.

Plus running any android apps on desktop gives even more software options than any other desktop for most consumers.

reply
mixmastamyk
1 day ago
[-]
Reliability is a commodity these days, and we don't use Google services.

Apparently, ubiquitous surveillance is acceptable to you. Not an uncommon stance, but I'd never recommend it to others. It's a disservice when freedom and privacy respecting choices, that are just as reliable, already exist.

reply
fc417fc802
1 day ago
[-]
You also can't properly back up the system unless you install a custom ROM and take on the associated maintenance burden. (At least Android. I'm not sure to what extent these things are left up to the whims of the developer on iOS.)
reply
arcmechanica
1 day ago
[-]
My parents run linux because mom likes coupon websites and I can't repair the thing every week
reply
veqq
1 day ago
[-]
What are coupon websites / you can get free coupons from them(?)
reply
thoughtpalette
22 hours ago
[-]
retailmenot, etc. There's some pretty dubious ones that come up in search results, e.g. try searching "levis.com coupon codes" or something.
reply
txdv
1 day ago
[-]
I installed ubuntu for my mother, she just needs to download pdfs and read pdfs, look at images, use gmail. Sometimes she opens a document with LibreOffice, but no power usage.

Seems to work, the maintenance is also now super easy, ssh, update. Something wrong and she needs support? I ssh, open up a tunnel and connect via remina to her desktop to explain.

I had a situation once when Ubuntu did literally not go into the Desktop Environment anymore, but all I did was update and upgrade packages and it started working again.

reply
dpflug
1 day ago
[-]
IMHO, Fedora's Atomic Desktops[^1] are the way to go for that. Automatic upgrades you can roll back if something breaks? Yes, please.

Universal Blue[^2] has some spins that got a glow up, but their dev team gives a bit of the "everything old is bad" vibe.

OpenSUSE's MicroOS[^3] desktops aren't ready for nontechnical people, but their atomic upgrade strategy is much faster and simpler (btrfs snapshots). I'm keeping an eye on it.

^1: https://fedoraproject.org/atomic-desktops/

^2: https://universal-blue.org

^3: https://microos.opensuse.org

reply
haswell
23 hours ago
[-]
Good call on Fedora's Atomic options.

My daily driver is NixOS and part of me really wants that level of predictability and rollback for them. For a brief period, I had started thinking through what it might look like to remotely manage this for them. But my ultimately goal is to help them achieve autonomy, and only step in when necessary.

reply
cryptoegorophy
1 day ago
[-]
Why not just iPad?
reply
haswell
1 day ago
[-]
I may have understated their needs somewhat. Most of what they do is browsing and document editing, but a few key use cases make a real computer necessary (or at least highly desirable):

- Document scanning

- Label printing (my mom buys/sells stuff on eBay)

- My dad still works and writes proposals/manages invoices/does complex taxes

At a minimum, they need a full desktop environment. Most of these things have decent 1:1 Linux alternatives, but one or two might necessitate a single-purpose Windows VM when all else fails.

Two pretty decent used micro PCs will also cost less than a single iPad.

reply
doright
2 days ago
[-]
I like themes like this. The only thing that hampers the authenticity for me, and this isn't the fault of the author really, is the super high resolution fonts compared to what was available back then. There's just something charming about low resolution fonts that are readable enough on screen, probably nostalgia.

I think any type of pixel font authentic to a couple decades ago won't look good on a 4K monitor, unfortunately. It got to the point where I ordered a 1024x768 monitor just to play old games with a period system.

reply
jeroenhd
1 day ago
[-]
Pixel fonts don't accurately represent the 90's UIs because we don't use CRTs anymore. The poor souls buying the very first terrible flat screen monitors may have used computers like that, but most of that era was experienced using smudgy, edge blurring CRTs.

You could probably create a CRT-filter-based font for high resolution screens (though you'd probably still need to optimise for subpixel layout for accuracy, even on 4k monitors).

reply
Gormo
1 day ago
[-]
Most people vastly overstate the effect that CRT displays had on the appearance of legacy software.

Yes, very early on, when people used TVs or cheap composite monitors as the display devices for their computers, there were blurry pixel edges, bloom effects, dot crawl, color artifacting, and all the rest.

But by the '90s, we had high-quality monitors designed for high-resolution graphics with fast refresh rates, with crisp pixel boundaries and minimal artifacting. CRT filters overcompensate for this a lot, and end up making SVGA-era graphics anachronistically look like they're being displayed on composite monitors.

reply
zozbot234
1 day ago
[-]
CRT monitors did not have "crisp pixel boundaries". A CRT pixel is a Gaussian-blurred dot, not a "crisp" square as it is on modern displays. What "high-quality" CRT monitors did have was higher resolutions, even as high as 1600x1200, where individual pixels are basically not distinguishable.
reply
Gormo
5 hours ago
[-]
By the early '90s, high-quality CRT displays had low dot pitches or very precise aperture grilles in addition to supporting a wider range of refresh rates, and better clarity of display was a major selling point.

People were typically using 640x480 or 800x600 in GUI enviroments, and most DOS games were at 320x200. 1600x1200 was incredibly uncommon, even where the video hardware and monitors supported it -- people were usually using 14" or 15" 4:3 displays, and that resolution was way too high to be usable on displays that size, and the necessarily lower refresh rates made flicker unbearable at higher resolutions.

At the common resolutions and with purpose-built CRT monitors, pixel boundaries were quite clear and distinguishable.

reply
zozbot234
4 hours ago
[-]
> At the common resolutions and with purpose-built CRT monitors, pixel boundaries were quite clear and distinguishable.

Being able to clearly resolve individual pixels (which I agree was a thing at resolutions like 640x480 or 800x600. 1024x768 is pushing it already though) is not the same as seeing "crisp" boundaries between them. The latter is what I was objecting to. 320x200 (sometimes also 320x240 or the like) is a special case since it was pixel-doubled on more modern VGA/SVGA display hardware, so that's the one case where a single pixel was genuinely seen as a small square with rather crisp boundaries, as opposed to a blurry dot.

reply
dfox
2 days ago
[-]
Another issue with modern recreations of old UIs is that the dimensions are usually subtly wrong, which for me ruins the feeling. Some of that is related to the fonts having different height, but in many cases it is just that something is one-pixel off and just looks wrong. For the 95-style UI the common issue are control borders (especially the high-light side of "3d" controls), of which there is a huge amount of examples on the screenshot.
reply
wobfan
2 days ago
[-]
I actually would think less of this as a look back into the past but hopefully as a real alternative to the current DEs, which obviously then needs to have high res fonts. That would be nice.
reply
selfhoster11
2 days ago
[-]
I wouldn't say that's so "obvious". I for one would prefer the original pixel fonts, but size adjusted to fit my screen density. By hand, if required.
reply
zozbot234
2 days ago
[-]
If we're talking Windows 9x, the "original fonts" could also be TrueType, hence arbitrarily resized. Yes, the original Windows 95 included a pixel font for the UI but then TrueType fonts like Verdana and Tahoma were added soon after that and were commonly used.
reply
selfhoster11
1 day ago
[-]
But didn't they include handcrafted hinting bytecode models something like that?
reply
selfhoster11
2 days ago
[-]
For 4K monitors, why not just pixel-double? Integer scaling will solve many issues introduced by pixel fonts.
reply
doright
1 day ago
[-]
You're right in that there's nothing stopping one from doing so (I even use an integer scaler for old games on my main computer), it's just a tradeoff between "doing what's possible" and "having the most authentic experience one can".

If we're talking about the subjective experience of recreating "a child's bedroom computer" from the mid 90s-early 00s, a widescreen aspect ratio alone would be jarring, since my conception of a monitor for such a system is a 4:3 CRT. So for me, little else would reach that level except a system with the same aspect ratio and a similar DPI.

Not only that, but UI design itself has undergone many shifts since that era to account for the types of monitors those UIs are being designed for. There's not as much of a need for pixel-perfect design when vector-based web UIs dominate the desktop application space nowadays, relegating those that go back to older UI paradigms to enthusiasts who still remember earlier times. Or maybe people who develop for fantasy consoles.

I should mention while I'm at it that those sort of faux-pixel art shaders used in some games come off as quite jarring to me since I expect UIs to be meticulously laid out for the original screen size, not just for blowing up arbitrary content 2x or 4x on a huge widescreen monitor. I sometimes feel those are meant to represent a nostalgic feeling of some kind, being pixelated and all, but really it just makes me wish there were some alternate reality in which people still designed games and desktop applications for 800x600 or 1024x768 monitors again.

It's interesting at present how there's stuff like 4K and then there's the playdate with a relatively tiny handheld resolution, but relatively little interest for new content for those types of resolutions in-between.

reply
Gormo
1 day ago
[-]
> If we're talking about the subjective experience of recreating "a child's bedroom computer" from the mid 90s-early 00s

Is that what this project is going for? I understood it to be attempting to apply design elements from that era to create a superior UI for a modern "child's bedroom computer".

reply
creatonez
1 day ago
[-]
libpango's removal of bitmapped fonts in 2019 did serious harm to retro theming.
reply
interludead
1 day ago
[-]
I love that you went all-in with a 1024x768 monitor
reply
MarkusWandel
1 day ago
[-]
Three modern desktop environments that I use:

- Windows 10/11. Especially in 11, it's easiest just to type the start of an app's name into the search box. As opposed to the two clicks it takes to get to the "traditional" menu where you still have to scroll to find it.

- Gnome (only on fresh Linux installs, usually replaced with Mate pretty soon). Has a smartphone-style app grid, but here, too, its quickest just to type the start of the app's name.

- Mate: Modern, but still has the Windows 95 paradigm (easy enough to collapse the two toolbars into just one bottom one). Still my favourite desktop environment.

Not all fancy graphic stuff is good. And don't even get me started on how hard it is to drag an app window to another screen these days - on Windows. You really have to find the 2% or so of the top bar that's still draggable and not cluttered up by other stuff.

reply
teamonkey
1 day ago
[-]
How do you get Windows to launch an installed app after you type the first few letters, instead of searching the web with Edge and Bing?
reply
Mogzol
1 day ago
[-]
I used winaero tweaker [1] to disable web search, the search is infinitely better now.

You can do the same tweak by editing the registry [2] if you don't want to download an app for it (though the app includes a lot of additional useful tweaks).

[1] https://winaerotweaker.com/

[2] https://www.tomshardware.com/how-to/disable-windows-web-sear...

reply
hn92726819
1 day ago
[-]
You can download openshell and it will replace the start menu from the start menu from whatever era you want (XP, 7, 8, I think 10 too). It's open source as well
reply
MarkusWandel
1 day ago
[-]
Type into the search box and when the app icon comes up, click on it.
reply
arcmechanica
1 day ago
[-]
"I searched bing for you and here's the link to Word on the web"
reply
MarkusWandel
1 day ago
[-]
Dunno. On the relatively clean Windows 11 on my laptop, on which I'd never run Teams before, I typed "tea" into the search box (no ENTER) and while it also offered to look up on the web about "Tea" it did show "Microsoft Teams (personal) (app) at the top right with a large, clickable icon.
reply
wavemode
1 day ago
[-]
The decline in usability and organization of the Windows start menu over the years has been frankly staggering.

Whenever I see screenshots of the old menu I get pangs of nostalgia.

reply
MarkusWandel
1 day ago
[-]
I think they drank the Macintosh kool-aid and expect you to have all your commonly used apps pinned. In Win11 this even looks sort of like a Mac dock. Still made no sense to ruin the usability of the start menu which they invented.
reply
arcmechanica
1 day ago
[-]
What happens when you are no longer the darling. And PMs need to ship something new to get noticed, so they screw it all up
reply
protocolture
1 day ago
[-]
I am 99% sure that this is the result of academic GUI design and focus testing.

I read all the Windows 8 development blogs, and everything they wrote about seemed absolutely justified. Then you actually use the thing and it was a nightmare.

Same with their approach to hardware, the Duke Xbox controller tested really well, but then someone with daintier hands went to use it and uh actually its great for like 15% of the user base.

reply
leptons
1 day ago
[-]
I use two free programs to return my computer usability to Windows 10 days.

https://open-shell.github.io/Open-Shell-Menu/

https://github.com/valinet/ExplorerPatcher

Without these I would probably give up on computers and go live under a bridge.

reply
MarkusWandel
1 day ago
[-]
At least you get the right-click menu that has a lot of handy stuff in the old format.
reply
interludead
1 day ago
[-]
Sometimes I wonder if the people designing this stuff ever actually use dual displays day to day
reply
emidln
2 days ago
[-]
This looks neat. I remember the various fvwm95 and icewm themes doing a similar number in the late 90s and early 2000s.

It would be fun to pair this with Gambas[0], a free VB6 clone that works with GTK.

[0] https://gambaswiki.org/website/en/main.html

reply
ThinkingGuy
1 day ago
[-]
qvwm was another window manager that sought to emulate the look and feel of Windows:

https://qvwm.sourceforge.net/index_en.html

reply
bsnnkv
2 days ago
[-]
This still remains the absolute pinnacle of cohesive desktop environment design in my books.
reply
InsideOutSanta
2 days ago
[-]
I think the desktop operating systems of that era were at a sweet spot. They were technically advanced enough to render good-looking, crisp color user interfaces. However, most people were still novices at using computers, so OS designers consciously designed their operating systems to be as clear as possible. Applications tended to be written for each individual platform and to follow its UI guidelines.

Windows 95, NT, System 7 and System 8, BeOS, and NextSTEP all had really clear UX. You always knew where to drag a window, what was clickable, where to find settings, etc.

reply
cosmic_cheese
2 days ago
[-]
An aspect of System 7/Mac OS 8/9 that I find criminally underrated is how flexible it is.

For those versions, a good bulk of the “system” isn’t part of the system proper but instead implemented by way of extensions and control panels loaded at startup. The OS itself is extremely minimal, basically just enough to provide a barebones desktop and act as a substrate for apps to run on. Everything else, including “core” functionality like audio and networking, was implemented in an extension.

This meant that you could pare back the OS to be extremely lean and only have the exact functionality you personally needed at that precise moment and nothing else, and doing so didn’t require you to recompile a kernel or anything like that — just create an extension set that only loaded what you needed. This was excellent for use cases like games and emulators where you wanted every last ounce of resources to go to those, and nice for single purpose machines too (no point in loading game controller support on a box that only ever runs Photoshop and Illustrator).

Of course the way it was implemented is awful by modern standards, but the concept is golden and I think there should be OS projects trying to replicate it.

reply
InsideOutSanta
2 days ago
[-]
I remember creating different extension sets using the built-in Extension Manager and the third-party tool Conflict Catcher. I had sets for gaming, video editing, and normal usage. It was a simple matter of selecting the correct set and rebooting. Or you could hit shift on startup and start into a minimal system without any extensions.

There's a good reason the third-party extension manager was called "Conflict Catcher," but the power and flexibility such a system grants users is unmatched.

reply
vanschelven
1 day ago
[-]
> However, most people were still novices at using computers

It has (to my surprise, initially) been my experience that "kids these days" are more novice at (desktop) computer-usage than the people of the 90s

reply
stonogo
1 day ago
[-]
"I've come up with a set of rules that describe our reactions to technologies: 1. Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works. 2. Anything that's invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it. 3. Anything invented after you're thirty-five is against the natural order of things." -- Douglas Adams
reply
mfro
2 days ago
[-]
Is it really necessary to spin up an entirely new distro for an XFCE+GTK theme?
reply
grayhatter
2 days ago
[-]
> Stop spending time on things I don't care about

It's ok for people to waste time building stuff they think is cool. Did it need to be a distro? No but it also didn't need to exist. I'm glad it exists though, I think it really whips the llamas ass!

reply
charcircuit
1 day ago
[-]
This attitude is why Linux based operating systems have such poor market share on the desktop. Opportunity costs are real. Friction is real. You don't see Windows creating a new OS for a single theme. You don't see macOS do it either.
reply
keyringlight
1 day ago
[-]
I see it as one of the consequences of freedom, but perhaps also a gap in packaging where they can't bundle up their changes in a form to be applied onto another base.

That 'base' is one issue I've been thinking about with linux, I have similar concerns about the cost of everyone being able to make their own distro for their own slight variation on something else. It's not that I think it's a bad thing to pathfind in new areas, but the replication in building/supporting it all, getting users to pick between 4 similar variants of the same thing, and accounting for "you're using KustomLinux, which is 2 steps removed from CommonLinux" and all the little differences between them. It's an interesting contrast against standardization, but I can't help wondering how it would change the approachability of linux if the starting point was limited to one of the big distros and then variants are layered on top of that.

reply
accrual
1 day ago
[-]
I agree. I've gone down the "make Linux look like Windows 95" path a couple times and while it's a fun yak shave, it's tough to get all the little changes and details right on one system, let alone your friend's system who want to play with the same DE for a day.
reply
grayhatter
1 day ago
[-]
The reason Linux has poor desktop market share has nothing to do with a fun themed distro someone created as a side project.

> You don't see Windows creating a new OS for a single theme. You don't see macOS do it either.

I'd also consider the behavior of both Windows and OSX to be a warning to avoid, and not an example to emulate.

But the line doesn't always have to go up with every breath everyone takes. It's ok to do stuff just because it's fun. Not every single action needs to increase market share.

reply
Gormo
1 day ago
[-]
Why would the Linux ecosystem -- a diverse community of lots of different individuals and organizations all pursuing their own particular goals -- be singularly concerned with increasing desktop market share of Linux as a whole, and all pursue that in the exact same way?
reply
immibis
1 day ago
[-]
You don't see Windows themes at all.
reply
corank
1 day ago
[-]
I wouldn't call it an entirely new distro. It's just an Fedora image bundled with the necessary changes to create the UX. It doesn't provide its own software repositories. It's more like an unofficial Fedora Spin.
reply
mfro
1 day ago
[-]
I see, it still seems like the kind of project that would be much better suited to a DE packge-group style release. I think very few people will want to reinstall their OS just to try it.
reply
haunter
2 days ago
[-]
A better modern middle ground imo is the KD3 continuation project https://www.trinitydesktop.org/
reply
haunter
1 day ago
[-]
*KDE3
reply
bitbasher
2 days ago
[-]
It's not complete until you have a comet cursor and several IE toolbars that were somehow installed.
reply
hybrid_study
2 days ago
[-]
Or Microsoft Bob somewhere
reply
OsrsNeedsf2P
2 days ago
[-]
Does this project offer anything besides Chicago95's UI pre-installed?
reply
fsiefken
2 days ago
[-]
would be nice if it would have wine installed so it can run most windows apps for where there is no good linux alternative. xyplorer, sumatra, irfanview

perhaps a shell where root is mapped to C:\

reply
benrutter
1 day ago
[-]
I love the niche of enthusiasm that exists for the Windows 95 UI. It's not an original point that aside from nostalgia it's a really clear and usable design; but that leads me to wonder, are there any modern UIs/themes/etc that are inspired by (rather than necessarily directly mimicking) Windows 95?

Would be interesting to see what a modern version of Windows 95 would look like, or what general design lessons can be learned from it's niceties.

reply
vardump
2 days ago
[-]
My childhood home's computer said 38911 BYTES FREE.
reply
jhbadger
2 days ago
[-]
And mine just said ]▩ as it waited for you to type an Applesoft command. It is always weird when people say something is from "your childhood" as opposed to theirs. I remember the 1990s, sure, but I was already an adult.
reply
WarOnPrivacy
1 day ago
[-]
Computers from my childhood wouldn't fit in my bedroom but I did bring punchcards home.
reply
probably_wrong
1 day ago
[-]
Mine said C:\>, because I was cool enough to have a 34MB hard drive.
reply
ipcress_file
1 day ago
[-]
Was that a C64? My VIC-20 had about a tenth of that!
reply
myself248
2 days ago
[-]
SYS 64738
reply
qwertox
2 days ago
[-]
`/var/home/afidel/`

There are some pretty good desktop environments for Linux which emulate the Windows desktop, so that old Windows users would feel at home immediately.

But I've never seen them emulate the filesystem, which is what took most old Windows users the biggest effort to understand. And the Linux filesystem raises it to a new level of complexity, which makes every old Windows users want to go back to Windows immediately.

With "old" users I don't mean experienced users.

Is there some kind of overlay which does all this `C:\User\afidel\Desktop` mapping for those users?

reply
account42
1 day ago
[-]
Surely you mean C:\Documents and Settings\afidel\Desktop

Or maybe it's not actually a real problem for users if these paths change.

reply
qwertox
1 day ago
[-]
My Windows 11 C:\ no longer has a `Documents and Settings`. It still works, though. Windows 8 has it as a hidden junction targeting C:\Users, so it was already deprecated there.
reply
mg794613
1 day ago
[-]
That's just the chicago theme I'va had for years now on my XFCE, I think it's a bit lame to steal that theme (not from Microsoft but the maker of that xfce theme), put it in a repository, give it a different name, and pose it as your own work?

The repo literally adds nothing, just a name change.

reply
zild3d
1 day ago
[-]
> a bit lame to steal that theme

Do you mean reuse/copy/redistribute? Which is what the GPL-3.0+/MIT license that the theme uses is meant for?

> Based on Fedora Atomic Xfce with the Chicago95 theme.

The readme has a direct link out to it in the 2nd sentence, that's pretty clear credit

reply
bobbyraduloff
1 day ago
[-]
I think the main difference is that it’s a fully set up installable ISO of Fedora Atomic XFCE with the theme and other tweaks preloaded which is a convenient thing to have, I guess. Also they do credit Chicago95 pretty much at the top of the README.
reply
0dayz
1 day ago
[-]
While neat, I would love to see a desktop that has similar design to either Windows XP/2000, classic Mac OS, or mac OS x cheetah.

Turns out they do have Windows XP: https://github.com/winblues/bluexp

reply
CapsAdmin
1 day ago
[-]
I've done something like this with the windows 2000 look from time to time, but something I found frustrating with theming xfwm4 window decorations is the non existent ability to create a horizontal gradient across the top of the window like in Windows 2000.

I believe this is the reason you cannot find a proper windows 2000 theme for xfce.

reply
trts
1 day ago
[-]
anyone remember XPDE? I'm not sure it was ever finished / packaged in any major repo but came across someone doing a walkthrough of it the other week and it looked pretty complete.

https://www.youtube.com/watch?v=xFKx8nCl1Vw

It is hard for me to distinguish between the functional simplicity of desktop computing in that era, with the overall excitement that the explosion of connectivity brought to the world. The internet was a lot of fun and had so many surprising corners. Practically all of it was personal, niche, or experimental content for awhile.

I wonder if Windows 9X was really all that exceptional, or if it was just what people remember driving with as they navigated the new world.

The best modern equivalent to that desktop paradigm I've found is LXQt, although when I use it I find I kind of miss some of the accouterments of the modern desktops.

reply
herrherrmann
2 days ago
[-]
This looks great with some apps that have matching themes, but I wonder if it quickly falls apart once you rely on apps with very non-consistent UIs (audio/video software, Discord, Spotify, Slack, and basically all other Electron-based apps). Although I guess there might be some matching CSS injection hacks available for the Electron ones?
reply
overgard
1 day ago
[-]
Neat! It's probably just nostalgia, but I still don't think any modern desktop has been as good as Windows 2000 was. Perfect blend of minimal without hiding things (well, excluding the Office 2000 menu hiding disaster which we will conveniently ignore)
reply
qalmakka
1 day ago
[-]
This is awesome. IMHO the Windows 2000 UI was the pinnacle of UX, it's still unmatched to this day. I'd love to see the SerenityOS UI ported on Linux+Wayland someday.
reply
acyou
1 day ago
[-]
What are the benefits/drawbacks of using this vs. actually running Windows 95 or XP?

I'm assuming the PC will be mostly used for "educational software" (games), which you would want to run on XP. What benefit is there to running Fedora?

reply
ha1zum
1 day ago
[-]
Fedora is a modern Linux desktop distribution with active development and support for modern softwares. While using an old OS such as Windows XP is a huge security risk, with very minimal possibilities of running modern softwares. Even major browsers like Firefox and Chrome won't run on XP anymore.
reply
Gormo
1 day ago
[-]
Well, this is a modern OS capable of running modern software, for starters.

It's just Linux with a Win95-lookalike theme.

reply
mixmastamyk
1 day ago
[-]
95 and XP are different in terms of software selection.

Linux Benefits: Security patches (safety), software updates (convenience), hardware support, freedom, and privacy. Not to mention a modern browser, terminal, TLS, and filesystems like exfat.

Fedora runs Wine as well. Maybe not every 95/XP program will run, but I'd guess a lot of them do. You could also run the others in Virtual Box when needed.

reply
acyou
1 day ago
[-]
Ok, but you don't need any of that if you're just running 1990s/2000s Windows educational software on an air gapped OS with no Internet access, which is what you would want for this application.

Why would you want to mess around trying to get programs to half work on Wine when you can just have the real thing?

If you want your kid to have a web browser for educational purposes, I feel like you had probably might as well just hand them the iPad.

reply
mixmastamyk
17 hours ago
[-]
Why tie up a whole machine/desk-space for something so niche? Probably a big 90s/2k sized desktop at that.
reply
forgotmypw17
18 hours ago
[-]
I can testify that Xfce+Chicago95 is great as a daily driver, and one of the most consistent GUIs across different distros (some differences in defaults still exist)
reply
flas9sd
1 day ago
[-]
seen the win32 aesthetic used for higher-order reasons: separating "work mode" from playtime. An extra user to do so on the same machine helps.

Connect the dots reading https://www.marginalia.nu/log/99_context/ and seeing the ui change in old vs new screencasts https://www.youtube.com/@ViktorLofgren/videos

reply
fc417fc802
1 day ago
[-]
> I get significantly more work done when I unplug my computer from the Internet.

I tried this, but documentation is often a huge problem. Increasing amounts of it are primarily online and not particularly straightforward to mirror locally.

reply
flas9sd
17 hours ago
[-]
you could try Zealdocs - it was a reason to go reading more of the projects own documentation pages. There are converters for whatever-doc/ssg-framework a project uses to Docset, the packaged format it relies on. Though ignoring that, how approachable the docs are to different levels of experience is another thing.

I learned heaps from treating a REPL as an (offline) escape room, in terms of how to get inline help, variable introspection and debugging tricks. Not every language offers a convenient one though.

reply
rbanffy
2 days ago
[-]
How cute… imagine my childhood home would have a computer with a graphical desktop…

I feel so old now…

reply
cmrdporcupine
2 days ago
[-]
Yeah as far as GUIs go, Win95 isn't in the "nostalgia" category for me, I was already well into adulthood.

I kind of get the appeal, but it's also unnecessarily skeumorphic/fake-3d and there were some UX things that made little sense especially lumping all the window controls all together (including the destructive "close" X) where MacOS smartly separated them.

reply
abraxas
2 days ago
[-]
The fake 3d is actually very useful in communicating what is a button or another interactive piece of the interface and what's not. The modern clean uis where everything is a thin rectangle or just text that you are supposed to click are a nightmare.
reply
alabastervlog
2 days ago
[-]
Old screen caps of UIs with depth feel so relaxing to look at, and I don’t think it’s just a nostalgia effect.

It’s like there’s always just a little extra brain power and attention being used by modern flat UIs, and you get to shut that off when you look at a depth-enhanced UI.

reply
rbanffy
2 days ago
[-]
The most important part is that controls are consistent across applications. In that regard, tools and libraries that implement some look and feel rather than deferring it to the underlying environment are a disservice to users.

Windows, for instance, has dozens of ways to do that, and you can find parts of Windows that use an archeological version of the controls. Nobody, it seems, bothered to reimplement the older APIs on top of the new ones.

reply
rbanffy
2 days ago
[-]
I agree. Early Macs had to give buttons a 2D distinctive look. A good thing was that the look and feel were part of the OS and not the application, so everything would be consistent.
reply
mrweasel
2 days ago
[-]
> where MacOS smartly separated them.

Interesting that modern macOS now have them next to each other, like Windows.

You'd be hard pressed to call the Window 95 UI pretty, but it is really functional. I'm still a firm believe that the majority of the work we do with computers today could be done within the Windows 95 shell. We need 64bit, more memory, faster CPUs, GPUs all that, but the modern UI aren't really "better", if anything many of them are more confusing. I think a lot of office works would be happy to just have kept the Window 95 era UI for Windows and Office.

reply
Gormo
1 day ago
[-]
> I'm still a firm believe that the majority of the work we do with computers today could be done within the Windows 95 shell.

Wasn't that one of the ideas behind SerenityOS?

reply
rbanffy
2 days ago
[-]
> You'd be hard pressed to call the Window 95 UI pretty, but it is really functional.

Ironically, the Windows 95 look seems a lot like a copy of the NeXT look, which is the OS all modern Macs are kind of running.

reply
cmrdporcupine
2 days ago
[-]
Yeah frankly I'd take the NeXT UI over any of them, including Mac OS X, which felt like a huge step backwards to me compared to NeXTstep

EDIT: Sun's OpenLook is the other one from that era that was fantastic

reply
chuckadams
1 day ago
[-]
The window decorations in Win95 are in fact pixel-for-pixel copies of the ones in NeXTSTEP.
reply
cmrdporcupine
1 day ago
[-]
Note that NeXTstep actually did the right thing with window controls -- close is on opposite side of the window from the others, so you can't accidentally hit it.

Windows3 and Motif hid this stuff under a menu, so wasn't a huge concern.

But then Windows95, and then (oddly) MacOS through this away in favour of throwing them all together.

Awareness of spatial patterns / frequency of use seems to have been higher among early UX/UI designers than after. I guess maybe because mice became more accurate?

reply
fallsoffbikes
2 days ago
[-]
Though Apple has forgotten or ignored this in products like the Apple tv where restart and factory reset are right next to each other.
reply
pwython
2 days ago
[-]
Especially when you're using that tiny trackpad remote. Overall, Apple TV works, but most of the apps UX suck, and that's not Apple's fault.
reply
spacedcowboy
2 days ago
[-]
Yeah. My first computer I soldered the chips/resistors/capacitors/etc to the PCB... It had 1K of RAM, and the screen memory had to come out of that too...

Someone wrote chess for it.

[aside] It was a Sinclair ZX-81, and I was 11 at the time. My parents bought the kit and a second-hand black & white TV with a dial-tuner (no pushbuttons to change the channel) as an Xmas present ...

I loved the TV, it was my TV when we only had one other in the house. I watched everything on that TV (even snooker and swore I could tell which ball was which)... After a couple of months, my dad started to get annoyed I'd not bothered to build the computer, so I was dispatched to the shed to build it.

A few days later (hey, I was in school), the thing worked and I was working my way through the (rather excellent) manual that came with it, getting to know it. One of the logic chapters had an example:

[P]RINT 1+1=2

(It was tokenised input, so you just pressed P and PRINT would come up in the built-in BASIC). Anyone here can see that the answer would be logical-true because 1+1 does equal 2, and indeed the computer printed "1" on the next line.

Anyway, flush with this futuristic knowledge, I set it all up using the family TV in the lounge, and we went through the same thing, just to prove to everyone that it worked...

[P]RINT 1+1=2

1

"I knew it. You've buggered it", said my dad in disgust as he got up and walked out the room. I tried to explain the (new to me) concept of logical truth to him and how the computer represented it, but I don't think he ever really believed me...

[/aside]

Anyway, that Sinclair ZX81 fundamentally changed my life. Computers and computing opened up a whole new world. Some 45 years later I'm about to retire from Apple as one of their most senior engineers, having been here for the last 20 years. Anyone with any Apple device is running some of the software I've written over the years which is kind of cool, but it's time to bow out.

reply
jemurray
2 days ago
[-]
My childhood home would need DOS. Maybe deskview for multitasking. :)
reply
xtracto
2 days ago
[-]
Dosshell ha! Or xtree gold . Great times.
reply
esafak
2 days ago
[-]
DESQview with a Q for Quarterdeck :)
reply
skissane
1 day ago
[-]
A Windows 3.x UI theme would get the childhood nostalgia going much stronger for me than a 95/NT4/98/2000/Me theme does.

Occasionally I boot up Windows 3.1 in a VM and play a game of Solitaire, for old time’s sake - I can run Windows 95 in a VM too, but it just doesn’t have the same pull.

reply
Gormo
1 day ago
[-]
Win3.1 works great in DOSBox, and you can get an even more authentic '80s-'90s IBM-compatible experience using PCEm or 86box, which emulate a wide variety of PC motherboards, graphics cards, and sound cards.
reply
geor9e
1 day ago
[-]
I remember a couple kids with Windows 95 PCs at home. They seemed like such richie rich's. We'd all play Wolfenstein when we'd sleep over at their houses. My childhood computer was a WebTV, hacked to get dialup internet for free, on a 100 lb CRT TV from Goodwill. I finally scraped up the money for an actual PC some time in highschool.
reply
brundolf
1 day ago
[-]
Based on the description I was hoping it would include some classic DOS and Windows games to play via wine :)
reply
jwitchel
1 day ago
[-]
Usually the observation that “a widely used thing is objectively bad” is strong market signal for entrepreneurial opportunity in a big tam.

I for one would welcome a set of deeply integrated ui improvements in a Mac that included a better file manager, better window management, better desktop search, a contact manager just that worked, a messaging client that just worked, audio and camera controls that just worked, a calculator that didn’t suck, etc.

I’d pay at least $100 a year for that tool set.

reply
rauli_
1 day ago
[-]
That Git GUI got my attention. Does anyone have an idea what it might be? The titlebar says aurora but when I searched for it all I got was an commercial product with AI nonsense stuck into it.
reply
hexmiles
1 day ago
[-]
From the look i suspect it is gitg: a git frontend for gnome.

https://wiki.gnome.org/Apps/Gitg

reply
priteau
1 day ago
[-]
Looks like it. The aurora title comes from the Git repository they are viewing: https://github.com/ublue-os/aurora
reply
grimgrin
1 day ago
[-]
well this is simply adorable, down to the neocities hosted website

this already is my stack, tho I need to get over to the immutable variety. i'm on fedora 41 i3-spin, using chicago95. been using it for years, with the plymouth bit that gives you a proper win95 startup background during boot

https://wiki.archlinux.org/title/Plymouth

reply
DiggyJohnson
1 day ago
[-]
Sweet! I have a very similar desktop that I’ve shoehorned support for i3 into as well. Will do likewise with this.
reply
karunamurti
1 day ago
[-]
Nice. I use KDE + Kvantum + Expose theme and various Windows XP themes myself for Windows XP Experience.
reply
keyle
1 day ago
[-]
How messed up is the current state of play than half the population yearns for the late 90s?
reply
zeroq
1 day ago
[-]
This really resonates with me.

Every time I try desktop linux it feels like it's 30 years behind in usability.

reply
fc417fc802
1 day ago
[-]
That surprises me. The major DEs (KDE and Gnome) have felt reasonably polished to me for quite a few years now, they haven't felt buggy to me for even longer, and they always felt like they catered to power users unlike Windows and Mac that actively get in your way when trying to customize things.

Several times they've been ahead of the curve and I've been surprised to learn that Windows or Mac were only just getting a feature that I'd had access to for years or more.

Of course at present I only use i3-alikes. I'd characterize those as a return to the sanity of the past.

reply
hattmall
1 day ago
[-]
What about the old school Packard Bell OS. With Kidspace and other "areas"
reply
nostrademons
2 days ago
[-]
My childhood computer was a Mac. Anyone have a theme that emulates System 6.08?
reply
theandrewbailey
1 day ago
[-]
reply
aantix
1 day ago
[-]
This looks wonderful to use. Is there video playback?

I’ve been thinking lots about the YouTube algorithm, how it’s based on engagement, not educational value, and how it’s such a huge missed opportunity for our kids.

So I’ve been building my own YouTube exploration app, for my own kids.

Email me if you’re interested in testing.

jim.jones1@gmail.com

reply
guest__user
1 day ago
[-]
If we could get tabworks I would love that
reply
anthk
2 days ago
[-]
It needs the BSOD screensaver from XScreensaver, for sure. And, maybe, DOSBox-X or DOSEmu2.

Also:

- Pan

- Sylpheed

- Audacious with the WinAMP theme

- Hexchat kinda has a MIRC vibe

- Parole looks like WMP from < v9 releases

- You can't simulate a dialer, but with trickle you can mimic a 56k/ISDN connection pretty well

- SyncTERM for BBS's

- ScummVM, with just a bilinear filter, because I played tons of adventures

- There's an SDL2 reimplementation of Space Cadet Pinball at github.

- Trigger Rally would look like a great shareware game

- Pidgin, hands down. Either you were an AOL user in America, or a MSN user in Europe. It has emoticons, not emojis. Add that annoying notification theme with a sound and that would be the very late 90's/early 00's (my early teen years)

reply
brandon272
2 days ago
[-]
Emoticons! I completely forgot about that term.
reply
esafak
2 days ago
[-]
Trillian > Pidgin. What's Pan?
reply
anthk
1 day ago
[-]
A news reader (NNTP). OFC, geek people would use SLRN, but we are mimicking Windows.
reply
interludead
1 day ago
[-]
I love that it's not trying to be a 1:1 clone of Windows 95, but more of a vibe-match
reply
eldjee25
1 day ago
[-]
Sounds good
reply
deadbabe
1 day ago
[-]
This looks neat, but the problem with all these nostalgia fueled projects is that if you use it seriously you’re basically LARPing being in a kinder past. Eventually it just gives an empty feeling, and you long for days when interfaces didn’t just look like this for fun, it’s because that’s what they simply were, and this was the state of the art. But you can never go back to those days.
reply
calvinmorrison
1 day ago
[-]
if you're interested in golden era UI computing, check out Trinity Desktop which is a fork of KDE3.5
reply
ranger_danger
2 days ago
[-]
Why are recreations of old UIs always differently wrong? Is pixel-perfection too much to ask for?
reply
rikthevik
2 days ago
[-]
Well, it is difficult and a lot of work.

Maybe _you_ can help.

reply
ranger_danger
2 days ago
[-]
I have made pixel-perfect renditions before, but I would be worried to release them for fear of legal problems.
reply
shrx
2 days ago
[-]
I guess you self-answered your question then.
reply
wvbdmp
1 day ago
[-]
Not really, cause these guys don’t seem to be afraid to use the Windows logo, so why not use historically accurate fonts, icons etc.?
reply
esafak
2 days ago
[-]
I'm trying to forget that horror show and you want to put it under a spotlight! If I never see a bsod or windows registry again it'll be too soon.
reply
stavros
2 days ago
[-]
Well, at least buttons looked like buttons.
reply