An Ode to the Return of Wysiwyg
27 points
3 days ago
| 9 comments
| jeffverkoeyen.com
| HN
pimlottc
9 hours ago
[-]
WYSIWYG is a concept that pre-dates the web and what this article is talking about is not the same thing. WYSIWYG was coined as a term to describe word processing and desktop publishing software where the appearance of your text matched the final printed output; the same fonts, weights, sizes, styles, etc. That's it.

It's something we mostly take for granted today but was a real advancement over earlier, often text-based, programs that used simple text effects like highlighting or different colors to represent visual effects that were only fully realized when you printed your document.

It has nothing to do with being able to view source, or copy other designs, or any of that.

reply
layer8
8 hours ago
[-]
The term was later also extended to things like visual GUI builders, where the appearance in the editing interface matches the appearance of the final GUI (e.g. the Visual Basic form editor). This specific WYSIWYG variation mostly hasn't returned, unfortunately.
reply
skissane
8 hours ago
[-]
> It's something we mostly take for granted today but was a real advancement over earlier, often text-based, programs that used simple text effects like highlighting or different colors to represent visual effects that were only fully realized when you printed your document.

I am tasked with maintaining documentation in Confluence and Notion-and I wasn’t enjoying it. Then I built a system with bidirectional sync between the two of them and a Git repo full of Markdown documents-and now I find the task to be much more pleasant.

reply
WalterBright
7 hours ago
[-]
WYSIWYG came about when displays became bit-mapped graphics with a sufficient amount of dots per inch.

Previously, displays used a character generator ROM chip which mapped ASCII onto one character. For a terminal I designed and built in those days, I used an off-the-shelf character generator chip which had a 5x7 font.

The original IBM PC used a character generator.

reply
anonymous908213
10 hours ago
[-]
Aside from the LLM writing vibes, or perhaps because it was written by an LLM, I think this article has very little tether to reality.

> It’s bringing back something we collectively gave away in the 2010’s when the algorithmic feed psycho-optimized its way into our lives: being weird.

It's really not. Prompting an LLM for a website is the exact opposite of being weird. It spits out something bland that follows corporate design fads and which contains no individuality. If you want to see weird websites, people are still making those by hand; the recently posted webtiles[1] is a nice way to browse a tiny slice of the human internet, with all its weirdness and chaotic individuality.

[1]https://webtiles.kicya.net/

reply
hackyhacky
9 hours ago
[-]
> It's really not. Prompting an LLM for a website is the exact opposite of being weird. It spits out something bland that follows corporate design fads and which contains no individuality. If you want to see weird websites,

I see your point, but I disagree. You consider part of the "weirdness" of being how it's done; and yes, it is indeed "weird" to learn several languages, consisting mostly of punctuation, in order to create an online self-promotion. But I think for most people, the "weirdness" (or its absence) is to be found in the end result. To that end, if a person wants a personal web page with animated tentacles around the edges and flying bananas in the background and pictures of demonic teddy bears, that is something that an AI can easily do when asked.

reply
bigbuppo
9 hours ago
[-]
Back in the bad old days, people created websites because they had no choice in the matter. You simply had to do that to share anything with the rest of the world. Most of the tools we had back then still exist. The barrier to entry has never been lower, and those that are motivated to tinker do just that. But going through history... once mainstream blogging became a thing, and then social media conquered all, the motivation to share with others became monetized, as did the methods of sharing with others. AI isn't going to fix that. On the flip side, those same monsters that destroyed the world we knew through monetizing everything are the same ones spending trillions of dollars on AI.
reply
WalterBright
7 hours ago
[-]
> those same monsters that destroyed the world we knew through monetizing everything

That's why we get to use google for free.

I use a ton of excellent free software.

reply
Terr_
6 hours ago
[-]
"You're not the customer, you're the product being sold."

It's all very "free" until ICE rams my car and drags me away because someone sold them the geolocation and facial-recognition data being automatically collected by that "excellent" software.

OK, sure, that's a dramatic example, but the same principle holds for plenty of other scenarios involving employers, insurance rates, etc.

reply
WalterBright
5 hours ago
[-]
The government already has your facial features recorded and databased. (From your passport photo, DL photo, and when you get on an airplane.) LPRs are being installed all over town by the government.
reply
Terr_
5 hours ago
[-]
I'm not sure what you mean here: Are you agreeing and layering on other depressing considerations, or are you downplaying that kind of privacy-break as having no effect?

It does matter. Imagine if Anne Frank (or Anna Franco) is hiding in my attic, and then myself or a guest accidentally takes a picture, perhaps without disabling the internet connection.

There's also the the social-graph it allows someone to construct:

https://kieranhealy.org/blog/archives/2013/06/09/using-metad...

reply
what
4 hours ago
[-]
Are you an illegal alien? Or why are you concerned about ICE dragging you away?
reply
mikestew
3 hours ago
[-]
Because ICE has been known to drag away U. S. citizens? C’mon, man, pick up a newspaper.
reply
WalterBright
2 hours ago
[-]
Our justice system has been known to convict innocent people, too.
reply
niko_dex
10 hours ago
[-]
This reads like a love letter to our collective youth. I like the perspective! It's interesting too, because I feel a lot of programmer types might see WYSIWYG and AI both as stepping stones towards a more disciplined approach to engineering.
reply
dtgriscom
9 hours ago
[-]
I've always wanted a DWIMNWIS code editor: "Do What I Mean, Not What I Say". These days it's likely that AI at least tries to provide this.
reply
bluedino
9 hours ago
[-]
No mention of Geocities?!
reply
mempko
7 hours ago
[-]
Read it again.
reply
wahern
3 hours ago
[-]
You mean, rewrite the prompt: "Please summarize the article again, but this time identify and explain any references to Geocities".

P.S. I don't mean to assume the previous commenter used ML to summarize, but it just occurred to me some people probably are, and missing details like that is probably common, more common than missing a reference the classic way, otherwise it wouldn't be a summary. At the same time, they may consider themselves to have read the article.

reply
sph
4 hours ago
[-]
Any article on the front page these days takes a random topic, then, however unrelated or tenuous the connection, turns it into a pitch for AI.

I’m so tired of this bait-and-switch /rant

reply
kylehotchkiss
10 hours ago
[-]
> The barrier to entry is lower than it’s ever been.

I don't see a web full of projects created by people who aren't technical. A substantial number of young people grew up on phones and iPads and might not even understand filesystems well enough to have the imagination to create things like this. So the power exists, but the people who are taking best advantage to me seem like the people who were building stuff before the LLMs came to be.

reply
hackyhacky
9 hours ago
[-]
> I don't see a web full of projects created by people who aren't technical.

Sure, but this is very new technology. It will take some time for the idea of building software easily to seep into the public consciousness. In that time, AI will get better and the barrier to entry will get even lower.

For comparison, the internet has been around in some form since the 1960s (more-or-less: depending on the exact technology that you consider to represent its beginning), but it took until the late 1990s or even early 2000s before most people were aware of it, and longer than that before it became central to their lives. I would expect the development of AI-coding-for-the-masses to happen much faster, but not instantaneously.

reply
bigbuppo
7 hours ago
[-]
So the internet is newer than AI?
reply
hackyhacky
6 hours ago
[-]
The internet is older than LLMs.
reply
dismalaf
9 hours ago
[-]
What's a "project"? How about a Shopify store? A Substack or WordPress site?
reply
airstrike
7 hours ago
[-]
This is slop.
reply
add-sub-mul-div
6 hours ago
[-]
We're becoming NPCs who publish what the AI writes about the AI creating web sites.
reply