It's something we mostly take for granted today but was a real advancement over earlier, often text-based, programs that used simple text effects like highlighting or different colors to represent visual effects that were only fully realized when you printed your document.
It has nothing to do with being able to view source, or copy other designs, or any of that.
I am tasked with maintaining documentation in Confluence and Notion-and I wasn’t enjoying it. Then I built a system with bidirectional sync between the two of them and a Git repo full of Markdown documents-and now I find the task to be much more pleasant.
Previously, displays used a character generator ROM chip which mapped ASCII onto one character. For a terminal I designed and built in those days, I used an off-the-shelf character generator chip which had a 5x7 font.
The original IBM PC used a character generator.
> It’s bringing back something we collectively gave away in the 2010’s when the algorithmic feed psycho-optimized its way into our lives: being weird.
It's really not. Prompting an LLM for a website is the exact opposite of being weird. It spits out something bland that follows corporate design fads and which contains no individuality. If you want to see weird websites, people are still making those by hand; the recently posted webtiles[1] is a nice way to browse a tiny slice of the human internet, with all its weirdness and chaotic individuality.
I see your point, but I disagree. You consider part of the "weirdness" of being how it's done; and yes, it is indeed "weird" to learn several languages, consisting mostly of punctuation, in order to create an online self-promotion. But I think for most people, the "weirdness" (or its absence) is to be found in the end result. To that end, if a person wants a personal web page with animated tentacles around the edges and flying bananas in the background and pictures of demonic teddy bears, that is something that an AI can easily do when asked.
That's why we get to use google for free.
I use a ton of excellent free software.
It's all very "free" until ICE rams my car and drags me away because someone sold them the geolocation and facial-recognition data being automatically collected by that "excellent" software.
OK, sure, that's a dramatic example, but the same principle holds for plenty of other scenarios involving employers, insurance rates, etc.
It does matter. Imagine if Anne Frank (or Anna Franco) is hiding in my attic, and then myself or a guest accidentally takes a picture, perhaps without disabling the internet connection.
There's also the the social-graph it allows someone to construct:
https://kieranhealy.org/blog/archives/2013/06/09/using-metad...
P.S. I don't mean to assume the previous commenter used ML to summarize, but it just occurred to me some people probably are, and missing details like that is probably common, more common than missing a reference the classic way, otherwise it wouldn't be a summary. At the same time, they may consider themselves to have read the article.
I’m so tired of this bait-and-switch /rant
I don't see a web full of projects created by people who aren't technical. A substantial number of young people grew up on phones and iPads and might not even understand filesystems well enough to have the imagination to create things like this. So the power exists, but the people who are taking best advantage to me seem like the people who were building stuff before the LLMs came to be.
Sure, but this is very new technology. It will take some time for the idea of building software easily to seep into the public consciousness. In that time, AI will get better and the barrier to entry will get even lower.
For comparison, the internet has been around in some form since the 1960s (more-or-less: depending on the exact technology that you consider to represent its beginning), but it took until the late 1990s or even early 2000s before most people were aware of it, and longer than that before it became central to their lives. I would expect the development of AI-coding-for-the-masses to happen much faster, but not instantaneously.