Bill Atkinson has died
1524 points
1 day ago
| 64 comments
| daringfireball.net
| HN
https://facebook.com/story.php?story_fbid=10238073579963378&...
tdhz77
3 hours ago
[-]
I asked Bill if he thought I could become an engineer even after earning my degree in sociology and political science. I really enjoyed writing software at the time but had no formal training. He laughed as he did and said of course, and you will be better than most. He found it as a strength and not a weakness. I will miss him.
reply
vovavili
2 hours ago
[-]
These degrees tend to prepare you quite well for programming, since at top-tier universities they are quite heavy on use of R and statistical modelling.
reply
pjmorris
14 minutes ago
[-]
I'm willing to be a cup of coffee that the OP's degree and Atkinson's advice preceded the existence of R. I'm going to excerpt a mid-80's interview of Butler Lampson from Susan Lammer's book 'Programmers At Work' to illustrate my guess at what Atkinson might've been thinking...

LAMPSON: I used to think that undergraduate computer-science education was bad, and that it should be outlawed. Recently I realized that position isn’t reasonable. An undergraduate degree in computer science is a perfectly respectable professional degree, just like electrical engineering or business administration. But I do think it’s a serious mistake to take an undergraduate degree in computer science if you intend to study it in graduate school.

INTERVIEWER: Why?

LAMPSON: Because most of what you learn won’t have any long-term significance. You won’t learn new ways of using your mind, which does you more good than learning the details of how to write a compiler, which is what you’re likely to get from undergraduate computer science. I think the world would be much better off if all the graduate computer-science departments would get together and agree not to accept anybody with a bachelor’s degree in computer science. Those people should be required to take a remedial year to learn something respectable like mathematics or history, before going on to graduate-level computer science. However, I don’t see that happening.

reply
matthewn
1 day ago
[-]
In an alternate timeline, HyperCard was not allowed to wither and die, but instead continued to mature, embraced the web, and inspired an entire genre of software-creating software. In this timeline, people shape their computing experiences as easily as one might sculpt a piece of clay, creating personal apps that make perfect sense to them and fit like a glove; computing devices actually become (for everyone, not just programmers) the "bicycle for the mind" that Steve Jobs spoke of. I think this is the timeline that Atkinson envisioned, and I wish I lived in it. We've lost a true visionary. Memory eternal!
reply
asnyder
6 hours ago
[-]
His legacy still exists and continues today. Even updated to modern sensibilities, cross-platform, and compatible with all your legacy Hypercard stacks!

As far as I remember, progression was Hypercard -> Metacard -> Runtime Revolution -> Livecode.

https://livecode.com

I was a kid when this progression first happened, my older brother Tuviah Snyder (now at Apple), was responsible for much of these updates and changes first at Metacard and then at its acquirer Runtime Revolution.

I even wrote some of my first programs as Hypercard compatible stacks. Was quite fun to see my apps on download.com, back in the day when that meant something :).

I always joked it required please and thank you due to its verbosity, but was super simple, accessible, and worked!

How nice, that even today one can take their legacy Hypercard Stacks and run them in the web, mobile, etc. Or create something new in what was more structured vibecoding before vibecoding :).

reply
mort96
6 hours ago
[-]
This seems like something completely different? Livecode looks like just another toolkit or SDK for developing standalone apps, which might be great for the handful of developers using it but certainly doesn't do anything to re-shape how users interact with their computers
reply
asnyder
6 hours ago
[-]
Nope, is completely the same base. Scroll the homepage, and you'll see an example of Livecode (updated HyperTalk).

You can open your HyperCard stacks, or MetaCard stacks, or Runtime/Livecode Stacks in their IDE, code, edit, etc, similar to what you would have back in Hypercard days, but with modern features, updates, and additions.

It's backwards compatible with HyperTalk, its current language is an updated HyperTalk (i.e. an updated MetaTalk), that incorporates all that was, but adds new features for today.

Your Livecode apps can be deployed and run as cross-platform desktop applications (Mac, Win, *nix) , mobile applications, and as far as I remember, web applications with HTML5 deployment (so they say).

Not affiliated with them in any way, just sharing my understanding and memories.

reply
asveikau
1 day ago
[-]
Maybe there's some sense of longing for a tool that's similar today, but there's no way of knowing how much hypercard did have the impact you are talking about. For example many of us reading here experienced HyperCard. It planted seeds in our future endeavors.

I remember in elementary school, I had some computer lab classes where the whole class worked in hypercard on some task. Multiply that by however many classrooms did something like that in the 80s and 90s. That's a lot of brains that can be influenced and have been.

We can judge it as a success in its own right, even if it never entered the next paradigm or never had quite an equivalent later on.

reply
lambdaone
1 day ago
[-]
HyperCard was undoubtedly the inspiration for Visual Basic, which for quite some time dominated the bespoke UI industry in the same way web frameworks do today.
reply
Stratoscope
22 hours ago
[-]
HyperCard was great, but it wasn't the inspiration for Visual Basic.

I was on the team that built Ruby (no relation to the programming language), which became the "Visual" side of Visual Basic.

Alan Cooper did the initial design of the product, via a prototype he called Tripod.

Alan had an unusual design philosophy at the time. He preferred to not look at any existing products that may have similar goals, so he could "design in a vacuum" from first principles.

I will ask him about it, but I'm almost certain that he never looked at HyperCard.

reply
canucker2016
10 hours ago
[-]
A blog post about Tripod/Ruby/VB history - https://retool.com/visual-basic

  Cooper's solution to this problem didn't click until late 1987, when a friend at Microsoft brought him along on a sales call with an IT manager at Bank of America. The manager explained that he needed Windows to be usable by all of the bank's employees: highly technical systems administrators, semi-technical analysts, and even users entirely unfamiliar with computers, like tellers. Cooper recalls the moment of inspiration:

  In an instant, I perceived the solution to the shell design problem: it would be a shell construction set—a tool where each user would be able to construct exactly the shell that they needed for their unique mix of applications and training. Instead of me telling the users what the ideal shell was, they could design their own, personalized ideal shell.
Thus was born Tripod, Cooper's shell construction kit.
reply
cortesoft
1 day ago
[-]
HyperCard was the foundation of my programming career. I treated the HyperCard Bible like an actual Bible.
reply
leakycap
2 hours ago
[-]
I miss the days of For Dummies, Bibles, and all the rest. If you'd read that thing carefully a few times, you usually knew your stuff. There was a finish line.

Modern continual versioning and constant updates means there is no finish line. No Bible could ever be printed. Ah, nostalgia.

reply
jkestner
1 day ago
[-]
Word. This is the Papert philosophy of constructionism, learning to think by making that so many of us still carry. I’m still trying to build software-building software. We do live in that timeline; it’s just unevenly distributed.
reply
jostylr
38 minutes ago
[-]
There is hypersrcipt: https://hyperscript.org which claims a descent from hypercard and certainly embraces the web.

Also, this might happen in a few years if AI improves enough to be trusted to make things by novices. Hard to imagine, but just maybe.

reply
nostrademons
1 day ago
[-]
The Web was significantly influenced by HyperCard. Tim Berners-Lee's original prototypes envisioned it as bidirectional, with a hypertext editor shipping alongside the browser. In that sense it does live on, and serves as the basis for much of the modern Internet.
reply
ebcode
12 hours ago
[-]
IIRC, the mouse pointer turning into a hand when you mouse over something clickable was original to HyperCard. And I think Brendan Eich was under a heavy influence of HyperTalk when created JavaScript.
reply
jjcob
11 hours ago
[-]
JavaScript felt like it took the best parts of C (concise expressiveness) and the ease of use of HyperTalk (event handlers, easy hierarchical access to objects, etc). It was pretty sweet.
reply
Tabular-Iceberg
11 hours ago
[-]
Wasn't the pointer always a hand in HyperCard?
reply
WillAdams
6 hours ago
[-]
Depended on context, and what the stack programmer set it to. Possibilities (per Fig. 51-1 in _The Complete Hypercard Handbook, 2nd edition_ were:

- watch

- busy

- hand

- arrow

- iBeam

- cross

- plus

reply
snickerbockers
12 hours ago
[-]
I honestly don't think the modern web is a legitimate hypertext system at this point. It was already bad enough 20 years ago with flash and serverside CGI but now most of the major websites are just serving JavaScript programs that then fetch data using a dedicated API. And then there's all the paywalls and constant CAPTCHA checks to make sure you aren't training an LLM off their content without a license.

Look up hyperland, it's a early 90s documentary by Douglas Adams and the guy from doctor who about the then-future hypermedia revolution. I can remember the web resembling that a long time ago but the modern web is very far removed from anything remotely resembling hypertext.

reply
WillAdams
6 hours ago
[-]
This is discussed a bit in the book:

https://www.goodreads.com/book/show/192405005-hypermedia-sys...

which maybe argues for a return to early ideas of the web as a successor to Hypercard...

reply
zahlman
1 day ago
[-]
Mr. Atkinson's passing was sad enough without thinking about this.

(More seriously: I can still recall using ResEdit to hack a custom FONT resource into a HyperCard stack, then using string manipulation in a text field to create tiled graphics. This performed much better than button icons or any other approach I could find. And then it stopped working in System 7.)

reply
Arathorn
1 day ago
[-]
It’s ironic that the next graphical programming environment similar to Hypercard was probably Flash - and it obviously died too.

What actually are the best successors now, at least for authoring generic apps for the open web? (Other than vibe coding things)

reply
crucialfelix
7 hours ago
[-]
- Minecraft - Roblox - LittleBigPlanet - Mario Maker

This is what kids do to be creative.

Slightly more serious (and therefore less succesful): - Logo/Turtle Graphics - Scratch - HyperStudio

HyperCard was both graphic design and hypertext (links). These two modalities got separated, and I think there are practical reasons for that. Because html/css design actually sucks and never became an amateur art form.

For writing and publishing we got Wiki, Obsidian et al, Blogs (RIP), forums, social media. Not meant to be interactive or programmable, but these fulfill people's needs for publishing.

reply
WillAdams
6 hours ago
[-]
Yeah, that sums things up well --- the problem of course is what happens when one works on a project which blurs boundaries.

I had to drop into BlockSCAD to rough out an arc algorithm for my current project:

https://github.com/WillAdams/gcodepreview

(see the subsubsection "Arcs for toolpaths and DXFs")

Jupyter Notebooks come close to allowing a seamless blending of text and algorithm, but they are sorely missing on the graphic design and vector graphics front --- which now that I write that, makes me realize that that is the big thing which I miss when trying to use them. Makes me wish for JuMP, a Jupyter Notebook which incorporates METAPOST --- if it also had an interactive drawing mode, it would be perfect.... (for my needs).

reply
jx47
1 day ago
[-]
I think that would be Decker (https://internet-janitor.itch.io/decker). Not my project but I found it some time ago when I searched for Hypercard successors. The neat thing is that it works in the browser.
reply
WillAdams
1 day ago
[-]
This gets mentioned pretty much every time HyperCard is --- but I can't see that anyone has done anything with it.

Why use it rather than Livecode (aside from the licensing of the latter) or Hypernext Studio?

reply
RodgerTheGreat
1 day ago
[-]
Some programs, games, and zines made with Decker: https://itch.io/games/tag-decker

Unlike LiveCode (or so far as I am aware HyperNext), Decker is free and open-source: https://github.com/JohnEarnest/Decker

HyperNext doesn't appear to be actively developed; the most recent updates I see are from last year, and it can't be used on modern computers. Decker's most recent release was yesterday morning.

I'd be happy to go into more detail if you like.

reply
WillAdams
22 hours ago
[-]
Livecode used to be opensource, which made me want to use it, but that window closed.

I guess I want a Flash replacement....

reply
leakycap
2 hours ago
[-]
https://ruffle.rs/ recently came to my attention when I needed to resuscitate a back into tool that had been completely built in Macromedia products
reply
jhbadger
23 hours ago
[-]
There's a fair amount of usage of it on Itch.io, if you are into that indie crowd. I was skeptical of it at first -- the whole 1-bit dithering aesthetic seems a bit too retro-twee, but I find it it is the best Hypercard-alike in terms of functionality -- it "just works" as compared to most Hyperclones that seem more like a proof of concept than a functional program.
reply
RossBencina
20 hours ago
[-]
Pretty sure the next after Hypercard was Macromind (later Macromedia) Director. I recall running an early version of a Director animation on a black and white Mac not long after I started playing with Hypercard. Later I was a Director developer. I recall when Future Splash released -- the fast scaling vector graphics were a new and impressive thing. The web browser plugin helped a lot and it really brought multimedia to the browser. It was only later that Macromedia acquired Future Splash and renamed it Flash.
reply
jonnytran
4 hours ago
[-]
Have you seen Scrappy? It’s still early, but it’s the most interesting thing I’ve seen in a while.

https://pontus.granstrom.me/scrappy/

reply
DonHopkins
1 day ago
[-]
Flash completely missed the most important point of HyperCard, which was that end users could put it into edit mode, explore the source code, learn from it, extend it, copy parts of it out, and build their own user interfaces with it.

It's not just "View Source", but "Edit Source" with a built-in, easy to use, scriptable, graphical, interactive WYSIWYG editor that anyone can use.

HyperCard did all that and more long before the web existed, was fully scriptable years before JavaScript existed, was extensible with plug-in XCMDs long before COM/OLE/ActiveX or even OpenDoc/CyberDog or Java/HotJava/Applets, and was widely available and embraced by millions of end-users, was used for games, storytelling, art, business, personal productivity, app development, education, publishing, porn, and so much more, way before merely static web page WYSIWYG editors (let alone live interactive scriptable extensible web application editors) ever existed.

LiveCard (HyperCard as a live HTTP web app server back-end via WebStar/MacHTTP) was probably the first tool that made it possible to create live web pages with graphics and forms with an interactive WYSIWYG editor that even kids could use to publish live HyperCard apps, databases, and clickable graphics on the web.

HyperCard deeply inspired HyperLook for NeWS, which was scripted, drawn, and modeled with PostScript, that I used to port SimCity to Unix:

Alan Kay on “Should web browsers have stuck to being document viewers?” and a discussion of Smalltalk, HyperCard, NeWS, and HyperLook

https://donhopkins.medium.com/alan-kay-on-should-web-browser...

>"Apple’s Hypercard was a terrific and highly successful end-user authoring system whose media was scripted, WYSIWYG, and “symmetric” (in the sense that the “reader” could turn around and “author” in the same high-level terms and forms). It should be the start of — and the guide for — the “User Experience” of encountering and dealing with web content.

>"The underlying system for a browser should not be that of an “app” but of an Operating System whose job would be to protectively and safely run encapsulated systems (i.e. “real objects”) gotten from the web. It should be the way that web content could be open-ended, and not tied to functional subsets in the browser." -Alan Kay

>[...] This work is so good — for any time — and especially for its time — that I don’t want to sully it with any criticisms in the same reply that contains this praise.

>I will confess to not knowing about most of this work until your comments here — and this lack of knowledge was a minus in a number of ways wrt some of the work that we did at Viewpoints since ca 2000.

>(Separate reply) My only real regret about this terrific work is that your group missed the significance for personal computing of the design of Hypertalk in Hypercard.

>It’s not even that Hypertalk is the very best possible way to solve the problems and goals it took on — hard to say one way or another — but I think it is the best example ever actually done and given to millions of end users. And by quite a distance.

>Dan Winkler and Bill Atkinson violated a lot of important principles of “good programming language design”, but they achieved the first overall system in which end-users “could see their own faces”, and could do many projects, and learn as they went.

>For many reasons, a second pass at the end-user programming problem — that takes advantage of what was learned from Hypercard and Hypertalk — has never been done (AFAIK). The Etoys system in Squeak Smalltalk in the early 2000s was very successful, but the design was purposely limited to 8–11 year olds (in part because of constraints from working at Disney).

>It’s interesting to contemplate that the follow on system might not have a close resemblance to Hypertalk — perhaps only a vague one ….

SimCity, Cellular Automata, and Happy Tool for HyperLook (nee HyperNeWS (nee GoodNeWS))

https://donhopkins.medium.com/hyperlook-nee-hypernews-nee-go...

>HyperLook was like HyperCard for NeWS, with PostScript graphics and scripting plus networking. Here are three unique and wacky examples that plug together to show what HyperNeWS was all about, and where we could go in the future!

>The Axis of Eval: Code, Graphics, and Data

>Hi Alan! Outside of Sun, at the Turing Institute in Glasgow, Arthur van Hoff developed a NeWS based reimagination of HyperCard in PostScript, first called GoodNeWS, then HyperNeWS, and finally HyperLook. It used PostScript for code, graphics, and data (the axis of eval). [...]

>What’s the Big Deal About HyperCard?

>"I thought HyperCard was quite brilliant in the end-user problems it solved. (It would have been wonderfully better with a deep dynamic language underneath, but I think part of the success of the design is that they didn’t have all the degrees of freedom to worry about, and were just able to concentrate on their end-user’s direct needs.

>"HyperCard is an especially good example of a system that was “finished and smoothed and documented” beautifully. It deserved to be successful. And Apple blew it by not making the design framework the basis of a web browser (as old PARC hands advised in the early 90s …)" -Alan Kay

HyperLook SimCity Demo Transcript

https://donhopkins.medium.com/hyperlook-simcity-demo-transcr...

>[...] All this is written in PostScript, all the graphics. The SimCity engine is in C, but all the user interface and the graphics are in PostScript.

>The neat thing about doing something like this in HyperLook is that HyperLook is kind of like HyperCard, in that all of the user interface is editable. So these windows we’re looking at here are like stacks, that we can edit.

>Now I’ll flip this into edit mode, while the program’s running. That’s a unique thing.

>Now I’m in edit mode, and this reset button here is just a user interface component that I can move around, and I can hit the “Props” key, and get a property sheet on it.

>I’ll show you what it really is. See, every one of these HyperLook objects has a property sheet, and you can define its graphics. I’ll zoom in here. We have this nice PostScript graphics editor, and we could turn it upside down, or sideways, or, you know, like that. Or scale it. I’ll just undo, that’s pretty useful.

https://news.ycombinator.com/item?id=34134403

DonHopkins on Dec 26, 2022 | parent | context | favorite | on: The Psychedelic Inspiration for Hypercard (2018)

Speaking about HyperCard, creating web pages, and publishing live interactive HyperCard stacks on the web, I wrote this about LiveCard:

https://news.ycombinator.com/item?id=22283045

DonHopkins on Feb 9, 2020 | parent | context | favorite | on: HyperCard: What Could Have Been (2002)

Check out this mind-blowing thing called "LiveCard" that somebody made by combining HyperCard with MacHTTP/WebStar (a Mac web server by Chuck Shotton that supported integration with other apps via Apple Events)! It was like implementing interactive graphical CGI scripts with HyperCard, without even programming (but also allowing you to script them in HyperTalk, and publish live HyperCard databases and graphics)! Normal HyperCard stacks would even work without modification. It was far ahead of its time, and inspired me to integrate WebStar with ScriptX to generate static and dynamic HTML web sites and services!

https://news.ycombinator.com/item?id=16226209

MacHTTP / WebStar from StarNine by Chuck Shotton, and LiveCard HyperCard stack publisher:

CGI and AppleScript:

http://www.drdobbs.com/web-development/cgi-and-applescript/1...

>Cal discusses the Macintosh as an Internet platform, then describes how you can use the AppleScript language for writing CGI applications that run on Macintosh servers.

https://news.ycombinator.com/item?id=7865263

MacHTTP / WebStar from StarNine by Chuck Shotton! He was also VP of Engineering at Quarterdeck, another pioneering company.

https://web.archive.org/web/20110705053055/http://www.astron...

http://infomotions.com/musings/tricks/manuscript/0800-machtt...

http://tidbits.com/article/6292

>It had an AppleScript / OSA API that let you write handlers for responding to web hits in other languages that supported AppleScript.

I used it to integrate ScriptX with the web:

http://www.art.net/~hopkins/Don/lang/scriptx/scriptx-www.htm...

https://medium.com/@donhopkins/1995-apple-world-wide-develop...

The coolest thing somebody did with WebStar was to integrate it with HyperCard so you could actually publish live INTERACTIVE HyperCard stacks on the web, that you could see as images you could click on to follow links, and followed by html form elements corresponding to the text fields, radio buttons, checkboxes, drop down menus, scrolling lists, etc in the HyperCard stack that you could use in the browser to interactive with live HyperCard pages!

That was the earliest easiest way that non-programmers and even kids could both not just create graphical web pages, but publish live interactive apps on the web!

Using HyperCard as a CGI application

https://web.archive.org/web/20060205023024/http://aaa-protei...

https://web.archive.org/web/20021013161709/http://pfhyper.co...

http://www.drdobbs.com/web-development/cgi-and-applescript/1...

https://web.archive.org/web/19990208235151/http://www.royals...

What was it actually ever used for? Saving kid's lives, for one thing:

>Livecard has exceeded all expectations and allows me to serve a stack 8 years in the making and previously confined to individual hospitals running Apples. A whole Childrens Hospital and University Department of Child Health should now swing in behind me and this product will become core curriculum for our medical course. Your product will save lives starting early 1997. Well done.

- Director, Emergency Medicine, Mater Childrens Hospital

reply
rezmason
2 hours ago
[-]
You're right. Flash and its legacy would have been better if it had built in "Edit Source".

The earliest Flash projects were these artful assemblages of scripts dangling from nested timelines, like an Alexander Calder mobile. They were at times labyrinthine, like they are in many similar tools, but there were ways to mitigate that. Later on, AS3 code was sometimes written like Java, because we wanted to be taken seriously.

Many Flash community members wanted to share their source, wanted a space where interested people could make changes. We did the best we could, uploading FLA files and zipped project directories. None of it turned out to be especially resilient.

It's one of the things I admire about Scratch. If you want, you can peek inside, and it's all there, for you to learn from and build off of, with virtually no arbitrary barriers in place.

reply
jjcob
8 hours ago
[-]
We kind of had that for a time with FileMaker and MS Access. People could build pretty amazing stuff with those apps, even without being a programmer.

I think the reason those apps never became mainstream is that they didn't have a good solution for sharing data. There were some ways you could use them to access database servers, but setting them up was so difficult that they were for all intents and purposes limited to local, single user programs.

HTML, CSS, PHP and MySQL had a learning curve, but you could easily make multi-user programs with them. That's why the web won.

reply
crucialfelix
7 hours ago
[-]
Yes! I used FileMaker a lot, and built my first journaling system with it. Like a cross between hypercard and a wiki. It really changed my life and this lead to programming.
reply
skeeter2020
6 hours ago
[-]
the stuff I made in Access (and later Excel) looks a lot like the stuff I generate with AI these days!
reply
specialist
7 hours ago
[-]
Yes. They didn't survive the transition from workgroup (shared files on a LAN) to client/server.
reply
garyrob
20 hours ago
[-]
"In an alternate timeline, HyperCard was not allowed to wither and die, but instead continued to mature, embraced the web..."

In yet another alternate timeline, someone thought to add something like URLs with something like GET, PUT, etc. to HyperCard, and Tim Berners-Lee's invention of the Web browser never happened because Hypercard already did it all.

reply
jandrese
20 hours ago
[-]
On one hand this would be simply amazing, on the other hand it would have been a total security nightmare that makes early Javascript look like a TPM Secure Enclave.
reply
duskwuff
19 hours ago
[-]
Those who were there will remember:

  on openbackground --merryxmas
    merryxmas "on openbackground --merryxmas"
  end openbackground
(And now I'm curious if this post will trip anyone's antivirus software...)
reply
dan-robertson
1 day ago
[-]
Not sure that sculpting clay is the best analogy. Lots of sculpting is hard, as is turning clay, especially if you want to successfully fire the result. Maybe it is an accurate analogy, but people may understand the difficulty differently.
reply
bombcar
21 hours ago
[-]
Hypercard is more like Lego - you can simply buy completed sets (use other's hypercard programs) - or you can put together things according to instructions - but you can always take them apart and change them, and eventually build your own.
reply
jchrisa
1 day ago
[-]
I haven't posted it here yet b/c it's not show ready, but we have been building this vision -- I like to think of it as an e-bike for the mind.

https://vibes.diy/

We had a lot of fun last night with Vibecode Karaoke, where you code an app at the same time as you sing a song.

reply
mannyv
1 day ago
[-]
Hypercard must have been a support nightmare.
reply
moffkalast
11 hours ago
[-]
Looking at the HyperTalk syntax [0] it's interesting how we take left hand variable assignment as a given while math typically teaches the exact opposite since you can't really write the answer before you have the question.

Makes you think if lambda expressions would be more consistent with the rest if they were reversed.

[0] https://en.wikipedia.org/wiki/HyperTalk#Fundamental_operatio...

reply
DonHopkins
23 hours ago
[-]
https://news.ycombinator.com/item?id=22285675

DonHopkins on Feb 10, 2020 | parent | context | favorite | on: HyperCard: What Could Have Been (2002)

Do you have the first commercial HyperCard stack ever released: the HyperCard SmutStack? Or SmutStack II, the Carnal Knowledge Navigator, both by Chuck Farnham? SmutStack was the first commercial HyperCard product available at rollout, released two weeks before HyperCard went public at a MacWorld Expo, cost $15, and made a lot of money (according to Chuck). SmutStack 2, the Carnal Knowledge Navigator, had every type of sexual adventure you could imagine in it, including information about gays, lesbians, transgendered, HIV, safer sex, etc. Chuck was also the marketing guy for Mac Playmate, which got him on Geraldo, and sued by Playboy.

https://www.zdnet.com/article/could-the-ios-app-be-the-21st-...

>Smut Stack. One of the first commercial stacks available at the launch of HyperCard was Smut Stack, a hilarious collection (if you were in sixth grade) of somewhat naughty images that would make joke, present a popup image, or a fart sound when the viewer clicked on them. The author was Chuck Farnham of Chuck's Weird World fame.

>How did he do it? After all, HyperCard was a major secret down at Cupertino, even at that time before the wall of silence went up around Apple.

>It seems that Farnham was walking around the San Jose flea market in the spring of 1987 and spotted a couple of used Macs for sale. He was told that they were broken. Carting them home, he got them running and discovered several early builds of HyperCard as well as its programming environment. Fooling around with the program, he was able to build the Smut Stack, which sold out at the Boston Macworld Expo, being one of the only commercial stacks available at the show.

https://archive.org/stream/MacWorld_9008_August_1990/MacWorl...

Page 69 of https://archive.org/stream/MacWorld_9008_August_1990

>Famham's Choice

>This staunch defender was none other than Chuck Farnham, whom readers of this column will remember as the self-appointed gadfly known for rooting around in Apple’s trash cans. One of Farnham ’s myriad enterprises is Digital Deviations, whose products include the infamous SmutStack, the Carnal Knowledge Navigator, and the multiple-disk set Sounds of Susan. The last comes in two versions: a $15 disk of generic sex noises and, for $10 more, a personalized version in which the talented Susan moans and groans using your name. I am not making this up.

>Farnham is frank about his participation in the Macintosh smut trade. “The problem with porno is generic,” he says, sounding for the briefest moment like Oliver Wendell Holmes. “When you do it, you have to make a commitment ... say you did it and say it’s yours. Most people would not stand up in front of God and country and say, ‘It’s mine.’ I don’t mind being called Mr. Scum Bag.”

>On the other hand, he admits cheerily, “There’s a huge market for sex stuff.” This despite the lack of true eroticism. “It’s a novelty,” says Farnham. Sort of the software equivalent of those ballpoint pens with the picture of a woman with a disappearing bikini.

https://archive.org/stream/NewComputerExpress110/NewComputer...

Page 18 of https://archive.org/stream/NewComputerExpress110

>“Chuck developed the first commercial stack, the Smutstack, which was released two weeks before HyperCard went public at a MacWorld Expo. He’s embarrassed how much money a silly collection of sounds, cartoons, and scans of naked women brought in. His later version, the Carnal Knowledge Navigator, was also a hit.

I've begged Chuck to dig around to see if he has an old copy of the floppy lying around and upload it, but so far I don't know of a copy online you can run. Its bold pioneering balance of art and slease deserves preservation, and the story behind it is hilarious.

Edit: OMG I've just found the Geraldo episode with Chuck online, auspiciously titled "Geraldo: Sex in the 90's. From Computer Porn to Fax Foxes", which shows an example of Smut Stack:

https://visual-icon.com/lionsgate/detail/?id=67563&t=ts

I love the way Chuck holds his smirk throughout the entire interview. And Geraldo's reply to his comment: "I was a fulfillment house for orders."

"That sounds sexual in itself! What was a fulfilment house?"

reply
al_borland
22 hours ago
[-]
I actually had an experience like this yesterday. After reading Gruber talk about how Markdown was never meant for notes, I started to rethink things. I wanted plain text, to be future proof, then stumbled across CotEditor as a means to edit. Inside I was able to use the code highlighting and outline config to define my own regex and effectively create my own markup language with just a dash of regex and nothing more. I then jumped over to Shortcuts and dragged and dropped some stuff together to open/create yearly and daily notes (on either my computer or phone), or append to a log with a quick action.

It is a custom system that didn’t require any code (if you don’t count the very minor bits of regex (just a lot of stuff like… ^\s- .).

Is it a good system, probably not, but we’ll see where it goes.

reply
kadushka
1 day ago
[-]
inspired an entire genre of software-creating software. In this timeline, people shape their computing experiences as easily as one might sculpt a piece of clay, creating personal apps that make perfect sense to them and fit like a glove

LLMs inspired vibe coding - that’s our timeline.

reply
JKCalhoun
1 day ago
[-]
When I was on the ColorSync team at Apple we, the engineers, got an invite to his place-in-the-woods one day.

I knew who he was at the time, but for some reason I felt I was more or less beholden to conversing only about color-related issues and how they applied to a computer workflow. Having retired, I have been kicking myself for some time not just chatting with him about ... whatever.

He was at the time I met him very in to a kind of digital photography. My recollection was that he had a high-end drum scanner and was in fact scanning film negatives (medium format camera?) and then going with a digital workflow from that point on. I remember he was excited about the way that "darks" could be captured (with the scanner?). A straight analog workflow would, according to him, cause the darks to roll off (guessing the film was not the culprit then, perhaps the analog printing process).

He excitedly showed us on his computer photos he took along the Pacific ocean of large rock outcroppings against the ocean — pointing out the detail that you could see in the shadow of the rocks. He was putting together a coffee table book of his photos at the time.

I have to say that I mused at the time about a wealthy, retired, engineer who throws money at high end photo gear and suddenly thinks they're a photographer. I think I was weighing his "technical" approach to photography vs. a strictly artistic one. Although, having learned more about Ansel Adams technical chops, perhaps for the best photographers there is overlap.

reply
rezmason
19 hours ago
[-]
> I have been kicking myself for some time not just chatting with him about ... whatever.

Maybe I should show some initiative! See, for a little while now I've wanted to just chat with you about whatever.

At this moment I'm working on a little research project about the advent of color on the Macintosh, specifically the color picker. Would you be interested in a casual convo that touches on that? If so, I can create a BlueSky account and reach out to you over there. :)

https://merveilles.town/deck/@rezmason/114586460712518867

reply
diskzero
18 hours ago
[-]
John is cool, but I don't think he was around when the Macintosh II software and hardware was being designed for color support. I did work with Eric Ringewald at Be and he was one of the Color Quickdraw engineers. He would be fun to talk to. Michael Dhuey worked on the hardware of the Mac II platform. I guess we can give some credit to Jean-Louis Gassée as well. Try to talk to those people! I got to work with a lot of these Apple legends at General Magic, Be, Eazel and then back at Apple again. I never got to work on a project with JKCalhoun directly, but I did walk by his office quite frequently.
reply
JKCalhoun
16 hours ago
[-]
True. I showed up at Apple in '95 after Color Quickdraw was already a thing.

Hilariously though, I did get handed the color pickers to "port" to PowerPC. In fact one of the first times I thought I was in over my head being at Apple was when I was staring at 68030 assembly and thinking, "Fuck, I have to rewrite this in C perhaps."

From your username, I feel like we've chatted before (but I don't know your real name).

reply
rezmason
16 hours ago
[-]
> I never got to work on a project with JKCalhoun directly, but I did walk by his office quite frequently.

Did you ever get hit with a paper airplane as you did? ;)

Thanks for this reply, and if you're who I think you are, thank you for all the good work you did alongside these other folks :D

reply
JKCalhoun
16 hours ago
[-]
We can certainly chat.
reply
throwanem
23 hours ago
[-]
There probably still isn't a good way to get that kind of dynamic range entirely in the digital domain. Oh, I'm sure the shortfall today is smaller, say maybe four or five stops versus probably eight or twelve back then. Nonetheless, I've done enough work in monochrome to recognize an occasional need to work around the same limitations he was, even though very few of my subjects are as demanding.
reply
JKCalhoun
23 hours ago
[-]
I wish a good monochrome digital camera didn't cost a small fortune. And I'm too scared to try to remove the Bayer grid from a "color" CCD.

Seems that, without the color/Bayer thing, you could get an extra stop or two for low-light.

I had a crazy notion to make a camera around an astronomical CCD (often monochrome) but they're not cheap either — at least one with a good pixel count.

reply
fractallyte
14 hours ago
[-]
I've had the same journey, and opted instead for a Sigma Foveon camera.

Comparisons and advantages: https://www.photigy.com/school/sigma-foveon-sensor-review-dp...

For black and white photography, the best high-end camera seemed to be the Leica M Monochrom (https://en.wikipedia.org/wiki/Leica_M_Monochrom), but to my mind, it's trounced by the Foveon:

https://youtu.be/OODMWXX_N7A

https://www.stevehuffphoto.com/2013/01/14/quick-comparison-l...

THIS is the photo that really sold it for me:

https://www.fredmiranda.com/forum/topic/1806915/0&year=2023#...

That's from a modified DP1m, but the SD Quattro H has an easily-removable IR filter and a huge sensor.

reply
throwanem
22 hours ago
[-]
I've replaced my D5300's viewfinder focusing screen a couple of times, back before I outgrew the need for focusing aids. I also wouldn't try debayering its sensor! But that sort of thing is what cheap beater bodies off your friendly local camera store's used counter, or eBay, were made for. Pixel count isn't everything, and how better to find out whether the depth of your interest would reward serious investment, than to see whether and how soon it outgrows unserious? Indeed, my own entire interest in photography has developed just so, out of a simple annoyance at having begun to discover what a 2016 phone camera couldn't do.
reply
JKCalhoun
21 hours ago
[-]
I like that idea. I should start watching eBay.
reply
formerly_proven
20 hours ago
[-]
You would also remove the microlenses, which increase sensitivity.
reply
herodotus
6 hours ago
[-]
Bill showed up at one of the WWDCs (2011?). I sat next to him during a lunch, not knowing who he was! He told me his name, and then showed me some photos he had taken. He seemed to me to be a gentle and kind soul. So sad to read this news.
reply
lanyard-textile
22 hours ago
[-]
:) Color in the computer is a good “whatever” topic.

Sometimes it’s just nice to talk about the progress of humanity. Nothing better than being a part of it, the gears that make the world turn.

reply
JKCalhoun
21 hours ago
[-]
Ha ha, but it's also "talking shop". I'm sure Bill preferred it to talking about his Quickdraw days.
reply
Aloha
21 hours ago
[-]
You always lose something when doing optical printing - you can often gain things too, but its not 1:1.

I adore this hybrid workflow, because I can pick how the photo will look, color palate, grain, whatever by picking my film, then I can use digital to fix (most if not all of) the inherent limitations in analog film.

Sadly, film is too much of a pain today, photography has long been about composition for me, not cameras or process - I liked film because I got a consistent result, but I can use digital too, and I do today.

reply
hugs
20 hours ago
[-]
"When art critics get together they talk about form and structure and meaning. When artists get together they talk about where you can buy cheap turpentine."
reply
tejtm
16 hours ago
[-]
-- Picasso
reply
sneak
17 hours ago
[-]
> I have to say that I mused at the time about a wealthy, retired, engineer who throws money at high end photo gear and suddenly thinks they're a photographer.

Duchamp would like a word.

Seriously though, as someone this describes to a T (though “suddenly” in this case is about 19 years), I was afraid to call myself any sort of artist for well over a decade, thinking I was just acquiring signal with high end gear. I didn’t want to try to present myself as something I’m not. After all, I just push the button, the camera does all the work.

I now have come to realize that this attitude is toxic and unnecessary. Art (even bad art!) doesn’t need more gatekeeping or gatekeepers.

I am a visual artist. A visual artist with perhaps better equipment than my skill level or talent justifies, but a visual artist nonetheless.

reply
gxs
1 day ago
[-]
> I have to say that I mused at the time about a wealthy, retired, engineer who throws money at high end photo gear and suddenly thinks they're a photographer

I think this says more about you than it does about him

reply
dang
1 day ago
[-]
Please don't cross into personal attack. The cost outweighs any benefit.

https://news.ycombinator.com/newsguidelines.html

reply
gxs
22 hours ago
[-]
Ugh I hate that you’re almost always right

I was about to argue but then I saw this part

> The cost outweighs any benefit.

And this is absolutely true - there is a benefit but it doesn’t mean it’s worth it

Either way my bad, I should have elaborated and been more gentle instead of just that quip

reply
viccis
1 day ago
[-]
It's true though. This effect is what keeps companies like PRS in business.
reply
bombcar
1 day ago
[-]
There’s a whole industry of prosumer stuff in … well, many industries.

Power tools definitely have it!

reply
JKCalhoun
23 hours ago
[-]
I don't deny that. That's probably true about a lot of observations.
reply
spiralcoaster
1 day ago
[-]
This is absolutely true and I don't understand why you're being downvoted. Especially in the context of this man just recently dying, there's someone throwing in their elitist opinion about photographers and how photography SHOULD be done, and apparently Bill was doing it wrong.
reply
JKCalhoun
23 hours ago
[-]
Well, I certainly didn't mean for it to come across that way. I wasn't saying this was the case with Bill. To be clear, I saw nothing bad about Bill's photos. (Also I'm not really versed enough in professional photography to have a valid opinion even if I didn't like them and so would not have publicly weighed in on them anyway.)

I was though being honest about how I felt at that time — debated whether to keep it to myself or not today (but I always foolishly error on the side of being forthcoming).

Perhaps it's a strange thing to imagine that someone would pursue in their spare time, especially after retired, what they did professionally.

reply
brulard
22 hours ago
[-]
He said "at the time". If I say "I thought X at the time" it implies I have reconsidered since. Your parents comment was unnecessarily condescending
reply
gxs
21 hours ago
[-]
It’s just the timing and how he said it, especially considering the tone of the message overall

But the irony isn’t lost on me that I myself shouldn’t have been so mean about it

reply
JKCalhoun
20 hours ago
[-]
You're right about the timing.
reply
dkislyuk
1 day ago
[-]
From Walter Isaacson's _Steve Jobs_:

> One of Bill Atkinson’s amazing feats (which we are so accustomed to nowadays that we rarely marvel at it) was to allow the windows on a screen to overlap so that the “top” one clipped into the ones “below” it. Atkinson made it possible to move these windows around, just like shuffling papers on a desk, with those below becoming visible or hidden as you moved the top ones. Of course, on a computer screen there are no layers of pixels underneath the pixels that you see, so there are no windows actually lurking underneath the ones that appear to be on top. To create the illusion of overlapping windows requires complex coding that involves what are called “regions.” Atkinson pushed himself to make this trick work because he thought he had seen this capability during his visit to Xerox PARC. In fact the folks at PARC had never accomplished it, and they later told him they were amazed that he had done so. “I got a feeling for the empowering aspect of naïveté”, Atkinson said. “Because I didn’t know it couldn’t be done, I was enabled to do it.” He was working so hard that one morning, in a daze, he drove his Corvette into a parked truck and nearly killed himself. Jobs immediately drove to the hospital to see him. “We were pretty worried about you”, he said when Atkinson regained consciousness. Atkinson gave him a pained smile and replied, “Don’t worry, I still remember regions.”

reply
JKCalhoun
1 day ago
[-]
With overlapping rectangular windows (slightly simpler case than ones with rounded corners) you can expect visible regions of windows that are not foremost to be, for example, perhaps "L" shaped, perhaps "T" shaped (if there are many windows and they overlap left and right edges). Bill's region structure was, as I understand it, more or less a RLE (run-length encoded) representation of the visible rows of a window's bounds. The region for the topmost window (not occluded in any way) would indicate the top row as running from 0 to width-of-window (or right edge of the display if clipped by the display). I believe too there was a shortcut to indicate "oh, and the following rows are identical" so that an un-occluded rectangular window would have a pretty compact region representation.

Windows partly obscured would have rows that may not begin at 0, may not continue to width-of-window. Window regions could even have holes if a skinnier window was on top and within the width of the larger background window.

The cleverness, I think, was then to write fast routines to add, subtract, intersect, and union regions, and rectangles of this structure. Never mind quickly traversing them, clipping to them, etc.

reply
duskwuff
23 hours ago
[-]
The QuickDraw source code refers to the contents of the Region structure as an "unpacked array of sorted inversion points". It's a little short on details, but you can sort of get a sense of how it works by looking at the implementation of PtInRgn(Point, RegionHandle):

https://github.com/historicalsource/supermario/blob/9dd3c4be...

As far as I can tell, it's a bounding box (in typical L/T/R/B format), followed by a sequence of the X/Y coordinates of every "corner" inside the region. It's fairly compact for most region shapes which arise from overlapping rectangular windows, and very fast to perform hit tests on.

reply
JKCalhoun
21 hours ago
[-]
Thanks for digging deeper.
reply
gblargg
12 hours ago
[-]
The key seems to have been recognizing the utility of the region concept and making it fundamental to the QuickDraw API (and the clever representation that made finding the main rectangular portions easy). This insulated QuickDraw from the complexity of windowing system operations. Once you go implementing region operations you probably find that it's fairly efficient to work out the major rectangular regions so you can use normal graphics operations on them, leaving small areas that can just be done inefficiently as a bunch of tiny rectangles. All this work for clipped graphics was applicable to far more than just redrawing obscured window content, so it could justify more engineering time to polishing it. Given how easy they were to use, more things could leverage the optimization (e.g. using them to redraw only the dirty region when a window was uncovered).
reply
rjsw
1 day ago
[-]
I think the difference between the Apple and Xerox approach may be more complicated than the people at PARC not knowing how to do this. The Alto doesn't have a framebuffer, each window has its own buffer and the microcode walks the windows to work out what to put on each scanline.
reply
JKCalhoun
1 day ago
[-]
Not doubting that, but what is the substantive difference here? Does the fact that there is a screen buffer on the Mac facilitate clipping that is otherwise not possible on the Alto?
reply
aaronharder
1 day ago
[-]
reply
lambdaone
1 day ago
[-]
It allows the Mac to use far less RAM to display overlapping windows, and doesn't require any extra hardware. Individual regions are refreshed independently of the rest of the screen, with occlusion, updates, and clipping managed automatically,
reply
atombender
7 hours ago
[-]
So when the OS needs to refresh a portion of the screen (e.g. everything behind a top window that was closed), what happens?

My guess is it asks each application that overlapped those areas to redraw only those areas (in case the app is able to be smart about redrawing incrementally), and also clips the following redraw so that any draw operations issued by the app can be "culled". If an app isn't smart and just redraws everything, the clipping can still eliminate a lot of the draw calls.

reply
saghm
1 day ago
[-]
Yeah, it seems like the hard part of this problem isn't merely coming up with a solution that technically is correct, but one that also is efficient enough to be actually useful. Throwing specialized or more expensive hardware at something is a valid approach for problems like this, but all else being equal, having a lower hardware requirement is better.
reply
al_borland
22 hours ago
[-]
I was just watching an interview with Andy Hertzfeld earlier today and he said this was the main challenge of the Macintosh project. How to take a $10k system (Lisa) and run it on a $3k system (Macintosh).

He said they drew a lot of inspiration from Woz on the hardware side. Woz was well known for employing lots of little hacks to make things more efficient, and the Macintosh team had to apply the same approach to software.

reply
ehaliewicz2
23 hours ago
[-]
It definitely makes it simpler. You can do a per-screen window sort, rather than per-pixel :).

Per-pixel sorting while racing the beam is tricky, game consoles usually did it by limiting the number of objects (sprites) per-line, and fetching+caching them before the line is reached.

reply
scripturial
15 hours ago
[-]
I remember coding games for the C64 with an 8 sprite limit, and having to swap sprites in and out for the top and bottom half of the screen to get more than 8.
reply
rsync
1 day ago
[-]
Displaying graphics (of any kind) without a framebuffer is called "racing the beam" and is technically quite difficult and involves managing the real world speed of the electron beam with the cpu clock speed ... as in, if you tax the cpu too much the beam goes by and you missed it ...

The very characteristic horizontally stretched graphics of the Atari 2600 are due to this - the CPU was actually too slow, in a sense, for the electron beam which means your horizontal graphic elements had a fairly large minimum width - you couldn't change the output fast enough.

I strongly recommend:

https://en.wikipedia.org/wiki/Racing_the_Beam

... which goes into great detail on this topic and is one of my favorite books.

reply
peter303
20 hours ago
[-]
Frame buffer memory was still incredibly expensive in 1980. Our labs 512 x 512 x 8bit table lookup color buffer cost $30,000 in 1980. Mac's 512 x 384 x 8bit buffer in 1984 had to fit the Macs $2500 price. The Xerox Alto was earlier than these two devices and would have cost even more if it had a full frame buffer.
reply
sroussey
17 hours ago
[-]
Wasn’t the original Mac at 512 x 342 x 1bit?
reply
mkl
17 hours ago
[-]
Yes: https://512pixels.net/2025/05/original-macintosh-resolution/

There was a discussion here a couple of weeks ago (with a typo in the title): https://news.ycombinator.com/item?id=44110219

reply
mjevans
1 day ago
[-]
Reminds me of a GPU's general workflow. (like the sibling comment, 'isn't that the obvious way this is done'? Different drawing areas being hit by 'firmware' / 'software' renderers?)
reply
heresie-dabord
23 hours ago
[-]
Bill Atkinson, all smiles as he receives applause from the audience for his work on Mac Paint: https://www.youtube.com/watch?v=nhISGtLhPx4
reply
JKCalhoun
23 hours ago
[-]
That's a great video. Everything he does gets applause and he is all (embarrassed?) grins.
reply
rezmason
19 hours ago
[-]
I like how he pronounces "pix-els", learning how we arrived at our current pronunciation is the kind of computer history I can't get enough of
reply
pducks32
23 hours ago
[-]
Would someone mind explaining the technical aspect here? I feel with modern compute and OS paradigms I can’t appreciate this. But even now I know that feeling when you crack it and the thrill of getting the imposible to work.

It’s on all of us to keep the history of this field alive and honor the people who made it all possible. So if anyone would nerd out on this, I’d love to be able to remember him that way.

(I did read this https://www.folklore.org/I_Still_Remember_Regions.html but might be not understanding it fully)

reply
giovannibajo1
22 hours ago
[-]
There were far fewer abstraction layers than today. Today when your desktop application draws something, it gets drawn into a context (a "buffer") which holds the picture of the whole window. Then the window manager / compositor simply paints all the windows on the screen, one on top of the other, in the correct priority (I'm simplifying a lot, but just to get the idea). So when you are programing your application, you don't care about other applications on the screen; you just draw the contents of your window and that's done.

Back at the time, there wouldn't be enough memory to hold a copy of the full contents all possible windows. In fact, there were actually zero abstraction layers: each application was responsible to draw itself directly into the framebuffer (array of pixels), into its correct position. So how to handle overlapping windows? How could each application draw itself on the screen, but only on the pixels not covered by other windows?

QuickDraw (the graphics API written by Atkinson) contained this data structure called "region" which basically represent a "set of pixels", like a mask. And QuickDraw drawing primitives (eg: text) supported clipping to a region. So each application had a region instance representing all visible pixels of the window at any given time; the application would then clip all its drawing to the region, so that only the visibile pixels would get updated.

But how was the region implemented? Obviously it could have not been a mask of pixels (as in, a bitmask) as it would use too much RAM and would be slow to update. In fact, think that the region datastructure had to be quick at doing also operations like intersections, unions, etc. as the operating system had to update the regions for each window as windows got dragged around by the mouse.

So the region was implemented as a bounding box plus a list of visible horizontal spans (I think, I don't know exactly the details). When you represent a list of spans, a common hack is to use simply a list of coordinates that represent the coordinates at which the "state" switches between "inside the span" to "outside the span". This approach makes it for some nice tricks when doing operations like intersections.

Hope this answers the question. I'm fuzzy on many details so there might be several mistakes in this comment (and I apologize in advance) but the overall answer should be good enough to highlight the differences compared to what computers to today.

reply
II2II
19 hours ago
[-]
It's a good description, but I'm going to add a couple of details since details that are obvious to someone who lived through that era may not be obvious to those who came after.

> Obviously it could have not been a mask of pixels

To be more specific about your explanation of too much memory: many early GUIs were 1 bit-per-pixel, so the bitmask would use the same amount of memory as the window contents.

There was another advantage to the complexity of only drawing regions: the OS could tell the application when a region was exposed, so you only had to redraw a region if it was exposed and needed an update or it was just exposed. Unless you were doing something complex and could justify buffering the results, you were probably re-rendering it. (At least that is my recollections from making a Mandelbrot fractal program for a compact Mac, several decades back.)

reply
gblargg
12 hours ago
[-]
And even ignoring memory requirements, an uncompressed bitmap mask would have taken a lot of time to process (especially considering when combining regions where one was not a multiple of 8 pixels shifted with respect to the other. With just the horizontal coordinates of inversions, it takes the same amount of time for a region 8 pixels wide and 800 pixels wide, given the same shape complexity.
reply
duskwuff
17 hours ago
[-]
> But how was the region implemented?

The source code describes it as "an unpacked array of sorted inversion points". If you can read 68k assembly, here's the implementation of PtInRgn:

https://github.com/historicalsource/supermario/blob/9dd3c4be...

reply
giovannibajo1
9 hours ago
[-]
Yeah those are the horizontal spans I was referring to.

It’s a sorted list of X coordinates (left to right). If you group them in couples, they are begin/end intervals of pixels within region (visibles), but it’s actually more useful to manipulate them as a flat array, as I described.

I studied a bit the code and each scanline is prefixed by the Y coordinates, and uses an out of bounds terminator (32767).

reply
bluedino
1 day ago
[-]
> In fact the folks at PARC had never accomplished it, and they later told him they were amazed that he had done so.

Reminds me of the story where some company was making a new VGA card, and it was rumored a rival company had implemented a buffer of some sort in their card. When both cards came out the rival had either not actually implemented it or implemented a far simpler solution

reply
Grosvenor
20 hours ago
[-]
Michael Abrash's black book of graphics programming. They heard about a "buffer", so implemented the only non-stupid thing - a write FIFO. Turns out the competition had done the most stupid thing and built a read buffer.

I teach this lesson to my mentees. Knowing that something is possible gives you significant information. Also, don't brag - It gives away significant information.

Just knowing something is possible makes it much, much easier to achieve.

https://valvedev.info/archives/abrash/abrash.pdf

reply
bogantech
2 minutes ago
[-]
> Turns out the competition had done the most stupid thing and built a read buffer

This isn't really stupid though as explained in the pdf

> Paradise had stuck a read FIFO between display memory and the video output stage of the VGA, allowing the video output to read ahead, so that when the CPU wanted to access display memory, pixels could come from the FIFO while the CPU was serviced immediately. That did indeed help performance--but not as much as Tom’s write FIFO.

VRAM accesses are contended, so during the visual display period the VGA circuitry has priority. CPU accesses result in wait states - a FIFO between the VRAM and the VGA means less contention and more cycles for CPU accesses

Why improve read performance though? Games accessing VRAM I presume would be 99% write. Perhaps it was to improve performance in GUIs like Windows?

reply
alanfalcon
22 hours ago
[-]
An infamous Starcraft example also contains notes of a similar story where they were so humbled by a competitor's demo (and criticism that their own game was simply "Warcraft in space") that they went back and significantly overhauled their game.

Former Ion Storm employees later revealed that Dominion’s E3 1996 demo was pre-rendered, with actors pretending to play, not live gameplay.

reply
stevenwoo
19 hours ago
[-]
I got a look at an early version of StarCraft source code as a reference for the sound library for Diablo 2 and curiosity made me do a quick analysis of the other stuff - they used a very naive approach to C++ and object inheritance to which first time C++ programmers often fall victim. It might have been their first C++ project so they probably needed to start over again anyways. We had an edict on Diablo 2 to make the C++ look like recognizable C for Dave Brevik's benefit which turned out pretty well I think (it was a year late but we shipped).
reply
genewitch
19 hours ago
[-]
Diablo II is in my top 3 games of all time, i still play it all the time. Thanks for contributing so much fun to my life!

(for ref, diablo III is also in my top 3 :)

reply
stevenwoo
15 hours ago
[-]
I was only one of many programmers at Blizzard North + others at Blizzard and our parent company at the time, but you are welcome from me.
reply
selimthegrim
5 hours ago
[-]
What exactly did they do that was naive?
reply
mhh__
22 hours ago
[-]
Similar tale with propaganda and stats with asterisks missing about the MiG-25 leading to the requirements for the F-15 being very high.
reply
jajko
1 day ago
[-]
Pretty awesome story, but also with a bit of dark lining. Of course any owner, and triple that for Jobs, loves over-competent guys who work themselves to the death, here almost literally.

But that's not a recipe for personal happiness for most people, and most of us would not end up contributing revolutionary improvements even if done so. World needs awesome workers, and we also need ie awesome parents or just happy balanced content people (or at least some part of those).

reply
1123581321
1 day ago
[-]
Pretty much. Most of us have creative itches to scratch that make us a bit miserable if we never get to pursue them, even if given a comfortable life. It’s circumstantial whether we get to pursue them as entrepreneurs or employees. The users or enjoyers of our work benefit either way.
reply
kevinventullo
23 hours ago
[-]
Just to add on, some of us have creative itches that are not directly monetizable, and for which there may be no users or enjoyers of our work at all (if there are, all the better!).

Naturally I don’t expect to do such things for a living.

reply
1123581321
16 hours ago
[-]
Yes, and thankful for these. Good addition.
reply
richardw
23 hours ago
[-]
Survivorship bias. The guys going home at 5 went home at 5 and their companies are not written about. It’s dark but we’ve been competing for a while as life forms and this is “dark-lite” compared to what our previous generations had to do.

Some people are competing, and need to make things happen that can’t be done when you check out at 5. Or more generally: the behaviour that achieves the best outcome for a given time and place, is what succeeds and forms the legends of those companies.

If you choose one path, know your competitors are testing the other paths. You succeed or fail partly based on what your most extreme competitors are willing to do, sometimes with some filters for legality and morality. (I.e. not universally true for all countries or times.)

Edit: I currently go home at 5, but have also been the person who actually won the has-no-life award. It’s a continuum, and is context specific. Both are right and sometimes one is necessary.

reply
duskwuff
23 hours ago
[-]
That's not quite how I read the story. Jobs didn't ask Atkinson if he remembered regions - Atkinson brought it up.
reply
asveikau
21 hours ago
[-]
It's also a joke, and a pretty good one at that. Shows a sense of humor.
reply
bowsamic
1 day ago
[-]
What is the dark lining? Do you think Atkinson did not feel totally satisfied with his labour?

And I don't think anyone said that that's the only way to be

reply
brentjanderson
19 hours ago
[-]
Bill's contribution with HyperCard is of course legendary. Apart from the experience of classrooms and computer labs in elementary schools, it was also the primary software powering a fusion of bridge-simulator-meets-live-action-drama field trips (among many other things) for over 20 years at the Space Center in central Utah.[0] I was one of many beneficiaries of this program as a participant, volunteer, and staff member. It was among the best things I've ever done.

That seed crystal of software shaped hundreds of thousands of students that to this day continue to rave about this program (although the last bits of HyperCard retired permanently about 12 years ago, nowadays it's primarily web based tech).

HyperCard's impact on teaching students to program starship simulators, and then telling compelling, interactive, immersive, multi-player dramatic stories in those ships is something enabled by Atkinson's dream in 1985.

May your consciousness journey between infinite pools of light, Bill.

Also, if you've read this far, go donate to Pancreatic Cancer research.[1]

[0]: https://spacecenter.alpineschools.org [1]: https://pancan.org

reply
betamaxthetape
18 hours ago
[-]
Is that stack available anywhere? Or do you have a copy?
reply
brentjanderson
5 hours ago
[-]
Sadly, most of them are lost to time. There's one that I'm aware of at https://archive.org/details/hypercard_voyager-engineer-new is just one station of about 15 from one of the ships.

https://thoriumsim.com is a modern incarnation of the same software.

reply
davisr
1 day ago
[-]
I first met Bill over video-chat during 2020 and we got to know each other a bit. He later sent me a gift that changed my life. We hadn't talked for the past couple years, but I know he experienced "death" before and was as psychologically prepared as anyone could be. I have no doubt that he handled the biggest trip of his life with grace. We didn't always see eye-to-eye when it came to software, but we did share a mutual interest in the unknown, and the meaning of it all. Meet ya on the other side, Bill.
reply
fingerlocks
1 day ago
[-]
Don’t leave us hanging. What was the gift?
reply
cess11
23 hours ago
[-]
Perhaps there were enough clues in the message to figure it out.
reply
DonHopkins
22 hours ago
[-]
Perhaps a colorful postcard with the message:

* <= Lick This Spot

(You may be one of the Lucky 20!)

reply
cess11
2 hours ago
[-]
Perhaps!
reply
thought_alarm
1 day ago
[-]
Bill Atkinson was a very fascinating guy. His interview with Leo Laporte from 2013 is a great listen.

Here's a little 6 minute clip: An acid trip, and the origins of Hypercard.

https://www.youtube.com/watch?v=bdJKjBHCh18

reply
gavmor
1 day ago
[-]
If you haven't, check out the documentary[0] on General Magic which Bill co-founded in 1990. Among the more remarkable scenes in there is when a member of the public seems perplexed by the thought that they would even want to "check email from Times Square."

An unthinkable future, but they thought it. And yet, most folks have never heard of General Magic.

0. https://www.youtube.com/watch?v=JQymn5flcek

reply
JKCalhoun
16 hours ago
[-]
Atkinson at this timestamp: https://youtu.be/JQymn5flcek?si=2TMJ8b9zsR_Kitj-&t=1297

Also, it's here in the documentary that someone expresses the excitement anticipating the smart phone. It's hard to watch for me now and not shake my head, "Oh, it's not quite as wonderful as you imagined."

reply
dboreham
1 day ago
[-]
Invaluable film if you believe Apple invented the smart phone.
reply
gavmor
1 day ago
[-]
Invaluable film if you believe invention is what made Apple valuable.
reply
fotta
1 day ago
[-]
Wow. Rest in peace Bill. I think he is deserving of a black stripe up top.
reply
djmips
1 day ago
[-]
100%
reply
zahlman
1 day ago
[-]
You can set your 'topcolor' in preferences, but this will obscure the links in the sidebar (barring local CSS hacking).
reply
froggertoaster
23 hours ago
[-]
I think you missed the point.
reply
zahlman
20 hours ago
[-]
What other stripe would there be to refer to? I haven't seen anything like that here.
reply
Centigonal
20 hours ago
[-]
Check the site now. HN has a "flag at half-mast" feature for when tech visionaries are freed by the great GC of the cosmos.
reply
zahlman
20 hours ago
[-]
Ah, I had to change topcolor back to notice....
reply
mjbamford
19 hours ago
[-]
I never met Bill, and he never knew I existed, but he has had such a huge impact on my career, my family and my prosperity. I started my programming passion on the Apple II and switch to the Mac in 1984 after seeing MacPaint. Hypercard was very impactful on my logical thinking, paraded the incredibility of possibilities from this machine, and taught me how to conceptualise information. His humble efforts have had such a profound affect. I'm so very full of grief upon hearing this news.
reply
bill_mcgonigle
1 day ago
[-]
People today take the WIMP interface for granted and forget about the pioneers who invented it.

It's really sad to see desktop apps adopt hamburger menus and things that make sense on mobile but make life harder on a desktop built for WIMP.

Thank you, Bill! Some days I'd rather be using your interface.

reply
WillAdams
1 day ago
[-]
Some notable stories from Folklore.org:

https://www.folklore.org/Joining_Apple_Computer.html

https://www.folklore.org/Negative_2000_Lines_Of_Code.html --- something to bring up whenever lines of code as a metric is put forward

https://www.folklore.org/Rosings_Rascals.html --- story of how the Macintosh Finder came to be

https://www.folklore.org/I_Still_Remember_Regions.html --- surviving a car accident

reply
leoc
22 hours ago
[-]
“Busy Being Born” https://www.folklore.org/Busy_Being_Born.html , with its priceless early glimpses of the Lisa/Mac UI preserved in Polaroid photos, may be the best.
reply
bombcar
21 hours ago
[-]
This shows amazing foresight on Bill's part. It would have been easy for all that to be lost.
reply
pmoriarty
16 hours ago
[-]
Here is a video of Bill showing and discussing those Polaroids: https://m.youtube.com/watch?v=Qg0mHFcB510
reply
LorenDB
1 day ago
[-]
The lines of code story is a timeless classic.
reply
90s_dev
1 day ago
[-]
> He thought and wrote "-2000 lines". Management stopped asking Bill to fill out the form.

This lesson stuck with me for years. Final results alone are measurable, not productivity.

reply
worik
17 hours ago
[-]
Imagine being the manager that had to assess Bill Atkinson's productivity!

> Final results alone are measurable

Not even those, for individuals

Mostly we work in teams. I myself like to work in "plumbing" that is arbitrarily far from "final results"

reply
garciasn
1 day ago
[-]
Score code on line count and runtime golf. Shorter, faster, and fastest time to completion is best.

Code that’s 4K and took slightly less time to write but runs slightly faster than code that’s 400 bytes that took another 30m to write still doesn’t get the best score.

reply
dan-robertson
1 day ago
[-]
I kind think metrics are not the answer and instead one needs taste. Obviously performance is multidimensional both in what one measures (latency vs throughput) and as a function of the input. The solution you imagine that is slightly faster in the test could avoid (or introduce) different worst-case or asymptotic behaviour, for example.
reply
garciasn
1 day ago
[-]
I argue we shouldn’t be doing this at all; but, if we have to do to whatever insanely arbitrary metric a project/product/eng leader wants, this is probably a better metric than code length.
reply
pmoriarty
16 hours ago
[-]
> Shorter, faster, and fastest time to completion is best.

What about correctness, robustness, readability, clarity, maintainability?

reply
beagle3
13 hours ago
[-]
Thanks, Bill. Rest in Peace.

I was amazed by Bill's software seeing it on a Mac back then - MacPaint mostly, then HyperCard. I was not even 10, but I was already programming, and spent hours trying to figure out how to implement MacPaint's Lasso on my humble ZX Spectrum. (With some success, but not quite as elegant...)

If you want to experience HyperCard, John Earnest (RodgerTheGreat on HN[0]) built Decker[1] that runs on both the web and natively, and captures the aesthetic and most stuff perfectly. It uses Lil as a programming language - it is different than HyperTalk, but beautiful in its own right. (It doesn't read as English quite the way HyperTalk does, but it is more regular and easier to write - it's a readable/writable vector language, quite unlike those other ones ...)

[0] https://news.ycombinator.com/user?id=RodgerTheGreat

[1] https://beyondloom.com/decker/

reply
wesnerm2
1 day ago
[-]
Atkinson's HyperCard was released in 1987, before the widespread adoption of the web. HyperCard introduced concepts like interactive stacks of cards, scripting, and linking, which were later adopted and expanded upon in the web. Robert Cailliau, who assisted Tim Berners-Lee in developing the first web browser, was influenced by HyperCard's hyperlink concept.
reply
bicepjai
20 hours ago
[-]
Please give a modern day metaphor or similar feature for HyperCard
reply
bigstrat2003
1 day ago
[-]
For anyone (like me) wondering who this guy was, he was a prominent UI guy at Apple back in the day. According to Wikipedia he created the menu bar, QuickDraw, and HyperCard.

For whomever submits stories like this, please say who the person was. Very few people are so famous that everyone in tech knows who they were, and Mr. Atkinson was not one of them. I've heard of his accomplishments, but never the man himself.

reply
gdubs
1 day ago
[-]
Adding a bit more context: The World Wide Web arguably exists because of HyperCard. The idea that information can be hyperlinked together.

Atkinson was a brilliant engineer. As critical to the launch of A Macintosh as anyone — efficient rendering of regions, overlapping windows, etc.

And last but not least, Mac Paint. Every computer painting program in existence owes Atkinson a nod.

reply
btilly
23 hours ago
[-]
The idea that information can be hyperlinked together predated HyperCard by decades. It goes back to https://www.theatlantic.com/magazine/archive/1945/07/as-we-m..., which was written in 1945. The same essay also has the fundamental ideas for a citation index.

This gave rise both to the Science Citation Index and to various hypertext systems. For example the famous 1968 presentation https://www.youtube.com/watch?v=yJDv-zdhzMY, now known as "The Mother of All Demos", demonstrated a working hypertext system among the other jaw-dropping accomplishments.

HyperCard brought hypertext to commodity hardware. The Web made a distributed hypertext system viable. Google's PageRank recombined hypertext and the Science Citation Index to make the web more usable. And all of the key insights trace back to Vannevar Bush. Who was able to have such deep insights in 1945 because he had been working in, and thinking about, computing at least since 1927.

The history of important ideas in computing generally goes far deeper than most programmers are aware.

reply
gdubs
22 hours ago
[-]
I'm not claiming the idea didn't exist but Atkinson's HyperCard turned it into a viable product and the creators of the web credited him for their inspiration.
reply
DonHopkins
22 hours ago
[-]
It's not just the links, more importantly it had:

  on mouseDown
    answer "HyperTalk!" with "OK"
  end mouseDown
reply
jcynix
22 hours ago
[-]
> The idea that information can be hyperlinked together.

HyperCard was really cool and I miss it. Its most important feature IMO was to enable non-programmers to rather easily author useful software. As happend with Excel.

The idea that information can be hyperlinked is much older than HyperCard. Check out Ted Nelson and his https://en.wikipedia.org/wiki/Project_Xanadu which predates HyperCard by more than a decade.

And then there was the https://en.wikipedia.org/wiki/Symbolics_Document_Examiner, or GNU Texinfo and its precursors besides many other attempts.

reply
gdubs
4 hours ago
[-]
I'm pretty obsessed with these 'branches not taken' concepts in computing. Like, everything is the same today. And there's good usability arguments that things shouldn't be different for the sake of being different. But, there are so many forgotten concepts of the past that were arguably much more powerful, simple, expressive ways to interact with machines.
reply
djmips
1 day ago
[-]
He was more then a prominent UI guy - back then he was designer and programmer - designing and coding the foundations.
reply
justin66
22 hours ago
[-]
People are showing you respect when they credit you with the ability to Google things yourself.
reply
pmoriarty
16 hours ago
[-]
The NYT credits him with inventing the double click.[1]

[1] - https://www.nytimes.com/2025/06/07/technology/bill-atkinson-...

reply
jcynix
42 minutes ago
[-]
Maybe the NYT should check Wikipedia first: https://en.wikipedia.org/wiki/Double-click
reply
zahlman
1 day ago
[-]
Several previous top-level comments address Atkinson's accomplishments, but I agree with you in principle.
reply
mathattack
1 hour ago
[-]
"What a man, what a mind, what gifts to the world he left us."

What a tribute! He was famous at the time, though now perhaps an unsung hero in leading us into a GUI world.

reply
erksa
15 hours ago
[-]
Not just a huge influence w/ Apple. Bill Atkinson was a dedicated photographer who did a lot to help bring the idea of sharing memories together.

He and his associated printed and sent tons of photography all around the world.

The was loved among photographers as well. https://apps.apple.com/us/app/photocard-by-bill-atkinson/id3...

reply
pmoriarty
15 hours ago
[-]
More of his photos can be found at https://billatkinson.com/
reply
iainmerrick
1 day ago
[-]
One of my favourite Atkinson stories -- I can't remember if this is on folklore.org or somewhere else -- is that he actually implemented editable text in MacPaint, by scanning the bitmap for character shapes, but chose not to ship that feature because it could never be perfect. Amazing technical skill and great taste and judgement.
reply
duskwuff
1 day ago
[-]
reply
leakycap
2 hours ago
[-]
I truly believe Mr. Atkinson's dedication to his craft was part of the reason I and so many others loved the Mac and got into computers

I will continue to admire him and his way of problem solving, speaking about your past work -- successes and lessons learned

reply
brador
18 minutes ago
[-]
Pancreatic cancer. That’s what got Steve too.
reply
vercantez
23 hours ago
[-]
Wow. One of the absolute greatest. The world truly is a different place because of Bill. Bill’s importance in the history of computing cannot be overstated. Hypercard is probably my favorite invention of his. So ahead of its time. Rest in peace Bill
reply
koops
17 hours ago
[-]
Bill Atkinson and Andy Hertzfeld were my childhood heroes through their work. Inside Macintosh was a series that enlightened my teen years. Thanks, Bill.
reply
THENATHE
22 hours ago
[-]
I know nothing about the fundamentals of “old computing” like what Mr. Atkinson worked on as I am only 27 and have much more contemporary experience. That being said, I still very greatly mourn the loss of these old head techs because the world of tech I use today would not have been possible if not for these incredibly smart and talented individuals. To learn to code without YouTube is truly a feat I could not imagine, and the world will be a lesser place without this kind of ingenuity. Hopefully he’s making some computers in the sky a bit better!
reply
bombcar
21 hours ago
[-]
It's amazing to remember that there was an entire generation of computers and users for whom a command line was a new and modern invention!
reply
agumonkey
1 day ago
[-]
reply
scrlk
1 day ago
[-]
Bill Atkinson demoing MacPaint at the Macintosh introduction in 1984: https://youtu.be/1tQ5XwvjPmA?t=1781
reply
JKCalhoun
1 day ago
[-]
Plenty of interviews will Bill on YouTube. Example: https://youtu.be/dhlKTRU--VA

Also, perhaps the General Magic documentary is a fun watch too: https://youtu.be/JQymn5flcek

reply
happycube
1 day ago
[-]
CHM posted MacPaint and QuickDraw source: https://computerhistory.org/blog/macpaint-and-quickdraw-sour...
reply
cxr
20 hours ago
[-]
In the corresponding interview, Bill makes several pointed remarks that the state of the code published—or at least he code as it was when presented to him for comment—is not his QuickDraw code.

<https://www.youtube.com/watch?v=kGIIwyJ7G94>

reply
blindriver
22 hours ago
[-]
Another death from pancreatic cancer. I really hope we can figure out why rates are skyrocketing because it is a silent killer and usually isn’t detected until it’s too late.
reply
ksec
19 hours ago
[-]
He is 74. I think Age has more to do with it.
reply
djmips
21 hours ago
[-]
I'm a little shook. A hero to many GenX coders I'm sure - I'm one of them. What a legend.
reply
soulofmischief
1 day ago
[-]
Atkinson is a legendary UX pioneer. Great technical skill and a deep understanding of the principles of interaction. His work, from the double click to HyperCard, continues to inspire my own work. You will be missed.
reply
yreg
10 hours ago
[-]
"How many man-years did it take to write QuickDraw?", the Byte magazine reporter asked Steve.

Steve turned to look at Bill. "Bill, how long did you spend writing Quickdraw?"

"Well, I worked on it on and off for four years", Bill replied.

Steve paused for a beat and then turned back to the Byte reporter. "Twenty-four man-years. We invested twenty-four man-years in QuickDraw."

Obviously, Steve figured that one Atkinson year equaled six man years, which may have been a modest estimate.

http://folklore.org/StoryView.py?story=Mythical_Man_Year.txt

reply
carlosdp
23 hours ago
[-]
I was just telling someone about the story of how he invented bitmapping for overlapping windows in the first Mac GUI in like two weeks, largely because he mis-remembered that being already a feature in the Xerox PARC demo and was convinced it was already possible.

RIP to a legend

reply
edbaskerville
1 day ago
[-]
I wish I could have met him before he died.

I'm yet another child of HyperCard. It opened my mind to what computers could be for, and even though the last two decades have been full primarily of disappointment, I still hold onto that other path as a possibility, or even as a slice of reality---a few weeds growing in the cracks of our dystopian concrete.

reply
jprd
21 hours ago
[-]
HyperCard opened my mind as a kid in a way that I couldn't grok until the first time I took Mushrooms. What a genius.
reply
replwoacause
5 hours ago
[-]
Did that single experience with mushrooms change you forever, or was your insight transient?
reply
jmwilson
1 day ago
[-]
HyperCard was my introduction to programming and delivered on the vision of personal computing as "bicycle for the mind." RIP
reply
yardie
19 hours ago
[-]
The Mac, Hypercard, MacPaint, and General Magic he's one of the few engineers who's such a substantial impact on my life. Rest in Peace.
reply
gdubs
1 day ago
[-]
Atkinson's work is so influential. From his contributions to the Macintosh team, to HyperCard, Bill was an inspiration to me and showed the power of merging art & technology.

Thanks for everything, Bill — Rest in Peace.

reply
baumgarn
1 day ago
[-]
I fondly remember creating simple narrative stories and games with HyperCard at 6 years old on my dad's Macintosh SE. It was my first contact with programming and a fundamental seed to using the computer as a creative tool. It has shaped my life in a substantial way. RIP Bill - HN bar should be blacked out.
reply
bilekas
1 day ago
[-]
I need a facebook account to see this post ?

Can we get a better link maybe on the homepage ?

reply
rmason
1 day ago
[-]
Here's a post that quotes the original Facebook post and adds some personal comments.

https://daringfireball.net/linked/2025/06/07/bill-atkinson-r...

reply
dang
1 day ago
[-]
Thanks! I've changed the link to that from https://m.facebook.com/story.php?story_fbid=1023807357996337... above (but put the original URL in the top text so people can read both).
reply
asveikau
1 day ago
[-]
Some of his old demos of graphics capabilities on the Mac or hypercard are around on YouTube, and I watched some maybe 10 years ago. He displayed not just the tech chops but he was a good communicator. RIP.
reply
pcunite
1 day ago
[-]
"Some say Steve used me, but I say he harnessed and motivated me, and drew out my best creative energy." - Bill Atkinson
reply
TruffleLabs
1 day ago
[-]
I loved his PhotoCard app as it allowed for image customization of the stamp and ability to be printed on very high quality card stock and ink.
reply
jnaina
10 hours ago
[-]
A sad for me.

I spent countless hours building HyperCard stacks and creating artwork in MacPaint, in college. A true legend.

RIP. Fat Bits forever.

reply
kabdib
2 hours ago
[-]
the source code of quickdraw is art
reply
malwrar
22 hours ago
[-]
This post is a really beautiful farewell, thanks author for including some examples of his work to smile at.
reply
jonstewart
23 hours ago
[-]
I was just musing to a young team member the other day that I think OOP comes easy to me because I learned HyperCard (v1.2 on System 6 on an SE) at a young age. RIP.
reply
analog31
22 hours ago
[-]
This was my experience too. My mom had a subscription to Byte Magazine, and I remember trying to read the articles on OOP when they came out. It was utterly opaque to me. When I started using HyperCard, the light bulb turned on.

I think a subtle factor is that when learning HC (or Visual Basic, or LabVIEW), you started using objects before you learned how to create them. All of these packages came with lots of pre-written objects that were easy to use. In the case of VB, you had to buy a special version if you wanted to create your own objects, and very few people did.

I think when teaching newer languages like Python, this is done as a matter of course. For instance if you show someone how to calculate a function and graph it, you're probably using objects from something like Matplotlib, before being shown how to create your own. And once again, among casual programmers, relatively few people define their own classes.

reply
zahlman
19 hours ago
[-]
>And once again, among casual programmers, relatively few people define their own classes.

I find that I'm less interested in defining my own classes today than I was 10 or so years ago. https://us.pycon.org/2012/schedule/presentation/352/ left a big impression on me (though I didn't see it until a fair bit after the fact).

reply
bhk
19 hours ago
[-]
There were giants in the Earth in those days...
reply
bhouston
18 hours ago
[-]
Why are so many original Apple people dying of pancreatic cancer? Is it that common and this a coincidence?
reply
empressplay
1 day ago
[-]
HyperCard Simulator: https://hcsimulator.com ViperCard HyperCard re-imagining: https://www.vipercard.net/
reply
sgt
1 day ago
[-]
HyperCard also inspired Myst (the game), if I recall correctly
reply
Uvix
23 hours ago
[-]
The initial (Mac-only) version of Myst was built in HyperCard.
reply
WillAdams
16 hours ago
[-]
And it was preceded by _The Manhole_ which was billed as "Where Alice would have gone if she had Hypercard".
reply
lutusp
16 hours ago
[-]
My time with Atkinson came before the Macintosh, before Hypercard. As a company Apple was struggling and we were preparing for what, in retrospect, was the really terrible Apple III. It was a less optimistic time -- after the Apple II and before the Macintosh.

A digression: the roster of Apple-related pancreatic cancer victims is getting longer -- Jef Raskin (2005), Steve Jobs (2011), now Bill Atkinson (2025). The overall pancreatic cancer occurrence rate is 14 per 100,000, so such a cluster is surprising within a small group, but the scientist in me wants to argue that it's just a coincidence, signifying nothing.

Maybe it's the stress of seeing how quickly one's projects become historical footnotes, erased by later events. And maybe it's irrational to expect anything else.

reply
tw1984
15 hours ago
[-]
Steve Jobs had pancreatic neuroendocrine tumor, which is not the traditional form of the pancreatic cancer people usually talk about. It is far less aggressive and completely treatable, in fact almost 100% curable as Jobs had it diagnosed at such an early stage.
reply
LightBug1
1 day ago
[-]
Wow ... made my first app on Hypercard in high school ... Loved it.

RIP Mr Bill Atkinson

reply
CodeWriter23
12 hours ago
[-]
FatBits was life altering for me.
reply
pcdoodle
1 day ago
[-]
Oh man, he's a legend. My condolences to any family members passing by in remembrance. My highest respect goes to those with the tenacity and character required to force a good idea into existence. Bill inspired many people. While reading about him in "Revolution in the Valley", it felt like it recalibrated my own personal compass and gave me a sense of purpose in my own endeavors.
reply
solarized
17 hours ago
[-]
Inna lillahi wa inna ilaihi rojiun.
reply
kapitanjakc
1 day ago
[-]
I've read stories about him on folklore.

He was a good man and great engineer.

RIP

reply
ayaros
1 day ago
[-]
A sad day for everyone. R.I.P. <3
reply
iwontberude
1 day ago
[-]
HyperCard was my introduction to programming. It was the first time I used a programming language on my mom’s old Macintosh IIci. It really has been a long time. Thank you, Bill.
reply
dondakirme
1 day ago
[-]
RIP
reply
msie
1 day ago
[-]
RIP programming god.
reply
mitchbob
1 day ago
[-]
reply
cm2187
1 day ago
[-]
worth watching the full show, was very interesting
reply
winterrx
1 day ago
[-]
Rest in peace.
reply
DonHopkins
1 day ago
[-]
https://news.ycombinator.com/item?id=21779399

DonHopkins on Dec 13, 2019 | parent | context | favorite | on: Bill Atkinson: Reflections on the 40th anniversary...

I recently posted these thoughts about Bill Atkinson, and links to articles and a recent interview he gave to Brad Myers' user interface class at CMU: https://news.ycombinator.com/item?id=21726302

Bill Atkinson is the humblest, sweetest, most astronomically talented guy -- practically the opposite of Rony Abovitz! I think they're on very different drugs. The Psychedelic Inspiration For Hypercard, by Bill Atkinson, as told to Leo Laporte.

"In 1985 I swallowed a tiny fleck of gelatin containing a medium dose of LSD, and I spent most of the night sitting on a concrete park bench outside my home in Los Gatos, California." ...

https://www.mondo2000.com/2018/06/18/the-inspiration-for-hyp...

Full interview with lots more details about the development of HyperCard:

https://twit.tv/shows/triangulation/episodes/247?autostart=f...

Bill Atkinson's guest lecture in Brad Meyer's CMU 05-640 Interaction Techniques class, Spring 2019, Feb 4, 2019:

https://scs.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=...

Including polaroids of early Lisa development.

About PhotoCard:

https://web.archive.org/web/20110303033205/http://www.billat...

PhotoCard by Bill Atkinson is a free app available from the iTunes App store, that allows you to create custom postcards using Bill's nature photos or your own personal photos, then send them by email or postal mail from your iPad, iPhone or iPod touch.

Bill Atkinson, Mac software legend and world renowned nature photographer, has created an innovative application that redefines how people create and send postcards.

With PhotoCard you can make dazzling, high resolution postcards on your iPad, iPhone or iPod touch, and send them on-the-spot, through email or the US Postal Service. The app is amazingly easy to use. To create a PhotoCard, select one of Bill's nature photos or one of your own personal photos. Then, flip the card over to type your message. For a fun touch, jazz up your PhotoCard with decorative stickers and stamps. If you're emailing your card, it can even include an audible greeting. When you've finished your creation, send it off to any email or postal address in the world!

pvg on Dec 13, 2019 | prev [–]

Was this bit about LSD and Hypercard covered before what seems like a 2016 interview and some later articles? So much has been written about HyperCard (and MacPaint and QuickDraw) I'm wondering if I somehow managed to miss it in all that material.

DonHopkins on Dec 13, 2019 | parent | next [–]

As far as I know, the first time Bill Atkinson publically mentioned that LSD inspired HyperCard was in an interview with Leo Laporte on Apr 25th 2016, which claims to be "Part 2". I have searched all over for part 1 but have not been able to find it. Then Mondo 2000 published a transcript of that part of the interview on June 18 2018, and I think a few other publications repeated it around that time.

And later on Feb 4, 2019 he gave a live talk to Brad Myers' "05-640: Interaction Techniques" user interface design class at CMU, during which he read the transcript.

http://www.cs.cmu.edu/~bam/uicourse/05440inter2019/schedule....

It's well worth watching that interview. He went over and explained all of his amazing Polaroids of Lisa development, which I don't think have ever been published anywhere else.

See Bill Atkinson's Lisa development polaroids:

http://www.cs.cmu.edu/~bam/uicourse/05440inter2019/Bill_Atki...

Then at 1:03:15 a student asked him the million dollar question: what was the impetus and motivation behind HyperCard? He chuckled, reached for the transcript he had off-camera, and then out of the blue he asked the entire class "How many of you guys have done ... a psychedelic?" (Brad reported "No hands", but I think some may have been embarrassed to admit it in front of their professor). So then Bill launched into reading the transcript of the LSD HyperCard story, and blew all the students' minds.

See video of Bill's talk:

https://scs.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=...

The next week I gave a talk to the same class that Bill had just traumatized by asking if they'd done illegal drugs, and (at 37:11) I trolled them by conspiratorially asking: "One thing I wanted to ask the class: Have any of you ever used ... (pregnant pause) ... HyperCard? Basically, because in 1987 I saw HyperCard, and it fucking blew my mind." Then I launched into my description of how important and amazing HyperCard was.

See video of Don's talk:

https://scs.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=...

Here is an index of all of the videos from Brad Myers' interaction techniques class, including Rob Haitani (Palm Pilot), Shumin Zhai (text input and swipe method), Dan Bricklin (spreadsheets, Demo prototyping tool), Don Hopkins (pie menus), and Bill Atkinson (Mac, HyperCard):

https://scs.hosted.panopto.com/Panopto/Pages/Sessions/List.a...

reply
mistrial9
22 hours ago
[-]
I remember Bill from the halcyon days, surrounded by smoke and mirrors. Amazing individual -- rest in Peace
reply
rcarmo
1 day ago
[-]
Oh. I came here to pass the time as I built a TinyMac with a Pi and was compiling BasiliskII in SDL mode. I'm quite saddened by the news, as Bill was one of the people who had the most influence in the technical design of early Macs (and a brilliant engineer for all accounts).

Why isn't the black bar up atop the site?

reply
dlachausse
1 day ago
[-]
He’s definitely deserving of the black bar.

This post is only an hour old as I’m writing this, so give it time. It’s a weekend, and as far as I’m aware there are only 2 mods, unless there are others empowered to turn on the black bar in their absence.

reply
dakiol
1 day ago
[-]
RIP. It still suprises me that people with resources die so early (he died at 74).
reply
monster_truck
1 day ago
[-]
There isn't an amount of resources in the world that will protect you from cancer, despite what some claim. Like my grandma said, "it is your reward for surviving absolutely everything else that could have got you" (she beat 3 different kinds of cancer before losing to a 4th, with 'resources')
reply
melling
1 day ago
[-]
Pancreatic cancer. Still quite deadly. It has been 17 years since Randy Pausch’s The Last Lecture.

https://youtu.be/ji5_MqicxSo?si=TlgWzgQ7bD3Usvu3

reply
mixmastamyk
1 day ago
[-]
Steve also, correct? Wonder if it has anything to do with the chemical dumping in silicon valley.
reply
movingontonext
22 hours ago
[-]
Jeff Raskin too. Three key people at various points in the original Macintosh’s development.
reply
melling
21 hours ago
[-]
There were a lot of Superfund sites in Silicon Valley.

https://www.nytimes.com/2018/03/26/lens/the-superfund-sites-...

reply
mitchbob
1 day ago
[-]
Bill pushed himself to his limits. I saw this first hand at General Magic, and heard the stories about the development of the Macintosh. People can wear themselves out.
reply
andoando
1 day ago
[-]
I wouldnt consider 74 early.
reply
melling
1 day ago
[-]
It is early.

“A 60-year-old male in the US can expect to live until about age 82”

Pancreatic cancer usually is hard to detect until it’s reached an advanced stage. We really should invest more into research

reply
saalweachter
1 day ago
[-]
It's not "he was so young", but it's still a few years shy of "he had a good, long life" IMO.
reply
IncreasePosts
1 day ago
[-]
Unless you're getting preventative screenings frequently, pancreatic cancer can be one of those ones that don't show any symptoms til you're already in stage 4. And most normal doctors will tell you to not do large amounts of preventative screenings.
reply
throw_m239339
1 day ago
[-]
> RIP. It still suprises me that people with resources die so early (he died at 74).

You don't know for how long he did have that disease, if anything, resources might have afforded him many more years of life at first place.So your comment strikes me as odd, given the fact that you can't judge how long did he live with such disease.

One of my friend's dad died from the same kind of cancer. Between the diagnosis and their death, 2 months passed, and that person had plenty of "resources"...

reply
busymom0
1 day ago
[-]
A friend of mine's diagnosis to death was less than a week. It all happened so fast, they couldn't process what had just happened.

It happened during a family reunion for Christmas, so at least everyone was present.

reply
deadbabe
1 day ago
[-]
Resources only help you reach your genetic potential, but if you’re just not built for longevity you still may not live long.

And some people with no resources, no reason to live, but have incredible genetics will linger for many years beyond what people think is possible, like a weed.

reply
jamessinghal
1 day ago
[-]
People without resources or purpose are a weed?
reply
accrual
23 hours ago
[-]
Like a weed, in the sense of living in spite of ones circumstances. For example, a person with limited resources living for a long time, which is like a weed with little sunlight still growing from a crack in concrete.
reply
deadbabe
18 hours ago
[-]
In contrast to some house plants that are meticulously watered and cared for in perfect conditions, and yet they still die in a week.
reply
mindslight
1 day ago
[-]
Life is not guaranteed. Once you've seen it happen a few times, you realize how stochastic death really is (or really, how stochastic living is). 74 is at least not the territory where people generally gasp at how young he was.
reply
djmips
1 day ago
[-]
reply