Before HD, almost all video was non-square pixels. DVD is 720x480. SD channels on cable TV systems are 528x480.
Correct. This came from the ITU-R BT.601 standard, one of the first digital video standards authors of which chose to define digital video as a sampled analog signal. Analog video never had a concept of pixels and operated on lines instead. The rate at which you could sample it could be arbitrary, and affected only the horizontal resolution. The rate chosen by BT.601 was 13.5 MHz, which resulted in a 10/11 pixel aspect ratio for 4:3 NTSC video and 59/54 for 4:3 PAL.
>SD channels on cable TV systems are 528x480
I'm not actually sure about America, but here in Europe most digital cable and satellite SDTV is delivered as 720x576i 4:2:0 MPEG-2 Part 2. There are some outliers that use 544x576i, however.
Doing my part and sending you some samples of UPC cable from the Czech Republic :)
720x576i 16:9: https://0x0.st/P-QU.ts
720x576i 4:3: https://0x0.st/P-Q0.ts
That one weird 544x576i channel I found: https://0x0.st/P-QG.ts
I also have a few decrypted samples from the Hot Bird 13E, public DVB-T and T2 transmitters and Vectra DVB-C from Poland, but for that I'd have to dig through my backups.
CRTs of course had no fixed horizontal resolution.
Edit: I just realized I forgot about PAL DVDs which were 720x576. But the same principle applies.
My understanding is that televisions would mostly have square/rectangular pixels, while computer monitors often had circular pixels.
Or are you perhaps referring to pixel aspect ratios instead?
F.ex. in case of a "4:3 720x480" frame… a quick test: 720/4=180 and 480/3=160… 180 vs. 160… different results… which means the pixels for this frame are not square, just rectangular. Alternatively 720/480 vs. 4/3 works too, of course.
Some modern films are still filmed with anamorphic lenses because the director / DP like that, and so we in the VFX industry have to deal with plate footage that way, and so have to deal with non-square pixels in the software handling the images (to de-squash the image, even though the digital camera sensor pixels that recorded the image from the lens were square) in order to display correctly (i.e. so that round circular things still look round, and are not squashed).
Even to the degree that full CG element renders (i.e. rendered to EXR with a pathtracing renderer) should really use anisotropic pixel filter widths to look correct.
If a video file only stores a singular color value for each pixel, why does it care what shape the pixel is in when it's displayed? It would be filled in with the single color value regardless.
What you're referring to stems from an assumption made a long time ago by Microsoft, later adopted as a de facto standard by most computer software. The assumption was that the pixel density of every display, unless otherwise specified, was 96 pixels per inch [1].
The value stuck and started being taken for granted, while the pixel density of displays started growing much beyond that—a move mostly popularized by Apple's Retina. A solution was needed to allow new software to take advantage of the increased detail provided by high-density displays while still accommodating legacy software written exclusively for 96 PPI. This resulted in the decoupling of "logical" pixels from "physical" pixels, with the logical resolution being most commonly defined as "what the resolution of the display would be given its physical size and a PPI of 96" [2], and the physical resolution representing the real amount of pixels. The 100x100 and 200x200 values in your example are respectively the logical and physical resolutions of your screenshot.
Different software vendors refer to these "logical" pixels differently, but the most names you're going to encounter are points (Apple), density-independent pixels ("DPs", Google), and device-independent pixels ("DIPs", Microsoft). The value of 96, while the most common, is also not a standard per se. Android uses 160 PPI as its base, Apple has for a long time used 72.
[1]: https://learn.microsoft.com/en-us/archive/blogs/fontblog/whe...
[2]: https://developer.mozilla.org/en-US/docs/Web/API/Window/devi...
From what I recall only Microsoft had problems with this, and specifically on Windows. You might be right about software that was exclusive to desktop Windows. I don't remember having scaling issues even on other Microsoft products such as Windows Mobile.
If my memory serves, it was Apple that popularized high pixel density in displays with the iPhone 4. They weren't the first to use such a display [1], but certainly the ones to start a chain reaction that resulted in phones adopting crazy resolutions all the way up to 4K.
It's the desktop software that mostly had problems scaling. I'm not sure about Windows Mobile. Windows Phone and UWP have adopted an Android-like model.
[1]: https://en.wikipedia.org/wiki/Retina_display#Competitors