When square pixels aren't square
52 points
4 hours ago
| 6 comments
| alexwlchan.net
| HN
drmpeg
3 hours ago
[-]
> Videos with non-square pixels are pretty rare...

Before HD, almost all video was non-square pixels. DVD is 720x480. SD channels on cable TV systems are 528x480.

reply
m132
2 hours ago
[-]
>Before HD, almost all video was non-square pixels

Correct. This came from the ITU-R BT.601 standard, one of the first digital video standards authors of which chose to define digital video as a sampled analog signal. Analog video never had a concept of pixels and operated on lines instead. The rate at which you could sample it could be arbitrary, and affected only the horizontal resolution. The rate chosen by BT.601 was 13.5 MHz, which resulted in a 10/11 pixel aspect ratio for 4:3 NTSC video and 59/54 for 4:3 PAL.

>SD channels on cable TV systems are 528x480

I'm not actually sure about America, but here in Europe most digital cable and satellite SDTV is delivered as 720x576i 4:2:0 MPEG-2 Part 2. There are some outliers that use 544x576i, however.

reply
drmpeg
2 hours ago
[-]
Here's some captures from my Comcast system here in Silicon Valley.

https://www.w6rz.net/528x480.ts

https://www.w6rz.net/528x480sp.ts

reply
m132
1 hour ago
[-]
Cool!

Doing my part and sending you some samples of UPC cable from the Czech Republic :)

720x576i 16:9: https://0x0.st/P-QU.ts

720x576i 4:3: https://0x0.st/P-Q0.ts

That one weird 544x576i channel I found: https://0x0.st/P-QG.ts

I also have a few decrypted samples from the Hot Bird 13E, public DVB-T and T2 transmitters and Vectra DVB-C from Poland, but for that I'd have to dig through my backups.

reply
badc0ffee
1 hour ago
[-]
Displaying content from a DVD on a panel with square pixels (LCD, plasma, etc.) required stretching or omitting some pixels. For widescreen content you'd need to stretch that 720x480 to 848x480, and for 4:3 content you'd need to stretch it to 720x540, or shrink it to 640x480, depending on the resolution of the panel.

CRTs of course had no fixed horizontal resolution.

Edit: I just realized I forgot about PAL DVDs which were 720x576. But the same principle applies.

reply
binaryturtle
2 hours ago
[-]
Just look at Japanese television… most channels get broadcast at 1440x1080i for 16:9 content instead the full 1920x1080i (to save bandwidth for other things, I assume), so it's still very common with HD too.
reply
ndiddy
1 hour ago
[-]
It may also be due to legacy reasons. Japan was a pioneer in adopting HD TV years before the rest of the world, but early HD cameras and video formats like HDCAM and HDV only recorded 1080i at 1440x1080. If their whole video processing chain is set up for 1440x1080, they’d likely have to replace a lot of equipment to switch over to full 1920x1080i.
reply
ranger_danger
2 hours ago
[-]
I'm confused... what does DVD, SD or any arbitrary frame size have to do with the shape of pixels themselves? Is that not only relevant to the display itself and not the file format/container/codec?

My understanding is that televisions would mostly have square/rectangular pixels, while computer monitors often had circular pixels.

Or are you perhaps referring to pixel aspect ratios instead?

reply
badc0ffee
58 minutes ago
[-]
I'm not 100% sure I understand your question, but in order to display a DVD correctly, you need to either display the pixels stored in the video stream wider than they are tall (for widescreen), or narrower than they are tall (for 4:3). Displaying those pixels 1:1 on a display with square pixels would never be correct for DVD video.
reply
binaryturtle
2 hours ago
[-]
A square pixel has a 1:1 aspect ratio (width is the same as the height). Any other rectangular pixel with widths different than their heights would be considered "non-square".

F.ex. in case of a "4:3 720x480" frame… a quick test: 720/4=180 and 480/3=160… 180 vs. 160… different results… which means the pixels for this frame are not square, just rectangular. Alternatively 720/480 vs. 4/3 works too, of course.

reply
ranger_danger
1 hour ago
[-]
Again I think you're talking about pixel aspect ratios instead, and not physically non-square pixels, which would be display-dependent. OP only said "square pixels" but then only talked about aspect ratios, hence my confusion.
reply
formerly_proven
39 minutes ago
[-]
Dots on a crt are not pixels. Their shape depends on the shadow mask.
reply
a012
2 hours ago
[-]
I’m no expert but this sounds like a digital version of the anamorphic lens/system, doesn’t it?
reply
pixelesque
1 hour ago
[-]
It is.

Some modern films are still filmed with anamorphic lenses because the director / DP like that, and so we in the VFX industry have to deal with plate footage that way, and so have to deal with non-square pixels in the software handling the images (to de-squash the image, even though the digital camera sensor pixels that recorded the image from the lens were square) in order to display correctly (i.e. so that round circular things still look round, and are not squashed).

Even to the degree that full CG element renders (i.e. rendered to EXR with a pathtracing renderer) should really use anisotropic pixel filter widths to look correct.

reply
shrinks99
1 hour ago
[-]
Yes, and when working with footage shot with anamorphic lenses one will have to render the footage as non-square pixels, mapped to the square pixels of our screens, to view it at its intended aspect ratio. This process is done either at the beginning (conforming the footage before sending to editorial / VFX) or end (conforming to square pixels as a final step) of the post-production workflow depending on the show.
reply
alberth
2 hours ago
[-]
Am I missing the obvious, but it seems like the author is messing with the aspect ratio.
reply
ranger_danger
2 hours ago
[-]
Yes I think they are conflating square pixels with square pixel aspect ratios.

If a video file only stores a singular color value for each pixel, why does it care what shape the pixel is in when it's displayed? It would be filled in with the single color value regardless.

reply
drob518
2 hours ago
[-]
Proving that everything is more complicated than you first think it is when you lift up a corner of the rug.
reply
sbondaryev
2 hours ago
[-]
This reminded me of retina screenshots on mac — selecting a 100×100 area can produce a 200×200 file. Different cause but same idea - the stored pixels don’t always match what you see on screen.
reply
m132
2 hours ago
[-]
This is indeed similar in the effects, but completely different in the cause to the phenomenon referenced in the article (device pixel ratio vs pixel aspect ratio).

What you're referring to stems from an assumption made a long time ago by Microsoft, later adopted as a de facto standard by most computer software. The assumption was that the pixel density of every display, unless otherwise specified, was 96 pixels per inch [1].

The value stuck and started being taken for granted, while the pixel density of displays started growing much beyond that—a move mostly popularized by Apple's Retina. A solution was needed to allow new software to take advantage of the increased detail provided by high-density displays while still accommodating legacy software written exclusively for 96 PPI. This resulted in the decoupling of "logical" pixels from "physical" pixels, with the logical resolution being most commonly defined as "what the resolution of the display would be given its physical size and a PPI of 96" [2], and the physical resolution representing the real amount of pixels. The 100x100 and 200x200 values in your example are respectively the logical and physical resolutions of your screenshot.

Different software vendors refer to these "logical" pixels differently, but the most names you're going to encounter are points (Apple), density-independent pixels ("DPs", Google), and device-independent pixels ("DIPs", Microsoft). The value of 96, while the most common, is also not a standard per se. Android uses 160 PPI as its base, Apple has for a long time used 72.

[1]: https://learn.microsoft.com/en-us/archive/blogs/fontblog/whe...

[2]: https://developer.mozilla.org/en-US/docs/Web/API/Window/devi...

reply
sublinear
1 hour ago
[-]
I might be misunderstanding what you're saying, but I'm pretty sure print and web were already more popular than anything Apple did. The need to be aware of output size and scale pixels was not at all uncommon by the time retina displays came out.

From what I recall only Microsoft had problems with this, and specifically on Windows. You might be right about software that was exclusive to desktop Windows. I don't remember having scaling issues even on other Microsoft products such as Windows Mobile.

reply
m132
1 hour ago
[-]
Print was always density-independent. This didn't translate into high-density displays, however. The web, at least how I remember it, for the longest time was "best viewed in Internet Explorer at 800x600", and later 1024x768, until vector-based Flash came along :)

If my memory serves, it was Apple that popularized high pixel density in displays with the iPhone 4. They weren't the first to use such a display [1], but certainly the ones to start a chain reaction that resulted in phones adopting crazy resolutions all the way up to 4K.

It's the desktop software that mostly had problems scaling. I'm not sure about Windows Mobile. Windows Phone and UWP have adopted an Android-like model.

[1]: https://en.wikipedia.org/wiki/Retina_display#Competitors

reply
fasterik
2 hours ago
[-]
Obligatory "A Pixel Is Not A Little Square"

https://alvyray.com/Memos/CG/Microsoft/6_pixel.pdf

reply