sRGB profile comparison
48 points
3 days ago
| 6 comments
| ninedegreesbelow.com
| HN
LegionMammal978
3 hours ago
[-]
sRGB has bugged me from the start, since it's not even clear to me which actual matrix to use to convert between linear sRGB colors and XYZ colors. I count at least 3 different matrices in IEC 61966-2-1, each of which I have seen different people ascribe to as the true version:

1. The matrix implied by the reference primaries in Table 1: [X; Y; Z] = [506752/1228815, 87098/409605, 7918/409605; 87881/245763, 175762/245763, 87881/737289; 12673/70218, 12673/175545, 1001167/1053270]*[R; G; B].

2. The matrix in section 5.2: [X; Y; Z] = [1031/2500, 447/1250, 361/2000; 1063/5000, 447/625, 361/5000; 193/10000, 149/1250, 1901/2000]*[R; G; B].

3. The inverse of the matrix in section 5.3: [X; Y; Z] = [248898325000/603542646087, 71938950000/201180882029, 36311670000/201180882029; 128304856250/603542646087, 143878592500/201180882029, 14525360000/201180882029; 11646692500/603542646087, 23977515000/201180882029, 191221850000/201180882029]*[R; G; B].

The distinction starts to matter for 16-bit color. The CSS people seem to take the position that the matrix implied by primaries is the true version, but meanwhile, the same document's Annex F (in Amd. 1) seems to suggest that the 5.2 matrix is the true version, and that the 5.3 matrix should be rederived to the increased precision. There's no easy way to decide, as far as I can tell.

Meanwhile, I agree with the author that the ICC's black-point finagling in their published profiles has not helped with the confusion over what exactly sRGB colors are supposed to map to.

reply
kurthr
52 minutes ago
[-]
If you're using sRGB with 16bit color you already have problems. It is an 8bit per color hack that worked perfectly well with CRTs and early LCDs. There were multiple different hacky versions with different vendors that were visually indistinguishable on displays of the day.

Even most modern displays are not really capable of more than 10bit color (RGB miniLED and QD-OLED barely are). Even REC2020 doesn't need 16bit.

sRGB doesn't even have a consistent gamma, and it's not anywhere close to uniformly covering the color volume. Why use it? DCI-P3 works fine.

reply
bflesch
1 hour ago
[-]
It's perfectly fine for fingerprinting though. Innocuous artifacts in file formats such as custom matrices, digits on the seventh decimal position of a floating point number or millisecond-precision timestamps allow identification and cross-referencing of internet users.

Just last week I noticed that when a reddit user uploads a screenshot taken on MacOS as PNG image to a reddit post, the PNG will still contain uniquely identifying information about the monitor that is attached to the MacOS system and when it was last calibrated. You can deduce type of Macbook they are using from the screen resolution and see when they switched machines once you notice a different monitor calibration timestamp. Just from a single PNG image that was uploaded by the user themselves. If those two pieces of information are not stored in the PNG you know that they must be Windows or Linux user.

It's these small breadcrumbs all over the place which make forensics so interesting.

reply
gpvos
3 hours ago
[-]
I would have loved to have found this page back when I was adapting some PDF-generating program to conform to PDF/A (which requires a colour profile in some cases). I found several sRGB profiles and could see that they were different, but knowing almost nothing about them I just chose the one that seemed to be from the most authoritative source (I forgot which). This page must have existed then, actually.
reply
grvbck
2 hours ago
[-]
It is a rabbit hole. I just checked the latest release of GiMP (3.2.4). The "GiMP built-in sRGB" profile is supposed to be a functional match to the ArgyllCMS sRGB color space – the true sRGB profile according to the addendum in the above profile comparison.

But if I embed it in a photo and then open the photo in GraphicConverter, it shows up as "sRGB IEC61966-2.1", which to my understanding is identical to Apple’s sRGB Color Space Profile.icm.

But that's an sRGB v2 profile. Should I download and use a v4 profile instead? Or download the ArgyllCMS sRGB.icm [1] and convert all photos to it? Or just select the Apple default sRGB profile everywhere?

I'm not a pro and don't have a calibrated display, but it annoys me when photos I upload online look vastly different in my browser than they look in my editing software on the same display.

[1] https://argyllcms.com/icclibsrc.html

reply
magicalhippo
4 hours ago
[-]
From 2012, updated 2015 it says. Would have been interesting with a recent update to compare.
reply
smallstepforman
1 hour ago
[-]
No mention of 601, BT709, BT2020, BT2100 etc. He did mention the P and D profiles. Unorm vs linear.

There is always a historic reason for a colour profile, sadly most software avoids terminology like the plague.

reply
voidUpdate
3 hours ago
[-]
Wow I'm glad I'm not a graphic designer. My head hurts just trying to understand this. I just pick the colours that look good to me
reply
kllrnohj
2 hours ago
[-]
graphic designers don't really see any of this, either. It's going to be photo junkies or people working on image processing systems (either building them or using them) that have to deal with this.

But for the most part this shouldn't really matter much. A huge amount of things these days are properly color managed, so as long as the thing that wrote the profile actually, you know, wrote what it actually wanted then it'll display just fine regardless of how many different "sRGB" profiles there are floating around. We're largely past the days of just hoping that the image and the display happen to agree on roughly the same colors.

reply
gpvos
1 hour ago
[-]
The problem, as I described in another comment, is that the average programmer doesn't know enough about colour spaces, and sometimes must choose a colour profile while not knowing nor understanding what they actually want. They can figure out that an "sRGB" profile is probably what they want, but then there should not be such a plethora of different versions of that, as choosing between them is impossible for anyone not in the know.
reply
esafak
2 hours ago
[-]
This is more like the stuff Linux users had to endure in the bad old days of setting up drivers. Concerns of twenty years ago. I remember the days people compared their colorimeters and profiled their own monitors. I'm too old for this.
reply
kllrnohj
2 hours ago
[-]
> I remember the days people compared their colorimeters and profiled their own monitors.

That would be calibration and it's still necessary if you want color accuracy. That's about ensuring that what your monitor thinks it's displaying and what it's actually physically emitting are the same. The main thing that's changed here is that factory calibration has become a lot more common and is often more than good enough for anything short of serious professional work. Even for things that aren't professional displays. Like most flagship or even midrange smartphones are factory calibrated with dE values that would make reference monitors from 20 years ago blush. Right up until the OEM shoves a shitty color curve on it intentionally to make it "pop" or be more "vibrant" (Samsung calls this "Vivid", Pixel calls it "Adaptive", etc.. - but they at least usually have a "natural" option that gets you back to the properly calibrated display)

reply