开发者

Creating/comparing Lab-profiled colors in iOS

For an application I'm working on I'm trying to create and compare (distance between) CIE Lab profiled colors.

I know there are ways of converting from the typical RGB to Lab with some math, but I'd like to do this in more elegant way if possible.

   

Creating CIE LAB Colors

So far I've tried the following… (The white/black point and range numbers come from here)

    const CGFloat *components = CGColorGetComponents([UIColor colorWithRed:.11 green:.33 blue:.55 alpha:1.0].CGColor);

    float  whitePoint[3] = {0.95, 1.0, 1.09};
    float  blackPoint[3] = {0.0,0.0,0.0};
    float  range[4] = {-127.0,127.0,-127.0,127.0};
    CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateLab(whitePoint, blackPoint, range);

    [self.someArray addObject:[UIColor colorWithCGColor:CGColorCreate(colorSpaceRef, components)]];

I'm not sure if this is the right way to go about creating colors in this space, especially since the RGB space UIColor uses is more restricted than Lab in the first place! Additionally, in the conversion from CGColor to UIColor do I lose my color space in the process? (the largest color space in the image is LAB)

Something else to think about is, can the iDevice screens properly display Lab colors (since their pixels use RGB)?

Essentially what I am looking for this the right way to generate random colors inside the Lab space.

Creating/comparing Lab-profiled colors in iOS

    Comparing CIE LAB Colors

Then once I have these Lab colors, how can I get the three values (L, a, and b) to compute the distance between one color and another? Once I've created a color and encapsulated it in a UIColor, will reading and comparing the CGComponents of the UIColor maintain the accuracy I'm looking for? (I've done this before in the RGB space by taking a Euclidian distance, and it should be the same concept here, except the distance will take human perceptual differences into account, which is what I'm striving to do.)

    ###If you've read this far I thank you, and appreciate any input you may have or similar experiences you can share!###开发者_运维知识库


Firstly, your picture is misleading. It shows out-of-gamut colors as in-gamut colors, it shows different lightness for different chroma and it lacks an embedded color profile.

Secondly, iOS devices have so far not been color calibrated. They are used in many different environments so calibration would not be as useful. So you cannot expect to be able to calculate which color a user may perceive. I treat them as sRGB-displays.

Thirdly, all CIELAB colors cannot be converted to valid RGB values, regardless of RGB-primaries.

If you wish to uniformly sample the CIELAB space for valid colors, I would suggest to generate L*a*b*-coordinates and discard invalid values.

CGFloat r,g,b;
while(1) {
    CGFloat L = random_from_range(0., 100.);
    CGFloat a = random_from_range(-128., 128.);
    CGFloat b = random_from_range(-128., 128.);
    CIELAB_to_sRGB(L,a,b,&r,&g,&b);
    if ( 0.0 <= r && r <= 1.0 && 0.0 <= g && g <= 1.0 && 0.0 <= b && b <= 1.0 ) {
        break;
    }
}
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜