Using iPhone camera as brightness sensor
I'm trying to use the front-facing camera as a brightness sensor (there is no public API for the separate brightness sensor that's used to adjust the screen brightness, apparently).
I've managed to set up a video capture session and grab frames from the video, using AVCaptureVideoOutput, and calculate the brightness from the frame. However, the camera is constantly adjusting its exposure settings to compensate for brightness, which makes perfect sense for recording video, but prevents me from getting actual brightness values.
For example, if I put my finger over the camera, the brightness value drops to 0 quickly, but then after a few seconds it creeps back up again as the camera compensates.
So... is there some way to manually set the exposure and disable the automatic adjustment? I tried setting AVCaptureDevice.exposureMode, but it didn't seem to make any difference.
Or, is there a way to get the exposure information from the capture output somehow, so I can appropriately bias my brightness calculation?
UPDATE: I was able to get the EXIF information this way; now I just need to figure out how to bias my brightness calculation.
NSDictionary* dict =开发者_开发知识库 (NSDictionary*) CMGetAttachment(sampleBuffer, kCGImagePropertyExifDictionary, NULL);
NSString* exp = [nsDict objectForKey:@"ExposureTime"];
Did you remember to call lockForConfiguration
before setting the exposure mode?
You can access a variety of metadata using CVBufferGetAttachment
on the pixel buffer you can get from the sample buffer; it probably includes the exposure status.
精彩评论