I\'m using the AVFoundation framework. In my sample buffer delegate I have the following code: -(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuf
How do you adjust contrast or sharpness using the Core Image framework?Which filters should be used an开发者_Go百科d why?
I\'m using the new Core Image APIs in iO开发者_StackOverflow中文版S 5 that do auto enhancements. However, the array returned from autoAdjustmentFilters or autoAdjustmentFiltersWithOptions never remove
I have a requirement to implement a functionality to draw edge of a object in a given image. So, when the user taps on a object in the Image ,it should draw its edge. For Eg:if I have a Image which ha
I am using Apple\'s Core Image Filter reference. It references: \"A CIVector class whose attribute type is CIAttributeTypeRectangle and whose disp开发者_开发知识库lay name is Rectangle.\"
I seem to be tying myself up in knots trying to read into all of the different ways you can represent images in a Cocoa app for OSX.
Now that Apple has ported the Core Image framework over to iOS 5.0, I\'m wondering: is Core Image is fast enough to apply live filters and effects to 开发者_StackOverflow社区camera video?
What is the fastest way to trim an image (NSImage or CGImageRef) so that all transparent areas around the image are removed?
My application reads and resizes images that are loaded from the internet; and unfortunately I can\'t control the creation of these images.
I want to implement the seam carving algorithm by Avidan/Shamir. After the energy computing stage which can be implemented using a core image filter, I need to compute the seams with the lowest energy