Invalid Identifier with NSRect frame = [self frame]
I'm currently trying to figure out how to alter frames from a webcam for a motion detection game. I am very new to Objective-C, and I have been unable to find a simple way to do this.
My question here is about an error message related to this method:
- (void)captureOutput:(QTCaptureOutput *)captureOutput
didOutputVideoFrame:(CVImageBufferRef)videoFrame
withSampleBuffer:(QTSampleBuffer *)sampleBuffer
fromConnection:(QTCaptureConnection *)connection
{
CIContext *myCIContext;
const NSOpenGLPixelFormatAttribute attr[] = {
NSOpenGLPFAAccelerated,
NSOpenGLPFANoRecovery,
NSOpenGLPFAColorSize, 32,
0
};
NSOpenGLPixelFormat *pf = [[NSOpenGLPixelFormat alloc] initWithAttributes:(void *)&attr];
myCIContext = [CIContext contextWithCGLContext: CGLGetCurrentContext()
pixelFormat: [pf CGLPixelFormatObj]
options: nil];
CVImageBufferRef releasedImageBuffer;
CVBufferRetain(videoFrame);
CIImage *picture = [CIImage imageWithCVImageBuffer:releasedImageBuffer];
NSRect frame = [self frame];
CGRect imageRect;
imageRect = [picture extent];
[colorCorrectionFilter setValue:picture forKey:@"inputImage"];
[effectFilter setValue:[colorCorrectionFilter valueForKey:@"outputImage"] forKey:@"inputImage"];
// render our resulting image into our context
[ciContext drawImage:[compositeFilter valueForKey:@"outputImage"]
atPoint:CGPointMake((int)((frame.size.width - imageRect.size.width) * 0.5), (int)((frame.size.height - imageRect.size.height) * 0.5)) // use integer coordinates to avoid interpolation
fromRect:imageRect];
@synchronized(self)
{
//basically, have frame to be released refer to the current frame
//then update the reference to the current frame with the next frame in the "video stream"
releasedImageBuffer = mCurrentImageBuf开发者_高级运维fer;
mCurrentImageBuffer = videoFrame;
}
CVBufferRelease(releasedImageBuffer);
}
The error message produced says:
warning: 'MyRecorderController' may not respond to '-frame'
error: invalid initializer
and the line highlighted is
NSRect frame = [self frame];
My header is currently like so:
#import <QuickTime/ImageCompression.h>
#import <QuickTime/QuickTime.h>
#import <Cocoa/Cocoa.h>
#import <QTKit/QTKit.h>
#import <OpenGL/OpenGL.h>
#import <QuartzCore/QuartzCore.h>
#import <CoreVideo/CoreVideo.h>
@interface MyRecorderController : NSObject{
IBOutlet QTCaptureView *mCaptureView;
IBOutlet NSPopUpButton *videoDevicePopUp;
NSMutableDictionary *namesToDevicesDictionary;
NSString *defaultDeviceMenuTitle;
CVImageBufferRef mCurrentImageBuffer;
QTCaptureDecompressedVideoOutput *mCaptureDecompressedVideoOutput;
// filters for CI rendering
CIFilter *colorCorrectionFilter; // hue saturation brightness control through one CI filter
CIFilter *effectFilter; // zoom blur filter
CIFilter *compositeFilter; // composites the timecode over the video
CIContext *ciContext;
QTCaptureSession *mCaptureSession;
QTCaptureMovieFileOutput *mCaptureMovieFileOutput;
QTCaptureDeviceInput *mCaptureDeviceInput;
}
@end
I have looked at tutorial code, and I do not understand what I have done wrong. As far as I can see (judging by said various sample code) I should not need to include a protocol in this - which is what other websites have suggested. I have tried it though, and while it does compile it ends up outputting:
2011-01-18 10:19:11.511 MyRecorder[9972:c903] -[MyRecorderController frame]: unrecognized selector sent to instance 0x1001525f0
2011-01-18 10:19:11.512 MyRecorder[9972:c903] *** Ignoring exception: -[MyRecorderController frame]: unrecognized selector sent to instance 0x1001525f0
Is there anything I have done wrong which has caused this? If not, is there a better way to be able to manipulate frames from a webcam (and output them to the screen)?
Thanks heaps!
You are trying to call a method frame on MyRecorderController - which just doesn't have that method. Maybe that class should inherit from UIView
, or you need to implement that method.
Ask yourself what frame you mean, and write the appropriate method.
精彩评论