开发者

Video capturing in iOS with MonoTouch

I have the code to create, configure and start a video capturing session in Objective-C running without problems. I'm porting the sample to C# and MonoTouch 4.0.3 and have a few problems, here is the code:

    void Initialize ()
    {   
        // Create notifier delegate class 
        captureVideoDelegate = new CaptureVideoDelegate(this);

        // Create capture session
        captureSession = new AVCaptureSession();
        captureSession.SessionPreset = AVCaptureSession.Preset640x480;

        // Create capture device
        captureDevice = AVCaptureDevice.DefaultDeviceWithMediaType(AVMediaType.Video);

        // Create capture device input
        NSError error;
        captureDeviceInput = new AVCaptureDeviceInput(captureDevice, out error);
        captureSession.AddInput(captureDeviceInput);

        // Create capture device output
        captureVideoOutput = new AVCaptureVideoDataOutput();
        captureSession.AddOutput(captureVideoOutput);
        captureVideoOutput.VideoSettings.PixelFormat = CVPixelFormatType.CV32BGRA;
        captureVideoOutput.MinFrameDuration = new CMTime(1, 30);
        //
        // ISSUE 1
        // In the original Objective-C code I was creating a dispatch_queue_t object, passing it to
        // setSampleBufferDelegate:queue message and worked, here I could not find an equivalent to 
        // the queue mechanism. Also not sure if the delegate should be used like this).
        //
        captureVideoOutput.SetSampleBufferDelegatequeue(captureVideoDelegate, ???????);

        // Create preview layer
        previewLayer = AVCaptureVideoPreviewLayer.FromSession(captureSession);
        previewLayer.Orientation = AVCaptureVideoOrientation.LandscapeRight;
        //
        // ISSUE 2:
        // Didn't find any VideoGravity related enumeration in MonoTouch (not sure if string will work)
        //
        previewLayer.VideoGravity = "AVLayerVideoGravityResizeAspectFill";
        previewLayer.Frame = new RectangleF(0, 0, 1024, 768);
        this.View.Layer.AddSublayer(previewLayer);

        // Start capture session
        captureSession.StartRun开发者_StackOverflowning();

    }

    #endregion

    public class CaptureVideoDelegate : AVCaptureVideoDataOutputSampleBufferDelegate
    {
        private VirtualDeckViewController mainViewController;

        public CaptureVideoDelegate(VirtualDeckViewController viewController)
        {
            mainViewController = viewController;
        }

        public override void DidOutputSampleBuffer (AVCaptureOutput captureOutput, CMSampleBuffer sampleBuffer, AVCaptureConnection connection)
        {
            // TODO: Implement - see: http://go-mono.com/docs/index.aspx?link=T%3aMonoTouch.Foundation.ModelAttribute

        }
    }

Issue 1: Not sure how to use correctly the delegate in the SetSampleBufferDelegatequeue method. Also not found an equivalent mechanism to dispatch_queue_t object that works fine in Objective-C to pass in the second parameter.

Issue 2: I did not find any VideoGravity enumerations in MonoTouch libraries, not sure if passing a string with the constant value will work.

I have look for any clue to solve this but no clear samples around. Any sample or information on how to do the same in MonoTouch would be highly appreciated.

Many thanks.


this is mine code. Use it well. I just cut out the important stuff, all the initialisation is there, as well as the reading of the sample output buffer.

I have then code that processes the CVImageBuffer form a linked custom ObjC library, if you need to process this in Monotouch, then you need to go the extra mile and convert it to CGImage or UIImage. There is no function for that in Monotouch (AFAIK), so you need to bind it yourself, from the plain ObjC. Sample in ObjC is here: how to convert a CVImageBufferRef to UIImage

public void InitCapture ()
        {
            try
            {
                // Setup the input
                NSError error = new NSError ();
                captureInput = new AVCaptureDeviceInput (AVCaptureDevice.DefaultDeviceWithMediaType (AVMediaType.Video), out error); 

                // Setup the output
                captureOutput = new AVCaptureVideoDataOutput (); 
                captureOutput.AlwaysDiscardsLateVideoFrames = true; 
                captureOutput.SetSampleBufferDelegateAndQueue (avBufferDelegate, dispatchQueue);
                captureOutput.MinFrameDuration = new CMTime (1, 10);

                // Set the video output to store frame in BGRA (compatible across devices)
                captureOutput.VideoSettings = new AVVideoSettings (CVPixelFormatType.CV32BGRA);

                // Create a capture session
                captureSession = new AVCaptureSession ();
                captureSession.SessionPreset = AVCaptureSession.PresetMedium;
                captureSession.AddInput (captureInput);
                captureSession.AddOutput (captureOutput);

                // Setup the preview layer
                prevLayer = new AVCaptureVideoPreviewLayer (captureSession);
                prevLayer.Frame = liveView.Bounds;
                prevLayer.VideoGravity = "AVLayerVideoGravityResize"; // image may be slightly distorted, but red bar position will be accurate

                liveView.Layer.AddSublayer (prevLayer);

                StartLiveDecoding ();
            }
            catch (Exception ex)
            {
                Console.WriteLine (ex.ToString ());
            }
        }

public void DidOutputSampleBuffer (AVCaptureOutput captureOutput, MonoTouch.CoreMedia.CMSampleBuffer sampleBuffer, AVCaptureConnection connection)
        {   
            Console.WriteLine ("DidOutputSampleBuffer: enter");

            if (isScanning) 
            {
                CVImageBuffer imageBuffer = sampleBuffer.GetImageBuffer (); 

                Console.WriteLine ("DidOutputSampleBuffer: calling decode");

                //      NSLog(@"got image w=%d h=%d bpr=%d",CVPixelBufferGetWidth(imageBuffer), CVPixelBufferGetHeight(imageBuffer), CVPixelBufferGetBytesPerRow(imageBuffer));
                // call the decoder
                DecodeImage (imageBuffer);
            }
            else
            {
                Console.WriteLine ("DidOutputSampleBuffer: not scanning");
            }

            Console.WriteLine ("DidOutputSampleBuffer: quit");
        } 


All issues solved and finally working fine, the freezing was happening because in my test I was not yet disposing the sampleBuffer in the method DidOutputSampleBuffer. The final code for my view is here:

UPDATE 1: Changed assignment of VideoSettings CVPixelFormat, was incorrect and would cause a wrong BytesPerPixel in the sampleBuffer.

public partial class VirtualDeckViewController : UIViewController
{   
    public CaptureVideoDelegate captureVideoDelegate;

    public AVCaptureVideoPreviewLayer previewLayer;
    public AVCaptureSession captureSession;
    public AVCaptureDevice captureDevice;
    public AVCaptureDeviceInput captureDeviceInput;
    public AVCaptureVideoDataOutput captureVideoOutput;

...

    public override void ViewDidLoad ()
    {
        base.ViewDidLoad ();

        SetupVideoCaptureSession();
    }

    public void SetupVideoCaptureSession()
    {
        // Create notifier delegate class 
        captureVideoDelegate = new CaptureVideoDelegate();

        // Create capture session
        captureSession = new AVCaptureSession();
        captureSession.BeginConfiguration();
        captureSession.SessionPreset = AVCaptureSession.Preset640x480;

        // Create capture device
        captureDevice = AVCaptureDevice.DefaultDeviceWithMediaType(AVMediaType.Video);

        // Create capture device input
        NSError error;
        captureDeviceInput = new AVCaptureDeviceInput(captureDevice, out error);
        captureSession.AddInput(captureDeviceInput);

        // Create capture device output
        captureVideoOutput = new AVCaptureVideoDataOutput();
        captureVideoOutput.AlwaysDiscardsLateVideoFrames = true;
                    // UPDATE: Wrong videosettings assignment
        //captureVideoOutput.VideoSettings.PixelFormat = CVPixelFormatType.CV32BGRA;
                    // UPDATE Correct videosettings assignment
                    captureVideoOutput.VideoSettings = new AVVideoSettings(CVPixelFormatType.CV32BGRA);
        captureVideoOutput.MinFrameDuration = new CMTime(1, 30);
        DispatchQueue dispatchQueue = new DispatchQueue("VideoCaptureQueue");
        captureVideoOutput.SetSampleBufferDelegateAndQueue(captureVideoDelegate, dispatchQueue);
        captureSession.AddOutput(captureVideoOutput);

        // Create preview layer
        previewLayer = AVCaptureVideoPreviewLayer.FromSession(captureSession);
        previewLayer.Orientation = AVCaptureVideoOrientation.LandscapeLeft;
        previewLayer.VideoGravity = "AVLayerVideoGravityResizeAspectFill";
        previewLayer.Frame = new RectangleF(0, 0, 1024, 768);
        this.View.Layer.AddSublayer(previewLayer);

        // Start capture session
        captureSession.CommitConfiguration();
        captureSession.StartRunning();  
    }

    public class CaptureVideoDelegate : AVCaptureVideoDataOutputSampleBufferDelegate
    {   
        public CaptureVideoDelegate() : base()
        {   
        }

        public override void DidOutputSampleBuffer (AVCaptureOutput captureOutput, CMSampleBuffer sampleBuffer, AVCaptureConnection connection)
        {
            // TODO: Implement buffer processing

            // Very important (buffer needs to be disposed or it will freeze)
            sampleBuffer.Dispose();
        }
    }

The final piece of the puzzle was answered with the Miguel de Icaza sample I finally found here: link

Thanks to Miguel and Pavel

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜