How to get video frames into GL Textures in WebGL?
I'm interested in being able to apply filters from glfx.js to live video. I have succeeded in importing and processing the frames as I desire, but the method is inefficient. In my page setup, I do this:
var fbCanvas = document.getElementById('framebuffer');
var fb = fbCanvas.getContext('2d');
var video = document.getElementById('video');
var output = fx.canvas();
And then, at 25hz (the play-rate of the video), I do this:
fb.drawImage(video, 0, 0);
var frame = output.texture(fbCanvas);
output.draw(frame).hueSaturation(-0.5, 0).update();
But I would like to be able to do this:
var frame = output.texture(video);
output.draw(frame).hueSaturation(-0.5, 0).update();
The call to output.te开发者_运维问答xture is just a wrapper on gl.texImage2D, which it seems will only accept images or canvases---not a video element.
My question is, how much of a performance hit am I taking by doing the extra drawImage to the hidden canvas? What would be the fastest way of getting video frames into GL Textures so that I can run GL shaders on them in realtime?
Thanks.
Live video processing(video in/out, as opposed to just record or just playback)with desktop openGL is often done multithreaded, but this requires support for context sharelisting (so that separate contexts in separate threads can reference the same resource handle space.) Context sharelisting is supported in OpenGL ES, but not WebGL (yet), though I think it's supported in WebCL. So, although there is support for WebWorkers, the use of WebGL seems effectively limited to a single thread.
But, when(not if) WebGL supports context sharelisting, the fastest way, I believe, would be to isolate the prep of the textures in an aux thread with a sharelisted context, then run the GL shaders on them in a main composting thread.
In desktop openGL, one way this is done is by declaring offscreen 1x1 windows (canvas element)with their own context, associated with a unique thread that does the prep. These contexts are sharelisted with a main thread in which final compositing occurs.
When(I think, not if)WebGL supports sharelisting, look for an alternate signature on getContext() that permits sharelisting with another context.
If you try to do this kind of video processing single threaded, you are bucking a 'two clock' problem. (Input video contract, output video contract.) You must isolate processing latency from those hard clocks via FIFO/cache and multithread, or else you will get glitch on the output, or miss input frames. The necessity of the FIFO/cache introduces video processing lag, and if you are bypassing audio, you will need to delay the audio to match. You can do that easily with a circular loop ring buffer with offset between record and playback.
精彩评论