Streaming XNA ouput via Expression Engine SDK for a live Augmented Reality Feed
I am attempting to write an application that first merges video into the XNA game engine, adds some augmented reality content, and then streams the rendered content out to another user. All of this needs to happen in real time or near real time. Essentially the system would look like this:
Video -> XNA(Game) -> (???) -> Expression Engine -> Live Stream -> Network -> Client
The clean and elegant way to do this would be to treat XNA's rendered output as a LiveDeviceSource and feed that directly into an Expression Engine stream Job. Is it possible to write a custom LiveDeviceSource or something similar to do this? Basically does someone know how to create a custom source for the expression engine where I can just push the rendered XNA buffer straight to the encoder? It would be fairly easy to dump the buffer to file and have the expression engine encode that, but I am afraid writing to disk would incur too much latency because of the time required to write to disk.
The alternative, quick and dirty hack, is to have the Expression Engine do a screen capture of the output window and then have it stream that. I can get the Expression Engine to capture the screen shot, but I can't get it to stream live. Can someone suggest how to go about doing both simultaneously?
Apologies in advance for this question not being all that well formed as I am still getting famili开发者_StackOverflow中文版ar with the Expression Engine SDK. For reference I am using XNA 4, MS Expression Engine 4, and Visual Studio 2010. Our goal is to eventually integrate this technology with a forthcoming release of Goblin XNA.
I would also be open to suggestions for other ways to stream live video directly from XNA. We're not married to the Expression Engine, it just appears to be the best streaming solution we've come across so far (we are having problems with ffmpeg).
You need to implement an AVStream miindriver to complete the pipeline there by creating a driver that gets recognised as a live source (look for details here http://msdn.microsoft.com/en-us/library/ff554228(v=VS.85).aspx) Another solution is to write using DirectShowFilter or MFT filters, but I cannot help you much in this department.
I have workied with expression encoder sdk, but I have not encountered a solution for what you have been asking yet. However, you can VERY easilly transfer image data to Expression Encoder. One way would be to retrieve the bytes off of the rendertarget and transfer them to the expression encoder with any means you prefer (WCF is a good choice for this) and reconstruct the data to a bitmap to use for streaming.
I put my source for retrieving byte data from Texture2D (and hence a RenderTarget2D) and then constructing a Bitmap object from them.
public static Bitmap ToBitmap(this Microsoft.Xna.Framework.Graphics.Texture2D rd, int Width, int Height)
{
var Bmp = new Bitmap(Width, Height, System.Drawing.Imaging.PixelFormat.Format32bppArgb);
byte[] data = ToBytes(rd);
var bmpData = Bmp.LockBits(new Rectangle(0, 0, rd.Width, rd.Height), System.Drawing.Imaging.ImageLockMode.WriteOnly, System.Drawing.Imaging.PixelFormat.Format32bppArgb);
System.Runtime.InteropServices.Marshal.Copy(data, 0, bmpData.Scan0, data.Length);
Bmp.UnlockBits(bmpData);
return Bmp;
}
public static byte[] ToBytes(this Microsoft.Xna.Framework.Graphics.Texture2D rd, byte[] data = null)
{
if (data == null || data.Length != 4 * rd.Height * rd.Width)
data = new byte[4 * rd.Height * rd.Width];
rd.GetData<byte>(data);
SwapBytes(data);
return data;
}
private static void SwapBytes(byte[] data)
{
System.Threading.Tasks.ParallelOptions po = new System.Threading.Tasks.ParallelOptions();
po.MaxDegreeOfParallelism = -1;
System.Threading.Tasks.Parallel.For(0, data.Length / 4, po, t =>
{
int bi = t * 4;
byte temp = data[bi];
data[bi] = data[bi + 2];
data[bi + 2] = temp;
});
}
public static void FromBytes(this Texture2D texture, byte[] bytes)
{
SwapBytes(bytes);
texture.SetData<byte>(bytes);
}
I hope I have helped.
精彩评论