开发者

MJPEG Stream from iPhone to Server

I am writing an app to make an MJPEG-Stream from the iPhone to the Server. My problem is that the JPEG's I see are a weird collection of colored lines. I can post the specific code if more information is needed.

These are the Steps:

  1. Get the frames via AVCaptureVideoDataOutput
  2. Convert them to an UIImage
  3. Get NSData UIImageJPEGRepresentation() from it
  4. Make an NSMutableData and wrap the Image with "Content-Type: image/jpeg" and --BOUNDARY

Now i want to send them as UDP Packets to my se开发者_如何学Pythonrver. I use asyncudpsocket on the iPhone and a small java datagrammserver on the PC.The first problem is the mtu. Asyncudpsocket says message too long if the nsdata is larger than 9kb. Therefore:

  1. Split the image into 9kb chunks
  2. Send them with asyncudp socket
  3. Accept an TCP Socket on port 2020 on the PC
  4. Receive the DatagrammPackets and stream them to the connected TCP Socket
  5. Open Firefox with localhost:2020 and watch the MJPEG

I've tested each part seperated and can tell that they're okay. I can show the images in a UIImageView, I can send, receive and show the bytes from the socket and I can stream JPEG's over a serversocket and watch them in Firefox.

The image fits in one datagrammpacket if they're small enough (that's sometimes the case), and I can see the header, bytes and the boundary.. but why is my MJPEG stream still showing me a weird colored image (instead of a black or white image, which should be the case)?


I'm finally finished with this app.

My problem was that I converted the DatagrammPacket.getBytes() first to a String and wrote it rather than with writeBytes(). Now I'm writing the bytes directly with the OutputStream.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜