MJPEG Stream from iPhone to Server
I am writing an app to make an MJPEG-Stream from the iPhone to the Server. My problem is that the JPEG's I see are a weird collection of colored lines. I can post the specific code if more information is needed.
These are the Steps:
- Get the frames via
AVCaptureVideoDataOutput
- Convert them to an
UIImage
- Get
NSData
UIImageJPEGRepresentation()
from it - Make an
NSMutableData
and wrap the Image with "Content-Type: image/jpeg" and --BOUNDARY
Now i want to send them as UDP Packets to my se开发者_如何学Pythonrver. I use asyncudpsocket on the iPhone and a small java datagrammserver on the PC.The first problem is the mtu. Asyncudpsocket says message too long if the nsdata
is larger than 9kb.
Therefore:
- Split the image into 9kb chunks
- Send them with asyncudp socket
- Accept an TCP Socket on port 2020 on the PC
- Receive the DatagrammPackets and stream them to the connected TCP Socket
- Open Firefox with localhost:2020 and watch the MJPEG
I've tested each part seperated and can tell that they're okay. I can show the images in a UIImageView
, I can send, receive and show the bytes from the socket and I can stream JPEG's over a serversocket and watch them in Firefox.
The image fits in one datagrammpacket if they're small enough (that's sometimes the case), and I can see the header, bytes and the boundary.. but why is my MJPEG stream still showing me a weird colored image (instead of a black or white image, which should be the case)?
I'm finally finished with this app.
My problem was that I converted the DatagrammPacket.getBytes()
first to a String
and wrote it rather than with writeBytes()
.
Now I'm writing the bytes directly with the OutputStream
.
精彩评论