开发者

How to send IplImage from server to iPod client UIImage via TCP

I have a server in linux using the Berkeley_sockets and I create a TCP connection with an iPod client. I have an IplImage* img; to send from the server to the iPod. I use the write(socket,/*DATA*/,43200); command and the data i tried to send is: reinterpret_cast<char*>(img), img and img->imageData. All of this choices actually send any kind of data.

On the iPod side I receive data this way (as i've seen here in SO. Don't mind the complicated stuff, it's just for receiving all the data from a single image.):

bytesRead = [iStream read: (char*)[buffer mutableBytes] + totalBytesRead maxLength: 43200 - totalBytesRead];

After receiving the whole image, i have this:

[buffer setLength: 43200];
NSData *imagem = [NSData dataWithBytes:buffer length:43200];
UIImage *final= [self UIImageFromIplImage:imagem];

Now.. i know i could have openCV working on the iPod, but i can't find a simple explanation on how to get it to work, so I used the second code from this webpage and adapted it, since i know all the specifications of my image (for instance I set up all the variables from the CGImageCreate() function.):

- (UIImage *)UIImageFromIplImage:(NSData *)image {

CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceGray();

// Allocating the buffer for CGImage
NSData *data = [NSData dataWithBytes:image length:43200];

CGDataProviderRef provider = CGDataProviderCreateWithCFData((CFDataRef开发者_如何学C)data);

// Creating CGImage from chunk of IplImage    
size_t width = 240;
size_t height = 180;
size_t depth = 8;             //bitsPerComponent
size_t depthXnChannels = 8;   //bitsPerPixel
size_t widthStep = 240;       //bytesPerRow

CGImageRef imageRef = CGImageCreate(width, height, depth, depthXnChannels, widthStep, colorSpace, kCGImageAlphaNone|kCGBitmapByteOrderDefault,provider, NULL, false, kCGRenderingIntentDefault);

// Getting UIImage from CGImage
UIImage *ret = [UIImage imageWithCGImage:imageRef];
lolView.image = ret;
CGImageRelease(imageRef);
CGDataProviderRelease(provider);
CGColorSpaceRelease(colorSpace);
return ret;

}

THE PROBLEM: When I display the image, I get it completely weird and ´random´, even though the image sent is always the same. I really have no idea what's wrong..

PS: The TCP connection is working fine with other data, like numbers or words. And the image is grayscale.

Thanks for all the help.


I got it working like this. On the server side (code::blocks in linux with openframeworks (& ofxOpenCv)):

img.allocate(240, 180, OF_IMAGE_COLOR);                    //ofImage
img2.allocate(240, 180);                                   //ofxCvColorImage
frame = cvCreateImage(cvSize(240,180), IPL_DEPTH_8U, 3);   //IplImage
bw = cvCreateImage(cvSize(240,180), IPL_DEPTH_8U, 1);      //IplImage
gray.allocate(240, 180);                                   //ofxCvGrayscaleImage


///ofImage
img.loadImage("lol.jpg");

///ofImage -> ofxCvColor
img2.setFromPixels(img.getPixels(), 240, 180);

///ofxCvColor -> IplImage
frame = img2.getCvImage();

///IplImage in GRAY
cvCvtColor(frame,bw,CV_RGB2GRAY);
cvThreshold(bw,bw,200,255,CV_THRESH_BINARY);  //It is actually a binary image
gray = bw;
pix = gray.getPixels();

n=write(newsockfd,pix,43200);

On the client side (iPod 4.3):

-(UIImage *) dataFromIplImageToUIImage:(unsigned char *) rawData;
{
size_t width = 240;
size_t height = 180;
size_t depth = 8;                   //bitsPerComponent
size_t depthXnChannels = 8;         //bitsPerPixel
size_t widthStep = 240;             //bytesPerRow

CGContextRef ctx = CGBitmapContextCreate(rawData, width, height, depth, widthStep,  CGColorSpaceCreateDeviceGray(), kCGImageAlphaNone);

CGImageRef imageRef = CGBitmapContextCreateImage (ctx);
UIImage* rawImage = [UIImage imageWithCGImage:imageRef];  

CGContextRelease(ctx);  

myImageView.image = rawImage;  
return rawImage;

free(rawData);
}

Probably there's an easier way to do this, but hey, gets the work done. Hope this helps anyone.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜