Resizing a UIImage on the iPhone with a small memory footprint
I've seen this question asked dozens of times but never answered.
How do you resize a UIImage (specifically one returned from the UIImagePickerController camera). When I try any of the methods that are out there I get a memory spike between 20 and 40MB. It does go away but I know that on some hardware this is completely unacceptable.
I've tried the methods that use the following operations: drawInRect:, CGContextDrawImage(), imageWithCGImage:scale:orientation:
I understand that uncompressed images living in memory take up more space than on disk, but it seems like the most common UIImage resize operations involve creating copies of the image data.
Even Apple recommends immediately resizing a picture taken with the c开发者_如何学Camera. However (b/c I believe they know this topic is intensely complex) they offer no words of guidance on how to manage that. Especially how to do it the moment the image data is returned.
Does anyone have a smooth method to resize a large UIImage while conserving memory? I know that's a tall order.
A method that uses little memory is to create a bitmap context with CGBitmapContextCreate
and draw the UIImage
into it. The only additional memory this will use is what you've malloc
ed and some small CGContext
overhead.
If you want to get fancy, you could instead mmap
with the PROT_WRITE
flag set and be limited only by the virtual address space
Have you tried benchmarking the memory footprint used by resizing UIImage
with the category methods described in the following blog post.
I've used similar resizing to make thumbnails for larger pictures download by the app. My pictures aren't probably as large as those that you're picking from your image picker controller, but I have found pretty good performance resulted.
Hope it helps in some way.
Here are a couple of git projects that cover UIImage resizing. I use the second and it works like a charm. It has a nice sample project included so you can see exactly how to use it. I think the first one has a sample as well, although I haven't tried using it or looked closely.
https://github.com/coryalder/UIImage_Resize
https://github.com/AliSoftware/UIImage-Resize
If you're really concerned about memory footprint you could do something crazy-ish like resizing chunks of the image and stitching them back together in memory... but that would probably only be for very large images. Pics from the built-in camera should be able to be handled ok using the available memory.
I dont have a 100% great answer here but you have a couple options. The iphone will let you enter the edit/crop etc mode. I think that's what Apple wants you to use. For us that wasn't quite acceptable. We grab the image data, get the pixel data, release the original image. Then resize the top half, then second ourselves. I'd post some code but am not at work. I can in a few days after vacation...
You probably have bigger images than I've handled in my apps, but you don't say which sizes we're talking about. I can only suggest Quartz or your own resize filter doing the picture line by line.
精彩评论