开发者

Why is image data in plist dictionary?

I inherited an OpenGL iPhone app code which has some large PNG files (say, sunshine.png, 10,000 px by 80 px) and a corresponding PLIST file for each png (say, ImageData/sunshine.plist).

The trouble is, when I replace the PNG files, the app doesn't pick them up, apparently because it doesn't really read the PNG, just the corresponding PLIST image data files. These plist files are texture mapped using OpenGL libraries. My suspicion is that the PLIST contains some binary encoding of each tile to be rendered, and that the original image has been split up into tiles and stored in the .plist file somehow. Problem is, I have no idea what it is therefore how to re-generate new PLIST files!

I tried making the PLIST file 0-byte, or deciphering them through NSData with UTF8Encoding but only received garbage. Have confirmed PVRTC encoding is not used. Any idea about what these PLIST file format mean, or come from, would be greatly appreciated! I am stuck unable to update this app since its main function is to display large, scrollable images.

Here's the开发者_开发问答 sunshine.plist file format: (data shortened)

<dict>

<key>0</key>

<data>

12LXYpValWKVYtdilWLXYpVi12KVWpVi12KVWpVi12LXYtdilVqVYpVi12KVYtdilWLX YpValWLXYpValWLXYtdi12KVWpVilWLXYpVi12KVYtdilVqVYtdilVqVYtdi12LXYpVa... </data>

<key>1</key>

<data>...</data>

</dict>

Here's how one of the dictionary value looks like, inside Xcode plist editor: (data shortened)

<d762d762 955a9562 9562d762 9562d762 9562d762 955a9562 d762955a 9562d762 d762d762 955a9562 9562d762 9562d762 9562d762 955a9562 d762955a 9562d762 d762d762 955a9562 9562d762 9562d762 9562d762 955a9562 d762955a 9562d762 d762d762 955a9562 9562d762 9562d762 9562d762 955a9562 d762955a 9562d762 73ce31c6 31c673ce 73ce31c6 31c631c6 31c631c6 73ce31c6 31c673ce 31c631c6 73ce31c6 31c673ce 73ce31c6 31c631c6 31c631c6 73ce31c6 31c673ce 31c631c6 73ce31c6 31c673ce 73ce31c6 31c631c6 31c631c6 73ce31c6 31c673ce 31c631c6 73ce31c6 31c673ce 73ce31c6 31c631c6 31c631c6 73ce31c6 31c673ce 31c631c6 e5f6e7fe e5f6e5f6 a5f6e5fe e5fee7fe e7fee5f6 e5fee7fe e5f6e7fe ...>

Thanks, StackOverflow community!


Sounds like the original programmer used something like...

UIImage *anImage;
...
NSData *imgData=UIImagePNGRepresentation(anImage);

... then they stored the imgData in a NSDictionary and wrote it a file.

To reverse it, you need load the dictionary, then extract the individual data objects. Then you can use create images from data like so:

NSDictionary *imgDict=[NSDictionary dictionaryWithContentsOfFile:plistFilePath];
NSData *imgData=[imgDict valueForKey:@"keyForImage"];
UIImage *newImg=[UIImage imageWithData:imgData];

UIImage can puzzle out the file format from the data. The keys for the images are the values in the plist <key>1</key>. The keys are NSStrings.

Once you have the UIImage you can convert that to other formats.


Edit01:

You might this helpful: Archives and Serialization Programming for Cocoa


Could it be that the previous programer has skipped the UIImage stage and only uses NSData to store the texture without any further encoding? Because when I look at the dictionary stream you have written here it looks as it is some raw image format. It seem to use 16bit (or maybe 32bit?!) per pixel but color format and width&height might then be hardcoded (or hopefully in the dictionary).

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜