开发者

How to import a large text file containing many small records in Objective-C without using large amounts of memory

I need the ability to import some fairly large text files (100Mb+) into CoreData in an application targeted towards mobile devices where memory is constrained. Each file contains a large number of small records which will be processed before being added to the database. Looking through many sources the recommended method for reading in a text file seems to be:

NSString *stringFromFileAtPath = [[NSString alloc]initWithContentsOfURL:url encoding:NSUTF8StringEncoding error:&error];

At first glance this seems like a very memory inten开发者_C百科sive way of doing what I require, but given that there seems to be no other recommended way to read the file would I be right in guessing that Apple have taken this into account and do their own memory management - perhaps faulting in data from the file only when necessary?

If not would the best way to proceed be using NSStream and NSScanner to retrieve and process one line of text at a time?

If the recommended method does handle memory well then the next step is often:

NSArray *lines = [stringFromFileAtPath componentsSeparatedByCharactersInSet:[NSCharacterSet newlineCharacterSet]];

If I use this method I'm assuming that it would need the complete text file in memory so again it would be memory intensive. To save memory would I be better off using NSScanner or, given the limited processing power of mobile devices (certainly some of the older ones) would it take forever to complete?

Thanks in advance for any help you can give me with this question.

Dave


Memory-mapped files can be a good way of examining the contents of a large file. And it looks like -[NSString initWithBytesNoCopy:length:encoding:freeWhenDone:] will let you create an NSString that uses a memory-mapped file's contents as the string's value directly.

I haven't used that particular function, I must admit, but my app does use memory-mapped files and they were dead easy to get working. So at the very least, you have an easy way of not needing to have 100MB of data loaded at once, but all the convenience of having 100MB of data appearing to be loaded at once.

Use the POSIX function mmap to map a file into memory. You'll need a POSIX file handle, which my code gets from open (after doing the usual song and dance to get a UTF8 copy of the right file name in the bundle), but which can probably be done using one of the NS facilities too.


Have you tried this solutions?

How to read data from NSFileHandle line by line?


I think the best way to get a lot of data into a database is to deliver a pre-populated database (*.sqlite-file).

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜