Unserializing large Blobs from the Blobstore
I store large Blobs in the datastore. These are XML files that can be as large as 20mb. Storing a single 20mb XML file is fine, however the issue comes when I need to unseri开发者_运维问答alize it.
There are all sorts of limitations to this:
- I cannot read more than 1mb from Blobstore
- even if I could I'm still confronted with the 5mb RAM limitation (since I need to put all the XML into RAM before unserializing it)
How would you say I can handle this? I'm open to all sorts of solutions, but hopefully not something that involves using another hosting provider.
You should switch to a Sax Parser, streaming the data from the Blobstore with the BlobstoreInputStream class.
These libraries should help you to avoid the GAE RAM restriction.
go with GAE BackEnd they do not have configurable memory limit.
精彩评论