System.OutOfMemory exception when calling MemoryStream.ToArray() after serializing object with DataContractSerializer
I am getting an intermittent "out of memory" exception at this statement:
return ms.ToArray();
In this method:
public static byte[] Serialize(Object inst)
{
Type t = inst.GetType();
DataContractSerializer dcs = new DataContractSerializer(t);
MemoryStream ms = new MemoryStream();
dcs.WriteObject(ms, inst);
return ms.ToArray();
}
How can I prevent it? Is there a better way to do this?
The length of ms is 182,870,206 bytes (174.4 MB)
I am putting this into a byte array so that I can then run it through compression and store it to disk. The data is (obviously) a large list of a custom class that I am downloading from a WCF server when my silverlight application starts. I am serializing it and compressing it so it uses only about 6MB in isolated storage. The next time the user visits and runs the silverlight application from the web, I check the timestamp, a开发者_开发问答nd if good I just open the file from isolated, decompress it, deserialize it, and load my structure. I am keeping the entire structure in memory because the application is mostly geared around manipulating the contents of this structure.
@configurator is correct. The size of the array was too big. I rolled by own serializer, by declaring a byte array of [list record count * byte count per record], then stuffed it directly myself using statements like this to stuff it:
Buffer.BlockCopy(
BitConverter.GetBytes(node.myInt),0,destinationArray,offset,sizeof(int));
offset += sizeof(int);
and this to get it back:
newNode.myInt= BitConverter.ToInt32(sourceByteArray,offset);
offset += sizeof(int);
Then I compressed it and stored it to isolated storage.
My size went from 174MB with the DataContractSerializer to 14MB with mine. After compression it went from a 6MB to a 1MB file in isolated storage.
Thanks to Configurator and Filip for their help.
The problem seems to be that you're expecting to return a 180MB byte array. That means the framework would need to find and allocate a consecutive 180MB of free memory to copy the stream data into, which is usually quite hard - hence the OutOfMemoryException. If you need to continue handling this amount of memory, use the memory stream itself (reading and writing to it as you need) to hold the buffer; otherwise, save it to a file (or to whatever other place you need it, e.g. serving it over a network) directly instead of using the memory stream.
I should mention that the memory stream has a 180MB array of its own in there as well, so is also in a bit of trouble and could cause OutOfMemory during serialization - it would likely be better (as in, more robust) if you could serialize it to a temporary file. You might also want to consider a more compact - but possibly less readable - serialization format, like json, binary serialization, or protocol buffers.
In response to the comment: to serialize directly to disk, use a FileStream
instead of a MemoryStream
:
public static void Serialize(Object inst, string filename)
{
Type t = inst.GetType();
DataContractSerializer dcs = new DataContractSerializer(t);
using (FileStream stream = File.OpenWrite(filename)) {
dcs.WriteObject(ms, inst);
}
}
I don't know how you use that code, but one thing that strikes me is that you don't release your resources. For instance, if you call Serialize(obj)
a lot of times with a lot of large objects, you will end up having a lot of memory being used that is not released directly, however the GC
should handle that properly, but you should always release your resources.
I've tried this piece of code:
public static byte[] Serialize(object obj)
{
Type type = obj.GetType();
DataContractSerializer dcs = new DataContractSerializer(type);
using (var stream = new MemoryStream())
{
dcs.WriteObject(stream, obj);
return stream.ToArray();
}
}
With the following Main
-method in a Console Application
static void Main(string[] args)
{
var filipEkberg = new Person {Age = 24, Name = @"Filip Ekberg"};
var obj = Serialize(filipEkberg);
}
However, my byte
-array is not nearly as big as yours. Having a look at this similar issue, you might want to consider checking out protobuf-net.
It might also be interesting to know what you are intending to do with the serialized data, do you need it as a byte
-array or could it just as well be XML written to a text-file?
Try to serialize to a stream (i.e. FileStream) instead of byte array. This way you can serialize gigabytes of data without OutOfMemory exception.
public static void Serialize<T>(T obj, string path)
{
DataContractSerializer serializer = new DataContractSerializer(typeof(T));
Stream stream = File.OpenWrite(path);
serializer.WriteObject(stream, obj);
}
public static T Deserialize<T>(string path)
{
DataContractSerializer serializer = new DataContractSerializer(typeof(T));
Stream stream = File.OpenRead(path);
return (T)serializer.ReadObject(stream);
}
Try to set memory stream position to 0 and after only call ToArray().
Regards.
精彩评论