How to increase deserialization speed?
Serializing/deserializing with BinaryFormatter, resulting serialized file is ~80MB in size. The deserialization takes a few minutes. How could I improve on this? Here's the deserialization code:
public static Universe DeserializeFromFile(string filepath)
{
Universe universe = null;
FileStream fs = new FileStream(filepath, FileMode.Open);
BinaryFormatter bf = new BinaryFormatter();
try
{
universe = (Universe)bf.Deserialize(fs);
}
catch (SerializationException e)
{
Console.WriteLine("Failed to deserialize. Reason: " + e.Message);
throw;
}
finally
{
fs.Close();
}
return universe;
}
Maybe read all to memory prior to deserializing or 开发者_如何学运维use some other serialization technique?
Try UnsafeDeserialize
. It is said to improve speed.
I know this is an old question, but stumbled upon a solution that improved my deserialization speed substantially. This is useful if you have large sets of data.
Upgrade your target framework to 4.7.1+ and enable the following switch in your app.config.
<runtime>
<!-- Use this switch to make BinaryFormatter fast with large object graphs starting with .NET 4.7.2 -->
<AppContextSwitchOverrides value="Switch.System.Runtime.Serialization.UseNewMaxArraySize=true" />
</runtime>
Sources: BinaryFormatter AppContextSwitchOverrides
Please take a look at this thread.
Try reading the file into a memory stream first in one go, then deserialize using the memory stream.
How complex is the data? If it is an object tree (rather than a full graph), then you might get some interesting results from trying protobuf-net. It is generally pretty easy to fit onto existing classes, and is generally much smaller, faster, and less brittle (you can change the object model without trashing the data).
Disclosure: I'm the author, so might be biased - but it really isn't terrible... I'd happily lend some* time to help you try it, though.
*=within reason
Implement ISerializable in the Universe class
精彩评论