开发者

using file/db as the buffer for very big numpy array to yield data prevent overflow?

In using the numpy.darray, I met a memory overflow problem due to the size of d开发者_开发技巧ata,for example:

Suppose I have a 100000000 * 100000000 * 100000000 float64 array data source, when I want to read data and process it in memory with np. It will raise a Memoray Error because it works out all memory for storing such a big array in memory.

Then maybe using a disk file / database as a buffer to store the array is a solution, when I want to use data, it will get the necessary data from the file / database, otherwise, it is just a python object take few memory.

Is it possible write such a adapter?

Thanks.

Rgs, KC


Take a look at pytables or numpy.memmap, maybe they fit your needs.

best, Peter


If You have matrices with lots of zeros use scipy.sparse.csc_matrix. It's possible to write everything, for example You can override numarray array class.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜