开发者

Is there a way to read and write in-memory files in R?

I am trying to use R to analyze large DNA sequence files (fastq files, several gigabytes each), but the standard R interface to these files (ShortRead) has to read the entire file at once. This doesn't fit in memory, so it causes an error. Is there any way that I can read a few (thousand) lines at a ti开发者_如何学Gome, stuff them into an in-memory file, and then use ShortRead to read from that in-memory file?

I'm looking for something like Perl's IO::Scalar, for R.


I don’t know much about R, but have you had a look at the mmap package?


It looks like ShortRead is soon to add a "FastqStreamer" class that does what I want.


Well, I don't know about readFastq accepting something other than a file...

But if it can, for other functions, you can use the R function pipe() to open a unix connection, then you could do this with a combination of unix commands head and tail and some pipes.

For example, to get lines 90 to 100, you use this:

head file.txt -n 100 | tail -n 10

So you can just read the file in chunks.

If you have to, you can always use these unix utilities to create a temporary file, then read that in with shortRead. It's a pain but if it can only take a file, at least it works.


Incidentally, the answer to generally how to do an in-memory file in R (like Perl's IO::Scalar) is the textConnection function. Sadly though, the ShortRead package cannot handle textConnection objects as inputs, so while the idea that I expressed in the question of reading a file in small chunks into in-memory files which are then parsed bit by bit is certainly possible for many applications, but not for may particular application since ShortRead does not like textConnections. So the solution is the FastqStreamer class described above.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜