开发者

Use a file as a buffer for a producer consumer system of two programs

I have two programs one of which writes some entries into a file and the other one reads all the entries from the file an processes them.

Currently, the files are executed sequentially. That means, the first programs produces the file completely and exits before the second program is run. Now I want, without much modifications that the second program can be run simultaneously in a producer-consumer fashion. I know I should use interprocess communication, but at this point I want to make minimal changes the programs to get the running.

Specifically, I want that the second program processes the entries f开发者_运维技巧rom the second file in real time as they are generated by the first file.

I am using gcc on ubuntu 11.04


If you are using a Unix-like operating system, may I suggest pipes? Modify your first program to write to standard output (instead of opening a file and passing references that ofstream around, pass std::cout). Modify your 2nd program to read from standard input (ditto, but replace your ifstream references with std::cin).

Then, instead of

prog1 -o some-tmp-file.txt
prog2 -i some-tmp-file.txt

do this:

prog1 | prog2


EDIT: If your existing programs are based on <cstdio> instead of <iostream>, the same principle applies. Use stdout instead of your existing FILE* in the first program. Uses stdin instead of your FILE* in the second program.


EDIT #2: If you want to make absolutely no change to the second program, and perhaps only minimal changes to the first program, try using named pipes.

mkfifo /tmp/some-tmp-file.txt
prog2 -i /tmp/some-temp-file.txt &
prog1 -o /tmp/some-temp-file.txt
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜