开发者

Suggestions for faster file processing?

I have a cluster of 4 servers. A file consists of many logical documents. Each file is started as a workflow. So, in summary a workflow runs on the server for each physical input file which can contain as many as 3,00,000 logical documents. At any given time 80 workflows are running concurrently across the cluster. Is there a way to speed u开发者_运维百科p the file processing? Is file splitting a good alternative ? Any suggestions? Everything is java based running on a tomcat servlet engine.


Try to process the files in Oracle Coherence. This gives grid processing. Coherence also provides data persistence as well.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜