开发者

Use of thrift/avro for a hadoop job to communicate between Java and C++

Right now we have a Hadoop job in Java that is working with some C++ binaries. We write files to NFS and C++ and Java read them and that is our form of communication, which prevents us from scaling. I'开发者_运维知识库m looking into Proto Buff, Thrift and Avro to get away from the NFS thing. This approach would definitely be better than the NFS approach, right?

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜