Use of thrift/avro for a hadoop job to communicate between Java and C++
Right now we have a Hadoop job in Java that is working with some C++ binaries. We write files to NFS and C++ and Java read them and that is our form of communication, which prevents us from scaling. I'开发者_运维知识库m looking into Proto Buff, Thrift and Avro to get away from the NFS thing. This approach would definitely be better than the NFS approach, right?
精彩评论