开发者

Linker error with Hadoop Pipes

Hadoop n00b here, just started playing around with Hadoop Pipes. I'm getting linker errors while compiling a simple WordCount example using hadoop-0.20.203 (current most recent version) that did not appear fo开发者_开发问答r the same code in hadoop-0.20.2

Linker errors of the form: undefined reference to `EVP_sha1' in HadoopPipes.cc.

EVP_sha1 (and all of the undefined references I get) are part of the openssl library which HadoopPipes.cc from hadoop-0.20.203 uses, but hadoop-0.20.2 does not.

I've tried adjusting my makefile to link to the ssl libraries, but I'm still out of luck. Any ideas would be greatly appreciated. Thanks!

PS, here is my current makefile:

CC = g++

HADOOP_INSTALL = /usr/local/hadoop-0.20.203.0

SSL_INSTALL = /usr/local/ssl

PLATFORM = Linux-amd64-64

CPPFLAGS = -m64 -I$(HADOOP_INSTALL)/c++/$(PLATFORM)/include -I$(SSL_INSTALL)/include

WordCount: WordCount.cc

    $(CC) $(CPPFLAGS) $< -Wall -Wextra -L$(SSL_INSTALL)/lib -lssl -lcrypto -L$(HADOOP_INSTALL)/c++/$(PLATFORM)/lib -lhadooppipes -lhadooputils -lpthread -g -O2 -o $@

The actual program I'm using can be found at http://cs.smith.edu/dftwiki/index.php/Hadoop_Tutorial_2.2_--_Running_C%2B%2B_Programs_on_Hadoop


Had same problem here: answer is to add -lcrypto to the compile command line:

http://grokbase.com/p/hadoop.apache.org/common-user/2011/06/re-linker-errors-with-hadoop-pipes/09zqdt5grdudu7no7q6k3gfcynpy

Here is a patch to fix the build process:

diff --git src/examples/pipes/Makefile.in src/examples/pipes/Makefile.in
index 17efa2a..1d8af8e 100644
--- src/examples/pipes/Makefile.in
+++ src/examples/pipes/Makefile.in
@@ -233,7 +233,7 @@ AM_CXXFLAGS = -Wall -I$(HADOOP_UTILS_PREFIX)/include \
         -I$(HADOOP_PIPES_PREFIX)/include

LDADD = -L$(HADOOP_UTILS_PREFIX)/lib -L$(HADOOP_PIPES_PREFIX)/lib \
-      -lhadooppipes -lhadooputils
+      -lhadooppipes -lhadooputils -lcrypto


# Define the sources for each program


You just need to make some changes to your Makefile. The libraries that natively accompany hadoop seem not to do it. You'll need to "re-make" them and change your linked path. A comprehensive answer to this can be found at http://goo.gl/y5iGZF.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜