开发者

CompressionTest for LZO fails

I sincerely thank you for reading my post.

I'm trying to install LZO 2.03 compression codec on HBase on my server (running Xeon CPUs). I'm currently running Hadoop 0.20.1 with HBase 0.90.2.

I've followed the guidelines from http://wiki.apache.org/hadoop/UsingLzoCompression. And I downloaded the LZO native connector (Hadoop-GPL-Compression) from http://code.google.com/p/hadoop-gpl-compression/.

I installed the lzo library using ./configure --prefix=/home/ckwon/wks/test/lzo_lib_x64 --enable-shared --disable-asm --build=x86_64-pc-linux-gnu

(with make install to a custom directory).

And then I copied all of LZO library files and GPL-Compression files (including native/) to $HADOOP_HOME/lib/ and $HBASE_HOME/lib

Then I ran the bin/hbase org.apache.hadoop.hbase.util.CompressionTest with the following script:

setenv CLASSPATH_HBASEI    `ls ${HBASE_HOME}/*.jar |& awk '{printf( ":%s", $1 );}'`
setenv CLASSPATH_HBASELIBI `ls ${HBASE_HOME}/lib/*.jar |& awk '{printf( ":%s", $1 );}'`
setenv CLASSPATH_LZO  $HBASE_HOME/lib/native/liblzo2.so

setenv CLASSPATH ${CLASSPATH_HBASEI}${CLASSPATH_HBASELIBI}

setenv LD_LIBRARY_PATH64 $HBASE_HOME/lib/native
#setenv LD_LIBRARY $HBASE_HOME/lib/nativ开发者_运维百科e

ls -l $LD_LIBRARY_PATH64

set JAVA=$JAVA_HOME/bin/java

set   JAVA_PLATFORM=`CLASSPATH=${CLASSPATH};${JAVA} org.apache.hadoop.util.PlatformName | sed -e "s/ /_/g"`
echo JP=$JAVA_PLATFORM

set      JAVA_LIBRARY_PATH=${HBASE_HOME}/lib/native/${JAVA_PLATFORM}
echo
echo java_lib_path---
echo
echo $JAVA_LIBRARY_PATH

cd $HBASE_HOME
./bin/hbase org.apache.hadoop.hbase.util.CompressionTest hdfs://SERVER:PORT/COMPRESSION_TEST_RUNNER.sh lzo

And it's generating

INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
WARN lzo.LzoCompressor: java.lang.UnsatisfiedLinkError: Cannot load liblzo2.so.2 (liblzo2.so.2: cannot open shared object file: No such file or directory)!
ERROR lzo.LzoCodec: Failed to load/initialize native-lzo library
    java.lang.RuntimeException: native-lzo library not available
            at com.hadoop.compression.lzo.LzoCodec.getCompressorType(LzoCodec.java:135)
            at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:98)
            at org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.getCompressor(Compression.java:200)
            at org.apache.hadoop.hbase.io.hfile.HFile$Writer.getCompressingStream(HFile.java:397)
            at org.apache.hadoop.hbase.io.hfile.HFile$Writer.newBlock(HFile.java:383)
            at org.apache.hadoop.hbase.io.hfile.HFile$Writer.checkBlockBoundary(HFile.java:354)
            at org.apache.hadoop.hbase.io.hfile.HFile$Writer.append(HFile.java:536)
            at org.apache.hadoop.hbase.io.hfile.HFile$Writer.append(HFile.java:515)
            at org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:126)

I've tried rebuilding the LZO library in x86_64-pc w/ i386 assembly code disabled, but is still causing the error.

I'd be grateful for any suggestions.


Install the native LZO libraries on both the master and slave servers. E.g. on Ubuntu

sudo apt-get install liblzo2-dev

If you have problems after following the Cloudera setup instructions, try copying the libgplcompression* files into your Hadoop lib folder. On Ubuntu, assuming you followed the Cloudera setup instructions, this would be:

sudo cp /usr/local/hadoop/lib/native/Linux-amd64-64/lib/libgplcompression.* \
    /usr/lib/hadoop/lib/native/
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜