Hadoop on OSX "Unable to load realm info from SCDynamicStore"
I am getting this error on startup of H开发者_如何学Pythonadoop on OSX 10.7:
Unable to load realm info from SCDynamicStore put: org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot create directory /user/travis/input/conf. Name node is in safe mode.
It doesn't appear to be causing any issues with the functionality of Hadoop.
Matthew Buckett's suggestion in HADOOP-7489 worked for me. Add the following to your hadoop-env.sh file:
export HADOOP_OPTS="-Djava.security.krb5.realm=OX.AC.UK -Djava.security.krb5.kdc=kdc0.ox.ac.uk:kdc1.ox.ac.uk"
As an update to this (and to address David Williams' point about Java 1.7), I experienced that only setting the .realm
and .kdc
properties was insufficient to stop the offending message.
However, by examining the source file that is omitting the message I was able to determine that setting the .krb5.conf
property to /dev/null
was enough to suppress the message. Obviously if you actually have a krb5 configuration, better to specify the actual path to it.
In total, my hadoop-env.sh
snippet is as follows:
HADOOP_OPTS="${HADOOP_OPTS} -Djava.security.krb5.realm= -Djava.security.krb5.kdc="
HADOOP_OPTS="${HADOOP_OPTS} -Djava.security.krb5.conf=/dev/null"
I'm having the same issue on OS X 10.8.2, Java version 1.7.0_21. Unfortunately, the above solution does not fix the problem with this version :(
Edit: I found the solution to this, based on a hint I saw here. In the hadoop-env.sh
file, change the JAVA_HOME
setting to:
export JAVA_HOME=`/usr/libexec/java_home -v 1.6`
(Note the grave quotes here.)
FYI, you can simplify this further by only specifying the following:
export HADOOP_OPTS="-Djava.security.krb5.realm= -Djava.security.krb5.kdc="
This is mentioned in HADOOP-7489 as well.
I had similar problem on MacOS and after trying different combinations this is what worked for me universally (both Hadoop 1.2 and 2.2):
in $HADOOP_HOME/conf/hadoop-env.sh
set the following lines:
# Set Hadoop-specific environment variables here.
export HADOOP_OPTS="-Djava.security.krb5.realm= -Djava.security.krb5.kdc="
# The java implementation to use.
export JAVA_HOME=`/usr/libexec/java_home -v 1.6`
Hope this will help
and also add
YARN_OPTS="$YARN_OPTS -Djava.security.krb5.realm=OX.AC.UK -Djava.security.krb5.kdc=kdc0.ox.ac.uk:kdc1.ox.ac.uk"
before executing start-yarn.sh (or start-all.sh) on cdh4.1.3
I had this error when debugging MapReduce from Eclipse, but it was a red herring. The real problem was that I should have been remote debugging by adding debugging parameters to the JAVA_OPTS
-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=1044
And then creating a new "Remote Java Application" profile in the debug configuration that pointed to port 1044.
This article has some more in-depth information about the debugging side of things. It's talking about Solr, but works much the same with Hadoop. If you have trouble, stick a message below and I'll try to help.
精彩评论