开发者

Hadoop DFS is pointing to current directory

A few months ago, we installed CLoudera Hadoop 3 in our local machine and everything was fine. Recently we also installed Whirr to start working with clusters. Although we faced some problems, after a while, we can start up a cluster, log into its master node and commence work. However, I found out recently that when I type:

hadoop dfs -ls

into our local machine, it now displays everything in the current directory I am in, not the cont开发者_开发知识库ents of the DFS. This didn't use to happen, so we are thinking something got messed up when we installed Whirr.

What could have caused this, and more importantly, how can we get our local hadoop dfs to point to the correct location?


Is core-site.xml in the Hadoop installation conf directory set to file:///?

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜