开发者

Hadoop put command doing nothing!

I am running Cloudera's distribution of Hadoop and everything is working perfectly.The hdfs contains a large number of .seq files.I need to merge the contents of all the .seq files into one large .seq file.However, the getmerge command did nothing for me.I then used cat and piped the data of some .seq files onto a local file.When i want to "put" this file into hdfs it does nothing.No error message shows up,and no file is cre开发者_StackOverflow社区ated.

I am able to "touchz" files in the hdfs and user permissions are not a problem here.The put command simply does not work.What am I doing wrong?


Write a job that merges the all sequence files into a single one. It's just the standard mapper and reducer with only one reduce task.


if the "hadoop" commands fails silently you should have a look at it.

Just type: 'which hadoop', this will give you the location of the "hadoop" executable. It is a shell script, just edit it and add logging to see what's going on.

If the hadoop bash script fails at the beginning it is no surprise that the hadoop dfs -put command does not work.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜