开发者

Hadoop Map-Reduce Code fails to pick driver files libcuddpp.so

I came across a strange problem about non-root users in Linux (CentOS).

I'm able to compile & run a Java Program through below commands properly :

[root@cuda1 hadoop-0.20.2]# javac EnumDevices.java
[root@cuda1 hadoop-0.20.2]# java EnumDevices
Total number of devices: 1
Name: Tesla C1060
Version: 1.3
Clock rate: 1296000 MHz
Threads per block: 512

But I need to run it through other user [B]hadoop[/B] in CentOS

[hadoop@ws37-mah-lin hadoop-0.20.2]$ javac EnumDevices.java
[hadoop@ws37-mah-lin hadoop-0.20.2]$ java EnumDevices
NVIDIA: could not open the d开发者_C百科evice file /dev/nvidiactl (Permission denied).
Exception in thread "main" CUDA Driver error: 100
       at jcuda.CUDA.setError(CUDA.java:1874)
       at jcuda.CUDA.init(CUDA.java:62)
       at jcuda.CUDA.<init>(CUDA.java:42)
       at EnumDevices.main(EnumDevices.java:20)
[hadoop@ws37-mah-lin hadoop-0.20.2]$

Actually I need to run a map-reduce code but first if it runs through simple then I will go for it.

Please guide me how to solve this issue as CLASSPATH is same through all users.


Looks like you're running into a problem with device file permissions. Hadoop has nothing to do with this, neither does the Java classpath. This might be useful:

http://www.linuxquestions.org/questions/slackware-14/could-not-open-dev-nvidiactl-310026/

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜