You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Adarsh Sharma <ad...@orkash.com> on 2011/02/24 05:43:51 UTC
Library Issues
Dear all,
I am confused about the concepts used while running map-reduce jobs in
Hadoop Cluster.
I attached a program that is used to run in Hadoop Cluster. Please find
the attachment.
I used to run this program successfully through below command in
/home/hadoop/project/hadoop-0.20.2 directory of both master & slave node:
[hadoop@cuda1 hadoop-0.20.2]$ javac EnumDevices.java
[hadoop@cuda1 hadoop-0.20.2]$ java EnumDevices
Total number of devices: 1
Name: Tesla C1060
Version: 1.3
Clock rate: 1296000 MHz
Threads per block: 512
but when I used the same code ( jcuda libraries in map-reduce job ) , i
result in the below errors :
[hadoop@ws37-mah-lin hadoop-0.20.2]$ bin/hadoop jar wordcount1.jar
org.myorg.WordCount /user/hadoop/gutenberg /user/hadoop/output1
11/02/22 09:59:21 INFO input.FileInputFormat: Total input paths to
process : 3
11/02/22 09:59:22 INFO mapred.JobClient: Running job: job_201102220937_0001
11/02/22 09:59:23 INFO mapred.JobClient: map 0% reduce 0%
11/02/22 09:59:33 INFO mapred.JobClient: Task Id :
attempt_201102220937_0001_m_000000_0, Status : FAILED
11/02/22 09:59:33 INFO mapred.JobClient: Task Id :
attempt_201102220937_0001_m_000001_0, Status : FAILED
java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:115)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:569)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
at org.apache.hadoop.mapred.Child.main(Child.java:170)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:113)
... 3 more
*Caused by: java.lang.UnsatisfiedLinkError: no jcuda in java.library.path*
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1734)
at java.lang.Runtime.loadLibrary0(Runtime.java:823)
at java.lang.System.loadLibrary(System.java:1028)
at jcuda.driver.CUDADriver.<clinit>(CUDADriver.java:909)
at jcuda.CUDA.init(CUDA.java:62)
at jcuda.CUDA.<init>(CUDA.java:42)
at org.myorg.WordCount$TokenizerMapper.<init>(WordCount.java:28)
... 8 more
11/02/22 09:59:42 INFO mapred.JobClient: Task Id :
attempt_201102220937_0001_m_000001_1, Status : FAILED
java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:115)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:569)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
at org.apache.hadoop.mapred.Child.main(Child.java:170)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:113)
... 3 more
*Caused by: java.lang.UnsatisfiedLinkError: no jcuda in java.library.path*
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1709)
at java.lang.Runtime.loadLibrary0(Runtime.java:823)
at java.lang.System.loadLibrary(System.java:1028)
at jcuda.driver.CUDADriver.<clinit>(CUDADriver.java:909)
at jcuda.CUDA.init(CUDA.java:62)
at jcuda.CUDA.<init>(CUDA.java:42)
at org.myorg.WordCount$TokenizerMapper.<init>(WordCount.java:28)
... 8 more
11/02/22 09:59:42 INFO mapred.JobClient: Task Id :
attempt_201102220937_0001_m_000000_1, Status : FAILED
java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:115)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:569)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
at org.apache.hadoop.mapred.Child.main(Child.java:170)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:113)
... 3 more
*Caused by: java.lang.UnsatisfiedLinkError: no jcuda in java.library.path*
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1709)
at java.lang.Runtime.loadLibrary0(Runtime.java:823)
at java.lang.System.loadLibrary(System.java:1028)
at jcuda.driver.CUDADriver.<clinit>(CUDADriver.java:909)
at jcuda.CUDA.init(CUDA.java:62)
at jcuda.CUDA.<init>(CUDA.java:42)
at org.myorg.WordCount$TokenizerMapper.<init>(WordCount.java:28)
... 8 more
11/02/22 09:59:42 INFO mapred.JobClient: Task Id :
attempt_201102220937_0001_m_000002_1, Status : FAILED
java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:115)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:569)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
at org.apache.hadoop.mapred.Child.main(Child.java:170)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:113)
... 3 more
Caused by: java.lang.UnsatisfiedLinkError: no jcuda in java.library.path
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1734)
at java.lang.Runtime.loadLibrary0(Runtime.java:823)
at java.lang.System.loadLibrary(System.java:1028)
at jcuda.driver.CUDADriver.<clinit>(CUDADriver.java:909)
at jcuda.CUDA.init(CUDA.java:62)
at jcuda.CUDA.<init>(CUDA.java:42)
at org.myorg.WordCount$TokenizerMapper.<init>(WordCount.java:28)
... 8 more
11/02/22 09:59:57 INFO mapred.JobClient: Job complete: job_201102220937_0001
11/02/22 09:59:57 INFO mapred.JobClient: Counters: 3
11/02/22 09:59:57 INFO mapred.JobClient: Job Counters
11/02/22 09:59:57 INFO mapred.JobClient: Launched map tasks=12
11/02/22 09:59:57 INFO mapred.JobClient: Data-local map tasks=12
11/02/22 09:59:57 INFO mapred.JobClient: Failed map tasks=1
[hadoop@ws37-mah-lin hadoop-0.20.2]$
My PATH Variable shows that it includes all libraries as
[hadoop@cuda1 ~]$ echo $PATH
/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/home/hadoop/project/hadoop-0.20.2/jcuda.jar:/usr/local/cuda/lib:/home/hadoop/bin
[hadoop@cuda1 ~]$
I don't how to resolve this error. Please help
Thanks & best Regards,
Adarsh Sharma
Re: Library Issues
Posted by Harsh J <qw...@gmail.com>.
Hey,
On Thu, Feb 24, 2011 at 10:13 AM, Adarsh Sharma
<ad...@orkash.com> wrote:
> Dear all,
>
> I am confused about the concepts used while running map-reduce jobs in
> Hadoop Cluster.
> I attached a program that is used to run in Hadoop Cluster. Please find the
> attachment.
> My PATH Variable shows that it includes all libraries as
>
> [hadoop@cuda1 ~]$ echo $PATH
> /usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/home/hadoop/project/hadoop-0.20.2/jcuda.jar:/usr/local/cuda/lib:/home/hadoop/bin
> [hadoop@cuda1 ~]$
AFAIK, on Linux, java.library.path is loaded from LD_LIBRARY_PATH and
on Windows it is PATH.
Also, build/lib/native or lib/native folders are added if available to
Hadoop's java.library.path if launched using the bin scripts. You can
place your native libraries there (under the proper Java platform).
--
Harsh J
www.harshj.com