You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by amit kumar verma <v....@verchaska.com> on 2010/07/08 11:22:15 UTC
Fwd: jni files
-------- Original Message --------
Subject: jni files
Date: Thu, 08 Jul 2010 13:38:26 +0530
From: amit kumar verma <v....@verchaska.com>
Reply-To: general@hadoop.apache.org
To: general@hadoop.apache.org
Hi,
I developed a project which is using some native jni files
(liblemur_jni.so), earlier i use to run application jar by using
-Djava.library.path=/PATH_TO_JNI_FILES, but am not able to the same with
./hadoop jar command.
I followed
http://hadoop.apache.org/common/docs/r0.18.3/native_libraries.html
1. First copy the library to the HDFS.
bin/hadoop fs -copyFromLocal mylib.so.1 /libraries/mylib.so.1
2. The job launching program should contain the following:
DistributedCache.createSymlink(conf);
DistributedCache.addCacheFile("hdfs://*
/192.168.0.153:50075*/libraries/mylib.so.1#mylib.so", conf);
3. The map/reduce task can contain:
System.loadLibrary("mylib.so");
but getting error :
Exception in thread "main" java.io.IOException: Call to*
/192.168.0.153:50075* failed on local exception: java.io.EOFException
at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)
at org.apache.hadoop.ipc.Client.call(Client.java:743)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
at $Proxy1.getProtocolVersion(Unknown Source)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
at
org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
at
org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
at
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
at
org.apache.hadoop.filecache.DistributedCache.getTimestamp(DistributedCache.java:506)
at
org.apache.hadoop.mapred.JobClient.configureCommandLineOptions(JobClient.java:640)
at
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:761)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:432)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:447)
at com.i4dweb.trobo.grid.WordCountNew.main(WordCountNew.java:49)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: java.io.EOFException
at java.io.DataInputStream.readInt(DataInputStream.java:375)
at
org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:508)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
Please advice.
--
Thanks,
Amit Kumar Verma
Verchaska Infotech Pvt. Ltd.
RE: jni files
Posted by Michael Segel <mi...@hotmail.com>.
Silly question...
Why do you have * in your lin
"> DistributedCache.addCacheFile("hdfs://*
>
/192.168.0.153:50075*/libraries/mylib.so.1#mylib.so", conf);"
According to the link:
DistributedCache.addCacheFile("hdfs://host:port/libraries/mylib.so.1#mylib.so",
conf);
Is the line...
So you should have
DistributedCache.addCacheFile("hdfs://192.168.0.153:50075/libraries/mylib.so.1#mylib.so",
conf);
Or am I missing something?
> Date: Thu, 8 Jul 2010 14:52:15 +0530
> From: v.amit@verchaska.com
> To: common-user@hadoop.apache.org
> Subject: Fwd: jni files
>
>
>
> -------- Original Message --------
> Subject: jni files
> Date: Thu, 08 Jul 2010 13:38:26 +0530
> From: amit kumar verma <v....@verchaska.com>
> Reply-To: general@hadoop.apache.org
> To: general@hadoop.apache.org
>
>
>
> Hi,
>
> I developed a project which is using some native jni files
> (liblemur_jni.so), earlier i use to run application jar by using
> -Djava.library.path=/PATH_TO_JNI_FILES, but am not able to the same with
> ./hadoop jar command.
>
> I followed
> http://hadoop.apache.org/common/docs/r0.18.3/native_libraries.html
>
> 1. First copy the library to the HDFS.
> bin/hadoop fs -copyFromLocal mylib.so.1 /libraries/mylib.so.1
> 2. The job launching program should contain the following:
> DistributedCache.createSymlink(conf);
> DistributedCache.addCacheFile("hdfs://*
> /192.168.0.153:50075*/libraries/mylib.so.1#mylib.so", conf);
> 3. The map/reduce task can contain:
> System.loadLibrary("mylib.so");
>
> but getting error :
>
> Exception in thread "main" java.io.IOException: Call to*
> /192.168.0.153:50075* failed on local exception: java.io.EOFException
> at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)
> at org.apache.hadoop.ipc.Client.call(Client.java:743)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> at $Proxy1.getProtocolVersion(Unknown Source)
> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> at
> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> at
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
> at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> at
> org.apache.hadoop.filecache.DistributedCache.getTimestamp(DistributedCache.java:506)
> at
> org.apache.hadoop.mapred.JobClient.configureCommandLineOptions(JobClient.java:640)
> at
> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:761)
> at org.apache.hadoop.mapreduce.Job.submit(Job.java:432)
> at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:447)
> at com.i4dweb.trobo.grid.WordCountNew.main(WordCountNew.java:49)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
> at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> Caused by: java.io.EOFException
> at java.io.DataInputStream.readInt(DataInputStream.java:375)
> at
> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:508)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
>
>
> Please advice.
>
> --
> Thanks,
> Amit Kumar Verma
> Verchaska Infotech Pvt. Ltd.
>
>
>
>
_________________________________________________________________
The New Busy is not the too busy. Combine all your e-mail accounts with Hotmail.
http://www.windowslive.com/campaign/thenewbusy?tile=multiaccount&ocid=PID28326::T:WLMTAGL:ON:WL:en-US:WM_HMP:042010_4